pipeline_tag
stringclasses
48 values
library_name
stringclasses
198 values
text
stringlengths
1
900k
metadata
stringlengths
2
438k
id
stringlengths
5
122
last_modified
null
tags
listlengths
1
1.84k
sha
null
created_at
stringlengths
25
25
arxiv
listlengths
0
201
languages
listlengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
listlengths
0
722
processed_texts
listlengths
1
723
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1062301628475289600/sCq-edVm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Su Başak 🤖 AI Bot </div> <div style="font-size: 15px">@inmidonot bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@inmidonot's tweets](https://twitter.com/inmidonot). | Data | Quantity | | --- | --- | | Tweets downloaded | 344 | | Retweets | 6 | | Short tweets | 15 | | Tweets kept | 323 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2mghgkpx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @inmidonot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/12l0mm4t) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/12l0mm4t/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/inmidonot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/inmidonot/1616937673978/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/inmidonot
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Su Başak AI Bot @inmidonot bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @inmidonot's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @inmidonot's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1342654008520011783/ELNBkoe__400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Insert 🚩🦮 🤖 AI Bot </div> <div style="font-size: 15px">@insert_name27 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@insert_name27's tweets](https://twitter.com/insert_name27). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 111 | | Short tweets | 491 | | Tweets kept | 2644 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3m2d1hmb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @insert_name27's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ajldnpxe) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ajldnpxe/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/insert_name27') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/insert_name27/1617820538616/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/insert_name27
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Insert AI Bot @insert\_name27 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @insert\_name27's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @insert\_name27's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1418652395119153153/dvMUbHmM_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1449364913890074627/SNmSlTYD_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1450840619132260357/r9rdJtIp_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Pratham & Insha & Savio Martin ⚡️</div> <div style="text-align: center; font-size: 14px;">@insharamin-prathkum-saviomartin7</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Pratham & Insha & Savio Martin ⚡️. | Data | Pratham | Insha | Savio Martin ⚡️ | | --- | --- | --- | --- | | Tweets downloaded | 3246 | 3249 | 3249 | | Retweets | 461 | 24 | 118 | | Short tweets | 317 | 457 | 201 | | Tweets kept | 2468 | 2768 | 2930 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/o7jfvmhp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @insharamin-prathkum-saviomartin7's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/p2md0wva) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/p2md0wva/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/insharamin-prathkum-saviomartin7') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/insharamin-prathkum-saviomartin7/1637920907734/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/insharamin-prathkum-saviomartin7
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Pratham & Insha & Savio Martin ️ @insharamin-prathkum-saviomartin7 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Pratham & Insha & Savio Martin ️. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @insharamin-prathkum-saviomartin7's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1276993788821540872/edbR86Jw_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Insufficiently Outraged 🤖 AI Bot </div> <div style="font-size: 15px">@insufficientout bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@insufficientout's tweets](https://twitter.com/insufficientout). | Data | Quantity | | --- | --- | | Tweets downloaded | 784 | | Retweets | 26 | | Short tweets | 68 | | Tweets kept | 690 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/5cu9fjjj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @insufficientout's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2c1v17ew) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2c1v17ew/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/insufficientout') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/insufficientout/1616757946042/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/insufficientout
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Insufficiently Outraged AI Bot @insufficientout bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @insufficientout's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @insufficientout's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374410731358019593/eBVT1vhW_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Spark Of Inquiry 🤖 AI Bot </div> <div style="font-size: 15px">@interro__bang bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@interro__bang's tweets](https://twitter.com/interro__bang). | Data | Quantity | | --- | --- | | Tweets downloaded | 114 | | Retweets | 2 | | Short tweets | 19 | | Tweets kept | 93 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1k112d2n/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @interro__bang's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/uppi8vz0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/uppi8vz0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/interro__bang') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/interro__bang/1616611219490/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/interro__bang
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Spark Of Inquiry AI Bot @interro\_\_bang bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @interro\_\_bang's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @interro\_\_bang's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/608132742224568320/x3yrArdT_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Electronic Intifada 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@intifada bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@intifada's tweets](https://twitter.com/intifada). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3241</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>6</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3235</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1qmm4ybr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @intifada's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/8f4jzilg) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/8f4jzilg/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/intifada'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/intifada/1603110719648/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/intifada
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Electronic Intifada AI Bot </div> <div style="font-size: 15px; color: #657786">@intifada bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @intifada's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3241</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>6</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3235</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @intifada's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/intifada'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @intifada's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3241</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>6</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>3235</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @intifada's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/intifada'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @intifada's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3241</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>6</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>3235</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @intifada's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/intifada'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/922432805426130944/Zv5SABlH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Carlos E. Perez 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@intuitmachine bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@intuitmachine's tweets](https://twitter.com/intuitmachine). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3216</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>222</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>82</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2912</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3a25w014/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @intuitmachine's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/g4lfqgv1) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/g4lfqgv1/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/intuitmachine'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/intuitmachine
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Carlos E. Perez AI Bot </div> <div style="font-size: 15px; color: #657786">@intuitmachine bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @intuitmachine's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3216</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>222</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>82</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2912</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @intuitmachine's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/intuitmachine'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @intuitmachine's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3216</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>222</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>82</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2912</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @intuitmachine's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/intuitmachine'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @intuitmachine's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3216</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>222</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>82</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2912</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @intuitmachine's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/intuitmachine'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1306312443388334081/oABG6C1L_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1393211665001459713/gobLbDve_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Steve | Millionaire Habits & Investor's Theory</div> <div style="text-align: center; font-size: 14px;">@investorstheory-steveonspeed</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Steve | Millionaire Habits & Investor's Theory. | Data | Steve | Millionaire Habits | Investor's Theory | | --- | --- | --- | | Tweets downloaded | 3245 | 3250 | | Retweets | 330 | 168 | | Short tweets | 320 | 660 | | Tweets kept | 2595 | 2422 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2yk0pwia/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @investorstheory-steveonspeed's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3hmaq3cx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3hmaq3cx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/investorstheory-steveonspeed') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/investorstheory-steveonspeed/1622080865723/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/investorstheory-steveonspeed
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;URL </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;URL </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> AI CYBORG </div> <div style="text-align: center; font-size: 16px; font-weight: 800">Steve | Millionaire Habits & Investor's Theory</div> <div style="text-align: center; font-size: 14px;">@investorstheory-steveonspeed</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on tweets from Steve | Millionaire Habits & Investor's Theory. | Data | Steve | Millionaire Habits | Investor's Theory | | --- | --- | --- | | Tweets downloaded | 3245 | 3250 | | Retweets | 330 | 168 | | Short tweets | 320 | 660 | | Tweets kept | 2595 | 2422 | Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @investorstheory-steveonspeed's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ## Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on tweets from Steve | Millionaire Habits & Investor's Theory.\n\n| Data | Steve | Millionaire Habits | Investor's Theory |\n| --- | --- | --- |\n| Tweets downloaded | 3245 | 3250 |\n| Retweets | 330 | 168 |\n| Short tweets | 320 | 660 |\n| Tweets kept | 2595 | 2422 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @investorstheory-steveonspeed's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## How to use\n\nYou can use this model directly with a pipeline for text generation:", "## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on tweets from Steve | Millionaire Habits & Investor's Theory.\n\n| Data | Steve | Millionaire Habits | Investor's Theory |\n| --- | --- | --- |\n| Tweets downloaded | 3245 | 3250 |\n| Retweets | 330 | 168 |\n| Short tweets | 320 | 660 |\n| Tweets kept | 2595 | 2422 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @investorstheory-steveonspeed's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## How to use\n\nYou can use this model directly with a pipeline for text generation:", "## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343581966541545472/Bs7oM0IV_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ada IO 🤖 AI Bot </div> <div style="font-size: 15px">@ioorbust bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ioorbust's tweets](https://twitter.com/ioorbust). | Data | Quantity | | --- | --- | | Tweets downloaded | 789 | | Retweets | 79 | | Short tweets | 102 | | Tweets kept | 608 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/zuxd4c8i/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ioorbust's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1nt569uh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1nt569uh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ioorbust') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ioorbust/1617757328084/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/ioorbust
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Ada IO AI Bot @ioorbust bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @ioorbust's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ioorbust's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1144996963252940800/VIHkkMCF_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">⌞ʙᴀʟᴀᴢꜱ⌝ 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@iotnerd bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@iotnerd's tweets](https://twitter.com/iotnerd). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3200</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>915</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>102</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2183</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2gq45sm3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iotnerd's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ksu06s41) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ksu06s41/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/iotnerd'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/iotnerd/1611677898375/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/iotnerd
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">⌞ʙᴀʟᴀᴢꜱ⌝ AI Bot </div> <div style="font-size: 15px; color: #657786">@iotnerd bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @iotnerd's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3200</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>915</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>102</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2183</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @iotnerd's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/iotnerd'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @iotnerd's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3200</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>915</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>102</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2183</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @iotnerd's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/iotnerd'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @iotnerd's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3200</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>915</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>102</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2183</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @iotnerd's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/iotnerd'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1420146838779400197/98VH7-UW_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Ivan Poduje</div> <div style="text-align: center; font-size: 14px;">@ipoduje</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Ivan Poduje. | Data | Ivan Poduje | | --- | --- | | Tweets downloaded | 3230 | | Retweets | 1035 | | Short tweets | 135 | | Tweets kept | 2060 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gyttyi09/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ipoduje's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/29wmg1mk) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/29wmg1mk/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ipoduje') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/ipoduje/1641572179072/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/ipoduje
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Ivan Poduje @ipoduje I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Ivan Poduje. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ipoduje's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1432037158072856578/a_Fty68E_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Riikka Purra</div> <div style="text-align: center; font-size: 14px;">@ir_rkp</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Riikka Purra. | Data | Riikka Purra | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 141 | | Short tweets | 78 | | Tweets kept | 3031 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1w0bzvgu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ir_rkp's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1nj4v31w) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1nj4v31w/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ir_rkp') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/ir_rkp/1643976228944/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/ir_rkp
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Riikka Purra @ir\_rkp I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Riikka Purra. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ir\_rkp's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360344954485227529/r2dktZMm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Kevin 🤖 AI Bot </div> <div style="font-size: 15px">@is_he_batman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@is_he_batman's tweets](https://twitter.com/is_he_batman). | Data | Quantity | | --- | --- | | Tweets downloaded | 960 | | Retweets | 51 | | Short tweets | 75 | | Tweets kept | 834 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/25g6159m/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @is_he_batman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2yerrfcg) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2yerrfcg/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/is_he_batman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/is_he_batman/1614109879160/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/is_he_batman
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Kevin AI Bot @is\_he\_batman bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @is\_he\_batman's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @is\_he\_batman's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1092831572645036035/yvgPGtOn_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ishan 🤖 AI Bot </div> <div style="font-size: 15px">@ishanspatil bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ishanspatil's tweets](https://twitter.com/ishanspatil). | Data | Quantity | | --- | --- | | Tweets downloaded | 2468 | | Retweets | 346 | | Short tweets | 231 | | Tweets kept | 1891 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/4iupc1l1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ishanspatil's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/k7nyg63n) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/k7nyg63n/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ishanspatil') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ishanspatil/1617782474953/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/ishanspatil
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Ishan AI Bot @ishanspatil bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @ishanspatil's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ishanspatil's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1448436144388009985/zWh5cSQ3_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">نورهان</div> <div style="text-align: center; font-size: 14px;">@islamocommunism</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from نورهان. | Data | نورهان | | --- | --- | | Tweets downloaded | 3196 | | Retweets | 1205 | | Short tweets | 227 | | Tweets kept | 1764 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2l8ikj22/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @islamocommunism's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2kngkxcq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2kngkxcq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/islamocommunism') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/islamocommunism/1635014280450/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/islamocommunism
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT نورهان @islamocommunism I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from نورهان. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @islamocommunism's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1381764452098437120/74IgKP07_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1368077075127603200/Z08slO2P_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Boston Psychology PhD & keyvan</div> <div style="text-align: center; font-size: 14px;">@islamphobiacow-praisegodbarbon</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Boston Psychology PhD & keyvan. | Data | Boston Psychology PhD | keyvan | | --- | --- | --- | | Tweets downloaded | 3224 | 3242 | | Retweets | 858 | 179 | | Short tweets | 251 | 223 | | Tweets kept | 2115 | 2840 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3egvdux4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @islamphobiacow-praisegodbarbon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/34hmjrwi) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/34hmjrwi/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/islamphobiacow-praisegodbarbon') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/islamphobiacow-praisegodbarbon/1627056382131/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/islamphobiacow-praisegodbarbon
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Boston Psychology PhD & keyvan @islamphobiacow-praisegodbarbon I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Boston Psychology PhD & keyvan. Data: Tweets downloaded, Boston Psychology PhD: 3224, keyvan: 3242 Data: Retweets, Boston Psychology PhD: 858, keyvan: 179 Data: Short tweets, Boston Psychology PhD: 251, keyvan: 223 Data: Tweets kept, Boston Psychology PhD: 2115, keyvan: 2840 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @islamphobiacow-praisegodbarbon's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1368077075127603200/Z08slO2P_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">beff jezos</div> <div style="text-align: center; font-size: 14px;">@islamphobiacow</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from beff jezos. | Data | beff jezos | | --- | --- | | Tweets downloaded | 395 | | Retweets | 36 | | Short tweets | 37 | | Tweets kept | 322 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1crtakdb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @islamphobiacow's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/29lljwti) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/29lljwti/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/islamphobiacow') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/islamphobiacow/1627597861566/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/islamphobiacow
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT beff jezos @islamphobiacow I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from beff jezos. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @islamphobiacow's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1344394699470082049/YzE4UMsj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rizza Islam 🤖 AI Bot </div> <div style="font-size: 15px">@islamrizza bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@islamrizza's tweets](https://twitter.com/islamrizza). | Data | Quantity | | --- | --- | | Tweets downloaded | 3195 | | Retweets | 73 | | Short tweets | 394 | | Tweets kept | 2728 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/t09cn5o0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @islamrizza's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/m6l6wkff) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/m6l6wkff/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/islamrizza') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/islamrizza/1619378181874/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/islamrizza
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Rizza Islam AI Bot @islamrizza bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @islamrizza's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @islamrizza's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1152361188862365697/HWUuVltf_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">nick casino 🤖 AI Bot </div> <div style="font-size: 15px">@island_iverson bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@island_iverson's tweets](https://twitter.com/island_iverson). | Data | Quantity | | --- | --- | | Tweets downloaded | 3182 | | Retweets | 367 | | Short tweets | 193 | | Tweets kept | 2622 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/dlr58v3e/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @island_iverson's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vy3qci6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vy3qci6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/island_iverson') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/island_iverson/1614113195211/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/island_iverson
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
nick casino AI Bot @island\_iverson bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @island\_iverson's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @island\_iverson's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1340996475472494593/yqCQjZ06_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1341001037142999041/h86Ch8TO_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Science Bits & International Science Teaching Foundation</div> <div style="text-align: center; font-size: 14px;">@istfoundation-sciencebits</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Science Bits & International Science Teaching Foundation. | Data | Science Bits | International Science Teaching Foundation | | --- | --- | --- | | Tweets downloaded | 2741 | 163 | | Retweets | 759 | 103 | | Short tweets | 47 | 1 | | Tweets kept | 1935 | 59 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/c9crff9r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @istfoundation-sciencebits's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2c68vj42) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2c68vj42/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/istfoundation-sciencebits') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/istfoundation-sciencebits/1634209108264/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/istfoundation-sciencebits
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Science Bits & International Science Teaching Foundation @istfoundation-sciencebits I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Science Bits & International Science Teaching Foundation. Data: Tweets downloaded, Science Bits: 2741, International Science Teaching Foundation: 163 Data: Retweets, Science Bits: 759, International Science Teaching Foundation: 103 Data: Short tweets, Science Bits: 47, International Science Teaching Foundation: 1 Data: Tweets kept, Science Bits: 1935, International Science Teaching Foundation: 59 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @istfoundation-sciencebits's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1359348009725808641/KyPjQGzk_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">itemLabel 🤖 AI Bot </div> <div style="font-size: 15px">@itemlabel bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itemlabel's tweets](https://twitter.com/itemlabel). | Data | Quantity | | --- | --- | | Tweets downloaded | 3188 | | Retweets | 1796 | | Short tweets | 389 | | Tweets kept | 1003 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/10hookja/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itemlabel's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1u63m0wj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1u63m0wj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itemlabel') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/itemlabel
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
itemLabel AI Bot @itemlabel bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itemlabel's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itemlabel's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1378965688459489280/VViTlDIl_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Google ‘Its All Bullshit’ 🤖 AI Bot </div> <div style="font-size: 15px">@itsall_bullshit bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itsall_bullshit's tweets](https://twitter.com/itsall_bullshit). | Data | Quantity | | --- | --- | | Tweets downloaded | 3158 | | Retweets | 1762 | | Short tweets | 98 | | Tweets kept | 1298 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/25y8c5ov/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsall_bullshit's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/y0ks8zfn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/y0ks8zfn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsall_bullshit') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsall_bullshit/1617823122662/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itsall_bullshit
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Google ‘Its All Bullshit’ AI Bot @itsall\_bullshit bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itsall\_bullshit's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsall\_bullshit's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1346051916556623875/e66ZNvO2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Big Ian 🤖 AI Bot </div> <div style="font-size: 15px">@itsbigian bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itsbigian's tweets](https://twitter.com/itsbigian). | Data | Quantity | | --- | --- | | Tweets downloaded | 3238 | | Retweets | 218 | | Short tweets | 552 | | Tweets kept | 2468 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2oczo3b8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsbigian's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/245obnds) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/245obnds/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsbigian') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsbigian/1616883483325/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itsbigian
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Big Ian AI Bot @itsbigian bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itsbigian's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsbigian's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1376928476633137157/d4J78Fmv_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Harveen 🤖 AI Bot </div> <div style="font-size: 15px">@itsharveen bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itsharveen's tweets](https://twitter.com/itsharveen). | Data | Quantity | | --- | --- | | Tweets downloaded | 632 | | Retweets | 30 | | Short tweets | 40 | | Tweets kept | 562 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/a779ia8t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsharveen's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1dip1d5b) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1dip1d5b/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsharveen') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsharveen/1617627052674/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itsharveen
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Harveen AI Bot @itsharveen bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itsharveen's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsharveen's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1348712980112936966/i5-XHX3G_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jane.flowers 🤖 AI Bot </div> <div style="font-size: 15px">@itsjaneflowers bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itsjaneflowers's tweets](https://twitter.com/itsjaneflowers). | Data | Quantity | | --- | --- | | Tweets downloaded | 1054 | | Retweets | 166 | | Short tweets | 79 | | Tweets kept | 809 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1af8sp4r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsjaneflowers's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/25kv3ol0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/25kv3ol0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsjaneflowers') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsjaneflowers/1616859152962/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itsjaneflowers
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
jane.flowers AI Bot @itsjaneflowers bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itsjaneflowers's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsjaneflowers's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1355537154538000391/0mOGv6Mw_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">june party corner</div> <div style="text-align: center; font-size: 14px;">@itskillerdog</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from june party corner. | Data | june party corner | | --- | --- | | Tweets downloaded | 196 | | Retweets | 20 | | Short tweets | 30 | | Tweets kept | 146 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1u7twx27/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itskillerdog's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vg0bbs8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vg0bbs8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itskillerdog') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itskillerdog/1630971994166/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itskillerdog
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT june party corner @itskillerdog I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from june party corner. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itskillerdog's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1337464500446957570/ptHOR4kZ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Luci Keller 🤖 AI Bot </div> <div style="font-size: 15px">@itslucikeller bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itslucikeller's tweets](https://twitter.com/itslucikeller). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 69 | | Short tweets | 352 | | Tweets kept | 2825 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3nhr24ju/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itslucikeller's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2zv0hvjq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2zv0hvjq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itslucikeller') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itslucikeller/1616622417664/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itslucikeller
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Luci Keller AI Bot @itslucikeller bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itslucikeller's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itslucikeller's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1389992995848658948/XT1CKTIg_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Aqsa.</div> <div style="text-align: center; font-size: 14px;">@itsmeaqsaa</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Aqsa.. | Data | Aqsa. | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 77 | | Short tweets | 1543 | | Tweets kept | 1626 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1xy28krg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsmeaqsaa's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/18kg27bt) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/18kg27bt/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsmeaqsaa') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsmeaqsaa/1631734394856/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itsmeaqsaa
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Aqsa. @itsmeaqsaa I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Aqsa.. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsmeaqsaa's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365914927580344322/b5PadSd5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">NFT ChiΞf of Staff 🤖 AI Bot </div> <div style="font-size: 15px">@itspublu bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itspublu's tweets](https://twitter.com/itspublu). | Data | Quantity | | --- | --- | | Tweets downloaded | 1768 | | Retweets | 481 | | Short tweets | 282 | | Tweets kept | 1005 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2l8q7e87/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itspublu's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vo0wnnt) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vo0wnnt/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itspublu') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itspublu/1616709602963/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itspublu
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
NFT ChiΞf of Staff AI Bot @itspublu bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itspublu's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itspublu's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/628257137060229120/_3q_D4g2_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Six words story</div> <div style="text-align: center; font-size: 14px;">@itssixword</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Six words story. | Data | Six words story | | --- | --- | | Tweets downloaded | 282 | | Retweets | 0 | | Short tweets | 2 | | Tweets kept | 280 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2dbtmbzz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itssixword's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2wydugsv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2wydugsv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itssixword') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itssixword/1629833127428/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/itssixword
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Six words story @itssixword I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Six words story. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itssixword's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1457774258063437824/VgJyJ_c2_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">uditgoenka.eth</div> <div style="text-align: center; font-size: 14px;">@iuditg</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from uditgoenka.eth. | Data | uditgoenka.eth | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 993 | | Short tweets | 450 | | Tweets kept | 1807 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1r2lhfr0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iuditg's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/iswph9y4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/iswph9y4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/iuditg') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/iuditg/1639532212187/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/iuditg
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT URL @iuditg I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from URL. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @iuditg's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1202257345734037504/tRJA6HEx_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">| praveen narayan 〉 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@ivanpeer bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ivanpeer's tweets](https://twitter.com/ivanpeer). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>971</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>110</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>102</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>759</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2thafoo8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ivanpeer's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3fepz7hm) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3fepz7hm/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ivanpeer'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ivanpeer/1603607581850/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/ivanpeer
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">| praveen narayan 〉 AI Bot </div> <div style="font-size: 15px; color: #657786">@ivanpeer bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @ivanpeer's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>971</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>110</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>102</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>759</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @ivanpeer's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ivanpeer'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @ivanpeer's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>971</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>110</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>102</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>759</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @ivanpeer's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/ivanpeer'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @ivanpeer's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>971</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>110</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>102</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>759</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @ivanpeer's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/ivanpeer'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1404902950607089665/CLa3e4aK_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">##lainpilled</div> <div style="text-align: center; font-size: 14px;">@ivegottagetagf</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from ##lainpilled. | Data | ##lainpilled | | --- | --- | | Tweets downloaded | 128 | | Retweets | 7 | | Short tweets | 16 | | Tweets kept | 105 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/7kyd6ojb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ivegottagetagf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ropyewj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ropyewj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ivegottagetagf') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ivegottagetagf/1623876885491/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/ivegottagetagf
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT ##lainpilled @ivegottagetagf I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from ##lainpilled. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ivegottagetagf's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/598663964340301824/im3Wzn-o_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Robert Evans (The Only Robert Evans)</div> <div style="text-align: center; font-size: 14px;">@iwriteok</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Robert Evans (The Only Robert Evans). | Data | Robert Evans (The Only Robert Evans) | | --- | --- | | Tweets downloaded | 3218 | | Retweets | 1269 | | Short tweets | 142 | | Tweets kept | 1807 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3hjcp2ib/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iwriteok's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/wq4n95ia) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/wq4n95ia/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/iwriteok') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/iwriteok/1668924855688/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/iwriteok
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Robert Evans (The Only Robert Evans) @iwriteok I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Robert Evans (The Only Robert Evans). Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @iwriteok's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1393393642958635008/P1qx1TlP_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">웃</div> <div style="text-align: center; font-size: 14px;">@iyxnmt</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 웃. | Data | 웃 | | --- | --- | | Tweets downloaded | 3073 | | Retweets | 1416 | | Short tweets | 660 | | Tweets kept | 997 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1lpd2izx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iyxnmt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3qg153k0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3qg153k0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/iyxnmt') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/iyxnmt/1621146502054/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/iyxnmt
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT 웃 @iyxnmt I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from 웃. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @iyxnmt's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1311033052592656385/V-9XECfj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jamie Beck 🤖 AI Bot </div> <div style="font-size: 15px">@j_beck00 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@j_beck00's tweets](https://twitter.com/j_beck00). | Data | Quantity | | --- | --- | | Tweets downloaded | 75 | | Retweets | 14 | | Short tweets | 4 | | Tweets kept | 57 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/23mq58mv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @j_beck00's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2mbmtl4r) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2mbmtl4r/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/j_beck00') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/j_beck00/1617471704579/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/j_beck00
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jamie Beck AI Bot @j\_beck00 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @j\_beck00's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @j\_beck00's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1333957151576887297/_1ExBQa3_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jocelyn (male) of the 365 Followers 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@j_j_j_j_j_jones bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@j_j_j_j_j_jones's tweets](https://twitter.com/j_j_j_j_j_jones). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3225</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>320</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>482</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2423</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/uz60miha/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @j_j_j_j_j_jones's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/soi1lw7l) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/soi1lw7l/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/j_j_j_j_j_jones'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/j_j_j_j_j_jones/1609141746129/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/j_j_j_j_j_jones
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jocelyn (male) of the 365 Followers AI Bot </div> <div style="font-size: 15px; color: #657786">@j_j_j_j_j_jones bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @j_j_j_j_j_jones's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3225</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>320</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>482</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2423</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @j_j_j_j_j_jones's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/j_j_j_j_j_jones'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @j_j_j_j_j_jones's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3225</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>320</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>482</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2423</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @j_j_j_j_j_jones's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/j_j_j_j_j_jones'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @j_j_j_j_j_jones's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3225</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>320</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>482</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2423</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @j_j_j_j_j_jones's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/j_j_j_j_j_jones'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1115644092329758721/AFjOr-K8_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">jack</div> <div style="text-align: center; font-size: 14px;">@jack</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from jack. | Data | jack | | --- | --- | | Tweets downloaded | 3231 | | Retweets | 1147 | | Short tweets | 817 | | Tweets kept | 1267 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/dibfzjll/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jack's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3f3e0roo) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3f3e0roo/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jack') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/jack/1653287961086/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jack
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT jack @jack I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from jack. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jack's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1314324082457018369/nPHIUIxe_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jack Walsh 🤖 AI Bot </div> <div style="font-size: 15px">@jack_walshh bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jack_walshh's tweets](https://twitter.com/jack_walshh). | Data | Quantity | | --- | --- | | Tweets downloaded | 1095 | | Retweets | 234 | | Short tweets | 121 | | Tweets kept | 740 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1o93caoq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jack_walshh's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/23dq75x4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/23dq75x4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jack_walshh') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jack_walshh/1616646386178/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jack_walshh
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jack Walsh AI Bot @jack\_walshh bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jack\_walshh's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jack\_walshh's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1251200537388695557/96JxUIrJ_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1384243878748856321/vreel6UH_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1417910390051246080/wKq6pjPR_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">DAN KOE & humble farmer & Jack Butcher</div> <div style="text-align: center; font-size: 14px;">@jackbutcher-paikcapital-thedankoe</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from DAN KOE & humble farmer & Jack Butcher. | Data | DAN KOE | humble farmer | Jack Butcher | | --- | --- | --- | --- | | Tweets downloaded | 3249 | 3247 | 3220 | | Retweets | 18 | 601 | 208 | | Short tweets | 899 | 500 | 1048 | | Tweets kept | 2332 | 2146 | 1964 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/mvqun4ol/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackbutcher-paikcapital-thedankoe's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2qd8720q) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2qd8720q/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jackbutcher-paikcapital-thedankoe') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/jackbutcher-paikcapital-thedankoe
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG DAN KOE & humble farmer & Jack Butcher @jackbutcher-paikcapital-thedankoe I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from DAN KOE & humble farmer & Jack Butcher. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jackbutcher-paikcapital-thedankoe's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/726446881547517952/ULhSTKxN_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jack Clark 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jackclarksf bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf). ## Training data The model was trained on [@jackclarksf's tweets](https://twitter.com/jackclarksf). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3216</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>603</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>187</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2426</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3r89xyps/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackclarksf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3ovybsy5) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3ovybsy5/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jackclarksf'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/jackclarksf
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jack Clark AI Bot </div> <div style="font-size: 15px; color: #657786">@jackclarksf bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jackclarksf's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3216</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>603</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>187</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2426</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jackclarksf's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jackclarksf'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jackclarksf's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3216</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>603</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>187</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2426</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jackclarksf's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jackclarksf'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jackclarksf's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3216</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>603</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>187</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2426</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jackclarksf's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jackclarksf'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1316066224938332162/hVIofspH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">JackGordon 🤖 AI Bot </div> <div style="font-size: 15px">@jackgordonyt bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jackgordonyt's tweets](https://twitter.com/jackgordonyt). | Data | Quantity | | --- | --- | | Tweets downloaded | 660 | | Retweets | 146 | | Short tweets | 106 | | Tweets kept | 408 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3d7wzfbd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackgordonyt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/fa0cjwj6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/fa0cjwj6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jackgordonyt') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jackgordonyt/1615830241451/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jackgordonyt
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
JackGordon AI Bot @jackgordonyt bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jackgordonyt's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jackgordonyt's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1365531191470923776/iPBbGURg_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">JackieRacc_VTuber</div> <div style="text-align: center; font-size: 14px;">@jackieracc_</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from JackieRacc_VTuber. | Data | JackieRacc_VTuber | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 252 | | Short tweets | 827 | | Tweets kept | 2170 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gx7e8h18/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackieracc_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1cvwo68s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1cvwo68s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jackieracc_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jackieracc_/1620680912006/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jackieracc_
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT JackieRacc\_VTuber @jackieracc\_ I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from JackieRacc\_VTuber. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jackieracc\_'s tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1026642891374874625/GPdw8p_L_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jacknjellify 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jacknjellify bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jacknjellify's tweets](https://twitter.com/jacknjellify). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3103</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1025</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>336</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1742</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/nmeryp1f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jacknjellify's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3q5b8kag) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3q5b8kag/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jacknjellify'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/jacknjellify
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jacknjellify AI Bot </div> <div style="font-size: 15px; color: #657786">@jacknjellify bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jacknjellify's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3103</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1025</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>336</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1742</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jacknjellify's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jacknjellify'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jacknjellify's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3103</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1025</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>336</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1742</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jacknjellify's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jacknjellify'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jacknjellify's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3103</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1025</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>336</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1742</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jacknjellify's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jacknjellify'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1418813091140227072/iXDCqBz0_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jack Posobiec 🇺🇸</div> <div style="text-align: center; font-size: 14px;">@jackposobiec</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jack Posobiec 🇺🇸. | Data | Jack Posobiec 🇺🇸 | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 818 | | Short tweets | 511 | | Tweets kept | 1917 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3s4mnium/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackposobiec's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2vllrmfa) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2vllrmfa/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jackposobiec') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jackposobiec/1630169093455/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jackposobiec
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jack Posobiec 🇺🇸 @jackposobiec I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jack Posobiec 🇺🇸. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jackposobiec's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1523106752668966913/tWNV2zbS_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">jacksfilms🌹</div> <div style="text-align: center; font-size: 14px;">@jacksfilms</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from jacksfilms🌹. | Data | jacksfilms🌹 | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 97 | | Short tweets | 444 | | Tweets kept | 2708 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1hsenlsv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jacksfilms's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2ow20675) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2ow20675/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jacksfilms') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/jacksfilms/1653095886748/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jacksfilms
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT jacksfilms @jacksfilms I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from jacksfilms. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jacksfilms's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1284100202421342209/MVXATULR_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Day6 Jae 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jae_day6 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jae_day6's tweets](https://twitter.com/jae_day6). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3229</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>123</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>1021</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2085</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3lpvhxwq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jae_day6's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3vyjrutx) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3vyjrutx/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jae_day6'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jae_day6/1601274497991/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jae_day6
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Day6 Jae AI Bot </div> <div style="font-size: 15px; color: #657786">@jae_day6 bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jae_day6's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3229</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>123</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>1021</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2085</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jae_day6's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jae_day6'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jae_day6's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3229</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>123</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>1021</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2085</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jae_day6's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jae_day6'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jae_day6's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3229</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>123</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>1021</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2085</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jae_day6's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jae_day6'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1410183697534439426/Db5MDUaw_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Programo, luego existo</div> <div style="text-align: center; font-size: 14px;">@jagedn</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Programo, luego existo. | Data | Programo, luego existo | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 549 | | Short tweets | 220 | | Tweets kept | 2475 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ptz28obp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jagedn's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1i8g6srp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1i8g6srp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jagedn') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jagedn/1625062317603/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jagedn
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Programo, luego existo @jagedn I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Programo, luego existo. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jagedn's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374457518995349507/LPSYSW4N_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">certified cael moment™ 🔜 BLFC 🤖 AI Bot </div> <div style="font-size: 15px">@jaguarunlocked bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jaguarunlocked's tweets](https://twitter.com/jaguarunlocked). | Data | Quantity | | --- | --- | | Tweets downloaded | 3176 | | Retweets | 1521 | | Short tweets | 203 | | Tweets kept | 1452 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2j5t38f8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jaguarunlocked's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3n6tm7lj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3n6tm7lj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jaguarunlocked') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jaguarunlocked/1617770655879/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jaguarunlocked
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
certified cael moment™ BLFC AI Bot @jaguarunlocked bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jaguarunlocked's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jaguarunlocked's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374180612609880068/QkkHvC6R_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jacob 🤖 AI Bot </div> <div style="font-size: 15px">@jakeaccino bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jakeaccino's tweets](https://twitter.com/jakeaccino). | Data | Quantity | | --- | --- | | Tweets downloaded | 179 | | Retweets | 8 | | Short tweets | 53 | | Tweets kept | 118 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/239ufxkc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jakeaccino's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3myo5k1y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3myo5k1y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jakeaccino') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/jakeaccino
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
jacob AI Bot @jakeaccino bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jakeaccino's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jakeaccino's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1253134985948614657/xN4lDF3W_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Cham ✍🏻 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jamescham bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf). ## Training data The model was trained on [@jamescham's tweets](https://twitter.com/jamescham). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3213</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>744</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>317</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2152</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/20ku8js2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamescham's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/32to3ioi) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/32to3ioi/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jamescham'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/jamescham
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Cham AI Bot </div> <div style="font-size: 15px; color: #657786">@jamescham bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jamescham's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3213</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>744</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>317</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2152</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jamescham's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jamescham'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jamescham's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3213</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>744</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>317</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2152</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jamescham's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jamescham'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jamescham's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3213</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>744</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>317</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2152</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jamescham's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jamescham'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1420806762408464385/10y3M0iO_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1324782032124215296/HMG6-q8g_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1401837042934468611/okzqIoMb_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">CANCELLED & James Charles & Logan Paul</div> <div style="text-align: center; font-size: 14px;">@jamescharles-loganpaul-tanamongeau</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from CANCELLED & James Charles & Logan Paul. | Data | CANCELLED | James Charles | Logan Paul | | --- | --- | --- | --- | | Tweets downloaded | 3167 | 3182 | 3246 | | Retweets | 938 | 480 | 98 | | Short tweets | 522 | 496 | 287 | | Tweets kept | 1707 | 2206 | 2861 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2avr905u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamescharles-loganpaul-tanamongeau's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2at101p1) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2at101p1/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jamescharles-loganpaul-tanamongeau') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jamescharles-loganpaul-tanamongeau/1631598787303/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jamescharles-loganpaul-tanamongeau
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG CANCELLED & James Charles & Logan Paul @jamescharles-loganpaul-tanamongeau I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from CANCELLED & James Charles & Logan Paul. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jamescharles-loganpaul-tanamongeau's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/958932211973152769/FUpkmn4u_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Clear 🤖 AI Bot </div> <div style="font-size: 15px">@jamesclear bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jamesclear's tweets](https://twitter.com/jamesclear). | Data | Quantity | | --- | --- | | Tweets downloaded | 3247 | | Retweets | 190 | | Short tweets | 385 | | Tweets kept | 2672 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2hvyoab9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamesclear's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/v67076s3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/v67076s3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jamesclear') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jamesclear/1616666243525/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jamesclear
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
James Clear AI Bot @jamesclear bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jamesclear's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jamesclear's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1309510427240534022/Us-RCD-5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Hutton 🤖 AI Bot </div> <div style="font-size: 15px">@jameshuttonphil bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jameshuttonphil's tweets](https://twitter.com/jameshuttonphil). | Data | Quantity | | --- | --- | | Tweets downloaded | 648 | | Retweets | 25 | | Short tweets | 89 | | Tweets kept | 534 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/bamdk9dm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jameshuttonphil's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2jp3j37a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2jp3j37a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jameshuttonphil') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jameshuttonphil/1617296338533/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jameshuttonphil
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
James Hutton AI Bot @jameshuttonphil bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jameshuttonphil's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jameshuttonphil's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1040878369594896384/eusyG8Np_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Sherlock 🤖 AI Bot </div> <div style="font-size: 15px">@jamespsherlock bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jamespsherlock's tweets](https://twitter.com/jamespsherlock). | Data | Quantity | | --- | --- | | Tweets downloaded | 743 | | Retweets | 260 | | Short tweets | 44 | | Tweets kept | 439 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ulatc4k/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamespsherlock's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1btltx5f) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1btltx5f/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jamespsherlock') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jamespsherlock/1616781166201/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jamespsherlock
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
James Sherlock AI Bot @jamespsherlock bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jamespsherlock's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jamespsherlock's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1375502415240122373/JO1DArJT_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jamila Husain</div> <div style="text-align: center; font-size: 14px;">@jamz5251</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jamila Husain. | Data | Jamila Husain | | --- | --- | | Tweets downloaded | 3234 | | Retweets | 900 | | Short tweets | 65 | | Tweets kept | 2269 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/r9z40rld/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamz5251's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/20gadkdv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/20gadkdv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jamz5251') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jamz5251/1622370618440/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jamz5251
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jamila Husain @jamz5251 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jamila Husain. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jamz5251's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1536389142287892481/N6kCwACw_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Columbine Janie</div> <div style="text-align: center; font-size: 14px;">@janieclone</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Columbine Janie. | Data | Columbine Janie | | --- | --- | | Tweets downloaded | 3072 | | Retweets | 1211 | | Short tweets | 462 | | Tweets kept | 1399 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1divgffx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @janieclone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ic6ynmd) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ic6ynmd/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/janieclone') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/janieclone
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Columbine Janie @janieclone I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Columbine Janie. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @janieclone's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1455690959132532738/Z4UvDtLA_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Poolgirl Janie Diamond</div> <div style="text-align: center; font-size: 14px;">@janiedied</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Poolgirl Janie Diamond. | Data | Poolgirl Janie Diamond | | --- | --- | | Tweets downloaded | 1505 | | Retweets | 552 | | Short tweets | 283 | | Tweets kept | 670 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3232onrl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @janiedied's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/smx9pf1l) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/smx9pf1l/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/janiedied') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/janiedied/1645111847557/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/janiedied
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Poolgirl Janie Diamond @janiedied I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Poolgirl Janie Diamond. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @janiedied's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/2158604209/feuilles_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Au Jardin 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jardininfo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jardininfo's tweets](https://twitter.com/jardininfo). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3200</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>375</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2825</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/48yjj01v/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jardininfo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3t0scjqn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3t0scjqn/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jardininfo'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jardininfo/1610568803876/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jardininfo
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Au Jardin AI Bot </div> <div style="font-size: 15px; color: #657786">@jardininfo bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jardininfo's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3200</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>375</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2825</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jardininfo's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jardininfo'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jardininfo's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3200</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>375</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2825</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jardininfo's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jardininfo'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jardininfo's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3200</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>375</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2825</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jardininfo's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jardininfo'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360504508678246401/WpE9tJiC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jason Chen 🤖 AI Bot </div> <div style="font-size: 15px">@jasonchen0325 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jasonchen0325's tweets](https://twitter.com/jasonchen0325). | Data | Quantity | | --- | --- | | Tweets downloaded | 354 | | Retweets | 30 | | Short tweets | 9 | | Tweets kept | 315 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1wsqq8bl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jasonchen0325's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/w2gqjxjr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/w2gqjxjr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jasonchen0325') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jasonchen0325/1616715271094/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jasonchen0325
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jason Chen AI Bot @jasonchen0325 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jasonchen0325's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jasonchen0325's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/757580607882948608/-KXkY5qL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">J.A. Sutherland SciFi Books 🤖 AI Bot </div> <div style="font-size: 15px">@jasutherlandbks bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jasutherlandbks's tweets](https://twitter.com/jasutherlandbks). | Data | Quantity | | --- | --- | | Tweets downloaded | 3192 | | Retweets | 952 | | Short tweets | 169 | | Tweets kept | 2071 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/210hhn5z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jasutherlandbks's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/23qtgnsl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/23qtgnsl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jasutherlandbks') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jasutherlandbks/1616634473974/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jasutherlandbks
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
J.A. Sutherland SciFi Books AI Bot @jasutherlandbks bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jasutherlandbks's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jasutherlandbks's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1370572501269553152/Tl_3viV2_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">🧠Jattazo Shin🧠 !!COMMISSIONS OPEN!!</div> <div style="text-align: center; font-size: 14px;">@jattazo</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 🧠Jattazo Shin🧠 !!COMMISSIONS OPEN!!. | Data | 🧠Jattazo Shin🧠 !!COMMISSIONS OPEN!! | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 196 | | Short tweets | 757 | | Tweets kept | 2290 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/oc8tbgql/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jattazo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/7n3lt4bb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/7n3lt4bb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jattazo') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jattazo/1620679164511/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jattazo
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jattazo Shin !!COMMISSIONS OPEN!! @jattazo I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jattazo Shin !!COMMISSIONS OPEN!!. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jattazo's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1356081411762040834/9L5GPrEi_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🧠Jattazo Shin🧠 🤖 AI Bot </div> <div style="font-size: 15px">@jattazoshin bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jattazoshin's tweets](https://twitter.com/jattazoshin). | Data | Quantity | | --- | --- | | Tweets downloaded | 2768 | | Retweets | 179 | | Short tweets | 414 | | Tweets kept | 2175 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2gto8yaa/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jattazoshin's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1gdg6xx3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1gdg6xx3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jattazoshin') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jattazoshin/1613105660546/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jattazoshin
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jattazo Shin AI Bot @jattazoshin bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jattazoshin's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jattazoshin's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377879160173993987/20XH6CdP_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Cool Narcissist 🤖 AI Bot </div> <div style="font-size: 15px">@java_jigga bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@java_jigga's tweets](https://twitter.com/java_jigga). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 313 | | Short tweets | 426 | | Tweets kept | 2507 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/kvpyc8u1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @java_jigga's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/6p3ishch) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/6p3ishch/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/java_jigga') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/java_jigga/1617788084385/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/java_jigga
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Cool Narcissist AI Bot @java\_jigga bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @java\_jigga's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @java\_jigga's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1329236115325390850/L6QYc5Qd_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Javi Ballester 🤖 AI Bot </div> <div style="font-size: 15px">@javiballester4 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@javiballester4's tweets](https://twitter.com/javiballester4). | Data | Quantity | | --- | --- | | Tweets downloaded | 369 | | Retweets | 4 | | Short tweets | 108 | | Tweets kept | 257 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/33xklndf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @javiballester4's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/s6kbzp61) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/s6kbzp61/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/javiballester4') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/javiballester4/1616630748333/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/javiballester4
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Javi Ballester AI Bot @javiballester4 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @javiballester4's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @javiballester4's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1355323445622558725/vl1-gUcf_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Javierhalamadrid 🤖 AI Bot </div> <div style="font-size: 15px">@javierito321 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@javierito321's tweets](https://twitter.com/javierito321). | Data | Quantity | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 144 | | Short tweets | 90 | | Tweets kept | 3009 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/36rnr3vs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @javierito321's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/bj56pvfw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/bj56pvfw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/javierito321') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/javierito321/1617016242704/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/javierito321
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Javierhalamadrid AI Bot @javierito321 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @javierito321's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @javierito321's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/922569683420794880/bk2ERDe2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gabor Javorszky 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@javorszky bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@javorszky's tweets](https://twitter.com/javorszky). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3137</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>2139</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>67</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>931</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1cyr2cuz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @javorszky's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2sa503ur) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2sa503ur/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/javorszky'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/javorszky/1602234108282/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/javorszky
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gabor Javorszky AI Bot </div> <div style="font-size: 15px; color: #657786">@javorszky bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @javorszky's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3137</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>2139</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>67</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>931</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @javorszky's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/javorszky'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @javorszky's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3137</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>2139</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>67</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>931</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @javorszky's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/javorszky'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @javorszky's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3137</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>2139</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>67</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>931</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @javorszky's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/javorszky'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1325460517922729984/xDO9dBt-_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jay Alammar</div> <div style="text-align: center; font-size: 14px;">@jayalammar</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jay Alammar. | Data | Jay Alammar | | --- | --- | | Tweets downloaded | 692 | | Retweets | 198 | | Short tweets | 35 | | Tweets kept | 459 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1wf3zug3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jayalammar's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/hq8g8xlh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/hq8g8xlh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jayalammar') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/jayalammar/1638460288971/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jayalammar
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jay Alammar @jayalammar I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jay Alammar. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jayalammar's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1345079087337938944/tUHfuOi2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jasmine Persephone ☭ Black Podcast Revolution 🤖 AI Bot </div> <div style="font-size: 15px">@jazzpomegranate bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jazzpomegranate's tweets](https://twitter.com/jazzpomegranate). | Data | Quantity | | --- | --- | | Tweets downloaded | 3208 | | Retweets | 184 | | Short tweets | 720 | | Tweets kept | 2304 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/312m9owm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jazzpomegranate's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/jvni6p8a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/jvni6p8a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jazzpomegranate') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jazzpomegranate/1614106581220/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jazzpomegranate
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jasmine Persephone Black Podcast Revolution AI Bot @jazzpomegranate bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jazzpomegranate's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jazzpomegranate's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/63826202/jon_buffalo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jon Beasley-Murray 🤖 AI Bot </div> <div style="font-size: 15px">@jbmurray bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jbmurray's tweets](https://twitter.com/jbmurray). | Data | Quantity | | --- | --- | | Tweets downloaded | 2861 | | Retweets | 364 | | Short tweets | 260 | | Tweets kept | 2237 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1yppksvx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jbmurray's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/dqw1zvsq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/dqw1zvsq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jbmurray') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jbmurray/1617246417542/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jbmurray
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jon Beasley-Murray AI Bot @jbmurray bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jbmurray's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jbmurray's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/948678870990954496/5moZ7K0__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jordan Peterson Quotes 🤖 AI Bot </div> <div style="font-size: 15px">@jbpetersonquote bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jbpetersonquote's tweets](https://twitter.com/jbpetersonquote). | Data | Quantity | | --- | --- | | Tweets downloaded | 1983 | | Retweets | 605 | | Short tweets | 47 | | Tweets kept | 1331 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1n1ihdfe/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jbpetersonquote's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1qijh16v) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1qijh16v/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jbpetersonquote') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jbpetersonquote/1620104584619/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jbpetersonquote
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jordan Peterson Quotes AI Bot @jbpetersonquote bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jbpetersonquote's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jbpetersonquote's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1198536241769267200/6qNt8FmJ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jacob 🤖 AI Bot </div> <div style="font-size: 15px">@jcbdwsn bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jcbdwsn's tweets](https://twitter.com/jcbdwsn). | Data | Quantity | | --- | --- | | Tweets downloaded | 3068 | | Retweets | 568 | | Short tweets | 735 | | Tweets kept | 1765 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1upu09rs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jcbdwsn's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/iw1elh48) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/iw1elh48/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jcbdwsn') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jcbdwsn/1616697091563/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jcbdwsn
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
jacob AI Bot @jcbdwsn bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jcbdwsn's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jcbdwsn's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343045735936245768/cRORtnYI_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Medlock 🤖 AI Bot </div> <div style="font-size: 15px">@jdcmedlock bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jdcmedlock's tweets](https://twitter.com/jdcmedlock). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 186 | | Short tweets | 598 | | Tweets kept | 2465 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3hjp1k22/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jdcmedlock's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1dpg5f7d) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1dpg5f7d/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jdcmedlock') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jdcmedlock/1617465399388/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jdcmedlock
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
James Medlock AI Bot @jdcmedlock bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jdcmedlock's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jdcmedlock's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1363680905215291399/Bl--YnLP_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jan Dogmart</div> <div style="text-align: center; font-size: 14px;">@jdogmart</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jan Dogmart. | Data | Jan Dogmart | | --- | --- | | Tweets downloaded | 1333 | | Retweets | 106 | | Short tweets | 243 | | Tweets kept | 984 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/8hacy1dt/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jdogmart's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/uebjr2z5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/uebjr2z5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jdogmart') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jdogmart/1627065726745/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jdogmart
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jan Dogmart @jdogmart I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jan Dogmart. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jdogmart's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379272172166860801/Ovdcs6RI_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">almond milk producer 🤖 AI Bot </div> <div style="font-size: 15px">@jeansingod bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jeansingod's tweets](https://twitter.com/jeansingod). | Data | Quantity | | --- | --- | | Tweets downloaded | 3229 | | Retweets | 121 | | Short tweets | 928 | | Tweets kept | 2180 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2lclop5h/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeansingod's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/h3lqu127) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/h3lqu127/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jeansingod') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/jeansingod
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
almond milk producer AI Bot @jeansingod bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jeansingod's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jeansingod's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1398500371828547586/h4mvrfn7_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Bigger than Joe, Smaller than Corn Pop</div> <div style="text-align: center; font-size: 14px;">@jeebustrump</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Bigger than Joe, Smaller than Corn Pop. | Data | Bigger than Joe, Smaller than Corn Pop | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 196 | | Short tweets | 446 | | Tweets kept | 2608 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2zx4vfav/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeebustrump's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/23spbbyg) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/23spbbyg/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jeebustrump') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jeebustrump/1628222373006/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jeebustrump
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Bigger than Joe, Smaller than Corn Pop @jeebustrump I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Bigger than Joe, Smaller than Corn Pop. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jeebustrump's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1323369978247172097/HXimQ-3i_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jeemers 🤖 AI Bot </div> <div style="font-size: 15px">@jeemstate bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jeemstate's tweets](https://twitter.com/jeemstate). | Data | Quantity | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 220 | | Short tweets | 400 | | Tweets kept | 2623 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3053t272/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeemstate's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2vb2sesn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2vb2sesn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jeemstate') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jeemstate/1614134968037/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jeemstate
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
jeemers AI Bot @jeemstate bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jeemstate's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jeemstate's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/935325968280907776/AcBo6zJc_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jeff Dean (@🏡) 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jeffdean bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf). ## Training data The model was trained on [@jeffdean's tweets](https://twitter.com/jeffdean). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3222</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>926</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>153</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2143</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2wbq7a5s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeffdean's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/1hpit9nl) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/1hpit9nl/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jeffdean'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/jeffdean
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jeff Dean (@) AI Bot </div> <div style="font-size: 15px; color: #657786">@jeffdean bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jeffdean's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3222</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>926</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>153</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2143</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jeffdean's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jeffdean'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jeffdean's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3222</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>926</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>153</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2143</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jeffdean's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jeffdean'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jeffdean's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3222</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>926</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>153</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2143</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jeffdean's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jeffdean'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/946265213795426304/WwqX2wdC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jeff 🤖 AI Bot </div> <div style="font-size: 15px">@jeffdeecee bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jeffdeecee's tweets](https://twitter.com/jeffdeecee). | Data | Quantity | | --- | --- | | Tweets downloaded | 1918 | | Retweets | 424 | | Short tweets | 318 | | Tweets kept | 1176 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/9vmqstjp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeffdeecee's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1bwn2sjp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1bwn2sjp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jeffdeecee') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jeffdeecee/1616649023498/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jeffdeecee
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jeff AI Bot @jeffdeecee bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jeffdeecee's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jeffdeecee's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1280520627531780096/1kM9No10_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jemma 🤖 AI Bot </div> <div style="font-size: 15px">@jematrics bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jematrics's tweets](https://twitter.com/jematrics). | Data | Quantity | | --- | --- | | Tweets downloaded | 1860 | | Retweets | 131 | | Short tweets | 437 | | Tweets kept | 1292 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/mkienuu2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jematrics's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2ztizvjf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2ztizvjf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jematrics') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jematrics/1617891433167/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jematrics
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jemma AI Bot @jematrics bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jematrics's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jematrics's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1407394105966088198/wb4S3Yea_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jen 🌸</div> <div style="text-align: center; font-size: 14px;">@jen_122</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jen 🌸. | Data | Jen 🌸 | | --- | --- | | Tweets downloaded | 3206 | | Retweets | 1337 | | Short tweets | 137 | | Tweets kept | 1732 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ikjihay/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jen_122's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1by45dby) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1by45dby/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jen_122') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jen_122/1628851017758/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jen_122
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jen @jen\_122 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jen . Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jen\_122's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1406127247283548162/tOo7-e6j_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jenny Nicholson</div> <div style="text-align: center; font-size: 14px;">@jennyenicholson</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jenny Nicholson. | Data | Jenny Nicholson | | --- | --- | | Tweets downloaded | 3247 | | Retweets | 126 | | Short tweets | 252 | | Tweets kept | 2869 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3kptwa31/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jennyenicholson's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/37kyl0hh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/37kyl0hh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jennyenicholson') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jennyenicholson/1626298632174/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jennyenicholson
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jenny Nicholson @jennyenicholson I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jenny Nicholson. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jennyenicholson's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1404750730670473221/dKZZf947_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jens 🧲 | Email Marketing</div> <div style="text-align: center; font-size: 14px;">@jenslennartsson</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jens 🧲 | Email Marketing. | Data | Jens 🧲 | Email Marketing | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 316 | | Short tweets | 346 | | Tweets kept | 2588 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1kaofe1s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jenslennartsson's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3mdvlzx0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3mdvlzx0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jenslennartsson') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jenslennartsson/1630958497152/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jenslennartsson
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;URL </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> AI BOT </div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jens | Email Marketing</div> <div style="text-align: center; font-size: 14px;">@jenslennartsson</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on tweets from Jens | Email Marketing. | Data | Jens | Email Marketing | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 316 | | Short tweets | 346 | | Tweets kept | 2588 | Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jenslennartsson's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ## Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on tweets from Jens | Email Marketing.\n\n| Data | Jens | Email Marketing |\n| --- | --- |\n| Tweets downloaded | 3250 |\n| Retweets | 316 |\n| Short tweets | 346 |\n| Tweets kept | 2588 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jenslennartsson's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## How to use\n\nYou can use this model directly with a pipeline for text generation:", "## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on tweets from Jens | Email Marketing.\n\n| Data | Jens | Email Marketing |\n| --- | --- |\n| Tweets downloaded | 3250 |\n| Retweets | 316 |\n| Short tweets | 346 |\n| Tweets kept | 2588 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jenslennartsson's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## How to use\n\nYou can use this model directly with a pipeline for text generation:", "## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1181722925482876929/LupG_6O4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jeremy 🤖 AI Bot </div> <div style="font-size: 15px">@jeremymmele bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jeremymmele's tweets](https://twitter.com/jeremymmele). | Data | Quantity | | --- | --- | | Tweets downloaded | 2670 | | Retweets | 596 | | Short tweets | 135 | | Tweets kept | 1939 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/10aqw0np/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeremymmele's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/31xfyhy7) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/31xfyhy7/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jeremymmele') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jeremymmele/1616718191129/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jeremymmele
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jeremy AI Bot @jeremymmele bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jeremymmele's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jeremymmele's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1296667294148382721/9Pr6XrPB_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/2387565623/7gew8nz1z7ik1ch148so_400x400.jpeg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1279600070145437696/eocLhSLu_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Andrej Karpathy & Yann LeCun & Jeremy Howard</div> <div style="text-align: center; font-size: 14px;">@jeremyphoward-karpathy-ylecun</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Andrej Karpathy & Yann LeCun & Jeremy Howard. | Data | Andrej Karpathy | Yann LeCun | Jeremy Howard | | --- | --- | --- | --- | | Tweets downloaded | 3217 | 3249 | 3246 | | Retweets | 426 | 940 | 1437 | | Short tweets | 98 | 185 | 154 | | Tweets kept | 2693 | 2124 | 1655 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/qtj3s22r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeremyphoward-karpathy-ylecun's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/38rnlg1v) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/38rnlg1v/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jeremyphoward-karpathy-ylecun') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jeremyphoward-karpathy-ylecun/1620435583163/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jeremyphoward-karpathy-ylecun
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Andrej Karpathy & Yann LeCun & Jeremy Howard @jeremyphoward-karpathy-ylecun I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Andrej Karpathy & Yann LeCun & Jeremy Howard. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jeremyphoward-karpathy-ylecun's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1279600070145437696/eocLhSLu_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jeremy Howard 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jeremyphoward bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jeremyphoward's tweets](https://twitter.com/jeremyphoward). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3226</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1332</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>191</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1703</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1bhcd11t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeremyphoward's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/zfkvv5hj) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/zfkvv5hj/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jeremyphoward'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://res.cloudinary.com/huggingtweets/image/upload/v1599877649/jeremyphoward.jpg", "widget": [{"text": "My dream is"}]}
huggingtweets/jeremyphoward
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jeremy Howard AI Bot </div> <div style="font-size: 15px; color: #657786">@jeremyphoward bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jeremyphoward's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3226</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1332</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>191</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1703</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jeremyphoward's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jeremyphoward'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jeremyphoward's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3226</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1332</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>191</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1703</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jeremyphoward's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jeremyphoward'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jeremyphoward's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3226</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1332</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>191</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1703</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jeremyphoward's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jeremyphoward'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1294479143904653312/qP7tP-nr_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jessica Taylor</div> <div style="text-align: center; font-size: 14px;">@jessi_cata</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jessica Taylor. | Data | Jessica Taylor | | --- | --- | | Tweets downloaded | 907 | | Retweets | 145 | | Short tweets | 12 | | Tweets kept | 750 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/125iwpq5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jessi_cata's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ba9qak3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ba9qak3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jessi_cata') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jessi_cata/1622268778505/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jessi_cata
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jessica Taylor @jessi\_cata I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jessica Taylor. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jessi\_cata's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/939573265005133824/TPJRo-bL_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jessi rihanna (Top .00002%)</div> <div style="text-align: center; font-size: 14px;">@jessi_rihanna</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jessi rihanna (Top .00002%). | Data | Jessi rihanna (Top .00002%) | | --- | --- | | Tweets downloaded | 3220 | | Retweets | 495 | | Short tweets | 209 | | Tweets kept | 2516 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/14pd4m51/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jessi_rihanna's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/p0syf1v9) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/p0syf1v9/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jessi_rihanna') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jessi_rihanna/1627956346427/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jessi_rihanna
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jessi rihanna (Top .00002%) @jessi\_rihanna I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jessi rihanna (Top .00002%). Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jessi\_rihanna's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1331300707216068609/s4UcWg76_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">serial experiments maK 🤖 AI Bot </div> <div style="font-size: 15px">@jesusisathembo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jesusisathembo's tweets](https://twitter.com/jesusisathembo). | Data | Quantity | | --- | --- | | Tweets downloaded | 3161 | | Retweets | 1164 | | Short tweets | 243 | | Tweets kept | 1754 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1bqj16zu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jesusisathembo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/kuiuxq9x) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/kuiuxq9x/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jesusisathembo') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jesusisathembo/1614096400764/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jesusisathembo
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
serial experiments maK AI Bot @jesusisathembo bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jesusisathembo's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jesusisathembo's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1260604010681139201/BhgTOxCS_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">الرأسمالية حرام ☭ 🤖 AI Bot </div> <div style="font-size: 15px">@jeveuxrien95 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jeveuxrien95's tweets](https://twitter.com/jeveuxrien95). | Data | Quantity | | --- | --- | | Tweets downloaded | 3143 | | Retweets | 2419 | | Short tweets | 124 | | Tweets kept | 600 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ou9dif8u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jeveuxrien95's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2ugpi1zi) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2ugpi1zi/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jeveuxrien95') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jeveuxrien95/1616675447899/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jeveuxrien95
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
الرأسمالية حرام AI Bot @jeveuxrien95 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jeveuxrien95's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jeveuxrien95's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1047129590630633472/sXDJOGMO_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">JF Carrasco 🤖 AI Bot </div> <div style="font-size: 15px">@jfcarrasco bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jfcarrasco's tweets](https://twitter.com/jfcarrasco). | Data | Quantity | | --- | --- | | Tweets downloaded | 3218 | | Retweets | 1155 | | Short tweets | 407 | | Tweets kept | 1656 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/o2dxy46e/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jfcarrasco's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1f2jq47h) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1f2jq47h/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jfcarrasco') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jfcarrasco/1619525052137/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jfcarrasco
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
JF Carrasco AI Bot @jfcarrasco bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jfcarrasco's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jfcarrasco's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1351964374672396289/AxM62cUt_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jonathan Ichikawa 🤖 AI Bot </div> <div style="font-size: 15px">@jichikawa bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jichikawa's tweets](https://twitter.com/jichikawa). | Data | Quantity | | --- | --- | | Tweets downloaded | 3247 | | Retweets | 149 | | Short tweets | 327 | | Tweets kept | 2771 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3i8ms47x/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jichikawa's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1gtjo12t) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1gtjo12t/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jichikawa') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jichikawa/1616722342889/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jichikawa
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jonathan Ichikawa AI Bot @jichikawa bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jichikawa's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jichikawa's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/626944793302581248/TxzPTAYL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jim Groom 🤖 AI Bot </div> <div style="font-size: 15px">@jimgroom bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jimgroom's tweets](https://twitter.com/jimgroom). | Data | Quantity | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 359 | | Short tweets | 327 | | Tweets kept | 2558 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/niqph9pw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jimgroom's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2kwyza2p) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2kwyza2p/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jimgroom') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jimgroom/1617257800236/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jimgroom
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jim Groom AI Bot @jimgroom bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jimgroom's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jimgroom's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1418336753170239493/xADOlUfY_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jimmy</div> <div style="text-align: center; font-size: 14px;">@jimlbsp</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jimmy. | Data | Jimmy | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 348 | | Short tweets | 383 | | Tweets kept | 2508 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/213jvoqo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jimlbsp's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2rtm7jla) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2rtm7jla/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jimlbsp') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/jimlbsp
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jimmy @jimlbsp I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jimmy. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jimlbsp's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1265044655890149376/WpCp6n9e_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">J.K. Rowling 🤖 AI Bot </div> <div style="font-size: 15px">@jk_rowling bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jk_rowling's tweets](https://twitter.com/jk_rowling). | Data | Quantity | | --- | --- | | Tweets downloaded | 3186 | | Retweets | 719 | | Short tweets | 190 | | Tweets kept | 2277 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/17fpax3c/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jk_rowling's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2362x8t2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2362x8t2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jk_rowling') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jk_rowling/1617719835072/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jk_rowling
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
J.K. Rowling AI Bot @jk\_rowling bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jk\_rowling's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jk\_rowling's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1361886758514827264/9c2tO5mH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jeffrey Mark Epstein 🤖 AI Bot </div> <div style="font-size: 15px">@jmlepstein bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jmlepstein's tweets](https://twitter.com/jmlepstein). | Data | Quantity | | --- | --- | | Tweets downloaded | 687 | | Retweets | 136 | | Short tweets | 31 | | Tweets kept | 520 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2nouh38r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jmlepstein's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/wraxaefs) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/wraxaefs/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jmlepstein') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jmlepstein/1616644934198/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jmlepstein
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jeffrey Mark Epstein AI Bot @jmlepstein bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jmlepstein's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jmlepstein's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1271116010637135878/NMVGOTdS_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rafael Mourad 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jmourad bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jmourad's tweets](https://twitter.com/jmourad). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3217</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1639</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>85</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1493</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/f3pk6pjw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jmourad's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/181j4wxm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/181j4wxm/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jmourad'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jmourad/1610325650442/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/jmourad
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rafael Mourad AI Bot </div> <div style="font-size: 15px; color: #657786">@jmourad bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jmourad's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3217</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1639</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>85</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1493</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jmourad's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jmourad'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jmourad's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3217</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1639</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>85</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1493</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jmourad's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jmourad'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jmourad's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3217</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1639</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>85</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1493</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jmourad's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jmourad'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1380530524779859970/TfwVAbyX_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1308769664240160770/AfgzWVE7_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">President Biden & Joe Biden</div> <div style="text-align: center; font-size: 14px;">@joebiden-potus</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from President Biden & Joe Biden. | Data | President Biden | Joe Biden | | --- | --- | --- | | Tweets downloaded | 872 | 3250 | | Retweets | 32 | 384 | | Short tweets | 3 | 38 | | Tweets kept | 837 | 2828 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1c3s9vhj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @joebiden-potus's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/tcstvtkt) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/tcstvtkt/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/joebiden-potus') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
huggingtweets/joebiden-potus
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
AI CYBORG President Biden & Joe Biden @joebiden-potus I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from President Biden & Joe Biden. Data: Tweets downloaded, President Biden: 872, Joe Biden: 3250 Data: Retweets, President Biden: 32, Joe Biden: 384 Data: Short tweets, President Biden: 3, Joe Biden: 38 Data: Tweets kept, President Biden: 837, Joe Biden: 2828 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @joebiden-potus's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n" ]
text-generation
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1308769664240160770/AfgzWVE7_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Joe Biden</div> <div style="text-align: center; font-size: 14px;">@joebiden</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Joe Biden. | Data | Joe Biden | | --- | --- | | Tweets downloaded | 3215 | | Retweets | 629 | | Short tweets | 31 | | Tweets kept | 2555 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/tbtim2bm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @joebiden's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1w4wo0t6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1w4wo0t6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/joebiden') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/joebiden/1662283925554/predictions.png", "widget": [{"text": "My dream is"}]}
huggingtweets/joebiden
null
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Joe Biden @joebiden I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Joe Biden. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @joebiden's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
text-generation
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/typography@0.2.x/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1246455428894076933/4cFOl1LQ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Joe Davison 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@joeddav bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@joeddav's tweets](https://twitter.com/joeddav). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>1557</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>632</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>67</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>858</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2jn1y5u5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @joeddav's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2bbyx49y) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2bbyx49y/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/joeddav'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://res.cloudinary.com/huggingtweets/image/upload/v1599921389/joeddav.jpg", "widget": [{"text": "My dream is"}]}
huggingtweets/joeddav
null
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Joe Davison AI Bot </div> <div style="font-size: 15px; color: #657786">@joeddav bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @joeddav's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>1557</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>632</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>67</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>858</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @joeddav's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/joeddav'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @joeddav's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>1557</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>632</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>67</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>858</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @joeddav's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/joeddav'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @joeddav's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>1557</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>632</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>67</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>858</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @joeddav's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/joeddav'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]