modelId
stringlengths
5
139
author
stringlengths
2
42
last_modified
timestamp[us, tz=UTC]date
2020-02-15 11:33:14
2025-09-24 06:28:30
downloads
int64
0
223M
likes
int64
0
11.7k
library_name
stringclasses
573 values
tags
listlengths
1
4.05k
pipeline_tag
stringclasses
55 values
createdAt
timestamp[us, tz=UTC]date
2022-03-02 23:29:04
2025-09-24 06:27:16
card
stringlengths
11
1.01M
huggingtweets/praticoslo
huggingtweets
2021-05-22T19:21:52Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/praticoslo/1612438916423/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1315738168994533378/xJehqWDO_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Oslo City Sanjay 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@praticoslo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@praticoslo's tweets](https://twitter.com/praticoslo). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3114</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1186</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>222</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1706</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2k3trssu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @praticoslo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/c8foylk6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/c8foylk6/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/praticoslo'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/prakash1729brt
huggingtweets
2021-05-22T19:20:25Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/prakash1729brt/1601628881012/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1149024688116510721/YsVabUsx_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">prakash sellathurai 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@prakash1729brt bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@prakash1729brt's tweets](https://twitter.com/prakash1729brt). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>169</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>43</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>39</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>87</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2ck8tafl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @prakash1729brt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/32hnix57) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/32hnix57/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/prakash1729brt'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/postpastiche
huggingtweets
2021-05-22T19:11:55Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/postpastiche/1616644005211/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1118076636090503168/r39GQ1Ec_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">charley 🤖 AI Bot </div> <div style="font-size: 15px">@postpastiche bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@postpastiche's tweets](https://twitter.com/postpastiche). | Data | Quantity | | --- | --- | | Tweets downloaded | 1056 | | Retweets | 116 | | Short tweets | 110 | | Tweets kept | 830 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/35w04o6v/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @postpastiche's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1a3antnl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1a3antnl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/postpastiche') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/portgarden
huggingtweets
2021-05-22T19:09:10Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1362069452045565952/C9XrhddS_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">sam 🤖 AI Bot </div> <div style="font-size: 15px">@portgarden bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@portgarden's tweets](https://twitter.com/portgarden). | Data | Quantity | | --- | --- | | Tweets downloaded | 3225 | | Retweets | 239 | | Short tweets | 1180 | | Tweets kept | 1806 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/35dkjzwj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @portgarden's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/10m6qz0w) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/10m6qz0w/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/portgarden') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/porter_esq
huggingtweets
2021-05-22T19:08:05Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1346175758817902592/pYMf2A-D_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">seman🐄 🤖 AI Bot </div> <div style="font-size: 15px">@porter_esq bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@porter_esq's tweets](https://twitter.com/porter_esq). | Data | Quantity | | --- | --- | | Tweets downloaded | 3184 | | Retweets | 970 | | Short tweets | 370 | | Tweets kept | 1844 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/y4rnqx7f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @porter_esq's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/eqz1nxjx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/eqz1nxjx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/porter_esq') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/poppy_haze
huggingtweets
2021-05-22T19:05:57Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/poppy_haze/1614097147143/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1353162751368192003/Zeq8UG0M_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rev. Poppy Haze 🤖 AI Bot </div> <div style="font-size: 15px">@poppy_haze bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@poppy_haze's tweets](https://twitter.com/poppy_haze). | Data | Quantity | | --- | --- | | Tweets downloaded | 3193 | | Retweets | 1024 | | Short tweets | 229 | | Tweets kept | 1940 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1hn22d89/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @poppy_haze's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2kkk999d) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2kkk999d/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/poppy_haze') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/poppunkarsonist
huggingtweets
2021-05-22T19:04:47Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/poppunkarsonist/1617962318351/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1350067635019509762/i5v0L1oK_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">yaoi deuteragonist 🤖 AI Bot </div> <div style="font-size: 15px">@poppunkarsonist bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@poppunkarsonist's tweets](https://twitter.com/poppunkarsonist). | Data | Quantity | | --- | --- | | Tweets downloaded | 3236 | | Retweets | 177 | | Short tweets | 414 | | Tweets kept | 2645 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3rhs577s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @poppunkarsonist's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2sg13pk3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2sg13pk3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/poppunkarsonist') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/pop2bycharlixcx
huggingtweets
2021-05-22T19:01:58Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/pop2bycharlixcx/1617806922566/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377000617474920455/38AtGvJK_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Charli at GAY420 📦 🤖 AI Bot </div> <div style="font-size: 15px">@pop2bycharlixcx bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@pop2bycharlixcx's tweets](https://twitter.com/pop2bycharlixcx). | Data | Quantity | | --- | --- | | Tweets downloaded | 3227 | | Retweets | 349 | | Short tweets | 439 | | Tweets kept | 2439 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3qewrl8r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @pop2bycharlixcx's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2hamntab) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2hamntab/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/pop2bycharlixcx') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/poly_metis
huggingtweets
2021-05-22T19:00:13Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/poly_metis/1616689405149/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1331954891116466176/HAz7a4M4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">«Sam•Forsythe» 🤖 AI Bot </div> <div style="font-size: 15px">@poly_metis bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@poly_metis's tweets](https://twitter.com/poly_metis). | Data | Quantity | | --- | --- | | Tweets downloaded | 3202 | | Retweets | 553 | | Short tweets | 276 | | Tweets kept | 2373 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/7402gag0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @poly_metis's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2rhxhjay) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2rhxhjay/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/poly_metis') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/planeselchu
huggingtweets
2021-05-22T18:52:58Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/planeselchu/1604759866977/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/3663964297/3ebe8ed41d3c797e79e4b6591f06e02e_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">El Chupacabra 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@planeselchu bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@planeselchu's tweets](https://twitter.com/planeselchu). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>65</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>2</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>63</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/i749t2fu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @planeselchu's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/hrsgzzhb) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/hrsgzzhb/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/planeselchu'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/pkmn_elfbooks
huggingtweets
2021-05-22T18:51:51Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/pkmn_elfbooks/1619948689912/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/378800000271614092/2afadc59e43ec2c7e8cae83d15506bd8_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">PKMN ELFBOOKS 🤖 AI Bot </div> <div style="font-size: 15px">@pkmn_elfbooks bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@pkmn_elfbooks's tweets](https://twitter.com/pkmn_elfbooks). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 0 | | Short tweets | 335 | | Tweets kept | 2915 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2wv4f1mc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @pkmn_elfbooks's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3vxudyrh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3vxudyrh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/pkmn_elfbooks') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/piratepilots
huggingtweets
2021-05-22T18:48:30Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/piratepilots/1617914218242/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1254995258485747712/wTAawAgy_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Adam 🤖 AI Bot </div> <div style="font-size: 15px">@piratepilots bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@piratepilots's tweets](https://twitter.com/piratepilots). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 163 | | Short tweets | 1052 | | Tweets kept | 2031 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1nqn8xdk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @piratepilots's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/26pugvha) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/26pugvha/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/piratepilots') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/piersmorgan
huggingtweets
2021-05-22T18:47:22Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1289830755917537280/SVdTkxiN_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Piers Morgan 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@piersmorgan bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@piersmorgan's tweets](https://twitter.com/piersmorgan). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3224</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>481</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>339</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2404</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/e34ep2e8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @piersmorgan's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3v7ca9mt) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3v7ca9mt/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/piersmorgan'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/pidgezero_one
huggingtweets
2021-05-22T18:46:13Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/pidgezero_one/1614132740987/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1334990538647461889/XfOoKZ3w_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ATK | pidge 🤖 AI Bot </div> <div style="font-size: 15px">@pidgezero_one bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@pidgezero_one's tweets](https://twitter.com/pidgezero_one). | Data | Quantity | | --- | --- | | Tweets downloaded | 3229 | | Retweets | 351 | | Short tweets | 168 | | Tweets kept | 2710 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/18ne87gw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @pidgezero_one's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/xb62y69a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/xb62y69a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/pidgezero_one') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/philosophy_mark
huggingtweets
2021-05-22T18:38:29Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/philosophy_mark/1616697551802/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1359971689690329090/PDRloAA7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Mark Schroeder 🤖 AI Bot </div> <div style="font-size: 15px">@philosophy_mark bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@philosophy_mark's tweets](https://twitter.com/philosophy_mark). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 171 | | Short tweets | 347 | | Tweets kept | 2732 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/21aq4o19/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @philosophy_mark's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1erhw6qr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1erhw6qr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/philosophy_mark') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/philoso_foster
huggingtweets
2021-05-22T18:37:27Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/philoso_foster/1616729629058/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1357526479987331073/YKjgUnEz_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jen foster, scrunchie rights activist 🤖 AI Bot </div> <div style="font-size: 15px">@philoso_foster bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@philoso_foster's tweets](https://twitter.com/philoso_foster). | Data | Quantity | | --- | --- | | Tweets downloaded | 3241 | | Retweets | 634 | | Short tweets | 410 | | Tweets kept | 2197 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/15lvsiy3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @philoso_foster's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1m83r9mb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1m83r9mb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/philoso_foster') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/phantasyphiend
huggingtweets
2021-05-22T18:35:02Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/phantasyphiend/1616698324465/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/378800000130578639/207d4a749cd598bc91c77b9f9599cfaf_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Boredom is Strength 🤖 AI Bot </div> <div style="font-size: 15px">@phantasyphiend bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@phantasyphiend's tweets](https://twitter.com/phantasyphiend). | Data | Quantity | | --- | --- | | Tweets downloaded | 3233 | | Retweets | 1236 | | Short tweets | 105 | | Tweets kept | 1892 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3o8six8s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @phantasyphiend's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/367y74jo) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/367y74jo/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/phantasyphiend') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/peteskomoroch
huggingtweets
2021-05-22T18:32:34Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1245441601196810240/mTgj2Xuj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">peteskomoroch 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@peteskomoroch bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf). ## Training data The model was trained on [@peteskomoroch's tweets](https://twitter.com/peteskomoroch). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3204</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1989</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>155</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1060</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/fcor2sn3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @peteskomoroch's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3hsgl063) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3hsgl063/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/peteskomoroch'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/petersinger
huggingtweets
2021-05-22T18:30:21Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/petersinger/1616651178099/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/726051571549802496/C7CCfzNg_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Peter Singer 🤖 AI Bot </div> <div style="font-size: 15px">@petersinger bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@petersinger's tweets](https://twitter.com/petersinger). | Data | Quantity | | --- | --- | | Tweets downloaded | 2280 | | Retweets | 153 | | Short tweets | 9 | | Tweets kept | 2118 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ts8iug1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @petersinger's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/m34q3fhn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/m34q3fhn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/petersinger') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/peterhurford
huggingtweets
2021-05-22T18:26:56Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/peterhurford/1617082349548/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1213492335012651010/mrTNCZyF_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Peter Hurford is hiring interns 🤖 AI Bot </div> <div style="font-size: 15px">@peterhurford bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@peterhurford's tweets](https://twitter.com/peterhurford). | Data | Quantity | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 633 | | Short tweets | 173 | | Tweets kept | 2438 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3tfl6jpj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @peterhurford's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/iczego0s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/iczego0s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/peterhurford') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/peter_shoes_
huggingtweets
2021-05-22T18:25:53Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/peter_shoes_/1616614828484/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364286254511194122/2k1Xq9KR_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Peter Shoes 🤖 AI Bot </div> <div style="font-size: 15px">@peter_shoes_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@peter_shoes_'s tweets](https://twitter.com/peter_shoes_). | Data | Quantity | | --- | --- | | Tweets downloaded | 2893 | | Retweets | 653 | | Short tweets | 156 | | Tweets kept | 2084 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2lh8o2ik/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @peter_shoes_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/akr3u3cc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/akr3u3cc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/peter_shoes_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/persoverant
huggingtweets
2021-05-22T18:22:35Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/persoverant/1616687540093/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1336360264816599045/cfhEnxC7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Perso Verant 🤖 AI Bot </div> <div style="font-size: 15px">@persoverant bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@persoverant's tweets](https://twitter.com/persoverant). | Data | Quantity | | --- | --- | | Tweets downloaded | 3245 | | Retweets | 106 | | Short tweets | 335 | | Tweets kept | 2804 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/31i1riwh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @persoverant's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/205cbqur) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/205cbqur/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/persoverant') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/persimfan
huggingtweets
2021-05-22T18:21:26Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/persimfan/1614096598302/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1363872381907988481/EuhhK3gG_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">bad vibes only 😇🧘🏻 🤖 AI Bot </div> <div style="font-size: 15px">@persimfan bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@persimfan's tweets](https://twitter.com/persimfan). | Data | Quantity | | --- | --- | | Tweets downloaded | 3089 | | Retweets | 762 | | Short tweets | 530 | | Tweets kept | 1797 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/vjosb2cs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @persimfan's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2aaa509f) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2aaa509f/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/persimfan') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/permafuddled
huggingtweets
2021-05-22T18:20:25Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/permafuddled/1617768895101/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1318270049468243969/fLfecmYW_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">eskatonic.water 🤖 AI Bot </div> <div style="font-size: 15px">@permafuddled bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@permafuddled's tweets](https://twitter.com/permafuddled). | Data | Quantity | | --- | --- | | Tweets downloaded | 255 | | Retweets | 41 | | Short tweets | 38 | | Tweets kept | 176 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/319s90s7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @permafuddled's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/28zluakd) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/28zluakd/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/permafuddled') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/penners827
huggingtweets
2021-05-22T18:18:01Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/penners827/1616129257605/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1352431980126855169/6B6H29nl_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">penny | officially gameing 🤖 AI Bot </div> <div style="font-size: 15px">@penners827 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@penners827's tweets](https://twitter.com/penners827). | Data | Quantity | | --- | --- | | Tweets downloaded | 3206 | | Retweets | 1282 | | Short tweets | 396 | | Tweets kept | 1528 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ifmktei/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @penners827's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3lynjmin) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3lynjmin/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/penners827') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/pebblessss12
huggingtweets
2021-05-22T18:15:49Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375097459945304067/1epYiME7_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Peb 🤖 AI Bot </div> <div style="font-size: 15px">@pebblessss12 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@pebblessss12's tweets](https://twitter.com/pebblessss12). | Data | Quantity | | --- | --- | | Tweets downloaded | 2143 | | Retweets | 949 | | Short tweets | 339 | | Tweets kept | 855 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/24nvmhab/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @pebblessss12's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/15ub33vb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/15ub33vb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/pebblessss12') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/pdobryden
huggingtweets
2021-05-22T18:14:42Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1131522752303816704/xI89q9-z_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Paul Dobryden 🤖 AI Bot </div> <div style="font-size: 15px">@pdobryden bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@pdobryden's tweets](https://twitter.com/pdobryden). | Data | Quantity | | --- | --- | | Tweets downloaded | 1004 | | Retweets | 35 | | Short tweets | 164 | | Tweets kept | 805 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3fjzzc3x/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @pdobryden's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2cp6vkoc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2cp6vkoc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/pdobryden') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/pareinoia
huggingtweets
2021-05-22T18:01:20Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/pareinoia/1616618526006/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1350516642049110016/5Fm9kSGJ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🥝 kiwi, best of fruits 🥝 🤖 AI Bot </div> <div style="font-size: 15px">@pareinoia bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@pareinoia's tweets](https://twitter.com/pareinoia). | Data | Quantity | | --- | --- | | Tweets downloaded | 1999 | | Retweets | 464 | | Short tweets | 306 | | Tweets kept | 1229 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ca7c493/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @pareinoia's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2tntgk3a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2tntgk3a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/pareinoia') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/palaeoplushies
huggingtweets
2021-05-22T17:57:35Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/palaeoplushies/1614170470431/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/956582649748914176/LHJnQe0x_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rebecca Groom 𓆣 🤖 AI Bot </div> <div style="font-size: 15px">@palaeoplushies bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@palaeoplushies's tweets](https://twitter.com/palaeoplushies). | Data | Quantity | | --- | --- | | Tweets downloaded | 3231 | | Retweets | 1264 | | Short tweets | 328 | | Tweets kept | 1639 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3v2wg239/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @palaeoplushies's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/f2cvcgpc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/f2cvcgpc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/palaeoplushies') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/pakalupapitow
huggingtweets
2021-05-22T17:56:33Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/pakalupapitow/1614043057184/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1104739030229348352/qOCWwK4Y_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Pakalu Papito 🤖 AI Bot </div> <div style="font-size: 15px">@pakalupapitow bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@pakalupapitow's tweets](https://twitter.com/pakalupapitow). | Data | Quantity | | --- | --- | | Tweets downloaded | 825 | | Retweets | 0 | | Short tweets | 26 | | Tweets kept | 799 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3nhcqfcm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @pakalupapitow's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/304r4enc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/304r4enc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/pakalupapitow') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/paharnic
huggingtweets
2021-05-22T17:55:08Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/paharnic/1614095924704/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1361100730829049858/3MUqI3ao_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">paharnic 📕 (jurnal arc) 🤖 AI Bot </div> <div style="font-size: 15px">@paharnic bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@paharnic's tweets](https://twitter.com/paharnic). | Data | Quantity | | --- | --- | | Tweets downloaded | 2594 | | Retweets | 694 | | Short tweets | 179 | | Tweets kept | 1721 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2wukqiq6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @paharnic's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1m2fq298) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1m2fq298/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/paharnic') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/paguetisqueso
huggingtweets
2021-05-22T17:54:00Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/paguetisqueso/1605894116196/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1325149910971265026/O1uABo-F_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">fullmetal autist 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@paguetisqueso bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@paguetisqueso's tweets](https://twitter.com/paguetisqueso). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3184</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>855</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>754</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1575</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/37zz525d/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @paguetisqueso's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/19oxcev4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/19oxcev4/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/paguetisqueso'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/pabloiglesias
huggingtweets
2021-05-22T17:52:55Z
4
1
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/pabloiglesias/1621002350351/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1337047075859668992/vsS3FHEd_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Pablo Iglesias 🔻</div> <div style="text-align: center; font-size: 14px;">@pabloiglesias</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Pablo Iglesias 🔻. | Data | Pablo Iglesias 🔻 | | --- | --- | | Tweets downloaded | 3230 | | Retweets | 1157 | | Short tweets | 191 | | Tweets kept | 1882 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1cxyib7q/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @pabloiglesias's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/auuc2mv0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/auuc2mv0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/pabloiglesias') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/p69ns
huggingtweets
2021-05-22T17:51:58Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/p69ns/1620455795681/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1333057562418135050/ddMkA8SB_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Puneet 🌌</div> <div style="text-align: center; font-size: 14px;">@p69ns</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Puneet 🌌. | Data | Puneet 🌌 | | --- | --- | | Tweets downloaded | 223 | | Retweets | 44 | | Short tweets | 51 | | Tweets kept | 128 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/7i9ntszz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @p69ns's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2wsrhmmr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2wsrhmmr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/p69ns') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/oxtrf
huggingtweets
2021-05-22T17:50:31Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/oxtrf/1616669978589/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1332803125598687237/doZnTkBs_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">oscar ridout 🤖 AI Bot </div> <div style="font-size: 15px">@oxtrf bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@oxtrf's tweets](https://twitter.com/oxtrf). | Data | Quantity | | --- | --- | | Tweets downloaded | 1006 | | Retweets | 111 | | Short tweets | 120 | | Tweets kept | 775 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/13yo29fo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @oxtrf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/18i97j4h) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/18i97j4h/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/oxtrf') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/owlsimulator
huggingtweets
2021-05-22T17:49:29Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/owlsimulator/1618071835912/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1344571440704548864/oDFUNN1t_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">✿Atelier Victoria: Alchemist of Dark Eyebags✿ 🤖 AI Bot </div> <div style="font-size: 15px">@owlsimulator bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@owlsimulator's tweets](https://twitter.com/owlsimulator). | Data | Quantity | | --- | --- | | Tweets downloaded | 3184 | | Retweets | 2445 | | Short tweets | 52 | | Tweets kept | 687 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/tpdnekp3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @owlsimulator's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ekgy332) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ekgy332/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/owlsimulator') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ourqueeningreen
huggingtweets
2021-05-22T17:47:15Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ourqueeningreen/1614214881697/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1359357742952157187/mQGr8RbR_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🌿 🤖 AI Bot </div> <div style="font-size: 15px">@ourqueeningreen bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ourqueeningreen's tweets](https://twitter.com/ourqueeningreen). | Data | Quantity | | --- | --- | | Tweets downloaded | 244 | | Retweets | 189 | | Short tweets | 5 | | Tweets kept | 50 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2z9h01nl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ourqueeningreen's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3dos6ygx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3dos6ygx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ourqueeningreen') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/oughton_andrew
huggingtweets
2021-05-22T17:46:02Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/oughton_andrew/1616652616725/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1308077162394779649/95Akm8K6_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Andrew!! 🤖 AI Bot </div> <div style="font-size: 15px">@oughton_andrew bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@oughton_andrew's tweets](https://twitter.com/oughton_andrew). | Data | Quantity | | --- | --- | | Tweets downloaded | 2848 | | Retweets | 1110 | | Short tweets | 499 | | Tweets kept | 1239 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3m8g16om/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @oughton_andrew's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1fieo2nf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1fieo2nf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/oughton_andrew') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/orkoliberal
huggingtweets
2021-05-22T17:42:22Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/orkoliberal/1616723109984/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1345930771203125248/Js275GOL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">orko 🤖 AI Bot </div> <div style="font-size: 15px">@orkoliberal bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@orkoliberal's tweets](https://twitter.com/orkoliberal). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 57 | | Short tweets | 1021 | | Tweets kept | 2171 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/32939qwk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @orkoliberal's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/odi2o1dn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/odi2o1dn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/orkoliberal') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/opossumzavod
huggingtweets
2021-05-22T17:36:48Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/opossumzavod/1617768474582/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1352327440174080005/iNLT3tbN_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Leonid Bruhzhnev 🚩⚔👑 🤖 AI Bot </div> <div style="font-size: 15px">@opossumzavod bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@opossumzavod's tweets](https://twitter.com/opossumzavod). | Data | Quantity | | --- | --- | | Tweets downloaded | 3184 | | Retweets | 1034 | | Short tweets | 242 | | Tweets kept | 1908 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/23nwwphy/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @opossumzavod's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1dqk2qsz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1dqk2qsz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/opossumzavod') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ookinanami73
huggingtweets
2021-05-22T17:32:21Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ookinanami73/1602171715530/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/947803794410057728/38X54KmI_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">大樹七海オオキナナミ 書籍『弁理士にお任せあれ』|『ビジネスツールとしての知的財産』発売中 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@ookinanami73 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ookinanami73's tweets](https://twitter.com/ookinanami73). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3192</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1202</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>1279</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>711</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1y5h9ibk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ookinanami73's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/1h8u0g0r) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/1h8u0g0r/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ookinanami73'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/omalliecatt
huggingtweets
2021-05-22T17:24:54Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/omalliecatt/1614108504212/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1351181888904441857/kRazJkru_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">omalliecatt 🤖 AI Bot </div> <div style="font-size: 15px">@omalliecatt bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@omalliecatt's tweets](https://twitter.com/omalliecatt). | Data | Quantity | | --- | --- | | Tweets downloaded | 3034 | | Retweets | 1789 | | Short tweets | 161 | | Tweets kept | 1084 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2dmajkuu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @omalliecatt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/u4woldqy) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/u4woldqy/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/omalliecatt') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/oliversherouse
huggingtweets
2021-05-22T17:22:39Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/oliversherouse/1607051207212/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/960537652738121728/9SNKYOB-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Oliver Sherouse 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@oliversherouse bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@oliversherouse's tweets](https://twitter.com/oliversherouse). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>84</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>15</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>7</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>62</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2xs14x8e/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @oliversherouse's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/30zp73af) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/30zp73af/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/oliversherouse'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/olikuchi
huggingtweets
2021-05-22T17:20:22Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/olikuchi/1616735068585/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1342840760698167297/knVda2aD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Melissa Nakod 🤖 AI Bot </div> <div style="font-size: 15px">@olikuchi bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@olikuchi's tweets](https://twitter.com/olikuchi). | Data | Quantity | | --- | --- | | Tweets downloaded | 522 | | Retweets | 40 | | Short tweets | 26 | | Tweets kept | 456 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1vyt5dd1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @olikuchi's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vwx4xay) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vwx4xay/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/olikuchi') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ohitstarik
huggingtweets
2021-05-22T17:17:59Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ohitstarik/1617806598893/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1344429516433797121/EWFnfof-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">aerospace action bronson 🛩️ 🤖 AI Bot </div> <div style="font-size: 15px">@ohitstarik bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ohitstarik's tweets](https://twitter.com/ohitstarik). | Data | Quantity | | --- | --- | | Tweets downloaded | 3234 | | Retweets | 169 | | Short tweets | 410 | | Tweets kept | 2655 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/28wqr38y/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ohitstarik's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/24op17hz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/24op17hz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ohitstarik') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/oframblers
huggingtweets
2021-05-22T17:16:42Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/oframblers/1617759790860/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365027685173518336/5emgEh7A_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Vamyz 🤖 AI Bot </div> <div style="font-size: 15px">@oframblers bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@oframblers's tweets](https://twitter.com/oframblers). | Data | Quantity | | --- | --- | | Tweets downloaded | 2471 | | Retweets | 75 | | Short tweets | 352 | | Tweets kept | 2044 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/m1uvdmz7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @oframblers's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/32608sfl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/32608sfl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/oframblers') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ofrainandruin
huggingtweets
2021-05-22T17:14:59Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ofrainandruin/1614192469112/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364178872229646336/3YGZu06Y_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">static tincture vending machine 🤖 AI Bot </div> <div style="font-size: 15px">@ofrainandruin bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ofrainandruin's tweets](https://twitter.com/ofrainandruin). | Data | Quantity | | --- | --- | | Tweets downloaded | 3210 | | Retweets | 424 | | Short tweets | 528 | | Tweets kept | 2258 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1qe3i7df/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ofrainandruin's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/vw5x6cdg) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/vw5x6cdg/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ofrainandruin') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nyshra_
huggingtweets
2021-05-22T17:07:53Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1133336385711214592/CVletvRA_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Nyshra 🤖 AI Bot </div> <div style="font-size: 15px">@nyshra_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nyshra_'s tweets](https://twitter.com/nyshra_). | Data | Quantity | | --- | --- | | Tweets downloaded | 2405 | | Retweets | 1968 | | Short tweets | 134 | | Tweets kept | 303 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/39zp1zix/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nyshra_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/128qa3vv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/128qa3vv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nyshra_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nykteli_os
huggingtweets
2021-05-22T17:06:24Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nykteli_os/1617765021419/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1378316147217428484/KHvoGqqC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">kat 🤖 AI Bot </div> <div style="font-size: 15px">@nykteli_os bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nykteli_os's tweets](https://twitter.com/nykteli_os). | Data | Quantity | | --- | --- | | Tweets downloaded | 3181 | | Retweets | 1380 | | Short tweets | 333 | | Tweets kept | 1468 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2i76tul5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nykteli_os's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2wvcbiag) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2wvcbiag/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nykteli_os') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nyanberryy
huggingtweets
2021-05-22T17:01:43Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nyanberryy/1617147047295/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1342198467251359748/b4U3zZCb_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">sanrio mentall illness 🤖 AI Bot </div> <div style="font-size: 15px">@nyanberryy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nyanberryy's tweets](https://twitter.com/nyanberryy). | Data | Quantity | | --- | --- | | Tweets downloaded | 3171 | | Retweets | 2093 | | Short tweets | 430 | | Tweets kept | 648 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2n3cxdi4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nyanberryy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2m61vei9) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2m61vei9/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nyanberryy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nuggetprime
huggingtweets
2021-05-22T16:59:29Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nuggetprime/1614192411560/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343993986726449153/9mjKvHIl_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Emi the Nugget 🤖 AI Bot </div> <div style="font-size: 15px">@nuggetprime bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nuggetprime's tweets](https://twitter.com/nuggetprime). | Data | Quantity | | --- | --- | | Tweets downloaded | 3115 | | Retweets | 268 | | Short tweets | 514 | | Tweets kept | 2333 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1hpu3zl5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nuggetprime's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2epf97f0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2epf97f0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nuggetprime') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nueclear333
huggingtweets
2021-05-22T16:58:17Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nueclear333/1616127804862/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360502222472110080/eLpcrOJ2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">{nuclear333} 🤖 AI Bot </div> <div style="font-size: 15px">@nueclear333 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nueclear333's tweets](https://twitter.com/nueclear333). | Data | Quantity | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 379 | | Short tweets | 505 | | Tweets kept | 2360 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1q7uhjmk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nueclear333's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2kao691s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2kao691s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nueclear333') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/notpretzel
huggingtweets
2021-05-22T16:57:10Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/notpretzel/1616725707255/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374396890255556610/-Hs0KiBN_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Julie 🤖 AI Bot </div> <div style="font-size: 15px">@notpretzel bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@notpretzel's tweets](https://twitter.com/notpretzel). | Data | Quantity | | --- | --- | | Tweets downloaded | 628 | | Retweets | 39 | | Short tweets | 127 | | Tweets kept | 462 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/273cvupn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @notpretzel's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ev1zmou) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ev1zmou/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/notpretzel') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/notjohnfante
huggingtweets
2021-05-22T16:54:50Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/notjohnfante/1614109599350/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1121491295304867841/XQw9vw3T_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">2,300 Guillotines 💀 🤖 AI Bot </div> <div style="font-size: 15px">@notjohnfante bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@notjohnfante's tweets](https://twitter.com/notjohnfante). | Data | Quantity | | --- | --- | | Tweets downloaded | 2857 | | Retweets | 229 | | Short tweets | 239 | | Tweets kept | 2389 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3eh51lx0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @notjohnfante's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2nu4o83i) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2nu4o83i/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/notjohnfante') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/notdaijob
huggingtweets
2021-05-22T16:53:47Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/notdaijob/1601881653058/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1297897300568576000/xlT6AkR4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">honda hanako san 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@notdaijob bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@notdaijob's tweets](https://twitter.com/notdaijob). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2993</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1127</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>414</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1452</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/f3hrx6g6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @notdaijob's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/mhnjpovn) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/mhnjpovn/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/notdaijob'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/notanastronomer
huggingtweets
2021-05-22T16:52:36Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/notanastronomer/1616727503635/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1344888565176487936/SIjKeap6_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Lauren Gilbert 🤖 AI Bot </div> <div style="font-size: 15px">@notanastronomer bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@notanastronomer's tweets](https://twitter.com/notanastronomer). | Data | Quantity | | --- | --- | | Tweets downloaded | 3221 | | Retweets | 255 | | Short tweets | 349 | | Tweets kept | 2617 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/a2lf1xnl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @notanastronomer's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1kzasb23) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1kzasb23/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/notanastronomer') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/normmacdonald
huggingtweets
2021-05-22T16:47:41Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/normmacdonald/1617162362414/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1281990037/Unknown_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Norm Macdonald 🤖 AI Bot </div> <div style="font-size: 15px">@normmacdonald bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@normmacdonald's tweets](https://twitter.com/normmacdonald). | Data | Quantity | | --- | --- | | Tweets downloaded | 3190 | | Retweets | 160 | | Short tweets | 275 | | Tweets kept | 2755 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3j8ka0eq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @normmacdonald's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3stwuwin) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3stwuwin/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/normmacdonald') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nolemonnomelon
huggingtweets
2021-05-22T16:44:05Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nolemonnomelon/1616619611154/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1274178806719361025/YI5SJVX__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">zest of lemon stove 🤖 AI Bot </div> <div style="font-size: 15px">@nolemonnomelon bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nolemonnomelon's tweets](https://twitter.com/nolemonnomelon). | Data | Quantity | | --- | --- | | Tweets downloaded | 3242 | | Retweets | 103 | | Short tweets | 342 | | Tweets kept | 2797 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/23u2u7ar/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nolemonnomelon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1pvkyrlm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1pvkyrlm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nolemonnomelon') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nolanatlas
huggingtweets
2021-05-22T16:43:03Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nolanatlas/1617955301190/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375301394853339138/U-JYYHuL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">nolan 🤖 AI Bot </div> <div style="font-size: 15px">@nolanatlas bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nolanatlas's tweets](https://twitter.com/nolanatlas). | Data | Quantity | | --- | --- | | Tweets downloaded | 537 | | Retweets | 252 | | Short tweets | 80 | | Tweets kept | 205 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/fuq0bsmg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nolanatlas's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3hczao8v) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3hczao8v/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nolanatlas') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/noetic_emetic
huggingtweets
2021-05-22T16:41:48Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/noetic_emetic/1616706015114/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343992596377468930/e2uDWKzB_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">matty matt, aspiring non-cynic (🙅‍♂️🐶) 🤖 AI Bot </div> <div style="font-size: 15px">@noetic_emetic bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@noetic_emetic's tweets](https://twitter.com/noetic_emetic). | Data | Quantity | | --- | --- | | Tweets downloaded | 2038 | | Retweets | 136 | | Short tweets | 170 | | Tweets kept | 1732 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/5wbnriy4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @noetic_emetic's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/27jn5mzh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/27jn5mzh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/noetic_emetic') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nobu_hibiki
huggingtweets
2021-05-22T16:35:57Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nobu_hibiki/1616869686388/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375076697909854208/vK7nbssh_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Nobu Hibiki 🧵🍫🎶 🤖 AI Bot </div> <div style="font-size: 15px">@nobu_hibiki bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nobu_hibiki's tweets](https://twitter.com/nobu_hibiki). | Data | Quantity | | --- | --- | | Tweets downloaded | 3247 | | Retweets | 126 | | Short tweets | 493 | | Tweets kept | 2628 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1o4af15n/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nobu_hibiki's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/nm0uz8lj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/nm0uz8lj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nobu_hibiki') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nintendobower
huggingtweets
2021-05-22T16:27:58Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nintendobower/1617034457304/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1329621245868982273/7gZABQqW_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">NintendoBower 🤖 AI Bot </div> <div style="font-size: 15px">@nintendobower bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nintendobower's tweets](https://twitter.com/nintendobower). | Data | Quantity | | --- | --- | | Tweets downloaded | 896 | | Retweets | 51 | | Short tweets | 84 | | Tweets kept | 761 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1e0up07b/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nintendobower's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3h9vnuv5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3h9vnuv5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nintendobower') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nillster
huggingtweets
2021-05-22T16:25:41Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nillster/1616723709207/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1174493980353945600/caRV_TcD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Belated Halloween Name Mr. Nilla 🤖 AI Bot </div> <div style="font-size: 15px">@nillster bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nillster's tweets](https://twitter.com/nillster). | Data | Quantity | | --- | --- | | Tweets downloaded | 3186 | | Retweets | 2476 | | Short tweets | 41 | | Tweets kept | 669 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1eynsn57/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nillster's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3gmy1d91) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3gmy1d91/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nillster') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nikkibomm
huggingtweets
2021-05-22T16:24:39Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1348723772396892161/2vB4zAtF_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">nicole kayla🦋 🤖 AI Bot </div> <div style="font-size: 15px">@nikkibomm bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nikkibomm's tweets](https://twitter.com/nikkibomm). | Data | Quantity | | --- | --- | | Tweets downloaded | 2707 | | Retweets | 1332 | | Short tweets | 251 | | Tweets kept | 1124 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ikrygq8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nikkibomm's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/elmjn9n2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/elmjn9n2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nikkibomm') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nicodelon
huggingtweets
2021-05-22T16:21:09Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nicodelon/1616724924763/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1362417941690552328/e7Fg40V5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Nicolas Delon 🤖 AI Bot </div> <div style="font-size: 15px">@nicodelon bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nicodelon's tweets](https://twitter.com/nicodelon). | Data | Quantity | | --- | --- | | Tweets downloaded | 894 | | Retweets | 150 | | Short tweets | 99 | | Tweets kept | 645 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ip6ohm5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nicodelon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/doeia66z) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/doeia66z/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nicodelon') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nicedaysareweak
huggingtweets
2021-05-22T16:17:39Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nicedaysareweak/1617767429628/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1308228998267191296/0csXvzOF_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alice 🤖 AI Bot </div> <div style="font-size: 15px">@nicedaysareweak bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nicedaysareweak's tweets](https://twitter.com/nicedaysareweak). | Data | Quantity | | --- | --- | | Tweets downloaded | 1448 | | Retweets | 119 | | Short tweets | 97 | | Tweets kept | 1232 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ffpnpv2f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nicedaysareweak's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/50bytich) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/50bytich/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nicedaysareweak') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nflfantasy
huggingtweets
2021-05-22T16:16:28Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nflfantasy/1601268210358/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1157019440443039744/xkFjRwnR_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">NFL Fantasy Football 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@nflfantasy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nflfantasy's tweets](https://twitter.com/nflfantasy). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3227</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>848</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>265</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2114</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/30y9jcf1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nflfantasy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3a1kvcd4) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3a1kvcd4/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/nflfantasy'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/nfl
huggingtweets
2021-05-22T16:15:18Z
13
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nfl/1615605347734/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1324024227528278016/wN_LQ_cj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">NFL 🤖 AI Bot </div> <div style="font-size: 15px">@nfl bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nfl's tweets](https://twitter.com/nfl). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 1068 | | Short tweets | 32 | | Tweets kept | 2150 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/7gm7ii1y/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nfl's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/e9eunvix) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/e9eunvix/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nfl') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/news8
huggingtweets
2021-05-22T16:09:08Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/news8/1613275881464/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1052648330956664832/ayJW5Xra_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">News 8 San Diego 🤖 AI Bot </div> <div style="font-size: 15px">@news8 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@news8's tweets](https://twitter.com/news8). | Data | Quantity | | --- | --- | | Tweets downloaded | 3215 | | Retweets | 3 | | Short tweets | 2 | | Tweets kept | 3210 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/x7zifsh0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @news8's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/jr7qo9sl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/jr7qo9sl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/news8') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/neuro_stack
huggingtweets
2021-05-22T16:06:35Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/neuro_stack/1616634401290/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365043885823717376/-S-wwvpg_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">neurostack 🤖 AI Bot </div> <div style="font-size: 15px">@neuro_stack bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@neuro_stack's tweets](https://twitter.com/neuro_stack). | Data | Quantity | | --- | --- | | Tweets downloaded | 321 | | Retweets | 23 | | Short tweets | 19 | | Tweets kept | 279 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1mltk6iq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @neuro_stack's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1n6m7afn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1n6m7afn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/neuro_stack') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nestor_d
huggingtweets
2021-05-22T16:05:32Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/nestor_d/1616784666647/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1347888379/Narvpjedi_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ṅ̮̖̦̬̬̬̼̓͊͂̾̂͆é͆ṡ͍̼̱̜̦̋̀t̡̯̭̝̮̍͑̐̽o̺͎͐ͫ̅̉͒̑̚r̋ͮ͗ 🤖 AI Bot </div> <div style="font-size: 15px">@nestor_d bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nestor_d's tweets](https://twitter.com/nestor_d). | Data | Quantity | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 232 | | Short tweets | 340 | | Tweets kept | 2667 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1iu58bxy/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nestor_d's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/kyb41zp8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/kyb41zp8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nestor_d') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/neonacho
huggingtweets
2021-05-22T16:01:48Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/neonacho/1616874127037/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/839830093686050816/c1WsoCk8_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">. 🤖 AI Bot </div> <div style="font-size: 15px">@neonacho bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@neonacho's tweets](https://twitter.com/neonacho). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 70 | | Short tweets | 888 | | Tweets kept | 2291 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/366dutzu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @neonacho's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ktsqptj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ktsqptj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/neonacho') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/neil_jetter
huggingtweets
2021-05-22T15:55:47Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/neil_jetter/1616624365889/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1125464452529152000/8GSujJ8l_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Puzzle man 🧩 🤖 AI Bot </div> <div style="font-size: 15px">@neil_jetter bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@neil_jetter's tweets](https://twitter.com/neil_jetter). | Data | Quantity | | --- | --- | | Tweets downloaded | 481 | | Retweets | 117 | | Short tweets | 96 | | Tweets kept | 268 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/39dpbluj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @neil_jetter's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3a7kufsc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3a7kufsc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/neil_jetter') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nbthieves
huggingtweets
2021-05-22T15:54:39Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1296003688562130944/K_R9DCAP_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Nothing But Thieves 🤖 AI Bot </div> <div style="font-size: 15px">@nbthieves bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@nbthieves's tweets](https://twitter.com/nbthieves). | Data | Quantity | | --- | --- | | Tweets downloaded | 3159 | | Retweets | 959 | | Short tweets | 187 | | Tweets kept | 2013 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/lpdh8nfr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nbthieves's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ml5d0ypp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ml5d0ypp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nbthieves') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/natashajaques
huggingtweets
2021-05-22T15:47:56Z
4
1
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: http://res.cloudinary.com/huggingtweets/image/upload/v1599942934/natashajaques.jpg tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1105961729987620864/Q7OBLflN_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Natasha Jaques 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@natashajaques bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@natashajaques's tweets](https://twitter.com/natashajaques). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>799</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>518</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>23</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>258</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3ab9hmc0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @natashajaques's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3nw4qkaf) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3nw4qkaf/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/natashajaques'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mxrtinli
huggingtweets
2021-05-22T15:39:26Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mxrtinli/1616696860405/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1359637588630528004/SqovhhAH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">martin 李润林 🤖 AI Bot </div> <div style="font-size: 15px">@mxrtinli bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mxrtinli's tweets](https://twitter.com/mxrtinli). | Data | Quantity | | --- | --- | | Tweets downloaded | 375 | | Retweets | 105 | | Short tweets | 31 | | Tweets kept | 239 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/24avrm4e/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mxrtinli's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1344ky2b) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1344ky2b/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mxrtinli') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mutual_ayyde
huggingtweets
2021-05-22T15:37:45Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mutual_ayyde/1616654015077/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1357503109790978050/pkBmTm4h_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">mutual - dsa matrioshka brain caucus 🤖 AI Bot </div> <div style="font-size: 15px">@mutual_ayyde bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mutual_ayyde's tweets](https://twitter.com/mutual_ayyde). | Data | Quantity | | --- | --- | | Tweets downloaded | 3226 | | Retweets | 341 | | Short tweets | 377 | | Tweets kept | 2508 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1078r58l/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mutual_ayyde's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3240hia4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3240hia4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mutual_ayyde') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mutilumila
huggingtweets
2021-05-22T15:35:13Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mutilumila/1616785118212/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1367580181171470336/VGbeIwgL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">p a ' u l 🤖 AI Bot </div> <div style="font-size: 15px">@mutilumila bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mutilumila's tweets](https://twitter.com/mutilumila). | Data | Quantity | | --- | --- | | Tweets downloaded | 3227 | | Retweets | 432 | | Short tweets | 618 | | Tweets kept | 2177 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2xkgonzr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mutilumila's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2oplbn5a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2oplbn5a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mutilumila') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mraofnull
huggingtweets
2021-05-22T15:23:05Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mraofnull/1614169554638/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/897994820362416128/MUi78ucT_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Mraof 🤖 AI Bot </div> <div style="font-size: 15px">@mraofnull bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mraofnull's tweets](https://twitter.com/mraofnull). | Data | Quantity | | --- | --- | | Tweets downloaded | 1389 | | Retweets | 486 | | Short tweets | 237 | | Tweets kept | 666 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ostzmpk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mraofnull's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2sr0ddvm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2sr0ddvm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mraofnull') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mothsprite
huggingtweets
2021-05-22T15:19:14Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mothsprite/1614115281996/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360064787552608259/9-NoRXNL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">carsen 💝 🤖 AI Bot </div> <div style="font-size: 15px">@mothsprite bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mothsprite's tweets](https://twitter.com/mothsprite). | Data | Quantity | | --- | --- | | Tweets downloaded | 3168 | | Retweets | 563 | | Short tweets | 660 | | Tweets kept | 1945 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/31yl64zo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mothsprite's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/10118mvg) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/10118mvg/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mothsprite') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/most_lamentable
huggingtweets
2021-05-22T15:17:59Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/most_lamentable/1618953821275/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1324108343217192960/6sVP_i_6_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ACAB Rocky 🖤 🤖 AI Bot </div> <div style="font-size: 15px">@most_lamentable bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@most_lamentable's tweets](https://twitter.com/most_lamentable). | Data | Quantity | | --- | --- | | Tweets downloaded | 3155 | | Retweets | 2752 | | Short tweets | 63 | | Tweets kept | 340 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1qgfbjlk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @most_lamentable's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3i44t2y3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3i44t2y3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/most_lamentable') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mormo_music
huggingtweets
2021-05-22T15:16:51Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mormo_music/1619264382586/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1309110567383322624/_bG1P3yC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">zeta mask yo (42/?? years) 🤖 AI Bot </div> <div style="font-size: 15px">@mormo_music bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mormo_music's tweets](https://twitter.com/mormo_music). | Data | Quantity | | --- | --- | | Tweets downloaded | 3247 | | Retweets | 178 | | Short tweets | 325 | | Tweets kept | 2744 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1hjkc8nh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mormo_music's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8guhilo5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8guhilo5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mormo_music') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/morganstanley
huggingtweets
2021-05-22T15:15:39Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/morganstanley/1607110195449/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/767759044849336328/99u_IE90_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Morgan Stanley 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@morganstanley bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@morganstanley's tweets](https://twitter.com/morganstanley). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3234</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>106</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>1</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3127</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3mn5apem/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @morganstanley's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1gcjvbjs) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1gcjvbjs/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/morganstanley'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/moratorias
huggingtweets
2021-05-22T15:14:29Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/moratorias/1614113587590/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1354820099107037197/5rPiix_w_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Indy 🤖 AI Bot </div> <div style="font-size: 15px">@moratorias bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@moratorias's tweets](https://twitter.com/moratorias). | Data | Quantity | | --- | --- | | Tweets downloaded | 3197 | | Retweets | 710 | | Short tweets | 339 | | Tweets kept | 2148 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1twsutkc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @moratorias's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2qbw3sqa) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2qbw3sqa/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/moratorias') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/morallawwithin
huggingtweets
2021-05-22T15:12:26Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374727712355577856/PsAz792x_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">florence bacus 💜 🤖 AI Bot </div> <div style="font-size: 15px">@morallawwithin bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@morallawwithin's tweets](https://twitter.com/morallawwithin). | Data | Quantity | | --- | --- | | Tweets downloaded | 3227 | | Retweets | 666 | | Short tweets | 491 | | Tweets kept | 2070 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3hnxbkm1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @morallawwithin's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ue5m0yh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ue5m0yh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/morallawwithin') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/monodevice
huggingtweets
2021-05-22T15:08:48Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/monodevice/1616731608711/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1342210639595532289/_IT2n4Yn_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🐜onio 🤖 AI Bot </div> <div style="font-size: 15px">@monodevice bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@monodevice's tweets](https://twitter.com/monodevice). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 28 | | Short tweets | 982 | | Tweets kept | 2236 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/28ckopf3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @monodevice's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2gglhzx0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2gglhzx0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/monodevice') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/molleindustria
huggingtweets
2021-05-22T15:04:01Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/molleindustria/1607297976960/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1093212724/logo_small_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Paolo Pedercini 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@molleindustria bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@molleindustria's tweets](https://twitter.com/molleindustria). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3240</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>376</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>172</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2692</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/r51uy9bs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @molleindustria's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1cdzfc0q) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1cdzfc0q/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/molleindustria'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/molassesgrey
huggingtweets
2021-05-22T15:02:59Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/molassesgrey/1614173478568/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1362448847746830336/iwo39ze1_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David Foster Winnie 🤖 AI Bot </div> <div style="font-size: 15px">@molassesgrey bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@molassesgrey's tweets](https://twitter.com/molassesgrey). | Data | Quantity | | --- | --- | | Tweets downloaded | 3159 | | Retweets | 1239 | | Short tweets | 290 | | Tweets kept | 1630 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ve0e5vf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @molassesgrey's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/24eh8794) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/24eh8794/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/molassesgrey') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ml_nlp
huggingtweets
2021-05-22T14:58:50Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/ml_nlp/1606838395922/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/483993250470969344/_hfa_iHG_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Machine Learning and NLP 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@ml_nlp bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ml_nlp's tweets](https://twitter.com/ml_nlp). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>1669</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>185</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>13</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1471</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1us77dfn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ml_nlp's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3kg0h84e) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3kg0h84e/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ml_nlp'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/mitchellsolomo1
huggingtweets
2021-05-22T14:55:21Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mitchellsolomo1/1614098943754/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1354235179892674562/Ku6uOc6K_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Mitchell Solomon 🤖 AI Bot </div> <div style="font-size: 15px">@mitchellsolomo1 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mitchellsolomo1's tweets](https://twitter.com/mitchellsolomo1). | Data | Quantity | | --- | --- | | Tweets downloaded | 243 | | Retweets | 38 | | Short tweets | 25 | | Tweets kept | 180 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3du8kd6m/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mitchellsolomo1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3duwyidn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3duwyidn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mitchellsolomo1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mit_csail
huggingtweets
2021-05-22T14:53:45Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mit_csail/1620429689752/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/885505956272115712/U81HpDxb_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">MIT CSAIL 🤖 AI Bot </div> <div style="font-size: 15px">@mit_csail bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mit_csail's tweets](https://twitter.com/mit_csail). | Data | Quantity | | --- | --- | | Tweets downloaded | 3226 | | Retweets | 105 | | Short tweets | 44 | | Tweets kept | 3077 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/nj6zg8vq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mit_csail's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vkl4au0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vkl4au0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mit_csail') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mistykrueger
huggingtweets
2021-05-22T14:52:29Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mistykrueger/1619113130071/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365730731180363785/qqDYQuLX_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dr. Misty Krueger 🤖 AI Bot </div> <div style="font-size: 15px">@mistykrueger bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mistykrueger's tweets](https://twitter.com/mistykrueger). | Data | Quantity | | --- | --- | | Tweets downloaded | 2056 | | Retweets | 313 | | Short tweets | 323 | | Tweets kept | 1420 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/29y7s3fq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mistykrueger's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/t7fw1d2s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/t7fw1d2s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mistykrueger') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/misogenist
huggingtweets
2021-05-22T14:50:20Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/misogenist/1617971482479/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1231843464532221952/sTSwvexI_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">💊 🤖 AI Bot </div> <div style="font-size: 15px">@misogenist bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@misogenist's tweets](https://twitter.com/misogenist). | Data | Quantity | | --- | --- | | Tweets downloaded | 3199 | | Retweets | 252 | | Short tweets | 1022 | | Tweets kept | 1925 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1iudua4o/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @misogenist's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1kn4lk1o) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1kn4lk1o/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/misogenist') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/milligram3d
huggingtweets
2021-05-22T14:46:20Z
4
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/milligram3d/1616791387103/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1329940613718949888/ta7GE35b_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">im gay 🤖 AI Bot </div> <div style="font-size: 15px">@milligram3d bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@milligram3d's tweets](https://twitter.com/milligram3d). | Data | Quantity | | --- | --- | | Tweets downloaded | 3102 | | Retweets | 514 | | Short tweets | 267 | | Tweets kept | 2321 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2b28e9ko/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @milligram3d's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2dnn0apc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2dnn0apc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/milligram3d') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/milezmarkus
huggingtweets
2021-05-22T14:45:13Z
5
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364075918327746560/jG0rQra-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Miles Markus 🤖 AI Bot </div> <div style="font-size: 15px">@milezmarkus bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@milezmarkus's tweets](https://twitter.com/milezmarkus). | Data | Quantity | | --- | --- | | Tweets downloaded | 3164 | | Retweets | 1121 | | Short tweets | 203 | | Tweets kept | 1840 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3sb1xj7c/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @milezmarkus's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/16cneqjr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/16cneqjr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/milezmarkus') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mild_lakes
huggingtweets
2021-05-22T14:42:41Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mild_lakes/1614174488992/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1345777271240617987/wwqcknPt_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Mild 🤖 AI Bot </div> <div style="font-size: 15px">@mild_lakes bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mild_lakes's tweets](https://twitter.com/mild_lakes). | Data | Quantity | | --- | --- | | Tweets downloaded | 2207 | | Retweets | 517 | | Short tweets | 601 | | Tweets kept | 1089 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/30nz4ixw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mild_lakes's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/122k4eob) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/122k4eob/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mild_lakes') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mikrodystopies
huggingtweets
2021-05-22T14:41:31Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mikrodystopies/1604658435538/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1313931951791902720/P5xuzPnM_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Mikrodystopies 🤖 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@mikrodystopies bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mikrodystopies's tweets](https://twitter.com/mikrodystopies). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>1353</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>14</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>3</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1336</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3ujepu0f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mikrodystopies's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/6omc5zso) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/6omc5zso/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/mikrodystopies'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/mikekyismad
huggingtweets
2021-05-22T14:39:53Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mikekyismad/1616782600007/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364645809728315393/XaERYHCb_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Mikeky Mckekerson 🤖 AI Bot </div> <div style="font-size: 15px">@mikekyismad bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mikekyismad's tweets](https://twitter.com/mikekyismad). | Data | Quantity | | --- | --- | | Tweets downloaded | 576 | | Retweets | 11 | | Short tweets | 198 | | Tweets kept | 367 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/e9p5fru6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mikekyismad's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/tq9x0dms) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/tq9x0dms/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mikekyismad') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mike_massive
huggingtweets
2021-05-22T14:37:44Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://www.huggingtweets.com/mike_massive/1614095900280/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1299026526999019520/erdLPGsC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">mike insane 🤖 AI Bot </div> <div style="font-size: 15px">@mike_massive bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@mike_massive's tweets](https://twitter.com/mike_massive). | Data | Quantity | | --- | --- | | Tweets downloaded | 434 | | Retweets | 67 | | Short tweets | 73 | | Tweets kept | 294 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/sk151pwp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mike_massive's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/rli832dd) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/rli832dd/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mike_massive') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/micky_cow
huggingtweets
2021-05-22T14:28:03Z
6
0
transformers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1362790988356464645/TGSSbvT0_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Micky the cow 🤖 AI Bot </div> <div style="font-size: 15px">@micky_cow bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@micky_cow's tweets](https://twitter.com/micky_cow). | Data | Quantity | | --- | --- | | Tweets downloaded | 135 | | Retweets | 0 | | Short tweets | 15 | | Tweets kept | 120 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ugkdnx6z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @micky_cow's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2jfh2mjg) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2jfh2mjg/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/micky_cow') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)