File size: 1,376 Bytes
58cf517
 
 
 
 
 
 
 
 
 
 
30973d8
9de53c6
 
 
30973d8
 
 
9de53c6
 
4565d47
 
9de53c6
 
 
 
30973d8
 
 
 
 
 
 
9de53c6
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
title: Augmented Poetry
emoji: 
colorFrom: red
colorTo: gray
sdk: gradio
sdk_version: 3.4
app_file: app.py
pinned: false
---

- 1. fine-tune a large language model (LLM) using the text corpus of a specific poet

- select a certain rhyme from the gutenberg corpus and fine-tune on this
- try fine-tuning on a few lines of a poem that Eva has started

run in a docker container and transfer to another machine

Is it better to have a sequence to sequence transformer trained on sucessive lines of the poetry corpus??

https://github.com/huggingface/transformers/tree/main/examples/pytorch/language-modeling

merve/poetry only has 573 rows.

TODO: - upload the gutenberg poetry corpus up to huggingface - ask the lady who made it

## Research

<https://github.com/aparrish/gutenberg-dammit/>
implement language generation with a basic transformer

<https://github.com/aparrish/gutenberg-poetry-corpus>
Gutenberg Poetry Autocomplete, a search engine-like interface for writing poems mined from Project Gutenberg. (A poem written using this interface was recently published in the Indianapolis Review!)

https://ymeadows.com/en-articles/fine-tuning-transformer-based-language-models
https://thegradient.pub/prompting/
https://towardsdatascience.com/fine-tuning-for-domain-adaptation-in-nlp-c47def356fd6
https://ruder.io/recent-advances-lm-fine-tuning/

https://streamlit.io/