YashikaNagpal commited on
Commit
5512374
·
verified ·
1 Parent(s): ee31b37

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -15
README.md CHANGED
@@ -34,15 +34,15 @@ T5ForConditionalGeneration(
34
  )
35
  (lm_head): Linear(in_features=768, out_features=32100, bias=False)
36
  )
37
- ```
38
 
 
39
  pip install -U transformers torch datasets
40
- Then, load the model and run inference:
41
-
42
  ```
43
  from transformers import T5ForConditionalGeneration, RobertaTokenizer
44
 
45
- # Download from the 🤗 Hub (replace with your model ID after uploading)
 
46
  model_name = "your-username/codet5-conala-comments" # Update with your HF model ID
47
  tokenizer = RobertaTokenizer.from_pretrained(model_name)
48
  model = T5ForConditionalGeneration.from_pretrained(model_name)
@@ -65,8 +65,8 @@ comment = tokenizer.decode(outputs[0], skip_special_tokens=True)
65
  print(f"Code: {code_snippet}")
66
  print(f"Comment: {comment}")
67
  # Expected output: Something close to "Concatenate elements of a list 'x' of multiple integers to a single integer"
68
-
69
  ```
 
70
  # Training Details
71
  Training Dataset
72
  **Name:** janrauhl/conala
@@ -99,13 +99,3 @@ Non-Default Hyperparameters:
99
  **learning_rate:** 1e-4
100
  **fp16:** True
101
 
102
- ```
103
- @article{wang2021codet5,
104
- title={CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation},
105
- author={Wang, Yue and Wang, Weishi and Joty, Shafiq and Hoi, Steven C. H.},
106
- journal={Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing},
107
- year={2021},
108
- url={https://arxiv.org/abs/2109.00859}
109
- }
110
-
111
- ```
 
34
  )
35
  (lm_head): Linear(in_features=768, out_features=32100, bias=False)
36
  )
 
37
 
38
+ ```bash
39
  pip install -U transformers torch datasets
40
+ #Then, load the model and run inference:
 
41
  ```
42
  from transformers import T5ForConditionalGeneration, RobertaTokenizer
43
 
44
+ # Download from the 🤗 Hub
45
+ ```python
46
  model_name = "your-username/codet5-conala-comments" # Update with your HF model ID
47
  tokenizer = RobertaTokenizer.from_pretrained(model_name)
48
  model = T5ForConditionalGeneration.from_pretrained(model_name)
 
65
  print(f"Code: {code_snippet}")
66
  print(f"Comment: {comment}")
67
  # Expected output: Something close to "Concatenate elements of a list 'x' of multiple integers to a single integer"
 
68
  ```
69
+
70
  # Training Details
71
  Training Dataset
72
  **Name:** janrauhl/conala
 
99
  **learning_rate:** 1e-4
100
  **fp16:** True
101