Dataset Viewer
Auto-converted to Parquet
T
stringclasses
1 value
Modelo
stringclasses
6 values
Tipo
stringclasses
1 value
Arquitetura
stringclasses
3 values
Tipo de Peso
stringclasses
1 value
Precisão
stringclasses
1 value
Licença
stringclasses
2 values
#Params (B)
float64
0.49
7
Hub Likes
int64
0
613
Disponível no hub
bool
2 classes
SHA do modelo
stringclasses
4 values
Área do Direito
float64
0.31
0.52
Área Médica
float64
0
0
Computação
float64
0
0
Discurso de Ódio
float64
0.44
0.77
Economia e Contabilidade
float64
0
0
Multidisciplinar
float64
0.38
0.7
Provas Militares
float64
0
0
Semântica e Inferência
float64
0.42
0.86
HateBR
float64
0
0.87
PT Hate Speech
float64
0.63
0.77
tweetSentBR
float64
0.17
0.71
OAB
float64
0.31
0.52
Revalida
float64
0
0
MREX
float64
0
0
ENAM
float64
0
0
AFA
float64
0
0
ITA
float64
0
0
IME
float64
0
0
POSCOMP
float64
0
0
OBI
float64
0
0
BCB
float64
0
0
CFCES
float64
0
0
ASSIN2 RTE
float64
0.76
0.92
ASSIN2 STS
float64
0.01
0.73
FAQUAD NLI
float64
0.49
0.97
BLUEX
float64
0.38
0.66
ENEM
float64
0.38
0.75
CNPU
float64
0
0
ENADE
float64
0
0
BNDES
float64
0
0
CACD (1ª fase)
float64
0
0
CACD (2ª fase)
float64
0
0
Média Geral
float64
0.4
0.75
Datasets Área Médica
stringclasses
1 value
Datasets Área do Direito
stringclasses
1 value
Datasets Provas Militares
stringclasses
1 value
Datasets Computação
stringclasses
1 value
Datasets Discurso de Ódio
stringclasses
1 value
Datasets Economia e Contabilidade
stringclasses
1 value
Datasets Semântica e Inferência
stringclasses
1 value
Datasets Multidisciplinar
stringclasses
1 value
energy_dataset
float64
0.5
0.5
reasoning_dataset
float64
0.5
0.5
SFT
qwen2.5-7B-1E_fulltrain
SFT : Supervised Finetuning
N/A
Original
BF16
qwen-research
7
0
false
N/A
0.391101
0
0
0.541988
0
0.494075
0
0.858661
0.719101
0.733728
0.173134
0.391101
0
0
0
0
0
0
0
0
0
0
0.921053
0.687538
0.967391
0.441828
0.546322
0
0
0
0
0
0.620133
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
qwen2.5-7B-2E_fulltrain
SFT : Supervised Finetuning
N/A
Original
BF16
qwen-research
7
0
false
N/A
0.386425
0
0
0.660035
0
0.498106
0
0.809891
0.775672
0.745174
0.459258
0.386425
0
0
0
0
0
0
0
0
0
0
0.919182
0.665711
0.844781
0.451985
0.544227
0
0
0
0
0
0.643602
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-7B-Instruct
SFT : Supervised Finetuning
N/A
Original
BF16
qwen-research
7
0
false
N/A
0.518703
0
0
0.769782
0
0.703835
0
0.847292
0.82935
0.771373
0.708624
0.518703
0
0
0
0
0
0
0
0
0
0
0.923559
0.726579
0.891739
0.656971
0.750698
0
0
0
0
0
0.753066
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-0.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
0.494
329
true
7ae557604adf67be50417f59c2c2f167def9a775
0.311111
0
0
0.437037
0
0.377778
0
0.417116
0
0.666667
0.644444
0.311111
0
0
0
0
0
0
0
0
0
0
0.755556
0.006902
0.488889
0.377778
0.377778
0
0
0
0
0
0.403236
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-7B-Instruct
SFT : Supervised Finetuning
N/A
Original
BF16
qwen-research
7
0
false
N/A
0.518703
0
0
0.769782
0
0.703835
0
0.847292
0.82935
0.771373
0.708624
0.518703
0
0
0
0
0
0
0
0
0
0
0.923559
0.726579
0.891739
0.656971
0.750698
0
0
0
0
0
0.753066
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
SFT : Supervised Finetuning
Gemma3ForConditionalGeneration
Original
BF16
gemma-research
4.3
26
true
a5758227f60fce2f4feb8d1479c3c12609b92cbb
0.440573
0
0
0.737568
0
0.563194
0
0.784777
0.867857
0.649157
0.695688
0.440573
0
0
0
0
0
0
0
0
0
0
0.888599
0.710348
0.755385
0.500923
0.625466
0
0
0
0
0
0.681555
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-0.5B-Instruct
SFT : Supervised Finetuning
Qwen2ForCausalLM
Original
BF16
qwen-research
0.494
329
true
7ae557604adf67be50417f59c2c2f167def9a775
0.311111
0
0
0.437037
0
0.377778
0
0.417116
0
0.666667
0.644444
0.311111
0
0
0
0
0
0
0
0
0
0
0.755556
0.006902
0.488889
0.377778
0.377778
0
0
0
0
0
0.403236
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
Qwen/Qwen2.5-7B-Instruct
SFT : Supervised Finetuning
N/A
Original
BF16
qwen-research
7
0
false
N/A
0.518703
0
0
0.769782
0
0.703835
0
0.847292
0.82935
0.771373
0.708624
0.518703
0
0
0
0
0
0
0
0
0
0
0.923559
0.726579
0.891739
0.656971
0.750698
0
0
0
0
0
0.753066
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
CEIA-UFG/Gemma-3-Gaia-PT-BR-4b-it
SFT : Supervised Finetuning
Gemma3ForConditionalGeneration
Original
BF16
gemma-research
4.3
46
true
a5758227f60fce2f4feb8d1479c3c12609b92cbb
0.440573
0
0
0.737568
0
0.563194
0
0.784777
0.867857
0.649157
0.695688
0.440573
0
0
0
0
0
0
0
0
0
0
0.888599
0.710348
0.755385
0.500923
0.625466
0
0
0
0
0
0.681555
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
SFT
google/gemma-3-4b-it
SFT : Supervised Finetuning
Gemma3ForConditionalGeneration
Original
BF16
gemma-research
4.3
613
true
093f9f388b31de276ce2de164bdc2081324b9767
0.444042
0
0
0.728994
0
0.568899
0
0.774315
0.86211
0.629847
0.695025
0.444042
0
0
0
0
0
0
0
0
0
0
0.879315
0.705681
0.737949
0.500462
0.637337
0
0
0
0
0
0.676863
Revalida, MREX
OAB, ENAM
AFA, ITA, IME
POSCOMP, OBI
HateBR, PT Hate Speech, tweetSentBR
BCB, CFCES
FAQUAD NLI, ASSIN2 RTE, ASSIN2 STS
ENEM, BLUEX, CNPU, ENADE, BNDES, CACD (1ª fase), CACD (2ª fase)
0.5
0.5
README.md exists but content is empty.
Downloads last month
8