File size: 1,196 Bytes
75046e5
 
 
 
c823959
 
 
 
 
 
 
 
7d37b88
c823959
75046e5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c823959
75046e5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c823959
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
---
language: multilingual
license: mit
tags:
- transformer
- summarization
- translation
- question-answering
- english
- arabic
datasets:
- miscovery/arabic_egypt_english_world_facts
pipeline_tag: summarization
library_name: transformers
---

# Miscovery Transformer Model

This model is a transformer-based encoder-decoder model for multiple NLP tasks:
- Text summarization
- Translation (English-Arabic)
- Question-answering

## Model Architecture

- Model type: miscovery
- Number of parameters: 485674144
- Encoder layers: 12
- Decoder layers: 12
- Attention heads: 12
- Hidden size: 768
- Feed-forward size: 3072

## Training

The model was trained in two stages:
1. Pre-training on sentence rearrangement tasks
2. Fine-tuning on downstream tasks

## Usage

1. Install the package:

```bash
pip install miscovery-model
```

2. Run the model using a script:

```python
from miscovery_model import standard_pipeline

# Create a pipeline
model = standard_pipeline("miscovery/model")

# Use it
result = model("Translate this to Arabic: What year did World War I begin?")
print(result)
```

## Limitations

This model was trained on specific datasets and may not generalize well to all domains.