DeepMount00
commited on
Commit
•
627371c
1
Parent(s):
1d3c297
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,58 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
tags:
|
4 |
+
- text-generation-inference
|
5 |
+
- q&a
|
6 |
+
- italian
|
7 |
+
- mamba
|
8 |
+
- question answering
|
9 |
+
language:
|
10 |
+
- it
|
11 |
+
pipeline_tag: question-answering
|
12 |
+
---
|
13 |
+
|
14 |
+
# Question Answering Generative Model
|
15 |
+
The key distinction between this model and DeepMount00/Mamba-QA-ITA lies in their performance and scale. This model boasts significantly improved performance and houses approximately 790 million parameters, a substantial increase from the 370 million parameters of DeepMount00/Mamba-QA-ITA. Furthermore, it delivers answers with greater accuracy and precision, enhancing the user experience and reliability of information.
|
16 |
+
## Overview
|
17 |
+
The model is a question-answering generative system, evolved from the Mamba model with 790 million parameters. This advanced model is capable of responding to complex questions and understanding when the answer is not present in the provided context.
|
18 |
+
|
19 |
+
## Model Architecture
|
20 |
+
The model is based on a Mamba architecture, enabling it to handle complex question answering. It's designed to understand and respond accurately in situations where context is limited or the question is intricate.
|
21 |
+
|
22 |
+
## Unique Features
|
23 |
+
- **Advanced Parameterization**: With 370 million parameters, the model offers a fine balance between efficiency and capability.
|
24 |
+
- **Contextual Understanding**: The model can discern when the answer to a question is not available in the provided context, showcasing its advanced comprehension abilities.
|
25 |
+
|
26 |
+
## Capabilities
|
27 |
+
- **Complex Question Handling**: Capable of understanding and responding to a wide range of complex questions.
|
28 |
+
- **Parameter Efficiency**: Despite having fewer parameters compared to some larger models, it maintains high efficiency and accuracy.
|
29 |
+
|
30 |
+
## How to Use
|
31 |
+
To utilize this model for advanced question answering:
|
32 |
+
|
33 |
+
```python
|
34 |
+
import torch
|
35 |
+
from transformers import AutoTokenizer
|
36 |
+
from mamba_ssm.models.mixer_seq_simple import MambaLMHeadModel
|
37 |
+
|
38 |
+
model_name = "DeepMount00/Mamba-QA-ITA-790m"
|
39 |
+
|
40 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
41 |
+
model = MambaLMHeadModel.from_pretrained(model_name, device="cuda", dtype=torch.float16)
|
42 |
+
|
43 |
+
def run_qa_mamba(model, question, context):
|
44 |
+
input_ids = torch.LongTensor([tokenizer.encode(f"{context}\n\nQ: {question}\nA:")]).cuda()
|
45 |
+
output = model.generate(input_ids=input_ids, max_length=2048, eos_token_id=tokenizer.eos_token_id)
|
46 |
+
answer = tokenizer.batch_decode(output)[0].replace(f"{context}\n\nQ: {question}\nA:", "").split("\n\n")[0].strip()
|
47 |
+
answer = answer.replace("<|endoftext|>", "")
|
48 |
+
return answer
|
49 |
+
|
50 |
+
question = """Quante torri ha bologna? """
|
51 |
+
context = """La torre degli Asinelli è una delle cosiddette due torri di Bologna, simbolo della città, situate in piazza di porta Ravegnana, all'incrocio tra le antiche strade San Donato (ora via Zamboni), San Vitale, Maggiore e Castiglione. Eretta, secondo la tradizione, fra il 1109 e il 1119 dal nobile Gherardo Asinelli, la torre è alta 97,20 metri, pende verso ovest per 2,23 metri e presenta all'interno una scalinata composta da 498 gradini. Ancora non si può dire con certezza quando e da chi fu costruita la torre degli Asinelli. Si presume che la torre debba il proprio nome a Gherardo Asinelli, il nobile cavaliere di fazione ghibellina al quale se ne attribuisce la costruzione, iniziata secondo una consolidata tradizione l'11 ottobre 1109 e terminata dieci anni dopo, nel 1119."""
|
52 |
+
|
53 |
+
answer = run_qa_mamba(model, question, context)
|
54 |
+
print(answer)
|
55 |
+
```
|
56 |
+
---
|
57 |
+
## Developer
|
58 |
+
[Michele Montebovi]
|