awaisakhtar commited on
Commit
6f874c5
1 Parent(s): 2184fe1

Update README.md

Browse files

# Project Title

Short description of your project or the model you've fine-tuned.

## Table of Contents

- [Overview](#overview)
- [Training Procedure](#training-procedure)
- [Quantization Configuration](#quantization-configuration)
- [Framework Versions](#framework-versions)
- [Usage](#usage)
- [Evaluation](#evaluation)
- [Contributing](#contributing)
- [License](#license)

## Overview

Provide a brief introduction to your project. Explain what your fine-tuned model does and its potential applications. Mention any notable achievements or improvements over the base model.

## Training Procedure

Describe the training process for your fine-tuned model. Include details such as:
- Dataset used (XSum).
- Amount of data used (3% of the dataset).
- Number of training epochs (1 epoch).
- Any specific data preprocessing or augmentation.

## Quantization Configuration

Explain the quantization configuration used during training. Include details such as:
- Quantization method (bitsandbytes).
- Whether you loaded data in 8-bit or 4-bit.
- Threshold and skip modules for int8 quantization.
- Use of FP32 CPU offload and FP16 weight.
- Configuration for 4-bit quantization (fp4, double quant, compute dtype).

## Framework Versions

List the versions of the frameworks or libraries you used for this project. Include specific versions, e.g., PEFT 0.5.0.

## Usage

Provide instructions on how to use your fine-tuned model. Include code snippets or examples on how to generate summaries using the model. Mention any dependencies that need to be installed.

```bash
# Example usage command
python generate_summary.py --model your-model-name --input input.txt --output output.txt

Files changed (1) hide show
  1. README.md +7 -0
README.md CHANGED
@@ -1,5 +1,10 @@
1
  ---
2
  library_name: peft
 
 
 
 
 
3
  ---
4
  ## Training procedure
5
 
@@ -19,3 +24,5 @@ The following `bitsandbytes` quantization config was used during training:
19
 
20
 
21
  - PEFT 0.5.0
 
 
 
1
  ---
2
  library_name: peft
3
+ datasets:
4
+ - xsum
5
+ language:
6
+ - en
7
+ pipeline_tag: summarization
8
  ---
9
  ## Training procedure
10
 
 
24
 
25
 
26
  - PEFT 0.5.0
27
+
28
+