File size: 6,184 Bytes
d48f2aa
 
 
 
a46bdf3
d48f2aa
919f770
c820f03
 
 
d48f2aa
 
 
 
 
 
 
a46bdf3
d48f2aa
 
 
b6ac7a0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d48f2aa
 
 
b6ac7a0
d48f2aa
c88b3ed
 
 
 
 
3c46b8e
 
 
 
c88b3ed
 
 
 
 
 
 
 
 
d48f2aa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
be1227b
919f770
d48f2aa
 
 
c820f03
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---
license: mit
base_model: facebook/bart-large-cnn
model-index:
- name: lora-bart-cnn-tib-1024
  results: []
library_name: peft
datasets:
- gigant/tib
pipeline_tag: summarization
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# lora-bart-cnn-1024

This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on the TIB dataset.

## Model description

Fine Tuned with LORA on the TIB dataset.

A quick demo of it's capabilities:
```
Moderator: Good afternoon, everyone, and welcome to today's webinar on the fascinating and rapidly evolving topic of Artificial Intelligence. We have a distinguished panel of experts with us today who will shed light on the latest developments in AI and its impact on various aspects of our lives. I'll start by introducing our first speaker, Dr. Emily Rodriguez, a renowned AI researcher and professor.
Dr. Rodriguez: Thank you, it's a pleasure to be here. Artificial Intelligence has witnessed remarkable growth over the past few decades, and it's now ingrained in our daily lives, from voice assistants in our smartphones to self-driving cars and even in healthcare diagnostics. AI technologies are advancing at an unprecedented rate, driven by deep learning and neural networks. These innovations have allowed machines to perform tasks that were once thought to be exclusive to humans, such as natural language understanding, image recognition, and decision-making. The future of AI holds immense promise, but it also presents important ethical and societal challenges that we need to address.
Moderator: Indeed, the ethical aspect of AI is a crucial issue. Let's hear from our next speaker, Dr. James Chen, a pioneer in AI ethics.
Dr. Chen: Thank you for having me. As AI technologies continue to advance, it's essential that we consider the ethical implications. AI can perpetuate biases, invade privacy, and disrupt the job market. We must work collectively to ensure that AI is developed and deployed in a way that respects human rights, diversity, and transparency. Regulatory frameworks and ethical guidelines are crucial to navigate this evolving landscape and strike a balance between innovation and safeguarding societal values.
Moderator: Excellent points, Dr. Chen. Now, I'd like to turn to Dr. Sarah Patel, who has expertise in AI and its applications in healthcare.
Dr. Patel: Thank you. AI in healthcare is revolutionizing how we diagnose, treat, and manage diseases. Machine learning models can analyze vast datasets to predict disease outcomes and personalize treatment plans. It can improve the accuracy of medical imaging and reduce diagnostic errors. However, we must be cautious about data privacy and the need for responsible AI implementation in the healthcare sector. Ensuring data security and patient trust is essential for the successful integration of AI into healthcare systems.
Moderator: Thank you, Dr. Patel. Lastly, we have Dr. Michael Johnson, an expert in AI and its economic implications.
Dr. Johnson: AI is reshaping industries and economies worldwide. While it has the potential to boost productivity and drive economic growth, it also poses challenges in terms of job displacement and workforce adaptation. The role of governments, businesses, and educational institutions in upskilling and retraining the workforce is paramount. Additionally, fostering innovation and entrepreneurship in AI-related fields can create new opportunities and ensure a balanced and prosperous AI-driven economy.
Moderator: Thank you to all our speakers for their valuable insights on the multifaceted world of AI. It's clear that AI's impact on our society is immense, with profound implications across ethics, healthcare, and the economy. As we continue to advance, it is crucial that we remain vigilant and considerate of the ethical and societal dimensions, ensuring that AI remains a force for good. Thank you all for participating in this enlightening webinar
```
Is summarized as
```
Artificial Intelligence (AI) has become ingrained in our daily lives, from voice assistants in our smartphones to self-driving cars, and even in healthcare diagnostics. The future of AI holds immense promise, but it also presents important ethical and societal challenges that we need to address. This webinar will present the latest developments in AI and its impact on various aspects of our lives, including the ethical implications of the technology, as well as the economic and societal implications of AI. We will hear from a panel of experts who will share their insights on the current state of the art in the field of AI, including pioneers in AI ethics such as Dr. James Chen and Dr. Emily Rodriguez.
```

## Intended uses & limitations

Intended for summarizing video conferences/webinars.

Try out the model with the code below :D
```python
from peft import PeftModel, PeftConfig
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

config = PeftConfig.from_pretrained("jolenechong/lora-bart-cnn-tib-1024")
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/bart-large-cnn")
model = PeftModel.from_pretrained(model, "jolenechong/lora-bart-cnn-tib-1024")
tokenizer = AutoTokenizer.from_pretrained("jolenechong/lora-bart-cnn-tib-1024", from_pt=True)

text = """[add transcript you want to summarize here]"""
inputs = tokenizer(text, return_tensors="pt")

with torch.no_grad():
    outputs = model.generate(input_ids=inputs["input_ids"])
    print(tokenizer.batch_decode(outputs.detach().cpu().numpy())[0])
```

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5

### Training results



### Framework versions

- PEFT 0.5.0
- PEFT 0.5.0
- Transformers 4.34.1
- Pytorch 2.1.0+cu121
- Datasets 2.14.6
- Tokenizers 0.14.1