Update README.md
Browse files
README.md
CHANGED
@@ -4,23 +4,150 @@ language:
|
|
4 |
- en
|
5 |
datasets:
|
6 |
- togethercomputer/RedPajama-Data-1T
|
7 |
-
-
|
8 |
-
- Muennighoff/natural-instructions
|
9 |
widget:
|
10 |
-
- text:
|
11 |
-
|
12 |
-
|
13 |
-
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
inference:
|
25 |
parameters:
|
26 |
temperature: 0.7
|
|
|
4 |
- en
|
5 |
datasets:
|
6 |
- togethercomputer/RedPajama-Data-1T
|
7 |
+
- togethercomputer/RedPajama-Data-Instruct
|
|
|
8 |
widget:
|
9 |
+
- text: |-
|
10 |
+
Label the tweets as either 'positive', 'negative', 'mixed', or 'neutral':
|
11 |
+
|
12 |
+
Tweet: I can say that there isn't anything I would change.
|
13 |
+
Label: positive
|
14 |
+
|
15 |
+
Tweet: I'm not sure about this.
|
16 |
+
Label: neutral
|
17 |
+
|
18 |
+
Tweet: I liked some parts but I didn't like other parts.
|
19 |
+
Label: mixed
|
20 |
+
|
21 |
+
Tweet: I think the background image could have been better.
|
22 |
+
Label: negative
|
23 |
+
|
24 |
+
Tweet: I really like it.
|
25 |
+
Label:
|
26 |
+
example_title: Sentiment Analysis
|
27 |
+
- text: |-
|
28 |
+
Please answer the following question:
|
29 |
+
|
30 |
+
Question: What is the capital of Canada?
|
31 |
+
Answer: Ottawa
|
32 |
+
|
33 |
+
Question: What is the currency of Switzerland?
|
34 |
+
Answer: Swiss franc
|
35 |
+
|
36 |
+
Question: In which country is Wisconsin located?
|
37 |
+
Answer:
|
38 |
+
example_title: Question Answering
|
39 |
+
- text: >-
|
40 |
+
Given a news article, classify its topic.
|
41 |
+
|
42 |
+
Possible labels: 1. World 2. Sports 3. Business 4. Sci/Tech
|
43 |
+
|
44 |
+
|
45 |
+
Article: A nearby star thought to harbor comets and asteroids now appears to
|
46 |
+
be home to planets, too.
|
47 |
+
|
48 |
+
Label: Sci/Tech
|
49 |
+
|
50 |
+
|
51 |
+
Article: Soaring crude prices plus worries about the economy and the outlook
|
52 |
+
for earnings are expected to hang over the stock market next week during the
|
53 |
+
depth of the summer doldrums.
|
54 |
+
|
55 |
+
Label: Business
|
56 |
+
|
57 |
+
|
58 |
+
Article: Murtagh a stickler for success Northeastern field hockey coach
|
59 |
+
Cheryl Murtagh doesn't want the glare of the spotlight that shines on her to
|
60 |
+
detract from a team that has been the America East champion for the past
|
61 |
+
three years and has been to the NCAA tournament 13 times.
|
62 |
+
|
63 |
+
Label::
|
64 |
+
example_title: Topic Classification
|
65 |
+
- text: |-
|
66 |
+
Paraphrase the given sentence into a different sentence.
|
67 |
+
|
68 |
+
Input: Can you recommend some upscale restaurants in New York?
|
69 |
+
Output: What upscale restaurants do you recommend in New York?
|
70 |
+
|
71 |
+
Input: What are the famous places we should not miss in Paris?
|
72 |
+
Output: Recommend some of the best places to visit in Paris?
|
73 |
+
|
74 |
+
Input: Could you recommend some hotels that have cheap price in Zurich?
|
75 |
+
Output:
|
76 |
+
example_title: Paraphrasing
|
77 |
+
- text: >-
|
78 |
+
Given a review from Amazon's food products, the task is to generate a short
|
79 |
+
summary of the given review in the input.
|
80 |
+
|
81 |
+
|
82 |
+
Input: I have bought several of the Vitality canned dog food products and
|
83 |
+
have found them all to be of good quality. The product looks more like a
|
84 |
+
stew than a processed meat and it smells better. My Labrador is finicky and
|
85 |
+
she appreciates this product better than most.
|
86 |
+
|
87 |
+
Output: Good Quality Dog Food
|
88 |
+
|
89 |
+
|
90 |
+
Input: Product arrived labeled as Jumbo Salted Peanuts...the peanuts were
|
91 |
+
actually small sized unsalted. Not sure if this was an error or if the
|
92 |
+
vendor intended to represent the product as 'Jumbo'.
|
93 |
+
|
94 |
+
Output: Not as Advertised
|
95 |
+
|
96 |
+
|
97 |
+
Input: My toddler loves this game to a point where he asks for it. That's a
|
98 |
+
big thing for me. Secondly, no glitching unlike one of their competitors
|
99 |
+
(PlayShifu). Any tech I don’t have to reach out to support for help is a
|
100 |
+
good tech for me. I even enjoy some of the games and activities in this.
|
101 |
+
Overall, this is a product that shows that the developers took their time
|
102 |
+
and made sure people would not be asking for refund. I’ve become bias
|
103 |
+
regarding this product and honestly I look forward to buying more of this
|
104 |
+
company’s stuff. Please keep up the great work.
|
105 |
+
|
106 |
+
Output:
|
107 |
+
example_title: Text Summarization
|
108 |
+
- text: |-
|
109 |
+
Identify which sense of a word is meant in a given context.
|
110 |
+
|
111 |
+
Context: The river overflowed the bank.
|
112 |
+
Word: bank
|
113 |
+
Sense: river bank
|
114 |
+
|
115 |
+
Context: A mouse takes much more room than a trackball.
|
116 |
+
Word: mouse
|
117 |
+
Sense: computer mouse
|
118 |
+
|
119 |
+
Context: The bank will not be accepting cash on Saturdays.
|
120 |
+
Word: bank
|
121 |
+
Sense: commercial (finance) banks
|
122 |
+
|
123 |
+
Context: Bill killed the project
|
124 |
+
Word: kill
|
125 |
+
Sense:
|
126 |
+
example_title: Word Sense Disambiguation
|
127 |
+
- text: >-
|
128 |
+
Given a pair of sentences, choose whether the two sentences agree
|
129 |
+
(entailment)/disagree (contradiction) with each other.
|
130 |
+
|
131 |
+
Possible labels: 1. entailment 2. contradiction
|
132 |
+
|
133 |
+
|
134 |
+
Sentence 1: The skier was on the edge of the ramp. Sentence 2: The skier was
|
135 |
+
dressed in winter clothes.
|
136 |
+
|
137 |
+
Label: entailment
|
138 |
+
|
139 |
+
|
140 |
+
Sentence 1: The boy skated down the staircase railing. Sentence 2: The boy
|
141 |
+
is a newbie skater.
|
142 |
+
|
143 |
+
Label: contradiction
|
144 |
+
|
145 |
+
|
146 |
+
Sentence 1: Two middle-aged people stand by a golf hole. Sentence 2: A
|
147 |
+
couple riding in a golf cart.
|
148 |
+
|
149 |
+
Label:
|
150 |
+
example_title: Natural Language Inference
|
151 |
inference:
|
152 |
parameters:
|
153 |
temperature: 0.7
|