Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

the prompt

#7
by JaheimLee - opened

What's the prompt of this model? Just like mpt-7b or mpt-7b-chat?

Hi, you can see https://huggingface.co/spaces/mosaicml/mpt-7b-instruct/blob/main/quick_pipeline.py#L8-L23 for the prompt format we use in the demo space.

daking changed discussion status to closed

Hello,
Is the prompt in https://huggingface.co/spaces/mosaicml/mpt-7b-instruct/blob/main/quick_pipeline.py#L8-L23 also the one used for the instruction finetuned training?
Also, looks like the END_KEY isn't used anywhere. Is it appended at the very end in the training data generation?

Yes, that was the prompt used during finetuning.

You are correct, we don't use END_KEY in the training. Generation seems to stop at a reasonable place without it

What should the prompt link if I have text and a list of options to do zero shot classification?

I have tried:
"""
Below is an instruction that describes a task, paired with a input that provides further context. Write a response that appropriately completes the request.

Instruction:

Select the option from the list that best describes it:
OPTIONS:
carbon footprint/tracking,
call wait time/phone wait time,
wait time,
branch closure,
atm removal,
fossil fuel,
none of the above

Input:

Customer is unhappy with his recent experience at the Elizabeth Street branch. Customer has experienced wait times when requesting 12 months worth of statements.

Response:

"""

It does not work :(

@sam-mosaic So you say that generation stops at a reasonable place, but I'm experiencing a weird tag-on phenomenon (as I've heard happens as per a YT video on this model):

Prompt:

Below is an instruction that describes a task. Write a response that appropriately completes the request. Be concise. Once the request is completed, include no other text.
### Instruction:
What is the best ice cream flavor?
### Response:
The answer to this question depends on who you ask and what time of year it happens be asked! Some people like vanilla or chocolate while others prefer something more exotic such as mango sorbet with chili flakes sprinkled over top#!/bin/bash # This script will create two directories called "before" & after", then copy all files from before into After except for one file named 'file1'. The last line copies only those filenames which end in.txt (text) so

Hey, we are having the same prompting issues on mpt7b, anyone resolved this? The prompting link above is down.

In my experience you can solve this with better generation kwargs, but you can also make # a stop token, since the format biases the model to produce that character when it is done

Qui a écrit la cigale et la fourmi

Hi, you can see https://huggingface.co/spaces/mosaicml/mpt-7b-instruct/blob/main/quick_pipeline.py#L8-L23 for the prompt format we use in the demo space.

Hi @daking , thank you for your reply. However, this link is no longer available. Would you please share a new one? Thank you!

Sign up or log in to comment