Edit model card

Electric Sheep 7B - α - "The Author"

  • Developed by: maldv
  • License: cc-by-nc-4.0
  • Finetuned from model: maldv/winter-garden-7b-alpha
  • Methodology: Simple newline delimited, rolling window book and conversation data.

Will It Write

I spent the first evening after baking this model watching as it wrote page after page, story after story. The answer is, yes. It will write, and never stop. 100% story, 0% plot.

It is one of the most beautiful things I've ever seen. laughs

Data

70% book data, then 10% from each of the other datasets, lora r 64, lr .00007, 2 epochs. Trained for around 2 days on an a6000; loss fell to .4, but that grad leveled out at around 3 and ground in really nicely.

Chat Template

It was trained to follow no prompt at all, just to start going. It can be encouraged by [WP] Topic\n\n, and once it gets going an author note doesn't seem to get regurgitated.

If you have to follow a chat template, use the one it shipped with, as that is what the conversation turns were conditioned on. Format is not super important, but if you use one, then provide a few turns of dialogue to get it rolling.

{{bos_token}}
{% for message in messages %}
  {% if 'name' in message %}{{message['name'] + ('' if 'to' not in message else ' (to ' + message['to'] + ')') + ': ' + message['content'] + '\n\n'}}{% else %}{{message['content'] + '\n\n '}}
{% endif %}
{% endfor %}

Which produces results like:

Paul (to Jane): The sky is blue.

Jane (to Paul): Yes, it sure is!

Downloads last month
4
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for maldv/electric-sheep-7b-alpha

Finetuned
(1)
this model

Datasets used to train maldv/electric-sheep-7b-alpha