Spaces:
Sleeping
Sleeping
readme
Browse files
README.md
CHANGED
@@ -1,445 +1,36 @@
|
|
1 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
|
3 |
-
|
4 |
|
5 |
-
[
|
6 |
|
7 |
-
|
8 |
|
9 |
-
AI Town
|
10 |
|
11 |
-
|
12 |
-
Inspired by the research paper [_Generative Agents: Interactive Simulacra of Human Behavior_](https://arxiv.org/pdf/2304.03442.pdf).
|
13 |
|
14 |
-
|
15 |
-
is to provide a platform with a strong foundation that is meant to be extended.
|
16 |
-
The back-end natively supports shared global state, transactions, and a simulation engine
|
17 |
-
and should be suitable from everything from a simple project to play around with to a scalable, multi-player game.
|
18 |
-
A secondary goal is to make a JS/TS framework available as most simulators in this space
|
19 |
-
(including the original paper above) are written in Python.
|
20 |
-
|
21 |
-
## Overview
|
22 |
-
|
23 |
-
- π» [Stack](#stack)
|
24 |
-
- π§ [Installation](#installation)
|
25 |
-
- π€ [Customize - run YOUR OWN simulated world](#customize-your-own-simulation)
|
26 |
-
- π©βπ» [Deploying](#deploy-the-app)
|
27 |
-
- π [Credits](#credits)
|
28 |
-
|
29 |
-
## Stack
|
30 |
-
|
31 |
-
- Game engine, database, and vector search: [Convex](https://convex.dev/)
|
32 |
-
- Auth (Optional): [Clerk](https://clerk.com/)
|
33 |
-
- Default chat model is `llama3` and embeddings with `mxbai-embed-large`.
|
34 |
-
- Local inference: [Ollama](https://github.com/jmorganca/ollama)
|
35 |
-
- Configurable for other cloud LLMs: [Together.ai](https://together.ai/) or anything
|
36 |
-
that speaks the [OpenAI API](https://platform.openai.com/).
|
37 |
-
PRs welcome to add more cloud provider support.
|
38 |
-
- Pixel Art Generation: [Replicate](https://replicate.com/), [Fal.ai](https://serverless.fal.ai/lora)
|
39 |
-
- Background Music Generation: [Replicate](https://replicate.com/) using [MusicGen](https://huggingface.co/spaces/facebook/MusicGen)
|
40 |
-
|
41 |
-
## Installation
|
42 |
-
|
43 |
-
**Note**: There is a one-click install of a fork of this project on
|
44 |
-
[Pinokio](https://pinokio.computer/item?uri=https://github.com/cocktailpeanutlabs/aitown)
|
45 |
-
for anyone interested in running but not modifying it π
|
46 |
-
|
47 |
-
### 1. Clone repo and Install packages
|
48 |
-
|
49 |
-
```bash
|
50 |
-
git clone https://github.com/a16z-infra/ai-town.git
|
51 |
-
cd ai-town
|
52 |
-
npm install
|
53 |
-
```
|
54 |
-
|
55 |
-
### 2. To develop locally with [Convex](https://convex.dev):
|
56 |
-
|
57 |
-
Either
|
58 |
-
[download a pre-built binary(recommended)](https://github.com/get-convex/convex-backend/releases),
|
59 |
-
or [build it from source and run it](https://stack.convex.dev/building-the-oss-backend).
|
60 |
-
|
61 |
-
```sh
|
62 |
-
# For new Macs:
|
63 |
-
curl -L -O https://github.com/get-convex/convex-backend/releases/latest/download/convex-local-backend-aarch64-apple-darwin.zip
|
64 |
-
unzip convex-local-backend-aarch64-apple-darwin.zip
|
65 |
-
|
66 |
-
brew install just
|
67 |
-
|
68 |
-
# Runs the server
|
69 |
-
./convex-local-backend
|
70 |
-
```
|
71 |
-
|
72 |
-
This also [installs `just`](https://github.com/casey/just?tab=readme-ov-file#installation)
|
73 |
-
(e.g. `brew install just` or `cargo install just`).
|
74 |
-
We use `just` like `make` to add extra params, so you run `just convex ...`
|
75 |
-
instead of `npx convex ...` for local development.
|
76 |
-
|
77 |
-
If you're running the pre-built binary on Mac and there's an Apple warning,
|
78 |
-
go to the folder it's in and right-click it and select "Open" to bypass.
|
79 |
-
From then on you can run it from the commandline.
|
80 |
-
Or you can compile it from source and run it (see above).
|
81 |
-
|
82 |
-
To develop against the cloud-hosted version, change the package.json scripts
|
83 |
-
to use `convex ...` instead of `just convex ...`.
|
84 |
-
|
85 |
-
### 3. To run a local LLM, download and run [Ollama](https://ollama.com/).
|
86 |
-
|
87 |
-
You can leave the app running or run `ollama serve`.
|
88 |
-
`ollama serve` will warn you if the app is already running.
|
89 |
-
Run `ollama pull llama3` to have it download `llama3`.
|
90 |
-
Test it out with `ollama run llama3`.
|
91 |
-
If you want to customize which model to use, adjust convex/util/llm.ts or set
|
92 |
-
`just convex env set LLM_MODEL # model`.
|
93 |
-
Ollama model options can be found [here](https://ollama.ai/library).
|
94 |
-
|
95 |
-
You might want to set `NUM_MEMORIES_TO_SEARCH` to `1` in constants.ts,
|
96 |
-
to reduce the size of conversation prompts, if you see slowness.
|
97 |
-
|
98 |
-
Check out `convex/config.ts` to configure which models to offer to the UI,
|
99 |
-
or to set it up to talk to a cloud-hosted LLM.
|
100 |
-
|
101 |
-
### 4. Adding background music with Replicate (Optional)
|
102 |
-
|
103 |
-
For Daily background music generation, create a
|
104 |
-
[Replicate](https://replicate.com/) account and create a token in your Profile's
|
105 |
-
[API Token page](https://replicate.com/account/api-tokens).
|
106 |
-
`npx convex env set REPLICATE_API_TOKEN # token`
|
107 |
-
Specify `just` instead of `npx` if you're doing local development.
|
108 |
-
|
109 |
-
### 5. Run the code
|
110 |
-
|
111 |
-
To run both the front and and back end:
|
112 |
-
|
113 |
-
```bash
|
114 |
-
npm run dev
|
115 |
-
```
|
116 |
-
|
117 |
-
**Note**: If you encounter a node version error on the convex server upon application startup, please use node version 18, which is the most stable. One way to do this is by [installing nvm](https://nodejs.org/en/download/package-manager) and running `nvm install 18` or `nvm use 18`. Do this before both the `npm run dev` above and the `./convex-local-backend` in Step 2.
|
118 |
-
|
119 |
-
You can now visit http://localhost:5173.
|
120 |
-
|
121 |
-
If you'd rather run the frontend in a separate terminal from Convex (which syncs
|
122 |
-
your backend functions as they're saved), you can run these two commands:
|
123 |
-
|
124 |
-
```bash
|
125 |
-
npm run dev:frontend
|
126 |
-
npm run dev:backend
|
127 |
-
```
|
128 |
-
|
129 |
-
See package.json for details, but dev:backend runs `just convex dev`
|
130 |
-
|
131 |
-
**Note**: The simulation will pause after 5 minutes if the window is idle.
|
132 |
-
Loading the page will unpause it.
|
133 |
-
You can also manually freeze & unfreeze the world with a button in the UI.
|
134 |
-
If you want to run the world without the
|
135 |
-
browser, you can comment-out the "stop inactive worlds" cron in `convex/crons.ts`.
|
136 |
-
|
137 |
-
### Various commands to run / test / debug
|
138 |
-
|
139 |
-
**To stop the back end, in case of too much activity**
|
140 |
-
|
141 |
-
This will stop running the engine and agents. You can still run queries and
|
142 |
-
run functions to debug.
|
143 |
-
|
144 |
-
```bash
|
145 |
-
just convex run testing:stop
|
146 |
-
```
|
147 |
-
|
148 |
-
**To restart the back end after stopping it**
|
149 |
-
|
150 |
-
```bash
|
151 |
-
just convex run testing:resume
|
152 |
-
```
|
153 |
-
|
154 |
-
**To kick the engine in case the game engine or agents aren't running**
|
155 |
-
|
156 |
-
```bash
|
157 |
-
just convex run testing:kick
|
158 |
-
```
|
159 |
-
|
160 |
-
**To archive the world**
|
161 |
-
|
162 |
-
If you'd like to reset the world and start from scratch, you can archive the current world:
|
163 |
|
164 |
```bash
|
165 |
-
|
166 |
-
|
167 |
-
|
168 |
-
Then, you can still look at the world's data in the dashboard, but the engine and agents will
|
169 |
-
no longer run.
|
170 |
-
|
171 |
-
You can then create a fresh world with `init`.
|
172 |
-
|
173 |
-
```bash
|
174 |
-
just convex run init
|
175 |
-
```
|
176 |
-
|
177 |
-
**To clear all databases**
|
178 |
-
|
179 |
-
You can wipe all tables with the `wipeAllTables` testing function.
|
180 |
-
|
181 |
-
```bash
|
182 |
-
just convex run testing:wipeAllTables
|
183 |
-
```
|
184 |
-
|
185 |
-
**To pause your backend deployment**
|
186 |
-
|
187 |
-
You can go to the [dashboard](https://dashboard.convex.dev) to your deployment
|
188 |
-
settings to pause and un-pause your deployment. This will stop all functions, whether invoked
|
189 |
-
from the client, scheduled, or as a cron job. See this as a last resort, as
|
190 |
-
there are gentler ways of stopping above. Once you
|
191 |
-
|
192 |
-
## Customize your own simulation
|
193 |
-
|
194 |
-
NOTE: every time you change character data, you should re-run
|
195 |
-
`just convex run testing:wipeAllTables` and then
|
196 |
-
`npm run dev` to re-upload everything to Convex.
|
197 |
-
This is because character data is sent to Convex on the initial load.
|
198 |
-
However, beware that `just convex run testing:wipeAllTables` WILL wipe all of your data.
|
199 |
-
|
200 |
-
1. Create your own characters and stories: All characters and stories, as well as their spritesheet references are stored in [characters.ts](./data/characters.ts). You can start by changing character descriptions.
|
201 |
-
|
202 |
-
2. Updating spritesheets: in `data/characters.ts`, you will see this code:
|
203 |
-
|
204 |
-
```ts
|
205 |
-
export const characters = [
|
206 |
-
{
|
207 |
-
name: 'f1',
|
208 |
-
textureUrl: '/assets/32x32folk.png',
|
209 |
-
spritesheetData: f1SpritesheetData,
|
210 |
-
speed: 0.1,
|
211 |
-
},
|
212 |
-
...
|
213 |
-
];
|
214 |
```
|
215 |
|
216 |
-
|
217 |
-
|
218 |
-
3. Update the Background (Environment): The map gets loaded in `convex/init.ts` from `data/gentle.js`. To update the map, follow these steps:
|
219 |
-
|
220 |
-
- Use [Tiled](https://www.mapeditor.org/) to export tilemaps as a JSON file (2 layers named bgtiles and objmap)
|
221 |
-
- Use the `convertMap.js` script to convert the JSON to a format that the engine can use.
|
222 |
-
|
223 |
-
```console
|
224 |
-
node data/convertMap.js <mapDataPath> <assetPath> <tilesetpxw> <tilesetpxh>
|
225 |
-
```
|
226 |
-
|
227 |
-
- `<mapDataPath>`: Path to the Tiled JSON file.
|
228 |
-
- `<assetPath>`: Path to tileset images.
|
229 |
-
- `<tilesetpxw>`: Tileset width in pixels.
|
230 |
-
- `<tilesetpxh>`: Tileset height in pixels.
|
231 |
-
Generates `converted-map.js` that you can use like `gentle.js`
|
232 |
-
|
233 |
-
4. Change the background music by modifying the prompt in `convex/music.ts`
|
234 |
-
5. Change how often to generate new music at `convex/crons.ts` by modifying the `generate new background music` job
|
235 |
-
|
236 |
-
## Using a cloud AI Provider
|
237 |
-
|
238 |
-
Configure `convex/util/llm.ts` or set these env variables:
|
239 |
-
|
240 |
-
```sh
|
241 |
-
# Local Convex
|
242 |
-
just convex env set LLM_API_HOST # url
|
243 |
-
just convex env set LLM_MODEL # model
|
244 |
-
# Cloud Convex
|
245 |
-
npx convex env set LLM_API_HOST # url
|
246 |
-
npx convex env set LLM_MODEL # model
|
247 |
-
```
|
248 |
-
|
249 |
-
The embeddings model config needs to be changed [in code](./convex/util/llm.ts),
|
250 |
-
since you need to specify the embeddings dimension.
|
251 |
-
|
252 |
-
### Keys
|
253 |
-
|
254 |
-
For Together.ai, visit https://api.together.xyz/settings/api-keys
|
255 |
-
For OpenAI, visit https://platform.openai.com/account/api-keys
|
256 |
-
|
257 |
-
## Using hosted Convex
|
258 |
-
|
259 |
-
You can run your Convex backend in the cloud by just running
|
260 |
-
|
261 |
-
```sh
|
262 |
-
npx convex dev --once --configure
|
263 |
-
```
|
264 |
-
|
265 |
-
And updating the `package.json` scripts to remove `just`:
|
266 |
-
change `just convex ...` to `convex ...`.
|
267 |
-
|
268 |
-
You'll then need to set any environment variables you had locally in the cloud
|
269 |
-
environment with `npx convex env set` or on the dashboard:
|
270 |
-
https://dashboard.convex.dev/deployment/settings/environment-variables
|
271 |
-
|
272 |
-
To run commands, use `npx convex ...` where you used to run `just convex ...`.
|
273 |
-
|
274 |
-
## Deploy the app
|
275 |
-
|
276 |
-
### Deploy Convex functions to prod environment
|
277 |
-
|
278 |
-
Before you can run the app, you will need to make sure the Convex functions are deployed to its production environment.
|
279 |
-
|
280 |
-
1. Run `npx convex deploy` to deploy the convex functions to production
|
281 |
-
2. Run `npx convex run init --prod`
|
282 |
-
|
283 |
-
If you have existing data you want to clear, you can run `npx convex run testing:wipeAllTables --prod`
|
284 |
-
|
285 |
-
### Adding Auth (Optional)
|
286 |
-
|
287 |
-
You can add clerk auth back in with `git revert b44a436`.
|
288 |
-
Or just look at that diff for what changed to remove it.
|
289 |
-
|
290 |
-
**Make a Clerk account**
|
291 |
-
|
292 |
-
- Go to https://dashboard.clerk.com/ and click on "Add Application"
|
293 |
-
- Name your application and select the sign-in providers you would like to offer users
|
294 |
-
- Create Application
|
295 |
-
- Add `VITE_CLERK_PUBLISHABLE_KEY` and `CLERK_SECRET_KEY` to `.env.local`
|
296 |
-
|
297 |
-
```bash
|
298 |
-
VITE_CLERK_PUBLISHABLE_KEY=pk_***
|
299 |
-
CLERK_SECRET_KEY=sk_***
|
300 |
-
```
|
301 |
-
|
302 |
-
- Go to JWT Templates and create a new Convex Template.
|
303 |
-
- Copy the JWKS endpoint URL for use below.
|
304 |
-
|
305 |
-
```sh
|
306 |
-
npx convex env set CLERK_ISSUER_URL # e.g. https://your-issuer-url.clerk.accounts.dev/
|
307 |
-
```
|
308 |
-
|
309 |
-
### Deploy to Vercel
|
310 |
-
|
311 |
-
- Register an account on Vercel and then [install the Vercel CLI](https://vercel.com/docs/cli).
|
312 |
-
- **If you are using Github Codespaces**: You will need to [install the Vercel CLI](https://vercel.com/docs/cli) and authenticate from your codespaces cli by running `vercel login`.
|
313 |
-
- Deploy the app to Vercel with `vercel --prod`.
|
314 |
-
|
315 |
-
## Using local inference from a cloud deployment.
|
316 |
-
|
317 |
-
We support using [Ollama](https://github.com/jmorganca/ollama) for conversation generations.
|
318 |
-
To have it accessible from the web, you can use Tunnelmole or Ngrok or similar.
|
319 |
-
|
320 |
-
**Using Tunnelmole**
|
321 |
-
|
322 |
-
[Tunnelmole](https://github.com/robbie-cahill/tunnelmole-client) is an open source tunneling tool.
|
323 |
-
|
324 |
-
You can install Tunnelmole using one of the following options:
|
325 |
-
|
326 |
-
- NPM: `npm install -g tunnelmole`
|
327 |
-
- Linux: `curl -s https://tunnelmole.com/sh/install-linux.sh | sudo bash`
|
328 |
-
- Mac: `curl -s https://tunnelmole.com/sh/install-mac.sh --output install-mac.sh && sudo bash install-mac.sh`
|
329 |
-
- Windows: Install with NPM, or if you don't have NodeJS installed, download the `exe` file for Windows [here](https://tunnelmole.com/downloads/tmole.exe) and put it somewhere in your PATH.
|
330 |
-
|
331 |
-
Once Tunnelmole is installed, run the following command:
|
332 |
-
|
333 |
-
```
|
334 |
-
tmole 11434
|
335 |
-
```
|
336 |
-
|
337 |
-
Tunnelmole should output a unique url once you run this command.
|
338 |
-
|
339 |
-
**Using Ngrok**
|
340 |
-
|
341 |
-
Ngrok is a popular closed source tunneling tool.
|
342 |
-
|
343 |
-
- [Install Ngrok](https://ngrok.com/docs/getting-started/)
|
344 |
-
|
345 |
-
Once ngrok is installed and authenticated, run the following command:
|
346 |
-
|
347 |
-
```
|
348 |
-
ngrok http http://localhost:11434
|
349 |
-
```
|
350 |
-
|
351 |
-
Ngrok should output a unique url once you run this command.
|
352 |
-
|
353 |
-
**Add Ollama endpoint to Convex**
|
354 |
-
|
355 |
-
```sh
|
356 |
-
npx convex env set OLLAMA_HOST # your tunnelmole/ngrok unique url from the previous step
|
357 |
-
```
|
358 |
-
|
359 |
-
**Update Ollama domains**
|
360 |
-
|
361 |
-
Ollama has a list of accepted domains. Add the ngrok domain so it won't reject
|
362 |
-
traffic. see ollama.ai for more details.
|
363 |
-
|
364 |
-
## Credits
|
365 |
-
|
366 |
-
- All interactions, background music and rendering on the <Game/> component in the project are powered by [PixiJS](https://pixijs.com/).
|
367 |
-
- Tilesheet:
|
368 |
-
- https://opengameart.org/content/16x16-game-assets by George Bailey
|
369 |
-
- https://opengameart.org/content/16x16-rpg-tileset by hilau
|
370 |
-
- We used https://github.com/pierpo/phaser3-simple-rpg for the original POC of this project. We have since re-wrote the whole app, but appreciated the easy starting point
|
371 |
-
- Original assets by [ansimuz](https://opengameart.org/content/tiny-rpg-forest)
|
372 |
-
- The UI is based on original assets by [Mounir Tohami](https://mounirtohami.itch.io/pixel-art-gui-elements)
|
373 |
-
|
374 |
-
# π§βπ« What is Convex?
|
375 |
-
|
376 |
-
[Convex](https://convex.dev) is a hosted backend platform with a
|
377 |
-
built-in database that lets you write your
|
378 |
-
[database schema](https://docs.convex.dev/database/schemas) and
|
379 |
-
[server functions](https://docs.convex.dev/functions) in
|
380 |
-
[TypeScript](https://docs.convex.dev/typescript). Server-side database
|
381 |
-
[queries](https://docs.convex.dev/functions/query-functions) automatically
|
382 |
-
[cache](https://docs.convex.dev/functions/query-functions#caching--reactivity) and
|
383 |
-
[subscribe](https://docs.convex.dev/client/react#reactivity) to data, powering a
|
384 |
-
[realtime `useQuery` hook](https://docs.convex.dev/client/react#fetching-data) in our
|
385 |
-
[React client](https://docs.convex.dev/client/react). There are also clients for
|
386 |
-
[Python](https://docs.convex.dev/client/python),
|
387 |
-
[Rust](https://docs.convex.dev/client/rust),
|
388 |
-
[ReactNative](https://docs.convex.dev/client/react-native), and
|
389 |
-
[Node](https://docs.convex.dev/client/javascript), as well as a straightforward
|
390 |
-
[HTTP API](https://docs.convex.dev/http-api/).
|
391 |
-
|
392 |
-
The database supports
|
393 |
-
[NoSQL-style documents](https://docs.convex.dev/database/document-storage) with
|
394 |
-
[opt-in schema validation](https://docs.convex.dev/database/schemas),
|
395 |
-
[relationships](https://docs.convex.dev/database/document-ids) and
|
396 |
-
[custom indexes](https://docs.convex.dev/database/indexes/)
|
397 |
-
(including on fields in nested objects).
|
398 |
-
|
399 |
-
The
|
400 |
-
[`query`](https://docs.convex.dev/functions/query-functions) and
|
401 |
-
[`mutation`](https://docs.convex.dev/functions/mutation-functions) server functions have transactional,
|
402 |
-
low latency access to the database and leverage our
|
403 |
-
[`v8` runtime](https://docs.convex.dev/functions/runtimes) with
|
404 |
-
[determinism guardrails](https://docs.convex.dev/functions/runtimes#using-randomness-and-time-in-queries-and-mutations)
|
405 |
-
to provide the strongest ACID guarantees on the market:
|
406 |
-
immediate consistency,
|
407 |
-
serializable isolation, and
|
408 |
-
automatic conflict resolution via
|
409 |
-
[optimistic multi-version concurrency control](https://docs.convex.dev/database/advanced/occ) (OCC / MVCC).
|
410 |
-
|
411 |
-
The [`action` server functions](https://docs.convex.dev/functions/actions) have
|
412 |
-
access to external APIs and enable other side-effects and non-determinism in
|
413 |
-
either our
|
414 |
-
[optimized `v8` runtime](https://docs.convex.dev/functions/runtimes) or a more
|
415 |
-
[flexible `node` runtime](https://docs.convex.dev/functions/runtimes#nodejs-runtime).
|
416 |
-
|
417 |
-
Functions can run in the background via
|
418 |
-
[scheduling](https://docs.convex.dev/scheduling/scheduled-functions) and
|
419 |
-
[cron jobs](https://docs.convex.dev/scheduling/cron-jobs).
|
420 |
-
|
421 |
-
Development is cloud-first, with
|
422 |
-
[hot reloads for server function](https://docs.convex.dev/cli#run-the-convex-dev-server) editing via the
|
423 |
-
[CLI](https://docs.convex.dev/cli),
|
424 |
-
[preview deployments](https://docs.convex.dev/production/hosting/preview-deployments),
|
425 |
-
[logging and exception reporting integrations](https://docs.convex.dev/production/integrations/),
|
426 |
-
There is a
|
427 |
-
[dashboard UI](https://docs.convex.dev/dashboard) to
|
428 |
-
[browse and edit data](https://docs.convex.dev/dashboard/deployments/data),
|
429 |
-
[edit environment variables](https://docs.convex.dev/production/environment-variables),
|
430 |
-
[view logs](https://docs.convex.dev/dashboard/deployments/logs),
|
431 |
-
[run server functions](https://docs.convex.dev/dashboard/deployments/functions), and more.
|
432 |
-
|
433 |
-
There are built-in features for
|
434 |
-
[reactive pagination](https://docs.convex.dev/database/pagination),
|
435 |
-
[file storage](https://docs.convex.dev/file-storage),
|
436 |
-
[reactive text search](https://docs.convex.dev/text-search),
|
437 |
-
[vector search](https://docs.convex.dev/vector-search),
|
438 |
-
[https endpoints](https://docs.convex.dev/functions/http-actions) (for webhooks),
|
439 |
-
[snapshot import/export](https://docs.convex.dev/database/import-export/),
|
440 |
-
[streaming import/export](https://docs.convex.dev/production/integrations/streaming-import-export), and
|
441 |
-
[runtime validation](https://docs.convex.dev/database/schemas#validators) for
|
442 |
-
[function arguments](https://docs.convex.dev/functions/args-validation) and
|
443 |
-
[database data](https://docs.convex.dev/database/schemas#schema-validation).
|
444 |
|
445 |
-
|
|
|
|
1 |
+
---
|
2 |
+
title: AI Town on HuggingFace
|
3 |
+
emoji: π πββ¬
|
4 |
+
colorFrom: green
|
5 |
+
colorTo: red
|
6 |
+
sdk: docker
|
7 |
+
app_port: 5173
|
8 |
+
pinned: false
|
9 |
+
disable_embedding: true
|
10 |
+
# header: mini
|
11 |
+
short_description: AI Town on HuggingFace
|
12 |
+
hf_oauth: true
|
13 |
+
---
|
14 |
|
15 |
+
# AI Town π π»π on Hugging Face π€
|
16 |
|
17 |
+
[**Demo on Hugging Face Spaces**](https://huggingface.co/spaces/radames/ai-town)
|
18 |
|
19 |
+
AI Town is a very cool project by [Yoko](https://github.com/ykhli) et [al.](https://github.com/a16z-infra/ai-town), a virtual town with live AI characters where they can chat and socialize. You can also interact with them by sending them messages.
|
20 |
|
21 |
+
This repository contains a few code patches to make AI Town run on [Hugging Face π€ Spaces](https://huggingface.co/spaces), as well as a Dockerfile capable of running [Convex open-source backend](https://github.com/get-convex/convex-backend), the backend and frontend on a single container.
|
22 |
|
23 |
+
## How to run locally
|
|
|
24 |
|
25 |
+
Grab your Hugging Face API token from https://huggingface.co/settings/tokens
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
26 |
|
27 |
```bash
|
28 |
+
export HF_TOKEN=hf_**********
|
29 |
+
docker build -t ai-town -f Dockerfile .
|
30 |
+
docker run -ti -p 5173:5173 -e LLM_API_KEY=$HF_TOKEN ai-town
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
31 |
```
|
32 |
|
33 |
+
## How to run on Hugging Face
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
34 |
|
35 |
+
You can duplicate this Space https://huggingface.co/spaces/radames/ai-town?duplicate=true, add your `HF_TOKEN`
|
36 |
+
Then you can customize [patches/constants.ts](patches/constants.ts) and [patches/characters.ts](patches/characters.ts) as you wish, as well as the LLM model and embeddings model in [patches/llm.ts](patches/llm.ts).
|