File size: 3,422 Bytes
414a53c
 
8108862
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1d69b6f
8108862
 
 
 
 
 
 
 
 
 
 
 
c2dd818
8108862
 
 
c2dd818
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
license: other
language:
- en
thumbnail: https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico
tags:
- gpt
- llm
- large language model
---
# h2oGPT DataBase Data Card
## Summary

H2O.ai's Chroma database files for h2oGPT for LangChain integration.  Sources are generated and processed by [get_db()](https://github.com/h2oai/h2ogpt/blob/40780bc6f4197e7f54753d40adafabe7c6e582f0/gpt_langchain.py#L491-L492)

| File        |Purpose         | Source            | License
|-------------|----------------|-------------------|----------
|[db_dir_DriverlessAI_docs.zip](db_dir_DriverlessAI_docs.zip) | DriverlessAI Documentation Q/A | [Source](https://github.com/h2oai/h2ogpt/blob/40780bc6f4197e7f54753d40adafabe7c6e582f0/gpt_langchain.py#L469-L473) | CC-BY-NC
|[db_dir_UserData.zip](db_dir_UserData.zip) | Example PDFs and Text Files Q/A | [Source](https://github.com/h2oai/h2ogpt/blob/40780bc6f4197e7f54753d40adafabe7c6e582f0/gpt_langchain.py#L474-L478)  | ArXiv
|[db_dir_github_h2oGPT.zip](db_dir_github_h2oGPT.zip) | h2oGPT GitHub repo Q/A | [Source](https://github.com/h2oai/h2ogpt/blob/40780bc6f4197e7f54753d40adafabe7c6e582f0/gpt_langchain.py#L463-L468) | Apache V2
|[db_dir_wiki.zip](db_dir_wiki.zip) | Example subset of Wikipedia (from API) Q/A | [Source](https://github.com/h2oai/h2ogpt/blob/40780bc6f4197e7f54753d40adafabe7c6e582f0/gpt_langchain.py#L463-L468) | Wikipedia CC-BY-SA
|[db_dir_wiki_full.zip](db_dir_wiki.zip) | All Wikipedia as of 04/01/2023 for articles with >5k views for Q/A | [Source](https://github.com/h2oai/h2ogpt/blob/40780bc6f4197e7f54753d40adafabe7c6e582f0/gpt_langchain.py#L448-L457) | Wikipedia CC-BY-SA


UserData can be generated for any collection of private offline docs by running [make_db.py](https://github.com/h2oai/h2ogpt/blob/8bde589f1c532c6fb6badb313b073761ddc31f73/make_db.py#L15-L22).  For quickly using a private document collection for Q/A, place documents (PDFs, text, etc.) into a folder called `user_path` and run
```bash
python make_db.py
```
To use the chatbot with such docs, run:
```bash
python generate.py --base_model=h2oai/h2ogpt-oig-oasst1-512-6.9b --langchain_mode=UserData
```
using [h2oGPT](https://github.com/h2oai/h2ogpt) .  Any other instruct-tuned base model can be used, including non-h2oGPT ones, as long as required GPU memory is avaialble for given model size.  Or one can choose 8-bit generation.

See also LangChain example use with [test_langchain_simple.py](https://github.com/h2oai/h2ogpt/blob/4637531b928dfa458d708615ebd2cb6454d23064/tests/test_langchain_simple.py)

If one has obtained all databases (except wiki_full) and unzipped them into the current directory, then one can run h2oGPT Chatbot like:
```bash
python generate.py --base_model=h2oai/h2ogpt-oasst1-512-12b --load_8bit=True --langchain_mode=UserData --visible_langchain_modes="['UserData', 'wiki', 'MyData', 'github h2oGPT', 'DriverlessAI docs']"
```
which uses now 12B model in 8-bit mode, that fits onto single 24GB GPU.

If one has obtained all databases (including wiki_full) and unzipped them into the current directory, then one can run h2oGPT Chatbot like:
```bash
python generate.py --base_model=h2oai/h2ogpt-oasst1-512-12b --load_8bit=True --langchain_mode=wiki_full --visible_langchain_modes="['UserData', 'wiki_full', 'MyData', 'github h2oGPT', 'DriverlessAI docs']"
```
which will default to wiki_full for QA against full Wikipedia.