File size: 6,600 Bytes
76aa260
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
# Hosting UI and Models separately

Sometimes, its beneficial to host Ollama, separate from the UI, but retain the RAG and RBAC support features shared across users:

# Open WebUI Configuration

## UI Configuration

For the UI configuration, you can set up the Apache VirtualHost as follows:

```

# Assuming you have a website hosting this UI at "server.com"

<VirtualHost 192.168.1.100:80>

    ServerName server.com

    DocumentRoot /home/server/public_html



    ProxyPass / http://server.com:3000/ nocanon

    ProxyPassReverse / http://server.com:3000/



</VirtualHost>

```

Enable the site first before you can request SSL:

`a2ensite server.com.conf` # this will enable the site. a2ensite is short for "Apache 2 Enable Site"

```

# For SSL

<VirtualHost 192.168.1.100:443>

    ServerName server.com

    DocumentRoot /home/server/public_html



    ProxyPass / http://server.com:3000/ nocanon

    ProxyPassReverse / http://server.com:3000/



    SSLEngine on

    SSLCertificateFile /etc/ssl/virtualmin/170514456861234/ssl.cert

    SSLCertificateKeyFile /etc/ssl/virtualmin/170514456861234/ssl.key

    SSLProtocol all -SSLv2 -SSLv3 -TLSv1 -TLSv1.1



    SSLProxyEngine on

    SSLCACertificateFile /etc/ssl/virtualmin/170514456865864/ssl.ca

</VirtualHost>



```

I'm using virtualmin here for my SSL clusters, but you can also use certbot directly or your preferred SSL method. To use SSL:

### Prerequisites.

Run the following commands:

`snap install certbot --classic`
`snap apt install python3-certbot-apache` (this will install the apache plugin).

Navigate to the apache sites-available directory:

`cd /etc/apache2/sites-available/`

Create server.com.conf if it is not yet already created, containing the above `<virtualhost>` configuration (it should match your case. Modify as necessary). Use the one without the SSL:

Once it's created, run `certbot --apache -d server.com`, this will request and add/create an SSL keys for you as well as create the server.com.le-ssl.conf

# Configuring Ollama Server

On your latest installation of Ollama, make sure that you have setup your api server from the official Ollama reference:

[Ollama FAQ](https://github.com/jmorganca/ollama/blob/main/docs/faq.md)

### TL;DR

The guide doesn't seem to match the current updated service file on linux. So, we will address it here:

Unless when you're compiling Ollama from source, installing with the standard install `curl https://ollama.com/install.sh | sh` creates a file called `ollama.service` in /etc/systemd/system. You can use nano to edit the file:

```

sudo nano /etc/systemd/system/ollama.service

```

Add the following lines:

```

Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify

```

For instance:

```

[Unit]

Description=Ollama Service

After=network-online.target



[Service]

ExecStart=/usr/local/bin/ollama serve

Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify 192.168.254.109:DIFFERENT_PORT, format

Environment="OLLAMA_ORIGINS=http://192.168.254.106:11434,https://models.server.city" # this line is optional

User=ollama

Group=ollama

Restart=always

RestartSec=3

Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/s>



[Install]

WantedBy=default.target

```

Save the file by pressing CTRL+S, then press CTRL+X

When your computer restarts, the Ollama server will now be listening on the IP:PORT you specified, in this case 0.0.0.0:11434, or 192.168.254.106:11434 (whatever your local IP address is). Make sure that your router is correctly configured to serve pages from that local IP by forwarding 11434 to your local IP server.

# Ollama Model Configuration

## For the Ollama model configuration, use the following Apache VirtualHost setup:

Navigate to the apache sites-available directory:

`cd /etc/apache2/sites-available/`

`nano models.server.city.conf` # match this with your ollama server domain

Add the following virtualhost containing this example (modify as needed):

```



# Assuming you have a website hosting this UI at "models.server.city"

<IfModule mod_ssl.c>

    <VirtualHost 192.168.254.109:443>

        DocumentRoot "/var/www/html/"

        ServerName models.server.city

        <Directory "/var/www/html/">

            Options None

            Require all granted

        </Directory>



        ProxyRequests Off

        ProxyPreserveHost On

        ProxyAddHeaders On

        SSLProxyEngine on



        ProxyPass / http://server.city:1000/ nocanon # or port 11434

        ProxyPassReverse / http://server.city:1000/ # or port 11434



        SSLCertificateFile /etc/letsencrypt/live/models.server.city/fullchain.pem

        SSLCertificateKeyFile /etc/letsencrypt/live/models.server.city/privkey.pem

        Include /etc/letsencrypt/options-ssl-apache.conf

    </VirtualHost>

</IfModule>

```

You may need to enable the site first (if you haven't done so yet) before you can request SSL:

`a2ensite models.server.city.conf`

#### For the SSL part of Ollama server

Run the following commands:

Navigate to the apache sites-available directory:

`cd /etc/apache2/sites-available/`
`certbot --apache -d server.com`

```

<VirtualHost 192.168.254.109:80>

    DocumentRoot "/var/www/html/"

    ServerName models.server.city

    <Directory "/var/www/html/">

        Options None

        Require all granted

    </Directory>



    ProxyRequests Off

    ProxyPreserveHost On

    ProxyAddHeaders On

    SSLProxyEngine on



    ProxyPass / http://server.city:1000/ nocanon # or port 11434

    ProxyPassReverse / http://server.city:1000/ # or port 11434



    RewriteEngine on

    RewriteCond %{SERVER_NAME} =models.server.city

    RewriteRule ^ https://%{SERVER_NAME}%{REQUEST_URI} [END,NE,R=permanent]

</VirtualHost>



```

Don't forget to restart/reload Apache with `systemctl reload apache2`

Open your site at https://server.com!

**Congratulations**, your _**Open-AI-like Chat-GPT style UI**_ is now serving AI with RAG, RBAC and multimodal features! Download Ollama models if you haven't yet done so!

If you encounter any misconfiguration or errors, please file an issue or engage with our discussion. There are a lot of friendly developers here to assist you.

Let's make this UI much more user friendly for everyone!

Thanks for making open-webui your UI Choice for AI!

This doc is made by **Bob Reyes**, your **Open-WebUI** fan from the Philippines.