Self-Hosting Perplexica and Ollama
Set up Perplexica and Ollama for a secure, self-hosted AI search. Follow our guide to install and configure both tools for seamless integration.
Perplexica and Ollama Setup
Are you in the self-hosted camp, enjoying Ollama, and wondering when we’d have something like Perplexity AI but local, and maybe a bit more secure? I had been keeping an eye out when I came across an article on MARKTECHPOST about Perplexica. So I decided to take a crack at it. There were a few issues I encountered which we’ll work around in the Perplexica setup, aside from config there was a call property that we need to address. Let’s dive in.
Ollama Install and Setup
To begin with Ollama, follow these steps:
Run the installation script using
1
curl -fsSL https://ollama.com/install.sh | sh
Pull the latest version of Llama3 using
1
ollama pull llama3:latest
Pull the latest version of Nomic-Embed-Text using
1
ollama pull nomic-embed-text:latest
Edit the Ollama service file by running sudo systemctl edit ollama.service and adding the following lines
1 2 3
Copy Code [Service] Environment="OLLAMA_HOST=0.0.0.0"
Reload the systemd daemon using
1
sudo systemctl daemon-reload
Restart the Ollama service using
1
sudo systemctl restart ollama
Perplexica Setup
To set up Perplexica, follow these steps:
Clone the Perplexica repository using
1
git clone https://github.com/ItzCrazyKns/Perplexica.git
Copy the sample configuration file to a new file named config.toml
1
using cp sample.config.toml config.toml
Open config.toml in a text editor (such as nano) and make the following changes:
1
Change OLLAMA = http://server-ip:11434
Comment out the server for SEARXNG and press CTRL+X to exit and Y to save
- Open
ui/components/theme/Switcher.tsx
in a text editor (such as nano) and make the following changes- According to this issue, change to line 10
1
const ThemeSwitcher = ({ className, size }: { className?: string; size?: number }) => {
Then press ctrl+x, then y to save the file
- Open
docker-compose.yml
in a text editor (such as nano) and make the following changes- Change -
SEARXNG_API_URL=http://server-ip:4000
- Change -
NEXT_PUBLIC_API_URL=http://server-ip:3001/api
- Change -
NEXT_PUBLIC_WS_URL=ws://server-ip:3001
- Change -
- Build and start the Perplexica container using
1
docker compose up -d --build
- Access Perplexica by visiting
http://server-ip:3000
in your web browser
That’s it! With these steps, you should be able to set up both Perplexica and Ollama on your system. If you found this helpful please share this post, donate to my Buymeacoffee, or clap if you’re reading this on Medium. Till next time fair winds and following seas!