Setting Up Llama 3.2 Locally with Ollama and Open WebUI: A Complete Guide

We’ll use Ollama as tool for setting up the llama3.2 model in our local device.

Requirements for Llama3.2

1 Install Ollama

Download the Ollama from here: https://ollama.com/ and install it locally. It should be very easy to install just …


This content originally appeared on DEV Community and was authored by Ravi Agheda

We'll use Ollama as tool for setting up the llama3.2 model in our local device.

Requirements for Llama3.2

Llama 3.2 requirements

1 Install Ollama

Download the Ollama from here: https://ollama.com/ and install it locally. It should be very easy to install just one click.
It'll automatically setup the CLI path, if not please explore the documentation.

you can explore the models from supported by ollama here
https://ollama.com/search

2 Setup the model, in our case Llama3.2

ollama run llama3.2:latest

It should download the modal and spin it up in your terminal, you can have chat directly from the terminal

Example of the chat

Once we're good that part, now it's time to setup the UI ( like chatGPT )

3 Setup Open-WebUI

Open WebUI is an extensible, self-hosted AI interface that adapts to your workflow, all while operating entirely offline.

Checkout their docs from here: https://docs.openwebui.com/

we'll use the docker to setup the interface locally, run this command

docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

Once docker is up successfully go to port 3000 and you're good to go.

Open Web UI Screenshot

You can run other models and change them directly from the open webui interface.

Enjoy....


This content originally appeared on DEV Community and was authored by Ravi Agheda


Print Share Comment Cite Upload Translate Updates
APA

Ravi Agheda | Sciencx (2025-04-02T08:56:57+00:00) Setting Up Llama 3.2 Locally with Ollama and Open WebUI: A Complete Guide. Retrieved from https://www.scien.cx/2025/04/02/setting-up-llama-3-2-locally-with-ollama-and-open-webui-a-complete-guide/

MLA
" » Setting Up Llama 3.2 Locally with Ollama and Open WebUI: A Complete Guide." Ravi Agheda | Sciencx - Wednesday April 2, 2025, https://www.scien.cx/2025/04/02/setting-up-llama-3-2-locally-with-ollama-and-open-webui-a-complete-guide/
HARVARD
Ravi Agheda | Sciencx Wednesday April 2, 2025 » Setting Up Llama 3.2 Locally with Ollama and Open WebUI: A Complete Guide., viewed ,<https://www.scien.cx/2025/04/02/setting-up-llama-3-2-locally-with-ollama-and-open-webui-a-complete-guide/>
VANCOUVER
Ravi Agheda | Sciencx - » Setting Up Llama 3.2 Locally with Ollama and Open WebUI: A Complete Guide. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2025/04/02/setting-up-llama-3-2-locally-with-ollama-and-open-webui-a-complete-guide/
CHICAGO
" » Setting Up Llama 3.2 Locally with Ollama and Open WebUI: A Complete Guide." Ravi Agheda | Sciencx - Accessed . https://www.scien.cx/2025/04/02/setting-up-llama-3-2-locally-with-ollama-and-open-webui-a-complete-guide/
IEEE
" » Setting Up Llama 3.2 Locally with Ollama and Open WebUI: A Complete Guide." Ravi Agheda | Sciencx [Online]. Available: https://www.scien.cx/2025/04/02/setting-up-llama-3-2-locally-with-ollama-and-open-webui-a-complete-guide/. [Accessed: ]
rf:citation
» Setting Up Llama 3.2 Locally with Ollama and Open WebUI: A Complete Guide | Ravi Agheda | Sciencx | https://www.scien.cx/2025/04/02/setting-up-llama-3-2-locally-with-ollama-and-open-webui-a-complete-guide/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.