Local LLM Memo

Table of contents

Tool
How to run LLM localy
How to use LLM for personal research agent
Create API Server

Tool

Ollama

It can run with large language models (LLMs).
Although each models has different interface, it has unified …


This content originally appeared on DEV Community and was authored by Akira Game

Table of contents

  • Tool
  • How to run LLM localy
  • How to use LLM for personal research agent
  • Create API Server

Tool

Ollama

It can run with large language models (LLMs).
Although each models has different interface, it has unified interface to use lcoal LLM, so you can run LLMs easily.

LangChain

LangChain is a composable framework to build with LLMs.
Langchain

LangGraph

LangGraph is the orchestration framework for controllable agentic workflows.
It defines state and create complex workflow.

Duckdockgo-search

This is a search engine that protects user privacy.

DuckdockGo

How to run LLM locally

I used Ollama to run LLMs locally.

Download Ollama

Download Ollama from the official website.
Ollama

Install

Install Ollama following the installer.

Download LLM

Download the LLM you want to download.

ollama pull <model_name>

Run LLM

Run the LLM

ollama run <model_name>

How to use LLM for personal research agent

I referred the below technical article.
How to build a Interactive Personal AI Research Agent with Llama 3.2 : A Step-by-Step Guide using LangChain and Ollama

Create API Server

I built API server using render and Flask.


This content originally appeared on DEV Community and was authored by Akira Game


Print Share Comment Cite Upload Translate Updates
APA

Akira Game | Sciencx (2025-02-14T20:02:09+00:00) Local LLM Memo. Retrieved from https://www.scien.cx/2025/02/14/local-llm-memo/

MLA
" » Local LLM Memo." Akira Game | Sciencx - Friday February 14, 2025, https://www.scien.cx/2025/02/14/local-llm-memo/
HARVARD
Akira Game | Sciencx Friday February 14, 2025 » Local LLM Memo., viewed ,<https://www.scien.cx/2025/02/14/local-llm-memo/>
VANCOUVER
Akira Game | Sciencx - » Local LLM Memo. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2025/02/14/local-llm-memo/
CHICAGO
" » Local LLM Memo." Akira Game | Sciencx - Accessed . https://www.scien.cx/2025/02/14/local-llm-memo/
IEEE
" » Local LLM Memo." Akira Game | Sciencx [Online]. Available: https://www.scien.cx/2025/02/14/local-llm-memo/. [Accessed: ]
rf:citation
» Local LLM Memo | Akira Game | Sciencx | https://www.scien.cx/2025/02/14/local-llm-memo/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.