Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama

Introduction

DeepSeek R1 is an open-source LLM that offers powerful generative AI capabilities. If you’re running it locally using Ollama, you might be wondering how to integrate it with your Node.js applications. This guide will show you ho…


This content originally appeared on DEV Community and was authored by Deepak Sharma

Introduction

DeepSeek R1 is an open-source LLM that offers powerful generative AI capabilities. If you're running it locally using Ollama, you might be wondering how to integrate it with your Node.js applications. This guide will show you how to set up and use the OpenAI SDK with your locally running DeepSeek R1 model.

Step 1: Start DeepSeek R1 Locally with Ollama

Make sure Ollama is running and has the DeepSeek R1 model downloaded. If you haven't installed it yet, do this:

ollama pull deepseek-r1:1.5b

Then, start a test session to verify it's working:

ollama run deepseek-r1:1.5b

Step 2: Install Dependencies (Nodejs)

First, ensure you have Node.js installed, then install the OpenAI SDK:

npm install openai

Step 3: Configure OpenAI SDK to Use Ollama

const OpenAI = require("openai");

const openai = new OpenAI({
    baseURL: "http://localhost:11434/v1", // Pointing to Ollama's local API
    apiKey: "ollama", // Required by the OpenAI SDK, but Ollama doesn’t validate it
});

async function chatWithDeepSeek(prompt) {
    try {
        const response = await openai.chat.completions.create({
            model: "deepseek-r1:1.5b", // Ensure this model is running
            messages: [{ role: "user", content: prompt }],
        });

        console.log(response.choices[0].message.content);
    } catch (error) {
        console.error("Error:", error.message);
    }
}

// Test the function
chatWithDeepSeek("Hello, how are you?");

Step 4: Enabling Streaming Responses

To improve performance and get responses in real-time, enable streaming
Streaming Version of the Function

async function chatWithDeepSeekStream(prompt) {
    try {
        const stream = await openai.chat.completions.create({
            model: "deepseek-r1:1.5b",
            messages: [{ role: "user", content: prompt }],
            stream: true, // Enable streaming
        });

        for await (const chunk of stream) {
            process.stdout.write(chunk.choices[0]?.delta?.content || "");
        }
        console.log("\n");
    } catch (error) {
        console.error("Error:", error.message);
    }
}

chatWithDeepSeekStream("Tell me a fun fact about space.");


This content originally appeared on DEV Community and was authored by Deepak Sharma


Print Share Comment Cite Upload Translate Updates
APA

Deepak Sharma | Sciencx (2025-02-09T07:42:48+00:00) Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama. Retrieved from https://www.scien.cx/2025/02/09/use-openais-node-js-sdk-with-deepseek-r1-running-locally-via-ollama/

MLA
" » Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama." Deepak Sharma | Sciencx - Sunday February 9, 2025, https://www.scien.cx/2025/02/09/use-openais-node-js-sdk-with-deepseek-r1-running-locally-via-ollama/
HARVARD
Deepak Sharma | Sciencx Sunday February 9, 2025 » Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama., viewed ,<https://www.scien.cx/2025/02/09/use-openais-node-js-sdk-with-deepseek-r1-running-locally-via-ollama/>
VANCOUVER
Deepak Sharma | Sciencx - » Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2025/02/09/use-openais-node-js-sdk-with-deepseek-r1-running-locally-via-ollama/
CHICAGO
" » Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama." Deepak Sharma | Sciencx - Accessed . https://www.scien.cx/2025/02/09/use-openais-node-js-sdk-with-deepseek-r1-running-locally-via-ollama/
IEEE
" » Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama." Deepak Sharma | Sciencx [Online]. Available: https://www.scien.cx/2025/02/09/use-openais-node-js-sdk-with-deepseek-r1-running-locally-via-ollama/. [Accessed: ]
rf:citation
» Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama | Deepak Sharma | Sciencx | https://www.scien.cx/2025/02/09/use-openais-node-js-sdk-with-deepseek-r1-running-locally-via-ollama/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.