Engee documentation
Notebook

Using DeepSeek-R1 in Engee

In this example, we will look at the process of creating and sending requests, as well as outputting responses to Engee for the DeepSeek-R1 Large Language Model (LLM) via a third-party Python API.

Introduction

DeepSeek-R1 is the most relevant and sensational model of all DeepSeek LLMs in recent years. There are several ways to access the chat for free and for a fee with this model. In our example, we will consider the simplest and most free access method.

We will be sending JSON requests to an open LLM access service - <https://openrouter.ai />. This is one of the possible ways to integrate language models into user applications.

Website openrouter.ai is used as an example of the API for accessing chat with LLM. We warn you about the availability of paid services on it and recommend that you use the help of specialists to perform similar work. In addition, we declare that this example does not serve as an advertisement for the service used.

Getting started

First, let's check if the library is installed. requests Python. It allows you to work with HTTP requests. The following code can be executed, for example, in a cell of the .ipynb file:

!pip install requests

After that, we will install and connect the library. PyCall.jl to work with code in Python in Julia.

In [ ]:
Pkg.add("PyCall")     # Установка библиотеки
   Resolving package versions...
  No Changes to `~/.project/Project.toml`
  No Changes to `~/.project/Manifest.toml`
In [ ]:
using PyCall    # Подключение библиотеки

Now let's connect the library requests:

In [ ]:
py"""
import requests
"""

The preparation for the work has been completed, you can proceed to the formation of the request.

Forming a request

Creating variables with the Openrouter API access page (API_URL) and a custom key (API_KEY). The key entered below is used exclusively in the current example and is currently disabled for use. If necessary, enter your API key.

In [ ]:
py"""
API_KEY = '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'
API_URL = 'https://openrouter.ai/api/v1/chat/completions'
"""

Next, we will define the headers of the API request.:

In [ ]:
py"""
headers = {
    'Authorization': f'Bearer {API_KEY}',
    'Content-Type': 'application/json'
}
"""

Defining the data for the request in JSON format:

In [ ]:
py"""
data = {
    "model": "deepseek/deepseek-r1:free",
    "messages": [{"role": "user", "content": "Напиши программу Hello World! на Julia"}]
}
"""

Here: deepseek/deepseek-r1:free - this is a free DeepSeek-R1 model available for chat via OpenRouter.
In the field messages for the user role ("role": "user") the text query is being determined content.

We receive and output the response

We will send the generated POST request to OpenRouter.:

In [ ]:
py"""
response = requests.post(API_URL, json=data, headers=headers)   # Отправляем POST-запрос
"""

We will check the success of the request and print the necessary data.

In [ ]:
py"""
if response.status_code == 200:                                                 # если код ответа "200", то:
    for content in response.json()['choices']:                                      # из поля "choices" ответа
        output = content                                                                # получаем содержимое "content"
    print(output['message']['content'])                                             # а из этого содержимого печатаем "content" поля "messages"              
else:                                                                           # иначе:
    print("Failed to fetch data from API. Status Code:", response.status_code)      # Печать сообщения об ошибке
"""
```julia
println("Hello World!")
```

**Объяснение:**
1. `println` — это функция в Julia, которая выводит переданные аргументы в консоль и добавляет переход на новую строку.
2. Строка `"Hello World!"` передается в функцию `println` для вывода.
3. Программа выведет в консоль текст: `Hello World!`.

If the request is successfully processed, as in the case above, we receive a response from the neural network. Since DeepSeek's response is marked up in Markdown, we can save it to a file with the appropriate extension for clarity.

In [ ]:
try
    write("Py_LLM.md", py"output['message']['content']");
catch err;
    println("Введите актуальный API_KEY");
end

Code Masking

The indisputable convenience of Engee when working with scripts lies in the possibility of [masking code codes]. ячеек](https://engee.com/helpcenter/stable/ru/interactive-scripts/language_basics/code_cell_masks.html). So, we can easily create the minimum necessary GUI to work with the written application.:

In [ ]:
py"""
# Define the request payload (data)
Вопрос = "Ответь максимально коротко на вопрос о жизни, вселенной и вообще" # @param {type:"string"}

data = {
    "model": "deepseek/deepseek-r1:free",
    "messages": [{"role": "user", "content": str(Вопрос)}]
}

# Send the POST request to the DeepSeek API
response = requests.post(API_URL, json=data, headers=headers)

# Check if the request was successful
if response.status_code == 200:
    for content in response.json()['choices']:
        output = content
    print(output['message']['content'])
else:
    print("Failed to fetch data from API. Status Code:", response.status_code)
"""
42. Но смысл в поиске своего пути.

Conclusion

In the example, we have considered a fairly simple approach to connect a chat with the DeepSeek-R1 neural network in the Engee script. The program is implemented in Python, its job is to generate a request in JSON format using the API of a third-party service. In addition, the output of the received response to the Markdown file and the masking of the code cell for creating a GUI program in the Engee script are shown.