Engee documentation
Notebook

Using DeepSeek-R1 in Engee

In this example, we will look at the process of creating and sending queries, and outputting responses to Engee for a large language model (LLM) DeepSeek-R1 via the API of a third-party Python service.

Introduction

The DeepSeek-R1 is the most relevant and much-talked-about model of all DeepSeek LLMs in recent times. There are several ways of free and paid access to chat with this model, In our example we will consider the simplest and free way of access.

We will send JSON requests to the open LLM access service - https://openrouter.ai/. This is one of the possible ways to integrate language models into user applications.

The openrouter.ai site is used for an example of the LLM chat access API. We warn you about the presence of paid services on it and we recommend that you use the help of specialists to perform similar work. In addition, we state, that this example does not serve as an advertisement of the service used.

Getting started

First, let's check if the requests Python library is installed. It allows you to work with HTTP requests. The following code can be executed, for example, in a cell of the .ipynb file:

!pip install requests

After that, we'll install and connect the library PyCall.jl to work with the code in Python in Julia.

In [ ]:
]add PyCall     # Установка библиотеки
In [ ]:
using PyCall    # Подключение библиотеки

Now let's connect the library requests:

In [ ]:
py"""
import requests
"""

The preparation for work is done, we can move on to the formation of the query.

Forming a query

Create variables with the Openrouter API access page (API_URL) and user key (API_KEY). The key entered below is used exclusively in the current example and is currently disabled for use. If necessary, enter your own API key.

In [ ]:
py"""
API_KEY = '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'
API_URL = 'https://openrouter.ai/api/v1/chat/completions'
"""

Next, let's define the API request headers:

In [ ]:
py"""
headers = {
    'Authorization': f'Bearer {API_KEY}',
    'Content-Type': 'application/json'
}
"""

Let's define the data for the request in JSON format:

In [ ]:
py"""
data = {
    "model": "deepseek/deepseek-r1:free",
    "messages": [{"role": "user", "content": "Напиши программу Hello World! на Julia"}]
}
"""

Here: deepseek/deepseek-r1:free is a free DeepSeek-R1, available for chat via OpenRouter.
In the messages field for the user role ("role": "user") the text query content is defined .

Receive and output the answer

Let's send the generated POST-request to OpenRouter:

In [ ]:
py"""
response = requests.post(API_URL, json=data, headers=headers)   # Отправляем POST-запрос
"""

Check the success of the request and print the necessary data.

In [ ]:
py"""
if response.status_code == 200:                                                 # если код ответа "200", то:
    for content in response.json()['choices']:                                      # из поля "choices" ответа
        output = content                                                                # получаем содержимое "content"
    print(output['message']['content'])                                             # а из этого содержимого печатаем "content" поля "messages"              
else:                                                                           # иначе:
    print("Failed to fetch data from API. Status Code:", response.status_code)      # Печать сообщения об ошибке
"""
Для написания программы "Hello World!" на языке Julia достаточно использовать встроенную функцию `println`, которая выводит текст с переводом строки. Вот пример кода:

```julia
println("Hello World!")
```

**Объяснение:**
- `println` — функция, выводящая переданные аргументы в консоль, добавляя символ новой строки.
- Строковые литералы заключаются в двойные кавычки (`"..."`).

**Как запустить:**
1. Сохраните код в файл с расширением `.jl` (например, `hello.jl`).
2. Запустите в терминале: `julia hello.jl`.

Это минимальный пример, демонстрирующий базовый вывод в Julia. Для более сложных программ код можно организовать в функции и модули.

In case of successful processing of the request, as in the case above, we receive a response from the neural network.
Since DeepSeek's response is marked up in Markdown, for clarity. we can save it to a file with the appropriate extension.

In [ ]:
write("Py_LLM.md", py"output['message']['content']");

Code masking

An undeniable convenience of Engee when working with scripts is the ability to the ability to mask code cells. So, we can easily create the minimum necessary GUI to work with the written application:

In [ ]:
py"""
# Define the request payload (data)
Вопрос = "Ответь максимально коротко на вопрос о жизни, вселенной и вообще" # @param {type:"string"}

data = {
    "model": "deepseek/deepseek-r1:free",
    "messages": [{"role": "user", "content": str(Вопрос)}]
}

# Send the POST request to the DeepSeek API
response = requests.post(API_URL, json=data, headers=headers)

# Check if the request was successful
if response.status_code == 200:
    for content in response.json()['choices']:
        output = content
    print(output['message']['content'])
else:
    print("Failed to fetch data from API. Status Code:", response.status_code)
"""
42.

$$$\blacksquare$$

Conclusion

In the example we have considered a rather simple approach to connecting chat with DeepSeek-R1 neural network in the Engee script. The programme is implemented in Python and its work consists in forming a request in JSON format via API of a third-party service. In addition, it shows the output of the received response to a Markdown file and masking a code cell to create a GUI programme in the Engee script. Engee.