You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.Rmd
+70-19
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ You can install this package from CRAN:
37
37
install.packages("rollama")
38
38
```
39
39
40
-
Or you can install the development version of `rollama` from [GitHub](https://github.com/JBGruber/rollama) with:
40
+
Or you can install the development version of `rollama` from [GitHub](https://github.com/JBGruber/rollama). This version is updated more frequently and may contain bug fixes (or new bugs):
41
41
42
42
```r
43
43
# install.packages("remotes")
@@ -56,13 +56,11 @@ rollama::ping_ollama()
56
56
57
57
### Installation of Ollama through Docker
58
58
59
-
The advantage of running things through Docker is that the application is isolated from the rest of your system, behaves the same on different systems, and is easy to download and update.
59
+
For beginners we recommend to download Ollama from [their website](https://ollama.com/). However, if you are familiar with Docker, you can also run Ollama through Docker. The advantage of running things through Docker is that the application is isolated from the rest of your system, behaves the same on different systems, and is easy to download and update.
60
60
You can also get a nice web interface.
61
61
After making sure [Docker](https://docs.docker.com/desktop/) is installed, you can simply use the Docker Compose file from [this gist](https://gist.github.com/JBGruber/73f9f49f833c6171b8607b976abc0ddc).
62
62
63
-
If you don’t know how to use Docker Compose, you can follow this video to use the compose file and start Ollama and Open WebUI:
64
-
65
-
[](https://www.youtube.com/watch?v=iMyCdd5nP5U)
63
+
If you don’t know how to use Docker Compose, you can follow this [video](https://www.youtube.com/watch?v=iMyCdd5nP5U) to use the compose file and start Ollama and Open WebUI.
66
64
67
65
## Example
68
66
@@ -81,15 +79,22 @@ You can make single requests, which does not store any history and treats each q
81
79
82
80
```{r query}
83
81
# ask a single question
84
-
query("why is the sky blue?")
82
+
query("Why is the sky blue? Answer with one sentence.")
83
+
```
84
+
85
+
With the output argument, we can specify the format of the response. Available options include "text", "list", "data.frame", "response", "httr2_response", and "httr2_request":
86
+
87
+
```{r output}
88
+
# ask a single question and specify the output format
89
+
query("Why is the sky blue? Answer with one sentence." , output = "text")
85
90
```
86
91
87
92
Or you can use the `chat` function, treats all messages sent during an R session as part of the same conversation:
88
93
89
94
```{r chat}
90
95
# hold a conversation
91
-
chat("why is the sky blue?")
92
-
chat("and how do you know that?")
96
+
chat("Why is the sky blue? Give a short answer.")
97
+
chat("And how do you know that? Give a short answer.")
93
98
```
94
99
95
100
If you are done with a conversation and want to start a new one, you can do that like so:
@@ -103,11 +108,11 @@ new_chat()
103
108
You can set a number of model parameters, either by creating a new model, with a [modelfile](https://jbgruber.github.io/rollama/reference/create_model.html), or by including the parameters in the prompt:
104
109
105
110
```{r}
106
-
query("why is the sky blue?", model_params = list(
107
-
seed = 42,
108
-
temperature = 0,
109
-
num_gpu = 0
110
-
))
111
+
query("Why is the sky blue? Answer with one sentence.", output = "text",
# if you don't have the model yet: pull_model("llama3.2:3b-instruct-q4_1")
150
+
query("Why is the sky blue? Answer with one sentence.")
151
+
```
152
+
153
+
## Easy query generation
154
+
155
+
The `make_query` function simplifies the creation of structured queries, which can, for example, be used in [annotation tasks](https://jbgruber.github.io/rollama/articles/annotation.html#the-make_query-helper-function).
156
+
157
+
Main components (check the [documentation](https://jbgruber.github.io/rollama/articles/annotation.html#the-make_query-helper-function) for more options):
158
+
159
+
-**`text`**: The text(s) to classify.
160
+
-**`prompt`**: Could be a (classification) question
161
+
-**`system`**: Optional system prompt providing context or instructions for the task.
162
+
-**`examples`**: Optional prior examples for one-shot or few-shot learning (user messages and assistant responses).
163
+
164
+
165
+
**Zero-shot Example**
166
+
In this example, the function is used without examples:
167
+
168
+
```{r make_query}
169
+
# Create a query using make_query
170
+
q_zs <- make_query(
171
+
text = "the pizza tastes terrible",
172
+
prompt = "Is this text: 'positive', 'neutral', or 'negative'?",
173
+
system = "You assign texts into categories. Answer with just the correct category."
174
+
)
175
+
# Print the query
176
+
print(q_zs)
177
+
# Run the query
178
+
query(q_zs, output = "text")
179
+
146
180
```
147
181
182
+
## Learn more
183
+
184
+
-[Use rollama for annotation tasks](https://jbgruber.github.io/rollama/articles/annotation.html)
0 commit comments