Skip to content

Commit cfc4947

Browse files
authored
Merge pull request #35 from JBGruber/ReadMe
2 parents f1ba0eb + 4030b0e commit cfc4947

File tree

3 files changed

+194
-199
lines changed

3 files changed

+194
-199
lines changed

README.Rmd

+70-19
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ You can install this package from CRAN:
3737
install.packages("rollama")
3838
```
3939

40-
Or you can install the development version of `rollama` from [GitHub](https://github.com/JBGruber/rollama) with:
40+
Or you can install the development version of `rollama` from [GitHub](https://github.com/JBGruber/rollama). This version is updated more frequently and may contain bug fixes (or new bugs):
4141

4242
``` r
4343
# install.packages("remotes")
@@ -56,13 +56,11 @@ rollama::ping_ollama()
5656

5757
### Installation of Ollama through Docker
5858

59-
The advantage of running things through Docker is that the application is isolated from the rest of your system, behaves the same on different systems, and is easy to download and update.
59+
For beginners we recommend to download Ollama from [their website](https://ollama.com/). However, if you are familiar with Docker, you can also run Ollama through Docker. The advantage of running things through Docker is that the application is isolated from the rest of your system, behaves the same on different systems, and is easy to download and update.
6060
You can also get a nice web interface.
6161
After making sure [Docker](https://docs.docker.com/desktop/) is installed, you can simply use the Docker Compose file from [this gist](https://gist.github.com/JBGruber/73f9f49f833c6171b8607b976abc0ddc).
6262

63-
If you don’t know how to use Docker Compose, you can follow this video to use the compose file and start Ollama and Open WebUI:
64-
65-
[![Install Docker on macOS, Windows and Linux](https://img.youtube.com/vi/iMyCdd5nP5U/0.jpg)](https://www.youtube.com/watch?v=iMyCdd5nP5U)
63+
If you don’t know how to use Docker Compose, you can follow this [video](https://www.youtube.com/watch?v=iMyCdd5nP5U) to use the compose file and start Ollama and Open WebUI.
6664

6765
## Example
6866

@@ -81,15 +79,22 @@ You can make single requests, which does not store any history and treats each q
8179

8280
```{r query}
8381
# ask a single question
84-
query("why is the sky blue?")
82+
query("Why is the sky blue? Answer with one sentence.")
83+
```
84+
85+
With the output argument, we can specify the format of the response. Available options include "text", "list", "data.frame", "response", "httr2_response", and "httr2_request":
86+
87+
```{r output}
88+
# ask a single question and specify the output format
89+
query("Why is the sky blue? Answer with one sentence." , output = "text")
8590
```
8691

8792
Or you can use the `chat` function, treats all messages sent during an R session as part of the same conversation:
8893

8994
```{r chat}
9095
# hold a conversation
91-
chat("why is the sky blue?")
92-
chat("and how do you know that?")
96+
chat("Why is the sky blue? Give a short answer.")
97+
chat("And how do you know that? Give a short answer.")
9398
```
9499

95100
If you are done with a conversation and want to start a new one, you can do that like so:
@@ -103,11 +108,11 @@ new_chat()
103108
You can set a number of model parameters, either by creating a new model, with a [modelfile](https://jbgruber.github.io/rollama/reference/create_model.html), or by including the parameters in the prompt:
104109

105110
```{r}
106-
query("why is the sky blue?", model_params = list(
107-
seed = 42,
108-
temperature = 0,
109-
num_gpu = 0
110-
))
111+
query("Why is the sky blue? Answer with one sentence.", output = "text",
112+
model_params = list(
113+
seed = 42,
114+
num_gpu = 0)
115+
)
111116
```
112117

113118
```{r include=FALSE, results='asis'}
@@ -131,17 +136,63 @@ options(rollama_server = "http://localhost:11434")
131136
You can change how a model answers by setting a configuration or system message in plain English (or another language supported by the model):
132137

133138
```{r config}
134-
options(rollama_config = "You make answers understandable to a 5 year old")
135-
query("why is the sky blue?")
139+
options(rollama_config = "You make short answers understandable to a 5 year old")
140+
query("Why is the sky blue?")
136141
```
137142

138143
By default, the package uses the "llama3.1 8B" model. Supported models can be found at <https://ollama.com/library>.
139-
To download a specific model make use of the additional information available in "Tags" <https://ollama.com/library/mistral/tags>.
144+
To download a specific model make use of the additional information available in "Tags" <https://ollama.com/library/llama3.2/tags>.
140145
Change this via `rollama_model`:
141146

142147
```{r model}
143-
options(rollama_model = "mixtral")
144-
# if you don't have the model yet: pull_model("mixtral")
145-
query("why is the sky blue?")
148+
options(rollama_model = "llama3.2:3b-instruct-q4_1")
149+
# if you don't have the model yet: pull_model("llama3.2:3b-instruct-q4_1")
150+
query("Why is the sky blue? Answer with one sentence.")
151+
```
152+
153+
## Easy query generation
154+
155+
The `make_query` function simplifies the creation of structured queries, which can, for example, be used in [annotation tasks](https://jbgruber.github.io/rollama/articles/annotation.html#the-make_query-helper-function).
156+
157+
Main components (check the [documentation](https://jbgruber.github.io/rollama/articles/annotation.html#the-make_query-helper-function) for more options):
158+
159+
- **`text`**: The text(s) to classify.
160+
- **`prompt`**: Could be a (classification) question
161+
- **`system`**: Optional system prompt providing context or instructions for the task.
162+
- **`examples`**: Optional prior examples for one-shot or few-shot learning (user messages and assistant responses).
163+
164+
165+
**Zero-shot Example**
166+
In this example, the function is used without examples:
167+
168+
```{r make_query}
169+
# Create a query using make_query
170+
q_zs <- make_query(
171+
text = "the pizza tastes terrible",
172+
prompt = "Is this text: 'positive', 'neutral', or 'negative'?",
173+
system = "You assign texts into categories. Answer with just the correct category."
174+
)
175+
# Print the query
176+
print(q_zs)
177+
# Run the query
178+
query(q_zs, output = "text")
179+
146180
```
147181

182+
## Learn more
183+
184+
- [Use rollama for annotation tasks](https://jbgruber.github.io/rollama/articles/annotation.html)
185+
- [Annotate images](https://jbgruber.github.io/rollama/articles/image-annotation.html)
186+
- [Get text embedding](https://jbgruber.github.io/rollama/articles/text-embedding.html)
187+
- [Use more models (GGUF format) from Hugging Face](https://jbgruber.github.io/rollama/articles/hf-gguf.html)
188+
189+
190+
## Citation
191+
192+
Please cite the package using the [pre print](https://arxiv.org/abs/2404.07654) DOI: <https://doi.org/10.48550/arXiv.2404.07654>
193+
194+
195+
196+
197+
198+

0 commit comments

Comments
 (0)