You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Fix an issue where CMD+K does not clear the terminal when the terminal has focus (#1671)
On MacOS, ⌘+K is bound, by default, to Terminal:Clear. Without this
change ⌘+K does not clear the terminal but instead iniates a chord
sequence and waits for the next stroke of the chord.
Co-authored-by: Rob Leidle <[email protected]>
* Change treeSitter to cache the Language objects it loads from wasm (#1672)
Without this change, for a repository with 600 typescript files, the
indexer would fail to finish correctly and there would be many of the
following errors in the webview console log:
'Unable to load language for file ${path} RuntimeError: table index is out of bounds'
The following bash will create a repo that reproduces the problem:
current_path="."
for ((i=1; i<=20; i++)); do
new_folder="folder-$i"
mkdir -p "$current_path/$new_folder"
current_path="$current_path/$new_folder"
for ((a=1; a<=30; a++)); do
head -c 10000 /dev/urandom | base64 > "$current_path/file-$a.ts"
done
done
Co-authored-by: Rob Leidle <[email protected]>
* acknowledge sourcemap flag in esbuild.js
* don't run jetbrains-release.yaml on vscode releases
* further testing for walkDir
* chore: add telemetry to commands (#1673)
* test: Add basic unit test to baseLLM (#1668)
* update version
* test: Add basic unit test to baseLLM
---------
Co-authored-by: Nate Sesti <[email protected]>
Co-authored-by: inimaz <[email protected]>
* feat: add Quick Actions CodeLens feature (#1674)
* docs: add docs and schema for "OS" provider (#1536)
* ignore .env
* ✨ use and cache imports for autocomplete (#1456)
* ✨ use and cache imports for autocomplete
* fix tsc
* add voyage rerank-1
* import Handlebars
* feat: open pane on install (#1564)
* feat: open pane on activation
* comment out testing code
* chore: add telemetry for pageviews (#1576)
* feat: update onboarding w/ embeddings model (#1570)
* chore(gui): remove unused pages
* feat: add embeddings step
* feat: update styles
* feat: copy button updates
* fix: correct pull command for embed model
* fix: remove commented code
* fix: remove commented code
* feat: simplify copy btn props
* chore: rename onboarding selection event
* feat: add provider config
* fix: undo msg name
* remove dead code
* fix: invalid mode check
* fix: remove testing logic
* fix: fullscreen gui retains context when hidden, fixed fullscreen focusing (#1582)
* small UI tweaks
* media query
* feat: add best experience onboarding
* small fixes
* feat: add free trial card to onboarding (#1600)
* feat: add free trial card to onboarding
* add import
* chore: add telemetry for full screen toggle (#1618)
* rerank-lite-1
* remove doccs
* basic tests for VS Code extension
* improved testing of VS Code extension
* manually implement stop tokens for hf inference api
* chore: onboarding metrics (#1626)
* fix: pageview tracking
* feat: add onboarding telemetry
* create single `onboardingStatus` type
* improved var naming
* remove console logs
* fix windows performance issue
* rename vscodeExtension.ts
* migration of onboarding variables
* "stash" instead of "delete" in indexing progress
* fix preview.yaml
* also fix main.yaml
* Update troubleshooting.md (#1637)
* feat: add quick actions
* Update index.d.ts
* quick actions mvp
* update docs
* subscribe to vscode change settings
* Update commands.ts
* cleanup
* Update quick-actions.md
* Update VerticalPerLineCodeLensProvider.ts
* resolve feedback
---------
Co-authored-by: Nate Sesti <[email protected]>
Co-authored-by: Nate Sesti <[email protected]>
Co-authored-by: Jonah Wagner <[email protected]>
* chore: add `isCommandEvent` to command telemetry (#1675)
* chore: add `isCommandEvent` to command telemetry
* Update commands.ts
* Nate/better retrieval (#1677)
* deduplicatearray tests
* break out separate retrieval pipelines
* IConfigHandler
* tests for codebase indexer
* better .continueignore for continue
* indexing fixes
* ignore .gitignore and .continueignore when indexing
* retrieval pipeline improvements
* fix formatting err in out .continueignore
* add necessary filter to lance_db_cache
* update package.json version
* skip unused tests
* don't ignore .prompt files
* update version
* Update pull_request_template.md
* don't use multi-media format when there are multiple text items
* add free trial experience (#1685)
* fix: add code range for quick actions/fixes (#1687)
* fix: add code range for quick actions/fixes
* Update test.js
* add pathSep message type
* docs improvements
* jetbrains fix
* update package.json version
---------
Co-authored-by: Rob Leidle <[email protected]>
Co-authored-by: Rob Leidle <[email protected]>
Co-authored-by: Patrick Erichsen <[email protected]>
Co-authored-by: inimaz <[email protected]>
Co-authored-by: inimaz <[email protected]>
Co-authored-by: Jonah Wagner <[email protected]>
Co-authored-by: Priyash <[email protected]>
Want a quick and easy setup for Continue? We've got you covered with some sample `config.json` files for different scenarios. Just copy and paste them into your `config.json` by clicking the gear icon at the bottom right of the Continue sidebar.
10
10
11
-
## Best Overall Experience
11
+
## Quick Setup Options
12
+
13
+
You can use Continue in different ways. Here are some quick setups for common uses:
14
+
15
+
-[Free Trial](#free-trial) - Try Continue without any additional setup.
16
+
-[Best Overall Experience](#best-overall-experience) - Utilize the hand picked models for the best experience.
17
+
-[Local and Offline](#local-and-offline-configuration) - Use local models for offline use with better privacy.
18
+
19
+
### Free Trial
20
+
21
+
The `free-trial` lets new users try out Continue with GPT-4o, Llama3, Claude 3.5, and other models using a ContinueDev proxy server that securely makes API calls to these services.
22
+
23
+
```json title="~/.continue/config.json"
24
+
{
25
+
"models": [
26
+
{
27
+
"title": "GPT-4o (trial)",
28
+
"provider": "free-trial",
29
+
"model": "gpt-4o"
30
+
}
31
+
],
32
+
"tabAutocompleteModel": {
33
+
"title": "Codestral (trial)",
34
+
"provider": "free-trial",
35
+
"model": "AUTODETECT"
36
+
},
37
+
"embeddingsProvider": {
38
+
"provider": "free-trial"
39
+
},
40
+
"reranker": {
41
+
"name": "free-trial"
42
+
}
43
+
}
44
+
```
45
+
46
+
### Best Overall Experience
12
47
13
48
This setup uses Claude 3.5 Sonnet for chatting, Codestral for autocomplete, and Voyage AI for embeddings and reranking.
14
49
15
50
**What You Need:**
16
51
17
-
1. Get a Codestral API key from [Mistral AI's La Plateforme](https://console.mistral.ai/codestral)
18
-
2. Get an Anthropic API key from [Anthropic Console](https://console.anthropic.com/account/keys)
19
-
3. Replace `[CODESTRAL_API_KEY]` and `[ANTHROPIC_API_KEY]` with the keys you got from the above links.
20
-
21
-
:::note
22
-
This example uses a free trial for embeddings and reranking, forwarding requests via ContinueDev proxy. For direct service, get a Voyage AI API key and update the `provider` and `apiKey` fields. See the [config reference for Voyage AI](../walkthroughs//codebase-embeddings.md#voyage-ai) for details on how to set this up.
23
-
:::
52
+
1. Get an Anthropic API key from [Anthropic Console](https://console.anthropic.com/account/keys)
53
+
2. Get a Codestral API key from [Mistral AI's La Plateforme](https://console.mistral.ai/codestral)
54
+
3. Get an Voyage AI API key from [Voyage AI Dashboard](https://dash.voyageai.com/)
55
+
4. Replace `[CODESTRAL_API_KEY]`, `[ANTHROPIC_API_KEY]`, and `[VOYAGE_API_KEY]` with the keys you got from the above links.
24
56
25
57
```json title="~/.continue/config.json"
26
58
{
@@ -39,15 +71,21 @@ This example uses a free trial for embeddings and reranking, forwarding requests
39
71
"apiKey": "[CODESTRAL_API_KEY]"
40
72
},
41
73
"embeddingsProvider": {
42
-
"provider": "free-trial"
74
+
"provider": "openai",
75
+
"model": "voyage-code-2",
76
+
"apiBase": "https://api.voyageai.com/v1/",
77
+
"apiKey": "[VOYAGE_AI_API_KEY]"
43
78
},
44
79
"reranker": {
45
-
"name": "free-trial"
80
+
"name": "voyage",
81
+
"params": {
82
+
"apiKey": "[VOYAGE_AI_API_KEY]"
83
+
}
46
84
}
47
85
}
48
86
```
49
87
50
-
## Local and Offline Configuration
88
+
###Local and Offline Configuration
51
89
52
90
This configuration leverages Ollama for all functionalities - chat, autocomplete, and embeddings - ensuring that no code is transmitted outside your machine, allowing Continue to be run even on an air-gapped computer.
53
91
@@ -135,39 +173,57 @@ If you need to send custom headers for authentication, you may use the `requestO
135
173
136
174
```json title="~/.continue/config.json"
137
175
{
138
-
"models": [
139
-
{
140
-
"title": "Ollama",
141
-
"provider": "ollama",
142
-
"model": "llama2-7b",
143
-
"requestOptions": {
144
-
"headers": {
145
-
"X-Auth-Token": "xxx"
146
-
}
147
-
}
176
+
"models": [
177
+
{
178
+
"title": "Ollama",
179
+
"provider": "ollama",
180
+
"model": "llama2-7b",
181
+
"requestOptions": {
182
+
"headers": {
183
+
"X-Auth-Token": "xxx"
184
+
}
148
185
}
149
-
]
186
+
}
187
+
]
150
188
}
151
189
```
152
190
153
191
Similarly if your model requires a Certificate for authentication, you may use the `requestOptions.clientCertificate` property like in the example below:
154
192
155
193
```json title="~/.continue/config.json"
156
194
{
157
-
"models": [
158
-
{
159
-
"title": "Ollama",
160
-
"provider": "ollama",
161
-
"model": "llama2-7b",
162
-
"requestOptions": {
163
-
"clientCertificate": {
164
-
"cert": "C:\temp\ollama.pem",
165
-
"key": "C:\temp\ollama.key",
166
-
"passphrase": "c0nt!nu3"
167
-
}
168
-
}
195
+
"models": [
196
+
{
197
+
"title": "Ollama",
198
+
"provider": "ollama",
199
+
"model": "llama2-7b",
200
+
"requestOptions": {
201
+
"clientCertificate": {
202
+
"cert": "C:\tempollama.pem",
203
+
"key": "C:\tempollama.key",
204
+
"passphrase": "c0nt!nu3"
205
+
}
169
206
}
170
-
]
207
+
}
208
+
]
209
+
}
210
+
```
211
+
212
+
## Context Length
213
+
214
+
Continue by default knows the context length for common models. For example, it will automatically assume 200k tokens for Claude 3. For Ollama, the context length is determined automatically by asking Ollama. If neither of these are sufficient, you can manually specify the context length by using hte `"contextLength"` property in your model in config.json.
Copy file name to clipboardExpand all lines: docs/docs/walkthroughs/codebase-embeddings.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -294,7 +294,7 @@ Continue offers a free trial of Voyage AI's reranking model.
294
294
}
295
295
```
296
296
297
-
## Customizing which files are indexed
297
+
## Ignore files during indexing
298
298
299
299
Continue respects `.gitignore` files in order to determine which files should not be indexed. If you'd like to exclude additional files, you can add them to a `.continueignore` file, which follows the exact same rules as `.gitignore`.
Copy file name to clipboardExpand all lines: extensions/intellij/src/main/kotlin/com/github/continuedev/continueintellijextension/actions/ContinuePluginActions.kt
+1-16
Original file line number
Diff line number
Diff line change
@@ -172,22 +172,7 @@ class ViewLogsAction : AnAction() {
172
172
}
173
173
}
174
174
175
-
classToggleAuxiliaryBarAction : AnAction() {
176
-
overridefunactionPerformed(e:AnActionEvent) {
177
-
val project = e.project ?:return
178
-
val toolWindowManager =ToolWindowManager.getInstance(project)
179
-
val toolWindow = toolWindowManager.getToolWindow("Continue")
Copy file name to clipboardExpand all lines: extensions/intellij/src/main/kotlin/com/github/continuedev/continueintellijextension/continue/CoreMessenger.kt
+1
Original file line number
Diff line number
Diff line change
@@ -147,6 +147,7 @@ class CoreMessenger(private val project: Project, esbuildPath: String, continueC
Copy file name to clipboardExpand all lines: extensions/intellij/src/main/kotlin/com/github/continuedev/continueintellijextension/continue/IdeProtocolClient.kt
0 commit comments