-
Added
transform
option for OpenAI-FIM-compatible providers.This feature enables support for non-OpenAI-FIM-compatible APIs with OpenAI-FIM-compatible provider, such as the DeepInfra FIM API. Example configurations are available in recipes.md.
- Modified the Gemini provider's default prompt strategy to use the new Prefix First structure.
- Other providers will continue to use their previous default prompt configurations.
- lsp: always
prepend_to_complete_word
without checkingTriggerCharacter
- Add a new "Prefix-First" prompt structure for chat LLMs.
- change multi-lines indicators to unicode character
⏎
.
- lsp: change multi lines indicators to unicode character
⏎
. - Lualine integration: add spinner component.
- lsp: add detail field to show provider name.
- Minuet Event: add three user events during its request workflow.
- lsp: handle request where params.context is not provided (for mini.completion).
- lsp: Don't explicitly disable auto trigger for filetypes not in
enabled_auto_trigger_ft
.
- lsp: Fix cursor column position when trying to get current line content.
- lsp: Early return on throttle.
- Introduced in-process LSP support for built-in completion.
- For
blink-cmp
, the completion item'skind_name
now reflects the LLM provider name. - Improve error handling for both stream and non-stream JSON decoding, providing more informative messages.
- Added recipes for llama.cpp
- Added recipes for integration with VectorCode
- Add luarocks release
Minuet change_model
now supportsvim.ui.select
- Remove Hugging Face provider
- Remove deprecated commands (
MinuetToggle*
,MinuetChange*
)
- Add option to show suggestion when completion menu is visible
- Support api_key as a function
- Update default gemini model to gemini-2.0-flash
- Add command
Minuet change_preset
- Add option to show suggestion when completion menu is visible
- Add RAG experimental feature doc with vectorcode
- Initial release