- Add README, LICENSE, and pyproject.toml.
- Add basic functionality.
- Rename project to aitelegrambot from rationalai-telegram-bot.
- Add script for pipx.
- Add additonal instructions in README.
- Use a seperate module for constant messages.
- Fix bug because of which the command was being sent to the LLM.
- The bot now replies to the message which ran the infer command.
- The bot has an administration feature which allows the admin to:
- Change the current model.
- Pull a model.
- Delete a model.
- List models.
- Fix bug with list models command: Error and reply when no models are available.
- Add support for
streaming responses(Receiving the messages, as the LLM generates them).
The following two releases are just very basic bug fixes which should have been properly tested.
- Fix bugs
- Fix bugs
- Add support for both, streaming and non-streaming inference.
- Default to non-streaming.
- New option to customize streaming or non-streaming.
- Fix bug
- Fix bug
- add help command.
- Reply to the user even when streaming messages is disabled.
- Fix bugs with pull and delete commands.
- add support for multiple administrators.
- add current model indication in list_models command.
- fix error in /help command.
- remove debugging statements from /list_models command.
- fix BadRequest response error when using streaming inference.
- add help message when user sends an empty message to /infer commands.
- Fix error in /infer command.
- Fix behaviour of /infer command.