Skip to content

Commit 8ad97cb

Browse files
committed
doc: update installation and dependency
1 parent f0ddd0e commit 8ad97cb

File tree

5 files changed

+18
-5
lines changed

5 files changed

+18
-5
lines changed

README.md

+9-2
Original file line numberDiff line numberDiff line change
@@ -228,13 +228,20 @@ https://github.com/OpenBMB/AgentVerse/assets/11704492/4d07da68-f942-4205-b558-f1
228228

229229
## Installation
230230

231+
231232
**Manually Install (Recommended!)**
233+
234+
**Make sure you have Python >= 3.9**
232235
```bash
233236
git clone https://github.com/OpenBMB/AgentVerse.git --depth 1
234237
cd AgentVerse
235-
python setup.py develop
238+
pip install -e .
239+
```
240+
241+
If you want to use AgentVerse with local models such as LLaMA, you need to additionally install some other dependencies:
242+
```bash
243+
pip install -r requirements_local.txt
236244
```
237-
Some users have reported problems installing the `orjson` required by `gradio`. One simple workaround is to install it with Anaconda `conda install -c conda-forge orjson`.
238245

239246
**Install with pip**
240247

agentverse/llms/utils/token_counter.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,6 @@
55
from agentverse.logging import logger
66
from agentverse.message import Message
77
from agentverse.llms import LOCAL_LLMS
8-
from transformers import AutoTokenizer
98

109

1110
def count_string_tokens(prompt: str = "", model: str = "gpt-3.5-turbo") -> int:
@@ -29,6 +28,7 @@ def count_message_tokens(
2928
tokens_per_name = 1
3029
encoding_model = "gpt-4"
3130
elif model in LOCAL_LLMS:
31+
from transformers import AutoTokenizer
3232
encoding = AutoTokenizer.from_pretrained(model)
3333
else:
3434
raise NotImplementedError(

requirements.txt

+1-2
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ pyyaml
22
fastapi==0.95.1
33
uvicorn
44
py3langid
5-
iso-639
5+
setuptools-scm
66
openai==0.27.8
77
opencv-python==4.8.0.76
88
gradio
@@ -17,5 +17,4 @@ colorlog
1717
rapidfuzz
1818
spacy
1919
colorama==0.4.6
20-
fschat[model_worker,webui]
2120
tiktoken==0.5.1

requirements_local.txt

+1
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
fschat[model_worker,webui]

setup.py

+6
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,9 @@
55
with open("requirements.txt", "r") as f:
66
requirements = f.read().splitlines()
77

8+
with open("requirements_local.txt", "r") as f:
9+
requirements_local = f.read().splitlines()
10+
811
with open("README.md", "r", encoding='utf8') as fh:
912
long_description = fh.read()
1013

@@ -38,6 +41,9 @@
3841
# "langchain",
3942
# ],
4043
install_requires=requirements,
44+
extras_require={
45+
'local': requirements_local
46+
},
4147
include_package_data = True,
4248
entry_points={
4349
"console_scripts": [

0 commit comments

Comments
 (0)