Open
Description
To test out LMQL as per the guidance in the documentation. I did the following steps where I encountered several issues:
- I started on a blank folder on Linux environment and running
conda env create -f requirements.yml -n lmql-dev
with the requirements file - After activating the environment on conda i ran the activate script with
source activate-dev.sh
- Then I installed lmql on the environment to run with my gpu using
pip install lmql[hf]
To test local model - Llama-2-7B
- I added Llama.cpp using
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
as per llama-cpp-python - To load the .guff of Llama-2-7B-GGUF I used
lmql serve-model llama.cpp:/home/gayanukaa/llm-test/lmql-test/llama-2-7b.Q4_K_M.gguf --cuda --port 9999 --trust_remote_code True
From this onwards I faced several issues:
- Despite installing lmql i got a ModuleNotFoundError for running python files
- After loaading the inference server as step 5 above i tried running Python scripts from one of three possible methods and lmql run for .lmql files -> There was no ouput. I was not sure about whether it takes a lot of time to load but it remains on this.
Would appreciate any help or guidance to resolve these issues to run and learn LMQL
Metadata
Metadata
Assignees
Labels
No labels