Description
There are a few issues when using LoRA: First, if I load a regular LoRA and select the ComfyUI (or XLab) type, the following error occurs:
model_type FLUX
!!! Exception during processing !!!
Traceback (most recent call last):
File "E:\comfyui\ComfyUI-aki-v1.3\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\comfyui\ComfyUI-aki-v1.3\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\comfyui\ComfyUI-aki-v1.3\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "E:\comfyui\ComfyUI-aki-v1.3\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\comfyui\ComfyUI-aki-v1.3\custom_nodes\svdquant\nodes\lora\flux.py", line 94, in load_lora
input_lora = comfyui2diffusers(lora_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\comfyui\ComfyUI-aki-v1.3\python312\Lib\site-packages\nunchaku\lora\flux\comfyui_converter.py", line 53, in comfyui2diffusers
assert "lora_unet_single_blocks" in k
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError
If I switch to the Diffusers type:
[2025-02-25 01:38:22.066] [info] Loading weights from E:\comfyui\ComfyUI-aki-v1.3\models\diffusion_models\svdqfluxdev\transformer_blocks.safetensors
[2025-02-25 01:38:23.854] [info] Done.
model_type FLUX
Converting 57 transformer blocks...
Converting LoRA branch for block single_transformer_blocks.0...
- Found single_transformer_blocks.0 LoRA of qkv_proj (rank: 32)
- Using original LoRA
- Found single_transformer_blocks.0 LoRA of out_proj (rank: 32)
- Using original LoRA
......
- Found transformer_blocks.18 LoRA of mlp_fc2 (rank: 32)
- Using original LoRA
- Found transformer_blocks.18 LoRA of mlp_context_fc1 (rank: 32)
- Using original LoRA
- Found transformer_blocks.18 LoRA of mlp_context_fc2 (rank: 32)
- Using original LoRA
[2025-02-25 01:38:26.979] [info] Loading partial weights from C:\Users\Admin\AppData\Local\Temp\tmp5hm0i2ys.safetensors
!!! Exception during processing !!! 'utf-8' codec can't decode byte 0xc1 in position 0: invalid start byte
Traceback (most recent call last):
File "E:\comfyui\ComfyUI-aki-v1.3\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\comfyui\ComfyUI-aki-v1.3\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\comfyui\ComfyUI-aki-v1.3\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "E:\comfyui\ComfyUI-aki-v1.3\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\comfyui\ComfyUI-aki-v1.3\custom_nodes\svdquant\nodes\lora\flux.py", line 110, in load_lora
model.model.diffusion_model.model.update_lora_params(tmp_file.name)
File "E:\comfyui\ComfyUI-aki-v1.3\python312\Lib\site-packages\nunchaku\models\transformer_flux.py", line 166, in update_lora_params
block.m.load(path, True)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc1 in position 0: invalid start byte
- Using original LoRA