Reference Video:
【DeepSeek+LoRA+FastAPI】How Developers Fine-tune Large Models and Expose Interfaces for Backend Calls

After following the blogger’s video installation, I encountered some issues. It might be because the blogger was using Linux, while I am using Windows, or the server rented by the blogger had some basic configurations pre-installed. Therefore, following the steps exactly resulted in some errors.

1. Error: pip install -e ".[torch,metrics]"

  1. Check the .condarc configuration file. Delete default and replace it with a domestic mirror source.
1
2
3
4
5
6
7
8
show_channel_urls: true
channels:
- https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
- https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/r
- https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/msys2
envs_dirs:
- D:\Software\LLaMA-Factory
- e:\llama-factory-domain

2. Error running llamafactory-cli webui

1. Prompt llamafactory-cli version

Traceback (most recent call last):llamafactory-cli version Traceback (most recent call last):

This proves that llamafactory was not installed successfully and needs to be reinstalled. Note that installing the latest version might cause version incompatibility.
pip install llama-factory=4.44.1 --upgrade -i https://pypi.tuna.tsinghua.edu.cn/simple

2. Error File "...\httpx\_transports\default.py", line 153... FileNotFoundError: [Errno 2] No such file or directory

  • Manually set the correct SSL_CERT_FILE environment variable, pointing to the cacert.pem file in the Python installation directory.
  • If the cacert.pem file cannot be found, you can download it from the official cURL repository and place it in the correct directory.
  • If it still cannot be found after setting, use the command echo %SSL_CERT_FILE% to view the actual location and manually change it via set SSL_CERT_FILE=C:\Python310\DLLs\cacert.pem.

3. ERROR: Exception in ASGI application Traceback (most recent call last):

  • Check dependencies. generally, normal installation shouldn’t have dependency issues. Try modifying the version with pip install llama-factory=4.44.1 --upgrade -i https://pypi.tuna.tsinghua.edu.cn/simple.

  • TypeError: argument of type 'bool' is not iterable Traceback: Modify the global_share attribute value to True in the path /src/llamafactory/webui/interface.py.

    LLaMa-Factory Deployment and llamafactory-cli webui UI Interface Not Opening Issue Resolution Record

  • ValueError: When localhost is not accessible, a shareable link must be created. Please set share=True or check your proxy settings to allow access to localhost.

This is caused by inconsistent component versions. Reinstall the specified version of the component: pip install pydantic==2.10.6.

4. Please check your internet connection…

“Please check your internet connection. This can happen if your antivirus software blocks the download of this file. You can install manually by following these steps:”

Just download following the steps:

  1. Download this file: https://cdn-media.huggingface.co/frpc-gradio-0.2/frpc_windows_amd64.exe
  2. Rename the downloaded file to: frpc_windows_amd64_v0.2
  3. Move the file to this location: D:\Software\LLaMA-Factory\LLaMA-Factory\lib\site-packages\gradio