LLaMA-Factory Gradio Environment Configuration
Reference Video:
【DeepSeek+LoRA+FastAPI】How Developers Fine-tune Large Models and Expose Interfaces for Backend Calls
After following the blogger’s video installation, I encountered some issues. It might be because the blogger was using Linux, while I am using Windows, or the server rented by the blogger had some basic configurations pre-installed. Therefore, following the steps exactly resulted in some errors.
1. Error: pip install -e ".[torch,metrics]"
- Check the
.condarcconfiguration file. Deletedefaultand replace it with a domestic mirror source.
1 | show_channel_urls: true |
2. Error running llamafactory-cli webui
1. Prompt llamafactory-cli version
Traceback (most recent call last):llamafactory-cli version Traceback (most recent call last):
This proves that llamafactory was not installed successfully and needs to be reinstalled. Note that installing the latest version might cause version incompatibility.pip install llama-factory=4.44.1 --upgrade -i https://pypi.tuna.tsinghua.edu.cn/simple
2. Error File "...\httpx\_transports\default.py", line 153... FileNotFoundError: [Errno 2] No such file or directory
- Manually set the correct
SSL_CERT_FILEenvironment variable, pointing to thecacert.pemfile in the Python installation directory. - If the
cacert.pemfile cannot be found, you can download it from the official cURL repository and place it in the correct directory. - If it still cannot be found after setting, use the command
echo %SSL_CERT_FILE%to view the actual location and manually change it viaset SSL_CERT_FILE=C:\Python310\DLLs\cacert.pem.
3. ERROR: Exception in ASGI application Traceback (most recent call last):
Check dependencies. generally, normal installation shouldn’t have dependency issues. Try modifying the version with
pip install llama-factory=4.44.1 --upgrade -i https://pypi.tuna.tsinghua.edu.cn/simple.TypeError: argument of type 'bool' is not iterable Traceback: Modify theglobal_shareattribute value toTruein the path/src/llamafactory/webui/interface.py.LLaMa-Factory Deployment and llamafactory-cli webui UI Interface Not Opening Issue Resolution Record
ValueError: When localhost is not accessible, a shareable link must be created. Please set share=True or check your proxy settings to allow access to localhost.
This is caused by inconsistent component versions. Reinstall the specified version of the component: pip install pydantic==2.10.6.
4. Please check your internet connection…
“Please check your internet connection. This can happen if your antivirus software blocks the download of this file. You can install manually by following these steps:”
Just download following the steps:
- Download this file: https://cdn-media.huggingface.co/frpc-gradio-0.2/frpc_windows_amd64.exe
- Rename the downloaded file to:
frpc_windows_amd64_v0.2 - Move the file to this location:
D:\Software\LLaMA-Factory\LLaMA-Factory\lib\site-packages\gradio

