D:\Projects>git clone https://github.com/QiuYannnn/Local-File-Organizer.git
Cloning into 'Local-File-Organizer'...
remote: Enumerating objects: 170, done.
remote: Counting objects: 100% (40/40), done.
remote: Compressing objects: 100% (20/20), done.
remote: Total 170 (delta 24), reused 20 (delta 20), pack-reused 130 (from 1)
Receiving objects: 100% (170/170), 27.91 MiB | 2.75 MiB/s, done.
Resolving deltas: 100% (75/75), done.
D:\Projects>cd lo*
D:\Projects\Local-File-Organizer>conda activate py
Error while loading conda entry point: conda-libmamba-solver (initialization failed)
EnvironmentNameNotFound: Could not find conda environment: py
You can list all discoverable environments with `conda info --envs`.
Terminate batch job (Y/N)? conda activate pyg
Terminate batch job (Y/N)? y
D:\Projects\Local-File-Organizer>conda activate pyg
(pyg) D:\Projects\Local-File-Organizer>python
Python 3.12.8 (tags/v3.12.8:2dc476b, Dec 3 2024, 19:30:04) [MSC v.1942 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>>
KeyboardInterrupt
>>> exit()
(pyg) D:\Projects\Local-File-Organizer>pip install nexaai --prefer-binary --index-url https://nexaai.github.io/nexa-sdk/whl/cpu --extra-index-url https://pypi.org/simple --no-cache-dir
Looking in indexes: https://nexaai.github.io/nexa-sdk/whl/cpu, https://pypi.org/simple
Collecting nexaai
Downloading https://github.com/NexaAI/nexa-sdk/releases/download/v0.1.1.0/nexaai-0.1.1.0-cp312-cp312-win_amd64.whl (5.7 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.7/5.7 MB 8.2 MB/s eta 0:00:00
Collecting cmake (from nexaai)
Downloading cmake-4.0.0-py3-none-win_amd64.whl.metadata (6.3 kB)
...
Successfully installed altair-5.5.0 audioread-3.0.1 cmake-4.0.0 coloredlogs-15.0.1 ctranslate2-4.6.0 diskcache-5.6.3 fastapi-0.115.12 faster_whisper-1.1.1 flatbuffers-25.2.10 gitdb-4.0.12 gitpython-3.1.44 humanfriendly-10.0 librosa-0.11.0 llvmlite-0.44.0 modelscope-1.25.0 mpmath-1.3.0 msgpack-1.1.0 narwhals-1.35.0 nexaai-0.1.1.0 numba-0.61.2 onnxruntime-1.21.0 pooch-1.8.2 pyarrow-19.0.1 pydeck-0.9.1 pyreadline3-3.5.4 python-multipart-0.0.20 scikit-learn-1.6.1 smmap-5.0.2 soundfile-0.13.1 soxr-0.5.0.post1 starlette-0.46.2 streamlit-1.44.1 streamlit-audiorec-0.1.3 sympy-1.13.3 tabulate-0.9.0 threadpoolctl-3.6.0 toml-0.10.2 uvicorn-0.34.1
[notice] A new release of pip is available: 24.3.1 -> 25.0.1
[notice] To update, run: python.exe -m pip install --upgrade pip
(pyg) D:\Projects\Local-File-Organizer>pip install -r requirements.txt
Requirement already satisfied: cmake in c:\python312\lib\site-packages (from -r requirements.txt (line 1)) (4.0.0)
Requirement already satisfied: pytesseract in c:\python312\lib\site-packages (from -r requirements.txt (line 2)) (0.3.13)
..
[notice] A new release of pip is available: 24.3.1 -> 25.0.1
[notice] To update, run: python.exe -m pip install --upgrade pip
(pyg) D:\Projects\Local-File-Organizer>python main.py
--------------------------------------------------
**NOTE: Silent mode logs all outputs to a text file instead of displaying them in the terminal.
Would you like to enable silent mode? (yes/no): yes
Enter the path of the directory you want to organize: C:\Users\Admin\Downloads\Telegram Desktop
Enter the path to store organized files and folders (press Enter to use 'organized_folder' in the input directory):
Please choose the mode to organize your files:
1. By Content
2. By Date
3. By Type
Enter 1, 2, or 3 (or type '/exit' to exit): 1
model-q4_0.gguf: 100%|████████████████████████████████████████████████████████████| 3.56G/3.56G [08:14<00:00, 7.73MB/s]
Verifying download: 100%|██████████████████████████████████████████████████████████| 3.56G/3.56G [00:14<00:00, 265MB/s]
projector-q4_0.gguf: 100%|██████████████████████████████████████████████████████████| 596M/596M [01:18<00:00, 7.99MB/s]
Verifying download: 100%|████████████████████████████████████████████████████████████| 596M/596M [00:02<00:00, 281MB/s]
Traceback (most recent call last):
File "D:\Projects\Local-File-Organizer\main.py", line 337, in <module>
main()
File "D:\Projects\Local-File-Organizer\main.py", line 222, in main
initialize_models()
File "D:\Projects\Local-File-Organizer\main.py", line 51, in initialize_models
image_inference = NexaVLMInference(
^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\nexa\gguf\nexa_inference_vlm.py", line 155, in __init__
self._load_model()
File "C:\Python312\Lib\site-packages\nexa\utils.py", line 312, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\nexa\gguf\nexa_inference_vlm.py", line 168, in _load_model
self.projector_handler(
File "C:\Python312\Lib\site-packages\nexa\gguf\llama\llama_chat_format.py", line 2693, in __init__
import llama_cpp.llava_cpp as llava_cpp
ModuleNotFoundError: No module named 'llama_cpp'
I followed exact steps given in this readme page, but it said this,also installing pip install llama-cpp-python gives:
--------------------------------------------------
**NOTE: Silent mode logs all outputs to a text file instead of displaying them in the terminal.
Would you like to enable silent mode? (yes/no): yes
Enter the path of the directory you want to organize: D:\tg\
Enter the path to store organized files and folders (press Enter to use 'organized_folder' in the input directory):
Please choose the mode to organize your files:
1. By Content
2. By Date
3. By Type
Enter 1, 2, or 3 (or type '/exit' to exit): 1
⠸ 2025-04-16 21:22:22,747 - ERROR - Failed to load model: Failed to load model from file: C:\Users\Admin\.cache\nexa\hub\official\llava-v1.6-vicuna-7b\model-q4_0.gguf. Falling back to CPU.
Traceback (most recent call last):
File "C:\Python312\Lib\site-packages\nexa\gguf\nexa_inference_vlm.py", line 181, in _load_model
self.model = Llama(
^^^^^^
File "C:\Python312\Lib\site-packages\nexa\gguf\llama\llama.py", line 372, in __init__
internals.LlamaModel(
File "C:\Python312\Lib\site-packages\nexa\gguf\llama\_internals_transformers.py", line 56, in __init__
raise ValueError(f"Failed to load model from file: {path_model}")
ValueError: Failed to load model from file: C:\Users\Admin\.cache\nexa\hub\official\llava-v1.6-vicuna-7b\model-q4_0.gguf
Traceback (most recent call last):
File "C:\Python312\Lib\site-packages\nexa\gguf\nexa_inference_vlm.py", line 181, in _load_model
self.model = Llama(
^^^^^^
File "C:\Python312\Lib\site-packages\nexa\gguf\llama\llama.py", line 372, in __init__
internals.LlamaModel(
File "C:\Python312\Lib\site-packages\nexa\gguf\llama\_internals_transformers.py", line 56, in __init__
raise ValueError(f"Failed to load model from file: {path_model}")
ValueError: Failed to load model from file: C:\Users\Admin\.cache\nexa\hub\official\llava-v1.6-vicuna-7b\model-q4_0.gguf
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\Projects\Local-File-Organizer\main.py", line 337, in <module>
main()
File "D:\Projects\Local-File-Organizer\main.py", line 222, in main
initialize_models()
File "D:\Projects\Local-File-Organizer\main.py", line 51, in initialize_models
image_inference = NexaVLMInference(
^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\nexa\gguf\nexa_inference_vlm.py", line 155, in __init__
self._load_model()
File "C:\Python312\Lib\site-packages\nexa\utils.py", line 312, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Python312\Lib\site-packages\nexa\gguf\nexa_inference_vlm.py", line 194, in _load_model
self.model = Llama(
^^^^^^
File "C:\Python312\Lib\site-packages\nexa\gguf\llama\llama.py", line 372, in __init__
internals.LlamaModel(
File "C:\Python312\Lib\site-packages\nexa\gguf\llama\_internals_transformers.py", line 56, in __init__
raise ValueError(f"Failed to load model from file: {path_model}")
ValueError: Failed to load model from file: C:\Users\Admin\.cache\nexa\hub\official\llava-v1.6-vicuna-7b\model-q4_0.gguf
I followed exact steps given in this readme page, but it said this,also installing pip install llama-cpp-python gives: