You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
2024-04-14 15:00:23 Checking for script in /app/prestart.sh
2024-04-14 15:00:23 There is no script /app/prestart.sh
2024-04-14 15:00:23 INFO: Will watch for changes in these directories: ['/app']
2024-04-14 15:00:23 WARNING: "workers" flag is ignored when reloading is enabled.
2024-04-14 15:00:23 INFO: Uvicorn running on http://0.0.0.0:4891 (Press CTRL+C to quit)
2024-04-14 15:00:23 INFO: Started reloader process [1] using WatchFiles
2024-04-14 15:00:24 Process SpawnProcess-1:
2024-04-14 15:00:24 Traceback (most recent call last):
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
2024-04-14 15:00:24 self.run()
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/multiprocessing/process.py", line 108, in run
2024-04-14 15:00:24 self._target(*self._args, **self._kwargs)
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/_subprocess.py", line 76, in subprocess_started
2024-04-14 15:00:24 target(sockets=sockets)
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 59, in run
2024-04-14 15:00:24 return asyncio.run(self.serve(sockets=sockets))
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
2024-04-14 15:00:24 return runner.run(main)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
2024-04-14 15:00:24 return self._loop.run_until_complete(task)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 66, in serve
2024-04-14 15:00:24 config.load()
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/config.py", line 471, in load
2024-04-14 15:00:24 self.loaded_app = import_from_string(self.app)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/importer.py", line 21, in import_from_string
2024-04-14 15:00:24 module = importlib.import_module(module_str)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/importlib/init.py", line 126, in import_module
2024-04-14 15:00:24 return _bootstrap._gcd_import(name[level:], package, level)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "", line 1204, in _gcd_import
2024-04-14 15:00:24 File "", line 1176, in _find_and_load
2024-04-14 15:00:24 File "", line 1147, in _find_and_load_unlocked
2024-04-14 15:00:24 File "", line 690, in _load_unlocked
2024-04-14 15:00:24 File "", line 940, in exec_module
2024-04-14 15:00:24 File "", line 241, in _call_with_frames_removed
2024-04-14 15:00:24 File "/app/main.py", line 6, in
2024-04-14 15:00:24 from api_v1.api import router as v1_router
2024-04-14 15:00:24 File "/app/api_v1/api.py", line 1, in
2024-04-14 15:00:24 from api_v1.routes import chat, completions, engines, health
2024-04-14 15:00:24 File "/app/api_v1/routes/chat.py", line 6, in
2024-04-14 15:00:24 from gpt4all import GPT4All
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/gpt4all/init.py", line 1, in
2024-04-14 15:00:24 from . import gpt4all # noqa
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/gpt4all/gpt4all.py", line 6, in
2024-04-14 15:00:24 from . import pyllmodel
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/gpt4all/pyllmodel.py", line 39, in
2024-04-14 15:00:24 llmodel, llama = load_llmodel_library()
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/gpt4all/pyllmodel.py", line 32, in load_llmodel_library
2024-04-14 15:00:24 llama_lib = ctypes.CDLL(llama_dir, mode=ctypes.RTLD_GLOBAL)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/ctypes/init.py", line 376, in init
2024-04-14 15:00:24 self._handle = _dlopen(self._name, mode)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 OSError: /usr/local/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.so: cannot open shared object file: No such file or directory
I am getting above error when I run "docker compose up" command after I run "DOCKER_BUILDKIT=1 docker build -t gpt4all_api --progress plain -f gpt4all_api/Dockerfile.buildkit ."
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
2024-04-14 15:00:23 Checking for script in /app/prestart.sh
2024-04-14 15:00:23 There is no script /app/prestart.sh
2024-04-14 15:00:23 INFO: Will watch for changes in these directories: ['/app']
2024-04-14 15:00:23 WARNING: "workers" flag is ignored when reloading is enabled.
2024-04-14 15:00:23 INFO: Uvicorn running on http://0.0.0.0:4891 (Press CTRL+C to quit)
2024-04-14 15:00:23 INFO: Started reloader process [1] using WatchFiles
2024-04-14 15:00:24 Process SpawnProcess-1:
2024-04-14 15:00:24 Traceback (most recent call last):
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
2024-04-14 15:00:24 self.run()
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/multiprocessing/process.py", line 108, in run
2024-04-14 15:00:24 self._target(*self._args, **self._kwargs)
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/_subprocess.py", line 76, in subprocess_started
2024-04-14 15:00:24 target(sockets=sockets)
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 59, in run
2024-04-14 15:00:24 return asyncio.run(self.serve(sockets=sockets))
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/asyncio/runners.py", line 190, in run
2024-04-14 15:00:24 return runner.run(main)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/asyncio/runners.py", line 118, in run
2024-04-14 15:00:24 return self._loop.run_until_complete(task)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/server.py", line 66, in serve
2024-04-14 15:00:24 config.load()
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/config.py", line 471, in load
2024-04-14 15:00:24 self.loaded_app = import_from_string(self.app)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/uvicorn/importer.py", line 21, in import_from_string
2024-04-14 15:00:24 module = importlib.import_module(module_str)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/importlib/init.py", line 126, in import_module
2024-04-14 15:00:24 return _bootstrap._gcd_import(name[level:], package, level)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "", line 1204, in _gcd_import
2024-04-14 15:00:24 File "", line 1176, in _find_and_load
2024-04-14 15:00:24 File "", line 1147, in _find_and_load_unlocked
2024-04-14 15:00:24 File "", line 690, in _load_unlocked
2024-04-14 15:00:24 File "", line 940, in exec_module
2024-04-14 15:00:24 File "", line 241, in _call_with_frames_removed
2024-04-14 15:00:24 File "/app/main.py", line 6, in
2024-04-14 15:00:24 from api_v1.api import router as v1_router
2024-04-14 15:00:24 File "/app/api_v1/api.py", line 1, in
2024-04-14 15:00:24 from api_v1.routes import chat, completions, engines, health
2024-04-14 15:00:24 File "/app/api_v1/routes/chat.py", line 6, in
2024-04-14 15:00:24 from gpt4all import GPT4All
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/gpt4all/init.py", line 1, in
2024-04-14 15:00:24 from . import gpt4all # noqa
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/gpt4all/gpt4all.py", line 6, in
2024-04-14 15:00:24 from . import pyllmodel
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/gpt4all/pyllmodel.py", line 39, in
2024-04-14 15:00:24 llmodel, llama = load_llmodel_library()
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/site-packages/gpt4all/pyllmodel.py", line 32, in load_llmodel_library
2024-04-14 15:00:24 llama_lib = ctypes.CDLL(llama_dir, mode=ctypes.RTLD_GLOBAL)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 File "/usr/local/lib/python3.11/ctypes/init.py", line 376, in init
2024-04-14 15:00:24 self._handle = _dlopen(self._name, mode)
2024-04-14 15:00:24 ^^^^^^^^^^^^^^^^^^^^^^^^^
2024-04-14 15:00:24 OSError: /usr/local/lib/python3.11/site-packages/gpt4all/llmodel_DO_NOT_MODIFY/build/libllama.so: cannot open shared object file: No such file or directory
I am getting above error when I run "docker compose up" command after I run "DOCKER_BUILDKIT=1 docker build -t gpt4all_api --progress plain -f gpt4all_api/Dockerfile.buildkit ."
Beta Was this translation helpful? Give feedback.
All reactions