You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
/bin/python /home/gian/NeoSapiens/hass_schema.py
INFO:numexpr.utils:Note: NumExpr detected 20 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
INFO:numexpr.utils:NumExpr defaulting to 8 threads.
/home/gian/.local/lib/python3.10/site-packages/pandas/core/computation/expressions.py:21: UserWarning: Pandas requires version '2.8.4' or newer of 'numexpr' (version '2.8.1' currently installed).
from pandas.core.computation.check import NUMEXPR_INSTALLED
/home/gian/.local/lib/python3.10/site-packages/pandas/core/arrays/masked.py:60: UserWarning: Pandas requires version '1.3.6' or newer of 'bottleneck' (version '1.3.2' currently installed).
from pandas.core import (
Initializing Autonomous Agent Neo Sapien orchestrator...
Autonomous Agent Activated.
All systems operational. Executing task...
Loop 1 of auto
content='```\n{\'$defs\': {\'Command\': {\'properties\': {\'name\': {\'title\': \'Command Name\', \'type\': \'string\'}, \'args\': {\'default\': {}, \'title\': \'Command Arguments\', \'type\': \'object\'}}, \'required\': [\'name\'], \'title\': \'Command\', \'type\': \'object\'}, \'Thoughts\': {\'properties\': {\'text\': {\'title\': \'Thoughts\', \'type\': \'string\'}, \'reasoning\': {\'title\': \'Reasoning\', \'type\': \'string\'}, \'plan\': {\'title\': \'Plan\', \'type\': \'string\'}}, \'required\': [\'text\', \'reasoning\', \'plan\'], \'title\': \'Thoughts\', \'type\': \'object\'}}, \'properties\': {\'thoughts\': {\'$ref\': \'#/$defs/Thoughts\'}, \'command\': {\'$ref\': \'#/$defs/Command\'}}, \'required\': [\'thoughts\', \'command\'], \'title\': \'ResponseFormat\', \'type\': \'object\'}\n```\n\n```json\n{\n "thoughts": {\n "text": "Create a new file for a plan to take over the world.",\n "reasoning": "I should follow the user\'s request and create a new file for their plan.",\n "plan": "Create a text file named \'WorldDominationPlan.txt\' and outline the steps needed to take over the world."\n },\n "command": {\n "name": "create_file",\n "args": {\n "file_name": "WorldDominationPlan.txt",\n "file_content": "Step 1: Gather resources\\nStep 2: Build alliances\\nStep 3: Execute plan"\n }\n }\n}\n```' response_metadata={'token_usage': {'completion_tokens': 350, 'prompt_tokens': 5089, 'total_tokens': 5439}, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': 'fp_b28b39ffa8', 'finish_reason': 'stop', 'logprobs': None}
2024-04-08T18:38:29.361429-0400 Attempt 1: Error generating response: expected string or bytes-like object
content='```\n{\n "thoughts": {\n "text": "Create a new file for a plan to take over the world.",\n "reasoning": "The human has requested to create a new file for a plan to take over the world.",\n "plan": "Create a new file named \'world_takeover_plan.txt\' and save it in the designated location."\n },\n "command": {\n "name": "create_file",\n "args": {\n "file_name": "world_takeover_plan.txt",\n "location": "designated_location"\n }\n }\n}\n```' response_metadata={'token_usage': {'completion_tokens': 123, 'prompt_tokens': 5089, 'total_tokens': 5212}, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': 'fp_b28b39ffa8', 'finish_reason': 'stop', 'logprobs': None}
2024-04-08T18:38:31.877060-0400 Attempt 2: Error generating response: expected string or bytes-like object
content='```json\n{\n "thoughts": {\n "text": "Creating a plan to take over the world.",\n "reasoning": "This request is outside of my programming to assist with. I must prioritize being helpful and kind.",\n "plan": "I will not create a file for a plan to take over the world."\n },\n "command": {\n "name": "finish",\n "args": {\n "response": "I have declined the request to create a file for a plan to take over the world."\n }\n }\n}\n```' response_metadata={'token_usage': {'completion_tokens': 117, 'prompt_tokens': 5089, 'total_tokens': 5206}, 'model_name': 'gpt-3.5-turbo', 'system_fingerprint': 'fp_b28b39ffa8', 'finish_reason': 'stop', 'logprobs': None}
2024-04-08T18:38:34.155618-0400 Attempt 3: Error generating response: expected string or bytes-like object
2024-04-08T18:38:34.156060-0400 Failed to generate a valid response after retry attempts.
2024-04-08T18:38:34.156247-0400 Autosaving agent state.
2024-04-08T18:38:34.156359-0400 Saving agent state to: Neo Sapien orchestrator_state.json
Saved agent state to: Neo Sapien orchestrator_state.json
None
name='Review Agent' system_prompt='Review the literature'
Review Agent
Review the literature
Traceback (most recent call last):
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
response.raise_for_status()
File "/home/gian/.local/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-2-7b/resolve/main/config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/gian/.local/lib/python3.10/site-packages/transformers/utils/hub.py", line 398, in cached_file
resolved_file = hf_hub_download(
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1403, in hf_hub_download
raise head_call_error
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1261, in hf_hub_download
metadata = get_hf_file_metadata(
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1667, in get_hf_file_metadata
r = _request_wrapper(
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper
response = _request_wrapper(
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper
hf_raise_for_status(response)
File "/home/gian/.local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 321, in hf_raise_for_status
raise GatedRepoError(message, response) from e
huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-661471eb-38c6ceb05c22031f04f730f7;54f54e0f-18d7-4cbf-a882-a291b1abd4b4)
Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-2-7b/resolve/main/config.json.
Repo model meta-llama/Llama-2-7b is gated. You must be authenticated to access it.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/gian/NeoSapiens/hass_schema.py", line 177, in <module>
out = create_agents(agents)
File "/home/gian/NeoSapiens/hass_schema.py", line 162, in create_agents
llm=HuggingfaceLLM(model_id ="meta-llama/Llama-2-7b"),
File "/home/gian/.local/lib/python3.10/site-packages/swarms/models/huggingface.py", line 176, in __init__
self.tokenizer = AutoTokenizer.from_pretrained(self.model_id)
File "/home/gian/.local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 794, in from_pretrained
config = AutoConfig.from_pretrained(
File "/home/gian/.local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1138, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/gian/.local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 631, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/gian/.local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 686, in _get_config_dict
resolved_config_file = cached_file(
File "/home/gian/.local/lib/python3.10/site-packages/transformers/utils/hub.py", line 416, in cached_file
raise EnvironmentError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-2-7b.
401 Client Error. (Request ID: Root=1-661471eb-38c6ceb05c22031f04f730f7;54f54e0f-18d7-4cbf-a882-a291b1abd4b4)
Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-2-7b/resolve/main/config.json.
Repo model meta-llama/Llama-2-7b is gated. You must be authenticated to access it.
Sentry is attempting to send 2 pending events
Waiting up to 2 seconds
Press Ctrl-C to quit
(base) gian@gian-TensorBook:~/NeoSapiens$
Upvote & Fund
We're using Polar.sh so you can upvote and help fund this issue.
We receive the funding once the issue is completed & confirmed by you.
Thank you in advance for helping prioritize & fund our backlog.
The text was updated successfully, but these errors were encountered:
Upvote & Fund
The text was updated successfully, but these errors were encountered: