Skip to content

Setting Up Your Project

Installation

Just like most python packages you can easily install Dandy using pip.

pip install dandy

Info

The Dandy package will also install blessed, requests, pydantic and python-dotenv packages.

Creating a Settings File

You can create a dandy_settings.py file in the root of your project with the following contents.

dandy_settings.py
import os
from pathlib import Path

ALLOW_RECORDING_TO_FILE = True

BASE_PATH = Path.resolve(Path(__file__)).parent

LLM_CONFIGS = {
    'DEFAULT': {
        'HOST': os.getenv('OPENAI_HOST', 'https://api.openai.com'),
        'PORT': int(os.getenv('OPENAI_PORT', 443)),
        'API_KEY': os.getenv('OPENAI_API_KEY'),
        'MODEL': 'gpt-4o-mini',
    },
    'GPT_4o': {
        'MODEL': 'gpt-4o',
    },
}

This configuration sets up OpenAI as the LLM service with multiple model options.

The DEFAULT in the LLM_CONFIGS will be used when no other config is specified for any llm actions.

Tip

Once the DEFAULT config is specified, the HOST, PORT AND API_KEY from the DEFAULT config will flow to the other configs if they are not specified.

Environment Variables

The DANDY_SETTINGS_MODULE environment variable can be used to specify the settings module to be used.

export DANDY_SETTINGS_MODULE=dandy_settings

Note

If the DANDY_SETTINGS_MODULE environment variable is not set, the system will default to look for a dandy_settings.py file in the current working directory or sys.path.

More Settings

There are more settings you can configure in your project see below for more information.

dandy/default_settings.py
import os
from pathlib import Path

AGENT_DEFAULT_PLAN_TIME_LIMIT_SECONDS: int | None = 600
AGENT_DEFAULT_PLAN_TASK_COUNT_LIMIT: int | None = 100

ALLOW_RECORDING_TO_FILE: bool = False

BASE_PATH: Path | str = Path.cwd()

CACHE_MEMORY_LIMIT: int = 1000
CACHE_SQLITE_DATABASE_PATH: Path | str = BASE_PATH
CACHE_SQLITE_LIMIT: int = 10000

DANDY_DIRECTORY = '.dandy'

DEBUG: bool = False

FUTURES_MAX_WORKERS: int = 10

HTTP_CONNECTION_RETRY_COUNT: int = 4
HTTP_CONNECTION_TIMEOUT_SECONDS: int | None = 60

LLM_CONFIGS = {
    'DEFAULT': {
        'HOST': os.getenv("AI_API_HOST"),
        'PORT': int(os.getenv("AI_API_POST", 443)),
        'API_KEY': os.getenv("AI_API_KEY"),
        'MODEL': os.getenv("AI_API_MODEL"),
        'OPTIONS': {
            'frequency_penalty': None,
            'max_completion_tokens': None,
            'presence_penalty': None,
            'temperature': None,
            'top_p': None,
        }
    },
}