Configuration

ell provides various configuration options to customize its behavior.

ell.init(store: None | str = None, verbose: bool = False, autocommit: bool = True, lazy_versioning: bool = True, default_api_params: Dict[str, Any] | None = None, default_client: Any | None = None, autocommit_model: str = 'gpt-4o-mini') None

Initialize the ELL configuration with various settings.

Parameters:
  • verbose (bool) – Set verbosity of ELL operations.

  • store (Union[Store, str], optional) – Set the store for ELL. Can be a Store instance or a string path for SQLiteStore.

  • autocommit (bool) – Set autocommit for the store operations.

  • lazy_versioning (bool) – Enable or disable lazy versioning.

  • default_api_params (Dict[str, Any], optional) – Set default parameters for language models.

  • default_openai_client (openai.Client, optional) – Set the default OpenAI client.

  • autocommit_model (str) – Set the model used for autocommitting.

This init function is a convenience function that sets up the configuration for ell. It is a thin wrapper around the Config class, which is a Pydantic model.

You can modify the global configuration using the ell.config object which is an instance of Config:

pydantic model ell.Config

Configuration class for ELL.

Fields:
  • autocommit (bool)

  • autocommit_model (str)

  • default_api_params (Dict[str, Any])

  • default_client (openai.OpenAI | None)

  • lazy_versioning (bool)

  • override_wrapped_logging_width (int | None)

  • providers (Dict[Type, ell.provider.Provider])

  • registry (Dict[str, ell.configurator._Model])

  • store (None)

  • verbose (bool)

  • wrapped_logging (bool)

field autocommit: bool = False

If True, enables automatic committing of changes to the store.

field autocommit_model: str = 'gpt-4o-mini'

When set, changes the default autocommit model from GPT 4o mini.

field default_api_params: Dict[str, Any] [Optional]

Default parameters for language models.

field lazy_versioning: bool = True

If True, enables lazy versioning for improved performance.

field override_wrapped_logging_width: int | None = None

If set, overrides the default width for wrapped logging.

field verbose: bool = False

If True, enables verbose logging.

field wrapped_logging: bool = True

If True, enables wrapped logging for better readability.

get_client_for(model_name: str) Tuple[OpenAI | None, bool]

Get the OpenAI client for a specific model name.

Parameters:

model_name (str) – The name of the model to get the client for.

Returns:

The OpenAI client for the specified model, or None if not found, and a fallback flag.

Return type:

Tuple[Optional[openai.Client], bool]

get_provider_for(client: Type[Any] | Any) Provider | None

Get the provider instance for a specific client instance.

Parameters:

client (Any) – The client instance to get the provider for.

Returns:

The provider instance for the specified client, or None if not found.

Return type:

Optional[Provider]

model_registry_override(overrides: Dict[str, _Model])

Temporarily override the model registry with new model configurations.

Parameters:

overrides (Dict[str, ModelConfig]) – A dictionary of model names to ModelConfig instances to override.

register_model(name: str, default_client: OpenAI | Any | None = None, supports_streaming: bool | None = None) None

Register a model with its configuration.

register_provider(provider: Provider, client_type: Type[Any]) None

Register a provider class for a specific client type.

Parameters:

provider_class (Type[Provider]) – The provider class to register.