Skip to main content
Version: devel

dlt.destinations.impl.destination.factory

destination Objects

class destination(Destination[CustomDestinationClientConfiguration,
"DestinationClient"])

View source on GitHub

spec

@property
def spec() -> Type[CustomDestinationClientConfiguration]

View source on GitHub

A spec of destination configuration resolved from the sink function signature

__init__

def __init__(destination_callable: Union[TDestinationCallable, str] = None,
destination_name: str = None,
environment: str = None,
loader_file_format: TLoaderFileFormat = None,
batch_size: int = 10,
naming_convention: str = "direct",
spec: Type[CustomDestinationClientConfiguration] = None,
**kwargs: Any) -> None

View source on GitHub

Configure the destination to use in a pipeline.

The dlt.destination decorator wraps this and should preferably be used.

All arguments provided here supersede other configuration sources such as environment variables and dlt config files.

Arguments:

  • destination_callable Union[TDestinationCallable, str], optional - The destination callable to use.
  • destination_name str, optional - Name of the destination, can be used in config section to differentiate between multiple of the same type
  • environment str, optional - Environment of the destination
  • loader_file_format TLoaderFileFormat, optional - The file format to use for the loader
  • batch_size int, optional - The batch size to use for the loader
  • naming_convention str, optional - The naming convention to use for the loader
  • spec Type[CustomDestinationClientConfiguration], optional - The spec to use for the destination
  • **kwargs Any - Additional arguments passed to the destination config

This demo works on codespaces. Codespaces is a development environment available for free to anyone with a Github account. You'll be asked to fork the demo repository and from there the README guides you with further steps.
The demo uses the Continue VSCode extension.

Off to codespaces!

DHelp

Ask a question

Welcome to "Codex Central", your next-gen help center, driven by OpenAI's GPT-4 model. It's more than just a forum or a FAQ hub – it's a dynamic knowledge base where coders can find AI-assisted solutions to their pressing problems. With GPT-4's powerful comprehension and predictive abilities, Codex Central provides instantaneous issue resolution, insightful debugging, and personalized guidance. Get your code running smoothly with the unparalleled support at Codex Central - coding help reimagined with AI prowess.