Skip to content

Miscellaneous

plateforme.core.schema.json

This module provides utilities for managing JSON schema within the Plateforme framework using Pydantic features.

DEFAULT_REF_TEMPLATE module-attribute

DEFAULT_REF_TEMPLATE = '#/$defs/{model}'

The default format string for generating reference names in JSON schemas.

JsonEncoder module-attribute

JsonEncoder = Callable[[Any], Any]

A type alias for the JSON encoder function.

JsonSchemaDict module-attribute

JsonSchemaDict = Dict[str, 'JsonSchemaValue']

A type alias for a JSON schema dictionary.

JsonSchemaExtra module-attribute

JsonSchemaExtra = Union[
    JsonSchemaDict, Callable[[JsonSchemaDict], None]
]

A type alias for the extra JSON schema data.

JsonSchemaExtraCallable module-attribute

JsonSchemaExtraCallable = Union[
    JsonSchemaExtra,
    Callable[[JsonSchemaDict, Type[Any]], None],
]

A type alias for the extra JSON schema data callable.

JsonSchemaMode module-attribute

JsonSchemaMode = Literal['validation', 'serialization']

A type alias for the mode of a JSON schema.

For some types, the inputs for validation differ from the outputs of serialization. For example, computed fields will only be present when serializing, and should not be provided when validating. This flag provides a way to indicate whether you want the JSON schema required for validation inputs, or that will be matched by serialization outputs.

JsonSchemaSource module-attribute

JsonSchemaSource = Literal['key', 'model', 'both']

A type alias for the source of a JSON schema.

It describes the source type to use for generating the resources JSON schema. It can be either key , model, or both where the latter accepts, when applicable, integer and string values for key identifiers in addition to the standard model schema generation.

JsonSchemaValue module-attribute

JsonSchemaValue = Union[
    int,
    float,
    str,
    bool,
    None,
    List["JsonSchemaValue"],
    JsonSchemaDict,
]

A type alias for a JSON schema value.

GenerateJsonSchema

GenerateJsonSchema(
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
)

Bases: GenerateJsonSchema

A class for generating JSON schemas.

This class generates JSON schemas based on configured parameters. The default schema dialect is https://json-schema.org/draft/2020-12/schema. The class uses by_alias to configure how fields with multiple names are handled, ref_template to format reference names and source to determine the source type of the schema when applicable.

Attributes:

Name Type Description
schema_dialect

The JSON schema dialect used to generate the schema.

ignored_warning_kinds set[JsonSchemaWarningKind]

Warnings to ignore when generating the schema. A self.render_warning_message will do nothing if its argument kind is in ignored_warning_kinds; this value can be modified on subclasses to easily control which warnings are emitted.

by_alias

Whether to use field aliases when generating the schema, i.e. if True, fields will be serialized according to their alias, otherwise according to their attribute name. Defaults to True.

ref_template

The template format string to use when generating reference names. Defaults to DEFAULT_REF_TEMPLATE.

source

The source type of the schema. It can be either model or resource where the latter accepts, when applicable, string values for identifiers in addition to the standard model schema generation. Defaults to model.

core_to_json_refs dict[CoreModeRef, JsonRef]

A mapping of core refs to JSON refs.

core_to_defs_refs dict[CoreModeRef, DefsRef]

A mapping of core refs to definition refs.

defs_to_core_refs dict[DefsRef, CoreModeRef]

A mapping of definition refs to core refs.

json_to_defs_refs dict[JsonRef, DefsRef]

A mapping of JSON refs to definition refs.

definitions dict[DefsRef, JsonSchemaValue]

Definitions in the schema.

Raises:

Type Description
JsonSchemaError

If the instance of the class is inadvertently re-used after generating a schema.

Note

See documentation bellow for more information about schema dialects: https://json-schema.org/understanding-json-schema/reference/schema.html

Initialize the JSON schema generator.

Parameters:

Name Type Description Default
by_alias bool

Whether to use field aliases when generating the schema, i.e. if True, fields will be serialized according to their alias, otherwise according to their attribute name. Defaults to True.

True
ref_template str

The template format string to use when generating reference names. Defaults to DEFAULT_REF_TEMPLATE.

DEFAULT_REF_TEMPLATE
Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/json.py
def __init__(self,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
):
    """Initialize the JSON schema generator.

    Args:
        by_alias: Whether to use field aliases when generating the schema,
            i.e. if ``True``, fields will be serialized according to their
            alias, otherwise according to their attribute name.
            Defaults to ``True``.
        ref_template: The template format string to use when generating
            reference names. Defaults to ``DEFAULT_REF_TEMPLATE``.
    """
    super().__init__(by_alias=by_alias, ref_template=ref_template)
    self._source: JsonSchemaSource = 'model'

ValidationsMapping

This class just contains mappings from core_schema attribute names to the corresponding JSON schema attribute names. While I suspect it is unlikely to be necessary, you can in principle override this class in a subclass of GenerateJsonSchema (by inheriting from GenerateJsonSchema.ValidationsMapping) to change these mappings.

build_schema_type_to_method

build_schema_type_to_method() -> dict[
    CoreSchemaOrFieldType,
    Callable[[CoreSchemaOrField], JsonSchemaValue],
]

Builds a dictionary mapping fields to methods for generating JSON schemas.

Returns:

Type Description
dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]]

A dictionary containing the mapping of CoreSchemaOrFieldType to a handler method.

Raises:

Type Description
TypeError

If no method has been defined for generating a JSON schema for a given pydantic core schema type.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def build_schema_type_to_method(
    self,
) -> dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]]:
    """Builds a dictionary mapping fields to methods for generating JSON schemas.

    Returns:
        A dictionary containing the mapping of `CoreSchemaOrFieldType` to a handler method.

    Raises:
        TypeError: If no method has been defined for generating a JSON schema for a given pydantic core schema type.
    """
    mapping: dict[CoreSchemaOrFieldType, Callable[[CoreSchemaOrField], JsonSchemaValue]] = {}
    core_schema_types: list[CoreSchemaOrFieldType] = _typing_extra.all_literal_values(
        CoreSchemaOrFieldType  # type: ignore
    )
    for key in core_schema_types:
        method_name = f"{key.replace('-', '_')}_schema"
        try:
            mapping[key] = getattr(self, method_name)
        except AttributeError as e:  # pragma: no cover
            raise TypeError(
                f'No method for generating JsonSchema for core_schema.type={key!r} '
                f'(expected: {type(self).__name__}.{method_name})'
            ) from e
    return mapping

generate_definitions

generate_definitions(
    inputs: Sequence[
        tuple[JsonSchemaKeyT, JsonSchemaMode, CoreSchema]
    ],
) -> tuple[
    dict[
        tuple[JsonSchemaKeyT, JsonSchemaMode],
        JsonSchemaValue,
    ],
    dict[DefsRef, JsonSchemaValue],
]

Generates JSON schema definitions from a list of core schemas, pairing the generated definitions with a mapping that links the input keys to the definition references.

Parameters:

Name Type Description Default
inputs Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, CoreSchema]]

A sequence of tuples, where:

  • The first element is a JSON schema key type.
  • The second element is the JSON mode: either 'validation' or 'serialization'.
  • The third element is a core schema.
required

Returns:

Type Description
tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]]

A tuple where:

  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
  • The second element is a dictionary whose keys are definition references for the JSON schemas from the first returned element, and whose values are the actual JSON schema definitions.

Raises:

Type Description
PydanticUserError

Raised if the JSON schema generator has already been used to generate a JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def generate_definitions(
    self, inputs: Sequence[tuple[JsonSchemaKeyT, JsonSchemaMode, core_schema.CoreSchema]]
) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], dict[DefsRef, JsonSchemaValue]]:
    """Generates JSON schema definitions from a list of core schemas, pairing the generated definitions with a
    mapping that links the input keys to the definition references.

    Args:
        inputs: A sequence of tuples, where:

            - The first element is a JSON schema key type.
            - The second element is the JSON mode: either 'validation' or 'serialization'.
            - The third element is a core schema.

    Returns:
        A tuple where:

            - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and
                whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have
                JsonRef references to definitions that are defined in the second returned element.)
            - The second element is a dictionary whose keys are definition references for the JSON schemas
                from the first returned element, and whose values are the actual JSON schema definitions.

    Raises:
        PydanticUserError: Raised if the JSON schema generator has already been used to generate a JSON schema.
    """
    if self._used:
        raise PydanticUserError(
            'This JSON schema generator has already been used to generate a JSON schema. '
            f'You must create a new instance of {type(self).__name__} to generate a new JSON schema.',
            code='json-schema-already-used',
        )

    for key, mode, schema in inputs:
        self._mode = mode
        self.generate_inner(schema)

    definitions_remapping = self._build_definitions_remapping()

    json_schemas_map: dict[tuple[JsonSchemaKeyT, JsonSchemaMode], DefsRef] = {}
    for key, mode, schema in inputs:
        self._mode = mode
        json_schema = self.generate_inner(schema)
        json_schemas_map[(key, mode)] = definitions_remapping.remap_json_schema(json_schema)

    json_schema = {'$defs': self.definitions}
    json_schema = definitions_remapping.remap_json_schema(json_schema)
    self._used = True
    return json_schemas_map, _sort_json_schema(json_schema['$defs'])  # type: ignore

generate_inner

generate_inner(
    schema: CoreSchemaOrField,
) -> JsonSchemaValue

Generates a JSON schema for a given core schema.

Parameters:

Name Type Description Default
schema CoreSchemaOrField

The given core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def generate_inner(self, schema: CoreSchemaOrField) -> JsonSchemaValue:  # noqa: C901
    """Generates a JSON schema for a given core schema.

    Args:
        schema: The given core schema.

    Returns:
        The generated JSON schema.
    """
    # If a schema with the same CoreRef has been handled, just return a reference to it
    # Note that this assumes that it will _never_ be the case that the same CoreRef is used
    # on types that should have different JSON schemas
    if 'ref' in schema:
        core_ref = CoreRef(schema['ref'])  # type: ignore[typeddict-item]
        core_mode_ref = (core_ref, self.mode)
        if core_mode_ref in self.core_to_defs_refs and self.core_to_defs_refs[core_mode_ref] in self.definitions:
            return {'$ref': self.core_to_json_refs[core_mode_ref]}

    # Generate the JSON schema, accounting for the json_schema_override and core_schema_override
    metadata_handler = _core_metadata.CoreMetadataHandler(schema)

    def populate_defs(core_schema: CoreSchema, json_schema: JsonSchemaValue) -> JsonSchemaValue:
        if 'ref' in core_schema:
            core_ref = CoreRef(core_schema['ref'])  # type: ignore[typeddict-item]
            defs_ref, ref_json_schema = self.get_cache_defs_ref_schema(core_ref)
            json_ref = JsonRef(ref_json_schema['$ref'])
            self.json_to_defs_refs[json_ref] = defs_ref
            # Replace the schema if it's not a reference to itself
            # What we want to avoid is having the def be just a ref to itself
            # which is what would happen if we blindly assigned any
            if json_schema.get('$ref', None) != json_ref:
                self.definitions[defs_ref] = json_schema
                self._core_defs_invalid_for_json_schema.pop(defs_ref, None)
            json_schema = ref_json_schema
        return json_schema

    def convert_to_all_of(json_schema: JsonSchemaValue) -> JsonSchemaValue:
        if '$ref' in json_schema and len(json_schema.keys()) > 1:
            # technically you can't have any other keys next to a "$ref"
            # but it's an easy mistake to make and not hard to correct automatically here
            json_schema = json_schema.copy()
            ref = json_schema.pop('$ref')
            json_schema = {'allOf': [{'$ref': ref}], **json_schema}
        return json_schema

    def handler_func(schema_or_field: CoreSchemaOrField) -> JsonSchemaValue:
        """Generate a JSON schema based on the input schema.

        Args:
            schema_or_field: The core schema to generate a JSON schema from.

        Returns:
            The generated JSON schema.

        Raises:
            TypeError: If an unexpected schema type is encountered.
        """
        # Generate the core-schema-type-specific bits of the schema generation:
        json_schema: JsonSchemaValue | None = None
        if self.mode == 'serialization' and 'serialization' in schema_or_field:
            ser_schema = schema_or_field['serialization']  # type: ignore
            json_schema = self.ser_schema(ser_schema)
        if json_schema is None:
            if _core_utils.is_core_schema(schema_or_field) or _core_utils.is_core_schema_field(schema_or_field):
                generate_for_schema_type = self._schema_type_to_method[schema_or_field['type']]
                json_schema = generate_for_schema_type(schema_or_field)
            else:
                raise TypeError(f'Unexpected schema type: schema={schema_or_field}')
        if _core_utils.is_core_schema(schema_or_field):
            json_schema = populate_defs(schema_or_field, json_schema)
            json_schema = convert_to_all_of(json_schema)
        return json_schema

    current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, handler_func)

    for js_modify_function in metadata_handler.metadata.get('pydantic_js_functions', ()):

        def new_handler_func(
            schema_or_field: CoreSchemaOrField,
            current_handler: GetJsonSchemaHandler = current_handler,
            js_modify_function: GetJsonSchemaFunction = js_modify_function,
        ) -> JsonSchemaValue:
            json_schema = js_modify_function(schema_or_field, current_handler)
            if _core_utils.is_core_schema(schema_or_field):
                json_schema = populate_defs(schema_or_field, json_schema)
            original_schema = current_handler.resolve_ref_schema(json_schema)
            ref = json_schema.pop('$ref', None)
            if ref and json_schema:
                original_schema.update(json_schema)
            return original_schema

        current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, new_handler_func)

    for js_modify_function in metadata_handler.metadata.get('pydantic_js_annotation_functions', ()):

        def new_handler_func(
            schema_or_field: CoreSchemaOrField,
            current_handler: GetJsonSchemaHandler = current_handler,
            js_modify_function: GetJsonSchemaFunction = js_modify_function,
        ) -> JsonSchemaValue:
            json_schema = js_modify_function(schema_or_field, current_handler)
            if _core_utils.is_core_schema(schema_or_field):
                json_schema = populate_defs(schema_or_field, json_schema)
                json_schema = convert_to_all_of(json_schema)
            return json_schema

        current_handler = _schema_generation_shared.GenerateJsonSchemaHandler(self, new_handler_func)

    json_schema = current_handler(schema)
    if _core_utils.is_core_schema(schema):
        json_schema = populate_defs(schema, json_schema)
        json_schema = convert_to_all_of(json_schema)
    return json_schema

any_schema

any_schema(schema: AnySchema) -> JsonSchemaValue

Generates a JSON schema that matches any value.

Parameters:

Name Type Description Default
schema AnySchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def any_schema(self, schema: core_schema.AnySchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches any value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return {}

none_schema

none_schema(schema: NoneSchema) -> JsonSchemaValue

Generates a JSON schema that matches None.

Parameters:

Name Type Description Default
schema NoneSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def none_schema(self, schema: core_schema.NoneSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches `None`.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return {'type': 'null'}

bool_schema

bool_schema(schema: BoolSchema) -> JsonSchemaValue

Generates a JSON schema that matches a bool value.

Parameters:

Name Type Description Default
schema BoolSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def bool_schema(self, schema: core_schema.BoolSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a bool value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return {'type': 'boolean'}

int_schema

int_schema(schema: IntSchema) -> JsonSchemaValue

Generates a JSON schema that matches an int value.

Parameters:

Name Type Description Default
schema IntSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def int_schema(self, schema: core_schema.IntSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches an int value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema: dict[str, Any] = {'type': 'integer'}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.numeric)
    json_schema = {k: v for k, v in json_schema.items() if v not in {math.inf, -math.inf}}
    return json_schema

float_schema

float_schema(schema: FloatSchema) -> JsonSchemaValue

Generates a JSON schema that matches a float value.

Parameters:

Name Type Description Default
schema FloatSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def float_schema(self, schema: core_schema.FloatSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a float value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema: dict[str, Any] = {'type': 'number'}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.numeric)
    json_schema = {k: v for k, v in json_schema.items() if v not in {math.inf, -math.inf}}
    return json_schema

decimal_schema

decimal_schema(schema: DecimalSchema) -> JsonSchemaValue

Generates a JSON schema that matches a decimal value.

Parameters:

Name Type Description Default
schema DecimalSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def decimal_schema(self, schema: core_schema.DecimalSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a decimal value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema = self.str_schema(core_schema.str_schema())
    if self.mode == 'validation':
        multiple_of = schema.get('multiple_of')
        le = schema.get('le')
        ge = schema.get('ge')
        lt = schema.get('lt')
        gt = schema.get('gt')
        json_schema = {
            'anyOf': [
                self.float_schema(
                    core_schema.float_schema(
                        allow_inf_nan=schema.get('allow_inf_nan'),
                        multiple_of=None if multiple_of is None else float(multiple_of),
                        le=None if le is None else float(le),
                        ge=None if ge is None else float(ge),
                        lt=None if lt is None else float(lt),
                        gt=None if gt is None else float(gt),
                    )
                ),
                json_schema,
            ],
        }
    return json_schema

str_schema

str_schema(schema: StringSchema) -> JsonSchemaValue

Generates a JSON schema that matches a string value.

Parameters:

Name Type Description Default
schema StringSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def str_schema(self, schema: core_schema.StringSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a string value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema = {'type': 'string'}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.string)
    return json_schema

bytes_schema

bytes_schema(schema: BytesSchema) -> JsonSchemaValue

Generates a JSON schema that matches a bytes value.

Parameters:

Name Type Description Default
schema BytesSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def bytes_schema(self, schema: core_schema.BytesSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a bytes value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema = {'type': 'string', 'format': 'base64url' if self._config.ser_json_bytes == 'base64' else 'binary'}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.bytes)
    return json_schema

date_schema

date_schema(schema: DateSchema) -> JsonSchemaValue

Generates a JSON schema that matches a date value.

Parameters:

Name Type Description Default
schema DateSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def date_schema(self, schema: core_schema.DateSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a date value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema = {'type': 'string', 'format': 'date'}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.date)
    return json_schema

time_schema

time_schema(schema: TimeSchema) -> JsonSchemaValue

Generates a JSON schema that matches a time value.

Parameters:

Name Type Description Default
schema TimeSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def time_schema(self, schema: core_schema.TimeSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a time value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return {'type': 'string', 'format': 'time'}

datetime_schema

datetime_schema(schema: DatetimeSchema) -> JsonSchemaValue

Generates a JSON schema that matches a datetime value.

Parameters:

Name Type Description Default
schema DatetimeSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def datetime_schema(self, schema: core_schema.DatetimeSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a datetime value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return {'type': 'string', 'format': 'date-time'}

timedelta_schema

timedelta_schema(
    schema: TimedeltaSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a timedelta value.

Parameters:

Name Type Description Default
schema TimedeltaSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def timedelta_schema(self, schema: core_schema.TimedeltaSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a timedelta value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    if self._config.ser_json_timedelta == 'float':
        return {'type': 'number'}
    return {'type': 'string', 'format': 'duration'}

literal_schema

literal_schema(schema: LiteralSchema) -> JsonSchemaValue

Generates a JSON schema that matches a literal value.

Parameters:

Name Type Description Default
schema LiteralSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def literal_schema(self, schema: core_schema.LiteralSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a literal value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    expected = [v.value if isinstance(v, Enum) else v for v in schema['expected']]
    # jsonify the expected values
    expected = [to_jsonable_python(v) for v in expected]

    if len(expected) == 1:
        return {'const': expected[0]}

    types = {type(e) for e in expected}
    if types == {str}:
        return {'enum': expected, 'type': 'string'}
    elif types == {int}:
        return {'enum': expected, 'type': 'integer'}
    elif types == {float}:
        return {'enum': expected, 'type': 'number'}
    elif types == {bool}:
        return {'enum': expected, 'type': 'boolean'}
    elif types == {list}:
        return {'enum': expected, 'type': 'array'}
    # there is not None case because if it's mixed it hits the final `else`
    # if it's a single Literal[None] then it becomes a `const` schema above
    else:
        return {'enum': expected}

is_instance_schema

is_instance_schema(
    schema: IsInstanceSchema,
) -> JsonSchemaValue

Handles JSON schema generation for a core schema that checks if a value is an instance of a class.

Unless overridden in a subclass, this raises an error.

Parameters:

Name Type Description Default
schema IsInstanceSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def is_instance_schema(self, schema: core_schema.IsInstanceSchema) -> JsonSchemaValue:
    """Handles JSON schema generation for a core schema that checks if a value is an instance of a class.

    Unless overridden in a subclass, this raises an error.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.handle_invalid_for_json_schema(schema, f'core_schema.IsInstanceSchema ({schema["cls"]})')

is_subclass_schema

is_subclass_schema(
    schema: IsSubclassSchema,
) -> JsonSchemaValue

Handles JSON schema generation for a core schema that checks if a value is a subclass of a class.

For backwards compatibility with v1, this does not raise an error, but can be overridden to change this.

Parameters:

Name Type Description Default
schema IsSubclassSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def is_subclass_schema(self, schema: core_schema.IsSubclassSchema) -> JsonSchemaValue:
    """Handles JSON schema generation for a core schema that checks if a value is a subclass of a class.

    For backwards compatibility with v1, this does not raise an error, but can be overridden to change this.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    # Note: This is for compatibility with V1; you can override if you want different behavior.
    return {}

callable_schema

callable_schema(schema: CallableSchema) -> JsonSchemaValue

Generates a JSON schema that matches a callable value.

Unless overridden in a subclass, this raises an error.

Parameters:

Name Type Description Default
schema CallableSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def callable_schema(self, schema: core_schema.CallableSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a callable value.

    Unless overridden in a subclass, this raises an error.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.handle_invalid_for_json_schema(schema, 'core_schema.CallableSchema')

list_schema

list_schema(schema: ListSchema) -> JsonSchemaValue

Returns a schema that matches a list schema.

Parameters:

Name Type Description Default
schema ListSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def list_schema(self, schema: core_schema.ListSchema) -> JsonSchemaValue:
    """Returns a schema that matches a list schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    items_schema = {} if 'items_schema' not in schema else self.generate_inner(schema['items_schema'])
    json_schema = {'type': 'array', 'items': items_schema}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.array)
    return json_schema

tuple_positional_schema

tuple_positional_schema(
    schema: TupleSchema,
) -> JsonSchemaValue

Replaced by tuple_schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
@deprecated('`tuple_positional_schema` is deprecated. Use `tuple_schema` instead.', category=None)
@final
def tuple_positional_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue:
    """Replaced by `tuple_schema`."""
    warnings.warn(
        '`tuple_positional_schema` is deprecated. Use `tuple_schema` instead.',
        PydanticDeprecatedSince26,
        stacklevel=2,
    )
    return self.tuple_schema(schema)

tuple_variable_schema

tuple_variable_schema(
    schema: TupleSchema,
) -> JsonSchemaValue

Replaced by tuple_schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
@deprecated('`tuple_variable_schema` is deprecated. Use `tuple_schema` instead.', category=None)
@final
def tuple_variable_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue:
    """Replaced by `tuple_schema`."""
    warnings.warn(
        '`tuple_variable_schema` is deprecated. Use `tuple_schema` instead.',
        PydanticDeprecatedSince26,
        stacklevel=2,
    )
    return self.tuple_schema(schema)

tuple_schema

tuple_schema(schema: TupleSchema) -> JsonSchemaValue

Generates a JSON schema that matches a tuple schema e.g. Tuple[int, str, bool] or Tuple[int, ...].

Parameters:

Name Type Description Default
schema TupleSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def tuple_schema(self, schema: core_schema.TupleSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a tuple schema e.g. `Tuple[int,
    str, bool]` or `Tuple[int, ...]`.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema: JsonSchemaValue = {'type': 'array'}
    if 'variadic_item_index' in schema:
        variadic_item_index = schema['variadic_item_index']
        if variadic_item_index > 0:
            json_schema['minItems'] = variadic_item_index
            json_schema['prefixItems'] = [
                self.generate_inner(item) for item in schema['items_schema'][:variadic_item_index]
            ]
        if variadic_item_index + 1 == len(schema['items_schema']):
            # if the variadic item is the last item, then represent it faithfully
            json_schema['items'] = self.generate_inner(schema['items_schema'][variadic_item_index])
        else:
            # otherwise, 'items' represents the schema for the variadic
            # item plus the suffix, so just allow anything for simplicity
            # for now
            json_schema['items'] = True
    else:
        prefixItems = [self.generate_inner(item) for item in schema['items_schema']]
        if prefixItems:
            json_schema['prefixItems'] = prefixItems
        json_schema['minItems'] = len(prefixItems)
        json_schema['maxItems'] = len(prefixItems)
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.array)
    return json_schema

set_schema

set_schema(schema: SetSchema) -> JsonSchemaValue

Generates a JSON schema that matches a set schema.

Parameters:

Name Type Description Default
schema SetSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def set_schema(self, schema: core_schema.SetSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a set schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self._common_set_schema(schema)

frozenset_schema

frozenset_schema(
    schema: FrozenSetSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a frozenset schema.

Parameters:

Name Type Description Default
schema FrozenSetSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def frozenset_schema(self, schema: core_schema.FrozenSetSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a frozenset schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self._common_set_schema(schema)

generator_schema

generator_schema(
    schema: GeneratorSchema,
) -> JsonSchemaValue

Returns a JSON schema that represents the provided GeneratorSchema.

Parameters:

Name Type Description Default
schema GeneratorSchema

The schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def generator_schema(self, schema: core_schema.GeneratorSchema) -> JsonSchemaValue:
    """Returns a JSON schema that represents the provided GeneratorSchema.

    Args:
        schema: The schema.

    Returns:
        The generated JSON schema.
    """
    items_schema = {} if 'items_schema' not in schema else self.generate_inner(schema['items_schema'])
    json_schema = {'type': 'array', 'items': items_schema}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.array)
    return json_schema

dict_schema

dict_schema(schema: DictSchema) -> JsonSchemaValue

Generates a JSON schema that matches a dict schema.

Parameters:

Name Type Description Default
schema DictSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def dict_schema(self, schema: core_schema.DictSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a dict schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema: JsonSchemaValue = {'type': 'object'}

    keys_schema = self.generate_inner(schema['keys_schema']).copy() if 'keys_schema' in schema else {}
    keys_pattern = keys_schema.pop('pattern', None)

    values_schema = self.generate_inner(schema['values_schema']).copy() if 'values_schema' in schema else {}
    values_schema.pop('title', None)  # don't give a title to the additionalProperties
    if values_schema or keys_pattern is not None:  # don't add additionalProperties if it's empty
        if keys_pattern is None:
            json_schema['additionalProperties'] = values_schema
        else:
            json_schema['patternProperties'] = {keys_pattern: values_schema}

    self.update_with_validations(json_schema, schema, self.ValidationsMapping.object)
    return json_schema

function_before_schema

function_before_schema(
    schema: BeforeValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-before schema.

Parameters:

Name Type Description Default
schema BeforeValidatorFunctionSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def function_before_schema(self, schema: core_schema.BeforeValidatorFunctionSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a function-before schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self._function_schema(schema)

function_after_schema

function_after_schema(
    schema: AfterValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-after schema.

Parameters:

Name Type Description Default
schema AfterValidatorFunctionSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def function_after_schema(self, schema: core_schema.AfterValidatorFunctionSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a function-after schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self._function_schema(schema)

function_plain_schema

function_plain_schema(
    schema: PlainValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-plain schema.

Parameters:

Name Type Description Default
schema PlainValidatorFunctionSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def function_plain_schema(self, schema: core_schema.PlainValidatorFunctionSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a function-plain schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self._function_schema(schema)

function_wrap_schema

function_wrap_schema(
    schema: WrapValidatorFunctionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a function-wrap schema.

Parameters:

Name Type Description Default
schema WrapValidatorFunctionSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def function_wrap_schema(self, schema: core_schema.WrapValidatorFunctionSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a function-wrap schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self._function_schema(schema)

default_schema

default_schema(
    schema: WithDefaultSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema with a default value.

Parameters:

Name Type Description Default
schema WithDefaultSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def default_schema(self, schema: core_schema.WithDefaultSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema with a default value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema = self.generate_inner(schema['schema'])

    if 'default' not in schema:
        return json_schema
    default = schema['default']
    # Note: if you want to include the value returned by the default_factory,
    # override this method and replace the code above with:
    # if 'default' in schema:
    #     default = schema['default']
    # elif 'default_factory' in schema:
    #     default = schema['default_factory']()
    # else:
    #     return json_schema

    try:
        encoded_default = self.encode_default(default)
    except pydantic_core.PydanticSerializationError:
        self.emit_warning(
            'non-serializable-default',
            f'Default value {default} is not JSON serializable; excluding default from JSON schema',
        )
        # Return the inner schema, as though there was no default
        return json_schema

    if '$ref' in json_schema:
        # Since reference schemas do not support child keys, we wrap the reference schema in a single-case allOf:
        return {'allOf': [json_schema], 'default': encoded_default}
    else:
        json_schema['default'] = encoded_default
        return json_schema

nullable_schema

nullable_schema(schema: NullableSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows null values.

Parameters:

Name Type Description Default
schema NullableSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def nullable_schema(self, schema: core_schema.NullableSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that allows null values.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    null_schema = {'type': 'null'}
    inner_json_schema = self.generate_inner(schema['schema'])

    if inner_json_schema == null_schema:
        return null_schema
    else:
        # Thanks to the equality check against `null_schema` above, I think 'oneOf' would also be valid here;
        # I'll use 'anyOf' for now, but it could be changed it if it would work better with some external tooling
        return self.get_flattened_anyof([inner_json_schema, null_schema])

union_schema

union_schema(schema: UnionSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching any of the given schemas.

Parameters:

Name Type Description Default
schema UnionSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def union_schema(self, schema: core_schema.UnionSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that allows values matching any of the given schemas.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    generated: list[JsonSchemaValue] = []

    choices = schema['choices']
    for choice in choices:
        # choice will be a tuple if an explicit label was provided
        choice_schema = choice[0] if isinstance(choice, tuple) else choice
        try:
            generated.append(self.generate_inner(choice_schema))
        except PydanticOmit:
            continue
        except PydanticInvalidForJsonSchema as exc:
            self.emit_warning('skipped-choice', exc.message)
    if len(generated) == 1:
        return generated[0]
    return self.get_flattened_anyof(generated)

tagged_union_schema

tagged_union_schema(
    schema: TaggedUnionSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching any of the given schemas, where the schemas are tagged with a discriminator field that indicates which schema should be used to validate the value.

Parameters:

Name Type Description Default
schema TaggedUnionSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def tagged_union_schema(self, schema: core_schema.TaggedUnionSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that allows values matching any of the given schemas, where
    the schemas are tagged with a discriminator field that indicates which schema should be used to validate
    the value.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    generated: dict[str, JsonSchemaValue] = {}
    for k, v in schema['choices'].items():
        if isinstance(k, Enum):
            k = k.value
        try:
            # Use str(k) since keys must be strings for json; while not technically correct,
            # it's the closest that can be represented in valid JSON
            generated[str(k)] = self.generate_inner(v).copy()
        except PydanticOmit:
            continue
        except PydanticInvalidForJsonSchema as exc:
            self.emit_warning('skipped-choice', exc.message)

    one_of_choices = _deduplicate_schemas(generated.values())
    json_schema: JsonSchemaValue = {'oneOf': one_of_choices}

    # This reflects the v1 behavior; TODO: we should make it possible to exclude OpenAPI stuff from the JSON schema
    openapi_discriminator = self._extract_discriminator(schema, one_of_choices)
    if openapi_discriminator is not None:
        json_schema['discriminator'] = {
            'propertyName': openapi_discriminator,
            'mapping': {k: v.get('$ref', v) for k, v in generated.items()},
        }

    return json_schema

chain_schema

chain_schema(schema: ChainSchema) -> JsonSchemaValue

Generates a JSON schema that matches a core_schema.ChainSchema.

When generating a schema for validation, we return the validation JSON schema for the first step in the chain. For serialization, we return the serialization JSON schema for the last step in the chain.

Parameters:

Name Type Description Default
schema ChainSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def chain_schema(self, schema: core_schema.ChainSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a core_schema.ChainSchema.

    When generating a schema for validation, we return the validation JSON schema for the first step in the chain.
    For serialization, we return the serialization JSON schema for the last step in the chain.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    step_index = 0 if self.mode == 'validation' else -1  # use first step for validation, last for serialization
    return self.generate_inner(schema['steps'][step_index])

lax_or_strict_schema

lax_or_strict_schema(
    schema: LaxOrStrictSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching either the lax schema or the strict schema.

Parameters:

Name Type Description Default
schema LaxOrStrictSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def lax_or_strict_schema(self, schema: core_schema.LaxOrStrictSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that allows values matching either the lax schema or the
    strict schema.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    # TODO: Need to read the default value off of model config or whatever
    use_strict = schema.get('strict', False)  # TODO: replace this default False
    # If your JSON schema fails to generate it is probably
    # because one of the following two branches failed.
    if use_strict:
        return self.generate_inner(schema['strict_schema'])
    else:
        return self.generate_inner(schema['lax_schema'])

json_or_python_schema

json_or_python_schema(
    schema: JsonOrPythonSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that allows values matching either the JSON schema or the Python schema.

The JSON schema is used instead of the Python schema. If you want to use the Python schema, you should override this method.

Parameters:

Name Type Description Default
schema JsonOrPythonSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def json_or_python_schema(self, schema: core_schema.JsonOrPythonSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that allows values matching either the JSON schema or the
    Python schema.

    The JSON schema is used instead of the Python schema. If you want to use the Python schema, you should override
    this method.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.generate_inner(schema['json_schema'])

typed_dict_schema

typed_dict_schema(
    schema: TypedDictSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a typed dict.

Parameters:

Name Type Description Default
schema TypedDictSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def typed_dict_schema(self, schema: core_schema.TypedDictSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a typed dict.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    total = schema.get('total', True)
    named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [
        (name, self.field_is_required(field, total), field)
        for name, field in schema['fields'].items()
        if self.field_is_present(field)
    ]
    if self.mode == 'serialization':
        named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', [])))

    config = _get_typed_dict_config(schema)
    with self._config_wrapper_stack.push(config):
        json_schema = self._named_required_fields_schema(named_required_fields)

    extra = schema.get('extra_behavior')
    if extra is None:
        extra = config.get('extra', 'ignore')
    if extra == 'forbid':
        json_schema['additionalProperties'] = False
    elif extra == 'allow':
        json_schema['additionalProperties'] = True

    return json_schema

typed_dict_field_schema

typed_dict_field_schema(
    schema: TypedDictField,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a typed dict field.

Parameters:

Name Type Description Default
schema TypedDictField

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def typed_dict_field_schema(self, schema: core_schema.TypedDictField) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a typed dict field.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.generate_inner(schema['schema'])

dataclass_field_schema

dataclass_field_schema(
    schema: DataclassField,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass field.

Parameters:

Name Type Description Default
schema DataclassField

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def dataclass_field_schema(self, schema: core_schema.DataclassField) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a dataclass field.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.generate_inner(schema['schema'])

model_field_schema

model_field_schema(schema: ModelField) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model field.

Parameters:

Name Type Description Default
schema ModelField

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def model_field_schema(self, schema: core_schema.ModelField) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a model field.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.generate_inner(schema['schema'])

computed_field_schema

computed_field_schema(
    schema: ComputedField,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a computed field.

Parameters:

Name Type Description Default
schema ComputedField

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def computed_field_schema(self, schema: core_schema.ComputedField) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a computed field.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.generate_inner(schema['return_schema'])

model_schema

model_schema(schema: ModelSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model.

Parameters:

Name Type Description Default
schema ModelSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def model_schema(self, schema: core_schema.ModelSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a model.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    # We do not use schema['model'].model_json_schema() here
    # because it could lead to inconsistent refs handling, etc.
    cls = cast('type[BaseModel]', schema['cls'])
    config = cls.model_config
    title = config.get('title')

    with self._config_wrapper_stack.push(config):
        json_schema = self.generate_inner(schema['schema'])

    json_schema_extra = config.get('json_schema_extra')
    if cls.__pydantic_root_model__:
        root_json_schema_extra = cls.model_fields['root'].json_schema_extra
        if json_schema_extra and root_json_schema_extra:
            raise ValueError(
                '"model_config[\'json_schema_extra\']" and "Field.json_schema_extra" on "RootModel.root"'
                ' field must not be set simultaneously'
            )
        if root_json_schema_extra:
            json_schema_extra = root_json_schema_extra

    json_schema = self._update_class_schema(json_schema, title, config.get('extra', None), cls, json_schema_extra)

    return json_schema

resolve_schema_to_update

resolve_schema_to_update(
    json_schema: JsonSchemaValue,
) -> JsonSchemaValue

Resolve a JsonSchemaValue to the non-ref schema if it is a $ref schema.

Parameters:

Name Type Description Default
json_schema JsonSchemaValue

The schema to resolve.

required

Returns:

Type Description
JsonSchemaValue

The resolved schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def resolve_schema_to_update(self, json_schema: JsonSchemaValue) -> JsonSchemaValue:
    """Resolve a JsonSchemaValue to the non-ref schema if it is a $ref schema.

    Args:
        json_schema: The schema to resolve.

    Returns:
        The resolved schema.
    """
    if '$ref' in json_schema:
        schema_to_update = self.get_schema_from_definitions(JsonRef(json_schema['$ref']))
        if schema_to_update is None:
            raise RuntimeError(f'Cannot update undefined schema for $ref={json_schema["$ref"]}')
        return self.resolve_schema_to_update(schema_to_update)
    else:
        schema_to_update = json_schema
    return schema_to_update

model_fields_schema

model_fields_schema(
    schema: ModelFieldsSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a model's fields.

Parameters:

Name Type Description Default
schema ModelFieldsSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def model_fields_schema(self, schema: core_schema.ModelFieldsSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a model's fields.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [
        (name, self.field_is_required(field, total=True), field)
        for name, field in schema['fields'].items()
        if self.field_is_present(field)
    ]
    if self.mode == 'serialization':
        named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', [])))
    json_schema = self._named_required_fields_schema(named_required_fields)
    extras_schema = schema.get('extras_schema', None)
    if extras_schema is not None:
        schema_to_update = self.resolve_schema_to_update(json_schema)
        schema_to_update['additionalProperties'] = self.generate_inner(extras_schema)
    return json_schema

field_is_present

field_is_present(field: CoreSchemaField) -> bool

Whether the field should be included in the generated JSON schema.

Parameters:

Name Type Description Default
field CoreSchemaField

The schema for the field itself.

required

Returns:

Type Description
bool

True if the field should be included in the generated JSON schema, False otherwise.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def field_is_present(self, field: CoreSchemaField) -> bool:
    """Whether the field should be included in the generated JSON schema.

    Args:
        field: The schema for the field itself.

    Returns:
        `True` if the field should be included in the generated JSON schema, `False` otherwise.
    """
    if self.mode == 'serialization':
        # If you still want to include the field in the generated JSON schema,
        # override this method and return True
        return not field.get('serialization_exclude')
    elif self.mode == 'validation':
        return True
    else:
        assert_never(self.mode)

field_is_required

field_is_required(
    field: ModelField | DataclassField | TypedDictField,
    total: bool,
) -> bool

Whether the field should be marked as required in the generated JSON schema. (Note that this is irrelevant if the field is not present in the JSON schema.).

Parameters:

Name Type Description Default
field ModelField | DataclassField | TypedDictField

The schema for the field itself.

required
total bool

Only applies to TypedDictFields. Indicates if the TypedDict this field belongs to is total, in which case any fields that don't explicitly specify required=False are required.

required

Returns:

Type Description
bool

True if the field should be marked as required in the generated JSON schema, False otherwise.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def field_is_required(
    self,
    field: core_schema.ModelField | core_schema.DataclassField | core_schema.TypedDictField,
    total: bool,
) -> bool:
    """Whether the field should be marked as required in the generated JSON schema.
    (Note that this is irrelevant if the field is not present in the JSON schema.).

    Args:
        field: The schema for the field itself.
        total: Only applies to `TypedDictField`s.
            Indicates if the `TypedDict` this field belongs to is total, in which case any fields that don't
            explicitly specify `required=False` are required.

    Returns:
        `True` if the field should be marked as required in the generated JSON schema, `False` otherwise.
    """
    if self.mode == 'serialization' and self._config.json_schema_serialization_defaults_required:
        return not field.get('serialization_exclude')
    else:
        if field['type'] == 'typed-dict-field':
            return field.get('required', total)
        else:
            return field['schema']['type'] != 'default'

dataclass_args_schema

dataclass_args_schema(
    schema: DataclassArgsSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass's constructor arguments.

Parameters:

Name Type Description Default
schema DataclassArgsSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def dataclass_args_schema(self, schema: core_schema.DataclassArgsSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a dataclass's constructor arguments.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    named_required_fields: list[tuple[str, bool, CoreSchemaField]] = [
        (field['name'], self.field_is_required(field, total=True), field)
        for field in schema['fields']
        if self.field_is_present(field)
    ]
    if self.mode == 'serialization':
        named_required_fields.extend(self._name_required_computed_fields(schema.get('computed_fields', [])))
    return self._named_required_fields_schema(named_required_fields)

dataclass_schema

dataclass_schema(
    schema: DataclassSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a dataclass.

Parameters:

Name Type Description Default
schema DataclassSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def dataclass_schema(self, schema: core_schema.DataclassSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a dataclass.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    cls = schema['cls']
    config: ConfigDict = getattr(cls, '__pydantic_config__', cast('ConfigDict', {}))
    title = config.get('title') or cls.__name__

    with self._config_wrapper_stack.push(config):
        json_schema = self.generate_inner(schema['schema']).copy()

    json_schema_extra = config.get('json_schema_extra')
    json_schema = self._update_class_schema(json_schema, title, config.get('extra', None), cls, json_schema_extra)

    # Dataclass-specific handling of description
    if is_dataclass(cls) and not hasattr(cls, '__pydantic_validator__'):
        # vanilla dataclass; don't use cls.__doc__ as it will contain the class signature by default
        description = None
    else:
        description = None if cls.__doc__ is None else inspect.cleandoc(cls.__doc__)
    if description:
        json_schema['description'] = description

    return json_schema

arguments_schema

arguments_schema(
    schema: ArgumentsSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function's arguments.

Parameters:

Name Type Description Default
schema ArgumentsSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def arguments_schema(self, schema: core_schema.ArgumentsSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a function's arguments.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    metadata = _core_metadata.CoreMetadataHandler(schema).metadata
    prefer_positional = metadata.get('pydantic_js_prefer_positional_arguments')

    arguments = schema['arguments_schema']
    kw_only_arguments = [a for a in arguments if a.get('mode') == 'keyword_only']
    kw_or_p_arguments = [a for a in arguments if a.get('mode') in {'positional_or_keyword', None}]
    p_only_arguments = [a for a in arguments if a.get('mode') == 'positional_only']
    var_args_schema = schema.get('var_args_schema')
    var_kwargs_schema = schema.get('var_kwargs_schema')

    if prefer_positional:
        positional_possible = not kw_only_arguments and not var_kwargs_schema
        if positional_possible:
            return self.p_arguments_schema(p_only_arguments + kw_or_p_arguments, var_args_schema)

    keyword_possible = not p_only_arguments and not var_args_schema
    if keyword_possible:
        return self.kw_arguments_schema(kw_or_p_arguments + kw_only_arguments, var_kwargs_schema)

    if not prefer_positional:
        positional_possible = not kw_only_arguments and not var_kwargs_schema
        if positional_possible:
            return self.p_arguments_schema(p_only_arguments + kw_or_p_arguments, var_args_schema)

    raise PydanticInvalidForJsonSchema(
        'Unable to generate JSON schema for arguments validator with positional-only and keyword-only arguments'
    )

kw_arguments_schema

kw_arguments_schema(
    arguments: list[ArgumentsParameter],
    var_kwargs_schema: CoreSchema | None,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function's keyword arguments.

Parameters:

Name Type Description Default
arguments list[ArgumentsParameter]

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def kw_arguments_schema(
    self, arguments: list[core_schema.ArgumentsParameter], var_kwargs_schema: CoreSchema | None
) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a function's keyword arguments.

    Args:
        arguments: The core schema.

    Returns:
        The generated JSON schema.
    """
    properties: dict[str, JsonSchemaValue] = {}
    required: list[str] = []
    for argument in arguments:
        name = self.get_argument_name(argument)
        argument_schema = self.generate_inner(argument['schema']).copy()
        argument_schema['title'] = self.get_title_from_name(name)
        properties[name] = argument_schema

        if argument['schema']['type'] != 'default':
            # This assumes that if the argument has a default value,
            # the inner schema must be of type WithDefaultSchema.
            # I believe this is true, but I am not 100% sure
            required.append(name)

    json_schema: JsonSchemaValue = {'type': 'object', 'properties': properties}
    if required:
        json_schema['required'] = required

    if var_kwargs_schema:
        additional_properties_schema = self.generate_inner(var_kwargs_schema)
        if additional_properties_schema:
            json_schema['additionalProperties'] = additional_properties_schema
    else:
        json_schema['additionalProperties'] = False
    return json_schema

p_arguments_schema

p_arguments_schema(
    arguments: list[ArgumentsParameter],
    var_args_schema: CoreSchema | None,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function's positional arguments.

Parameters:

Name Type Description Default
arguments list[ArgumentsParameter]

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def p_arguments_schema(
    self, arguments: list[core_schema.ArgumentsParameter], var_args_schema: CoreSchema | None
) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a function's positional arguments.

    Args:
        arguments: The core schema.

    Returns:
        The generated JSON schema.
    """
    prefix_items: list[JsonSchemaValue] = []
    min_items = 0

    for argument in arguments:
        name = self.get_argument_name(argument)

        argument_schema = self.generate_inner(argument['schema']).copy()
        argument_schema['title'] = self.get_title_from_name(name)
        prefix_items.append(argument_schema)

        if argument['schema']['type'] != 'default':
            # This assumes that if the argument has a default value,
            # the inner schema must be of type WithDefaultSchema.
            # I believe this is true, but I am not 100% sure
            min_items += 1

    json_schema: JsonSchemaValue = {'type': 'array', 'prefixItems': prefix_items}
    if min_items:
        json_schema['minItems'] = min_items

    if var_args_schema:
        items_schema = self.generate_inner(var_args_schema)
        if items_schema:
            json_schema['items'] = items_schema
    else:
        json_schema['maxItems'] = len(prefix_items)

    return json_schema

get_argument_name

get_argument_name(argument: ArgumentsParameter) -> str

Retrieves the name of an argument.

Parameters:

Name Type Description Default
argument ArgumentsParameter

The core schema.

required

Returns:

Type Description
str

The name of the argument.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def get_argument_name(self, argument: core_schema.ArgumentsParameter) -> str:
    """Retrieves the name of an argument.

    Args:
        argument: The core schema.

    Returns:
        The name of the argument.
    """
    name = argument['name']
    if self.by_alias:
        alias = argument.get('alias')
        if isinstance(alias, str):
            name = alias
        else:
            pass  # might want to do something else?
    return name

call_schema

call_schema(schema: CallSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a function call.

Parameters:

Name Type Description Default
schema CallSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def call_schema(self, schema: core_schema.CallSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a function call.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.generate_inner(schema['arguments_schema'])

custom_error_schema

custom_error_schema(
    schema: CustomErrorSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a custom error.

Parameters:

Name Type Description Default
schema CustomErrorSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def custom_error_schema(self, schema: core_schema.CustomErrorSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a custom error.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return self.generate_inner(schema['schema'])

json_schema

json_schema(schema: JsonSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a JSON object.

Parameters:

Name Type Description Default
schema JsonSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def json_schema(self, schema: core_schema.JsonSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a JSON object.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    content_core_schema = schema.get('schema') or core_schema.any_schema()
    content_json_schema = self.generate_inner(content_core_schema)
    if self.mode == 'validation':
        return {'type': 'string', 'contentMediaType': 'application/json', 'contentSchema': content_json_schema}
    else:
        # self.mode == 'serialization'
        return content_json_schema

url_schema

url_schema(schema: UrlSchema) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a URL.

Parameters:

Name Type Description Default
schema UrlSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def url_schema(self, schema: core_schema.UrlSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a URL.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    json_schema = {'type': 'string', 'format': 'uri', 'minLength': 1}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.string)
    return json_schema

multi_host_url_schema

multi_host_url_schema(
    schema: MultiHostUrlSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a URL that can be used with multiple hosts.

Parameters:

Name Type Description Default
schema MultiHostUrlSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def multi_host_url_schema(self, schema: core_schema.MultiHostUrlSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a URL that can be used with multiple hosts.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    # Note: 'multi-host-uri' is a custom/pydantic-specific format, not part of the JSON Schema spec
    json_schema = {'type': 'string', 'format': 'multi-host-uri', 'minLength': 1}
    self.update_with_validations(json_schema, schema, self.ValidationsMapping.string)
    return json_schema

uuid_schema

uuid_schema(schema: UuidSchema) -> JsonSchemaValue

Generates a JSON schema that matches a UUID.

Parameters:

Name Type Description Default
schema UuidSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def uuid_schema(self, schema: core_schema.UuidSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a UUID.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    return {'type': 'string', 'format': 'uuid'}

definitions_schema

definitions_schema(
    schema: DefinitionsSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that defines a JSON object with definitions.

Parameters:

Name Type Description Default
schema DefinitionsSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def definitions_schema(self, schema: core_schema.DefinitionsSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that defines a JSON object with definitions.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    for definition in schema['definitions']:
        try:
            self.generate_inner(definition)
        except PydanticInvalidForJsonSchema as e:
            core_ref: CoreRef = CoreRef(definition['ref'])  # type: ignore
            self._core_defs_invalid_for_json_schema[self.get_defs_ref((core_ref, self.mode))] = e
            continue
    return self.generate_inner(schema['schema'])

definition_ref_schema

definition_ref_schema(
    schema: DefinitionReferenceSchema,
) -> JsonSchemaValue

Generates a JSON schema that matches a schema that references a definition.

Parameters:

Name Type Description Default
schema DefinitionReferenceSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def definition_ref_schema(self, schema: core_schema.DefinitionReferenceSchema) -> JsonSchemaValue:
    """Generates a JSON schema that matches a schema that references a definition.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    core_ref = CoreRef(schema['schema_ref'])
    _, ref_json_schema = self.get_cache_defs_ref_schema(core_ref)
    return ref_json_schema

ser_schema

ser_schema(
    schema: SerSchema
    | IncExSeqSerSchema
    | IncExDictSerSchema,
) -> JsonSchemaValue | None

Generates a JSON schema that matches a schema that defines a serialized object.

Parameters:

Name Type Description Default
schema SerSchema | IncExSeqSerSchema | IncExDictSerSchema

The core schema.

required

Returns:

Type Description
JsonSchemaValue | None

The generated JSON schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def ser_schema(
    self, schema: core_schema.SerSchema | core_schema.IncExSeqSerSchema | core_schema.IncExDictSerSchema
) -> JsonSchemaValue | None:
    """Generates a JSON schema that matches a schema that defines a serialized object.

    Args:
        schema: The core schema.

    Returns:
        The generated JSON schema.
    """
    schema_type = schema['type']
    if schema_type == 'function-plain' or schema_type == 'function-wrap':
        # PlainSerializerFunctionSerSchema or WrapSerializerFunctionSerSchema
        return_schema = schema.get('return_schema')
        if return_schema is not None:
            return self.generate_inner(return_schema)
    elif schema_type == 'format' or schema_type == 'to-string':
        # FormatSerSchema or ToStringSerSchema
        return self.str_schema(core_schema.str_schema())
    elif schema['type'] == 'model':
        # ModelSerSchema
        return self.generate_inner(schema['schema'])
    return None

get_title_from_name

get_title_from_name(name: str) -> str

Retrieves a title from a name.

Parameters:

Name Type Description Default
name str

The name to retrieve a title from.

required

Returns:

Type Description
str

The title.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def get_title_from_name(self, name: str) -> str:
    """Retrieves a title from a name.

    Args:
        name: The name to retrieve a title from.

    Returns:
        The title.
    """
    return name.title().replace('_', ' ')

field_title_should_be_set

field_title_should_be_set(
    schema: CoreSchemaOrField,
) -> bool

Returns true if a field with the given schema should have a title set based on the field name.

Intuitively, we want this to return true for schemas that wouldn't otherwise provide their own title (e.g., int, float, str), and false for those that would (e.g., BaseModel subclasses).

Parameters:

Name Type Description Default
schema CoreSchemaOrField

The schema to check.

required

Returns:

Type Description
bool

True if the field should have a title set, False otherwise.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def field_title_should_be_set(self, schema: CoreSchemaOrField) -> bool:
    """Returns true if a field with the given schema should have a title set based on the field name.

    Intuitively, we want this to return true for schemas that wouldn't otherwise provide their own title
    (e.g., int, float, str), and false for those that would (e.g., BaseModel subclasses).

    Args:
        schema: The schema to check.

    Returns:
        `True` if the field should have a title set, `False` otherwise.
    """
    if _core_utils.is_core_schema_field(schema):
        if schema['type'] == 'computed-field':
            field_schema = schema['return_schema']
        else:
            field_schema = schema['schema']
        return self.field_title_should_be_set(field_schema)

    elif _core_utils.is_core_schema(schema):
        if schema.get('ref'):  # things with refs, such as models and enums, should not have titles set
            return False
        if schema['type'] in {'default', 'nullable', 'definitions'}:
            return self.field_title_should_be_set(schema['schema'])  # type: ignore[typeddict-item]
        if _core_utils.is_function_with_inner_schema(schema):
            return self.field_title_should_be_set(schema['schema'])
        if schema['type'] == 'definition-ref':
            # Referenced schemas should not have titles set for the same reason
            # schemas with refs should not
            return False
        return True  # anything else should have title set

    else:
        raise PydanticInvalidForJsonSchema(f'Unexpected schema type: schema={schema}')  # pragma: no cover

normalize_name

normalize_name(name: str) -> str

Normalizes a name to be used as a key in a dictionary.

Parameters:

Name Type Description Default
name str

The name to normalize.

required

Returns:

Type Description
str

The normalized name.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def normalize_name(self, name: str) -> str:
    """Normalizes a name to be used as a key in a dictionary.

    Args:
        name: The name to normalize.

    Returns:
        The normalized name.
    """
    return re.sub(r'[^a-zA-Z0-9.\-_]', '_', name).replace('.', '__')

get_defs_ref

get_defs_ref(core_mode_ref: CoreModeRef) -> DefsRef

Override this method to change the way that definitions keys are generated from a core reference.

Parameters:

Name Type Description Default
core_mode_ref CoreModeRef

The core reference.

required

Returns:

Type Description
DefsRef

The definitions key.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def get_defs_ref(self, core_mode_ref: CoreModeRef) -> DefsRef:
    """Override this method to change the way that definitions keys are generated from a core reference.

    Args:
        core_mode_ref: The core reference.

    Returns:
        The definitions key.
    """
    # Split the core ref into "components"; generic origins and arguments are each separate components
    core_ref, mode = core_mode_ref
    components = re.split(r'([\][,])', core_ref)
    # Remove IDs from each component
    components = [x.rsplit(':', 1)[0] for x in components]
    core_ref_no_id = ''.join(components)
    # Remove everything before the last period from each "component"
    components = [re.sub(r'(?:[^.[\]]+\.)+((?:[^.[\]]+))', r'\1', x) for x in components]
    short_ref = ''.join(components)

    mode_title = _MODE_TITLE_MAPPING[mode]

    # It is important that the generated defs_ref values be such that at least one choice will not
    # be generated for any other core_ref. Currently, this should be the case because we include
    # the id of the source type in the core_ref
    name = DefsRef(self.normalize_name(short_ref))
    name_mode = DefsRef(self.normalize_name(short_ref) + f'-{mode_title}')
    module_qualname = DefsRef(self.normalize_name(core_ref_no_id))
    module_qualname_mode = DefsRef(f'{module_qualname}-{mode_title}')
    module_qualname_id = DefsRef(self.normalize_name(core_ref))
    occurrence_index = self._collision_index.get(module_qualname_id)
    if occurrence_index is None:
        self._collision_counter[module_qualname] += 1
        occurrence_index = self._collision_index[module_qualname_id] = self._collision_counter[module_qualname]

    module_qualname_occurrence = DefsRef(f'{module_qualname}__{occurrence_index}')
    module_qualname_occurrence_mode = DefsRef(f'{module_qualname_mode}__{occurrence_index}')

    self._prioritized_defsref_choices[module_qualname_occurrence_mode] = [
        name,
        name_mode,
        module_qualname,
        module_qualname_mode,
        module_qualname_occurrence,
        module_qualname_occurrence_mode,
    ]

    return module_qualname_occurrence_mode

get_cache_defs_ref_schema

get_cache_defs_ref_schema(
    core_ref: CoreRef,
) -> tuple[DefsRef, JsonSchemaValue]

This method wraps the get_defs_ref method with some cache-lookup/population logic, and returns both the produced defs_ref and the JSON schema that will refer to the right definition.

Parameters:

Name Type Description Default
core_ref CoreRef

The core reference to get the definitions reference for.

required

Returns:

Type Description
tuple[DefsRef, JsonSchemaValue]

A tuple of the definitions reference and the JSON schema that will refer to it.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def get_cache_defs_ref_schema(self, core_ref: CoreRef) -> tuple[DefsRef, JsonSchemaValue]:
    """This method wraps the get_defs_ref method with some cache-lookup/population logic,
    and returns both the produced defs_ref and the JSON schema that will refer to the right definition.

    Args:
        core_ref: The core reference to get the definitions reference for.

    Returns:
        A tuple of the definitions reference and the JSON schema that will refer to it.
    """
    core_mode_ref = (core_ref, self.mode)
    maybe_defs_ref = self.core_to_defs_refs.get(core_mode_ref)
    if maybe_defs_ref is not None:
        json_ref = self.core_to_json_refs[core_mode_ref]
        return maybe_defs_ref, {'$ref': json_ref}

    defs_ref = self.get_defs_ref(core_mode_ref)

    # populate the ref translation mappings
    self.core_to_defs_refs[core_mode_ref] = defs_ref
    self.defs_to_core_refs[defs_ref] = core_mode_ref

    json_ref = JsonRef(self.ref_template.format(model=defs_ref))
    self.core_to_json_refs[core_mode_ref] = json_ref
    self.json_to_defs_refs[json_ref] = defs_ref
    ref_json_schema = {'$ref': json_ref}
    return defs_ref, ref_json_schema

handle_ref_overrides

handle_ref_overrides(
    json_schema: JsonSchemaValue,
) -> JsonSchemaValue

It is not valid for a schema with a top-level $ref to have sibling keys.

During our own schema generation, we treat sibling keys as overrides to the referenced schema, but this is not how the official JSON schema spec works.

Because of this, we first remove any sibling keys that are redundant with the referenced schema, then if any remain, we transform the schema from a top-level '$ref' to use allOf to move the $ref out of the top level. (See bottom of https://swagger.io/docs/specification/using-ref/ for a reference about this behavior)

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def handle_ref_overrides(self, json_schema: JsonSchemaValue) -> JsonSchemaValue:
    """It is not valid for a schema with a top-level $ref to have sibling keys.

    During our own schema generation, we treat sibling keys as overrides to the referenced schema,
    but this is not how the official JSON schema spec works.

    Because of this, we first remove any sibling keys that are redundant with the referenced schema, then if
    any remain, we transform the schema from a top-level '$ref' to use allOf to move the $ref out of the top level.
    (See bottom of https://swagger.io/docs/specification/using-ref/ for a reference about this behavior)
    """
    if '$ref' in json_schema:
        # prevent modifications to the input; this copy may be safe to drop if there is significant overhead
        json_schema = json_schema.copy()

        referenced_json_schema = self.get_schema_from_definitions(JsonRef(json_schema['$ref']))
        if referenced_json_schema is None:
            # This can happen when building schemas for models with not-yet-defined references.
            # It may be a good idea to do a recursive pass at the end of the generation to remove
            # any redundant override keys.
            if len(json_schema) > 1:
                # Make it an allOf to at least resolve the sibling keys issue
                json_schema = json_schema.copy()
                json_schema.setdefault('allOf', [])
                json_schema['allOf'].append({'$ref': json_schema['$ref']})
                del json_schema['$ref']

            return json_schema
        for k, v in list(json_schema.items()):
            if k == '$ref':
                continue
            if k in referenced_json_schema and referenced_json_schema[k] == v:
                del json_schema[k]  # redundant key
        if len(json_schema) > 1:
            # There is a remaining "override" key, so we need to move $ref out of the top level
            json_ref = JsonRef(json_schema['$ref'])
            del json_schema['$ref']
            assert 'allOf' not in json_schema  # this should never happen, but just in case
            json_schema['allOf'] = [{'$ref': json_ref}]

    return json_schema

encode_default

encode_default(dft: Any) -> Any

Encode a default value to a JSON-serializable value.

This is used to encode default values for fields in the generated JSON schema.

Parameters:

Name Type Description Default
dft Any

The default value to encode.

required

Returns:

Type Description
Any

The encoded default value.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def encode_default(self, dft: Any) -> Any:
    """Encode a default value to a JSON-serializable value.

    This is used to encode default values for fields in the generated JSON schema.

    Args:
        dft: The default value to encode.

    Returns:
        The encoded default value.
    """
    config = self._config
    return pydantic_core.to_jsonable_python(
        dft,
        timedelta_mode=config.ser_json_timedelta,
        bytes_mode=config.ser_json_bytes,
    )

update_with_validations

update_with_validations(
    json_schema: JsonSchemaValue,
    core_schema: CoreSchema,
    mapping: dict[str, str],
) -> None

Update the json_schema with the corresponding validations specified in the core_schema, using the provided mapping to translate keys in core_schema to the appropriate keys for a JSON schema.

Parameters:

Name Type Description Default
json_schema JsonSchemaValue

The JSON schema to update.

required
core_schema CoreSchema

The core schema to get the validations from.

required
mapping dict[str, str]

A mapping from core_schema attribute names to the corresponding JSON schema attribute names.

required
Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def update_with_validations(
    self, json_schema: JsonSchemaValue, core_schema: CoreSchema, mapping: dict[str, str]
) -> None:
    """Update the json_schema with the corresponding validations specified in the core_schema,
    using the provided mapping to translate keys in core_schema to the appropriate keys for a JSON schema.

    Args:
        json_schema: The JSON schema to update.
        core_schema: The core schema to get the validations from.
        mapping: A mapping from core_schema attribute names to the corresponding JSON schema attribute names.
    """
    for core_key, json_schema_key in mapping.items():
        if core_key in core_schema:
            json_schema[json_schema_key] = core_schema[core_key]

get_json_ref_counts

get_json_ref_counts(
    json_schema: JsonSchemaValue,
) -> dict[JsonRef, int]

Get all values corresponding to the key '$ref' anywhere in the json_schema.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def get_json_ref_counts(self, json_schema: JsonSchemaValue) -> dict[JsonRef, int]:
    """Get all values corresponding to the key '$ref' anywhere in the json_schema."""
    json_refs: dict[JsonRef, int] = Counter()

    def _add_json_refs(schema: Any) -> None:
        if isinstance(schema, dict):
            if '$ref' in schema:
                json_ref = JsonRef(schema['$ref'])
                if not isinstance(json_ref, str):
                    return  # in this case, '$ref' might have been the name of a property
                already_visited = json_ref in json_refs
                json_refs[json_ref] += 1
                if already_visited:
                    return  # prevent recursion on a definition that was already visited
                defs_ref = self.json_to_defs_refs[json_ref]
                if defs_ref in self._core_defs_invalid_for_json_schema:
                    raise self._core_defs_invalid_for_json_schema[defs_ref]
                _add_json_refs(self.definitions[defs_ref])

            for v in schema.values():
                _add_json_refs(v)
        elif isinstance(schema, list):
            for v in schema:
                _add_json_refs(v)

    _add_json_refs(json_schema)
    return json_refs

emit_warning

emit_warning(
    kind: JsonSchemaWarningKind, detail: str
) -> None

This method simply emits PydanticJsonSchemaWarnings based on handling in the warning_message method.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def emit_warning(self, kind: JsonSchemaWarningKind, detail: str) -> None:
    """This method simply emits PydanticJsonSchemaWarnings based on handling in the `warning_message` method."""
    message = self.render_warning_message(kind, detail)
    if message is not None:
        warnings.warn(message, PydanticJsonSchemaWarning)

render_warning_message

render_warning_message(
    kind: JsonSchemaWarningKind, detail: str
) -> str | None

This method is responsible for ignoring warnings as desired, and for formatting the warning messages.

You can override the value of ignored_warning_kinds in a subclass of GenerateJsonSchema to modify what warnings are generated. If you want more control, you can override this method; just return None in situations where you don't want warnings to be emitted.

Parameters:

Name Type Description Default
kind JsonSchemaWarningKind

The kind of warning to render. It can be one of the following:

  • 'skipped-choice': A choice field was skipped because it had no valid choices.
  • 'non-serializable-default': A default value was skipped because it was not JSON-serializable.
required
detail str

A string with additional details about the warning.

required

Returns:

Type Description
str | None

The formatted warning message, or None if no warning should be emitted.

Source code in .venv/lib/python3.12/site-packages/pydantic/json_schema.py
def render_warning_message(self, kind: JsonSchemaWarningKind, detail: str) -> str | None:
    """This method is responsible for ignoring warnings as desired, and for formatting the warning messages.

    You can override the value of `ignored_warning_kinds` in a subclass of GenerateJsonSchema
    to modify what warnings are generated. If you want more control, you can override this method;
    just return None in situations where you don't want warnings to be emitted.

    Args:
        kind: The kind of warning to render. It can be one of the following:

            - 'skipped-choice': A choice field was skipped because it had no valid choices.
            - 'non-serializable-default': A default value was skipped because it was not JSON-serializable.
        detail: A string with additional details about the warning.

    Returns:
        The formatted warning message, or `None` if no warning should be emitted.
    """
    if kind in self.ignored_warning_kinds:
        return None
    return f'{detail} [{kind}]'

generate

generate(
    schema: CoreSchema,
    mode: JsonSchemaMode = "validation",
    source: JsonSchemaSource = "model",
) -> JsonSchemaDict

Generates a JSON schema for a specified core schema.

It generates a JSON schema for a specified core schema using the configured parameters. The schema is generated based on the specified mode and source type.

Parameters:

Name Type Description Default
schema CoreSchema

A pydantic-core model core schema.

required
mode JsonSchemaMode

The mode to use for generating the JSON Schema. It can be either validation or serialization where respectively the schema is generated for validating data or serializing data. Defaults to validation.

'validation'
source JsonSchemaSource

The source type to use for generating the resources JSON schema. It can be either key , model, or both where the latter accepts, when applicable, integer and string values for key identifiers in addition to the standard model schema generation. Defaults to model.

'model'

Returns:

Type Description
JsonSchemaDict

A generated JSON schema representing the given model core schema.

Raises:

Type Description
PydanticUserError

If the JSON schema generator has already been used to generate a JSON schema.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/json.py
def generate(
    self,
    schema: CoreSchema,
    mode: JsonSchemaMode = 'validation',
    source: JsonSchemaSource = 'model',
) -> JsonSchemaDict:
    """Generates a JSON schema for a specified core schema.

    It generates a JSON schema for a specified core schema using the
    configured parameters. The schema is generated based on the specified
    mode and source type.

    Args:
        schema: A ``pydantic-core`` model core schema.
        mode: The mode to use for generating the JSON Schema. It can be
            either ``validation`` or ``serialization`` where respectively
            the schema is generated for validating data or serializing
            data. Defaults to ``validation``.
        source: The source type to use for generating the resources JSON
            schema. It can be either ``key`` , ``model``, or ``both`` where
            the latter accepts, when applicable, integer and string values
            for key identifiers in addition to the standard model schema
            generation. Defaults to ``model``.

    Returns:
        A generated JSON schema representing the given model core schema.

    Raises:
        PydanticUserError: If the JSON schema generator has already been
            used to generate a JSON schema.
    """
    self._mode = mode
    self._source = source
    return super().generate(schema, mode)

plateforme.core.schema.types

This module provides utilities for managing schema types within the Plateforme framework using Pydantic features.

PydanticConfigDict

Bases: TypedDict

A TypedDict for configuring Pydantic behaviour.

title instance-attribute

title: str | None

The title for the generated JSON schema, defaults to the model's name

str_to_lower instance-attribute

str_to_lower: bool

Whether to convert all characters to lowercase for str types. Defaults to False.

str_to_upper instance-attribute

str_to_upper: bool

Whether to convert all characters to uppercase for str types. Defaults to False.

str_strip_whitespace instance-attribute

str_strip_whitespace: bool

Whether to strip leading and trailing whitespace for str types.

str_min_length instance-attribute

str_min_length: int

The minimum length for str types. Defaults to None.

str_max_length instance-attribute

str_max_length: int | None

The maximum length for str types. Defaults to None.

extra instance-attribute

extra: ExtraValues | None

Whether to ignore, allow, or forbid extra attributes during model initialization. Defaults to 'ignore'.

You can configure how pydantic handles the attributes that are not defined in the model:

  • allow - Allow any extra attributes.
  • forbid - Forbid any extra attributes.
  • ignore - Ignore any extra attributes.
from pydantic import BaseModel, ConfigDict


class User(BaseModel):
    model_config = ConfigDict(extra='ignore')  # (1)!

    name: str


user = User(name='John Doe', age=20)  # (2)!
print(user)
#> name='John Doe'
  1. This is the default behaviour.
  2. The age argument is ignored.

Instead, with extra='allow', the age argument is included:

from pydantic import BaseModel, ConfigDict


class User(BaseModel):
    model_config = ConfigDict(extra='allow')

    name: str


user = User(name='John Doe', age=20)  # (1)!
print(user)
#> name='John Doe' age=20
  1. The age argument is included.

With extra='forbid', an error is raised:

from pydantic import BaseModel, ConfigDict, ValidationError


class User(BaseModel):
    model_config = ConfigDict(extra='forbid')

    name: str


try:
    User(name='John Doe', age=20)
except ValidationError as e:
    print(e)
    '''
    1 validation error for User
    age
    Extra inputs are not permitted [type=extra_forbidden, input_value=20, input_type=int]
    '''

frozen instance-attribute

frozen: bool

Whether models are faux-immutable, i.e. whether __setattr__ is allowed, and also generates a __hash__() method for the model. This makes instances of the model potentially hashable if all the attributes are hashable. Defaults to False.

Note

On V1, the inverse of this setting was called allow_mutation, and was True by default.

populate_by_name instance-attribute

populate_by_name: bool

Whether an aliased field may be populated by its name as given by the model attribute, as well as the alias. Defaults to False.

Note

The name of this configuration setting was changed in v2.0 from allow_population_by_field_name to populate_by_name.

from pydantic import BaseModel, ConfigDict, Field


class User(BaseModel):
    model_config = ConfigDict(populate_by_name=True)

    name: str = Field(alias='full_name')  # (1)!
    age: int


user = User(full_name='John Doe', age=20)  # (2)!
print(user)
#> name='John Doe' age=20
user = User(name='John Doe', age=20)  # (3)!
print(user)
#> name='John Doe' age=20
  1. The field 'name' has an alias 'full_name'.
  2. The model is populated by the alias 'full_name'.
  3. The model is populated by the field name 'name'.

use_enum_values instance-attribute

use_enum_values: bool

Whether to populate models with the value property of enums, rather than the raw enum. This may be useful if you want to serialize model.model_dump() later. Defaults to False.

Note

If you have an Optional[Enum] value that you set a default for, you need to use validate_default=True for said Field to ensure that the use_enum_values flag takes effect on the default, as extracting an enum's value occurs during validation, not serialization.

from enum import Enum
from typing import Optional

from pydantic import BaseModel, ConfigDict, Field


class SomeEnum(Enum):
    FOO = 'foo'
    BAR = 'bar'
    BAZ = 'baz'


class SomeModel(BaseModel):
    model_config = ConfigDict(use_enum_values=True)

    some_enum: SomeEnum
    another_enum: Optional[SomeEnum] = Field(default=SomeEnum.FOO, validate_default=True)


model1 = SomeModel(some_enum=SomeEnum.BAR)
print(model1.model_dump())
# {'some_enum': 'bar', 'another_enum': 'foo'}

model2 = SomeModel(some_enum=SomeEnum.BAR, another_enum=SomeEnum.BAZ)
print(model2.model_dump())
#> {'some_enum': 'bar', 'another_enum': 'baz'}

validate_assignment instance-attribute

validate_assignment: bool

Whether to validate the data when the model is changed. Defaults to False.

The default behavior of Pydantic is to validate the data when the model is created.

In case the user changes the data after the model is created, the model is not revalidated.

from pydantic import BaseModel

class User(BaseModel):
    name: str

user = User(name='John Doe')  # (1)!
print(user)
#> name='John Doe'
user.name = 123  # (1)!
print(user)
#> name=123
  1. The validation happens only when the model is created.
  2. The validation does not happen when the data is changed.

In case you want to revalidate the model when the data is changed, you can use validate_assignment=True:

from pydantic import BaseModel, ValidationError

class User(BaseModel, validate_assignment=True):  # (1)!
    name: str

user = User(name='John Doe')  # (2)!
print(user)
#> name='John Doe'
try:
    user.name = 123  # (3)!
except ValidationError as e:
    print(e)
    '''
    1 validation error for User
    name
      Input should be a valid string [type=string_type, input_value=123, input_type=int]
    '''
  1. You can either use class keyword arguments, or model_config to set validate_assignment=True.
  2. The validation happens when the model is created.
  3. The validation also happens when the data is changed.

arbitrary_types_allowed instance-attribute

arbitrary_types_allowed: bool

Whether arbitrary types are allowed for field types. Defaults to False.

from pydantic import BaseModel, ConfigDict, ValidationError

# This is not a pydantic model, it's an arbitrary class
class Pet:
    def __init__(self, name: str):
        self.name = name

class Model(BaseModel):
    model_config = ConfigDict(arbitrary_types_allowed=True)

    pet: Pet
    owner: str

pet = Pet(name='Hedwig')
# A simple check of instance type is used to validate the data
model = Model(owner='Harry', pet=pet)
print(model)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model.pet.name)
#> Hedwig
print(type(model.pet))
#> <class '__main__.Pet'>
try:
    # If the value is not an instance of the type, it's invalid
    Model(owner='Harry', pet='Hedwig')
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    pet
      Input should be an instance of Pet [type=is_instance_of, input_value='Hedwig', input_type=str]
    '''

# Nothing in the instance of the arbitrary type is checked
# Here name probably should have been a str, but it's not validated
pet2 = Pet(name=42)
model2 = Model(owner='Harry', pet=pet2)
print(model2)
#> pet=<__main__.Pet object at 0x0123456789ab> owner='Harry'
print(model2.pet)
#> <__main__.Pet object at 0x0123456789ab>
print(model2.pet.name)
#> 42
print(type(model2.pet))
#> <class '__main__.Pet'>

from_attributes instance-attribute

from_attributes: bool

Whether to build models and look up discriminators of tagged unions using python object attributes.

loc_by_alias instance-attribute

loc_by_alias: bool

Whether to use the actual key provided in the data (e.g. alias) for error locs rather than the field's name. Defaults to True.

alias_generator instance-attribute

alias_generator: (
    Callable[[str], str] | AliasGenerator | None
)

A callable that takes a field name and returns an alias for it or an instance of AliasGenerator. Defaults to None.

When using a callable, the alias generator is used for both validation and serialization. If you want to use different alias generators for validation and serialization, you can use AliasGenerator instead.

If data source field names do not match your code style (e. g. CamelCase fields), you can automatically generate aliases using alias_generator. Here's an example with a basic callable:

from pydantic import BaseModel, ConfigDict
from pydantic.alias_generators import to_pascal

class Voice(BaseModel):
    model_config = ConfigDict(alias_generator=to_pascal)

    name: str
    language_code: str

voice = Voice(Name='Filiz', LanguageCode='tr-TR')
print(voice.language_code)
#> tr-TR
print(voice.model_dump(by_alias=True))
#> {'Name': 'Filiz', 'LanguageCode': 'tr-TR'}

If you want to use different alias generators for validation and serialization, you can use AliasGenerator.

from pydantic import AliasGenerator, BaseModel, ConfigDict
from pydantic.alias_generators import to_camel, to_pascal

class Athlete(BaseModel):
    first_name: str
    last_name: str
    sport: str

    model_config = ConfigDict(
        alias_generator=AliasGenerator(
            validation_alias=to_camel,
            serialization_alias=to_pascal,
        )
    )

athlete = Athlete(firstName='John', lastName='Doe', sport='track')
print(athlete.model_dump(by_alias=True))
#> {'FirstName': 'John', 'LastName': 'Doe', 'Sport': 'track'}
Note

Pydantic offers three built-in alias generators: to_pascal, to_camel, and to_snake.

ignored_types instance-attribute

ignored_types: tuple[type, ...]

A tuple of types that may occur as values of class attributes without annotations. This is typically used for custom descriptors (classes that behave like property). If an attribute is set on a class without an annotation and has a type that is not in this tuple (or otherwise recognized by pydantic), an error will be raised. Defaults to ().

allow_inf_nan instance-attribute

allow_inf_nan: bool

Whether to allow infinity (+inf an -inf) and NaN values to float fields. Defaults to True.

json_schema_extra instance-attribute

json_schema_extra: JsonDict | JsonSchemaExtraCallable | None

A dict or callable to provide extra JSON schema properties. Defaults to None.

json_encoders instance-attribute

json_encoders: dict[type[object], JsonEncoder] | None

A dict of custom JSON encoders for specific types. Defaults to None.

Deprecated

This config option is a carryover from v1. We originally planned to remove it in v2 but didn't have a 1:1 replacement so we are keeping it for now. It is still deprecated and will likely be removed in the future.

strict instance-attribute

strict: bool

(new in V2) If True, strict validation is applied to all fields on the model.

By default, Pydantic attempts to coerce values to the correct type, when possible.

There are situations in which you may want to disable this behavior, and instead raise an error if a value's type does not match the field's type annotation.

To configure strict mode for all fields on a model, you can set strict=True on the model.

from pydantic import BaseModel, ConfigDict

class Model(BaseModel):
    model_config = ConfigDict(strict=True)

    name: str
    age: int

See Strict Mode for more details.

See the Conversion Table for more details on how Pydantic converts data in both strict and lax modes.

revalidate_instances instance-attribute

revalidate_instances: Literal[
    "always", "never", "subclass-instances"
]

When and how to revalidate models and dataclasses during validation. Accepts the string values of 'never', 'always' and 'subclass-instances'. Defaults to 'never'.

  • 'never' will not revalidate models and dataclasses during validation
  • 'always' will revalidate models and dataclasses during validation
  • 'subclass-instances' will revalidate models and dataclasses during validation if the instance is a subclass of the model or dataclass

By default, model and dataclass instances are not revalidated during validation.

from typing import List

from pydantic import BaseModel

class User(BaseModel, revalidate_instances='never'):  # (1)!
    hobbies: List[str]

class SubUser(User):
    sins: List[str]

class Transaction(BaseModel):
    user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]  # (2)!
t = Transaction(user=my_user)  # (3)!
print(t)
#> user=User(hobbies=[1])

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)
#> user=SubUser(hobbies=['scuba diving'], sins=['lying'])
  1. revalidate_instances is set to 'never' by **default.
  2. The assignment is not validated, unless you set validate_assignment to True in the model's config.
  3. Since revalidate_instances is set to never, this is not revalidated.

If you want to revalidate instances during validation, you can set revalidate_instances to 'always' in the model's config.

from typing import List

from pydantic import BaseModel, ValidationError

class User(BaseModel, revalidate_instances='always'):  # (1)!
    hobbies: List[str]

class SubUser(User):
    sins: List[str]

class Transaction(BaseModel):
    user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]
try:
    t = Transaction(user=my_user)  # (2)!
except ValidationError as e:
    print(e)
    '''
    1 validation error for Transaction
    user.hobbies.0
      Input should be a valid string [type=string_type, input_value=1, input_type=int]
    '''

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)  # (3)!
#> user=User(hobbies=['scuba diving'])
  1. revalidate_instances is set to 'always'.
  2. The model is revalidated, since revalidate_instances is set to 'always'.
  3. Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).

It's also possible to set revalidate_instances to 'subclass-instances' to only revalidate instances of subclasses of the model.

from typing import List

from pydantic import BaseModel

class User(BaseModel, revalidate_instances='subclass-instances'):  # (1)!
    hobbies: List[str]

class SubUser(User):
    sins: List[str]

class Transaction(BaseModel):
    user: User

my_user = User(hobbies=['reading'])
t = Transaction(user=my_user)
print(t)
#> user=User(hobbies=['reading'])

my_user.hobbies = [1]
t = Transaction(user=my_user)  # (2)!
print(t)
#> user=User(hobbies=[1])

my_sub_user = SubUser(hobbies=['scuba diving'], sins=['lying'])
t = Transaction(user=my_sub_user)
print(t)  # (3)!
#> user=User(hobbies=['scuba diving'])
  1. revalidate_instances is set to 'subclass-instances'.
  2. This is not revalidated, since my_user is not a subclass of User.
  3. Using 'never' we would have gotten user=SubUser(hobbies=['scuba diving'], sins=['lying']).

ser_json_timedelta instance-attribute

ser_json_timedelta: Literal['iso8601', 'float']

The format of JSON serialized timedeltas. Accepts the string values of 'iso8601' and 'float'. Defaults to 'iso8601'.

  • 'iso8601' will serialize timedeltas to ISO 8601 durations.
  • 'float' will serialize timedeltas to the total number of seconds.

ser_json_bytes instance-attribute

ser_json_bytes: Literal['utf8', 'base64']

The encoding of JSON serialized bytes. Accepts the string values of 'utf8' and 'base64'. Defaults to 'utf8'.

  • 'utf8' will serialize bytes to UTF-8 strings.
  • 'base64' will serialize bytes to URL safe base64 strings.

ser_json_inf_nan instance-attribute

ser_json_inf_nan: Literal['null', 'constants']

The encoding of JSON serialized infinity and NaN float values. Accepts the string values of 'null' and 'constants'. Defaults to 'null'.

  • 'null' will serialize infinity and NaN values as null.
  • 'constants' will serialize infinity and NaN values as Infinity and NaN.

validate_default instance-attribute

validate_default: bool

Whether to validate default values during validation. Defaults to False.

validate_return instance-attribute

validate_return: bool

whether to validate the return value from call validators. Defaults to False.

protected_namespaces instance-attribute

protected_namespaces: tuple[str, ...]

A tuple of strings that prevent model to have field which conflict with them. Defaults to ('model_', )).

Pydantic prevents collisions between model attributes and BaseModel's own methods by namespacing them with the prefix model_.

import warnings

from pydantic import BaseModel

warnings.filterwarnings('error')  # Raise warnings as errors

try:

    class Model(BaseModel):
        model_prefixed_field: str

except UserWarning as e:
    print(e)
    '''
    Field "model_prefixed_field" has conflict with protected namespace "model_".

    You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
    '''

You can customize this behavior using the protected_namespaces setting:

import warnings

from pydantic import BaseModel, ConfigDict

warnings.filterwarnings('error')  # Raise warnings as errors

try:

    class Model(BaseModel):
        model_prefixed_field: str
        also_protect_field: str

        model_config = ConfigDict(
            protected_namespaces=('protect_me_', 'also_protect_')
        )

except UserWarning as e:
    print(e)
    '''
    Field "also_protect_field" has conflict with protected namespace "also_protect_".

    You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ('protect_me_',)`.
    '''

While Pydantic will only emit a warning when an item is in a protected namespace but does not actually have a collision, an error is raised if there is an actual collision with an existing attribute:

from pydantic import BaseModel

try:

    class Model(BaseModel):
        model_validate: str

except NameError as e:
    print(e)
    '''
    Field "model_validate" conflicts with member <bound method BaseModel.model_validate of <class 'pydantic.main.BaseModel'>> of protected namespace "model_".
    '''

hide_input_in_errors instance-attribute

hide_input_in_errors: bool

Whether to hide inputs when printing errors. Defaults to False.

Pydantic shows the input value and type when it raises ValidationError during the validation.

from pydantic import BaseModel, ValidationError

class Model(BaseModel):
    a: str

try:
    Model(a=123)
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    a
      Input should be a valid string [type=string_type, input_value=123, input_type=int]
    '''

You can hide the input value and type by setting the hide_input_in_errors config to True.

from pydantic import BaseModel, ConfigDict, ValidationError

class Model(BaseModel):
    a: str
    model_config = ConfigDict(hide_input_in_errors=True)

try:
    Model(a=123)
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    a
      Input should be a valid string [type=string_type]
    '''

defer_build instance-attribute

defer_build: bool

Whether to defer model validator and serializer construction until the first model validation.

This can be useful to avoid the overhead of building models which are only used nested within other models, or when you want to manually define type namespace via Model.model_rebuild(_types_namespace=...). Defaults to False.

plugin_settings instance-attribute

plugin_settings: dict[str, object] | None

A dict of settings for plugins. Defaults to None.

See Pydantic Plugins for details.

schema_generator instance-attribute

schema_generator: type[GenerateSchema] | None

A custom core schema generator class to use when generating JSON schemas. Useful if you want to change the way types are validated across an entire model/schema. Defaults to None.

The GenerateSchema interface is subject to change, currently only the string_schema method is public.

See #6737 for details.

json_schema_serialization_defaults_required instance-attribute

json_schema_serialization_defaults_required: bool

Whether fields with default values should be marked as required in the serialization schema. Defaults to False.

This ensures that the serialization schema will reflect the fact a field with a default will always be present when serializing the model, even though it is not required for validation.

However, there are scenarios where this may be undesirable — in particular, if you want to share the schema between validation and serialization, and don't mind fields with defaults being marked as not required during serialization. See #7209 for more details.

from pydantic import BaseModel, ConfigDict

class Model(BaseModel):
    a: str = 'a'

    model_config = ConfigDict(json_schema_serialization_defaults_required=True)

print(Model.model_json_schema(mode='validation'))
'''
{
    'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
    'title': 'Model',
    'type': 'object',
}
'''
print(Model.model_json_schema(mode='serialization'))
'''
{
    'properties': {'a': {'default': 'a', 'title': 'A', 'type': 'string'}},
    'required': ['a'],
    'title': 'Model',
    'type': 'object',
}
'''

json_schema_mode_override instance-attribute

json_schema_mode_override: Literal[
    "validation", "serialization", None
]

If not None, the specified mode will be used to generate the JSON schema regardless of what mode was passed to the function call. Defaults to None.

This provides a way to force the JSON schema generation to reflect a specific mode, e.g., to always use the validation schema.

It can be useful when using frameworks (such as FastAPI) that may generate different schemas for validation and serialization that must both be referenced from the same schema; when this happens, we automatically append -Input to the definition reference for the validation schema and -Output to the definition reference for the serialization schema. By specifying a json_schema_mode_override though, this prevents the conflict between the validation and serialization schemas (since both will use the specified schema), and so prevents the suffixes from being added to the definition references.

from pydantic import BaseModel, ConfigDict, Json

class Model(BaseModel):
    a: Json[int]  # requires a string to validate, but will dump an int

print(Model.model_json_schema(mode='serialization'))
'''
{
    'properties': {'a': {'title': 'A', 'type': 'integer'}},
    'required': ['a'],
    'title': 'Model',
    'type': 'object',
}
'''

class ForceInputModel(Model):
    # the following ensures that even with mode='serialization', we
    # will get the schema that would be generated for validation.
    model_config = ConfigDict(json_schema_mode_override='validation')

print(ForceInputModel.model_json_schema(mode='serialization'))
'''
{
    'properties': {
        'a': {
            'contentMediaType': 'application/json',
            'contentSchema': {'type': 'integer'},
            'title': 'A',
            'type': 'string',
        }
    },
    'required': ['a'],
    'title': 'ForceInputModel',
    'type': 'object',
}
'''

coerce_numbers_to_str instance-attribute

coerce_numbers_to_str: bool

If True, enables automatic coercion of any Number type to str in "lax" (non-strict) mode. Defaults to False.

Pydantic doesn't allow number types (int, float, Decimal) to be coerced as type str by default.

from decimal import Decimal

from pydantic import BaseModel, ConfigDict, ValidationError

class Model(BaseModel):
    value: str

try:
    print(Model(value=42))
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    value
      Input should be a valid string [type=string_type, input_value=42, input_type=int]
    '''

class Model(BaseModel):
    model_config = ConfigDict(coerce_numbers_to_str=True)

    value: str

repr(Model(value=42).value)
#> "42"
repr(Model(value=42.13).value)
#> "42.13"
repr(Model(value=Decimal('42.13')).value)
#> "42.13"

regex_engine instance-attribute

regex_engine: Literal['rust-regex', 'python-re']

The regex engine to used for pattern validation Defaults to 'rust-regex'.

  • rust-regex uses the regex Rust crate, which is non-backtracking and therefore more DDoS resistant, but does not support all regex features.
  • python-re use the re module, which supports all regex features, but may be slower.
from pydantic import BaseModel, ConfigDict, Field, ValidationError

class Model(BaseModel):
    model_config = ConfigDict(regex_engine='python-re')

    value: str = Field(pattern=r'^abc(?=def)')

print(Model(value='abcdef').value)
#> abcdef

try:
    print(Model(value='abxyzcdef'))
except ValidationError as e:
    print(e)
    '''
    1 validation error for Model
    value
      String should match pattern '^abc(?=def)' [type=string_pattern_mismatch, input_value='abxyzcdef', input_type=str]
    '''

validation_error_cause instance-attribute

validation_error_cause: bool

If True, python exceptions that were part of a validation failure will be shown as an exception group as a cause. Can be useful for debugging. Defaults to False.

Note

Python 3.10 and older don't support exception groups natively. <=3.10, backport must be installed: pip install exceptiongroup.

Note

The structure of validation errors are likely to change in future pydantic versions. Pydantic offers no guarantees about the structure of validation errors. Should be used for visual traceback debugging only.

PydanticIPvAnyAddress

Validate an IPv4 or IPv6 address.

from pydantic import BaseModel
from pydantic.networks import IPvAnyAddress

class IpModel(BaseModel):
    ip: IPvAnyAddress

print(IpModel(ip='127.0.0.1'))
#> ip=IPv4Address('127.0.0.1')

try:
    IpModel(ip='http://www.example.com')
except ValueError as e:
    print(e.errors())
    '''
    [
        {
            'type': 'ip_any_address',
            'loc': ('ip',),
            'msg': 'value is not a valid IPv4 or IPv6 address',
            'input': 'http://www.example.com',
        }
    ]
    '''

PydanticIPvAnyInterface

Validate an IPv4 or IPv6 interface.

PydanticIPvAnyNetwork

Validate an IPv4 or IPv6 network.

UrlConstraints dataclass

UrlConstraints(
    max_length: int | None = None,
    allowed_schemes: list[str] | None = None,
    host_required: bool | None = None,
    default_host: str | None = None,
    default_port: int | None = None,
    default_path: str | None = None,
)

Bases: PydanticMetadata

Url constraints.

Attributes:

Name Type Description
max_length int | None

The maximum length of the url. Defaults to None.

allowed_schemes list[str] | None

The allowed schemes. Defaults to None.

host_required bool | None

Whether the host is required. Defaults to None.

default_host str | None

The default host. Defaults to None.

default_port int | None

The default port. Defaults to None.

default_path str | None

The default path. Defaults to None.

TypeAdapter

TypeAdapter(
    type: type[T],
    *,
    config: ConfigDict | None = ...,
    _parent_depth: int = ...,
    module: str | None = ...,
)
TypeAdapter(
    type: T,
    *,
    config: ConfigDict | None = ...,
    _parent_depth: int = ...,
    module: str | None = ...,
)
TypeAdapter(
    type: type[T] | T,
    *,
    config: ConfigDict | None = None,
    _parent_depth: int = 2,
    module: str | None = None,
)

Bases: Generic[T]

Type adapters provide a flexible way to perform validation and serialization based on a Python type.

A TypeAdapter instance exposes some of the functionality from BaseModel instance methods for types that do not have such methods (such as dataclasses, primitive types, and more).

Note: TypeAdapter instances are not types, and cannot be used as type annotations for fields.

Attributes:

Name Type Description
core_schema

The core schema for the type.

validator SchemaValidator

The schema validator for the type.

serializer

The schema serializer for the type.

Initializes the TypeAdapter object.

Parameters:

Name Type Description Default
type type[T] | T

The type associated with the TypeAdapter.

required
config ConfigDict | None

Configuration for the TypeAdapter, should be a dictionary conforming to ConfigDict.

None
_parent_depth int

depth at which to search the parent namespace to construct the local namespace.

2
module str | None

The module that passes to plugin if provided.

None

Note

You cannot use the config argument when instantiating a TypeAdapter if the type you're using has its own config that cannot be overridden (ex: BaseModel, TypedDict, and dataclass). A type-adapter-config-unused error will be raised in this case.

Note

The _parent_depth argument is named with an underscore to suggest its private nature and discourage use. It may be deprecated in a minor version, so we only recommend using it if you're comfortable with potential change in behavior / support.

Compatibility with mypy

Depending on the type used, mypy might raise an error when instantiating a TypeAdapter. As a workaround, you can explicitly annotate your variable:

from typing import Union

from pydantic import TypeAdapter

ta: TypeAdapter[Union[str, int]] = TypeAdapter(Union[str, int])  # type: ignore[arg-type]

Returns:

Type Description
None

A type adapter configured for the specified type.

Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
def __init__(
    self,
    type: type[T] | T,
    *,
    config: ConfigDict | None = None,
    _parent_depth: int = 2,
    module: str | None = None,
) -> None:
    """Initializes the TypeAdapter object.

    Args:
        type: The type associated with the `TypeAdapter`.
        config: Configuration for the `TypeAdapter`, should be a dictionary conforming to [`ConfigDict`][pydantic.config.ConfigDict].
        _parent_depth: depth at which to search the parent namespace to construct the local namespace.
        module: The module that passes to plugin if provided.

    !!! note
        You cannot use the `config` argument when instantiating a `TypeAdapter` if the type you're using has its own
        config that cannot be overridden (ex: `BaseModel`, `TypedDict`, and `dataclass`). A
        [`type-adapter-config-unused`](../errors/usage_errors.md#type-adapter-config-unused) error will be raised in this case.

    !!! note
        The `_parent_depth` argument is named with an underscore to suggest its private nature and discourage use.
        It may be deprecated in a minor version, so we only recommend using it if you're
        comfortable with potential change in behavior / support.

    ??? tip "Compatibility with `mypy`"
        Depending on the type used, `mypy` might raise an error when instantiating a `TypeAdapter`. As a workaround, you can explicitly
        annotate your variable:

        ```py
        from typing import Union

        from pydantic import TypeAdapter

        ta: TypeAdapter[Union[str, int]] = TypeAdapter(Union[str, int])  # type: ignore[arg-type]
        ```

    Returns:
        A type adapter configured for the specified `type`.
    """
    type_is_annotated: bool = _typing_extra.is_annotated(type)
    annotated_type: Any = get_args(type)[0] if type_is_annotated else None
    type_has_config: bool = _type_has_config(annotated_type if type_is_annotated else type)

    if type_has_config and config is not None:
        raise PydanticUserError(
            'Cannot use `config` when the type is a BaseModel, dataclass or TypedDict.'
            ' These types can have their own config and setting the config via the `config`'
            ' parameter to TypeAdapter will not override it, thus the `config` you passed to'
            ' TypeAdapter becomes meaningless, which is probably not what you want.',
            code='type-adapter-config-unused',
        )

    config_wrapper = _config.ConfigWrapper(config)

    core_schema: CoreSchema
    try:
        core_schema = _getattr_no_parents(type, '__pydantic_core_schema__')
    except AttributeError:
        core_schema = _get_schema(type, config_wrapper, parent_depth=_parent_depth + 1)

    core_config = config_wrapper.core_config(None)
    validator: SchemaValidator
    try:
        validator = _getattr_no_parents(type, '__pydantic_validator__')
    except AttributeError:
        if module is None:
            f = sys._getframe(1)
            module = cast(str, f.f_globals.get('__name__', ''))
        validator = create_schema_validator(
            core_schema, type, module, str(type), 'TypeAdapter', core_config, config_wrapper.plugin_settings
        )  # type: ignore

    serializer: SchemaSerializer
    try:
        serializer = _getattr_no_parents(type, '__pydantic_serializer__')
    except AttributeError:
        serializer = SchemaSerializer(core_schema, core_config)

    self.core_schema = core_schema
    self.validator = validator
    self.serializer = serializer

validate_python

validate_python(
    __object: Any,
    *,
    strict: bool | None = None,
    from_attributes: bool | None = None,
    context: dict[str, Any] | None = None,
) -> T

Validate a Python object against the model.

Parameters:

Name Type Description Default
__object Any

The Python object to validate against the model.

required
strict bool | None

Whether to strictly check types.

None
from_attributes bool | None

Whether to extract data from object attributes.

None
context dict[str, Any] | None

Additional context to pass to the validator.

None

Note

When using TypeAdapter with a Pydantic dataclass, the use of the from_attributes argument is not supported.

Returns:

Type Description
T

The validated object.

Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
def validate_python(
    self,
    __object: Any,
    *,
    strict: bool | None = None,
    from_attributes: bool | None = None,
    context: dict[str, Any] | None = None,
) -> T:
    """Validate a Python object against the model.

    Args:
        __object: The Python object to validate against the model.
        strict: Whether to strictly check types.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.

    !!! note
        When using `TypeAdapter` with a Pydantic `dataclass`, the use of the `from_attributes`
        argument is not supported.

    Returns:
        The validated object.
    """
    return self.validator.validate_python(__object, strict=strict, from_attributes=from_attributes, context=context)

validate_json

validate_json(
    __data: str | bytes,
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> T

Usage docs: https://docs.pydantic.dev/2.6/concepts/json/#json-parsing

Validate a JSON string or bytes against the model.

Parameters:

Name Type Description Default
__data str | bytes

The JSON data to validate against the model.

required
strict bool | None

Whether to strictly check types.

None
context dict[str, Any] | None

Additional context to use during validation.

None

Returns:

Type Description
T

The validated object.

Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
def validate_json(
    self, __data: str | bytes, *, strict: bool | None = None, context: dict[str, Any] | None = None
) -> T:
    """Usage docs: https://docs.pydantic.dev/2.6/concepts/json/#json-parsing

    Validate a JSON string or bytes against the model.

    Args:
        __data: The JSON data to validate against the model.
        strict: Whether to strictly check types.
        context: Additional context to use during validation.

    Returns:
        The validated object.
    """
    return self.validator.validate_json(__data, strict=strict, context=context)

validate_strings

validate_strings(
    __obj: Any,
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> T

Validate object contains string data against the model.

Parameters:

Name Type Description Default
__obj Any

The object contains string data to validate.

required
strict bool | None

Whether to strictly check types.

None
context dict[str, Any] | None

Additional context to use during validation.

None

Returns:

Type Description
T

The validated object.

Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
def validate_strings(self, __obj: Any, *, strict: bool | None = None, context: dict[str, Any] | None = None) -> T:
    """Validate object contains string data against the model.

    Args:
        __obj: The object contains string data to validate.
        strict: Whether to strictly check types.
        context: Additional context to use during validation.

    Returns:
        The validated object.
    """
    return self.validator.validate_strings(__obj, strict=strict, context=context)

get_default_value

get_default_value(
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> Some[T] | None

Get the default value for the wrapped type.

Parameters:

Name Type Description Default
strict bool | None

Whether to strictly check types.

None
context dict[str, Any] | None

Additional context to pass to the validator.

None

Returns:

Type Description
Some[T] | None

The default value wrapped in a Some if there is one or None if not.

Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
def get_default_value(self, *, strict: bool | None = None, context: dict[str, Any] | None = None) -> Some[T] | None:
    """Get the default value for the wrapped type.

    Args:
        strict: Whether to strictly check types.
        context: Additional context to pass to the validator.

    Returns:
        The default value wrapped in a `Some` if there is one or None if not.
    """
    return self.validator.get_default_value(strict=strict, context=context)

dump_python

dump_python(
    __instance: T,
    *,
    mode: Literal["json", "python"] = "python",
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> Any

Dump an instance of the adapted type to a Python object.

Parameters:

Name Type Description Default
__instance T

The Python object to serialize.

required
mode Literal['json', 'python']

The output format.

'python'
include IncEx | None

Fields to include in the output.

None
exclude IncEx | None

Fields to exclude from the output.

None
by_alias bool

Whether to use alias names for field names.

False
exclude_unset bool

Whether to exclude unset fields.

False
exclude_defaults bool

Whether to exclude fields with default values.

False
exclude_none bool

Whether to exclude fields with None values.

False
round_trip bool

Whether to output the serialized data in a way that is compatible with deserialization.

False
warnings bool

Whether to display serialization warnings.

True

Returns:

Type Description
Any

The serialized object.

Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
def dump_python(
    self,
    __instance: T,
    *,
    mode: Literal['json', 'python'] = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> Any:
    """Dump an instance of the adapted type to a Python object.

    Args:
        __instance: The Python object to serialize.
        mode: The output format.
        include: Fields to include in the output.
        exclude: Fields to exclude from the output.
        by_alias: Whether to use alias names for field names.
        exclude_unset: Whether to exclude unset fields.
        exclude_defaults: Whether to exclude fields with default values.
        exclude_none: Whether to exclude fields with None values.
        round_trip: Whether to output the serialized data in a way that is compatible with deserialization.
        warnings: Whether to display serialization warnings.

    Returns:
        The serialized object.
    """
    return self.serializer.to_python(
        __instance,
        mode=mode,
        by_alias=by_alias,
        include=include,
        exclude=exclude,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        round_trip=round_trip,
        warnings=warnings,
    )

dump_json

dump_json(
    __instance: T,
    *,
    indent: int | None = None,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> bytes

Usage docs: https://docs.pydantic.dev/2.6/concepts/json/#json-serialization

Serialize an instance of the adapted type to JSON.

Parameters:

Name Type Description Default
__instance T

The instance to be serialized.

required
indent int | None

Number of spaces for JSON indentation.

None
include IncEx | None

Fields to include.

None
exclude IncEx | None

Fields to exclude.

None
by_alias bool

Whether to use alias names for field names.

False
exclude_unset bool

Whether to exclude unset fields.

False
exclude_defaults bool

Whether to exclude fields with default values.

False
exclude_none bool

Whether to exclude fields with a value of None.

False
round_trip bool

Whether to serialize and deserialize the instance to ensure round-tripping.

False
warnings bool

Whether to emit serialization warnings.

True

Returns:

Type Description
bytes

The JSON representation of the given instance as bytes.

Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
def dump_json(
    self,
    __instance: T,
    *,
    indent: int | None = None,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> bytes:
    """Usage docs: https://docs.pydantic.dev/2.6/concepts/json/#json-serialization

    Serialize an instance of the adapted type to JSON.

    Args:
        __instance: The instance to be serialized.
        indent: Number of spaces for JSON indentation.
        include: Fields to include.
        exclude: Fields to exclude.
        by_alias: Whether to use alias names for field names.
        exclude_unset: Whether to exclude unset fields.
        exclude_defaults: Whether to exclude fields with default values.
        exclude_none: Whether to exclude fields with a value of `None`.
        round_trip: Whether to serialize and deserialize the instance to ensure round-tripping.
        warnings: Whether to emit serialization warnings.

    Returns:
        The JSON representation of the given instance as bytes.
    """
    return self.serializer.to_json(
        __instance,
        indent=indent,
        include=include,
        exclude=exclude,
        by_alias=by_alias,
        exclude_unset=exclude_unset,
        exclude_defaults=exclude_defaults,
        exclude_none=exclude_none,
        round_trip=round_trip,
        warnings=warnings,
    )

json_schema

json_schema(
    *,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[
        GenerateJsonSchema
    ] = GenerateJsonSchema,
    mode: JsonSchemaMode = "validation",
) -> dict[str, Any]

Generate a JSON schema for the adapted type.

Parameters:

Name Type Description Default
by_alias bool

Whether to use alias names for field names.

True
ref_template str

The format string used for generating $ref strings.

DEFAULT_REF_TEMPLATE
schema_generator type[GenerateJsonSchema]

The generator class used for creating the schema.

GenerateJsonSchema
mode JsonSchemaMode

The mode to use for schema generation.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the model as a dictionary.

Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
def json_schema(
    self,
    *,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]:
    """Generate a JSON schema for the adapted type.

    Args:
        by_alias: Whether to use alias names for field names.
        ref_template: The format string used for generating $ref strings.
        schema_generator: The generator class used for creating the schema.
        mode: The mode to use for schema generation.

    Returns:
        The JSON schema for the model as a dictionary.
    """
    schema_generator_instance = schema_generator(by_alias=by_alias, ref_template=ref_template)
    return schema_generator_instance.generate(self.core_schema, mode=mode)

json_schemas staticmethod

json_schemas(
    __inputs: Iterable[
        tuple[
            JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]
        ]
    ],
    *,
    by_alias: bool = True,
    title: str | None = None,
    description: str | None = None,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[
        GenerateJsonSchema
    ] = GenerateJsonSchema,
) -> tuple[
    dict[
        tuple[JsonSchemaKeyT, JsonSchemaMode],
        JsonSchemaValue,
    ],
    JsonSchemaValue,
]

Generate a JSON schema including definitions from multiple type adapters.

Parameters:

Name Type Description Default
__inputs Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]]

Inputs to schema generation. The first two items will form the keys of the (first) output mapping; the type adapters will provide the core schemas that get converted into definitions in the output JSON schema.

required
by_alias bool

Whether to use alias names.

True
title str | None

The title for the schema.

None
description str | None

The description for the schema.

None
ref_template str

The format string used for generating $ref strings.

DEFAULT_REF_TEMPLATE
schema_generator type[GenerateJsonSchema]

The generator class used for creating the schema.

GenerateJsonSchema

Returns:

Type Description
tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]

A tuple where:

  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element.)
  • The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys.
Source code in .venv/lib/python3.12/site-packages/pydantic/type_adapter.py
@staticmethod
def json_schemas(
    __inputs: Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]],
    *,
    by_alias: bool = True,
    title: str | None = None,
    description: str | None = None,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
) -> tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]:
    """Generate a JSON schema including definitions from multiple type adapters.

    Args:
        __inputs: Inputs to schema generation. The first two items will form the keys of the (first)
            output mapping; the type adapters will provide the core schemas that get converted into
            definitions in the output JSON schema.
        by_alias: Whether to use alias names.
        title: The title for the schema.
        description: The description for the schema.
        ref_template: The format string used for generating $ref strings.
        schema_generator: The generator class used for creating the schema.

    Returns:
        A tuple where:

            - The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and
                whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have
                JsonRef references to definitions that are defined in the second returned element.)
            - The second element is a JSON schema containing all definitions referenced in the first returned
                element, along with the optional title and description keys.

    """
    schema_generator_instance = schema_generator(by_alias=by_alias, ref_template=ref_template)

    inputs = [(key, mode, adapter.core_schema) for key, mode, adapter in __inputs]

    json_schemas_map, definitions = schema_generator_instance.generate_definitions(inputs)

    json_schema: dict[str, Any] = {}
    if definitions:
        json_schema['$defs'] = definitions
    if title:
        json_schema['title'] = title
    if description:
        json_schema['description'] = description

    return json_schemas_map, json_schema

AllowInfNan dataclass

AllowInfNan(allow_inf_nan: bool = True)

Bases: PydanticMetadata

A field metadata class to indicate that a field should allow -inf, inf, and nan.

PydanticAwareDateTime

A datetime that requires timezone info.

Discriminator dataclass

Discriminator(
    discriminator: str | Callable[[Any], Hashable],
    custom_error_type: str | None = None,
    custom_error_message: str | None = None,
    custom_error_context: dict[str, int | str | float]
    | None = None,
)

Usage docs: https://docs.pydantic.dev/2.6/concepts/unions/#discriminated-unions-with-callable-discriminator

Provides a way to use a custom callable as the way to extract the value of a union discriminator.

This allows you to get validation behavior like you'd get from Field(discriminator=<field_name>), but without needing to have a single shared field across all the union choices. This also makes it possible to handle unions of models and primitive types with discriminated-union-style validation errors. Finally, this allows you to use a custom callable as the way to identify which member of a union a value belongs to, while still seeing all the performance benefits of a discriminated union.

Consider this example, which is much more performant with the use of Discriminator and thus a TaggedUnion than it would be as a normal Union.

from typing import Any, Union

from typing_extensions import Annotated, Literal

from pydantic import BaseModel, Discriminator, Tag

class Pie(BaseModel):
    time_to_cook: int
    num_ingredients: int

class ApplePie(Pie):
    fruit: Literal['apple'] = 'apple'

class PumpkinPie(Pie):
    filling: Literal['pumpkin'] = 'pumpkin'

def get_discriminator_value(v: Any) -> str:
    if isinstance(v, dict):
        return v.get('fruit', v.get('filling'))
    return getattr(v, 'fruit', getattr(v, 'filling', None))

class ThanksgivingDinner(BaseModel):
    dessert: Annotated[
        Union[
            Annotated[ApplePie, Tag('apple')],
            Annotated[PumpkinPie, Tag('pumpkin')],
        ],
        Discriminator(get_discriminator_value),
    ]

apple_variation = ThanksgivingDinner.model_validate(
    {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}}
)
print(repr(apple_variation))
'''
ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple'))
'''

pumpkin_variation = ThanksgivingDinner.model_validate(
    {
        'dessert': {
            'filling': 'pumpkin',
            'time_to_cook': 40,
            'num_ingredients': 6,
        }
    }
)
print(repr(pumpkin_variation))
'''
ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin'))
'''

See the Discriminated Unions concepts docs for more details on how to use Discriminators.

discriminator instance-attribute

discriminator: str | Callable[[Any], Hashable]

The callable or field name for discriminating the type in a tagged union.

A Callable discriminator must extract the value of the discriminator from the input. A str discriminator must be the name of a field to discriminate against.

custom_error_type class-attribute instance-attribute

custom_error_type: str | None = None

Type to use in custom errors replacing the standard discriminated union validation errors.

custom_error_message class-attribute instance-attribute

custom_error_message: str | None = None

Message to use in custom errors.

custom_error_context class-attribute instance-attribute

custom_error_context: (
    dict[str, int | str | float] | None
) = None

Context to use in custom errors.

PydanticFutureDate

A date in the future.

PydanticFutureDateTime

A datetime that must be in the future.

PydanticNaiveDateTime

A datetime that doesn't require timezone info.

PydanticPastDate

A date in the past.

PydanticPastDateTime

A datetime that must be in the past.

Strict dataclass

Strict(strict: bool = True)

Bases: PydanticMetadata, BaseMetadata

Usage docs: https://docs.pydantic.dev/2.6/concepts/strict_mode/#strict-mode-with-annotated-strict

A field metadata class to indicate that a field should be validated in strict mode.

Attributes:

Name Type Description
strict bool

Whether to validate the field in strict mode.

Example
from typing_extensions import Annotated

from pydantic.types import Strict

StrictBool = Annotated[bool, Strict()]

Tag dataclass

Tag(tag: str)

Provides a way to specify the expected tag to use for a case of a (callable) discriminated union.

Also provides a way to label a union case in error messages.

When using a callable Discriminator, attach a Tag to each case in the Union to specify the tag that should be used to identify that case. For example, in the below example, the Tag is used to specify that if get_discriminator_value returns 'apple', the input should be validated as an ApplePie, and if it returns 'pumpkin', the input should be validated as a PumpkinPie.

The primary role of the Tag here is to map the return value from the callable Discriminator function to the appropriate member of the Union in question.

from typing import Any, Union

from typing_extensions import Annotated, Literal

from pydantic import BaseModel, Discriminator, Tag

class Pie(BaseModel):
    time_to_cook: int
    num_ingredients: int

class ApplePie(Pie):
    fruit: Literal['apple'] = 'apple'

class PumpkinPie(Pie):
    filling: Literal['pumpkin'] = 'pumpkin'

def get_discriminator_value(v: Any) -> str:
    if isinstance(v, dict):
        return v.get('fruit', v.get('filling'))
    return getattr(v, 'fruit', getattr(v, 'filling', None))

class ThanksgivingDinner(BaseModel):
    dessert: Annotated[
        Union[
            Annotated[ApplePie, Tag('apple')],
            Annotated[PumpkinPie, Tag('pumpkin')],
        ],
        Discriminator(get_discriminator_value),
    ]

apple_variation = ThanksgivingDinner.model_validate(
    {'dessert': {'fruit': 'apple', 'time_to_cook': 60, 'num_ingredients': 8}}
)
print(repr(apple_variation))
'''
ThanksgivingDinner(dessert=ApplePie(time_to_cook=60, num_ingredients=8, fruit='apple'))
'''

pumpkin_variation = ThanksgivingDinner.model_validate(
    {
        'dessert': {
            'filling': 'pumpkin',
            'time_to_cook': 40,
            'num_ingredients': 6,
        }
    }
)
print(repr(pumpkin_variation))
'''
ThanksgivingDinner(dessert=PumpkinPie(time_to_cook=40, num_ingredients=6, filling='pumpkin'))
'''

Note

You must specify a Tag for every case in a Tag that is associated with a callable Discriminator. Failing to do so will result in a PydanticUserError with code callable-discriminator-no-tag.

See the Discriminated Unions concepts docs for more details on how to use Tags.

UuidVersion dataclass

UuidVersion(uuid_version: Literal[1, 3, 4, 5])

A field metadata class to indicate a UUID version.

PydanticMultiHostUrl

fragment property

fragment: str | None

path property

path: str | None

query property

query: str | None

scheme property

scheme: str

build builtin

build(
    *,
    scheme: str,
    hosts: Optional[list[MultiHostHost]] = None,
    path: Optional[str] = None,
    query: Optional[str] = None,
    fragment: Optional[str] = None,
    host: Optional[str] = None,
    username: Optional[str] = None,
    password: Optional[str] = None,
    port: Optional[int] = None,
) -> Self

Build a new MultiHostUrl instance from its component parts.

This method takes either hosts - a list of MultiHostHost typed dicts, or the individual components username, password, host and port.

Parameters:

Name Type Description Default
scheme str

The scheme part of the URL.

required
hosts Optional[list[MultiHostHost]]

Multiple hosts to build the URL from.

None
username Optional[str]

The username part of the URL.

None
password Optional[str]

The password part of the URL.

None
host Optional[str]

The host part of the URL.

None
port Optional[int]

The port part of the URL.

None
path Optional[str]

The path part of the URL.

None
query Optional[str]

The query part of the URL, or omit for no query.

None
fragment Optional[str]

The fragment part of the URL, or omit for no fragment.

None

Returns:

Type Description
Self

An instance of MultiHostUrl

hosts method descriptor

hosts() -> list[MultiHostHost]

The hosts of the MultiHostUrl as MultiHostHost typed dicts.

from pydantic_core import MultiHostUrl

mhu = MultiHostUrl('https://foo.com:123,foo:bar@bar.com/path')
print(mhu.hosts())
"""
[
    {'username': None, 'password': None, 'host': 'foo.com', 'port': 123},
    {'username': 'foo', 'password': 'bar', 'host': 'bar.com', 'port': 443}
]
Returns: A list of dicts, each representing a host.

query_params method descriptor

query_params() -> list[tuple[str, str]]

The query part of the URL as a list of key-value pairs.

e.g. [('foo', 'bar')] in https://foo.com,bar.com/path?query#fragment

unicode_string method descriptor

unicode_string() -> str

The URL as a unicode string, unlike __str__() this will not punycode encode the hosts.

PydanticUrl

fragment property

fragment: str | None

host property

host: str | None

password property

password: str | None

path property

path: str | None

port property

port: int | None

query property

query: str | None

scheme property

scheme: str

username property

username: str | None

build builtin

build(
    *,
    scheme: str,
    host: str,
    username: Optional[str] = None,
    password: Optional[str] = None,
    port: Optional[int] = None,
    path: Optional[str] = None,
    query: Optional[str] = None,
    fragment: Optional[str] = None,
) -> Self

Build a new Url instance from its component parts.

Parameters:

Name Type Description Default
scheme str

The scheme part of the URL.

required
username Optional[str]

The username part of the URL, or omit for no username.

None
password Optional[str]

The password part of the URL, or omit for no password.

None
host str

The host part of the URL.

required
port Optional[int]

The port part of the URL, or omit for no port.

None
path Optional[str]

The path part of the URL, or omit for no path.

None
query Optional[str]

The query part of the URL, or omit for no query.

None
fragment Optional[str]

The fragment part of the URL, or omit for no fragment.

None

Returns:

Type Description
Self

An instance of URL

query_params method descriptor

query_params() -> list[tuple[str, str]]

The query part of the URL as a list of key-value pairs.

e.g. [('foo', 'bar')] in https://user:pass@host:port/path?foo=bar#fragment

unicode_host method descriptor

unicode_host() -> str | None

The host part of the URL as a unicode string, or None.

e.g. host in https://user:pass@host:port/path?query#fragment

If the URL must be punycode encoded, this is the decoded host, e.g if the input URL is https://£££.com, unicode_host() will be £££.com

unicode_string method descriptor

unicode_string() -> str

The URL as a unicode string, unlike __str__() this will not punycode encode the host.

If the URL must be punycode encoded, this is the decoded string, e.g if the input URL is https://£££.com, unicode_string() will be https://£££.com

Schema dataclass

Schema(model: str)

Representation of the model schema argument for annotations.

OneOrMany

Bases: list[_T], Generic[_T]

A class for representing a single value or a sequence of values.

validate classmethod

validate(
    obj: _T" optional hover>_T | list[_T] | set[_T] | tuple[_T],
) -> OneOrMany[_T]

Validate the one or many given object.

Parameters:

Name Type Description Default
obj _T | list[_T] | set[_T] | tuple[_T]

The input object to handle either as a single entry or a sequence of entries. If the object is a list, set, or tuple, it will be used as is. Otherwise, it will be wrapped in a list.

required

Returns:

Type Description
OneOrMany[_T]

The validated one or many object.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
@classmethod
def validate(
    cls,
    obj: _T | list[_T] | set[_T] | tuple[_T],
) -> 'OneOrMany[_T]':
    """Validate the one or many given object.

    Args:
        obj: The input object to handle either as a single entry or a
            sequence of entries. If the object is a list, set, or tuple, it
            will be used as is. Otherwise, it will be wrapped in a list.

    Returns:
        The validated one or many object.
    """
    if isinstance(obj, (list, set, tuple)):
        return cls(obj)
    else:
        return cls([obj])

TypeAdapterList

TypeAdapterList(
    type_: type[_T],
    *,
    config: ConfigDict | None = None,
    _parent_depth: int = 2,
    module: str | None = None,
)
TypeAdapterList(
    type_: _T,
    *,
    config: ConfigDict | None = None,
    _parent_depth: int = 2,
    module: str | None = None,
)
TypeAdapterList(
    type_: type[_T] | _T,
    *,
    config: ConfigDict | None = None,
    _parent_depth: int = 2,
    module: str | None = None,
)

Bases: Generic[_T]

A list type adapter for handling one or many input values.

The type adapters provide a flexible way to perform validation and serialization based on a Python type. It proxies the TypeAdapter class with a list type checking mechanism that allows for a single value or a sequence of values.

Attributes:

Name Type Description
core_schema

The core schema for the type.

validator

The schema validator for the type.

serializer

The schema serializer for the type.

Note

TypeAdapterList instances are not types, and cannot be used as type annotations for fields.

Initialize a list type adapter.

Parameters:

Name Type Description Default
type_ type[_T] | _T

The type associated with the adapter.

required
config ConfigDict | None

The configuration to use for the adapter.

None
_parent_depth int

The depth at which to search the parent namespace to construct the local namespace.

2
module str | None

The module that passes to plugin if provided.

None

Returns:

Type Description
None

A type adapter configured for the specified type.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
def __init__(
    self,
    type_: type[_T] | _T,
    *,
    config: PydanticConfigDict | None = None,
    _parent_depth: int = 2,
    module: str | None = None,
) -> None:
    """Initialize a list type adapter.

    Args:
        type_: The type associated with the adapter.
        config: The configuration to use for the adapter.
        _parent_depth: The depth at which to search the parent
            namespace to construct the local namespace.
        module: The module that passes to plugin if provided.

    Returns:
        A type adapter configured for the specified `type`.
    """
    self.__pydantic_adapter__ = TypeAdapter(
        OneOrMany[type_],  # type: ignore
        config=config,
        _parent_depth=_parent_depth,
        module=module,
    )

validate_python

validate_python(
    __object: Any,
    *,
    strict: bool | None = None,
    from_attributes: bool | None = None,
    context: dict[str, Any] | None = None,
) -> list[_T]

Validate a Python object against the model.

Parameters:

Name Type Description Default
__object Any

The Python object to validate against the model.

required
strict bool | None

Whether to strictly check types.

None
from_attributes bool | None

Whether to extract data from object attributes.

None
context dict[str, Any] | None

Additional context to pass to the validator.

None

Returns:

Type Description
list[_T]

The validated object.

Note

When using TypeAdapterList with a Pydantic dataclass, the use of the from_attributes argument is not supported.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
def validate_python(  # type: ignore[empty-body, unused-ignore]
    self,
    __object: Any,
    *,
    strict: bool | None = None,
    from_attributes: bool | None = None,
    context: dict[str, Any] | None = None,
) -> list[_T]:
    """Validate a Python object against the model.

    Args:
        __object: The Python object to validate against the model.
        strict: Whether to strictly check types.
        from_attributes: Whether to extract data from object attributes.
        context: Additional context to pass to the validator.

    Returns:
        The validated object.

    Note:
        When using `TypeAdapterList` with a Pydantic `dataclass`, the use
        of the `from_attributes` argument is not supported.
    """

validate_json

validate_json(
    __data: str | bytes,
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> list[_T]

Validate a JSON string or bytes against the model.

Parameters:

Name Type Description Default
__data str | bytes

The JSON data to validate against the model.

required
strict bool | None

Whether to strictly check types.

None
context dict[str, Any] | None

Additional context to use during validation.

None

Returns:

Type Description
list[_T]

The validated object.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
def validate_json(  # type: ignore[empty-body, unused-ignore]
    self,
    __data: str | bytes,
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> list[_T]:
    """Validate a JSON string or bytes against the model.

    Args:
        __data: The JSON data to validate against the model.
        strict: Whether to strictly check types.
        context: Additional context to use during validation.

    Returns:
        The validated object.
    """

validate_strings

validate_strings(
    __object: Any,
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> list[_T]

Validate object contains string data against the model.

Parameters:

Name Type Description Default
__object Any

The object contains string data to validate.

required
strict bool | None

Whether to strictly check types.

None
context dict[str, Any] | None

Additional context to use during validation.

None

Returns:

Type Description
list[_T]

The validated object.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
def validate_strings(  # type: ignore[empty-body, unused-ignore]
    self,
    __object: Any,
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> list[_T]:
    """Validate object contains string data against the model.

    Args:
        __object: The object contains string data to validate.
        strict: Whether to strictly check types.
        context: Additional context to use during validation.

    Returns:
        The validated object.
    """

get_default_value

get_default_value(
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> Some[list[_T]] | None

Get the default value for the wrapped type.

Parameters:

Name Type Description Default
strict bool | None

Whether to strictly check types.

None
context dict[str, Any] | None

Additional context to pass to the validator.

None

Returns:

Type Description
Some[list[_T]] | None

The default value wrapped in a Some if there is one or None

Some[list[_T]] | None

if not.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
def get_default_value(  # type: ignore[empty-body, unused-ignore]
    self,
    *,
    strict: bool | None = None,
    context: dict[str, Any] | None = None,
) -> Some[list[_T]] | None:
    """Get the default value for the wrapped type.

    Args:
        strict: Whether to strictly check types.
        context: Additional context to pass to the validator.

    Returns:
        The default value wrapped in a `Some` if there is one or ``None``
        if not.
    """

dump_python

dump_python(
    __instances: _T | list[_T],
    *,
    mode: Literal["json", "python"] = "python",
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> Any

Dump an instance of the adapted type to a Python object.

Parameters:

Name Type Description Default
__instances _T | list[_T]

The Python object to serialize.

required
mode Literal['json', 'python']

The output format.

'python'
include IncEx | None

Fields to include in the output.

None
exclude IncEx | None

Fields to exclude from the output.

None
by_alias bool

Whether to use alias names for field names.

False
exclude_unset bool

Whether to exclude unset fields.

False
exclude_defaults bool

Whether to exclude fields with default values.

False
exclude_none bool

Whether to exclude fields with None values.

False
round_trip bool

Whether to output the serialized data in a way that is compatible with deserialization.

False
warnings bool

Whether to display serialization warnings.

True

Returns:

Type Description
Any

The serialized object.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
def dump_python(  # type: ignore[empty-body, unused-ignore]
    self,
    __instances: _T | list[_T],
    *,
    mode: Literal['json', 'python'] = 'python',
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> Any:
    """Dump an instance of the adapted type to a Python object.

    Args:
        __instances: The Python object to serialize.
        mode: The output format.
        include: Fields to include in the output.
        exclude: Fields to exclude from the output.
        by_alias: Whether to use alias names for field names.
        exclude_unset: Whether to exclude unset fields.
        exclude_defaults: Whether to exclude fields with default values.
        exclude_none: Whether to exclude fields with ``None`` values.
        round_trip: Whether to output the serialized data in a way that is
            compatible with deserialization.
        warnings: Whether to display serialization warnings.

    Returns:
        The serialized object.
    """

dump_json

dump_json(
    __instance: _T | list[_T],
    *,
    indent: int | None = None,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> bytes

Serialize an instance of the adapted type to JSON.

Parameters:

Name Type Description Default
__instance _T | list[_T]

The instance to be serialized.

required
indent int | None

Number of spaces for JSON indentation.

None
include IncEx | None

Fields to include.

None
exclude IncEx | None

Fields to exclude.

None
by_alias bool

Whether to use alias names for field names.

False
exclude_unset bool

Whether to exclude unset fields.

False
exclude_defaults bool

Whether to exclude fields with default values.

False
exclude_none bool

Whether to exclude fields with a value of None.

False
round_trip bool

Whether to serialize and deserialize the instance to ensure round-tripping.

False
warnings bool

Whether to emit serialization warnings.

True

Returns:

Type Description
bytes

The JSON representation of the given instance as bytes.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
def dump_json(  # type: ignore[empty-body, unused-ignore]
    self,
    __instance: _T | list[_T],
    *,
    indent: int | None = None,
    include: IncEx | None = None,
    exclude: IncEx | None = None,
    by_alias: bool = False,
    exclude_unset: bool = False,
    exclude_defaults: bool = False,
    exclude_none: bool = False,
    round_trip: bool = False,
    warnings: bool = True,
) -> bytes:
    """Serialize an instance of the adapted type to JSON.

    Args:
        __instance: The instance to be serialized.
        indent: Number of spaces for JSON indentation.
        include: Fields to include.
        exclude: Fields to exclude.
        by_alias: Whether to use alias names for field names.
        exclude_unset: Whether to exclude unset fields.
        exclude_defaults: Whether to exclude fields with default values.
        exclude_none: Whether to exclude fields with a value of ``None``.
        round_trip: Whether to serialize and deserialize the instance to
            ensure round-tripping.
        warnings: Whether to emit serialization warnings.

    Returns:
        The JSON representation of the given instance as bytes.
    """

json_schema

json_schema(
    *,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[
        GenerateJsonSchema
    ] = GenerateJsonSchema,
    mode: JsonSchemaMode = "validation",
) -> dict[str, Any]

Generate a JSON schema for the adapted type.

Parameters:

Name Type Description Default
by_alias bool

Whether to use alias names for field names.

True
ref_template str

The format string used for generating $ref strings.

DEFAULT_REF_TEMPLATE
schema_generator type[GenerateJsonSchema]

The generator class used for creating the schema.

GenerateJsonSchema
mode JsonSchemaMode

The mode to use for schema generation.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the model as a dictionary.

Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
def json_schema(  # type: ignore[empty-body, unused-ignore]
    self,
    *,
    by_alias: bool = True,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
    mode: JsonSchemaMode = 'validation',
) -> dict[str, Any]:
    """Generate a JSON schema for the adapted type.

    Args:
        by_alias: Whether to use alias names for field names.
        ref_template: The format string used for generating $ref strings.
        schema_generator: The generator class used for creating the schema.
        mode: The mode to use for schema generation.

    Returns:
        The JSON schema for the model as a dictionary.
    """

json_schemas staticmethod

json_schemas(
    __inputs: Iterable[
        tuple[
            JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]
        ]
    ],
    *,
    by_alias: bool = True,
    title: str | None = None,
    description: str | None = None,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[
        GenerateJsonSchema
    ] = GenerateJsonSchema,
) -> tuple[
    dict[
        tuple[JsonSchemaKeyT, JsonSchemaMode],
        JsonSchemaValue,
    ],
    JsonSchemaValue,
]

Generate a JSON schema from multiple type adapters.

Parameters:

Name Type Description Default
__inputs Iterable[tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]]

Inputs to schema generation. The first two items will form the keys of the (first) output mapping; the type adapters will provide the core schemas that get converted into definitions in the output JSON schema.

required
by_alias bool

Whether to use alias names.

True
title str | None

The title for the schema.

None
description str | None

The description for the schema.

None
ref_template str

The format string used for generating $ref strings.

DEFAULT_REF_TEMPLATE
schema_generator type[GenerateJsonSchema]

The generator class used for creating the schema.

GenerateJsonSchema

Returns:

Type Description
dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue]

A tuple where:

JsonSchemaValue
  • The first element is a dictionary whose keys are tuples of JSON schema key type and JSON mode, and whose values are the JSON schema corresponding to that pair of inputs. (These schemas may have JsonRef references to definitions that are defined in the second returned element).
tuple[dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue], JsonSchemaValue]
  • The second element is a JSON schema containing all definitions referenced in the first returned element, along with the optional title and description keys.
Source code in .venv/lib/python3.12/site-packages/plateforme/core/schema/types.py
@staticmethod
def json_schemas(  # type: ignore[empty-body, unused-ignore]
    __inputs: Iterable[
        tuple[JsonSchemaKeyT, JsonSchemaMode, TypeAdapter[Any]]
    ],
    *,
    by_alias: bool = True,
    title: str | None = None,
    description: str | None = None,
    ref_template: str = DEFAULT_REF_TEMPLATE,
    schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema,
) -> tuple[
    dict[tuple[JsonSchemaKeyT, JsonSchemaMode], JsonSchemaValue],
    JsonSchemaValue,
]:
    """Generate a JSON schema from multiple type adapters.

    Args:
        __inputs: Inputs to schema generation. The first two items will
            form the keys of the (first) output mapping; the type adapters
            will provide the core schemas that get converted into
            definitions in the output JSON schema.
        by_alias: Whether to use alias names.
        title: The title for the schema.
        description: The description for the schema.
        ref_template: The format string used for generating $ref strings.
        schema_generator: The generator class used for creating the schema.

    Returns:
        A tuple where:
        - The first element is a dictionary whose keys are tuples of JSON
            schema key type and JSON mode, and whose values are the JSON
            schema corresponding to that pair of inputs. (These schemas may
            have JsonRef references to definitions that are defined in the
            second returned element).
        - The second element is a JSON schema containing all definitions
            referenced in the first returned element, along with the
            optional title and description keys.
    """