Skip to content

BaseModel

Pydantic models are simply classes which inherit from BaseModel and define fields as annotated attributes.

pydantic.BaseModel

Usage Documentation

Models

A base class for creating Pydantic models.

Attributes:

Name Type Description
__class_vars__ set[str]

The names of the class variables defined on the model.

__private_attributes__ Dict[str, ModelPrivateAttr]

Metadata about the private attributes of the model.

__signature__ Signature

The synthesized __init__ Signature of the model.

__pydantic_complete__ bool

Whether model building is completed, or if there are still undefined fields.

__pydantic_core_schema__ CoreSchema

The core schema of the model.

__pydantic_custom_init__ bool

Whether the model has a custom __init__ function.

__pydantic_decorators__ DecoratorInfos

Metadata containing the decorators defined on the model. This replaces Model.__validators__ and Model.__root_validators__ from Pydantic V1.

__pydantic_generic_metadata__ PydanticGenericMetadata

Metadata for generic models; contains data used for a similar purpose to args, origin, parameters in typing-module generics. May eventually be replaced by these.

__pydantic_parent_namespace__ Dict[str, Any] | None

Parent namespace of the model, used for automatic rebuilding of models.

__pydantic_post_init__ None | Literal['model_post_init']

The name of the post-init method for the model, if defined.

__pydantic_root_model__ bool

Whether the model is a RootModel.

__pydantic_serializer__ SchemaSerializer

The pydantic-core SchemaSerializer used to dump instances of the model.

__pydantic_validator__ SchemaValidator | PluggableSchemaValidator

The pydantic-core SchemaValidator used to validate instances of the model.

__pydantic_fields__ Dict[str, FieldInfo]

A dictionary of field names and their corresponding FieldInfo objects.

__pydantic_computed_fields__ Dict[str, ComputedFieldInfo]

A dictionary of computed field names and their corresponding ComputedFieldInfo objects.

__pydantic_extra__ dict[str, Any] | None

A dictionary containing extra values, if extra is set to 'allow'.

__pydantic_fields_set__ set[str]

The names of fields explicitly set during instantiation.

__pydantic_private__ dict[str, Any] | None

Values of private attributes set on the model instance.

Source code in pydantic/main.py
 121  122  123  124  125  126  127  128  129  130  131  132  133  134  135  136  137  138  139  140  141  142  143  144  145  146  147  148  149  150  151  152  153  154  155  156  157  158  159  160  161  162  163  164  165  166  167  168  169  170  171  172  173  174  175  176  177  178  179  180  181  182  183  184  185  186  187  188  189  190  191  192  193  194  195  196  197  198  199  200  201  202  203  204  205  206  207  208  209  210  211  212  213  214  215  216  217  218  219  220  221  222  223  224  225  226  227  228  229  230  231  232  233  234  235  236  237  238  239  240  241  242  243  244  245  246  247  248  249  250  251  252  253  254  255  256  257  258  259  260  261  262  263  264  265  266  267  268  269  270  271  272  273  274  275  276  277  278  279  280  281  282  283  284  285  286  287  288  289  290  291  292  293  294  295  296  297  298  299  300  301  302  303  304  305  306  307  308  309  310  311  312  313  314  315  316  317  318  319  320  321  322  323  324  325  326  327  328  329  330  331  332  333  334  335  336  337  338  339  340  341  342  343  344  345  346  347  348  349  350  351  352  353  354  355  356  357  358  359  360  361  362  363  364  365  366  367  368  369  370  371  372  373  374  375  376  377  378  379  380  381  382  383  384  385  386  387  388  389  390  391  392  393  394  395  396  397  398  399  400  401  402  403  404  405  406  407  408  409  410  411  412  413  414  415  416  417  418  419  420  421  422  423  424  425  426  427  428  429  430  431  432  433  434  435  436  437  438  439  440  441  442  443  444  445  446  447  448  449  450  451  452  453  454  455  456  457  458  459  460  461  462  463  464  465  466  467  468  469  470  471  472  473  474  475  476  477  478  479  480  481  482  483  484  485  486  487  488  489  490  491  492  493  494  495  496  497  498  499  500  501  502  503  504  505  506  507  508  509  510  511  512  513  514  515  516  517  518  519  520  521  522  523  524  525  526  527  528  529  530  531  532  533  534  535  536  537  538  539  540  541  542  543  544  545  546  547  548  549  550  551  552  553  554  555  556  557  558  559  560  561  562  563  564  565  566  567  568  569  570  571  572  573  574  575  576  577  578  579  580  581  582  583  584  585  586  587  588  589  590  591  592  593  594  595  596  597  598  599  600  601  602  603  604  605  606  607  608  609  610  611  612  613  614  615  616  617  618  619  620  621  622  623  624  625  626  627  628  629  630  631  632  633  634  635  636  637  638  639  640  641  642  643  644  645  646  647  648  649  650  651  652  653  654  655  656  657  658  659  660  661  662  663  664  665  666  667  668  669  670  671  672  673  674  675  676  677  678  679  680  681  682  683  684  685  686  687  688  689  690  691  692  693  694  695  696  697  698  699  700  701  702  703  704  705  706  707  708  709  710  711  712  713  714  715  716  717  718  719  720  721  722  723  724  725  726  727  728  729  730  731  732  733  734  735  736  737  738  739  740  741  742  743  744  745  746  747  748  749  750  751  752  753  754  755  756  757  758  759  760  761  762  763  764  765  766  767  768  769  770  771  772  773  774  775  776  777  778  779  780  781  782  783  784  785  786  787  788  789  790  791  792  793  794  795  796  797  798  799  800  801  802  803  804  805  806  807  808  809  810  811  812  813  814  815  816  817  818  819  820  821  822  823  824  825  826  827  828  829  830  831  832  833  834  835  836  837  838  839  840  841  842  843  844  845  846  847  848  849  850  851  852  853  854  855  856  857  858  859  860  861  862  863  864  865  866  867  868  869  870  871  872  873  874  875  876  877  878  879  880  881  882  883  884  885  886  887  888  889  890  891  892  893  894  895  896  897  898  899  900  901  902  903  904  905  906  907  908  909  910  911  912  913  914  915  916  917  918  919  920  921  922  923  924  925  926  927  928  929  930  931  932  933  934  935  936  937  938  939  940  941  942  943  944  945  946  947  948  949  950  951  952  953  954  955  956  957  958  959  960  961  962  963  964  965  966  967  968  969  970  971  972  973  974  975  976  977  978  979  980  981  982  983  984  985  986  987  988  989  990  991  992  993  994  995  996  997  998  999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494 1495 1496 1497 1498 1499 1500 1501 1502 1503 1504 1505 1506 1507 1508 1509 1510 1511 1512 1513 1514 1515 1516 1517 1518 1519 1520 1521 1522 1523 1524 1525 1526 1527 1528 1529 1530 1531 1532 1533 1534 1535 1536 1537 1538 1539 1540 1541 1542 1543 1544 1545 1546 1547 1548 1549 1550 1551 1552 1553 1554 1555 1556 1557 1558 1559 1560 1561 1562 1563 1564 1565 1566 1567 1568 1569 1570 1571 1572 1573 1574 1575 1576 1577 1578 1579 1580 1581 1582 1583 1584 1585 1586 1587 1588 1589 1590 1591 1592 1593 1594 1595 1596 1597 1598 1599 1600 1601 1602 1603 1604 1605 1606 1607 1608 1609 1610 1611 1612 1613 1614 1615 1616 1617 1618 1619 1620 1621 1622 1623 1624 1625 1626 1627 1628 1629 1630 1631 1632 1633 1634 1635 1636 1637 1638 1639 1640 1641 1642 1643
class BaseModel(metaclass=_model_construction.ModelMetaclass):  """!!! abstract "Usage Documentation"  [Models](../concepts/models.md)  A base class for creating Pydantic models.  Attributes:  __class_vars__: The names of the class variables defined on the model.  __private_attributes__: Metadata about the private attributes of the model.  __signature__: The synthesized `__init__` [`Signature`][inspect.Signature] of the model.  __pydantic_complete__: Whether model building is completed, or if there are still undefined fields.  __pydantic_core_schema__: The core schema of the model.  __pydantic_custom_init__: Whether the model has a custom `__init__` function.  __pydantic_decorators__: Metadata containing the decorators defined on the model.  This replaces `Model.__validators__` and `Model.__root_validators__` from Pydantic V1.  __pydantic_generic_metadata__: Metadata for generic models; contains data used for a similar purpose to  __args__, __origin__, __parameters__ in typing-module generics. May eventually be replaced by these.  __pydantic_parent_namespace__: Parent namespace of the model, used for automatic rebuilding of models.  __pydantic_post_init__: The name of the post-init method for the model, if defined.  __pydantic_root_model__: Whether the model is a [`RootModel`][pydantic.root_model.RootModel].  __pydantic_serializer__: The `pydantic-core` `SchemaSerializer` used to dump instances of the model.  __pydantic_validator__: The `pydantic-core` `SchemaValidator` used to validate instances of the model.  __pydantic_fields__: A dictionary of field names and their corresponding [`FieldInfo`][pydantic.fields.FieldInfo] objects.  __pydantic_computed_fields__: A dictionary of computed field names and their corresponding [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] objects.  __pydantic_extra__: A dictionary containing extra values, if [`extra`][pydantic.config.ConfigDict.extra]  is set to `'allow'`.  __pydantic_fields_set__: The names of fields explicitly set during instantiation.  __pydantic_private__: Values of private attributes set on the model instance.  """ # Note: Many of the below class vars are defined in the metaclass, but we define them here for type checking purposes. model_config: ClassVar[ConfigDict] = ConfigDict()  """  Configuration for the model, should be a dictionary conforming to [`ConfigDict`][pydantic.config.ConfigDict].  """ __class_vars__: ClassVar[set[str]]  """The names of the class variables defined on the model.""" __private_attributes__: ClassVar[Dict[str, ModelPrivateAttr]] # noqa: UP006  """Metadata about the private attributes of the model.""" __signature__: ClassVar[Signature]  """The synthesized `__init__` [`Signature`][inspect.Signature] of the model.""" __pydantic_complete__: ClassVar[bool] = False  """Whether model building is completed, or if there are still undefined fields.""" __pydantic_core_schema__: ClassVar[CoreSchema]  """The core schema of the model.""" __pydantic_custom_init__: ClassVar[bool]  """Whether the model has a custom `__init__` method.""" # Must be set for `GenerateSchema.model_schema` to work for a plain `BaseModel` annotation. __pydantic_decorators__: ClassVar[_decorators.DecoratorInfos] = _decorators.DecoratorInfos()  """Metadata containing the decorators defined on the model.  This replaces `Model.__validators__` and `Model.__root_validators__` from Pydantic V1.""" __pydantic_generic_metadata__: ClassVar[_generics.PydanticGenericMetadata]  """Metadata for generic models; contains data used for a similar purpose to  __args__, __origin__, __parameters__ in typing-module generics. May eventually be replaced by these.""" __pydantic_parent_namespace__: ClassVar[Dict[str, Any] | None] = None # noqa: UP006  """Parent namespace of the model, used for automatic rebuilding of models.""" __pydantic_post_init__: ClassVar[None | Literal['model_post_init']]  """The name of the post-init method for the model, if defined.""" __pydantic_root_model__: ClassVar[bool] = False  """Whether the model is a [`RootModel`][pydantic.root_model.RootModel].""" __pydantic_serializer__: ClassVar[SchemaSerializer]  """The `pydantic-core` `SchemaSerializer` used to dump instances of the model.""" __pydantic_validator__: ClassVar[SchemaValidator | PluggableSchemaValidator]  """The `pydantic-core` `SchemaValidator` used to validate instances of the model.""" __pydantic_fields__: ClassVar[Dict[str, FieldInfo]] # noqa: UP006  """A dictionary of field names and their corresponding [`FieldInfo`][pydantic.fields.FieldInfo] objects.  This replaces `Model.__fields__` from Pydantic V1.  """ __pydantic_setattr_handlers__: ClassVar[Dict[str, Callable[[BaseModel, str, Any], None]]] # noqa: UP006  """`__setattr__` handlers. Memoizing the handlers leads to a dramatic performance improvement in `__setattr__`""" __pydantic_computed_fields__: ClassVar[Dict[str, ComputedFieldInfo]] # noqa: UP006  """A dictionary of computed field names and their corresponding [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] objects.""" __pydantic_extra__: dict[str, Any] | None = _model_construction.NoInitField(init=False)  """A dictionary containing extra values, if [`extra`][pydantic.config.ConfigDict.extra] is set to `'allow'`.""" __pydantic_fields_set__: set[str] = _model_construction.NoInitField(init=False)  """The names of fields explicitly set during instantiation.""" __pydantic_private__: dict[str, Any] | None = _model_construction.NoInitField(init=False)  """Values of private attributes set on the model instance.""" if not TYPE_CHECKING: # Prevent `BaseModel` from being instantiated directly # (defined in an `if not TYPE_CHECKING` block for clarity and to avoid type checking errors): __pydantic_core_schema__ = _mock_val_ser.MockCoreSchema( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', code='base-model-instantiated', ) __pydantic_validator__ = _mock_val_ser.MockValSer( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', val_or_ser='validator', code='base-model-instantiated', ) __pydantic_serializer__ = _mock_val_ser.MockValSer( 'Pydantic models should inherit from BaseModel, BaseModel cannot be instantiated directly', val_or_ser='serializer', code='base-model-instantiated', ) __slots__ = '__dict__', '__pydantic_fields_set__', '__pydantic_extra__', '__pydantic_private__' def __init__(self, /, **data: Any) -> None:  """Create a new model by parsing and validating input data from keyword arguments.  Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be  validated to form a valid model.  `self` is explicitly positional-only to allow `self` as a field name.  """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) if self is not validated_self: warnings.warn( 'A custom validator is returning a value other than `self`.\n' "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n" 'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.', stacklevel=2, ) # The following line sets a flag that we use to determine when `__init__` gets overridden by the user __init__.__pydantic_base_init__ = True # pyright: ignore[reportFunctionMemberAccess] @_utils.deprecated_instance_property @classmethod def model_fields(cls) -> dict[str, FieldInfo]:  """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.  !!! warning  Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.  Instead, you should access this attribute from the model class.  """ return getattr(cls, '__pydantic_fields__', {}) @_utils.deprecated_instance_property @classmethod def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:  """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.  !!! warning  Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.  Instead, you should access this attribute from the model class.  """ return getattr(cls, '__pydantic_computed_fields__', {}) @property def model_extra(self) -> dict[str, Any] | None:  """Get extra fields set during validation.  Returns:  A dictionary of extra fields, or `None` if `config.extra` is not set to `"allow"`.  """ return self.__pydantic_extra__ @property def model_fields_set(self) -> set[str]:  """Returns the set of fields that have been explicitly set on this model instance.  Returns:  A set of strings representing the fields that have been set,  i.e. that were not filled from defaults.  """ return self.__pydantic_fields_set__ @classmethod def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self: # noqa: C901  """Creates a new instance of the `Model` class with validated data.  Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.  Default values are respected, but no other validation is performed.  !!! note  `model_construct()` generally respects the `model_config.extra` setting on the provided model.  That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`  and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.  Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in  an error if extra values are passed, but they will be ignored.  Args:  _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,  this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.  Otherwise, the field names from the `values` argument will be used.  values: Trusted or pre-validated data dictionary.  Returns:  A new instance of the `Model` class with validated data.  """ m = cls.__new__(cls) fields_values: dict[str, Any] = {} fields_set = set() for name, field in cls.__pydantic_fields__.items(): if field.alias is not None and field.alias in values: fields_values[name] = values.pop(field.alias) fields_set.add(name) if (name not in fields_set) and (field.validation_alias is not None): validation_aliases: list[str | AliasPath] = ( field.validation_alias.choices if isinstance(field.validation_alias, AliasChoices) else [field.validation_alias] ) for alias in validation_aliases: if isinstance(alias, str) and alias in values: fields_values[name] = values.pop(alias) fields_set.add(name) break elif isinstance(alias, AliasPath): value = alias.search_dict_for_path(values) if value is not PydanticUndefined: fields_values[name] = value fields_set.add(name) break if name not in fields_set: if name in values: fields_values[name] = values.pop(name) fields_set.add(name) elif not field.is_required(): fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values) if _fields_set is None: _fields_set = fields_set _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None _object_setattr(m, '__dict__', fields_values) _object_setattr(m, '__pydantic_fields_set__', _fields_set) if not cls.__pydantic_root_model__: _object_setattr(m, '__pydantic_extra__', _extra) if cls.__pydantic_post_init__: m.model_post_init(None) # update private attributes with values set if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None: for k, v in values.items(): if k in m.__private_attributes__: m.__pydantic_private__[k] = v elif not cls.__pydantic_root_model__: # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist # Since it doesn't, that means that `__pydantic_private__` should be set to None _object_setattr(m, '__pydantic_private__', None) return m def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:  """!!! abstract "Usage Documentation"  [`model_copy`](../concepts/serialization.md#model_copy)  Returns a copy of the model.  !!! note  The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This  might have unexpected side effects if you store anything in it, on top of the model  fields (e.g. the value of [cached properties][functools.cached_property]).  Args:  update: Values to change/add in the new model. Note: the data is not validated  before creating the new model. You should trust this data.  deep: Set to `True` to make a deep copy of the model.  Returns:  New model instance.  """ copied = self.__deepcopy__() if deep else self.__copy__() if update: if self.model_config.get('extra') == 'allow': for k, v in update.items(): if k in self.__pydantic_fields__: copied.__dict__[k] = v else: if copied.__pydantic_extra__ is None: copied.__pydantic_extra__ = {} copied.__pydantic_extra__[k] = v else: copied.__dict__.update(update) copied.__pydantic_fields_set__.update(update.keys()) return copied def model_dump( self, *, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, ) -> dict[str, Any]:  """!!! abstract "Usage Documentation"  [`model_dump`](../concepts/serialization.md#modelmodel_dump)  Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.  Args:  mode: The mode in which `to_python` should run.  If mode is 'json', the output will only contain JSON serializable types.  If mode is 'python', the output may contain non-JSON-serializable Python objects.  include: A set of fields to include in the output.  exclude: A set of fields to exclude from the output.  context: Additional context to pass to the serializer.  by_alias: Whether to use the field's alias in the dictionary key if defined.  exclude_unset: Whether to exclude fields that have not been explicitly set.  exclude_defaults: Whether to exclude fields that are set to their default value.  exclude_none: Whether to exclude fields that have a value of `None`.  round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].  warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,  "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].  fallback: A function to call when an unknown value is encountered. If not provided,  a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.  serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.  Returns:  A dictionary representation of the model.  """ return self.__pydantic_serializer__.to_python( self, mode=mode, by_alias=by_alias, include=include, exclude=exclude, context=context, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, ) def model_dump_json( self, *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, ) -> str:  """!!! abstract "Usage Documentation"  [`model_dump_json`](../concepts/serialization.md#modelmodel_dump_json)  Generates a JSON representation of the model using Pydantic's `to_json` method.  Args:  indent: Indentation to use in the JSON output. If None is passed, the output will be compact.  include: Field(s) to include in the JSON output.  exclude: Field(s) to exclude from the JSON output.  context: Additional context to pass to the serializer.  by_alias: Whether to serialize using field aliases.  exclude_unset: Whether to exclude fields that have not been explicitly set.  exclude_defaults: Whether to exclude fields that are set to their default value.  exclude_none: Whether to exclude fields that have a value of `None`.  round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].  warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,  "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].  fallback: A function to call when an unknown value is encountered. If not provided,  a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.  serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.  Returns:  A JSON string representation of the model.  """ return self.__pydantic_serializer__.to_json( self, indent=indent, include=include, exclude=exclude, context=context, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, ).decode() @classmethod def model_json_schema( cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]:  """Generates a JSON schema for a model class.  Args:  by_alias: Whether to use attribute aliases or not.  ref_template: The reference template.  schema_generator: To override the logic used to generate the JSON schema, as a subclass of  `GenerateJsonSchema` with your desired modifications  mode: The mode in which to generate the schema.  Returns:  The JSON schema for the given model class.  """ return model_json_schema( cls, by_alias=by_alias, ref_template=ref_template, schema_generator=schema_generator, mode=mode ) @classmethod def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:  """Compute the class name for parametrizations of generic classes.  This method can be overridden to achieve a custom naming scheme for generic BaseModels.  Args:  params: Tuple of types of the class. Given a generic class  `Model` with 2 type variables and a concrete model `Model[str, int]`,  the value `(str, int)` would be passed to `params`.  Returns:  String representing the new class where `params` are passed to `cls` as type variables.  Raises:  TypeError: Raised when trying to generate concrete names for non-generic models.  """ if not issubclass(cls, typing.Generic): raise TypeError('Concrete names should only be generated for generic models.') # Any strings received should represent forward references, so we handle them specially below. # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future, # we may be able to remove this special case. param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params] params_component = ', '.join(param_names) return f'{cls.__name__}[{params_component}]' def model_post_init(self, context: Any, /) -> None:  """Override this method to perform additional initialization after `__init__` and `model_construct`.  This is useful if you want to do some validation that requires the entire model to be initialized.  """ pass @classmethod def model_rebuild( cls, *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None, ) -> bool | None:  """Try to rebuild the pydantic-core schema for the model.  This may be necessary when one of the annotations is a ForwardRef which could not be resolved during  the initial attempt to build the schema, and automatic rebuilding fails.  Args:  force: Whether to force the rebuilding of the model schema, defaults to `False`.  raise_errors: Whether to raise errors, defaults to `True`.  _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.  _types_namespace: The types namespace, defaults to `None`.  Returns:  Returns `None` if the schema is already "complete" and rebuilding was not required.  If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.  """ if not force and cls.__pydantic_complete__: return None for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'): if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer): # Deleting the validator/serializer is necessary as otherwise they can get reused in # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()` # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used. # Same applies for the core schema that can be reused in schema generation. delattr(cls, attr) cls.__pydantic_complete__ = False if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {} ns_resolver = _namespace_utils.NsResolver( parent_namespace={**rebuild_ns, **parent_ns}, ) if not cls.__pydantic_fields_complete__: typevars_map = _generics.get_model_typevars_map(cls) try: cls.__pydantic_fields__ = _fields.rebuild_model_fields( cls, ns_resolver=ns_resolver, typevars_map=typevars_map, ) except NameError as e: exc = PydanticUndefinedAnnotation.from_name_error(e) _mock_val_ser.set_model_mocks(cls, f'`{exc.name}`') if raise_errors: raise exc from e if not raise_errors and not cls.__pydantic_fields_complete__: # No need to continue with schema gen, it is guaranteed to fail return False assert cls.__pydantic_fields_complete__ return _model_construction.complete_model_class( cls, _config.ConfigWrapper(cls.model_config, check=False), raise_errors=raise_errors, ns_resolver=ns_resolver, ) @classmethod def model_validate( cls, obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self:  """Validate a pydantic model instance.  Args:  obj: The object to validate.  strict: Whether to enforce types strictly.  from_attributes: Whether to extract data from object attributes.  context: Additional context to pass to the validator.  by_alias: Whether to use the field's alias when validating against the provided input data.  by_name: Whether to use the field's name when validating against the provided input data.  Raises:  ValidationError: If the object could not be validated.  Returns:  The validated model instance.  """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_python( obj, strict=strict, from_attributes=from_attributes, context=context, by_alias=by_alias, by_name=by_name ) @classmethod def model_validate_json( cls, json_data: str | bytes | bytearray, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self:  """!!! abstract "Usage Documentation"  [JSON Parsing](../concepts/json.md#json-parsing)  Validate the given JSON data against the Pydantic model.  Args:  json_data: The JSON data to validate.  strict: Whether to enforce types strictly.  context: Extra variables to pass to the validator.  by_alias: Whether to use the field's alias when validating against the provided input data.  by_name: Whether to use the field's name when validating against the provided input data.  Returns:  The validated Pydantic model.  Raises:  ValidationError: If `json_data` is not a JSON string or the object could not be validated.  """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_json( json_data, strict=strict, context=context, by_alias=by_alias, by_name=by_name ) @classmethod def model_validate_strings( cls, obj: Any, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self:  """Validate the given object with string data against the Pydantic model.  Args:  obj: The object containing string data to validate.  strict: Whether to enforce types strictly.  context: Extra variables to pass to the validator.  by_alias: Whether to use the field's alias when validating against the provided input data.  by_name: Whether to use the field's name when validating against the provided input data.  Returns:  The validated Pydantic model.  """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_strings( obj, strict=strict, context=context, by_alias=by_alias, by_name=by_name ) @classmethod def __get_pydantic_core_schema__(cls, source: type[BaseModel], handler: GetCoreSchemaHandler, /) -> CoreSchema: # This warning is only emitted when calling `super().__get_pydantic_core_schema__` from a model subclass. # In the generate schema logic, this method (`BaseModel.__get_pydantic_core_schema__`) is special cased to # *not* be called if not overridden. warnings.warn( 'The `__get_pydantic_core_schema__` method of the `BaseModel` class is deprecated. If you are calling ' '`super().__get_pydantic_core_schema__` when overriding the method on a Pydantic model, consider using ' '`handler(source)` instead. However, note that overriding this method on models can lead to unexpected ' 'side effects.', PydanticDeprecatedSince211, stacklevel=2, ) # Logic copied over from `GenerateSchema._model_schema`: schema = cls.__dict__.get('__pydantic_core_schema__') if schema is not None and not isinstance(schema, _mock_val_ser.MockCoreSchema): return cls.__pydantic_core_schema__ return handler(source) @classmethod def __get_pydantic_json_schema__( cls, core_schema: CoreSchema, handler: GetJsonSchemaHandler, /, ) -> JsonSchemaValue:  """Hook into generating the model's JSON schema.  Args:  core_schema: A `pydantic-core` CoreSchema.  You can ignore this argument and call the handler with a new CoreSchema,  wrap this CoreSchema (`{'type': 'nullable', 'schema': current_schema}`),  or just call the handler with the original schema.  handler: Call into Pydantic's internal JSON schema generation.  This will raise a `pydantic.errors.PydanticInvalidForJsonSchema` if JSON schema  generation fails.  Since this gets called by `BaseModel.model_json_schema` you can override the  `schema_generator` argument to that function to change JSON schema generation globally  for a type.  Returns:  A JSON schema, as a Python object.  """ return handler(core_schema) @classmethod def __pydantic_init_subclass__(cls, **kwargs: Any) -> None:  """This is intended to behave just like `__init_subclass__`, but is called by `ModelMetaclass`  only after the class is actually fully initialized. In particular, attributes like `model_fields` will  be present when this is called.  This is necessary because `__init_subclass__` will always be called by `type.__new__`,  and it would require a prohibitively large refactor to the `ModelMetaclass` to ensure that  `type.__new__` was called in such a manner that the class would already be sufficiently initialized.  This will receive the same `kwargs` that would be passed to the standard `__init_subclass__`, namely,  any kwargs passed to the class definition that aren't used internally by pydantic.  Args:  **kwargs: Any keyword arguments passed to the class definition that aren't used internally  by pydantic.  """ pass def __class_getitem__( cls, typevar_values: type[Any] | tuple[type[Any], ...] ) -> type[BaseModel] | _forward_ref.PydanticRecursiveRef: cached = _generics.get_cached_generic_type_early(cls, typevar_values) if cached is not None: return cached if cls is BaseModel: raise TypeError('Type parameters should be placed on typing.Generic, not BaseModel') if not hasattr(cls, '__parameters__'): raise TypeError(f'{cls} cannot be parametrized because it does not inherit from typing.Generic') if not cls.__pydantic_generic_metadata__['parameters'] and typing.Generic not in cls.__bases__: raise TypeError(f'{cls} is not a generic class') if not isinstance(typevar_values, tuple): typevar_values = (typevar_values,) # For a model `class Model[T, U, V = int](BaseModel): ...` parametrized with `(str, bool)`, # this gives us `{T: str, U: bool, V: int}`: typevars_map = _generics.map_generic_model_arguments(cls, typevar_values) # We also update the provided args to use defaults values (`(str, bool)` becomes `(str, bool, int)`): typevar_values = tuple(v for v in typevars_map.values()) if _utils.all_identical(typevars_map.keys(), typevars_map.values()) and typevars_map: submodel = cls # if arguments are equal to parameters it's the same object _generics.set_cached_generic_type(cls, typevar_values, submodel) else: parent_args = cls.__pydantic_generic_metadata__['args'] if not parent_args: args = typevar_values else: args = tuple(_generics.replace_types(arg, typevars_map) for arg in parent_args) origin = cls.__pydantic_generic_metadata__['origin'] or cls model_name = origin.model_parametrized_name(args) params = tuple( {param: None for param in _generics.iter_contained_typevars(typevars_map.values())} ) # use dict as ordered set with _generics.generic_recursion_self_type(origin, args) as maybe_self_type: cached = _generics.get_cached_generic_type_late(cls, typevar_values, origin, args) if cached is not None: return cached if maybe_self_type is not None: return maybe_self_type # Attempt to rebuild the origin in case new types have been defined try: # depth 2 gets you above this __class_getitem__ call. # Note that we explicitly provide the parent ns, otherwise # `model_rebuild` will use the parent ns no matter if it is the ns of a module. # We don't want this here, as this has unexpected effects when a model # is being parametrized during a forward annotation evaluation. parent_ns = _typing_extra.parent_frame_namespace(parent_depth=2) or {} origin.model_rebuild(_types_namespace=parent_ns) except PydanticUndefinedAnnotation: # It's okay if it fails, it just means there are still undefined types # that could be evaluated later. pass submodel = _generics.create_generic_submodel(model_name, origin, args, params) _generics.set_cached_generic_type(cls, typevar_values, submodel, origin, args) return submodel def __copy__(self) -> Self:  """Returns a shallow copy of the model.""" cls = type(self) m = cls.__new__(cls) _object_setattr(m, '__dict__', copy(self.__dict__)) _object_setattr(m, '__pydantic_extra__', copy(self.__pydantic_extra__)) _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__)) if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None: _object_setattr(m, '__pydantic_private__', None) else: _object_setattr( m, '__pydantic_private__', {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, ) return m def __deepcopy__(self, memo: dict[int, Any] | None = None) -> Self:  """Returns a deep copy of the model.""" cls = type(self) m = cls.__new__(cls) _object_setattr(m, '__dict__', deepcopy(self.__dict__, memo=memo)) _object_setattr(m, '__pydantic_extra__', deepcopy(self.__pydantic_extra__, memo=memo)) # This next line doesn't need a deepcopy because __pydantic_fields_set__ is a set[str], # and attempting a deepcopy would be marginally slower. _object_setattr(m, '__pydantic_fields_set__', copy(self.__pydantic_fields_set__)) if not hasattr(self, '__pydantic_private__') or self.__pydantic_private__ is None: _object_setattr(m, '__pydantic_private__', None) else: _object_setattr( m, '__pydantic_private__', deepcopy({k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined}, memo=memo), ) return m if not TYPE_CHECKING: # We put `__getattr__` in a non-TYPE_CHECKING block because otherwise, mypy allows arbitrary attribute access # The same goes for __setattr__ and __delattr__, see: https://github.com/pydantic/pydantic/issues/8643 def __getattr__(self, item: str) -> Any: private_attributes = object.__getattribute__(self, '__private_attributes__') if item in private_attributes: attribute = private_attributes[item] if hasattr(attribute, '__get__'): return attribute.__get__(self, type(self)) # type: ignore try: # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items return self.__pydantic_private__[item] # type: ignore except KeyError as exc: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc else: # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized. # See `BaseModel.__repr_args__` for more details try: pydantic_extra = object.__getattribute__(self, '__pydantic_extra__') except AttributeError: pydantic_extra = None if pydantic_extra: try: return pydantic_extra[item] except KeyError as exc: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc else: if hasattr(self.__class__, item): return super().__getattribute__(item) # Raises AttributeError if appropriate else: # this is the current error raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') def __setattr__(self, name: str, value: Any) -> None: if (setattr_handler := self.__pydantic_setattr_handlers__.get(name)) is not None: setattr_handler(self, name, value) # if None is returned from _setattr_handler, the attribute was set directly elif (setattr_handler := self._setattr_handler(name, value)) is not None: setattr_handler(self, name, value) # call here to not memo on possibly unknown fields self.__pydantic_setattr_handlers__[name] = setattr_handler # memoize the handler for faster access def _setattr_handler(self, name: str, value: Any) -> Callable[[BaseModel, str, Any], None] | None:  """Get a handler for setting an attribute on the model instance.  Returns:  A handler for setting an attribute on the model instance. Used for memoization of the handler.  Memoizing the handlers leads to a dramatic performance improvement in `__setattr__`  Returns `None` when memoization is not safe, then the attribute is set directly.  """ cls = self.__class__ if name in cls.__class_vars__: raise AttributeError( f'{name!r} is a ClassVar of `{cls.__name__}` and cannot be set on an instance. ' f'If you want to set a value on the class, use `{cls.__name__}.{name} = value`.' ) elif not _fields.is_valid_field_name(name): if (attribute := cls.__private_attributes__.get(name)) is not None: if hasattr(attribute, '__set__'): return lambda model, _name, val: attribute.__set__(model, val) else: return _SIMPLE_SETATTR_HANDLERS['private'] else: _object_setattr(self, name, value) return None # Can not return memoized handler with possibly freeform attr names attr = getattr(cls, name, None) # NOTE: We currently special case properties and `cached_property`, but we might need # to generalize this to all data/non-data descriptors at some point. For non-data descriptors # (such as `cached_property`), it isn't obvious though. `cached_property` caches the value # to the instance's `__dict__`, but other non-data descriptors might do things differently. if isinstance(attr, cached_property): return _SIMPLE_SETATTR_HANDLERS['cached_property'] _check_frozen(cls, name, value) # We allow properties to be set only on non frozen models for now (to match dataclasses). # This can be changed if it ever gets requested. if isinstance(attr, property): return lambda model, _name, val: attr.__set__(model, val) elif cls.model_config.get('validate_assignment'): return _SIMPLE_SETATTR_HANDLERS['validate_assignment'] elif name not in cls.__pydantic_fields__: if cls.model_config.get('extra') != 'allow': # TODO - matching error raise ValueError(f'"{cls.__name__}" object has no field "{name}"') elif attr is None: # attribute does not exist, so put it in extra self.__pydantic_extra__[name] = value return None # Can not return memoized handler with possibly freeform attr names else: # attribute _does_ exist, and was not in extra, so update it return _SIMPLE_SETATTR_HANDLERS['extra_known'] else: return _SIMPLE_SETATTR_HANDLERS['model_field'] def __delattr__(self, item: str) -> Any: cls = self.__class__ if item in self.__private_attributes__: attribute = self.__private_attributes__[item] if hasattr(attribute, '__delete__'): attribute.__delete__(self) # type: ignore return try: # Note: self.__pydantic_private__ cannot be None if self.__private_attributes__ has items del self.__pydantic_private__[item] # type: ignore return except KeyError as exc: raise AttributeError(f'{cls.__name__!r} object has no attribute {item!r}') from exc # Allow cached properties to be deleted (even if the class is frozen): attr = getattr(cls, item, None) if isinstance(attr, cached_property): return object.__delattr__(self, item) _check_frozen(cls, name=item, value=None) if item in self.__pydantic_fields__: object.__delattr__(self, item) elif self.__pydantic_extra__ is not None and item in self.__pydantic_extra__: del self.__pydantic_extra__[item] else: try: object.__delattr__(self, item) except AttributeError: raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') # Because we make use of `@dataclass_transform()`, `__replace__` is already synthesized by # type checkers, so we define the implementation in this `if not TYPE_CHECKING:` block: def __replace__(self, **changes: Any) -> Self: return self.model_copy(update=changes) def __getstate__(self) -> dict[Any, Any]: private = self.__pydantic_private__ if private: private = {k: v for k, v in private.items() if v is not PydanticUndefined} return { '__dict__': self.__dict__, '__pydantic_extra__': self.__pydantic_extra__, '__pydantic_fields_set__': self.__pydantic_fields_set__, '__pydantic_private__': private, } def __setstate__(self, state: dict[Any, Any]) -> None: _object_setattr(self, '__pydantic_fields_set__', state.get('__pydantic_fields_set__', {})) _object_setattr(self, '__pydantic_extra__', state.get('__pydantic_extra__', {})) _object_setattr(self, '__pydantic_private__', state.get('__pydantic_private__', {})) _object_setattr(self, '__dict__', state.get('__dict__', {})) if not TYPE_CHECKING: def __eq__(self, other: Any) -> bool: if isinstance(other, BaseModel): # When comparing instances of generic types for equality, as long as all field values are equal, # only require their generic origin types to be equal, rather than exact type equality. # This prevents headaches like MyGeneric(x=1) != MyGeneric[Any](x=1). self_type = self.__pydantic_generic_metadata__['origin'] or self.__class__ other_type = other.__pydantic_generic_metadata__['origin'] or other.__class__ # Perform common checks first if not ( self_type == other_type and getattr(self, '__pydantic_private__', None) == getattr(other, '__pydantic_private__', None) and self.__pydantic_extra__ == other.__pydantic_extra__ ): return False # We only want to compare pydantic fields but ignoring fields is costly. # We'll perform a fast check first, and fallback only when needed # See GH-7444 and GH-7825 for rationale and a performance benchmark # First, do the fast (and sometimes faulty) __dict__ comparison if self.__dict__ == other.__dict__: # If the check above passes, then pydantic fields are equal, we can return early return True # We don't want to trigger unnecessary costly filtering of __dict__ on all unequal objects, so we return # early if there are no keys to ignore (we would just return False later on anyway) model_fields = type(self).__pydantic_fields__.keys() if self.__dict__.keys() <= model_fields and other.__dict__.keys() <= model_fields: return False # If we reach here, there are non-pydantic-fields keys, mapped to unequal values, that we need to ignore # Resort to costly filtering of the __dict__ objects # We use operator.itemgetter because it is much faster than dict comprehensions # NOTE: Contrary to standard python class and instances, when the Model class has a default value for an # attribute and the model instance doesn't have a corresponding attribute, accessing the missing attribute # raises an error in BaseModel.__getattr__ instead of returning the class attribute # So we can use operator.itemgetter() instead of operator.attrgetter() getter = operator.itemgetter(*model_fields) if model_fields else lambda _: _utils._SENTINEL try: return getter(self.__dict__) == getter(other.__dict__) except KeyError: # In rare cases (such as when using the deprecated BaseModel.copy() method), # the __dict__ may not contain all model fields, which is how we can get here. # getter(self.__dict__) is much faster than any 'safe' method that accounts # for missing keys, and wrapping it in a `try` doesn't slow things down much # in the common case. self_fields_proxy = _utils.SafeGetItemProxy(self.__dict__) other_fields_proxy = _utils.SafeGetItemProxy(other.__dict__) return getter(self_fields_proxy) == getter(other_fields_proxy) # other instance is not a BaseModel else: return NotImplemented # delegate to the other item in the comparison if TYPE_CHECKING: # We put `__init_subclass__` in a TYPE_CHECKING block because, even though we want the type-checking benefits # described in the signature of `__init_subclass__` below, we don't want to modify the default behavior of # subclass initialization. def __init_subclass__(cls, **kwargs: Unpack[ConfigDict]):  """This signature is included purely to help type-checkers check arguments to class declaration, which  provides a way to conveniently set model_config key/value pairs.  ```python  from pydantic import BaseModel  class MyModel(BaseModel, extra='allow'): ...  ```  However, this may be deceiving, since the _actual_ calls to `__init_subclass__` will not receive any  of the config arguments, and will only receive any keyword arguments passed during class initialization  that are _not_ expected keys in ConfigDict. (This is due to the way `ModelMetaclass.__new__` works.)  Args:  **kwargs: Keyword arguments passed to the class definition, which set model_config  Note:  You may want to override `__pydantic_init_subclass__` instead, which behaves similarly but is called  *after* the class is fully initialized.  """ def __iter__(self) -> TupleGenerator:  """So `dict(model)` works.""" yield from [(k, v) for (k, v) in self.__dict__.items() if not k.startswith('_')] extra = self.__pydantic_extra__ if extra: yield from extra.items() def __repr__(self) -> str: return f'{self.__repr_name__()}({self.__repr_str__(", ")})' def __repr_args__(self) -> _repr.ReprArgs: # Eagerly create the repr of computed fields, as this may trigger access of cached properties and as such # modify the instance's `__dict__`. If we don't do it now, it could happen when iterating over the `__dict__` # below if the instance happens to be referenced in a field, and would modify the `__dict__` size *during* iteration. computed_fields_repr_args = [ (k, getattr(self, k)) for k, v in self.__pydantic_computed_fields__.items() if v.repr ] for k, v in self.__dict__.items(): field = self.__pydantic_fields__.get(k) if field and field.repr: if v is not self: yield k, v else: yield k, self.__repr_recursion__(v) # `__pydantic_extra__` can fail to be set if the model is not yet fully initialized. # This can happen if a `ValidationError` is raised during initialization and the instance's # repr is generated as part of the exception handling. Therefore, we use `getattr` here # with a fallback, even though the type hints indicate the attribute will always be present. try: pydantic_extra = object.__getattribute__(self, '__pydantic_extra__') except AttributeError: pydantic_extra = None if pydantic_extra is not None: yield from ((k, v) for k, v in pydantic_extra.items()) yield from computed_fields_repr_args # take logic from `_repr.Representation` without the side effects of inheritance, see #5740 __repr_name__ = _repr.Representation.__repr_name__ __repr_recursion__ = _repr.Representation.__repr_recursion__ __repr_str__ = _repr.Representation.__repr_str__ __pretty__ = _repr.Representation.__pretty__ __rich_repr__ = _repr.Representation.__rich_repr__ def __str__(self) -> str: return self.__repr_str__(' ') # ##### Deprecated methods from v1 ##### @property @typing_extensions.deprecated( 'The `__fields__` attribute is deprecated, use `model_fields` instead.', category=None ) def __fields__(self) -> dict[str, FieldInfo]: warnings.warn( 'The `__fields__` attribute is deprecated, use `model_fields` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return getattr(type(self), '__pydantic_fields__', {}) @property @typing_extensions.deprecated( 'The `__fields_set__` attribute is deprecated, use `model_fields_set` instead.', category=None, ) def __fields_set__(self) -> set[str]: warnings.warn( 'The `__fields_set__` attribute is deprecated, use `model_fields_set` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return self.__pydantic_fields_set__ @typing_extensions.deprecated('The `dict` method is deprecated; use `model_dump` instead.', category=None) def dict( # noqa: D102 self, *, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, ) -> Dict[str, Any]: # noqa UP006 warnings.warn( 'The `dict` method is deprecated; use `model_dump` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return self.model_dump( include=include, exclude=exclude, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) @typing_extensions.deprecated('The `json` method is deprecated; use `model_dump_json` instead.', category=None) def json( # noqa: D102 self, *, include: IncEx | None = None, exclude: IncEx | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = PydanticUndefined, # type: ignore[assignment] models_as_dict: bool = PydanticUndefined, # type: ignore[assignment] **dumps_kwargs: Any, ) -> str: warnings.warn( 'The `json` method is deprecated; use `model_dump_json` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) if encoder is not PydanticUndefined: raise TypeError('The `encoder` argument is no longer supported; use field serializers instead.') if models_as_dict is not PydanticUndefined: raise TypeError('The `models_as_dict` argument is no longer supported; use a model serializer instead.') if dumps_kwargs: raise TypeError('`dumps_kwargs` keyword arguments are no longer supported.') return self.model_dump_json( include=include, exclude=exclude, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, ) @classmethod @typing_extensions.deprecated('The `parse_obj` method is deprecated; use `model_validate` instead.', category=None) def parse_obj(cls, obj: Any) -> Self: # noqa: D102 warnings.warn( 'The `parse_obj` method is deprecated; use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated( 'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, ' 'otherwise load the data then use `model_validate` instead.', category=None, ) def parse_raw( # noqa: D102 cls, b: str | bytes, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False, ) -> Self: # pragma: no cover warnings.warn( 'The `parse_raw` method is deprecated; if your data is JSON use `model_validate_json`, ' 'otherwise load the data then use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import parse try: obj = parse.load_str_bytes( b, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, ) except (ValueError, TypeError) as exc: import json # try to match V1 if isinstance(exc, UnicodeDecodeError): type_str = 'value_error.unicodedecode' elif isinstance(exc, json.JSONDecodeError): type_str = 'value_error.jsondecode' elif isinstance(exc, ValueError): type_str = 'value_error' else: type_str = 'type_error' # ctx is missing here, but since we've added `input` to the error, we're not pretending it's the same error: pydantic_core.InitErrorDetails = { # The type: ignore on the next line is to ignore the requirement of LiteralString 'type': pydantic_core.PydanticCustomError(type_str, str(exc)), # type: ignore 'loc': ('__root__',), 'input': b, } raise pydantic_core.ValidationError.from_exception_data(cls.__name__, [error]) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated( 'The `parse_file` method is deprecated; load the data from file, then if your data is JSON ' 'use `model_validate_json`, otherwise `model_validate` instead.', category=None, ) def parse_file( # noqa: D102 cls, path: str | Path, *, content_type: str | None = None, encoding: str = 'utf8', proto: DeprecatedParseProtocol | None = None, allow_pickle: bool = False, ) -> Self: warnings.warn( 'The `parse_file` method is deprecated; load the data from file, then if your data is JSON ' 'use `model_validate_json`, otherwise `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import parse obj = parse.load_file( path, proto=proto, content_type=content_type, encoding=encoding, allow_pickle=allow_pickle, ) return cls.parse_obj(obj) @classmethod @typing_extensions.deprecated( 'The `from_orm` method is deprecated; set ' "`model_config['from_attributes']=True` and use `model_validate` instead.", category=None, ) def from_orm(cls, obj: Any) -> Self: # noqa: D102 warnings.warn( 'The `from_orm` method is deprecated; set ' "`model_config['from_attributes']=True` and use `model_validate` instead.", category=PydanticDeprecatedSince20, stacklevel=2, ) if not cls.model_config.get('from_attributes', None): raise PydanticUserError( 'You must set the config attribute `from_attributes=True` to use from_orm', code=None ) return cls.model_validate(obj) @classmethod @typing_extensions.deprecated('The `construct` method is deprecated; use `model_construct` instead.', category=None) def construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self: # noqa: D102 warnings.warn( 'The `construct` method is deprecated; use `model_construct` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_construct(_fields_set=_fields_set, **values) @typing_extensions.deprecated( 'The `copy` method is deprecated; use `model_copy` instead. ' 'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.', category=None, ) def copy( self, *, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: Dict[str, Any] | None = None, # noqa UP006 deep: bool = False, ) -> Self: # pragma: no cover  """Returns a copy of the model.  !!! warning "Deprecated"  This method is now deprecated; use `model_copy` instead.  If you need `include` or `exclude`, use:  ```python {test="skip" lint="skip"}  data = self.model_dump(include=include, exclude=exclude, round_trip=True)  data = {**data, **(update or {})}  copied = self.model_validate(data)  ```  Args:  include: Optional set or mapping specifying which fields to include in the copied model.  exclude: Optional set or mapping specifying which fields to exclude in the copied model.  update: Optional dictionary of field-value pairs to override field values in the copied model.  deep: If True, the values of fields that are Pydantic models will be deep-copied.  Returns:  A copy of the model with included, excluded and updated fields as specified.  """ warnings.warn( 'The `copy` method is deprecated; use `model_copy` instead. ' 'See the docstring of `BaseModel.copy` for details about how to handle `include` and `exclude`.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals values = dict( copy_internals._iter( self, to_dict=False, by_alias=False, include=include, exclude=exclude, exclude_unset=False ), **(update or {}), ) if self.__pydantic_private__ is None: private = None else: private = {k: v for k, v in self.__pydantic_private__.items() if v is not PydanticUndefined} if self.__pydantic_extra__ is None: extra: dict[str, Any] | None = None else: extra = self.__pydantic_extra__.copy() for k in list(self.__pydantic_extra__): if k not in values: # k was in the exclude extra.pop(k) for k in list(values): if k in self.__pydantic_extra__: # k must have come from extra extra[k] = values.pop(k) # new `__pydantic_fields_set__` can have unset optional fields with a set value in `update` kwarg if update: fields_set = self.__pydantic_fields_set__ | update.keys() else: fields_set = set(self.__pydantic_fields_set__) # removing excluded fields from `__pydantic_fields_set__` if exclude: fields_set -= set(exclude) return copy_internals._copy_and_set_values(self, values, fields_set, extra, private, deep=deep) @classmethod @typing_extensions.deprecated('The `schema` method is deprecated; use `model_json_schema` instead.', category=None) def schema( # noqa: D102 cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE ) -> Dict[str, Any]: # noqa UP006 warnings.warn( 'The `schema` method is deprecated; use `model_json_schema` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_json_schema(by_alias=by_alias, ref_template=ref_template) @classmethod @typing_extensions.deprecated( 'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.', category=None, ) def schema_json( # noqa: D102 cls, *, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, **dumps_kwargs: Any ) -> str: # pragma: no cover warnings.warn( 'The `schema_json` method is deprecated; use `model_json_schema` and json.dumps instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) import json from .deprecated.json import pydantic_encoder return json.dumps( cls.model_json_schema(by_alias=by_alias, ref_template=ref_template), default=pydantic_encoder, **dumps_kwargs, ) @classmethod @typing_extensions.deprecated('The `validate` method is deprecated; use `model_validate` instead.', category=None) def validate(cls, value: Any) -> Self: # noqa: D102 warnings.warn( 'The `validate` method is deprecated; use `model_validate` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) return cls.model_validate(value) @classmethod @typing_extensions.deprecated( 'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.', category=None, ) def update_forward_refs(cls, **localns: Any) -> None: # noqa: D102 warnings.warn( 'The `update_forward_refs` method is deprecated; use `model_rebuild` instead.', category=PydanticDeprecatedSince20, stacklevel=2, ) if localns: # pragma: no cover raise TypeError('`localns` arguments are not longer accepted.') cls.model_rebuild(force=True) @typing_extensions.deprecated( 'The private method `_iter` will be removed and should no longer be used.', category=None ) def _iter(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_iter` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._iter(self, *args, **kwargs) @typing_extensions.deprecated( 'The private method `_copy_and_set_values` will be removed and should no longer be used.', category=None, ) def _copy_and_set_values(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_copy_and_set_values` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._copy_and_set_values(self, *args, **kwargs) @classmethod @typing_extensions.deprecated( 'The private method `_get_value` will be removed and should no longer be used.', category=None, ) def _get_value(cls, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_get_value` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._get_value(cls, *args, **kwargs) @typing_extensions.deprecated( 'The private method `_calculate_keys` will be removed and should no longer be used.', category=None, ) def _calculate_keys(self, *args: Any, **kwargs: Any) -> Any: warnings.warn( 'The private method `_calculate_keys` will be removed and should no longer be used.', category=PydanticDeprecatedSince20, stacklevel=2, ) from .deprecated import copy_internals return copy_internals._calculate_keys(self, *args, **kwargs) 

__init__

__init__(**data: Any) -> None 

Raises ValidationError if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

Source code in pydantic/main.py
243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260
def __init__(self, /, **data: Any) -> None:  """Create a new model by parsing and validating input data from keyword arguments.  Raises [`ValidationError`][pydantic_core.ValidationError] if the input data cannot be  validated to form a valid model.  `self` is explicitly positional-only to allow `self` as a field name.  """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) if self is not validated_self: warnings.warn( 'A custom validator is returning a value other than `self`.\n' "Returning anything other than `self` from a top level model validator isn't supported when validating via `__init__`.\n" 'See the `model_validator` docs (https://docs.pydantic.dev/latest/concepts/validators/#model-validators) for more details.', stacklevel=2, ) 

model_config class-attribute

model_config: ConfigDict = ConfigDict() 

Configuration for the model, should be a dictionary conforming to ConfigDict.

model_fields classmethod

model_fields() -> dict[str, FieldInfo] 

A mapping of field names to their respective FieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
265 266 267 268 269 270 271 272 273 274
@_utils.deprecated_instance_property @classmethod def model_fields(cls) -> dict[str, FieldInfo]:  """A mapping of field names to their respective [`FieldInfo`][pydantic.fields.FieldInfo] instances.  !!! warning  Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.  Instead, you should access this attribute from the model class.  """ return getattr(cls, '__pydantic_fields__', {}) 

model_computed_fields classmethod

model_computed_fields() -> dict[str, ComputedFieldInfo] 

A mapping of computed field names to their respective ComputedFieldInfo instances.

Warning

Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3. Instead, you should access this attribute from the model class.

Source code in pydantic/main.py
276 277 278 279 280 281 282 283 284 285
@_utils.deprecated_instance_property @classmethod def model_computed_fields(cls) -> dict[str, ComputedFieldInfo]:  """A mapping of computed field names to their respective [`ComputedFieldInfo`][pydantic.fields.ComputedFieldInfo] instances.  !!! warning  Accessing this attribute from a model instance is deprecated, and will not work in Pydantic V3.  Instead, you should access this attribute from the model class.  """ return getattr(cls, '__pydantic_computed_fields__', {}) 

__pydantic_core_schema__ class-attribute

__pydantic_core_schema__: CoreSchema 

The core schema of the model.

model_extra property

model_extra: dict[str, Any] | None 

Get extra fields set during validation.

Returns:

Type Description
dict[str, Any] | None

A dictionary of extra fields, or None if config.extra is not set to "allow".

model_fields_set property

model_fields_set: set[str] 

Returns the set of fields that have been explicitly set on this model instance.

Returns:

Type Description
set[str]

A set of strings representing the fields that have been set, i.e. that were not filled from defaults.

model_construct classmethod

model_construct( _fields_set: set[str] | None = None, **values: Any ) -> Self 

Creates a new instance of the Model class with validated data.

Creates a new model setting __dict__ and __pydantic_fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed.

Note

model_construct() generally respects the model_config.extra setting on the provided model. That is, if model_config.extra == 'allow', then all extra passed values are added to the model instance's __dict__ and __pydantic_extra__ fields. If model_config.extra == 'ignore' (the default), then all extra passed values are ignored. Because no validation is performed with a call to model_construct(), having model_config.extra == 'forbid' does not result in an error if extra values are passed, but they will be ignored.

Parameters:

Name Type Description Default
_fields_set set[str] | None

A set of field names that were originally explicitly set during instantiation. If provided, this is directly used for the model_fields_set attribute. Otherwise, the field names from the values argument will be used.

None
values Any

Trusted or pre-validated data dictionary.

{}

Returns:

Type Description
Self

A new instance of the Model class with validated data.

Source code in pydantic/main.py
306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385
@classmethod def model_construct(cls, _fields_set: set[str] | None = None, **values: Any) -> Self: # noqa: C901  """Creates a new instance of the `Model` class with validated data.  Creates a new model setting `__dict__` and `__pydantic_fields_set__` from trusted or pre-validated data.  Default values are respected, but no other validation is performed.  !!! note  `model_construct()` generally respects the `model_config.extra` setting on the provided model.  That is, if `model_config.extra == 'allow'`, then all extra passed values are added to the model instance's `__dict__`  and `__pydantic_extra__` fields. If `model_config.extra == 'ignore'` (the default), then all extra passed values are ignored.  Because no validation is performed with a call to `model_construct()`, having `model_config.extra == 'forbid'` does not result in  an error if extra values are passed, but they will be ignored.  Args:  _fields_set: A set of field names that were originally explicitly set during instantiation. If provided,  this is directly used for the [`model_fields_set`][pydantic.BaseModel.model_fields_set] attribute.  Otherwise, the field names from the `values` argument will be used.  values: Trusted or pre-validated data dictionary.  Returns:  A new instance of the `Model` class with validated data.  """ m = cls.__new__(cls) fields_values: dict[str, Any] = {} fields_set = set() for name, field in cls.__pydantic_fields__.items(): if field.alias is not None and field.alias in values: fields_values[name] = values.pop(field.alias) fields_set.add(name) if (name not in fields_set) and (field.validation_alias is not None): validation_aliases: list[str | AliasPath] = ( field.validation_alias.choices if isinstance(field.validation_alias, AliasChoices) else [field.validation_alias] ) for alias in validation_aliases: if isinstance(alias, str) and alias in values: fields_values[name] = values.pop(alias) fields_set.add(name) break elif isinstance(alias, AliasPath): value = alias.search_dict_for_path(values) if value is not PydanticUndefined: fields_values[name] = value fields_set.add(name) break if name not in fields_set: if name in values: fields_values[name] = values.pop(name) fields_set.add(name) elif not field.is_required(): fields_values[name] = field.get_default(call_default_factory=True, validated_data=fields_values) if _fields_set is None: _fields_set = fields_set _extra: dict[str, Any] | None = values if cls.model_config.get('extra') == 'allow' else None _object_setattr(m, '__dict__', fields_values) _object_setattr(m, '__pydantic_fields_set__', _fields_set) if not cls.__pydantic_root_model__: _object_setattr(m, '__pydantic_extra__', _extra) if cls.__pydantic_post_init__: m.model_post_init(None) # update private attributes with values set if hasattr(m, '__pydantic_private__') and m.__pydantic_private__ is not None: for k, v in values.items(): if k in m.__private_attributes__: m.__pydantic_private__[k] = v elif not cls.__pydantic_root_model__: # Note: if there are any private attributes, cls.__pydantic_post_init__ would exist # Since it doesn't, that means that `__pydantic_private__` should be set to None _object_setattr(m, '__pydantic_private__', None) return m 

model_copy

model_copy( *, update: Mapping[str, Any] | None = None, deep: bool = False ) -> Self 

Usage Documentation

model_copy

Returns a copy of the model.

Note

The underlying instance's __dict__ attribute is copied. This might have unexpected side effects if you store anything in it, on top of the model fields (e.g. the value of cached properties).

Parameters:

Name Type Description Default
update Mapping[str, Any] | None

Values to change/add in the new model. Note: the data is not validated before creating the new model. You should trust this data.

None
deep bool

Set to True to make a deep copy of the model.

False

Returns:

Type Description
Self

New model instance.

Source code in pydantic/main.py
387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419
def model_copy(self, *, update: Mapping[str, Any] | None = None, deep: bool = False) -> Self:  """!!! abstract "Usage Documentation"  [`model_copy`](../concepts/serialization.md#model_copy)  Returns a copy of the model.  !!! note  The underlying instance's [`__dict__`][object.__dict__] attribute is copied. This  might have unexpected side effects if you store anything in it, on top of the model  fields (e.g. the value of [cached properties][functools.cached_property]).  Args:  update: Values to change/add in the new model. Note: the data is not validated  before creating the new model. You should trust this data.  deep: Set to `True` to make a deep copy of the model.  Returns:  New model instance.  """ copied = self.__deepcopy__() if deep else self.__copy__() if update: if self.model_config.get('extra') == 'allow': for k, v in update.items(): if k in self.__pydantic_fields__: copied.__dict__[k] = v else: if copied.__pydantic_extra__ is None: copied.__pydantic_extra__ = {} copied.__pydantic_extra__[k] = v else: copied.__dict__.update(update) copied.__pydantic_fields_set__.update(update.keys()) return copied 

model_dump

model_dump( *, mode: Literal["json", "python"] | str = "python", include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False ) -> dict[str, Any] 

Usage Documentation

model_dump

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

Parameters:

Name Type Description Default
mode Literal['json', 'python'] | str

The mode in which to_python should run. If mode is 'json', the output will only contain JSON serializable types. If mode is 'python', the output may contain non-JSON-serializable Python objects.

'python'
include IncEx | None

A set of fields to include in the output.

None
exclude IncEx | None

A set of fields to exclude from the output.

None
context Any | None

Additional context to pass to the serializer.

None
by_alias bool | None

Whether to use the field's alias in the dictionary key if defined.

None
exclude_unset bool

Whether to exclude fields that have not been explicitly set.

False
exclude_defaults bool

Whether to exclude fields that are set to their default value.

False
exclude_none bool

Whether to exclude fields that have a value of None.

False
round_trip bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False
warnings bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True
fallback Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None
serialize_as_any bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
dict[str, Any]

A dictionary representation of the model.

Source code in pydantic/main.py
421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477
def model_dump( self, *, mode: Literal['json', 'python'] | str = 'python', include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, ) -> dict[str, Any]:  """!!! abstract "Usage Documentation"  [`model_dump`](../concepts/serialization.md#modelmodel_dump)  Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.  Args:  mode: The mode in which `to_python` should run.  If mode is 'json', the output will only contain JSON serializable types.  If mode is 'python', the output may contain non-JSON-serializable Python objects.  include: A set of fields to include in the output.  exclude: A set of fields to exclude from the output.  context: Additional context to pass to the serializer.  by_alias: Whether to use the field's alias in the dictionary key if defined.  exclude_unset: Whether to exclude fields that have not been explicitly set.  exclude_defaults: Whether to exclude fields that are set to their default value.  exclude_none: Whether to exclude fields that have a value of `None`.  round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].  warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,  "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].  fallback: A function to call when an unknown value is encountered. If not provided,  a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.  serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.  Returns:  A dictionary representation of the model.  """ return self.__pydantic_serializer__.to_python( self, mode=mode, by_alias=by_alias, include=include, exclude=exclude, context=context, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, ) 

model_dump_json

model_dump_json( *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: ( bool | Literal["none", "warn", "error"] ) = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False ) -> str 

Usage Documentation

model_dump_json

Generates a JSON representation of the model using Pydantic's to_json method.

Parameters:

Name Type Description Default
indent int | None

Indentation to use in the JSON output. If None is passed, the output will be compact.

None
include IncEx | None

Field(s) to include in the JSON output.

None
exclude IncEx | None

Field(s) to exclude from the JSON output.

None
context Any | None

Additional context to pass to the serializer.

None
by_alias bool | None

Whether to serialize using field aliases.

None
exclude_unset bool

Whether to exclude fields that have not been explicitly set.

False
exclude_defaults bool

Whether to exclude fields that are set to their default value.

False
exclude_none bool

Whether to exclude fields that have a value of None.

False
round_trip bool

If True, dumped values should be valid as input for non-idempotent types such as Json[T].

False
warnings bool | Literal['none', 'warn', 'error']

How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors, "error" raises a PydanticSerializationError.

True
fallback Callable[[Any], Any] | None

A function to call when an unknown value is encountered. If not provided, a PydanticSerializationError error is raised.

None
serialize_as_any bool

Whether to serialize fields with duck-typing serialization behavior.

False

Returns:

Type Description
str

A JSON string representation of the model.

Source code in pydantic/main.py
479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533
def model_dump_json( self, *, indent: int | None = None, include: IncEx | None = None, exclude: IncEx | None = None, context: Any | None = None, by_alias: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool | Literal['none', 'warn', 'error'] = True, fallback: Callable[[Any], Any] | None = None, serialize_as_any: bool = False, ) -> str:  """!!! abstract "Usage Documentation"  [`model_dump_json`](../concepts/serialization.md#modelmodel_dump_json)  Generates a JSON representation of the model using Pydantic's `to_json` method.  Args:  indent: Indentation to use in the JSON output. If None is passed, the output will be compact.  include: Field(s) to include in the JSON output.  exclude: Field(s) to exclude from the JSON output.  context: Additional context to pass to the serializer.  by_alias: Whether to serialize using field aliases.  exclude_unset: Whether to exclude fields that have not been explicitly set.  exclude_defaults: Whether to exclude fields that are set to their default value.  exclude_none: Whether to exclude fields that have a value of `None`.  round_trip: If True, dumped values should be valid as input for non-idempotent types such as Json[T].  warnings: How to handle serialization errors. False/"none" ignores them, True/"warn" logs errors,  "error" raises a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError].  fallback: A function to call when an unknown value is encountered. If not provided,  a [`PydanticSerializationError`][pydantic_core.PydanticSerializationError] error is raised.  serialize_as_any: Whether to serialize fields with duck-typing serialization behavior.  Returns:  A JSON string representation of the model.  """ return self.__pydantic_serializer__.to_json( self, indent=indent, include=include, exclude=exclude, context=context, by_alias=by_alias, exclude_unset=exclude_unset, exclude_defaults=exclude_defaults, exclude_none=exclude_none, round_trip=round_trip, warnings=warnings, fallback=fallback, serialize_as_any=serialize_as_any, ).decode() 

model_json_schema classmethod

model_json_schema( by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[ GenerateJsonSchema ] = GenerateJsonSchema, mode: JsonSchemaMode = "validation", ) -> dict[str, Any] 

Generates a JSON schema for a model class.

Parameters:

Name Type Description Default
by_alias bool

Whether to use attribute aliases or not.

True
ref_template str

The reference template.

DEFAULT_REF_TEMPLATE
schema_generator type[GenerateJsonSchema]

To override the logic used to generate the JSON schema, as a subclass of GenerateJsonSchema with your desired modifications

GenerateJsonSchema
mode JsonSchemaMode

The mode in which to generate the schema.

'validation'

Returns:

Type Description
dict[str, Any]

The JSON schema for the given model class.

Source code in pydantic/main.py
535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557
@classmethod def model_json_schema( cls, by_alias: bool = True, ref_template: str = DEFAULT_REF_TEMPLATE, schema_generator: type[GenerateJsonSchema] = GenerateJsonSchema, mode: JsonSchemaMode = 'validation', ) -> dict[str, Any]:  """Generates a JSON schema for a model class.  Args:  by_alias: Whether to use attribute aliases or not.  ref_template: The reference template.  schema_generator: To override the logic used to generate the JSON schema, as a subclass of  `GenerateJsonSchema` with your desired modifications  mode: The mode in which to generate the schema.  Returns:  The JSON schema for the given model class.  """ return model_json_schema( cls, by_alias=by_alias, ref_template=ref_template, schema_generator=schema_generator, mode=mode ) 

model_parametrized_name classmethod

model_parametrized_name( params: tuple[type[Any], ...] ) -> str 

Compute the class name for parametrizations of generic classes.

This method can be overridden to achieve a custom naming scheme for generic BaseModels.

Parameters:

Name Type Description Default
params tuple[type[Any], ...]

Tuple of types of the class. Given a generic class Model with 2 type variables and a concrete model Model[str, int], the value (str, int) would be passed to params.

required

Returns:

Type Description
str

String representing the new class where params are passed to cls as type variables.

Raises:

Type Description
TypeError

Raised when trying to generate concrete names for non-generic models.

Source code in pydantic/main.py
559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584
@classmethod def model_parametrized_name(cls, params: tuple[type[Any], ...]) -> str:  """Compute the class name for parametrizations of generic classes.  This method can be overridden to achieve a custom naming scheme for generic BaseModels.  Args:  params: Tuple of types of the class. Given a generic class  `Model` with 2 type variables and a concrete model `Model[str, int]`,  the value `(str, int)` would be passed to `params`.  Returns:  String representing the new class where `params` are passed to `cls` as type variables.  Raises:  TypeError: Raised when trying to generate concrete names for non-generic models.  """ if not issubclass(cls, typing.Generic): raise TypeError('Concrete names should only be generated for generic models.') # Any strings received should represent forward references, so we handle them specially below. # If we eventually move toward wrapping them in a ForwardRef in __class_getitem__ in the future, # we may be able to remove this special case. param_names = [param if isinstance(param, str) else _repr.display_as_type(param) for param in params] params_component = ', '.join(param_names) return f'{cls.__name__}[{params_component}]' 

model_post_init

model_post_init(context: Any) -> None 

Override this method to perform additional initialization after __init__ and model_construct. This is useful if you want to do some validation that requires the entire model to be initialized.

Source code in pydantic/main.py
586 587 588 589 590
def model_post_init(self, context: Any, /) -> None:  """Override this method to perform additional initialization after `__init__` and `model_construct`.  This is useful if you want to do some validation that requires the entire model to be initialized.  """ pass 

model_rebuild classmethod

model_rebuild( *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None ) -> bool | None 

Try to rebuild the pydantic-core schema for the model.

This may be necessary when one of the annotations is a ForwardRef which could not be resolved during the initial attempt to build the schema, and automatic rebuilding fails.

Parameters:

Name Type Description Default
force bool

Whether to force the rebuilding of the model schema, defaults to False.

False
raise_errors bool

Whether to raise errors, defaults to True.

True
_parent_namespace_depth int

The depth level of the parent namespace, defaults to 2.

2
_types_namespace MappingNamespace | None

The types namespace, defaults to None.

None

Returns:

Type Description
bool | None

Returns None if the schema is already "complete" and rebuilding was not required.

bool | None

If rebuilding was required, returns True if rebuilding was successful, otherwise False.

Source code in pydantic/main.py
592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667
@classmethod def model_rebuild( cls, *, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: MappingNamespace | None = None, ) -> bool | None:  """Try to rebuild the pydantic-core schema for the model.  This may be necessary when one of the annotations is a ForwardRef which could not be resolved during  the initial attempt to build the schema, and automatic rebuilding fails.  Args:  force: Whether to force the rebuilding of the model schema, defaults to `False`.  raise_errors: Whether to raise errors, defaults to `True`.  _parent_namespace_depth: The depth level of the parent namespace, defaults to 2.  _types_namespace: The types namespace, defaults to `None`.  Returns:  Returns `None` if the schema is already "complete" and rebuilding was not required.  If rebuilding _was_ required, returns `True` if rebuilding was successful, otherwise `False`.  """ if not force and cls.__pydantic_complete__: return None for attr in ('__pydantic_core_schema__', '__pydantic_validator__', '__pydantic_serializer__'): if attr in cls.__dict__ and not isinstance(getattr(cls, attr), _mock_val_ser.MockValSer): # Deleting the validator/serializer is necessary as otherwise they can get reused in # pydantic-core. We do so only if they aren't mock instances, otherwise — as `model_rebuild()` # isn't thread-safe — concurrent model instantiations can lead to the parent validator being used. # Same applies for the core schema that can be reused in schema generation. delattr(cls, attr) cls.__pydantic_complete__ = False if _types_namespace is not None: rebuild_ns = _types_namespace elif _parent_namespace_depth > 0: rebuild_ns = _typing_extra.parent_frame_namespace(parent_depth=_parent_namespace_depth, force=True) or {} else: rebuild_ns = {} parent_ns = _model_construction.unpack_lenient_weakvaluedict(cls.__pydantic_parent_namespace__) or {} ns_resolver = _namespace_utils.NsResolver( parent_namespace={**rebuild_ns, **parent_ns}, ) if not cls.__pydantic_fields_complete__: typevars_map = _generics.get_model_typevars_map(cls) try: cls.__pydantic_fields__ = _fields.rebuild_model_fields( cls, ns_resolver=ns_resolver, typevars_map=typevars_map, ) except NameError as e: exc = PydanticUndefinedAnnotation.from_name_error(e) _mock_val_ser.set_model_mocks(cls, f'`{exc.name}`') if raise_errors: raise exc from e if not raise_errors and not cls.__pydantic_fields_complete__: # No need to continue with schema gen, it is guaranteed to fail return False assert cls.__pydantic_fields_complete__ return _model_construction.complete_model_class( cls, _config.ConfigWrapper(cls.model_config, check=False), raise_errors=raise_errors, ns_resolver=ns_resolver, ) 

model_validate classmethod

model_validate( obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None ) -> Self 

Validate a pydantic model instance.

Parameters:

Name Type Description Default
obj Any

The object to validate.

required
strict bool | None

Whether to enforce types strictly.

None
from_attributes bool | None

Whether to extract data from object attributes.

None
context Any | None

Additional context to pass to the validator.

None
by_alias bool | None

Whether to use the field's alias when validating against the provided input data.

None
by_name bool | None

Whether to use the field's name when validating against the provided input data.

None

Raises:

Type Description
ValidationError

If the object could not be validated.

Returns:

Type Description
Self

The validated model instance.

Source code in pydantic/main.py
669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707
@classmethod def model_validate( cls, obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self:  """Validate a pydantic model instance.  Args:  obj: The object to validate.  strict: Whether to enforce types strictly.  from_attributes: Whether to extract data from object attributes.  context: Additional context to pass to the validator.  by_alias: Whether to use the field's alias when validating against the provided input data.  by_name: Whether to use the field's name when validating against the provided input data.  Raises:  ValidationError: If the object could not be validated.  Returns:  The validated model instance.  """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_python( obj, strict=strict, from_attributes=from_attributes, context=context, by_alias=by_alias, by_name=by_name ) 

model_validate_json classmethod

model_validate_json( json_data: str | bytes | bytearray, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None ) -> Self 

Usage Documentation

JSON Parsing

Validate the given JSON data against the Pydantic model.

Parameters:

Name Type Description Default
json_data str | bytes | bytearray

The JSON data to validate.

required
strict bool | None

Whether to enforce types strictly.

None
context Any | None

Extra variables to pass to the validator.

None
by_alias bool | None

Whether to use the field's alias when validating against the provided input data.

None
by_name bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Raises:

Type Description
ValidationError

If json_data is not a JSON string or the object could not be validated.

Source code in pydantic/main.py
709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748
@classmethod def model_validate_json( cls, json_data: str | bytes | bytearray, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self:  """!!! abstract "Usage Documentation"  [JSON Parsing](../concepts/json.md#json-parsing)  Validate the given JSON data against the Pydantic model.  Args:  json_data: The JSON data to validate.  strict: Whether to enforce types strictly.  context: Extra variables to pass to the validator.  by_alias: Whether to use the field's alias when validating against the provided input data.  by_name: Whether to use the field's name when validating against the provided input data.  Returns:  The validated Pydantic model.  Raises:  ValidationError: If `json_data` is not a JSON string or the object could not be validated.  """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_json( json_data, strict=strict, context=context, by_alias=by_alias, by_name=by_name ) 

model_validate_strings classmethod

model_validate_strings( obj: Any, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None ) -> Self 

Validate the given object with string data against the Pydantic model.

Parameters:

Name Type Description Default
obj Any

The object containing string data to validate.

required
strict bool | None

Whether to enforce types strictly.

None
context Any | None

Extra variables to pass to the validator.

None
by_alias bool | None

Whether to use the field's alias when validating against the provided input data.

None
by_name bool | None

Whether to use the field's name when validating against the provided input data.

None

Returns:

Type Description
Self

The validated Pydantic model.

Source code in pydantic/main.py
750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783
@classmethod def model_validate_strings( cls, obj: Any, *, strict: bool | None = None, context: Any | None = None, by_alias: bool | None = None, by_name: bool | None = None, ) -> Self:  """Validate the given object with string data against the Pydantic model.  Args:  obj: The object containing string data to validate.  strict: Whether to enforce types strictly.  context: Extra variables to pass to the validator.  by_alias: Whether to use the field's alias when validating against the provided input data.  by_name: Whether to use the field's name when validating against the provided input data.  Returns:  The validated Pydantic model.  """ # `__tracebackhide__` tells pytest and some other tools to omit this function from tracebacks __tracebackhide__ = True if by_alias is False and by_name is not True: raise PydanticUserError( 'At least one of `by_alias` or `by_name` must be set to True.', code='validate-by-alias-and-name-false', ) return cls.__pydantic_validator__.validate_strings( obj, strict=strict, context=context, by_alias=by_alias, by_name=by_name ) 

pydantic.create_model

create_model( model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: None = None, __module__: str = __name__, __validators__: ( dict[str, Callable[..., Any]] | None ) = None, __cls_kwargs__: dict[str, Any] | None = None, **field_definitions: Any | tuple[str, Any], ) -> type[BaseModel] 
create_model( model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: type[ModelT] | tuple[type[ModelT], ...], __module__: str = __name__, __validators__: ( dict[str, Callable[..., Any]] | None ) = None, __cls_kwargs__: dict[str, Any] | None = None, **field_definitions: Any | tuple[str, Any], ) -> type[ModelT] 
create_model( model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: ( type[ModelT] | tuple[type[ModelT], ...] | None ) = None, __module__: str | None = None, __validators__: ( dict[str, Callable[..., Any]] | None ) = None, __cls_kwargs__: dict[str, Any] | None = None, **field_definitions: Any | tuple[str, Any], ) -> type[ModelT] 

Usage Documentation

Dynamic Model Creation

Dynamically creates and returns a new Pydantic model, in other words, create_model dynamically creates a subclass of BaseModel.

Parameters:

Name Type Description Default
model_name str

The name of the newly created model.

required
__config__ ConfigDict | None

The configuration of the new model.

None
__doc__ str | None

The docstring of the new model.

None
__base__ type[ModelT] | tuple[type[ModelT], ...] | None

The base class or classes for the new model.

None
__module__ str | None

The name of the module that the model belongs to; if None, the value is taken from sys._getframe(1)

None
__validators__ dict[str, Callable[..., Any]] | None

A dictionary of methods that validate fields. The keys are the names of the validation methods to be added to the model, and the values are the validation methods themselves. You can read more about functional validators here.

None
__cls_kwargs__ dict[str, Any] | None

A dictionary of keyword arguments for class creation, such as metaclass.

None
**field_definitions Any | tuple[str, Any]

Field definitions of the new model. Either:

  • a single element, representing the type annotation of the field.
  • a two-tuple, the first element being the type and the second element the assigned value (either a default or the Field() function).
{}

Returns:

Type Description
type[ModelT]

The new model.

Raises:

Type Description
PydanticUserError

If __base__ and __config__ are both passed.

Source code in pydantic/main.py
1679 1680 1681 1682 1683 1684 1685 1686 1687 1688 1689 1690 1691 1692 1693 1694 1695 1696 1697 1698 1699 1700 1701 1702 1703 1704 1705 1706 1707 1708 1709 1710 1711 1712 1713 1714 1715 1716 1717 1718 1719 1720 1721 1722 1723 1724 1725 1726 1727 1728 1729 1730 1731 1732 1733 1734 1735 1736 1737 1738 1739 1740 1741 1742 1743 1744 1745 1746 1747 1748 1749 1750 1751 1752 1753 1754 1755 1756 1757 1758 1759 1760 1761 1762 1763 1764 1765 1766 1767 1768 1769 1770
def create_model( # noqa: C901 model_name: str, /, *, __config__: ConfigDict | None = None, __doc__: str | None = None, __base__: type[ModelT] | tuple[type[ModelT], ...] | None = None, __module__: str | None = None, __validators__: dict[str, Callable[..., Any]] | None = None, __cls_kwargs__: dict[str, Any] | None = None, # TODO PEP 747: replace `Any` by the TypeForm: **field_definitions: Any | tuple[str, Any], ) -> type[ModelT]:  """!!! abstract "Usage Documentation"  [Dynamic Model Creation](../concepts/models.md#dynamic-model-creation)  Dynamically creates and returns a new Pydantic model, in other words, `create_model` dynamically creates a  subclass of [`BaseModel`][pydantic.BaseModel].  Args:  model_name: The name of the newly created model.  __config__: The configuration of the new model.  __doc__: The docstring of the new model.  __base__: The base class or classes for the new model.  __module__: The name of the module that the model belongs to;  if `None`, the value is taken from `sys._getframe(1)`  __validators__: A dictionary of methods that validate fields. The keys are the names of the validation methods to  be added to the model, and the values are the validation methods themselves. You can read more about functional  validators [here](https://docs.pydantic.dev/2.9/concepts/validators/#field-validators).  __cls_kwargs__: A dictionary of keyword arguments for class creation, such as `metaclass`.  **field_definitions: Field definitions of the new model. Either:  - a single element, representing the type annotation of the field.  - a two-tuple, the first element being the type and the second element the assigned value  (either a default or the [`Field()`][pydantic.Field] function).  Returns:  The new [model][pydantic.BaseModel].  Raises:  PydanticUserError: If `__base__` and `__config__` are both passed.  """ if __base__ is None: __base__ = (cast('type[ModelT]', BaseModel),) elif not isinstance(__base__, tuple): __base__ = (__base__,) __cls_kwargs__ = __cls_kwargs__ or {} fields: dict[str, Any] = {} annotations: dict[str, Any] = {} for f_name, f_def in field_definitions.items(): if isinstance(f_def, tuple): if len(f_def) != 2: raise PydanticUserError( f'Field definition for {f_name!r} should a single element representing the type or a two-tuple, the first element ' 'being the type and the second element the assigned value (either a default or the `Field()` function).', code='create-model-field-definitions', ) annotations[f_name] = f_def[0] fields[f_name] = f_def[1] else: annotations[f_name] = f_def if __module__ is None: f = sys._getframe(1) __module__ = f.f_globals['__name__'] namespace: dict[str, Any] = {'__annotations__': annotations, '__module__': __module__} if __doc__: namespace.update({'__doc__': __doc__}) if __validators__: namespace.update(__validators__) namespace.update(fields) if __config__: namespace['model_config'] = __config__ resolved_bases = types.resolve_bases(__base__) meta, ns, kwds = types.prepare_class(model_name, resolved_bases, kwds=__cls_kwargs__) if resolved_bases is not __base__: ns['__orig_bases__'] = __base__ namespace.update(ns) return meta( model_name, resolved_bases, namespace, __pydantic_reset_parent_namespace__=False, _create_model_module=__module__, **kwds, )