Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Nov 5, 2025

Bumps @types/node from 24.9.1 to 24.10.0.

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Bumps [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) from 24.9.1 to 24.10.0. - [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases) - [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node) --- updated-dependencies: - dependency-name: "@types/node" dependency-version: 24.10.0 dependency-type: direct:development update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Nov 5, 2025
@dependabot dependabot bot requested a review from a team as a code owner November 5, 2025 04:19
@dependabot dependabot bot requested a review from fheisler November 5, 2025 04:19
@netlify
Copy link

netlify bot commented Nov 5, 2025

Deploy Preview for authentik-storybook canceled.

Name Link
🔨 Latest commit 7430801
🔍 Latest deploy log https://app.netlify.com/projects/authentik-storybook/deploys/690ad043edaac50008558e00
@netlify
Copy link

netlify bot commented Nov 5, 2025

Deploy Preview for authentik-docs ready!

Name Link
🔨 Latest commit 7430801
🔍 Latest deploy log https://app.netlify.com/projects/authentik-docs/deploys/690ad0434bbe5200075eab93
😎 Deploy Preview https://deploy-preview-17948--authentik-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@netlify
Copy link

netlify bot commented Nov 5, 2025

Deploy Preview for authentik-integrations canceled.

Name Link
🔨 Latest commit 7430801
🔍 Latest deploy log https://app.netlify.com/projects/authentik-integrations/deploys/690ad043653b2500087b0ea9
@codecov
Copy link

codecov bot commented Nov 5, 2025

❌ 2 Tests Failed:

Tests completed Failed Passed Skipped
2199 2 2197 2
View the top 2 failed test(s) by shortest run time
tests.e2e.test_provider_saml.TestProviderSAML::test_sp_initiated_implicit_post_buffer
Stack Traces | 23.4s run time
self = <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450> sql = 'TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovid...enticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";' params = None ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450>}) def _execute(self, sql, params, *ignored_wrapper_args): # Raise a warning during app initialization (stored_app_configs is only # ever set during testing). if not apps.ready and not apps.stored_app_configs: warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning) self.db.validate_no_broken_transaction() with self.db.wrap_database_errors: if params is None: # params default might be backend specific. > return self.cursor.execute(sql) .venv/lib/python3.13.../db/backends/utils.py:103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [IDLE] (host=localhost user=authentik database=test_authentik) at 0x7fb394fc8b90> args = ('TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovi...ticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";',) kwargs = {} def execute(self, *args, **kwargs): execute_total.labels(alias, vendor).inc() with ( query_duration_seconds.labels(**labels).time(), ExceptionCounterByType(errors_total, extra_labels=labels), ): > return super().execute(*args, **kwargs) .venv/lib/python3.13.../django_prometheus/db/common.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [IDLE] (host=localhost user=authentik database=test_authentik) at 0x7fb394fc8b90> query = 'TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovid...enticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";' params = None def execute( self, query: Query, params: Params | None = None, *, prepare: bool | None = None, binary: bool | None = None, ) -> Self: """ Execute a query or command to the database. """ try: with self._conn.lock: self._conn.wait( self._execute_gen(query, params, prepare=prepare, binary=binary) ) except e._NO_TRACEBACK as ex: > raise ex.with_traceback(None) E psycopg.errors.DeadlockDetected: deadlock detected E DETAIL: Process 357 waits for AccessExclusiveLock on relation 20129 of database 16389; blocked by process 483. E Process 483 waits for AccessShareLock on relation 20230 of database 16389; blocked by process 357. E HINT: See server log for query details. .venv/lib/python3.13....../site-packages/psycopg/cursor.py:97: DeadlockDetected The above exception was the direct cause of the following exception: self = <django.core.management.commands.flush.Command object at 0x7fb396296360> options = {'allow_cascade': False, 'database': 'default', 'force_color': False, 'inhibit_post_migrate': False, ...} database = 'default' connection = <DatabaseWrapper vendor='postgresql' alias='default'> verbosity = 0, interactive = False, reset_sequences = False allow_cascade = False, inhibit_post_migrate = False def handle(self, **options): database = options["database"] connection = connections[database] verbosity = options["verbosity"] interactive = options["interactive"] # The following are stealth options used by Django's internals. reset_sequences = options.get("reset_sequences", True) allow_cascade = options.get("allow_cascade", False) inhibit_post_migrate = options.get("inhibit_post_migrate", False) self.style = no_style() # Import the 'management' module within each installed app, to register # dispatcher events. for app_config in apps.get_app_configs(): try: import_module(".management", app_config.name) except ImportError: pass sql_list = sql_flush( self.style, connection, reset_sequences=reset_sequences, allow_cascade=allow_cascade, ) if interactive: confirm = input( """You have requested a flush of the database. This will IRREVERSIBLY DESTROY all data currently in the "%s" database, and return each table to an empty state. Are you sure you want to do this? Type 'yes' to continue, or 'no' to cancel: """ % connection.settings_dict["NAME"] ) else: confirm = "yes" if confirm == "yes": try: > connection.ops.execute_sql_flush(sql_list) .venv/lib/python3.13.../management/commands/flush.py:74: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <psqlextra.backend.operations.PostgresOperations object at 0x7fb3a1b74050> sql_list = ['TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovi...nticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";'] def execute_sql_flush(self, sql_list): """Execute a list of SQL statements to flush the database.""" with transaction.atomic( using=self.connection.alias, savepoint=self.connection.features.can_rollback_ddl, ): with self.connection.cursor() as cursor: for sql in sql_list: > cursor.execute(sql) .venv/lib/python3.13.../backends/base/operations.py:473: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<django.db.backends.utils.CursorWrapper object at 0x7fb39239a450>, 'TRUNCATE "django_postgres_cache_cacheentry", "aut...nticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";') kwargs = {} def runner(*args: "P.args", **kwargs: "P.kwargs"): # type: (...) -> R if sentry_sdk.get_client().get_integration(integration) is None: return original_function(*args, **kwargs) > return sentry_patched_function(*args, **kwargs) .venv/lib/python3.13.../site-packages/sentry_sdk/utils.py:1816: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450> sql = 'TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovid...enticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";' params = None @ensure_integration_enabled(DjangoIntegration, real_execute) def execute(self, sql, params=None): # type: (CursorWrapper, Any, Optional[Any]) -> Any with record_sql_queries( cursor=self.cursor, query=sql, params_list=params, paramstyle="format", executemany=False, span_origin=DjangoIntegration.origin_db, ) as span: _set_db_data(span, self) > result = real_execute(self, sql, params) .venv/lib/python3.13.../integrations/django/__init__.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450> sql = 'TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovid...enticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";' params = None def execute(self, sql, params=None): > return self._execute_with_wrappers( sql, params, many=False, executor=self._execute ) .venv/lib/python3.13.../db/backends/utils.py:79: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450> sql = 'TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovid...enticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";' params = None, many = False executor = <bound method CursorWrapper._execute of <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450>> def _execute_with_wrappers(self, sql, params, many, executor): context = {"connection": self.db, "cursor": self} for wrapper in reversed(self.db.execute_wrappers): executor = functools.partial(wrapper, executor) > return executor(sql, params, many, context) .venv/lib/python3.13.../db/backends/utils.py:92: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450> sql = 'TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovid...enticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";' params = None ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450>}) def _execute(self, sql, params, *ignored_wrapper_args): # Raise a warning during app initialization (stored_app_configs is only # ever set during testing). if not apps.ready and not apps.stored_app_configs: warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning) self.db.validate_no_broken_transaction() > with self.db.wrap_database_errors: .venv/lib/python3.13.../db/backends/utils.py:100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django.db.utils.DatabaseErrorWrapper object at 0x7fb399916510> exc_type = <class 'psycopg.errors.DeadlockDetected'> exc_value = DeadlockDetected('deadlock detected\nDETAIL: Process 357 waits for AccessExclusiveLock on relation 20129 of database ...ccessShareLock on relation 20230 of database 16389; blocked by process 357.\nHINT: See server log for query details.') traceback = <traceback object at 0x7fb3901b6500> def __exit__(self, exc_type, exc_value, traceback): if exc_type is None: return for dj_exc_type in ( DataError, OperationalError, IntegrityError, InternalError, ProgrammingError, NotSupportedError, DatabaseError, InterfaceError, Error, ): db_exc_type = getattr(self.wrapper.Database, dj_exc_type.__name__) if issubclass(exc_type, db_exc_type): dj_exc_value = dj_exc_type(*exc_value.args) # Only set the 'errors_occurred' flag for errors that may make # the connection unusable. if dj_exc_type not in (DataError, IntegrityError): self.wrapper.errors_occurred = True > raise dj_exc_value.with_traceback(traceback) from exc_value .venv/lib/python3.13.../django/db/utils.py:91: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450> sql = 'TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovid...enticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";' params = None ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7fb39239a450>}) def _execute(self, sql, params, *ignored_wrapper_args): # Raise a warning during app initialization (stored_app_configs is only # ever set during testing). if not apps.ready and not apps.stored_app_configs: warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning) self.db.validate_no_broken_transaction() with self.db.wrap_database_errors: if params is None: # params default might be backend specific. > return self.cursor.execute(sql) .venv/lib/python3.13.../db/backends/utils.py:103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [IDLE] (host=localhost user=authentik database=test_authentik) at 0x7fb394fc8b90> args = ('TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovi...ticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";',) kwargs = {} def execute(self, *args, **kwargs): execute_total.labels(alias, vendor).inc() with ( query_duration_seconds.labels(**labels).time(), ExceptionCounterByType(errors_total, extra_labels=labels), ): > return super().execute(*args, **kwargs) .venv/lib/python3.13.../django_prometheus/db/common.py:69: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [IDLE] (host=localhost user=authentik database=test_authentik) at 0x7fb394fc8b90> query = 'TRUNCATE "django_postgres_cache_cacheentry", "authentik_enterprise_license", "authentik_providers_radius_radiusprovid...enticator_webauthn_webauthndevice", "authentik_sources_oauth_oauthsource", "authentik_policies_reputation_reputation";' params = None def execute( self, query: Query, params: Params | None = None, *, prepare: bool | None = None, binary: bool | None = None, ) -> Self: """ Execute a query or command to the database. """ try: with self._conn.lock: self._conn.wait( self._execute_gen(query, params, prepare=prepare, binary=binary) ) except e._NO_TRACEBACK as ex: > raise ex.with_traceback(None) E django.db.utils.OperationalError: deadlock detected E DETAIL: Process 357 waits for AccessExclusiveLock on relation 20129 of database 16389; blocked by process 483. E Process 483 waits for AccessShareLock on relation 20230 of database 16389; blocked by process 357. E HINT: See server log for query details. .venv/lib/python3.13....../site-packages/psycopg/cursor.py:97: OperationalError The above exception was the direct cause of the following exception: self = <tests.e2e.test_provider_saml.TestProviderSAML testMethod=test_sp_initiated_implicit_post_buffer> result = <TestCaseFunction test_sp_initiated_implicit_post_buffer> debug = False def _setup_and_call(self, result, debug=False): """ Perform the following in order: pre-setup, run test, post-teardown, skipping pre/post hooks if test is set to be skipped. If debug=True, reraise any errors in setup and use super().debug() instead of __call__() to run the test. """ testMethod = getattr(self, self._testMethodName) skipped = getattr(self.__class__, "__unittest_skip__", False) or getattr( testMethod, "__unittest_skip__", False ) # Convert async test methods. if iscoroutinefunction(testMethod): setattr(self, self._testMethodName, async_to_sync(testMethod)) if not skipped: try: if self.__class__._pre_setup_ran_eagerly: self.__class__._pre_setup_ran_eagerly = False else: self._pre_setup() except Exception: if debug: raise result.addError(self, sys.exc_info()) return if debug: super().debug() else: super().__call__(result) if not skipped: try: > self._post_teardown() .venv/lib/python3.13.../django/test/testcases.py:379: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_provider_saml.TestProviderSAML testMethod=test_sp_initiated_implicit_post_buffer> def _post_teardown(self): """ Perform post-test things: * Flush the contents of the database to leave a clean slate. If the class has an 'available_apps' attribute, don't fire post_migrate. * Force-close the connection so the next test gets a clean cursor. """ try: > self._fixture_teardown() .venv/lib/python3.13.../django/test/testcases.py:1231: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_provider_saml.TestProviderSAML testMethod=test_sp_initiated_implicit_post_buffer> def _fixture_teardown(self): # Allow TRUNCATE ... CASCADE and don't emit the post_migrate signal # when flushing only a subset of the apps for db_name in self._databases_names(include_mirrors=False): # Flush the database inhibit_post_migrate = ( self.available_apps is not None or ( # Inhibit the post_migrate signal when using serialized # rollback to avoid trying to recreate the serialized data. self.serialized_rollback and hasattr(connections[db_name], "_test_serialized_contents") ) ) > call_command( "flush", verbosity=0, interactive=False, database=db_name, reset_sequences=False, allow_cascade=self.available_apps is not None, inhibit_post_migrate=inhibit_post_migrate, ) .venv/lib/python3.13.../django/test/testcases.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ command_name = 'flush', args = () options = {'allow_cascade': False, 'database': 'default', 'inhibit_post_migrate': False, 'interactive': False, ...} command = <django.core.management.commands.flush.Command object at 0x7fb396296360> app_name = 'django.core' parser = CommandParser(prog=' flush', usage=None, description='Removes ALL DATA from the database, including data added during ....', formatter_class=<class 'django.core.management.base.DjangoHelpFormatter'>, conflict_handler='error', add_help=True) opt_mapping = {'database': 'database', 'force_color': 'force_color', 'help': 'help', 'no_color': 'no_color', ...} arg_options = {'allow_cascade': False, 'database': 'default', 'inhibit_post_migrate': False, 'interactive': False, ...} parse_args = [] def call_command(command_name, *args, **options): """ Call the given command, with the given options and args/kwargs. This is the primary API you should use for calling specific commands. `command_name` may be a string or a command object. Using a string is preferred unless the command object is required for further processing or testing. Some examples: call_command('migrate') call_command('shell', plain=True) call_command('sqlmigrate', 'myapp') from django.core.management.commands import flush cmd = flush.Command() call_command(cmd, verbosity=0, interactive=False) # Do something with cmd ... """ if isinstance(command_name, BaseCommand): # Command object passed in. command = command_name command_name = command.__class__.__module__.split(".")[-1] else: # Load the command object by name. try: app_name = get_commands()[command_name] except KeyError: raise CommandError("Unknown command: %r" % command_name) if isinstance(app_name, BaseCommand): # If the command is already loaded, use it directly. command = app_name else: command = load_command_class(app_name, command_name) # Simulate argument parsing to get the option defaults (see #10080 for details). parser = command.create_parser("", command_name) # Use the `dest` option name from the parser option opt_mapping = { min(s_opt.option_strings).lstrip("-").replace("-", "_"): s_opt.dest for s_opt in parser._actions if s_opt.option_strings } arg_options = {opt_mapping.get(key, key): value for key, value in options.items()} parse_args = [] for arg in args: if isinstance(arg, (list, tuple)): parse_args += map(str, arg) else: parse_args.append(str(arg)) def get_actions(parser): # Parser actions and actions from sub-parser choices. for opt in parser._actions: if isinstance(opt, _SubParsersAction): for sub_opt in opt.choices.values(): yield from get_actions(sub_opt) else: yield opt parser_actions = list(get_actions(parser)) mutually_exclusive_required_options = { opt for group in parser._mutually_exclusive_groups for opt in group._group_actions if group.required } # Any required arguments which are passed in via **options must be passed # to parse_args(). for opt in parser_actions: if opt.dest in options and ( opt.required or opt in mutually_exclusive_required_options ): opt_dest_count = sum(v == opt.dest for v in opt_mapping.values()) if opt_dest_count > 1: raise TypeError( f"Cannot pass the dest {opt.dest!r} that matches multiple " f"arguments via **options." ) parse_args.append(min(opt.option_strings)) if isinstance(opt, (_AppendConstAction, _CountAction, _StoreConstAction)): continue value = arg_options[opt.dest] if isinstance(value, (list, tuple)): parse_args += map(str, value) else: parse_args.append(str(value)) defaults = parser.parse_args(args=parse_args) defaults = dict(defaults._get_kwargs(), **arg_options) # Raise an error if any unknown options were passed. stealth_options = set(command.base_stealth_options + command.stealth_options) dest_parameters = {action.dest for action in parser_actions} valid_options = (dest_parameters | stealth_options).union(opt_mapping) unknown_options = set(options) - valid_options if unknown_options: raise TypeError( "Unknown option(s) for %s command: %s. " "Valid options are: %s." % ( command_name, ", ".join(sorted(unknown_options)), ", ".join(sorted(valid_options)), ) ) # Move positional args out of options to mimic legacy optparse args = defaults.pop("args", ()) if "skip_checks" not in options: defaults["skip_checks"] = True > return command.execute(*args, **defaults) .venv/lib/python3.13.../core/management/__init__.py:194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django.core.management.commands.flush.Command object at 0x7fb396296360> args = () options = {'allow_cascade': False, 'database': 'default', 'force_color': False, 'inhibit_post_migrate': False, ...} def execute(self, *args, **options): """ Try to execute this command, performing system checks if needed (as controlled by the ``requires_system_checks`` attribute, except if force-skipped). """ if options["force_color"] and options["no_color"]: raise CommandError( "The --no-color and --force-color options can't be used together." ) if options["force_color"]: self.style = color_style(force_color=True) elif options["no_color"]: self.style = no_style() self.stderr.style_func = None if options.get("stdout"): self.stdout = OutputWrapper(options["stdout"]) if options.get("stderr"): self.stderr = OutputWrapper(options["stderr"]) if self.requires_system_checks and not options["skip_checks"]: check_kwargs = self.get_check_kwargs(options) self.check(**check_kwargs) if self.requires_migrations_checks: self.check_migrations() > output = self.handle(*args, **options) .venv/lib/python3.13.../core/management/base.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <django.core.management.commands.flush.Command object at 0x7fb396296360> options = {'allow_cascade': False, 'database': 'default', 'force_color': False, 'inhibit_post_migrate': False, ...} database = 'default' connection = <DatabaseWrapper vendor='postgresql' alias='default'> verbosity = 0, interactive = False, reset_sequences = False allow_cascade = False, inhibit_post_migrate = False def handle(self, **options): database = options["database"] connection = connections[database] verbosity = options["verbosity"] interactive = options["interactive"] # The following are stealth options used by Django's internals. reset_sequences = options.get("reset_sequences", True) allow_cascade = options.get("allow_cascade", False) inhibit_post_migrate = options.get("inhibit_post_migrate", False) self.style = no_style() # Import the 'management' module within each installed app, to register # dispatcher events. for app_config in apps.get_app_configs(): try: import_module(".management", app_config.name) except ImportError: pass sql_list = sql_flush( self.style, connection, reset_sequences=reset_sequences, allow_cascade=allow_cascade, ) if interactive: confirm = input( """You have requested a flush of the database. This will IRREVERSIBLY DESTROY all data currently in the "%s" database, and return each table to an empty state. Are you sure you want to do this? Type 'yes' to continue, or 'no' to cancel: """ % connection.settings_dict["NAME"] ) else: confirm = "yes" if confirm == "yes": try: connection.ops.execute_sql_flush(sql_list) except Exception as exc: > raise CommandError( "Database %s couldn't be flushed. Possible reasons:\n" " * The database isn't running or isn't configured correctly.\n" " * At least one of the expected database tables doesn't exist.\n" " * The SQL was invalid.\n" "Hint: Look at the output of 'django-admin sqlflush'. " "That's the SQL this command wasn't able to run." % (connection.settings_dict["NAME"],) ) from exc E django.core.management.base.CommandError: Database test_authentik couldn't be flushed. Possible reasons: E * The database isn't running or isn't configured correctly. E * At least one of the expected database tables doesn't exist. E * The SQL was invalid. E Hint: Look at the output of 'django-admin sqlflush'. That's the SQL this command wasn't able to run. .venv/lib/python3.13.../management/commands/flush.py:76: CommandError
tests.e2e.test_source_saml.TestSourceSAML::test_idp_post_auto
Stack Traces | 25.4s run time
self = <unittest.case._Outcome object at 0x7fb395e0ab50> test_case = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> subTest = False @contextlib.contextmanager def testPartExecutor(self, test_case, subTest=False): old_success = self.success self.success = True try: > yield .../hostedtoolcache/Python/3.13.9................../x64/lib/python3.13/unittest/case.py:58: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> result = <TestCaseFunction test_idp_post_auto> def run(self, result=None): if result is None: result = self.defaultTestResult() startTestRun = getattr(result, 'startTestRun', None) stopTestRun = getattr(result, 'stopTestRun', None) if startTestRun is not None: startTestRun() else: stopTestRun = None result.startTest(self) try: testMethod = getattr(self, self._testMethodName) if (getattr(self.__class__, "__unittest_skip__", False) or getattr(testMethod, "__unittest_skip__", False)): # If the class or method was skipped. skip_why = (getattr(self.__class__, '__unittest_skip_why__', '') or getattr(testMethod, '__unittest_skip_why__', '')) _addSkip(result, self, skip_why) return result expecting_failure = ( getattr(self, "__unittest_expecting_failure__", False) or getattr(testMethod, "__unittest_expecting_failure__", False) ) outcome = _Outcome(result) start_time = time.perf_counter() try: self._outcome = outcome with outcome.testPartExecutor(self): self._callSetUp() if outcome.success: outcome.expecting_failure = expecting_failure with outcome.testPartExecutor(self): > self._callTestMethod(testMethod) .../hostedtoolcache/Python/3.13.9................../x64/lib/python3.13/unittest/case.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> method = <bound method TestSourceSAML.test_idp_post_auto of <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto>> def _callTestMethod(self, method): > if method() is not None: .../hostedtoolcache/Python/3.13.9................../x64/lib/python3.13/unittest/case.py:606: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> args = (), kwargs = {} @wraps(func) def wrapper(self: TransactionTestCase, *args, **kwargs): """Run test again if we're below max_retries, including tearDown and setUp. Otherwise raise the error""" nonlocal count try: > return func(self, *args, **kwargs) tests/e2e/utils.py:324: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto>,) kwargs = {}, file = 'default/flow-default-invalidation-flow.yaml' content = 'version: 1\nmetadata:\n name: Default - Invalidation flow\nentries:\n- attrs:\n designation: invalidation\n na...0\n stage: !KeyOf default-invalidation-logout\n target: !KeyOf flow\n model: authentik_flows.flowstagebinding\n' @wraps(func) def wrapper(*args, **kwargs): for file in files: content = BlueprintInstance(path=file).retrieve() Importer.from_string(content).apply() > return func(*args, **kwargs) .../blueprints/tests/__init__.py:25: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = (<tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto>,) kwargs = {}, file = 'default/flow-default-source-pre-authentication.yaml' content = 'version: 1\nmetadata:\n name: Default - Source pre-authentication flow\nentries:\n- attrs:\n designation: stage_c... authentication: none\n identifiers:\n slug: default-source-pre-authentication\n model: authentik_flows.flow\n' @wraps(func) def wrapper(*args, **kwargs): for file in files: content = BlueprintInstance(path=file).retrieve() Importer.from_string(content).apply() > return func(*args, **kwargs) .../blueprints/tests/__init__.py:25: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> @retry() @apply_blueprint( "default/flow-default-authentication-flow.yaml", "default/flow-default-invalidation-flow.yaml", ) @apply_blueprint( "default/flow-default-source-authentication.yaml", "default/flow-default-source-enrollment.yaml", "default/flow-default-source-pre-authentication.yaml", ) def test_idp_post_auto(self): """test SAML Source With post binding (auto redirect)""" # Bootstrap all needed objects authentication_flow = Flow.objects.get(slug="default-source-authentication") enrollment_flow = Flow.objects.get(slug="default-source-enrollment") pre_authentication_flow = Flow.objects.get(slug="default-source-pre-authentication") keypair = CertificateKeyPair.objects.create( name=generate_id(), certificate_data=IDP_CERT, key_data=IDP_KEY, ) source = SAMLSource.objects.create( name=generate_id(), slug=self.slug, authentication_flow=authentication_flow, enrollment_flow=enrollment_flow, pre_authentication_flow=pre_authentication_flow, issuer="entity-id", sso_url=f"http://{self.host}:.../saml2/idp/SSOService.php", binding_type=SAMLBindingTypes.POST_AUTO, signing_kp=keypair, ) ident_stage = IdentificationStage.objects.first() ident_stage.sources.set([source]) ident_stage.save() self.driver.get(self.live_server_url) flow_executor = self.get_shadow_root("ak-flow-executor") identification_stage = self.get_shadow_root("ak-stage-identification", flow_executor) wait = WebDriverWait(identification_stage, self.wait_timeout) wait.until( ec.presence_of_element_located( (By.CSS_SELECTOR, ".pf-c-login__main-footer-links-item > button") ) ) identification_stage.find_element( By.CSS_SELECTOR, ".pf-c-login__main-footer-links-item > button" ).click() # Now we should be at the IDP, wait for the username field self.wait.until(ec.presence_of_element_located((By.ID, "username"))) self.driver.find_element(By.ID, "username").send_keys("user1") self.driver.find_element(By.ID, "password").send_keys("user1pass") self.driver.find_element(By.ID, "password").send_keys(Keys.ENTER) # Wait until we're logged in self.wait_for_url(self.if_user_url()) > self.assert_user( User.objects.exclude(username="akadmin") .exclude(username__startswith="ak-outpost") .exclude_anonymous() .exclude(pk=self.user.pk) .first() ) tests/e2e/test_source_saml.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> expected_user = <User: Jt5icgqv3K39xzfBA3cG> def assert_user(self, expected_user: User): """Check users/me API and assert it matches expected_user""" self.driver.get(self.url("authentik_api:user-me") + "?format=json") user_json = self.driver.find_element(By.CSS_SELECTOR, "pre").text user = UserSerializer(data=json.loads(user_json)["user"]) user.is_valid() > self.assertEqual(user["username"].value, expected_user.username) tests/e2e/utils.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> first = 'user1@example.com', second = 'Jt5icgqv3K39xzfBA3cG', msg = None def assertEqual(self, first, second, msg=None): """Fail if the two objects are unequal as determined by the '==' operator. """ assertion_func = self._getAssertEqualityFunc(first, second) > assertion_func(first, second, msg=msg) .../hostedtoolcache/Python/3.13.9................../x64/lib/python3.13/unittest/case.py:907: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> first = 'user1@example.com', second = 'Jt5icgqv3K39xzfBA3cG', msg = None def assertMultiLineEqual(self, first, second, msg=None): """Assert that two multi-line strings are equal.""" self.assertIsInstance(first, str, "First argument is not a string") self.assertIsInstance(second, str, "Second argument is not a string") if first != second: # Don't use difflib if the strings are too long if (len(first) > self._diffThreshold or len(second) > self._diffThreshold): self._baseAssertEqual(first, second, msg) # Append \n to both strings if either is missing the \n. # This allows the final ndiff to show the \n difference. The # exception here is if the string is empty, in which case no # \n should be added first_presplit = first second_presplit = second if first and second: if first[-1] != '\n' or second[-1] != '\n': first_presplit += '\n' second_presplit += '\n' elif second and second[-1] != '\n': second_presplit += '\n' elif first and first[-1] != '\n': first_presplit += '\n' firstlines = first_presplit.splitlines(keepends=True) secondlines = second_presplit.splitlines(keepends=True) # Generate the message and diff, then raise the exception standardMsg = '%s != %s' % _common_shorten_repr(first, second) diff = '\n' + ''.join(difflib.ndiff(firstlines, secondlines)) standardMsg = self._truncateMessage(standardMsg, diff) > self.fail(self._formatMessage(msg, standardMsg)) .../hostedtoolcache/Python/3.13.9................../x64/lib/python3.13/unittest/case.py:1273: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <tests.e2e.test_source_saml.TestSourceSAML testMethod=test_idp_post_auto> msg = "'user1@example.com' != 'Jt5icgqv3K39xzfBA3cG'\n- user1@example.com\n+ Jt5icgqv3K39xzfBA3cG\n" def fail(self, msg=None): """Fail immediately, with the given message.""" > raise self.failureException(msg) E AssertionError: 'user1@example.com' != 'Jt5icgqv3K39xzfBA3cG' E - user1@example.com E + Jt5icgqv3K39xzfBA3cG .../hostedtoolcache/Python/3.13.9................../x64/lib/python3.13/unittest/case.py:732: AssertionError

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

@github-project-automation github-project-automation bot moved this from Todo to In Progress in authentik Core Nov 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

2 participants