- Notifications
You must be signed in to change notification settings - Fork 401
Closed
Labels
api: vertex-aiIssues related to the googleapis/python-aiplatform API.Issues related to the googleapis/python-aiplatform API.flakybot: issueAn issue filed by the Flaky Bot. Should not be added manually.An issue filed by the Flaky Bot. Should not be added manually.priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.Important issue which blocks shipping the next release. Will be fixed prior to next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Description
Note: #1629 was also for this test, but it was closed more than 10 days ago. So, I didn't mark it flaky.
commit: 95855a2
buildURL: Build Status, Sponge
status: failed
Test output
self =def test_mdm_two_models_one_valid_config(self): """ Enable model monitoring on two existing models deployed to the same endpoint. """ # test model monitoring configurations job = aiplatform.ModelDeploymentMonitoringJob.create( display_name=self._make_display_name(key=JOB_NAME), logging_sampling_strategy=sampling_strategy, schedule_config=schedule_config, alert_config=alert_config, objective_configs=objective_config, create_request_timeout=3600, project=e2e_base._PROJECT, location=e2e_base._LOCATION, endpoint=self.endpoint, predict_instance_schema_uri="", analysis_instance_schema_uri="", ) assert job is not None gapic_job = job._gca_resource assert ( gapic_job.logging_sampling_strategy.random_sample_config.sample_rate == LOG_SAMPLE_RATE ) assert ( gapic_job.model_deployment_monitoring_schedule_config.monitor_interval.seconds == MONITOR_INTERVAL * 3600 ) assert ( gapic_job.model_monitoring_alert_config.email_alert_config.user_emails == [USER_EMAIL] ) assert gapic_job.model_monitoring_alert_config.enable_logging gca_obj_config = gapic_job.model_deployment_monitoring_objective_configs[ 0 ].objective_config expected_training_dataset = ( gca_model_monitoring.ModelMonitoringObjectiveConfig.TrainingDataset( bigquery_source=gca_io.BigQuerySource(input_uri=DATASET_BQ_URI), target_field=TARGET, ) ) assert gca_obj_config.training_dataset == expected_training_dataset assert ( gca_obj_config.training_prediction_skew_detection_config == skew_config.as_proto() ) assert ( gca_obj_config.prediction_drift_detection_config == drift_config.as_proto() ) job_resource = job._gca_resource.name # test job update and delete() timeout = time.time() + 3600 new_obj_config = model_monitoring.ObjectiveConfig(skew_config) while time.time() < timeout: if job.state == gca_job_state.JobState.JOB_STATE_RUNNING:
job.update(objective_configs=new_obj_config)
tests/system/aiplatform/test_model_monitoring.py:167:
google/cloud/aiplatform/jobs.py:2486: in update
ModelDeploymentMonitoringJob._parse_configs(
cls = <class 'google.cloud.aiplatform.jobs.ModelDeploymentMonitoringJob'>
objective_configs = <google.cloud.aiplatform.model_monitoring.objective.ObjectiveConfig object at 0x7fa9e6b0b0a0>
endpoint = 'projects/580378083368/locations/us-central1/endpoints/8289570005524152320'
deployed_model_ids = None@classmethod def _parse_configs( cls, objective_configs: Union[ model_monitoring.ObjectiveConfig, Dict[str, model_monitoring.ObjectiveConfig], ], endpoint: "aiplatform.Endpoint", deployed_model_ids: Optional[List[str]] = None, ) -> List[ gca_model_deployment_monitoring_job_compat.ModelDeploymentMonitoringObjectiveConfig ]: """Helper function for matching objective configs with their corresponding models. Args: objective_configs (Union[model_monitoring.objective.ObjectiveConfig, Dict[str, model_monitoring.objective.ObjectiveConfig]): Required. A single config if it applies to all models, or a dictionary of model_id: model_monitoring.objective.ObjectiveConfig if different model IDs have different configs. endpoint (aiplatform.Endpoint): Required. A valid instance of aiplatforn.Endpoint to launch the MDM job on. deployed_model_ids (Optional[List[str]]): Optional. A list of deployed model IDs to apply the objective config to. Note that a model will have a deployed_model_id that is different from the uploaded model ID, and IDs in this list should consist of deployed model IDs on the same endpoint passed in the argument. If `objective_configs` is a dictionary, then this parameter is ignored. If `objective_configs` is an instance of `model_monitoring.ObjectiveConfig` and `deployed_model_ids` is a non-empty list of valid IDs, then the same objective config will apply to all models in this list. Returns: A List of ModelDeploymentMonitoringObjectiveConfig objects. Raises: ValueError, when the model IDs given are invalid. RuntimeError, when XAI is enabled on a model that doesn't have XAI parameters configured. """ all_models = [] xai_enabled = []
for model in endpoint.list_models():
E AttributeError: 'str' object has no attribute 'list_models'
google/cloud/aiplatform/jobs.py:2107: AttributeError
Metadata
Metadata
Assignees
Labels
api: vertex-aiIssues related to the googleapis/python-aiplatform API.Issues related to the googleapis/python-aiplatform API.flakybot: issueAn issue filed by the Flaky Bot. Should not be added manually.An issue filed by the Flaky Bot. Should not be added manually.priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.Important issue which blocks shipping the next release. Will be fixed prior to next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.