- Notifications
You must be signed in to change notification settings - Fork 8.5k
[Obs AI Assistant] Check for documents before starting semantic text migration #221152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Obs AI Assistant] Check for documents before starting semantic text migration #221152
Conversation
| Pinging @elastic/obs-ai-assistant (Team:Obs AI Assistant) |
🤖 GitHub commentsExpand to view the GitHub comments
Just comment with:
|
💛 Build succeeded, but was flaky
Failed CI StepsTest Failures
Metrics [docs]
Historycc @viduni94 |
| Starting backport for target branches: 8.19 |
…migration (elastic#221152) Closes elastic#221157 ## Summary We run a semantic text migration at startup to add the semantic text field to documents that were created before 8.17. Before multilingual KB was introduced: - We created index assets for KB when the AI Assistant flyout opens. - Even if the user does not set up the KB, they will have a component template pointing to the custom inference endpoint. With the introduction of multilingual KB: - We moved some of the index creation to when setting up the KB. - We try to do the semantic_text migration at startup. During this migration, for users who didn't set up the KB but had the index assets created at startup, the custom inference endpoint will be unavailable. - But since the migration uses the inference endpoint from the write index, we try to access an endpoint that's not available. This is the reason for this error to be logged. ``` Inference endpoint "obs_ai_assistant_kb_inference" not found or unavailable: resource_not_found_exception Root causes: resource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference] ``` There is no customer impact from this, just that the error that gets logged is creating a lot of noise. ## Solution This PR checks whether there are documents without semantic_text before starting the migration. And also reduced the log level to warn because we hit the `/status` endpoint once when a user opens the AI Assistant. ### Checklist - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) (cherry picked from commit b1e7012)
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
…c text migration (#221152) (#221203) # Backport This will backport the following commits from `main` to `8.19`: - [[Obs AI Assistant] Check for documents before starting semantic text migration (#221152)](#221152) <!--- Backport version: 9.6.6 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Viduni Wickramarachchi","email":"viduni.wickramarachchi@elastic.co"},"sourceCommit":{"committedDate":"2025-05-21T21:18:06Z","message":"[Obs AI Assistant] Check for documents before starting semantic text migration (#221152)\n\nCloses https://github.com/elastic/kibana/issues/221157\n\n## Summary\n\nWe run a semantic text migration at startup to add the semantic text\nfield to documents that were created before 8.17.\n\nBefore multilingual KB was introduced:\n- We created index assets for KB when the AI Assistant flyout opens.\n- Even if the user does not set up the KB, they will have a component\ntemplate pointing to the custom inference endpoint.\n\nWith the introduction of multilingual KB:\n- We moved some of the index creation to when setting up the KB. \n- We try to do the semantic_text migration at startup. During this\nmigration, for users who didn't set up the KB but had the index assets\ncreated at startup, the custom inference endpoint will be unavailable.\n- But since the migration uses the inference endpoint from the write\nindex, we try to access an endpoint that's not available.\n\nThis is the reason for this error to be logged.\n```\nInference endpoint \"obs_ai_assistant_kb_inference\" not found or unavailable: resource_not_found_exception\n\tRoot causes:\n\t\tresource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference]\n```\n\nThere is no customer impact from this, just that the error that gets\nlogged is creating a lot of noise.\n\n## Solution\n\nThis PR checks whether there are documents without semantic_text before\nstarting the migration.\nAnd also reduced the log level to warn because we hit the `/status`\nendpoint once when a user opens the AI Assistant.\n\n\n### Checklist\n\n- [x] The PR description includes the appropriate Release Notes section,\nand the correct `release_note:*` label is applied per the\n[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)","sha":"b1e7012477af991632a0ad2fdfccad743bcac3c6","branchLabelMapping":{"^v9.1.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","Team:Obs AI Assistant","ci:project-deploy-observability","backport:version","v9.1.0","v8.19.0"],"title":"[Obs AI Assistant] Check for documents before starting semantic text migration","number":221152,"url":"https://github.com/elastic/kibana/pull/221152","mergeCommit":{"message":"[Obs AI Assistant] Check for documents before starting semantic text migration (#221152)\n\nCloses https://github.com/elastic/kibana/issues/221157\n\n## Summary\n\nWe run a semantic text migration at startup to add the semantic text\nfield to documents that were created before 8.17.\n\nBefore multilingual KB was introduced:\n- We created index assets for KB when the AI Assistant flyout opens.\n- Even if the user does not set up the KB, they will have a component\ntemplate pointing to the custom inference endpoint.\n\nWith the introduction of multilingual KB:\n- We moved some of the index creation to when setting up the KB. \n- We try to do the semantic_text migration at startup. During this\nmigration, for users who didn't set up the KB but had the index assets\ncreated at startup, the custom inference endpoint will be unavailable.\n- But since the migration uses the inference endpoint from the write\nindex, we try to access an endpoint that's not available.\n\nThis is the reason for this error to be logged.\n```\nInference endpoint \"obs_ai_assistant_kb_inference\" not found or unavailable: resource_not_found_exception\n\tRoot causes:\n\t\tresource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference]\n```\n\nThere is no customer impact from this, just that the error that gets\nlogged is creating a lot of noise.\n\n## Solution\n\nThis PR checks whether there are documents without semantic_text before\nstarting the migration.\nAnd also reduced the log level to warn because we hit the `/status`\nendpoint once when a user opens the AI Assistant.\n\n\n### Checklist\n\n- [x] The PR description includes the appropriate Release Notes section,\nand the correct `release_note:*` label is applied per the\n[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)","sha":"b1e7012477af991632a0ad2fdfccad743bcac3c6"}},"sourceBranch":"main","suggestedTargetBranches":["8.19"],"targetPullRequestStates":[{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/221152","number":221152,"mergeCommit":{"message":"[Obs AI Assistant] Check for documents before starting semantic text migration (#221152)\n\nCloses https://github.com/elastic/kibana/issues/221157\n\n## Summary\n\nWe run a semantic text migration at startup to add the semantic text\nfield to documents that were created before 8.17.\n\nBefore multilingual KB was introduced:\n- We created index assets for KB when the AI Assistant flyout opens.\n- Even if the user does not set up the KB, they will have a component\ntemplate pointing to the custom inference endpoint.\n\nWith the introduction of multilingual KB:\n- We moved some of the index creation to when setting up the KB. \n- We try to do the semantic_text migration at startup. During this\nmigration, for users who didn't set up the KB but had the index assets\ncreated at startup, the custom inference endpoint will be unavailable.\n- But since the migration uses the inference endpoint from the write\nindex, we try to access an endpoint that's not available.\n\nThis is the reason for this error to be logged.\n```\nInference endpoint \"obs_ai_assistant_kb_inference\" not found or unavailable: resource_not_found_exception\n\tRoot causes:\n\t\tresource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference]\n```\n\nThere is no customer impact from this, just that the error that gets\nlogged is creating a lot of noise.\n\n## Solution\n\nThis PR checks whether there are documents without semantic_text before\nstarting the migration.\nAnd also reduced the log level to warn because we hit the `/status`\nendpoint once when a user opens the AI Assistant.\n\n\n### Checklist\n\n- [x] The PR description includes the appropriate Release Notes section,\nand the correct `release_note:*` label is applied per the\n[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)","sha":"b1e7012477af991632a0ad2fdfccad743bcac3c6"}},{"branch":"8.19","label":"v8.19.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Viduni Wickramarachchi <viduni.wickramarachchi@elastic.co>
…migration (elastic#221152) Closes elastic#221157 ## Summary We run a semantic text migration at startup to add the semantic text field to documents that were created before 8.17. Before multilingual KB was introduced: - We created index assets for KB when the AI Assistant flyout opens. - Even if the user does not set up the KB, they will have a component template pointing to the custom inference endpoint. With the introduction of multilingual KB: - We moved some of the index creation to when setting up the KB. - We try to do the semantic_text migration at startup. During this migration, for users who didn't set up the KB but had the index assets created at startup, the custom inference endpoint will be unavailable. - But since the migration uses the inference endpoint from the write index, we try to access an endpoint that's not available. This is the reason for this error to be logged. ``` Inference endpoint "obs_ai_assistant_kb_inference" not found or unavailable: resource_not_found_exception Root causes: resource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference] ``` There is no customer impact from this, just that the error that gets logged is creating a lot of noise. ## Solution This PR checks whether there are documents without semantic_text before starting the migration. And also reduced the log level to warn because we hit the `/status` endpoint once when a user opens the AI Assistant. ### Checklist - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
…migration (elastic#221152) Closes elastic#221157 ## Summary We run a semantic text migration at startup to add the semantic text field to documents that were created before 8.17. Before multilingual KB was introduced: - We created index assets for KB when the AI Assistant flyout opens. - Even if the user does not set up the KB, they will have a component template pointing to the custom inference endpoint. With the introduction of multilingual KB: - We moved some of the index creation to when setting up the KB. - We try to do the semantic_text migration at startup. During this migration, for users who didn't set up the KB but had the index assets created at startup, the custom inference endpoint will be unavailable. - But since the migration uses the inference endpoint from the write index, we try to access an endpoint that's not available. This is the reason for this error to be logged. ``` Inference endpoint "obs_ai_assistant_kb_inference" not found or unavailable: resource_not_found_exception Root causes: resource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference] ``` There is no customer impact from this, just that the error that gets logged is creating a lot of noise. ## Solution This PR checks whether there are documents without semantic_text before starting the migration. And also reduced the log level to warn because we hit the `/status` endpoint once when a user opens the AI Assistant. ### Checklist - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) (cherry picked from commit b1e7012) # Conflicts: # x-pack/platform/plugins/shared/observability_ai_assistant/server/service/inference_endpoint.ts # x-pack/platform/plugins/shared/observability_ai_assistant/server/service/startup_migrations/populate_missing_semantic_text_fields.ts
| Starting backport for target branches: 8.18, 8.19, 9.0 |
💔 All backports failed
Manual backportTo create the backport manually run: Questions ?Please refer to the Backport tool documentation |
…migration (elastic#221152) Closes elastic#221157 ## Summary We run a semantic text migration at startup to add the semantic text field to documents that were created before 8.17. Before multilingual KB was introduced: - We created index assets for KB when the AI Assistant flyout opens. - Even if the user does not set up the KB, they will have a component template pointing to the custom inference endpoint. With the introduction of multilingual KB: - We moved some of the index creation to when setting up the KB. - We try to do the semantic_text migration at startup. During this migration, for users who didn't set up the KB but had the index assets created at startup, the custom inference endpoint will be unavailable. - But since the migration uses the inference endpoint from the write index, we try to access an endpoint that's not available. This is the reason for this error to be logged. ``` Inference endpoint "obs_ai_assistant_kb_inference" not found or unavailable: resource_not_found_exception Root causes: resource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference] ``` There is no customer impact from this, just that the error that gets logged is creating a lot of noise. ## Solution This PR checks whether there are documents without semantic_text before starting the migration. And also reduced the log level to warn because we hit the `/status` endpoint once when a user opens the AI Assistant. ### Checklist - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) (cherry picked from commit b1e7012) # Conflicts: # x-pack/platform/plugins/shared/observability_ai_assistant/server/service/inference_endpoint.ts # x-pack/platform/plugins/shared/observability_ai_assistant/server/service/startup_migrations/populate_missing_semantic_text_fields.ts
…migration (elastic#221152) Closes elastic#221157 ## Summary We run a semantic text migration at startup to add the semantic text field to documents that were created before 8.17. Before multilingual KB was introduced: - We created index assets for KB when the AI Assistant flyout opens. - Even if the user does not set up the KB, they will have a component template pointing to the custom inference endpoint. With the introduction of multilingual KB: - We moved some of the index creation to when setting up the KB. - We try to do the semantic_text migration at startup. During this migration, for users who didn't set up the KB but had the index assets created at startup, the custom inference endpoint will be unavailable. - But since the migration uses the inference endpoint from the write index, we try to access an endpoint that's not available. This is the reason for this error to be logged. ``` Inference endpoint "obs_ai_assistant_kb_inference" not found or unavailable: resource_not_found_exception Root causes: resource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference] ``` There is no customer impact from this, just that the error that gets logged is creating a lot of noise. ## Solution This PR checks whether there are documents without semantic_text before starting the migration. And also reduced the log level to warn because we hit the `/status` endpoint once when a user opens the AI Assistant. ### Checklist - [x] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) (cherry picked from commit b1e7012) # Conflicts: # x-pack/platform/plugins/shared/observability_ai_assistant/server/service/inference_endpoint.ts # x-pack/platform/plugins/shared/observability_ai_assistant/server/service/startup_migrations/populate_missing_semantic_text_fields.ts
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |

Closes #221157
Summary
We run a semantic text migration at startup to add the semantic text field to documents that were created before 8.17.
Before multilingual KB was introduced:
With the introduction of multilingual KB:
This is the reason for this error to be logged.
There is no customer impact from this, just that the error that gets logged is creating a lot of noise.
Solution
This PR checks whether there are documents without semantic_text before starting the migration.
And also reduced the log level to warn because we hit the
/statusendpoint once when a user opens the AI Assistant.Checklist
release_note:*label is applied per the guidelines