Skip to content

Commit a1967ce

Browse files
benironsidemdbirnstiehlflorent-leborgne
authored
Moves LLM connector guides to explore-analyze section (#4224)
## Summary Moves the LLM connector guides from the security docs section to the new AI section within explore-analyze. This fixes elastic/docs-content-internal/issues/487 as part of elastic/docs-content-internal/issues/298. The purpose is to update the IA for these docs to be solution-agnostic, since they are helpful to users of any solution, not just security. This PR: - Moves the four connector-specific guides to the recently created section, in a new sub-section. - Creates a landing page for the new subsection. - Moves `explore-analyze/ai-features.md` to `explore-analyze/ai-features/ai-features.md` (this is just cleanup from our [previous PR](#3768)). - Updates links to the LLM connector guides and creates corresponding redirects. - Moves the mapped_pages frontmatter from the page within the security solution docs that previously was the landing page for the connector guides (the page should still exist as there are some security-specific guides that were not moved) to the new landing page within explore-analyze. I would appreciate a check on my thinking here, but my thinking is that we should generally be sending users to the new landing page. - Updates the security LLM connectors landing page introduction. - Updates Elastic Managed LLM snippet. - Fixes a few Vale-identified sentence-level issues in the Amazon Bedrock guide. ## Generative AI disclosure 1. Did you use a generative AI (GenAI) tool to assist in creating this contribution? - [x ] Yes - [ ] No Tool(s) and model(s) used: CoPilot with GPT 4.1 to help with the redirects. --------- Co-authored-by: Mike Birnstiehl <114418652+mdbirnstiehl@users.noreply.github.com> Co-authored-by: florent-leborgne <florent.leborgne@elastic.co>
1 parent b91ceeb commit a1967ce

26 files changed

+138
-108
lines changed

explore-analyze/ai-features.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -20,8 +20,8 @@ For pricing information, refer to [pricing](https://www.elastic.co/pricing).
2020

2121
## Requirements
2222

23-
- To use Elastic's AI-powered features, you need an appropriate license and feature tier. These vary by solution and feature. Refer to each feature's documentation to learn more.
24-
- Most features require at least one working LLM connector. To learn about setting up large language model (LLM) connectors used by AI-powered features, refer to [](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md). Elastic Managed LLM is available by default if your license supports it.
23+
- To use Elastic's AI-powered features, you need an appropriate subscription level or serverless feature tier. These vary by solution and feature. Refer to each feature's documentation to learn more.
24+
- Most features require at least one working LLM connector. To learn about setting up large language model (LLM) connectors used by AI-powered features, refer to [](/explore-analyze/ai-features/llm-guides/llm-connectors.md). Elastic Managed LLM is available by default if your license supports it.
2525

2626
## AI-powered features on the Elastic platform
2727

@@ -87,7 +87,7 @@ The [Model Context Protocol (MCP)](/solutions/search/mcp.md) lets you connect AI
8787

8888
## AI-powered features in {{observability}}
8989

90-
{{observability}}'s AI-powered features all require an [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md). When you use one of these features, you can select any LLM connector that's configured in your environment. The connector you select for one feature does not affect which connector any other feature uses. For specific configuration instructions, refer to each feature's documentation.
90+
{{observability}}'s AI-powered features all require an [LLM connector](/explore-analyze/ai-features/llm-guides/llm-connectors.md). When you use one of these features, you can select any LLM connector that's configured in your environment. The connector you select for one feature does not affect which connector any other feature uses. For specific configuration instructions, refer to each feature's documentation.
9191

9292
### AI assistant for {{observability}}
9393

@@ -104,20 +104,20 @@ The [Model Context Protocol (MCP)](/solutions/search/mcp.md) lets you connect AI
104104

105105
## AI-powered features in {{elastic-sec}}
106106

107-
{{elastic-sec}}'s AI-powered features all require an [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md). When you use one of these features, you can select any LLM connector that's configured in your environment. The connector you select for one feature does not affect which connector any other feature uses. For specific configuration instructions, refer to each feature's documentation.
107+
{{elastic-sec}}'s AI-powered features all require an [LLM connector](/explore-analyze/ai-features/llm-guides/llm-connectors.md). When you use one of these features, you can select any LLM connector that's configured in your environment. The connector you select for one feature does not affect which connector any other feature uses. For specific configuration instructions, refer to each feature's documentation.
108108

109109
### AI Assistant for Security
110110

111111
[Elastic AI Assistant for Security](/solutions/security/ai/ai-assistant.md) helps you with tasks such as alert investigation, incident response, and query generation throughout {{elastic-sec}}. It provides a chat interface where you can ask questions about the {{stack}} and your data, and provides contextual insights that explain errors and messages and suggest remediation steps.
112112

113-
This feature requires an [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md).
113+
This feature requires an [LLM connector](/explore-analyze/ai-features/llm-guides/llm-connectors.md).
114114

115115

116116
### Attack Discovery
117117

118118
[Attack Discovery](/solutions/security/ai/attack-discovery.md) uses AI to triage your alerts and identify potential threats. Each "discovery" represents a potential attack and describes relationships among alerts to identify related users and hosts, map alerts to the MITRE ATT&CK matrix, and help identify threat actors.
119119

120-
This feature requires an [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md).
120+
This feature requires an [LLM connector](/explore-analyze/ai-features/llm-guides/llm-connectors.md).
121121

122122

123123
### Automatic Migration
@@ -127,14 +127,14 @@ This feature requires an [LLM connector](/solutions/security/ai/set-up-connector
127127
* Splunk rules
128128
* Splunk dashboards
129129

130-
This feature requires an [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md).
130+
This feature requires an [LLM connector](/explore-analyze/ai-features/llm-guides/llm-connectors.md).
131131

132132

133133
### Automatic Import
134134

135135
[Automatic Import](/solutions/security/get-started/automatic-import.md) helps you ingest data from sources that do not have prebuilt Elastic integrations. It uses AI to parse a sample of the data you want to ingest, and creates a new integration specifically for that type of data.
136136

137-
This feature requires an [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md).
137+
This feature requires an [LLM connector](/explore-analyze/ai-features/llm-guides/llm-connectors.md).
138138

139139

140140
### Automatic Troubleshooting
@@ -144,4 +144,4 @@ This feature requires an [LLM connector](/solutions/security/ai/set-up-connector
144144
* **Policy responses**: Detect warnings or failures in {{elastic-defend}}’s integration policies.
145145
* **Third-party antivirus (AV) software**: Identify installed third-party antivirus (AV) products that might conflict with {{elastic-defend}}.
146146

147-
This feature requires an [LLM connector](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md).
147+
This feature requires an [LLM connector](/explore-analyze/ai-features/llm-guides/llm-connectors.md).

explore-analyze/ai-features/ai-assistant.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ AI Assistant requires specific privileges and a large language model (LLM) conne
3838

3939
To learn more about configuring LLM connectors, refer to:
4040

41-
- [Enable LLM access](../../solutions/security/ai/set-up-connectors-for-large-language-models-llm.md)
41+
- [Enable LLM access](/explore-analyze/ai-features/llm-guides/llm-connectors.md)
4242

4343
## Prompt best practices [rag-for-esql]
4444
Elastic AI Assistant allows you to take full advantage of the Elastic platform to improve your operations. It can help you write an ES|QL query for a particular use case, or answer general questions about how to use the platform. Its ability to assist you depends on the specificity and detail of your questions. The more context and detail you provide, the more tailored and useful its responses will be.

solutions/security/ai/connect-to-amazon-bedrock.md renamed to explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ The following video demonstrates these steps (click to watch).
6363

6464
### Configure an IAM User [_configure_an_iam_user]
6565

66-
Next, assign the policy you just created to a new user:
66+
Next, assign the policy you created to a new user:
6767

6868
1. Return to the **IAM** menu. Select **Users** from the navigation menu, then click **Create User**.
6969
2. Name the user, then click **Next**.
@@ -82,7 +82,7 @@ The following video demonstrates these steps (click to watch).
8282
Create the access keys that will authenticate your Elastic connector:
8383

8484
1. Return to the **IAM** menu. Select **Users** from the navigation menu.
85-
2. Search for the user you just created, and click its name.
85+
2. Search for the user you created, and click its name.
8686
3. Go to the **Security credentials** tab.
8787
4. Under **Access keys**, click **Create access key**.
8888
5. Select **Third-party service**, check the box under **Confirmation**, click **Next**, then click **Create access key**.
@@ -102,7 +102,7 @@ Finally, configure the connector in {{kib}}:
102102
2. Find the **Connectors** page in the navigation menu or use the [global search field](/explore-analyze/find-and-organize/find-apps-and-objects.md). Then click **Create Connector**, and select **Amazon Bedrock**.
103103
3. Name your connector.
104104
4. (Optional) Configure the Amazon Bedrock connector to use a different AWS region where Anthropic models are supported by editing the **URL** field, for example by changing `us-east-1` to `eu-central-1`.
105-
5. (Optional) Add one of the following strings if you want to use a model other than the default. Note that these model IDs should have a prefix of `us.` or `eu.`, depending on your region, for example `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`.
105+
5. (Optional) Add one of the following strings if you want to use a model other than the default. These model IDs should have a prefix of `us.` or `eu.`, depending on your region, for example `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`.
106106

107107
* Sonnet 3.5: `us.anthropic.claude-3-5-sonnet-20240620-v1:0` or `eu.anthropic.claude-3-5-sonnet-20240620-v1:0`
108108
* Sonnet 3.5 v2: `us.anthropic.claude-3-5-sonnet-20241022-v2:0` or `eu.anthropic.claude-3-5-sonnet-20241022-v2:0`

solutions/security/ai/connect-to-azure-openai.md renamed to explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md

File renamed without changes.

solutions/security/ai/connect-to-google-vertex.md renamed to explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md

File renamed without changes.

solutions/observability/connect-to-own-local-llm.md renamed to explore-analyze/ai-features/llm-guides/connect-to-lmstudio-observability.md

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,20 @@
11
---
2-
navigation_title: Connect to a local LLM
2+
navigation_title: Connect to LM Studio for {{observability}}
33
mapped_pages:
44
- https://www.elastic.co/guide/en/observability/current/connect-to-local-llm.html
55
applies_to:
66
stack: ga 9.2
7-
serverless: ga
7+
serverless:
8+
observability: ga
89
products:
910
- id: observability
1011
---
1112

12-
# Connect to your own local LLM
13+
# Connect to a local LLM for {{observability}} using LM Studio
1314

1415
:::{important}
1516
Elastic doesn’t support the setup and configuration of local LLMs. The example provided is for reference only.
16-
Before using a local LLM, evaluate its performance according to the [LLM performance matrix](./llm-performance-matrix.md#evaluate-your-own-model).
17+
Before using a local LLM, evaluate its performance according to the [LLM performance matrix](/solutions/observability/llm-performance-matrix.md#evaluate-your-own-model).
1718
:::
1819

1920
This page provides instructions for setting up a connector to a large language model (LLM) of your choice using LM Studio. This allows you to use your chosen model within the {{obs-ai-assistant}}. You’ll first need to set up LM Studio, then download and deploy a model via LM studio and finally configure the connector in your Elastic deployment.
@@ -152,5 +153,5 @@ While local (open-weight) LLMs offer greater privacy and control, they generally
152153

153154
Local LLMs in air-gapped environments have specific installation and configuration instructions for deploying ELSER and configuring product documentation. Refer to the following links for more information:
154155

155-
- [Deploy ELSER in an air-gapped environment](../../explore-analyze/machine-learning/nlp/ml-nlp-elser.md#air-gapped-install)
156+
- [Deploy ELSER in an air-gapped environment](/explore-analyze/machine-learning/nlp/ml-nlp-elser.md#air-gapped-install)
156157
- [Configure product documentation for air-gapped-environments](kibana://reference/configuration-reference/ai-assistant-settings.md#configuring-product-doc-for-airgap)

solutions/security/ai/connect-to-own-local-llm.md renamed to explore-analyze/ai-features/llm-guides/connect-to-lmstudio-security.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
---
2+
navigation_title: Connect to LM Studio for {{elastic-sec}}
23
mapped_pages:
34
- https://www.elastic.co/guide/en/security/current/connect-to-byo-llm.html
45
- https://www.elastic.co/guide/en/serverless/current/connect-to-byo-llm.html
@@ -11,7 +12,7 @@ products:
1112
- id: cloud-serverless
1213
---
1314

14-
# Connect to your own local LLM using LM Studio
15+
# Connect to a local LLM for {{elastic-sec}} using LM Studio
1516

1617
This page provides instructions for setting up a connector to a large language model (LLM) of your choice using LM Studio. This allows you to use your chosen model within {{elastic-sec}}. You’ll first need to set up a reverse proxy to communicate with {{elastic-sec}}, then set up LM Studio on a server, and finally configure the connector in your Elastic deployment. [Learn more about the benefits of using a local LLM](https://www.elastic.co/blog/ai-assistant-locally-hosted-models).
1718

File renamed without changes.

solutions/security/ai/connect-to-vLLM.md renamed to explore-analyze/ai-features/llm-guides/connect-to-vLLM.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
---
2+
navigation_title: Connect to vLLM for {{elastic-sec}}
23
applies_to:
34
stack: all
45
serverless:
@@ -8,7 +9,7 @@ products:
89
- id: cloud-serverless
910
---
1011

11-
# Connect to your own LLM using vLLM (air gapped environments)
12+
# Connect to your own LLM using vLLM (air-gapped environments)
1213
This guide shows you how to run an OpenAI-compatible large language model with [vLLM](https://docs.vllm.ai/en/latest/) and connect it to Elastic. The setup runs inside Docker or Podman, is served through an Nginx reverse proxy, and does not require any outbound network access. This makes it a safe option for air-gapped environments or deployments with strict network controls.
1314

1415
The steps below show one example configuration, but you can use any model supported by vLLM, including private and gated models on Hugging Face.
@@ -189,4 +190,4 @@ With your vLLM connector set up, you can use it to power features including:
189190
* [Automatic import](/solutions/security/get-started/automatic-import.md): Use AI to create custom integrations for third-party data sources.
190191
* [AI Assistant for Observability and Search](/solutions/observability/observability-ai-assistant.md): Interact with an agent designed to assist with {{observability}} and Search tasks.
191192
192-
You can also learn how to [set up other types of LLM connectors](/solutions/security/ai/set-up-connectors-for-large-language-models-llm.md).
193+
You can also learn how to [set up other types of LLM connectors](/explore-analyze/ai-features/llm-guides/llm-connectors.md).
Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
---
2+
mapped_pages:
3+
- https://www.elastic.co/guide/en/security/current/llm-connector-guides.html
4+
- https://www.elastic.co/guide/en/serverless/current/security-llm-connector-guides.html
5+
applies_to:
6+
stack: ga
7+
serverless: ga
8+
products:
9+
- id: observability
10+
- id: elasticsearch
11+
- id: security
12+
- id: cloud-serverless
13+
---
14+
15+
# Configure access to LLMs
16+
17+
Elastic's [AI features](/explore-analyze/ai-features.md) work with the out-of-the-box Elastic Managed LLMs or with third-party LLMs configured using one of the available connectors.
18+
19+
## Elastic Managed LLMs
20+
21+
:::{include} ../../../solutions/_snippets/elastic-managed-llm.md
22+
:::
23+
24+
## Connect to a third-party or self-managed LLM
25+
26+
Follow these guides to connect to one or more third-party LLM providers:
27+
28+
* [Azure OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-azure-openai.md)
29+
* [Amazon Bedrock](/explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock.md)
30+
* [OpenAI](/explore-analyze/ai-features/llm-guides/connect-to-openai.md)
31+
* [Google Vertex](/explore-analyze/ai-features/llm-guides/connect-to-google-vertex.md)
32+
* [Self-managed LLMs](/explore-analyze/ai-features/llm-guides/local-llms-overview.md)
33+
34+
35+
## Preconfigured connectors
36+
37+
```{applies_to}
38+
stack: ga
39+
serverless: unavailable
40+
```
41+
42+
You can also use [preconfigured connectors](kibana://reference/connectors-kibana/pre-configured-connectors.md) to set up third-party LLM connectors by editing the `kibana.yml` file. This allows you enable a connector for multiple spaces at once, without performing set up in the {{kib}} UI for each space.
43+
44+
If you use a preconfigured connector for your LLM connector, we recommend adding the `exposeConfig: true` parameter to the `xpack.actions.preconfigured` section of the `kibana.yml` config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which LLM the connector uses.
45+
46+
47+
48+
49+

0 commit comments

Comments
 (0)