You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fix(docs): standardize ai-prompt-template attribute table to 6-column format
Convert attribute table from 4-column to 6-column format (Name, Type,
Required, Default, Valid values, Description) for consistency across
plugin docs. Move role enum values to Valid values column.
Copy file name to clipboardExpand all lines: docs/en/latest/plugins/ai-prompt-template.md
+9-9Lines changed: 9 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -40,15 +40,15 @@ The `ai-prompt-template` Plugin supports pre-configuring prompt templates that o
40
40
41
41
## Plugin Attributes
42
42
43
-
|**Field**|**Required**|**Type**|**Description**|
44
-
| --- | --- | --- | --- |
45
-
|`templates`| True |array| An array of template objects. |
46
-
|`templates.name`| True |string| Name of the template. When requesting the Route, the request should include the template name that corresponds to the configured template. |
|`templates.template.model`| False |string| Name of the LLM model, such as `gpt-4` or `gpt-3.5`. See your LLM provider API documentation for more available models. |
|`templates.template.messages.role`|True|string|Role of the message. Valid values are `system`, `user`, and `assistant`. |
51
-
|`templates.template.messages.content`| True |string| Content of the message (prompt). Use `{{variable_name}}` syntax to define template variables that will be filled from the request body. |
|`templates`|array |True ||| An array of template objects. |
46
+
|`templates.name`|string |True ||| Name of the template. When requesting the Route, the request should include the template name that corresponds to the configured template. |
|`templates.template.model`|string |False ||| Name of the LLM model, such as `gpt-4` or `gpt-3.5`. See your LLM provider API documentation for more available models. |
|`templates.template.messages.role`|string|True||[`system`, `user`, `assistant`]| Role of the message. |
51
+
|`templates.template.messages.content`|string |True ||| Content of the message (prompt). Use `{{variable_name}}` syntax to define template variables that will be filled from the request body. |
0 commit comments