doc: Fix missing templates + add documentation process in CONTRIBUTING.md

This commit is contained in:
Arthur Amstutz 2025-08-20 14:32:22 +00:00
parent 384c25bdb4
commit f2b00407e8
No known key found for this signature in database
GPG key ID: F3C6FC59C43A6EF6
17 changed files with 81 additions and 81 deletions

5
.gitignore vendored
View file

@ -36,4 +36,7 @@ terraform-provider-ovh
examples/kube-nodepool-deployment/.terraform.lock.hcl
examples/kube-nodepool-deployment/.terraform.tfstate.lock.info
examples/kube-nodepool-deployment/logs
examples/kube-nodepool-deployment/my-kube-cluster-*.yml
examples/kube-nodepool-deployment/my-kube-cluster-*.yml
# Documentation exclusions
docs/resources/dedicated_server_networking.md

View file

@ -19,6 +19,7 @@ This project accepts contributions. In order to contribute, you should pay atten
- The examples of resources and datasources in the documentation must follow the [Terraform style guidelines](https://developer.hashicorp.com/terraform/language/style)
- Check your documentation through [Terraform Doc Preview Tool](https://registry.terraform.io/tools/doc-preview)
- When adding a documentation page, use the `subcategory:` tag in the [YAML Frontmatter](https://developer.hashicorp.com/terraform/registry/providers/docs#yaml-frontmatter) with a value equals to the product name defined in the OVHcloud [product map](https://www.product-map.ovh/)
- New documentation pages should be added first in the directory `templates/`, with the examples being placed in the `examples/` directory. Once this is done, the content in `docs/` directory must be generated with [tfplugindocs](https://github.com/hashicorp/terraform-plugin-docs?tab=readme-ov-file#usage).
## Acceptance tests:

View file

@ -18,7 +18,7 @@ data "ovh_dbaas_logs_output_graylog_stream" "stream" {
## Argument Reference
* `service_name` - The service name. It's the ID of your Logs Data Platform instance.
* `title` - Stream description
* `title` - Stream name
## Attributes Reference

View file

@ -1,34 +1,29 @@
---
# generated by https://github.com/hashicorp/terraform-plugin-docs
page_title: "ovh_dbaas_logs_output_graylog_stream_url Data Source - terraform-provider-ovh"
subcategory: ""
description: |-
subcategory : "Logs Data Platform"
---
# ovh_dbaas_logs_output_graylog_stream_url (Data Source)
Use this data source to retrieve the list of URLs for a DBaas logs output Graylog stream.
## Example Usage
```terraform
data "ovh_dbaas_logs_output_graylog_stream_url" "urls" {
service_name = "ldp-xx-xxxxx"
stream_id = "STREAM_ID"
}
```
## Argument Reference
<!-- schema generated by tfplugindocs -->
## Schema
* `service_name` - The service name. It's the ID of your Logs Data Platform instance.
* `stream_id` - Stream ID.
### Required
## Attributes Reference
- `service_name` (String) The service name
- `stream_id` (String) Stream ID
The following attributes are exported:
### Read-Only
- `id` (String) The ID of this resource.
- `url` (List of Object) (see [below for nested schema](#nestedatt--url))
<a id="nestedatt--url"></a>
### Nested Schema for `url`
Read-Only:
- `address` (String)
- `type` (String)
* `url` - List of URLs. Each element contains:
* `address` - URL address
* `type` - URL type (e.g. `GRAYLOG_WEBUI`, `WEB_SOCKET`)

View file

@ -22,8 +22,8 @@ data "ovh_domain_zone" "root_zone" {
`id` is set to the domain zone name. In addition, the following attributes are exported:
* `urn` - URN of the DNS zone
* `last_update` - Last update date of the DNS zone
* `has_dns_anycast` - hasDnsAnycast flag of the DNS zone
* `name_servers` - Name servers that host the DNS zone
* `dnssec_supported` - Is DNSSEC supported by this zone
* `urn` - URN of the DNS Zone to be used inside an IAM policy

View file

@ -25,5 +25,6 @@ data "ovh_me_api_oauth2_client" "my_oauth2_client" {
* `description` - OAuth2 client description.
* `flow` - The OAuth2 flow to use. `AUTHORIZATION_CODE` or `CLIENT_CREDENTIALS` are supported at the moment.
* `callback_urls` - List of callback urls when configuring the `AUTHORIZATION_CODE` flow.
* `identity` - Identity URN of the service account to be used inside an IAM policy.
The `client_secret` attribute is not supported in the Data Source. If you need this attribute to be in the state, see how to import a `ovh_me_api_oauth2_client` resource instead.

View file

@ -37,8 +37,8 @@ resource "ovh_dbaas_logs_output_graylog_stream" "stream" {
The following arguments are supported:
* `service_name` - (Required) The service name
* `title` - (Required) Stream name
* `description` - (Required) Stream description
* `title` - (Required) Stream description
* `parent_stream_id` - Parent stream ID
* `retention_id` - Retention ID
* `cold_storage_compression` - Cold storage compression method. One of "LZMA", "GZIP", "DEFLATED", "ZSTD"
@ -50,7 +50,7 @@ The following arguments are supported:
* `indexing_enabled` - Enable ES indexing
* `indexing_max_size` - Maximum indexing size (in GB)
* `indexing_notify_enabled` - If set, notify when size is near 80, 90 or 100 % of the maximum configured setting
* `pause_indexing_on_max_size` - If set, pause indexing when maximum size is reach
* `pause_indexing_on_max_size` - If set, pause indexing when maximum size is reached
* `web_socket_enabled` - Enable Websocket
## Attributes Reference
@ -66,3 +66,11 @@ Id is set to the output stream Id. In addition, the following attributes are exp
* `stream_id` - Stream ID
* `updated_at` - Stream last updater
* `write_token` - Write token of the stream (empty if the caller is not the owner of the stream)
## Import
DBaas logs output Graylog stream can be imported using the `service_name` of the cluster and `stream_id` of the graylog output stream, separated by "/" E.g.,
```bash
$ terraform import ovh_dbaas_logs_output_graylog_stream.ldp ldp-az-12345/9d2f9cf8-9f92-1337-c0f3-48a0213d2c6f
```

View file

@ -1,48 +0,0 @@
---
# generated by https://github.com/hashicorp/terraform-plugin-docs
page_title: "ovh_dedicated_server_networking Resource - terraform-provider-ovh"
subcategory: ""
description: |-
---
# ovh_dedicated_server_networking (Resource)
<!-- schema generated by tfplugindocs -->
## Schema
### Required
- `interfaces` (Block Set, Min: 1, Max: 2) Interface or interfaces aggregation. (see [below for nested schema](#nestedblock--interfaces))
- `service_name` (String) The internal name of your dedicated server.
### Optional
- `timeouts` (Block, Optional) (see [below for nested schema](#nestedblock--timeouts))
### Read-Only
- `description` (String) Operation description
- `id` (String) The ID of this resource.
- `status` (String) Operation status
<a id="nestedblock--interfaces"></a>
### Nested Schema for `interfaces`
Required:
- `macs` (Set of String) Interface Mac address
- `type` (String) Interface type
<a id="nestedblock--timeouts"></a>
### Nested Schema for `timeouts`
Optional:
- `create` (String)
- `delete` (String)

View file

@ -80,7 +80,7 @@ Id is set to the order Id. In addition, the following attributes are exported:
* `last_update` - Last update date of the DNS zone
* `name` - Zone name
* `name_servers` - Name servers that host the DNS zone
* `urn` - URN of the DNS Zone to be used inside an IAM policy
* `order` - Details about an Order
* `date` - date
* `order_id` - order id

View file

@ -44,6 +44,7 @@ resource "ovh_me_api_oauth2_client" "my_oauth2_client_client_creds" {
* `description` - OAuth2 client description.
* `flow` - The OAuth2 flow to use. `AUTHORIZATION_CODE` or `CLIENT_CREDENTIALS` are supported at the moment.
* `callback_urls` - List of callback urls when configuring the `AUTHORIZATION_CODE` flow.
* `identity` - Identity URN of the service account to be used inside an IAM policy.
## Import

View file

@ -17,7 +17,7 @@ Use this data source to retrieve information about a DBaas logs output graylog s
## Argument Reference
* `service_name` - The service name. It's the ID of your Logs Data Platform instance.
* `title` - Stream description
* `title` - Stream name
## Attributes Reference

View file

@ -0,0 +1,29 @@
---
subcategory : "Logs Data Platform"
---
# ovh_dbaas_logs_output_graylog_stream_url (Data Source)
Use this data source to retrieve the list of URLs for a DBaas logs output Graylog stream.
## Example Usage
```terraform
data "ovh_dbaas_logs_output_graylog_stream_url" "urls" {
service_name = "ldp-xx-xxxxx"
stream_id = "STREAM_ID"
}
```
## Argument Reference
* `service_name` - The service name. It's the ID of your Logs Data Platform instance.
* `stream_id` - Stream ID.
## Attributes Reference
The following attributes are exported:
* `url` - List of URLs. Each element contains:
* `address` - URL address
* `type` - URL type (e.g. `GRAYLOG_WEBUI`, `WEB_SOCKET`)

View file

@ -22,8 +22,8 @@ Use this data source to retrieve information about a domain zone.
`id` is set to the domain zone name. In addition, the following attributes are exported:
* `urn` - URN of the DNS zone
* `last_update` - Last update date of the DNS zone
* `has_dns_anycast` - hasDnsAnycast flag of the DNS zone
* `name_servers` - Name servers that host the DNS zone
* `dnssec_supported` - Is DNSSEC supported by this zone
* `urn` - URN of the DNS Zone to be used inside an IAM policy

View file

@ -25,5 +25,6 @@ Use this data source to retrieve information about an existing OAuth2 service ac
* `description` - OAuth2 client description.
* `flow` - The OAuth2 flow to use. `AUTHORIZATION_CODE` or `CLIENT_CREDENTIALS` are supported at the moment.
* `callback_urls` - List of callback urls when configuring the `AUTHORIZATION_CODE` flow.
* `identity` - Identity URN of the service account to be used inside an IAM policy.
The `client_secret` attribute is not supported in the Data Source. If you need this attribute to be in the state, see how to import a `ovh_me_api_oauth2_client` resource instead.

View file

@ -22,8 +22,8 @@ To define the retention of the stream, you can use the following configuration:
The following arguments are supported:
* `service_name` - (Required) The service name
* `title` - (Required) Stream name
* `description` - (Required) Stream description
* `title` - (Required) Stream description
* `parent_stream_id` - Parent stream ID
* `retention_id` - Retention ID
* `cold_storage_compression` - Cold storage compression method. One of "LZMA", "GZIP", "DEFLATED", "ZSTD"
@ -35,7 +35,7 @@ The following arguments are supported:
* `indexing_enabled` - Enable ES indexing
* `indexing_max_size` - Maximum indexing size (in GB)
* `indexing_notify_enabled` - If set, notify when size is near 80, 90 or 100 % of the maximum configured setting
* `pause_indexing_on_max_size` - If set, pause indexing when maximum size is reach
* `pause_indexing_on_max_size` - If set, pause indexing when maximum size is reached
* `web_socket_enabled` - Enable Websocket
## Attributes Reference
@ -51,3 +51,11 @@ Id is set to the output stream Id. In addition, the following attributes are exp
* `stream_id` - Stream ID
* `updated_at` - Stream last updater
* `write_token` - Write token of the stream (empty if the caller is not the owner of the stream)
## Import
DBaas logs output Graylog stream can be imported using the `service_name` of the cluster and `stream_id` of the graylog output stream, separated by "/" E.g.,
```bash
$ terraform import ovh_dbaas_logs_output_graylog_stream.ldp ldp-az-12345/9d2f9cf8-9f92-1337-c0f3-48a0213d2c6f
```

View file

@ -51,7 +51,7 @@ Id is set to the order Id. In addition, the following attributes are exported:
* `last_update` - Last update date of the DNS zone
* `name` - Zone name
* `name_servers` - Name servers that host the DNS zone
* `urn` - URN of the DNS Zone to be used inside an IAM policy
* `order` - Details about an Order
* `date` - date
* `order_id` - order id

View file

@ -35,6 +35,7 @@ An OAuth2 client for an app hosted at `my-app.com`, that uses the client credent
* `description` - OAuth2 client description.
* `flow` - The OAuth2 flow to use. `AUTHORIZATION_CODE` or `CLIENT_CREDENTIALS` are supported at the moment.
* `callback_urls` - List of callback urls when configuring the `AUTHORIZATION_CODE` flow.
* `identity` - Identity URN of the service account to be used inside an IAM policy.
## Import