Skip to content

Commit a5aa26d

Browse files
Merge branch 'dev' into listing-data-source
# Conflicts: # MIGRATION_GUIDE.md
2 parents 0e9fcf0 + 8ad8fda commit a5aa26d

File tree

57 files changed

+1878
-191
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

57 files changed

+1878
-191
lines changed

CONTRIBUTING.md

Lines changed: 101 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,12 +8,13 @@
88
- [Making a contribution](#making-a-contribution)
99
- [Discuss a change with us!](#discuss-a-change-with-us)
1010
- [Follow the code conventions inside the repository](#follow-the-code-conventions-inside-the-repository)
11-
- [Introducing a new part of the SDK](#introducing-a-new-part-of-the-sdk)
1211
- [Test the change](#test-the-change)
1312
- [Describe the breaking changes](#describe-the-breaking-changes)
1413
- [Before submitting the PR](#before-submitting-the-pr)
1514
- [Naming and describing the PR](#naming-and-describing-the-pr)
1615
- [Requesting the review](#requesting-the-review)
16+
- [Adding support for a new snowflake object](#adding-support-for-a-new-snowflake-object)
17+
- [Introducing a new part of the SDK](#add-the-object-to-the-sdk)
1718
- [Advanced Debugging](#advanced-debugging)
1819
- [Extending the migration script](#extending-the-migration-script)
1920

@@ -104,10 +105,6 @@ It's best to approach us through the GitHub issues: either by commenting the alr
104105
### Follow the code conventions inside the repository
105106
We believe that code following the same conventions is easier to maintain and extend. When working on the given part of the provider try to follow the local solutions and not introduce too much new ideas.
106107

107-
### Introducing a new part of the SDK
108-
109-
To create new objects in our SDK we use quickly created generator that outputs the majority of the files needed. These files should be later edited and filled with the missing parts. We plan to improve the generator later on, but it should be enough for now. Please read more in the [generator readme](pkg/sdk/generator/README.md).
110-
111108
### Test the change
112109
Every introduced change should be tested. Depending on the type of the change it may require (any or mix of):
113110
- adding/modifying existing unit tests (e.g. changing the behavior of validation in the SDK)
@@ -144,8 +141,107 @@ We check for the new PRs in our repository every day Monday-Friday. We usually n
144141

145142
During our review we try to point out the unhandled special cases, missing tests, and deviations from the established conventions. Remember, review comment is like an invitation to dance: you don't have to agree but please provide the substantive reasons.
146143

144+
Please do not resolve our comments. We prefer to resolve ourselves after the comments are followed up by the contributor.
145+
147146
**⚠️ Important ⚠️** Tests and checks are not run automatically after your PR. We run them manually, when we are happy with the state of the change (even if some corrections are still necessary).
148147

148+
## Adding Support for a new Snowflake Object
149+
This guide describes the end-to-end process to add support for a new Snowflake object in the Terraform provider. Work is typically split into multiple PRs: SDK first, then SDK integration tests, followed by the Terraform resource, and finally the data source (SDK → integration tests → resource → data source).
150+
151+
### Prerequisites and conventions
152+
- Use the SDK generator to define the object and produce the bulk of implementation and validations. See the SDK generator [README](pkg/sdk/generator/README.md) for file layout, generation parts, and commands.
153+
154+
- Do not edit generated files; place any custom helpers or overrides in *_ext.go files. This is the pattern used across the SDK
155+
156+
- Map both SHOW and DESCRIBE outputs. If the outputs differ, generate separate mapping structs and conversion paths; do not force a shared struct across both operations.
157+
158+
### Add the object to the SDK
159+
- Create `<object_name_plural>_def.go` in the SDK generator defs directory
160+
161+
- Use the DSL to configure operations: CREATE/ALTER/DROP, DESCRIBE, SHOW, and optional ShowById helpers. For example, notebooks use DescribeOperation, ShowOperation, and ShowByIdOperationWithFiltering to support both SHOW and DESCRIBE flows plus a by-ID retrieval.
162+
163+
- If the server returns different shapes for SHOW and DESCRIBE, generate and map them separately.
164+
165+
- From repository root, run make generate-sdk to build all parts
166+
167+
- Expect the generator to create interface, DTOs, builders, validations, impl, and unit test placeholders (e.g., `_gen.go`, `_dto_gen.go`, `_dto_builders_gen.go`, `_validations_gen.go`, `_impl_gen.go`, `_gen_test.go`).
168+
169+
- Avoid encoding server-side numeric ranges unless they are stable and guaranteed (rely on Snowflake for ranges/limits validations, check for only basic cases like integers being non-negative).
170+
171+
- Implement unit tests.
172+
173+
Take a look at [generator readme](pkg/sdk/generator/README.md) and an example [SDK implementation for notebooks](https://github.com/snowflakedb/terraform-provider-snowflake/pull/4084).
174+
175+
### Add integration tests
176+
Add integration tests under the SDK’s testint package to validate the SDK behavior against a live Snowflake connection.
177+
178+
Recommended coverage:
179+
- Lifecycle: create, show, describe, alter (set/unset combinations), rename, drop, and show-by-id where applicable. Prefer asserting fields you directly control (e.g., comment) and anything with server defaults you depend on.
180+
181+
- Nil handling: ensure tests don’t panic on optional pointers (e.g., check for `nil` before calling `.Name()` on an identifier pointer).
182+
183+
- ALTER validations: test both invalid “none set” and “more than one set” branches when you have ExactlyOneValueSet or AtLeastOneValueSet rules.
184+
185+
- Error parity: assert the correct error kinds for missing objects (e.g., prefer consistent “object does not exist or not authorized” variants).
186+
187+
- Assertion helpers: the generator can produce “object asserts” for SHOW/DESC outputs. Use generated assertion structs for concision, but add nil-checks to avoid panics in optional fields.
188+
189+
- use `make generate-snowflake-object-assertions` to generate the assertions for the integration tests.
190+
191+
Take a look at [generator readme](pkg/sdk/generator/README.md) and an example [Integration tests implementation for notebooks](https://github.com/snowflakedb/terraform-provider-snowflake/pull/4123).
192+
193+
### Add resource
194+
Implement the resource schema, read/create/update/delete, acceptance tests, and docs. Use the SDK as the source of truth and mirror its SHOW/DESC coverage and validations.
195+
196+
- Schema design
197+
- Prefer nested blocks for structured inputs. For example, “create from a stage” is modeled as a `from { stage = "<db>.<schema>.<stage>" path = "path/to/file" }` block rather than a flat string, to align with Snowflake semantics and improve validation.
198+
199+
- Validate identifiers with the provider’s identifier validators (e.g., `IsValidIdentifier[...]`) and suppress quoting-only diffs for identifier fields (`suppressIdentifierQuoting`).
200+
201+
- Update semantics
202+
- If it's possible, implement rename in-place (`ALTER … RENAME TO …`) rather than ForceNew. Align with how recently refactored resources handle renames.
203+
204+
- Detect external changes for derived outputs via SHOW/DESC triggers when possible. If a particular field cannot be detected externally (e.g., notebooks “from” location due to Snowflake limitations), document that limitation explicitly in the resource docs.
205+
206+
- Defaults and constraints surfaced in docs
207+
- Where Snowflake restricts identifier casing (e.g., only upper-case identifiers are valid for specific warehouse references), document it explicitly and add validators to prevent invalid inputs in plans.
208+
209+
- Documentation and migration guide
210+
- Add a Migration Guide entry under the correct version, grouping object support under a single H3 “(new feature) snowflake_” heading with H4 subsections for “Added resource” and “Added data source”.
211+
212+
- When server capabilities are incomplete, document current limitations and ensure Update/Create sequences handle supported paths without requiring double-applies. Remember to use the model builder and assertions that you can automatically generate.
213+
214+
- Implement acceptance tests
215+
- Provide “basic” and “complete” cases; test rename, validations, and plan drift (ConfigPlanChecks). Avoid relying on “Safe” client wrappers for correctness checks; validate against the same paths real users hit.
216+
217+
- use `make generate-show-output-schemas` to generate show schemas.
218+
219+
- use `make generate-all-assertions-and-config-models` to generate assertions and config models.
220+
221+
Take a look at an example [Resource implementation for notebooks](https://github.com/snowflakedb/terraform-provider-snowflake/pull/4195)
222+
223+
### Add data source
224+
While not strictly required to “support” the object, a data source improves discoverability and enables read-only use cases. For parity with other objects, we recommend adding one.
225+
226+
Example patterns validated by the data source:
227+
- Filtering aligned to SHOW
228+
- Support `like`, `starts_with`, and `limit { rows, from }` to mirror SHOW filters; include `with_describe` to optionally call DESCRIBE for each item. Keep `with_describe` default-on but allow turning it off to reduce calls in large accounts.
229+
230+
- Output shape
231+
- Aggregate into a single `<object_name_plural>` collection with nested `show_output` (SHOW) and `describe_output` (DESCRIBE) blocks containing fields as strings/numbers
232+
233+
- Documentation and examples
234+
- Provide simple, filter, and pagination examples; include a note about default behavior of `with_describe`.
235+
236+
- Provider preview gate and migration guide
237+
- Add the “Added data source” H4 subsection under the same feature entry in the Migration Guide and link Snowflake’s SHOW docs where appropriate.
238+
239+
- use `make generate-all-assertions-and-config-models` to generate config model.
240+
241+
- use `make docs` to generate documentation based on the `.md.tmpl` file (which is the file you should edit instead of `.md` file).
242+
243+
Take a look at [Data source implementation for notebooks](https://github.com/snowflakedb/terraform-provider-snowflake/pull/4209) and its follow up with extra tests [Extended test coverage for notebooks](https://github.com/snowflakedb/terraform-provider-snowflake/pull/4237)
244+
149245
## Advanced Debugging
150246

151247
If you want to build and test the provider locally (manually, not through acceptance tests), build the binary first using `make build-local` or install to the proper local directory by invoking `make install-tf` (to uninstall run `make uninstall-tf`).

FAQ.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ Please refer to [this document](https://github.com/snowflakedb/terraform-provide
8181
- For a new way of referencing object identifiers in resources, take a look at the ["New computed fully qualified name field in resources" ](https://github.com/snowflakedb/terraform-provider-snowflake/blob/main/docs/guides/identifiers_rework_design_decisions.md#new-computed-fully-qualified-name-field-in-resources) section.
8282

8383
### Is this provider compatible with OpenTofu?
84-
OpenTofu is not currently supported. While it's mostly compatible with Terraform,
85-
OpenTofu's [reactive implementation of new Terraform features based on community demand](https://opentofu.org/faq/#opentofu-compatibility) may decrease compatibility over time.
84+
Although, the provider is present in the OpenTofu Registry ([see](https://github.com/snowflakedb/terraform-provider-snowflake/issues/3874)), it is not currently supported.
85+
While it's mostly compatible with Terraform, OpenTofu's [reactive implementation of new Terraform features based on community demand](https://opentofu.org/faq/#opentofu-compatibility) may decrease compatibility over time.
8686
We plan to research OpenTofu support in the future, but there's no timeline yet (once planned, it will appear in the [roadmap](https://github.com/snowflakedb/terraform-provider-snowflake/blob/main/ROADMAP.md)).
8787
For now, you must research and assess the risk of provider incompatibility.

MIGRATION_GUIDE.md

Lines changed: 26 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ for changes required after enabling given [Snowflake BCR Bundle](https://docs.sn
2424
> [!TIP]
2525
> If you're still using the `Snowflake-Labs/snowflake` source, see [Upgrading from Snowflake-Labs Provider](./SNOWFLAKEDB_MIGRATION.md) to upgrade to the snowflakedb namespace.
2626
27-
## v2.11.0 ➞ v2.12.0
27+
## v2.11.x ➞ v2.12.0
2828

2929
### *(new feature)* snowflake_listings datasource
3030
Added a new preview data source for listings. See reference [docs](https://docs.snowflake.com/en/sql-reference/sql/show-listings).
@@ -33,12 +33,35 @@ This data source focuses on base query commands (SHOW LISTINGS and DESCRIBE LIST
3333

3434
This feature will be marked as a stable feature in future releases. Breaking changes are expected, even without bumping the major version. To use this feature, add `snowflake_listings_datasource` to `preview_features_enabled` field in the provider configuration.
3535

36+
### *(improvement)* snowflake_scim_integration now accepts custom role names for run_as_role
37+
38+
Previously, the `run_as_role` field in the [snowflake_scim_integration](https://registry.terraform.io/providers/snowflakedb/snowflake/2.11.0/docs/resources/scim_integration) resource only accepted predefined role names: `OKTA_PROVISIONER`, `AAD_PROVISIONER`, or `GENERIC_SCIM_PROVISIONER`.
39+
40+
Now, the field accepts any custom role name, allowing you to use organization-specific roles for SCIM provisioning. This field is now case-sensitive. The exception is if you set any of `okta_provisioner`, `aad_provisioner`, or `generic_scim_provisioner`.
41+
To maintain compatibility and avoid breaking changes, for these values, the provider makes them uppercase (like it was doing before). This will be changed in the v3 version of the provider (the field will behave like all other identifier fields).
42+
- If you use `OKTA_PROVISIONER`, `AAD_PROVISIONER`, or `GENERIC_SCIM_PROVISIONER` roles in Snowflake, please make them uppercase in your configuration (this will result in an empty plan).
43+
- If you use `okta_provisioner`, `aad_provisioner`, or `generic_scim_provisioner` roles in Snowflake, please handle them in provider with snowflake_execute, or rename the roles entirely.
44+
45+
Existing configurations using predefined roles will continue to work without modifications, but please bear in mind the potential case-sensitive changes in v3.
46+
47+
References: [#3917](https://github.com/snowflakedb/terraform-provider-snowflake/issues/3917).
48+
49+
### *(new feature)* Added serverless task parameters
50+
Added support for new serverless task fields:
51+
- `target_completion_interval` - Specifies the target completion interval for serverless tasks; also added as a computed value to `show_output`.
52+
- `serverless_task_min_statement_size` (parameter) - Minimum statement size for serverless tasks; also added as a computed value to `parameters`.
53+
- `serverless_task_max_statement_size` (parameter) - Maximum statement size for serverless tasks; also added as a computed value to `parameters`.
54+
55+
These fields are available in the `snowflake_task` resource for serverless task configurations.
56+
57+
No changes in configuration are required for existing tasks. You can optionally update your configurations to use these new parameters.
58+
3659
### *(improvement)* New fields in user resources and data sources output fields
37-
We adjusted the `show_output` by adding the missing `has_workload_identity ` field. This concerns `user`, `service_user`, and `legacy_service_user` resources and `users` data source.
60+
We adjusted the `show_output` by adding the missing `has_workload_identity` field. This concerns `user`, `service_user`, and `legacy_service_user` resources and `users` data source.
3861

3962
## v2.10.x ➞ v2.11.0
4063

41-
### *(new feature)* snowflake_notebook
64+
### *(new feature)* Notebooks preview feature
4265

4366
#### Added resource
4467
Added a new preview resource for managing notebooks. See reference [docs](https://docs.snowflake.com/en/sql-reference/sql/create-notebook).

docs/data-sources/tasks.md

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -224,6 +224,8 @@ Read-Only:
224224
- `rows_per_resultset` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--parameters--rows_per_resultset))
225225
- `s3_stage_vpce_dns_name` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--parameters--s3_stage_vpce_dns_name))
226226
- `search_path` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--parameters--search_path))
227+
- `serverless_task_max_statement_size` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--parameters--serverless_task_max_statement_size))
228+
- `serverless_task_min_statement_size` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--parameters--serverless_task_min_statement_size))
227229
- `statement_queued_timeout_in_seconds` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--parameters--statement_queued_timeout_in_seconds))
228230
- `statement_timeout_in_seconds` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--parameters--statement_timeout_in_seconds))
229231
- `strict_json_output` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--parameters--strict_json_output))
@@ -635,6 +637,30 @@ Read-Only:
635637
- `value` (String)
636638

637639

640+
<a id="nestedobjatt--tasks--parameters--serverless_task_max_statement_size"></a>
641+
### Nested Schema for `tasks.parameters.serverless_task_max_statement_size`
642+
643+
Read-Only:
644+
645+
- `default` (String)
646+
- `description` (String)
647+
- `key` (String)
648+
- `level` (String)
649+
- `value` (String)
650+
651+
652+
<a id="nestedobjatt--tasks--parameters--serverless_task_min_statement_size"></a>
653+
### Nested Schema for `tasks.parameters.serverless_task_min_statement_size`
654+
655+
Read-Only:
656+
657+
- `default` (String)
658+
- `description` (String)
659+
- `key` (String)
660+
- `level` (String)
661+
- `value` (String)
662+
663+
638664
<a id="nestedobjatt--tasks--parameters--statement_queued_timeout_in_seconds"></a>
639665
### Nested Schema for `tasks.parameters.statement_queued_timeout_in_seconds`
640666

@@ -973,9 +999,20 @@ Read-Only:
973999
- `schedule` (String)
9741000
- `schema_name` (String)
9751001
- `state` (String)
1002+
- `target_completion_interval` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--show_output--target_completion_interval))
9761003
- `task_relations` (List of Object) (see [below for nested schema](#nestedobjatt--tasks--show_output--task_relations))
9771004
- `warehouse` (String)
9781005

1006+
<a id="nestedobjatt--tasks--show_output--target_completion_interval"></a>
1007+
### Nested Schema for `tasks.show_output.target_completion_interval`
1008+
1009+
Read-Only:
1010+
1011+
- `hours` (Number)
1012+
- `minutes` (Number)
1013+
- `seconds` (Number)
1014+
1015+
9791016
<a id="nestedobjatt--tasks--show_output--task_relations"></a>
9801017
### Nested Schema for `tasks.show_output.task_relations`
9811018

docs/resources/scim_integration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -46,7 +46,7 @@ resource "snowflake_scim_integration" "test" {
4646

4747
- `enabled` (Boolean) Specify whether the security integration is enabled.
4848
- `name` (String) String that specifies the identifier (i.e. name) for the integration; must be unique in your account. Due to technical limitations (read more [here](../guides/identifiers_rework_design_decisions#known-limitations-and-identifier-recommendations)), avoid using the following characters: `|`, `.`, `"`.
49-
- `run_as_role` (String) Specify the SCIM role in Snowflake that owns any users and roles that are imported from the identity provider into Snowflake using SCIM. Provider assumes that the specified role is already provided. Valid options are: `OKTA_PROVISIONER` | `AAD_PROVISIONER` | `GENERIC_SCIM_PROVISIONER`.
49+
- `run_as_role` (String) Specify the SCIM role in Snowflake that owns any users and roles that are imported from the identity provider into Snowflake using SCIM. Provider assumes that the specified role is already provided. This field is case-sensitive. The exception is using `generic_scim_provisioner`, `okta_provisioner`, or `aad_provisioner`, which are automatically converted to uppercase for backwards compatibility.
5050
- `scim_client` (String) Specifies the client type for the scim integration. Valid options are: `OKTA` | `AZURE` | `GENERIC`.
5151

5252
### Optional

0 commit comments

Comments
 (0)