You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: pkg/scripts/migration_script/README.md
+44-4Lines changed: 44 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -108,6 +108,26 @@ where script options are:
108
108
Supported resources:
109
109
- snowflake_database_role
110
110
111
+
-`users` which expects a converted CSV output from the snowflake_users data source.
112
+
To support object parameters, one should use the SHOW PARAMETERS output, and combine it with the SHOW USERS output, so the CSV header looks like `"comment","created_on",...,"abort_detached_query_value","abort_detached_query_level","timezone_value","timezone_level",...`
113
+
When the additional columns are present, the resulting resource will have the parameters values, if the parameter level is set to "USER".
114
+
115
+
Caution: password parameter is not supported as it is returned in the form of `"***"` from the data source.
116
+
117
+
Note: Newlines are allowed only in the `comment`, `rsa_public_key` and `rsa_public_key2` fields, they might cause errors and require manual corrections elsewhere.
118
+
119
+
For more details about using multiple sources, visit the [Multiple sources section](#multiple-sources).
120
+
121
+
Different user types are mapped to their respective Terraform resources based on the `type` attribute:
122
+
-`PERSON` (or empty) → `snowflake_user` - A human user who can interact with Snowflake
123
+
-`SERVICE` → `snowflake_service_user` - A service or application user without human interaction (cannot use password/SAML authentication, cannot have first_name, last_name, must_change_password)
124
+
-`LEGACY_SERVICE` → `snowflake_legacy_service_user` - Similar to SERVICE but allows password and SAML authentication (cannot have first_name, last_name)
125
+
126
+
Supported resources:
127
+
- snowflake_user
128
+
- snowflake_service_user
129
+
- snowflake_legacy_service_user
130
+
111
131
-**INPUT**:
112
132
- Migration script operates on STDIN input in CSV format. You can redirect the input from a file or pipe it from another command.
113
133
-**OUTPUT**:
@@ -586,7 +606,7 @@ As an example, let's import all schemas in a given database. First, we need to d
586
606
terraform {
587
607
required_providers {
588
608
snowflake = {
589
-
source = "Snowflake-Labs/snowflake"
609
+
source = "snowflakedb/snowflake"
590
610
}
591
611
local = {
592
612
source = "hashicorp/local"
@@ -601,14 +621,16 @@ data "snowflake_schemas" "test" {
601
621
}
602
622
603
623
locals {
604
-
# Transform each schema by merging show_output and flattened parameters
624
+
# Transform each schema by merging show_output, describe_output, and flattened parameters
605
625
schemas_flattened = [
606
626
for schema in data.snowflake_schemas.test.schemas : merge(
607
627
schema.show_output[0],
628
+
# Include describe output fields (if describe_output is present)
0 commit comments