Skip to content

Commit 3775133

Browse files
changed test structure
1 parent a0c954f commit 3775133

File tree

5 files changed

+149
-184
lines changed

5 files changed

+149
-184
lines changed

pkg/manual_tests/migration_script/README.md

Lines changed: 28 additions & 147 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,6 @@
22

33
This directory contains end-to-end tests for the migration script. Each object type has its own folder with test configuration.
44

5-
## Test Approach
6-
7-
The test validates the migration script by:
8-
1. Creating test objects on Snowflake
9-
2. Fetching objects via data source and generating a CSV
10-
3. Running the migration script to generate Terraform code with import blocks
11-
4. Applying the generated code (which imports existing objects)
12-
5. **Verifying the plan is empty** (no changes needed = successful import)
13-
145
## Quick Start
156

167
### Step 1: Navigate to object type folder
@@ -22,183 +13,73 @@ cd grants # or schemas, warehouses, users, etc.
2213
### Step 2: Clean up any previous state
2314

2415
```bash
25-
rm -rf .terraform terraform.tfstate terraform.tfstate.backup generated_output.tf
16+
rm -rf .terraform terraform.tfstate terraform.tfstate.backup import/.terraform import/terraform.tfstate import/terraform.tfstate.backup import/main.tf
2617
```
2718

28-
### Step 3: Initialize Terraform
19+
### Step 3: Initialize and create test objects
2920

3021
```bash
3122
terraform init
32-
```
33-
34-
### Step 4: Create test objects AND generate CSV
35-
36-
```bash
3723
terraform apply -auto-approve
3824
```
3925

4026
This creates objects from `objects_def.tf` and generates `objects.csv` via `datasource.tf`.
4127

42-
### Step 5: Run migration script
28+
### Step 4: Run migration script (output to import directory)
4329

4430
```bash
45-
go run github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/scripts/migration_script@main -import=block grants < objects.csv > generated_output.tf
31+
go run github.com/Snowflake-Labs/terraform-provider-snowflake/pkg/scripts/migration_script@dev \
32+
-import=block grants < objects.csv > import/main.tf
4633
```
4734

48-
### Step 6: Apply generated output (imports existing objects)
35+
### Step 5: Import in the separate directory
4936

5037
```bash
51-
terraform apply -auto-approve
38+
cd import
39+
terraform init
40+
terraform apply
5241
```
5342

54-
This imports the existing Snowflake objects into Terraform state.
43+
This imports the existing Snowflake objects into a fresh state.
5544

56-
### Step 7: Verify plan is empty
45+
### Step 6: Verify plan is empty
5746

5847
```bash
5948
terraform plan
6049
```
6150

62-
**Success criteria:** The plan should show "No changes. Your infrastructure matches the configuration."
63-
64-
If there are changes, the migration script generated incorrect values.
65-
66-
### Step 8: Cleanup (when done)
51+
### Step 7: Cleanup
6752

6853
```bash
69-
terraform destroy -auto-approve
70-
```
71-
72-
## Directory Structure
73-
74-
```
75-
manual_tests/
76-
├── README.md # This file
77-
├── users/ # Users test
78-
│ ├── objects_def.tf # Creates test users on Snowflake
79-
│ ├── datasource.tf # Fetches users, generates CSV
80-
│ ├── objects.csv # Generated CSV (after terraform apply)
81-
│ └── generated_output.tf # Generated by migration script
82-
├── schemas/
83-
├── warehouses/
84-
├── grants/
85-
└── <new_object_type>/ # Add new object types here
54+
cd ..
55+
terraform destroy
8656
```
8757

88-
## Adding a New Object Type
58+
## Test Assertions
8959

90-
To add tests for a new object type (e.g., `warehouses`):
60+
The `datasource.tf` includes a **precondition assertion** that fails if no grants are found. This prevents generating an empty CSV that would cause silent failures.
9161

92-
### Step 1: Create the folder
93-
94-
```bash
95-
mkdir -p warehouses
96-
```
97-
98-
### Step 2: Create `objects_def.tf`
99-
100-
Create test objects with various configurations:
62+
### How it works
10163

10264
```hcl
103-
terraform {
104-
required_providers {
105-
snowflake = {
106-
source = "snowflakedb/snowflake"
65+
resource "local_file" "grants_csv" {
66+
# ...
67+
lifecycle {
68+
precondition {
69+
condition = length(local.grants_csv_rows_unique) > 0
70+
error_message = "TEST ASSERTION FAILED: No grants found. Make sure objects_def.tf resources were created first."
10771
}
10872
}
10973
}
110-
111-
provider "snowflake" {}
112-
113-
# Basic warehouse
114-
resource "snowflake_warehouse" "basic" {
115-
name = "MIGRATION_TEST_WH_BASIC"
116-
}
117-
118-
# Warehouse with all parameters
119-
resource "snowflake_warehouse" "complete" {
120-
name = "MIGRATION_TEST_WH_COMPLETE"
121-
comment = "Test warehouse for migration"
122-
warehouse_size = "XSMALL"
123-
# ... more parameters
124-
}
12574
```
12675

127-
**Important naming convention:** Use `MIGRATION_TEST_` prefix for all test objects.
76+
### Testing the assertion
12877

129-
### Step 3: Create `datasource.tf`
130-
131-
Fetch the objects and generate CSV:
132-
133-
```hcl
134-
# Fetch test warehouses
135-
data "snowflake_warehouses" "test_warehouses" {
136-
like = "MIGRATION_TEST_WH_%"
137-
}
138-
139-
locals {
140-
# Flatten the data source output
141-
warehouses_flattened = [
142-
for wh in data.snowflake_warehouses.test_warehouses.warehouses :
143-
wh.show_output[0]
144-
]
145-
146-
# Create CSV header
147-
csv_header = length(local.warehouses_flattened) > 0 ? join(",", [
148-
for key in keys(local.warehouses_flattened[0]) : "\"${key}\""
149-
]) : ""
150-
151-
# CSV escape function
152-
csv_escape = length(local.warehouses_flattened) > 0 ? {
153-
for wh in local.warehouses_flattened :
154-
wh.name => {
155-
for key in keys(local.warehouses_flattened[0]) :
156-
key => replace(
157-
replace(
158-
replace(tostring(lookup(wh, key, "")), "\\", "\\\\"),
159-
"\n", "\\n"
160-
),
161-
"\"", "\"\""
162-
)
163-
}
164-
} : {}
165-
166-
# Create CSV rows
167-
csv_rows = length(local.warehouses_flattened) > 0 ? [
168-
for wh in local.warehouses_flattened :
169-
join(",", [
170-
for key in keys(local.warehouses_flattened[0]) :
171-
"\"${local.csv_escape[wh.name][key]}\""
172-
])
173-
] : []
174-
175-
csv_content = join("\n", concat([local.csv_header], local.csv_rows))
176-
}
177-
178-
# Write CSV file
179-
resource "local_file" "csv" {
180-
content = local.csv_content
181-
filename = "${path.module}/objects.csv"
182-
}
183-
184-
# Debug outputs
185-
output "objects_found" {
186-
value = length(local.warehouses_flattened)
187-
}
188-
```
189-
190-
### Step 4: Test it
78+
To verify the assertion works correctly, use the `test_assertion/` subdirectory:
19179

19280
```bash
193-
cd warehouses
194-
rm -rf .terraform terraform.tfstate* generated_output.tf
81+
cd test_assertion
19582
terraform init
196-
terraform apply -auto-approve
197-
198-
cd ..
199-
go run . -import=block warehouses < manual_tests/warehouses/objects.csv > manual_tests/warehouses/generated_output.tf
200-
201-
cd manual_tests/warehouses
202-
terraform apply -auto-approve
203-
terraform plan # Should show no changes!
83+
terraform apply
84+
# Expected error: "TEST ASSERTION FAILED: No grants found..."
20485
```

pkg/manual_tests/migration_script/grants/datasource.tf

Lines changed: 20 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ data "snowflake_grants" "of_child_role" {
5959
# Fetch all grants TO the privilege database role
6060
data "snowflake_grants" "to_priv_db_role" {
6161
grants_to {
62-
database_role = "\"${snowflake_database.test_db.name}\".\"${snowflake_database_role.priv_db_role.name}\""
62+
database_role = snowflake_database_role.priv_db_role.fully_qualified_name
6363
}
6464

6565
depends_on = [
@@ -73,7 +73,7 @@ data "snowflake_grants" "to_priv_db_role" {
7373
# Fetch all grants TO the parent database role
7474
data "snowflake_grants" "to_parent_db_role" {
7575
grants_to {
76-
database_role = "\"${snowflake_database.test_db.name}\".\"${snowflake_database_role.parent_db_role.name}\""
76+
database_role = snowflake_database_role.parent_db_role.fully_qualified_name
7777
}
7878

7979
depends_on = [
@@ -88,7 +88,7 @@ data "snowflake_grants" "to_parent_db_role" {
8888
# Fetch grants OF the child database role
8989
data "snowflake_grants" "of_child_db_role" {
9090
grants_of {
91-
database_role = "\"${snowflake_database.test_db.name}\".\"${snowflake_database_role.child_db_role.name}\""
91+
database_role = snowflake_database_role.child_db_role.fully_qualified_name
9292
}
9393

9494
depends_on = [
@@ -99,7 +99,7 @@ data "snowflake_grants" "of_child_db_role" {
9999
# Fetch grants OF the priv database role (granted to account role)
100100
data "snowflake_grants" "of_priv_db_role" {
101101
grants_of {
102-
database_role = "\"${snowflake_database.test_db.name}\".\"${snowflake_database_role.priv_db_role.name}\""
102+
database_role = snowflake_database_role.priv_db_role.fully_qualified_name
103103
}
104104

105105
depends_on = [
@@ -119,42 +119,38 @@ locals {
119119
[for g in data.snowflake_grants.of_priv_db_role.grants : g]
120120
)
121121

122-
# Filter to only privilege grants:
123-
# - Must contain MIGRATION_TEST in grantee_name or name
124-
# - Must have a non-empty granted_by (implicit grants from Snowflake have empty granted_by
125-
# and cannot be managed by Terraform)
126-
# - Must have a non-empty privilege (grants_of returns role membership info with empty privilege,
127-
# these are not privilege grants and the migration script can't handle them)
122+
# Filter grants:
123+
# - Must contain MIGRATION_TEST in grantee_name or name (test objects only)
124+
# - Must have a non-empty privilege (grants_of returns role membership rows with
125+
# empty privilege - the migration script can't handle these)
128126
test_grants = [
129127
for g in local.all_grants : g
130128
if (can(regex("MIGRATION_TEST", g.grantee_name)) || can(regex("MIGRATION_TEST", g.name))) &&
131-
g.granted_by != "" &&
132129
g.privilege != ""
133130
]
134131

132+
# CSV column definitions - order matters for output
133+
csv_columns = ["privilege", "granted_on", "grant_on", "name", "granted_to", "grant_to", "grantee_name", "grant_option", "granted_by"]
134+
135135
# CSV header - matches the GrantCsvRow struct
136-
grants_csv_header = "\"privilege\",\"granted_on\",\"grant_on\",\"name\",\"granted_to\",\"grant_to\",\"grantee_name\",\"grant_option\",\"granted_by\""
136+
grants_csv_header = join(",", [for col in local.csv_columns : "\"${col}\""])
137137

138-
# CSV escape function
139-
csv_escape_grant = {
138+
# CSV escape helper - escapes special chars for CSV format
139+
csv_escape = {
140140
for idx, grant in local.test_grants :
141141
idx => {
142-
privilege = replace(replace(replace(tostring(grant.privilege), "\\", "\\\\"), "\n", "\\n"), "\"", "\"\"")
143-
granted_on = replace(replace(replace(tostring(grant.granted_on), "\\", "\\\\"), "\n", "\\n"), "\"", "\"\"")
144-
grant_on = ""
145-
name = replace(replace(replace(tostring(grant.name), "\\", "\\\\"), "\n", "\\n"), "\"", "\"\"")
146-
granted_to = replace(replace(replace(tostring(grant.granted_to), "\\", "\\\\"), "\n", "\\n"), "\"", "\"\"")
147-
grant_to = ""
148-
grantee_name = replace(replace(replace(tostring(grant.grantee_name), "\\", "\\\\"), "\n", "\\n"), "\"", "\"\"")
149-
grant_option = tostring(grant.grant_option)
150-
granted_by = replace(replace(replace(tostring(grant.granted_by), "\\", "\\\\"), "\n", "\\n"), "\"", "\"\"")
142+
for col in local.csv_columns :
143+
col => col == "grant_on" || col == "grant_to" ? "" : (
144+
col == "grant_option" ? tostring(grant[col]) :
145+
replace(replace(replace(tostring(grant[col]), "\\", "\\\\"), "\n", "\\n"), "\"", "\"\"")
146+
)
151147
}
152148
}
153149

154150
# Convert each grant to CSV row
155151
grants_csv_rows = [
156152
for idx, grant in local.test_grants :
157-
"\"${local.csv_escape_grant[idx].privilege}\",\"${local.csv_escape_grant[idx].granted_on}\",\"${local.csv_escape_grant[idx].grant_on}\",\"${local.csv_escape_grant[idx].name}\",\"${local.csv_escape_grant[idx].granted_to}\",\"${local.csv_escape_grant[idx].grant_to}\",\"${local.csv_escape_grant[idx].grantee_name}\",\"${local.csv_escape_grant[idx].grant_option}\",\"${local.csv_escape_grant[idx].granted_by}\""
153+
join(",", [for col in local.csv_columns : "\"${local.csv_escape[idx][col]}\""])
158154
]
159155

160156
# Remove duplicates by converting to set and back
Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
# Provider configuration for import directory
2+
# This file is kept separate from generated main.tf
3+
4+
terraform {
5+
required_providers {
6+
snowflake = {
7+
source = "snowflakedb/snowflake"
8+
}
9+
}
10+
}
11+
12+
provider "snowflake" {
13+
# Uses default configuration from ~/.snowflake/config or environment variables
14+
}
15+

0 commit comments

Comments
 (0)