Skip to content

Commit 0818101

Browse files
new scenarios e2e bq review comments addressed
1 parent 47f6740 commit 0818101

File tree

9 files changed

+171
-599
lines changed

9 files changed

+171
-599
lines changed

src/e2e-test/features/bigquery/sink/BigQueryToBigQuerySink.feature

Lines changed: 118 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -249,7 +249,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
249249
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqUpsertExpectedFile"
250250

251251
@BQ_NULL_MODE_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION
252-
Scenario: Validate Successful record transfer from BigQuery source plugin to BigQuery sink plugin with all null values in one column and few null values in different column.
252+
Scenario: Validate Successful record transfer from BigQuery source plugin to BigQuery sink plugin having all null values in one column and few null values in another column in Source table
253253
Given Open Datafusion Project to configure pipeline
254254
When Expand Plugin group in the LHS plugins list: "Source"
255255
When Select plugin: "BigQuery" from the plugins list as: "Source"
@@ -297,7 +297,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
297297
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
298298

299299
@BQ_UPDATE_SOURCE_DEDUPE_TEST @BQ_UPDATE_SINK_DEDUPE_TEST @EXISTING_BQ_CONNECTION
300-
Scenario: Verify successful record transfer from BigQuery source to BigQuery sink using advance operation update with dedupe property.
300+
Scenario: Verify successful record transfer from BigQuery source to BigQuery sink using advance operation update with Dedupe By Property.
301301
Given Open Datafusion Project to configure pipeline
302302
When Expand Plugin group in the LHS plugins list: "Source"
303303
When Select plugin: "BigQuery" from the plugins list as: "Source"
@@ -328,9 +328,9 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
328328
Then Verify input plugin property: "dataset" contains value: "dataset"
329329
Then Enter input plugin property: "table" with value: "bqTargetTable"
330330
And Select radio button plugin property: "operation" with value: "update"
331-
Then Enter Value for plugin property table key : "relationTableKey" with values: "Name"
331+
Then Enter Value for plugin property table key : "relationTableKey" with values: "relationTableKeyValue"
332332
Then Select dropdown plugin property: "dedupeBy" with option value: "dedupeByOrder"
333-
Then Enter key for plugin property: "dedupeBy" with values: "ID"
333+
Then Enter key for plugin property: "dedupeBy" with values: "dedupeByValue"
334334
Then Validate "BigQuery" plugin properties
335335
Then Close the BigQuery properties
336336
Then Save the pipeline
@@ -346,7 +346,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
346346
Then Open and capture logs
347347
Then Close the pipeline logs
348348
Then Verify the pipeline status is "Succeeded"
349-
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqUpdatededupeExpectedFile"
349+
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqUpdateDedupeExpectedFile"
350350

351351
@BQ_INSERT_INT_SOURCE_TEST @BQ_EXISTING_SINK_TEST @EXISTING_BQ_CONNECTION
352352
Scenario: Verify successful record transfer for the Insert operation with partition type Integer and destination table is existing already.
@@ -401,8 +401,8 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
401401
Then Verify the pipeline status is "Succeeded"
402402
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqInsertExpectedFile"
403403

404-
@BQ_TIME_STAMP_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION
405-
Scenario: Verify successful record transfer for the Insert operation from BigQuery source plugin to BigQuery sink with partition type Time.
404+
@BQ_TIME_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION
405+
Scenario: Verify successful record transfer for the Insert operation from BigQuery source plugin to BigQuery sink with partition type Time and partition field is date.
406406
Given Open Datafusion Project to configure pipeline
407407
When Expand Plugin group in the LHS plugins list: "Source"
408408
When Select plugin: "BigQuery" from the plugins list as: "Source"
@@ -449,10 +449,110 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
449449
Then Open and capture logs
450450
Then Close the pipeline logs
451451
Then Verify the pipeline status is "Succeeded"
452+
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqDateExpectedFile"
453+
454+
@BQ_TIME_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION
455+
Scenario: Verify successful record transfer for the Insert operation from BigQuery source plugin to BigQuery sink with partition type Time and partition field is datetime.
456+
Given Open Datafusion Project to configure pipeline
457+
When Expand Plugin group in the LHS plugins list: "Source"
458+
When Select plugin: "BigQuery" from the plugins list as: "Source"
459+
When Expand Plugin group in the LHS plugins list: "Sink"
460+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
461+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
462+
Then Navigate to the properties page of plugin: "BigQuery"
463+
Then Click plugin property: "switch-useConnection"
464+
Then Click on the Browse Connections button
465+
Then Select connection: "bqConnectionName"
466+
Then Click on the Browse button inside plugin properties
467+
Then Select connection data row with name: "dataset"
468+
Then Select connection data row with name: "bqSourceTable"
469+
Then Wait till connection data loading completes with a timeout of 60 seconds
470+
Then Verify input plugin property: "dataset" contains value: "dataset"
471+
Then Verify input plugin property: "table" contains value: "bqSourceTable"
472+
Then Click on the Get Schema button
473+
Then Validate "BigQuery" plugin properties
474+
And Close the Plugin Properties page
475+
Then Navigate to the properties page of plugin: "BigQuery2"
476+
Then Click plugin property: "useConnection"
477+
Then Click on the Browse Connections button
478+
Then Select connection: "bqConnectionName"
479+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
480+
Then Click on the Browse button inside plugin properties
481+
Then Click SELECT button inside connection data row with name: "dataset"
482+
Then Wait till connection data loading completes with a timeout of 60 seconds
483+
Then Verify input plugin property: "dataset" contains value: "dataset"
484+
Then Enter input plugin property: "table" with value: "bqTargetTable"
485+
Then Enter input plugin property: "partitionByField" with value: "bqPartitionFieldDateTime"
486+
Then Click plugin property: "updateTableSchema"
487+
Then Validate "BigQuery" plugin properties
488+
Then Close the BigQuery properties
489+
Then Save the pipeline
490+
Then Preview and run the pipeline
491+
Then Wait till pipeline preview is in running state
492+
Then Open and capture pipeline preview logs
493+
Then Verify the preview run status of pipeline in the logs is "succeeded"
494+
Then Close the pipeline logs
495+
Then Close the preview
496+
Then Deploy the pipeline
497+
Then Run the Pipeline in Runtime
498+
Then Wait till pipeline is in running state
499+
Then Open and capture logs
500+
Then Close the pipeline logs
501+
Then Verify the pipeline status is "Succeeded"
502+
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqDateTimeExpectedFile"
503+
504+
@BQ_TIME_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION
505+
Scenario: Verify successful record transfer for the Insert operation from BigQuery source plugin to BigQuery sink with partition type Time and partition field is timestamp.
506+
Given Open Datafusion Project to configure pipeline
507+
When Expand Plugin group in the LHS plugins list: "Source"
508+
When Select plugin: "BigQuery" from the plugins list as: "Source"
509+
When Expand Plugin group in the LHS plugins list: "Sink"
510+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
511+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
512+
Then Navigate to the properties page of plugin: "BigQuery"
513+
Then Click plugin property: "switch-useConnection"
514+
Then Click on the Browse Connections button
515+
Then Select connection: "bqConnectionName"
516+
Then Click on the Browse button inside plugin properties
517+
Then Select connection data row with name: "dataset"
518+
Then Select connection data row with name: "bqSourceTable"
519+
Then Wait till connection data loading completes with a timeout of 60 seconds
520+
Then Verify input plugin property: "dataset" contains value: "dataset"
521+
Then Verify input plugin property: "table" contains value: "bqSourceTable"
522+
Then Click on the Get Schema button
523+
Then Validate "BigQuery" plugin properties
524+
And Close the Plugin Properties page
525+
Then Navigate to the properties page of plugin: "BigQuery2"
526+
Then Click plugin property: "useConnection"
527+
Then Click on the Browse Connections button
528+
Then Select connection: "bqConnectionName"
529+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
530+
Then Click on the Browse button inside plugin properties
531+
Then Click SELECT button inside connection data row with name: "dataset"
532+
Then Wait till connection data loading completes with a timeout of 60 seconds
533+
Then Verify input plugin property: "dataset" contains value: "dataset"
534+
Then Enter input plugin property: "table" with value: "bqTargetTable"
535+
Then Enter input plugin property: "partitionByField" with value: "bqPartitionFieldTimeStamp"
536+
Then Click plugin property: "updateTableSchema"
537+
Then Validate "BigQuery" plugin properties
538+
Then Close the BigQuery properties
539+
Then Save the pipeline
540+
Then Preview and run the pipeline
541+
Then Wait till pipeline preview is in running state
542+
Then Open and capture pipeline preview logs
543+
Then Verify the preview run status of pipeline in the logs is "succeeded"
544+
Then Close the pipeline logs
545+
Then Close the preview
546+
Then Deploy the pipeline
547+
Then Run the Pipeline in Runtime
548+
Then Wait till pipeline is in running state
549+
Then Open and capture logs
550+
Then Close the pipeline logs
551+
Then Verify the pipeline status is "Succeeded"
452552
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqTimeStampExpectedFile"
453553

454554
@BQ_UPSERT_DEDUPE_SOURCE_TEST @BQ_UPSERT_DEDUPE_SINK_TEST @EXISTING_BQ_CONNECTION
455-
Scenario:Validate successful records transfer from BigQuery source to BigQuery sink with Upsert operation with dedupe source data and destination table already exists and update table schema is false
555+
Scenario:Validate successful records transfer from BigQuery source to BigQuery sink with Upsert operation with dedupe source data and existing destination table where update table schema is set to false
456556
Given Open Datafusion Project to configure pipeline
457557
When Expand Plugin group in the LHS plugins list: "Source"
458558
When Select plugin: "BigQuery" from the plugins list as: "Source"
@@ -486,7 +586,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
486586
Then Click on the Add Button of the property: "relationTableKey" with value:
487587
| TableKeyDedupe |
488588
Then Select dropdown plugin property: "dedupeBy" with option value: "dedupeBy"
489-
Then Enter key for plugin property: "dedupeBy" with values: "Price"
589+
Then Enter key for plugin property: "dedupeBy" with values: "dedupeByValueUpsert"
490590
Then Validate "BigQuery" plugin properties
491591
And Close the Plugin Properties page
492592
Then Save the pipeline
@@ -504,7 +604,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
504604
Then Verify the pipeline status is "Succeeded"
505605
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqUpsertDedupeFile"
506606

507-
@BQ_RECORD_SOURCE_TEST @BQ_SINK_TEST
607+
@BQ_RECORD_SOURCE_TEST @BQ_SECOND_RECORD_SOURCE_TEST @BQ_SINK_TEST
508608
Scenario: Validate successful record transfer from two BigQuery source plugins with different schema record names, taking one extra column in BigQuery source plugin 1,and
509609
using wrangler transformation plugin for removing the extra column and transferring the data in BigQuery sink plugin containing all the columns from both the source plugin.
510610
Given Open Datafusion Project to configure pipeline
@@ -517,6 +617,13 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
517617
Then Click on the Get Schema button
518618
Then Click on the Validate button
519619
Then Close the Plugin Properties page
620+
Then Navigate to the properties page of plugin: "BigQuery2"
621+
Then Replace input plugin property: "project" with value: "projectId"
622+
Then Replace input plugin property: "dataset" with value: "dataset"
623+
Then Replace input plugin property: "table" with value: "bqSourceTable2"
624+
Then Click on the Get Schema button
625+
Then Click on the Validate button
626+
Then Close the Plugin Properties page
520627
Then Navigate to the properties page of plugin: "BigQuery3"
521628
Then Replace input plugin property: "project" with value: "projectId"
522629
Then Replace input plugin property: "table" with value: "bqTargetTable"
@@ -572,5 +679,5 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
572679
Then Run the Pipeline in Runtime
573680
Then Wait till pipeline is in running state
574681
Then Open and capture logs
575-
Then Close the pipeline logs
576682
Then Verify the pipeline status is "Succeeded"
683+
Then Close the pipeline logs

src/e2e-test/java/io/cdap/plugin/bigquery/runners/sinkrunner/TestRunner.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@
2727
features = {"src/e2e-test/features"},
2828
glue = {"io.cdap.plugin.bigquery.stepsdesign", "io.cdap.plugin.gcs.stepsdesign",
2929
"stepsdesign", "io.cdap.plugin.common.stepsdesign"},
30-
tags = {"@BigQuery_Sink not @CDAP-20830"},
30+
tags = {"@BigQuery_Sink and not @CDAP-20830"},
3131
//TODO: Enable test once issue is fixed https://cdap.atlassian.net/browse/CDAP-20830
3232
monochrome = true,
3333
plugin = {"pretty", "html:target/cucumber-html-report/bigquery-sink",

src/e2e-test/java/io/cdap/plugin/common/stepsdesign/TestSetupHooks.java

Lines changed: 34 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,7 @@ public class TestSetupHooks {
5959
public static String gcsTargetBucketName = StringUtils.EMPTY;
6060
public static String bqTargetTable = StringUtils.EMPTY;
6161
public static String bqSourceTable = StringUtils.EMPTY;
62+
public static String bqSourceTable2 = StringUtils.EMPTY;
6263
public static String bqSourceView = StringUtils.EMPTY;
6364
public static String pubSubTargetTopic = StringUtils.EMPTY;
6465
public static String spannerInstance = StringUtils.EMPTY;
@@ -270,7 +271,11 @@ public static void createTempSourceBQTable() throws IOException, InterruptedExce
270271
}
271272

272273
@After(order = 1, value = "@BQ_SOURCE_TEST or @BQ_PARTITIONED_SOURCE_TEST or @BQ_SOURCE_DATATYPE_TEST or " +
273-
"@BQ_INSERT_SOURCE_TEST or @BQ_UPDATE_SINK_TEST")
274+
"@BQ_INSERT_SOURCE_TEST or @BQ_UPDATE_SINK_TEST or @BQ_UPSERT_SOURCE_TEST or @BQ_UPSERT_SINK_TEST or " +
275+
"@BQ_NULL_MODE_SOURCE_TEST or @BQ_UPDATE_SOURCE_DEDUPE_TEST or @BQ_UPDATE_SINK_DEDUPE_TEST or " +
276+
"@BQ_INSERT_INT_SOURCE_TEST or @BQ_EXISTING_SINK_TEST or @BQ_TIME_STAMP_SOURCE_TEST or " +
277+
"@BQ_UPSERT_DEDUPE_SOURCE_TEST or @BQ_UPSERT_DEDUPE_SINK_TEST or @BQ_RECORD_SOURCE_TEST or " +
278+
"@BQ_SECOND_RECORD_SOURCE_TEST or @BQ_INSERT_SINK_TEST")
274279
public static void deleteTempSourceBQTable() throws IOException, InterruptedException {
275280
BigQueryClient.dropBqQuery(bqSourceTable);
276281
PluginPropertyUtils.removePluginProp("bqSourceTable");
@@ -1091,19 +1096,23 @@ public static void createSinkBQExistingTable() throws IOException, InterruptedEx
10911096
BeforeActions.scenario.write("BQ Target Table " + bqTargetTable + " updated successfully");
10921097
}
10931098

1094-
@Before(order = 1, value = "@BQ_TIME_STAMP_SOURCE_TEST")
1099+
@Before(order = 1, value = "@BQ_TIME_SOURCE_TEST")
10951100
public static void createTimeStampBQTable() throws IOException, InterruptedException {
10961101
bqSourceTable = "E2E_SOURCE_" + UUID.randomUUID().toString().replaceAll("-", "_");
10971102
PluginPropertyUtils.addPluginProp("bqSourceTable", bqSourceTable);
10981103
BeforeActions.scenario.write("BQ source table name - " + bqSourceTable);
10991104
BigQueryClient.getSoleQueryResult("create table `" + datasetName + "." + bqSourceTable + "` " +
1100-
"(ID STRING, transaction_date DATE, Firstname STRING)");
1105+
"(ID STRING, transaction_date DATE, Firstname STRING," +
1106+
" transaction_dt DATETIME, updated_on TIMESTAMP )");
11011107
try {
11021108
BigQueryClient.getSoleQueryResult("INSERT INTO `" + datasetName + "." + bqSourceTable + "` " +
1103-
"(ID, transaction_date, Firstname)" +
1104-
"VALUES" + "('Agra', '2021-02-20', 'Neera')," +
1105-
"('Noida', '2021-02-21','')," +
1106-
"('Gurgaon', '2021-02-22', 'singh')");
1109+
"(ID, transaction_date, Firstname, transaction_dt, updated_on )" +
1110+
"VALUES" + "('Agra', '2021-02-20', 'Neera','2019-07-07 11:24:00', " +
1111+
"'2019-03-10 04:50:01 UTC')," +
1112+
"('Noida', '2021-02-21','', '2019-07-07 11:24:00', " +
1113+
"'2019-03-10 04:50:01 UTC')," +
1114+
"('Gurgaon', '2021-02-22', 'singh', '2019-07-07 11:24:00', " +
1115+
"'2019-03-10 04:50:01 UTC' )");
11071116
} catch (NoSuchElementException e) {
11081117
// Insert query does not return any record.
11091118
// Iterator on TableResult values in getSoleQueryResult method throws NoSuchElementException
@@ -1175,6 +1184,24 @@ public static void createSourceBQRecordTable() throws IOException, InterruptedEx
11751184
BeforeActions.scenario.write("BQ Source Table " + bqSourceTable + " created successfully");
11761185
}
11771186

1187+
@Before(order = 1, value = "@BQ_SECOND_RECORD_SOURCE_TEST")
1188+
public static void createSourceBQSecondRecordTable() throws IOException, InterruptedException {
1189+
bqSourceTable2 = "E2E_SOURCE_" + UUID.randomUUID().toString().replaceAll("-", "_");
1190+
io.cdap.e2e.utils.BigQueryClient.getSoleQueryResult("create table `" + datasetName + "." + bqSourceTable2 + "` " +
1191+
"(ID INT64, Name STRING, " + "Price FLOAT64 ) ");
1192+
try {
1193+
io.cdap.e2e.utils.BigQueryClient.getSoleQueryResult("INSERT INTO `" + datasetName + "." + bqSourceTable2 + "` " +
1194+
"(ID, Name, Price)" +
1195+
"VALUES" + "(1, 'string_1', 0.1)");
1196+
} catch (NoSuchElementException e) {
1197+
// Insert query does not return any record.
1198+
// Iterator on TableResult values in getSoleQueryResult method throws NoSuchElementException
1199+
BeforeActions.scenario.write("Error inserting the record in the table" + e.getStackTrace());
1200+
}
1201+
PluginPropertyUtils.addPluginProp("bqSourceTable2", bqSourceTable2);
1202+
BeforeActions.scenario.write("BQ Source Table " + bqSourceTable2 + " created successfully");
1203+
}
1204+
11781205
@Before(order = 1, value = "@BQ_INSERT_SINK_TEST")
11791206
public static void createSinkBQInsertTable() throws IOException, InterruptedException {
11801207

src/e2e-test/resources/pluginDataCyAttributes.properties

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,4 +42,6 @@ spannerConnectionRow=connector-Spanner
4242
testConnection=connection-test-button
4343
connectionCreate=connection-submit-button
4444
parsingOptionConfirm=parsing-config-confirm
45+
dedupeBy=dedupeBy
46+
relationTableKey=relationTableKey
4547
## CONNECTION-MANAGEMENT-END

0 commit comments

Comments
 (0)