You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Scenario: Validate Successful record transfer from BigQuery source plugin to BigQuery sink plugin with all null values in one column and few null values in different column.
252
+
Scenario: Validate Successful record transfer from BigQuery source plugin to BigQuery sink plugin having all null values in one column and few null values in another column in Source table
253
253
Given Open Datafusion Project to configure pipeline
254
254
When Expand Plugin group in the LHS plugins list: "Source"
255
255
When Select plugin: "BigQuery" from the plugins list as: "Source"
@@ -297,7 +297,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
297
297
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
Scenario: Verify successful record transfer for the Insert operation from BigQuery source plugin to BigQuery sink with partition type Time and partition field is date.
406
406
Given Open Datafusion Project to configure pipeline
407
407
When Expand Plugin group in the LHS plugins list: "Source"
408
408
When Select plugin: "BigQuery" from the plugins list as: "Source"
@@ -449,10 +449,110 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
449
449
Then Open and capture logs
450
450
Then Close the pipeline logs
451
451
Then Verify the pipeline status is "Succeeded"
452
+
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqDateExpectedFile"
Scenario: Verify successful record transfer for the Insert operation from BigQuery source plugin to BigQuery sink with partition type Time and partition field is datetime.
456
+
Given Open Datafusion Project to configure pipeline
457
+
When Expand Plugin group in the LHS plugins list: "Source"
458
+
When Select plugin: "BigQuery" from the plugins list as: "Source"
459
+
When Expand Plugin group in the LHS plugins list: "Sink"
460
+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
461
+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
462
+
Then Navigate to the properties page of plugin: "BigQuery"
463
+
Then Click plugin property: "switch-useConnection"
464
+
Then Click on the Browse Connections button
465
+
Then Select connection: "bqConnectionName"
466
+
Then Click on the Browse button inside plugin properties
467
+
Then Select connection data row with name: "dataset"
468
+
Then Select connection data row with name: "bqSourceTable"
469
+
Then Wait till connection data loading completes with a timeout of 60 seconds
470
+
Then Verify input plugin property: "dataset" contains value: "dataset"
471
+
Then Verify input plugin property: "table" contains value: "bqSourceTable"
472
+
Then Click on the Get Schema button
473
+
Then Validate "BigQuery" plugin properties
474
+
And Close the Plugin Properties page
475
+
Then Navigate to the properties page of plugin: "BigQuery2"
476
+
Then Click plugin property: "useConnection"
477
+
Then Click on the Browse Connections button
478
+
Then Select connection: "bqConnectionName"
479
+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
480
+
Then Click on the Browse button inside plugin properties
481
+
Then Click SELECT button inside connection data row with name: "dataset"
482
+
Then Wait till connection data loading completes with a timeout of 60 seconds
483
+
Then Verify input plugin property: "dataset" contains value: "dataset"
484
+
Then Enter input plugin property: "table" with value: "bqTargetTable"
485
+
Then Enter input plugin property: "partitionByField" with value: "bqPartitionFieldDateTime"
486
+
Then Click plugin property: "updateTableSchema"
487
+
Then Validate "BigQuery" plugin properties
488
+
Then Close the BigQuery properties
489
+
Then Save the pipeline
490
+
Then Preview and run the pipeline
491
+
Then Wait till pipeline preview is in running state
492
+
Then Open and capture pipeline preview logs
493
+
Then Verify the preview run status of pipeline in the logs is "succeeded"
494
+
Then Close the pipeline logs
495
+
Then Close the preview
496
+
Then Deploy the pipeline
497
+
Then Run the Pipeline in Runtime
498
+
Then Wait till pipeline is in running state
499
+
Then Open and capture logs
500
+
Then Close the pipeline logs
501
+
Then Verify the pipeline status is "Succeeded"
502
+
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqDateTimeExpectedFile"
Scenario: Verify successful record transfer for the Insert operation from BigQuery source plugin to BigQuery sink with partition type Time and partition field is timestamp.
506
+
Given Open Datafusion Project to configure pipeline
507
+
When Expand Plugin group in the LHS plugins list: "Source"
508
+
When Select plugin: "BigQuery" from the plugins list as: "Source"
509
+
When Expand Plugin group in the LHS plugins list: "Sink"
510
+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
511
+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
512
+
Then Navigate to the properties page of plugin: "BigQuery"
513
+
Then Click plugin property: "switch-useConnection"
514
+
Then Click on the Browse Connections button
515
+
Then Select connection: "bqConnectionName"
516
+
Then Click on the Browse button inside plugin properties
517
+
Then Select connection data row with name: "dataset"
518
+
Then Select connection data row with name: "bqSourceTable"
519
+
Then Wait till connection data loading completes with a timeout of 60 seconds
520
+
Then Verify input plugin property: "dataset" contains value: "dataset"
521
+
Then Verify input plugin property: "table" contains value: "bqSourceTable"
522
+
Then Click on the Get Schema button
523
+
Then Validate "BigQuery" plugin properties
524
+
And Close the Plugin Properties page
525
+
Then Navigate to the properties page of plugin: "BigQuery2"
526
+
Then Click plugin property: "useConnection"
527
+
Then Click on the Browse Connections button
528
+
Then Select connection: "bqConnectionName"
529
+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
530
+
Then Click on the Browse button inside plugin properties
531
+
Then Click SELECT button inside connection data row with name: "dataset"
532
+
Then Wait till connection data loading completes with a timeout of 60 seconds
533
+
Then Verify input plugin property: "dataset" contains value: "dataset"
534
+
Then Enter input plugin property: "table" with value: "bqTargetTable"
535
+
Then Enter input plugin property: "partitionByField" with value: "bqPartitionFieldTimeStamp"
536
+
Then Click plugin property: "updateTableSchema"
537
+
Then Validate "BigQuery" plugin properties
538
+
Then Close the BigQuery properties
539
+
Then Save the pipeline
540
+
Then Preview and run the pipeline
541
+
Then Wait till pipeline preview is in running state
542
+
Then Open and capture pipeline preview logs
543
+
Then Verify the preview run status of pipeline in the logs is "succeeded"
544
+
Then Close the pipeline logs
545
+
Then Close the preview
546
+
Then Deploy the pipeline
547
+
Then Run the Pipeline in Runtime
548
+
Then Wait till pipeline is in running state
549
+
Then Open and capture logs
550
+
Then Close the pipeline logs
551
+
Then Verify the pipeline status is "Succeeded"
452
552
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqTimeStampExpectedFile"
Scenario:Validate successful records transfer from BigQuery source to BigQuery sink with Upsert operation with dedupe source data and destination table already exists and update table schema is false
555
+
Scenario:Validate successful records transfer from BigQuery source to BigQuery sink with Upsert operation with dedupe source data and existing destination table where update table schema is set to false
456
556
Given Open Datafusion Project to configure pipeline
457
557
When Expand Plugin group in the LHS plugins list: "Source"
458
558
When Select plugin: "BigQuery" from the plugins list as: "Source"
@@ -486,7 +586,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
486
586
Then Click on the Add Button of the property: "relationTableKey" with value:
487
587
| TableKeyDedupe |
488
588
Then Select dropdown plugin property: "dedupeBy" with option value: "dedupeBy"
489
-
Then Enter key for plugin property: "dedupeBy" with values: "Price"
589
+
Then Enter key for plugin property: "dedupeBy" with values: "dedupeByValueUpsert"
490
590
Then Validate "BigQuery" plugin properties
491
591
And Close the Plugin Properties page
492
592
Then Save the pipeline
@@ -504,7 +604,7 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
504
604
Then Verify the pipeline status is "Succeeded"
505
605
Then Validate the data transferred from BigQuery to BigQuery with actual And expected file for: "bqUpsertDedupeFile"
Scenario: Validate successful record transfer from two BigQuery source plugins with different schema record names, taking one extra column in BigQuery source plugin 1,and
509
609
using wrangler transformation plugin for removing the extra column and transferring the data in BigQuery sink plugin containing all the columns from both the source plugin.
510
610
Given Open Datafusion Project to configure pipeline
@@ -517,6 +617,13 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
517
617
Then Click on the Get Schema button
518
618
Then Click on the Validate button
519
619
Then Close the Plugin Properties page
620
+
Then Navigate to the properties page of plugin: "BigQuery2"
621
+
Then Replace input plugin property: "project" with value: "projectId"
622
+
Then Replace input plugin property: "dataset" with value: "dataset"
623
+
Then Replace input plugin property: "table" with value: "bqSourceTable2"
624
+
Then Click on the Get Schema button
625
+
Then Click on the Validate button
626
+
Then Close the Plugin Properties page
520
627
Then Navigate to the properties page of plugin: "BigQuery3"
521
628
Then Replace input plugin property: "project" with value: "projectId"
522
629
Then Replace input plugin property: "table" with value: "bqTargetTable"
@@ -572,5 +679,5 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
0 commit comments