Skip to content

[ti_flashpoint][alert] Add Flashpoint Alert data stream#16714

Open
akshraj-crest wants to merge 2 commits intoelastic:feature/ti_flashpoint-0.1.0from
akshraj-crest:datastream-alert
Open

[ti_flashpoint][alert] Add Flashpoint Alert data stream#16714
akshraj-crest wants to merge 2 commits intoelastic:feature/ti_flashpoint-0.1.0from
akshraj-crest:datastream-alert

Conversation

@akshraj-crest
Copy link
Contributor

@akshraj-crest akshraj-crest commented Dec 29, 2025

Proposed commit message

The initial release includes alert data stream and associated dashboard.

Flashpoint fields are mapped to their corresponding ECS fields where possible.

Test samples were derived from live data samples, which were subsequently
sanitized.

Checklist

  • I have reviewed tips for building integrations and this pull request is aligned with them.
  • I have verified that all data streams collect metrics or logs.
  • I have added an entry to my package's changelog.yml file.
  • I have verified that Kibana version constraints are current according to guidelines.
  • I have verified that any added dashboard complies with Kibana's Dashboard good practices

How to test this PR locally

To test the flashpoint package:

  • Clone integrations repo.
  • Install elastic package locally.
  • Start elastic stack using elastic-package.
  • Move to integrations/packages/ti_flashpoint directory.
  • Run the following command to run tests.

elastic-package test

Run asset tests for the package
--- Test results for package: ti_flashpoint - START ---
╭───────────────┬─────────────┬───────────┬────────────────────────────────────────────────────────────────────────┬────────┬──────────────╮
│ PACKAGE       │ DATA STREAM │ TEST TYPE │ TEST NAME                                                              │ RESULT │ TIME ELAPSED │
├───────────────┼─────────────┼───────────┼────────────────────────────────────────────────────────────────────────┼────────┼──────────────┤
│ ti_flashpoint │             │ asset     │ dashboard ti_flashpoint-467e6747-8c82-4bd6-8ba5-2ec3d0e3b826 is loaded │ PASS   │        992ns │
│ ti_flashpoint │ alert       │ asset     │ index_template logs-ti_flashpoint.alert is loaded                      │ PASS   │        237ns │
│ ti_flashpoint │ alert       │ asset     │ ingest_pipeline logs-ti_flashpoint.alert-0.1.0 is loaded               │ PASS   │        176ns │
╰───────────────┴─────────────┴───────────┴────────────────────────────────────────────────────────────────────────┴────────┴──────────────╯
--- Test results for package: ti_flashpoint - END   ---
Done
Run pipeline tests for the package
--- Test results for package: ti_flashpoint - START ---
╭───────────────┬─────────────┬───────────┬───────────────────────────────────────────┬────────┬──────────────╮
│ PACKAGE       │ DATA STREAM │ TEST TYPE │ TEST NAME                                 │ RESULT │ TIME ELAPSED │
├───────────────┼─────────────┼───────────┼───────────────────────────────────────────┼────────┼──────────────┤
│ ti_flashpoint │ alert       │ pipeline  │ (ingest pipeline warnings test-alert.log) │ PASS   │ 306.347643ms │
│ ti_flashpoint │ alert       │ pipeline  │ test-alert.log                            │ PASS   │ 127.278134ms │
╰───────────────┴─────────────┴───────────┴───────────────────────────────────────────┴────────┴──────────────╯
--- Test results for package: ti_flashpoint - END   ---
Done
Run policy tests for the package
--- Test results for package: ti_flashpoint - START ---
No test results
--- Test results for package: ti_flashpoint - END   ---
Done
Run script tests for the package
--- Test results for package: ti_flashpoint - START ---
PKG ti_flashpoint
[no test files]
--- Test results for package: ti_flashpoint - END ---
Done
Run static tests for the package
--- Test results for package: ti_flashpoint - START ---
╭───────────────┬─────────────┬───────────┬──────────────────────────┬────────┬──────────────╮
│ PACKAGE       │ DATA STREAM │ TEST TYPE │ TEST NAME                │ RESULT │ TIME ELAPSED │
├───────────────┼─────────────┼───────────┼──────────────────────────┼────────┼──────────────┤
│ ti_flashpoint │ alert       │ static    │ Verify sample_event.json │ PASS   │ 122.305915ms │
╰───────────────┴─────────────┴───────────┴──────────────────────────┴────────┴──────────────╯
--- Test results for package: ti_flashpoint - END   ---
Done
Run system tests for the package
2025/12/29 16:31:33  INFO Installing package...
2025/12/29 16:31:45  INFO Running test for data_stream "alert" with configuration 'default'
2025/12/29 16:31:52  INFO Setting up independent Elastic Agent...
2025/12/29 16:32:00  INFO Setting up service...
2025/12/29 16:32:18  INFO Validating test case...
2025/12/29 16:32:19  INFO Tearing down service...
2025/12/29 16:32:19  INFO Write container logs to file: /root/integrations/build/container-logs/ti_flashpoint-1767006139704997093.log
2025/12/29 16:32:22  INFO Tearing down agent...
2025/12/29 16:32:22  INFO Write container logs to file: /root/integrations/build/container-logs/elastic-agent-1767006142749489329.log
2025/12/29 16:32:30  INFO Uninstalling package...
--- Test results for package: ti_flashpoint - START ---
╭───────────────┬─────────────┬───────────┬───────────┬────────┬──────────────╮
│ PACKAGE       │ DATA STREAM │ TEST TYPE │ TEST NAME │ RESULT │ TIME ELAPSED │
├───────────────┼─────────────┼───────────┼───────────┼────────┼──────────────┤
│ ti_flashpoint │ alert       │ system    │ default   │ PASS   │ 33.91574444s │
╰───────────────┴─────────────┴───────────┴───────────┴────────┴──────────────╯
--- Test results for package: ti_flashpoint - END   ---
Done

Screenshots

image (2) image (1)

Go Code for Ingest Pipeline Generation

The alert data stream pipeline is generated using Go code built on top of the Dispear library.
Below is the code used for generating the pipeline logic:

package main

import (
	"fmt"
	"strings"

	. "github.com/efd6/dispear"
)

const (
	ECSVersion = "9.2.0"
	PkgRoot    = "json"
)
const errorFormat = "Processor {{{_ingest.on_failure_processor_type}}} with tag {{{_ingest.on_failure_processor_tag}}} in pipeline {{{_ingest.on_failure_pipeline}}} failed with message: {{{_ingest.on_failure_message}}}"

// removeErrorHandler generates a series of Renders that first remove the given field
// from the document and then append a custom error message to the 'error.message' field.
// This is typically used to handle errors in an Ingest Pipeline by removing a field that
// caused an issue and appending a formatted error message to the document.
func removeErrorHandler(f string) []Renderer {
	return []Renderer{
		REMOVE(f),
		APPEND("error.message", errorFormat),
	}
}

// safeNavigateAndCheck converts a dot-separated field path to a safe navigation string.
//
// Example:
// "parent.child.grandchild" -> "ctx?.parent?.child?.grandchild"
func safeNavigateAndCheck(field string) string {
	parts := strings.Split(field, ".")
	condition := "ctx"
	for i, part := range parts {
		if i > 0 { // Skip the first part which is already included in the condition
			condition += fmt.Sprintf("?.%s", part)
		} else {
			condition += fmt.Sprintf(".%s", part)
		}
	}
	return condition
}

func main() {

	// Initial processors of pipeline

	DESCRIPTION("Pipeline for processing alert logs.")

	DROP("empty events placeholder").IF("ctx.message == 'empty_events_placeholder'")

	SET("ecs.version").
		VALUE(ECSVersion).
		TAG("set ecs.version to 9.2.0")

	TERMINATE("data collection error").
		IF("ctx.error?.message != null && ctx.message == null && ctx.event?.original == null").
		DESCRIPTION("error message set and no data to process.")

	BLANK()
	BLANK().COMMENT("remove agentless metadata")

	REMOVE(
		"organization",
		"division",
		"team",
	).
		IF("ctx.organization instanceof String && ctx.division instanceof String && ctx.team instanceof String").
		IGNORE_MISSING(true).
		TAG("remove_agentless_tags").
		DESCRIPTION("Removes the fields added by Agentless as metadata, as they can collide with ECS fields.")

	BLANK()
	BLANK().COMMENT("parse the event JSON")

	RENAME("message", "event.original").
		IF("ctx.event?.original == null").
		DESCRIPTION("Renames the original `message` field to `event.original` to store a copy of the original message. The `event.original` field is not touched if the document already has one; it may happen when Logstash sends the document.").
		IGNORE_MISSING(true)

	REMOVE("message").
		TAG("remove_message").
		IF("ctx.event?.original != null").
		DESCRIPTION("The `message` field is no longer required if the document has an `event.original` field.").
		IGNORE_MISSING(true)

	JSON(PkgRoot, "event.original")

	// Add fingerprint

	BLANK()
	BLANK().COMMENT("Add fingerprint")

	FINGERPRINT("_id", "json.id").IGNORE_MISSING(true)

	// Setting event.* fields

	BLANK()
	BLANK().COMMENT("Set event.* fields")

	SET("event.kind").
		VALUE("alert").
		TAG("set event.kind to alert")

	// Script to rename fields to snake_case (hyphen to underscore)

	BLANK()
	BLANK().COMMENT("rename fields to snake_case (hyphen to underscore)")

	SCRIPT().
		TAG("script_normalize_field_names").
		DESCRIPTION("Convert field names from hyphen to underscore.").
		LANG("painless").
		SOURCE(`
        // Replace '-' with '_' in field names
        String normalize(String str) {
          return str.replace('-', '_');
        }

        // Recursive function to process objects
        def normalizeFields(def obj) {
          if (obj instanceof Map) {
            def newObj = new HashMap();
            for (entry in obj.entrySet()) {
              String newKey = normalize(entry.getKey());
              newObj.put(newKey, normalizeFields(entry.getValue()));
            }
            return newObj;
          } else if (obj instanceof List) {
            def newList = new ArrayList();
            for (item in obj) {
              newList.add(normalizeFields(item));
            }
            return newList;
          }
          return obj;
        }

        // Apply transformation
        if (ctx.json != null) {
          ctx.ti_flashpoint = ctx.ti_flashpoint ?: [:];
          ctx.ti_flashpoint.alert = normalizeFields(ctx.json);
          ctx.remove('json');
        }
		`)

	// Use Date processors

	BLANK()
	BLANK().COMMENT("Date processors")

	for _, field := range []string{
		"ti_flashpoint.alert.created_at",
		"ti_flashpoint.alert.generated_at",
		"ti_flashpoint.alert.resource.sort_date",
	} {
		DATE(field, field, "UNIX", "ISO8601", "yyyy-MM-dd' 'HH:mm:ss", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ", "yyyy-MM-dd'T'HH:mm:ssXXXXX", "yyyy-MM-dd' 'HH:mm:ssXXXXX").
			IF(safeNavigateAndCheck(field) + " != null" + " && " + "ctx." + field + " != ''").
			ON_FAILURE(removeErrorHandler(field)...)
	}

	DATE("ti_flashpoint.alert.resource.created_at", "ti_flashpoint.alert.resource.created_at.date_time", "UNIX", "ISO8601", "yyyy-MM-dd' 'HH:mm:ss", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ", "yyyy-MM-dd'T'HH:mm:ssXXXXX", "yyyy-MM-dd' 'HH:mm:ssXXXXX").
		IF(safeNavigateAndCheck("ti_flashpoint.alert.resource.created_at.date_time") + " != null" + " && " + "ctx." + "ti_flashpoint.alert.resource.created_at.date_time" + " != ''").
		ON_FAILURE(removeErrorHandler("ti_flashpoint.alert.resource.created_at.date_time")...)

	// Convert to Long

	BLANK()
	BLANK().COMMENT("Convert to Long")

	// Convert to Long on ti_flashpoint.alert.resource.media_v2 array
	for _, field := range []string{
		"image_enrichment.enrichments.v1.image_analysis.safe_search.adult",
		"image_enrichment.enrichments.v1.image_analysis.safe_search.medical",
		"image_enrichment.enrichments.v1.image_analysis.safe_search.racy",
		"image_enrichment.enrichments.v1.image_analysis.safe_search.spoof",
		"image_enrichment.enrichments.v1.image_analysis.safe_search.violence",
	} {
		FOREACH("ti_flashpoint.alert.resource.media_v2",
			CONVERT("", "_ingest._value."+field, "long").
				IGNORE_MISSING(true).
				ON_FAILURE(removeErrorHandler("_ingest._value."+field)...),
		).IF("ctx.ti_flashpoint?.alert?.resource?.media_v2 instanceof List")
	}

	// Convert to boolean

	BLANK()
	BLANK().COMMENT("Convert to Boolean")

	for _, field := range []string{
		"ti_flashpoint.alert.is_read",
	} {
		CONVERT("", field, "boolean").
			IGNORE_MISSING(true).
			ON_FAILURE(removeErrorHandler(field)...)
	}

	// Convert to String

	BLANK()
	BLANK().COMMENT("Convert to String")

	for _, field := range []string{
		"ti_flashpoint.alert.reason.id",
		"ti_flashpoint.alert.resource.container.container.native_id",
		"ti_flashpoint.alert.resource.container.native_id",
		"ti_flashpoint.alert.resource.site_actor.native_id",
	} {
		CONVERT("", field, "string").
			IGNORE_MISSING(true)
	}

	// Set ECS Mapping

	BLANK()
	BLANK().COMMENT("Map custom fields to corresponding ECS and related fields.")

	// Map ECS mapping for top-level fields

	for _, mapping := range []struct {
		ecsField, customField string
	}{
		{ecsField: "event.created", customField: "ti_flashpoint.alert.created_at"},
		{ecsField: "@timestamp", customField: "ti_flashpoint.alert.created_at"},
		{ecsField: "event.id", customField: "ti_flashpoint.alert.id"},
		{ecsField: "event.reason", customField: "ti_flashpoint.alert.reason.name"},
		{ecsField: "user.name", customField: "ti_flashpoint.alert.resource.author"},
		{ecsField: "container.name", customField: "ti_flashpoint.alert.resource.container.name"},
		{ecsField: "container.id", customField: "ti_flashpoint.alert.resource.container.native_id"},
		{ecsField: "event.url", customField: "ti_flashpoint.alert.resource.link"},
	} {
		SET(mapping.ecsField).
			COPY_FROM(mapping.customField).
			IGNORE_EMPTY(true).
			TAG(fmt.Sprintf("set %s from %s", mapping.ecsField, mapping.customField))
	}
	// Map ECS mapping for array fields
	for _, mapping := range []struct {
		ecsField, customField string
	}{
		{ecsField: "file.mime_type", customField: "mime_type"},
		{ecsField: "file.hash.sha256", customField: "phash256"},
		{ecsField: "file.hash.sha1", customField: "sha1"},
	} {
		FOREACH("ti_flashpoint.alert.resource.media_v2",
			APPEND(mapping.ecsField, "{{{_ingest._value."+mapping.customField+"}}}").
				ALLOW_DUPLICATES(false),
		).IF("ctx.ti_flashpoint?.alert?.resource?.media_v2 instanceof List")
	}

	// Map related mapppings

	for _, mapping := range []struct {
		ecsField, customField string
	}{
		{ecsField: "related.user", customField: "ti_flashpoint.alert.resource.author"},
	} {
		APPEND(mapping.ecsField, "{{{"+mapping.customField+"}}}").
			IF(safeNavigateAndCheck(mapping.customField) + " != null").
			ALLOW_DUPLICATES(false).
			TAG(fmt.Sprintf("append %s from %s", mapping.ecsField, mapping.customField))
	}

	FOREACH("ti_flashpoint.alert.resource.authors",
		APPEND("related.user", "{{{_ingest._value}}}").
			ALLOW_DUPLICATES(false),
	).IF("ctx.ti_flashpoint?.alert?.resource?.authors instanceof List")

	for _, mapping := range []struct {
		ecsField, customField string
	}{
		{ecsField: "related.hash", customField: "phash"},
		{ecsField: "related.hash", customField: "phash256"},
		{ecsField: "related.hash", customField: "sha1"},
	} {
		FOREACH("ti_flashpoint.alert.resource.media_v2",
			APPEND(mapping.ecsField, "{{{_ingest._value."+mapping.customField+"}}}").
				ALLOW_DUPLICATES(false),
		).IF("ctx.ti_flashpoint?.alert?.resource?.media_v2 instanceof List")
	}

	// Remove Duplicate Fields.

	BLANK()
	BLANK().COMMENT("Remove duplicate custom fields if preserve_duplicate_custom_fields are not enabled")

	FOREACH("ti_flashpoint.alert.resource.media_v2",
		REMOVE(
			"_ingest._value.mime_type",
			"_ingest._value.phash256",
			"_ingest._value.sha1",
		).
			TAG("remove_custom_duplicate_fields_for_media_v2").
			IGNORE_MISSING(true),
	).IF("ctx.ti_flashpoint?.alert?.resource?.media_v2 instanceof List && (ctx.tags == null || !ctx.tags.contains('preserve_duplicate_custom_fields'))")

	REMOVE(
		"ti_flashpoint.alert.created_at",
		"ti_flashpoint.alert.id",
		"ti_flashpoint.alert.reason.name",
		"ti_flashpoint.alert.resource.author",
		"ti_flashpoint.alert.resource.container.name",
		"ti_flashpoint.alert.resource.container.native_id",
		"ti_flashpoint.alert.resource.link",
	).
		IF("ctx.tags == null || !ctx.tags.contains('preserve_duplicate_custom_fields')").
		TAG("remove_custom_duplicate_fields").
		IGNORE_MISSING(true)

	// Clean up script

	BLANK()
	BLANK().COMMENT("Cleanup")

	SCRIPT().
		TAG("script_to_drop_null_values").
		DESCRIPTION("This script processor iterates over the whole document to remove fields with null values.").
		LANG("painless").
		SOURCE(`
		void handleMap(Map map) {
		map.values().removeIf(v -> {
			if (v instanceof Map) {
			handleMap(v);
			} else if (v instanceof List) {
			handleList(v);
			}
			return v == null || v == '' || (v instanceof Map && v.size() == 0) || (v instanceof List && v.size() == 0)
		});
		}
		void handleList(List list) {
		list.removeIf(v -> {
			if (v instanceof Map) {
			handleMap(v);
			} else if (v instanceof List) {
			handleList(v);
			}
			return v == null || v == '' || (v instanceof Map && v.size() == 0) || (v instanceof List && v.size() == 0)
		});
		}
		handleMap(ctx);
		`)

	// Set and Append processor on last

	SET("event.kind").
		VALUE("pipeline_error").
		IF("ctx.error?.message != null").
		TAG("set event.kind to pipeline_error")
	APPEND("tags", "preserve_original_event").
		IF("ctx.error?.message != null").
		ALLOW_DUPLICATES(false)

	// Global on failure processor

	ON_FAILURE(
		APPEND("error.message", errorFormat),
		SET("event.kind").VALUE("pipeline_error").TAG("set event.kind to pipeline_error"),
		APPEND("tags", "preserve_original_event").
			ALLOW_DUPLICATES(false),
	)

	// Generate the pipeline

	Generate()
}

@akshraj-crest akshraj-crest requested a review from a team as a code owner December 29, 2025 11:16
@andrewkroh andrewkroh added New Integration Issue or pull request for creating a new integration package. dashboard Relates to a Kibana dashboard bug, enhancement, or modification. documentation Improvements or additions to documentation. Applied to PRs that modify *.md files. labels Jan 8, 2026
@kcreddy kcreddy self-requested a review February 6, 2026 09:42
@@ -0,0 +1,3 @@
dependencies:
ecs:
reference: git@v9.2.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

9.3.0 is out. Can you update to this?

/packages/ti_domaintools @elastic/security-service-integrations
/packages/ti_eclecticiq @elastic/security-service-integrations
/packages/ti_eset @elastic/security-service-integrations
/packages/ti_flashpoint @elastic/security-service-integrations
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the idea to add threat indicators to this integration?
All the ti_* integrations ingest threat indicators feed and detection rules rely on ti_* to run detections.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Flashpoint ingests indicators and vulnerabilities, that's why we’ve aligned this as a ti_* package

Comment on lines +25 to +27
Integrating Flashpoint with Elastic SIEM provides centralized visibility into threat intelligence alerts. Kibana dashboards present key metrics such as `Total Alerts`, along with visualizations showing `Alerts by Data Type`, `Source`, and `Origin`.

`Alert Trends over Time`, `Top Authors`, `MIME Types`, `Alert Sources`, and `Geographic Distribution` of related resources help analysts quickly monitor activity and investigate alerts. These insights support efficient threat monitoring and analysis workflows.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think naming all visualisations is good idea as it can be maintenance burden.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, We can list only relevant visualizations to avoid maintenance overhead.

Comment on lines +62 to +65
max([
state.cursor.max_created_at.parse_time("2006-01-02T15:04:05"),
body.items.map(e, timestamp(e.created_at)).max()
]).format("2006-01-02T15:04:05")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

max also works for a pair, without having to instantiate a list

Comment on lines +76 to +88
((has(body.items) && size(body.items) > 0) ?
(has(state.?cursor.max_created_at) ?
optional.of(
max([
state.cursor.max_created_at.parse_time("2006-01-02T15:04:05"),
body.items.map(e, timestamp(e.created_at)).max(),
]).format("2006-01-02T15:04:05")
)
:
optional.of(body.items.map(e, timestamp(e.created_at)).max().format("2006-01-02T15:04:05"))
)
:
state.?cursor.max_created_at)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is identical to L60-71.
I think you could just use either one of max_created_at or last_timestamp and keep updating for all pages (no pagination condition), and it might still work?


# rename fields to snake_case (hyphen to underscore)
- script:
description: Convert field names from hyphen to underscore.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It also populates ctx.ti_flashpoint.alert and removes ctx.json. Its important to call them out.
Or use rename and remove processors after this for respective assignment of ctx.ti_flashpoint.alert or removal of ctx.json

Comment on lines +104 to +109
- UNIX
- ISO8601
- yyyy-MM-dd' 'HH:mm:ss
- yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ
- yyyy-MM-dd'T'HH:mm:ssXXXXX
- yyyy-MM-dd' 'HH:mm:ssXXXXX
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since there are lot of patterns, to avoid the performance bottlenecks, can you confirm if they are ordered by most likely/common to appear.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can reorder them based on priority and likelihood of occurrence.

Comment on lines +376 to +377
field: file.hash.sha256
value: '{{{_ingest._value.phash256}}}'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sha256 and phash are different algorithms. Suggest not to do this assignment.

multi: false
required: true
show_user: false
default: 5000
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this the maximum allowed by the API?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is maximum allowed by API.

@@ -0,0 +1,118 @@
{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest generate the sample event that can populate as many ECS fields as possible. For example, this doesn't have related fields or user fields.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dashboard Relates to a Kibana dashboard bug, enhancement, or modification. documentation Improvements or additions to documentation. Applied to PRs that modify *.md files. New Integration Issue or pull request for creating a new integration package.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants