HomeGuides
Try a DemoGet the Latest ReleaseSee the CHANGELOGCreate a Feature Request / Bug ReportJoin the Conversation
Guides

Send

Send transforms send data to an external system.

Substation's send transforms differ from other transforms in a couple ways:

  • Data Passthrough: All data processed by a send transform passes through, without modification, to the next configured transform.
  • Data Batching: All data is batched in memory before being sent to an external system. Each batch can be further processed by applying auxiliary transforms before it is sent.

send.aws.dynamodb.put

Puts JSON objects as items into an AWS DynamoDB table.

Settings

FieldTypeDescriptionRequired
batch.countintMaximum number of items to batch before emitting a new array.

Defaults to 1,000 items.
No
batch.sizeintMaximum size (in bytes) of items to batch before emitting a new array.

Defaults to 1MB.
No
batch.durationintMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
aws.arnstringAWS resource (DynamoDB table) that is accessed.Yes
aws.assume_role_arnstringAWS role that is used to authenticate.

Defaults to an empty string (no role assumption is used).
No

Example

sub.transform.send.aws.dynamodb({
  aws: { arn: 'arn:aws:dynamodb:us-east-2:123456789012:table/my-table'},
})
sub.tf.send.aws.dynamodb({aws: { arn: 'arn:aws:dynamodb:us-east-2:123456789012:table/my-table'}})

send.aws.eventbridge

Puts JSON data into an AWS EventBridge bus.

Settings

FieldTypeDescriptionRequired
descriptionstringDescription used when events are put into the EventBridge bus.

Defaults to "Substation Transform".
No
batch.countintMaximum number of items to batch before emitting a new array.

Defaults to 1,000 items.
No
batch.sizeintMaximum size (in bytes) of items to batch before emitting a new array.

Defaults to 1MB.
No
batch.durationintMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
aws.arnstringAWS resource (EventBridge bus) that is accessed.Yes
aws.assume_role_arnstringAWS role that is used to authenticate.

Defaults to an empty string (no role assumption is used).
No

Example

sub.transform.send.aws.eventbridge()
sub.tf.send.aws.eventbridge()

send.aws.data_firehose

Puts data into an AWS Kinesis Data Firehose stream.

Settings

FieldTypeDescriptionRequired
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
aws.awsstringAWS resource (Data Firehose stream) that is accessed.Yes
aws.assume_role_arnstringAWS role that is used to authenticate.

Defaults to an empty string (no role assumption is used).
No
retry.countintegerMaximum number of times to retry putting data into the Firehose stream.

Defaults to the AWS_MAX_ATTEMPTS environment variable.
No

Example

sub.transform.send.aws.data_firehose({
  aws: { arn: 'arn:aws:firehose:us-east-2:123456789012:deliverystream/my-stream'},
})
sub.tf.send.aws.firehose({aws: { arn: 'arn:aws:firehose:us-east-2:123456789012:deliverystream/my-stream'}})

send.aws.kinesis_data_stream

Puts data into an AWS Kinesis Data Stream stream.

Settings

FieldTypeDescriptionRequired
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
aws.arnstringAWS resource (Kinesis Data Stream) that is accessed.Yes
aws.assume_role_arnstringAWS role that is used to authenticate.

Defaults to an empty string (no role assumption is used).
No
use_batch_key_as_partition_keyboolDetermines if the value retrieved using object.batch_key should be used as the Kinesis record's partition key.

Defaults to false (partition key is a random UUID).
No
enable_record_aggregationbooleanDetermines if records should be aggregated using the Kinesis Producer Library.

Defaults to false (no aggregation is used).
No

Example

sub.transform.send.aws.kinesis_data_stream({
  aws: { arn: 'arn:aws:kinesis:us-east-2:123456789012:stream/my-stream'},
})
sub.tf.send.aws.kinesis_data_stream({  aws: { arn: 'arn:aws:kinesis:us-east-2:123456789012:stream/my-stream'}})

send.aws.lambda

Asynchronously invokes and sends data as a payload to an AWS Lambda function.

If you need to synchronously invoke a Lambda function, then use the enrich AWS Lambda transform.

Settings

FieldTypeDescriptionRequired
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
aws.arnstringAWS resource (Lambda function) that is accessed.Yes
aws.assume_role_arnstringAWS role that is used to authenticate.

Defaults to an empty string (no role assumption is used).
No

Example

sub.transform.send.aws.lambda({
  aws: { arn: 'arn:aws:lambda:us-east-2:123456789012:function/my-func'},
})
sub.tf.send.aws.lambda({aws: { arn: 'arn:aws:lambda:us-east-2:123456789012:function/my-func'}})

send.aws.s3

Writes data as an object to an AWS S3 bucket.

Settings

FieldTypeDescriptionRequired
batch.countintMaximum number of items to batch before emitting a new array.

Defaults to 1,000 items.
No
batch.sizeintMaximum size (in bytes) of items to batch before emitting a new array.

Defaults to 1MB.
No
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
storage_classstringStorage class (e.g., STANDARD, GLACIER_IR) used by the object in the S3 bucket. Read more here.

Defaults to STANDARD.
No
aws.arnstringAWS resource (S3 bucket) that is accessed.Yes
aws.role_arnstringAWS role that is used to authenticate.

Defaults to an empty string (no role assumption is used).
No
file_pathobjectDetermines how the name of the object is constructed.

Defaults to year/month/day/uuid.extension.
No
use_batch_key_as_prefixboolDetermines if the value retrieved using object.batch_key should replace the prefix value in file_path.

Defaults to false.
No

Example

sub.transform.send.aws.s3({
  aws: { arn: 'arn:aws:s3:::my-bucket'},
  file_path: { prefix: 'prefix' } },
})
sub.tf.send.aws.s3(
sub.tf.send.aws.s3({
  aws: { arn: 'arn:aws:s3:::my-bucket'},
  file_path: { prefix: 'prefix' } },
})

send.aws.sns

Sends data to an AWS SNS topic.

Settings

FieldTypeDescriptionRequired
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
aws.arnstringAWS resource (SNS topic) that is accessed.Yes
aws.assume_role_arnstringAWS role that is used to authenticate.

Defaults to an empty string (no role assumption is used).
No

Example

sub.transform.send.aws.sns({
  aws: { arn: 'arn:aws:sns:us-east-1:123456789012:substation' },
})
sub.tf.send.aws.sns({aws: {arn: 'arn:aws:sns:us-east-1:123456789012:substation' }})

send.aws.sqs

Sends data to an AWS SQS queue.

Settings

FieldTypeDescriptionRequired
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
aws.arnstringAWS resource (SQS queue) that is accessed.Yes
aws.role_arnstringAWS role that is used to authenticate.

Defaults to an empty string (no role assumption is used).
No
retry.countintegerMaximum number of times to retry sending data to the SQS queue.

Defaults to the AWS_MAX_ATTEMPTS environment variable.
No

Example

sub.transform.send.aws.sns({
  aws: {arn: 'arn:aws:sqs:us-east-1:123456789012:substation' },
})
sub.tf.send.aws.sqs({aws: {arn: 'arn:aws:sqs:us-east-1:123456789012:substation' }})

send.file

Writes data to a file.

Settings

FieldTypeDescriptionRequired
batch.countintegerMaximum number of items to batch before emitting a new array.

Defaults to 1,000 items.
No
batch.sizeintegerMaximum size (in bytes) of items to batch before emitting a new array.

Defaults to 1MB.
No
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
file_pathobjectDetermines how the name of the object is constructed.

Defaults to year/month/day/uuid.extension.
No
use_batch_key_as_prefixboolDetermines if the value retrieved using object.batch_key should replace the prefix value in file_path.

Defaults to false.
No

Example

sub.transform.send.file()
sub.tf.send.file()

send.http.post

POSTs data to an HTTP(S) URL.

Settings

FieldTypeDescriptionRequired
urlstringThe HTTP(S) URL used in the POST request.

URLs support loading secrets.
Yes
batch.countintegerMaximum number of items to batch before emitting a new array.

Defaults to 1,000 items.
No
batch.sizeintegerMaximum size (in bytes) of items to batch before emitting a new array.

Defaults to 1MB.
No
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No
headers[]objectAn array of objects that contain HTTP headers sent in the request. Header values support loading secrets.

Defaults to an empty object (no headers are used).
No

Example

sub.transform.send.http.post(
  settings={ url: 'api.foo.com' }
),
sub.tf.send.http.post({ url: 'api.foo.com' })

send.stdout

Sends data to stdout.

Settings

FieldTypeDescriptionRequired
batch.countintegerMaximum number of items to batch before emitting a new array.

Defaults to 1,000 items.
No
batch.sizeintegerMaximum size (in bytes) of items to batch before emitting a new array.

Defaults to 1MB.
No
batch.durationstringMaximum duration to batch items before emitting a new array.

Defaults to 1m.
No
auxiliary_transforms[]objectTransforms that are applied to batched data in a sub-pipeline before sending data externally.

Defaults to an empty list (no additional transformation is applied).
No
object.batch_keystringRetrieves a value from an object that is used to organize batched data.

No default, all data is batched into the same array.
No

Example

sub.transform.send.stdout()
sub.tf.send.stdout()

File-Based Send Transforms

Send transforms that deliver file-like objects have specific settings that determine the path, format, and compression for each file.

file_path Settings

Determines how the name of the file is constructed.

FieldTypeDescriptionRequired
prefixstringString value that is prepended to the file path.No
time_formatstringInserts a formatted datetime string into the file path.

Must be one of:

- pattern-based layouts
- unix: epoch (supports fractions of a second)
- unix_milli: epoch milliseconds
No
uuidbooleanInserts a random UUID into the file path. In most configurations, this becomes the file name.No
suffixstringString value that is appended to the file name.No

Use Cases

Random, Date-Based Files

{
  // creates the file pattern `year/month/day/uuid.extension`
  file_path: {
    time_format: '2006/01/02',
    uuid: true,
  }
}