Aller au contenu principal

Data Streaming

Category: Transform

Process large datasets with memory-efficient streaming

What this node does

  • Memory-efficient
  • Chunk processing
  • Large dataset support
  • Async streaming

How to use

  1. In the Agentic Studio, open or create a workflow
  2. In the node palette on the left, find Data Streaming under the Transform category (or use the search bar)
  3. Drag the node onto the canvas
  4. Double-click the node to open its configuration dialog
  5. Fill in the required parameters (see Configuration below)
  6. Connect the Input Data (file/stream) input port from an upstream node
  7. Optionally connect the Chunk Size port if needed
  8. Connect the Processed Data Stream output to the next node downstream

Inputs

PortTypeRequiredDescription
Input Data (file/stream)anyany data
Chunk SizenumberOptionalNumeric value

Outputs

PortTypeDescription
Processed Data Streamanyany data

Configuration

Open the configuration dialog by double-clicking the Data Streaming node on the canvas.

ParameterWhat to enter
chunkSizeConfigure chunkSize in the node settings
processingFunctionConfigure processingFunction in the node settings
concurrencyConfigure concurrency in the node settings

When to use this node

  • Process large FHIR bundles
  • Stream log files
  • Handle big data

Need help configuring this node?

Go to Settings → Connectors to set up the connection this node depends on, then reference the connector ID in the node configuration dialog.