Getting started

Category: Data flows

Version: 1.0

Last updated: March 17, 2026

Author: Any2Info


Description

Data flows are used to configure and control data traffic within the Any2Info platform.

A data flow is a visual sequence of connected nodes that define how data is processed. These flows are built using a drag & drop interface in the Data hub.

Data can be:

  • Retrieved from external sources (ERP, databases, APIs, files)

  • Processed or enriched using logic

  • Sent to internal or external destinations

Each data flow consists of different types of nodes, including:

  • Triggers

  • Import connectors

  • Export connectors

  • Functions

  • Conditions

Every data flow always starts with a trigger, which defines when the flow is executed.


How Data Flows Work

Data flows are built from left to right using nodes from the toolbox.

  • The toolbox (left panel) contains all available nodes

  • The canvas (center) is where you build your flow

  • The properties panel (right) is used to configure each node

Each node performs a specific action and passes its output to the next step in the flow.

This allows you to create flexible data pipelines without writing code.


Getting Started

To create your first data flow:

1. Start with a Trigger

Every data flow begins with a trigger.

You can choose:

  • Scheduled triggers (Daily, Weekly, Monthly, specific time)

  • Event-based triggers

The trigger determines when the flow runs.


2. Add an Import node (Data source)

After the trigger, define where your data comes from.

You can import data from:

  • ERP systems

  • Databases

  • APIs

  • Files

  • The Any2Info platform (forms, dataclips)

Configure:

  • Entity or dataset

  • Fields (columns)

  • Filters


3. Process the data (optional)

You can add logic between import and export:

  • Functions → calculations, transformations, queries

  • Conditions → control flow (if/else logic)

This step is used to enrich or modify data before sending it forward.


4. Add an Export node (Destination)

Next, define where the data should go.

Examples:

  • ERP systems

  • Databases

  • APIs

  • Files

  • Dataclips or forms within the platform

Configure field mapping between input and output data.


5. Test the data flow

Before activating:

  • Run the flow manually

  • Use small datasets (e.g. Take limits)

  • Validate output and mappings


6. Activate the data flow

Once validated:

  • Enable the flow

  • Let it run based on the configured trigger


Common Use Cases

Data flows are typically used to:

  • Import data from external systems into the platform

  • Export user input (forms) to external systems

  • Synchronize data between systems

  • Create dataclips for dashboards and applications

  • Enrich or transform data before usage


Tips & Best Practices

  • Always start with a trigger

  • Use filters to limit data volume

  • Build flows step-by-step and validate each node

  • Keep flows readable and logically structured

  • Use functions and conditions only when necessary

  • Avoid large unfiltered datasets in production


Changelog

Version
Date
Change

1.0

March 17, 2026

Initial documentation version added.

Last updated

Was this helpful?