How to track global state with workflowStaticData in n8n (Crunchbase)

— Juliet Edjere

You’re building a workflow that pulls product feature data from Crunchbase’s API and syncs it into Baserow. The challenge is managing state across batches — especially tracking what needs to be created vs. updated. workflowStaticData allows you to share, store, and mutate that global state across SplitInBatches loops.

This tutorial covers:

  • Why workflowStaticData matters
  • How to design the loop right
  • How to conditionally decide create/update
  • Using global lists for batching
  • How to reset + update the state

Why Use workflowStaticData in n8n?

In batch processing or data sync workflows, n8n doesn't carry over variables across nodes like traditional programming environments. This makes global state management tricky, especially when dealing with paginated APIs or multi-step decision trees.

This is where workflowStaticData shines:

  • It's persistent across nodes in a single execution.
  • It's perfect for tracking lookup maps, aggregation arrays, counters, and contextual flags.
  • It doesn’t reset between node executions—unlike item.json.

Let’s sync Crunchbase data to Baserow and handle new records and updates in batches.

Step 1 — Fetch Existing Tools from Baserow

Use a Baserow List Rows node to retrieve all existing tools.

Purpose: Build a lookup map to detect duplicates and prepare update logic.

Step 2 — Build Lookup Map and Initialize State

Use a Code node to transform Baserow rows into a map, keyed by crunchbase_permalink, and initialize arrays for batch operations.

// Step 2: Build Lookup Map and Initialize Lists
const toolMap = {};
for (const item of items) {
    const data = item.json;
    const permalink = data.crunchbase_permalink;
    const rowId = data.id;
    const status = data.status;
    const lastLlmRunAt = data.last_llm_run_at;

    if (permalink) {
        toolMap[permalink] = {
            row_id: rowId,
            status,
            last_llm_run_at: lastLlmRunAt
        };
    } else {
        console.warn(`Skipping row ${rowId} due to missing permalink.`);
    }
}

const workflowStaticData = $getWorkflowStaticData('global');
workflowStaticData.baserowToolMap = toolMap;
workflowStaticData.batch_create_list = [];
workflowStaticData.batch_update_list = [];

return items;

Step 3 — Fetch Crunchbase Tool List

Use an HTTP Request node to hit Crunchbase’s API, returning tool data:

  • Fields: name, permalink, etc.
  • Store the permalink in crunchbase_permalink.

Step 4 — Split Into Manageable Batches

Use the SplitInBatches node to handle tools in groups of 50–100.

Step 5 — Process Batches & Track State

Use another Code node to:

  • Check if crunchbase_permalink exists in the lookup map.
  • If new: add to batch_create_list.
  • If existing: add to batch_update_list with LLM logic.
const workflowStaticData = $getWorkflowStaticData('global');
const toolMap = workflowStaticData.baserowToolMap;
let createList = workflowStaticData.batch_create_list;
let updateList = workflowStaticData.batch_update_list;

const MonthsAgo = new Date();
MonthsAgo.setMonth(MonthsAgo.getMonth() - 1);
const today = new Date().toISOString().split('T')[0];

for (const item of items) {
    const cbTool = item.json;
    const existing = toolMap[cbTool.permalink];

    if (!existing) {
        createList.push({
            name: cbTool.name,
            crunchbase_permalink: cbTool.permalink,
            created_at: today,
            last_synced_cb_at: today,
            status: 'New - Needs LLM'
        });
    } else {
        const { row_id, status, last_llm_run_at } = existing;
        const updatePayload = {
            id: row_id,
            last_synced_cb_at: today
        };

        const statusNeedsLLM = ['New - Needs LLM', 'Needs LLM Re-run', 'LLM Failed'].includes(status);
        const llmNeverRan = !last_llm_run_at;
        let llmRunIsOld = false;

        if (last_llm_run_at) {
            try {
                const lastRun = new Date(last_llm_run_at);
                llmRunIsOld = lastRun < MonthsAgo;
            } catch {
                llmRunIsOld = true;
            }
        }

        if (!statusNeedsLLM && (llmNeverRan || llmRunIsOld)) {
            updatePayload.status = 'Needs LLM Re-run';
        }

        updateList.push(updatePayload);
    }
}

workflowStaticData.batch_create_list = createList;
workflowStaticData.batch_update_list = updateList;

return items;

Step 6 — Baserow Batch Create

Use Batch Create Rows in Baserow:

{
  "items": {{ JSON.stringify($json.batch_create_list) }}
}

Set:

  • Batch Size: 100
  • Batch Interval: 500ms

Step 7 — Baserow Batch Update

Use Batch Update Rows in Baserow:

{
  "items": {{ JSON.stringify($json.batch_update_list) }}
}

This node reads from workflowStaticData, automatically populated by previous nodes.

Best Practices for workflowStaticData

  1. Keep keys flat — Avoid nesting too deeply.
  2. Pre-validate large lists — Only store what's needed for fast lookup.
  3. Avoid async logic inside Code nodes that modify static data.
  4. Use only on workflows that execute synchronously — not good across multiple executions.
  5. Clear or re-initialize on each run — staticData persists during a workflow run.

What You’ve Built

You’ve set up a full-scale Crunchbase-to-Baserow integration pipeline that:

  • Identifies new vs. existing records using a static global map.
  • Batches large updates and inserts safely.
  • Makes decisions on data freshness using time thresholds.
  • Uses workflowStaticData properly to manage temporary global memory for one-run consistency.

ABOUT ME

I'm Juliet Edjere, a no-code professional focused on automation, product development, and building scalable solutions with no coding knowledge.

I document all things MVP validation and how designs, data, and market trends connect.

Click. Build. Launch.

Visit my website → built with Carrd and designed in Figma

Powered By Swish