Back to Integrations
Action

Convert To File

The Convert To File node acts as the ultimate data export engine for your workflow. It aggressively serializes massive JSON arrays into robust, flat-file or spreadsheet formats without consuming massive amounts of system memory. Engineered with a True Zero-RAM Streaming architecture, it ensures seamless data extraction to disk or directly to optimized Base64 payloads.

Convert To File
Data Processing / Action
⚠️

What can you do with Convert To File?

10 Native Formats Supported

Seamlessly serialize complex workflow variables into <strong>CSV, HTML, ICS, JSON, ODS, RTF, Text, XLS</strong>, and <strong>XLSX</strong>, instantly preparing your data for external databases or BI reporting tools.

True Zero-RAM Streaming

Aggressively process massive arrays with zero fear of Out-Of-Memory crashes. The system streams data chunks directly to disk via <strong>Buffered Writers</strong> rather than keeping bloated strings in RAM.

Dual-Destination Output

Effortlessly save your exported data directly to the server's <strong>File Path</strong>, or instantly inject the binary file into the workflow pipeline as a lightweight <strong>Base64</strong> string.

Detailed Usage & Configuration

The Convert To File node is your definitive export gateway. Once you have cleaned, merged, and processed your data throughout the workflow, this node solidifies it into a permanent file.

1. Choosing Your Destination

  • File Path: The most efficient method. Simply provide a directory path (e.g., /var/backups/data.csv). The node natively opens a stream and writes your data chunk-by-chunk to the disk. By default, it will overwrite any existing file, but you can toggle the Append property to safely aggregate ongoing daily logs.
  • Base64 Payload: Need to email an invoice or send an attachment to an API? Select this destination. The node skips writing to disk and seamlessly streams the generated file directly into a Base64 String, making it portable for immediate downstream use via an HTTP Request or Gmail node.

2. Serializing Specific Formats

The execution engine adapts intelligently based on the chosen output format:

  • XLSX, ODS & HTML: Utilizes advanced structural engines to dynamically generate native Microsoft Excel and OpenOffice spreadsheet files, completely bypassing heavy legacy CGO dependencies.
  • CSV & TSV: Automatically flattens the first object in your JSON array to construct the strict Column Headers. All subsequent objects are robustly mapped against these headers. If appending to an existing CSV, the node smartly skips writing duplicate headers.
  • JSONL (NDJSON): Perfect for massive logging operations or feeding ElasticSearch. Each item is strictly serialized as an independent JSON object on a single new line, making parsing incredibly efficient.
  • ICS (Calendar): Seamlessly compiles event properties (like SUMMARY and DTSTART) into a universal Calendar format, instantly ready for Google Calendar or Microsoft Outlook import.
  • RTF & Text: Fallback parsers that forcefully convert objects or simple strings into plain line-breaks or Rich Text Format documents, perfect for generating raw output logs.
💡 Workflow Tip: When building automated daily reports, use the Date & Time node beforehand to generate a dynamic string (like {{ $json.today_date }}). Then, set your File Path dynamically to /backups/report_{{ $json.today_date }}.csv to effortlessly generate organized, rotating daily logs!