Process incoming files and notify via email with GitHub storage — Flujo de trabajo n8n

Alta complejidad Programado17 nodos🏷️ File Managementpor vinci-king-01

Descripción general

File Processing Pipeline with Email and GitHub

This workflow automatically ingests newly-uploaded files, validates and transforms their contents, stores the processed files in a GitHub repository, and sends email notifications upon completion. It is ideal for teams that regularly receive data drops and need an auditable, automated pipeline to clean, version, and distribute those files.

Pre-conditions/Requirements

Prerequisites n8n instance (self-hosted or n8n cloud) GitHub and SMTP credential

Nodos utilizados

Send EmailGitHubHTTP RequestCode

Vista previa del flujo de trabajo

How it works
This workflow runs on a fixed schedule and checks an ex
Trigger & Retrieval
This section contains the Schedule Trigger that launche
Processing & Validation
Nodes in this cluster handle per-file work. Files arri
Storage & Notification
After a record set passes validation, it is wrapped in
D
Daily File Check
Fetch File List
Check File Count
A
Any New Files?
Prepare File Items
I
Iterate Files
Download File
I
Is CSV?
Parse CSV
Validate Data
V
Validation Passed?
P
Prepare GitHub Commit
Create/Update File
S
Success Email Content
Send Success Email
E
Error Email Content
Send Error Email
17 nodes18 edges

Cómo funciona

  1. 1

    Disparador

    El flujo de trabajo comienza con un disparador programado, ejecutándose según un horario definido.

  2. 2

    Procesamiento

    Los datos fluyen a través de 17 nodos, connecting code, emailsend, github.

  3. 3

    Salida

    El flujo de trabajo completa su automatización y entrega el resultado al destino configurado.

Detalles de nodos (17)

SE

Send Email

emailSend

#1
GI

GitHub

github

#2
HT

HTTP Request

httpRequest

#3
CO

Code

code

#4

Cómo importar este flujo de trabajo

  1. 1Haz clic en el botón Descargar JSON a la derecha para guardar el archivo del flujo de trabajo.
  2. 2Abre tu instancia de n8n. Ve a Flujos de trabajo → Nuevo → Importar desde archivo.
  3. 3Selecciona el archivo process-incoming-files-and-notify-via-email-with-github-storage descargado y haz clic en Importar.
  4. 4Configura las credenciales para cada nodo de servicio (claves API, OAuth, etc.).
  5. 5Haz clic en Probar flujo de trabajo para verificar que todo funcione, luego actívalo.

O pega directamente en n8n → Importar desde JSON:

{ "name": "Process incoming files and notify via email with GitHub storage", "nodes": [...], ...}

Integraciones

codeemailsendgithubhttprequestifscheduletriggersetsplitinbatches

Obtener este flujo de trabajo

Descarga e importa con un solo clic

Descargar JSONVer en n8n.io
Nodos17
Complejidadhigh
Disparadorscheduled
CategoríaFile Management

Creado por

vinci-king-01

vinci-king-01

@vinci-king-01

Etiquetas

codeemailsendgithubhttprequestifscheduletriggersetsplitinbatches

¿Nuevo en n8n?

n8n es una herramienta de automatización de flujos de trabajo gratuita y de código abierto. Alójala tú mismo o usa la versión en la nube.

Obtener n8n gratis →

Related File Management Workflows

COCOGOHT+5
medium

Sync SharePoint to Google Drive via Supabase (n8n Workflow)

Eliminate manual file transfers and data silos with this enterprise-grade synchronization pipeline. This n8n workflow automates the heavy lifting of mirroring your Microsoft SharePoint document libraries to Google Drive while maintaining a robust audit trail in Supabase or Postgres. Designed for high-reliability environments, the automation utilizes a sophisticated delta-check logic: it fetches SharePoint metadata, compares it against your existing database records using the 'Compare Datasets' node, and identifies only new or modified files for processing. By leveraging the 'Split in Batches' node, the system handles large volumes of data without hitting memory limits, ensuring every document is securely downloaded via HTTP and re-uploaded to its destination. Once the transfer is verified, the workflow updates your database status in real-time, providing a transparent log of your synchronization history. This is an essential solution for organizations transitioning between cloud ecosystems or those requiring a redundant, cross-platform backup strategy. It effectively bridges the gap between Microsoft and Google ecosystems using Supabase as a high-performance state-management layer. **Common Use Cases:** - Automated Multi-Cloud Disaster Recovery: Maintain a real-time mirror of critical SharePoint corporate assets in Google Drive to ensure business continuity during service outages. - Cross-Departmental Collaboration: Automatically sync project documentation from an executive SharePoint site to a creative team's Google Drive folder for seamless cross-platform access. - Centralized Document Auditing: Use Supabase to track every file movement between ecosystems, creating a searchable metadata index for compliance and regulatory reporting.

Scheduled·15 nodes
FOGOGOHT+2
medium

Automate Threads Video Backups to Google Drive & Sheets

Streamline your social media asset management with this robust n8n automation designed for digital marketers and content creators. This workflow eliminates the manual effort of downloading Threads content by providing a seamless bridge between Instagram's text-based platform and your cloud storage. Once triggered via a simple form input, the automation utilizes a RapidAPI connection to fetch high-quality video assets, processes the binary data, and securely uploads the file to a designated Google Drive folder. Simultaneously, the workflow maintains a centralized audit log in Google Sheets, capturing essential metadata such as timestamps and file IDs. Built-in logic gates and wait nodes ensure API rate limits are respected and file transfers are verified before logging. This is an essential tool for agencies managing high volumes of social content who need a reliable, automated pipeline for content repurposing, archival compliance, or collaborative review processes. By centralizing your Threads media assets, you ensure your creative team has instant access to raw footage without the friction of manual downloads. **Common Use Cases:** - Automated Social Media Content Archiving for Compliance - Centralized Asset Library Syncing for Creative Agencies - Competitor Research and Video Content Benchmarking

Trigger·9 nodes