Automatic Notion Database Backup to Google Drive with Telegram Notifications — Flujo de trabajo n8n

Alta complejidad Disparador18 nodos🏷️ File Managementpor Prueba

Descripción general

🔍 Workflow Overview

What This Workflow Does

This workflow automatically saves copies of all your Notion databases to Google Drive. It's like creating a safety backup of your important Notion information, similar to saving important documents in a filing cabinet.

Target Audience: Anyone who uses Notion and wants to protect their data by creating automatic backups to Google Drive.

Prerequisites (What You Need Before Starting)

Required Accounts Notion Account - Where your databases are stored

Nodos utilizados

TelegramGoogle DriveNotionCode

Vista previa del flujo de trabajo

Notion Backup to Google Drive
Automatically backup all your Notion databases to Googl
How it works
1. Trigger: Runs manually or on
1. Initialize
Triggers the backup process and loads configuration set
2. Prepare Backup
Creates timestamped folder and retrieves all Notion dat
3. Process Databases
Loops through each database, exports all pages, and upl
4. Finalize & Metadata
Generates backup summary with statistics and success/fa
5. Cleanup Old Backups
Automatically deletes backups older than retention peri
6. Send Notification
Formats and sends Telegram notification with backup res
⚠️ CRITICAL SETUP
Before first run, update the Configuration Settings nod
- Your Google Drive parent folder ID
- Your Telegram bot chat ID
Without these, the workflow will fail!
M
Manual Trigger
C
Configuration Settings
Create Backup Folder
Get All Databases
Initialize Tracking
L
Loop Databases
Get Database Pages
Create Backup File
Upload Backup File
Generate Metadata
Upload Metadata
S
Should Cleanup?
List Old Backups
Filter Old Backups
Delete Old Backup
Format Notification
Send Telegram Notification
I
If
18 nodes20 edges

Cómo funciona

  1. 1

    Disparador

    El flujo de trabajo comienza con un disparador disparador.

  2. 2

    Procesamiento

    Los datos fluyen a través de 18 nodos, connecting code, googledrive, if.

  3. 3

    Salida

    El flujo de trabajo completa su automatización y entrega el resultado al destino configurado.

Detalles de nodos (18)

TE

Telegram

telegram

#1
GO

Google Drive

googleDrive

#2
NO

Notion

notion

#3
CO

Code

code

#4

Cómo importar este flujo de trabajo

  1. 1Haz clic en el botón Descargar JSON a la derecha para guardar el archivo del flujo de trabajo.
  2. 2Abre tu instancia de n8n. Ve a Flujos de trabajo → Nuevo → Importar desde archivo.
  3. 3Selecciona el archivo automatic-notion-database-backup-to-google-drive-with-telegram-notifications descargado y haz clic en Importar.
  4. 4Configura las credenciales para cada nodo de servicio (claves API, OAuth, etc.).
  5. 5Haz clic en Probar flujo de trabajo para verificar que todo funcione, luego actívalo.

O pega directamente en n8n → Importar desde JSON:

{ "name": "Automatic Notion Database Backup to Google Drive with Telegram Notifications", "nodes": [...], ...}

Integraciones

codegoogledriveifmanualtriggernotionsetsplitinbatchestelegram

Obtener este flujo de trabajo

Descarga e importa con un solo clic

Descargar JSONVer en n8n.io
Nodos18
Complejidadhigh
Disparadortrigger
CategoríaFile Management

Creado por

Prueba

Prueba

@prueba

Etiquetas

codegoogledriveifmanualtriggernotionsetsplitinbatchestelegram

¿Nuevo en n8n?

n8n es una herramienta de automatización de flujos de trabajo gratuita y de código abierto. Alójala tú mismo o usa la versión en la nube.

Obtener n8n gratis →

Related File Management Workflows

COCOGOHT+5
medium

Sync SharePoint to Google Drive via Supabase (n8n Workflow)

Eliminate manual file transfers and data silos with this enterprise-grade synchronization pipeline. This n8n workflow automates the heavy lifting of mirroring your Microsoft SharePoint document libraries to Google Drive while maintaining a robust audit trail in Supabase or Postgres. Designed for high-reliability environments, the automation utilizes a sophisticated delta-check logic: it fetches SharePoint metadata, compares it against your existing database records using the 'Compare Datasets' node, and identifies only new or modified files for processing. By leveraging the 'Split in Batches' node, the system handles large volumes of data without hitting memory limits, ensuring every document is securely downloaded via HTTP and re-uploaded to its destination. Once the transfer is verified, the workflow updates your database status in real-time, providing a transparent log of your synchronization history. This is an essential solution for organizations transitioning between cloud ecosystems or those requiring a redundant, cross-platform backup strategy. It effectively bridges the gap between Microsoft and Google ecosystems using Supabase as a high-performance state-management layer. **Common Use Cases:** - Automated Multi-Cloud Disaster Recovery: Maintain a real-time mirror of critical SharePoint corporate assets in Google Drive to ensure business continuity during service outages. - Cross-Departmental Collaboration: Automatically sync project documentation from an executive SharePoint site to a creative team's Google Drive folder for seamless cross-platform access. - Centralized Document Auditing: Use Supabase to track every file movement between ecosystems, creating a searchable metadata index for compliance and regulatory reporting.

Scheduled·15 nodes
FOGOGOHT+2
medium

Automate Threads Video Backups to Google Drive & Sheets

Streamline your social media asset management with this robust n8n automation designed for digital marketers and content creators. This workflow eliminates the manual effort of downloading Threads content by providing a seamless bridge between Instagram's text-based platform and your cloud storage. Once triggered via a simple form input, the automation utilizes a RapidAPI connection to fetch high-quality video assets, processes the binary data, and securely uploads the file to a designated Google Drive folder. Simultaneously, the workflow maintains a centralized audit log in Google Sheets, capturing essential metadata such as timestamps and file IDs. Built-in logic gates and wait nodes ensure API rate limits are respected and file transfers are verified before logging. This is an essential tool for agencies managing high volumes of social content who need a reliable, automated pipeline for content repurposing, archival compliance, or collaborative review processes. By centralizing your Threads media assets, you ensure your creative team has instant access to raw footage without the friction of manual downloads. **Common Use Cases:** - Automated Social Media Content Archiving for Compliance - Centralized Asset Library Syncing for Creative Agencies - Competitor Research and Video Content Benchmarking

Trigger·9 nodes