Import E.ON W1000 Energy Meter Data to Home Assistant with Spook Integration — n8n 工作流

复杂度 触发器32 个节点🏷️ File Management👁 110 次查看作者:András Farkas

概览

UPDATES: 2025-12-03 fix JS code in calculate hourly sum node

E.ON W1000 → n8n → Home Assistant (Spook) “Integration”

This workflow processes emails from the E.ON portal containing 15-minute +A -A (import/export) data and daily 1.8.0 2.8.0 meter readings.
It extracts the required columns from the attached XLSX file, groups the 15-minute values by hour, then:

updates the Spook/Recorder statistics under the IDs sensor.grid_energy_import and sensor.grid_energy_export, and
sets the current me

使用的节点

GmailHome AssistantCode

工作流预览

Subject & Attachment checks
- Gmail payload uses `Subject` (capital S), IMAP node u
- This template handles both: first If checks `$json.Su
Column Mapping
The E.ON export reuses column names; n8n appends `_1`,
We normalize to:
- Default (+A): `Időbélyeg` → `start`, `Érték` → `AP`
- *_1 (-A): `Időbélyeg` → `start`, `Érték_1`
E.ON W1000 → n8n → Home Assistant (Spook) — O
Goal: Parse E.ON W1000 email export (.xlsx), group 15-m
Credentials to configure
- Gmail OAuth2 *or* IMAP credentials (read-only)
- Home Assistant API (Long-Lived Access Token)
- Optional: adjust entity IDs if you renamed them in HA
> Tip: Keep
Schedule trigger
Gmail trigger
IMAP trigger
Time Conversion
- E.ON Excel time is a serial day count; Convert Excel
- Convert datetime to Spook format rounds to the top of
Hourly Grouping Logic (Code node)
INPUT: `{ start, AP, AM, 1_8_0?, 2_8_0? }` at hourly ti
- Sums 15-min AP/AM into hourly totals
- Resets meter baselines when a new `1_8_0`/`2_8_0` rea
recorder.import_statistics payload
We build `{ start, state, sum }` arrays for each meter:
- start: JS Date object
- state: current meter state (kWh)
- sum: same as state for total_incre
Entity IDs updated
- `sensor.grid_energy_import` / `sensor.grid_energy_exp
- `input_number.grid_import_meter` / `input_number.grid
Troubleshooting
- No rows after Extract? → Check that the email subject
- Wrong times? → Verify timezone and Excel serial conve
Security & Privacy
- Gmail/IMAP access is read-only; do not store raw atta
- Filter by sender (`[email protected]`) + subject token
- Never hardcode tokens in nodes —
Home Assistant prerequisites
Before running this workflow, make sure HA is prepared:
1. Recorder enabled
- Recorder integration must be active (default if yo
2. *
E
Extract from File
R
Rename "*_1" keys for me…
Get last 5 messages
A
Aggregate_id
Get a message[0]
R
Rename "*_1" keys for me…
R
Rename "*_1" keys for me…
E
Extract default data fro…
E
Extract '*_1' data from …
E
Extract '*_2' data from …
E
Extract '*_3' data from …
M
Merge (+A; -A)
R
Rename "*_1" keys for me…
M
Merge (+A; -A)1
M
Merge (+A; -A)2
Calculate hourly sum and
Spook: update +A hitoric…
Spook: update -A hitoric…
G
Generate 1_8_0 list for …
G
Generate 2_8_0 list for …
G
Generate 1_8_0 stats
G
Generate 2_8_0 stats
Update input_number.expo…
Update input_number.impo…
G
Gmail Trigger
S
Schedule Trigger
I
If attachment_0 is xlsx
N
No Operation, do nothing1
E
Email Trigger (IMAP)
C
Check Email Subject
C
Convert datetime to Spoo…
C
Convert Excel time
32 nodes35 edges

工作原理

  1. 1

    触发器

    工作流由 触发器 触发器启动。

  2. 2

    处理

    数据流经 32 个节点, connecting aggregate, code, datetime。

  3. 3

    输出

    工作流完成自动化并将结果发送到配置的目标。

节点详情 (32)

GM

Gmail

gmail

#1
HO

Home Assistant

homeAssistant

#2
CO

Code

code

#3

如何导入此工作流

  1. 1点击右侧 下载 JSON 按钮保存工作流文件。
  2. 2打开你的 n8n 实例,依次点击 工作流 → 新建 → 从文件导入
  3. 3选择下载的 import-eon-w1000-energy-meter-data-to-home-assistant-with-spook-integration 文件并点击导入。
  4. 4为每个服务节点配置 凭证(API 密钥、OAuth 等)。
  5. 5点击 测试工作流 验证一切正常,然后激活它。

或直接在 n8n → 从 JSON 导入 中粘贴:

{ "name": "Import E.ON W1000 Energy Meter Data to Home Assistant with Spook Integration", "nodes": [...], ...}

集成

aggregatecodedatetimeemailreadimapextractfromfilegmailgmailtriggerhomeassistantifmergerenamekeysscheduletriggersetsplitout

获取此工作流

一键下载并导入

下载 JSON在 n8n.io 上查看
节点32
复杂度high
触发器trigger
查看次数110

创建者

András Farkas

András Farkas

@netesfiu

标签

aggregatecodedatetimeemailreadimapextractfromfilegmailgmailtriggerhomeassistantifmerge

n8n 新手?

n8n 是一款免费开源的工作流自动化工具,支持自托管或使用云版本。

免费获取 n8n →

Related File Management Workflows

COCOGOHT+5
medium

Sync SharePoint to Google Drive via Supabase (n8n Workflow)

Eliminate manual file transfers and data silos with this enterprise-grade synchronization pipeline. This n8n workflow automates the heavy lifting of mirroring your Microsoft SharePoint document libraries to Google Drive while maintaining a robust audit trail in Supabase or Postgres. Designed for high-reliability environments, the automation utilizes a sophisticated delta-check logic: it fetches SharePoint metadata, compares it against your existing database records using the 'Compare Datasets' node, and identifies only new or modified files for processing. By leveraging the 'Split in Batches' node, the system handles large volumes of data without hitting memory limits, ensuring every document is securely downloaded via HTTP and re-uploaded to its destination. Once the transfer is verified, the workflow updates your database status in real-time, providing a transparent log of your synchronization history. This is an essential solution for organizations transitioning between cloud ecosystems or those requiring a redundant, cross-platform backup strategy. It effectively bridges the gap between Microsoft and Google ecosystems using Supabase as a high-performance state-management layer. **Common Use Cases:** - Automated Multi-Cloud Disaster Recovery: Maintain a real-time mirror of critical SharePoint corporate assets in Google Drive to ensure business continuity during service outages. - Cross-Departmental Collaboration: Automatically sync project documentation from an executive SharePoint site to a creative team's Google Drive folder for seamless cross-platform access. - Centralized Document Auditing: Use Supabase to track every file movement between ecosystems, creating a searchable metadata index for compliance and regulatory reporting.

Scheduled·15 nodes
FOGOGOHT+2
medium

Automate Threads Video Backups to Google Drive & Sheets

Streamline your social media asset management with this robust n8n automation designed for digital marketers and content creators. This workflow eliminates the manual effort of downloading Threads content by providing a seamless bridge between Instagram's text-based platform and your cloud storage. Once triggered via a simple form input, the automation utilizes a RapidAPI connection to fetch high-quality video assets, processes the binary data, and securely uploads the file to a designated Google Drive folder. Simultaneously, the workflow maintains a centralized audit log in Google Sheets, capturing essential metadata such as timestamps and file IDs. Built-in logic gates and wait nodes ensure API rate limits are respected and file transfers are verified before logging. This is an essential tool for agencies managing high volumes of social content who need a reliable, automated pipeline for content repurposing, archival compliance, or collaborative review processes. By centralizing your Threads media assets, you ensure your creative team has instant access to raw footage without the friction of manual downloads. **Common Use Cases:** - Automated Social Media Content Archiving for Compliance - Centralized Asset Library Syncing for Creative Agencies - Competitor Research and Video Content Benchmarking

Trigger·9 nodes