Import E.ON W1000 Energy Meter Data to Home Assistant with Spook Integration — n8n ワークフロー

複雑度 トリガー32個のノード🏷️ File Management👁 110回閲覧作成者:András Farkas

概要

UPDATES: 2025-12-03 fix JS code in calculate hourly sum node

E.ON W1000 → n8n → Home Assistant (Spook) “Integration”

This workflow processes emails from the E.ON portal containing 15-minute +A -A (import/export) data and daily 1.8.0 2.8.0 meter readings.
It extracts the required columns from the attached XLSX file, groups the 15-minute values by hour, then:

updates the Spook/Recorder statistics under the IDs sensor.grid_energy_import and sensor.grid_energy_export, and
sets the current me

使用ノード

GmailHome AssistantCode

ワークフロープレビュー

Subject & Attachment checks
- Gmail payload uses `Subject` (capital S), IMAP node u
- This template handles both: first If checks `$json.Su
Column Mapping
The E.ON export reuses column names; n8n appends `_1`,
We normalize to:
- Default (+A): `Időbélyeg` → `start`, `Érték` → `AP`
- *_1 (-A): `Időbélyeg` → `start`, `Érték_1`
E.ON W1000 → n8n → Home Assistant (Spook) — O
Goal: Parse E.ON W1000 email export (.xlsx), group 15-m
Credentials to configure
- Gmail OAuth2 *or* IMAP credentials (read-only)
- Home Assistant API (Long-Lived Access Token)
- Optional: adjust entity IDs if you renamed them in HA
> Tip: Keep
Schedule trigger
Gmail trigger
IMAP trigger
Time Conversion
- E.ON Excel time is a serial day count; Convert Excel
- Convert datetime to Spook format rounds to the top of
Hourly Grouping Logic (Code node)
INPUT: `{ start, AP, AM, 1_8_0?, 2_8_0? }` at hourly ti
- Sums 15-min AP/AM into hourly totals
- Resets meter baselines when a new `1_8_0`/`2_8_0` rea
recorder.import_statistics payload
We build `{ start, state, sum }` arrays for each meter:
- start: JS Date object
- state: current meter state (kWh)
- sum: same as state for total_incre
Entity IDs updated
- `sensor.grid_energy_import` / `sensor.grid_energy_exp
- `input_number.grid_import_meter` / `input_number.grid
Troubleshooting
- No rows after Extract? → Check that the email subject
- Wrong times? → Verify timezone and Excel serial conve
Security & Privacy
- Gmail/IMAP access is read-only; do not store raw atta
- Filter by sender (`[email protected]`) + subject token
- Never hardcode tokens in nodes —
Home Assistant prerequisites
Before running this workflow, make sure HA is prepared:
1. Recorder enabled
- Recorder integration must be active (default if yo
2. *
E
Extract from File
R
Rename "*_1" keys for me…
Get last 5 messages
A
Aggregate_id
Get a message[0]
R
Rename "*_1" keys for me…
R
Rename "*_1" keys for me…
E
Extract default data fro…
E
Extract '*_1' data from …
E
Extract '*_2' data from …
E
Extract '*_3' data from …
M
Merge (+A; -A)
R
Rename "*_1" keys for me…
M
Merge (+A; -A)1
M
Merge (+A; -A)2
Calculate hourly sum and
Spook: update +A hitoric…
Spook: update -A hitoric…
G
Generate 1_8_0 list for …
G
Generate 2_8_0 list for …
G
Generate 1_8_0 stats
G
Generate 2_8_0 stats
Update input_number.expo…
Update input_number.impo…
G
Gmail Trigger
S
Schedule Trigger
I
If attachment_0 is xlsx
N
No Operation, do nothing1
E
Email Trigger (IMAP)
C
Check Email Subject
C
Convert datetime to Spoo…
C
Convert Excel time
32 nodes35 edges

仕組み

  1. 1

    トリガー

    このワークフローは トリガー トリガーで開始します。

  2. 2

    処理

    データは 32 個のノードを流れます, connecting aggregate, code, datetime。

  3. 3

    出力

    ワークフローは自動化を完了し、設定された宛先に結果を配信します。

ノード詳細 (32)

GM

Gmail

gmail

#1
HO

Home Assistant

homeAssistant

#2
CO

Code

code

#3

このワークフローのインポート方法

  1. 1右側の JSONをダウンロード ボタンをクリックしてワークフローファイルを保存します。
  2. 2n8nインスタンスを開き、ワークフロー → 新規 → ファイルからインポート に進みます。
  3. 3ダウンロードした import-eon-w1000-energy-meter-data-to-home-assistant-with-spook-integration ファイルを選択し、インポートをクリックします。
  4. 4各サービスノードの 認証情報(APIキー、OAuthなど)を設定します。
  5. 5ワークフローをテスト をクリックして動作確認し、有効化します。

またはn8nの JSONからインポート に直接貼り付け:

{ "name": "Import E.ON W1000 Energy Meter Data to Home Assistant with Spook Integration", "nodes": [...], ...}

インテグレーション

aggregatecodedatetimeemailreadimapextractfromfilegmailgmailtriggerhomeassistantifmergerenamekeysscheduletriggersetsplitout

このワークフローを取得

ワンクリックでダウンロード&インポート

JSONをダウンロードn8n.ioで見る
ノード32
複雑度high
トリガーtrigger
閲覧数110
カテゴリFile Management

作成者

András Farkas

András Farkas

@netesfiu

タグ

aggregatecodedatetimeemailreadimapextractfromfilegmailgmailtriggerhomeassistantifmerge

n8n初めてですか?

n8nは無料のオープンソースワークフロー自動化ツールです。セルフホストまたはクラウド版をご利用いただけます。

n8nを無料で始める →

Related File Management Workflows

COCOGOHT+5
medium

Sync SharePoint to Google Drive via Supabase (n8n Workflow)

Eliminate manual file transfers and data silos with this enterprise-grade synchronization pipeline. This n8n workflow automates the heavy lifting of mirroring your Microsoft SharePoint document libraries to Google Drive while maintaining a robust audit trail in Supabase or Postgres. Designed for high-reliability environments, the automation utilizes a sophisticated delta-check logic: it fetches SharePoint metadata, compares it against your existing database records using the 'Compare Datasets' node, and identifies only new or modified files for processing. By leveraging the 'Split in Batches' node, the system handles large volumes of data without hitting memory limits, ensuring every document is securely downloaded via HTTP and re-uploaded to its destination. Once the transfer is verified, the workflow updates your database status in real-time, providing a transparent log of your synchronization history. This is an essential solution for organizations transitioning between cloud ecosystems or those requiring a redundant, cross-platform backup strategy. It effectively bridges the gap between Microsoft and Google ecosystems using Supabase as a high-performance state-management layer. **Common Use Cases:** - Automated Multi-Cloud Disaster Recovery: Maintain a real-time mirror of critical SharePoint corporate assets in Google Drive to ensure business continuity during service outages. - Cross-Departmental Collaboration: Automatically sync project documentation from an executive SharePoint site to a creative team's Google Drive folder for seamless cross-platform access. - Centralized Document Auditing: Use Supabase to track every file movement between ecosystems, creating a searchable metadata index for compliance and regulatory reporting.

Scheduled·15 nodes
FOGOGOHT+2
medium

Automate Threads Video Backups to Google Drive & Sheets

Streamline your social media asset management with this robust n8n automation designed for digital marketers and content creators. This workflow eliminates the manual effort of downloading Threads content by providing a seamless bridge between Instagram's text-based platform and your cloud storage. Once triggered via a simple form input, the automation utilizes a RapidAPI connection to fetch high-quality video assets, processes the binary data, and securely uploads the file to a designated Google Drive folder. Simultaneously, the workflow maintains a centralized audit log in Google Sheets, capturing essential metadata such as timestamps and file IDs. Built-in logic gates and wait nodes ensure API rate limits are respected and file transfers are verified before logging. This is an essential tool for agencies managing high volumes of social content who need a reliable, automated pipeline for content repurposing, archival compliance, or collaborative review processes. By centralizing your Threads media assets, you ensure your creative team has instant access to raw footage without the friction of manual downloads. **Common Use Cases:** - Automated Social Media Content Archiving for Compliance - Centralized Asset Library Syncing for Creative Agencies - Competitor Research and Video Content Benchmarking

Trigger·9 nodes