first_commit
This commit is contained in:
@@ -0,0 +1,6 @@
|
|||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
logs/
|
||||||
|
*.log
|
||||||
|
env/*.env
|
||||||
|
!env/*.example
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
[Unit]
|
||||||
|
Description=Astrape Database Ingest
|
||||||
|
After=network-online.target postgresql.service
|
||||||
|
Wants=network-online.target
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=simple
|
||||||
|
User=gibil
|
||||||
|
Group=gibil
|
||||||
|
WorkingDirectory=/mnt/astrape
|
||||||
|
Environment=PYTHONUNBUFFERED=1
|
||||||
|
Environment=PYTHONDONTWRITEBYTECODE=1
|
||||||
|
ExecStart=/usr/bin/python3 -m gibil.scripts.db_daemon
|
||||||
|
Restart=always
|
||||||
|
RestartSec=10
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
[Unit]
|
||||||
|
Description=Astrape Web UI
|
||||||
|
After=network-online.target
|
||||||
|
Wants=network-online.target
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=simple
|
||||||
|
User=gibil
|
||||||
|
Group=gibil
|
||||||
|
WorkingDirectory=/mnt/astrape
|
||||||
|
Environment=PYTHONUNBUFFERED=1
|
||||||
|
Environment=PYTHONDONTWRITEBYTECODE=1
|
||||||
|
ExecStart=/usr/bin/python3 -m gibil.scripts.web_daemon
|
||||||
|
Restart=always
|
||||||
|
RestartSec=5
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
@@ -0,0 +1,71 @@
|
|||||||
|
# Architecture Principles
|
||||||
|
|
||||||
|
## Standalone Subsystems
|
||||||
|
|
||||||
|
Each class should behave like a small standalone subsystem. It should own one clear responsibility, expose a narrow public interface, and avoid hidden dependencies on the internals of other classes.
|
||||||
|
|
||||||
|
Good subsystem boundaries:
|
||||||
|
- accept explicit inputs
|
||||||
|
- return explicit outputs
|
||||||
|
- keep internal state private
|
||||||
|
- avoid reaching into global state
|
||||||
|
- avoid performing unrelated work
|
||||||
|
- can be tested with recorded or fixture data
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
- a weather client fetches forecast data
|
||||||
|
- a weather parser converts API payloads into forecast points
|
||||||
|
- a weather builder normalizes external forecast records for storage
|
||||||
|
- a storage class persists records
|
||||||
|
- Gibil makes decisions from snapshots
|
||||||
|
|
||||||
|
## Data Models Between Subsystems
|
||||||
|
|
||||||
|
Subsystems should communicate through shared data models rather than through source-specific payloads.
|
||||||
|
|
||||||
|
For example:
|
||||||
|
- Open-Meteo JSON should become `WeatherForecastRun`
|
||||||
|
- Modbus register reads should become `Observation`
|
||||||
|
- HASS entity state should become `Observation`
|
||||||
|
- Gibil should reason from `Snapshot`
|
||||||
|
|
||||||
|
This keeps the edges messy and the core calm.
|
||||||
|
|
||||||
|
## Side Effects At The Edges
|
||||||
|
|
||||||
|
Network calls, database writes, MQTT publishes, and filesystem writes should live at clear boundaries.
|
||||||
|
|
||||||
|
Core reasoning classes should generally be pure or nearly pure:
|
||||||
|
- input data in
|
||||||
|
- answer out
|
||||||
|
- no surprise I/O
|
||||||
|
|
||||||
|
Stateful classes are allowed, but their state should be deliberate and inspectable.
|
||||||
|
|
||||||
|
## Grow By Composition
|
||||||
|
|
||||||
|
Astrape should grow by connecting small subsystems together, not by building one large object that knows everything.
|
||||||
|
|
||||||
|
The desired shape is:
|
||||||
|
|
||||||
|
```text
|
||||||
|
source client -> parser -> model -> storage -> query/snapshot -> Gibil -> publisher
|
||||||
|
```
|
||||||
|
|
||||||
|
Each part should be replaceable without rewriting the others.
|
||||||
|
|
||||||
|
## Prefer Working Slices
|
||||||
|
|
||||||
|
Build one thin working path at a time. A thin slice may start with empty storage or recorded source data, but it should still follow the real subsystem boundaries.
|
||||||
|
|
||||||
|
For example, the weather slice can start with:
|
||||||
|
|
||||||
|
```text
|
||||||
|
Open-Meteo forecast run -> WeatherBuilder -> clean forecast records
|
||||||
|
```
|
||||||
|
|
||||||
|
Then grow into:
|
||||||
|
|
||||||
|
```text
|
||||||
|
Open-Meteo -> parser -> WeatherBuilder -> TimescaleDB -> weather_predictor.py
|
||||||
|
```
|
||||||
@@ -0,0 +1,256 @@
|
|||||||
|
# Ingestion & Storage
|
||||||
|
|
||||||
|
## Purpose
|
||||||
|
|
||||||
|
Astrape needs a reliable way to collect energy-related data, normalize it, store it, and give Gibil a clean view of the current system state. The first version should favor boring, inspectable data flows over cleverness.
|
||||||
|
|
||||||
|
Gibil should not need to know whether a value came from Modbus, Home Assistant, a weather API, a price API, or a manual override. It should receive timestamped observations and snapshots with enough metadata to decide whether the data is fresh and trustworthy.
|
||||||
|
|
||||||
|
## Initial Sources
|
||||||
|
|
||||||
|
### Sigen Inverter
|
||||||
|
|
||||||
|
- Protocol: Modbus TCP
|
||||||
|
- Polling target: every 5-10 seconds for fast-changing electrical state
|
||||||
|
- Initial metrics:
|
||||||
|
- `solar_power_w`
|
||||||
|
- `battery_soc_pct`
|
||||||
|
- `battery_charge_w`
|
||||||
|
- `battery_discharge_w`
|
||||||
|
- `grid_import_w`
|
||||||
|
- `grid_export_w`
|
||||||
|
- `daily_yield_kwh`
|
||||||
|
- Risk: register map must be confirmed before this can be real
|
||||||
|
|
||||||
|
### Home Assistant / Ganymede
|
||||||
|
|
||||||
|
- Preferred integration: MQTT
|
||||||
|
- Direction: HASS/Ganymede should publish selected state to Astrape where possible
|
||||||
|
- Initial metrics:
|
||||||
|
- `home_power_w`
|
||||||
|
- `indoor_temp_c`
|
||||||
|
- selected device states
|
||||||
|
- selected sensor values needed for water/heating logic
|
||||||
|
- Reasoning: MQTT keeps Astrape loosely coupled and avoids making HASS a synchronous dependency for every decision tick
|
||||||
|
|
||||||
|
### Weather
|
||||||
|
|
||||||
|
- Preferred first source: OpenMeteo
|
||||||
|
- Polling target: hourly forecast refresh
|
||||||
|
- Initial metrics:
|
||||||
|
- `outdoor_temp_c`
|
||||||
|
- `cloud_cover_pct`
|
||||||
|
- `ghi_w_m2`
|
||||||
|
- `wind_speed_m_s`
|
||||||
|
- Use: external forecast history for generation and heating models
|
||||||
|
|
||||||
|
### Grid Pricing
|
||||||
|
|
||||||
|
- First implementation: static time-of-use config
|
||||||
|
- Later implementation: spot pricing API if needed
|
||||||
|
- Initial metrics:
|
||||||
|
- `grid_price_per_kwh`
|
||||||
|
- `price_stage`
|
||||||
|
- `cheap_window_active`
|
||||||
|
- Reasoning: static config lets Gibil produce useful behavior before price API work is settled
|
||||||
|
|
||||||
|
### Manual Inputs
|
||||||
|
|
||||||
|
- Purpose: allow operator-supplied values when a real integration is not available yet
|
||||||
|
- Inputs may come from local config or a small authenticated admin path
|
||||||
|
- Manual data should be marked clearly with `source = manual`
|
||||||
|
|
||||||
|
## Observation Shape
|
||||||
|
|
||||||
|
Every collector should produce normalized observations.
|
||||||
|
|
||||||
|
```text
|
||||||
|
observed_at: timestamp when the measurement was true
|
||||||
|
received_at: timestamp when Astrape received it
|
||||||
|
source: sigen | hass | weather | price | manual
|
||||||
|
metric: stable metric name
|
||||||
|
value: number, string, or boolean
|
||||||
|
unit: W | kWh | pct | C | SEK/kWh | state | none
|
||||||
|
quality: ok | stale | estimated | missing | error
|
||||||
|
metadata: source-specific context
|
||||||
|
```
|
||||||
|
|
||||||
|
Guidelines:
|
||||||
|
- `observed_at` and `received_at` are both needed because pushed data may arrive late
|
||||||
|
- metric names should be stable and boring
|
||||||
|
- raw source names/registers/entities belong in metadata, not in the metric name
|
||||||
|
- Gibil should be able to ignore stale or low-quality observations
|
||||||
|
|
||||||
|
## Derived Snapshots
|
||||||
|
|
||||||
|
Gibil should reason from snapshots, not directly from loose individual observations.
|
||||||
|
|
||||||
|
A snapshot is the best-known whole-system state at a decision tick. It can include:
|
||||||
|
|
||||||
|
- current solar generation
|
||||||
|
- current home consumption
|
||||||
|
- battery SoC
|
||||||
|
- battery charge/discharge power
|
||||||
|
- grid import/export
|
||||||
|
- current price stage
|
||||||
|
- active forecast window
|
||||||
|
- stale/missing input flags
|
||||||
|
|
||||||
|
Snapshots should be persisted because they explain what Gibil knew when it made a decision.
|
||||||
|
|
||||||
|
## Storage Choice
|
||||||
|
|
||||||
|
Use TimescaleDB as the first primary store.
|
||||||
|
|
||||||
|
Reasons:
|
||||||
|
- It is Postgres, so querying and joining data stays straightforward
|
||||||
|
- It handles time-series retention and aggregation well
|
||||||
|
- It works for raw observations, derived snapshots, decisions, forecasts, and events
|
||||||
|
- It leaves room for later model training without needing a second historical store immediately
|
||||||
|
|
||||||
|
InfluxDB remains a reasonable alternative, but TimescaleDB is the better default if we want relational joins, auditability, and forecast training queries.
|
||||||
|
|
||||||
|
The runtime expects `ASTRAPE_DATABASE_URL` to point at TimescaleDB. Weather ingest also expects `ASTRAPE_LATITUDE` and `ASTRAPE_LONGITUDE`.
|
||||||
|
|
||||||
|
## Initial Tables
|
||||||
|
|
||||||
|
### `observations`
|
||||||
|
|
||||||
|
Raw normalized metric samples from all collectors.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `id`
|
||||||
|
- `observed_at`
|
||||||
|
- `received_at`
|
||||||
|
- `source`
|
||||||
|
- `metric`
|
||||||
|
- `value_num`
|
||||||
|
- `value_text`
|
||||||
|
- `value_bool`
|
||||||
|
- `unit`
|
||||||
|
- `quality`
|
||||||
|
- `metadata`
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
- use one value column based on the metric type
|
||||||
|
- keep metadata as JSON for source-specific details
|
||||||
|
- make this a hypertable on `observed_at`
|
||||||
|
|
||||||
|
### `snapshots`
|
||||||
|
|
||||||
|
Periodic whole-system state used by Gibil.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `id`
|
||||||
|
- `created_at`
|
||||||
|
- `snapshot`
|
||||||
|
- `input_quality`
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
- store the snapshot as JSON initially
|
||||||
|
- this can be normalized later if query patterns demand it
|
||||||
|
|
||||||
|
### `decisions`
|
||||||
|
|
||||||
|
Gibil outputs and reasoning.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `id`
|
||||||
|
- `created_at`
|
||||||
|
- `snapshot_id`
|
||||||
|
- `stage`
|
||||||
|
- `recommendations`
|
||||||
|
- `reasons`
|
||||||
|
- `confidence`
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
- decisions should be explainable enough to debug after the fact
|
||||||
|
- this table becomes the audit trail for HASS-facing behavior
|
||||||
|
|
||||||
|
### `weather_forecast_points`
|
||||||
|
|
||||||
|
Clean external weather forecast points from weather sources.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `id`
|
||||||
|
- `issued_at`
|
||||||
|
- `target_at`
|
||||||
|
- `horizon_hours`
|
||||||
|
- `source`
|
||||||
|
- `temperature_c`
|
||||||
|
- `shortwave_radiation_w_m2`
|
||||||
|
- `cloud_cover_pct`
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
- this stores external forecasts, not internal predictions
|
||||||
|
- make this a hypertable on `target_at`
|
||||||
|
|
||||||
|
### `weather_resolved_truth`
|
||||||
|
|
||||||
|
Observed weather for target hours that have already happened.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `id`
|
||||||
|
- `resolved_at`
|
||||||
|
- `source`
|
||||||
|
- `temperature_c`
|
||||||
|
- `shortwave_radiation_w_m2`
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
- future prediction modules can join this to `weather_forecast_points`
|
||||||
|
- make this a hypertable on `resolved_at`
|
||||||
|
|
||||||
|
### `system_events`
|
||||||
|
|
||||||
|
Operational events from collectors, storage, Gibil, and publishers.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `id`
|
||||||
|
- `created_at`
|
||||||
|
- `component`
|
||||||
|
- `severity`
|
||||||
|
- `event_type`
|
||||||
|
- `message`
|
||||||
|
- `metadata`
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
- this should capture stale data, auth failures, bad Modbus reads, publish failures, and degraded-mode decisions
|
||||||
|
|
||||||
|
## Retention
|
||||||
|
|
||||||
|
Initial retention targets:
|
||||||
|
- raw 5-10 second observations: 7-30 days
|
||||||
|
- 1-minute aggregates: 6-12 months
|
||||||
|
- 15-minute/hourly aggregates: keep indefinitely unless storage becomes a problem
|
||||||
|
- decisions: keep indefinitely
|
||||||
|
- system events: keep indefinitely or archive after a year
|
||||||
|
|
||||||
|
Retention should be revisited after real sample rates and database size are known.
|
||||||
|
|
||||||
|
## First Slice
|
||||||
|
|
||||||
|
The first implementation slice should prove the shape before touching real hardware.
|
||||||
|
|
||||||
|
1. Define the observation and snapshot models.
|
||||||
|
2. Add a manual collector only if needed for operator-supplied values.
|
||||||
|
3. Store observations in TimescaleDB or a local development substitute.
|
||||||
|
4. Build one snapshot from the latest observations.
|
||||||
|
5. Let Gibil make a simple stage decision from that snapshot.
|
||||||
|
6. Persist the decision with reasons.
|
||||||
|
|
||||||
|
This gives us the whole loop:
|
||||||
|
|
||||||
|
```text
|
||||||
|
collector -> observations -> snapshot -> Gibil decision -> stored audit trail
|
||||||
|
```
|
||||||
|
|
||||||
|
MQTT publishing can come immediately after this loop exists.
|
||||||
|
|
||||||
|
## Open Questions
|
||||||
|
|
||||||
|
- Should development use real TimescaleDB from day one, or SQLite/Postgres first?
|
||||||
|
- What is the exact MQTT topic namespace for HASS/Ganymede integration?
|
||||||
|
- Which HASS entities should be included in the first read-only state feed?
|
||||||
|
- How should the `gibil` IPA identity authenticate to MQTT and HASS?
|
||||||
|
- What high-resolution retention target is acceptable on the Astrape VM?
|
||||||
|
- Should snapshots be created on a fixed schedule, on new data, or both?
|
||||||
@@ -0,0 +1,105 @@
|
|||||||
|
# Operations
|
||||||
|
|
||||||
|
## Web UI
|
||||||
|
|
||||||
|
Start the web UI daemon:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python3 -m gibil.scripts.web_daemon
|
||||||
|
```
|
||||||
|
|
||||||
|
The daemon listens on:
|
||||||
|
|
||||||
|
```text
|
||||||
|
http://0.0.0.0:8080
|
||||||
|
```
|
||||||
|
|
||||||
|
By default the server binds to all network interfaces so it can be reached from another machine. Override the bind address or port if needed:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
export ASTRAPE_WEB_HOST='0.0.0.0'
|
||||||
|
export ASTRAPE_WEB_PORT='8080'
|
||||||
|
```
|
||||||
|
|
||||||
|
The host process reloads `webui.py` and display modules on each request. The browser polls `/api/ui-version` and refreshes when those files change.
|
||||||
|
|
||||||
|
## Systemd Services
|
||||||
|
|
||||||
|
Install service units:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo cp deploy/systemd/astrape-web.service /etc/systemd/system/
|
||||||
|
sudo cp deploy/systemd/astrape-db.service /etc/systemd/system/
|
||||||
|
sudo systemctl daemon-reload
|
||||||
|
sudo systemctl enable --now astrape-web.service astrape-db.service
|
||||||
|
```
|
||||||
|
|
||||||
|
Check status:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
systemctl status astrape-web.service
|
||||||
|
systemctl status astrape-db.service
|
||||||
|
journalctl -u astrape-web.service -f
|
||||||
|
journalctl -u astrape-db.service -f
|
||||||
|
```
|
||||||
|
|
||||||
|
Both services run as the IPA-managed `gibil` user from `/mnt/astrape`.
|
||||||
|
|
||||||
|
## Database Daemon
|
||||||
|
|
||||||
|
Install runtime dependencies:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python3 -m pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
Create a local env file:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cp env/astrape.env.example env/astrape.env
|
||||||
|
nano env/astrape.env
|
||||||
|
```
|
||||||
|
|
||||||
|
Required values:
|
||||||
|
|
||||||
|
```text
|
||||||
|
ASTRAPE_DATABASE_URL=postgresql://USER:PASSWORD@HOST:PORT/DBNAME
|
||||||
|
ASTRAPE_LATITUDE=59.0000
|
||||||
|
ASTRAPE_LONGITUDE=18.0000
|
||||||
|
```
|
||||||
|
|
||||||
|
Optional values:
|
||||||
|
|
||||||
|
```text
|
||||||
|
ASTRAPE_WEATHER_FORECAST_HOURS=48
|
||||||
|
ASTRAPE_WEATHER_POLL_SECONDS=3600
|
||||||
|
ASTRAPE_WEATHER_TRUTH_LOOKBACK_DAYS=14
|
||||||
|
ASTRAPE_WEATHER_TRUTH_END_DELAY_DAYS=5
|
||||||
|
```
|
||||||
|
|
||||||
|
The daemons load `env/*.env` automatically. Existing process environment variables win over file values.
|
||||||
|
|
||||||
|
For temporary frontend tuning, enable display-only sample weather data:
|
||||||
|
|
||||||
|
```text
|
||||||
|
ASTRAPE_WEB_SAMPLE_DATA=1
|
||||||
|
```
|
||||||
|
|
||||||
|
This does not write artificial data to TimescaleDB. It only changes the web UI weather API response.
|
||||||
|
|
||||||
|
Start the database ingest daemon:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python3 -m gibil.scripts.db_daemon
|
||||||
|
```
|
||||||
|
|
||||||
|
Current behavior:
|
||||||
|
- initializes TimescaleDB weather tables
|
||||||
|
- fetches real Open-Meteo hourly forecasts
|
||||||
|
- normalizes them through `WeatherBuilder`
|
||||||
|
- stores rows in `weather_forecast_points`
|
||||||
|
- fetches Open-Meteo archive data for resolved truth
|
||||||
|
- stores rows in `weather_resolved_truth`
|
||||||
|
- repeats every `ASTRAPE_WEATHER_POLL_SECONDS`
|
||||||
|
|
||||||
|
No internal weather predictions are generated here. This daemon only stores external forecast and resolved-truth data for later modules.
|
||||||
@@ -0,0 +1,117 @@
|
|||||||
|
# Weather Source Data
|
||||||
|
|
||||||
|
## Goal
|
||||||
|
|
||||||
|
This subsystem aggregates external weather forecasts and stores them in a clean database-ready shape.
|
||||||
|
|
||||||
|
Terminology:
|
||||||
|
- **forecast**: data from an external weather source, such as Open-Meteo
|
||||||
|
- **resolved truth**: observed weather for a time that has already happened
|
||||||
|
- **prediction**: an internal estimate produced by a future Astrape/Gibil model
|
||||||
|
|
||||||
|
This module should not produce predictions or confidence scores. A later `weather_predictor.py` subsystem can use this clean forecast database to produce predictions and confidence.
|
||||||
|
|
||||||
|
## Subsystem Boundary
|
||||||
|
|
||||||
|
Initial classes should stay narrowly scoped:
|
||||||
|
|
||||||
|
- `OpenMeteoClient`: fetch raw hourly forecast payloads
|
||||||
|
- `OpenMeteoParser`: convert API payloads into external forecast runs and points
|
||||||
|
- `WeatherBuilder`: normalize and select clean forecast records for database use
|
||||||
|
- `WeatherStore`: persist forecast points and resolved truth
|
||||||
|
|
||||||
|
These classes communicate through data models like `WeatherForecastRun`, `WeatherForecastPoint`, and `WeatherResolvedTruth`.
|
||||||
|
|
||||||
|
## Core Data Shape
|
||||||
|
|
||||||
|
Every weather API pull is a forecast run.
|
||||||
|
|
||||||
|
```text
|
||||||
|
issued_at = when the external forecast was fetched
|
||||||
|
target_at = the hour being forecast
|
||||||
|
horizon_hours = target_at - issued_at
|
||||||
|
forecast_value = external forecast value for that target hour
|
||||||
|
```
|
||||||
|
|
||||||
|
Later, when `target_at` is in the past, Astrape can attach resolved truth:
|
||||||
|
|
||||||
|
```text
|
||||||
|
resolved_at = the hour that actually happened
|
||||||
|
truth = observed temperature / observed solar radiation
|
||||||
|
```
|
||||||
|
|
||||||
|
That creates rows future modules can use:
|
||||||
|
|
||||||
|
```text
|
||||||
|
target_at | resolved_truth | forecast_1h | forecast_2h | ... | forecast_48h
|
||||||
|
```
|
||||||
|
|
||||||
|
The future predictor can learn from those rows without needing to know anything about Open-Meteo payloads.
|
||||||
|
|
||||||
|
## First Variables
|
||||||
|
|
||||||
|
Use Open-Meteo hourly forecast fields:
|
||||||
|
|
||||||
|
- `temperature_2m`
|
||||||
|
- `shortwave_radiation`
|
||||||
|
- `cloud_cover`
|
||||||
|
|
||||||
|
Open-Meteo documents `shortwave_radiation` as average incoming solar radiation over the preceding hour at the surface, equivalent to GHI, measured in W/m2. That is the right starting solar forecast variable for Astrape.
|
||||||
|
|
||||||
|
## Storage Shape
|
||||||
|
|
||||||
|
Forecast points should be stored as individual rows.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `issued_at`
|
||||||
|
- `target_at`
|
||||||
|
- `horizon_hours`
|
||||||
|
- `source`
|
||||||
|
- `temperature_c`
|
||||||
|
- `shortwave_radiation_w_m2`
|
||||||
|
- `cloud_cover_pct`
|
||||||
|
|
||||||
|
Resolved truth should be stored separately. For now, resolved truth comes from the Open-Meteo historical archive API.
|
||||||
|
|
||||||
|
Until archive data is available, Astrape can also store the current 0-hour Open-Meteo forecast as provisional truth with `source = open_meteo_zero_hour`. This gives the UI and future joins a near-real-time truth line. Archive truth remains separate with `source = open_meteo_archive`, so later modules can choose whether to prefer archive actuals over provisional 0-hour values.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `resolved_at`
|
||||||
|
- `source`
|
||||||
|
- `temperature_c`
|
||||||
|
- `shortwave_radiation_w_m2`
|
||||||
|
|
||||||
|
The future predictor can join forecast points to truth by `target_at = resolved_at`.
|
||||||
|
|
||||||
|
Open-Meteo archive data can lag behind current time depending on model availability, so the database daemon backfills a configurable historical window instead of assuming the last completed hour is immediately available.
|
||||||
|
|
||||||
|
## Visual Explorer
|
||||||
|
|
||||||
|
We should build a small web output for inspecting forecast history.
|
||||||
|
|
||||||
|
Useful first view:
|
||||||
|
- select a weather variable, such as temperature or shortwave radiation
|
||||||
|
- select forecast horizons, such as 2h and 4h
|
||||||
|
- overlay those horizon-specific external forecasts against resolved truth
|
||||||
|
- plot by `target_at`
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
```text
|
||||||
|
target_at on x-axis
|
||||||
|
temperature_c on y-axis
|
||||||
|
line 1: Open-Meteo forecast made 2 hours before target_at
|
||||||
|
line 2: Open-Meteo forecast made 4 hours before target_at
|
||||||
|
line 3: resolved truth
|
||||||
|
```
|
||||||
|
|
||||||
|
This visual layer should read from the cleaned weather database. It should not be part of the Open-Meteo client or parser.
|
||||||
|
|
||||||
|
## First Implementation Slice
|
||||||
|
|
||||||
|
1. Fetch one Open-Meteo-style hourly forecast run.
|
||||||
|
2. Parse it into forecast points.
|
||||||
|
3. Normalize the run through `WeatherBuilder`.
|
||||||
|
4. Store forecast points through `WeatherStore`.
|
||||||
|
5. Add resolved truth rows when we have a source for observed weather.
|
||||||
|
6. Build the visual explorer after forecast/truth storage exists.
|
||||||
@@ -0,0 +1,2 @@
|
|||||||
|
"""Gibil intelligence package for Astrape."""
|
||||||
|
|
||||||
@@ -0,0 +1,2 @@
|
|||||||
|
"""Core classes used by Gibil."""
|
||||||
|
|
||||||
@@ -0,0 +1,81 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
|
from gibil.classes.models import Decision, PowerStage, Snapshot
|
||||||
|
|
||||||
|
|
||||||
|
class GibilAgent:
|
||||||
|
"""Stateful decision engine for Astrape."""
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self.surplus_latch_set = False
|
||||||
|
self.previous_stage: PowerStage | None = None
|
||||||
|
|
||||||
|
def decide(self, snapshot: Snapshot) -> Decision:
|
||||||
|
reasons: list[str] = []
|
||||||
|
|
||||||
|
self._update_surplus_latch(snapshot, reasons)
|
||||||
|
|
||||||
|
if self.surplus_latch_set:
|
||||||
|
stage = PowerStage.SURPLUS
|
||||||
|
reasons.append("surplus latch is set")
|
||||||
|
elif snapshot.cheap_window_active:
|
||||||
|
stage = PowerStage.CHEAP_GRID
|
||||||
|
reasons.append("cheap grid window is active")
|
||||||
|
elif self._should_conserve(snapshot):
|
||||||
|
stage = PowerStage.CONSERVE
|
||||||
|
reasons.append("battery is low and there is no useful solar surplus")
|
||||||
|
else:
|
||||||
|
stage = PowerStage.STANDARD
|
||||||
|
reasons.append("no surplus, cheap window, or conserve condition is active")
|
||||||
|
|
||||||
|
if self.previous_stage != stage:
|
||||||
|
previous = self.previous_stage.value if self.previous_stage else "none"
|
||||||
|
reasons.append(f"stage changed from {previous} to {stage.value}")
|
||||||
|
|
||||||
|
self.previous_stage = stage
|
||||||
|
|
||||||
|
return Decision(
|
||||||
|
created_at=datetime.now(timezone.utc),
|
||||||
|
stage=stage,
|
||||||
|
reasons=reasons,
|
||||||
|
confidence=self._confidence(snapshot),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _update_surplus_latch(
|
||||||
|
self, snapshot: Snapshot, reasons: list[str]
|
||||||
|
) -> None:
|
||||||
|
if snapshot.battery_soc_pct is None or snapshot.solar_power_w is None:
|
||||||
|
return
|
||||||
|
|
||||||
|
home_power_w = snapshot.home_power_w or 0
|
||||||
|
has_surplus = snapshot.solar_power_w > home_power_w
|
||||||
|
|
||||||
|
if not self.surplus_latch_set:
|
||||||
|
if snapshot.battery_soc_pct >= 95 and has_surplus:
|
||||||
|
self.surplus_latch_set = True
|
||||||
|
reasons.append("surplus latch set: battery >= 95% and solar exceeds load")
|
||||||
|
return
|
||||||
|
|
||||||
|
if snapshot.battery_soc_pct < 80 or not has_surplus:
|
||||||
|
self.surplus_latch_set = False
|
||||||
|
reasons.append("surplus latch cleared: battery < 80% or surplus ended")
|
||||||
|
|
||||||
|
def _should_conserve(self, snapshot: Snapshot) -> bool:
|
||||||
|
if snapshot.battery_soc_pct is None:
|
||||||
|
return False
|
||||||
|
|
||||||
|
solar_power_w = snapshot.solar_power_w or 0
|
||||||
|
home_power_w = snapshot.home_power_w or 0
|
||||||
|
|
||||||
|
return snapshot.battery_soc_pct < 25 and solar_power_w < home_power_w
|
||||||
|
|
||||||
|
def _confidence(self, snapshot: Snapshot) -> float:
|
||||||
|
expected_inputs = [
|
||||||
|
snapshot.solar_power_w,
|
||||||
|
snapshot.home_power_w,
|
||||||
|
snapshot.battery_soc_pct,
|
||||||
|
]
|
||||||
|
present = sum(value is not None for value in expected_inputs)
|
||||||
|
return present / len(expected_inputs)
|
||||||
@@ -0,0 +1,37 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from os import environ
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
class EnvLoader:
|
||||||
|
"""Loads Astrape env files without overriding process environment."""
|
||||||
|
|
||||||
|
def __init__(self, env_dir: Path | None = None) -> None:
|
||||||
|
if env_dir is None:
|
||||||
|
env_dir = Path(__file__).resolve().parents[2] / "env"
|
||||||
|
self.env_dir = env_dir
|
||||||
|
|
||||||
|
def load(self) -> None:
|
||||||
|
if not self.env_dir.exists():
|
||||||
|
return
|
||||||
|
|
||||||
|
for path in sorted(self.env_dir.glob("*.env")):
|
||||||
|
self._load_file(path)
|
||||||
|
|
||||||
|
def _load_file(self, path: Path) -> None:
|
||||||
|
for raw_line in path.read_text(encoding="utf-8").splitlines():
|
||||||
|
line = raw_line.strip()
|
||||||
|
if not line or line.startswith("#") or "=" not in line:
|
||||||
|
continue
|
||||||
|
|
||||||
|
key, value = line.split("=", 1)
|
||||||
|
key = key.strip()
|
||||||
|
value = self._clean_value(value.strip())
|
||||||
|
if key and key not in environ:
|
||||||
|
environ[key] = value
|
||||||
|
|
||||||
|
def _clean_value(self, value: str) -> str:
|
||||||
|
if len(value) >= 2 and value[0] == value[-1] and value[0] in {"'", '"'}:
|
||||||
|
return value[1:-1]
|
||||||
|
return value
|
||||||
@@ -0,0 +1,82 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass, field
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from enum import Enum
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
|
||||||
|
class ObservationQuality(str, Enum):
|
||||||
|
OK = "ok"
|
||||||
|
STALE = "stale"
|
||||||
|
ESTIMATED = "estimated"
|
||||||
|
MISSING = "missing"
|
||||||
|
ERROR = "error"
|
||||||
|
|
||||||
|
|
||||||
|
class PowerStage(str, Enum):
|
||||||
|
ALWAYS = "always"
|
||||||
|
SURPLUS = "surplus"
|
||||||
|
CHEAP_GRID = "cheap_grid"
|
||||||
|
STANDARD = "standard"
|
||||||
|
CONSERVE = "conserve"
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class Observation:
|
||||||
|
source: str
|
||||||
|
metric: str
|
||||||
|
value: int | float | str | bool | None
|
||||||
|
unit: str = "none"
|
||||||
|
quality: ObservationQuality = ObservationQuality.OK
|
||||||
|
observed_at: datetime = field(default_factory=lambda: datetime.now(timezone.utc))
|
||||||
|
received_at: datetime = field(default_factory=lambda: datetime.now(timezone.utc))
|
||||||
|
metadata: dict[str, Any] = field(default_factory=dict)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class Snapshot:
|
||||||
|
created_at: datetime
|
||||||
|
solar_power_w: float | None = None
|
||||||
|
home_power_w: float | None = None
|
||||||
|
battery_soc_pct: float | None = None
|
||||||
|
grid_import_w: float | None = None
|
||||||
|
grid_export_w: float | None = None
|
||||||
|
cheap_window_active: bool = False
|
||||||
|
input_quality: dict[str, ObservationQuality] = field(default_factory=dict)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class Decision:
|
||||||
|
created_at: datetime
|
||||||
|
stage: PowerStage
|
||||||
|
reasons: list[str]
|
||||||
|
confidence: float
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class WeatherForecastPoint:
|
||||||
|
issued_at: datetime
|
||||||
|
target_at: datetime
|
||||||
|
horizon_hours: int
|
||||||
|
temperature_c: float | None
|
||||||
|
shortwave_radiation_w_m2: float | None
|
||||||
|
cloud_cover_pct: float | None = None
|
||||||
|
source: str = "open_meteo"
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class WeatherForecastRun:
|
||||||
|
issued_at: datetime
|
||||||
|
source: str
|
||||||
|
latitude: float
|
||||||
|
longitude: float
|
||||||
|
points: list[WeatherForecastPoint]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class WeatherResolvedTruth:
|
||||||
|
resolved_at: datetime
|
||||||
|
temperature_c: float | None
|
||||||
|
shortwave_radiation_w_m2: float | None
|
||||||
|
source: str
|
||||||
@@ -0,0 +1,63 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
|
from gibil.classes.models import Observation, ObservationQuality, Snapshot
|
||||||
|
|
||||||
|
|
||||||
|
class SnapshotBuilder:
|
||||||
|
"""Builds Gibil's decision input from the latest observations."""
|
||||||
|
|
||||||
|
def build(self, observations: list[Observation]) -> Snapshot:
|
||||||
|
latest = self._latest_by_metric(observations)
|
||||||
|
|
||||||
|
return Snapshot(
|
||||||
|
created_at=datetime.now(timezone.utc),
|
||||||
|
solar_power_w=self._number(latest, "solar_power_w"),
|
||||||
|
home_power_w=self._number(latest, "home_power_w"),
|
||||||
|
battery_soc_pct=self._number(latest, "battery_soc_pct"),
|
||||||
|
grid_import_w=self._number(latest, "grid_import_w"),
|
||||||
|
grid_export_w=self._number(latest, "grid_export_w"),
|
||||||
|
cheap_window_active=self._boolean(latest, "cheap_window_active"),
|
||||||
|
input_quality={
|
||||||
|
metric: observation.quality for metric, observation in latest.items()
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
def _latest_by_metric(
|
||||||
|
self, observations: list[Observation]
|
||||||
|
) -> dict[str, Observation]:
|
||||||
|
latest: dict[str, Observation] = {}
|
||||||
|
|
||||||
|
for observation in observations:
|
||||||
|
existing = latest.get(observation.metric)
|
||||||
|
if existing is None or observation.observed_at > existing.observed_at:
|
||||||
|
latest[observation.metric] = observation
|
||||||
|
|
||||||
|
return latest
|
||||||
|
|
||||||
|
def _number(
|
||||||
|
self, observations: dict[str, Observation], metric: str
|
||||||
|
) -> float | None:
|
||||||
|
observation = observations.get(metric)
|
||||||
|
if observation is None or observation.quality != ObservationQuality.OK:
|
||||||
|
return None
|
||||||
|
|
||||||
|
if isinstance(observation.value, bool):
|
||||||
|
return None
|
||||||
|
|
||||||
|
if isinstance(observation.value, int | float):
|
||||||
|
return float(observation.value)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _boolean(self, observations: dict[str, Observation], metric: str) -> bool:
|
||||||
|
observation = observations.get(metric)
|
||||||
|
if observation is None or observation.quality != ObservationQuality.OK:
|
||||||
|
return False
|
||||||
|
|
||||||
|
if isinstance(observation.value, bool):
|
||||||
|
return observation.value
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
@@ -0,0 +1,251 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import date, datetime, timezone
|
||||||
|
from typing import Any
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
from urllib.request import urlopen
|
||||||
|
import json
|
||||||
|
|
||||||
|
from gibil.classes.models import (
|
||||||
|
WeatherForecastPoint,
|
||||||
|
WeatherForecastRun,
|
||||||
|
WeatherResolvedTruth,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class OpenMeteoClient:
|
||||||
|
"""Fetches external weather forecasts from Open-Meteo."""
|
||||||
|
|
||||||
|
base_url = "https://api.open-meteo.com/v1/forecast"
|
||||||
|
|
||||||
|
def build_url(
|
||||||
|
self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
forecast_hours: int = 48,
|
||||||
|
timezone_name: str = "UTC",
|
||||||
|
) -> str:
|
||||||
|
params = {
|
||||||
|
"latitude": latitude,
|
||||||
|
"longitude": longitude,
|
||||||
|
"hourly": ",".join(
|
||||||
|
[
|
||||||
|
"temperature_2m",
|
||||||
|
"shortwave_radiation",
|
||||||
|
"cloud_cover",
|
||||||
|
]
|
||||||
|
),
|
||||||
|
"forecast_hours": forecast_hours,
|
||||||
|
"timezone": timezone_name,
|
||||||
|
}
|
||||||
|
return f"{self.base_url}?{urlencode(params)}"
|
||||||
|
|
||||||
|
def fetch_forecast(
|
||||||
|
self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
forecast_hours: int = 48,
|
||||||
|
) -> WeatherForecastRun:
|
||||||
|
url = self.build_url(latitude, longitude, forecast_hours)
|
||||||
|
with urlopen(url, timeout=10) as response:
|
||||||
|
payload = json.loads(response.read().decode("utf-8"))
|
||||||
|
|
||||||
|
return OpenMeteoParser().parse_forecast(
|
||||||
|
payload=payload,
|
||||||
|
latitude=latitude,
|
||||||
|
longitude=longitude,
|
||||||
|
issued_at=datetime.now(timezone.utc),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class OpenMeteoArchiveClient:
|
||||||
|
"""Fetches historical weather data from Open-Meteo archive."""
|
||||||
|
|
||||||
|
base_url = "https://archive-api.open-meteo.com/v1/archive"
|
||||||
|
|
||||||
|
def build_url(
|
||||||
|
self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
start_date: date,
|
||||||
|
end_date: date,
|
||||||
|
timezone_name: str = "UTC",
|
||||||
|
) -> str:
|
||||||
|
params = {
|
||||||
|
"latitude": latitude,
|
||||||
|
"longitude": longitude,
|
||||||
|
"start_date": start_date.isoformat(),
|
||||||
|
"end_date": end_date.isoformat(),
|
||||||
|
"hourly": ",".join(
|
||||||
|
[
|
||||||
|
"temperature_2m",
|
||||||
|
"shortwave_radiation",
|
||||||
|
]
|
||||||
|
),
|
||||||
|
"timezone": timezone_name,
|
||||||
|
}
|
||||||
|
return f"{self.base_url}?{urlencode(params)}"
|
||||||
|
|
||||||
|
def fetch_resolved_truth(
|
||||||
|
self,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
start_date: date,
|
||||||
|
end_date: date,
|
||||||
|
) -> list[WeatherResolvedTruth]:
|
||||||
|
url = self.build_url(latitude, longitude, start_date, end_date)
|
||||||
|
with urlopen(url, timeout=20) as response:
|
||||||
|
payload = json.loads(response.read().decode("utf-8"))
|
||||||
|
|
||||||
|
return OpenMeteoArchiveParser().parse_resolved_truth(payload)
|
||||||
|
|
||||||
|
|
||||||
|
class OpenMeteoParser:
|
||||||
|
"""Converts Open-Meteo JSON into clean external forecast records."""
|
||||||
|
|
||||||
|
def parse_forecast(
|
||||||
|
self,
|
||||||
|
payload: dict[str, Any],
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
issued_at: datetime,
|
||||||
|
) -> WeatherForecastRun:
|
||||||
|
hourly = payload.get("hourly", {})
|
||||||
|
times = hourly.get("time", [])
|
||||||
|
temperatures = hourly.get("temperature_2m", [])
|
||||||
|
radiation = hourly.get("shortwave_radiation", [])
|
||||||
|
cloud_cover = hourly.get("cloud_cover", [])
|
||||||
|
|
||||||
|
points: list[WeatherForecastPoint] = []
|
||||||
|
for index, raw_time in enumerate(times):
|
||||||
|
target_at = self._parse_time(raw_time)
|
||||||
|
horizon_hours = max(
|
||||||
|
0, round((target_at - issued_at).total_seconds() / 3600)
|
||||||
|
)
|
||||||
|
|
||||||
|
points.append(
|
||||||
|
WeatherForecastPoint(
|
||||||
|
issued_at=issued_at,
|
||||||
|
target_at=target_at,
|
||||||
|
horizon_hours=horizon_hours,
|
||||||
|
temperature_c=self._at(temperatures, index),
|
||||||
|
shortwave_radiation_w_m2=self._at(radiation, index),
|
||||||
|
cloud_cover_pct=self._at(cloud_cover, index),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
return WeatherForecastRun(
|
||||||
|
issued_at=issued_at,
|
||||||
|
source="open_meteo",
|
||||||
|
latitude=latitude,
|
||||||
|
longitude=longitude,
|
||||||
|
points=points,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _parse_time(self, raw_time: str) -> datetime:
|
||||||
|
parsed = datetime.fromisoformat(raw_time)
|
||||||
|
if parsed.tzinfo is None:
|
||||||
|
return parsed.replace(tzinfo=timezone.utc)
|
||||||
|
return parsed.astimezone(timezone.utc)
|
||||||
|
|
||||||
|
def _at(self, values: list[Any], index: int) -> float | None:
|
||||||
|
if index >= len(values):
|
||||||
|
return None
|
||||||
|
|
||||||
|
value = values[index]
|
||||||
|
if value is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return float(value)
|
||||||
|
|
||||||
|
|
||||||
|
class OpenMeteoArchiveParser:
|
||||||
|
"""Converts Open-Meteo archive JSON into resolved truth records."""
|
||||||
|
|
||||||
|
def parse_resolved_truth(self, payload: dict[str, Any]) -> list[WeatherResolvedTruth]:
|
||||||
|
hourly = payload.get("hourly", {})
|
||||||
|
times = hourly.get("time", [])
|
||||||
|
temperatures = hourly.get("temperature_2m", [])
|
||||||
|
radiation = hourly.get("shortwave_radiation", [])
|
||||||
|
|
||||||
|
truth: list[WeatherResolvedTruth] = []
|
||||||
|
for index, raw_time in enumerate(times):
|
||||||
|
truth.append(
|
||||||
|
WeatherResolvedTruth(
|
||||||
|
resolved_at=self._parse_time(raw_time),
|
||||||
|
temperature_c=self._at(temperatures, index),
|
||||||
|
shortwave_radiation_w_m2=self._at(radiation, index),
|
||||||
|
source="open_meteo_archive",
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
return truth
|
||||||
|
|
||||||
|
def _parse_time(self, raw_time: str) -> datetime:
|
||||||
|
parsed = datetime.fromisoformat(raw_time)
|
||||||
|
if parsed.tzinfo is None:
|
||||||
|
return parsed.replace(tzinfo=timezone.utc)
|
||||||
|
return parsed.astimezone(timezone.utc)
|
||||||
|
|
||||||
|
def _at(self, values: list[Any], index: int) -> float | None:
|
||||||
|
if index >= len(values):
|
||||||
|
return None
|
||||||
|
|
||||||
|
value = values[index]
|
||||||
|
if value is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return float(value)
|
||||||
|
|
||||||
|
|
||||||
|
class WeatherBuilder:
|
||||||
|
"""Builds a clean database-ready set of external weather forecast records."""
|
||||||
|
|
||||||
|
def build_forecast_run(
|
||||||
|
self,
|
||||||
|
source: str,
|
||||||
|
latitude: float,
|
||||||
|
longitude: float,
|
||||||
|
points: list[WeatherForecastPoint],
|
||||||
|
issued_at: datetime | None = None,
|
||||||
|
) -> WeatherForecastRun:
|
||||||
|
if issued_at is None:
|
||||||
|
issued_at = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
clean_points = [
|
||||||
|
WeatherForecastPoint(
|
||||||
|
issued_at=issued_at,
|
||||||
|
target_at=point.target_at,
|
||||||
|
horizon_hours=max(
|
||||||
|
0, round((point.target_at - issued_at).total_seconds() / 3600)
|
||||||
|
),
|
||||||
|
temperature_c=point.temperature_c,
|
||||||
|
shortwave_radiation_w_m2=point.shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct=point.cloud_cover_pct,
|
||||||
|
source=source,
|
||||||
|
)
|
||||||
|
for point in sorted(points, key=lambda item: item.target_at)
|
||||||
|
]
|
||||||
|
|
||||||
|
return WeatherForecastRun(
|
||||||
|
issued_at=issued_at,
|
||||||
|
source=source,
|
||||||
|
latitude=latitude,
|
||||||
|
longitude=longitude,
|
||||||
|
points=clean_points,
|
||||||
|
)
|
||||||
|
|
||||||
|
def points_for_horizon(
|
||||||
|
self,
|
||||||
|
forecast_runs: list[WeatherForecastRun],
|
||||||
|
horizon_hours: int,
|
||||||
|
) -> list[WeatherForecastPoint]:
|
||||||
|
points: list[WeatherForecastPoint] = []
|
||||||
|
for run in forecast_runs:
|
||||||
|
points.extend(
|
||||||
|
point
|
||||||
|
for point in run.points
|
||||||
|
if point.horizon_hours == horizon_hours
|
||||||
|
)
|
||||||
|
|
||||||
|
return sorted(points, key=lambda point: point.target_at)
|
||||||
@@ -0,0 +1,310 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from gibil.classes.models import WeatherForecastPoint, WeatherResolvedTruth
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class WeatherDisplayDataset:
|
||||||
|
forecast_points: list[WeatherForecastPoint]
|
||||||
|
resolved_truth: list[WeatherResolvedTruth]
|
||||||
|
|
||||||
|
|
||||||
|
class WeatherDisplay:
|
||||||
|
"""Renders weather source data for the Astrape web UI."""
|
||||||
|
|
||||||
|
def render(self) -> str:
|
||||||
|
return """
|
||||||
|
<section class="panel weather-panel" data-module="weather-display">
|
||||||
|
<div class="panel-heading">
|
||||||
|
<div>
|
||||||
|
<h2>Weather</h2>
|
||||||
|
<p>External forecast history</p>
|
||||||
|
</div>
|
||||||
|
<div class="control-row">
|
||||||
|
<label>
|
||||||
|
Variable
|
||||||
|
<select id="weather-variable">
|
||||||
|
<option value="temperature_c">Temperature</option>
|
||||||
|
<option value="shortwave_radiation_w_m2">Solar radiation</option>
|
||||||
|
</select>
|
||||||
|
</label>
|
||||||
|
<div class="legend-control">
|
||||||
|
<div class="legend-title">Horizons</div>
|
||||||
|
<div id="weather-horizons" class="horizon-options"></div>
|
||||||
|
<div class="horizon-option">
|
||||||
|
<span class="legend-swatch truth-swatch"></span>
|
||||||
|
<span>Resolved truth</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="chart-shell">
|
||||||
|
<canvas id="weather-chart" width="1100" height="420"></canvas>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
<script>
|
||||||
|
window.astrapeModules = window.astrapeModules || {};
|
||||||
|
window.astrapeModules.weatherDisplay = (() => {
|
||||||
|
const palette = ["#60a5fa", "#34d399", "#fbbf24", "#a78bfa", "#fb7185", "#22d3ee"];
|
||||||
|
const defaultSlots = [
|
||||||
|
{ enabled: true, horizon: 4 },
|
||||||
|
{ enabled: true, horizon: 8 },
|
||||||
|
{ enabled: true, horizon: 12 },
|
||||||
|
{ enabled: false, horizon: 16 },
|
||||||
|
{ enabled: false, horizon: 24 },
|
||||||
|
{ enabled: false, horizon: 36 },
|
||||||
|
];
|
||||||
|
|
||||||
|
function init() {
|
||||||
|
const variable = document.getElementById("weather-variable");
|
||||||
|
variable.addEventListener("change", render);
|
||||||
|
buildHorizonControls();
|
||||||
|
refresh();
|
||||||
|
setInterval(refresh, 5000);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function refresh() {
|
||||||
|
const response = await fetch("/api/weather", { cache: "no-store" });
|
||||||
|
const payload = await response.json();
|
||||||
|
window.astrapeWeatherData = payload;
|
||||||
|
render();
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildHorizonControls() {
|
||||||
|
const horizons = document.getElementById("weather-horizons");
|
||||||
|
horizons.innerHTML = "";
|
||||||
|
const slots = loadSlots();
|
||||||
|
|
||||||
|
slots.forEach((slot, index) => {
|
||||||
|
const option = document.createElement("div");
|
||||||
|
option.className = "horizon-option";
|
||||||
|
option.innerHTML = `
|
||||||
|
<input class="horizon-enabled" type="checkbox" ${slot.enabled ? "checked" : ""}>
|
||||||
|
<span class="legend-swatch" style="background: ${palette[index]}"></span>
|
||||||
|
<input class="horizon-value" type="number" min="1" max="47" step="1" value="${slot.horizon}">
|
||||||
|
<span>h</span>
|
||||||
|
`;
|
||||||
|
const checkbox = option.querySelector(".horizon-enabled");
|
||||||
|
const value = option.querySelector(".horizon-value");
|
||||||
|
checkbox.addEventListener("change", render);
|
||||||
|
value.addEventListener("input", render);
|
||||||
|
horizons.appendChild(option);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function render() {
|
||||||
|
const payload = window.astrapeWeatherData || { forecast_points: [], resolved_truth: [] };
|
||||||
|
const variable = document.getElementById("weather-variable").value;
|
||||||
|
const selectedHorizons = selectedSlots();
|
||||||
|
saveSlots();
|
||||||
|
drawChart(payload, variable, selectedHorizons);
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawChart(payload, variable, selectedHorizons) {
|
||||||
|
const canvas = document.getElementById("weather-chart");
|
||||||
|
const ctx = canvas.getContext("2d");
|
||||||
|
const series = buildSeries(payload, variable, selectedHorizons);
|
||||||
|
|
||||||
|
ctx.clearRect(0, 0, canvas.width, canvas.height);
|
||||||
|
|
||||||
|
const allPoints = series.flatMap((item) => item.points);
|
||||||
|
const now = Date.now();
|
||||||
|
const xs = allPoints.map((point) => new Date(point.target_at).getTime());
|
||||||
|
xs.push(now);
|
||||||
|
const ys = allPoints.map((point) => point.value).filter((value) => value !== null);
|
||||||
|
if (!xs.length || !ys.length) return;
|
||||||
|
|
||||||
|
const bounds = {
|
||||||
|
minX: Math.min(...xs),
|
||||||
|
maxX: Math.max(...xs),
|
||||||
|
minY: Math.min(...ys),
|
||||||
|
maxY: Math.max(...ys),
|
||||||
|
};
|
||||||
|
if (bounds.minY === bounds.maxY) {
|
||||||
|
bounds.minY -= 1;
|
||||||
|
bounds.maxY += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
drawAxes(ctx, canvas, bounds);
|
||||||
|
drawNowMarker(ctx, canvas, bounds);
|
||||||
|
series.forEach((item) => {
|
||||||
|
drawSeries(ctx, canvas, bounds, item.points, item.color, item.width);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildSeries(payload, variable, selectedHorizons) {
|
||||||
|
const series = [];
|
||||||
|
for (const slot of selectedHorizons) {
|
||||||
|
const points = (payload.forecast_points || [])
|
||||||
|
.filter((point) => point.horizon_hours === slot.horizon)
|
||||||
|
.map((point) => ({ target_at: point.target_at, value: point[variable] }))
|
||||||
|
.filter((point) => point.value !== null);
|
||||||
|
series.push({ label: `${slot.horizon}h forecast`, points, color: slot.color, width: 2 });
|
||||||
|
}
|
||||||
|
|
||||||
|
const truth = (payload.resolved_truth || [])
|
||||||
|
.map((point) => ({ target_at: point.resolved_at, value: point[variable] }))
|
||||||
|
.filter((point) => point.value !== null);
|
||||||
|
series.push({ label: "resolved truth", points: truth, color: "#f8fafc", width: 3 });
|
||||||
|
return series;
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawAxes(ctx, canvas, bounds) {
|
||||||
|
const margin = chartMargin();
|
||||||
|
ctx.strokeStyle = "#94a3b8";
|
||||||
|
ctx.lineWidth = 1;
|
||||||
|
ctx.beginPath();
|
||||||
|
ctx.moveTo(margin.left, margin.top);
|
||||||
|
ctx.lineTo(margin.left, canvas.height - margin.bottom);
|
||||||
|
ctx.lineTo(canvas.width - margin.right, canvas.height - margin.bottom);
|
||||||
|
ctx.stroke();
|
||||||
|
|
||||||
|
ctx.fillStyle = "#475569";
|
||||||
|
ctx.font = "12px system-ui";
|
||||||
|
ctx.fillText(bounds.maxY.toFixed(1), 10, margin.top + 4);
|
||||||
|
ctx.fillText(bounds.minY.toFixed(1), 10, canvas.height - margin.bottom);
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawSeries(ctx, canvas, bounds, points, color, width) {
|
||||||
|
if (!points.length) return;
|
||||||
|
const margin = chartMargin();
|
||||||
|
if (points.length === 1) {
|
||||||
|
const point = points[0];
|
||||||
|
const x = scale(new Date(point.target_at).getTime(), bounds.minX, bounds.maxX, margin.left, canvas.width - margin.right);
|
||||||
|
const y = scale(point.value, bounds.minY, bounds.maxY, canvas.height - margin.bottom, margin.top);
|
||||||
|
ctx.fillStyle = color;
|
||||||
|
ctx.beginPath();
|
||||||
|
ctx.arc(x, y, width + 3, 0, Math.PI * 2);
|
||||||
|
ctx.fill();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
ctx.strokeStyle = color;
|
||||||
|
ctx.lineWidth = width;
|
||||||
|
ctx.beginPath();
|
||||||
|
|
||||||
|
points.forEach((point, index) => {
|
||||||
|
const x = scale(new Date(point.target_at).getTime(), bounds.minX, bounds.maxX, margin.left, canvas.width - margin.right);
|
||||||
|
const y = scale(point.value, bounds.minY, bounds.maxY, canvas.height - margin.bottom, margin.top);
|
||||||
|
if (index === 0) ctx.moveTo(x, y);
|
||||||
|
else ctx.lineTo(x, y);
|
||||||
|
});
|
||||||
|
ctx.stroke();
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawNowMarker(ctx, canvas, bounds) {
|
||||||
|
const now = Date.now();
|
||||||
|
if (now < bounds.minX || now > bounds.maxX) return;
|
||||||
|
|
||||||
|
const margin = chartMargin();
|
||||||
|
const x = scale(now, bounds.minX, bounds.maxX, margin.left, canvas.width - margin.right);
|
||||||
|
|
||||||
|
ctx.save();
|
||||||
|
ctx.strokeStyle = "#f8fafc";
|
||||||
|
ctx.lineWidth = 1;
|
||||||
|
ctx.setLineDash([5, 5]);
|
||||||
|
ctx.beginPath();
|
||||||
|
ctx.moveTo(x, margin.top);
|
||||||
|
ctx.lineTo(x, canvas.height - margin.bottom);
|
||||||
|
ctx.stroke();
|
||||||
|
ctx.setLineDash([]);
|
||||||
|
ctx.fillStyle = "#f8fafc";
|
||||||
|
ctx.font = "12px system-ui";
|
||||||
|
ctx.fillText("now", Math.min(x + 8, canvas.width - margin.right - 28), margin.top + 14);
|
||||||
|
ctx.restore();
|
||||||
|
}
|
||||||
|
|
||||||
|
function selectedSlots() {
|
||||||
|
return [...document.querySelectorAll("#weather-horizons .horizon-option")]
|
||||||
|
.map((item, index) => {
|
||||||
|
const enabled = item.querySelector(".horizon-enabled").checked;
|
||||||
|
const input = item.querySelector(".horizon-value");
|
||||||
|
const horizon = clamp(Number(input.value), 1, 47);
|
||||||
|
input.value = horizon;
|
||||||
|
return enabled ? { horizon, color: palette[index] } : null;
|
||||||
|
})
|
||||||
|
.filter(Boolean);
|
||||||
|
}
|
||||||
|
|
||||||
|
function saveSlots() {
|
||||||
|
const slots = [...document.querySelectorAll("#weather-horizons .horizon-option")]
|
||||||
|
.map((item) => ({
|
||||||
|
enabled: item.querySelector(".horizon-enabled").checked,
|
||||||
|
horizon: Number(item.querySelector(".horizon-value").value),
|
||||||
|
}));
|
||||||
|
localStorage.setItem("astrapeWeatherHorizonSlots", JSON.stringify(slots));
|
||||||
|
}
|
||||||
|
|
||||||
|
function loadSlots() {
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(localStorage.getItem("astrapeWeatherHorizonSlots"));
|
||||||
|
if (Array.isArray(parsed) && parsed.length === 6) return parsed;
|
||||||
|
} catch (error) {
|
||||||
|
return defaultSlots;
|
||||||
|
}
|
||||||
|
return defaultSlots;
|
||||||
|
}
|
||||||
|
|
||||||
|
function clamp(value, min, max) {
|
||||||
|
if (!Number.isFinite(value)) return min;
|
||||||
|
return Math.min(max, Math.max(min, Math.round(value)));
|
||||||
|
}
|
||||||
|
|
||||||
|
function chartMargin() {
|
||||||
|
return { top: 24, right: 28, bottom: 34, left: 52 };
|
||||||
|
}
|
||||||
|
|
||||||
|
function scale(value, inMin, inMax, outMin, outMax) {
|
||||||
|
if (inMin === inMax) return (outMin + outMax) / 2;
|
||||||
|
return outMin + ((value - inMin) / (inMax - inMin)) * (outMax - outMin);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { init };
|
||||||
|
})();
|
||||||
|
window.astrapeModules.weatherDisplay.init();
|
||||||
|
</script>
|
||||||
|
"""
|
||||||
|
|
||||||
|
def data_payload(self, dataset: WeatherDisplayDataset | None = None) -> str:
|
||||||
|
if dataset is None:
|
||||||
|
dataset = WeatherDisplayDataset(forecast_points=[], resolved_truth=[])
|
||||||
|
|
||||||
|
forecast_points = [self._forecast_point(point) for point in dataset.forecast_points]
|
||||||
|
resolved_truth = [self._truth_point(point) for point in dataset.resolved_truth]
|
||||||
|
horizons = sorted({point["horizon_hours"] for point in forecast_points})
|
||||||
|
|
||||||
|
return json.dumps(
|
||||||
|
{
|
||||||
|
"forecast_points": forecast_points,
|
||||||
|
"resolved_truth": resolved_truth,
|
||||||
|
"horizons": horizons,
|
||||||
|
"min_horizon": 1,
|
||||||
|
"max_horizon": 47,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
def _forecast_point(self, point: WeatherForecastPoint) -> dict[str, object]:
|
||||||
|
return {
|
||||||
|
"issued_at": self._iso(point.issued_at),
|
||||||
|
"target_at": self._iso(point.target_at),
|
||||||
|
"horizon_hours": point.horizon_hours,
|
||||||
|
"source": point.source,
|
||||||
|
"temperature_c": point.temperature_c,
|
||||||
|
"shortwave_radiation_w_m2": point.shortwave_radiation_w_m2,
|
||||||
|
"cloud_cover_pct": point.cloud_cover_pct,
|
||||||
|
}
|
||||||
|
|
||||||
|
def _truth_point(self, point: WeatherResolvedTruth) -> dict[str, object]:
|
||||||
|
return {
|
||||||
|
"resolved_at": self._iso(point.resolved_at),
|
||||||
|
"source": point.source,
|
||||||
|
"temperature_c": point.temperature_c,
|
||||||
|
"shortwave_radiation_w_m2": point.shortwave_radiation_w_m2,
|
||||||
|
}
|
||||||
|
|
||||||
|
def _iso(self, value: datetime) -> str:
|
||||||
|
return value.isoformat()
|
||||||
@@ -0,0 +1,74 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from math import pi, sin
|
||||||
|
|
||||||
|
from gibil.classes.models import WeatherForecastPoint, WeatherResolvedTruth
|
||||||
|
from gibil.classes.weather_display import WeatherDisplayDataset
|
||||||
|
|
||||||
|
|
||||||
|
class WeatherSampleData:
|
||||||
|
"""Builds temporary display-only weather data for UI tuning."""
|
||||||
|
|
||||||
|
def build(self, hours: int = 72) -> WeatherDisplayDataset:
|
||||||
|
now = datetime.now(timezone.utc).replace(minute=0, second=0, microsecond=0)
|
||||||
|
start = now - timedelta(hours=hours)
|
||||||
|
horizons = [2, 4, 8, 12, 24]
|
||||||
|
|
||||||
|
forecast_points: list[WeatherForecastPoint] = []
|
||||||
|
resolved_truth: list[WeatherResolvedTruth] = []
|
||||||
|
|
||||||
|
for offset in range(hours + 1):
|
||||||
|
target_at = start + timedelta(hours=offset)
|
||||||
|
truth_temperature = self._temperature(target_at, offset)
|
||||||
|
truth_solar = self._solar(target_at, offset)
|
||||||
|
|
||||||
|
resolved_truth.append(
|
||||||
|
WeatherResolvedTruth(
|
||||||
|
resolved_at=target_at,
|
||||||
|
source="sample",
|
||||||
|
temperature_c=truth_temperature,
|
||||||
|
shortwave_radiation_w_m2=truth_solar,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
for horizon in horizons:
|
||||||
|
forecast_points.append(
|
||||||
|
WeatherForecastPoint(
|
||||||
|
issued_at=target_at - timedelta(hours=horizon),
|
||||||
|
target_at=target_at,
|
||||||
|
horizon_hours=horizon,
|
||||||
|
temperature_c=truth_temperature
|
||||||
|
+ self._temperature_error(offset, horizon),
|
||||||
|
shortwave_radiation_w_m2=max(
|
||||||
|
0,
|
||||||
|
truth_solar + self._solar_error(offset, horizon),
|
||||||
|
),
|
||||||
|
cloud_cover_pct=max(
|
||||||
|
0,
|
||||||
|
min(100, 45 + 30 * sin((offset + horizon) / 9)),
|
||||||
|
),
|
||||||
|
source="sample",
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
return WeatherDisplayDataset(
|
||||||
|
forecast_points=forecast_points,
|
||||||
|
resolved_truth=resolved_truth,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _temperature(self, target_at: datetime, offset: int) -> float:
|
||||||
|
daily = sin(((target_at.hour - 7) / 24) * 2 * pi)
|
||||||
|
slow = sin(offset / 18)
|
||||||
|
return round(6.5 + daily * 5.5 + slow * 1.3, 1)
|
||||||
|
|
||||||
|
def _solar(self, target_at: datetime, offset: int) -> float:
|
||||||
|
daylight = max(0, sin(((target_at.hour - 5) / 15) * pi))
|
||||||
|
cloud_effect = 0.75 + 0.25 * sin(offset / 7)
|
||||||
|
return round(780 * daylight * cloud_effect, 1)
|
||||||
|
|
||||||
|
def _temperature_error(self, offset: int, horizon: int) -> float:
|
||||||
|
return round((horizon / 8) * sin((offset + horizon) / 5), 1)
|
||||||
|
|
||||||
|
def _solar_error(self, offset: int, horizon: int) -> float:
|
||||||
|
return round((horizon * 9) * sin((offset + horizon) / 4), 1)
|
||||||
@@ -0,0 +1,278 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from os import environ
|
||||||
|
from typing import Iterator
|
||||||
|
|
||||||
|
from gibil.classes.models import WeatherForecastPoint, WeatherForecastRun, WeatherResolvedTruth
|
||||||
|
from gibil.classes.weather_display import WeatherDisplayDataset
|
||||||
|
|
||||||
|
|
||||||
|
class WeatherStoreConfigurationError(RuntimeError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class WeatherStoreConfig:
|
||||||
|
database_url: str
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "WeatherStoreConfig":
|
||||||
|
database_url = environ.get("ASTRAPE_DATABASE_URL")
|
||||||
|
if not database_url:
|
||||||
|
raise WeatherStoreConfigurationError(
|
||||||
|
"ASTRAPE_DATABASE_URL is required for weather storage"
|
||||||
|
)
|
||||||
|
|
||||||
|
return cls(database_url=database_url)
|
||||||
|
|
||||||
|
|
||||||
|
class WeatherStore:
|
||||||
|
"""Persists external weather forecasts and resolved truth in TimescaleDB."""
|
||||||
|
|
||||||
|
def __init__(self, config: WeatherStoreConfig) -> None:
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "WeatherStore":
|
||||||
|
return cls(WeatherStoreConfig.from_env())
|
||||||
|
|
||||||
|
def initialize(self) -> None:
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("CREATE EXTENSION IF NOT EXISTS timescaledb")
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
CREATE TABLE IF NOT EXISTS weather_forecast_points (
|
||||||
|
issued_at TIMESTAMPTZ NOT NULL,
|
||||||
|
target_at TIMESTAMPTZ NOT NULL,
|
||||||
|
horizon_hours INTEGER NOT NULL,
|
||||||
|
source TEXT NOT NULL,
|
||||||
|
latitude DOUBLE PRECISION NOT NULL,
|
||||||
|
longitude DOUBLE PRECISION NOT NULL,
|
||||||
|
temperature_c DOUBLE PRECISION,
|
||||||
|
shortwave_radiation_w_m2 DOUBLE PRECISION,
|
||||||
|
cloud_cover_pct DOUBLE PRECISION,
|
||||||
|
inserted_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
|
PRIMARY KEY (issued_at, target_at, source)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT create_hypertable(
|
||||||
|
'weather_forecast_points',
|
||||||
|
'target_at',
|
||||||
|
if_not_exists => TRUE
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
CREATE TABLE IF NOT EXISTS weather_resolved_truth (
|
||||||
|
resolved_at TIMESTAMPTZ NOT NULL,
|
||||||
|
source TEXT NOT NULL,
|
||||||
|
temperature_c DOUBLE PRECISION,
|
||||||
|
shortwave_radiation_w_m2 DOUBLE PRECISION,
|
||||||
|
inserted_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
|
PRIMARY KEY (resolved_at, source)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT create_hypertable(
|
||||||
|
'weather_resolved_truth',
|
||||||
|
'resolved_at',
|
||||||
|
if_not_exists => TRUE
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
def save_forecast_run(self, forecast_run: WeatherForecastRun) -> int:
|
||||||
|
rows = [
|
||||||
|
(
|
||||||
|
point.issued_at,
|
||||||
|
point.target_at,
|
||||||
|
point.horizon_hours,
|
||||||
|
forecast_run.source,
|
||||||
|
forecast_run.latitude,
|
||||||
|
forecast_run.longitude,
|
||||||
|
point.temperature_c,
|
||||||
|
point.shortwave_radiation_w_m2,
|
||||||
|
point.cloud_cover_pct,
|
||||||
|
)
|
||||||
|
for point in forecast_run.points
|
||||||
|
]
|
||||||
|
if not rows:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.executemany(
|
||||||
|
"""
|
||||||
|
INSERT INTO weather_forecast_points (
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
horizon_hours,
|
||||||
|
source,
|
||||||
|
latitude,
|
||||||
|
longitude,
|
||||||
|
temperature_c,
|
||||||
|
shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT (issued_at, target_at, source)
|
||||||
|
DO UPDATE SET
|
||||||
|
horizon_hours = EXCLUDED.horizon_hours,
|
||||||
|
latitude = EXCLUDED.latitude,
|
||||||
|
longitude = EXCLUDED.longitude,
|
||||||
|
temperature_c = EXCLUDED.temperature_c,
|
||||||
|
shortwave_radiation_w_m2 = EXCLUDED.shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct = EXCLUDED.cloud_cover_pct,
|
||||||
|
inserted_at = now()
|
||||||
|
""",
|
||||||
|
rows,
|
||||||
|
)
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
return len(rows)
|
||||||
|
|
||||||
|
def save_resolved_truth(self, truth_points: list[WeatherResolvedTruth]) -> int:
|
||||||
|
rows = [
|
||||||
|
(
|
||||||
|
point.resolved_at,
|
||||||
|
point.source,
|
||||||
|
point.temperature_c,
|
||||||
|
point.shortwave_radiation_w_m2,
|
||||||
|
)
|
||||||
|
for point in truth_points
|
||||||
|
]
|
||||||
|
if not rows:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.executemany(
|
||||||
|
"""
|
||||||
|
INSERT INTO weather_resolved_truth (
|
||||||
|
resolved_at,
|
||||||
|
source,
|
||||||
|
temperature_c,
|
||||||
|
shortwave_radiation_w_m2
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s)
|
||||||
|
ON CONFLICT (resolved_at, source)
|
||||||
|
DO UPDATE SET
|
||||||
|
temperature_c = EXCLUDED.temperature_c,
|
||||||
|
shortwave_radiation_w_m2 = EXCLUDED.shortwave_radiation_w_m2,
|
||||||
|
inserted_at = now()
|
||||||
|
""",
|
||||||
|
rows,
|
||||||
|
)
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
return len(rows)
|
||||||
|
|
||||||
|
def save_zero_hour_forecast_as_truth(
|
||||||
|
self, forecast_run: WeatherForecastRun
|
||||||
|
) -> int:
|
||||||
|
truth_points = [
|
||||||
|
WeatherResolvedTruth(
|
||||||
|
resolved_at=point.target_at,
|
||||||
|
source="open_meteo_zero_hour",
|
||||||
|
temperature_c=point.temperature_c,
|
||||||
|
shortwave_radiation_w_m2=point.shortwave_radiation_w_m2,
|
||||||
|
)
|
||||||
|
for point in forecast_run.points
|
||||||
|
if point.horizon_hours == 0
|
||||||
|
]
|
||||||
|
return self.save_resolved_truth(truth_points)
|
||||||
|
|
||||||
|
def load_display_dataset(
|
||||||
|
self,
|
||||||
|
start_at: datetime | None = None,
|
||||||
|
end_at: datetime | None = None,
|
||||||
|
) -> WeatherDisplayDataset:
|
||||||
|
now = datetime.now(timezone.utc)
|
||||||
|
if start_at is None:
|
||||||
|
start_at = now - timedelta(hours=24)
|
||||||
|
if end_at is None:
|
||||||
|
end_at = now + timedelta(hours=48)
|
||||||
|
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
horizon_hours,
|
||||||
|
source,
|
||||||
|
temperature_c,
|
||||||
|
shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct
|
||||||
|
FROM weather_forecast_points
|
||||||
|
WHERE target_at >= %s AND target_at <= %s
|
||||||
|
ORDER BY target_at, horizon_hours
|
||||||
|
LIMIT 5000
|
||||||
|
""",
|
||||||
|
(start_at, end_at),
|
||||||
|
)
|
||||||
|
forecast_rows = cursor.fetchall()
|
||||||
|
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
resolved_at,
|
||||||
|
source,
|
||||||
|
temperature_c,
|
||||||
|
shortwave_radiation_w_m2
|
||||||
|
FROM weather_resolved_truth
|
||||||
|
WHERE resolved_at >= %s AND resolved_at <= %s
|
||||||
|
ORDER BY resolved_at
|
||||||
|
LIMIT 5000
|
||||||
|
""",
|
||||||
|
(start_at, end_at),
|
||||||
|
)
|
||||||
|
truth_rows = cursor.fetchall()
|
||||||
|
|
||||||
|
return WeatherDisplayDataset(
|
||||||
|
forecast_points=[
|
||||||
|
WeatherForecastPoint(
|
||||||
|
issued_at=row[0],
|
||||||
|
target_at=row[1],
|
||||||
|
horizon_hours=row[2],
|
||||||
|
source=row[3],
|
||||||
|
temperature_c=row[4],
|
||||||
|
shortwave_radiation_w_m2=row[5],
|
||||||
|
cloud_cover_pct=row[6],
|
||||||
|
)
|
||||||
|
for row in forecast_rows
|
||||||
|
],
|
||||||
|
resolved_truth=[
|
||||||
|
WeatherResolvedTruth(
|
||||||
|
resolved_at=row[0],
|
||||||
|
source=row[1],
|
||||||
|
temperature_c=row[2],
|
||||||
|
shortwave_radiation_w_m2=row[3],
|
||||||
|
)
|
||||||
|
for row in truth_rows
|
||||||
|
],
|
||||||
|
)
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def _connection(self) -> Iterator[object]:
|
||||||
|
try:
|
||||||
|
import psycopg
|
||||||
|
except ImportError as error:
|
||||||
|
raise WeatherStoreConfigurationError(
|
||||||
|
"Install dependencies with `python3 -m pip install -r requirements.txt`"
|
||||||
|
) from error
|
||||||
|
|
||||||
|
with psycopg.connect(self.config.database_url) as connection:
|
||||||
|
yield connection
|
||||||
@@ -0,0 +1,254 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from os import environ
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.weather_sample_data import WeatherSampleData
|
||||||
|
from gibil.classes.weather_store import WeatherStore, WeatherStoreConfigurationError
|
||||||
|
from gibil.classes.weather_display import WeatherDisplay
|
||||||
|
|
||||||
|
|
||||||
|
class WebUI:
|
||||||
|
"""Composes Astrape web modules into one page."""
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
self.weather_display = WeatherDisplay()
|
||||||
|
|
||||||
|
def render_page(self) -> str:
|
||||||
|
return f"""<!doctype html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="utf-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||||
|
<title>Astrape</title>
|
||||||
|
<style>
|
||||||
|
:root {{
|
||||||
|
color-scheme: light;
|
||||||
|
--bg: #11151c;
|
||||||
|
--surface: #181e27;
|
||||||
|
--panel: #1f2733;
|
||||||
|
--ink: #edf2f7;
|
||||||
|
--muted: #9aa8ba;
|
||||||
|
--line: #344052;
|
||||||
|
--field: #121821;
|
||||||
|
}}
|
||||||
|
|
||||||
|
* {{
|
||||||
|
box-sizing: border-box;
|
||||||
|
}}
|
||||||
|
|
||||||
|
body {{
|
||||||
|
margin: 0;
|
||||||
|
background: var(--bg);
|
||||||
|
color: var(--ink);
|
||||||
|
font-family: Inter, ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif;
|
||||||
|
line-height: 1.4;
|
||||||
|
}}
|
||||||
|
|
||||||
|
header {{
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: space-between;
|
||||||
|
gap: 24px;
|
||||||
|
padding: 22px 28px;
|
||||||
|
border-bottom: 1px solid var(--line);
|
||||||
|
background: var(--surface);
|
||||||
|
}}
|
||||||
|
|
||||||
|
h1, h2, p {{
|
||||||
|
margin: 0;
|
||||||
|
}}
|
||||||
|
|
||||||
|
h1 {{
|
||||||
|
font-size: 22px;
|
||||||
|
font-weight: 700;
|
||||||
|
}}
|
||||||
|
|
||||||
|
h2 {{
|
||||||
|
font-size: 18px;
|
||||||
|
font-weight: 700;
|
||||||
|
}}
|
||||||
|
|
||||||
|
p {{
|
||||||
|
color: var(--muted);
|
||||||
|
font-size: 13px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
main {{
|
||||||
|
width: min(1280px, 100%);
|
||||||
|
margin: 0 auto;
|
||||||
|
padding: 24px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.panel {{
|
||||||
|
background: var(--surface);
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: 8px;
|
||||||
|
padding: 18px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.panel-heading {{
|
||||||
|
display: grid;
|
||||||
|
grid-template-columns: minmax(180px, auto) 1fr;
|
||||||
|
gap: 24px;
|
||||||
|
margin-bottom: 18px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.control-row {{
|
||||||
|
display: flex;
|
||||||
|
align-items: end;
|
||||||
|
justify-content: flex-end;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
gap: 14px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
label {{
|
||||||
|
display: grid;
|
||||||
|
gap: 6px;
|
||||||
|
color: var(--muted);
|
||||||
|
font-size: 12px;
|
||||||
|
font-weight: 600;
|
||||||
|
}}
|
||||||
|
|
||||||
|
select {{
|
||||||
|
min-width: 150px;
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: 6px;
|
||||||
|
background: var(--field);
|
||||||
|
color: var(--ink);
|
||||||
|
padding: 8px;
|
||||||
|
font: inherit;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.legend-control {{
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: flex-end;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
gap: 10px 12px;
|
||||||
|
color: var(--muted);
|
||||||
|
font-size: 12px;
|
||||||
|
font-weight: 600;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.legend-title {{
|
||||||
|
line-height: 1;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.horizon-options {{
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
gap: 8px 12px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.horizon-option {{
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
color: var(--ink);
|
||||||
|
font-size: 13px;
|
||||||
|
font-weight: 500;
|
||||||
|
line-height: 1.2;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.horizon-option input {{
|
||||||
|
width: 14px;
|
||||||
|
height: 14px;
|
||||||
|
margin: 0;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.horizon-option .horizon-value {{
|
||||||
|
width: 54px;
|
||||||
|
height: 28px;
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: 6px;
|
||||||
|
background: var(--field);
|
||||||
|
color: var(--ink);
|
||||||
|
padding: 4px 6px;
|
||||||
|
font: inherit;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.legend-swatch {{
|
||||||
|
width: 22px;
|
||||||
|
height: 3px;
|
||||||
|
border-radius: 999px;
|
||||||
|
flex: 0 0 auto;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.truth-swatch {{
|
||||||
|
background: #f8fafc;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.chart-shell {{
|
||||||
|
position: relative;
|
||||||
|
min-height: 420px;
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: 6px;
|
||||||
|
overflow: hidden;
|
||||||
|
background: var(--panel);
|
||||||
|
}}
|
||||||
|
|
||||||
|
canvas {{
|
||||||
|
display: block;
|
||||||
|
width: 100%;
|
||||||
|
height: 420px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
@media (max-width: 760px) {{
|
||||||
|
header, .panel-heading, .control-row {{
|
||||||
|
display: grid;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.legend-control, .horizon-options {{
|
||||||
|
justify-content: flex-start;
|
||||||
|
}}
|
||||||
|
|
||||||
|
main {{
|
||||||
|
padding: 14px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
select {{
|
||||||
|
width: 100%;
|
||||||
|
}}
|
||||||
|
}}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<header>
|
||||||
|
<h1>Astrape</h1>
|
||||||
|
<p>Gibil web UI</p>
|
||||||
|
</header>
|
||||||
|
<main>
|
||||||
|
{self.weather_display.render()}
|
||||||
|
</main>
|
||||||
|
<script>
|
||||||
|
let astrapeUiVersion = null;
|
||||||
|
async function watchUiVersion() {{
|
||||||
|
const response = await fetch("/api/ui-version", {{ cache: "no-store" }});
|
||||||
|
const payload = await response.json();
|
||||||
|
if (astrapeUiVersion === null) {{
|
||||||
|
astrapeUiVersion = payload.version;
|
||||||
|
return;
|
||||||
|
}}
|
||||||
|
if (astrapeUiVersion !== payload.version) {{
|
||||||
|
window.location.reload();
|
||||||
|
}}
|
||||||
|
}}
|
||||||
|
setInterval(watchUiVersion, 1000);
|
||||||
|
watchUiVersion();
|
||||||
|
</script>
|
||||||
|
</body>
|
||||||
|
</html>"""
|
||||||
|
|
||||||
|
def weather_payload(self) -> str:
|
||||||
|
EnvLoader().load()
|
||||||
|
if environ.get("ASTRAPE_WEB_SAMPLE_DATA") == "1":
|
||||||
|
return self.weather_display.data_payload(WeatherSampleData().build())
|
||||||
|
|
||||||
|
try:
|
||||||
|
dataset = WeatherStore.from_env().load_display_dataset()
|
||||||
|
except WeatherStoreConfigurationError:
|
||||||
|
dataset = None
|
||||||
|
|
||||||
|
return self.weather_display.data_payload(dataset)
|
||||||
@@ -0,0 +1,137 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from os import environ
|
||||||
|
from sys import stderr
|
||||||
|
from time import sleep
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.weather_builder import (
|
||||||
|
OpenMeteoArchiveClient,
|
||||||
|
OpenMeteoClient,
|
||||||
|
WeatherBuilder,
|
||||||
|
)
|
||||||
|
from gibil.classes.weather_store import WeatherStore
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class DbDaemonConfig:
|
||||||
|
latitude: float
|
||||||
|
longitude: float
|
||||||
|
forecast_hours: int
|
||||||
|
truth_lookback_days: int
|
||||||
|
truth_end_delay_days: int
|
||||||
|
poll_seconds: int
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "DbDaemonConfig":
|
||||||
|
return cls(
|
||||||
|
latitude=float(_required_env("ASTRAPE_LATITUDE")),
|
||||||
|
longitude=float(_required_env("ASTRAPE_LONGITUDE")),
|
||||||
|
forecast_hours=int(environ.get("ASTRAPE_WEATHER_FORECAST_HOURS", "48")),
|
||||||
|
truth_lookback_days=int(
|
||||||
|
environ.get("ASTRAPE_WEATHER_TRUTH_LOOKBACK_DAYS", "14")
|
||||||
|
),
|
||||||
|
truth_end_delay_days=int(
|
||||||
|
environ.get("ASTRAPE_WEATHER_TRUTH_END_DELAY_DAYS", "5")
|
||||||
|
),
|
||||||
|
poll_seconds=int(environ.get("ASTRAPE_WEATHER_POLL_SECONDS", "3600")),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class DbDaemon:
|
||||||
|
"""Runs builder components that populate Astrape's database."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
config: DbDaemonConfig,
|
||||||
|
weather_client: OpenMeteoClient,
|
||||||
|
archive_client: OpenMeteoArchiveClient,
|
||||||
|
weather_builder: WeatherBuilder,
|
||||||
|
weather_store: WeatherStore,
|
||||||
|
) -> None:
|
||||||
|
self.config = config
|
||||||
|
self.weather_client = weather_client
|
||||||
|
self.archive_client = archive_client
|
||||||
|
self.weather_builder = weather_builder
|
||||||
|
self.weather_store = weather_store
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "DbDaemon":
|
||||||
|
return cls(
|
||||||
|
config=DbDaemonConfig.from_env(),
|
||||||
|
weather_client=OpenMeteoClient(),
|
||||||
|
archive_client=OpenMeteoArchiveClient(),
|
||||||
|
weather_builder=WeatherBuilder(),
|
||||||
|
weather_store=WeatherStore.from_env(),
|
||||||
|
)
|
||||||
|
|
||||||
|
def initialize(self) -> None:
|
||||||
|
self.weather_store.initialize()
|
||||||
|
|
||||||
|
def run_once(self) -> tuple[int, int]:
|
||||||
|
raw_run = self.weather_client.fetch_forecast(
|
||||||
|
latitude=self.config.latitude,
|
||||||
|
longitude=self.config.longitude,
|
||||||
|
forecast_hours=self.config.forecast_hours,
|
||||||
|
)
|
||||||
|
forecast_run = self.weather_builder.build_forecast_run(
|
||||||
|
source=raw_run.source,
|
||||||
|
latitude=raw_run.latitude,
|
||||||
|
longitude=raw_run.longitude,
|
||||||
|
points=raw_run.points,
|
||||||
|
issued_at=raw_run.issued_at,
|
||||||
|
)
|
||||||
|
forecast_count = self.weather_store.save_forecast_run(forecast_run)
|
||||||
|
zero_hour_truth_count = self.weather_store.save_zero_hour_forecast_as_truth(
|
||||||
|
forecast_run
|
||||||
|
)
|
||||||
|
|
||||||
|
today = datetime.now(timezone.utc).date()
|
||||||
|
truth_end = today - timedelta(days=self.config.truth_end_delay_days)
|
||||||
|
truth_start = truth_end - timedelta(days=self.config.truth_lookback_days)
|
||||||
|
truth_points = self.archive_client.fetch_resolved_truth(
|
||||||
|
latitude=self.config.latitude,
|
||||||
|
longitude=self.config.longitude,
|
||||||
|
start_date=truth_start,
|
||||||
|
end_date=truth_end,
|
||||||
|
)
|
||||||
|
archive_truth_count = self.weather_store.save_resolved_truth(truth_points)
|
||||||
|
|
||||||
|
return forecast_count, zero_hour_truth_count + archive_truth_count
|
||||||
|
|
||||||
|
def run_forever(self) -> None:
|
||||||
|
self.initialize()
|
||||||
|
while True:
|
||||||
|
forecast_count, truth_count = self.run_once()
|
||||||
|
print(
|
||||||
|
f"stored_weather_forecast_points={forecast_count} "
|
||||||
|
f"stored_weather_resolved_truth={truth_count}",
|
||||||
|
flush=True,
|
||||||
|
)
|
||||||
|
sleep(self.config.poll_seconds)
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
try:
|
||||||
|
EnvLoader().load()
|
||||||
|
daemon = DbDaemon.from_env()
|
||||||
|
daemon.run_forever()
|
||||||
|
except Exception as error:
|
||||||
|
print(f"db_daemon_startup_error={error}", file=stderr)
|
||||||
|
raise SystemExit(1) from error
|
||||||
|
|
||||||
|
|
||||||
|
def _required_env(name: str) -> str:
|
||||||
|
value = environ.get(name)
|
||||||
|
if not value:
|
||||||
|
raise RuntimeError(
|
||||||
|
f"{name} is required. Set ASTRAPE_DATABASE_URL, "
|
||||||
|
"ASTRAPE_LATITUDE, and ASTRAPE_LONGITUDE before starting db_daemon."
|
||||||
|
)
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,84 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from http.server import BaseHTTPRequestHandler, ThreadingHTTPServer
|
||||||
|
from importlib import import_module, reload
|
||||||
|
from os import environ
|
||||||
|
from pathlib import Path
|
||||||
|
import json
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
|
||||||
|
EnvLoader().load()
|
||||||
|
|
||||||
|
HOST = environ.get("ASTRAPE_WEB_HOST", "0.0.0.0")
|
||||||
|
PORT = int(environ.get("ASTRAPE_WEB_PORT", "8080"))
|
||||||
|
PROJECT_ROOT = Path(__file__).resolve().parents[2]
|
||||||
|
WATCHED_PATHS = [
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "webui.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "weather_display.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "weather_store.py",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class AstrapeWebHandler(BaseHTTPRequestHandler):
|
||||||
|
def do_GET(self) -> None:
|
||||||
|
if self.path == "/":
|
||||||
|
self._send_html(self._webui().render_page())
|
||||||
|
return
|
||||||
|
|
||||||
|
if self.path == "/api/weather":
|
||||||
|
self._send_json_text(self._webui().weather_payload())
|
||||||
|
return
|
||||||
|
|
||||||
|
if self.path == "/api/ui-version":
|
||||||
|
self._send_json_text(json.dumps({"version": self._ui_version()}))
|
||||||
|
return
|
||||||
|
|
||||||
|
self.send_error(404)
|
||||||
|
|
||||||
|
def log_message(self, format: str, *args: object) -> None:
|
||||||
|
print(f"{self.address_string()} - {format % args}")
|
||||||
|
|
||||||
|
def _webui(self):
|
||||||
|
weather_store_module = import_module("gibil.classes.weather_store")
|
||||||
|
weather_display_module = import_module("gibil.classes.weather_display")
|
||||||
|
webui_module = import_module("gibil.classes.webui")
|
||||||
|
reload(weather_store_module)
|
||||||
|
reload(weather_display_module)
|
||||||
|
reload(webui_module)
|
||||||
|
return webui_module.WebUI()
|
||||||
|
|
||||||
|
def _ui_version(self) -> str:
|
||||||
|
mtimes = [
|
||||||
|
str(path.stat().st_mtime_ns)
|
||||||
|
for path in WATCHED_PATHS
|
||||||
|
if path.exists()
|
||||||
|
]
|
||||||
|
return ".".join(mtimes)
|
||||||
|
|
||||||
|
def _send_html(self, body: str) -> None:
|
||||||
|
encoded = body.encode("utf-8")
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Content-Type", "text/html; charset=utf-8")
|
||||||
|
self.send_header("Content-Length", str(len(encoded)))
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(encoded)
|
||||||
|
|
||||||
|
def _send_json_text(self, body: str) -> None:
|
||||||
|
encoded = body.encode("utf-8")
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Content-Type", "application/json; charset=utf-8")
|
||||||
|
self.send_header("Cache-Control", "no-store")
|
||||||
|
self.send_header("Content-Length", str(len(encoded)))
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(encoded)
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
server = ThreadingHTTPServer((HOST, PORT), AstrapeWebHandler)
|
||||||
|
print(f"Astrape web UI listening on http://{HOST}:{PORT}")
|
||||||
|
server.serve_forever()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,7 @@
|
|||||||
|
def main() -> None:
|
||||||
|
print("Run `python3 -m gibil.scripts.web_daemon` to start the Astrape web UI.")
|
||||||
|
print("Run `python3 -m gibil.scripts.db_daemon` to start database ingest.")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
psycopg[binary]>=3.2,<4
|
||||||
Reference in New Issue
Block a user