Add new daemons and debug scripts for Sigenergy and Oracle functionalities
- Implement `sigen_daemon.py` to poll Sigenergy plant metrics and store snapshots. - Create `web_daemon.py` for serving a web interface with various endpoints. - Add debug scripts: - `debug_duplicates.py` to find duplicate target times in forecast data. - `debug_energy_forecast.py` to print baseline energy forecast curves. - `debug_oracle_evaluations.py` to run the oracle evaluator. - `debug_sigen.py` to inspect stored Sigenergy plant snapshots. - `debug_weather.py` to trace resolved truth data. - `modbus_test.py` for exploring Sigenergy plants or inverters over Modbus TCP. - Introduce `oracle_evaluator.py` for evaluating stored oracle predictions against actuals. - Add TCN training scripts in `tcn` directory for training usage sequence models.
This commit is contained in:
@@ -1,258 +0,0 @@
|
|||||||
|
|
||||||
SSUUMMMMAARRYY OOFF LLEESSSS CCOOMMMMAANNDDSS
|
|
||||||
|
|
||||||
Commands marked with * may be preceded by a number, _N.
|
|
||||||
Notes in parentheses indicate the behavior if _N is given.
|
|
||||||
A key preceded by a caret indicates the Ctrl key; thus ^K is ctrl-K.
|
|
||||||
|
|
||||||
h H Display this help.
|
|
||||||
q :q Q :Q ZZ Exit.
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
MMOOVVIINNGG
|
|
||||||
|
|
||||||
e ^E j ^N CR * Forward one line (or _N lines).
|
|
||||||
y ^Y k ^K ^P * Backward one line (or _N lines).
|
|
||||||
f ^F ^V SPACE * Forward one window (or _N lines).
|
|
||||||
b ^B ESC-v * Backward one window (or _N lines).
|
|
||||||
z * Forward one window (and set window to _N).
|
|
||||||
w * Backward one window (and set window to _N).
|
|
||||||
ESC-SPACE * Forward one window, but don't stop at end-of-file.
|
|
||||||
d ^D * Forward one half-window (and set half-window to _N).
|
|
||||||
u ^U * Backward one half-window (and set half-window to _N).
|
|
||||||
ESC-) RightArrow * Right one half screen width (or _N positions).
|
|
||||||
ESC-( LeftArrow * Left one half screen width (or _N positions).
|
|
||||||
ESC-} ^RightArrow Right to last column displayed.
|
|
||||||
ESC-{ ^LeftArrow Left to first column.
|
|
||||||
F Forward forever; like "tail -f".
|
|
||||||
ESC-F Like F but stop when search pattern is found.
|
|
||||||
r ^R ^L Repaint screen.
|
|
||||||
R Repaint screen, discarding buffered input.
|
|
||||||
---------------------------------------------------
|
|
||||||
Default "window" is the screen height.
|
|
||||||
Default "half-window" is half of the screen height.
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
SSEEAARRCCHHIINNGG
|
|
||||||
|
|
||||||
/_p_a_t_t_e_r_n * Search forward for (_N-th) matching line.
|
|
||||||
?_p_a_t_t_e_r_n * Search backward for (_N-th) matching line.
|
|
||||||
n * Repeat previous search (for _N-th occurrence).
|
|
||||||
N * Repeat previous search in reverse direction.
|
|
||||||
ESC-n * Repeat previous search, spanning files.
|
|
||||||
ESC-N * Repeat previous search, reverse dir. & spanning files.
|
|
||||||
ESC-u Undo (toggle) search highlighting.
|
|
||||||
ESC-U Clear search highlighting.
|
|
||||||
&_p_a_t_t_e_r_n * Display only matching lines.
|
|
||||||
---------------------------------------------------
|
|
||||||
A search pattern may begin with one or more of:
|
|
||||||
^N or ! Search for NON-matching lines.
|
|
||||||
^E or * Search multiple files (pass thru END OF FILE).
|
|
||||||
^F or @ Start search at FIRST file (for /) or last file (for ?).
|
|
||||||
^K Highlight matches, but don't move (KEEP position).
|
|
||||||
^R Don't use REGULAR EXPRESSIONS.
|
|
||||||
^W WRAP search if no match found.
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
JJUUMMPPIINNGG
|
|
||||||
|
|
||||||
g < ESC-< * Go to first line in file (or line _N).
|
|
||||||
G > ESC-> * Go to last line in file (or line _N).
|
|
||||||
p % * Go to beginning of file (or _N percent into file).
|
|
||||||
t * Go to the (_N-th) next tag.
|
|
||||||
T * Go to the (_N-th) previous tag.
|
|
||||||
{ ( [ * Find close bracket } ) ].
|
|
||||||
} ) ] * Find open bracket { ( [.
|
|
||||||
ESC-^F _<_c_1_> _<_c_2_> * Find close bracket _<_c_2_>.
|
|
||||||
ESC-^B _<_c_1_> _<_c_2_> * Find open bracket _<_c_1_>.
|
|
||||||
---------------------------------------------------
|
|
||||||
Each "find close bracket" command goes forward to the close bracket
|
|
||||||
matching the (_N-th) open bracket in the top line.
|
|
||||||
Each "find open bracket" command goes backward to the open bracket
|
|
||||||
matching the (_N-th) close bracket in the bottom line.
|
|
||||||
|
|
||||||
m_<_l_e_t_t_e_r_> Mark the current top line with <letter>.
|
|
||||||
M_<_l_e_t_t_e_r_> Mark the current bottom line with <letter>.
|
|
||||||
'_<_l_e_t_t_e_r_> Go to a previously marked position.
|
|
||||||
'' Go to the previous position.
|
|
||||||
^X^X Same as '.
|
|
||||||
ESC-M_<_l_e_t_t_e_r_> Clear a mark.
|
|
||||||
---------------------------------------------------
|
|
||||||
A mark is any upper-case or lower-case letter.
|
|
||||||
Certain marks are predefined:
|
|
||||||
^ means beginning of the file
|
|
||||||
$ means end of the file
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
CCHHAANNGGIINNGG FFIILLEESS
|
|
||||||
|
|
||||||
:e [_f_i_l_e] Examine a new file.
|
|
||||||
^X^V Same as :e.
|
|
||||||
:n * Examine the (_N-th) next file from the command line.
|
|
||||||
:p * Examine the (_N-th) previous file from the command line.
|
|
||||||
:x * Examine the first (or _N-th) file from the command line.
|
|
||||||
:d Delete the current file from the command line list.
|
|
||||||
= ^G :f Print current file name.
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
MMIISSCCEELLLLAANNEEOOUUSS CCOOMMMMAANNDDSS
|
|
||||||
|
|
||||||
-_<_f_l_a_g_> Toggle a command line option [see OPTIONS below].
|
|
||||||
--_<_n_a_m_e_> Toggle a command line option, by name.
|
|
||||||
__<_f_l_a_g_> Display the setting of a command line option.
|
|
||||||
___<_n_a_m_e_> Display the setting of an option, by name.
|
|
||||||
+_c_m_d Execute the less cmd each time a new file is examined.
|
|
||||||
|
|
||||||
!_c_o_m_m_a_n_d Execute the shell command with $SHELL.
|
|
||||||
|XX_c_o_m_m_a_n_d Pipe file between current pos & mark XX to shell command.
|
|
||||||
s _f_i_l_e Save input to a file.
|
|
||||||
v Edit the current file with $VISUAL or $EDITOR.
|
|
||||||
V Print version number of "less".
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
OOPPTTIIOONNSS
|
|
||||||
|
|
||||||
Most options may be changed either on the command line,
|
|
||||||
or from within less by using the - or -- command.
|
|
||||||
Options may be given in one of two forms: either a single
|
|
||||||
character preceded by a -, or a name preceded by --.
|
|
||||||
|
|
||||||
-? ........ --help
|
|
||||||
Display help (from command line).
|
|
||||||
-a ........ --search-skip-screen
|
|
||||||
Search skips current screen.
|
|
||||||
-A ........ --SEARCH-SKIP-SCREEN
|
|
||||||
Search starts just after target line.
|
|
||||||
-b [_N] .... --buffers=[_N]
|
|
||||||
Number of buffers.
|
|
||||||
-B ........ --auto-buffers
|
|
||||||
Don't automatically allocate buffers for pipes.
|
|
||||||
-c ........ --clear-screen
|
|
||||||
Repaint by clearing rather than scrolling.
|
|
||||||
-d ........ --dumb
|
|
||||||
Dumb terminal.
|
|
||||||
-D xx_c_o_l_o_r . --color=xx_c_o_l_o_r
|
|
||||||
Set screen colors.
|
|
||||||
-e -E .... --quit-at-eof --QUIT-AT-EOF
|
|
||||||
Quit at end of file.
|
|
||||||
-f ........ --force
|
|
||||||
Force open non-regular files.
|
|
||||||
-F ........ --quit-if-one-screen
|
|
||||||
Quit if entire file fits on first screen.
|
|
||||||
-g ........ --hilite-search
|
|
||||||
Highlight only last match for searches.
|
|
||||||
-G ........ --HILITE-SEARCH
|
|
||||||
Don't highlight any matches for searches.
|
|
||||||
-h [_N] .... --max-back-scroll=[_N]
|
|
||||||
Backward scroll limit.
|
|
||||||
-i ........ --ignore-case
|
|
||||||
Ignore case in searches that do not contain uppercase.
|
|
||||||
-I ........ --IGNORE-CASE
|
|
||||||
Ignore case in all searches.
|
|
||||||
-j [_N] .... --jump-target=[_N]
|
|
||||||
Screen position of target lines.
|
|
||||||
-J ........ --status-column
|
|
||||||
Display a status column at left edge of screen.
|
|
||||||
-k [_f_i_l_e] . --lesskey-file=[_f_i_l_e]
|
|
||||||
Use a lesskey file.
|
|
||||||
-K ........ --quit-on-intr
|
|
||||||
Exit less in response to ctrl-C.
|
|
||||||
-L ........ --no-lessopen
|
|
||||||
Ignore the LESSOPEN environment variable.
|
|
||||||
-m -M .... --long-prompt --LONG-PROMPT
|
|
||||||
Set prompt style.
|
|
||||||
-n -N .... --line-numbers --LINE-NUMBERS
|
|
||||||
Don't use line numbers.
|
|
||||||
-o [_f_i_l_e] . --log-file=[_f_i_l_e]
|
|
||||||
Copy to log file (standard input only).
|
|
||||||
-O [_f_i_l_e] . --LOG-FILE=[_f_i_l_e]
|
|
||||||
Copy to log file (unconditionally overwrite).
|
|
||||||
-p [_p_a_t_t_e_r_n] --pattern=[_p_a_t_t_e_r_n]
|
|
||||||
Start at pattern (from command line).
|
|
||||||
-P [_p_r_o_m_p_t] --prompt=[_p_r_o_m_p_t]
|
|
||||||
Define new prompt.
|
|
||||||
-q -Q .... --quiet --QUIET --silent --SILENT
|
|
||||||
Quiet the terminal bell.
|
|
||||||
-r -R .... --raw-control-chars --RAW-CONTROL-CHARS
|
|
||||||
Output "raw" control characters.
|
|
||||||
-s ........ --squeeze-blank-lines
|
|
||||||
Squeeze multiple blank lines.
|
|
||||||
-S ........ --chop-long-lines
|
|
||||||
Chop (truncate) long lines rather than wrapping.
|
|
||||||
-t [_t_a_g] .. --tag=[_t_a_g]
|
|
||||||
Find a tag.
|
|
||||||
-T [_t_a_g_s_f_i_l_e] --tag-file=[_t_a_g_s_f_i_l_e]
|
|
||||||
Use an alternate tags file.
|
|
||||||
-u -U .... --underline-special --UNDERLINE-SPECIAL
|
|
||||||
Change handling of backspaces.
|
|
||||||
-V ........ --version
|
|
||||||
Display the version number of "less".
|
|
||||||
-w ........ --hilite-unread
|
|
||||||
Highlight first new line after forward-screen.
|
|
||||||
-W ........ --HILITE-UNREAD
|
|
||||||
Highlight first new line after any forward movement.
|
|
||||||
-x [_N[,...]] --tabs=[_N[,...]]
|
|
||||||
Set tab stops.
|
|
||||||
-X ........ --no-init
|
|
||||||
Don't use termcap init/deinit strings.
|
|
||||||
-y [_N] .... --max-forw-scroll=[_N]
|
|
||||||
Forward scroll limit.
|
|
||||||
-z [_N] .... --window=[_N]
|
|
||||||
Set size of window.
|
|
||||||
-" [_c[_c]] . --quotes=[_c[_c]]
|
|
||||||
Set shell quote characters.
|
|
||||||
-~ ........ --tilde
|
|
||||||
Don't display tildes after end of file.
|
|
||||||
-# [_N] .... --shift=[_N]
|
|
||||||
Set horizontal scroll amount (0 = one half screen width).
|
|
||||||
--file-size
|
|
||||||
Automatically determine the size of the input file.
|
|
||||||
--follow-name
|
|
||||||
The F command changes files if the input file is renamed.
|
|
||||||
--incsearch
|
|
||||||
Search file as each pattern character is typed in.
|
|
||||||
--line-num-width=N
|
|
||||||
Set the width of the -N line number field to N characters.
|
|
||||||
--mouse
|
|
||||||
Enable mouse input.
|
|
||||||
--no-keypad
|
|
||||||
Don't send termcap keypad init/deinit strings.
|
|
||||||
--no-histdups
|
|
||||||
Remove duplicates from command history.
|
|
||||||
--rscroll=C
|
|
||||||
Set the character used to mark truncated lines.
|
|
||||||
--save-marks
|
|
||||||
Retain marks across invocations of less.
|
|
||||||
--status-col-width=N
|
|
||||||
Set the width of the -J status column to N characters.
|
|
||||||
--use-backslash
|
|
||||||
Subsequent options use backslash as escape char.
|
|
||||||
--use-color
|
|
||||||
Enables colored text.
|
|
||||||
--wheel-lines=N
|
|
||||||
Each click of the mouse wheel moves N lines.
|
|
||||||
|
|
||||||
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
LLIINNEE EEDDIITTIINNGG
|
|
||||||
|
|
||||||
These keys can be used to edit text being entered
|
|
||||||
on the "command line" at the bottom of the screen.
|
|
||||||
|
|
||||||
RightArrow ..................... ESC-l ... Move cursor right one character.
|
|
||||||
LeftArrow ...................... ESC-h ... Move cursor left one character.
|
|
||||||
ctrl-RightArrow ESC-RightArrow ESC-w ... Move cursor right one word.
|
|
||||||
ctrl-LeftArrow ESC-LeftArrow ESC-b ... Move cursor left one word.
|
|
||||||
HOME ........................... ESC-0 ... Move cursor to start of line.
|
|
||||||
END ............................ ESC-$ ... Move cursor to end of line.
|
|
||||||
BACKSPACE ................................ Delete char to left of cursor.
|
|
||||||
DELETE ......................... ESC-x ... Delete char under cursor.
|
|
||||||
ctrl-BACKSPACE ESC-BACKSPACE ........... Delete word to left of cursor.
|
|
||||||
ctrl-DELETE .... ESC-DELETE .... ESC-X ... Delete word under cursor.
|
|
||||||
ctrl-U ......... ESC (MS-DOS only) ....... Delete entire line.
|
|
||||||
UpArrow ........................ ESC-k ... Retrieve previous command line.
|
|
||||||
DownArrow ...................... ESC-j ... Retrieve next command line.
|
|
||||||
TAB ...................................... Complete filename & cycle.
|
|
||||||
SHIFT-TAB ...................... ESC-TAB Complete filename & reverse cycle.
|
|
||||||
ctrl-L ................................... Complete filename, list all.
|
|
||||||
@@ -10,7 +10,7 @@ Group=gibil
|
|||||||
WorkingDirectory=/mnt/astrape
|
WorkingDirectory=/mnt/astrape
|
||||||
Environment=PYTHONUNBUFFERED=1
|
Environment=PYTHONUNBUFFERED=1
|
||||||
Environment=PYTHONDONTWRITEBYTECODE=1
|
Environment=PYTHONDONTWRITEBYTECODE=1
|
||||||
ExecStart=/usr/bin/python3 -m gibil.scripts.db_daemon
|
ExecStart=/usr/bin/python3 -m gibil.scripts.daemons.db_daemon
|
||||||
Restart=always
|
Restart=always
|
||||||
RestartSec=10
|
RestartSec=10
|
||||||
|
|
||||||
|
|||||||
@@ -0,0 +1,19 @@
|
|||||||
|
[Unit]
|
||||||
|
Description=Astrape Energy Oracle Forecast Snapshots
|
||||||
|
After=network-online.target postgresql.service astrape-sigen.service astrape-db.service
|
||||||
|
Wants=network-online.target
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=simple
|
||||||
|
User=gibil
|
||||||
|
Group=gibil
|
||||||
|
WorkingDirectory=/mnt/astrape
|
||||||
|
Environment=PYTHONUNBUFFERED=1
|
||||||
|
Environment=PYTHONDONTWRITEBYTECODE=1
|
||||||
|
Environment=ASTRAPE_ORACLE_POLL_SECONDS=300
|
||||||
|
ExecStart=/usr/bin/python3 -m gibil.scripts.daemons.oracle_daemon
|
||||||
|
Restart=always
|
||||||
|
RestartSec=10
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
@@ -0,0 +1,18 @@
|
|||||||
|
[Unit]
|
||||||
|
Description=Astrape Sigenergy Plant Ingest
|
||||||
|
After=network-online.target postgresql.service
|
||||||
|
Wants=network-online.target
|
||||||
|
|
||||||
|
[Service]
|
||||||
|
Type=simple
|
||||||
|
User=gibil
|
||||||
|
Group=gibil
|
||||||
|
WorkingDirectory=/mnt/astrape
|
||||||
|
Environment=PYTHONUNBUFFERED=1
|
||||||
|
Environment=PYTHONDONTWRITEBYTECODE=1
|
||||||
|
ExecStart=/usr/bin/python3 -m gibil.scripts.daemons.sigen_daemon
|
||||||
|
Restart=always
|
||||||
|
RestartSec=10
|
||||||
|
|
||||||
|
[Install]
|
||||||
|
WantedBy=multi-user.target
|
||||||
@@ -10,7 +10,7 @@ Group=gibil
|
|||||||
WorkingDirectory=/mnt/astrape
|
WorkingDirectory=/mnt/astrape
|
||||||
Environment=PYTHONUNBUFFERED=1
|
Environment=PYTHONUNBUFFERED=1
|
||||||
Environment=PYTHONDONTWRITEBYTECODE=1
|
Environment=PYTHONDONTWRITEBYTECODE=1
|
||||||
ExecStart=/usr/bin/python3 -m gibil.scripts.web_daemon
|
ExecStart=/usr/bin/python3 -m gibil.scripts.daemons.web_daemon
|
||||||
Restart=always
|
Restart=always
|
||||||
RestartSec=5
|
RestartSec=5
|
||||||
|
|
||||||
|
|||||||
@@ -200,6 +200,40 @@ Notes:
|
|||||||
- future prediction modules can join this to `weather_forecast_points`
|
- future prediction modules can join this to `weather_forecast_points`
|
||||||
- make this a hypertable on `resolved_at`
|
- make this a hypertable on `resolved_at`
|
||||||
|
|
||||||
|
### `sigen_plant_snapshots`
|
||||||
|
|
||||||
|
High-resolution Sigenergy plant state from Modbus TCP.
|
||||||
|
|
||||||
|
Core fields:
|
||||||
|
- `observed_at`
|
||||||
|
- `received_at`
|
||||||
|
- `source`
|
||||||
|
- `solar_power_w`
|
||||||
|
- `battery_soc_pct`
|
||||||
|
- `battery_soh_pct`
|
||||||
|
- `battery_power_w`
|
||||||
|
- `grid_power_w`
|
||||||
|
- `grid_import_w`
|
||||||
|
- `grid_export_w`
|
||||||
|
- `load_power_w`
|
||||||
|
- `plant_active_power_w`
|
||||||
|
- `accumulated_pv_energy_kwh`
|
||||||
|
- `daily_consumed_energy_kwh`
|
||||||
|
- `accumulated_consumed_energy_kwh`
|
||||||
|
- status fields for EMS, running state, and grid sensor state
|
||||||
|
- `raw_values`
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
- raw polling target is `SIGEN_POLL_SECONDS=5`
|
||||||
|
- make this a hypertable on `observed_at`
|
||||||
|
- keep raw JSON during integration so unsupported or surprising registers can be debugged
|
||||||
|
- rollup views should preserve averages, min/max spikes, and sample counts so short-duration usage signatures are not erased completely
|
||||||
|
|
||||||
|
Initial rollups:
|
||||||
|
- `sigen_plant_snapshots_1m`
|
||||||
|
- `sigen_plant_snapshots_15m`
|
||||||
|
- `sigen_plant_snapshots_1h`
|
||||||
|
|
||||||
### `system_events`
|
### `system_events`
|
||||||
|
|
||||||
Operational events from collectors, storage, Gibil, and publishers.
|
Operational events from collectors, storage, Gibil, and publishers.
|
||||||
|
|||||||
+9
-3
@@ -5,7 +5,7 @@
|
|||||||
Start the web UI daemon:
|
Start the web UI daemon:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
python3 -m gibil.scripts.web_daemon
|
python3 -m gibil.scripts.daemons.web_daemon
|
||||||
```
|
```
|
||||||
|
|
||||||
The daemon listens on:
|
The daemon listens on:
|
||||||
@@ -30,8 +30,10 @@ Install service units:
|
|||||||
```bash
|
```bash
|
||||||
sudo cp deploy/systemd/astrape-web.service /etc/systemd/system/
|
sudo cp deploy/systemd/astrape-web.service /etc/systemd/system/
|
||||||
sudo cp deploy/systemd/astrape-db.service /etc/systemd/system/
|
sudo cp deploy/systemd/astrape-db.service /etc/systemd/system/
|
||||||
|
sudo cp deploy/systemd/astrape-sigen.service /etc/systemd/system/
|
||||||
|
sudo cp deploy/systemd/astrape-oracle.service /etc/systemd/system/
|
||||||
sudo systemctl daemon-reload
|
sudo systemctl daemon-reload
|
||||||
sudo systemctl enable --now astrape-web.service astrape-db.service
|
sudo systemctl enable --now astrape-web.service astrape-db.service astrape-sigen.service astrape-oracle.service
|
||||||
```
|
```
|
||||||
|
|
||||||
Check status:
|
Check status:
|
||||||
@@ -39,8 +41,12 @@ Check status:
|
|||||||
```bash
|
```bash
|
||||||
systemctl status astrape-web.service
|
systemctl status astrape-web.service
|
||||||
systemctl status astrape-db.service
|
systemctl status astrape-db.service
|
||||||
|
systemctl status astrape-sigen.service
|
||||||
|
systemctl status astrape-oracle.service
|
||||||
journalctl -u astrape-web.service -f
|
journalctl -u astrape-web.service -f
|
||||||
journalctl -u astrape-db.service -f
|
journalctl -u astrape-db.service -f
|
||||||
|
journalctl -u astrape-sigen.service -f
|
||||||
|
journalctl -u astrape-oracle.service -f
|
||||||
```
|
```
|
||||||
|
|
||||||
Both services run as the IPA-managed `gibil` user from `/mnt/astrape`.
|
Both services run as the IPA-managed `gibil` user from `/mnt/astrape`.
|
||||||
@@ -90,7 +96,7 @@ This does not write artificial data to TimescaleDB. It only changes the web UI w
|
|||||||
Start the database ingest daemon:
|
Start the database ingest daemon:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
python3 -m gibil.scripts.db_daemon
|
python3 -m gibil.scripts.daemons.db_daemon
|
||||||
```
|
```
|
||||||
|
|
||||||
Current behavior:
|
Current behavior:
|
||||||
|
|||||||
@@ -22,6 +22,11 @@ class PowerStage(str, Enum):
|
|||||||
CONSERVE = "conserve"
|
CONSERVE = "conserve"
|
||||||
|
|
||||||
|
|
||||||
|
class ForecastKind(str, Enum):
|
||||||
|
SOLAR = "solar"
|
||||||
|
LOAD = "load"
|
||||||
|
|
||||||
|
|
||||||
@dataclass(frozen=True)
|
@dataclass(frozen=True)
|
||||||
class Observation:
|
class Observation:
|
||||||
source: str
|
source: str
|
||||||
@@ -80,3 +85,75 @@ class WeatherResolvedTruth:
|
|||||||
temperature_c: float | None
|
temperature_c: float | None
|
||||||
shortwave_radiation_w_m2: float | None
|
shortwave_radiation_w_m2: float | None
|
||||||
source: str
|
source: str
|
||||||
|
cloud_cover_pct: float | None = None
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class SigenPlantSnapshot:
|
||||||
|
observed_at: datetime
|
||||||
|
received_at: datetime
|
||||||
|
source: str = "sigen_modbus"
|
||||||
|
plant_epoch_seconds: int | None = None
|
||||||
|
plant_ems_work_mode: int | None = None
|
||||||
|
plant_running_state: int | None = None
|
||||||
|
grid_sensor_status: int | None = None
|
||||||
|
solar_power_w: float | None = None
|
||||||
|
battery_soc_pct: float | None = None
|
||||||
|
battery_soh_pct: float | None = None
|
||||||
|
battery_power_w: float | None = None
|
||||||
|
grid_power_w: float | None = None
|
||||||
|
grid_import_w: float | None = None
|
||||||
|
grid_export_w: float | None = None
|
||||||
|
load_power_w: float | None = None
|
||||||
|
plant_active_power_w: float | None = None
|
||||||
|
accumulated_pv_energy_kwh: float | None = None
|
||||||
|
daily_consumed_energy_kwh: float | None = None
|
||||||
|
accumulated_consumed_energy_kwh: float | None = None
|
||||||
|
raw_values: dict[str, int | float | str | bool | None] = field(default_factory=dict)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class PowerForecastPoint:
|
||||||
|
target_at: datetime
|
||||||
|
horizon_minutes: int
|
||||||
|
expected_power_w: float
|
||||||
|
p10_power_w: float
|
||||||
|
p50_power_w: float
|
||||||
|
p90_power_w: float
|
||||||
|
confidence: float
|
||||||
|
source: str
|
||||||
|
model_version: str
|
||||||
|
metadata: dict[str, Any] = field(default_factory=dict)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class PowerForecastRun:
|
||||||
|
issued_at: datetime
|
||||||
|
kind: ForecastKind
|
||||||
|
source: str
|
||||||
|
model_version: str
|
||||||
|
points: list[PowerForecastPoint]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class NetPowerForecastPoint:
|
||||||
|
target_at: datetime
|
||||||
|
horizon_minutes: int
|
||||||
|
expected_net_power_w: float
|
||||||
|
safe_net_power_w: float
|
||||||
|
p10_net_power_w: float
|
||||||
|
p50_net_power_w: float
|
||||||
|
p90_net_power_w: float
|
||||||
|
solar_p50_power_w: float
|
||||||
|
load_p50_power_w: float
|
||||||
|
solar_p10_power_w: float
|
||||||
|
solar_p90_power_w: float
|
||||||
|
load_p10_power_w: float
|
||||||
|
load_p90_power_w: float
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class NetPowerForecastRun:
|
||||||
|
issued_at: datetime
|
||||||
|
source: str
|
||||||
|
points: list[NetPowerForecastPoint]
|
||||||
|
|||||||
@@ -0,0 +1,15 @@
|
|||||||
|
from gibil.classes.oracle.builder import EnergyForecastBuilder, EnergyOracleBuilder
|
||||||
|
from gibil.classes.oracle.config import EnergyForecastConfig
|
||||||
|
from gibil.classes.oracle.display import OracleDisplay
|
||||||
|
from gibil.classes.oracle.quality_display import OracleQualityDisplay
|
||||||
|
from gibil.classes.oracle.store import OracleStore, OracleStoreConfig
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"EnergyForecastBuilder",
|
||||||
|
"EnergyForecastConfig",
|
||||||
|
"EnergyOracleBuilder",
|
||||||
|
"OracleDisplay",
|
||||||
|
"OracleQualityDisplay",
|
||||||
|
"OracleStore",
|
||||||
|
"OracleStoreConfig",
|
||||||
|
]
|
||||||
@@ -0,0 +1,191 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
|
from gibil.classes.models import NetPowerForecastRun, PowerForecastPoint, PowerForecastRun
|
||||||
|
from gibil.classes.oracle.config import EnergyForecastConfig
|
||||||
|
from gibil.classes.predictors.net_forecaster import NetPowerForecaster
|
||||||
|
from gibil.classes.predictors.solar_rolling_regression import RollingSolarRegressionOracle
|
||||||
|
from gibil.classes.predictors.usage_daily import DailyUsageOracle
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
from gibil.classes.weather.store import WeatherStore
|
||||||
|
|
||||||
|
|
||||||
|
class EnergyOracleBuilder:
|
||||||
|
"""Builds production, load, and net oracle curves."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
weather_store: WeatherStore,
|
||||||
|
sigen_store: SigenStore,
|
||||||
|
config: EnergyForecastConfig,
|
||||||
|
) -> None:
|
||||||
|
self.weather_store = weather_store
|
||||||
|
self.sigen_store = sigen_store
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "EnergyOracleBuilder":
|
||||||
|
return cls(
|
||||||
|
weather_store=WeatherStore.from_env(),
|
||||||
|
sigen_store=SigenStore.from_env(),
|
||||||
|
config=EnergyForecastConfig.from_env(),
|
||||||
|
)
|
||||||
|
|
||||||
|
def build(self) -> tuple[PowerForecastRun, PowerForecastRun, NetPowerForecastRun]:
|
||||||
|
issued_at = datetime.now(timezone.utc)
|
||||||
|
hourly_solar_run = RollingSolarRegressionOracle(
|
||||||
|
weather_store=self.weather_store,
|
||||||
|
sigen_store=self.sigen_store,
|
||||||
|
config=self.config,
|
||||||
|
).forecast(issued_at=issued_at)
|
||||||
|
solar_run = self._resample_power_run(
|
||||||
|
hourly_solar_run,
|
||||||
|
issued_at=issued_at,
|
||||||
|
step_minutes=self.config.oracle_step_minutes,
|
||||||
|
)
|
||||||
|
load_run = DailyUsageOracle(
|
||||||
|
sigen_store=self.sigen_store,
|
||||||
|
config=self.config,
|
||||||
|
).forecast(
|
||||||
|
target_times=[point.target_at for point in solar_run.points],
|
||||||
|
issued_at=issued_at,
|
||||||
|
)
|
||||||
|
net_run = NetPowerForecaster().combine(solar_run, load_run)
|
||||||
|
return solar_run, load_run, net_run
|
||||||
|
|
||||||
|
def _resample_power_run(
|
||||||
|
self,
|
||||||
|
run: PowerForecastRun,
|
||||||
|
issued_at: datetime,
|
||||||
|
step_minutes: int,
|
||||||
|
) -> PowerForecastRun:
|
||||||
|
if step_minutes <= 0 or len(run.points) < 2:
|
||||||
|
return run
|
||||||
|
|
||||||
|
points = sorted(run.points, key=lambda point: point.target_at)
|
||||||
|
end_at = min(
|
||||||
|
points[-1].target_at,
|
||||||
|
issued_at + self._timedelta_hours(self.config.horizon_hours),
|
||||||
|
)
|
||||||
|
target_at = self._ceil_time(issued_at, step_minutes)
|
||||||
|
sampled_points: list[PowerForecastPoint] = []
|
||||||
|
|
||||||
|
while target_at <= end_at:
|
||||||
|
point = self._interpolate_power_point(points, target_at, issued_at)
|
||||||
|
if point is not None:
|
||||||
|
sampled_points.append(point)
|
||||||
|
target_at += self._timedelta_minutes(step_minutes)
|
||||||
|
|
||||||
|
current_point = self._current_power_point(points, issued_at)
|
||||||
|
if current_point is not None:
|
||||||
|
sampled_points.insert(0, current_point)
|
||||||
|
|
||||||
|
if not sampled_points:
|
||||||
|
return run
|
||||||
|
|
||||||
|
return PowerForecastRun(
|
||||||
|
issued_at=run.issued_at,
|
||||||
|
kind=run.kind,
|
||||||
|
source=run.source,
|
||||||
|
model_version=f"{run.model_version}_sampled_{step_minutes}m",
|
||||||
|
points=sampled_points,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _interpolate_power_point(
|
||||||
|
self,
|
||||||
|
points: list[PowerForecastPoint],
|
||||||
|
target_at: datetime,
|
||||||
|
issued_at: datetime,
|
||||||
|
) -> PowerForecastPoint | None:
|
||||||
|
if target_at < points[0].target_at or target_at > points[-1].target_at:
|
||||||
|
return None
|
||||||
|
|
||||||
|
for index in range(len(points) - 1):
|
||||||
|
left = points[index]
|
||||||
|
right = points[index + 1]
|
||||||
|
if left.target_at <= target_at <= right.target_at:
|
||||||
|
ratio = self._time_ratio(left.target_at, right.target_at, target_at)
|
||||||
|
p10 = self._lerp(left.p10_power_w, right.p10_power_w, ratio)
|
||||||
|
p50 = self._lerp(left.p50_power_w, right.p50_power_w, ratio)
|
||||||
|
p90 = self._lerp(left.p90_power_w, right.p90_power_w, ratio)
|
||||||
|
return PowerForecastPoint(
|
||||||
|
target_at=target_at,
|
||||||
|
horizon_minutes=max(
|
||||||
|
0, round((target_at - issued_at).total_seconds() / 60)
|
||||||
|
),
|
||||||
|
expected_power_w=p50,
|
||||||
|
p10_power_w=p10,
|
||||||
|
p50_power_w=p50,
|
||||||
|
p90_power_w=p90,
|
||||||
|
confidence=self._lerp(left.confidence, right.confidence, ratio),
|
||||||
|
source=left.source,
|
||||||
|
model_version=left.model_version,
|
||||||
|
metadata={
|
||||||
|
"interpolated": True,
|
||||||
|
"left_target_at": left.target_at.isoformat(),
|
||||||
|
"right_target_at": right.target_at.isoformat(),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _current_power_point(
|
||||||
|
self,
|
||||||
|
points: list[PowerForecastPoint],
|
||||||
|
issued_at: datetime,
|
||||||
|
) -> PowerForecastPoint | None:
|
||||||
|
if not points:
|
||||||
|
return None
|
||||||
|
|
||||||
|
first = points[0]
|
||||||
|
return PowerForecastPoint(
|
||||||
|
target_at=issued_at,
|
||||||
|
horizon_minutes=0,
|
||||||
|
expected_power_w=first.p50_power_w,
|
||||||
|
p10_power_w=first.p10_power_w,
|
||||||
|
p50_power_w=first.p50_power_w,
|
||||||
|
p90_power_w=first.p90_power_w,
|
||||||
|
confidence=first.confidence,
|
||||||
|
source=first.source,
|
||||||
|
model_version=first.model_version,
|
||||||
|
metadata={
|
||||||
|
"interpolated": True,
|
||||||
|
"anchored_to": first.target_at.isoformat(),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
def _ceil_time(self, value: datetime, step_minutes: int) -> datetime:
|
||||||
|
step_seconds = step_minutes * 60
|
||||||
|
timestamp = value.timestamp()
|
||||||
|
remainder = timestamp % step_seconds
|
||||||
|
if remainder:
|
||||||
|
timestamp += step_seconds - remainder
|
||||||
|
return datetime.fromtimestamp(timestamp, timezone.utc)
|
||||||
|
|
||||||
|
def _time_ratio(
|
||||||
|
self,
|
||||||
|
left: datetime,
|
||||||
|
right: datetime,
|
||||||
|
value: datetime,
|
||||||
|
) -> float:
|
||||||
|
span = (right - left).total_seconds()
|
||||||
|
if span <= 0:
|
||||||
|
return 0.0
|
||||||
|
return (value - left).total_seconds() / span
|
||||||
|
|
||||||
|
def _lerp(self, left: float, right: float, ratio: float) -> float:
|
||||||
|
return left + (right - left) * ratio
|
||||||
|
|
||||||
|
def _timedelta_hours(self, hours: int):
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
return timedelta(hours=hours)
|
||||||
|
|
||||||
|
def _timedelta_minutes(self, minutes: int):
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
return timedelta(minutes=minutes)
|
||||||
|
|
||||||
|
|
||||||
|
EnergyForecastBuilder = EnergyOracleBuilder
|
||||||
@@ -0,0 +1,60 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from os import environ
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class EnergyForecastConfig:
|
||||||
|
horizon_hours: int = 24
|
||||||
|
oracle_step_minutes: int = 15
|
||||||
|
fallback_solar_peak_w: float = 10000
|
||||||
|
solar_peak_headroom: float = 1.05
|
||||||
|
solar_scale: float = 1.0
|
||||||
|
solar_training_days: int = 30
|
||||||
|
solar_min_training_samples: int = 24
|
||||||
|
solar_ridge_lambda: float = 0.1
|
||||||
|
load_lookback_minutes: int = 30
|
||||||
|
load_profile_days: int = 30
|
||||||
|
load_profile_bucket_minutes: int = 15
|
||||||
|
load_profile_min_samples: int = 5
|
||||||
|
load_recent_blend: float = 0.35
|
||||||
|
local_timezone: str = "Europe/Stockholm"
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "EnergyForecastConfig":
|
||||||
|
return cls(
|
||||||
|
horizon_hours=int(environ.get("ASTRAPE_ENERGY_FORECAST_HOURS", "24")),
|
||||||
|
oracle_step_minutes=int(environ.get("ASTRAPE_ORACLE_STEP_MINUTES", "15")),
|
||||||
|
fallback_solar_peak_w=float(
|
||||||
|
environ.get("ASTRAPE_SOLAR_PEAK_W", "10000")
|
||||||
|
),
|
||||||
|
solar_peak_headroom=float(
|
||||||
|
environ.get("ASTRAPE_SOLAR_PEAK_HEADROOM", "1.05")
|
||||||
|
),
|
||||||
|
solar_scale=float(environ.get("ASTRAPE_SOLAR_FORECAST_SCALE", "1.0")),
|
||||||
|
solar_training_days=int(
|
||||||
|
environ.get("ASTRAPE_SOLAR_TRAINING_DAYS", "30")
|
||||||
|
),
|
||||||
|
solar_min_training_samples=int(
|
||||||
|
environ.get("ASTRAPE_SOLAR_MIN_TRAINING_SAMPLES", "24")
|
||||||
|
),
|
||||||
|
solar_ridge_lambda=float(
|
||||||
|
environ.get("ASTRAPE_SOLAR_RIDGE_LAMBDA", "0.1")
|
||||||
|
),
|
||||||
|
load_lookback_minutes=int(
|
||||||
|
environ.get("ASTRAPE_LOAD_LOOKBACK_MINUTES", "30")
|
||||||
|
),
|
||||||
|
load_profile_days=int(environ.get("ASTRAPE_LOAD_PROFILE_DAYS", "30")),
|
||||||
|
load_profile_bucket_minutes=int(
|
||||||
|
environ.get("ASTRAPE_LOAD_PROFILE_BUCKET_MINUTES", "15")
|
||||||
|
),
|
||||||
|
load_profile_min_samples=int(
|
||||||
|
environ.get("ASTRAPE_LOAD_PROFILE_MIN_SAMPLES", "5")
|
||||||
|
),
|
||||||
|
load_recent_blend=float(environ.get("ASTRAPE_LOAD_RECENT_BLEND", "0.35")),
|
||||||
|
local_timezone=environ.get(
|
||||||
|
"ASTRAPE_LOCAL_TIMEZONE",
|
||||||
|
environ.get("TZ", "Europe/Stockholm"),
|
||||||
|
),
|
||||||
|
)
|
||||||
@@ -0,0 +1,434 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
from dataclasses import asdict
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from gibil.classes.oracle.builder import EnergyOracleBuilder
|
||||||
|
from gibil.classes.models import (
|
||||||
|
NetPowerForecastPoint,
|
||||||
|
PowerForecastPoint,
|
||||||
|
PowerForecastRun,
|
||||||
|
)
|
||||||
|
from gibil.classes.oracle.store import OracleStore
|
||||||
|
|
||||||
|
|
||||||
|
class OracleDisplay:
|
||||||
|
"""Renders energy oracle curves for the Astrape web UI."""
|
||||||
|
|
||||||
|
def render(self) -> str:
|
||||||
|
return """
|
||||||
|
<section class="panel oracle-panel" data-module="oracle-display">
|
||||||
|
<div class="panel-heading">
|
||||||
|
<div>
|
||||||
|
<h2>Energy Oracle</h2>
|
||||||
|
<p>Solar, usage, and net power projection curves</p>
|
||||||
|
</div>
|
||||||
|
<div class="control-row">
|
||||||
|
<div id="oracle-legend" class="legend-control"></div>
|
||||||
|
<label>
|
||||||
|
Curve
|
||||||
|
<select id="oracle-variable">
|
||||||
|
<option value="net">Net power</option>
|
||||||
|
<option value="history">Past net predictions</option>
|
||||||
|
<option value="solar">Solar production</option>
|
||||||
|
<option value="load">Consumption</option>
|
||||||
|
</select>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="chart-shell">
|
||||||
|
<canvas id="oracle-chart" width="1100" height="420"></canvas>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
<script>
|
||||||
|
window.astrapeModules = window.astrapeModules || {};
|
||||||
|
window.astrapeModules.oracleDisplay = (() => {
|
||||||
|
const colors = {
|
||||||
|
actual: "#34d399",
|
||||||
|
historical: "#a78bfa",
|
||||||
|
p10: "#60a5fa",
|
||||||
|
p50: "#f8fafc",
|
||||||
|
p90: "#fbbf24",
|
||||||
|
safe: "#fb7185"
|
||||||
|
};
|
||||||
|
|
||||||
|
function init() {
|
||||||
|
document.getElementById("oracle-variable").addEventListener("change", render);
|
||||||
|
refresh();
|
||||||
|
setInterval(refresh, 5000);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function refresh() {
|
||||||
|
const response = await fetch("/api/oracle", { cache: "no-store" });
|
||||||
|
window.astrapeOracleData = await response.json();
|
||||||
|
render();
|
||||||
|
}
|
||||||
|
|
||||||
|
function render() {
|
||||||
|
const payload = window.astrapeOracleData || {};
|
||||||
|
const variable = document.getElementById("oracle-variable").value;
|
||||||
|
const series = buildSeries(payload, variable);
|
||||||
|
renderLegend(series);
|
||||||
|
drawChart(series, payload);
|
||||||
|
}
|
||||||
|
|
||||||
|
function renderLegend(series) {
|
||||||
|
const legend = document.getElementById("oracle-legend");
|
||||||
|
legend.innerHTML = "";
|
||||||
|
series.forEach((item) => {
|
||||||
|
const entry = document.createElement("div");
|
||||||
|
entry.className = "horizon-option";
|
||||||
|
entry.innerHTML = `
|
||||||
|
<span class="legend-swatch" style="${legendSwatchStyle(item)}"></span>
|
||||||
|
<span>${item.label}</span>
|
||||||
|
`;
|
||||||
|
legend.appendChild(entry);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function legendSwatchStyle(item) {
|
||||||
|
if (item.dash) {
|
||||||
|
return `background: repeating-linear-gradient(90deg, ${item.color} 0 8px, transparent 8px 13px); border: 1px solid ${item.color};`;
|
||||||
|
}
|
||||||
|
return `background: ${item.color}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function buildSeries(payload, variable) {
|
||||||
|
if (variable === "solar") {
|
||||||
|
return [
|
||||||
|
{ label: "Observed solar", color: colors.actual, width: 3, markers: true, points: actualPoints(payload.actual_points, "solar_power_w", payload.now) },
|
||||||
|
{ label: "Current solar low", color: colors.p10, width: 2, dash: [6, 5], points: powerPoints(payload.solar_points, "p10_power_w") },
|
||||||
|
{ label: "Current solar expected", color: colors.p50, width: 3, points: powerPoints(payload.solar_points, "p50_power_w") },
|
||||||
|
{ label: "Current solar high", color: colors.p90, width: 2, dash: [6, 5], points: powerPoints(payload.solar_points, "p90_power_w") },
|
||||||
|
...historicalPowerSeries(payload.historical_solar_runs || [], "Solar forecast"),
|
||||||
|
];
|
||||||
|
}
|
||||||
|
if (variable === "load") {
|
||||||
|
return [
|
||||||
|
{ label: "Observed load", color: colors.actual, width: 3, markers: true, points: actualPoints(payload.actual_points, "load_power_w", payload.now) },
|
||||||
|
{ label: "Current load low", color: colors.p10, width: 2, dash: [6, 5], points: powerPoints(payload.load_points, "p10_power_w") },
|
||||||
|
{ label: "Current load expected", color: colors.p50, width: 3, points: powerPoints(payload.load_points, "p50_power_w") },
|
||||||
|
{ label: "Current load high", color: colors.p90, width: 2, dash: [6, 5], points: powerPoints(payload.load_points, "p90_power_w") },
|
||||||
|
...historicalPowerSeries(payload.historical_load_runs || [], "Load forecast"),
|
||||||
|
];
|
||||||
|
}
|
||||||
|
if (variable === "history") {
|
||||||
|
return [
|
||||||
|
{ label: "Observed net", color: colors.actual, width: 3, markers: true, points: actualPoints(payload.actual_points, "net_power_w", payload.now) },
|
||||||
|
...historicalNetSeries(payload.historical_net_runs || []),
|
||||||
|
];
|
||||||
|
}
|
||||||
|
return [
|
||||||
|
{ label: "Observed net", color: colors.actual, width: 3, markers: true, points: actualPoints(payload.actual_points, "net_power_w", payload.now) },
|
||||||
|
{ label: "Current net low", color: colors.p10, width: 2, dash: [6, 5], points: netPoints(payload.net_points, "p10_net_power_w") },
|
||||||
|
{ label: "Current net expected", color: colors.p50, width: 3, points: netPoints(payload.net_points, "p50_net_power_w") },
|
||||||
|
{ label: "Current net high", color: colors.p90, width: 2, dash: [6, 5], points: netPoints(payload.net_points, "p90_net_power_w") },
|
||||||
|
...historicalNetSeries(payload.historical_net_runs || []),
|
||||||
|
];
|
||||||
|
}
|
||||||
|
|
||||||
|
function historicalNetSeries(runs) {
|
||||||
|
const palette = ["#a78bfa", "#c084fc", "#818cf8", "#38bdf8", "#f472b6", "#f59e0b"];
|
||||||
|
return runs.map((run, index) => ({
|
||||||
|
label: `Net forecast ${formatLag(run)}`,
|
||||||
|
color: palette[index % palette.length],
|
||||||
|
width: 2,
|
||||||
|
dash: [3, 5],
|
||||||
|
points: (run.points || []).map((point) => ({
|
||||||
|
target_at: point.target_at,
|
||||||
|
value: point.p50_net_power_w ?? point.expected_net_power_w
|
||||||
|
})).filter((point) => new Date(point.target_at).getTime() >= new Date(run.issued_at).getTime())
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
function historicalPowerSeries(runs, labelPrefix) {
|
||||||
|
const palette = ["#a78bfa", "#c084fc", "#818cf8", "#38bdf8", "#f472b6", "#f59e0b"];
|
||||||
|
return runs.map((run, index) => ({
|
||||||
|
label: `${labelPrefix} ${formatLag(run)}`,
|
||||||
|
color: palette[index % palette.length],
|
||||||
|
width: 2,
|
||||||
|
dash: [3, 5],
|
||||||
|
points: (run.points || []).map((point) => ({
|
||||||
|
target_at: point.target_at,
|
||||||
|
value: point.p50_power_w ?? point.expected_power_w
|
||||||
|
})).filter((point) => new Date(point.target_at).getTime() >= new Date(run.issued_at).getTime())
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatLag(run) {
|
||||||
|
if (run.lag_hours) return `${run.lag_hours}h ago`;
|
||||||
|
return `issued ${formatIssuedAge(run.issued_at)}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatIssuedAge(issuedAt) {
|
||||||
|
const ageMs = Math.max(0, new Date(window.astrapeOracleData.now).getTime() - new Date(issuedAt).getTime());
|
||||||
|
const minutes = Math.round(ageMs / 60000);
|
||||||
|
if (minutes < 60) return `${minutes}m ago`;
|
||||||
|
return `${Math.round(minutes / 60)}h ago`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function actualPoints(points, key, nowIso) {
|
||||||
|
const parsedNow = new Date(nowIso).getTime();
|
||||||
|
const now = Number.isFinite(parsedNow) ? parsedNow : Date.now();
|
||||||
|
return (points || [])
|
||||||
|
.filter((point) => new Date(point.target_at).getTime() <= now)
|
||||||
|
.map((point) => ({ target_at: point.target_at, value: point[key] }));
|
||||||
|
}
|
||||||
|
|
||||||
|
function powerPoints(points, key) {
|
||||||
|
return (points || []).map((point) => ({ target_at: point.target_at, value: point[key] }));
|
||||||
|
}
|
||||||
|
|
||||||
|
function netPoints(points, key) {
|
||||||
|
return (points || []).map((point) => ({ target_at: point.target_at, value: point[key] }));
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawChart(series, payload) {
|
||||||
|
const canvas = document.getElementById("oracle-chart");
|
||||||
|
const ctx = canvas.getContext("2d");
|
||||||
|
ctx.clearRect(0, 0, canvas.width, canvas.height);
|
||||||
|
|
||||||
|
const allPoints = series.flatMap((item) => item.points).filter((point) => point.value !== null);
|
||||||
|
if (!allPoints.length) return;
|
||||||
|
|
||||||
|
const ys = allPoints.map((point) => point.value);
|
||||||
|
ys.push(0);
|
||||||
|
const windowBounds = oracleAlignedBounds(payload.now);
|
||||||
|
const bounds = {
|
||||||
|
minX: windowBounds.minX,
|
||||||
|
maxX: windowBounds.maxX,
|
||||||
|
minY: Math.min(...ys),
|
||||||
|
maxY: Math.max(...ys),
|
||||||
|
};
|
||||||
|
if (bounds.minY === bounds.maxY) {
|
||||||
|
bounds.minY -= 1;
|
||||||
|
bounds.maxY += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
drawAxes(ctx, canvas, bounds);
|
||||||
|
drawZeroLine(ctx, canvas, bounds);
|
||||||
|
drawNowMarker(ctx, canvas, bounds, windowBounds.nowX);
|
||||||
|
series.forEach((item) => drawSeries(ctx, canvas, bounds, item));
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawAxes(ctx, canvas, bounds) {
|
||||||
|
const margin = chartMargin();
|
||||||
|
ctx.strokeStyle = "#94a3b8";
|
||||||
|
ctx.lineWidth = 1;
|
||||||
|
ctx.beginPath();
|
||||||
|
ctx.moveTo(margin.left, margin.top);
|
||||||
|
ctx.lineTo(margin.left, canvas.height - margin.bottom);
|
||||||
|
ctx.lineTo(canvas.width - margin.right, canvas.height - margin.bottom);
|
||||||
|
ctx.stroke();
|
||||||
|
ctx.fillStyle = "#94a3b8";
|
||||||
|
ctx.font = "12px system-ui";
|
||||||
|
ctx.fillText(`${Math.round(bounds.maxY)} W`, 10, margin.top + 4);
|
||||||
|
ctx.fillText(`${Math.round(bounds.minY)} W`, 10, canvas.height - margin.bottom);
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawZeroLine(ctx, canvas, bounds) {
|
||||||
|
if (bounds.minY > 0 || bounds.maxY < 0) return;
|
||||||
|
const margin = chartMargin();
|
||||||
|
const y = scale(0, bounds.minY, bounds.maxY, canvas.height - margin.bottom, margin.top);
|
||||||
|
ctx.save();
|
||||||
|
ctx.strokeStyle = "#475569";
|
||||||
|
ctx.lineWidth = 1;
|
||||||
|
ctx.setLineDash([4, 4]);
|
||||||
|
ctx.beginPath();
|
||||||
|
ctx.moveTo(margin.left, y);
|
||||||
|
ctx.lineTo(canvas.width - margin.right, y);
|
||||||
|
ctx.stroke();
|
||||||
|
ctx.restore();
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawNowMarker(ctx, canvas, bounds, now) {
|
||||||
|
if (now < bounds.minX || now > bounds.maxX) return;
|
||||||
|
const margin = chartMargin();
|
||||||
|
const x = scale(now, bounds.minX, bounds.maxX, margin.left, canvas.width - margin.right);
|
||||||
|
ctx.save();
|
||||||
|
ctx.strokeStyle = "#f8fafc";
|
||||||
|
ctx.lineWidth = 1;
|
||||||
|
ctx.setLineDash([5, 5]);
|
||||||
|
ctx.beginPath();
|
||||||
|
ctx.moveTo(x, margin.top);
|
||||||
|
ctx.lineTo(x, canvas.height - margin.bottom);
|
||||||
|
ctx.stroke();
|
||||||
|
ctx.setLineDash([]);
|
||||||
|
ctx.fillStyle = "#f8fafc";
|
||||||
|
ctx.font = "12px system-ui";
|
||||||
|
ctx.fillText("now", Math.min(x + 8, canvas.width - margin.right - 28), margin.top + 14);
|
||||||
|
ctx.restore();
|
||||||
|
}
|
||||||
|
|
||||||
|
function drawSeries(ctx, canvas, bounds, series) {
|
||||||
|
const points = series.points.filter((point) => point.value !== null);
|
||||||
|
if (!points.length) return;
|
||||||
|
const margin = chartMargin();
|
||||||
|
ctx.strokeStyle = series.color;
|
||||||
|
ctx.lineWidth = series.width;
|
||||||
|
ctx.setLineDash(series.dash || []);
|
||||||
|
ctx.beginPath();
|
||||||
|
points.forEach((point, index) => {
|
||||||
|
const x = scale(new Date(point.target_at).getTime(), bounds.minX, bounds.maxX, margin.left, canvas.width - margin.right);
|
||||||
|
const y = scale(point.value, bounds.minY, bounds.maxY, canvas.height - margin.bottom, margin.top);
|
||||||
|
if (index === 0) ctx.moveTo(x, y);
|
||||||
|
else ctx.lineTo(x, y);
|
||||||
|
});
|
||||||
|
ctx.stroke();
|
||||||
|
ctx.setLineDash([]);
|
||||||
|
|
||||||
|
if (series.markers || points.length < 12) {
|
||||||
|
ctx.fillStyle = series.color;
|
||||||
|
points.forEach((point) => {
|
||||||
|
const x = scale(new Date(point.target_at).getTime(), bounds.minX, bounds.maxX, margin.left, canvas.width - margin.right);
|
||||||
|
const y = scale(point.value, bounds.minY, bounds.maxY, canvas.height - margin.bottom, margin.top);
|
||||||
|
if (x < margin.left || x > canvas.width - margin.right) return;
|
||||||
|
ctx.beginPath();
|
||||||
|
ctx.arc(x, y, 3.5, 0, Math.PI * 2);
|
||||||
|
ctx.fill();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function scale(value, inMin, inMax, outMin, outMax) {
|
||||||
|
if (inMin === inMax) return (outMin + outMax) / 2;
|
||||||
|
return outMin + ((value - inMin) / (inMax - inMin)) * (outMax - outMin);
|
||||||
|
}
|
||||||
|
|
||||||
|
function chartMargin() {
|
||||||
|
return { top: 24, right: 28, bottom: 34, left: 64 };
|
||||||
|
}
|
||||||
|
|
||||||
|
function oracleAlignedBounds(nowIso) {
|
||||||
|
const parsedNow = new Date(nowIso).getTime();
|
||||||
|
const now = Number.isFinite(parsedNow) ? parsedNow : Date.now();
|
||||||
|
return {
|
||||||
|
minX: now - 24 * 60 * 60 * 1000,
|
||||||
|
maxX: now + 48 * 60 * 60 * 1000,
|
||||||
|
nowX: now
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return { init };
|
||||||
|
})();
|
||||||
|
window.astrapeModules.oracleDisplay.init();
|
||||||
|
</script>
|
||||||
|
"""
|
||||||
|
|
||||||
|
def data_payload(self) -> str:
|
||||||
|
builder = EnergyOracleBuilder.from_env()
|
||||||
|
solar_run, load_run, net_run = builder.build()
|
||||||
|
actual_points = builder.sigen_store.load_recent_actual_points()
|
||||||
|
try:
|
||||||
|
oracle_store = OracleStore.from_env()
|
||||||
|
historical_net_runs = oracle_store.load_lagged_net_runs()
|
||||||
|
historical_solar_runs = oracle_store.load_lagged_power_runs("solar")
|
||||||
|
historical_load_runs = oracle_store.load_lagged_power_runs("load")
|
||||||
|
except Exception:
|
||||||
|
historical_net_runs = []
|
||||||
|
historical_solar_runs = []
|
||||||
|
historical_load_runs = []
|
||||||
|
return json.dumps(
|
||||||
|
{
|
||||||
|
"issued_at": self._iso(net_run.issued_at),
|
||||||
|
"now": self._iso(net_run.issued_at),
|
||||||
|
"solar_model": solar_run.model_version,
|
||||||
|
"load_model": load_run.model_version,
|
||||||
|
"solar_points": [
|
||||||
|
self._power_point(point) for point in solar_run.points
|
||||||
|
],
|
||||||
|
"load_points": [
|
||||||
|
self._power_point(point) for point in load_run.points
|
||||||
|
],
|
||||||
|
"net_points": [self._net_point(point) for point in net_run.points],
|
||||||
|
"actual_points": [
|
||||||
|
self._actual_point(point) for point in actual_points
|
||||||
|
],
|
||||||
|
"historical_net_runs": [
|
||||||
|
self._historical_net_run(run) for run in historical_net_runs
|
||||||
|
],
|
||||||
|
"historical_solar_runs": [
|
||||||
|
self._historical_power_run(run) for run in historical_solar_runs
|
||||||
|
],
|
||||||
|
"historical_load_runs": [
|
||||||
|
self._historical_power_run(run) for run in historical_load_runs
|
||||||
|
],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
def _power_point(self, point: PowerForecastPoint) -> dict[str, object]:
|
||||||
|
return {
|
||||||
|
"target_at": self._iso(point.target_at),
|
||||||
|
"horizon_minutes": point.horizon_minutes,
|
||||||
|
"expected_power_w": point.expected_power_w,
|
||||||
|
"p10_power_w": point.p10_power_w,
|
||||||
|
"p50_power_w": point.p50_power_w,
|
||||||
|
"p90_power_w": point.p90_power_w,
|
||||||
|
"confidence": point.confidence,
|
||||||
|
"source": point.source,
|
||||||
|
"model_version": point.model_version,
|
||||||
|
"metadata": point.metadata,
|
||||||
|
}
|
||||||
|
|
||||||
|
def _net_point(self, point: NetPowerForecastPoint) -> dict[str, object]:
|
||||||
|
return asdict(point) | {"target_at": self._iso(point.target_at)}
|
||||||
|
|
||||||
|
def _actual_point(self, point: dict[str, object]) -> dict[str, object]:
|
||||||
|
return {
|
||||||
|
"target_at": self._iso(point["target_at"]),
|
||||||
|
"solar_power_w": point["solar_power_w"],
|
||||||
|
"load_power_w": point["load_power_w"],
|
||||||
|
"net_power_w": point["net_power_w"],
|
||||||
|
"grid_import_w": point["grid_import_w"],
|
||||||
|
"grid_export_w": point["grid_export_w"],
|
||||||
|
"sample_count": point["sample_count"],
|
||||||
|
}
|
||||||
|
|
||||||
|
def _historical_net_run(self, run: dict[str, object]) -> dict[str, object]:
|
||||||
|
return {
|
||||||
|
"lag_hours": run.get("lag_hours"),
|
||||||
|
"issued_at": self._iso(run["issued_at"]),
|
||||||
|
"points": [
|
||||||
|
{
|
||||||
|
"target_at": self._iso(point["target_at"]),
|
||||||
|
"horizon_minutes": point["horizon_minutes"],
|
||||||
|
"expected_net_power_w": point["expected_net_power_w"],
|
||||||
|
"safe_net_power_w": point["safe_net_power_w"],
|
||||||
|
"p10_net_power_w": point.get("p10_net_power_w"),
|
||||||
|
"p50_net_power_w": point.get("p50_net_power_w"),
|
||||||
|
"p90_net_power_w": point.get("p90_net_power_w"),
|
||||||
|
"solar_p50_power_w": point["solar_p50_power_w"],
|
||||||
|
"load_p50_power_w": point["load_p50_power_w"],
|
||||||
|
"solar_p10_power_w": point["solar_p10_power_w"],
|
||||||
|
"solar_p90_power_w": point.get("solar_p90_power_w"),
|
||||||
|
"load_p10_power_w": point.get("load_p10_power_w"),
|
||||||
|
"load_p90_power_w": point["load_p90_power_w"],
|
||||||
|
}
|
||||||
|
for point in run["points"]
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
def _historical_power_run(self, run: dict[str, object]) -> dict[str, object]:
|
||||||
|
return {
|
||||||
|
"lag_hours": run.get("lag_hours"),
|
||||||
|
"issued_at": self._iso(run["issued_at"]),
|
||||||
|
"kind": run["kind"],
|
||||||
|
"source": run["source"],
|
||||||
|
"model_version": run["model_version"],
|
||||||
|
"points": [
|
||||||
|
{
|
||||||
|
"target_at": self._iso(point["target_at"]),
|
||||||
|
"horizon_minutes": point["horizon_minutes"],
|
||||||
|
"expected_power_w": point["expected_power_w"],
|
||||||
|
"p10_power_w": point["p10_power_w"],
|
||||||
|
"p50_power_w": point["p50_power_w"],
|
||||||
|
"p90_power_w": point["p90_power_w"],
|
||||||
|
"confidence": point["confidence"],
|
||||||
|
}
|
||||||
|
for point in run["points"]
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
def _iso(self, value: datetime) -> str:
|
||||||
|
return value.isoformat()
|
||||||
@@ -0,0 +1,152 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
from gibil.classes.oracle.store import OracleStore
|
||||||
|
|
||||||
|
|
||||||
|
class OracleQualityDisplay:
|
||||||
|
"""Renders oracle prediction quality tables."""
|
||||||
|
|
||||||
|
def render(self) -> str:
|
||||||
|
return """
|
||||||
|
<section class="panel oracle-quality-panel" data-module="oracle-quality-display">
|
||||||
|
<div class="panel-heading">
|
||||||
|
<div>
|
||||||
|
<h2>Oracle Quality</h2>
|
||||||
|
<p>Prediction error by model and horizon</p>
|
||||||
|
</div>
|
||||||
|
<div class="control-row">
|
||||||
|
<label>
|
||||||
|
Window
|
||||||
|
<select id="quality-lookback">
|
||||||
|
<option value="24">24 hours</option>
|
||||||
|
<option value="168" selected>7 days</option>
|
||||||
|
<option value="720">30 days</option>
|
||||||
|
</select>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="table-shell">
|
||||||
|
<table>
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Kind</th>
|
||||||
|
<th>Model</th>
|
||||||
|
<th>Horizon</th>
|
||||||
|
<th>Samples</th>
|
||||||
|
<th>Bias</th>
|
||||||
|
<th>MAE</th>
|
||||||
|
<th>Median AE</th>
|
||||||
|
<th>MAPE</th>
|
||||||
|
<th>Coverage</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody id="quality-rows"></tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
<script>
|
||||||
|
window.astrapeModules = window.astrapeModules || {};
|
||||||
|
window.astrapeModules.oracleQualityDisplay = (() => {
|
||||||
|
function init() {
|
||||||
|
document.getElementById("quality-lookback").addEventListener("change", refresh);
|
||||||
|
refresh();
|
||||||
|
setInterval(refresh, 10000);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function refresh() {
|
||||||
|
const lookback = document.getElementById("quality-lookback").value;
|
||||||
|
const response = await fetch(`/api/oracle-quality?lookback_hours=${lookback}`, { cache: "no-store" });
|
||||||
|
const payload = await response.json();
|
||||||
|
render(payload.rows || []);
|
||||||
|
}
|
||||||
|
|
||||||
|
function render(rows) {
|
||||||
|
const tbody = document.getElementById("quality-rows");
|
||||||
|
if (!rows.length) {
|
||||||
|
tbody.innerHTML = `<tr><td colspan="9">No evaluated oracle predictions yet.</td></tr>`;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
tbody.innerHTML = rows.map((row) => `
|
||||||
|
<tr>
|
||||||
|
<td>${escapeHtml(row.kind)}</td>
|
||||||
|
<td>${escapeHtml(row.model_version)}</td>
|
||||||
|
<td>${formatHorizon(row)}</td>
|
||||||
|
<td>${row.evaluated_count}</td>
|
||||||
|
<td class="${biasClass(row.mean_error_w)}">${formatW(row.mean_error_w)}</td>
|
||||||
|
<td>${formatW(row.mean_absolute_error_w)}</td>
|
||||||
|
<td>${formatW(row.median_absolute_error_w)}</td>
|
||||||
|
<td>${formatPct(row.mean_absolute_pct_error)}</td>
|
||||||
|
<td>${formatPct(row.interval_coverage)}</td>
|
||||||
|
</tr>
|
||||||
|
`).join("");
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatHorizon(row) {
|
||||||
|
if (row.horizon_label) return row.horizon_label;
|
||||||
|
return `${row.min_horizon_minutes}-${row.max_horizon_minutes}m`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatW(value) {
|
||||||
|
if (value === null || value === undefined) return "n/a";
|
||||||
|
return `${Math.round(Number(value))} W`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatPct(value) {
|
||||||
|
if (value === null || value === undefined) return "n/a";
|
||||||
|
return `${(Number(value) * 100).toFixed(1)}%`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function biasClass(value) {
|
||||||
|
if (value === null || value === undefined) return "";
|
||||||
|
const absolute = Math.abs(Number(value));
|
||||||
|
if (absolute < 250) return "metric-good";
|
||||||
|
if (absolute < 1000) return "metric-warn";
|
||||||
|
return "metric-bad";
|
||||||
|
}
|
||||||
|
|
||||||
|
function escapeHtml(value) {
|
||||||
|
return String(value ?? "")
|
||||||
|
.replace(/&/g, "&")
|
||||||
|
.replace(/</g, "<")
|
||||||
|
.replace(/>/g, ">")
|
||||||
|
.replace(/"/g, """)
|
||||||
|
.replace(/'/g, "'");
|
||||||
|
}
|
||||||
|
|
||||||
|
return { init };
|
||||||
|
})();
|
||||||
|
window.astrapeModules.oracleQualityDisplay.init();
|
||||||
|
</script>
|
||||||
|
"""
|
||||||
|
|
||||||
|
def data_payload(self, lookback_hours: float = 168) -> str:
|
||||||
|
try:
|
||||||
|
rows = OracleStore.from_env().load_evaluation_summary(
|
||||||
|
lookback=timedelta(hours=lookback_hours)
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
rows = []
|
||||||
|
|
||||||
|
return json.dumps(
|
||||||
|
{
|
||||||
|
"lookback_hours": lookback_hours,
|
||||||
|
"rows": [self._row(row) for row in rows],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
def _row(self, row: dict[str, object]) -> dict[str, object]:
|
||||||
|
return {
|
||||||
|
key: self._json_value(value)
|
||||||
|
for key, value in row.items()
|
||||||
|
}
|
||||||
|
|
||||||
|
def _json_value(self, value: object) -> object:
|
||||||
|
if value is None or isinstance(value, (str, int, float, bool)):
|
||||||
|
return value
|
||||||
|
try:
|
||||||
|
return float(value)
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
return str(value)
|
||||||
@@ -0,0 +1,888 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from os import environ
|
||||||
|
from typing import Iterator
|
||||||
|
|
||||||
|
from gibil.classes.models import NetPowerForecastRun, PowerForecastRun
|
||||||
|
|
||||||
|
|
||||||
|
class OracleStoreConfigurationError(RuntimeError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class OracleStoreConfig:
|
||||||
|
database_url: str
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "OracleStoreConfig":
|
||||||
|
database_url = environ.get("ASTRAPE_DATABASE_URL")
|
||||||
|
if not database_url:
|
||||||
|
raise OracleStoreConfigurationError(
|
||||||
|
"ASTRAPE_DATABASE_URL is required for oracle storage"
|
||||||
|
)
|
||||||
|
return cls(database_url=database_url)
|
||||||
|
|
||||||
|
|
||||||
|
class OracleStore:
|
||||||
|
"""Persists generated oracle projection curves for later evaluation."""
|
||||||
|
|
||||||
|
def __init__(self, config: OracleStoreConfig) -> None:
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "OracleStore":
|
||||||
|
return cls(OracleStoreConfig.from_env())
|
||||||
|
|
||||||
|
def initialize(self) -> None:
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("CREATE EXTENSION IF NOT EXISTS timescaledb")
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
CREATE TABLE IF NOT EXISTS oracle_power_forecast_points (
|
||||||
|
issued_at TIMESTAMPTZ NOT NULL,
|
||||||
|
target_at TIMESTAMPTZ NOT NULL,
|
||||||
|
kind TEXT NOT NULL,
|
||||||
|
source TEXT NOT NULL,
|
||||||
|
model_version TEXT NOT NULL,
|
||||||
|
horizon_minutes INTEGER NOT NULL,
|
||||||
|
expected_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
p10_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
p50_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
p90_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
confidence DOUBLE PRECISION NOT NULL,
|
||||||
|
inserted_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
|
PRIMARY KEY (issued_at, target_at, kind, source, model_version)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT create_hypertable(
|
||||||
|
'oracle_power_forecast_points',
|
||||||
|
'target_at',
|
||||||
|
if_not_exists => TRUE
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
CREATE TABLE IF NOT EXISTS oracle_net_forecast_points (
|
||||||
|
issued_at TIMESTAMPTZ NOT NULL,
|
||||||
|
target_at TIMESTAMPTZ NOT NULL,
|
||||||
|
source TEXT NOT NULL,
|
||||||
|
horizon_minutes INTEGER NOT NULL,
|
||||||
|
expected_net_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
safe_net_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
p10_net_power_w DOUBLE PRECISION,
|
||||||
|
p50_net_power_w DOUBLE PRECISION,
|
||||||
|
p90_net_power_w DOUBLE PRECISION,
|
||||||
|
solar_p50_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
load_p50_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
solar_p10_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
solar_p90_power_w DOUBLE PRECISION,
|
||||||
|
load_p10_power_w DOUBLE PRECISION,
|
||||||
|
load_p90_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
inserted_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
|
PRIMARY KEY (issued_at, target_at, source)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
ALTER TABLE oracle_net_forecast_points
|
||||||
|
ADD COLUMN IF NOT EXISTS p10_net_power_w DOUBLE PRECISION
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
ALTER TABLE oracle_net_forecast_points
|
||||||
|
ADD COLUMN IF NOT EXISTS p50_net_power_w DOUBLE PRECISION
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
ALTER TABLE oracle_net_forecast_points
|
||||||
|
ADD COLUMN IF NOT EXISTS p90_net_power_w DOUBLE PRECISION
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
ALTER TABLE oracle_net_forecast_points
|
||||||
|
ADD COLUMN IF NOT EXISTS solar_p90_power_w DOUBLE PRECISION
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
ALTER TABLE oracle_net_forecast_points
|
||||||
|
ADD COLUMN IF NOT EXISTS load_p10_power_w DOUBLE PRECISION
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT create_hypertable(
|
||||||
|
'oracle_net_forecast_points',
|
||||||
|
'target_at',
|
||||||
|
if_not_exists => TRUE
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
CREATE TABLE IF NOT EXISTS oracle_forecast_evaluations (
|
||||||
|
issued_at TIMESTAMPTZ NOT NULL,
|
||||||
|
target_at TIMESTAMPTZ NOT NULL,
|
||||||
|
kind TEXT NOT NULL,
|
||||||
|
source TEXT NOT NULL,
|
||||||
|
model_version TEXT NOT NULL,
|
||||||
|
horizon_minutes INTEGER NOT NULL,
|
||||||
|
expected_power_w DOUBLE PRECISION NOT NULL,
|
||||||
|
p10_power_w DOUBLE PRECISION,
|
||||||
|
p50_power_w DOUBLE PRECISION,
|
||||||
|
p90_power_w DOUBLE PRECISION,
|
||||||
|
realized_power_w DOUBLE PRECISION,
|
||||||
|
error_w DOUBLE PRECISION,
|
||||||
|
absolute_error_w DOUBLE PRECISION,
|
||||||
|
absolute_pct_error DOUBLE PRECISION,
|
||||||
|
covered_by_p10_p90 BOOLEAN,
|
||||||
|
sample_count INTEGER NOT NULL DEFAULT 0,
|
||||||
|
evaluated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
|
inserted_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
|
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
|
PRIMARY KEY (
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version
|
||||||
|
)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT create_hypertable(
|
||||||
|
'oracle_forecast_evaluations',
|
||||||
|
'target_at',
|
||||||
|
if_not_exists => TRUE
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
CREATE INDEX IF NOT EXISTS oracle_forecast_evaluations_kind_horizon_idx
|
||||||
|
ON oracle_forecast_evaluations (
|
||||||
|
kind,
|
||||||
|
horizon_minutes,
|
||||||
|
target_at DESC
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
def save_runs(
|
||||||
|
self,
|
||||||
|
solar_run: PowerForecastRun,
|
||||||
|
load_run: PowerForecastRun,
|
||||||
|
net_run: NetPowerForecastRun,
|
||||||
|
) -> int:
|
||||||
|
self.initialize()
|
||||||
|
power_rows = [
|
||||||
|
(
|
||||||
|
run.issued_at,
|
||||||
|
point.target_at,
|
||||||
|
run.kind.value,
|
||||||
|
run.source,
|
||||||
|
run.model_version,
|
||||||
|
point.horizon_minutes,
|
||||||
|
point.expected_power_w,
|
||||||
|
point.p10_power_w,
|
||||||
|
point.p50_power_w,
|
||||||
|
point.p90_power_w,
|
||||||
|
point.confidence,
|
||||||
|
)
|
||||||
|
for run in (solar_run, load_run)
|
||||||
|
for point in run.points
|
||||||
|
]
|
||||||
|
net_rows = [
|
||||||
|
(
|
||||||
|
net_run.issued_at,
|
||||||
|
point.target_at,
|
||||||
|
net_run.source,
|
||||||
|
point.horizon_minutes,
|
||||||
|
point.expected_net_power_w,
|
||||||
|
point.safe_net_power_w,
|
||||||
|
point.p10_net_power_w,
|
||||||
|
point.p50_net_power_w,
|
||||||
|
point.p90_net_power_w,
|
||||||
|
point.solar_p50_power_w,
|
||||||
|
point.load_p50_power_w,
|
||||||
|
point.solar_p10_power_w,
|
||||||
|
point.solar_p90_power_w,
|
||||||
|
point.load_p10_power_w,
|
||||||
|
point.load_p90_power_w,
|
||||||
|
)
|
||||||
|
for point in net_run.points
|
||||||
|
]
|
||||||
|
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.executemany(
|
||||||
|
"""
|
||||||
|
INSERT INTO oracle_power_forecast_points (
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version,
|
||||||
|
horizon_minutes,
|
||||||
|
expected_power_w,
|
||||||
|
p10_power_w,
|
||||||
|
p50_power_w,
|
||||||
|
p90_power_w,
|
||||||
|
confidence
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT (issued_at, target_at, kind, source, model_version)
|
||||||
|
DO UPDATE SET
|
||||||
|
horizon_minutes = EXCLUDED.horizon_minutes,
|
||||||
|
expected_power_w = EXCLUDED.expected_power_w,
|
||||||
|
p10_power_w = EXCLUDED.p10_power_w,
|
||||||
|
p50_power_w = EXCLUDED.p50_power_w,
|
||||||
|
p90_power_w = EXCLUDED.p90_power_w,
|
||||||
|
confidence = EXCLUDED.confidence,
|
||||||
|
inserted_at = now()
|
||||||
|
""",
|
||||||
|
power_rows,
|
||||||
|
)
|
||||||
|
cursor.executemany(
|
||||||
|
"""
|
||||||
|
INSERT INTO oracle_net_forecast_points (
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
source,
|
||||||
|
horizon_minutes,
|
||||||
|
expected_net_power_w,
|
||||||
|
safe_net_power_w,
|
||||||
|
p10_net_power_w,
|
||||||
|
p50_net_power_w,
|
||||||
|
p90_net_power_w,
|
||||||
|
solar_p50_power_w,
|
||||||
|
load_p50_power_w,
|
||||||
|
solar_p10_power_w,
|
||||||
|
solar_p90_power_w,
|
||||||
|
load_p10_power_w,
|
||||||
|
load_p90_power_w
|
||||||
|
)
|
||||||
|
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT (issued_at, target_at, source)
|
||||||
|
DO UPDATE SET
|
||||||
|
horizon_minutes = EXCLUDED.horizon_minutes,
|
||||||
|
expected_net_power_w = EXCLUDED.expected_net_power_w,
|
||||||
|
safe_net_power_w = EXCLUDED.safe_net_power_w,
|
||||||
|
p10_net_power_w = EXCLUDED.p10_net_power_w,
|
||||||
|
p50_net_power_w = EXCLUDED.p50_net_power_w,
|
||||||
|
p90_net_power_w = EXCLUDED.p90_net_power_w,
|
||||||
|
solar_p50_power_w = EXCLUDED.solar_p50_power_w,
|
||||||
|
load_p50_power_w = EXCLUDED.load_p50_power_w,
|
||||||
|
solar_p10_power_w = EXCLUDED.solar_p10_power_w,
|
||||||
|
solar_p90_power_w = EXCLUDED.solar_p90_power_w,
|
||||||
|
load_p10_power_w = EXCLUDED.load_p10_power_w,
|
||||||
|
load_p90_power_w = EXCLUDED.load_p90_power_w,
|
||||||
|
inserted_at = now()
|
||||||
|
""",
|
||||||
|
net_rows,
|
||||||
|
)
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
return len(power_rows) + len(net_rows)
|
||||||
|
|
||||||
|
def load_recent_net_runs(
|
||||||
|
self,
|
||||||
|
lookback: timedelta = timedelta(hours=6),
|
||||||
|
limit: int = 6,
|
||||||
|
) -> list[dict[str, object]]:
|
||||||
|
return self.load_lagged_net_runs(
|
||||||
|
lag_hours=[hour for hour in (1, 2, 6, 24, 48) if hour <= lookback.total_seconds() / 3600],
|
||||||
|
tolerance=timedelta(minutes=45),
|
||||||
|
limit=limit,
|
||||||
|
)
|
||||||
|
|
||||||
|
def load_lagged_net_runs(
|
||||||
|
self,
|
||||||
|
lag_hours: list[int] | None = None,
|
||||||
|
tolerance: timedelta = timedelta(minutes=45),
|
||||||
|
limit: int = 5,
|
||||||
|
) -> list[dict[str, object]]:
|
||||||
|
if lag_hours is None:
|
||||||
|
lag_hours = [1, 2, 6, 24, 48]
|
||||||
|
|
||||||
|
now = datetime.now(timezone.utc)
|
||||||
|
selected: list[tuple[int, datetime]] = []
|
||||||
|
used_issued_at: set[datetime] = set()
|
||||||
|
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
for lag_hour in lag_hours:
|
||||||
|
target_issued_at = now - timedelta(hours=lag_hour)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT issued_at
|
||||||
|
FROM oracle_net_forecast_points
|
||||||
|
WHERE issued_at BETWEEN %s AND %s
|
||||||
|
GROUP BY issued_at
|
||||||
|
ORDER BY abs(extract(epoch FROM (issued_at - %s)))
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
target_issued_at - tolerance,
|
||||||
|
target_issued_at + tolerance,
|
||||||
|
target_issued_at,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
if row is None or row[0] in used_issued_at:
|
||||||
|
continue
|
||||||
|
selected.append((lag_hour, row[0]))
|
||||||
|
used_issued_at.add(row[0])
|
||||||
|
if len(selected) >= limit:
|
||||||
|
break
|
||||||
|
|
||||||
|
runs: list[dict[str, object]] = []
|
||||||
|
for lag_hour, issued_at in selected:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
target_at,
|
||||||
|
horizon_minutes,
|
||||||
|
expected_net_power_w,
|
||||||
|
safe_net_power_w,
|
||||||
|
COALESCE(p10_net_power_w, safe_net_power_w),
|
||||||
|
COALESCE(p50_net_power_w, expected_net_power_w),
|
||||||
|
p90_net_power_w,
|
||||||
|
solar_p50_power_w,
|
||||||
|
load_p50_power_w,
|
||||||
|
solar_p10_power_w,
|
||||||
|
solar_p90_power_w,
|
||||||
|
load_p10_power_w,
|
||||||
|
load_p90_power_w
|
||||||
|
FROM oracle_net_forecast_points
|
||||||
|
WHERE issued_at = %s
|
||||||
|
AND target_at >= %s
|
||||||
|
ORDER BY target_at
|
||||||
|
""",
|
||||||
|
(issued_at, issued_at),
|
||||||
|
)
|
||||||
|
points = cursor.fetchall()
|
||||||
|
if not points:
|
||||||
|
continue
|
||||||
|
runs.append(
|
||||||
|
{
|
||||||
|
"lag_hours": lag_hour,
|
||||||
|
"issued_at": issued_at,
|
||||||
|
"points": [
|
||||||
|
{
|
||||||
|
"target_at": row[0],
|
||||||
|
"horizon_minutes": row[1],
|
||||||
|
"expected_net_power_w": row[2],
|
||||||
|
"safe_net_power_w": row[3],
|
||||||
|
"p10_net_power_w": row[4],
|
||||||
|
"p50_net_power_w": row[5],
|
||||||
|
"p90_net_power_w": row[6],
|
||||||
|
"solar_p50_power_w": row[7],
|
||||||
|
"load_p50_power_w": row[8],
|
||||||
|
"solar_p10_power_w": row[9],
|
||||||
|
"solar_p90_power_w": row[10],
|
||||||
|
"load_p10_power_w": row[11],
|
||||||
|
"load_p90_power_w": row[12],
|
||||||
|
}
|
||||||
|
for row in points
|
||||||
|
],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return runs
|
||||||
|
|
||||||
|
def load_lagged_power_runs(
|
||||||
|
self,
|
||||||
|
kind: str,
|
||||||
|
lag_hours: list[int] | None = None,
|
||||||
|
tolerance: timedelta = timedelta(minutes=45),
|
||||||
|
limit: int = 5,
|
||||||
|
) -> list[dict[str, object]]:
|
||||||
|
if kind not in {"solar", "load"}:
|
||||||
|
raise ValueError("kind must be 'solar' or 'load'")
|
||||||
|
if lag_hours is None:
|
||||||
|
lag_hours = [1, 2, 6, 24, 48]
|
||||||
|
|
||||||
|
now = datetime.now(timezone.utc)
|
||||||
|
selected: list[tuple[int, datetime, str, str, str]] = []
|
||||||
|
used_keys: set[tuple[datetime, str, str, str]] = set()
|
||||||
|
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
for lag_hour in lag_hours:
|
||||||
|
target_issued_at = now - timedelta(hours=lag_hour)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT issued_at, kind, source, model_version
|
||||||
|
FROM oracle_power_forecast_points
|
||||||
|
WHERE kind = %s
|
||||||
|
AND issued_at BETWEEN %s AND %s
|
||||||
|
GROUP BY issued_at, kind, source, model_version
|
||||||
|
ORDER BY abs(extract(epoch FROM (issued_at - %s)))
|
||||||
|
LIMIT 1
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
kind,
|
||||||
|
target_issued_at - tolerance,
|
||||||
|
target_issued_at + tolerance,
|
||||||
|
target_issued_at,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
if row is None:
|
||||||
|
continue
|
||||||
|
key = (row[0], row[1], row[2], row[3])
|
||||||
|
if key in used_keys:
|
||||||
|
continue
|
||||||
|
selected.append((lag_hour, row[0], row[1], row[2], row[3]))
|
||||||
|
used_keys.add(key)
|
||||||
|
if len(selected) >= limit:
|
||||||
|
break
|
||||||
|
|
||||||
|
runs: list[dict[str, object]] = []
|
||||||
|
for lag_hour, issued_at, run_kind, source, model_version in selected:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
target_at,
|
||||||
|
horizon_minutes,
|
||||||
|
expected_power_w,
|
||||||
|
p10_power_w,
|
||||||
|
p50_power_w,
|
||||||
|
p90_power_w,
|
||||||
|
confidence
|
||||||
|
FROM oracle_power_forecast_points
|
||||||
|
WHERE issued_at = %s
|
||||||
|
AND kind = %s
|
||||||
|
AND source = %s
|
||||||
|
AND model_version = %s
|
||||||
|
AND target_at >= %s
|
||||||
|
ORDER BY target_at
|
||||||
|
""",
|
||||||
|
(issued_at, run_kind, source, model_version, issued_at),
|
||||||
|
)
|
||||||
|
points = cursor.fetchall()
|
||||||
|
if not points:
|
||||||
|
continue
|
||||||
|
runs.append(
|
||||||
|
{
|
||||||
|
"lag_hours": lag_hour,
|
||||||
|
"issued_at": issued_at,
|
||||||
|
"kind": run_kind,
|
||||||
|
"source": source,
|
||||||
|
"model_version": model_version,
|
||||||
|
"points": [
|
||||||
|
{
|
||||||
|
"target_at": row[0],
|
||||||
|
"horizon_minutes": row[1],
|
||||||
|
"expected_power_w": row[2],
|
||||||
|
"p10_power_w": row[3],
|
||||||
|
"p50_power_w": row[4],
|
||||||
|
"p90_power_w": row[5],
|
||||||
|
"confidence": row[6],
|
||||||
|
}
|
||||||
|
for row in points
|
||||||
|
],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
return runs
|
||||||
|
|
||||||
|
def evaluate_due_forecasts(
|
||||||
|
self,
|
||||||
|
actual_window: timedelta = timedelta(minutes=5),
|
||||||
|
lookback: timedelta = timedelta(days=7),
|
||||||
|
limit: int = 1000,
|
||||||
|
) -> int:
|
||||||
|
self.initialize()
|
||||||
|
start_at = datetime.now(timezone.utc) - lookback
|
||||||
|
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
power_count = self._evaluate_due_power_forecasts(
|
||||||
|
cursor=cursor,
|
||||||
|
actual_window=actual_window,
|
||||||
|
start_at=start_at,
|
||||||
|
limit=limit,
|
||||||
|
)
|
||||||
|
remaining_limit = max(limit - power_count, 0)
|
||||||
|
net_count = 0
|
||||||
|
if remaining_limit > 0:
|
||||||
|
net_count = self._evaluate_due_net_forecasts(
|
||||||
|
cursor=cursor,
|
||||||
|
actual_window=actual_window,
|
||||||
|
start_at=start_at,
|
||||||
|
limit=remaining_limit,
|
||||||
|
)
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
return power_count + net_count
|
||||||
|
|
||||||
|
def load_evaluation_summary(
|
||||||
|
self,
|
||||||
|
lookback: timedelta = timedelta(days=7),
|
||||||
|
) -> list[dict[str, object]]:
|
||||||
|
start_at = datetime.now(timezone.utc) - lookback
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
WITH bucketed AS (
|
||||||
|
SELECT
|
||||||
|
*,
|
||||||
|
CASE
|
||||||
|
WHEN horizon_minutes < 120 THEN 1
|
||||||
|
WHEN horizon_minutes < 240 THEN 2
|
||||||
|
WHEN horizon_minutes < 480 THEN 3
|
||||||
|
WHEN horizon_minutes < 960 THEN 4
|
||||||
|
ELSE 5
|
||||||
|
END AS horizon_bucket,
|
||||||
|
CASE
|
||||||
|
WHEN horizon_minutes < 120 THEN '0-2h'
|
||||||
|
WHEN horizon_minutes < 240 THEN '2-4h'
|
||||||
|
WHEN horizon_minutes < 480 THEN '4-8h'
|
||||||
|
WHEN horizon_minutes < 960 THEN '8-16h'
|
||||||
|
ELSE '16-24h'
|
||||||
|
END AS horizon_label
|
||||||
|
FROM oracle_forecast_evaluations
|
||||||
|
WHERE target_at >= %s
|
||||||
|
AND realized_power_w IS NOT NULL
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version,
|
||||||
|
horizon_bucket,
|
||||||
|
horizon_label,
|
||||||
|
min(horizon_minutes) AS min_horizon_minutes,
|
||||||
|
max(horizon_minutes) AS max_horizon_minutes,
|
||||||
|
count(*) AS evaluated_count,
|
||||||
|
avg(error_w) AS mean_error_w,
|
||||||
|
avg(absolute_error_w) AS mean_absolute_error_w,
|
||||||
|
percentile_cont(0.50) WITHIN GROUP (
|
||||||
|
ORDER BY absolute_error_w
|
||||||
|
) AS median_absolute_error_w,
|
||||||
|
avg(absolute_pct_error) AS mean_absolute_pct_error,
|
||||||
|
avg(
|
||||||
|
CASE
|
||||||
|
WHEN covered_by_p10_p90 IS NULL THEN NULL
|
||||||
|
WHEN covered_by_p10_p90 THEN 1.0
|
||||||
|
ELSE 0.0
|
||||||
|
END
|
||||||
|
) AS interval_coverage
|
||||||
|
FROM bucketed
|
||||||
|
GROUP BY kind, source, model_version, horizon_bucket, horizon_label
|
||||||
|
ORDER BY kind, source, model_version, horizon_bucket
|
||||||
|
""",
|
||||||
|
(start_at,),
|
||||||
|
)
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"kind": row[0],
|
||||||
|
"source": row[1],
|
||||||
|
"model_version": row[2],
|
||||||
|
"horizon_bucket": row[3],
|
||||||
|
"horizon_label": row[4],
|
||||||
|
"min_horizon_minutes": row[5],
|
||||||
|
"max_horizon_minutes": row[6],
|
||||||
|
"evaluated_count": row[7],
|
||||||
|
"mean_error_w": row[8],
|
||||||
|
"mean_absolute_error_w": row[9],
|
||||||
|
"median_absolute_error_w": row[10],
|
||||||
|
"mean_absolute_pct_error": row[11],
|
||||||
|
"interval_coverage": row[12],
|
||||||
|
}
|
||||||
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
|
def _evaluate_due_power_forecasts(
|
||||||
|
self,
|
||||||
|
cursor: object,
|
||||||
|
actual_window: timedelta,
|
||||||
|
start_at: datetime,
|
||||||
|
limit: int,
|
||||||
|
) -> int:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
WITH candidates AS (
|
||||||
|
SELECT
|
||||||
|
forecast.issued_at,
|
||||||
|
forecast.target_at,
|
||||||
|
forecast.kind,
|
||||||
|
forecast.source,
|
||||||
|
forecast.model_version,
|
||||||
|
forecast.horizon_minutes,
|
||||||
|
forecast.expected_power_w,
|
||||||
|
forecast.p10_power_w,
|
||||||
|
forecast.p50_power_w,
|
||||||
|
forecast.p90_power_w
|
||||||
|
FROM oracle_power_forecast_points AS forecast
|
||||||
|
LEFT JOIN oracle_forecast_evaluations AS evaluation
|
||||||
|
ON evaluation.issued_at = forecast.issued_at
|
||||||
|
AND evaluation.target_at = forecast.target_at
|
||||||
|
AND evaluation.kind = forecast.kind
|
||||||
|
AND evaluation.source = forecast.source
|
||||||
|
AND evaluation.model_version = forecast.model_version
|
||||||
|
WHERE forecast.target_at >= %s
|
||||||
|
AND forecast.target_at <= now() - %s
|
||||||
|
AND (
|
||||||
|
evaluation.issued_at IS NULL
|
||||||
|
OR evaluation.sample_count = 0
|
||||||
|
)
|
||||||
|
ORDER BY forecast.target_at, forecast.issued_at
|
||||||
|
LIMIT %s
|
||||||
|
),
|
||||||
|
realized AS (
|
||||||
|
SELECT
|
||||||
|
candidates.*,
|
||||||
|
actual.realized_power_w,
|
||||||
|
actual.sample_count
|
||||||
|
FROM candidates
|
||||||
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT
|
||||||
|
avg(
|
||||||
|
CASE candidates.kind
|
||||||
|
WHEN 'solar' THEN snapshot.solar_power_w
|
||||||
|
WHEN 'load' THEN snapshot.load_power_w
|
||||||
|
ELSE NULL
|
||||||
|
END
|
||||||
|
) AS realized_power_w,
|
||||||
|
count(*) FILTER (
|
||||||
|
WHERE CASE candidates.kind
|
||||||
|
WHEN 'solar' THEN snapshot.solar_power_w
|
||||||
|
WHEN 'load' THEN snapshot.load_power_w
|
||||||
|
ELSE NULL
|
||||||
|
END IS NOT NULL
|
||||||
|
) AS sample_count
|
||||||
|
FROM sigen_plant_snapshots AS snapshot
|
||||||
|
WHERE snapshot.observed_at >= candidates.target_at
|
||||||
|
AND snapshot.observed_at < candidates.target_at + %s
|
||||||
|
) AS actual ON TRUE
|
||||||
|
)
|
||||||
|
INSERT INTO oracle_forecast_evaluations (
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version,
|
||||||
|
horizon_minutes,
|
||||||
|
expected_power_w,
|
||||||
|
p10_power_w,
|
||||||
|
p50_power_w,
|
||||||
|
p90_power_w,
|
||||||
|
realized_power_w,
|
||||||
|
error_w,
|
||||||
|
absolute_error_w,
|
||||||
|
absolute_pct_error,
|
||||||
|
covered_by_p10_p90,
|
||||||
|
sample_count,
|
||||||
|
evaluated_at
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version,
|
||||||
|
horizon_minutes,
|
||||||
|
expected_power_w,
|
||||||
|
p10_power_w,
|
||||||
|
p50_power_w,
|
||||||
|
p90_power_w,
|
||||||
|
realized_power_w,
|
||||||
|
realized_power_w - p50_power_w,
|
||||||
|
abs(realized_power_w - p50_power_w),
|
||||||
|
CASE
|
||||||
|
WHEN abs(realized_power_w) < 1 THEN NULL
|
||||||
|
ELSE abs(realized_power_w - p50_power_w) / abs(realized_power_w)
|
||||||
|
END,
|
||||||
|
CASE
|
||||||
|
WHEN realized_power_w IS NULL THEN NULL
|
||||||
|
ELSE realized_power_w BETWEEN p10_power_w AND p90_power_w
|
||||||
|
END,
|
||||||
|
COALESCE(sample_count, 0),
|
||||||
|
now()
|
||||||
|
FROM realized
|
||||||
|
ON CONFLICT (
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version
|
||||||
|
)
|
||||||
|
DO UPDATE SET
|
||||||
|
horizon_minutes = EXCLUDED.horizon_minutes,
|
||||||
|
expected_power_w = EXCLUDED.expected_power_w,
|
||||||
|
p10_power_w = EXCLUDED.p10_power_w,
|
||||||
|
p50_power_w = EXCLUDED.p50_power_w,
|
||||||
|
p90_power_w = EXCLUDED.p90_power_w,
|
||||||
|
realized_power_w = EXCLUDED.realized_power_w,
|
||||||
|
error_w = EXCLUDED.error_w,
|
||||||
|
absolute_error_w = EXCLUDED.absolute_error_w,
|
||||||
|
absolute_pct_error = EXCLUDED.absolute_pct_error,
|
||||||
|
covered_by_p10_p90 = EXCLUDED.covered_by_p10_p90,
|
||||||
|
sample_count = EXCLUDED.sample_count,
|
||||||
|
evaluated_at = EXCLUDED.evaluated_at,
|
||||||
|
updated_at = now()
|
||||||
|
""",
|
||||||
|
(start_at, actual_window, limit, actual_window),
|
||||||
|
)
|
||||||
|
return cursor.rowcount
|
||||||
|
|
||||||
|
def _evaluate_due_net_forecasts(
|
||||||
|
self,
|
||||||
|
cursor: object,
|
||||||
|
actual_window: timedelta,
|
||||||
|
start_at: datetime,
|
||||||
|
limit: int,
|
||||||
|
) -> int:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
WITH candidates AS (
|
||||||
|
SELECT
|
||||||
|
forecast.issued_at,
|
||||||
|
forecast.target_at,
|
||||||
|
'net'::text AS kind,
|
||||||
|
forecast.source,
|
||||||
|
'net_forecaster_v1'::text AS model_version,
|
||||||
|
forecast.horizon_minutes,
|
||||||
|
forecast.expected_net_power_w AS expected_power_w,
|
||||||
|
COALESCE(forecast.p10_net_power_w, forecast.safe_net_power_w)
|
||||||
|
AS p10_power_w,
|
||||||
|
COALESCE(forecast.p50_net_power_w, forecast.expected_net_power_w)
|
||||||
|
AS p50_power_w,
|
||||||
|
forecast.p90_net_power_w AS p90_power_w
|
||||||
|
FROM oracle_net_forecast_points AS forecast
|
||||||
|
LEFT JOIN oracle_forecast_evaluations AS evaluation
|
||||||
|
ON evaluation.issued_at = forecast.issued_at
|
||||||
|
AND evaluation.target_at = forecast.target_at
|
||||||
|
AND evaluation.kind = 'net'
|
||||||
|
AND evaluation.source = forecast.source
|
||||||
|
AND evaluation.model_version = 'net_forecaster_v1'
|
||||||
|
WHERE forecast.target_at >= %s
|
||||||
|
AND forecast.target_at <= now() - %s
|
||||||
|
AND (
|
||||||
|
evaluation.issued_at IS NULL
|
||||||
|
OR evaluation.sample_count = 0
|
||||||
|
)
|
||||||
|
ORDER BY forecast.target_at, forecast.issued_at
|
||||||
|
LIMIT %s
|
||||||
|
),
|
||||||
|
realized AS (
|
||||||
|
SELECT
|
||||||
|
candidates.*,
|
||||||
|
actual.realized_power_w,
|
||||||
|
actual.sample_count
|
||||||
|
FROM candidates
|
||||||
|
LEFT JOIN LATERAL (
|
||||||
|
SELECT
|
||||||
|
avg(snapshot.solar_power_w - snapshot.load_power_w)
|
||||||
|
AS realized_power_w,
|
||||||
|
count(*) FILTER (
|
||||||
|
WHERE snapshot.solar_power_w IS NOT NULL
|
||||||
|
AND snapshot.load_power_w IS NOT NULL
|
||||||
|
) AS sample_count
|
||||||
|
FROM sigen_plant_snapshots AS snapshot
|
||||||
|
WHERE snapshot.observed_at >= candidates.target_at
|
||||||
|
AND snapshot.observed_at < candidates.target_at + %s
|
||||||
|
) AS actual ON TRUE
|
||||||
|
)
|
||||||
|
INSERT INTO oracle_forecast_evaluations (
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version,
|
||||||
|
horizon_minutes,
|
||||||
|
expected_power_w,
|
||||||
|
p10_power_w,
|
||||||
|
p50_power_w,
|
||||||
|
p90_power_w,
|
||||||
|
realized_power_w,
|
||||||
|
error_w,
|
||||||
|
absolute_error_w,
|
||||||
|
absolute_pct_error,
|
||||||
|
covered_by_p10_p90,
|
||||||
|
sample_count,
|
||||||
|
evaluated_at
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version,
|
||||||
|
horizon_minutes,
|
||||||
|
expected_power_w,
|
||||||
|
p10_power_w,
|
||||||
|
p50_power_w,
|
||||||
|
p90_power_w,
|
||||||
|
realized_power_w,
|
||||||
|
realized_power_w - p50_power_w,
|
||||||
|
abs(realized_power_w - p50_power_w),
|
||||||
|
CASE
|
||||||
|
WHEN abs(realized_power_w) < 1 THEN NULL
|
||||||
|
ELSE abs(realized_power_w - p50_power_w) / abs(realized_power_w)
|
||||||
|
END,
|
||||||
|
CASE
|
||||||
|
WHEN realized_power_w IS NULL OR p90_power_w IS NULL THEN NULL
|
||||||
|
ELSE realized_power_w BETWEEN p10_power_w AND p90_power_w
|
||||||
|
END,
|
||||||
|
COALESCE(sample_count, 0),
|
||||||
|
now()
|
||||||
|
FROM realized
|
||||||
|
ON CONFLICT (
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
kind,
|
||||||
|
source,
|
||||||
|
model_version
|
||||||
|
)
|
||||||
|
DO UPDATE SET
|
||||||
|
horizon_minutes = EXCLUDED.horizon_minutes,
|
||||||
|
expected_power_w = EXCLUDED.expected_power_w,
|
||||||
|
p10_power_w = EXCLUDED.p10_power_w,
|
||||||
|
p50_power_w = EXCLUDED.p50_power_w,
|
||||||
|
p90_power_w = EXCLUDED.p90_power_w,
|
||||||
|
realized_power_w = EXCLUDED.realized_power_w,
|
||||||
|
error_w = EXCLUDED.error_w,
|
||||||
|
absolute_error_w = EXCLUDED.absolute_error_w,
|
||||||
|
absolute_pct_error = EXCLUDED.absolute_pct_error,
|
||||||
|
covered_by_p10_p90 = EXCLUDED.covered_by_p10_p90,
|
||||||
|
sample_count = EXCLUDED.sample_count,
|
||||||
|
evaluated_at = EXCLUDED.evaluated_at,
|
||||||
|
updated_at = now()
|
||||||
|
""",
|
||||||
|
(start_at, actual_window, limit, actual_window),
|
||||||
|
)
|
||||||
|
return cursor.rowcount
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def _connection(self) -> Iterator[object]:
|
||||||
|
try:
|
||||||
|
import psycopg
|
||||||
|
except ImportError as error:
|
||||||
|
raise OracleStoreConfigurationError(
|
||||||
|
"Install dependencies with `python3 -m pip install -r requirements.txt`"
|
||||||
|
) from error
|
||||||
|
|
||||||
|
with psycopg.connect(self.config.database_url) as connection:
|
||||||
|
yield connection
|
||||||
@@ -0,0 +1,9 @@
|
|||||||
|
__all__ = [
|
||||||
|
"BaselineSolarProductionOracle",
|
||||||
|
"BaselineUsageOracle",
|
||||||
|
"DailyUsageOracle",
|
||||||
|
"HistoricalUsageOracle",
|
||||||
|
"SequenceUsageOracle",
|
||||||
|
"NetPowerForecaster",
|
||||||
|
"RollingSolarRegressionOracle",
|
||||||
|
]
|
||||||
@@ -0,0 +1,69 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
|
||||||
|
def fit_ridge_regression(
|
||||||
|
features: list[list[float]],
|
||||||
|
targets: list[float],
|
||||||
|
ridge_lambda: float,
|
||||||
|
) -> list[float] | None:
|
||||||
|
if not features:
|
||||||
|
return None
|
||||||
|
|
||||||
|
width = len(features[0])
|
||||||
|
xtx = [[0.0 for _ in range(width)] for _ in range(width)]
|
||||||
|
xty = [0.0 for _ in range(width)]
|
||||||
|
|
||||||
|
for row, target in zip(features, targets):
|
||||||
|
for i in range(width):
|
||||||
|
xty[i] += row[i] * target
|
||||||
|
for j in range(width):
|
||||||
|
xtx[i][j] += row[i] * row[j]
|
||||||
|
|
||||||
|
for i in range(1, width):
|
||||||
|
xtx[i][i] += ridge_lambda
|
||||||
|
|
||||||
|
return solve_linear_system(xtx, xty)
|
||||||
|
|
||||||
|
|
||||||
|
def solve_linear_system(
|
||||||
|
matrix: list[list[float]],
|
||||||
|
vector: list[float],
|
||||||
|
) -> list[float] | None:
|
||||||
|
size = len(vector)
|
||||||
|
rows = [matrix[index][:] + [vector[index]] for index in range(size)]
|
||||||
|
|
||||||
|
for pivot_index in range(size):
|
||||||
|
pivot_row = max(
|
||||||
|
range(pivot_index, size),
|
||||||
|
key=lambda row_index: abs(rows[row_index][pivot_index]),
|
||||||
|
)
|
||||||
|
if abs(rows[pivot_row][pivot_index]) < 1e-9:
|
||||||
|
return None
|
||||||
|
|
||||||
|
rows[pivot_index], rows[pivot_row] = rows[pivot_row], rows[pivot_index]
|
||||||
|
pivot = rows[pivot_index][pivot_index]
|
||||||
|
rows[pivot_index] = [value / pivot for value in rows[pivot_index]]
|
||||||
|
|
||||||
|
for row_index in range(size):
|
||||||
|
if row_index == pivot_index:
|
||||||
|
continue
|
||||||
|
factor = rows[row_index][pivot_index]
|
||||||
|
rows[row_index] = [
|
||||||
|
value - factor * pivot_value
|
||||||
|
for value, pivot_value in zip(rows[row_index], rows[pivot_index])
|
||||||
|
]
|
||||||
|
|
||||||
|
return [row[-1] for row in rows]
|
||||||
|
|
||||||
|
|
||||||
|
def dot(left: list[float], right: list[float]) -> float:
|
||||||
|
return sum(left_value * right_value for left_value, right_value in zip(left, right))
|
||||||
|
|
||||||
|
|
||||||
|
def quantile(values: list[float], q: float) -> float:
|
||||||
|
if not values:
|
||||||
|
return 0.0
|
||||||
|
|
||||||
|
sorted_values = sorted(values)
|
||||||
|
index = round((len(sorted_values) - 1) * q)
|
||||||
|
return sorted_values[index]
|
||||||
@@ -0,0 +1,54 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from gibil.classes.models import NetPowerForecastPoint, NetPowerForecastRun, PowerForecastRun
|
||||||
|
|
||||||
|
|
||||||
|
class NetPowerForecaster:
|
||||||
|
"""Combines production and usage curves into expected and interval net power."""
|
||||||
|
|
||||||
|
def combine(
|
||||||
|
self,
|
||||||
|
solar_run: PowerForecastRun,
|
||||||
|
load_run: PowerForecastRun,
|
||||||
|
) -> NetPowerForecastRun:
|
||||||
|
load_by_target = {point.target_at: point for point in load_run.points}
|
||||||
|
points: list[NetPowerForecastPoint] = []
|
||||||
|
|
||||||
|
for solar_point in solar_run.points:
|
||||||
|
load_point = load_by_target.get(solar_point.target_at)
|
||||||
|
if load_point is None:
|
||||||
|
continue
|
||||||
|
|
||||||
|
points.append(
|
||||||
|
NetPowerForecastPoint(
|
||||||
|
target_at=solar_point.target_at,
|
||||||
|
horizon_minutes=solar_point.horizon_minutes,
|
||||||
|
expected_net_power_w=(
|
||||||
|
solar_point.p50_power_w - load_point.p50_power_w
|
||||||
|
),
|
||||||
|
safe_net_power_w=(
|
||||||
|
solar_point.p10_power_w - load_point.p90_power_w
|
||||||
|
),
|
||||||
|
p10_net_power_w=(
|
||||||
|
solar_point.p10_power_w - load_point.p90_power_w
|
||||||
|
),
|
||||||
|
p50_net_power_w=(
|
||||||
|
solar_point.p50_power_w - load_point.p50_power_w
|
||||||
|
),
|
||||||
|
p90_net_power_w=(
|
||||||
|
solar_point.p90_power_w - load_point.p10_power_w
|
||||||
|
),
|
||||||
|
solar_p50_power_w=solar_point.p50_power_w,
|
||||||
|
load_p50_power_w=load_point.p50_power_w,
|
||||||
|
solar_p10_power_w=solar_point.p10_power_w,
|
||||||
|
solar_p90_power_w=solar_point.p90_power_w,
|
||||||
|
load_p10_power_w=load_point.p10_power_w,
|
||||||
|
load_p90_power_w=load_point.p90_power_w,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
return NetPowerForecastRun(
|
||||||
|
issued_at=solar_run.issued_at,
|
||||||
|
source="baseline_net_forecaster",
|
||||||
|
points=points,
|
||||||
|
)
|
||||||
@@ -0,0 +1,94 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
|
||||||
|
from gibil.classes.models import ForecastKind, PowerForecastPoint, PowerForecastRun, WeatherForecastPoint
|
||||||
|
from gibil.classes.oracle.config import EnergyForecastConfig
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
from gibil.classes.weather.store import WeatherStore
|
||||||
|
|
||||||
|
|
||||||
|
class BaselineSolarProductionOracle:
|
||||||
|
"""Forecasts solar production from shortwave radiation and recent plant peak."""
|
||||||
|
|
||||||
|
model_version = "baseline_solar_radiation_v1"
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
weather_store: WeatherStore,
|
||||||
|
sigen_store: SigenStore,
|
||||||
|
config: EnergyForecastConfig,
|
||||||
|
) -> None:
|
||||||
|
self.weather_store = weather_store
|
||||||
|
self.sigen_store = sigen_store
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
def forecast(self, issued_at: datetime | None = None) -> PowerForecastRun:
|
||||||
|
if issued_at is None:
|
||||||
|
issued_at = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
weather_points = self.weather_store.load_latest_forecast_points(
|
||||||
|
start_at=issued_at,
|
||||||
|
end_at=issued_at + timedelta(hours=self.config.horizon_hours),
|
||||||
|
)
|
||||||
|
peak_w = self._solar_peak_w()
|
||||||
|
points = [
|
||||||
|
self._forecast_point(
|
||||||
|
weather_point=point,
|
||||||
|
issued_at=issued_at,
|
||||||
|
peak_w=peak_w,
|
||||||
|
)
|
||||||
|
for point in weather_points
|
||||||
|
]
|
||||||
|
|
||||||
|
return PowerForecastRun(
|
||||||
|
issued_at=issued_at,
|
||||||
|
kind=ForecastKind.SOLAR,
|
||||||
|
source="baseline_solar_oracle",
|
||||||
|
model_version=self.model_version,
|
||||||
|
points=points,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _forecast_point(
|
||||||
|
self,
|
||||||
|
weather_point: WeatherForecastPoint,
|
||||||
|
issued_at: datetime,
|
||||||
|
peak_w: float,
|
||||||
|
) -> PowerForecastPoint:
|
||||||
|
radiation = max(weather_point.shortwave_radiation_w_m2 or 0.0, 0.0)
|
||||||
|
expected = min(peak_w, peak_w * (radiation / 1000.0) * self.config.solar_scale)
|
||||||
|
cloud_cover = weather_point.cloud_cover_pct
|
||||||
|
cloud_uncertainty = 1.0
|
||||||
|
if cloud_cover is not None:
|
||||||
|
cloud_uncertainty += min(max(cloud_cover, 0.0), 100.0) / 200.0
|
||||||
|
|
||||||
|
p10 = max(0.0, expected * (0.75 / cloud_uncertainty))
|
||||||
|
p90 = min(peak_w, expected * (1.15 * cloud_uncertainty))
|
||||||
|
|
||||||
|
return PowerForecastPoint(
|
||||||
|
target_at=weather_point.target_at,
|
||||||
|
horizon_minutes=self._horizon_minutes(issued_at, weather_point.target_at),
|
||||||
|
expected_power_w=expected,
|
||||||
|
p10_power_w=p10,
|
||||||
|
p50_power_w=expected,
|
||||||
|
p90_power_w=p90,
|
||||||
|
confidence=0.25,
|
||||||
|
source="open_meteo_shortwave",
|
||||||
|
model_version=self.model_version,
|
||||||
|
metadata={
|
||||||
|
"shortwave_radiation_w_m2": weather_point.shortwave_radiation_w_m2,
|
||||||
|
"cloud_cover_pct": weather_point.cloud_cover_pct,
|
||||||
|
"temperature_c": weather_point.temperature_c,
|
||||||
|
"solar_peak_w": peak_w,
|
||||||
|
"fallback_reason": "not_enough_solar_training_samples",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
def _solar_peak_w(self) -> float:
|
||||||
|
recent_peak = self.sigen_store.load_recent_solar_peak_w()
|
||||||
|
if recent_peak is None or recent_peak <= 0:
|
||||||
|
return self.config.fallback_solar_peak_w
|
||||||
|
return recent_peak * max(self.config.solar_peak_headroom, 1.0)
|
||||||
|
|
||||||
|
def _horizon_minutes(self, issued_at: datetime, target_at: datetime) -> int:
|
||||||
|
return max(0, round((target_at - issued_at).total_seconds() / 60))
|
||||||
@@ -0,0 +1,175 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
|
||||||
|
from gibil.classes.models import ForecastKind, PowerForecastPoint, PowerForecastRun, WeatherForecastPoint
|
||||||
|
from gibil.classes.oracle.config import EnergyForecastConfig
|
||||||
|
from gibil.classes.predictors.solar_baseline import BaselineSolarProductionOracle
|
||||||
|
from gibil.classes.predictors.math_utils import dot, fit_ridge_regression, quantile
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
from gibil.classes.weather.store import WeatherStore
|
||||||
|
|
||||||
|
|
||||||
|
class RollingSolarRegressionOracle:
|
||||||
|
"""Forecasts solar production with a rolling ridge regression."""
|
||||||
|
|
||||||
|
model_version = "rolling_solar_regression_v1"
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
weather_store: WeatherStore,
|
||||||
|
sigen_store: SigenStore,
|
||||||
|
config: EnergyForecastConfig,
|
||||||
|
) -> None:
|
||||||
|
self.weather_store = weather_store
|
||||||
|
self.sigen_store = sigen_store
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
def forecast(self, issued_at: datetime | None = None) -> PowerForecastRun:
|
||||||
|
if issued_at is None:
|
||||||
|
issued_at = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
weather_points = self.weather_store.load_latest_forecast_points(
|
||||||
|
start_at=issued_at,
|
||||||
|
end_at=issued_at + timedelta(hours=self.config.horizon_hours),
|
||||||
|
)
|
||||||
|
training_samples = self.sigen_store.load_solar_training_samples(
|
||||||
|
lookback=timedelta(days=self.config.solar_training_days)
|
||||||
|
)
|
||||||
|
model = self._fit_model(training_samples)
|
||||||
|
if model is None:
|
||||||
|
return BaselineSolarProductionOracle(
|
||||||
|
weather_store=self.weather_store,
|
||||||
|
sigen_store=self.sigen_store,
|
||||||
|
config=self.config,
|
||||||
|
).forecast(issued_at=issued_at)
|
||||||
|
|
||||||
|
points = [
|
||||||
|
self._forecast_point(
|
||||||
|
weather_point=point,
|
||||||
|
issued_at=issued_at,
|
||||||
|
model=model,
|
||||||
|
training_sample_count=len(training_samples),
|
||||||
|
)
|
||||||
|
for point in weather_points
|
||||||
|
]
|
||||||
|
|
||||||
|
return PowerForecastRun(
|
||||||
|
issued_at=issued_at,
|
||||||
|
kind=ForecastKind.SOLAR,
|
||||||
|
source="rolling_solar_regression_oracle",
|
||||||
|
model_version=self.model_version,
|
||||||
|
points=points,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _fit_model(
|
||||||
|
self,
|
||||||
|
samples: list[dict[str, float | int | object]],
|
||||||
|
) -> "_SolarRegressionModel | None":
|
||||||
|
if len(samples) < self.config.solar_min_training_samples:
|
||||||
|
return None
|
||||||
|
|
||||||
|
features = [
|
||||||
|
self._features(
|
||||||
|
radiation=float(sample["shortwave_radiation_w_m2"]),
|
||||||
|
cloud_cover=float(sample["cloud_cover_pct"]),
|
||||||
|
)
|
||||||
|
for sample in samples
|
||||||
|
]
|
||||||
|
targets = [float(sample["solar_power_w"]) for sample in samples]
|
||||||
|
|
||||||
|
coefficients = fit_ridge_regression(
|
||||||
|
features,
|
||||||
|
targets,
|
||||||
|
ridge_lambda=self.config.solar_ridge_lambda,
|
||||||
|
)
|
||||||
|
if coefficients is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
residuals = [
|
||||||
|
target - dot(coefficients, feature)
|
||||||
|
for feature, target in zip(features, targets)
|
||||||
|
]
|
||||||
|
return _SolarRegressionModel(
|
||||||
|
coefficients=coefficients,
|
||||||
|
residual_p10=quantile(residuals, 0.10),
|
||||||
|
residual_p90=quantile(residuals, 0.90),
|
||||||
|
peak_w=self._solar_peak_w(),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _forecast_point(
|
||||||
|
self,
|
||||||
|
weather_point: WeatherForecastPoint,
|
||||||
|
issued_at: datetime,
|
||||||
|
model: "_SolarRegressionModel",
|
||||||
|
training_sample_count: int,
|
||||||
|
) -> PowerForecastPoint:
|
||||||
|
radiation = max(weather_point.shortwave_radiation_w_m2 or 0.0, 0.0)
|
||||||
|
cloud_cover = self._cloud_cover(weather_point.cloud_cover_pct)
|
||||||
|
expected = model.predict(self._features(radiation, cloud_cover))
|
||||||
|
expected *= self.config.solar_scale
|
||||||
|
p10 = max(0.0, expected + model.residual_p10)
|
||||||
|
p90 = min(model.peak_w, expected + model.residual_p90)
|
||||||
|
if p90 < expected:
|
||||||
|
p90 = expected
|
||||||
|
if p10 > expected:
|
||||||
|
p10 = expected
|
||||||
|
|
||||||
|
return PowerForecastPoint(
|
||||||
|
target_at=weather_point.target_at,
|
||||||
|
horizon_minutes=self._horizon_minutes(issued_at, weather_point.target_at),
|
||||||
|
expected_power_w=expected,
|
||||||
|
p10_power_w=p10,
|
||||||
|
p50_power_w=expected,
|
||||||
|
p90_power_w=p90,
|
||||||
|
confidence=0.45,
|
||||||
|
source="rolling_solar_regression",
|
||||||
|
model_version=self.model_version,
|
||||||
|
metadata={
|
||||||
|
"shortwave_radiation_w_m2": weather_point.shortwave_radiation_w_m2,
|
||||||
|
"cloud_cover_pct": weather_point.cloud_cover_pct,
|
||||||
|
"temperature_c": weather_point.temperature_c,
|
||||||
|
"solar_peak_w": model.peak_w,
|
||||||
|
"training_sample_count": training_sample_count,
|
||||||
|
"residual_p10_w": model.residual_p10,
|
||||||
|
"residual_p90_w": model.residual_p90,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
def _features(self, radiation: float, cloud_cover: float) -> list[float]:
|
||||||
|
radiation_kw = radiation / 1000.0
|
||||||
|
cloud = cloud_cover / 100.0
|
||||||
|
clear = 1.0 - cloud
|
||||||
|
return [
|
||||||
|
1.0,
|
||||||
|
radiation_kw,
|
||||||
|
radiation_kw * clear,
|
||||||
|
radiation_kw * cloud,
|
||||||
|
cloud,
|
||||||
|
]
|
||||||
|
|
||||||
|
def _cloud_cover(self, value: float | None) -> float:
|
||||||
|
if value is None:
|
||||||
|
return 0.0
|
||||||
|
return min(max(value, 0.0), 100.0)
|
||||||
|
|
||||||
|
def _solar_peak_w(self) -> float:
|
||||||
|
recent_peak = self.sigen_store.load_recent_solar_peak_w()
|
||||||
|
if recent_peak is None or recent_peak <= 0:
|
||||||
|
return self.config.fallback_solar_peak_w
|
||||||
|
return recent_peak * max(self.config.solar_peak_headroom, 1.0)
|
||||||
|
|
||||||
|
def _horizon_minutes(self, issued_at: datetime, target_at: datetime) -> int:
|
||||||
|
return max(0, round((target_at - issued_at).total_seconds() / 60))
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class _SolarRegressionModel:
|
||||||
|
coefficients: list[float]
|
||||||
|
residual_p10: float
|
||||||
|
residual_p90: float
|
||||||
|
peak_w: float
|
||||||
|
|
||||||
|
def predict(self, features: list[float]) -> float:
|
||||||
|
return min(max(dot(self.coefficients, features), 0.0), self.peak_w)
|
||||||
@@ -0,0 +1,76 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
|
||||||
|
from gibil.classes.models import ForecastKind, PowerForecastPoint, PowerForecastRun
|
||||||
|
from gibil.classes.oracle.config import EnergyForecastConfig
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
|
||||||
|
|
||||||
|
class BaselineUsageOracle:
|
||||||
|
"""Forecasts near-future load from recent high-resolution Sigen history."""
|
||||||
|
|
||||||
|
model_version = "baseline_recent_load_v1"
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
sigen_store: SigenStore,
|
||||||
|
config: EnergyForecastConfig,
|
||||||
|
) -> None:
|
||||||
|
self.sigen_store = sigen_store
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
def forecast(
|
||||||
|
self,
|
||||||
|
target_times: list[datetime],
|
||||||
|
issued_at: datetime | None = None,
|
||||||
|
) -> PowerForecastRun:
|
||||||
|
if issued_at is None:
|
||||||
|
issued_at = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
lookback = timedelta(minutes=self.config.load_lookback_minutes)
|
||||||
|
summary = self.sigen_store.load_recent_power_summary(lookback=lookback)
|
||||||
|
latest = self.sigen_store.load_latest_snapshot()
|
||||||
|
fallback_load_w = latest.load_power_w if latest else 0.0
|
||||||
|
|
||||||
|
p50 = self._number(summary.get("load_p50_w"), fallback_load_w)
|
||||||
|
p10 = max(0.0, self._number(summary.get("load_p10_w"), p50 * 0.7))
|
||||||
|
p90 = max(
|
||||||
|
self._number(summary.get("load_p90_w"), p50 * 1.5),
|
||||||
|
p50 * 1.25,
|
||||||
|
)
|
||||||
|
|
||||||
|
points = [
|
||||||
|
PowerForecastPoint(
|
||||||
|
target_at=target_at,
|
||||||
|
horizon_minutes=max(
|
||||||
|
0, round((target_at - issued_at).total_seconds() / 60)
|
||||||
|
),
|
||||||
|
expected_power_w=p50,
|
||||||
|
p10_power_w=p10,
|
||||||
|
p50_power_w=p50,
|
||||||
|
p90_power_w=p90,
|
||||||
|
confidence=0.35,
|
||||||
|
source="recent_sigen_load",
|
||||||
|
model_version=self.model_version,
|
||||||
|
metadata={
|
||||||
|
"lookback_minutes": self.config.load_lookback_minutes,
|
||||||
|
"load_avg_w": summary.get("load_avg_w"),
|
||||||
|
"load_max_w": summary.get("load_max_w"),
|
||||||
|
},
|
||||||
|
)
|
||||||
|
for target_at in target_times
|
||||||
|
]
|
||||||
|
|
||||||
|
return PowerForecastRun(
|
||||||
|
issued_at=issued_at,
|
||||||
|
kind=ForecastKind.LOAD,
|
||||||
|
source="baseline_usage_oracle",
|
||||||
|
model_version=self.model_version,
|
||||||
|
points=points,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _number(self, value: object, fallback: float) -> float:
|
||||||
|
if value is None:
|
||||||
|
return float(fallback)
|
||||||
|
return float(value)
|
||||||
@@ -0,0 +1,188 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from zoneinfo import ZoneInfo, ZoneInfoNotFoundError
|
||||||
|
|
||||||
|
from gibil.classes.models import ForecastKind, PowerForecastPoint, PowerForecastRun
|
||||||
|
from gibil.classes.oracle.config import EnergyForecastConfig
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
|
||||||
|
|
||||||
|
class DailyUsageOracle:
|
||||||
|
"""Forecasts load from time-of-day history blended with recent load."""
|
||||||
|
|
||||||
|
model_version = "daily_usage_profile_v1"
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
sigen_store: SigenStore,
|
||||||
|
config: EnergyForecastConfig,
|
||||||
|
) -> None:
|
||||||
|
self.sigen_store = sigen_store
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
def forecast(
|
||||||
|
self,
|
||||||
|
target_times: list[datetime],
|
||||||
|
issued_at: datetime | None = None,
|
||||||
|
) -> PowerForecastRun:
|
||||||
|
if issued_at is None:
|
||||||
|
issued_at = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
recent_summary = self.sigen_store.load_recent_power_summary(
|
||||||
|
lookback=timedelta(minutes=self.config.load_lookback_minutes)
|
||||||
|
)
|
||||||
|
profile = self._daily_profile()
|
||||||
|
latest = self.sigen_store.load_latest_snapshot()
|
||||||
|
fallback_load_w = latest.load_power_w if latest else 0.0
|
||||||
|
recent_p50 = self._number(recent_summary.get("load_p50_w"), fallback_load_w)
|
||||||
|
recent_p10 = self._number(recent_summary.get("load_p10_w"), recent_p50 * 0.7)
|
||||||
|
recent_p90 = self._number(recent_summary.get("load_p90_w"), recent_p50 * 1.5)
|
||||||
|
blend = min(max(self.config.load_recent_blend, 0.0), 1.0)
|
||||||
|
|
||||||
|
points = [
|
||||||
|
self._forecast_point(
|
||||||
|
target_at=target_at,
|
||||||
|
issued_at=issued_at,
|
||||||
|
profile=profile,
|
||||||
|
recent_p10=recent_p10,
|
||||||
|
recent_p50=recent_p50,
|
||||||
|
recent_p90=recent_p90,
|
||||||
|
blend=blend,
|
||||||
|
)
|
||||||
|
for target_at in target_times
|
||||||
|
]
|
||||||
|
|
||||||
|
return PowerForecastRun(
|
||||||
|
issued_at=issued_at,
|
||||||
|
kind=ForecastKind.LOAD,
|
||||||
|
source="daily_usage_oracle",
|
||||||
|
model_version=self.model_version,
|
||||||
|
points=points,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _daily_profile(self) -> dict[int, dict[str, float | int]]:
|
||||||
|
weekly_profile = self.sigen_store.load_load_profile(
|
||||||
|
lookback=timedelta(days=self.config.load_profile_days),
|
||||||
|
bucket_minutes=self.config.load_profile_bucket_minutes,
|
||||||
|
min_samples=self.config.load_profile_min_samples,
|
||||||
|
timezone_name=self._local_timezone_name(),
|
||||||
|
)
|
||||||
|
grouped: dict[int, list[dict[str, float | int]]] = {}
|
||||||
|
for (_iso_dow, minute_bucket), values in weekly_profile.items():
|
||||||
|
grouped.setdefault(minute_bucket, []).append(values)
|
||||||
|
|
||||||
|
return {
|
||||||
|
minute_bucket: self._weighted_profile(values)
|
||||||
|
for minute_bucket, values in grouped.items()
|
||||||
|
}
|
||||||
|
|
||||||
|
def _weighted_profile(
|
||||||
|
self,
|
||||||
|
values: list[dict[str, float | int]],
|
||||||
|
) -> dict[str, float | int]:
|
||||||
|
total_samples = sum(int(value["sample_count"]) for value in values)
|
||||||
|
if total_samples <= 0:
|
||||||
|
total_samples = len(values)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"p10": self._weighted_average(values, "p10", total_samples),
|
||||||
|
"p50": self._weighted_average(values, "p50", total_samples),
|
||||||
|
"p90": self._weighted_average(values, "p90", total_samples),
|
||||||
|
"avg_load_power_w": self._weighted_average(
|
||||||
|
values,
|
||||||
|
"avg_load_power_w",
|
||||||
|
total_samples,
|
||||||
|
),
|
||||||
|
"max_load_power_w": max(float(value["max_load_power_w"]) for value in values),
|
||||||
|
"sample_count": total_samples,
|
||||||
|
"weekday_bucket_count": len(values),
|
||||||
|
}
|
||||||
|
|
||||||
|
def _weighted_average(
|
||||||
|
self,
|
||||||
|
values: list[dict[str, float | int]],
|
||||||
|
key: str,
|
||||||
|
total_samples: int,
|
||||||
|
) -> float:
|
||||||
|
return sum(
|
||||||
|
float(value[key]) * int(value["sample_count"])
|
||||||
|
for value in values
|
||||||
|
) / total_samples
|
||||||
|
|
||||||
|
def _forecast_point(
|
||||||
|
self,
|
||||||
|
target_at: datetime,
|
||||||
|
issued_at: datetime,
|
||||||
|
profile: dict[int, dict[str, float | int]],
|
||||||
|
recent_p10: float,
|
||||||
|
recent_p50: float,
|
||||||
|
recent_p90: float,
|
||||||
|
blend: float,
|
||||||
|
) -> PowerForecastPoint:
|
||||||
|
profile_key = self._profile_key(target_at)
|
||||||
|
profile_values = profile.get(profile_key)
|
||||||
|
|
||||||
|
if profile_values is None:
|
||||||
|
p10 = max(0.0, recent_p10)
|
||||||
|
p50 = max(0.0, recent_p50)
|
||||||
|
p90 = max(p50 * 1.25, recent_p90)
|
||||||
|
confidence = 0.25
|
||||||
|
sample_count = 0
|
||||||
|
weekday_bucket_count = 0
|
||||||
|
else:
|
||||||
|
p10 = self._blend(float(profile_values["p10"]), recent_p10, blend)
|
||||||
|
p50 = self._blend(float(profile_values["p50"]), recent_p50, blend)
|
||||||
|
p90 = self._blend(float(profile_values["p90"]), recent_p90, blend)
|
||||||
|
p10 = max(0.0, min(p10, p50))
|
||||||
|
p90 = max(p90, p50 * 1.15)
|
||||||
|
sample_count = int(profile_values["sample_count"])
|
||||||
|
weekday_bucket_count = int(profile_values["weekday_bucket_count"])
|
||||||
|
confidence = min(0.65, 0.35 + sample_count / 750.0)
|
||||||
|
|
||||||
|
return PowerForecastPoint(
|
||||||
|
target_at=target_at,
|
||||||
|
horizon_minutes=max(
|
||||||
|
0, round((target_at - issued_at).total_seconds() / 60)
|
||||||
|
),
|
||||||
|
expected_power_w=p50,
|
||||||
|
p10_power_w=p10,
|
||||||
|
p50_power_w=p50,
|
||||||
|
p90_power_w=p90,
|
||||||
|
confidence=confidence,
|
||||||
|
source="time_of_day_load_profile",
|
||||||
|
model_version=self.model_version,
|
||||||
|
metadata={
|
||||||
|
"profile_key": profile_key,
|
||||||
|
"profile_sample_count": sample_count,
|
||||||
|
"weekday_bucket_count": weekday_bucket_count,
|
||||||
|
"recent_blend": blend,
|
||||||
|
"lookback_days": self.config.load_profile_days,
|
||||||
|
"bucket_minutes": self.config.load_profile_bucket_minutes,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
def _profile_key(self, target_at: datetime) -> int:
|
||||||
|
local = target_at.astimezone(self._local_timezone())
|
||||||
|
minute_of_day = local.hour * 60 + local.minute
|
||||||
|
return (
|
||||||
|
minute_of_day // self.config.load_profile_bucket_minutes
|
||||||
|
) * self.config.load_profile_bucket_minutes
|
||||||
|
|
||||||
|
def _local_timezone(self) -> ZoneInfo:
|
||||||
|
return ZoneInfo(self._local_timezone_name())
|
||||||
|
|
||||||
|
def _local_timezone_name(self) -> str:
|
||||||
|
try:
|
||||||
|
ZoneInfo(self.config.local_timezone)
|
||||||
|
except ZoneInfoNotFoundError:
|
||||||
|
return "UTC"
|
||||||
|
return self.config.local_timezone
|
||||||
|
|
||||||
|
def _blend(self, profile_value: float, recent_value: float, blend: float) -> float:
|
||||||
|
return profile_value * (1.0 - blend) + recent_value * blend
|
||||||
|
|
||||||
|
def _number(self, value: object, fallback: float) -> float:
|
||||||
|
if value is None:
|
||||||
|
return float(fallback)
|
||||||
|
return float(value)
|
||||||
@@ -0,0 +1,142 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from zoneinfo import ZoneInfo, ZoneInfoNotFoundError
|
||||||
|
|
||||||
|
from gibil.classes.models import ForecastKind, PowerForecastPoint, PowerForecastRun
|
||||||
|
from gibil.classes.oracle.config import EnergyForecastConfig
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
|
||||||
|
|
||||||
|
class HistoricalUsageOracle:
|
||||||
|
"""Forecasts load from time-of-week history blended with recent load."""
|
||||||
|
|
||||||
|
model_version = "historical_usage_profile_v1"
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
sigen_store: SigenStore,
|
||||||
|
config: EnergyForecastConfig,
|
||||||
|
) -> None:
|
||||||
|
self.sigen_store = sigen_store
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
def forecast(
|
||||||
|
self,
|
||||||
|
target_times: list[datetime],
|
||||||
|
issued_at: datetime | None = None,
|
||||||
|
) -> PowerForecastRun:
|
||||||
|
if issued_at is None:
|
||||||
|
issued_at = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
recent_summary = self.sigen_store.load_recent_power_summary(
|
||||||
|
lookback=timedelta(minutes=self.config.load_lookback_minutes)
|
||||||
|
)
|
||||||
|
profile = self.sigen_store.load_load_profile(
|
||||||
|
lookback=timedelta(days=self.config.load_profile_days),
|
||||||
|
bucket_minutes=self.config.load_profile_bucket_minutes,
|
||||||
|
min_samples=self.config.load_profile_min_samples,
|
||||||
|
timezone_name=self._local_timezone_name(),
|
||||||
|
)
|
||||||
|
latest = self.sigen_store.load_latest_snapshot()
|
||||||
|
fallback_load_w = latest.load_power_w if latest else 0.0
|
||||||
|
recent_p50 = self._number(recent_summary.get("load_p50_w"), fallback_load_w)
|
||||||
|
recent_p10 = self._number(recent_summary.get("load_p10_w"), recent_p50 * 0.7)
|
||||||
|
recent_p90 = self._number(recent_summary.get("load_p90_w"), recent_p50 * 1.5)
|
||||||
|
blend = min(max(self.config.load_recent_blend, 0.0), 1.0)
|
||||||
|
|
||||||
|
points = [
|
||||||
|
self._forecast_point(
|
||||||
|
target_at=target_at,
|
||||||
|
issued_at=issued_at,
|
||||||
|
profile=profile,
|
||||||
|
recent_p10=recent_p10,
|
||||||
|
recent_p50=recent_p50,
|
||||||
|
recent_p90=recent_p90,
|
||||||
|
blend=blend,
|
||||||
|
)
|
||||||
|
for target_at in target_times
|
||||||
|
]
|
||||||
|
|
||||||
|
return PowerForecastRun(
|
||||||
|
issued_at=issued_at,
|
||||||
|
kind=ForecastKind.LOAD,
|
||||||
|
source="historical_usage_oracle",
|
||||||
|
model_version=self.model_version,
|
||||||
|
points=points,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _forecast_point(
|
||||||
|
self,
|
||||||
|
target_at: datetime,
|
||||||
|
issued_at: datetime,
|
||||||
|
profile: dict[tuple[int, int], dict[str, float | int]],
|
||||||
|
recent_p10: float,
|
||||||
|
recent_p50: float,
|
||||||
|
recent_p90: float,
|
||||||
|
blend: float,
|
||||||
|
) -> PowerForecastPoint:
|
||||||
|
profile_key = self._profile_key(target_at)
|
||||||
|
profile_values = profile.get(profile_key)
|
||||||
|
|
||||||
|
if profile_values is None:
|
||||||
|
p10 = max(0.0, recent_p10)
|
||||||
|
p50 = max(0.0, recent_p50)
|
||||||
|
p90 = max(p50 * 1.25, recent_p90)
|
||||||
|
confidence = 0.25
|
||||||
|
sample_count = 0
|
||||||
|
else:
|
||||||
|
p10 = self._blend(float(profile_values["p10"]), recent_p10, blend)
|
||||||
|
p50 = self._blend(float(profile_values["p50"]), recent_p50, blend)
|
||||||
|
p90 = self._blend(float(profile_values["p90"]), recent_p90, blend)
|
||||||
|
p10 = max(0.0, min(p10, p50))
|
||||||
|
p90 = max(p90, p50 * 1.15)
|
||||||
|
confidence = min(0.65, 0.35 + float(profile_values["sample_count"]) / 500.0)
|
||||||
|
sample_count = int(profile_values["sample_count"])
|
||||||
|
|
||||||
|
return PowerForecastPoint(
|
||||||
|
target_at=target_at,
|
||||||
|
horizon_minutes=max(
|
||||||
|
0, round((target_at - issued_at).total_seconds() / 60)
|
||||||
|
),
|
||||||
|
expected_power_w=p50,
|
||||||
|
p10_power_w=p10,
|
||||||
|
p50_power_w=p50,
|
||||||
|
p90_power_w=p90,
|
||||||
|
confidence=confidence,
|
||||||
|
source="time_of_week_load_profile",
|
||||||
|
model_version=self.model_version,
|
||||||
|
metadata={
|
||||||
|
"profile_key": profile_key,
|
||||||
|
"profile_sample_count": sample_count,
|
||||||
|
"recent_blend": blend,
|
||||||
|
"lookback_days": self.config.load_profile_days,
|
||||||
|
"bucket_minutes": self.config.load_profile_bucket_minutes,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
def _profile_key(self, target_at: datetime) -> tuple[int, int]:
|
||||||
|
local = target_at.astimezone(self._local_timezone())
|
||||||
|
minute_of_day = local.hour * 60 + local.minute
|
||||||
|
bucket = (
|
||||||
|
minute_of_day // self.config.load_profile_bucket_minutes
|
||||||
|
) * self.config.load_profile_bucket_minutes
|
||||||
|
return local.isoweekday(), bucket
|
||||||
|
|
||||||
|
def _local_timezone(self) -> ZoneInfo:
|
||||||
|
return ZoneInfo(self._local_timezone_name())
|
||||||
|
|
||||||
|
def _local_timezone_name(self) -> str:
|
||||||
|
try:
|
||||||
|
ZoneInfo(self.config.local_timezone)
|
||||||
|
except ZoneInfoNotFoundError:
|
||||||
|
return "UTC"
|
||||||
|
return self.config.local_timezone
|
||||||
|
|
||||||
|
def _blend(self, profile_value: float, recent_value: float, blend: float) -> float:
|
||||||
|
return profile_value * (1.0 - blend) + recent_value * blend
|
||||||
|
|
||||||
|
def _number(self, value: object, fallback: float) -> float:
|
||||||
|
if value is None:
|
||||||
|
return float(fallback)
|
||||||
|
return float(value)
|
||||||
@@ -0,0 +1,35 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
|
||||||
|
from gibil.classes.predictors.usage_sequence_dataset import (
|
||||||
|
UsageSequenceDatasetBuilder,
|
||||||
|
UsageSequenceScaleConfig,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class UsageHybridModelShape:
|
||||||
|
"""Describes the fixed-plus-token sequence model input contract."""
|
||||||
|
|
||||||
|
past_scales: tuple[UsageSequenceScaleConfig, ...]
|
||||||
|
past_fixed_features: tuple[str, ...]
|
||||||
|
future_fixed_features: tuple[str, ...]
|
||||||
|
future_steps: int
|
||||||
|
quantiles: tuple[float, ...] = (0.10, 0.50, 0.90)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_dataset_builder(
|
||||||
|
cls,
|
||||||
|
builder: UsageSequenceDatasetBuilder,
|
||||||
|
) -> "UsageHybridModelShape":
|
||||||
|
return cls(
|
||||||
|
past_scales=builder.config.past_scales,
|
||||||
|
past_fixed_features=tuple(builder.past_feature_names),
|
||||||
|
future_fixed_features=tuple(builder.future_feature_names),
|
||||||
|
future_steps=builder.future_steps,
|
||||||
|
)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def output_width(self) -> int:
|
||||||
|
return self.future_steps * len(self.quantiles)
|
||||||
@@ -0,0 +1,158 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class UsageHybridTCNConfig:
|
||||||
|
past_feature_count: int
|
||||||
|
future_feature_count: int
|
||||||
|
future_steps: int
|
||||||
|
scale_names: tuple[str, ...]
|
||||||
|
hidden_channels: int = 64
|
||||||
|
branch_layers: int = 4
|
||||||
|
dropout: float = 0.10
|
||||||
|
quantiles: tuple[float, ...] = (0.10, 0.50, 0.90)
|
||||||
|
|
||||||
|
|
||||||
|
def build_usage_hybrid_tcn(config: UsageHybridTCNConfig):
|
||||||
|
try:
|
||||||
|
return _build_usage_hybrid_tcn(config)
|
||||||
|
except ImportError as error:
|
||||||
|
raise RuntimeError(
|
||||||
|
"PyTorch is required for TCN training. Install dependencies with "
|
||||||
|
"`python3 -m pip install -r requirements.txt`."
|
||||||
|
) from error
|
||||||
|
|
||||||
|
|
||||||
|
def _build_usage_hybrid_tcn(config: UsageHybridTCNConfig):
|
||||||
|
import torch
|
||||||
|
from torch import nn
|
||||||
|
|
||||||
|
class CausalTrim(nn.Module):
|
||||||
|
def __init__(self, trim: int) -> None:
|
||||||
|
super().__init__()
|
||||||
|
self.trim = trim
|
||||||
|
|
||||||
|
def forward(self, value):
|
||||||
|
if self.trim <= 0:
|
||||||
|
return value
|
||||||
|
return value[:, :, :-self.trim]
|
||||||
|
|
||||||
|
class TemporalBlock(nn.Module):
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
in_channels: int,
|
||||||
|
out_channels: int,
|
||||||
|
kernel_size: int,
|
||||||
|
dilation: int,
|
||||||
|
dropout: float,
|
||||||
|
) -> None:
|
||||||
|
super().__init__()
|
||||||
|
padding = (kernel_size - 1) * dilation
|
||||||
|
self.net = nn.Sequential(
|
||||||
|
nn.Conv1d(
|
||||||
|
in_channels,
|
||||||
|
out_channels,
|
||||||
|
kernel_size=kernel_size,
|
||||||
|
dilation=dilation,
|
||||||
|
padding=padding,
|
||||||
|
),
|
||||||
|
CausalTrim(padding),
|
||||||
|
nn.ReLU(),
|
||||||
|
nn.Dropout(dropout),
|
||||||
|
nn.Conv1d(
|
||||||
|
out_channels,
|
||||||
|
out_channels,
|
||||||
|
kernel_size=kernel_size,
|
||||||
|
dilation=dilation,
|
||||||
|
padding=padding,
|
||||||
|
),
|
||||||
|
CausalTrim(padding),
|
||||||
|
nn.ReLU(),
|
||||||
|
nn.Dropout(dropout),
|
||||||
|
)
|
||||||
|
self.residual = (
|
||||||
|
nn.Conv1d(in_channels, out_channels, kernel_size=1)
|
||||||
|
if in_channels != out_channels
|
||||||
|
else nn.Identity()
|
||||||
|
)
|
||||||
|
self.activation = nn.ReLU()
|
||||||
|
|
||||||
|
def forward(self, value):
|
||||||
|
return self.activation(self.net(value) + self.residual(value))
|
||||||
|
|
||||||
|
class TemporalBranch(nn.Module):
|
||||||
|
def __init__(self) -> None:
|
||||||
|
super().__init__()
|
||||||
|
layers = []
|
||||||
|
channels = config.past_feature_count
|
||||||
|
for layer_index in range(config.branch_layers):
|
||||||
|
layers.append(
|
||||||
|
TemporalBlock(
|
||||||
|
in_channels=channels,
|
||||||
|
out_channels=config.hidden_channels,
|
||||||
|
kernel_size=5,
|
||||||
|
dilation=2**layer_index,
|
||||||
|
dropout=config.dropout,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
channels = config.hidden_channels
|
||||||
|
self.net = nn.Sequential(*layers)
|
||||||
|
|
||||||
|
def forward(self, value):
|
||||||
|
# Dataset tensors are batch x time x features; Conv1d wants batch x features x time.
|
||||||
|
encoded = self.net(value.transpose(1, 2))
|
||||||
|
return encoded[:, :, -1]
|
||||||
|
|
||||||
|
class UsageHybridTCN(nn.Module):
|
||||||
|
def __init__(self) -> None:
|
||||||
|
super().__init__()
|
||||||
|
self.branches = nn.ModuleDict(
|
||||||
|
{name: TemporalBranch() for name in config.scale_names}
|
||||||
|
)
|
||||||
|
branch_width = config.hidden_channels * len(config.scale_names)
|
||||||
|
self.context = nn.Sequential(
|
||||||
|
nn.Linear(branch_width, config.hidden_channels),
|
||||||
|
nn.ReLU(),
|
||||||
|
nn.Dropout(config.dropout),
|
||||||
|
)
|
||||||
|
self.future_encoder = nn.Sequential(
|
||||||
|
nn.Linear(config.future_feature_count, config.hidden_channels),
|
||||||
|
nn.ReLU(),
|
||||||
|
)
|
||||||
|
self.head = nn.Sequential(
|
||||||
|
nn.Linear(config.hidden_channels * 2, config.hidden_channels),
|
||||||
|
nn.ReLU(),
|
||||||
|
nn.Dropout(config.dropout),
|
||||||
|
nn.Linear(config.hidden_channels, len(config.quantiles)),
|
||||||
|
)
|
||||||
|
|
||||||
|
def forward(self, past_by_scale, future_features):
|
||||||
|
branch_outputs = [
|
||||||
|
self.branches[name](past_by_scale[name])
|
||||||
|
for name in config.scale_names
|
||||||
|
]
|
||||||
|
context = self.context(torch.cat(branch_outputs, dim=1))
|
||||||
|
future = self.future_encoder(future_features)
|
||||||
|
repeated_context = context.unsqueeze(1).expand(-1, future.size(1), -1)
|
||||||
|
return self.head(torch.cat([repeated_context, future], dim=2))
|
||||||
|
|
||||||
|
return UsageHybridTCN()
|
||||||
|
|
||||||
|
|
||||||
|
def pinball_loss(prediction, target, quantiles: tuple[float, ...]):
|
||||||
|
try:
|
||||||
|
import torch
|
||||||
|
except ImportError as error:
|
||||||
|
raise RuntimeError(
|
||||||
|
"PyTorch is required for TCN training. Install dependencies with "
|
||||||
|
"`python3 -m pip install -r requirements.txt`."
|
||||||
|
) from error
|
||||||
|
|
||||||
|
target = target.unsqueeze(-1)
|
||||||
|
losses = []
|
||||||
|
for index, quantile in enumerate(quantiles):
|
||||||
|
error = target - prediction[:, :, index : index + 1]
|
||||||
|
losses.append(torch.maximum(quantile * error, (quantile - 1) * error))
|
||||||
|
return torch.stack(losses, dim=-1).mean()
|
||||||
@@ -0,0 +1,32 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from gibil.classes.models import PowerForecastRun
|
||||||
|
from gibil.classes.oracle.config import EnergyForecastConfig
|
||||||
|
from gibil.classes.predictors.usage_daily import DailyUsageOracle
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
|
||||||
|
|
||||||
|
class SequenceUsageOracle:
|
||||||
|
"""Forecasts load from recent sequence state when a trained model exists."""
|
||||||
|
|
||||||
|
model_version = "sequence_usage_tcn_v1"
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
sigen_store: SigenStore,
|
||||||
|
config: EnergyForecastConfig,
|
||||||
|
) -> None:
|
||||||
|
self.sigen_store = sigen_store
|
||||||
|
self.config = config
|
||||||
|
self.fallback = DailyUsageOracle(sigen_store=sigen_store, config=config)
|
||||||
|
|
||||||
|
def forecast(
|
||||||
|
self,
|
||||||
|
target_times: list[datetime],
|
||||||
|
issued_at: datetime | None = None,
|
||||||
|
) -> PowerForecastRun:
|
||||||
|
# The sequence model scaffold is present, but production should remain
|
||||||
|
# deterministic until we have a trained artifact and evaluation history.
|
||||||
|
return self.fallback.forecast(target_times=target_times, issued_at=issued_at)
|
||||||
@@ -0,0 +1,405 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from bisect import bisect_right
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from math import cos, pi, sin
|
||||||
|
from os import environ
|
||||||
|
from typing import Iterator
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class UsageSequenceScaleConfig:
|
||||||
|
name: str
|
||||||
|
hours: int
|
||||||
|
step_seconds: int
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class UsageFeatureToken:
|
||||||
|
name: str
|
||||||
|
value: float
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class UsageSequenceDatasetConfig:
|
||||||
|
lookback_days: int = 30
|
||||||
|
future_hours: int = 24
|
||||||
|
future_step_minutes: int = 15
|
||||||
|
stride_minutes: int = 15
|
||||||
|
local_timezone: str = "Europe/Stockholm"
|
||||||
|
past_scales: tuple[UsageSequenceScaleConfig, ...] = (
|
||||||
|
UsageSequenceScaleConfig(name="recent", hours=2, step_seconds=10),
|
||||||
|
UsageSequenceScaleConfig(name="medium", hours=6, step_seconds=30),
|
||||||
|
UsageSequenceScaleConfig(name="daily", hours=24, step_seconds=120),
|
||||||
|
)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "UsageSequenceDatasetConfig":
|
||||||
|
EnvLoader().load()
|
||||||
|
return cls(
|
||||||
|
lookback_days=int(environ.get("ASTRAPE_USAGE_SEQUENCE_LOOKBACK_DAYS", "30")),
|
||||||
|
future_hours=int(environ.get("ASTRAPE_USAGE_SEQUENCE_FUTURE_HOURS", "24")),
|
||||||
|
future_step_minutes=int(
|
||||||
|
environ.get("ASTRAPE_USAGE_SEQUENCE_FUTURE_STEP_MINUTES", "15")
|
||||||
|
),
|
||||||
|
stride_minutes=int(environ.get("ASTRAPE_USAGE_SEQUENCE_STRIDE_MINUTES", "15")),
|
||||||
|
local_timezone=environ.get(
|
||||||
|
"ASTRAPE_LOCAL_TIMEZONE",
|
||||||
|
environ.get("TZ", "Europe/Stockholm"),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class UsageSequenceExample:
|
||||||
|
issued_at: datetime
|
||||||
|
past_by_scale: dict[str, list[list[float]]]
|
||||||
|
past_tokens_by_scale: dict[str, list[list[UsageFeatureToken]]]
|
||||||
|
future_features: list[list[float]]
|
||||||
|
future_tokens: list[list[UsageFeatureToken]]
|
||||||
|
targets: list[float]
|
||||||
|
|
||||||
|
|
||||||
|
class UsageSequenceDatasetBuilder:
|
||||||
|
"""Builds load forecasting windows from Sigen history."""
|
||||||
|
|
||||||
|
past_feature_names = [
|
||||||
|
"load_power_w",
|
||||||
|
"solar_power_w",
|
||||||
|
"grid_import_w",
|
||||||
|
"grid_export_w",
|
||||||
|
"battery_power_w",
|
||||||
|
"battery_soc_pct",
|
||||||
|
"hour_sin",
|
||||||
|
"hour_cos",
|
||||||
|
"dow_sin",
|
||||||
|
"dow_cos",
|
||||||
|
]
|
||||||
|
future_feature_names = [
|
||||||
|
"hour_sin",
|
||||||
|
"hour_cos",
|
||||||
|
"dow_sin",
|
||||||
|
"dow_cos",
|
||||||
|
"temperature_c",
|
||||||
|
"shortwave_radiation_w_m2",
|
||||||
|
"cloud_cover_pct",
|
||||||
|
]
|
||||||
|
|
||||||
|
def __init__(self, config: UsageSequenceDatasetConfig) -> None:
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "UsageSequenceDatasetBuilder":
|
||||||
|
return cls(UsageSequenceDatasetConfig.from_env())
|
||||||
|
|
||||||
|
def build(self, limit: int | None = None) -> list[UsageSequenceExample]:
|
||||||
|
samples_by_scale = {
|
||||||
|
scale.name: self._load_samples(step_seconds=scale.step_seconds)
|
||||||
|
for scale in self.config.past_scales
|
||||||
|
}
|
||||||
|
target_samples = self._load_samples(
|
||||||
|
step_seconds=self.config.future_step_minutes * 60
|
||||||
|
)
|
||||||
|
weather_by_target = self._load_weather_forecasts()
|
||||||
|
if not target_samples or any(not samples for samples in samples_by_scale.values()):
|
||||||
|
return []
|
||||||
|
|
||||||
|
by_scale = {
|
||||||
|
name: {sample["bucket"]: sample for sample in samples}
|
||||||
|
for name, samples in samples_by_scale.items()
|
||||||
|
}
|
||||||
|
target_by_time = {
|
||||||
|
sample["bucket"]: sample
|
||||||
|
for sample in target_samples
|
||||||
|
}
|
||||||
|
first_available = max(samples[0]["bucket"] for samples in samples_by_scale.values())
|
||||||
|
last_available = min(
|
||||||
|
[samples[-1]["bucket"] for samples in samples_by_scale.values()]
|
||||||
|
+ [target_samples[-1]["bucket"]]
|
||||||
|
)
|
||||||
|
start_at = first_available + timedelta(hours=self.max_past_hours)
|
||||||
|
end_at = last_available - timedelta(hours=self.config.future_hours)
|
||||||
|
issued_at = self._ceil_time(start_at, self.config.stride_minutes)
|
||||||
|
examples: list[UsageSequenceExample] = []
|
||||||
|
|
||||||
|
while issued_at <= end_at:
|
||||||
|
example = self._build_example(
|
||||||
|
issued_at,
|
||||||
|
by_scale,
|
||||||
|
target_by_time,
|
||||||
|
weather_by_target,
|
||||||
|
)
|
||||||
|
if example is not None:
|
||||||
|
examples.append(example)
|
||||||
|
if limit is not None and len(examples) >= limit:
|
||||||
|
break
|
||||||
|
issued_at += timedelta(minutes=self.config.stride_minutes)
|
||||||
|
|
||||||
|
return examples
|
||||||
|
|
||||||
|
def iter_examples(self) -> Iterator[UsageSequenceExample]:
|
||||||
|
for example in self.build():
|
||||||
|
yield example
|
||||||
|
|
||||||
|
def _build_example(
|
||||||
|
self,
|
||||||
|
issued_at: datetime,
|
||||||
|
by_scale: dict[str, dict[datetime, dict[str, object]]],
|
||||||
|
target_by_time: dict[datetime, dict[str, object]],
|
||||||
|
weather_by_target: dict[datetime, list[dict[str, object]]],
|
||||||
|
) -> UsageSequenceExample | None:
|
||||||
|
future_times = [
|
||||||
|
issued_at + timedelta(minutes=self.config.future_step_minutes * offset)
|
||||||
|
for offset in range(1, self.future_steps + 1)
|
||||||
|
]
|
||||||
|
|
||||||
|
past_by_scale: dict[str, list[list[float]]] = {}
|
||||||
|
past_tokens_by_scale: dict[str, list[list[UsageFeatureToken]]] = {}
|
||||||
|
for scale in self.config.past_scales:
|
||||||
|
past_times = [
|
||||||
|
issued_at - timedelta(seconds=scale.step_seconds * offset)
|
||||||
|
for offset in range(self.past_steps(scale), 0, -1)
|
||||||
|
]
|
||||||
|
past_rows = [
|
||||||
|
by_scale[scale.name].get(target_at)
|
||||||
|
for target_at in past_times
|
||||||
|
]
|
||||||
|
if any(row is None or row["load_power_w"] is None for row in past_rows):
|
||||||
|
return None
|
||||||
|
past_by_scale[scale.name] = [
|
||||||
|
self._past_features(row) for row in past_rows if row is not None
|
||||||
|
]
|
||||||
|
past_tokens_by_scale[scale.name] = [
|
||||||
|
self._past_tokens(row) for row in past_rows if row is not None
|
||||||
|
]
|
||||||
|
|
||||||
|
future_rows = [target_by_time.get(target_at) for target_at in future_times]
|
||||||
|
if any(row is None or row["load_power_w"] is None for row in future_rows):
|
||||||
|
return None
|
||||||
|
|
||||||
|
return UsageSequenceExample(
|
||||||
|
issued_at=issued_at,
|
||||||
|
past_by_scale=past_by_scale,
|
||||||
|
past_tokens_by_scale=past_tokens_by_scale,
|
||||||
|
future_features=[
|
||||||
|
self._future_features(target_at, issued_at, weather_by_target)
|
||||||
|
for target_at in future_times
|
||||||
|
],
|
||||||
|
future_tokens=[
|
||||||
|
self._future_tokens(target_at=target_at, issued_at=issued_at)
|
||||||
|
for target_at in future_times
|
||||||
|
],
|
||||||
|
targets=[
|
||||||
|
float(row["load_power_w"])
|
||||||
|
for row in future_rows
|
||||||
|
if row is not None
|
||||||
|
],
|
||||||
|
)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def max_past_hours(self) -> int:
|
||||||
|
return max(scale.hours for scale in self.config.past_scales)
|
||||||
|
|
||||||
|
def past_steps(self, scale: UsageSequenceScaleConfig) -> int:
|
||||||
|
return scale.hours * 60 * 60 // scale.step_seconds
|
||||||
|
|
||||||
|
@property
|
||||||
|
def future_steps(self) -> int:
|
||||||
|
return self.config.future_hours * 60 // self.config.future_step_minutes
|
||||||
|
|
||||||
|
def _past_features(self, row: dict[str, object]) -> list[float]:
|
||||||
|
time_features = self._time_features(row["bucket"])
|
||||||
|
return [
|
||||||
|
self._number(row["load_power_w"]),
|
||||||
|
self._number(row["solar_power_w"]),
|
||||||
|
self._number(row["grid_import_w"]),
|
||||||
|
self._number(row["grid_export_w"]),
|
||||||
|
self._number(row["battery_power_w"]),
|
||||||
|
self._number(row["battery_soc_pct"]),
|
||||||
|
*time_features,
|
||||||
|
]
|
||||||
|
|
||||||
|
def _past_tokens(self, row: dict[str, object]) -> list[UsageFeatureToken]:
|
||||||
|
return []
|
||||||
|
|
||||||
|
def _time_features(self, value: object) -> list[float]:
|
||||||
|
timestamp = value
|
||||||
|
if not isinstance(timestamp, datetime):
|
||||||
|
raise TypeError("timestamp must be a datetime")
|
||||||
|
|
||||||
|
local = timestamp.astimezone(timezone.utc)
|
||||||
|
minutes = local.hour * 60 + local.minute
|
||||||
|
minute_angle = 2 * pi * minutes / 1440
|
||||||
|
dow_angle = 2 * pi * (local.isoweekday() - 1) / 7
|
||||||
|
return [
|
||||||
|
sin(minute_angle),
|
||||||
|
cos(minute_angle),
|
||||||
|
sin(dow_angle),
|
||||||
|
cos(dow_angle),
|
||||||
|
]
|
||||||
|
|
||||||
|
def _future_features(
|
||||||
|
self,
|
||||||
|
target_at: datetime,
|
||||||
|
issued_at: datetime,
|
||||||
|
weather_by_target: dict[datetime, list[dict[str, object]]],
|
||||||
|
) -> list[float]:
|
||||||
|
weather = self._weather_for_target(
|
||||||
|
target_at=target_at,
|
||||||
|
issued_at=issued_at,
|
||||||
|
weather_by_target=weather_by_target,
|
||||||
|
)
|
||||||
|
return [
|
||||||
|
*self._time_features(target_at),
|
||||||
|
self._number(weather.get("temperature_c")),
|
||||||
|
self._number(weather.get("shortwave_radiation_w_m2")),
|
||||||
|
self._number(weather.get("cloud_cover_pct")),
|
||||||
|
]
|
||||||
|
|
||||||
|
def _future_tokens(
|
||||||
|
self,
|
||||||
|
target_at: datetime,
|
||||||
|
issued_at: datetime,
|
||||||
|
) -> list[UsageFeatureToken]:
|
||||||
|
return []
|
||||||
|
|
||||||
|
def _weather_for_target(
|
||||||
|
self,
|
||||||
|
target_at: datetime,
|
||||||
|
issued_at: datetime,
|
||||||
|
weather_by_target: dict[datetime, list[dict[str, object]]],
|
||||||
|
) -> dict[str, object]:
|
||||||
|
forecast_target_at = self._floor_time(target_at, step_minutes=60)
|
||||||
|
rows = weather_by_target.get(forecast_target_at, [])
|
||||||
|
if not rows:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
issued_values = [row["issued_at"] for row in rows]
|
||||||
|
index = bisect_right(issued_values, issued_at) - 1
|
||||||
|
if index < 0:
|
||||||
|
return {}
|
||||||
|
return rows[index]
|
||||||
|
|
||||||
|
def _load_samples(self, step_seconds: int) -> list[dict[str, object]]:
|
||||||
|
EnvLoader().load()
|
||||||
|
database_url = environ.get("ASTRAPE_DATABASE_URL")
|
||||||
|
if not database_url:
|
||||||
|
raise RuntimeError("ASTRAPE_DATABASE_URL is required")
|
||||||
|
|
||||||
|
start_at = datetime.now(timezone.utc) - timedelta(days=self.config.lookback_days)
|
||||||
|
bucket = self._bucket_interval(step_seconds)
|
||||||
|
try:
|
||||||
|
import psycopg
|
||||||
|
except ImportError as error:
|
||||||
|
raise RuntimeError(
|
||||||
|
"Install dependencies with `python3 -m pip install -r requirements.txt`"
|
||||||
|
) from error
|
||||||
|
|
||||||
|
with psycopg.connect(database_url) as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
f"""
|
||||||
|
SELECT
|
||||||
|
time_bucket('{bucket}', observed_at) AS bucket,
|
||||||
|
avg(load_power_w) AS load_power_w,
|
||||||
|
avg(solar_power_w) AS solar_power_w,
|
||||||
|
avg(grid_import_w) AS grid_import_w,
|
||||||
|
avg(grid_export_w) AS grid_export_w,
|
||||||
|
avg(battery_power_w) AS battery_power_w,
|
||||||
|
avg(battery_soc_pct) AS battery_soc_pct
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
WHERE observed_at >= %s
|
||||||
|
AND observed_at <= now()
|
||||||
|
GROUP BY bucket
|
||||||
|
ORDER BY bucket
|
||||||
|
""",
|
||||||
|
(start_at,),
|
||||||
|
)
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"bucket": row[0],
|
||||||
|
"load_power_w": row[1],
|
||||||
|
"solar_power_w": row[2],
|
||||||
|
"grid_import_w": row[3],
|
||||||
|
"grid_export_w": row[4],
|
||||||
|
"battery_power_w": row[5],
|
||||||
|
"battery_soc_pct": row[6],
|
||||||
|
}
|
||||||
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
|
def _load_weather_forecasts(self) -> dict[datetime, list[dict[str, object]]]:
|
||||||
|
EnvLoader().load()
|
||||||
|
database_url = environ.get("ASTRAPE_DATABASE_URL")
|
||||||
|
if not database_url:
|
||||||
|
raise RuntimeError("ASTRAPE_DATABASE_URL is required")
|
||||||
|
|
||||||
|
start_at = datetime.now(timezone.utc) - timedelta(days=self.config.lookback_days)
|
||||||
|
end_at = datetime.now(timezone.utc) + timedelta(hours=self.config.future_hours)
|
||||||
|
try:
|
||||||
|
import psycopg
|
||||||
|
except ImportError as error:
|
||||||
|
raise RuntimeError(
|
||||||
|
"Install dependencies with `python3 -m pip install -r requirements.txt`"
|
||||||
|
) from error
|
||||||
|
|
||||||
|
with psycopg.connect(database_url) as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
temperature_c,
|
||||||
|
shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct
|
||||||
|
FROM weather_forecast_points
|
||||||
|
WHERE target_at >= %s
|
||||||
|
AND target_at <= %s
|
||||||
|
ORDER BY target_at, issued_at
|
||||||
|
""",
|
||||||
|
(start_at, end_at),
|
||||||
|
)
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
|
||||||
|
by_target: dict[datetime, list[dict[str, object]]] = {}
|
||||||
|
for row in rows:
|
||||||
|
by_target.setdefault(row[1], []).append(
|
||||||
|
{
|
||||||
|
"issued_at": row[0],
|
||||||
|
"target_at": row[1],
|
||||||
|
"temperature_c": row[2],
|
||||||
|
"shortwave_radiation_w_m2": row[3],
|
||||||
|
"cloud_cover_pct": row[4],
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return by_target
|
||||||
|
|
||||||
|
def _bucket_interval(self, step_seconds: int) -> str:
|
||||||
|
if step_seconds % 60 == 0:
|
||||||
|
return f"{step_seconds // 60} minutes"
|
||||||
|
return f"{step_seconds} seconds"
|
||||||
|
|
||||||
|
def _ceil_time(self, value: datetime, step_minutes: int) -> datetime:
|
||||||
|
step_seconds = step_minutes * 60
|
||||||
|
timestamp = value.timestamp()
|
||||||
|
remainder = timestamp % step_seconds
|
||||||
|
if remainder:
|
||||||
|
timestamp += step_seconds - remainder
|
||||||
|
return datetime.fromtimestamp(timestamp, timezone.utc)
|
||||||
|
|
||||||
|
def _floor_time(self, value: datetime, step_minutes: int) -> datetime:
|
||||||
|
step_seconds = step_minutes * 60
|
||||||
|
timestamp = value.timestamp()
|
||||||
|
timestamp -= timestamp % step_seconds
|
||||||
|
return datetime.fromtimestamp(timestamp, timezone.utc)
|
||||||
|
|
||||||
|
def _number(self, value: object) -> float:
|
||||||
|
if value is None:
|
||||||
|
return 0.0
|
||||||
|
return float(value)
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
from gibil.classes.sigen.builder import SigenBuilder, SigenPlantClient
|
||||||
|
from gibil.classes.sigen.modbus import SigenModbusClient
|
||||||
|
from gibil.classes.sigen.store import SigenStore, SigenStoreConfig
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"SigenBuilder",
|
||||||
|
"SigenModbusClient",
|
||||||
|
"SigenPlantClient",
|
||||||
|
"SigenStore",
|
||||||
|
"SigenStoreConfig",
|
||||||
|
]
|
||||||
@@ -0,0 +1,175 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from os import environ
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from gibil.classes.models import SigenPlantSnapshot
|
||||||
|
from gibil.classes.sigen.modbus import SigenModbusClient
|
||||||
|
from gibil.classes.sigen.registers import PLANT_REGISTERS, SigenRegister
|
||||||
|
|
||||||
|
|
||||||
|
CORE_PLANT_REGISTER_NAMES = (
|
||||||
|
"plant_system_time",
|
||||||
|
"plant_ems_work_mode",
|
||||||
|
"plant_grid_sensor_status",
|
||||||
|
"plant_grid_sensor_active_power",
|
||||||
|
"plant_ess_soc",
|
||||||
|
"plant_active_power",
|
||||||
|
"plant_sigen_photovoltaic_power",
|
||||||
|
"plant_ess_power",
|
||||||
|
"plant_running_state",
|
||||||
|
"plant_ess_soh",
|
||||||
|
"plant_accumulated_pv_energy",
|
||||||
|
"plant_daily_consumed_energy",
|
||||||
|
"plant_accumulated_consumed_energy",
|
||||||
|
"plant_total_load_power",
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SigenPlantClient:
|
||||||
|
"""Fetches plant-level Sigenergy metrics over Modbus TCP."""
|
||||||
|
|
||||||
|
def __init__(self, modbus_client: SigenModbusClient) -> None:
|
||||||
|
self.modbus_client = modbus_client
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "SigenPlantClient":
|
||||||
|
host = environ.get("SIGEN_MODBUS_HOST")
|
||||||
|
if not host:
|
||||||
|
raise RuntimeError("SIGEN_MODBUS_HOST is required for Sigen Modbus reads")
|
||||||
|
|
||||||
|
return cls(
|
||||||
|
SigenModbusClient(
|
||||||
|
host=host,
|
||||||
|
port=int(environ.get("SIGEN_MODBUS_PORT", "502")),
|
||||||
|
unit_id=int(environ.get("SIGEN_MODBUS_UNIT_ID", "247")),
|
||||||
|
timeout=float(environ.get("SIGEN_MODBUS_TIMEOUT", "20")),
|
||||||
|
retries=int(environ.get("SIGEN_MODBUS_RETRIES", "3")),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
def fetch_snapshot(
|
||||||
|
self,
|
||||||
|
register_names: tuple[str, ...] = CORE_PLANT_REGISTER_NAMES,
|
||||||
|
) -> SigenPlantSnapshot:
|
||||||
|
with self.modbus_client as client:
|
||||||
|
values = self._read_values(client, register_names)
|
||||||
|
|
||||||
|
return SigenBuilder().build_snapshot(values)
|
||||||
|
|
||||||
|
def _read_values(
|
||||||
|
self,
|
||||||
|
client: SigenModbusClient,
|
||||||
|
register_names: tuple[str, ...],
|
||||||
|
) -> dict[str, int | float | str | bool | None]:
|
||||||
|
values: dict[str, int | float | str | bool | None] = {}
|
||||||
|
for name in register_names:
|
||||||
|
register = PLANT_REGISTERS[name]
|
||||||
|
try:
|
||||||
|
values[name] = self._read_value(client, register)
|
||||||
|
except Exception as exc:
|
||||||
|
values[name] = None
|
||||||
|
values[f"{name}_error"] = str(exc)
|
||||||
|
return values
|
||||||
|
|
||||||
|
def _read_value(
|
||||||
|
self,
|
||||||
|
client: SigenModbusClient,
|
||||||
|
register: SigenRegister,
|
||||||
|
) -> int | float | str | bool | None:
|
||||||
|
result = client.read(register.kind, register.address, register.count)
|
||||||
|
return register.decode(result.values)
|
||||||
|
|
||||||
|
|
||||||
|
class SigenBuilder:
|
||||||
|
"""Builds database-ready Sigenergy plant snapshots from decoded registers."""
|
||||||
|
|
||||||
|
max_plant_clock_drift_seconds = 300
|
||||||
|
|
||||||
|
def build_snapshot(
|
||||||
|
self,
|
||||||
|
values: dict[str, Any],
|
||||||
|
received_at: datetime | None = None,
|
||||||
|
) -> SigenPlantSnapshot:
|
||||||
|
if received_at is None:
|
||||||
|
received_at = datetime.now(timezone.utc)
|
||||||
|
|
||||||
|
plant_epoch_seconds = self._int_or_none(values.get("plant_system_time"))
|
||||||
|
observed_at = self._observed_at(plant_epoch_seconds, received_at)
|
||||||
|
|
||||||
|
grid_power_w = self._kw_to_w(values.get("plant_grid_sensor_active_power"))
|
||||||
|
|
||||||
|
return SigenPlantSnapshot(
|
||||||
|
observed_at=observed_at,
|
||||||
|
received_at=received_at,
|
||||||
|
plant_epoch_seconds=plant_epoch_seconds,
|
||||||
|
plant_ems_work_mode=self._int_or_none(values.get("plant_ems_work_mode")),
|
||||||
|
plant_running_state=self._int_or_none(values.get("plant_running_state")),
|
||||||
|
grid_sensor_status=self._int_or_none(
|
||||||
|
values.get("plant_grid_sensor_status")
|
||||||
|
),
|
||||||
|
solar_power_w=self._kw_to_w(
|
||||||
|
values.get("plant_sigen_photovoltaic_power")
|
||||||
|
),
|
||||||
|
battery_soc_pct=self._float_or_none(values.get("plant_ess_soc")),
|
||||||
|
battery_soh_pct=self._float_or_none(values.get("plant_ess_soh")),
|
||||||
|
battery_power_w=self._kw_to_w(values.get("plant_ess_power")),
|
||||||
|
grid_power_w=grid_power_w,
|
||||||
|
grid_import_w=max(grid_power_w, 0.0) if grid_power_w is not None else None,
|
||||||
|
grid_export_w=abs(min(grid_power_w, 0.0))
|
||||||
|
if grid_power_w is not None
|
||||||
|
else None,
|
||||||
|
load_power_w=self._kw_to_w(values.get("plant_total_load_power")),
|
||||||
|
plant_active_power_w=self._kw_to_w(values.get("plant_active_power")),
|
||||||
|
accumulated_pv_energy_kwh=self._float_or_none(
|
||||||
|
values.get("plant_accumulated_pv_energy")
|
||||||
|
),
|
||||||
|
daily_consumed_energy_kwh=self._float_or_none(
|
||||||
|
values.get("plant_daily_consumed_energy")
|
||||||
|
),
|
||||||
|
accumulated_consumed_energy_kwh=self._float_or_none(
|
||||||
|
values.get("plant_accumulated_consumed_energy")
|
||||||
|
),
|
||||||
|
raw_values=dict(values),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _observed_at(
|
||||||
|
self,
|
||||||
|
plant_epoch_seconds: int | None,
|
||||||
|
fallback: datetime,
|
||||||
|
) -> datetime:
|
||||||
|
if plant_epoch_seconds is None:
|
||||||
|
return fallback
|
||||||
|
try:
|
||||||
|
plant_time = datetime.fromtimestamp(plant_epoch_seconds, timezone.utc)
|
||||||
|
except (OverflowError, OSError, ValueError):
|
||||||
|
return fallback
|
||||||
|
|
||||||
|
drift_seconds = abs((fallback - plant_time).total_seconds())
|
||||||
|
if drift_seconds > self.max_plant_clock_drift_seconds:
|
||||||
|
return fallback
|
||||||
|
|
||||||
|
return plant_time
|
||||||
|
|
||||||
|
def _kw_to_w(self, value: Any) -> float | None:
|
||||||
|
numeric = self._float_or_none(value)
|
||||||
|
if numeric is None:
|
||||||
|
return None
|
||||||
|
return numeric * 1000
|
||||||
|
|
||||||
|
def _float_or_none(self, value: Any) -> float | None:
|
||||||
|
if value is None:
|
||||||
|
return None
|
||||||
|
if isinstance(value, bool):
|
||||||
|
return float(value)
|
||||||
|
try:
|
||||||
|
return float(value)
|
||||||
|
except (TypeError, ValueError):
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _int_or_none(self, value: Any) -> int | None:
|
||||||
|
numeric = self._float_or_none(value)
|
||||||
|
if numeric is None:
|
||||||
|
return None
|
||||||
|
return int(numeric)
|
||||||
@@ -0,0 +1,182 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from inspect import signature
|
||||||
|
import sys
|
||||||
|
from typing import Literal
|
||||||
|
|
||||||
|
try:
|
||||||
|
from pymodbus.client import ModbusTcpClient
|
||||||
|
from pymodbus.exceptions import ModbusException
|
||||||
|
except ImportError: # pragma: no cover - exercised only before dependency install
|
||||||
|
ModbusTcpClient = None # type: ignore[assignment]
|
||||||
|
|
||||||
|
class ModbusException(Exception):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
RegisterKind = Literal["holding", "input", "coil", "discrete"]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class ModbusReadResult:
|
||||||
|
kind: RegisterKind
|
||||||
|
address: int
|
||||||
|
count: int
|
||||||
|
values: list[int] | list[bool]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class ModbusReadError:
|
||||||
|
kind: RegisterKind
|
||||||
|
address: int
|
||||||
|
count: int
|
||||||
|
error: str
|
||||||
|
|
||||||
|
|
||||||
|
class SigenModbusClient:
|
||||||
|
"""Small Modbus TCP client for exploring a Sigenergy plant or inverter."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
host: str,
|
||||||
|
port: int = 502,
|
||||||
|
unit_id: int = 1,
|
||||||
|
timeout: float = 5.0,
|
||||||
|
retries: int = 3,
|
||||||
|
trace: bool = False,
|
||||||
|
) -> None:
|
||||||
|
if ModbusTcpClient is None:
|
||||||
|
raise RuntimeError(
|
||||||
|
"pymodbus is not installed. Install dependencies with "
|
||||||
|
"`python3 -m pip install -r requirements.txt`."
|
||||||
|
)
|
||||||
|
|
||||||
|
self.host = host
|
||||||
|
self.port = port
|
||||||
|
self.unit_id = unit_id
|
||||||
|
self.timeout = timeout
|
||||||
|
self.retries = retries
|
||||||
|
self.trace = trace
|
||||||
|
self._client = ModbusTcpClient(
|
||||||
|
host=host,
|
||||||
|
port=port,
|
||||||
|
timeout=timeout,
|
||||||
|
retries=retries,
|
||||||
|
trace_packet=self._trace_packet if trace else None,
|
||||||
|
)
|
||||||
|
self._unit_keyword = self._detect_unit_keyword()
|
||||||
|
|
||||||
|
def __enter__(self) -> SigenModbusClient:
|
||||||
|
self.connect()
|
||||||
|
return self
|
||||||
|
|
||||||
|
def __exit__(self, *args: object) -> None:
|
||||||
|
self.close()
|
||||||
|
|
||||||
|
def connect(self) -> None:
|
||||||
|
if not self._client.connect():
|
||||||
|
raise ConnectionError(
|
||||||
|
f"Could not connect to Modbus TCP target {self.host}:{self.port}"
|
||||||
|
)
|
||||||
|
|
||||||
|
def close(self) -> None:
|
||||||
|
self._client.close()
|
||||||
|
|
||||||
|
def read(
|
||||||
|
self,
|
||||||
|
kind: RegisterKind,
|
||||||
|
address: int,
|
||||||
|
count: int = 1,
|
||||||
|
) -> ModbusReadResult:
|
||||||
|
if count < 1:
|
||||||
|
raise ValueError("count must be at least 1")
|
||||||
|
if address < 0:
|
||||||
|
raise ValueError("address must be zero or greater")
|
||||||
|
|
||||||
|
response = self._read_raw(kind, address, count)
|
||||||
|
if response.isError():
|
||||||
|
raise ModbusException(str(response))
|
||||||
|
|
||||||
|
values = getattr(response, "registers", None)
|
||||||
|
if values is None:
|
||||||
|
values = getattr(response, "bits", [])
|
||||||
|
values = list(values[:count])
|
||||||
|
|
||||||
|
return ModbusReadResult(
|
||||||
|
kind=kind,
|
||||||
|
address=address,
|
||||||
|
count=count,
|
||||||
|
values=list(values),
|
||||||
|
)
|
||||||
|
|
||||||
|
def scan(
|
||||||
|
self,
|
||||||
|
kind: RegisterKind,
|
||||||
|
start: int,
|
||||||
|
count: int,
|
||||||
|
chunk_size: int = 10,
|
||||||
|
) -> list[ModbusReadResult | ModbusReadError]:
|
||||||
|
if count < 1:
|
||||||
|
raise ValueError("count must be at least 1")
|
||||||
|
if chunk_size < 1:
|
||||||
|
raise ValueError("chunk_size must be at least 1")
|
||||||
|
|
||||||
|
results: list[ModbusReadResult | ModbusReadError] = []
|
||||||
|
stop = start + count
|
||||||
|
address = start
|
||||||
|
while address < stop:
|
||||||
|
current_count = min(chunk_size, stop - address)
|
||||||
|
try:
|
||||||
|
results.append(self.read(kind, address, current_count))
|
||||||
|
except Exception as exc:
|
||||||
|
results.append(
|
||||||
|
ModbusReadError(
|
||||||
|
kind=kind,
|
||||||
|
address=address,
|
||||||
|
count=current_count,
|
||||||
|
error=str(exc),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
address += current_count
|
||||||
|
|
||||||
|
return results
|
||||||
|
|
||||||
|
def _read_raw(self, kind: RegisterKind, address: int, count: int):
|
||||||
|
if kind == "holding":
|
||||||
|
return self._call_read(self._client.read_holding_registers, address, count)
|
||||||
|
if kind == "input":
|
||||||
|
return self._call_read(self._client.read_input_registers, address, count)
|
||||||
|
if kind == "coil":
|
||||||
|
return self._call_read(self._client.read_coils, address, count)
|
||||||
|
if kind == "discrete":
|
||||||
|
return self._call_read(self._client.read_discrete_inputs, address, count)
|
||||||
|
|
||||||
|
raise ValueError(f"Unsupported register kind: {kind}")
|
||||||
|
|
||||||
|
def _call_read(self, method, address: int, count: int):
|
||||||
|
kwargs = {
|
||||||
|
"address": address,
|
||||||
|
"count": count,
|
||||||
|
self._unit_keyword: self.unit_id,
|
||||||
|
}
|
||||||
|
try:
|
||||||
|
return method(**kwargs)
|
||||||
|
except TypeError as exc:
|
||||||
|
if self._unit_keyword not in str(exc):
|
||||||
|
raise
|
||||||
|
|
||||||
|
kwargs.pop(self._unit_keyword)
|
||||||
|
return method(address, self.unit_id, **kwargs)
|
||||||
|
|
||||||
|
def _detect_unit_keyword(self) -> str:
|
||||||
|
read_signature = signature(self._client.read_holding_registers)
|
||||||
|
for keyword in ("device_id", "slave", "unit"):
|
||||||
|
if keyword in read_signature.parameters:
|
||||||
|
return keyword
|
||||||
|
return "slave"
|
||||||
|
|
||||||
|
def _trace_packet(self, sending: bool, packet: bytes) -> bytes:
|
||||||
|
direction = "TX" if sending else "RX"
|
||||||
|
print(f"{direction} {packet.hex(' ')}", file=sys.stderr)
|
||||||
|
return packet
|
||||||
@@ -0,0 +1,530 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from typing import Literal
|
||||||
|
|
||||||
|
from gibil.classes.sigen.modbus import RegisterKind
|
||||||
|
|
||||||
|
|
||||||
|
SigenDataType = Literal["u16", "u32", "u64", "s16", "s32", "string"]
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class SigenRegister:
|
||||||
|
name: str
|
||||||
|
kind: RegisterKind
|
||||||
|
address: int
|
||||||
|
count: int
|
||||||
|
data_type: SigenDataType
|
||||||
|
gain: float = 1
|
||||||
|
unit: str | None = None
|
||||||
|
description: str | None = None
|
||||||
|
|
||||||
|
def decode(self, registers: list[int] | list[bool]) -> int | float | str:
|
||||||
|
numeric_registers = [int(register) for register in registers[: self.count]]
|
||||||
|
if self.data_type == "string":
|
||||||
|
return self._decode_string(numeric_registers)
|
||||||
|
|
||||||
|
raw_value = self._combine(numeric_registers)
|
||||||
|
|
||||||
|
if self.data_type.startswith("s"):
|
||||||
|
bits = 16 * self.count
|
||||||
|
sign_bit = 1 << (bits - 1)
|
||||||
|
if raw_value & sign_bit:
|
||||||
|
raw_value -= 1 << bits
|
||||||
|
|
||||||
|
if self.gain == 1:
|
||||||
|
return raw_value
|
||||||
|
return raw_value / self.gain
|
||||||
|
|
||||||
|
def _combine(self, registers: list[int]) -> int:
|
||||||
|
value = 0
|
||||||
|
for register in registers:
|
||||||
|
value = (value << 16) | (register & 0xFFFF)
|
||||||
|
return value
|
||||||
|
|
||||||
|
def _decode_string(self, registers: list[int]) -> str:
|
||||||
|
raw_bytes = bytearray()
|
||||||
|
for register in registers:
|
||||||
|
raw_bytes.append((register >> 8) & 0xFF)
|
||||||
|
raw_bytes.append(register & 0xFF)
|
||||||
|
return raw_bytes.rstrip(b"\x00").decode("ascii", errors="replace").strip()
|
||||||
|
|
||||||
|
|
||||||
|
PLANT_REGISTERS: dict[str, SigenRegister] = {
|
||||||
|
"plant_system_time": SigenRegister(
|
||||||
|
name="plant_system_time",
|
||||||
|
kind="input",
|
||||||
|
address=30000,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
unit="s",
|
||||||
|
),
|
||||||
|
"plant_ems_work_mode": SigenRegister(
|
||||||
|
name="plant_ems_work_mode",
|
||||||
|
kind="input",
|
||||||
|
address=30003,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
),
|
||||||
|
"plant_grid_sensor_status": SigenRegister(
|
||||||
|
name="plant_grid_sensor_status",
|
||||||
|
kind="input",
|
||||||
|
address=30004,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
),
|
||||||
|
"plant_grid_sensor_active_power": SigenRegister(
|
||||||
|
name="plant_grid_sensor_active_power",
|
||||||
|
kind="input",
|
||||||
|
address=30005,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
),
|
||||||
|
"plant_ess_soc": SigenRegister(
|
||||||
|
name="plant_ess_soc",
|
||||||
|
kind="input",
|
||||||
|
address=30014,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
gain=10,
|
||||||
|
unit="%",
|
||||||
|
),
|
||||||
|
"plant_active_power": SigenRegister(
|
||||||
|
name="plant_active_power",
|
||||||
|
kind="input",
|
||||||
|
address=30031,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
),
|
||||||
|
"plant_sigen_photovoltaic_power": SigenRegister(
|
||||||
|
name="plant_sigen_photovoltaic_power",
|
||||||
|
kind="input",
|
||||||
|
address=30035,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
),
|
||||||
|
"plant_ess_power": SigenRegister(
|
||||||
|
name="plant_ess_power",
|
||||||
|
kind="input",
|
||||||
|
address=30037,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
),
|
||||||
|
"plant_running_state": SigenRegister(
|
||||||
|
name="plant_running_state",
|
||||||
|
kind="input",
|
||||||
|
address=30051,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
),
|
||||||
|
"plant_ess_rated_energy_capacity": SigenRegister(
|
||||||
|
name="plant_ess_rated_energy_capacity",
|
||||||
|
kind="input",
|
||||||
|
address=30083,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=100,
|
||||||
|
unit="kWh",
|
||||||
|
),
|
||||||
|
"plant_ess_soh": SigenRegister(
|
||||||
|
name="plant_ess_soh",
|
||||||
|
kind="input",
|
||||||
|
address=30087,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
gain=10,
|
||||||
|
unit="%",
|
||||||
|
),
|
||||||
|
"plant_accumulated_pv_energy": SigenRegister(
|
||||||
|
name="plant_accumulated_pv_energy",
|
||||||
|
kind="input",
|
||||||
|
address=30088,
|
||||||
|
count=4,
|
||||||
|
data_type="u64",
|
||||||
|
gain=100,
|
||||||
|
unit="kWh",
|
||||||
|
),
|
||||||
|
"plant_daily_consumed_energy": SigenRegister(
|
||||||
|
name="plant_daily_consumed_energy",
|
||||||
|
kind="input",
|
||||||
|
address=30092,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=100,
|
||||||
|
unit="kWh",
|
||||||
|
),
|
||||||
|
"plant_accumulated_consumed_energy": SigenRegister(
|
||||||
|
name="plant_accumulated_consumed_energy",
|
||||||
|
kind="input",
|
||||||
|
address=30094,
|
||||||
|
count=4,
|
||||||
|
data_type="u64",
|
||||||
|
gain=100,
|
||||||
|
unit="kWh",
|
||||||
|
),
|
||||||
|
"plant_general_load_power": SigenRegister(
|
||||||
|
name="plant_general_load_power",
|
||||||
|
kind="input",
|
||||||
|
address=30282,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="General load power",
|
||||||
|
),
|
||||||
|
"plant_total_load_power": SigenRegister(
|
||||||
|
name="plant_total_load_power",
|
||||||
|
kind="input",
|
||||||
|
address=30284,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="Total load power",
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
PLANT_PARAMETER_REGISTERS: dict[str, SigenRegister] = {
|
||||||
|
"plant_start_stop": SigenRegister(
|
||||||
|
name="plant_start_stop",
|
||||||
|
kind="holding",
|
||||||
|
address=40000,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
description="Start/Stop (0: Stop, 1: Start)",
|
||||||
|
),
|
||||||
|
"plant_active_power_fixed_target": SigenRegister(
|
||||||
|
name="plant_active_power_fixed_target",
|
||||||
|
kind="holding",
|
||||||
|
address=40001,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="Active power fixed adjustment target value",
|
||||||
|
),
|
||||||
|
"plant_reactive_power_fixed_target": SigenRegister(
|
||||||
|
name="plant_reactive_power_fixed_target",
|
||||||
|
kind="holding",
|
||||||
|
address=40003,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kvar",
|
||||||
|
description="Reactive power fixed adjustment target value",
|
||||||
|
),
|
||||||
|
"plant_active_power_percentage_target": SigenRegister(
|
||||||
|
name="plant_active_power_percentage_target",
|
||||||
|
kind="holding",
|
||||||
|
address=40005,
|
||||||
|
count=1,
|
||||||
|
data_type="s16",
|
||||||
|
gain=100,
|
||||||
|
unit="%",
|
||||||
|
description="Active power percentage target. Range: -100.00 to 100.00",
|
||||||
|
),
|
||||||
|
"plant_qs_ratio_target": SigenRegister(
|
||||||
|
name="plant_qs_ratio_target",
|
||||||
|
kind="holding",
|
||||||
|
address=40006,
|
||||||
|
count=1,
|
||||||
|
data_type="s16",
|
||||||
|
gain=100,
|
||||||
|
unit="%",
|
||||||
|
description="Q/S adjustment target value",
|
||||||
|
),
|
||||||
|
"plant_power_factor_target": SigenRegister(
|
||||||
|
name="plant_power_factor_target",
|
||||||
|
kind="holding",
|
||||||
|
address=40007,
|
||||||
|
count=1,
|
||||||
|
data_type="s16",
|
||||||
|
gain=1000,
|
||||||
|
description="Power factor adjustment target value",
|
||||||
|
),
|
||||||
|
"plant_remote_ems_enable": SigenRegister(
|
||||||
|
name="plant_remote_ems_enable",
|
||||||
|
kind="holding",
|
||||||
|
address=40029,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
description="Remote EMS enable (0: disabled, 1: enabled)",
|
||||||
|
),
|
||||||
|
"plant_independent_phase_power_control_enable": SigenRegister(
|
||||||
|
name="plant_independent_phase_power_control_enable",
|
||||||
|
kind="holding",
|
||||||
|
address=40030,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
description="Independent phase power control enable (0: disabled, 1: enabled)",
|
||||||
|
),
|
||||||
|
"plant_remote_ems_control_mode": SigenRegister(
|
||||||
|
name="plant_remote_ems_control_mode",
|
||||||
|
kind="holding",
|
||||||
|
address=40031,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
description=(
|
||||||
|
"Remote EMS control mode: 0 PCS remote, 1 standby, "
|
||||||
|
"2 self-consumption, 3 charge grid first, 4 charge PV first, "
|
||||||
|
"5 discharge PV first, 6 discharge ESS first"
|
||||||
|
),
|
||||||
|
),
|
||||||
|
"plant_ess_max_charging_limit": SigenRegister(
|
||||||
|
name="plant_ess_max_charging_limit",
|
||||||
|
kind="holding",
|
||||||
|
address=40032,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="ESS max charging limit",
|
||||||
|
),
|
||||||
|
"plant_ess_max_discharging_limit": SigenRegister(
|
||||||
|
name="plant_ess_max_discharging_limit",
|
||||||
|
kind="holding",
|
||||||
|
address=40034,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="ESS max discharging limit",
|
||||||
|
),
|
||||||
|
"plant_pv_max_power_limit": SigenRegister(
|
||||||
|
name="plant_pv_max_power_limit",
|
||||||
|
kind="holding",
|
||||||
|
address=40036,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="PV max power limit",
|
||||||
|
),
|
||||||
|
"plant_grid_point_maximum_export_limitation": SigenRegister(
|
||||||
|
name="plant_grid_point_maximum_export_limitation",
|
||||||
|
kind="holding",
|
||||||
|
address=40038,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="Grid point maximum export limitation",
|
||||||
|
),
|
||||||
|
"plant_grid_maximum_import_limitation": SigenRegister(
|
||||||
|
name="plant_grid_maximum_import_limitation",
|
||||||
|
kind="holding",
|
||||||
|
address=40040,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="Grid point maximum import limitation",
|
||||||
|
),
|
||||||
|
"plant_pcs_maximum_export_limitation": SigenRegister(
|
||||||
|
name="plant_pcs_maximum_export_limitation",
|
||||||
|
kind="holding",
|
||||||
|
address=40042,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="PCS maximum export limitation",
|
||||||
|
),
|
||||||
|
"plant_pcs_maximum_import_limitation": SigenRegister(
|
||||||
|
name="plant_pcs_maximum_import_limitation",
|
||||||
|
kind="holding",
|
||||||
|
address=40044,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
description="PCS maximum import limitation",
|
||||||
|
),
|
||||||
|
"plant_backup_soc": SigenRegister(
|
||||||
|
name="plant_backup_soc",
|
||||||
|
kind="holding",
|
||||||
|
address=40046,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
gain=10,
|
||||||
|
unit="%",
|
||||||
|
description="ESS backup SOC. Range: 0 to 100.0",
|
||||||
|
),
|
||||||
|
"plant_charge_cut_off_soc": SigenRegister(
|
||||||
|
name="plant_charge_cut_off_soc",
|
||||||
|
kind="holding",
|
||||||
|
address=40047,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
gain=10,
|
||||||
|
unit="%",
|
||||||
|
description="ESS charge cut-off SOC. Range: 0 to 100.0",
|
||||||
|
),
|
||||||
|
"plant_discharge_cut_off_soc": SigenRegister(
|
||||||
|
name="plant_discharge_cut_off_soc",
|
||||||
|
kind="holding",
|
||||||
|
address=40048,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
gain=10,
|
||||||
|
unit="%",
|
||||||
|
description="ESS discharge cut-off SOC. Range: 0 to 100.0",
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
INVERTER_REGISTERS: dict[str, SigenRegister] = {
|
||||||
|
"inverter_model_type": SigenRegister(
|
||||||
|
name="inverter_model_type",
|
||||||
|
kind="input",
|
||||||
|
address=30500,
|
||||||
|
count=15,
|
||||||
|
data_type="string",
|
||||||
|
),
|
||||||
|
"inverter_serial_number": SigenRegister(
|
||||||
|
name="inverter_serial_number",
|
||||||
|
kind="input",
|
||||||
|
address=30515,
|
||||||
|
count=10,
|
||||||
|
data_type="string",
|
||||||
|
),
|
||||||
|
"inverter_machine_firmware_version": SigenRegister(
|
||||||
|
name="inverter_machine_firmware_version",
|
||||||
|
kind="input",
|
||||||
|
address=30525,
|
||||||
|
count=15,
|
||||||
|
data_type="string",
|
||||||
|
),
|
||||||
|
"inverter_rated_active_power": SigenRegister(
|
||||||
|
name="inverter_rated_active_power",
|
||||||
|
kind="input",
|
||||||
|
address=30540,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
),
|
||||||
|
"inverter_running_state": SigenRegister(
|
||||||
|
name="inverter_running_state",
|
||||||
|
kind="input",
|
||||||
|
address=30578,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
),
|
||||||
|
"inverter_active_power": SigenRegister(
|
||||||
|
name="inverter_active_power",
|
||||||
|
kind="input",
|
||||||
|
address=30587,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
),
|
||||||
|
"inverter_reactive_power": SigenRegister(
|
||||||
|
name="inverter_reactive_power",
|
||||||
|
kind="input",
|
||||||
|
address=30589,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kvar",
|
||||||
|
),
|
||||||
|
"inverter_ess_charge_discharge_power": SigenRegister(
|
||||||
|
name="inverter_ess_charge_discharge_power",
|
||||||
|
kind="input",
|
||||||
|
address=30599,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
),
|
||||||
|
"inverter_ess_battery_soc": SigenRegister(
|
||||||
|
name="inverter_ess_battery_soc",
|
||||||
|
kind="input",
|
||||||
|
address=30601,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
gain=10,
|
||||||
|
unit="%",
|
||||||
|
),
|
||||||
|
"inverter_ess_battery_soh": SigenRegister(
|
||||||
|
name="inverter_ess_battery_soh",
|
||||||
|
kind="input",
|
||||||
|
address=30602,
|
||||||
|
count=1,
|
||||||
|
data_type="u16",
|
||||||
|
gain=10,
|
||||||
|
unit="%",
|
||||||
|
),
|
||||||
|
"inverter_pv_power": SigenRegister(
|
||||||
|
name="inverter_pv_power",
|
||||||
|
kind="input",
|
||||||
|
address=31035,
|
||||||
|
count=2,
|
||||||
|
data_type="s32",
|
||||||
|
gain=1000,
|
||||||
|
unit="kW",
|
||||||
|
),
|
||||||
|
"inverter_daily_pv_energy": SigenRegister(
|
||||||
|
name="inverter_daily_pv_energy",
|
||||||
|
kind="input",
|
||||||
|
address=31509,
|
||||||
|
count=2,
|
||||||
|
data_type="u32",
|
||||||
|
gain=100,
|
||||||
|
unit="kWh",
|
||||||
|
),
|
||||||
|
"inverter_accumulated_pv_energy": SigenRegister(
|
||||||
|
name="inverter_accumulated_pv_energy",
|
||||||
|
kind="input",
|
||||||
|
address=31511,
|
||||||
|
count=4,
|
||||||
|
data_type="u64",
|
||||||
|
gain=100,
|
||||||
|
unit="kWh",
|
||||||
|
),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
DEFAULT_PLANT_REGISTER_NAMES = (
|
||||||
|
"plant_system_time",
|
||||||
|
"plant_ems_work_mode",
|
||||||
|
"plant_grid_sensor_status",
|
||||||
|
"plant_grid_sensor_active_power",
|
||||||
|
"plant_ess_soc",
|
||||||
|
"plant_active_power",
|
||||||
|
"plant_sigen_photovoltaic_power",
|
||||||
|
"plant_ess_power",
|
||||||
|
"plant_running_state",
|
||||||
|
"plant_ess_rated_energy_capacity",
|
||||||
|
"plant_ess_soh",
|
||||||
|
"plant_accumulated_pv_energy",
|
||||||
|
"plant_daily_consumed_energy",
|
||||||
|
"plant_accumulated_consumed_energy",
|
||||||
|
"plant_general_load_power",
|
||||||
|
"plant_total_load_power",
|
||||||
|
)
|
||||||
|
|
||||||
|
DEFAULT_INVERTER_REGISTER_NAMES = (
|
||||||
|
"inverter_model_type",
|
||||||
|
"inverter_serial_number",
|
||||||
|
"inverter_machine_firmware_version",
|
||||||
|
"inverter_rated_active_power",
|
||||||
|
"inverter_running_state",
|
||||||
|
"inverter_active_power",
|
||||||
|
"inverter_reactive_power",
|
||||||
|
"inverter_ess_charge_discharge_power",
|
||||||
|
"inverter_ess_battery_soc",
|
||||||
|
"inverter_ess_battery_soh",
|
||||||
|
"inverter_pv_power",
|
||||||
|
"inverter_daily_pv_energy",
|
||||||
|
"inverter_accumulated_pv_energy",
|
||||||
|
)
|
||||||
@@ -0,0 +1,508 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from contextlib import contextmanager
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import datetime, timedelta, timezone
|
||||||
|
from os import environ
|
||||||
|
from typing import Iterator
|
||||||
|
|
||||||
|
from gibil.classes.models import SigenPlantSnapshot
|
||||||
|
|
||||||
|
|
||||||
|
class SigenStoreConfigurationError(RuntimeError):
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class SigenStoreConfig:
|
||||||
|
database_url: str
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "SigenStoreConfig":
|
||||||
|
database_url = environ.get("ASTRAPE_DATABASE_URL")
|
||||||
|
if not database_url:
|
||||||
|
raise SigenStoreConfigurationError(
|
||||||
|
"ASTRAPE_DATABASE_URL is required for Sigen storage"
|
||||||
|
)
|
||||||
|
|
||||||
|
return cls(database_url=database_url)
|
||||||
|
|
||||||
|
|
||||||
|
class SigenStore:
|
||||||
|
"""Persists Sigenergy plant snapshots in TimescaleDB."""
|
||||||
|
|
||||||
|
def __init__(self, config: SigenStoreConfig) -> None:
|
||||||
|
self.config = config
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "SigenStore":
|
||||||
|
return cls(SigenStoreConfig.from_env())
|
||||||
|
|
||||||
|
def initialize(self) -> None:
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute("CREATE EXTENSION IF NOT EXISTS timescaledb")
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
CREATE TABLE IF NOT EXISTS sigen_plant_snapshots (
|
||||||
|
observed_at TIMESTAMPTZ NOT NULL,
|
||||||
|
received_at TIMESTAMPTZ NOT NULL,
|
||||||
|
source TEXT NOT NULL,
|
||||||
|
plant_epoch_seconds BIGINT,
|
||||||
|
plant_ems_work_mode INTEGER,
|
||||||
|
plant_running_state INTEGER,
|
||||||
|
grid_sensor_status INTEGER,
|
||||||
|
solar_power_w DOUBLE PRECISION,
|
||||||
|
battery_soc_pct DOUBLE PRECISION,
|
||||||
|
battery_soh_pct DOUBLE PRECISION,
|
||||||
|
battery_power_w DOUBLE PRECISION,
|
||||||
|
grid_power_w DOUBLE PRECISION,
|
||||||
|
grid_import_w DOUBLE PRECISION,
|
||||||
|
grid_export_w DOUBLE PRECISION,
|
||||||
|
load_power_w DOUBLE PRECISION,
|
||||||
|
plant_active_power_w DOUBLE PRECISION,
|
||||||
|
accumulated_pv_energy_kwh DOUBLE PRECISION,
|
||||||
|
daily_consumed_energy_kwh DOUBLE PRECISION,
|
||||||
|
accumulated_consumed_energy_kwh DOUBLE PRECISION,
|
||||||
|
raw_values JSONB NOT NULL DEFAULT '{}'::jsonb,
|
||||||
|
inserted_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
|
PRIMARY KEY (observed_at, source)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT create_hypertable(
|
||||||
|
'sigen_plant_snapshots',
|
||||||
|
'observed_at',
|
||||||
|
if_not_exists => TRUE
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
CREATE INDEX IF NOT EXISTS sigen_plant_snapshots_received_at_idx
|
||||||
|
ON sigen_plant_snapshots (received_at DESC)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
self._create_rollup_view(
|
||||||
|
cursor,
|
||||||
|
view_name="sigen_plant_snapshots_1m",
|
||||||
|
bucket="1 minute",
|
||||||
|
)
|
||||||
|
self._create_rollup_view(
|
||||||
|
cursor,
|
||||||
|
view_name="sigen_plant_snapshots_15m",
|
||||||
|
bucket="15 minutes",
|
||||||
|
)
|
||||||
|
self._create_rollup_view(
|
||||||
|
cursor,
|
||||||
|
view_name="sigen_plant_snapshots_1h",
|
||||||
|
bucket="1 hour",
|
||||||
|
)
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
def save_snapshot(self, snapshot: SigenPlantSnapshot) -> int:
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
try:
|
||||||
|
from psycopg.types.json import Jsonb
|
||||||
|
except ImportError as error:
|
||||||
|
raise SigenStoreConfigurationError(
|
||||||
|
"Install dependencies with `python3 -m pip install -r requirements.txt`"
|
||||||
|
) from error
|
||||||
|
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
INSERT INTO sigen_plant_snapshots (
|
||||||
|
observed_at,
|
||||||
|
received_at,
|
||||||
|
source,
|
||||||
|
plant_epoch_seconds,
|
||||||
|
plant_ems_work_mode,
|
||||||
|
plant_running_state,
|
||||||
|
grid_sensor_status,
|
||||||
|
solar_power_w,
|
||||||
|
battery_soc_pct,
|
||||||
|
battery_soh_pct,
|
||||||
|
battery_power_w,
|
||||||
|
grid_power_w,
|
||||||
|
grid_import_w,
|
||||||
|
grid_export_w,
|
||||||
|
load_power_w,
|
||||||
|
plant_active_power_w,
|
||||||
|
accumulated_pv_energy_kwh,
|
||||||
|
daily_consumed_energy_kwh,
|
||||||
|
accumulated_consumed_energy_kwh,
|
||||||
|
raw_values
|
||||||
|
)
|
||||||
|
VALUES (
|
||||||
|
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s,
|
||||||
|
%s, %s, %s, %s, %s, %s, %s, %s, %s, %s
|
||||||
|
)
|
||||||
|
ON CONFLICT (observed_at, source)
|
||||||
|
DO UPDATE SET
|
||||||
|
received_at = EXCLUDED.received_at,
|
||||||
|
plant_epoch_seconds = EXCLUDED.plant_epoch_seconds,
|
||||||
|
plant_ems_work_mode = EXCLUDED.plant_ems_work_mode,
|
||||||
|
plant_running_state = EXCLUDED.plant_running_state,
|
||||||
|
grid_sensor_status = EXCLUDED.grid_sensor_status,
|
||||||
|
solar_power_w = EXCLUDED.solar_power_w,
|
||||||
|
battery_soc_pct = EXCLUDED.battery_soc_pct,
|
||||||
|
battery_soh_pct = EXCLUDED.battery_soh_pct,
|
||||||
|
battery_power_w = EXCLUDED.battery_power_w,
|
||||||
|
grid_power_w = EXCLUDED.grid_power_w,
|
||||||
|
grid_import_w = EXCLUDED.grid_import_w,
|
||||||
|
grid_export_w = EXCLUDED.grid_export_w,
|
||||||
|
load_power_w = EXCLUDED.load_power_w,
|
||||||
|
plant_active_power_w = EXCLUDED.plant_active_power_w,
|
||||||
|
accumulated_pv_energy_kwh = EXCLUDED.accumulated_pv_energy_kwh,
|
||||||
|
daily_consumed_energy_kwh = EXCLUDED.daily_consumed_energy_kwh,
|
||||||
|
accumulated_consumed_energy_kwh = EXCLUDED.accumulated_consumed_energy_kwh,
|
||||||
|
raw_values = EXCLUDED.raw_values,
|
||||||
|
inserted_at = now()
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
snapshot.observed_at,
|
||||||
|
snapshot.received_at,
|
||||||
|
snapshot.source,
|
||||||
|
snapshot.plant_epoch_seconds,
|
||||||
|
snapshot.plant_ems_work_mode,
|
||||||
|
snapshot.plant_running_state,
|
||||||
|
snapshot.grid_sensor_status,
|
||||||
|
snapshot.solar_power_w,
|
||||||
|
snapshot.battery_soc_pct,
|
||||||
|
snapshot.battery_soh_pct,
|
||||||
|
snapshot.battery_power_w,
|
||||||
|
snapshot.grid_power_w,
|
||||||
|
snapshot.grid_import_w,
|
||||||
|
snapshot.grid_export_w,
|
||||||
|
snapshot.load_power_w,
|
||||||
|
snapshot.plant_active_power_w,
|
||||||
|
snapshot.accumulated_pv_energy_kwh,
|
||||||
|
snapshot.daily_consumed_energy_kwh,
|
||||||
|
snapshot.accumulated_consumed_energy_kwh,
|
||||||
|
Jsonb(snapshot.raw_values),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
connection.commit()
|
||||||
|
|
||||||
|
return 1
|
||||||
|
|
||||||
|
def load_latest_snapshot(self) -> SigenPlantSnapshot | None:
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
observed_at,
|
||||||
|
received_at,
|
||||||
|
source,
|
||||||
|
plant_epoch_seconds,
|
||||||
|
plant_ems_work_mode,
|
||||||
|
plant_running_state,
|
||||||
|
grid_sensor_status,
|
||||||
|
solar_power_w,
|
||||||
|
battery_soc_pct,
|
||||||
|
battery_soh_pct,
|
||||||
|
battery_power_w,
|
||||||
|
grid_power_w,
|
||||||
|
grid_import_w,
|
||||||
|
grid_export_w,
|
||||||
|
load_power_w,
|
||||||
|
plant_active_power_w,
|
||||||
|
accumulated_pv_energy_kwh,
|
||||||
|
daily_consumed_energy_kwh,
|
||||||
|
accumulated_consumed_energy_kwh,
|
||||||
|
raw_values
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
ORDER BY observed_at DESC
|
||||||
|
LIMIT 1
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
|
||||||
|
if row is None:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return SigenPlantSnapshot(
|
||||||
|
observed_at=row[0],
|
||||||
|
received_at=row[1],
|
||||||
|
source=row[2],
|
||||||
|
plant_epoch_seconds=row[3],
|
||||||
|
plant_ems_work_mode=row[4],
|
||||||
|
plant_running_state=row[5],
|
||||||
|
grid_sensor_status=row[6],
|
||||||
|
solar_power_w=row[7],
|
||||||
|
battery_soc_pct=row[8],
|
||||||
|
battery_soh_pct=row[9],
|
||||||
|
battery_power_w=row[10],
|
||||||
|
grid_power_w=row[11],
|
||||||
|
grid_import_w=row[12],
|
||||||
|
grid_export_w=row[13],
|
||||||
|
load_power_w=row[14],
|
||||||
|
plant_active_power_w=row[15],
|
||||||
|
accumulated_pv_energy_kwh=row[16],
|
||||||
|
daily_consumed_energy_kwh=row[17],
|
||||||
|
accumulated_consumed_energy_kwh=row[18],
|
||||||
|
raw_values=row[19] or {},
|
||||||
|
)
|
||||||
|
|
||||||
|
def load_recent_power_summary(
|
||||||
|
self,
|
||||||
|
lookback: timedelta = timedelta(minutes=30),
|
||||||
|
) -> dict[str, float | None]:
|
||||||
|
start_at = datetime.now(timezone.utc) - lookback
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
avg(load_power_w),
|
||||||
|
percentile_cont(0.10) WITHIN GROUP (ORDER BY load_power_w),
|
||||||
|
percentile_cont(0.50) WITHIN GROUP (ORDER BY load_power_w),
|
||||||
|
percentile_cont(0.90) WITHIN GROUP (ORDER BY load_power_w),
|
||||||
|
max(load_power_w),
|
||||||
|
max(solar_power_w)
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
WHERE observed_at >= %s
|
||||||
|
""",
|
||||||
|
(start_at,),
|
||||||
|
)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"load_avg_w": row[0],
|
||||||
|
"load_p10_w": row[1],
|
||||||
|
"load_p50_w": row[2],
|
||||||
|
"load_p90_w": row[3],
|
||||||
|
"load_max_w": row[4],
|
||||||
|
"solar_max_w": row[5],
|
||||||
|
}
|
||||||
|
|
||||||
|
def load_load_profile(
|
||||||
|
self,
|
||||||
|
lookback: timedelta = timedelta(days=30),
|
||||||
|
bucket_minutes: int = 15,
|
||||||
|
min_samples: int = 5,
|
||||||
|
timezone_name: str = "UTC",
|
||||||
|
) -> dict[tuple[int, int], dict[str, float | int]]:
|
||||||
|
if bucket_minutes <= 0:
|
||||||
|
raise ValueError("bucket_minutes must be greater than zero")
|
||||||
|
|
||||||
|
start_at = datetime.now(timezone.utc) - lookback
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
WITH localized AS (
|
||||||
|
SELECT
|
||||||
|
observed_at AT TIME ZONE %s AS local_observed_at,
|
||||||
|
load_power_w
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
WHERE observed_at >= %s
|
||||||
|
AND observed_at <= now()
|
||||||
|
AND load_power_w IS NOT NULL
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
EXTRACT(ISODOW FROM local_observed_at)::int AS iso_dow,
|
||||||
|
(
|
||||||
|
EXTRACT(HOUR FROM local_observed_at)::int * 60
|
||||||
|
+ FLOOR(EXTRACT(MINUTE FROM local_observed_at)::int / %s)::int * %s
|
||||||
|
) AS minute_bucket,
|
||||||
|
percentile_cont(0.10) WITHIN GROUP (ORDER BY load_power_w) AS p10,
|
||||||
|
percentile_cont(0.50) WITHIN GROUP (ORDER BY load_power_w) AS p50,
|
||||||
|
percentile_cont(0.90) WITHIN GROUP (ORDER BY load_power_w) AS p90,
|
||||||
|
avg(load_power_w) AS avg_load_power_w,
|
||||||
|
max(load_power_w) AS max_load_power_w,
|
||||||
|
count(*) AS sample_count
|
||||||
|
FROM localized
|
||||||
|
GROUP BY iso_dow, minute_bucket
|
||||||
|
HAVING count(*) >= %s
|
||||||
|
""",
|
||||||
|
(
|
||||||
|
timezone_name,
|
||||||
|
start_at,
|
||||||
|
bucket_minutes,
|
||||||
|
bucket_minutes,
|
||||||
|
min_samples,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
|
||||||
|
return {
|
||||||
|
(int(row[0]), int(row[1])): {
|
||||||
|
"p10": float(row[2]),
|
||||||
|
"p50": float(row[3]),
|
||||||
|
"p90": float(row[4]),
|
||||||
|
"avg_load_power_w": float(row[5]),
|
||||||
|
"max_load_power_w": float(row[6]),
|
||||||
|
"sample_count": int(row[7]),
|
||||||
|
}
|
||||||
|
for row in rows
|
||||||
|
}
|
||||||
|
|
||||||
|
def load_recent_actual_points(
|
||||||
|
self,
|
||||||
|
lookback: timedelta = timedelta(hours=24),
|
||||||
|
bucket: str = "5 minutes",
|
||||||
|
) -> list[dict[str, object]]:
|
||||||
|
start_at = datetime.now(timezone.utc) - lookback
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
f"""
|
||||||
|
SELECT
|
||||||
|
time_bucket('{bucket}', observed_at) AS bucket,
|
||||||
|
avg(solar_power_w) AS solar_power_w,
|
||||||
|
avg(load_power_w) AS load_power_w,
|
||||||
|
avg(solar_power_w - load_power_w) AS net_power_w,
|
||||||
|
avg(grid_import_w) AS grid_import_w,
|
||||||
|
avg(grid_export_w) AS grid_export_w,
|
||||||
|
count(*) AS sample_count
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
WHERE observed_at >= %s
|
||||||
|
AND observed_at <= now()
|
||||||
|
GROUP BY bucket
|
||||||
|
ORDER BY bucket
|
||||||
|
LIMIT 10000
|
||||||
|
""",
|
||||||
|
(start_at,),
|
||||||
|
)
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"target_at": row[0],
|
||||||
|
"solar_power_w": row[1],
|
||||||
|
"load_power_w": row[2],
|
||||||
|
"net_power_w": row[3],
|
||||||
|
"grid_import_w": row[4],
|
||||||
|
"grid_export_w": row[5],
|
||||||
|
"sample_count": row[6],
|
||||||
|
}
|
||||||
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
|
def load_recent_solar_peak_w(
|
||||||
|
self,
|
||||||
|
lookback: timedelta = timedelta(days=14),
|
||||||
|
) -> float | None:
|
||||||
|
start_at = datetime.now(timezone.utc) - lookback
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT max(solar_power_w)
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
WHERE observed_at >= %s
|
||||||
|
""",
|
||||||
|
(start_at,),
|
||||||
|
)
|
||||||
|
row = cursor.fetchone()
|
||||||
|
|
||||||
|
return row[0] if row else None
|
||||||
|
|
||||||
|
def load_solar_training_samples(
|
||||||
|
self,
|
||||||
|
lookback: timedelta = timedelta(days=30),
|
||||||
|
min_samples_per_hour: int = 3,
|
||||||
|
) -> list[dict[str, float | int | object]]:
|
||||||
|
start_at = datetime.now(timezone.utc) - lookback
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
WITH hourly_solar AS (
|
||||||
|
SELECT
|
||||||
|
time_bucket('1 hour', observed_at) AS target_at,
|
||||||
|
avg(solar_power_w) AS avg_solar_power_w,
|
||||||
|
count(*) AS sample_count
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
WHERE observed_at >= %s
|
||||||
|
AND solar_power_w IS NOT NULL
|
||||||
|
GROUP BY target_at
|
||||||
|
),
|
||||||
|
latest_weather AS (
|
||||||
|
SELECT
|
||||||
|
target_at,
|
||||||
|
shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct,
|
||||||
|
ROW_NUMBER() OVER (
|
||||||
|
PARTITION BY target_at
|
||||||
|
ORDER BY issued_at DESC
|
||||||
|
) AS rn
|
||||||
|
FROM weather_forecast_points
|
||||||
|
WHERE target_at >= %s
|
||||||
|
)
|
||||||
|
SELECT
|
||||||
|
h.target_at,
|
||||||
|
h.avg_solar_power_w,
|
||||||
|
h.sample_count,
|
||||||
|
w.shortwave_radiation_w_m2,
|
||||||
|
w.cloud_cover_pct
|
||||||
|
FROM hourly_solar h
|
||||||
|
JOIN latest_weather w
|
||||||
|
ON w.target_at = h.target_at
|
||||||
|
AND w.rn = 1
|
||||||
|
WHERE h.sample_count >= %s
|
||||||
|
AND w.shortwave_radiation_w_m2 IS NOT NULL
|
||||||
|
ORDER BY h.target_at
|
||||||
|
""",
|
||||||
|
(start_at, start_at, min_samples_per_hour),
|
||||||
|
)
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"target_at": row[0],
|
||||||
|
"solar_power_w": float(row[1]),
|
||||||
|
"sample_count": int(row[2]),
|
||||||
|
"shortwave_radiation_w_m2": float(row[3]),
|
||||||
|
"cloud_cover_pct": float(row[4]) if row[4] is not None else 0.0,
|
||||||
|
}
|
||||||
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
|
def _create_rollup_view(self, cursor: object, view_name: str, bucket: str) -> None:
|
||||||
|
cursor.execute(
|
||||||
|
f"""
|
||||||
|
CREATE OR REPLACE VIEW {view_name} AS
|
||||||
|
SELECT
|
||||||
|
time_bucket('{bucket}', observed_at) AS bucket,
|
||||||
|
source,
|
||||||
|
avg(solar_power_w) AS avg_solar_power_w,
|
||||||
|
min(solar_power_w) AS min_solar_power_w,
|
||||||
|
max(solar_power_w) AS max_solar_power_w,
|
||||||
|
avg(load_power_w) AS avg_load_power_w,
|
||||||
|
min(load_power_w) AS min_load_power_w,
|
||||||
|
max(load_power_w) AS max_load_power_w,
|
||||||
|
avg(grid_import_w) AS avg_grid_import_w,
|
||||||
|
max(grid_import_w) AS max_grid_import_w,
|
||||||
|
avg(grid_export_w) AS avg_grid_export_w,
|
||||||
|
max(grid_export_w) AS max_grid_export_w,
|
||||||
|
avg(battery_power_w) AS avg_battery_power_w,
|
||||||
|
min(battery_power_w) AS min_battery_power_w,
|
||||||
|
max(battery_power_w) AS max_battery_power_w,
|
||||||
|
avg(battery_soc_pct) AS avg_battery_soc_pct,
|
||||||
|
min(battery_soc_pct) AS min_battery_soc_pct,
|
||||||
|
max(battery_soc_pct) AS max_battery_soc_pct,
|
||||||
|
min(accumulated_pv_energy_kwh) AS start_accumulated_pv_energy_kwh,
|
||||||
|
max(accumulated_pv_energy_kwh) AS end_accumulated_pv_energy_kwh,
|
||||||
|
count(*) AS sample_count
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
GROUP BY bucket, source
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
@contextmanager
|
||||||
|
def _connection(self) -> Iterator[object]:
|
||||||
|
try:
|
||||||
|
import psycopg
|
||||||
|
except ImportError as error:
|
||||||
|
raise SigenStoreConfigurationError(
|
||||||
|
"Install dependencies with `python3 -m pip install -r requirements.txt`"
|
||||||
|
) from error
|
||||||
|
|
||||||
|
with psycopg.connect(self.config.database_url) as connection:
|
||||||
|
yield connection
|
||||||
@@ -0,0 +1,23 @@
|
|||||||
|
from gibil.classes.weather.builder import (
|
||||||
|
OpenMeteoArchiveClient,
|
||||||
|
OpenMeteoArchiveParser,
|
||||||
|
OpenMeteoClient,
|
||||||
|
OpenMeteoParser,
|
||||||
|
WeatherBuilder,
|
||||||
|
)
|
||||||
|
from gibil.classes.weather.display import WeatherDisplay, WeatherDisplayDataset
|
||||||
|
from gibil.classes.weather.sample_data import WeatherSampleData
|
||||||
|
from gibil.classes.weather.store import WeatherStore, WeatherStoreConfig
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"OpenMeteoClient",
|
||||||
|
"OpenMeteoParser",
|
||||||
|
"OpenMeteoArchiveClient",
|
||||||
|
"OpenMeteoArchiveParser",
|
||||||
|
"WeatherBuilder",
|
||||||
|
"WeatherDisplay",
|
||||||
|
"WeatherDisplayDataset",
|
||||||
|
"WeatherSampleData",
|
||||||
|
"WeatherStore",
|
||||||
|
"WeatherStoreConfig",
|
||||||
|
]
|
||||||
@@ -80,6 +80,7 @@ class OpenMeteoArchiveClient:
|
|||||||
[
|
[
|
||||||
"temperature_2m",
|
"temperature_2m",
|
||||||
"shortwave_radiation",
|
"shortwave_radiation",
|
||||||
|
"cloud_cover",
|
||||||
]
|
]
|
||||||
),
|
),
|
||||||
"timezone": timezone_name,
|
"timezone": timezone_name,
|
||||||
@@ -167,6 +168,7 @@ class OpenMeteoArchiveParser:
|
|||||||
times = hourly.get("time", [])
|
times = hourly.get("time", [])
|
||||||
temperatures = hourly.get("temperature_2m", [])
|
temperatures = hourly.get("temperature_2m", [])
|
||||||
radiation = hourly.get("shortwave_radiation", [])
|
radiation = hourly.get("shortwave_radiation", [])
|
||||||
|
cloud_cover = hourly.get("cloud_cover", [])
|
||||||
|
|
||||||
truth: list[WeatherResolvedTruth] = []
|
truth: list[WeatherResolvedTruth] = []
|
||||||
for index, raw_time in enumerate(times):
|
for index, raw_time in enumerate(times):
|
||||||
@@ -175,6 +177,7 @@ class OpenMeteoArchiveParser:
|
|||||||
resolved_at=self._parse_time(raw_time),
|
resolved_at=self._parse_time(raw_time),
|
||||||
temperature_c=self._at(temperatures, index),
|
temperature_c=self._at(temperatures, index),
|
||||||
shortwave_radiation_w_m2=self._at(radiation, index),
|
shortwave_radiation_w_m2=self._at(radiation, index),
|
||||||
|
cloud_cover_pct=self._at(cloud_cover, index),
|
||||||
source="open_meteo_archive",
|
source="open_meteo_archive",
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
@@ -30,6 +30,7 @@ class WeatherDisplay:
|
|||||||
<select id="weather-variable">
|
<select id="weather-variable">
|
||||||
<option value="temperature_c">Temperature</option>
|
<option value="temperature_c">Temperature</option>
|
||||||
<option value="shortwave_radiation_w_m2">Solar radiation</option>
|
<option value="shortwave_radiation_w_m2">Solar radiation</option>
|
||||||
|
<option value="cloud_cover_pct">Cloud cover</option>
|
||||||
</select>
|
</select>
|
||||||
</label>
|
</label>
|
||||||
<div class="legend-control">
|
<div class="legend-control">
|
||||||
@@ -112,15 +113,13 @@ class WeatherDisplay:
|
|||||||
ctx.clearRect(0, 0, canvas.width, canvas.height);
|
ctx.clearRect(0, 0, canvas.width, canvas.height);
|
||||||
|
|
||||||
const allPoints = series.flatMap((item) => item.points);
|
const allPoints = series.flatMap((item) => item.points);
|
||||||
const now = Date.now();
|
const windowBounds = oracleAlignedBounds(payload.now);
|
||||||
const xs = allPoints.map((point) => new Date(point.target_at).getTime());
|
|
||||||
xs.push(now);
|
|
||||||
const ys = allPoints.map((point) => point.value).filter((value) => value !== null);
|
const ys = allPoints.map((point) => point.value).filter((value) => value !== null);
|
||||||
if (!xs.length || !ys.length) return;
|
if (!ys.length) return;
|
||||||
|
|
||||||
const bounds = {
|
const bounds = {
|
||||||
minX: Math.min(...xs),
|
minX: windowBounds.minX,
|
||||||
maxX: Math.max(...xs),
|
maxX: windowBounds.maxX,
|
||||||
minY: Math.min(...ys),
|
minY: Math.min(...ys),
|
||||||
maxY: Math.max(...ys),
|
maxY: Math.max(...ys),
|
||||||
};
|
};
|
||||||
@@ -130,7 +129,7 @@ class WeatherDisplay:
|
|||||||
}
|
}
|
||||||
|
|
||||||
drawAxes(ctx, canvas, bounds);
|
drawAxes(ctx, canvas, bounds);
|
||||||
drawNowMarker(ctx, canvas, bounds);
|
drawNowMarker(ctx, canvas, bounds, windowBounds.nowX);
|
||||||
series.forEach((item) => {
|
series.forEach((item) => {
|
||||||
drawSeries(ctx, canvas, bounds, item.points, item.color, item.width);
|
drawSeries(ctx, canvas, bounds, item.points, item.color, item.width);
|
||||||
});
|
});
|
||||||
@@ -196,8 +195,7 @@ class WeatherDisplay:
|
|||||||
ctx.stroke();
|
ctx.stroke();
|
||||||
}
|
}
|
||||||
|
|
||||||
function drawNowMarker(ctx, canvas, bounds) {
|
function drawNowMarker(ctx, canvas, bounds, now) {
|
||||||
const now = Date.now();
|
|
||||||
if (now < bounds.minX || now > bounds.maxX) return;
|
if (now < bounds.minX || now > bounds.maxX) return;
|
||||||
|
|
||||||
const margin = chartMargin();
|
const margin = chartMargin();
|
||||||
@@ -258,6 +256,16 @@ class WeatherDisplay:
|
|||||||
return { top: 24, right: 28, bottom: 34, left: 52 };
|
return { top: 24, right: 28, bottom: 34, left: 52 };
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function oracleAlignedBounds(nowIso) {
|
||||||
|
const parsedNow = new Date(nowIso).getTime();
|
||||||
|
const now = Number.isFinite(parsedNow) ? parsedNow : Date.now();
|
||||||
|
return {
|
||||||
|
minX: now - 24 * 60 * 60 * 1000,
|
||||||
|
maxX: now + 48 * 60 * 60 * 1000,
|
||||||
|
nowX: now
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
function scale(value, inMin, inMax, outMin, outMax) {
|
function scale(value, inMin, inMax, outMin, outMax) {
|
||||||
if (inMin === inMax) return (outMin + outMax) / 2;
|
if (inMin === inMax) return (outMin + outMax) / 2;
|
||||||
return outMin + ((value - inMin) / (inMax - inMin)) * (outMax - outMin);
|
return outMin + ((value - inMin) / (inMax - inMin)) * (outMax - outMin);
|
||||||
@@ -279,6 +287,7 @@ class WeatherDisplay:
|
|||||||
|
|
||||||
return json.dumps(
|
return json.dumps(
|
||||||
{
|
{
|
||||||
|
"now": datetime.now().astimezone().isoformat(),
|
||||||
"forecast_points": forecast_points,
|
"forecast_points": forecast_points,
|
||||||
"resolved_truth": resolved_truth,
|
"resolved_truth": resolved_truth,
|
||||||
"horizons": horizons,
|
"horizons": horizons,
|
||||||
@@ -304,6 +313,7 @@ class WeatherDisplay:
|
|||||||
"source": point.source,
|
"source": point.source,
|
||||||
"temperature_c": point.temperature_c,
|
"temperature_c": point.temperature_c,
|
||||||
"shortwave_radiation_w_m2": point.shortwave_radiation_w_m2,
|
"shortwave_radiation_w_m2": point.shortwave_radiation_w_m2,
|
||||||
|
"cloud_cover_pct": point.cloud_cover_pct,
|
||||||
}
|
}
|
||||||
|
|
||||||
def _iso(self, value: datetime) -> str:
|
def _iso(self, value: datetime) -> str:
|
||||||
@@ -4,7 +4,7 @@ from datetime import datetime, timedelta, timezone
|
|||||||
from math import pi, sin
|
from math import pi, sin
|
||||||
|
|
||||||
from gibil.classes.models import WeatherForecastPoint, WeatherResolvedTruth
|
from gibil.classes.models import WeatherForecastPoint, WeatherResolvedTruth
|
||||||
from gibil.classes.weather_display import WeatherDisplayDataset
|
from gibil.classes.weather.display import WeatherDisplayDataset
|
||||||
|
|
||||||
|
|
||||||
class WeatherSampleData:
|
class WeatherSampleData:
|
||||||
@@ -7,7 +7,7 @@ from os import environ
|
|||||||
from typing import Iterator
|
from typing import Iterator
|
||||||
|
|
||||||
from gibil.classes.models import WeatherForecastPoint, WeatherForecastRun, WeatherResolvedTruth
|
from gibil.classes.models import WeatherForecastPoint, WeatherForecastRun, WeatherResolvedTruth
|
||||||
from gibil.classes.weather_display import WeatherDisplayDataset
|
from gibil.classes.weather.display import WeatherDisplayDataset
|
||||||
|
|
||||||
|
|
||||||
class WeatherStoreConfigurationError(RuntimeError):
|
class WeatherStoreConfigurationError(RuntimeError):
|
||||||
@@ -76,11 +76,18 @@ class WeatherStore:
|
|||||||
source TEXT NOT NULL,
|
source TEXT NOT NULL,
|
||||||
temperature_c DOUBLE PRECISION,
|
temperature_c DOUBLE PRECISION,
|
||||||
shortwave_radiation_w_m2 DOUBLE PRECISION,
|
shortwave_radiation_w_m2 DOUBLE PRECISION,
|
||||||
|
cloud_cover_pct DOUBLE PRECISION,
|
||||||
inserted_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
inserted_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||||
PRIMARY KEY (resolved_at, source)
|
PRIMARY KEY (resolved_at, source)
|
||||||
)
|
)
|
||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
ALTER TABLE weather_resolved_truth
|
||||||
|
ADD COLUMN IF NOT EXISTS cloud_cover_pct DOUBLE PRECISION
|
||||||
|
"""
|
||||||
|
)
|
||||||
cursor.execute(
|
cursor.execute(
|
||||||
"""
|
"""
|
||||||
SELECT create_hypertable(
|
SELECT create_hypertable(
|
||||||
@@ -149,6 +156,7 @@ class WeatherStore:
|
|||||||
point.source,
|
point.source,
|
||||||
point.temperature_c,
|
point.temperature_c,
|
||||||
point.shortwave_radiation_w_m2,
|
point.shortwave_radiation_w_m2,
|
||||||
|
point.cloud_cover_pct,
|
||||||
)
|
)
|
||||||
for point in truth_points
|
for point in truth_points
|
||||||
]
|
]
|
||||||
@@ -163,13 +171,15 @@ class WeatherStore:
|
|||||||
resolved_at,
|
resolved_at,
|
||||||
source,
|
source,
|
||||||
temperature_c,
|
temperature_c,
|
||||||
shortwave_radiation_w_m2
|
shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct
|
||||||
)
|
)
|
||||||
VALUES (%s, %s, %s, %s)
|
VALUES (%s, %s, %s, %s, %s)
|
||||||
ON CONFLICT (resolved_at, source)
|
ON CONFLICT (resolved_at, source)
|
||||||
DO UPDATE SET
|
DO UPDATE SET
|
||||||
temperature_c = EXCLUDED.temperature_c,
|
temperature_c = EXCLUDED.temperature_c,
|
||||||
shortwave_radiation_w_m2 = EXCLUDED.shortwave_radiation_w_m2,
|
shortwave_radiation_w_m2 = EXCLUDED.shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct = EXCLUDED.cloud_cover_pct,
|
||||||
inserted_at = now()
|
inserted_at = now()
|
||||||
""",
|
""",
|
||||||
rows,
|
rows,
|
||||||
@@ -187,12 +197,67 @@ class WeatherStore:
|
|||||||
source="open_meteo_zero_hour",
|
source="open_meteo_zero_hour",
|
||||||
temperature_c=point.temperature_c,
|
temperature_c=point.temperature_c,
|
||||||
shortwave_radiation_w_m2=point.shortwave_radiation_w_m2,
|
shortwave_radiation_w_m2=point.shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct=point.cloud_cover_pct,
|
||||||
)
|
)
|
||||||
for point in forecast_run.points
|
for point in forecast_run.points
|
||||||
if point.horizon_hours == 0
|
if point.horizon_hours == 0
|
||||||
]
|
]
|
||||||
return self.save_resolved_truth(truth_points)
|
return self.save_resolved_truth(truth_points)
|
||||||
|
|
||||||
|
def load_latest_forecast_points(
|
||||||
|
self,
|
||||||
|
start_at: datetime,
|
||||||
|
end_at: datetime,
|
||||||
|
) -> list[WeatherForecastPoint]:
|
||||||
|
with self._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
horizon_hours,
|
||||||
|
source,
|
||||||
|
temperature_c,
|
||||||
|
shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct
|
||||||
|
FROM (
|
||||||
|
SELECT
|
||||||
|
issued_at,
|
||||||
|
target_at,
|
||||||
|
horizon_hours,
|
||||||
|
source,
|
||||||
|
temperature_c,
|
||||||
|
shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct,
|
||||||
|
ROW_NUMBER() OVER (
|
||||||
|
PARTITION BY target_at
|
||||||
|
ORDER BY issued_at DESC
|
||||||
|
) as rn
|
||||||
|
FROM weather_forecast_points
|
||||||
|
WHERE target_at >= %s AND target_at <= %s
|
||||||
|
) as ranked
|
||||||
|
WHERE rn = 1
|
||||||
|
ORDER BY target_at
|
||||||
|
LIMIT 5000
|
||||||
|
""",
|
||||||
|
(start_at, end_at),
|
||||||
|
)
|
||||||
|
rows = cursor.fetchall()
|
||||||
|
|
||||||
|
return [
|
||||||
|
WeatherForecastPoint(
|
||||||
|
issued_at=row[0],
|
||||||
|
target_at=row[1],
|
||||||
|
horizon_hours=row[2],
|
||||||
|
source=row[3],
|
||||||
|
temperature_c=row[4],
|
||||||
|
shortwave_radiation_w_m2=row[5],
|
||||||
|
cloud_cover_pct=row[6],
|
||||||
|
)
|
||||||
|
for row in rows
|
||||||
|
]
|
||||||
|
|
||||||
def load_display_dataset(
|
def load_display_dataset(
|
||||||
self,
|
self,
|
||||||
start_at: datetime | None = None,
|
start_at: datetime | None = None,
|
||||||
@@ -243,7 +308,8 @@ class WeatherStore:
|
|||||||
resolved_at,
|
resolved_at,
|
||||||
source,
|
source,
|
||||||
temperature_c,
|
temperature_c,
|
||||||
shortwave_radiation_w_m2
|
shortwave_radiation_w_m2,
|
||||||
|
cloud_cover_pct
|
||||||
FROM weather_resolved_truth
|
FROM weather_resolved_truth
|
||||||
WHERE resolved_at >= %s AND resolved_at <= %s
|
WHERE resolved_at >= %s AND resolved_at <= %s
|
||||||
ORDER BY resolved_at
|
ORDER BY resolved_at
|
||||||
@@ -272,6 +338,7 @@ class WeatherStore:
|
|||||||
source=row[1],
|
source=row[1],
|
||||||
temperature_c=row[2],
|
temperature_c=row[2],
|
||||||
shortwave_radiation_w_m2=row[3],
|
shortwave_radiation_w_m2=row[3],
|
||||||
|
cloud_cover_pct=row[4],
|
||||||
)
|
)
|
||||||
for row in truth_rows
|
for row in truth_rows
|
||||||
],
|
],
|
||||||
+123
-7
@@ -3,18 +3,23 @@ from __future__ import annotations
|
|||||||
from os import environ
|
from os import environ
|
||||||
|
|
||||||
from gibil.classes.env_loader import EnvLoader
|
from gibil.classes.env_loader import EnvLoader
|
||||||
from gibil.classes.weather_sample_data import WeatherSampleData
|
from gibil.classes.weather.sample_data import WeatherSampleData
|
||||||
from gibil.classes.weather_store import WeatherStore, WeatherStoreConfigurationError
|
from gibil.classes.weather.store import WeatherStore, WeatherStoreConfigurationError
|
||||||
from gibil.classes.weather_display import WeatherDisplay
|
from gibil.classes.weather.display import WeatherDisplay
|
||||||
|
from gibil.classes.oracle.display import OracleDisplay
|
||||||
|
from gibil.classes.oracle.quality_display import OracleQualityDisplay
|
||||||
|
|
||||||
|
|
||||||
class WebUI:
|
class WebUI:
|
||||||
"""Composes Astrape web modules into one page."""
|
"""Composes Astrape web modules into a small control panel."""
|
||||||
|
|
||||||
def __init__(self) -> None:
|
def __init__(self) -> None:
|
||||||
self.weather_display = WeatherDisplay()
|
self.weather_display = WeatherDisplay()
|
||||||
|
self.oracle_display = OracleDisplay()
|
||||||
|
self.oracle_quality_display = OracleQualityDisplay()
|
||||||
|
|
||||||
def render_page(self) -> str:
|
def render_page(self, page: str = "oracle") -> str:
|
||||||
|
current_page = page if page in {"oracle", "weather", "quality"} else "oracle"
|
||||||
return f"""<!doctype html>
|
return f"""<!doctype html>
|
||||||
<html lang="en">
|
<html lang="en">
|
||||||
<head>
|
<head>
|
||||||
@@ -31,6 +36,7 @@ class WebUI:
|
|||||||
--muted: #9aa8ba;
|
--muted: #9aa8ba;
|
||||||
--line: #344052;
|
--line: #344052;
|
||||||
--field: #121821;
|
--field: #121821;
|
||||||
|
--active: #38bdf8;
|
||||||
}}
|
}}
|
||||||
|
|
||||||
* {{
|
* {{
|
||||||
@@ -55,6 +61,39 @@ class WebUI:
|
|||||||
background: var(--surface);
|
background: var(--surface);
|
||||||
}}
|
}}
|
||||||
|
|
||||||
|
.brand {{
|
||||||
|
display: grid;
|
||||||
|
gap: 2px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
nav {{
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
}}
|
||||||
|
|
||||||
|
nav a {{
|
||||||
|
color: var(--muted);
|
||||||
|
text-decoration: none;
|
||||||
|
border: 1px solid transparent;
|
||||||
|
border-radius: 6px;
|
||||||
|
padding: 8px 10px;
|
||||||
|
font-size: 13px;
|
||||||
|
font-weight: 700;
|
||||||
|
}}
|
||||||
|
|
||||||
|
nav a:hover {{
|
||||||
|
color: var(--ink);
|
||||||
|
border-color: var(--line);
|
||||||
|
}}
|
||||||
|
|
||||||
|
nav a.active {{
|
||||||
|
color: var(--ink);
|
||||||
|
border-color: var(--active);
|
||||||
|
background: #102334;
|
||||||
|
}}
|
||||||
|
|
||||||
h1, h2, p {{
|
h1, h2, p {{
|
||||||
margin: 0;
|
margin: 0;
|
||||||
}}
|
}}
|
||||||
@@ -87,6 +126,10 @@ class WebUI:
|
|||||||
padding: 18px;
|
padding: 18px;
|
||||||
}}
|
}}
|
||||||
|
|
||||||
|
.panel + .panel {{
|
||||||
|
margin-top: 18px;
|
||||||
|
}}
|
||||||
|
|
||||||
.panel-heading {{
|
.panel-heading {{
|
||||||
display: grid;
|
display: grid;
|
||||||
grid-template-columns: minmax(180px, auto) 1fr;
|
grid-template-columns: minmax(180px, auto) 1fr;
|
||||||
@@ -195,6 +238,49 @@ class WebUI:
|
|||||||
height: 420px;
|
height: 420px;
|
||||||
}}
|
}}
|
||||||
|
|
||||||
|
table {{
|
||||||
|
width: 100%;
|
||||||
|
border-collapse: collapse;
|
||||||
|
font-size: 13px;
|
||||||
|
}}
|
||||||
|
|
||||||
|
th, td {{
|
||||||
|
padding: 10px 12px;
|
||||||
|
border-bottom: 1px solid var(--line);
|
||||||
|
text-align: right;
|
||||||
|
white-space: nowrap;
|
||||||
|
}}
|
||||||
|
|
||||||
|
th:first-child, td:first-child,
|
||||||
|
th:nth-child(2), td:nth-child(2) {{
|
||||||
|
text-align: left;
|
||||||
|
}}
|
||||||
|
|
||||||
|
th {{
|
||||||
|
color: var(--muted);
|
||||||
|
font-size: 12px;
|
||||||
|
font-weight: 700;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.table-shell {{
|
||||||
|
overflow-x: auto;
|
||||||
|
border: 1px solid var(--line);
|
||||||
|
border-radius: 6px;
|
||||||
|
background: var(--panel);
|
||||||
|
}}
|
||||||
|
|
||||||
|
.metric-good {{
|
||||||
|
color: #34d399;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.metric-warn {{
|
||||||
|
color: #fbbf24;
|
||||||
|
}}
|
||||||
|
|
||||||
|
.metric-bad {{
|
||||||
|
color: #fb7185;
|
||||||
|
}}
|
||||||
|
|
||||||
@media (max-width: 760px) {{
|
@media (max-width: 760px) {{
|
||||||
header, .panel-heading, .control-row {{
|
header, .panel-heading, .control-row {{
|
||||||
display: grid;
|
display: grid;
|
||||||
@@ -216,11 +302,14 @@ class WebUI:
|
|||||||
</head>
|
</head>
|
||||||
<body>
|
<body>
|
||||||
<header>
|
<header>
|
||||||
|
<div class="brand">
|
||||||
<h1>Astrape</h1>
|
<h1>Astrape</h1>
|
||||||
<p>Gibil web UI</p>
|
<p>Gibil control panel</p>
|
||||||
|
</div>
|
||||||
|
{self._nav(current_page)}
|
||||||
</header>
|
</header>
|
||||||
<main>
|
<main>
|
||||||
{self.weather_display.render()}
|
{self._page_body(current_page)}
|
||||||
</main>
|
</main>
|
||||||
<script>
|
<script>
|
||||||
let astrapeUiVersion = null;
|
let astrapeUiVersion = null;
|
||||||
@@ -241,6 +330,25 @@ class WebUI:
|
|||||||
</body>
|
</body>
|
||||||
</html>"""
|
</html>"""
|
||||||
|
|
||||||
|
def _nav(self, current_page: str) -> str:
|
||||||
|
pages = [
|
||||||
|
("oracle", "/oracle", "Oracle"),
|
||||||
|
("weather", "/weather", "Weather"),
|
||||||
|
("quality", "/quality", "Quality"),
|
||||||
|
]
|
||||||
|
links = [
|
||||||
|
f'<a class="{"active" if key == current_page else ""}" href="{href}">{label}</a>'
|
||||||
|
for key, href, label in pages
|
||||||
|
]
|
||||||
|
return f"<nav>{''.join(links)}</nav>"
|
||||||
|
|
||||||
|
def _page_body(self, page: str) -> str:
|
||||||
|
if page == "weather":
|
||||||
|
return self.weather_display.render()
|
||||||
|
if page == "quality":
|
||||||
|
return self.oracle_quality_display.render()
|
||||||
|
return self.oracle_display.render()
|
||||||
|
|
||||||
def weather_payload(self) -> str:
|
def weather_payload(self) -> str:
|
||||||
EnvLoader().load()
|
EnvLoader().load()
|
||||||
if environ.get("ASTRAPE_WEB_SAMPLE_DATA") == "1":
|
if environ.get("ASTRAPE_WEB_SAMPLE_DATA") == "1":
|
||||||
@@ -252,3 +360,11 @@ class WebUI:
|
|||||||
dataset = None
|
dataset = None
|
||||||
|
|
||||||
return self.weather_display.data_payload(dataset)
|
return self.weather_display.data_payload(dataset)
|
||||||
|
|
||||||
|
def oracle_payload(self) -> str:
|
||||||
|
EnvLoader().load()
|
||||||
|
return self.oracle_display.data_payload()
|
||||||
|
|
||||||
|
def oracle_quality_payload(self, lookback_hours: float = 168) -> str:
|
||||||
|
EnvLoader().load()
|
||||||
|
return self.oracle_quality_display.data_payload(lookback_hours=lookback_hours)
|
||||||
|
|||||||
@@ -0,0 +1 @@
|
|||||||
|
"""Long-running Astrape service entrypoints."""
|
||||||
@@ -7,12 +7,12 @@ from sys import stderr
|
|||||||
from time import sleep
|
from time import sleep
|
||||||
|
|
||||||
from gibil.classes.env_loader import EnvLoader
|
from gibil.classes.env_loader import EnvLoader
|
||||||
from gibil.classes.weather_builder import (
|
from gibil.classes.weather.builder import (
|
||||||
OpenMeteoArchiveClient,
|
OpenMeteoArchiveClient,
|
||||||
OpenMeteoClient,
|
OpenMeteoClient,
|
||||||
WeatherBuilder,
|
WeatherBuilder,
|
||||||
)
|
)
|
||||||
from gibil.classes.weather_store import WeatherStore
|
from gibil.classes.weather.store import WeatherStore
|
||||||
|
|
||||||
|
|
||||||
@dataclass(frozen=True)
|
@dataclass(frozen=True)
|
||||||
@@ -0,0 +1,115 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from os import environ
|
||||||
|
from sys import stderr
|
||||||
|
from time import sleep
|
||||||
|
|
||||||
|
from gibil.classes.oracle.builder import EnergyOracleBuilder
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.oracle.store import OracleStore
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class OracleDaemonConfig:
|
||||||
|
poll_seconds: float
|
||||||
|
evaluate_forecasts: bool
|
||||||
|
evaluation_actual_window_minutes: float
|
||||||
|
evaluation_lookback_hours: float
|
||||||
|
evaluation_limit: int
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "OracleDaemonConfig":
|
||||||
|
return cls(
|
||||||
|
poll_seconds=float(environ.get("ASTRAPE_ORACLE_POLL_SECONDS", "300")),
|
||||||
|
evaluate_forecasts=environ.get(
|
||||||
|
"ASTRAPE_ORACLE_EVALUATE_FORECASTS", "1"
|
||||||
|
).lower()
|
||||||
|
not in {"0", "false", "no"},
|
||||||
|
evaluation_actual_window_minutes=float(
|
||||||
|
environ.get("ASTRAPE_ORACLE_EVALUATION_WINDOW_MINUTES", "5")
|
||||||
|
),
|
||||||
|
evaluation_lookback_hours=float(
|
||||||
|
environ.get("ASTRAPE_ORACLE_EVALUATION_LOOKBACK_HOURS", "168")
|
||||||
|
),
|
||||||
|
evaluation_limit=int(environ.get("ASTRAPE_ORACLE_EVALUATION_LIMIT", "1000")),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class OracleDaemon:
|
||||||
|
"""Periodically stores oracle projection curves for evaluation."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
config: OracleDaemonConfig,
|
||||||
|
builder: EnergyOracleBuilder,
|
||||||
|
store: OracleStore,
|
||||||
|
) -> None:
|
||||||
|
self.config = config
|
||||||
|
self.builder = builder
|
||||||
|
self.store = store
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "OracleDaemon":
|
||||||
|
return cls(
|
||||||
|
config=OracleDaemonConfig.from_env(),
|
||||||
|
builder=EnergyOracleBuilder.from_env(),
|
||||||
|
store=OracleStore.from_env(),
|
||||||
|
)
|
||||||
|
|
||||||
|
def run_once(self) -> int:
|
||||||
|
solar_run, load_run, net_run = self.builder.build()
|
||||||
|
saved_count = self.store.save_runs(solar_run, load_run, net_run)
|
||||||
|
if self.config.evaluate_forecasts:
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
evaluated_count = self.store.evaluate_due_forecasts(
|
||||||
|
actual_window=timedelta(
|
||||||
|
minutes=self.config.evaluation_actual_window_minutes
|
||||||
|
),
|
||||||
|
lookback=timedelta(hours=self.config.evaluation_lookback_hours),
|
||||||
|
limit=self.config.evaluation_limit,
|
||||||
|
)
|
||||||
|
return saved_count + evaluated_count
|
||||||
|
return saved_count
|
||||||
|
|
||||||
|
def run_forever(self) -> None:
|
||||||
|
self.store.initialize()
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
saved_count = self.run_once()
|
||||||
|
print(f"stored_oracle_records={saved_count}", flush=True)
|
||||||
|
except Exception as error:
|
||||||
|
print(f"oracle_poll_error={error}", file=stderr, flush=True)
|
||||||
|
sleep(self.config.poll_seconds)
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
try:
|
||||||
|
EnvLoader().load()
|
||||||
|
args = parse_args()
|
||||||
|
daemon = OracleDaemon.from_env()
|
||||||
|
if args.once:
|
||||||
|
print(f"stored_oracle_records={daemon.run_once()}", flush=True)
|
||||||
|
return
|
||||||
|
daemon.run_forever()
|
||||||
|
except Exception as error:
|
||||||
|
print(f"oracle_daemon_startup_error={error}", file=stderr)
|
||||||
|
raise SystemExit(1) from error
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Store Astrape oracle projection curves."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--once",
|
||||||
|
action="store_true",
|
||||||
|
help="Save one set of oracle curves and exit.",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,95 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from os import environ
|
||||||
|
from sys import stderr
|
||||||
|
from time import sleep
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.sigen.builder import SigenPlantClient
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass(frozen=True)
|
||||||
|
class SigenDaemonConfig:
|
||||||
|
poll_seconds: float
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "SigenDaemonConfig":
|
||||||
|
return cls(
|
||||||
|
poll_seconds=float(environ.get("SIGEN_POLL_SECONDS", "5")),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SigenDaemon:
|
||||||
|
"""Polls Sigenergy plant metrics and stores normalized snapshots."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
config: SigenDaemonConfig,
|
||||||
|
plant_client: SigenPlantClient,
|
||||||
|
sigen_store: SigenStore,
|
||||||
|
) -> None:
|
||||||
|
self.config = config
|
||||||
|
self.plant_client = plant_client
|
||||||
|
self.sigen_store = sigen_store
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def from_env(cls) -> "SigenDaemon":
|
||||||
|
return cls(
|
||||||
|
config=SigenDaemonConfig.from_env(),
|
||||||
|
plant_client=SigenPlantClient.from_env(),
|
||||||
|
sigen_store=SigenStore.from_env(),
|
||||||
|
)
|
||||||
|
|
||||||
|
def initialize(self) -> None:
|
||||||
|
self.sigen_store.initialize()
|
||||||
|
|
||||||
|
def run_once(self) -> int:
|
||||||
|
snapshot = self.plant_client.fetch_snapshot()
|
||||||
|
return self.sigen_store.save_snapshot(snapshot)
|
||||||
|
|
||||||
|
def run_forever(self) -> None:
|
||||||
|
self.initialize()
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
saved_count = self.run_once()
|
||||||
|
print(f"stored_sigen_plant_snapshots={saved_count}", flush=True)
|
||||||
|
except Exception as error:
|
||||||
|
print(f"sigen_poll_error={error}", file=stderr, flush=True)
|
||||||
|
|
||||||
|
sleep(self.config.poll_seconds)
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
try:
|
||||||
|
EnvLoader().load()
|
||||||
|
daemon = SigenDaemon.from_env()
|
||||||
|
args = parse_args()
|
||||||
|
if args.once:
|
||||||
|
daemon.initialize()
|
||||||
|
saved_count = daemon.run_once()
|
||||||
|
print(f"stored_sigen_plant_snapshots={saved_count}", flush=True)
|
||||||
|
return
|
||||||
|
|
||||||
|
daemon.run_forever()
|
||||||
|
except Exception as error:
|
||||||
|
print(f"sigen_daemon_startup_error={error}", file=stderr)
|
||||||
|
raise SystemExit(1) from error
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Poll Sigenergy plant metrics into Astrape's database."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--once",
|
||||||
|
action="store_true",
|
||||||
|
help="Initialize storage, save one snapshot, and exit.",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,142 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from http.server import BaseHTTPRequestHandler, ThreadingHTTPServer
|
||||||
|
from importlib import import_module, reload
|
||||||
|
from os import environ
|
||||||
|
from pathlib import Path
|
||||||
|
import json
|
||||||
|
from urllib.parse import parse_qs, urlparse
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
|
||||||
|
EnvLoader().load()
|
||||||
|
|
||||||
|
HOST = environ.get("ASTRAPE_WEB_HOST", "0.0.0.0")
|
||||||
|
PORT = int(environ.get("ASTRAPE_WEB_PORT", "8080"))
|
||||||
|
PROJECT_ROOT = Path(__file__).resolve().parents[2]
|
||||||
|
WATCHED_PATHS = [
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "webui.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "weather" / "display.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "oracle" / "display.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "oracle" / "quality_display.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "weather" / "store.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "oracle" / "store.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "oracle" / "builder.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "oracle" / "config.py",
|
||||||
|
PROJECT_ROOT / "gibil" / "classes" / "sigen" / "store.py",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class AstrapeWebHandler(BaseHTTPRequestHandler):
|
||||||
|
def do_GET(self) -> None:
|
||||||
|
parsed = urlparse(self.path)
|
||||||
|
path = parsed.path
|
||||||
|
|
||||||
|
if path in {"/", "/oracle"}:
|
||||||
|
self._send_html(self._webui().render_page("oracle"))
|
||||||
|
return
|
||||||
|
|
||||||
|
if path == "/weather":
|
||||||
|
self._send_html(self._webui().render_page("weather"))
|
||||||
|
return
|
||||||
|
|
||||||
|
if path == "/quality":
|
||||||
|
self._send_html(self._webui().render_page("quality"))
|
||||||
|
return
|
||||||
|
|
||||||
|
if path == "/api/weather":
|
||||||
|
self._send_json_text(self._webui().weather_payload())
|
||||||
|
return
|
||||||
|
|
||||||
|
if path == "/api/oracle":
|
||||||
|
self._send_json_text(self._webui().oracle_payload())
|
||||||
|
return
|
||||||
|
|
||||||
|
if path == "/api/oracle-quality":
|
||||||
|
params = parse_qs(parsed.query)
|
||||||
|
lookback_hours = self._float_param(params, "lookback_hours", 168)
|
||||||
|
self._send_json_text(
|
||||||
|
self._webui().oracle_quality_payload(
|
||||||
|
lookback_hours=lookback_hours
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
if path == "/api/ui-version":
|
||||||
|
self._send_json_text(json.dumps({"version": self._ui_version()}))
|
||||||
|
return
|
||||||
|
|
||||||
|
self.send_error(404)
|
||||||
|
|
||||||
|
def log_message(self, format: str, *args: object) -> None:
|
||||||
|
print(f"{self.address_string()} - {format % args}")
|
||||||
|
|
||||||
|
def _webui(self):
|
||||||
|
weather_store_module = import_module("gibil.classes.weather.store")
|
||||||
|
sigen_store_module = import_module("gibil.classes.sigen.store")
|
||||||
|
oracle_store_module = import_module("gibil.classes.oracle.store")
|
||||||
|
oracle_builder_module = import_module("gibil.classes.oracle.builder")
|
||||||
|
oracle_display_module = import_module("gibil.classes.oracle.display")
|
||||||
|
oracle_quality_display_module = import_module(
|
||||||
|
"gibil.classes.oracle.quality_display"
|
||||||
|
)
|
||||||
|
weather_display_module = import_module("gibil.classes.weather.display")
|
||||||
|
webui_module = import_module("gibil.classes.webui")
|
||||||
|
reload(weather_store_module)
|
||||||
|
reload(sigen_store_module)
|
||||||
|
reload(oracle_store_module)
|
||||||
|
reload(oracle_builder_module)
|
||||||
|
reload(oracle_display_module)
|
||||||
|
reload(oracle_quality_display_module)
|
||||||
|
reload(weather_display_module)
|
||||||
|
reload(webui_module)
|
||||||
|
return webui_module.WebUI()
|
||||||
|
|
||||||
|
def _float_param(
|
||||||
|
self,
|
||||||
|
params: dict[str, list[str]],
|
||||||
|
key: str,
|
||||||
|
default: float,
|
||||||
|
) -> float:
|
||||||
|
values = params.get(key)
|
||||||
|
if not values:
|
||||||
|
return default
|
||||||
|
try:
|
||||||
|
return float(values[0])
|
||||||
|
except ValueError:
|
||||||
|
return default
|
||||||
|
|
||||||
|
def _ui_version(self) -> str:
|
||||||
|
mtimes = [
|
||||||
|
str(path.stat().st_mtime_ns)
|
||||||
|
for path in WATCHED_PATHS
|
||||||
|
if path.exists()
|
||||||
|
]
|
||||||
|
return ".".join(mtimes)
|
||||||
|
|
||||||
|
def _send_html(self, body: str) -> None:
|
||||||
|
encoded = body.encode("utf-8")
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Content-Type", "text/html; charset=utf-8")
|
||||||
|
self.send_header("Content-Length", str(len(encoded)))
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(encoded)
|
||||||
|
|
||||||
|
def _send_json_text(self, body: str) -> None:
|
||||||
|
encoded = body.encode("utf-8")
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header("Content-Type", "application/json; charset=utf-8")
|
||||||
|
self.send_header("Cache-Control", "no-store")
|
||||||
|
self.send_header("Content-Length", str(len(encoded)))
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(encoded)
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
server = ThreadingHTTPServer((HOST, PORT), AstrapeWebHandler)
|
||||||
|
print(f"Astrape web UI listening on http://{HOST}:{PORT}")
|
||||||
|
server.serve_forever()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,31 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Debug script to find duplicate target times in forecast data."""
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.weather.store import WeatherStore
|
||||||
|
from collections import defaultdict
|
||||||
|
|
||||||
|
EnvLoader().load()
|
||||||
|
|
||||||
|
store = WeatherStore.from_env()
|
||||||
|
dataset = store.load_display_dataset()
|
||||||
|
|
||||||
|
# Group by (target_at, horizon_hours) to find duplicates
|
||||||
|
by_key = defaultdict(list)
|
||||||
|
for point in dataset.forecast_points:
|
||||||
|
key = (point.target_at, point.horizon_hours)
|
||||||
|
by_key[key].append(point)
|
||||||
|
|
||||||
|
# Find duplicates
|
||||||
|
duplicates = {k: v for k, v in by_key.items() if len(v) > 1}
|
||||||
|
|
||||||
|
print(f"\nTotal forecast points: {len(dataset.forecast_points)}")
|
||||||
|
print(f"Unique (target_at, horizon) pairs: {len(by_key)}")
|
||||||
|
print(f"Duplicate (target_at, horizon) pairs: {len(duplicates)}")
|
||||||
|
|
||||||
|
if duplicates:
|
||||||
|
print("\nFirst 3 duplicates:")
|
||||||
|
for (target_at, horizon), points in list(duplicates.items())[:3]:
|
||||||
|
print(f"\n target_at={target_at}, horizon={horizon}h ({len(points)} points):")
|
||||||
|
for i, p in enumerate(points):
|
||||||
|
print(f" [{i}] issued_at={p.issued_at}, temp={p.temperature_c}, source={p.source}")
|
||||||
@@ -0,0 +1,67 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Debug baseline energy forecast curves."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
from datetime import timezone
|
||||||
|
|
||||||
|
from gibil.classes.oracle.builder import EnergyForecastBuilder
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.models import PowerForecastPoint
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
EnvLoader().load()
|
||||||
|
args = parse_args()
|
||||||
|
solar_run, load_run, net_run = EnergyForecastBuilder.from_env().build()
|
||||||
|
|
||||||
|
print(
|
||||||
|
f"issued_at={net_run.issued_at.astimezone(timezone.utc).isoformat(timespec='seconds')}"
|
||||||
|
)
|
||||||
|
print(
|
||||||
|
f"solar_model={solar_run.model_version} "
|
||||||
|
f"load_model={load_run.model_version} points={len(net_run.points)}"
|
||||||
|
)
|
||||||
|
print(
|
||||||
|
"target_at solar_p10 solar_p50 solar_p90 "
|
||||||
|
"load_p10 load_p50 load_p90 net_p10 net_p50 net_p90"
|
||||||
|
)
|
||||||
|
solar_by_target = _by_target(solar_run.points)
|
||||||
|
load_by_target = _by_target(load_run.points)
|
||||||
|
for point in net_run.points[: args.limit]:
|
||||||
|
solar_point = solar_by_target[point.target_at]
|
||||||
|
load_point = load_by_target[point.target_at]
|
||||||
|
print(
|
||||||
|
f"{point.target_at.astimezone(timezone.utc).isoformat(timespec='minutes'):25} "
|
||||||
|
f"{solar_point.p10_power_w:9.0f} "
|
||||||
|
f"{solar_point.p50_power_w:9.0f} "
|
||||||
|
f"{solar_point.p90_power_w:9.0f} "
|
||||||
|
f"{load_point.p10_power_w:8.0f} "
|
||||||
|
f"{load_point.p50_power_w:8.0f} "
|
||||||
|
f"{load_point.p90_power_w:8.0f} "
|
||||||
|
f"{point.p10_net_power_w:7.0f} "
|
||||||
|
f"{point.p50_net_power_w:7.0f} "
|
||||||
|
f"{point.p90_net_power_w:7.0f}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _by_target(points: list[PowerForecastPoint]) -> dict[object, PowerForecastPoint]:
|
||||||
|
return {point.target_at: point for point in points}
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Print baseline solar/load/net forecast curves."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--limit",
|
||||||
|
type=int,
|
||||||
|
default=24,
|
||||||
|
help="Number of forecast points to show. Defaults to 24.",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,5 @@
|
|||||||
|
from gibil.scripts.oracle_evaluator import main
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,198 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Debug script to inspect stored Sigenergy plant snapshots."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
from datetime import timezone
|
||||||
|
import json
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.sigen.store import SigenStore
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
EnvLoader().load()
|
||||||
|
args = parse_args()
|
||||||
|
store = SigenStore.from_env()
|
||||||
|
|
||||||
|
if args.view == "raw":
|
||||||
|
rows = load_raw_snapshots(store, args.limit)
|
||||||
|
print_raw_snapshots(rows)
|
||||||
|
return
|
||||||
|
|
||||||
|
rows = load_rollup(store, args.view, args.limit)
|
||||||
|
print_rollup(rows, args.view)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Inspect stored Sigenergy plant snapshots."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--view",
|
||||||
|
choices=("raw", "1m", "15m", "1h"),
|
||||||
|
default="raw",
|
||||||
|
help="View to inspect. Defaults to raw.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--limit",
|
||||||
|
type=int,
|
||||||
|
default=10,
|
||||||
|
help="Number of most recent rows/buckets to show. Defaults to 10.",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def load_raw_snapshots(store: SigenStore, limit: int) -> list[tuple]:
|
||||||
|
with store._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
"""
|
||||||
|
SELECT
|
||||||
|
observed_at,
|
||||||
|
received_at,
|
||||||
|
solar_power_w,
|
||||||
|
load_power_w,
|
||||||
|
battery_soc_pct,
|
||||||
|
battery_power_w,
|
||||||
|
grid_import_w,
|
||||||
|
grid_export_w,
|
||||||
|
plant_active_power_w,
|
||||||
|
accumulated_pv_energy_kwh,
|
||||||
|
daily_consumed_energy_kwh,
|
||||||
|
raw_values
|
||||||
|
FROM sigen_plant_snapshots
|
||||||
|
ORDER BY observed_at DESC
|
||||||
|
LIMIT %s
|
||||||
|
""",
|
||||||
|
(limit,),
|
||||||
|
)
|
||||||
|
return cursor.fetchall()
|
||||||
|
|
||||||
|
|
||||||
|
def load_rollup(store: SigenStore, view: str, limit: int) -> list[tuple]:
|
||||||
|
view_name = {
|
||||||
|
"1m": "sigen_plant_snapshots_1m",
|
||||||
|
"15m": "sigen_plant_snapshots_15m",
|
||||||
|
"1h": "sigen_plant_snapshots_1h",
|
||||||
|
}[view]
|
||||||
|
|
||||||
|
with store._connection() as connection:
|
||||||
|
with connection.cursor() as cursor:
|
||||||
|
cursor.execute(
|
||||||
|
f"""
|
||||||
|
SELECT
|
||||||
|
bucket,
|
||||||
|
sample_count,
|
||||||
|
avg_solar_power_w,
|
||||||
|
max_solar_power_w,
|
||||||
|
avg_load_power_w,
|
||||||
|
max_load_power_w,
|
||||||
|
avg_grid_import_w,
|
||||||
|
max_grid_import_w,
|
||||||
|
avg_grid_export_w,
|
||||||
|
max_grid_export_w,
|
||||||
|
avg_battery_soc_pct
|
||||||
|
FROM {view_name}
|
||||||
|
ORDER BY bucket DESC
|
||||||
|
LIMIT %s
|
||||||
|
""",
|
||||||
|
(limit,),
|
||||||
|
)
|
||||||
|
return cursor.fetchall()
|
||||||
|
|
||||||
|
|
||||||
|
def print_raw_snapshots(rows: list[tuple]) -> None:
|
||||||
|
print(f"raw_snapshots={len(rows)}")
|
||||||
|
for row in rows:
|
||||||
|
(
|
||||||
|
observed_at,
|
||||||
|
received_at,
|
||||||
|
solar_power_w,
|
||||||
|
load_power_w,
|
||||||
|
battery_soc_pct,
|
||||||
|
battery_power_w,
|
||||||
|
grid_import_w,
|
||||||
|
grid_export_w,
|
||||||
|
plant_active_power_w,
|
||||||
|
accumulated_pv_energy_kwh,
|
||||||
|
daily_consumed_energy_kwh,
|
||||||
|
raw_values,
|
||||||
|
) = row
|
||||||
|
print(
|
||||||
|
f"{_fmt_time(observed_at)} "
|
||||||
|
f"solar={_fmt_w(solar_power_w)} "
|
||||||
|
f"load={_fmt_w(load_power_w)} "
|
||||||
|
f"soc={_fmt_pct(battery_soc_pct)} "
|
||||||
|
f"battery={_fmt_w(battery_power_w)} "
|
||||||
|
f"import={_fmt_w(grid_import_w)} "
|
||||||
|
f"export={_fmt_w(grid_export_w)} "
|
||||||
|
f"plant={_fmt_w(plant_active_power_w)} "
|
||||||
|
f"pv_total={_fmt_kwh(accumulated_pv_energy_kwh)} "
|
||||||
|
f"load_today={_fmt_kwh(daily_consumed_energy_kwh)} "
|
||||||
|
f"lag_s={(received_at - observed_at).total_seconds():.1f}"
|
||||||
|
)
|
||||||
|
if raw_values and any(key.endswith("_error") for key in raw_values):
|
||||||
|
errors = {
|
||||||
|
key: value
|
||||||
|
for key, value in raw_values.items()
|
||||||
|
if key.endswith("_error")
|
||||||
|
}
|
||||||
|
print(f" errors={json.dumps(errors, default=str)}")
|
||||||
|
|
||||||
|
|
||||||
|
def print_rollup(rows: list[tuple], view: str) -> None:
|
||||||
|
print(f"{view}_buckets={len(rows)}")
|
||||||
|
for row in rows:
|
||||||
|
(
|
||||||
|
bucket,
|
||||||
|
sample_count,
|
||||||
|
avg_solar_power_w,
|
||||||
|
max_solar_power_w,
|
||||||
|
avg_load_power_w,
|
||||||
|
max_load_power_w,
|
||||||
|
avg_grid_import_w,
|
||||||
|
max_grid_import_w,
|
||||||
|
avg_grid_export_w,
|
||||||
|
max_grid_export_w,
|
||||||
|
avg_battery_soc_pct,
|
||||||
|
) = row
|
||||||
|
print(
|
||||||
|
f"{_fmt_time(bucket)} samples={sample_count:4} "
|
||||||
|
f"solar_avg={_fmt_w(avg_solar_power_w)} "
|
||||||
|
f"solar_max={_fmt_w(max_solar_power_w)} "
|
||||||
|
f"load_avg={_fmt_w(avg_load_power_w)} "
|
||||||
|
f"load_max={_fmt_w(max_load_power_w)} "
|
||||||
|
f"import_avg={_fmt_w(avg_grid_import_w)} "
|
||||||
|
f"import_max={_fmt_w(max_grid_import_w)} "
|
||||||
|
f"export_avg={_fmt_w(avg_grid_export_w)} "
|
||||||
|
f"export_max={_fmt_w(max_grid_export_w)} "
|
||||||
|
f"soc_avg={_fmt_pct(avg_battery_soc_pct)}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _fmt_time(value) -> str:
|
||||||
|
return value.astimezone(timezone.utc).isoformat(timespec="seconds")
|
||||||
|
|
||||||
|
|
||||||
|
def _fmt_w(value) -> str:
|
||||||
|
if value is None:
|
||||||
|
return "None"
|
||||||
|
return f"{value:.0f}W"
|
||||||
|
|
||||||
|
|
||||||
|
def _fmt_pct(value) -> str:
|
||||||
|
if value is None:
|
||||||
|
return "None"
|
||||||
|
return f"{value:.1f}%"
|
||||||
|
|
||||||
|
|
||||||
|
def _fmt_kwh(value) -> str:
|
||||||
|
if value is None:
|
||||||
|
return "None"
|
||||||
|
return f"{value:.2f}kWh"
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -2,8 +2,8 @@
|
|||||||
"""Debug script to trace resolved truth data."""
|
"""Debug script to trace resolved truth data."""
|
||||||
|
|
||||||
from gibil.classes.env_loader import EnvLoader
|
from gibil.classes.env_loader import EnvLoader
|
||||||
from gibil.classes.weather_store import WeatherStore
|
from gibil.classes.weather.store import WeatherStore
|
||||||
from gibil.classes.weather_display import WeatherDisplay
|
from gibil.classes.weather.display import WeatherDisplay
|
||||||
from datetime import datetime, timezone
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
EnvLoader().load()
|
EnvLoader().load()
|
||||||
@@ -0,0 +1,383 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Explore a Sigenergy plant or inverter over Modbus TCP."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
from dataclasses import asdict
|
||||||
|
from os import environ
|
||||||
|
|
||||||
|
from gibil.classes.sigen.builder import SigenPlantClient
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.sigen.modbus import (
|
||||||
|
ModbusReadError,
|
||||||
|
ModbusReadResult,
|
||||||
|
RegisterKind,
|
||||||
|
SigenModbusClient,
|
||||||
|
)
|
||||||
|
from gibil.classes.sigen.registers import (
|
||||||
|
DEFAULT_INVERTER_REGISTER_NAMES,
|
||||||
|
DEFAULT_PLANT_REGISTER_NAMES,
|
||||||
|
INVERTER_REGISTERS,
|
||||||
|
PLANT_PARAMETER_REGISTERS,
|
||||||
|
PLANT_REGISTERS,
|
||||||
|
SigenRegister,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
DEFAULT_KINDS: tuple[RegisterKind, ...] = ("holding", "input")
|
||||||
|
ALL_KINDS: tuple[RegisterKind, ...] = ("holding", "input", "coil", "discrete")
|
||||||
|
DEFAULT_UNIT_CANDIDATES = (0, 1, 2, 3, 247, 255)
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
EnvLoader().load()
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
if args.command == "units":
|
||||||
|
results = probe_units(args)
|
||||||
|
print_results(results, errors=True)
|
||||||
|
return
|
||||||
|
if args.command == "catalog":
|
||||||
|
if args.group in {"plant", "all"}:
|
||||||
|
print_catalog("Plant Sensors", PLANT_REGISTERS)
|
||||||
|
if args.group in {"params", "all"}:
|
||||||
|
print_catalog("Plant Parameters", PLANT_PARAMETER_REGISTERS)
|
||||||
|
return
|
||||||
|
if args.command == "snapshot":
|
||||||
|
snapshot = SigenPlantClient.from_env().fetch_snapshot()
|
||||||
|
print(json.dumps(asdict(snapshot), indent=2, default=str))
|
||||||
|
return
|
||||||
|
|
||||||
|
with SigenModbusClient(
|
||||||
|
host=args.host,
|
||||||
|
port=args.port,
|
||||||
|
unit_id=args.unit_id,
|
||||||
|
timeout=args.timeout,
|
||||||
|
retries=args.retries,
|
||||||
|
trace=args.trace,
|
||||||
|
) as client:
|
||||||
|
if args.command == "probe":
|
||||||
|
print(
|
||||||
|
f"Connected to {args.host}:{args.port} "
|
||||||
|
f"with unit id {args.unit_id}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
if args.command == "plant":
|
||||||
|
print_known_registers(client, args.register, PLANT_REGISTERS)
|
||||||
|
return
|
||||||
|
if args.command == "inverter":
|
||||||
|
print_known_registers(client, args.register, INVERTER_REGISTERS)
|
||||||
|
return
|
||||||
|
if args.command == "read":
|
||||||
|
try:
|
||||||
|
result = client.read(args.kind, args.address, args.count)
|
||||||
|
print_results([result], errors=True)
|
||||||
|
except Exception as exc:
|
||||||
|
print(
|
||||||
|
f"{args.kind:8} {args.address:5} "
|
||||||
|
f"+{args.count:<3} ERROR {exc}"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
results: list[ModbusReadResult | ModbusReadError] = []
|
||||||
|
for kind in args.kind:
|
||||||
|
results.extend(
|
||||||
|
client.scan(
|
||||||
|
kind=kind,
|
||||||
|
start=args.start,
|
||||||
|
count=args.count,
|
||||||
|
chunk_size=args.chunk_size,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
if args.json:
|
||||||
|
print(json.dumps([asdict(result) for result in results], indent=2))
|
||||||
|
else:
|
||||||
|
print_results(results, errors=args.errors)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Minimal Modbus TCP explorer for a Sigenergy plant/inverter."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--host",
|
||||||
|
default=environ.get("SIGEN_MODBUS_HOST"),
|
||||||
|
required="SIGEN_MODBUS_HOST" not in environ,
|
||||||
|
help="Modbus TCP host or IP. Can also be set as SIGEN_MODBUS_HOST.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--port",
|
||||||
|
type=int,
|
||||||
|
default=int(environ.get("SIGEN_MODBUS_PORT", "502")),
|
||||||
|
help="Modbus TCP port. Defaults to 502.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--unit-id",
|
||||||
|
type=int,
|
||||||
|
default=int(environ.get("SIGEN_MODBUS_UNIT_ID", "1")),
|
||||||
|
help="Modbus unit/slave id. Defaults to 1.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--timeout",
|
||||||
|
type=float,
|
||||||
|
default=float(environ.get("SIGEN_MODBUS_TIMEOUT", "5")),
|
||||||
|
help="Socket timeout in seconds. Defaults to 5.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--retries",
|
||||||
|
type=int,
|
||||||
|
default=int(environ.get("SIGEN_MODBUS_RETRIES", "3")),
|
||||||
|
help="Modbus request retries. Defaults to 3.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--trace",
|
||||||
|
action="store_true",
|
||||||
|
help="Print Modbus TCP packet bytes to stderr.",
|
||||||
|
)
|
||||||
|
|
||||||
|
subparsers = parser.add_subparsers(dest="command", required=True)
|
||||||
|
|
||||||
|
subparsers.add_parser("probe", help="Open a connection and report success.")
|
||||||
|
subparsers.add_parser(
|
||||||
|
"snapshot",
|
||||||
|
help="Read core plant metrics and print the builder snapshot as JSON.",
|
||||||
|
)
|
||||||
|
|
||||||
|
catalog = subparsers.add_parser(
|
||||||
|
"catalog",
|
||||||
|
help="List known Sigenergy plant sensors and settable parameters.",
|
||||||
|
)
|
||||||
|
catalog.add_argument(
|
||||||
|
"group",
|
||||||
|
choices=("plant", "params", "all"),
|
||||||
|
nargs="?",
|
||||||
|
default="all",
|
||||||
|
help="Catalog group to list. Defaults to all.",
|
||||||
|
)
|
||||||
|
|
||||||
|
units = subparsers.add_parser(
|
||||||
|
"units",
|
||||||
|
help="Try small reads against likely unit ids.",
|
||||||
|
)
|
||||||
|
units.add_argument(
|
||||||
|
"--candidate",
|
||||||
|
action="append",
|
||||||
|
type=int,
|
||||||
|
default=None,
|
||||||
|
help=(
|
||||||
|
"Unit id candidate to try. Repeat for multiple ids. "
|
||||||
|
"Defaults to 0, 1, 2, 3, 247, and 255."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
units.add_argument(
|
||||||
|
"--kind",
|
||||||
|
action="append",
|
||||||
|
choices=ALL_KINDS,
|
||||||
|
default=None,
|
||||||
|
help=(
|
||||||
|
"Register table to test. Repeat for multiple kinds. "
|
||||||
|
"Defaults to holding and input."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
units.add_argument(
|
||||||
|
"--address",
|
||||||
|
type=int,
|
||||||
|
default=30000,
|
||||||
|
help="Address to test. Defaults to 30000.",
|
||||||
|
)
|
||||||
|
units.add_argument(
|
||||||
|
"--count",
|
||||||
|
type=int,
|
||||||
|
default=1,
|
||||||
|
help="Number of values to request. Defaults to 1.",
|
||||||
|
)
|
||||||
|
|
||||||
|
plant = subparsers.add_parser(
|
||||||
|
"plant",
|
||||||
|
help="Read a small set of known Sigenergy plant registers.",
|
||||||
|
)
|
||||||
|
plant.add_argument(
|
||||||
|
"--register",
|
||||||
|
action="append",
|
||||||
|
choices=sorted(PLANT_REGISTERS),
|
||||||
|
default=None,
|
||||||
|
help="Known plant register to read. Repeat for multiple registers.",
|
||||||
|
)
|
||||||
|
|
||||||
|
inverter = subparsers.add_parser(
|
||||||
|
"inverter",
|
||||||
|
help="Read a small set of known Sigenergy inverter registers.",
|
||||||
|
)
|
||||||
|
inverter.add_argument(
|
||||||
|
"--register",
|
||||||
|
action="append",
|
||||||
|
choices=sorted(INVERTER_REGISTERS),
|
||||||
|
default=None,
|
||||||
|
help="Known inverter register to read. Repeat for multiple registers.",
|
||||||
|
)
|
||||||
|
|
||||||
|
read = subparsers.add_parser(
|
||||||
|
"read",
|
||||||
|
help="Read one raw Modbus register range.",
|
||||||
|
)
|
||||||
|
read.add_argument(
|
||||||
|
"kind",
|
||||||
|
choices=ALL_KINDS,
|
||||||
|
help="Register table to read.",
|
||||||
|
)
|
||||||
|
read.add_argument(
|
||||||
|
"address",
|
||||||
|
type=int,
|
||||||
|
help="Modbus address to read.",
|
||||||
|
)
|
||||||
|
read.add_argument(
|
||||||
|
"count",
|
||||||
|
type=int,
|
||||||
|
nargs="?",
|
||||||
|
default=1,
|
||||||
|
help="Number of values to read. Defaults to 1.",
|
||||||
|
)
|
||||||
|
|
||||||
|
scan = subparsers.add_parser("scan", help="Scan register ranges in chunks.")
|
||||||
|
scan.add_argument(
|
||||||
|
"--kind",
|
||||||
|
action="append",
|
||||||
|
choices=ALL_KINDS,
|
||||||
|
default=None,
|
||||||
|
help=(
|
||||||
|
"Register table to scan. Repeat for multiple kinds. "
|
||||||
|
"Defaults to holding and input."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
scan.add_argument(
|
||||||
|
"--start",
|
||||||
|
type=int,
|
||||||
|
default=0,
|
||||||
|
help="Starting zero-based Modbus address. Defaults to 0.",
|
||||||
|
)
|
||||||
|
scan.add_argument(
|
||||||
|
"--count",
|
||||||
|
type=int,
|
||||||
|
default=100,
|
||||||
|
help="Number of addresses to scan. Defaults to 100.",
|
||||||
|
)
|
||||||
|
scan.add_argument(
|
||||||
|
"--chunk-size",
|
||||||
|
type=int,
|
||||||
|
default=10,
|
||||||
|
help="Addresses per Modbus request. Defaults to 10.",
|
||||||
|
)
|
||||||
|
scan.add_argument(
|
||||||
|
"--errors",
|
||||||
|
action="store_true",
|
||||||
|
help="Show failed chunks as well as successful reads.",
|
||||||
|
)
|
||||||
|
scan.add_argument(
|
||||||
|
"--json",
|
||||||
|
action="store_true",
|
||||||
|
help="Print raw result objects as JSON.",
|
||||||
|
)
|
||||||
|
|
||||||
|
args = parser.parse_args()
|
||||||
|
if args.command == "scan" and args.kind is None:
|
||||||
|
args.kind = list(DEFAULT_KINDS)
|
||||||
|
if args.command == "units":
|
||||||
|
if args.kind is None:
|
||||||
|
args.kind = list(DEFAULT_KINDS)
|
||||||
|
if args.candidate is None:
|
||||||
|
args.candidate = list(DEFAULT_UNIT_CANDIDATES)
|
||||||
|
if args.command == "plant" and args.register is None:
|
||||||
|
args.register = list(DEFAULT_PLANT_REGISTER_NAMES)
|
||||||
|
if args.command == "inverter" and args.register is None:
|
||||||
|
args.register = list(DEFAULT_INVERTER_REGISTER_NAMES)
|
||||||
|
|
||||||
|
return args
|
||||||
|
|
||||||
|
|
||||||
|
def probe_units(args: argparse.Namespace) -> list[ModbusReadResult | ModbusReadError]:
|
||||||
|
results: list[ModbusReadResult | ModbusReadError] = []
|
||||||
|
for unit_id in args.candidate:
|
||||||
|
with SigenModbusClient(
|
||||||
|
host=args.host,
|
||||||
|
port=args.port,
|
||||||
|
unit_id=unit_id,
|
||||||
|
timeout=args.timeout,
|
||||||
|
retries=args.retries,
|
||||||
|
trace=args.trace,
|
||||||
|
) as client:
|
||||||
|
for kind in args.kind:
|
||||||
|
try:
|
||||||
|
result = client.read(kind, args.address, args.count)
|
||||||
|
results.append(result)
|
||||||
|
except Exception as exc:
|
||||||
|
results.append(
|
||||||
|
ModbusReadError(
|
||||||
|
kind=kind,
|
||||||
|
address=args.address,
|
||||||
|
count=args.count,
|
||||||
|
error=f"unit {unit_id}: {exc}",
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def print_known_registers(
|
||||||
|
client: SigenModbusClient,
|
||||||
|
register_names: list[str],
|
||||||
|
registers: dict[str, SigenRegister],
|
||||||
|
) -> None:
|
||||||
|
for register_name in register_names:
|
||||||
|
register = registers[register_name]
|
||||||
|
try:
|
||||||
|
result = client.read(register.kind, register.address, register.count)
|
||||||
|
value = register.decode(result.values)
|
||||||
|
unit = f" {register.unit}" if register.unit else ""
|
||||||
|
raw_values = " ".join(str(value) for value in result.values)
|
||||||
|
print(
|
||||||
|
f"{register.name:32} {value}{unit:4} "
|
||||||
|
f"({register.kind} {register.address} +{register.count}: {raw_values})"
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
print(
|
||||||
|
f"{register.name:32} ERROR "
|
||||||
|
f"({register.kind} {register.address} +{register.count}: {exc})"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def print_catalog(title: str, registers: dict[str, SigenRegister]) -> None:
|
||||||
|
print(title)
|
||||||
|
print("-" * len(title))
|
||||||
|
for register in registers.values():
|
||||||
|
unit = register.unit or ""
|
||||||
|
description = register.description or ""
|
||||||
|
print(
|
||||||
|
f"{register.name:48} {register.kind:7} "
|
||||||
|
f"{register.address:5} +{register.count:<2} "
|
||||||
|
f"{register.data_type:6} gain={register.gain:<7g} "
|
||||||
|
f"{unit:5} {description}"
|
||||||
|
)
|
||||||
|
print()
|
||||||
|
|
||||||
|
|
||||||
|
def print_results(
|
||||||
|
results: list[ModbusReadResult | ModbusReadError],
|
||||||
|
errors: bool,
|
||||||
|
) -> None:
|
||||||
|
for result in results:
|
||||||
|
if isinstance(result, ModbusReadError):
|
||||||
|
if errors:
|
||||||
|
print(
|
||||||
|
f"{result.kind:8} {result.address:5} "
|
||||||
|
f"+{result.count:<3} ERROR {result.error}"
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
values = " ".join(str(value) for value in result.values)
|
||||||
|
print(f"{result.kind:8} {result.address:5} +{result.count:<3} {values}")
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1,102 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
from datetime import timedelta
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.oracle.store import OracleStore
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
EnvLoader().load()
|
||||||
|
args = parse_args()
|
||||||
|
store = OracleStore.from_env()
|
||||||
|
|
||||||
|
if args.evaluate:
|
||||||
|
evaluated_count = store.evaluate_due_forecasts(
|
||||||
|
actual_window=timedelta(minutes=args.actual_window_minutes),
|
||||||
|
lookback=timedelta(hours=args.lookback_hours),
|
||||||
|
limit=args.limit,
|
||||||
|
)
|
||||||
|
print(f"evaluated_oracle_forecasts={evaluated_count}")
|
||||||
|
|
||||||
|
if args.summary:
|
||||||
|
rows = store.load_evaluation_summary(
|
||||||
|
lookback=timedelta(hours=args.lookback_hours)
|
||||||
|
)
|
||||||
|
print(f"oracle_evaluation_summary_rows={len(rows)}")
|
||||||
|
for row in rows:
|
||||||
|
print(
|
||||||
|
" ".join(
|
||||||
|
[
|
||||||
|
f"kind={row['kind']}",
|
||||||
|
f"model={row['model_version']}",
|
||||||
|
f"horizon={row.get('horizon_label') or _format_horizon(row)}",
|
||||||
|
f"n={row['evaluated_count']}",
|
||||||
|
f"bias={_format_w(row['mean_error_w'])}",
|
||||||
|
f"mae={_format_w(row['mean_absolute_error_w'])}",
|
||||||
|
f"median_ae={_format_w(row['median_absolute_error_w'])}",
|
||||||
|
f"mape={_format_pct(row['mean_absolute_pct_error'])}",
|
||||||
|
f"coverage={_format_pct(row['interval_coverage'])}",
|
||||||
|
]
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Evaluate stored Astrape oracle predictions against Sigen actuals."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--evaluate",
|
||||||
|
action="store_true",
|
||||||
|
help="Evaluate stored predictions whose target time has passed.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--summary",
|
||||||
|
action="store_true",
|
||||||
|
help="Print evaluation quality by kind/model/horizon bucket.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--actual-window-minutes",
|
||||||
|
type=float,
|
||||||
|
default=5,
|
||||||
|
help="Minutes after each target timestamp to average as realized actuals.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--lookback-hours",
|
||||||
|
type=float,
|
||||||
|
default=168,
|
||||||
|
help="Only evaluate/summarize predictions with target times this recent.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--limit",
|
||||||
|
type=int,
|
||||||
|
default=1000,
|
||||||
|
help="Maximum unevaluated predictions to process.",
|
||||||
|
)
|
||||||
|
args = parser.parse_args()
|
||||||
|
if not args.evaluate and not args.summary:
|
||||||
|
args.evaluate = True
|
||||||
|
args.summary = True
|
||||||
|
return args
|
||||||
|
|
||||||
|
|
||||||
|
def _format_w(value: object) -> str:
|
||||||
|
if value is None:
|
||||||
|
return "n/a"
|
||||||
|
return f"{float(value):.0f}W"
|
||||||
|
|
||||||
|
|
||||||
|
def _format_horizon(row: dict[str, object]) -> str:
|
||||||
|
return f"{row['min_horizon_minutes']}-{row['max_horizon_minutes']}m"
|
||||||
|
|
||||||
|
|
||||||
|
def _format_pct(value: object) -> str:
|
||||||
|
if value is None:
|
||||||
|
return "n/a"
|
||||||
|
return f"{float(value) * 100:.1f}%"
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
"""TCN training and inspection scripts."""
|
||||||
@@ -0,0 +1,254 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
from pathlib import Path
|
||||||
|
from random import Random
|
||||||
|
|
||||||
|
from gibil.classes.env_loader import EnvLoader
|
||||||
|
from gibil.classes.predictors.usage_hybrid_tcn import (
|
||||||
|
UsageHybridTCNConfig,
|
||||||
|
build_usage_hybrid_tcn,
|
||||||
|
pinball_loss,
|
||||||
|
)
|
||||||
|
from gibil.classes.predictors.usage_sequence_dataset import (
|
||||||
|
UsageSequenceExample,
|
||||||
|
UsageSequenceDatasetBuilder,
|
||||||
|
UsageSequenceDatasetConfig,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
EnvLoader().load()
|
||||||
|
args = parse_args()
|
||||||
|
config = UsageSequenceDatasetConfig.from_env()
|
||||||
|
builder = UsageSequenceDatasetBuilder(config=config)
|
||||||
|
examples = builder.build(limit=args.limit)
|
||||||
|
|
||||||
|
print(f"usage_sequence_examples={len(examples)}")
|
||||||
|
print(
|
||||||
|
"minimum_history_hours="
|
||||||
|
f"{builder.max_past_hours + config.future_hours}"
|
||||||
|
)
|
||||||
|
print(f"past_features={len(builder.past_feature_names)}")
|
||||||
|
for scale in config.past_scales:
|
||||||
|
print(
|
||||||
|
f"past_scale={scale.name} "
|
||||||
|
f"hours={scale.hours} "
|
||||||
|
f"step_seconds={scale.step_seconds} "
|
||||||
|
f"steps={builder.past_steps(scale)}"
|
||||||
|
)
|
||||||
|
print(f"future_steps={builder.future_steps}")
|
||||||
|
print(f"future_features={len(builder.future_feature_names)}")
|
||||||
|
|
||||||
|
if examples:
|
||||||
|
first = examples[0]
|
||||||
|
last = examples[-1]
|
||||||
|
print(f"first_issued_at={first.issued_at.isoformat()}")
|
||||||
|
print(f"last_issued_at={last.issued_at.isoformat()}")
|
||||||
|
for name, rows in first.past_by_scale.items():
|
||||||
|
print(f"first_past_{name}_shape={len(rows)}x{len(rows[0])}")
|
||||||
|
token_count = sum(
|
||||||
|
len(tokens)
|
||||||
|
for tokens in first.past_tokens_by_scale[name]
|
||||||
|
)
|
||||||
|
print(f"first_past_{name}_tokens={token_count}")
|
||||||
|
print(
|
||||||
|
"first_future_feature_shape="
|
||||||
|
f"{len(first.future_features)}x{len(first.future_features[0])}"
|
||||||
|
)
|
||||||
|
print(
|
||||||
|
"first_future_tokens="
|
||||||
|
f"{sum(len(tokens) for tokens in first.future_tokens)}"
|
||||||
|
)
|
||||||
|
print(f"first_targets={len(first.targets)}")
|
||||||
|
print(
|
||||||
|
"first_target_preview="
|
||||||
|
+ ",".join(f"{value:.0f}" for value in first.targets[:8])
|
||||||
|
)
|
||||||
|
|
||||||
|
if args.dry_run:
|
||||||
|
return
|
||||||
|
|
||||||
|
if not examples:
|
||||||
|
raise SystemExit("No usage sequence examples available for training yet.")
|
||||||
|
|
||||||
|
train_model(
|
||||||
|
examples=examples,
|
||||||
|
builder=builder,
|
||||||
|
epochs=args.epochs,
|
||||||
|
batch_size=args.batch_size,
|
||||||
|
learning_rate=args.learning_rate,
|
||||||
|
artifact_path=args.artifact_path,
|
||||||
|
seed=args.seed,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Build training windows for the sequence usage oracle."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--dry-run",
|
||||||
|
action="store_true",
|
||||||
|
help="Only build examples and print dataset shape.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--limit",
|
||||||
|
type=int,
|
||||||
|
default=None,
|
||||||
|
help="Optional maximum number of examples to build.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--epochs",
|
||||||
|
type=int,
|
||||||
|
default=20,
|
||||||
|
help="Training epochs.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--batch-size",
|
||||||
|
type=int,
|
||||||
|
default=32,
|
||||||
|
help="Training batch size.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--learning-rate",
|
||||||
|
type=float,
|
||||||
|
default=0.001,
|
||||||
|
help="Adam learning rate.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--artifact-path",
|
||||||
|
type=Path,
|
||||||
|
default=Path("models/usage_sequence_tcn_v1.pt"),
|
||||||
|
help="Where to save the trained TCN artifact.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--seed",
|
||||||
|
type=int,
|
||||||
|
default=7,
|
||||||
|
help="Deterministic shuffle seed.",
|
||||||
|
)
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def train_model(
|
||||||
|
examples: list[UsageSequenceExample],
|
||||||
|
builder: UsageSequenceDatasetBuilder,
|
||||||
|
epochs: int,
|
||||||
|
batch_size: int,
|
||||||
|
learning_rate: float,
|
||||||
|
artifact_path: Path,
|
||||||
|
seed: int,
|
||||||
|
) -> None:
|
||||||
|
try:
|
||||||
|
import torch
|
||||||
|
except ImportError as error:
|
||||||
|
raise SystemExit(
|
||||||
|
"PyTorch is required for training. Install it with "
|
||||||
|
"`python3 -m pip install -r requirements.txt`."
|
||||||
|
) from error
|
||||||
|
|
||||||
|
torch.backends.mkldnn.enabled = False
|
||||||
|
if hasattr(torch.backends, "nnpack"):
|
||||||
|
torch.backends.nnpack.enabled = False
|
||||||
|
scale_names = tuple(scale.name for scale in builder.config.past_scales)
|
||||||
|
model_config = UsageHybridTCNConfig(
|
||||||
|
past_feature_count=len(builder.past_feature_names),
|
||||||
|
future_feature_count=len(builder.future_feature_names),
|
||||||
|
future_steps=builder.future_steps,
|
||||||
|
scale_names=scale_names,
|
||||||
|
)
|
||||||
|
model = build_usage_hybrid_tcn(model_config)
|
||||||
|
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)
|
||||||
|
|
||||||
|
shuffled = examples[:]
|
||||||
|
Random(seed).shuffle(shuffled)
|
||||||
|
validation_count = max(1, len(shuffled) // 5) if len(shuffled) >= 5 else 0
|
||||||
|
validation_examples = shuffled[:validation_count]
|
||||||
|
training_examples = shuffled[validation_count:] or shuffled
|
||||||
|
|
||||||
|
for epoch in range(1, epochs + 1):
|
||||||
|
model.train()
|
||||||
|
training_losses = []
|
||||||
|
for batch in batches(training_examples, batch_size):
|
||||||
|
past_by_scale, future_features, targets = examples_to_tensors(
|
||||||
|
batch,
|
||||||
|
scale_names,
|
||||||
|
torch,
|
||||||
|
)
|
||||||
|
prediction = model(past_by_scale, future_features)
|
||||||
|
loss = pinball_loss(prediction, targets, model_config.quantiles)
|
||||||
|
optimizer.zero_grad()
|
||||||
|
loss.backward()
|
||||||
|
optimizer.step()
|
||||||
|
training_losses.append(float(loss.detach()))
|
||||||
|
|
||||||
|
validation_loss = None
|
||||||
|
if validation_examples:
|
||||||
|
model.eval()
|
||||||
|
with torch.no_grad():
|
||||||
|
validation_losses = []
|
||||||
|
for batch in batches(validation_examples, batch_size):
|
||||||
|
past_by_scale, future_features, targets = examples_to_tensors(
|
||||||
|
batch,
|
||||||
|
scale_names,
|
||||||
|
torch,
|
||||||
|
)
|
||||||
|
prediction = model(past_by_scale, future_features)
|
||||||
|
loss = pinball_loss(prediction, targets, model_config.quantiles)
|
||||||
|
validation_losses.append(float(loss.detach()))
|
||||||
|
validation_loss = sum(validation_losses) / len(validation_losses)
|
||||||
|
|
||||||
|
train_loss = sum(training_losses) / len(training_losses)
|
||||||
|
message = f"epoch={epoch} train_pinball_loss={train_loss:.4f}"
|
||||||
|
if validation_loss is not None:
|
||||||
|
message += f" validation_pinball_loss={validation_loss:.4f}"
|
||||||
|
print(message)
|
||||||
|
|
||||||
|
artifact_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
torch.save(
|
||||||
|
{
|
||||||
|
"model_version": "sequence_usage_tcn_v1",
|
||||||
|
"model_config": model_config.__dict__,
|
||||||
|
"past_feature_names": builder.past_feature_names,
|
||||||
|
"future_feature_names": builder.future_feature_names,
|
||||||
|
"state_dict": model.state_dict(),
|
||||||
|
},
|
||||||
|
artifact_path,
|
||||||
|
)
|
||||||
|
print(f"saved_artifact={artifact_path}")
|
||||||
|
|
||||||
|
|
||||||
|
def examples_to_tensors(
|
||||||
|
examples: list[UsageSequenceExample],
|
||||||
|
scale_names: tuple[str, ...],
|
||||||
|
torch,
|
||||||
|
):
|
||||||
|
past_by_scale = {
|
||||||
|
name: torch.tensor(
|
||||||
|
[example.past_by_scale[name] for example in examples],
|
||||||
|
dtype=torch.float32,
|
||||||
|
)
|
||||||
|
for name in scale_names
|
||||||
|
}
|
||||||
|
future_features = torch.tensor(
|
||||||
|
[example.future_features for example in examples],
|
||||||
|
dtype=torch.float32,
|
||||||
|
)
|
||||||
|
targets = torch.tensor(
|
||||||
|
[example.targets for example in examples],
|
||||||
|
dtype=torch.float32,
|
||||||
|
)
|
||||||
|
return past_by_scale, future_features, targets
|
||||||
|
|
||||||
|
|
||||||
|
def batches(
|
||||||
|
examples: list[UsageSequenceExample],
|
||||||
|
batch_size: int,
|
||||||
|
):
|
||||||
|
for start in range(0, len(examples), batch_size):
|
||||||
|
yield examples[start : start + batch_size]
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
@@ -1,84 +0,0 @@
|
|||||||
from __future__ import annotations
|
|
||||||
|
|
||||||
from http.server import BaseHTTPRequestHandler, ThreadingHTTPServer
|
|
||||||
from importlib import import_module, reload
|
|
||||||
from os import environ
|
|
||||||
from pathlib import Path
|
|
||||||
import json
|
|
||||||
|
|
||||||
from gibil.classes.env_loader import EnvLoader
|
|
||||||
|
|
||||||
EnvLoader().load()
|
|
||||||
|
|
||||||
HOST = environ.get("ASTRAPE_WEB_HOST", "0.0.0.0")
|
|
||||||
PORT = int(environ.get("ASTRAPE_WEB_PORT", "8080"))
|
|
||||||
PROJECT_ROOT = Path(__file__).resolve().parents[2]
|
|
||||||
WATCHED_PATHS = [
|
|
||||||
PROJECT_ROOT / "gibil" / "classes" / "webui.py",
|
|
||||||
PROJECT_ROOT / "gibil" / "classes" / "weather_display.py",
|
|
||||||
PROJECT_ROOT / "gibil" / "classes" / "weather_store.py",
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
class AstrapeWebHandler(BaseHTTPRequestHandler):
|
|
||||||
def do_GET(self) -> None:
|
|
||||||
if self.path == "/":
|
|
||||||
self._send_html(self._webui().render_page())
|
|
||||||
return
|
|
||||||
|
|
||||||
if self.path == "/api/weather":
|
|
||||||
self._send_json_text(self._webui().weather_payload())
|
|
||||||
return
|
|
||||||
|
|
||||||
if self.path == "/api/ui-version":
|
|
||||||
self._send_json_text(json.dumps({"version": self._ui_version()}))
|
|
||||||
return
|
|
||||||
|
|
||||||
self.send_error(404)
|
|
||||||
|
|
||||||
def log_message(self, format: str, *args: object) -> None:
|
|
||||||
print(f"{self.address_string()} - {format % args}")
|
|
||||||
|
|
||||||
def _webui(self):
|
|
||||||
weather_store_module = import_module("gibil.classes.weather_store")
|
|
||||||
weather_display_module = import_module("gibil.classes.weather_display")
|
|
||||||
webui_module = import_module("gibil.classes.webui")
|
|
||||||
reload(weather_store_module)
|
|
||||||
reload(weather_display_module)
|
|
||||||
reload(webui_module)
|
|
||||||
return webui_module.WebUI()
|
|
||||||
|
|
||||||
def _ui_version(self) -> str:
|
|
||||||
mtimes = [
|
|
||||||
str(path.stat().st_mtime_ns)
|
|
||||||
for path in WATCHED_PATHS
|
|
||||||
if path.exists()
|
|
||||||
]
|
|
||||||
return ".".join(mtimes)
|
|
||||||
|
|
||||||
def _send_html(self, body: str) -> None:
|
|
||||||
encoded = body.encode("utf-8")
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header("Content-Type", "text/html; charset=utf-8")
|
|
||||||
self.send_header("Content-Length", str(len(encoded)))
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(encoded)
|
|
||||||
|
|
||||||
def _send_json_text(self, body: str) -> None:
|
|
||||||
encoded = body.encode("utf-8")
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header("Content-Type", "application/json; charset=utf-8")
|
|
||||||
self.send_header("Cache-Control", "no-store")
|
|
||||||
self.send_header("Content-Length", str(len(encoded)))
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(encoded)
|
|
||||||
|
|
||||||
|
|
||||||
def main() -> None:
|
|
||||||
server = ThreadingHTTPServer((HOST, PORT), AstrapeWebHandler)
|
|
||||||
print(f"Astrape web UI listening on http://{HOST}:{PORT}")
|
|
||||||
server.serve_forever()
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
main()
|
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
def main() -> None:
|
def main() -> None:
|
||||||
print("Run `python3 -m gibil.scripts.web_daemon` to start the Astrape web UI.")
|
print("Run `python3 -m gibil.scripts.daemons.web_daemon` to start the Astrape web UI.")
|
||||||
print("Run `python3 -m gibil.scripts.db_daemon` to start database ingest.")
|
print("Run `python3 -m gibil.scripts.daemons.db_daemon` to start database ingest.")
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
|
|||||||
@@ -1 +1,3 @@
|
|||||||
psycopg[binary]>=3.2,<4
|
psycopg[binary]>=3.2,<4
|
||||||
|
pymodbus>=3.8,<4
|
||||||
|
torch>=2.2,<3
|
||||||
|
|||||||
-258
@@ -1,258 +0,0 @@
|
|||||||
|
|
||||||
SSUUMMMMAARRYY OOFF LLEESSSS CCOOMMMMAANNDDSS
|
|
||||||
|
|
||||||
Commands marked with * may be preceded by a number, _N.
|
|
||||||
Notes in parentheses indicate the behavior if _N is given.
|
|
||||||
A key preceded by a caret indicates the Ctrl key; thus ^K is ctrl-K.
|
|
||||||
|
|
||||||
h H Display this help.
|
|
||||||
q :q Q :Q ZZ Exit.
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
MMOOVVIINNGG
|
|
||||||
|
|
||||||
e ^E j ^N CR * Forward one line (or _N lines).
|
|
||||||
y ^Y k ^K ^P * Backward one line (or _N lines).
|
|
||||||
f ^F ^V SPACE * Forward one window (or _N lines).
|
|
||||||
b ^B ESC-v * Backward one window (or _N lines).
|
|
||||||
z * Forward one window (and set window to _N).
|
|
||||||
w * Backward one window (and set window to _N).
|
|
||||||
ESC-SPACE * Forward one window, but don't stop at end-of-file.
|
|
||||||
d ^D * Forward one half-window (and set half-window to _N).
|
|
||||||
u ^U * Backward one half-window (and set half-window to _N).
|
|
||||||
ESC-) RightArrow * Right one half screen width (or _N positions).
|
|
||||||
ESC-( LeftArrow * Left one half screen width (or _N positions).
|
|
||||||
ESC-} ^RightArrow Right to last column displayed.
|
|
||||||
ESC-{ ^LeftArrow Left to first column.
|
|
||||||
F Forward forever; like "tail -f".
|
|
||||||
ESC-F Like F but stop when search pattern is found.
|
|
||||||
r ^R ^L Repaint screen.
|
|
||||||
R Repaint screen, discarding buffered input.
|
|
||||||
---------------------------------------------------
|
|
||||||
Default "window" is the screen height.
|
|
||||||
Default "half-window" is half of the screen height.
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
SSEEAARRCCHHIINNGG
|
|
||||||
|
|
||||||
/_p_a_t_t_e_r_n * Search forward for (_N-th) matching line.
|
|
||||||
?_p_a_t_t_e_r_n * Search backward for (_N-th) matching line.
|
|
||||||
n * Repeat previous search (for _N-th occurrence).
|
|
||||||
N * Repeat previous search in reverse direction.
|
|
||||||
ESC-n * Repeat previous search, spanning files.
|
|
||||||
ESC-N * Repeat previous search, reverse dir. & spanning files.
|
|
||||||
ESC-u Undo (toggle) search highlighting.
|
|
||||||
ESC-U Clear search highlighting.
|
|
||||||
&_p_a_t_t_e_r_n * Display only matching lines.
|
|
||||||
---------------------------------------------------
|
|
||||||
A search pattern may begin with one or more of:
|
|
||||||
^N or ! Search for NON-matching lines.
|
|
||||||
^E or * Search multiple files (pass thru END OF FILE).
|
|
||||||
^F or @ Start search at FIRST file (for /) or last file (for ?).
|
|
||||||
^K Highlight matches, but don't move (KEEP position).
|
|
||||||
^R Don't use REGULAR EXPRESSIONS.
|
|
||||||
^W WRAP search if no match found.
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
JJUUMMPPIINNGG
|
|
||||||
|
|
||||||
g < ESC-< * Go to first line in file (or line _N).
|
|
||||||
G > ESC-> * Go to last line in file (or line _N).
|
|
||||||
p % * Go to beginning of file (or _N percent into file).
|
|
||||||
t * Go to the (_N-th) next tag.
|
|
||||||
T * Go to the (_N-th) previous tag.
|
|
||||||
{ ( [ * Find close bracket } ) ].
|
|
||||||
} ) ] * Find open bracket { ( [.
|
|
||||||
ESC-^F _<_c_1_> _<_c_2_> * Find close bracket _<_c_2_>.
|
|
||||||
ESC-^B _<_c_1_> _<_c_2_> * Find open bracket _<_c_1_>.
|
|
||||||
---------------------------------------------------
|
|
||||||
Each "find close bracket" command goes forward to the close bracket
|
|
||||||
matching the (_N-th) open bracket in the top line.
|
|
||||||
Each "find open bracket" command goes backward to the open bracket
|
|
||||||
matching the (_N-th) close bracket in the bottom line.
|
|
||||||
|
|
||||||
m_<_l_e_t_t_e_r_> Mark the current top line with <letter>.
|
|
||||||
M_<_l_e_t_t_e_r_> Mark the current bottom line with <letter>.
|
|
||||||
'_<_l_e_t_t_e_r_> Go to a previously marked position.
|
|
||||||
'' Go to the previous position.
|
|
||||||
^X^X Same as '.
|
|
||||||
ESC-M_<_l_e_t_t_e_r_> Clear a mark.
|
|
||||||
---------------------------------------------------
|
|
||||||
A mark is any upper-case or lower-case letter.
|
|
||||||
Certain marks are predefined:
|
|
||||||
^ means beginning of the file
|
|
||||||
$ means end of the file
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
CCHHAANNGGIINNGG FFIILLEESS
|
|
||||||
|
|
||||||
:e [_f_i_l_e] Examine a new file.
|
|
||||||
^X^V Same as :e.
|
|
||||||
:n * Examine the (_N-th) next file from the command line.
|
|
||||||
:p * Examine the (_N-th) previous file from the command line.
|
|
||||||
:x * Examine the first (or _N-th) file from the command line.
|
|
||||||
:d Delete the current file from the command line list.
|
|
||||||
= ^G :f Print current file name.
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
MMIISSCCEELLLLAANNEEOOUUSS CCOOMMMMAANNDDSS
|
|
||||||
|
|
||||||
-_<_f_l_a_g_> Toggle a command line option [see OPTIONS below].
|
|
||||||
--_<_n_a_m_e_> Toggle a command line option, by name.
|
|
||||||
__<_f_l_a_g_> Display the setting of a command line option.
|
|
||||||
___<_n_a_m_e_> Display the setting of an option, by name.
|
|
||||||
+_c_m_d Execute the less cmd each time a new file is examined.
|
|
||||||
|
|
||||||
!_c_o_m_m_a_n_d Execute the shell command with $SHELL.
|
|
||||||
|XX_c_o_m_m_a_n_d Pipe file between current pos & mark XX to shell command.
|
|
||||||
s _f_i_l_e Save input to a file.
|
|
||||||
v Edit the current file with $VISUAL or $EDITOR.
|
|
||||||
V Print version number of "less".
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
OOPPTTIIOONNSS
|
|
||||||
|
|
||||||
Most options may be changed either on the command line,
|
|
||||||
or from within less by using the - or -- command.
|
|
||||||
Options may be given in one of two forms: either a single
|
|
||||||
character preceded by a -, or a name preceded by --.
|
|
||||||
|
|
||||||
-? ........ --help
|
|
||||||
Display help (from command line).
|
|
||||||
-a ........ --search-skip-screen
|
|
||||||
Search skips current screen.
|
|
||||||
-A ........ --SEARCH-SKIP-SCREEN
|
|
||||||
Search starts just after target line.
|
|
||||||
-b [_N] .... --buffers=[_N]
|
|
||||||
Number of buffers.
|
|
||||||
-B ........ --auto-buffers
|
|
||||||
Don't automatically allocate buffers for pipes.
|
|
||||||
-c ........ --clear-screen
|
|
||||||
Repaint by clearing rather than scrolling.
|
|
||||||
-d ........ --dumb
|
|
||||||
Dumb terminal.
|
|
||||||
-D xx_c_o_l_o_r . --color=xx_c_o_l_o_r
|
|
||||||
Set screen colors.
|
|
||||||
-e -E .... --quit-at-eof --QUIT-AT-EOF
|
|
||||||
Quit at end of file.
|
|
||||||
-f ........ --force
|
|
||||||
Force open non-regular files.
|
|
||||||
-F ........ --quit-if-one-screen
|
|
||||||
Quit if entire file fits on first screen.
|
|
||||||
-g ........ --hilite-search
|
|
||||||
Highlight only last match for searches.
|
|
||||||
-G ........ --HILITE-SEARCH
|
|
||||||
Don't highlight any matches for searches.
|
|
||||||
-h [_N] .... --max-back-scroll=[_N]
|
|
||||||
Backward scroll limit.
|
|
||||||
-i ........ --ignore-case
|
|
||||||
Ignore case in searches that do not contain uppercase.
|
|
||||||
-I ........ --IGNORE-CASE
|
|
||||||
Ignore case in all searches.
|
|
||||||
-j [_N] .... --jump-target=[_N]
|
|
||||||
Screen position of target lines.
|
|
||||||
-J ........ --status-column
|
|
||||||
Display a status column at left edge of screen.
|
|
||||||
-k [_f_i_l_e] . --lesskey-file=[_f_i_l_e]
|
|
||||||
Use a lesskey file.
|
|
||||||
-K ........ --quit-on-intr
|
|
||||||
Exit less in response to ctrl-C.
|
|
||||||
-L ........ --no-lessopen
|
|
||||||
Ignore the LESSOPEN environment variable.
|
|
||||||
-m -M .... --long-prompt --LONG-PROMPT
|
|
||||||
Set prompt style.
|
|
||||||
-n -N .... --line-numbers --LINE-NUMBERS
|
|
||||||
Don't use line numbers.
|
|
||||||
-o [_f_i_l_e] . --log-file=[_f_i_l_e]
|
|
||||||
Copy to log file (standard input only).
|
|
||||||
-O [_f_i_l_e] . --LOG-FILE=[_f_i_l_e]
|
|
||||||
Copy to log file (unconditionally overwrite).
|
|
||||||
-p [_p_a_t_t_e_r_n] --pattern=[_p_a_t_t_e_r_n]
|
|
||||||
Start at pattern (from command line).
|
|
||||||
-P [_p_r_o_m_p_t] --prompt=[_p_r_o_m_p_t]
|
|
||||||
Define new prompt.
|
|
||||||
-q -Q .... --quiet --QUIET --silent --SILENT
|
|
||||||
Quiet the terminal bell.
|
|
||||||
-r -R .... --raw-control-chars --RAW-CONTROL-CHARS
|
|
||||||
Output "raw" control characters.
|
|
||||||
-s ........ --squeeze-blank-lines
|
|
||||||
Squeeze multiple blank lines.
|
|
||||||
-S ........ --chop-long-lines
|
|
||||||
Chop (truncate) long lines rather than wrapping.
|
|
||||||
-t [_t_a_g] .. --tag=[_t_a_g]
|
|
||||||
Find a tag.
|
|
||||||
-T [_t_a_g_s_f_i_l_e] --tag-file=[_t_a_g_s_f_i_l_e]
|
|
||||||
Use an alternate tags file.
|
|
||||||
-u -U .... --underline-special --UNDERLINE-SPECIAL
|
|
||||||
Change handling of backspaces.
|
|
||||||
-V ........ --version
|
|
||||||
Display the version number of "less".
|
|
||||||
-w ........ --hilite-unread
|
|
||||||
Highlight first new line after forward-screen.
|
|
||||||
-W ........ --HILITE-UNREAD
|
|
||||||
Highlight first new line after any forward movement.
|
|
||||||
-x [_N[,...]] --tabs=[_N[,...]]
|
|
||||||
Set tab stops.
|
|
||||||
-X ........ --no-init
|
|
||||||
Don't use termcap init/deinit strings.
|
|
||||||
-y [_N] .... --max-forw-scroll=[_N]
|
|
||||||
Forward scroll limit.
|
|
||||||
-z [_N] .... --window=[_N]
|
|
||||||
Set size of window.
|
|
||||||
-" [_c[_c]] . --quotes=[_c[_c]]
|
|
||||||
Set shell quote characters.
|
|
||||||
-~ ........ --tilde
|
|
||||||
Don't display tildes after end of file.
|
|
||||||
-# [_N] .... --shift=[_N]
|
|
||||||
Set horizontal scroll amount (0 = one half screen width).
|
|
||||||
--file-size
|
|
||||||
Automatically determine the size of the input file.
|
|
||||||
--follow-name
|
|
||||||
The F command changes files if the input file is renamed.
|
|
||||||
--incsearch
|
|
||||||
Search file as each pattern character is typed in.
|
|
||||||
--line-num-width=N
|
|
||||||
Set the width of the -N line number field to N characters.
|
|
||||||
--mouse
|
|
||||||
Enable mouse input.
|
|
||||||
--no-keypad
|
|
||||||
Don't send termcap keypad init/deinit strings.
|
|
||||||
--no-histdups
|
|
||||||
Remove duplicates from command history.
|
|
||||||
--rscroll=C
|
|
||||||
Set the character used to mark truncated lines.
|
|
||||||
--save-marks
|
|
||||||
Retain marks across invocations of less.
|
|
||||||
--status-col-width=N
|
|
||||||
Set the width of the -J status column to N characters.
|
|
||||||
--use-backslash
|
|
||||||
Subsequent options use backslash as escape char.
|
|
||||||
--use-color
|
|
||||||
Enables colored text.
|
|
||||||
--wheel-lines=N
|
|
||||||
Each click of the mouse wheel moves N lines.
|
|
||||||
|
|
||||||
|
|
||||||
---------------------------------------------------------------------------
|
|
||||||
|
|
||||||
LLIINNEE EEDDIITTIINNGG
|
|
||||||
|
|
||||||
These keys can be used to edit text being entered
|
|
||||||
on the "command line" at the bottom of the screen.
|
|
||||||
|
|
||||||
RightArrow ..................... ESC-l ... Move cursor right one character.
|
|
||||||
LeftArrow ...................... ESC-h ... Move cursor left one character.
|
|
||||||
ctrl-RightArrow ESC-RightArrow ESC-w ... Move cursor right one word.
|
|
||||||
ctrl-LeftArrow ESC-LeftArrow ESC-b ... Move cursor left one word.
|
|
||||||
HOME ........................... ESC-0 ... Move cursor to start of line.
|
|
||||||
END ............................ ESC-$ ... Move cursor to end of line.
|
|
||||||
BACKSPACE ................................ Delete char to left of cursor.
|
|
||||||
DELETE ......................... ESC-x ... Delete char under cursor.
|
|
||||||
ctrl-BACKSPACE ESC-BACKSPACE ........... Delete word to left of cursor.
|
|
||||||
ctrl-DELETE .... ESC-DELETE .... ESC-X ... Delete word under cursor.
|
|
||||||
ctrl-U ......... ESC (MS-DOS only) ....... Delete entire line.
|
|
||||||
UpArrow ........................ ESC-k ... Retrieve previous command line.
|
|
||||||
DownArrow ...................... ESC-j ... Retrieve next command line.
|
|
||||||
TAB ...................................... Complete filename & cycle.
|
|
||||||
SHIFT-TAB ...................... ESC-TAB Complete filename & reverse cycle.
|
|
||||||
ctrl-L ................................... Complete filename, list all.
|
|
||||||
Reference in New Issue
Block a user