diff --git a/CHANGELOG.md b/CHANGELOG.md index 72cd634..c408887 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -92,6 +92,7 @@ All notable changes to the **AIMBAT** project will be documented in this file. - Update plot seismograms. - Remove dead code ([#231](https://github.com/pysmo/aimbat/issues/231)) - Quality stats integrated into read models and tui +- Quality stats integrated into read models and tui ([#232](https://github.com/pysmo/aimbat/issues/232)) ### 🚀 New Features diff --git a/README.md b/README.md index 3268d12..fb9b13f 100644 --- a/README.md +++ b/README.md @@ -40,23 +40,27 @@ arrival times. ## Version 2 AIMBAT v2 is a complete rewrite. It shares the same goal as v1 but none of the -code. - -- **Complete rewrite.** The algorithms are optimised and projects are stored in - a SQLite database (via [SQLModel](https://sqlmodel.tiangolo.com)), making them - persistent, portable, and inspectable. -- **Focused scope.** Much of the underlying code has moved into the - [pysmo](https://github.com/pysmo/pysmo) library, leaving AIMBAT to focus on - the user-facing ICCS → quality-control → MCCC workflow rather than - reimplementing general seismogram utilities. -- **Flexible data storage.** A single project can hold any number of seismic - events. Files from different events can live anywhere on disk — no need to - keep them in separate directories or follow a particular layout. -- **Maintainable.** v2 is built on modern, typed Python with a comprehensive - test suite and strict dependency management, so it keeps working as the - ecosystem evolves. +code. The main improvements for users are: + +- **Flexible workflow.** Snapshots save the complete processing state at any + point, making it straightforward to roll back, compare parameter sets, or + try a different approach without losing prior work. ICCS and MCCC can be run + in any order and as many times as needed; results can be exported from any + snapshot, not only after a final MCCC pass. +- **Multi-event projects.** A single project database holds any number of + seismic events. Waveform files can live anywhere on disk — no prescribed + directory layout and no need to manage separate directories per event. +- **Structured output for downstream analysis.** Each snapshot can be exported + as a structured JSON document containing per-station arrival times, ICCS + correlation coefficients, and — if MCCC has been run — formal timing standard + errors. This makes AIMBAT useful as a data source beyond tomographic + inversion: station quality assessment, delay patterns as a function of + back-azimuth, or any workflow that requires picks and quality metrics in a + machine-readable format. - **Multiple interfaces.** AIMBAT can be used via a CLI, an interactive shell, - a terminal UI, or directly as a Python library. + a terminal UI, or directly as a Python library. All functionality is + accessible through the Python API, making it straightforward to script any + part of the workflow. ## Quick Start diff --git a/docs/first-steps/workflow.md b/docs/first-steps/workflow.md index ef25b69..bb79dcf 100644 --- a/docs/first-steps/workflow.md +++ b/docs/first-steps/workflow.md @@ -28,8 +28,12 @@ flowchart TD AIMBAT[^2] stacks all seismograms aligned on an initial pick, then cross-correlates each seismogram against that stack to refine arrivals -simultaneously across the array. Parameters and picks are improved iteratively -before a final MCCC run. +simultaneously across the array. The diagram below reflects the original +AIMBAT workflow. In modern AIMBAT, snapshots provide save points throughout the +process, making the workflow more flexible: MCCC can be run at any point, not +only as a final step, and ICCS and MCCC runs may alternate as parameters are +refined. Results can be exported from any snapshot; running MCCC before the +final export is usual but not required. [^2]: Lou, X., et al. "AIMBAT: A Python/Matplotlib Tool for Measuring @@ -56,9 +60,13 @@ flowchart TD 1. Change one parameter at a time and run ICCS to observe the effect before making further adjustments. 2. Take snapshots often, with a comment describing the current state. They are - lightweight and easy to roll back to. + lightweight and easy to roll back to. There is no limit on how many + snapshots an event can have — use them as freely as version-control commits. 3. Do not focus too much on individual poorly-aligned seismograms — use autoflip and autoselect to let the algorithm handle them. +4. Export picks whenever the dataset is in a useful state — not only at the + end. ICCS picks are directly usable, and a snapshot taken at any stage is + sufficient to produce output. ### ICCS running modes @@ -82,3 +90,17 @@ automatically if parameter changes bring it into alignment. and revised pick across iterations), consider deleting it from the project. Rogue seismograms can distort the valid ranges used when updating picks and time windows. + +### MCCC running modes + +MCCC has one optional flag: + +- **`--all`**: includes deselected seismograms in the inversion. By default, + only seismograms with `select=True` participate. Passing `--all` computes + picks and quality metrics for every seismogram, regardless of selection + state, which is useful when you want timing estimates for all stations even + if some were excluded from the ICCS stack. + +Use `--all` with care: deselected seismograms are typically excluded because +they are noisy or misaligned, and including them may degrade the inversion +for the rest of the array. diff --git a/docs/usage/alignment.md b/docs/usage/alignment.md index 26ceb39..5af364b 100644 --- a/docs/usage/alignment.md +++ b/docs/usage/alignment.md @@ -5,8 +5,8 @@ ICCS alignment is inherently exploratory. There is no fixed sequence of steps that works for every dataset — it is a feedback loop between adjusting parameters, running the algorithm, and examining the results. The goal is a -stack that is coherent across the array and CC norms that are high enough to -give MCCC a clean dataset to work with. +stack that is coherent across the array and correlation coefficients that are +high across most of the array. Parameters interact: a filter that sharpens the waveform may allow a narrower time window, which in turn changes which seismograms align well. It is @@ -93,43 +93,48 @@ refinement. ### Minimum CC norm -`min_ccnorm` is the threshold used by autoselect to deselect seismograms +`min_cc` is the threshold used by autoselect to deselect seismograms automatically. It does not affect the cross-correlation itself — only which seismograms are excluded from contributing to the stack in subsequent iterations. Setting this too high early on may exclude seismograms that would align well once the stack improves. It is usually more effective to start with a -permissive threshold and tighten it as alignment converges. +permissive threshold and tighten it as alignment converges. The threshold can +be adjusted interactively with `aimbat tool cc`. --- ## Interactive adjustment -In addition to setting parameters directly, three tools let you adjust values -by interacting with the plot — clicking or scrolling in a waveform display -rather than typing numbers. +In addition to setting parameters directly, four interactive tools let you +adjust values by interacting with the plot — clicking or scrolling in a +waveform display rather than typing numbers. They are the most convenient way +to explore parameters, but the same values can also be set directly via CLI +arguments or the Python API. === "CLI" ```bash - aimbat pick phase # adjust t1 by clicking on the stack - aimbat pick window # set window_pre / window_post by clicking - aimbat pick ccnorm # set min_ccnorm by scrolling the matrix image + aimbat tool phase # adjust t1 by clicking on the stack + aimbat tool window # set window_pre / window_post by clicking + aimbat tool cc # set min_cc by scrolling the matrix image + aimbat tool bandpass # adjust bandpass filter settings interactively ``` === "Shell" ```bash - pick phase # adjust t1 by clicking on the stack - pick window # set window_pre / window_post by clicking - pick ccnorm # set min_ccnorm by scrolling the matrix image + tool phase # adjust t1 by clicking on the stack + tool window # set window_pre / window_post by clicking + tool cc # set min_cc by scrolling the matrix image + tool bandpass # adjust bandpass filter settings interactively ``` -Each command opens a matplotlib window. Click (or scroll, for ccnorm) to -set the value, then close the window to save it. +Each command opens a matplotlib window. Click (or scroll) to set the value, +then close the window to save it. -All three accept `--no-context` and `--all` (include deselected seismograms). +All four accept `--no-context` and `--all` (include deselected seismograms). === "TUI" @@ -137,14 +142,15 @@ All three accept `--no-context` and `--all` (include deselected seismograms). - **Phase arrival (t1)** — click in the stack to shift all picks globally - **Time window** — click to place the window boundaries - - **Min CC norm** — scroll the matrix image to set the threshold + - **Min CC** — scroll the matrix image to set the threshold + - **Bandpass filter** — toggle the filter and adjust frequency bounds Before launching, toggle **Context** (`c`) and **All seismograms** (`a`) as needed. The TUI suspends while the matplotlib window is open and resumes when you close it. -The pick and window tools open the **stack view**; the CC norm tool opens the -**matrix image**. The behaviour of each is described in [The ICCS +The `phase`, `window`, and `bandpass` tools open the **stack view**; `cc` +opens the **matrix image**. The behaviour of each is described in [The ICCS Stack](iccs-stack.md#use-in-interactive-adjustment). --- @@ -174,7 +180,7 @@ throughout. It is safe to run repeatedly. ### Autoselect -With autoselect enabled, seismograms whose CC norm falls below `min_ccnorm` +With autoselect enabled, seismograms whose CC norm falls below `min_cc` are automatically set to `select = False` and excluded from the stack in subsequent iterations. Importantly, they are still cross-correlated against the stack — so if parameters improve and they start to align better, they can @@ -200,25 +206,29 @@ automatically; there is no need to monitor it. Running ICCS again from AIMBAT's interface always starts a fresh run from the current picks. What matters is the convergence of the *overall process*: across multiple -runs with adjusted parameters, do the stack and CC norms keep improving, or -have they plateaued? When further adjustments produce no visible improvement -in the stack, alignment is as good as it is going to get with ICCS, and it is -time to move to MCCC. +runs with adjusted parameters, do the stack and correlation coefficients keep +improving, or have they plateaued? When further adjustments produce no visible +improvement in the stack, the data is ready — either for direct export, or as +input to MCCC for formal timing uncertainties. --- ## Knowing when to stop There is no objective criterion for when ICCS alignment is "done". Practical -signals that the dataset is ready for MCCC: +signals that the dataset is ready: - The stack is visually coherent — individual traces closely follow its shape -- CC norms are high across most of the array +- Correlation coefficients are high across most of the array - The time window highlights a clean, well-defined arrival - Running ICCS again with or without autoflip/autoselect produces no meaningful change -It is worth taking a snapshot at this point before running MCCC. +At this point the ICCS picks can be exported directly from a snapshot — see +[Exporting Results](results.md). If formal per-station timing standard errors +are needed (for example, as input to tomographic inversion), continue to +[MCCC alignment](mccc.md) before taking the final snapshot. Either way, it is +worth taking a snapshot now before making any further changes. --- @@ -228,9 +238,11 @@ It is worth taking a snapshot at this point before running MCCC. an improvement or regression if multiple things change at once. - **Take snapshots liberally.** They are lightweight and make it easy to backtrack to a promising state. -- **Don't over-optimise.** MCCC is more precise than ICCS and will further - refine picks. The job of ICCS is to get the data to a state where MCCC can - succeed — not to produce perfect picks itself. +- **ICCS picks are directly usable.** For workflows that do not require formal + timing uncertainties, ICCS picks exported from a snapshot are suitable for + further analysis as-is. MCCC adds formal standard errors and a more rigorous + pairwise solution — run it when those are needed, but there is no obligation + to do so. - **Outlier seismograms.** If a seismogram consistently has a poor CC norm across many runs and parameter combinations, it may be worth deleting it from the project rather than letting it drag down the stack. diff --git a/docs/usage/index.md b/docs/usage/index.md index 914acc3..03ecd3d 100644 --- a/docs/usage/index.md +++ b/docs/usage/index.md @@ -53,7 +53,12 @@ aimbat data add *.sac export DEFAULT_EVENT_ID=$(aimbat event dump | jq -r '.[0].id') aimbat snapshot create "initial import" aimbat align iccs --autoflip --autoselect +aimbat snapshot create "post-ICCS" +aimbat snapshot results --output results.json # export ICCS picks +# or continue to MCCC for formal timing uncertainties: aimbat align mccc +aimbat snapshot create "post-MCCC" +aimbat snapshot results --output results.json ``` --- @@ -96,7 +101,9 @@ Exit with `exit`, `quit`, `q`, or **Ctrl+D**. ### Terminal UI (TUI) ```bash -aimbat tui +aimbat tui # via main CLI +aimbat-tui # standalone entry point (same app) +aimbat tui --debug # enable verbose logging ``` The TUI is a full-screen, keyboard-driven interface built for efficient @@ -174,13 +181,29 @@ Pressing `Enter` on any table row opens a context menu. Available actions depend | Reset parameters | Restore all per-seismogram parameters to their defaults | | Delete seismogram | Remove the seismogram from the project | +**Snapshots — Snapshots table:** + +| Action | Description | +|--------|-------------| +| Show details | Display the event-level parameters saved in the snapshot | +| Preview stack | Open the ICCS stack plot built from the snapshot's parameters | +| Preview matrix image | Open the matrix image built from the snapshot's parameters | +| Save results to JSON | Export per-seismogram picks and MCCC metrics to a JSON file | +| Rollback to this snapshot | Restore the snapshot's parameters as the current live values | +| Delete snapshot | Permanently remove the snapshot | + +Both preview options support `c` (toggle context waveforms) and `a` (include +all seismograms) key bindings inside the action menu before the plot opens. +See [Snapshots](snapshots.md) for a description of each action and the format +of the exported results file. + #### Global key bindings | Key | Action | |-----|--------| | `e` | Open event switcher | | `a` | Run alignment (ICCS or MCCC) | -| `t` | Open interactive tools (matplotlib picking) | +| `t` | Open interactive parameter tools (phase pick, time window, min CC, bandpass filter) | | `p` | Edit processing parameters | | `n` | Create a new snapshot | | `d` | Add data files to the project | @@ -288,10 +311,12 @@ alignment and interactive tools are unavailable until the problem is resolved. AIMBAT writes a log to `aimbat.log` in the current directory. By default only `INFO`-level messages and above are recorded. To get more detail, pass -`--debug` to any CLI command: +`--debug` to any CLI command or TUI entry point: ```bash -aimbat align iccs --debug +aimbat align iccs --debug # any CLI command +aimbat tui --debug # TUI via main CLI +aimbat-tui --debug # TUI standalone entry point ``` This sets the log level to `DEBUG` for that invocation and writes verbose diff --git a/docs/usage/mccc.md b/docs/usage/mccc.md index c36d9ac..1c20efa 100644 --- a/docs/usage/mccc.md +++ b/docs/usage/mccc.md @@ -1,21 +1,21 @@ -# Finalising with MCCC +# MCCC Alignment ## When to run MCCC -MCCC works best on data that is already reasonably well aligned — coherent -stack, high CC norms across most of the array, a clean and stable time window. -It cannot recover poor alignment; if the seismograms are badly misaligned, -many pairwise correlations will be weak and the inversion will be poorly -constrained. - -Running ICCS first is therefore the usual approach, not because of a strict -rule, but because ICCS gives you tools that MCCC does not: interactive -parameter adjustment, autoflip to correct polarity, and autoselect to remove -seismograms that consistently fail to align. Once the dataset is in a state -where those tools are no longer improving things, MCCC is the natural next -step — and because it produces formal standard errors for each delay estimate, -its output is directly usable in further analyses such as tomographic -inversion. +ICCS alignment produces relative arrival-time picks that are directly usable +for many purposes. MCCC is the natural next step for most analyses because it +adds formal standard errors to those picks — derived from a pairwise +least-squares inversion — making the output suitable for applications that +require timing uncertainties, such as tomographic inversion. + +MCCC works best on data that is already well aligned. It cannot recover poor +alignment: if seismograms are badly misaligned, many pairwise correlations will +be weak and the inversion will be poorly constrained. Running ICCS first is +therefore the standard approach — not because of a strict rule, but because +ICCS provides tools that MCCC does not: interactive parameter adjustment, +autoflip to correct polarity, and autoselect to remove seismograms that +consistently fail to align. Once the dataset is in a state where those tools +are no longer improving things, MCCC is ready to run. Take a snapshot before running MCCC. @@ -88,24 +88,24 @@ confirm the picks improved. ## Parameters -### Minimum CC norm (`mccc_min_ccnorm`) +### Minimum CC norm (`mccc_min_cc`) Pairs of seismograms whose cross-correlation coefficient falls below this -threshold are excluded from the inversion. Unlike ICCS's `min_ccnorm`, which +threshold are excluded from the inversion. Unlike ICCS's `min_cc`, which operates on whole seismograms, this threshold applies to **pairs**: a seismogram can still contribute to the solution through its good pairs even if some of its pairings are weak. Setting this too low allows noisy pairs to degrade the inversion; setting it -too high may leave too few constraints for a stable solution. The ICCS CC -norms give a rough sense of which seismograms are likely to correlate well with -each other. +too high may leave too few constraints for a stable solution. The ICCS +correlation coefficients give a rough sense of which seismograms are likely to +correlate well with each other. ### Damping (`mccc_damp`) Tikhonov regularisation applied to the inversion. A small amount of damping stabilises the solution when the constraint matrix is poorly conditioned — -for example, when a seismogram has few pairs above `mccc_min_ccnorm` and its +for example, when a seismogram has few pairs above `mccc_min_cc` and its time shift is therefore weakly constrained. Higher damping pulls all shifts closer to zero (the group mean), producing a more conservative solution. @@ -122,3 +122,28 @@ subset that contributed to the ICCS stack. Passing `--all` includes deselected seismograms in the inversion. Their picks are still updated, but they may degrade the inversion if they are genuinely noisy or misaligned. Use with caution. + +--- + +## Exporting results + +Once MCCC has run and you are satisfied with the picks, take a snapshot and +export the results: + +=== "CLI" + + ```bash + aimbat snapshot create "post-MCCC" + aimbat snapshot results --output results.json + ``` + +=== "TUI" + + Press `n` to create a snapshot, then press `Enter` on the new snapshot row + and choose **Save results to JSON**. + +The output is a JSON document with event-level header fields (snapshot +metadata, event coordinates, MCCC RMSE) and a `seismograms` list containing +the frozen `t1` pick, ICCS and MCCC correlation coefficients, and formal timing +standard errors for each station. See [Exporting Results](results.md) for the +full field reference and examples of working with the output. diff --git a/docs/usage/quality.md b/docs/usage/quality.md index 2baf89c..fbba0b2 100644 --- a/docs/usage/quality.md +++ b/docs/usage/quality.md @@ -23,7 +23,7 @@ between that seismogram and the current ICCS stack as `iccs_cc`. minimum CC threshold. You do not need to run any explicit step. - **What it tells you**: How closely the waveform matches the array stack under the current window and filter settings. It is the basis for the - `--autoselect` threshold set with `aimbat pick cc`. + `--autoselect` threshold. - **Interpretation**: Values closer to 1.0 indicate high similarity to the stack. Values near 0 or negative suggest misalignment, poor SNR, or a polarity flip. diff --git a/docs/usage/results.md b/docs/usage/results.md new file mode 100644 index 0000000..8d14355 --- /dev/null +++ b/docs/usage/results.md @@ -0,0 +1,172 @@ +# Exporting Results + +## Overview + +Any snapshot can be exported as a structured JSON document using +`aimbat snapshot results`. The output contains everything needed to identify +the snapshot, the source event, and the per-station arrival-time picks — +including quality metrics from ICCS and, if MCCC has been run, formal timing +standard errors. + +Exporting does not require MCCC to have been run. ICCS picks alone are +sufficient for many workflows. MCCC adds formal per-station timing uncertainties +and is worth running when those are required, but the output format is the +same either way — MCCC fields are simply `null` in snapshots that pre-date any +MCCC run. + +--- + +## Running the export + +=== "CLI" + + ```bash + aimbat snapshot results # print to stdout + aimbat snapshot results --output out.json # save to file + ``` + +=== "Shell" + + ```bash + snapshot results + snapshot results --output out.json + ``` + +=== "TUI" + + Press `Enter` on a snapshot row in the **Snapshots** tab and choose + **Save results to JSON**. A file-picker dialog opens; the suggested + filename is `results_.json`. + +Pass `--alias` to use camelCase field names (e.g. `snapshotId`, `eventTime`, +`mcccRmse`). + +--- + +## Output format + +The output is a JSON object with two parts: an envelope containing event-level +information, and a `seismograms` list with one entry per station. + +```json +{ + "snapshot_id": "3f1a2b4c-...", + "snapshot_time": "2025-03-01T14:22:00Z", + "snapshot_comment": "post-MCCC final", + "event_id": "6a4a...", + "event_time": "2024-11-15T08:43:12Z", + "event_latitude": 37.2, + "event_longitude": 141.8, + "event_depth_km": 35.0, + "mccc_rmse": 0.021, + "seismograms": [ + { + "seismogram_id": "...", + "name": "II.MAJO", + "channel": "BHZ", + "select": true, + "flip": false, + "t1": "2024-11-15T08:43:47.312Z", + "iccs_cc": 0.94, + "mccc_cc_mean": 0.91, + "mccc_cc_std": 0.03, + "mccc_error": 0.018 + } + ] +} +``` + +### Envelope fields + +Event-level information appears once in the envelope, rather than being +repeated on every seismogram row. + +| Field | Type | Always present | Description | +|-------|------|:---:|-------------| +| `snapshot_id` | UUID string | Yes | Snapshot this export came from | +| `snapshot_time` | ISO 8601 | Yes | When the snapshot was taken | +| `snapshot_comment` | string \| null | Yes | Optional label from snapshot creation | +| `event_id` | UUID string | Yes | Event this snapshot belongs to | +| `event_time` | ISO 8601 | Yes | Seismic event origin time | +| `event_latitude` | float | Yes | Event latitude (degrees) | +| `event_longitude` | float | Yes | Event longitude (degrees) | +| `event_depth_km` | float \| null | Yes | Event depth in km; null if not recorded | +| `mccc_rmse` | float \| null | Yes | Global MCCC RMSE (seconds); null if MCCC not run | +| `seismograms` | array | Yes | Per-seismogram entries (see below) | + +### Per-seismogram fields + +| Field | Type | Always present | Description | +|-------|------|:---:|-------------| +| `seismogram_id` | UUID string | Yes | Seismogram record identifier | +| `name` | string | Yes | Station name in `NETWORK.NAME` format | +| `channel` | string | Yes | Channel code (e.g. `BHZ`) | +| `select` | bool | Yes | Selection state at snapshot time | +| `flip` | bool | Yes | Whether polarity was flipped at snapshot time | +| `t1` | ISO 8601 | Yes | Frozen absolute arrival-time pick | +| `iccs_cc` | float \| null | Yes | Correlation coefficient with ICCS stack | +| `mccc_cc_mean` | float \| null | Yes | Mean pairwise MCCC correlation coefficient | +| `mccc_cc_std` | float \| null | Yes | Std of pairwise MCCC correlation coefficients | +| `mccc_error` | float \| null | Yes | Formal timing standard error from MCCC (seconds) | + +`iccs_cc` is `null` for snapshots taken before the event was first opened in +AIMBAT. All MCCC fields are `null` for snapshots taken before MCCC was run. + +--- + +## Working with the output + +### Filtering with `jq` + +Extract only selected seismograms: + +```bash +aimbat snapshot results | \ + jq '[.seismograms[] | select(.select == true)]' +``` + +Extract stations where MCCC timing error is below 0.05 s: + +```bash +aimbat snapshot results | \ + jq '[.seismograms[] | select(.mccc_error != null and .mccc_error < 0.05)]' +``` + +Export station names and pick times as CSV: + +```bash +aimbat snapshot results | \ + jq -r '.seismograms[] | [.name, .t1] | @csv' +``` + +### Python + +```python +import json + +with open("results.json") as f: + data = json.load(f) + +print(f"Event: {data['event_time']} ({data['event_latitude']}, {data['event_longitude']})") +print(f"MCCC RMSE: {data['mccc_rmse']} s") + +for seis in data["seismograms"]: + if seis["select"]: + print(f" {seis['name']:12s} t1={seis['t1']} err={seis['mccc_error']}") +``` + +--- + +## ICCS vs MCCC picks + +Picks exported from an ICCS-only snapshot have `t1` values refined by the +iterative stack alignment. Picks from a post-MCCC snapshot have `t1` values +replaced by the least-squares pairwise solution — slightly different in value, +with the addition of formal standard errors in `mccc_error`. + +The MCCC-derived `t1` values are generally preferred for applications that +require formal uncertainties. For applications where only relative picks are +needed and uncertainties are not, an ICCS-only snapshot is sufficient. + +See [Aligning with ICCS](alignment.md) and [MCCC Alignment](mccc.md) for a +full description of what each algorithm produces and when to use each one. diff --git a/docs/usage/snapshots.md b/docs/usage/snapshots.md index 0f5434c..bef581e 100644 --- a/docs/usage/snapshots.md +++ b/docs/usage/snapshots.md @@ -6,7 +6,8 @@ A snapshot saves the current processing parameters and quality metrics for an event at a point in time. Specifically, it stores: - All event-level parameters: the time window, bandpass filter settings, and - Min CC (`min_cc`, exposed in the CLI via `pick cc`) + Min CC (`min_cc`) — see [Aligning with ICCS](alignment.md) for what each + controls - Per-seismogram parameters for every seismogram in the event: the current `t1` pick, `select` flag, and `flip` flag - Quality metrics, if available at snapshot time: @@ -244,3 +245,92 @@ The output is a JSON object with five keys, all cross-referenced by | `seismogram_parameters` | Per-seismogram parameter snapshots | Yes | | `event_quality` | Event quality snapshots (MCCC RMSE) | Only if MCCC has been run | | `seismogram_quality` | Per-seismogram quality snapshots (ICCS CC, MCCC metrics) | Only if quality metrics exist | + +--- + +## Saving results + +Any snapshot can be exported as a structured JSON document containing the +frozen `t1` picks, ICCS correlation coefficients, and — if MCCC was run before +the snapshot — per-seismogram MCCC quality metrics and the event-level RMSE. +MCCC does not need to have been run; the export is useful at any stage of +processing. + +This is the primary format for passing AIMBAT picks into downstream tools such +as tomographic inversion codes. See [Exporting Results](results.md) for full +details on the output format and how to work with it. + +=== "CLI" + + ```bash + aimbat snapshot results # print to stdout + aimbat snapshot results --output out.json # save to file + ``` + +=== "Shell" + + ```bash + snapshot results + snapshot results --output out.json + ``` + +=== "TUI" + + Press `Enter` on a snapshot row in the **Snapshots** tab and choose + **Save results to JSON**. A file-picker dialog opens; the suggested + filename is `results_.json`. Confirm to write the file. + +Pass `--alias` to use camelCase field names in the output. + +--- + +## Snapshot notes + +Each snapshot can carry a freeform Markdown note — useful for recording +observations, decisions, or links to external references at the time the +snapshot was taken. + +=== "CLI" + + ```bash + aimbat snapshot note read # display the note + aimbat snapshot note edit # open in $EDITOR and save on exit + ``` + +=== "Shell" + + ```bash + snapshot note read + snapshot note edit + ``` + +If no note exists yet, `read` prints `(no note)` and `edit` opens an empty +buffer. The note is saved when you close the editor without error. + +--- + +## Snapshot quality statistics + +A summary of quality metrics across all snapshots for an event can be viewed +without opening individual snapshot records: + +=== "CLI" + + ```bash + aimbat snapshot quality list # for a specific event + aimbat snapshot quality list --all-events # across all events + aimbat snapshot quality dump # raw JSON export + ``` + +=== "Shell" + + ```bash + snapshot quality list + snapshot quality list --all-events + snapshot quality dump + ``` + +The table shows per-snapshot aggregated ICCS correlation coefficients and, where +MCCC has been run, MCCC metrics (mean, SEM) and the global RMSE. This makes it +easy to compare the quality evolution across snapshots without having to export +each one individually. diff --git a/pyproject.toml b/pyproject.toml index a7e2b83..05661ed 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -39,7 +39,7 @@ Documentation = "https://aimbat.pysmo.org" [project.scripts] aimbat = "aimbat.app:app" -aimbat-tui = "aimbat._tui.app:main" +aimbat-tui = "aimbat._cli.tui:app" [dependency-groups] test = [ diff --git a/src/aimbat/_cli/__init__.py b/src/aimbat/_cli/__init__.py index db62f90..b02a150 100644 --- a/src/aimbat/_cli/__init__.py +++ b/src/aimbat/_cli/__init__.py @@ -1,13 +1,14 @@ from .align import app as align from .data import app as data from .event import app as event -from .pick import app as pick from .plot import app as plot from .project import app as project from .seismogram import app as seismogram from .shell import app as shell from .snapshot import app as snapshot from .station import app as station +from .tool import app as tool +from .tui import app as tui from .utils import app as utils __all__ = [ @@ -15,11 +16,12 @@ "align", "data", "event", - "pick", + "tool", "plot", "project", "seismogram", "shell", "snapshot", "station", + "tui", ] diff --git a/src/aimbat/_cli/snapshot.py b/src/aimbat/_cli/snapshot.py index 9203e09..c8f271f 100644 --- a/src/aimbat/_cli/snapshot.py +++ b/src/aimbat/_cli/snapshot.py @@ -6,6 +6,7 @@ to undo them if needed. """ +from pathlib import Path from typing import Annotated, Literal from uuid import UUID @@ -418,5 +419,52 @@ def cli_snapshot_quality_list( ) +@app.command(name="results") +@simple_exception +def cli_snapshot_results( + snapshot_id: Annotated[ + UUID, + id_parameter(AimbatSnapshot, help="UUID (or unique prefix) of snapshot."), + ], + *, + output: Annotated[ + Path | None, + Parameter( + name="output", + help="Write results to this JSON file instead of printing to stdout.", + ), + ] = None, + dump_parameters: JsonDumpParameters = JsonDumpParameters(), +) -> None: + """Export per-seismogram MCCC results from a snapshot as JSON. + + Each row contains the frozen pick time (T1), ICCS correlation coefficient, + per-seismogram MCCC quality metrics, and the event-level MCCC RMSE. + + Results are printed to stdout unless `--output` is given, in which case + they are written to the specified file. + """ + import json + + from sqlmodel import Session + + from aimbat.core import dump_snapshot_results + from aimbat.db import engine + + with Session(engine) as session: + data = dump_snapshot_results( + session, + snapshot_id, + by_alias=dump_parameters.by_alias, + ) + + if output is None: + from rich import print_json + + print_json(data=data) + else: + output.write_text(json.dumps(data, indent=2), encoding="utf-8") + + if __name__ == "__main__": app() diff --git a/src/aimbat/_cli/pick.py b/src/aimbat/_cli/tool.py similarity index 71% rename from src/aimbat/_cli/pick.py rename to src/aimbat/_cli/tool.py index 6377dc4..0c8c527 100644 --- a/src/aimbat/_cli/pick.py +++ b/src/aimbat/_cli/tool.py @@ -1,9 +1,9 @@ -"""Interactively pick phase arrival times and processing parameters. +"""Launch interactive tools for picking phase arrival times and processing parameters. -These commands open an interactive matplotlib plot for an event. Use +Each subcommand opens an interactive matplotlib plot for an event. Use `--event-id` or set the `DEFAULT_EVENT_ID` environment variable to choose -which event to pick. Click on the plot to set the chosen value, then close -the window to save it. +which event to work with. Click on the plot to set the chosen value, then +close the window to save it. """ from typing import Annotated @@ -19,7 +19,42 @@ use_matrix_image, ) -app = App(name="pick", help=__doc__, help_format="markdown") +app = App(name="tool", help=__doc__, help_format="markdown") + + +@app.command(name="bandpass") +@simple_exception +def cli_update_bandpass( + event_id: Annotated[UUID, event_parameter()], + *, + iccs_plot_parameters: IccsPlotParameters = IccsPlotParameters(), + use_matrix_image: Annotated[bool, use_matrix_image()] = False, + _: DebugParameter = DebugParameter(), +) -> None: + """Interactively update the bandpass filter parameters for an event. + + Opens an interactive plot with controls for enabling/disabling the bandpass + filter and adjusting the minimum and maximum frequencies. Close the window + to save the updated parameters. + """ + from sqlmodel import Session + + from aimbat.core import create_iccs_instance, resolve_event + from aimbat.db import engine + from aimbat.plot import update_bandpass + + with Session(engine) as session: + event = resolve_event(session, event_id) + iccs = create_iccs_instance(session, event).iccs + update_bandpass( + session, + event, + iccs, + context=iccs_plot_parameters.context, + all_seismograms=iccs_plot_parameters.all_seismograms, + use_matrix_image=use_matrix_image, + return_fig=False, + ) @app.command(name="phase") diff --git a/src/aimbat/_cli/tui.py b/src/aimbat/_cli/tui.py new file mode 100644 index 0000000..897f302 --- /dev/null +++ b/src/aimbat/_cli/tui.py @@ -0,0 +1,16 @@ +"""Launch the AIMBAT terminal user interface.""" + +from cyclopts import App + +from .common import DebugParameter, simple_exception + +app = App(name="tui", help=__doc__, help_format="markdown") + + +@app.default +@simple_exception +def cli_tui(*, _: DebugParameter = DebugParameter()) -> None: + """Launch the AIMBAT terminal user interface.""" + from aimbat._tui.app import main + + main() diff --git a/src/aimbat/_tui/app.py b/src/aimbat/_tui/app.py index af6941a..49984e8 100644 --- a/src/aimbat/_tui/app.py +++ b/src/aimbat/_tui/app.py @@ -30,7 +30,7 @@ TabPane, Tabs, ) -from textual_fspicker import FileOpen, Filters +from textual_fspicker import FileOpen, FileSave, Filters from pysmo.tools.iccs import ICCS @@ -63,6 +63,7 @@ delete_station, dump_event_table, dump_seismogram_table, + dump_snapshot_results, dump_snapshot_table, dump_station_table, get_event_quality, @@ -76,6 +77,7 @@ from aimbat.core._project import _project_exists from aimbat.db import engine from aimbat.io import DATATYPE_SUFFIXES, DataType +from aimbat.logger import logger from aimbat.models import ( AimbatEvent, AimbatEventRead, @@ -93,6 +95,7 @@ plot_matrix_image, plot_seismograms, plot_stack, + update_bandpass, update_min_cc, update_pick, update_timewindow, @@ -190,6 +193,24 @@ def _tool_cc( ) +def _tool_bandpass( + session: Session, + event: AimbatEvent, + iccs: ICCS, + context: bool, + all_seismograms: bool, +) -> None: + update_bandpass( + session, + event, + iccs, + context, + all_seismograms=all_seismograms, + use_matrix_image=False, + return_fig=False, + ) + + def _tool_stack( session: Session, event: AimbatEvent, @@ -214,6 +235,7 @@ def _tool_image( "phase": ("Phase arrival (t1)", _tool_phase), "window": ("Time window", _tool_window), "cc": ("Min CC", _tool_cc), + "bandpass": ("Bandpass filter", _tool_bandpass), "stack": ("Stack plot", _tool_stack), "image": ("Matrix image", _tool_image), } @@ -297,6 +319,7 @@ def on_mount(self) -> None: self.set_interval(5, self._check_iccs_staleness) + logger.info("TUI started.") if not _project_exists(engine): self.push_screen(NoProjectModal(), self._on_no_project_modal) else: @@ -305,10 +328,12 @@ def on_mount(self) -> None: def _on_no_project_modal(self, create: bool | None) -> None: if create: + logger.info("User chose to create a new project.") create_project(engine) self._create_iccs() self.refresh_all() else: + logger.info("User declined to create a project. Exiting.") self.exit() @on(TabbedContent.TabActivated) @@ -404,6 +429,9 @@ def _create_iccs(self) -> None: Concurrent calls are ignored — only one worker runs at a time. """ if self._iccs_creating: + logger.debug( + "ICCS creation already in progress; skipping duplicate request." + ) return self._iccs_creating = True self._bound_iccs = None @@ -417,20 +445,24 @@ def _worker_create_iccs(self) -> None: event = self._get_current_event(session) bound_iccs = create_iccs_instance(session, event) except (NoResultFound, RuntimeError): + logger.debug("ICCS worker: no event selected or no data; aborting.") self.call_from_thread(setattr, self, "_iccs_creating", False) return except Exception as exc: + logger.exception(f"ICCS worker: unexpected error during creation: {exc}") self.call_from_thread( self.notify, f"ICCS init failed: {exc}", severity="error" ) self.call_from_thread(setattr, self, "_iccs_creating", False) return + logger.debug("ICCS worker: instance created successfully.") self.call_from_thread(self._assign_iccs, bound_iccs) def _assign_iccs(self, bound_iccs: BoundICCS) -> None: """Main-thread callback: store the new BoundICCS instance and refresh status.""" self._iccs_creating = False self._bound_iccs = bound_iccs + logger.info("ICCS instance ready and assigned.") self._refresh_event_bar() self._refresh_seismograms() @@ -582,6 +614,9 @@ def _check_iccs_staleness(self) -> None: except (NoResultFound, RuntimeError): return if stale: + logger.debug( + "ICCS staleness detected; recreating instance and refreshing UI." + ) self._iccs_last_modified_seen = last_modified self._create_iccs() self.refresh_all() @@ -760,6 +795,8 @@ def on_action(result: tuple[str, bool, bool] | None) -> None: self._preview_snapshot_plot(snap_id, "stack", context, all_seis) elif action == "preview_image": self._preview_snapshot_plot(snap_id, "image", context, all_seis) + elif action == "save_results": + self._save_snapshot_results(snap_id) else: self._handle_row_action("tab-snapshots", snap_id, action) @@ -965,12 +1002,14 @@ def _handle_row_action(self, tab: str, item_id: str, action: str | None) -> None self._reset_seismogram_parameters(item_id) def _select_event(self, item_id: str) -> None: + logger.debug(f"User selected event {item_id[:8]}.") self._current_event_id = uuid.UUID(item_id) self._create_iccs() self.refresh_all() self.notify("Event selected", timeout=2) def _toggle_event_completed(self, item_id: str) -> None: + logger.debug(f"User toggled completed flag for event {item_id[:8]}.") try: with Session(engine) as session: event = session.get(AimbatEvent, uuid.UUID(item_id)) @@ -1003,6 +1042,7 @@ def _view_seismograms(self, tab: str, item_id: str) -> None: self.notify(str(exc), severity="error") def _toggle_seismogram_bool(self, item_id: str, param: SeismogramParameter) -> None: + logger.debug(f"User toggled {param} for seismogram {item_id[:8]}.") try: seis_uuid = uuid.UUID(item_id) with Session(engine) as session: @@ -1026,6 +1066,7 @@ def _toggle_seismogram_bool(self, item_id: str, param: SeismogramParameter) -> N self.notify(str(exc), severity="error") def _reset_seismogram_parameters(self, item_id: str) -> None: + logger.debug(f"User reset parameters for seismogram {item_id[:8]}.") try: with Session(engine) as session: reset_seismogram_parameters(session, uuid.UUID(item_id)) @@ -1050,6 +1091,7 @@ def on_confirm(confirmed: bool | None) -> None: return try: if tab == "project-events": + logger.info(f"User confirmed deletion of event {item_id[:8]}.") with Session(engine) as session: delete_event(session, uuid.UUID(item_id)) if self._current_event_id == uuid.UUID(item_id): @@ -1058,23 +1100,27 @@ def on_confirm(confirmed: bool | None) -> None: self.refresh_all() self.notify("Event deleted", timeout=2) elif tab == "project-stations": + logger.info(f"User confirmed deletion of station {item_id[:8]}.") with Session(engine) as session: delete_station(session, uuid.UUID(item_id)) self._create_iccs() self.refresh_all() self.notify("Station deleted", timeout=2) elif tab == "tab-seismograms": + logger.info(f"User confirmed deletion of seismogram {item_id[:8]}.") with Session(engine) as session: delete_seismogram(session, uuid.UUID(item_id)) self._create_iccs() self.refresh_all() self.notify("Seismogram deleted", timeout=2) elif tab == "tab-snapshots": + logger.info(f"User confirmed deletion of snapshot {item_id[:8]}.") with Session(engine) as session: delete_snapshot(session, uuid.UUID(item_id)) self._refresh_snapshots() self.notify("Snapshot deleted", timeout=2) except Exception as exc: + logger.exception(f"Deletion failed: {exc}") self.notify(str(exc), severity="error") self.push_screen(ConfirmModal(msg), on_confirm) @@ -1101,9 +1147,32 @@ def _show_snapshot_details(self, snap_id: str) -> None: except Exception as exc: self.notify(str(exc), severity="error") + def _save_snapshot_results(self, snap_id: str) -> None: + default_name = f"results_{snap_id[:8]}.json" + + def on_path(path: Path | None) -> None: + if path is None: + return + import json + + try: + with Session(engine) as session: + data = dump_snapshot_results(session, uuid.UUID(snap_id)) + path.write_text(json.dumps(data, indent=2), encoding="utf-8") + logger.info(f"Snapshot results saved to {path}.") + self.notify(f"Results saved to {path.name}", timeout=3) + except Exception as exc: + logger.exception(f"Failed to save snapshot results: {exc}") + self.notify(str(exc), severity="error") + + self.push_screen( + FileSave(".", title="Save results", default_file=default_name), on_path + ) + def _preview_snapshot_plot( self, snap_id: str, plot_type: str, context: bool, all_seis: bool ) -> None: + logger.debug(f"User previewing {plot_type} plot for snapshot {snap_id[:8]}.") try: with self._suspend("Previewing snapshot"): with Session(engine) as session: @@ -1113,6 +1182,7 @@ def _preview_snapshot_plot( else: plot_matrix_image(bound.iccs, context, all_seis, return_fig=False) except Exception as exc: + logger.exception(f"Snapshot preview failed: {exc}") self.notify(str(exc), severity="error") def _confirm_rollback(self, snap_id: str) -> None: @@ -1120,6 +1190,7 @@ def on_confirm(confirmed: bool | None) -> None: if not confirmed: return try: + logger.info(f"User confirmed rollback to snapshot {snap_id[:8]}.") with Session(engine) as session: rollback_to_snapshot(session, uuid.UUID(snap_id)) self._create_iccs() @@ -1128,6 +1199,7 @@ def on_confirm(confirmed: bool | None) -> None: self.query_one(TabbedContent).active = "tab-seismograms" self.notify("Rolled back to snapshot", timeout=3) except Exception as exc: + logger.exception(f"Rollback failed: {exc}") self.notify(str(exc), severity="error") self.push_screen(ConfirmModal("Roll back to this snapshot?"), on_confirm) @@ -1137,6 +1209,7 @@ def on_confirm(confirmed: bool | None) -> None: # ------------------------------------------------------------------ def action_open_parameters(self) -> None: + logger.debug("User opened parameters modal.") try: with Session(engine) as session: event = self._get_current_event(session) @@ -1147,6 +1220,7 @@ def action_open_parameters(self) -> None: def on_close(changed: bool | None) -> None: if changed: + logger.info("Parameters changed; recreating ICCS.") self._create_iccs() self.refresh_all() @@ -1155,6 +1229,7 @@ def on_close(changed: bool | None) -> None: def action_switch_event(self) -> None: def on_result(result: uuid.UUID | None) -> None: if result is not None: + logger.debug(f"User switched to event {str(result)[:8]}.") self._current_event_id = result self._create_iccs() self.refresh_all() @@ -1180,9 +1255,11 @@ def on_file(path: Path | None) -> None: session, [path], data_type, disable_progress_bar=True ) session.commit() + logger.info(f"User added data file: {path}.") self.notify(f"Added: {path.name}", severity="information") self.refresh_all() except Exception as exc: + logger.exception(f"Failed to add data file {path}: {exc}") self.notify(str(exc), severity="error") self.push_screen( @@ -1229,6 +1306,9 @@ def _run_tool(self, tool: str, context: bool, all_seis: bool) -> None: matplotlib on the main thread via App.suspend(), which is the correct Textual pattern for blocking terminal-adjacent processes. """ + logger.debug( + f"User launched interactive tool '{tool}' (context={context}, all_seis={all_seis})." + ) if self._bound_iccs is None: self.notify("ICCS not ready — please wait", severity="warning") return @@ -1241,6 +1321,7 @@ def _run_tool(self, tool: str, context: bool, all_seis: bool) -> None: event = self._get_current_event(session) fn(session, event, iccs, context, all_seis) except Exception as exc: + logger.exception(f"Interactive tool '{tool}' raised: {exc}") self.notify(str(exc), severity="error") return @@ -1269,6 +1350,9 @@ def _run_align_tool( all_seis: bool, ) -> None: """Run ICCS or MCCC in a background thread.""" + logger.debug( + f"Alignment worker starting: {algorithm=}, {autoflip=}, {autoselect=}, {all_seis=}." + ) notify_msg = "Alignment complete" notify_severity: Literal["information", "warning", "error"] = "information" try: @@ -1285,6 +1369,7 @@ def _run_align_tool( run_mccc(session, event, bound.iccs, all_seis) notify_msg = "MCCC complete" except Exception as exc: + logger.exception(f"Alignment worker error ({algorithm}): {exc}") self.call_from_thread(self.notify, str(exc), severity="error") return self.call_from_thread(self._post_align_complete, notify_msg, notify_severity) @@ -1304,12 +1389,14 @@ def on_comment(comment: str | None) -> None: if comment is None: return try: + logger.info(f"User creating snapshot with comment={comment!r}.") with Session(engine) as session: event = self._get_current_event(session) create_snapshot(session, event, comment or None) self._refresh_snapshots() self.notify("Snapshot created", timeout=2) except Exception as exc: + logger.exception(f"Snapshot creation failed: {exc}") self.notify(str(exc), severity="error") self.push_screen(SnapshotCommentModal(), on_comment) @@ -1329,6 +1416,7 @@ def action_show_help(self) -> None: self.push_screen(HelpModal(self._active_tab)) def action_refresh(self) -> None: + logger.debug("User triggered manual refresh.") self.refresh_all() self.notify("Refreshed", timeout=1) diff --git a/src/aimbat/_tui/help/tab-snapshots.md b/src/aimbat/_tui/help/tab-snapshots.md index 26b1240..3aaacf5 100644 --- a/src/aimbat/_tui/help/tab-snapshots.md +++ b/src/aimbat/_tui/help/tab-snapshots.md @@ -54,6 +54,7 @@ you are in the parameter space. | Show details | View the event parameters (window, filter, min CC) as saved | | Preview stack | Open the ICCS stack plot built from this snapshot's parameters, without changing anything in the database | | Preview matrix image | Open the cross-correlation matrix image from this snapshot | +| Save results to JSON | Export the snapshot's quality metrics and picks to a JSON file via a file-save dialogue | | Rollback to this snapshot | Restore these parameters as the current live values — overwrites the current parameters for this event | | Delete snapshot | Permanently remove the snapshot (the live parameters are not affected) | diff --git a/src/aimbat/_tui/modals.py b/src/aimbat/_tui/modals.py index c6787d5..0ea6810 100644 --- a/src/aimbat/_tui/modals.py +++ b/src/aimbat/_tui/modals.py @@ -560,6 +560,7 @@ def action_cancel(self) -> None: ("show_details", "Show details"), ("preview_stack", "Preview stack"), ("preview_image", "Preview matrix image"), + ("save_results", "Save results to JSON"), ("rollback", "Rollback to this snapshot"), ("delete", "Delete snapshot"), ] @@ -658,6 +659,7 @@ def action_cancel(self) -> None: ("phase", "Phase arrival (t1)"), ("window", "Time window"), ("cc", "Min CC"), + ("bandpass", "Bandpass filter"), ("stack", "Stack plot"), ("image", "Matrix image"), ] diff --git a/src/aimbat/app.py b/src/aimbat/app.py index 0112156..5b7f16a 100644 --- a/src/aimbat/app.py +++ b/src/aimbat/app.py @@ -33,7 +33,7 @@ app.command(cli.align) app.command(cli.data) app.command(cli.event) -app.command(cli.pick) +app.command(cli.tool) app.command(cli.plot) app.command(cli.project) app.command(cli.seismogram) @@ -41,11 +41,7 @@ app.command(cli.station) app.command(cli.utils) app.command(cli.shell) -app.command( - "aimbat._tui.app:main", - name="tui", - help="Launch the AIMBAT terminal user interface", -) +app.command(cli.tui) if __name__ == "__main__": diff --git a/src/aimbat/core/_snapshot.py b/src/aimbat/core/_snapshot.py index 04f1aa3..9c80cd6 100644 --- a/src/aimbat/core/_snapshot.py +++ b/src/aimbat/core/_snapshot.py @@ -16,12 +16,15 @@ AimbatEventQuality, AimbatEventQualitySnapshot, AimbatSeismogram, + AimbatSeismogramParameters, AimbatSeismogramParametersSnapshot, AimbatSeismogramQuality, AimbatSeismogramQualitySnapshot, AimbatSnapshot, AimbatSnapshotRead, SeismogramQualityStats, + SnapshotResults, + SnapshotSeismogramResult, ) from aimbat.models._parameters import ( AimbatEventParametersBase, @@ -43,6 +46,7 @@ "get_snapshot_quality", "dump_snapshot_table", "dump_snapshot_quality_table", + "dump_snapshot_results", "dump_event_parameter_snapshot_table", "dump_seismogram_parameter_snapshot_table", "dump_event_quality_snapshot_table", @@ -683,3 +687,85 @@ def dump_seismogram_quality_snapshot_table( ) return seis_quality_dicts + + +def dump_snapshot_results( + session: Session, + snapshot_id: UUID, + by_alias: bool = False, +) -> dict[str, Any]: + """Dump per-seismogram MCCC results from a snapshot as a results envelope. + + Returns a dict with event- and snapshot-level header fields plus a + `seismograms` list containing one entry per seismogram. Event-level + scalars (`snapshot_id`, `event_id`, `mccc_rmse`) appear once in the + envelope rather than being repeated on every row. + + Args: + session: Database session. + snapshot_id: UUID of the snapshot to export results from. + by_alias: Whether to use camelCase serialisation aliases for field names. + + Returns: + Dict with header fields and a `seismograms` list. + + Raises: + NoResultFound: If no snapshot with the given ID is found. + """ + logger.debug(f"Dumping per-seismogram results for snapshot {snapshot_id}.") + + snapshot = session.exec( + select(AimbatSnapshot) + .where(AimbatSnapshot.id == snapshot_id) + .options( + selectinload(rel(AimbatSnapshot.event)), + selectinload(rel(AimbatSnapshot.event_quality_snapshot)), + selectinload(rel(AimbatSnapshot.seismogram_parameters_snapshots)).options( + selectinload( + rel(AimbatSeismogramParametersSnapshot.parameters) + ).options( + selectinload( + rel(AimbatSeismogramParameters.seismogram) + ).selectinload(rel(AimbatSeismogram.station)) + ) + ), + selectinload(rel(AimbatSnapshot.seismogram_quality_snapshots)).selectinload( + rel(AimbatSeismogramQualitySnapshot.quality) + ), + ) + ).one_or_none() + + if snapshot is None: + raise NoResultFound(f"No AimbatSnapshot found with id: {snapshot_id}.") + + eq = snapshot.event_quality_snapshot + mccc_rmse = eq.mccc_rmse if eq is not None else None + + # Build a lookup from seismogram_id → quality snapshot. + quality_map: dict[UUID, AimbatSeismogramQualitySnapshot] = { + sq.quality.seismogram_id: sq for sq in snapshot.seismogram_quality_snapshots + } + + seismograms = [ + SnapshotSeismogramResult.from_snapshot_records( + param_snap=ps, + quality_snap=quality_map.get(ps.parameters.seismogram_id), + ) + for ps in snapshot.seismogram_parameters_snapshots + ] + + event = snapshot.event + results = SnapshotResults( + snapshot_id=snapshot.id, + snapshot_time=snapshot.time, + snapshot_comment=snapshot.comment, + event_id=snapshot.event_id, + event_time=event.time, + event_latitude=event.latitude, + event_longitude=event.longitude, + event_depth_km=event.depth / 1000 if event.depth is not None else None, + mccc_rmse=mccc_rmse, + seismograms=seismograms, + ) + + return results.model_dump(mode="json", by_alias=by_alias) diff --git a/src/aimbat/models/_readers.py b/src/aimbat/models/_readers.py index aca0566..739be5e 100644 --- a/src/aimbat/models/_readers.py +++ b/src/aimbat/models/_readers.py @@ -16,7 +16,14 @@ if TYPE_CHECKING: from sqlmodel import Session - from ._models import AimbatEvent, AimbatSeismogram, AimbatSnapshot, AimbatStation + from ._models import ( + AimbatEvent, + AimbatSeismogram, + AimbatSeismogramParametersSnapshot, + AimbatSeismogramQualitySnapshot, + AimbatSnapshot, + AimbatStation, + ) __all__ = [ "AimbatEventRead", @@ -24,6 +31,8 @@ "SeismogramQualityStats", "AimbatSnapshotRead", "AimbatStationRead", + "SnapshotSeismogramResult", + "SnapshotResults", ] @@ -649,3 +658,135 @@ def from_snapshot( event_id=snapshot.event_id, short_event_id=short_event_id, ) + + +class SnapshotSeismogramResult(BaseModel): + """Per-seismogram result record from a snapshot. + + Joins the frozen parameter and quality records from a snapshot to produce + one row per seismogram. Only snapshots with MCCC data will have meaningful + MCCC columns; `iccs_cc` is populated whenever ICCS was run before the + snapshot was taken. + + Event- and snapshot-level scalars (`snapshot_id`, `event_id`, `mccc_rmse`) + are not repeated here — they live in the enclosing `SnapshotResults` envelope. + """ + + model_config = ConfigDict( + frozen=True, + alias_generator=to_camel, + populate_by_name=True, + ) + + seismogram_id: UUID = Field( + title="Seismogram ID", + json_schema_extra={ + "rich": RichColSpec(style="yellow", no_wrap=True, highlight=False), # type: ignore[dict-item] + }, + ) + name: str = Field(title="Name", description="Station network and name.") + channel: str = Field(title="Channel", description="Station channel code.") + select: bool = Field( + title="Select", + description="Whether this seismogram was selected at snapshot time.", + ) + flip: bool = Field( + title="Flip", + description="Whether this seismogram was flipped at snapshot time.", + ) + t1: PydanticTimestamp | None = Field( + default=None, + title="T1", + description="Frozen pick time (absolute timestamp) at snapshot time.", + ) + iccs_cc: float | None = Field( + default=None, + title="Stack CC", + description="Cross-correlation coefficient with ICCS stack at snapshot time.", + ) + mccc_cc_mean: float | None = Field( + default=None, + title="MCCC CC", + description="Mean cross-correlation coefficient of MCCC cluster.", + ) + mccc_cc_std: float | None = Field( + default=None, + title="MCCC CC std", + description="Standard deviation of cross-correlation coefficients in MCCC cluster.", + ) + mccc_error: PydanticTimedelta | None = Field( + default=None, + title="MCCC err Δt (s)", + description="Uncertainty in the MCCC arrival time residual in seconds.", + ) + + @classmethod + def from_snapshot_records( + cls, + param_snap: "AimbatSeismogramParametersSnapshot", + quality_snap: "AimbatSeismogramQualitySnapshot | None", + ) -> "SnapshotSeismogramResult": + """Build a result record from pre-loaded snapshot records. + + Warning: + `param_snap.parameters.seismogram.station` must be loaded before + calling (e.g. via `selectinload`). `quality_snap.quality` must also + be loaded when `quality_snap` is not `None`. + + Args: + param_snap: Seismogram parameters snapshot record. + quality_snap: Matching seismogram quality snapshot, or `None` if + no quality data was captured for this seismogram. + + Returns: + Assembled result record. + """ + seis = param_snap.parameters.seismogram + station = seis.station + name = (f"{station.network}." if station.network else "") + station.name + return cls( + seismogram_id=seis.id, + name=name, + channel=station.channel, + select=param_snap.select, + flip=param_snap.flip, + t1=param_snap.t1, + iccs_cc=getattr(quality_snap, "iccs_cc", None), + mccc_cc_mean=getattr(quality_snap, "mccc_cc_mean", None), + mccc_cc_std=getattr(quality_snap, "mccc_cc_std", None), + mccc_error=getattr(quality_snap, "mccc_error", None), + ) + + +class SnapshotResults(BaseModel): + """Full results export for a snapshot. + + Contains event- and snapshot-level header information followed by + per-seismogram result records. All repeated scalars (UUIDs, event + details, MCCC RMSE) appear once in the envelope rather than being + duplicated across every row. + """ + + model_config = ConfigDict( + frozen=True, + alias_generator=to_camel, + populate_by_name=True, + ) + + snapshot_id: UUID = Field(title="Snapshot ID") + snapshot_time: PydanticTimestamp = Field(title="Snapshot time") + snapshot_comment: str | None = Field(default=None, title="Snapshot comment") + event_id: UUID = Field(title="Event ID") + event_time: PydanticTimestamp = Field(title="Event time") + event_latitude: float = Field(title="Event latitude") + event_longitude: float = Field(title="Event longitude") + event_depth_km: float | None = Field(default=None, title="Event depth (km)") + mccc_rmse: PydanticTimedelta | None = Field( + default=None, + title="MCCC RMSE", + description="Global MCCC root-mean-square error for the event (seconds).", + ) + seismograms: list[SnapshotSeismogramResult] = Field( + title="Seismograms", + description="Per-seismogram result records.", + ) diff --git a/src/aimbat/plot/_iccs.py b/src/aimbat/plot/_iccs.py index 7d1626f..51ead8f 100644 --- a/src/aimbat/plot/_iccs.py +++ b/src/aimbat/plot/_iccs.py @@ -13,6 +13,9 @@ from pysmo.tools.iccs import ( plot_stack as _plot_stack, ) +from pysmo.tools.iccs import ( + update_bandpass as _update_bandpass, +) from pysmo.tools.iccs import ( update_min_cc as _update_min_cc, ) @@ -41,6 +44,7 @@ __all__ = [ "plot_stack", "plot_matrix_image", + "update_bandpass", "update_pick", "update_timewindow", "update_min_cc", @@ -89,6 +93,56 @@ def plot_matrix_image( return _plot_matrix_image(iccs, context, all_seismograms, return_fig=return_fig) # type: ignore[call-overload] +def update_bandpass( + session: Session, + event: AimbatEvent, + iccs: ICCS, + context: bool, + all_seismograms: bool, + use_matrix_image: bool, + return_fig: bool, +) -> tuple["Figure", "Axes", Any] | None: + """Update the bandpass filter parameters for an event. + + Args: + session: Database session. + event: AimbatEvent. + iccs: ICCS instance. + context: If True, plot waveforms with extra context around the taper window. + all_seismograms: If True, include deselected seismograms in the plot. + use_matrix_image: If True, pick from the matrix image; otherwise pick from the stack plot. + return_fig: If True, return the figure, axes and widget objects instead of showing the plot. + + Returns: + A tuple of (Figure, Axes, widgets) if return_fig is True, otherwise None. + """ + + logger.info(f"Updating bandpass filter parameters for event {event.id}.") + + result = _update_bandpass( # type: ignore[call-overload] + iccs, context, all_seismograms, use_matrix_image, return_fig=return_fig + ) + + if not return_fig: + logger.debug( + f"Saving new bandpass filter parameters for event {event.id}: " + f"apply={iccs.bandpass_apply}, fmin={iccs.bandpass_fmin}, fmax={iccs.bandpass_fmax}" + ) + set_event_parameter( + session, event.id, EventParameter.BANDPASS_APPLY, iccs.bandpass_apply + ) + set_event_parameter( + session, event.id, EventParameter.BANDPASS_FMIN, iccs.bandpass_fmin + ) + set_event_parameter( + session, event.id, EventParameter.BANDPASS_FMAX, iccs.bandpass_fmax + ) + return None + + logger.warning(_RETURN_FIG_WARNING) + return result + + def update_pick( session: Session, iccs: ICCS, diff --git a/tests/conftest.py b/tests/conftest.py index f45d307..29af654 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -296,14 +296,18 @@ def loaded_session(loaded_engine: Engine) -> Generator[Session, None, None]: @pytest.fixture() -def cli() -> Callable[[str], None]: +def cli() -> Callable[[str | list[str]], None]: """Returns a callable that invokes ``app()`` in-process with command tokens. + Accepts either a command string (split via ``shlex``) or a pre-tokenised + list of strings. Pass a list when tokens contain platform-specific path + separators (e.g. Windows backslashes) that ``shlex`` would otherwise mangle. + Returns: - A callable that accepts a command string and runs it via the app. + A callable that accepts a command string or token list and runs it via the app. """ - def _run(command: str) -> None: + def _run(command: str | list[str]) -> None: try: app(command) except SystemExit as exc: diff --git a/tests/functional/test_cli_snapshots.py b/tests/functional/test_cli_snapshots.py index 0f7afa2..f0b752a 100644 --- a/tests/functional/test_cli_snapshots.py +++ b/tests/functional/test_cli_snapshots.py @@ -6,6 +6,7 @@ """ from collections.abc import Callable +from pathlib import Path from unittest.mock import MagicMock, patch import pytest @@ -809,3 +810,119 @@ def test_preview_matrix_is_called( cli(f"snapshot preview {snapshot_id} --matrix") mock_plot.assert_called_once() + + +# =================================================================== +# Snapshot results +# =================================================================== + + +@pytest.mark.cli +class TestSnapshotResults: + """Tests for the `snapshot results` CLI command.""" + + def test_results_stdout_contains_expected_fields( + self, + loaded_engine: Engine, + cli: Callable[[str], None], + cli_json: Callable[[str], list | dict], + event_id: str, + capsys: pytest.CaptureFixture[str], + ) -> None: + """Verifies that stdout JSON output contains the expected envelope and seismogram fields. + + Args: + loaded_engine: The monkeypatched engine with data loaded. + cli: The in-process CLI callable. + cli_json: The in-process CLI JSON dump callable. + event_id: The default event ID fixture. + capsys: The pytest capsys fixture. + """ + import json + + cli(f"snapshot create --event-id {event_id}") + data = cli_json("snapshot dump") + assert isinstance(data, dict) + snapshot_id = data["snapshots"][0]["id"] + + cli(f"snapshot results {snapshot_id}") + output = capsys.readouterr().out + result = json.loads(output) + assert isinstance(result, dict) + for field in ( + "snapshot_id", + "event_id", + "event_time", + "mccc_rmse", + "seismograms", + ): + assert field in result, f"Expected field '{field}' in results envelope" + assert len(result["seismograms"]) > 0 + for field in ("seismogram_id", "name", "flip", "t1"): + assert field in result["seismograms"][0], ( + f"Expected field '{field}' in seismogram row" + ) + + def test_results_output_to_file( + self, + loaded_engine: Engine, + cli: Callable[[str | list[str]], None], + cli_json: Callable[[str], list | dict], + event_id: str, + tmp_path: Path, + ) -> None: + """Verifies that --output writes a valid JSON file. + + Args: + loaded_engine: The monkeypatched engine with data loaded. + cli: The in-process CLI callable. + cli_json: The in-process CLI JSON dump callable. + event_id: The default event ID fixture. + tmp_path: Pytest temporary directory. + """ + import json + + cli(f"snapshot create --event-id {event_id}") + data = cli_json("snapshot dump") + assert isinstance(data, dict) + snapshot_id: str = data["snapshots"][0]["id"] + + out_file = tmp_path / "results.json" + cli(["snapshot", "results", snapshot_id, "--output", str(out_file)]) + + assert out_file.exists(), "Output file should be created" + result = json.loads(out_file.read_text()) + assert isinstance(result, dict) + assert len(result["seismograms"]) > 0 + + def test_results_by_alias( + self, + loaded_engine: Engine, + cli: Callable[[str], None], + cli_json: Callable[[str], list | dict], + event_id: str, + capsys: pytest.CaptureFixture[str], + ) -> None: + """Verifies that --alias produces camelCase keys in the output. + + Args: + loaded_engine: The monkeypatched engine with data loaded. + cli: The in-process CLI callable. + cli_json: The in-process CLI JSON dump callable. + event_id: The default event ID fixture. + capsys: The pytest capsys fixture. + """ + import json + + cli(f"snapshot create --event-id {event_id}") + data = cli_json("snapshot dump") + assert isinstance(data, dict) + snapshot_id = data["snapshots"][0]["id"] + + cli(f"snapshot results {snapshot_id} --alias") + output = capsys.readouterr().out + result = json.loads(output) + assert "snapshotId" in result + assert "snapshot_id" not in result + assert "seismogramId" in result["seismograms"][0] + assert "seismogram_id" not in result["seismograms"][0] diff --git a/tests/integration/core/test_snapshots.py b/tests/integration/core/test_snapshots.py index b731adf..bbe094d 100644 --- a/tests/integration/core/test_snapshots.py +++ b/tests/integration/core/test_snapshots.py @@ -16,6 +16,7 @@ dump_event_quality_snapshot_table, dump_seismogram_parameter_snapshot_table, dump_seismogram_quality_snapshot_table, + dump_snapshot_results, dump_snapshot_table, get_snapshots, rollback_to_snapshot, @@ -967,3 +968,173 @@ def test_snapshot_without_iccs_stats_has_no_quality_records( create_snapshot(loaded_session, event) snapshot = loaded_session.exec(select(AimbatSnapshot)).one() assert len(snapshot.seismogram_quality_snapshots) == 0 + + +class TestDumpSnapshotResults: + """Tests for dump_snapshot_results.""" + + def test_returns_one_row_per_seismogram(self, loaded_session: Session) -> None: + """Verifies that the seismograms list has one entry per seismogram in the snapshot. + + Args: + loaded_session: The database session. + """ + event = loaded_session.exec(select(AimbatEvent)).first() + assert event is not None + create_snapshot(loaded_session, event) + snapshot = loaded_session.exec(select(AimbatSnapshot)).one() + + result = dump_snapshot_results(loaded_session, snapshot.id) + + assert isinstance(result, dict) + assert len(result["seismograms"]) == len(event.seismograms) + + def test_contains_expected_envelope_fields(self, loaded_session: Session) -> None: + """Verifies that the envelope contains the required header fields. + + Args: + loaded_session: The database session. + """ + event = loaded_session.exec(select(AimbatEvent)).first() + assert event is not None + create_snapshot(loaded_session, event) + snapshot = loaded_session.exec(select(AimbatSnapshot)).one() + + result = dump_snapshot_results(loaded_session, snapshot.id) + + for field in ( + "snapshot_id", + "snapshot_time", + "snapshot_comment", + "event_id", + "event_time", + "event_latitude", + "event_longitude", + "event_depth_km", + "mccc_rmse", + "seismograms", + ): + assert field in result, f"Expected field '{field}' in result envelope" + + def test_contains_expected_seismogram_fields(self, loaded_session: Session) -> None: + """Verifies that each seismogram entry contains the required fields. + + Args: + loaded_session: The database session. + """ + event = loaded_session.exec(select(AimbatEvent)).first() + assert event is not None + create_snapshot(loaded_session, event) + snapshot = loaded_session.exec(select(AimbatSnapshot)).one() + + result = dump_snapshot_results(loaded_session, snapshot.id) + + row = result["seismograms"][0] + for field in ( + "seismogram_id", + "name", + "channel", + "select", + "flip", + "t1", + "iccs_cc", + "mccc_cc_mean", + "mccc_cc_std", + "mccc_error", + ): + assert field in row, f"Expected field '{field}' in seismogram row" + + def test_repeated_scalars_not_in_seismogram_rows( + self, loaded_session: Session + ) -> None: + """Verifies that envelope-level scalars are not duplicated in seismogram rows. + + Args: + loaded_session: The database session. + """ + event = loaded_session.exec(select(AimbatEvent)).first() + assert event is not None + create_snapshot(loaded_session, event) + snapshot = loaded_session.exec(select(AimbatSnapshot)).one() + + result = dump_snapshot_results(loaded_session, snapshot.id) + + for row in result["seismograms"]: + assert "snapshot_id" not in row + assert "event_id" not in row + assert "mccc_rmse" not in row + + def test_mccc_fields_null_without_mccc_run(self, loaded_session: Session) -> None: + """Verifies that MCCC fields are null when no MCCC run was captured. + + Args: + loaded_session: The database session. + """ + event = loaded_session.exec(select(AimbatEvent)).first() + assert event is not None + create_snapshot(loaded_session, event) + snapshot = loaded_session.exec(select(AimbatSnapshot)).one() + + result = dump_snapshot_results(loaded_session, snapshot.id) + + assert result["mccc_rmse"] is None + for row in result["seismograms"]: + assert row["mccc_cc_mean"] is None + assert row["mccc_cc_std"] is None + assert row["mccc_error"] is None + + def test_mccc_fields_populated_after_mccc_run( + self, loaded_session: Session + ) -> None: + """Verifies that MCCC fields are present after a mock MCCC run. + + Args: + loaded_session: The database session. + """ + event = loaded_session.exec(select(AimbatEvent)).first() + assert event is not None + seis_ids = [s.id for s in event.seismograms] + select_flags = [s.parameters.select for s in event.seismograms] + _write_mock_mccc_quality( + loaded_session, event.id, seis_ids, select_flags, all_seismograms=True + ) + loaded_session.refresh(event) + create_snapshot(loaded_session, event) + snapshot = loaded_session.exec(select(AimbatSnapshot)).one() + + result = dump_snapshot_results(loaded_session, snapshot.id) + + assert result["mccc_rmse"] is not None + assert all(row["mccc_cc_mean"] is not None for row in result["seismograms"]) + assert all(row["mccc_error"] is not None for row in result["seismograms"]) + + def test_snapshot_id_not_found_raises(self, loaded_session: Session) -> None: + """Verifies that a missing snapshot ID raises NoResultFound. + + Args: + loaded_session: The database session. + """ + from sqlalchemy.exc import NoResultFound + + with pytest.raises(NoResultFound): + dump_snapshot_results(loaded_session, uuid.uuid4()) + + def test_by_alias_uses_camel_case_keys(self, loaded_session: Session) -> None: + """Verifies that by_alias=True produces camelCase field names at all levels. + + Args: + loaded_session: The database session. + """ + event = loaded_session.exec(select(AimbatEvent)).first() + assert event is not None + create_snapshot(loaded_session, event) + snapshot = loaded_session.exec(select(AimbatSnapshot)).one() + + result = dump_snapshot_results(loaded_session, snapshot.id, by_alias=True) + + assert "snapshotId" in result + assert "snapshot_id" not in result + assert "eventTime" in result + assert "event_time" not in result + assert "seismogramId" in result["seismograms"][0] + assert "seismogram_id" not in result["seismograms"][0] diff --git a/uv.lock b/uv.lock index cc4fc2e..8852b63 100644 --- a/uv.lock +++ b/uv.lock @@ -223,19 +223,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" }, ] -[[package]] -name = "anyio" -version = "4.12.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "idna" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/96/f0/5eb65b2bb0d09ac6776f2eb54adee6abe8228ea05b20a5ad0e4945de8aac/anyio-4.12.1.tar.gz", hash = "sha256:41cfcc3a4c85d3f05c932da7c26d0201ac36f72abd4435ba90d0464a3ffed703", size = 228685, upload-time = "2026-01-06T11:45:21.246Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/38/0e/27be9fdef66e72d64c0cdc3cc2823101b80585f8119b5c112c2e8f5f7dab/anyio-4.12.1-py3-none-any.whl", hash = "sha256:d405828884fc140aa80a3c667b8beed277f1dfedec42ba031bd6ac3db606ab6c", size = 113592, upload-time = "2026-01-06T11:45:19.497Z" }, -] - [[package]] name = "attrs" version = "26.1.0" @@ -449,7 +436,7 @@ wheels = [ [[package]] name = "cyclopts" -version = "4.10.0" +version = "4.10.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "attrs" }, @@ -457,9 +444,9 @@ dependencies = [ { name = "rich" }, { name = "rich-rst" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/2c/e7/3e26855c046ac527cf94d890f6698e703980337f22ea7097e02b35b910f9/cyclopts-4.10.0.tar.gz", hash = "sha256:0ae04a53274e200ef3477c8b54de63b019bc6cd0162d75c718bf40c9c3fb5268", size = 166394, upload-time = "2026-03-14T14:09:31.043Z" } +sdist = { url = "https://files.pythonhosted.org/packages/6c/c4/2ce2ca1451487dc7d59f09334c3fa1182c46cfcf0a2d5f19f9b26d53ac74/cyclopts-4.10.1.tar.gz", hash = "sha256:ad4e4bb90576412d32276b14a76f55d43353753d16217f2c3cd5bdceba7f15a0", size = 166623, upload-time = "2026-03-23T14:43:01.098Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/06/06/d68a5d5d292c2ad2bc6a02e5ca2cb1bb9c15e941ab02f004a06a342d7f0f/cyclopts-4.10.0-py3-none-any.whl", hash = "sha256:50f333382a60df8d40ec14aa2e627316b361c4f478598ada1f4169d959bf9ea7", size = 204097, upload-time = "2026-03-14T14:09:32.504Z" }, + { url = "https://files.pythonhosted.org/packages/8a/0b/2261922126b2e50c601fe22d7ff5194e0a4d50e654836260c0665e24d862/cyclopts-4.10.1-py3-none-any.whl", hash = "sha256:35f37257139380a386d9fe4475e1e7c87ca7795765ef4f31abba579fcfcb6ecd", size = 204331, upload-time = "2026-03-23T14:43:02.625Z" }, ] [[package]] @@ -723,47 +710,11 @@ wheels = [ [[package]] name = "griffelib" -version = "2.0.0" +version = "2.0.1" source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/71/d7/2b805e89cdc609e5b304361d80586b272ef00f6287ee63de1e571b1f71ec/griffelib-2.0.1.tar.gz", hash = "sha256:59f39eabb4c777483a3823e39e8f9e03e69df271a7e49aee64e91a8cfa91bdf5", size = 166383, upload-time = "2026-03-23T21:05:25.882Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/4d/51/c936033e16d12b627ea334aaaaf42229c37620d0f15593456ab69ab48161/griffelib-2.0.0-py3-none-any.whl", hash = "sha256:01284878c966508b6d6f1dbff9b6fa607bc062d8261c5c7253cb285b06422a7f", size = 142004, upload-time = "2026-02-09T19:09:40.561Z" }, -] - -[[package]] -name = "h11" -version = "0.16.0" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" }, -] - -[[package]] -name = "httpcore" -version = "1.0.9" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "certifi" }, - { name = "h11" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" }, -] - -[[package]] -name = "httpx" -version = "0.28.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "anyio" }, - { name = "certifi" }, - { name = "httpcore" }, - { name = "idna" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" }, + { url = "https://files.pythonhosted.org/packages/4b/4c/cc8c68196db727cfc1432f2ad5de50aa6707e630d44b2e6361dc06d8f134/griffelib-2.0.1-py3-none-any.whl", hash = "sha256:b769eed581c0e857d362fc8fcd8e57ecd2330c124b6104ac8b4c1c86d76970aa", size = 142377, upload-time = "2026-03-23T21:04:01.116Z" }, ] [[package]] @@ -1974,13 +1925,12 @@ wheels = [ [[package]] name = "pysmo" -version = "1.0.0.dev39+g4a209ab66" -source = { git = "https://github.com/pysmo/pysmo?rev=master#4a209ab66d52d498c463cf4a6e4af45df33154a0" } +version = "1.0.0.dev42+g39ff86acb" +source = { git = "https://github.com/pysmo/pysmo?rev=master#39ff86acbf51fa93c4e0b2628c7206ed34c08bc2" } dependencies = [ { name = "annotated-types" }, { name = "attrs" }, { name = "cattrs" }, - { name = "httpx" }, { name = "matplotlib" }, { name = "numpy" }, { name = "pandas" }, @@ -1988,6 +1938,7 @@ dependencies = [ { name = "pyproj" }, { name = "scipy" }, { name = "scipy-stubs" }, + { name = "urllib3" }, ] [[package]] @@ -2008,16 +1959,16 @@ wheels = [ [[package]] name = "pytest-cov" -version = "7.0.0" +version = "7.1.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "coverage" }, { name = "pluggy" }, { name = "pytest" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/5e/f7/c933acc76f5208b3b00089573cf6a2bc26dc80a8aece8f52bb7d6b1855ca/pytest_cov-7.0.0.tar.gz", hash = "sha256:33c97eda2e049a0c5298e91f519302a1334c26ac65c1a483d6206fd458361af1", size = 54328, upload-time = "2025-09-09T10:57:02.113Z" } +sdist = { url = "https://files.pythonhosted.org/packages/b1/51/a849f96e117386044471c8ec2bd6cfebacda285da9525c9106aeb28da671/pytest_cov-7.1.0.tar.gz", hash = "sha256:30674f2b5f6351aa09702a9c8c364f6a01c27aae0c1366ae8016160d1efc56b2", size = 55592, upload-time = "2026-03-21T20:11:16.284Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/ee/49/1377b49de7d0c1ce41292161ea0f721913fa8722c19fb9c1e3aa0367eecb/pytest_cov-7.0.0-py3-none-any.whl", hash = "sha256:3b8e9558b16cc1479da72058bdecf8073661c7f57f7d3c5f22a1c23507f2d861", size = 22424, upload-time = "2025-09-09T10:57:00.695Z" }, + { url = "https://files.pythonhosted.org/packages/9d/7a/d968e294073affff457b041c2be9868a40c1c71f4a35fcc1e45e5493067b/pytest_cov-7.1.0-py3-none-any.whl", hash = "sha256:a0461110b7865f9a271aa1b51e516c9a95de9d696734a2f71e3e78f46e1d4678", size = 22876, upload-time = "2026-03-21T20:11:14.438Z" }, ] [[package]] @@ -2266,14 +2217,14 @@ wheels = [ [[package]] name = "scipy-stubs" -version = "1.17.1.2" +version = "1.17.1.3" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "optype", extra = ["numpy"] }, ] -sdist = { url = "https://files.pythonhosted.org/packages/c7/ab/43f681ffba42f363b7ed6b767fd215d1e26006578214ff8330586a11bf95/scipy_stubs-1.17.1.2.tar.gz", hash = "sha256:2ecadc8c87a3b61aaf7379d6d6b10f1038a829c53b9efe5b174fb97fc8b52237", size = 388354, upload-time = "2026-03-15T22:33:20.449Z" } +sdist = { url = "https://files.pythonhosted.org/packages/a7/59/59c6cc3f9970154b9ed6b1aff42a0185cdd60cef54adc0404b9e77972221/scipy_stubs-1.17.1.3.tar.gz", hash = "sha256:5eb87a8d23d726706259b012ebe76a4a96a9ae9e141fc59bf55fc8eac2ed9e0f", size = 392185, upload-time = "2026-03-22T22:11:58.34Z" } wheels = [ - { url = "https://files.pythonhosted.org/packages/8c/0b/ec4fe720c1202d9df729a3e9d9b7e4d2da9f6e7f28bd2877b7d0769f4f75/scipy_stubs-1.17.1.2-py3-none-any.whl", hash = "sha256:f19e8f5273dbe3b7ee6a9554678c3973b9695fa66b91f29206d00830a1536c06", size = 594377, upload-time = "2026-03-15T22:33:18.684Z" }, + { url = "https://files.pythonhosted.org/packages/2c/d4/94304532c0a75a55526119043dd44a9bd1541a21e14483cbb54261c527d2/scipy_stubs-1.17.1.3-py3-none-any.whl", hash = "sha256:7b91d3f05aa47da06fbca14eb6c5bb4c28994e9245fd250cc847e375bab31297", size = 597933, upload-time = "2026-03-22T22:11:56.525Z" }, ] [[package]] @@ -2476,6 +2427,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/61/73/d21edf5b204d1467e06500080a50f79d49ef2b997c79123a536d4a17d97c/uc_micro_py-2.0.0-py3-none-any.whl", hash = "sha256:3603a3859af53e5a39bc7677713c78ea6589ff188d70f4fee165db88e22b242c", size = 6383, upload-time = "2026-03-01T06:31:26.257Z" }, ] +[[package]] +name = "urllib3" +version = "2.6.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/c7/24/5f1b3bdffd70275f6661c76461e25f024d5a38a46f04aaca912426a2b1d3/urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", size = 435556, upload-time = "2026-01-07T16:24:43.925Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/39/08/aaaad47bc4e9dc8c725e68f9d04865dbcb2052843ff09c97b08904852d84/urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4", size = 131584, upload-time = "2026-01-07T16:24:42.685Z" }, +] + [[package]] name = "vulture" version = "2.15" @@ -2633,7 +2593,7 @@ wheels = [ [[package]] name = "zensical" -version = "0.0.28" +version = "0.0.29" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "click" }, @@ -2643,18 +2603,18 @@ dependencies = [ { name = "pymdown-extensions" }, { name = "pyyaml" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/14/0a/ed78749cd30c8b72f6b3f85de7f4da45ddcbbd006222aa63f7d6e27d68db/zensical-0.0.28.tar.gz", hash = "sha256:af7d75a1b297721dfc9b897f729b601e56b3e566990a989e9e3e373a8cd04c40", size = 3842655, upload-time = "2026-03-19T14:28:09.17Z" } -wheels = [ - { url = "https://files.pythonhosted.org/packages/ce/c5/05e6a8b8ecfc255ff59414c71e1904b1ceaf3ccbc26f14b90ce82aaab16e/zensical-0.0.28-cp310-abi3-macosx_10_12_x86_64.whl", hash = "sha256:2db2997dd124dc9361b9d3228925df9e51281af9529c26187a865407588f8abb", size = 12302942, upload-time = "2026-03-19T14:27:32.009Z" }, - { url = "https://files.pythonhosted.org/packages/10/aa/c10fcbee69bcca8a545b1a868e3fec2560b984f68e91cbbce3eaee0814ff/zensical-0.0.28-cp310-abi3-macosx_11_0_arm64.whl", hash = "sha256:5c6e5ea5c057492a1473a68f0e71359d663057d7d864b32a8fd429c8ea390346", size = 12186436, upload-time = "2026-03-19T14:27:34.866Z" }, - { url = "https://files.pythonhosted.org/packages/c2/ea/d0aaa0f0ed1b7a69aeec5f25ce2ff2ea7b13e581c9115d51a4a50bc7bf57/zensical-0.0.28-cp310-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d2ee8a1d29b61de61e6b0f9123fa395c06c24c94e509170c7f7f9ccddaeaaad4", size = 12545239, upload-time = "2026-03-19T14:27:37.613Z" }, - { url = "https://files.pythonhosted.org/packages/d9/b1/508ea4de8b5c93a2ceb4d536314041a19a520866a5ce61c55d64417afaa9/zensical-0.0.28-cp310-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7cef68b363c0d3598d37a1090bfc5c6267e36a87a55e9fb6a6f9d7f2768f1dfd", size = 12488943, upload-time = "2026-03-19T14:27:40.663Z" }, - { url = "https://files.pythonhosted.org/packages/1d/35/9c1878845dfcec655f538ef523c606e585d38b84415d65009b83ebc356b2/zensical-0.0.28-cp310-abi3-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3175440fd526cf0273859d0de355e769ba43e082e09deb04b6f6afd77af6c91", size = 12840468, upload-time = "2026-03-19T14:27:43.758Z" }, - { url = "https://files.pythonhosted.org/packages/d0/1f/50f0ca6db76dc7888f9e0f0103c8faaaa6ee25a2c1e3664f2db5cc7bf24b/zensical-0.0.28-cp310-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0887436c5fd8fe7008c0d93407876695db67bcf55c8aec9fb36c339d82bb7fce", size = 12591152, upload-time = "2026-03-19T14:27:46.629Z" }, - { url = "https://files.pythonhosted.org/packages/f1/6b/621b7031c24c9fb0d38c2c488d79d73fcc2e645330c27fbab4ecccc06528/zensical-0.0.28-cp310-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:b8a0ca92e04687f71aa20c9ae80fe8b840125545657e6b7c0f83adecd04d512e", size = 12723744, upload-time = "2026-03-19T14:27:50.101Z" }, - { url = "https://files.pythonhosted.org/packages/8d/89/a8bdd6a8423e0bb4f8792793681cbe101cdfbb1e0c1128b3226afe53af5f/zensical-0.0.28-cp310-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:acb31723ca82c367d1c41a6a7b0f52ce1ed87f0ee437de2ee2fc2e284e120e44", size = 12760416, upload-time = "2026-03-19T14:27:52.667Z" }, - { url = "https://files.pythonhosted.org/packages/86/07/af4ec58b63a14c0fb6b21c8c875f34effa71d4258530a3e3d301b1c518b9/zensical-0.0.28-cp310-abi3-musllinux_1_2_i686.whl", hash = "sha256:3680b3a75560881e7fa32b450cf6de09895680b84d0dd2b611cb5fa552fdfc49", size = 12907390, upload-time = "2026-03-19T14:27:56.71Z" }, - { url = "https://files.pythonhosted.org/packages/61/70/1b3f319ac2c05bdcd27ae73ae315a893683eb286a42a746e7e572e2675f6/zensical-0.0.28-cp310-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:93e1bc47981b50bcd9c4098edc66fb86fd881c5b52b355db92dcef626cc0b468", size = 12864434, upload-time = "2026-03-19T14:28:00.443Z" }, - { url = "https://files.pythonhosted.org/packages/8b/21/be7c94b25e0f4281a6b5fbd471236e33c44b832a830fedad40a6c119f290/zensical-0.0.28-cp310-abi3-win32.whl", hash = "sha256:eee014ca1290463cf8471e3e1b05b7c627ac7afa0881635024d23d4794675980", size = 11888008, upload-time = "2026-03-19T14:28:03.565Z" }, - { url = "https://files.pythonhosted.org/packages/de/88/5ce79445489edae6c1a3ff9e06b4885bea5d8e8bb8e26e1aa1b24395c337/zensical-0.0.28-cp310-abi3-win_amd64.whl", hash = "sha256:6077a85ee1f0154dbfe542db36789322fe8625d716235a000d4e0a8969b14175", size = 12094496, upload-time = "2026-03-19T14:28:06.311Z" }, +sdist = { url = "https://files.pythonhosted.org/packages/78/bd/5786ab618a60bd7469ab243a7fd2c9eecb0790c85c784abb8b97edb77a54/zensical-0.0.29.tar.gz", hash = "sha256:0d6282be7cb551e12d5806badf5e94c54a5e2f2cf07057a3e36d1eaf97c33ada", size = 3842641, upload-time = "2026-03-24T13:37:27.587Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/4b/9c/8b681daa024abca9763017bec09ecee8008e110cae1254217c8dd22cc339/zensical-0.0.29-cp310-abi3-macosx_10_12_x86_64.whl", hash = "sha256:20ae0709ea14fce25ab33d0a82acdaf454a7a2e232a9ee20c019942205174476", size = 12311399, upload-time = "2026-03-24T13:36:53.809Z" }, + { url = "https://files.pythonhosted.org/packages/81/ae/4ebb4d8bb2ef0164d473698b92f11caf431fc436e1625524acd5641102ca/zensical-0.0.29-cp310-abi3-macosx_11_0_arm64.whl", hash = "sha256:599af3ba66fcd0146d7019f3493ed3c316051fae6c4d5599bc59f3a8f4b8a6f0", size = 12191845, upload-time = "2026-03-24T13:36:56.909Z" }, + { url = "https://files.pythonhosted.org/packages/d5/35/67f89db06571a52283b3ecbe3bcf32fd3115ca50436b3ae177a948b83ea7/zensical-0.0.29-cp310-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eea7e48a00a71c0586e875079b5f83a070c33a147e52ad4383e4b63ab524332b", size = 12554105, upload-time = "2026-03-24T13:36:59.945Z" }, + { url = "https://files.pythonhosted.org/packages/7c/f6/ac79e5d9c18b28557c9ff1c7c23d695fbdd82645d69bfe02292f46d935e7/zensical-0.0.29-cp310-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:59a57db35542e98d2896b833de07d199320f8ada3b4e7ddccb7fe892292d8b74", size = 12498643, upload-time = "2026-03-24T13:37:02.376Z" }, + { url = "https://files.pythonhosted.org/packages/b1/70/5c22a96a69e0e91e569c26236918bb9bab1170f59b29ad04105ead64f199/zensical-0.0.29-cp310-abi3-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d42c2b2a96a80cf64c98ba7242f59ef95109914bd4c9499d7ebc12544663852c", size = 12854531, upload-time = "2026-03-24T13:37:04.962Z" }, + { url = "https://files.pythonhosted.org/packages/79/25/e32237a8fcb0ceae1ef8e192e7f8db53b38f1e48f1c7cdbacd0a7b713892/zensical-0.0.29-cp310-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b2fca39c5f6b1782c77cf6591cf346357cabee85ebdb956c5ddc0fd5169f3d9", size = 12596828, upload-time = "2026-03-24T13:37:07.817Z" }, + { url = "https://files.pythonhosted.org/packages/ff/74/89ac909cbb258903ea53802c184e4986c17ce0ba79b1c7f77b7e78a2dce3/zensical-0.0.29-cp310-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:dfc23a74ef672aa51088c080286319da1dc0b989cd5051e9e5e6d7d4abbc2fc1", size = 12732059, upload-time = "2026-03-24T13:37:11.651Z" }, + { url = "https://files.pythonhosted.org/packages/8c/31/2429de6a9328eed4acc7e9a3789f160294a15115be15f9870a0d02649302/zensical-0.0.29-cp310-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:c9336d4e4b232e3c9a70e30258e916dd7e60c0a2a08c8690065e60350c302028", size = 12768542, upload-time = "2026-03-24T13:37:14.39Z" }, + { url = "https://files.pythonhosted.org/packages/10/8a/55588b2a1dcbe86dad0404506c9ba367a06c663b1ff47147c84d26f7510e/zensical-0.0.29-cp310-abi3-musllinux_1_2_i686.whl", hash = "sha256:30661148f0681199f3b598cbeb1d54f5cba773e54ae840bac639250d85907b84", size = 12917991, upload-time = "2026-03-24T13:37:16.795Z" }, + { url = "https://files.pythonhosted.org/packages/ec/5d/653901f0d3a3ca72daebc62746a148797f4e422cc3a2b66a4e6718e4398f/zensical-0.0.29-cp310-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:6a566ac1fd4bfac5d711a7bd1ae06666712127c2718daa5083c7bf3f107e8578", size = 12868392, upload-time = "2026-03-24T13:37:19.42Z" }, + { url = "https://files.pythonhosted.org/packages/29/58/d7449bc88a174b98daa3f2fbdfbdac3493768a557d8987e88bdaa6c78b1a/zensical-0.0.29-cp310-abi3-win32.whl", hash = "sha256:a231a3a02a3851741dc4d2de8910b5c39fe81e55bf026d8edf4d803e91a922fb", size = 11905486, upload-time = "2026-03-24T13:37:22.154Z" }, + { url = "https://files.pythonhosted.org/packages/f5/09/3fd082d016497c4d26ff20f42a8be2cc91e27191c0c5f3cd6507827f666f/zensical-0.0.29-cp310-abi3-win_amd64.whl", hash = "sha256:7145c5504380a344b8cd4586da815cdde77ef4a42319fa4f35e78250f01985af", size = 12101510, upload-time = "2026-03-24T13:37:24.77Z" }, ] diff --git a/zensical.toml b/zensical.toml index 4d771b9..89e3d02 100644 --- a/zensical.toml +++ b/zensical.toml @@ -25,10 +25,11 @@ nav = [ { "Selecting an Event" = "usage/event-selection.md" }, { "Initial Inspection" = "usage/inspection.md" }, { "The ICCS Stack" = "usage/iccs-stack.md" }, - { "Aligning with ICCS" = "usage/alignment.md" }, { "Snapshots" = "usage/snapshots.md" }, - { "Finalising with MCCC" = "usage/mccc.md" }, + { "Aligning with ICCS" = "usage/alignment.md" }, + { "MCCC Alignment" = "usage/mccc.md" }, { "Quality Assessment" = "usage/quality.md" }, + { "Exporting Results" = "usage/results.md" }, { "Python API" = "usage/api.md" }, ] }, { "API reference" = [