Skip to content

add spark typescript#30

Merged
ntheile merged 7 commits intomasterfrom
typescript-spark
Feb 8, 2026
Merged

add spark typescript#30
ntheile merged 7 commits intomasterfrom
typescript-spark

Conversation

@ntheile
Copy link
Contributor

@ntheile ntheile commented Feb 8, 2026

Summary by CodeRabbit

  • New Features

    • Spark wallet support in TypeScript bindings with public APIs to install a Spark runtime and create/run Spark nodes; full send/receive and transaction workflows.
    • Interactive example apps (web + Expo Go) demonstrating connect, balance, transactions, send/receive invoice flows and settings persistence.
  • Documentation

    • Expanded Spark integration guides, runtime usage, Expo/Vite/Metro guidance, packaging notes, and troubleshooting.
  • Tests

    • Added unit and integration tests covering Spark runtime behavior and wallet workflows.

@coderabbitai
Copy link

coderabbitai bot commented Feb 8, 2026

Warning

Rate limit exceeded

@ntheile has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 9 minutes and 5 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📝 Walkthrough

Walkthrough

Adds Spark support to the TypeScript bindings: new SparkNode implementation and types, spark-runtime utilities, vendor bundling and shims, example apps (web and Expo Go), build scripts, tests, and documentation updates.

Changes

Cohort / File(s) Summary
Core runtime & node
bindings/typescript/src/spark-runtime.ts, bindings/typescript/src/nodes/spark.ts, bindings/typescript/src/types.ts, bindings/typescript/src/factory.ts, bindings/typescript/src/index.ts
Adds installSparkRuntime, createStreamCompatibleFetch, and header/stream helpers; introduces SparkNode, SparkConfig/SparkNetwork, extends createNode overloads and public exports.
Vendor build & packaging
bindings/typescript/scripts/build-spark-vendor.mjs, bindings/typescript/package.json
Adds an esbuild-based vendor pipeline producing browser-ready Spark bundles, integrates vendor build into package scripts, bumps package version, and adds Spark-related dependencies and ./spark-runtime export.
Shims & runtime helpers
bindings/typescript/scripts/spark-shims/*
Adds multiple runtime shims (crypto, fetch, headers, node-events, http/https stubs, vitest no-op) to adapt Spark/vendor code to browser/Expo environments.
Vendor bridges & typings
bindings/typescript/src/vendor/frosts-bridge.*, bindings/typescript/src/vendor/spark-sdk-bare.*
Adds passthrough vendor bridge modules and ambient TypeScript declarations exposing Frosts and Spark-bare APIs consumed by the Spark implementation.
Examples — Expo Go
bindings/typescript/examples/spark-expo-go/*
New Expo Go example (no WASM): full App.tsx demo, Metro resolver, package/tsconfig/app.json, entry index.ts, README, .gitignore, and Metro resolveRequest to map dist/ imports.
Examples — Web demo
bindings/typescript/examples/spark-web/*
New Vite-served web demo (no WASM) with full wallet UI (index.html), README, package.json, and Vite config excluding react-native.
Build entries & shim mappings
bindings/typescript/scripts/spark-vendor-entries/*, bindings/typescript/scripts/spark-shims/*
Adds entry modules and glue files used by the vendor bundling process (frosts-bridge entry, frost wrappers, aliasing and runtime patches).
Tests
bindings/typescript/src/__tests__/integration/spark.real.test.ts, bindings/typescript/src/__tests__/spark-runtime.test.ts, bindings/typescript/src/__tests__/spark-node.test.ts, minor edit bindings/typescript/src/__tests__/integration/cln.real.test.ts
Adds integration and unit tests for Spark runtime and SparkNode behavior (env-driven integration tests, runtime helpers, payInvoice unit tests).
Docs & planning
bindings/typescript/README.md, docs/SPARK_EXPO_PURE_TS_PLAN.md, docs/THREAD_CONTEXT.md
Extensive documentation and planning updates: Spark usage, sdkEntry options, packaging notes, Expo guidance, architecture and phased plan, and thread/context rewrites.
Examples/config & misc
bindings/typescript/examples/spark-expo-go/metro.config.js, .gitignore, other example config files
Adds Metro resolver logic for dist/ imports, Vite .vite/ ignore, example configs, and example package scripts and app.json.

Sequence Diagram(s)

sequenceDiagram
    participant App as App (Web/Expo)
    participant Runtime as Spark Runtime
    participant Factory as createNode factory
    participant Node as SparkNode
    participant Wallet as Spark Wallet SDK
    participant Frosts as Frosts Crypto

    App->>Runtime: installSparkRuntime(options)
    Runtime->>Runtime: wrap global fetch / inject headers
    Runtime-->>App: RuntimeHandle{fetch, restore}

    App->>Factory: createNode({kind:'spark', config})
    Factory->>Node: construct SparkNode (loads SDK entry)
    Node->>Wallet: load/initialise wallet client (vendor bundle)
    Wallet-->>Node: wallet ready

    App->>Node: createInvoice(params)
    Node->>Wallet: wallet.createInvoice(...)
    Wallet-->>Node: bolt11 invoice
    Node-->>App: Transaction{hash, bolt11}

    App->>Node: payInvoice(params)
    Node->>Frosts: crypto/signing (if required)
    Frosts-->>Node: signature
    Node->>Wallet: wallet.payLightningInvoice(...)
    Wallet-->>Node: preimage/result
    Node-->>App: PayInvoiceResponse
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~75 minutes

Possibly related PRs

  • typescript #29 — Overlaps TypeScript bindings surface and factory/index exports; likely closely related to the Spark exports and factory changes.

Poem

🐰 I hopped through shims and bundled light,

Wrapped fetch and kept the runtime tight,
SparkNode planted seeds in web and app,
Expo and browser share the map,
A tiny rabbit cheers—code hops bright!

🚥 Pre-merge checks | ✅ 1 | ❌ 2
❌ Failed checks (1 warning, 1 inconclusive)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 2.99% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Title check ❓ Inconclusive The title 'add spark typescript' is vague and generic, using non-specific language that fails to convey the actual scope of the substantial changeset. Use a more descriptive title that captures the main implementation, such as 'Add Spark node support with TypeScript implementation and Expo integration' or 'Implement Spark wallet integration for TypeScript bindings'.
✅ Passed checks (1 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch typescript-spark

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 11

🤖 Fix all issues with AI agents
In `@bindings/typescript/examples/spark-expo-go/App.tsx`:
- Around line 142-144: persistForm currently serializes the entire
SparkFormState (including mnemonic and apiKey) into AsyncStorage via
AsyncStorage.setItem(STORAGE_KEY, JSON.stringify(value)), which stores secrets
as plaintext; update persistForm to avoid persisting raw secrets and add a
prominent warning comment above the function referencing SparkFormState,
STORAGE_KEY, mnemonic and apiKey that warns "Do not store raw mnemonic/apiKey in
plaintext — use secure storage or encrypt before persisting" and either redact
or remove mnemonic/apiKey before calling AsyncStorage.setItem (or switch to a
secure storage API) so no raw secret values are written to AsyncStorage.

In `@bindings/typescript/examples/spark-web/index.html`:
- Around line 233-237: The saveConfig() flow currently persists sensitive data
(mnemonic) to localStorage via localStorage.setItem(STORAGE_KEY,
JSON.stringify(config)); add a visible in-UI warning before or after saving:
render a persistent ⚠️ banner near the form (or modal) explaining that storing
seed phrases in localStorage is insecure (XSS risk) and should only be used for
local demos, and require an explicit user confirmation/checkbox (e.g., "I
understand the risk — persist mnemonic") before allowing saveConfig() to persist
the mnemonic; update getConfigFromForm()/saveConfig() to respect that
confirmation and use setStatus() to surface the warning state to the user.

In `@bindings/typescript/package.json`:
- Around line 45-47: Run an npm pack dry run to confirm tests and secrets are
excluded: use npm pack --dry-run (or npm pack --dry-run --json) and verify
entries do not include src/**/__tests__/**, scripts/examples, or any .env files;
if they appear, ensure package.json "files" remains ["dist"] and add a
.npmignore that explicitly excludes src/**/__tests__/**, scripts/, examples/,
and any *.env or secret files, and confirm tsconfig.build.json already excludes
tests (src/**/__tests__/**) so built artifacts remain safe.

In `@bindings/typescript/src/__tests__/integration/cln.real.test.ts`:
- Line 27: Remove or redact the sensitive console output that prints the full
CLN invoice: locate the console.log('CLN Invoice:', invoice) in cln.real.test.ts
(the test that uses the invoice variable) and either delete the log or replace
it with a non-sensitive message that does not include the full invoice or
payment hash (e.g., log only a masked/hashed identifier or test status). Ensure
no other prints expose the invoice or preimage values.

In `@bindings/typescript/src/__tests__/integration/spark.real.test.ts`:
- Line 56: The test currently prints the full invoice object via
console.log('Spark Invoice:', invoice), which may leak sensitive data; remove
that console.log and instead, if any output is needed for debugging, log a
non-sensitive identifier (e.g., invoice id or a masked/length-only
representation) or use assertions only; update the test in spark.real.test.ts to
stop printing the full invoice variable and reference the invoice value only in
secure/masked form or not at all.

In `@bindings/typescript/src/nodes/spark.ts`:
- Around line 643-645: The current computation of maxFeeSats uses
Math.floor(params.feeLimitMsat / 1000) which yields 0 for feeLimitMsat < 1000
and can cause payment rejection; update the calculation in the maxFeeSats
assignment (referencing maxFeeSats, params.feeLimitMsat,
this.config.defaultMaxFeeSats, DEFAULT_MAX_FEE_SATS) to ensure at least 1 sat
when a feeLimitMsat is provided (for example use Math.max(1,
Math.ceil(params.feeLimitMsat / 1000)) or Math.max(1, Math.floor(...)) as
appropriate) while leaving the fallback to this.config.defaultMaxFeeSats ??
DEFAULT_MAX_FEE_SATS unchanged.
- Around line 180-191: The unit-checking branch incorrectly treats "sats" as
msats; update the satoshis branch so it matches common variants (e.g., "sat",
"sats") instead of only "satoshi" or exact "sat". Concretely, modify the
condition that currently reads `unit.includes('satoshi') || unit === 'sat'` to
include `unit.includes('sat')` (or otherwise check `unit === 'sat' || unit ===
'sats'`), and ensure the branch for satoshis still returns `Math.floor(amount *
1000)` so sat → msat conversion is correct (use the same identifying variables
`unit` and `amount` from this function).
- Around line 490-527: loadSdk currently caches a rejected promise in
this.sdkPromise so transient failures become permanent; modify loadSdk so that
when the inner async IIFE catches a terminal error (i.e., before throwing the
LniError) it clears this.sdkPromise (set to undefined/null) so subsequent calls
retry, and apply the same pattern to getWallet by clearing walletPromise on
rejection; locate loadSdk, this.sdkPromise, importSparkSdkCandidate,
installPureSparkFrost, and getWallet/walletPromise to implement the change and
ensure thrown LniError still includes the original cause.
- Around line 140-141: The current branch converts a bigint via Number(value)
which silently loses precision for large bigints; replace it with a guarded
conversion in the block where typeof value === 'bigint' (the local variable
value) by checking if value is within safe numeric bounds (e.g. if value >
BigInt(Number.MAX_SAFE_INTEGER) || value < BigInt(Number.MIN_SAFE_INTEGER)) and
if so, either throw a descriptive error or return/preserve the bigint (prefer
preserving bigint through the call chain), otherwise safely return
Number(value); include Number.MAX_SAFE_INTEGER and the identifier value in the
guard and error message so callers can locate and handle oversized balances.

In `@bindings/typescript/src/spark-runtime.ts`:
- Around line 48-66: The current shouldAttachApiKeyHeader function makes
sameOriginOnly a no-op in non-browser contexts by returning true when typeof
window === 'undefined'; change this to a safe default: if sameOriginOnly is true
and no browser origin is available, return false (do not attach the key) or add
an optional origin parameter to shouldAttachApiKeyHeader (and callers) to supply
the server-side origin to compare against; use getRequestUrl(input) as before
and compare requestUrl.origin to the explicit origin when provided, otherwise
return false when sameOriginOnly is true and window.location.origin is
unavailable.

In `@bindings/typescript/src/types.ts`:
- Line 173: The TS type SparkNetwork currently lists
'mainnet'|'regtest'|'testnet'|'signet'|'local' but the Rust get_network() in
crates/lni/spark/lib.rs only handles "mainnet" and "regtest" (defaulting unknown
values to Network::Mainnet), causing mismatches; either narrow the TypeScript
SparkNetwork to only 'mainnet'|'regtest' to match Rust, or extend the Rust
get_network() and Network enum to explicitly handle 'testnet', 'signet', and
'local' (add new enum variants and string-match branches in get_network() to
return the correct Network variant instead of defaulting to Mainnet).
🧹 Nitpick comments (13)
bindings/typescript/scripts/spark-shims/bare-crypto.js (1)

1-11: Looks good overall — clean shim pattern.

One tiny nit: the error message on line 5 says "in this browser runtime", but globalThis.crypto can also be absent in non-browser environments (e.g., Node < 15 without the --experimental-global-webcrypto flag, or edge runtimes). Consider a more generic message like "globalThis.crypto is not available in this runtime.".

Suggested tweak
-      throw new Error('globalThis.crypto is not available in this browser runtime.');
+      throw new Error('globalThis.crypto is not available in this runtime.');
bindings/typescript/scripts/spark-shims/node-https.js (1)

5-7: Consider shimming https.get as well for completeness.

https.get is the other commonly used method in the Node https module. If any transitive dependency calls https.get(...) in this browser build, it will silently receive undefined instead of a clear error.

Suggested addition
 const https = {
   request: notSupported,
+  get: notSupported,
 };
bindings/typescript/scripts/spark-shims/bare-fetch-headers.js (1)

1-7: Inconsistent error timing compared to bare-fetch.js.

bare-fetch.js defers the error to call-time (stores null, throws when invoked), while this shim throws eagerly at module evaluation. If Headers is unavailable but never actually used, this import will still crash the app. Consider aligning the pattern — either both eager or both lazy — for consistent behavior.

♻️ Lazy-check variant (matching bare-fetch.js pattern)
 const BareHeaders = globalThis.Headers;
 
-if (typeof BareHeaders !== 'function') {
-  throw new Error('globalThis.Headers is not available in this browser runtime.');
-}
-
-export default BareHeaders;
+export default function getHeaders(...args) {
+  if (typeof BareHeaders !== 'function') {
+    throw new Error('globalThis.Headers is not available in this browser runtime.');
+  }
+  return new BareHeaders(...args);
+};

Note: This changes the export shape from a constructor to a factory function, so only adopt if callers can accommodate it. Otherwise, align bare-fetch.js to also throw eagerly instead.

bindings/typescript/scripts/spark-shims/vitest.js (1)

1-8: The apply trap is dead code — the proxy target is not callable.

The apply trap only fires when a Proxy is invoked as a function, which requires the target to be a function. Since the target is {}, the proxy can never be apply-trapped. The shim works correctly anyway because all vitest assertion chains go through property access (get trap), but apply on Line 6 is unreachable.

🧹 Optional cleanup
 const createNoopExpectation = () =>
   new Proxy(
-    {},
+    function () {},
     {
       get: () => (..._args) => createNoopExpectation(),
       apply: () => createNoopExpectation(),
     },
   );

Or simply remove the apply trap if you want to keep the plain-object target.

bindings/typescript/src/vendor/frosts-bridge.js (1)

1-23: Duplicate of scripts/spark-vendor-entries/frosts-bridge.js.

This file is identical to the vendor-entry copy. During build, tsc emits this to dist/vendor/frosts-bridge.js, then the vendor build script overwrites it with the esbuild-bundled version from the scripts entry. Consider importing from a single source or adding a comment explaining why both copies exist, to prevent them from drifting apart.

bindings/typescript/scripts/spark-shims/node-events.cjs (1)

17-25: Minor behavioral difference: off removes all matching listeners, not just the first.

Node's removeListener removes only the first occurrence of a listener. This filter approach removes all occurrences. This is likely fine for the shim's use case but worth noting if any Spark SDK code registers the same listener multiple times.

bindings/typescript/scripts/build-spark-vendor.mjs (2)

22-24: Fragile string-literal patching — consider a regex or AST-based approach.

The exact-string match on the BareHttpTransport call pattern will silently fail if upstream @buildonspark/spark-sdk reformats the line (e.g., whitespace changes, variable renames). The validation on lines 84-90 catches this failure at build time, which is good, but a regex with flexible whitespace would be more resilient:

const browserTransportPattern = /return\s+new\s+ConnectionManagerBrowser\s*\(\s*config\s*,\s*BareHttpTransport\s*\(\s*\)\s*\)/;

You'd then use source.replace(pattern, replacement) which also only replaces the first match — consistent with current behavior.


37-73: logLevel: 'silent' suppresses esbuild warnings — may hide resolution or bundling issues.

If a shim alias is misconfigured or a dependency can't be resolved, esbuild warnings won't surface until a downstream validation fails (or worse, doesn't catch the problem). Consider using logLevel: 'warning' or 'info' to surface build-time issues directly.

bindings/typescript/package.json (1)

67-77: New production dependencies added for Spark integration — consider documenting alpha package risks.

Several new dependencies were introduced for the Spark integration. Notable concerns:

  • @frosts/core and @frosts/secp256k1-tr at ^0.2.2-alpha.3 are alpha pre-release packages. These are the latest available versions on npm with no stable releases. Alpha APIs may change without notice. The ^ caret range for 0.x pre-release versions is relatively narrow in practice, but upgrades will still require manual attention and testing.
  • @buildonspark/spark-sdk at ^0.6.1 is the core Spark dependency and is available as a stable version on npm.

Consider documenting the intentional use of alpha dependencies in the project or lockfile so future maintainers understand the stability trade-off and are prepared for potential breaking changes in these packages.

bindings/typescript/src/spark-runtime.ts (1)

162-211: restore() only reverts globalThis.fetch but not the Buffer/btoa/atob polyfills also installed on globalThis.

Lines 171–182 install Buffer, btoa, and atob on globalThis, but restore() (lines 205–208) only restores fetch. This creates asymmetric cleanup — calling restore() doesn't fully undo the runtime mutations.

This is likely acceptable since polyfills are generally safe to leave installed, but it could surprise consumers who expect restore() to be a full teardown (e.g., in test harnesses).

♻️ Optional: track and restore all globals
 export function installSparkRuntime(options: SparkRuntimeOptions = {}): SparkRuntimeHandle {
   const runtime = globalThis as FetchRuntime;
   const previousFetch = runtime.fetch;
+  const previousBuffer = runtime.Buffer;
+  const previousBtoa = runtime.btoa;
+  const previousAtob = runtime.atob;

   // ... rest of setup ...

   return {
     fetch: baseFetch,
     restore() {
       if (options.setGlobalFetch !== false) {
         runtime.fetch = previousFetch;
       }
+      if (options.installBuffer !== false) {
+        runtime.Buffer = previousBuffer;
+      }
+      if (options.installBase64 !== false) {
+        runtime.btoa = previousBtoa;
+        runtime.atob = previousAtob;
+      }
     },
   };
 }
docs/SPARK_EXPO_PURE_TS_PLAN.md (1)

42-48: Plan SparkConfig fields diverge from the actual implementation.

The plan lists apiKey, storage, sspBaseUrl, electrsUrl, and operatorOverrides as direct config fields, but the implemented SparkConfig in types.ts uses accountNumber, sdkEntry, defaultMaxFeeSats, and a generic sparkOptions bag instead. Consider updating this section to reflect the current design so future readers aren't misled.

bindings/typescript/examples/spark-expo-go/App.tsx (1)

112-118: Runtime is reinstalled on every refresh — intentional but worth noting.

setupRuntime is called inside both connectWallet and refresh, and each call tears down the previous runtime and installs a new one. This works correctly but is redundant when the API key hasn't changed between calls. Fine for a demo, but a production version should only reinstall when the key changes.

Also applies to: 186-196

bindings/typescript/src/nodes/spark.ts (1)

230-233: Multiple helper functions return any — consider narrowing return types.

Functions like signingCommitmentFromBinding, identifierFromHex, buildUserKeyPackage, and buildSigningPackage all return any. While understandable given the untyped vendor bridge, this disables type-checking across the entire Frost signing path. Even a minimal branded type alias (e.g., type FrostCommitments = ReturnType<typeof FrostSigningCommitments['prototype']['constructor']>) would catch misuse.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🤖 Fix all issues with AI agents
In `@bindings/typescript/src/nodes/spark.ts`:
- Around line 770-779: The code currently returns fallbackSerialized after
emitting diagnostics when no z-candidate passes
validateOutboundAdaptorSignatureLocal; instead, change the flow to throw an
LniError (include a clear message and attach candidateDiagnostics) to fail fast
rather than returning the known-bad fallbackSerialized; keep the
emitSparkDebugCheckpoint('aggregate_frost:adaptor_validation_failed', {
candidates: candidateDiagnostics }) call for telemetry, then throw new LniError
with context (e.g., referencing fallbackSerialized or bigIntToFixedBytes for
clarity) so callers receive an explicit failure instead of a silent invalid
signature submission.
- Around line 1073-1108: The createInvoice implementation currently computes
amountSats with Math.floor(params.amountMsats / 1000) which converts positive
sub-1000 msat values to 0 (creating an unintended amountless invoice); update
the logic around amountSats in createInvoice so that when params.amountMsats is
provided and > 0 you use Math.max(1, Math.floor(params.amountMsats / 1000)) (and
keep 0 only when amountMsats is undefined or explicitly 0), referencing the
amountSats variable and params.amountMsats to ensure sub-satoshi positive
amounts become at least 1 sat rather than 0.

In `@bindings/typescript/src/spark-runtime.ts`:
- Around line 232-244: The installSparkRuntime behavior currently treats
options.setGlobalFetch as true when omitted, causing runtime.fetch (baseFetch)
to overwrite globalThis.fetch silently and restore() only reverts fetch; change
the logic so global fetch is only replaced when setGlobalFetch === true (i.e.,
opt-in) by updating the condition that assigns runtime.fetch (and the returned
fetch) to check for strict true, and extend the restore() implementation to also
revert any globals you install (previous Buffer, btoa, atob values captured when
installSparkRuntime runs) so restore() performs a full rollback of
Buffer/btoa/atob as well as runtime.fetch/previousFetch.

In `@docs/SPARK_EXPO_PURE_TS_PLAN.md`:
- Around line 153-156: The Phase 4 section "Security + persistence hardening"
currently defers handling of the mnemonic, leaving caller responsibilities
unclear; update the document so callers know whether LNI will persist the
mnemonic or they must, and provide concrete guidance: either (A) implement the
original security plan described in the Phase 4 draft (restore the "security
plan" details referenced in lines ~171-183 into the Phase 4 section and describe
how the SDK handles storage/encryption/secret rotation), or (B) explicitly state
caller responsibilities by adding a subsection under "Phase 4 — Security +
persistence hardening" that warns callers they must securely persist mnemonics,
lists recommended practices (encrypted storage, OS keychain, rotation,
least-privilege access), and example threat mitigations; reference the term
"mnemonic" and the Phase 4 heading so reviewers can locate the change.
- Line 3: Update the document status to accurately reflect remaining work:
change the "**Status: COMPLETE**" line to something like "**Status: IN PROGRESS
/ PENDING BROWSER/EXPO TESTS**" and add a short note referencing the outstanding
action "Test payInvoice in browser and Expo Go environments (currently only
tested in Node.js integration tests)"; ensure the term `payInvoice` is mentioned
and the follow-up item remains visible so readers know browser and Expo Go
verification is still required before marking complete.
🧹 Nitpick comments (9)
bindings/typescript/src/vendor/frosts-bridge.d.ts (1)

1-20: All exports are typed as any — consider narrowing over time.

This is understandable for an ambient declaration over a vendored WASM bridge where upstream types may not be published. However, the blanket any means callers get zero type-checking on FROST operations, which are security-critical (signing, key handling).

As a follow-up, consider:

  • Adding minimal callable/constructable signatures for the most-used symbols (e.g., aggregateWithTweak, round2).
  • Or re-exporting upstream .d.ts files if/when @frosts/core publishes them.

No action required now, but worth tracking.

bindings/typescript/examples/spark-expo-go/App.tsx (1)

125-134: appendSendDebugCheckpoint captures sendDebugEnabled by closure — toggling debug off won't stop in-flight callbacks.

The useCallback dependency on sendDebugEnabled means a new closure is created when the flag changes, but any already-dispatched async code (e.g., the global __LNI_SPARK_DEBUG__.emit) still holds the old closure reference where sendDebugEnabled was true. This is mitigated by the effect at Lines 166-190 replacing the global emit, so it's minor — but worth noting.

bindings/typescript/examples/spark-web/index.html (1)

345-348: Unnecessary re-throw in connectWallet catch block.

connectWallet re-throws the error after calling setStatus, but the caller at Line 433 swallows it with .catch(() => undefined). The other async handlers (refreshData, payInvoiceTest) don't re-throw. Remove for consistency.

Proposed fix
       } catch (error) {
         setStatus(`connect failed: ${error?.message ?? String(error)}`, true);
-        throw error;
       }
bindings/typescript/src/__tests__/spark-node.test.ts (1)

55-58: Overriding private getWallet via type cast is fragile but acceptable for unit tests.

If getWallet is renamed or its signature changes, these tests will silently stop testing the intended behavior. Consider adding a brief comment noting this coupling.

Also applies to: 84-87

bindings/typescript/README.md (1)

101-136: Excellent technical writeup of the pure TS FROST signer.

This section is a valuable reference for contributors and users who need to understand the signer architecture. The documentation of the three cryptographic bugs discovered is particularly useful.

Nit on Line 111: "round2" → "round 2" for readability (flagged by static analysis).

bindings/typescript/src/nodes/spark.ts (1)

1186-1217: Payment polling uses hardcoded 60s timeout and 2s interval — not configurable.

The polling loop at lines 1186–1217 uses fixed maxPollMs = 60_000 and pollIntervalMs = 2_000. For production use, callers may need shorter timeouts (mobile UX) or longer ones (high-value payments on congested network). Consider exposing these as optional fields on PayInvoiceParams or SparkConfig.

bindings/typescript/package.json (1)

67-78: Alpha dependencies and heavy dependency footprint for optional spark-runtime.

Two concerns:

  1. Alpha dependencies: @frosts/core and @frosts/secp256k1-tr are pinned to ^0.2.2-alpha.3. The caret range means any 0.2.x-alpha.y where y >= 3 could be resolved, including potentially breaking alpha releases. Consider pinning to an exact version until these packages reach stable.

  2. Dependency weight: This adds ~12 new runtime dependencies (crypto libs, buffer polyfill, SDK, etc.). Since these are only needed for Spark, consumers who only use NWC/Phoenixd/etc. will still pull them. The ./spark-runtime entry point suggests intent to separate concerns, but verify that bundlers can actually tree-shake these imports when only the "." entry point is used — otherwise consider making some of these peerDependencies or documenting the install-size impact.

docs/SPARK_EXPO_PURE_TS_PLAN.md (2)

171-183: Clarify security plan status: recommendation vs requirement.

The security plan section describes a detailed encryption/storage approach using AES-GCM, key derivation, etc., but Phase 4 (line 153) marks this as "DEFERRED" with storage "handled by the Spark SDK's internal layer." This creates ambiguity about whether this security plan:

  • Should be implemented by callers of the LNI library
  • Is handled internally by the Spark SDK
  • Is aspirational/future work

Consider adding a note at the beginning of this section clarifying its status (e.g., "Recommended approach for applications using SparkNode" or "Future work: not currently implemented").


93-93: Protocol compatibility is practically validated but lacks explicit test vectors.

The document correctly notes that manual aggregation bypasses @frosts internal functions to match Spark's WASM behavior. However, no deterministic FROST test vectors or interop tests exist to formally verify this. End-to-end payment success with real mainnet backend provides indirect evidence that signatures are accepted, but the document itself lists "Add deterministic FROST signing test vectors to catch regressions without a live backend" as a remaining follow-up (lines 200–207). Consider adding test vectors for:

  • Nested FROST with singleton user group (lambda=1)
  • BIP-340 Schnorr signature format
  • Adaptor signature construction

This would catch regressions without relying on live backend availability.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In `@bindings/typescript/examples/spark-expo-go/App.tsx`:
- Line 404: The UI currently exposes the sensitive payment.preimage in
setSendResult; change this to avoid displaying the full preimage by either
removing it from the visible send result or replacing it with a masked/truncated
value (e.g., show only the first 8 hex chars + ellipsis) and keep the full
preimage only in a debug-only log path; update the code that calls setSendResult
(look for setSendResult(...) and references to payment.preimage) to pass the
masked/truncated string instead and/or gate full preimage logging behind a debug
flag so the UI never shows the full secret.

In `@bindings/typescript/examples/spark-web/index.html`:
- Around line 420-436: The tx list rendering and showTxDetail use innerHTML with
unescaped network-supplied fields (e.g., tx.description, paymentHash) which
enables XSS; update the rendering to stop concatenating untrusted strings into
innerHTML (el.txList and the showTxDetail function) and instead build DOM nodes
via createElement/set textContent or use a single escapeHtml(text) helper to
escape <,>,&,",'; replace all interpolations of tx.description, tx.memo,
tx.paymentHash and similar with the escaped/textContent values and ensure
formatSats/formatTime outputs are also treated as text when inserted into the
DOM.
🧹 Nitpick comments (5)
bindings/typescript/examples/spark-expo-go/App.tsx (2)

214-219: Silent catch in disconnectNode swallows all errors.

catch {} hides any cleanup failure. At minimum, log it to the debug trace so connection teardown issues are diagnosable.

Suggested change
-    try { await node.cleanupConnections(); } catch {}
+    try { await node.cleanupConnections(); } catch (e) {
+      console.warn('cleanupConnections error:', e);
+    }

137-186: Large number of useState hooks in a single component — consider grouping related state.

The App component has 18+ individual useState calls. For a demo/example app this is acceptable, but it increases cognitive load. A useReducer for send/receive/UI state groups would make state transitions more explicit and the component easier to follow.

bindings/typescript/examples/spark-web/index.html (3)

518-521: setupRuntime is called on every refreshData — is this intentional?

Each call to refreshData (line 521) calls setupRuntime, which invokes sparkRuntimeHandle?.restore() then re-installs the runtime. If installSparkRuntime performs global patching (e.g., monkey-patching fetch), doing this on every refresh could be wasteful or interfere with in-flight requests. Consider installing the runtime once during connectWallet and only re-installing when the API key actually changes.


652-656: Misleading variable name open — it actually means "is now hidden".

classList.toggle('hidden') returns true when the class was added (i.e., the element is now hidden). The variable open reads as the opposite of what it holds. Consider renaming for clarity:

-      const open = el.advancedFields.classList.toggle('hidden');
-      el.advancedToggle.innerHTML = (open ? '&#9656;' : '&#9662;') + ' Advanced settings';
+      const isHidden = el.advancedFields.classList.toggle('hidden');
+      el.advancedToggle.innerHTML = (isHidden ? '&#9656;' : '&#9662;') + ' Advanced settings';

686-689: beforeunload handler — cleanupConnections is async but the result is voided.

beforeunload does not wait for async work; the void state.node.cleanupConnections() call will almost certainly be aborted by the browser before it completes. This is a known browser limitation and likely acceptable here, but worth noting that this cleanup is best-effort only.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Fix all issues with AI agents
In `@bindings/typescript/src/nodes/spark.ts`:
- Around line 1190-1224: The polling loop currently uses a hardcoded maxPollMs =
60_000 which ignores PayInvoiceParams.timeoutSeconds; update the code around the
while loop that references maxPollMs, startedAt, and pollIntervalMs so maxPollMs
is computed from PayInvoiceParams.timeoutSeconds (e.g. const maxPollMs =
(params?.timeoutSeconds ?? 60) * 1000) and ensure it is at least pollIntervalMs
or a sane minimum; keep the rest of the logic
(wallet.getLightningSendRequest(requestId), terminalStatuses check, preimage/fee
assignment, and LniError handling) unchanged so callers who set timeoutSeconds
actually control the polling timeout.
- Around line 777-779: Replace the plain Error thrown during adaptor signature
validation with an LniError so it matches other error paths; locate the throw
that uses candidateDiagnostics (the message "Adaptor signature validation
failed: no z-candidate passed validation ...") and change it to throw a new
LniError with the same descriptive message and an appropriate error code (e.g.,
"ADAPTOR_SIGNATURE_VALIDATION_FAILED") so the error name and code fields are
preserved consistent with the rest of the file.
- Around line 912-916: Replace the direct multiplication of item.totalValue by
1000 with the helper that handles CurrencyAmount objects: use
mapCurrencyAmountToMsats(item.totalValue) instead of
numberFromUnknown(item.totalValue) * 1000 in the object where
preimage/paymentHash/amountMsats/feesPaid/createdAt are set; this mirrors the
safe usage already applied for userRequest.fee and ensures CurrencyAmount
structures are converted correctly to msats.
🧹 Nitpick comments (5)
bindings/typescript/scripts/spark-shims/bare-crypto.js (1)

1-8: subtle: undefined yields a cryptic TypeError when accessed

When globalThis.crypto is absent, calling e.g. crypto.subtle.digest(...) will throw a generic TypeError: Cannot read properties of undefined instead of the clear message you provide for getRandomValues. For consistency, consider a getter that throws a descriptive error too.

♻️ Suggested improvement
 const cryptoImpl =
   globalThis.crypto ??
   {
     getRandomValues(array) {
       throw new Error('globalThis.crypto is not available in this runtime.');
     },
-    subtle: undefined,
+    get subtle() {
+      throw new Error('globalThis.crypto.subtle is not available in this runtime.');
+    },
   };
bindings/typescript/src/spark-runtime.ts (1)

201-250: installSparkRuntime wiring: apiKeySameOriginOnly defaults to true — good security posture.

Line 233: options.apiKeySameOriginOnly !== false means same-origin enforcement is on by default when an API key is provided. This is the right default for browser contexts.

However, note that options.apiKey on line 227 is checked via ?.trim(), but the untrimmed value is passed to withHeaderFetch on line 231. withHeaderFetch trims internally (line 180), so this is safe but slightly inconsistent.

Consistency nit
     if (options.apiKey?.trim()) {
       baseFetch = withHeaderFetch(
         baseFetch,
         options.apiKeyHeader?.trim() || 'x-api-key',
-        options.apiKey,
+        options.apiKey.trim(),
         {
           sameOriginOnly: options.apiKeySameOriginOnly !== false,
         },
       );
     }
bindings/typescript/src/nodes/spark.ts (1)

669-682: createUserIdentifier() called twice in pureAggregateFrost — minor redundancy.

Lines 673 and 681 each call createUserIdentifier(). Since this is deterministic (derives from the fixed string 'user'), the results are identical. Consider calling it once and reusing.

Proposed fix
   try {
+    const selfIdentifier = createUserIdentifier();
     const signingPackage = buildSigningPackage(
       params.message,
       params.selfCommitment,
-      createUserIdentifier(),
+      selfIdentifier,
       params.statechainCommitments,
     );

     const signatureShares = new Map<any, any>();
     for (const [identifier, shareBytes] of Object.entries(params.statechainSignatures ?? {})) {
       signatureShares.set(identifierFromHex(identifier), FrostSignatureShare.deserialize(Secp256K1Sha256TR, shareBytes));
     }
-    const selfIdentifier = createUserIdentifier();
     signatureShares.set(selfIdentifier, FrostSignatureShare.deserialize(Secp256K1Sha256TR, params.selfSignature));
bindings/typescript/examples/spark-web/index.html (2)

696-700: Advanced toggle arrow logic is correct but the variable name open is misleading.

classList.toggle('hidden') returns true when the class is added (fields now hidden), so open is true when fields are closed. The arrow rendering is correct but the semantics are inverted from what the name suggests. Consider renaming to isHidden or inverting with !.

Clearer naming
      el.advancedToggle.addEventListener('click', () => {
-        const open = el.advancedFields.classList.toggle('hidden');
-        el.advancedToggle.innerHTML = (open ? '&#9656;' : '&#9662;') + ' Advanced settings';
+        const collapsed = el.advancedFields.classList.toggle('hidden');
+        el.advancedToggle.innerHTML = (collapsed ? '&#9656;' : '&#9662;') + ' Advanced settings';
      });

769-773: Auto-connect on page load with saved mnemonic may be surprising.

If a mnemonic was previously saved, the demo auto-connects on load (line 771-772). This could be confusing if the user lands on the page expecting the login form. Consider showing the login view briefly or adding a small delay/confirmation.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Fix all issues with AI agents
In `@bindings/typescript/src/internal/polling.ts`:
- Around line 39-47: The polling loop can spin tightly when elapsed() <
maxDurationMs but elapsed() + remainingSleep >= maxDurationMs; update the logic
in the loop (around lookupDuration, remainingSleep, delayMs, elapsed(),
maxDurationMs, and sleep()) to sleep for the lesser of remainingSleep and
(maxDurationMs - elapsed()) instead of executing a bare continue — i.e., compute
allowedSleep = Math.max(0, Math.min(remainingSleep, maxDurationMs - elapsed()))
and await sleep(allowedSleep) when > 0 so the loop yields until the overall
timeout is reached.

In `@bindings/typescript/src/nodes/spark.ts`:
- Around line 144-159: The bigint-overflow branch in numberFromUnknown currently
throws a plain Error; change it to throw an LniError with the same message used
for other error paths so callers catching LniError will handle it; update the
throw in numberFromUnknown to use new LniError(...) (and add/import LniError if
it's not present) keeping the descriptive text (`BigInt value ${value} exceeds
... cannot be safely converted`) consistent with existing error messages.
🧹 Nitpick comments (4)
bindings/typescript/src/internal/polling.ts (1)

32-36: Error swallowing: all lookup errors are now treated as 'pending'.

This is a reasonable resilience improvement for transient failures, but note that persistent errors (e.g., misconfigured endpoint, auth failure) will now silently retry until timeout rather than surfacing early. Consider whether a retry counter or distinguishing transient vs. permanent errors would be worthwhile in a follow-up.

bindings/typescript/examples/spark-expo-go/App.tsx (1)

164-164: State variable name recvAmountMsats is misleading — it holds sats, not msats.

The UI label on Line 591 says "Amount (sats)" and the conversion on Line 383 does Math.floor(parsed) * 1000 (sats → msats), but the state variable is named recvAmountMsats. Consider renaming to recvAmountSats for clarity.

bindings/typescript/src/nodes/spark.ts (2)

255-281: buildUserKeyPackage: tweaked is computed only to extract verifyingKey — consider adding a brief comment.

Line 271 computes tweaked from tweakKeyPackage(base, ...) but only tweaked.verifyingKey is used (Line 278), while all other fields come from evenY. A short comment explaining this intentional split (even-Y key adjustments + tweaked verifying key) would help future readers of this crypto-heavy code.


660-787: pureAggregateFrost creates a second selfIdentifier instead of reusing the one from buildSigningPackage.

Line 673 calls createUserIdentifier() (passed into buildSigningPackage), then Line 681 calls createUserIdentifier() again. These should produce identical identifiers since both derive from the same static seed 'user', so this is functionally correct. However, creating it once and reusing it would be cleaner and avoid any doubt.

@ntheile ntheile merged commit 003c7d0 into master Feb 8, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant