67 Commits

Author SHA1 Message Date
Siavash Sameni
073756ed4b fix: auto-switch decoder codec to match incoming packets
The CallDecoder now inspects each incoming packet's codec_id and
automatically switches the audio decoder if it differs from the
current profile. This enables cross-codec interop where one client
sends Opus and the other sends Codec2 — previously the receiver
would try to decode with the wrong codec, producing garbled audio.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 15:25:24 +04:00
Siavash Sameni
2fcc2d77cf feat: add --profile/--codec flag to CLI for forcing codec selection
Enables debugging Codec2 by allowing forced codec selection from CLI.
Supports: good, degraded, catastrophic, codec2-3200, codec2-1200.
Frame size, timing, and jitter buffer are all adjusted dynamically
based on the selected profile.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 15:23:36 +04:00
Siavash Sameni
f7ccb67b02 fix: desktop ping closes endpoint properly, prevents resource leaks
Some checks failed
Mirror to GitHub / mirror (push) Failing after 39s
Build Release Binaries / build-amd64 (push) Failing after 3m46s
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 15:00:32 +04:00
Siavash Sameni
4df08eadbd fix: don't block connect on offline ping — always allow connection attempt
Some checks failed
Mirror to GitHub / mirror (push) Failing after 35s
Build Release Binaries / build-amd64 (push) Failing after 3m44s
Server may be reachable even if ping failed (transient timeout).
User should always be able to try connecting. Fingerprint change
still shows confirm dialog (accept/reject).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 14:20:38 +04:00
Siavash Sameni
6d776097c8 feat: relay ping handling, identity persistence, linux build script (backport)
Some checks failed
Mirror to GitHub / mirror (push) Failing after 40s
Build Release Binaries / build-amd64 (push) Failing after 3m46s
Backported from feat/android-voip-client:
- Relay: SNI "ping" connections handled gracefully (no timeout errors)
- Relay: identity persisted in ~/.wzp/relay-identity (stable fingerprint)
- Linux fire-and-forget build script (Hetzner VM)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 11:45:27 +04:00
Siavash Sameni
9f7962a6cd fix: vec allocation for desktop AudioRing (match Android fix)
Some checks failed
Mirror to GitHub / mirror (push) Failing after 36s
Build Release Binaries / build-amd64 (push) Failing after 3m35s
Same fix as Android: Box::new([0i16; 16384]) allocates 32KB on the
stack before moving to heap. Use vec![].into_boxed_slice() for
direct heap allocation.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 05:26:59 +04:00
Siavash Sameni
8c9befb15d ci: skip build on CI-only file changes
Add paths-ignore for .gitea/** so build.yml doesn't waste runner time
when only workflow files are modified.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 22:12:32 +04:00
Siavash Sameni
3f869a4cd7 ci: add GitHub mirror workflow
Automatically pushes branches and tags to github.com:manawenuz/wzp.git
on every push to Forgejo. Uses GH_SSH_KEY secret for authentication.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 19:50:39 +04:00
Siavash Sameni
2263e898e5 fix: port AudioRing reader-detects-lap fix to desktop client
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m44s
Same fix as Android (4af7c5f): writer never touches read_pos,
reader self-corrects when lapped. Power-of-2 capacity (16384),
bitmask indexing, overflow/underrun counters.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 13:42:33 +04:00
Siavash Sameni
9ab57ba037 merge: fj/feat/android-voip-client — congestion fix, AEC toggle, debug logging
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m36s
Merged 10 commits from Android branch:
- Send task crash fix on QUIC congestion (continue instead of break)
- AEC toggle + NoiseSuppressor on Android
- Debug reporter for crash diagnostics
- Mic mute crackling fix
- Participant dedup in UI
- Proper QUIC connection close on hangup
- Null alias display fix
- Tracing → Android logcat
- Incident reports for send-task crash and playout ring desync

Conflict resolved in room.rs: kept Android's improved debug logging
(recv gap tracking, lock contention, forward latency, send errors)
inside our media_task async block for parallel signal handling.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 13:13:43 +04:00
Siavash Sameni
7806d4ec04 feat: identicons, server fingerprints, lock status (TOFU)
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m35s
Identicon generator:
- Deterministic 5x5 symmetric pattern from fingerprint hash
- HSL-derived colors, rendered as inline SVG
- Click any identicon to copy its fingerprint to clipboard
- Used for participants, user identity, and relay servers

Server identity (TOFU — Trust On First Use):
- Ping returns server fingerprint (QUIC peer certificate hash)
- First contact: auto-saved as known fingerprint
- Subsequent pings: compared against known fingerprint
- Lock icons: locked (verified), unlocked (new), warning (changed), red (offline)
- Fingerprint mismatch shows confirmation dialog before connecting

UI updates:
- Participants show identicons instead of letter avatars
- User identity shows identicon + fingerprint on connect screen
- Manage Relays shows identicon per server with lock status
- Relay button shows lock icon instead of colored dot

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 13:02:42 +04:00
Siavash Sameni
d31b81a21d fix: replace relay dropdown with direct dialog on click
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m53s
- Click relay button opens Manage Relays dialog directly (no dropdown)
- Click a relay in the dialog to select it (highlighted with accent border)
- × button to delete, Add Relay button to add new
- Removed all dropdown menu code and CSS

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 12:53:13 +04:00
Claude
4d54b6f9e4 docs: incident reports for send-task crash and playout ring desync
Some checks failed
Build Release Binaries / build-amd64 (push) Has been cancelled
Two root-caused bugs documented with full evidence:

1. Send task fatal exit on QUIC congestion (FIXED in 2092245)
   - send_media() Err(Blocked) caused break → killed entire call
   - Now drops packet and continues

2. Playout ring buffer cursor desync (ROOT-CAUSED, fix pending)
   - AudioRing::write() mutates read_pos from producer thread on overflow
   - Violates SPSC contract → reader/writer fight over read_pos
   - Causes 12-16s bidirectional silence ~25-30s into call
   - Both clients affected simultaneously

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 08:52:14 +00:00
Siavash Sameni
c268ce419a fix: relay dialog overflow — stack inputs, full-width Add button
Some checks failed
Build Release Binaries / build-amd64 (push) Has been cancelled
- Dialog fits within 360px window (was overflowing at 420px)
- Add inputs stacked: name + host:port in a row, "Add Relay" button below
- Text overflow with ellipsis on relay names and addresses
- Proper min-width: 0 on flex children to prevent overflow

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 12:49:26 +04:00
Siavash Sameni
61b6e67610 feat: relay server dropdown with status indicators and manage dialog
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m38s
- Relay selector as dropdown with green/yellow/red status dots
  (green < 200ms, yellow > 200ms, red = offline, gray = unknown)
- All relays pinged on startup, RTT shown next to each
- "Manage Relays..." dialog: add/remove servers, see live status
- Clicking a relay in dropdown selects it, fills connect form
- Recent room chips auto-select matching relay
- Migrates old single-relay settings format automatically
- Prevents connecting to offline relays

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 12:44:19 +04:00
Siavash Sameni
dddf5d2e2d feat: relay ping with RTT display, fix dead_code warning
Some checks failed
Build Release Binaries / build-amd64 (push) Has been cancelled
- New ping_relay Tauri command: QUIC connect with 3s timeout, returns RTT ms
- Relay status shown next to input field: "42ms" (green) or "offline" (red)
- Auto-pings on app startup and debounced on relay input change
- Fix SyncWrapper dead_code warning with #[allow(dead_code)]

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 12:41:28 +04:00
Siavash Sameni
ed272d29f8 feat: fingerprint at startup, relay+room pairs, auto-reconnect, cleanup
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m34s
#7 Fingerprint shown before connecting — new get_identity command reads
   ~/.wzp/identity at startup (generates if missing). Click to copy.

#8 Recent rooms store (relay, room) pairs — clicking a chip fills both
   fields. Settings panel shows relay alongside room name. Migrates
   old string[] format automatically.

#9 Auto-reconnect on unexpected disconnect — exponential backoff
   (1s, 2s, 4s... max 10s), up to 5 attempts. Yellow blinking dot
   shows reconnecting state. Stops if user clicks hangup.

#10 Audio handle cleanup — CPAL handles stored in SyncWrapper (no more
    mem::forget), dropped properly on CallEngine::stop().

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 12:15:05 +04:00
Claude
2b3bdae440 fix: enable Rust tracing → Android logcat via tracing-android
Rust tracing subscriber was never initialized — all info!/warn!/error!
calls in the engine went to /dev/null. This meant our send/recv health
logging was invisible and we couldn't confirm the congestion fix was
active.

Now initializes tracing-android layer on first nativeInit(), routing
all Rust logs to logcat under tag "wzp_android". Also expanded logcat
filter in DebugReporter to capture engine-level log lines.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 08:03:28 +00:00
Siavash Sameni
21f5b24cbf fix: keep audio handles alive for call duration, fix Send+Sync
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m39s
The VPIO/CPAL audio handles were dropped at the end of start(),
killing the audio unit immediately. Audio I/O stopped working
after the first frame.

- Store audio handle in CallEngine via SyncWrapper
- Drop MutexGuard before returning from status() (Send future)
- Audio streams now live for the entire call duration

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 12:00:16 +04:00
Siavash Sameni
9b733010ab fix: blocking_lock panic in status(), fingerprint copy-to-clipboard
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m13s
- Change status() from blocking_lock to async lock().await —
  fixes "Cannot block the current thread from within a runtime" panic
  that froze the call timer and broke audio
- Click fingerprint to copy to clipboard (both connect and settings screens)
- Show "Copied!" feedback on click

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 11:53:31 +04:00
Siavash Sameni
80d5bd7628 fix: survive QUIC congestion — drop packets instead of killing send task
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m14s
send_datagram() returns Err(Blocked) when the QUIC congestion window
is full. This is transient — the window reopens once ACKs arrive.
Previously, all send paths treated this as fatal (break/return),
which killed the send task and cascaded via tokio::select! to kill
the entire call.

Now: log warning, drop the packet, continue. Brief audio glitch
(20-100ms) instead of complete call death. FEC on the receiver
side recovers most dropped packets.

Fixed in:
- CLI run_live send task (continue + error counter)
- CLI run_file_mode send paths (2 locations)
- Desktop engine send task

Also hardened recv tasks: transient errors (non-closed/reset)
are survived instead of causing exit.

Matches the fix applied to Android client (engine.rs).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 11:48:20 +04:00
Siavash Sameni
4a195a923a feat: settings panel with Cmd+, shortcut (macOS standard)
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m48s
- Full settings page as modal overlay (blur backdrop)
- Opens via gear icon on connect/call screens or Cmd+, (Ctrl+, on Win/Linux)
- Escape or click outside to close
- Settings: relay, room, alias, OS AEC toggle, AGC toggle
- Identity section showing fingerprint and identity file path
- Recent rooms management (remove individual, clear all)
- Save syncs back to connect form
- Gear icon on both connect and in-call screens

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 11:44:22 +04:00
Siavash Sameni
f726f8cfa4 feat: desktop GUI enhancements — audio level, call timer, VPIO, settings
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m47s
- Audio level meter with log-scale RMS visualization
- Call duration timer
- VPIO (OS AEC) wired through to engine with fallback to CPAL
- "You" badge on own participant entry
- Recent rooms list (click to reuse)
- Enter key to connect from form fields
- Improved dark theme with pulse animation on status dot
- Settings persistence via localStorage (relay, room, alias, AEC, recent rooms)
- Fingerprint display on connect screen
- Keyboard shortcuts skip input fields

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 11:40:07 +04:00
Claude
20922455bd fix: send task crash on QUIC congestion + AEC toggle + debug reporter
Root cause: send_media() returns Err(Blocked) when QUIC congestion
window is full. The send task treated ANY send error as fatal (break),
killing the entire call. Now send errors drop the packet and continue.

Also hardened recv task to survive transient errors and added health
logging (recv gap tracking, periodic stats) to both send and recv.

Relay: added comprehensive debug logging — recv gaps, lock contention,
forward latency, send errors — all per-participant with 5s stats.

Other changes:
- AEC toggle in Settings (persisted, applied on next call)
- Debug report: records call audio (WAV), RMS histogram (CSV), logcat,
  stats. Emailed as zip via Android share intent after call ends.
- Replaced LinearProgressIndicator with Box (compose version compat)
- FileProvider for sharing debug zip attachments

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 07:38:56 +00:00
Siavash Sameni
e468454464 feat: Tauri desktop GUI app with call engine
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m27s
- New desktop/ directory with Tauri v2 + Vite + TypeScript
- Rust backend: CallEngine wrapping wzp-client audio + transport
- Web frontend: connect screen, in-call screen with participants,
  mic/speaker mute, keyboard shortcuts (m/s/q)
- Dark theme UI, settings persistence via localStorage
- Platform-aware --os-aec: warns on Windows/Linux (not yet implemented)
- Workspace updated to include desktop/src-tauri

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 11:25:54 +04:00
Siavash Sameni
d1c96cd71f feat: macOS VoiceProcessingIO for hardware AEC + delay-compensated NLMS
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m33s
- Add --os-aec flag: uses Apple VoiceProcessingIO audio unit for
  hardware echo cancellation (same engine as FaceTime)
- New vpio feature + audio_vpio.rs: combined capture+playback via VPIO
- Improved software AEC: delay-compensated leaky NLMS with Geigel DTD
  (60ms tail, 40ms delay, configurable via --aec-delay)
- Add --aec-delay flag for tuning software AEC delay compensation
- Add dev-fast Cargo profile (opt-level 2 with incremental compilation)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 11:10:10 +04:00
Siavash Sameni
1b00b5e2a4 feat: improved AEC, keyboard shortcuts, dedup participants, dev-fast profile
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m40s
AEC improvements:
- Reduce echo tail from 100ms to 30ms (3.3x faster, suited for laptops)
- Add double-talk detection: freeze adaptation when near-end speaks
- Add residual echo suppression
- Disable AEC by default in --android mode (macOS has built-in AEC)

CLI features:
- Keyboard shortcuts: m=mic mute, s=speaker mute, q=quit (raw terminal mode)
- Dedup participants in RoomUpdate display (same fingerprint+alias shown once)
- Add dev-fast profile (opt-level 2 with incremental compilation)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 10:15:23 +04:00
Claude
e6564bab57 fix: mic mute crackling + add AEC/NoiseSuppressor + dedup room participants
Mic mute: the send loop now zeros the capture buffer when muted instead
of relying on write_audio() to skip writes. Previously stale ring data
and AGC amplification of near-silence caused crackling artifacts.

AEC: attach Android's hardware AcousticEchoCanceler to the AudioRecord
session. Also attach NoiseSuppressor when available. Both are released
on capture stop.

Room UI: deduplicate participants by fingerprint so ghost entries from
stale relay state don't show duplicate names.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 06:06:35 +00:00
Siavash Sameni
cfb48df1ef feat: direct playout mode, AEC far-end, audio processing switches
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m28s
- Add --android/--direct-playout: bypass jitter buffer, decode on recv
  (matches Android engine architecture)
- Wire AEC far-end reference from decoded playout to encoder
- Add --no-aec, --no-agc, --no-fec, --no-silence, --no-denoise switches
- Fix BufferSize::Fixed(960) → Default for macOS CoreAudio compat
- Optimize wzp-codec, wzp-fec, audiopus, nnnoiseless in debug profile
- Add capture callback size diagnostic logging

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 09:48:34 +04:00
Claude
aebf9156c0 fix: dedup participants in UI, wait for QUIC close ack before exiting
UI: deduplicate room participants by fingerprint so ghost entries from
stale relay state don't show duplicates.

Engine: after select! ends, call close_now() + connection.closed() with
500ms timeout to wait for the relay to acknowledge the CONNECTION_CLOSE.
Previously the close frame was queued but the runtime died before quinn
could retransmit if the first packet was lost.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 05:40:06 +00:00
Claude
9bbaec6b35 fix: use shutdown_timeout so QUIC CONNECTION_CLOSE actually gets sent
shutdown_background() killed the tokio runtime before quinn could send the
CONNECTION_CLOSE frame on the wire, so the relay never knew the client left.
Now use shutdown_timeout(500ms) to give quinn time to flush the close frame,
matching the desktop client pattern (which uses 2s timeout).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 05:20:20 +00:00
Siavash Sameni
ba29d8354f fix: send alias via CallOffer handshake (match Android approach)
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m44s
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 09:10:07 +04:00
Siavash Sameni
0908507a7a Merge remote-tracking branch 'origin/feat/android-voip-client' into feat/desktop-audio-rewrite 2026-04-06 09:04:55 +04:00
Siavash Sameni
860c90394d feat: rewrite desktop audio I/O with lock-free ring buffers
- Replace Mutex-based CPAL callbacks with atomic SPSC ring buffers
- Proper async send/recv loops (no block_on), 20ms playout tick
- Add signal task for RoomUpdate presence display
- Add --alias, --raw-room flags and key persistence (~/.wzp/identity)
- Add SetAlias signal variant and relay-side handling
- Graceful Ctrl+C shutdown with force-quit on second press

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 09:04:51 +04:00
Claude
dc66b60d18 fix: null alias display — Android JSONObject.optString returns literal "null"
o.optString("alias", null) returns the string "null" when the JSON value
is JSON null. Use o.isNull() check first. Also handle empty fingerprint
edge case with "unknown" fallback.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 05:04:47 +00:00
Claude
a9c4260b4e fix: close QUIC connection on hangup so relay removes participant immediately
stop_call() now calls close_now() on the stored transport handle before
killing the tokio runtime. This sends a QUIC CONNECTION_CLOSE frame so
the relay's recv loop breaks immediately, triggering leave() + RoomUpdate
broadcast. Previously the runtime was killed first, so transport.close()
never ran and the relay kept stale participants until idle timeout.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 04:58:24 +00:00
Claude
7eb136fcb3 fix: settings save button (back=discard), fix missing alias in featherchat tests
- Settings now uses draft state — changes only persist on explicit Save
- Back button discards unsaved changes
- Added applyServers() for batch server updates
- Added missing alias field to CallOffer in featherchat tests

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 04:30:23 +00:00
Claude
550a124972 fix: add missing alias arg to perform_handshake call in wzp-web
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 04:15:24 +00:00
Claude
0835c36d0f feat: settings page with persistence, client alias in handshake, fix null fingerprints
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m34s
- Add SettingsScreen with identity (alias, key backup/restore), audio defaults,
  server management, network prefs, and default room
- SettingsRepository persists all settings via SharedPreferences
- Auto-generate random display names on first launch (e.g. "Swift Wolf")
- Thread alias through CallOffer → relay handshake → RoomUpdate broadcast
- Derive caller fingerprint from identity key in relay handshake (fixes null
  fingerprints when --auth-url is not set)
- Persist identity seed for stable fingerprints across reconnects
- Add alias field to SignalMessage::CallOffer (serde default for backward compat)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 03:56:33 +00:00
Claude
6228ab32c1 ci: upload build artifacts to rustypaste
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m43s
Requires PASTE_AUTH and PASTE_URL secrets configured in Forgejo.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 02:08:13 +00:00
Claude
bd258f432a fix: remove actions/upload-artifact (unsupported on Forgejo)
Some checks failed
Build Release Binaries / build-amd64 (push) Has been cancelled
Forgejo doesn't support @actions/artifact v4. Package the tarball
and print sizes instead. Binaries can be grabbed from the runner
workspace or deployed directly.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 02:07:06 +00:00
Claude
8bf073aa80 fix: handle RoomUpdate variant in wzp-client signal type mapping
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 3m37s
Build Release Binaries / release (push) Has been skipped
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-06 01:54:36 +00:00
Claude
72e834b45e fix: init git submodules in CI with HTTPS fallback
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 4m46s
Build Release Binaries / release (push) Has been skipped
The featherchat submodule uses SSH URL which doesn't work in CI.
Convert to HTTPS via git insteadOf before submodule init.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 18:24:59 +00:00
Claude
673ffd498c fix: use catthehacker/ubuntu:act-latest for Forgejo CI runner
Some checks failed
Build Release Binaries / build-amd64 (push) Failing after 2m3s
Build Release Binaries / release (push) Has been skipped
The Forgejo runner needs Node.js for actions/checkout@v4.
catthehacker/ubuntu:act-latest has Node.js pre-installed.
Also install Rust in the workflow since the base image doesn't have it.
Build triggers on main + feat/* branches now.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 18:19:14 +00:00
Claude
2d4b8eebd5 feat: RoomUpdate protocol — broadcast participant list on join/leave
- Add RoomUpdate signal message to wzp-proto with participant count + list
- Add RoomParticipant struct (fingerprint + optional alias)
- Store fingerprint/alias in relay Participant struct
- Broadcast RoomUpdate to all room members on join and leave
- Add signal recv task in Android engine to handle RoomUpdate
- Surface room_participant_count + room_participants in CallStats JSON
- Show "X in room" with participant names in Android in-call UI

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 18:12:24 +00:00
Claude
a23d9f5e41 feat: foreground service, dB gain sliders, speaker routing, live network stats
- Wire CallService foreground service for background calls (microphone type)
- Add Voice Volume + Mic Gain sliders (-20 to +20 dB) applied in Kotlin
- Connect AudioRouteManager for real speaker toggle via AudioManager
- Feed quinn QUIC RTT into PathMonitor, display Loss/RTT/Jitter from live data
- Nuclear teardown between calls — recreate engine + audio pipeline each call
- Fix re-entrant teardown loop from CallService notification callback
- Park audio threads as daemons to avoid libcrypto TLS destructor crash on exit
- Remove duplicate wakelocks from Activity (service owns them now)
- Strip AEC + denoise from capture path, keep AGC only (incremental approach)
- Fix .so copy target: libwzp_android.so not libwzp.so

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 17:45:00 +00:00
Claude
b3e56ecbd8 feat: add AGC to capture + playout paths, add server UI, DNS resolve
- Wire AutoGainControl on both capture (mic → encode) and playout
  (decode → speaker) paths to normalize volume levels
- Add server list with add/remove custom server dialog
- Add IPv4/IPv6 preference toggle for DNS resolution
- Resolve DNS hostnames to IP in Kotlin before passing to Rust engine
- Revert to IP addresses for default servers (DNS still broken on QUIC)

AGC confirmed working — voice levels noticeably improved in testing.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 14:02:33 +00:00
Claude
2fa07286c3 feat: wakelock for background calls, server selector UI
- Partial wake lock + WiFi high-perf lock during calls — audio
  continues when screen is off / phone is locked
- Server selector: toggle between LAN (172.16.81.175) and Pangolin
  (pangolin.manko.yoga) before connecting
- Room name editable in idle screen

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 12:54:02 +00:00
Claude
bf91cf25bd feat: add real audio pipeline with Opus + RaptorQ FEC
- AudioPipeline: Kotlin AudioRecord/AudioTrack on JVM threads, PCM
  shuttled to Rust via lock-free ring buffers + JNI
- FEC: RaptorQ fountain codes on encode (5 frames/block, 20% repair
  ratio for GOOD profile), decoder feeds repair symbols for recovery
- Real audio level meter from mic RMS (replaces fake animation)
- Room name editable in UI (default: "android")
- Relay changed to pangolin.manko.yoga:4433
- Stats overlay shows FEC recovered count
- CallState now synced from polled stats (fixes "Connecting" stuck bug)

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 12:33:59 +00:00
Claude
81c756c076 chore: switch relay to 172.16.81.175:4433 for testing
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 12:01:51 +00:00
Claude
af85a49e86 fix: eliminate all native thread creation — run everything single-threaded
pthread_create crashes on Android due to static bionic __init_tcb stubs
in the Rust std prebuilt rlibs. This is unfixable without rebuilding std.

Solution: run the entire call (QUIC connect, handshake, media send/recv)
on a single tokio current_thread runtime. The JNI startCall() now blocks,
so Kotlin dispatches it to Dispatchers.IO (JVM thread, not pthread).

Audio pipeline temporarily simplified to silence frames — will restore
once threading is solved (either via Java Thread or rebuilding std).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 09:52:28 +00:00
Claude
bae03365da fix: restore getauxval_fix.c + current_thread tokio — both needed
The getauxval override (dlsym wrapper) fixes SIGSEGV in
init_have_lse_atomics at library load time. The current_thread
tokio runtime avoids SEGV_ACCERR in pthread_create/__init_tcb.
Both fixes are required together.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 09:37:57 +00:00
Claude
9d9ce4706d fix: use current_thread tokio runtime — avoid pthread_create SEGV on Android
Multi-thread tokio runtime crashes with SEGV_ACCERR in __init_tcb
during pthread_create on Android (static bionic stubs from CRT).
Switch to current_thread runtime which runs network I/O on the
calling thread without spawning additional OS threads.

Also: clean up build.rs — use only libc++_shared.so (dynamic),
remove getauxval_fix.c hack, remove static c++/c++abi linking.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 09:27:46 +00:00
Claude
9098e28a1f fix: SIGSEGV in getauxval — override broken CRT stub with dlsym wrapper
compiler-rt's init_have_lse_atomics calls getauxval(AT_HWCAP) at
library load time. The static getauxval from the CRT reads from
__libc_auxv which is NULL in shared libraries → SIGSEGV at 0x0.

Fix: compile getauxval_fix.c that provides a getauxval() which uses
dlsym(RTLD_DEFAULT) to find the real bionic getauxval at runtime.
Also switch to libc++_shared.so (bundled in APK) to avoid pulling
in static libc stubs.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 08:39:57 +00:00
Claude
f6d51fce61 fix: target API 26 in ELF — pthread_atfork blocked by bionic at API 21
The .note.android.ident ELF section had API level 0x15 (21), causing
Android's bionic linker to block pthread_atfork (used by rand crate).
Fix: pass -P 26 to cargo-ndk and set linker to android26-clang.
Verified: ELF now shows 0x1a (26).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 06:05:44 +00:00
Claude
a8dd0c2f57 fix: also link libc++abi for RTTI — resolve missing __class_type_info vtable
- Compile all 62 Oboe source files (was headers-only, missing symbols)
- Link libc++_static + libc++abi with NDK sysroot search path
- Bump linker target from android21 to android26 (fixes pthread_atfork)
- Link liblog + libOpenSLES for Oboe runtime deps

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 05:48:49 +00:00
Claude
64566e9acb fix: logcat-server.py SyntaxError — global declaration after use
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 05:12:28 +00:00
Claude
10eb19cd24 feat: add logcat HTTP server for remote crash debugging
Simple Python script that captures adb logcat and serves it over HTTP.
Run on laptop, read from anywhere via curl/browser.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 05:11:19 +00:00
Claude
778f4dd428 fix: link libc++ statically — crash on launch due to missing libc++_shared.so
- Set cpp_link_stdlib(None) to suppress cc crate's automatic linking
- Explicitly link both c++_static and c++abi with NDK sysroot search path
- Fixes RTTI vtable symbol (_ZTVN10__cxxabiv117__class_type_infoE) error
- Verified: only liblog.so remains as dynamic dependency

Closes #001

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 05:07:25 +00:00
Siavash Sameni
622fdee51f fix: also link libc++abi for RTTI — resolve missing __class_type_info vtable
Previous fix linked c++_static but not c++abi. Android NDK splits the
static C++ runtime into two archives: libc++_static.a (STL) and
libc++abi.a (RTTI/exceptions). Without c++abi, dlopen fails on
_ZTVN10__cxxabiv117__class_type_infoE.

Now using cpp_link_stdlib(None) to suppress cc crate auto-linking, then
explicitly linking both c++_static and c++abi via cargo:rustc-link-lib.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 09:00:14 +04:00
Claude
b204213a01 build: rebuild APK with static libc++ linking (fixes #001)
libc++_shared.so is no longer a runtime dependency — verified
via llvm-readelf. Only system libs (libdl, liblog, libm, libc) remain.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 04:56:35 +00:00
Siavash Sameni
e751af7e38 fix: link libc++ statically — crash on launch due to missing libc++_shared.so
The app crashed immediately when loading libwzp_android.so because the
cc crate's default dynamic linking produced a runtime dependency on
libc++_shared.so, which was never packaged into the APK.

Adding .cpp_link_stdlib(Some("c++_static")) to build.rs bakes the C++
runtime into libwzp_android.so directly, eliminating the missing .so.

See issues/001-libc++-shared-crash.md for full diagnosis and logcat trace.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 08:52:55 +04:00
Claude
8d5f6fe044 feat: wire QUIC transport, JNI bridge, connect UI + add docs
- Replace raw FFI with proper `jni` crate for string marshalling
- Wire QUIC transport in engine: connect to relay, crypto handshake
  (CallOffer/CallAnswer, X25519+Ed25519), send/recv MediaPackets
- Feed received packets into jitter buffer (was previously ignored)
- Add connect screen UI with CALL button (idle state) and in-call
  controls (mute, speaker, hang up, live stats)
- Hardcode relay 172.16.81.125:4433, room "android"
- Add comprehensive docs in docs/android/:
  architecture.md (8 mermaid diagrams), build-guide.md,
  debugging.md, maintenance.md, roadmap.md

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 04:43:49 +00:00
Claude
780309fede fix: crash on launch — don't auto-start call, handle null JNI strings, remove stdout tracing
- CallActivity no longer auto-starts a call on launch
- CallViewModel lazily inits engine only when startCall() is called
- nativeGetStats nullable return handled safely in Kotlin
- Removed tracing_subscriber::fmt() which panics on Android (no stdout)
- All JNI calls wrapped in try/catch on Kotlin side

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-05 02:04:23 +00:00
Claude
73ebcdd869 build: Android APK builds working — debug (8.9MB) and release (2.0MB)
- Fix C++ std::std:: double namespace in oboe_bridge.cpp
- Auto-fetch Oboe headers from GitHub in build.rs
- Configure cargo cross-compilation (.cargo/config.toml) with NDK linkers
- Fix Gradle settings (dependencyResolutionManagement), signing configs,
  Compose LinearProgressIndicator API, and Android manifest theme
- Add Gradle wrapper, .gitignore for build artifacts
- arm64-v8a only (raptorq crate incompatible with armv7 32-bit)
- Release APK: 2.0MB signed with wzp-release key
- Debug APK: 8.9MB signed with wzp-debug key

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-04 19:37:08 +00:00
Claude
e7b1c3372a feat: Android VoIP client — Phase 2 (JNI bridge, Compose UI, AEC pipeline wiring)
- JNI bridge with 8 extern functions (init, startCall, stopCall, setMute,
  setSpeaker, getStats, forceProfile, destroy) with panic catching
- Kotlin engine layer: WzpEngine JNI wrapper, WzpCallback interface,
  CallStats data class with JSON deserialization
- Jetpack Compose UI: InCallScreen with quality indicator (green/yellow/red),
  mute/speaker/hangup buttons, stats overlay, duration timer
- CallActivity with RECORD_AUDIO permission handling, Material3 theme
- CallService foreground service with WakeLock, WiFi lock, notification
- AudioRouteManager for speaker/earpiece/Bluetooth SCO switching
- AEC wired into CallEncoder pipeline: AEC → AGC → denoise → silence → encode
- AEC farend reference fed from decode path to encode path in pipeline
- Engine exposes set_aec_enabled/set_agc_enabled via AtomicBool flags

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-04 18:16:38 +00:00
Claude
26e9c55f1f feat: Android VoIP client — Phase 1 (audio quality, network adaptation, crate skeleton)
- New wzp-android crate with Oboe C++ backend, lock-free SPSC ring buffers,
  engine orchestrator, codec pipeline, and Android Gradle project structure
- AEC (NLMS adaptive filter), AGC (two-stage with fast attack/slow release),
  windowed-sinc FIR resampler replacing linear interpolation (wzp-codec)
- Opus encoder tuning: complexity 7 default, set_expected_loss support
- Mobile jitter buffer: asymmetric EMA (fast up/slow down), handoff spike
  detection with 2s cooldown, configurable safety margin
- Network-aware quality control: cellular-specific thresholds, faster
  downgrade on cellular, proactive tier drop on WiFi→cellular handoff,
  FEC ratio boost during network transitions
- Handoff detection in PathMonitor via RTT jitter spike analysis

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-04 18:07:55 +00:00
109 changed files with 22452 additions and 577 deletions

5
.cargo/config.toml Normal file
View File

@@ -0,0 +1,5 @@
[target.aarch64-linux-android]
linker = "aarch64-linux-android26-clang"
[target.armv7-linux-androideabi]
linker = "armv7a-linux-androideabi26-clang"

View File

@@ -2,187 +2,57 @@ name: Build Release Binaries
on: on:
push: push:
branches:
- main
- 'feat/*'
tags: tags:
- 'v*' - 'v*'
paths-ignore:
- '.gitea/**'
workflow_dispatch: workflow_dispatch:
inputs:
targets:
description: 'Targets to build (comma-separated: amd64,arm64,armv7,mac-arm64)'
required: false
default: 'amd64'
env: env:
CARGO_TERM_COLOR: always CARGO_TERM_COLOR: always
jobs: jobs:
# Always builds on push tags. On manual dispatch, reads inputs.
build-amd64: build-amd64:
if: >-
github.event_name == 'push' ||
contains(github.event.inputs.targets, 'amd64')
runs-on: ubuntu-latest runs-on: ubuntu-latest
container: container:
image: rust:1-bookworm image: catthehacker/ubuntu:act-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
- name: Install dependencies - name: Init submodules
run: apt-get update && apt-get install -y cmake pkg-config libasound2-dev
- name: Cache cargo
uses: actions/cache@v4
with:
path: |
~/.cargo/registry
~/.cargo/git
target
key: cargo-amd64-${{ hashFiles('Cargo.lock') }}
restore-keys: cargo-amd64-
- name: Build headless binaries
run: cargo build --release --bin wzp-relay --bin wzp-client --bin wzp-bench --bin wzp-web
- name: Build audio client
run: | run: |
cargo build --release --bin wzp-client --features audio git config --global url."https://git.manko.yoga/".insteadOf "ssh://git@git.manko.yoga:222/"
cp target/release/wzp-client target/release/wzp-client-audio git submodule update --init --recursive
cargo build --release --bin wzp-client
- name: Install Rust + dependencies
run: |
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
source "$HOME/.cargo/env"
apt-get update && apt-get install -y cmake pkg-config libasound2-dev ninja-build
rustc --version
- name: Build relay + tools
run: |
source "$HOME/.cargo/env"
cargo build --release --bin wzp-relay --bin wzp-client --bin wzp-bench --bin wzp-web
- name: Run tests - name: Run tests
run: cargo test --workspace --lib
- name: Package
run: | run: |
mkdir -p dist/wzp-linux-amd64 source "$HOME/.cargo/env"
cp target/release/wzp-relay dist/wzp-linux-amd64/ cargo test --workspace --lib
cp target/release/wzp-client dist/wzp-linux-amd64/
cp target/release/wzp-client-audio dist/wzp-linux-amd64/
cp target/release/wzp-web dist/wzp-linux-amd64/
cp target/release/wzp-bench dist/wzp-linux-amd64/
cp -r crates/wzp-web/static dist/wzp-linux-amd64/
cd dist && tar czf wzp-linux-amd64.tar.gz wzp-linux-amd64/
- name: Upload artifact - name: Upload to rustypaste
uses: actions/upload-artifact@v4
with:
name: wzp-linux-amd64
path: dist/wzp-linux-amd64.tar.gz
build-arm64:
if: >-
github.event_name == 'push' ||
contains(github.event.inputs.targets, 'arm64')
runs-on: ubuntu-latest
container:
image: rust:1-bookworm
steps:
- uses: actions/checkout@v4
- name: Install cross-compilation tools
run: |
dpkg --add-architecture arm64
apt-get update
apt-get install -y cmake pkg-config gcc-aarch64-linux-gnu libc6-dev-arm64-cross
rustup target add aarch64-unknown-linux-gnu
- name: Cache cargo
uses: actions/cache@v4
with:
path: |
~/.cargo/registry
~/.cargo/git
target
key: cargo-arm64-${{ hashFiles('Cargo.lock') }}
restore-keys: cargo-arm64-
- name: Build
env: env:
CARGO_TARGET_AARCH64_UNKNOWN_LINUX_GNU_LINKER: aarch64-linux-gnu-gcc PASTE_AUTH: ${{ secrets.PASTE_AUTH }}
CC_aarch64_unknown_linux_gnu: aarch64-linux-gnu-gcc PASTE_URL: ${{ secrets.PASTE_URL }}
run: | run: |
cargo build --release --target aarch64-unknown-linux-gnu \ tar czf /tmp/wzp-linux-amd64.tar.gz \
--bin wzp-relay --bin wzp-client --bin wzp-bench --bin wzp-web -C target/release wzp-relay wzp-client wzp-web wzp-bench
ls -lh /tmp/wzp-linux-amd64.tar.gz
- name: Package LINK=$(curl -sF "file=@/tmp/wzp-linux-amd64.tar.gz" \
run: | -H "Authorization: ${PASTE_AUTH}" \
mkdir -p dist/wzp-linux-arm64 "https://${PASTE_URL}")
cp target/aarch64-unknown-linux-gnu/release/wzp-relay dist/wzp-linux-arm64/ echo "Download: ${LINK}"
cp target/aarch64-unknown-linux-gnu/release/wzp-client dist/wzp-linux-arm64/
cp target/aarch64-unknown-linux-gnu/release/wzp-web dist/wzp-linux-arm64/
cp target/aarch64-unknown-linux-gnu/release/wzp-bench dist/wzp-linux-arm64/
cp -r crates/wzp-web/static dist/wzp-linux-arm64/
cd dist && tar czf wzp-linux-arm64.tar.gz wzp-linux-arm64/
- name: Upload artifact
uses: actions/upload-artifact@v4
with:
name: wzp-linux-arm64
path: dist/wzp-linux-arm64.tar.gz
build-armv7:
if: >-
github.event_name == 'push' ||
contains(github.event.inputs.targets, 'armv7')
runs-on: ubuntu-latest
container:
image: rust:1-bookworm
steps:
- uses: actions/checkout@v4
- name: Install cross-compilation tools
run: |
dpkg --add-architecture armhf
apt-get update
apt-get install -y cmake pkg-config gcc-arm-linux-gnueabihf libc6-dev-armhf-cross
rustup target add armv7-unknown-linux-gnueabihf
- name: Cache cargo
uses: actions/cache@v4
with:
path: |
~/.cargo/registry
~/.cargo/git
target
key: cargo-armv7-${{ hashFiles('Cargo.lock') }}
restore-keys: cargo-armv7-
- name: Build
env:
CARGO_TARGET_ARMV7_UNKNOWN_LINUX_GNUEABIHF_LINKER: arm-linux-gnueabihf-gcc
CC_armv7_unknown_linux_gnueabihf: arm-linux-gnueabihf-gcc
run: |
cargo build --release --target armv7-unknown-linux-gnueabihf \
--bin wzp-relay --bin wzp-client --bin wzp-bench --bin wzp-web
- name: Package
run: |
mkdir -p dist/wzp-linux-armv7
cp target/armv7-unknown-linux-gnueabihf/release/wzp-relay dist/wzp-linux-armv7/
cp target/armv7-unknown-linux-gnueabihf/release/wzp-client dist/wzp-linux-armv7/
cp target/armv7-unknown-linux-gnueabihf/release/wzp-web dist/wzp-linux-armv7/
cp target/armv7-unknown-linux-gnueabihf/release/wzp-bench dist/wzp-linux-armv7/
cp -r crates/wzp-web/static dist/wzp-linux-armv7/
cd dist && tar czf wzp-linux-armv7.tar.gz wzp-linux-armv7/
- name: Upload artifact
uses: actions/upload-artifact@v4
with:
name: wzp-linux-armv7
path: dist/wzp-linux-armv7.tar.gz
# Release job — creates a release with all artifacts when a tag is pushed
release:
if: startsWith(github.ref, 'refs/tags/v')
needs: [build-amd64]
runs-on: ubuntu-latest
steps:
- name: Download all artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Create release
uses: softprops/action-gh-release@v2
with:
files: artifacts/**/*.tar.gz
generate_release_notes: true

View File

@@ -0,0 +1,43 @@
name: Mirror to GitHub
on:
push:
branches:
- main
- 'feat/*'
- 'feature/*'
tags:
- '*'
jobs:
mirror:
runs-on: ubuntu-latest
container:
image: catthehacker/ubuntu:act-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Push to GitHub
env:
GH_SSH_KEY: ${{ secrets.GH_SSH_KEY }}
run: |
mkdir -p ~/.ssh
echo "${GH_SSH_KEY}" > ~/.ssh/id_ed25519
chmod 600 ~/.ssh/id_ed25519
ssh-keyscan github.com >> ~/.ssh/known_hosts 2>/dev/null
git remote add github git@github.com:manawenuz/wzp.git
# Push the current branch
BRANCH="${GITHUB_REF#refs/heads/}"
TAG="${GITHUB_REF#refs/tags/}"
if [ "${GITHUB_REF}" != "${GITHUB_REF#refs/tags/}" ]; then
echo "Pushing tag: ${TAG}"
git push github "refs/tags/${TAG}" --force
else
echo "Pushing branch: ${BRANCH}"
git push github "HEAD:refs/heads/${BRANCH}" --force
fi

3149
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -9,6 +9,8 @@ members = [
"crates/wzp-relay", "crates/wzp-relay",
"crates/wzp-client", "crates/wzp-client",
"crates/wzp-web", "crates/wzp-web",
"crates/wzp-android",
"desktop/src-tauri",
] ]
[workspace.package] [workspace.package]
@@ -52,3 +54,24 @@ wzp-fec = { path = "crates/wzp-fec" }
wzp-crypto = { path = "crates/wzp-crypto" } wzp-crypto = { path = "crates/wzp-crypto" }
wzp-transport = { path = "crates/wzp-transport" } wzp-transport = { path = "crates/wzp-transport" }
wzp-client = { path = "crates/wzp-client" } wzp-client = { path = "crates/wzp-client" }
# Fast dev profile: optimized but with debug info and incremental compilation.
# Use with: cargo run --profile dev-fast
[profile.dev-fast]
inherits = "dev"
opt-level = 2
# Optimize heavy compute deps even in debug builds —
# real-time audio needs < 20ms per frame, impossible unoptimized.
[profile.dev.package.nnnoiseless]
opt-level = 3
[profile.dev.package.audiopus_sys]
opt-level = 3
[profile.dev.package.audiopus]
opt-level = 3
[profile.dev.package.raptorq]
opt-level = 3
[profile.dev.package.wzp-codec]
opt-level = 3
[profile.dev.package.wzp-fec]
opt-level = 3

6
android/.gitignore vendored Normal file
View File

@@ -0,0 +1,6 @@
.gradle/
build/
app/build/
app/src/main/jniLibs/
local.properties
keystore/*.jks

View File

@@ -0,0 +1,85 @@
plugins {
id("com.android.application")
id("org.jetbrains.kotlin.android")
}
android {
namespace = "com.wzp.phone"
compileSdk = 34
defaultConfig {
applicationId = "com.wzp.phone"
minSdk = 26 // AAudio requires API 26
targetSdk = 34
versionCode = 1
versionName = "0.1.0"
ndk { abiFilters += listOf("arm64-v8a") }
}
signingConfigs {
create("release") {
storeFile = file("${project.rootDir}/keystore/wzp-release.jks")
storePassword = "wzphone2024"
keyAlias = "wzp-release"
keyPassword = "wzphone2024"
}
getByName("debug") {
storeFile = file("${project.rootDir}/keystore/wzp-debug.jks")
storePassword = "android"
keyAlias = "wzp-debug"
keyPassword = "android"
}
}
buildTypes {
debug {
signingConfig = signingConfigs.getByName("debug")
isDebuggable = true
}
release {
signingConfig = signingConfigs.getByName("release")
isMinifyEnabled = false
proguardFiles(
getDefaultProguardFile("proguard-android-optimize.txt"),
"proguard-rules.pro"
)
}
}
compileOptions {
sourceCompatibility = JavaVersion.VERSION_1_8
targetCompatibility = JavaVersion.VERSION_1_8
}
kotlinOptions {
jvmTarget = "1.8"
}
buildFeatures { compose = true }
composeOptions { kotlinCompilerExtensionVersion = "1.5.8" }
ndkVersion = "26.1.10909125"
}
// cargo-ndk integration: build the Rust native library for Android targets
tasks.register<Exec>("cargoNdkBuild") {
workingDir = file("${project.rootDir}/..")
commandLine(
"cargo", "ndk",
"-t", "arm64-v8a",
"-o", "${project.projectDir}/src/main/jniLibs",
"build", "--release", "-p", "wzp-android"
)
}
// Skip cargo-ndk in CI/Docker — .so is pre-built into jniLibs
// tasks.named("preBuild") { dependsOn("cargoNdkBuild") }
dependencies {
implementation("androidx.core:core-ktx:1.12.0")
implementation("androidx.lifecycle:lifecycle-runtime-ktx:2.7.0")
implementation("androidx.activity:activity-compose:1.8.2")
implementation(platform("androidx.compose:compose-bom:2024.01.00"))
implementation("androidx.compose.ui:ui")
implementation("androidx.compose.material3:material3")
}

9
android/app/proguard-rules.pro vendored Normal file
View File

@@ -0,0 +1,9 @@
# WZPhone ProGuard rules
# Keep JNI native methods
-keepclasseswithmembernames class * {
native <methods>;
}
# Keep the WZP engine bridge class
-keep class com.wzp.phone.engine.** { *; }

View File

@@ -0,0 +1,43 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE_MICROPHONE" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<application
android:name="com.wzp.WzpApplication"
android:label="WZ Phone"
android:supportsRtl="true"
android:theme="@android:style/Theme.Material.Light.NoActionBar">
<activity
android:name="com.wzp.ui.call.CallActivity"
android:exported="true"
android:launchMode="singleTask">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<service
android:name="com.wzp.service.CallService"
android:foregroundServiceType="microphone"
android:exported="false" />
<provider
android:name="androidx.core.content.FileProvider"
android:authorities="${applicationId}.fileprovider"
android:exported="false"
android:grantUriPermissions="true">
<meta-data
android:name="android.support.FILE_PROVIDER_PATHS"
android:resource="@xml/file_paths" />
</provider>
</application>
</manifest>

View File

@@ -0,0 +1,38 @@
package com.wzp
import android.app.Application
import android.app.NotificationChannel
import android.app.NotificationManager
import android.os.Build
/**
* Application entry point for WarzonePhone.
*
* Creates the notification channel required for the foreground [com.wzp.service.CallService].
*/
class WzpApplication : Application() {
override fun onCreate() {
super.onCreate()
createNotificationChannel()
}
private fun createNotificationChannel() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val channel = NotificationChannel(
CHANNEL_ID,
"Active Call",
NotificationManager.IMPORTANCE_LOW
).apply {
description = "Shown while a VoIP call is in progress"
setShowBadge(false)
}
val nm = getSystemService(NotificationManager::class.java)
nm.createNotificationChannel(channel)
}
}
companion object {
const val CHANNEL_ID = "wzp_call_channel"
}
}

View File

@@ -0,0 +1,324 @@
package com.wzp.audio
import android.Manifest
import android.content.Context
import android.content.pm.PackageManager
import android.media.AudioAttributes
import android.media.AudioFormat
import android.media.AudioRecord
import android.media.AudioTrack
import android.media.MediaRecorder
import android.media.audiofx.AcousticEchoCanceler
import android.media.audiofx.NoiseSuppressor
import android.util.Log
import androidx.core.content.ContextCompat
import com.wzp.engine.WzpEngine
import java.io.BufferedOutputStream
import java.io.File
import java.io.FileOutputStream
import java.io.OutputStreamWriter
import java.nio.ByteBuffer
import java.nio.ByteOrder
import kotlin.math.pow
import kotlin.math.sqrt
/**
* Audio pipeline that captures mic audio and plays received audio using
* Android AudioRecord/AudioTrack APIs running on JVM threads.
*
* PCM samples are shuttled to/from the Rust engine via JNI ring buffers:
* - Capture: AudioRecord → WzpEngine.writeAudio() → Rust encoder → network
* - Playout: network → Rust decoder → WzpEngine.readAudio() → AudioTrack
*
* All audio is 48kHz, mono, 16-bit PCM (matching Opus codec requirements).
*/
class AudioPipeline(private val context: Context) {
companion object {
private const val TAG = "AudioPipeline"
private const val SAMPLE_RATE = 48000
private const val CHANNEL_IN = AudioFormat.CHANNEL_IN_MONO
private const val CHANNEL_OUT = AudioFormat.CHANNEL_OUT_MONO
private const val ENCODING = AudioFormat.ENCODING_PCM_16BIT
/** 20ms frame at 48kHz = 960 samples */
private const val FRAME_SAMPLES = 960
}
@Volatile
private var running = false
/** Playout (incoming voice) gain in dB. 0 = unity. */
@Volatile
var playoutGainDb: Float = 0f
/** Capture (mic) gain in dB. 0 = unity. */
@Volatile
var captureGainDb: Float = 0f
/** Whether to attach hardware AEC. Must be set before start(). */
var aecEnabled: Boolean = true
/** Enable debug recording of PCM + RMS histogram to cache dir. */
var debugRecording: Boolean = true
private var captureThread: Thread? = null
private var playoutThread: Thread? = null
private val debugDir: File by lazy {
File(context.cacheDir, "wzp_debug").also { it.mkdirs() }
}
fun start(engine: WzpEngine) {
if (running) return
running = true
captureThread = Thread({
runCapture(engine)
// Park thread forever — exiting triggers a libcrypto TLS destructor
// crash (SIGSEGV in OPENSSL_free) on Android when a JNI-calling thread exits.
parkThread()
}, "wzp-capture").apply {
isDaemon = true
priority = Thread.MAX_PRIORITY
start()
}
playoutThread = Thread({
runPlayout(engine)
parkThread()
}, "wzp-playout").apply {
isDaemon = true
priority = Thread.MAX_PRIORITY
start()
}
Log.i(TAG, "audio pipeline started")
}
fun stop() {
running = false
// Don't join — threads are parked as daemons to avoid native TLS crash
captureThread = null
playoutThread = null
Log.i(TAG, "audio pipeline stopped")
}
private fun applyGain(pcm: ShortArray, count: Int, db: Float) {
if (db == 0f) return
val linear = 10f.pow(db / 20f)
for (i in 0 until count) {
pcm[i] = (pcm[i] * linear).toInt().coerceIn(-32000, 32000).toShort()
}
}
private fun computeRms(pcm: ShortArray, count: Int): Int {
var sumSq = 0.0
for (i in 0 until count) {
val s = pcm[i].toDouble()
sumSq += s * s
}
return sqrt(sumSq / count).toInt()
}
private fun parkThread() {
try {
Thread.sleep(Long.MAX_VALUE)
} catch (_: InterruptedException) {
// process exiting
}
}
private fun runCapture(engine: WzpEngine) {
if (ContextCompat.checkSelfPermission(context, Manifest.permission.RECORD_AUDIO)
!= PackageManager.PERMISSION_GRANTED
) {
Log.e(TAG, "RECORD_AUDIO permission not granted, capture disabled")
return
}
val minBuf = AudioRecord.getMinBufferSize(SAMPLE_RATE, CHANNEL_IN, ENCODING)
val bufSize = maxOf(minBuf, FRAME_SAMPLES * 2 * 4) // at least 4 frames
val recorder = try {
AudioRecord(
MediaRecorder.AudioSource.VOICE_COMMUNICATION,
SAMPLE_RATE,
CHANNEL_IN,
ENCODING,
bufSize
)
} catch (e: SecurityException) {
Log.e(TAG, "AudioRecord SecurityException: ${e.message}")
return
}
if (recorder.state != AudioRecord.STATE_INITIALIZED) {
Log.e(TAG, "AudioRecord failed to initialize")
recorder.release()
return
}
// Attach hardware AEC if available and enabled in settings
var aec: AcousticEchoCanceler? = null
var ns: NoiseSuppressor? = null
if (aecEnabled) {
if (AcousticEchoCanceler.isAvailable()) {
try {
aec = AcousticEchoCanceler.create(recorder.audioSessionId)
aec?.enabled = true
Log.i(TAG, "AEC enabled (session=${recorder.audioSessionId})")
} catch (e: Exception) {
Log.w(TAG, "AEC init failed: ${e.message}")
}
} else {
Log.w(TAG, "AEC not available on this device")
}
// Attach hardware noise suppressor if available
if (NoiseSuppressor.isAvailable()) {
try {
ns = NoiseSuppressor.create(recorder.audioSessionId)
ns?.enabled = true
Log.i(TAG, "NoiseSuppressor enabled")
} catch (e: Exception) {
Log.w(TAG, "NoiseSuppressor init failed: ${e.message}")
}
}
} else {
Log.i(TAG, "AEC disabled by user setting")
}
recorder.startRecording()
Log.i(TAG, "capture started: ${SAMPLE_RATE}Hz mono, buf=$bufSize, aec=${aec?.enabled}, ns=${ns?.enabled}")
val pcm = ShortArray(FRAME_SAMPLES)
// Debug: PCM file + RMS CSV
var pcmOut: BufferedOutputStream? = null
var rmsCsv: OutputStreamWriter? = null
val byteConv = ByteBuffer.allocate(FRAME_SAMPLES * 2).order(ByteOrder.LITTLE_ENDIAN)
var frameIdx = 0L
if (debugRecording) {
try {
pcmOut = BufferedOutputStream(FileOutputStream(File(debugDir, "capture.pcm")), 65536)
rmsCsv = OutputStreamWriter(FileOutputStream(File(debugDir, "capture_rms.csv")))
rmsCsv.write("frame,time_ms,rms\n")
} catch (e: Exception) {
Log.w(TAG, "debug recording init failed: ${e.message}")
}
}
try {
while (running) {
val read = recorder.read(pcm, 0, FRAME_SAMPLES)
if (read > 0) {
applyGain(pcm, read, captureGainDb)
engine.writeAudio(pcm)
// Debug: write raw PCM + RMS
if (pcmOut != null) {
byteConv.clear()
for (i in 0 until read) byteConv.putShort(pcm[i])
pcmOut.write(byteConv.array(), 0, read * 2)
}
if (rmsCsv != null) {
val rms = computeRms(pcm, read)
val timeMs = frameIdx * FRAME_SAMPLES * 1000L / SAMPLE_RATE
rmsCsv.write("$frameIdx,$timeMs,$rms\n")
}
frameIdx++
} else if (read < 0) {
Log.e(TAG, "AudioRecord.read error: $read")
break
}
}
} finally {
pcmOut?.close()
rmsCsv?.close()
recorder.stop()
aec?.release()
ns?.release()
recorder.release()
Log.i(TAG, "capture stopped (frames=$frameIdx)")
}
}
private fun runPlayout(engine: WzpEngine) {
val minBuf = AudioTrack.getMinBufferSize(SAMPLE_RATE, CHANNEL_OUT, ENCODING)
val bufSize = maxOf(minBuf, FRAME_SAMPLES * 2 * 4)
val track = AudioTrack.Builder()
.setAudioAttributes(
AudioAttributes.Builder()
.setUsage(AudioAttributes.USAGE_VOICE_COMMUNICATION)
.setContentType(AudioAttributes.CONTENT_TYPE_SPEECH)
.build()
)
.setAudioFormat(
AudioFormat.Builder()
.setSampleRate(SAMPLE_RATE)
.setChannelMask(CHANNEL_OUT)
.setEncoding(ENCODING)
.build()
)
.setBufferSizeInBytes(bufSize)
.setTransferMode(AudioTrack.MODE_STREAM)
.build()
if (track.state != AudioTrack.STATE_INITIALIZED) {
Log.e(TAG, "AudioTrack failed to initialize")
track.release()
return
}
track.play()
Log.i(TAG, "playout started: ${SAMPLE_RATE}Hz mono, buf=$bufSize")
val pcm = ShortArray(FRAME_SAMPLES)
val silence = ShortArray(FRAME_SAMPLES)
// Debug: PCM file + RMS CSV for playout
var pcmOut: BufferedOutputStream? = null
var rmsCsv: OutputStreamWriter? = null
val byteConv = ByteBuffer.allocate(FRAME_SAMPLES * 2).order(ByteOrder.LITTLE_ENDIAN)
var frameIdx = 0L
if (debugRecording) {
try {
pcmOut = BufferedOutputStream(FileOutputStream(File(debugDir, "playout.pcm")), 65536)
rmsCsv = OutputStreamWriter(FileOutputStream(File(debugDir, "playout_rms.csv")))
rmsCsv.write("frame,time_ms,rms\n")
} catch (e: Exception) {
Log.w(TAG, "debug playout recording init failed: ${e.message}")
}
}
try {
while (running) {
val read = engine.readAudio(pcm)
if (read >= FRAME_SAMPLES) {
applyGain(pcm, read, playoutGainDb)
track.write(pcm, 0, read)
// Debug: write raw PCM + RMS
if (pcmOut != null) {
byteConv.clear()
for (i in 0 until read) byteConv.putShort(pcm[i])
pcmOut.write(byteConv.array(), 0, read * 2)
}
if (rmsCsv != null) {
val rms = computeRms(pcm, read)
val timeMs = frameIdx * FRAME_SAMPLES * 1000L / SAMPLE_RATE
rmsCsv.write("$frameIdx,$timeMs,$rms\n")
}
frameIdx++
} else {
track.write(silence, 0, FRAME_SAMPLES)
// Log silence frames to RMS as 0
if (rmsCsv != null) {
val timeMs = frameIdx * FRAME_SAMPLES * 1000L / SAMPLE_RATE
rmsCsv.write("$frameIdx,$timeMs,0\n")
}
frameIdx++
Thread.sleep(5)
}
}
} finally {
pcmOut?.close()
rmsCsv?.close()
track.stop()
track.release()
Log.i(TAG, "playout stopped (frames=$frameIdx)")
}
}
}

View File

@@ -0,0 +1,142 @@
package com.wzp.audio
import android.content.Context
import android.media.AudioDeviceCallback
import android.media.AudioDeviceInfo
import android.media.AudioManager
import android.os.Handler
import android.os.Looper
/**
* Manages audio routing between earpiece, speaker, and Bluetooth devices.
*
* Wraps [AudioManager] operations and listens for device connection changes
* via [AudioDeviceCallback] (API 23+).
*
* Usage:
* 1. Call [register] when the call starts
* 2. Use [setSpeaker] and [setBluetoothSco] to switch routes
* 3. Call [unregister] when the call ends
*/
class AudioRouteManager(context: Context) {
private val audioManager = context.getSystemService(Context.AUDIO_SERVICE) as AudioManager
private val mainHandler = Handler(Looper.getMainLooper())
/** Listener for audio route changes. */
var onRouteChanged: ((AudioRoute) -> Unit)? = null
/** Current active route. */
var currentRoute: AudioRoute = AudioRoute.EARPIECE
private set
// -- Device callback (API 23+) -------------------------------------------
private val deviceCallback = object : AudioDeviceCallback() {
override fun onAudioDevicesAdded(addedDevices: Array<out AudioDeviceInfo>) {
for (device in addedDevices) {
if (device.type == AudioDeviceInfo.TYPE_BLUETOOTH_SCO) {
// A Bluetooth headset was connected — optionally auto-switch
onRouteChanged?.invoke(AudioRoute.BLUETOOTH)
}
}
}
override fun onAudioDevicesRemoved(removedDevices: Array<out AudioDeviceInfo>) {
for (device in removedDevices) {
if (device.type == AudioDeviceInfo.TYPE_BLUETOOTH_SCO) {
// Bluetooth disconnected — fall back to earpiece or speaker
val fallback = if (audioManager.isSpeakerphoneOn) {
AudioRoute.SPEAKER
} else {
AudioRoute.EARPIECE
}
currentRoute = fallback
onRouteChanged?.invoke(fallback)
}
}
}
}
// -- Public API -----------------------------------------------------------
/** Register the device callback. Call when a call starts. */
fun register() {
audioManager.registerAudioDeviceCallback(deviceCallback, mainHandler)
}
/** Unregister the device callback and release Bluetooth SCO. Call when the call ends. */
fun unregister() {
audioManager.unregisterAudioDeviceCallback(deviceCallback)
stopBluetoothSco()
}
/**
* Enable or disable the loudspeaker.
*
* When enabling speaker, Bluetooth SCO is disconnected.
*/
@Suppress("DEPRECATION")
fun setSpeaker(enabled: Boolean) {
if (enabled) {
stopBluetoothSco()
}
audioManager.isSpeakerphoneOn = enabled
currentRoute = if (enabled) AudioRoute.SPEAKER else AudioRoute.EARPIECE
onRouteChanged?.invoke(currentRoute)
}
/**
* Enable or disable Bluetooth SCO (Synchronous Connection Oriented) audio.
*
* When enabling Bluetooth, the speaker is turned off.
*/
@Suppress("DEPRECATION")
fun setBluetoothSco(enabled: Boolean) {
if (enabled) {
audioManager.isSpeakerphoneOn = false
audioManager.startBluetoothSco()
audioManager.isBluetoothScoOn = true
currentRoute = AudioRoute.BLUETOOTH
} else {
stopBluetoothSco()
currentRoute = AudioRoute.EARPIECE
}
onRouteChanged?.invoke(currentRoute)
}
/** Check whether a Bluetooth SCO device is currently connected. */
fun isBluetoothAvailable(): Boolean {
val devices = audioManager.getDevices(AudioManager.GET_DEVICES_OUTPUTS)
return devices.any { it.type == AudioDeviceInfo.TYPE_BLUETOOTH_SCO }
}
/** List available output audio routes. */
fun availableRoutes(): List<AudioRoute> {
val routes = mutableListOf(AudioRoute.EARPIECE, AudioRoute.SPEAKER)
if (isBluetoothAvailable()) {
routes.add(AudioRoute.BLUETOOTH)
}
return routes
}
// -- Internal -------------------------------------------------------------
@Suppress("DEPRECATION")
private fun stopBluetoothSco() {
if (audioManager.isBluetoothScoOn) {
audioManager.isBluetoothScoOn = false
audioManager.stopBluetoothSco()
}
}
}
/** Audio output route. */
enum class AudioRoute {
/** Phone earpiece (default for calls). */
EARPIECE,
/** Built-in loudspeaker. */
SPEAKER,
/** Bluetooth SCO headset/headphones. */
BLUETOOTH
}

View File

@@ -0,0 +1,141 @@
package com.wzp.data
import android.content.Context
import android.content.SharedPreferences
import com.wzp.ui.call.ServerEntry
import org.json.JSONArray
import org.json.JSONObject
import java.security.SecureRandom
/**
* Persists user settings via SharedPreferences.
*
* Stores: servers, default server index, room name, alias, gain values,
* IPv6 preference, and the identity seed (hex-encoded 32 bytes).
*/
class SettingsRepository(context: Context) {
private val prefs: SharedPreferences =
context.applicationContext.getSharedPreferences("wzp_settings", Context.MODE_PRIVATE)
companion object {
private const val KEY_SERVERS = "servers_json"
private const val KEY_SELECTED_SERVER = "selected_server"
private const val KEY_ROOM = "room_name"
private const val KEY_ALIAS = "alias"
private const val KEY_PLAYOUT_GAIN = "playout_gain_db"
private const val KEY_CAPTURE_GAIN = "capture_gain_db"
private const val KEY_PREFER_IPV6 = "prefer_ipv6"
private const val KEY_IDENTITY_SEED = "identity_seed_hex"
private const val KEY_AEC_ENABLED = "aec_enabled"
}
// --- Servers ---
fun saveServers(servers: List<ServerEntry>) {
val arr = JSONArray()
servers.forEach { entry ->
arr.put(JSONObject().apply {
put("address", entry.address)
put("label", entry.label)
})
}
prefs.edit().putString(KEY_SERVERS, arr.toString()).apply()
}
fun loadServers(): List<ServerEntry>? {
val json = prefs.getString(KEY_SERVERS, null) ?: return null
return try {
val arr = JSONArray(json)
(0 until arr.length()).map { i ->
val obj = arr.getJSONObject(i)
ServerEntry(obj.getString("address"), obj.getString("label"))
}
} catch (_: Exception) { null }
}
fun saveSelectedServer(index: Int) {
prefs.edit().putInt(KEY_SELECTED_SERVER, index).apply()
}
fun loadSelectedServer(): Int = prefs.getInt(KEY_SELECTED_SERVER, 0)
// --- Room ---
fun saveRoom(name: String) { prefs.edit().putString(KEY_ROOM, name).apply() }
fun loadRoom(): String = prefs.getString(KEY_ROOM, "android") ?: "android"
// --- Alias ---
fun saveAlias(alias: String) { prefs.edit().putString(KEY_ALIAS, alias).apply() }
/**
* Load alias, generating a random name on first launch.
*/
fun getOrCreateAlias(): String {
val existing = prefs.getString(KEY_ALIAS, null)
if (!existing.isNullOrEmpty()) return existing
val name = generateRandomName()
prefs.edit().putString(KEY_ALIAS, name).apply()
return name
}
private fun generateRandomName(): String {
val adjectives = listOf(
"Swift", "Silent", "Brave", "Calm", "Dark", "Fierce", "Ghost",
"Iron", "Lucky", "Noble", "Quick", "Sharp", "Storm", "Wild",
"Cold", "Bright", "Lone", "Red", "Grey", "Frosty", "Dusty",
"Rusty", "Neon", "Void", "Solar", "Lunar", "Cyber", "Pixel",
"Sonic", "Hyper", "Turbo", "Nano", "Mega", "Ultra", "Zinc"
)
val nouns = listOf(
"Wolf", "Hawk", "Fox", "Bear", "Lynx", "Crow", "Viper",
"Cobra", "Tiger", "Eagle", "Shark", "Raven", "Falcon", "Otter",
"Mantis", "Panda", "Jackal", "Badger", "Heron", "Bison",
"Condor", "Coyote", "Gecko", "Hornet", "Marten", "Osprey",
"Parrot", "Puma", "Raptor", "Stork", "Toucan", "Walrus"
)
val adj = adjectives.random()
val noun = nouns.random()
return "$adj $noun"
}
// --- Gain ---
fun savePlayoutGain(db: Float) { prefs.edit().putFloat(KEY_PLAYOUT_GAIN, db).apply() }
fun loadPlayoutGain(): Float = prefs.getFloat(KEY_PLAYOUT_GAIN, 0f)
fun saveCaptureGain(db: Float) { prefs.edit().putFloat(KEY_CAPTURE_GAIN, db).apply() }
fun loadCaptureGain(): Float = prefs.getFloat(KEY_CAPTURE_GAIN, 0f)
// --- IPv6 ---
fun savePreferIPv6(prefer: Boolean) { prefs.edit().putBoolean(KEY_PREFER_IPV6, prefer).apply() }
fun loadPreferIPv6(): Boolean = prefs.getBoolean(KEY_PREFER_IPV6, false)
// --- AEC ---
fun saveAecEnabled(enabled: Boolean) { prefs.edit().putBoolean(KEY_AEC_ENABLED, enabled).apply() }
fun loadAecEnabled(): Boolean = prefs.getBoolean(KEY_AEC_ENABLED, true)
// --- Identity seed ---
/**
* Get or generate the identity seed. On first call, generates a random
* 32-byte seed and persists it. Subsequent calls return the same seed.
*/
fun getOrCreateSeedHex(): String {
val existing = prefs.getString(KEY_IDENTITY_SEED, null)
if (!existing.isNullOrEmpty()) return existing
val seed = ByteArray(32).also { SecureRandom().nextBytes(it) }
val hex = seed.joinToString("") { "%02x".format(it) }
prefs.edit().putString(KEY_IDENTITY_SEED, hex).apply()
return hex
}
fun loadSeedHex(): String = prefs.getString(KEY_IDENTITY_SEED, "") ?: ""
fun saveSeedHex(hex: String) {
prefs.edit().putString(KEY_IDENTITY_SEED, hex).apply()
}
}

View File

@@ -0,0 +1,198 @@
package com.wzp.debug
import android.content.Context
import android.util.Log
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.withContext
import java.io.BufferedOutputStream
import java.io.ByteArrayOutputStream
import java.io.File
import java.io.FileInputStream
import java.io.FileOutputStream
import java.nio.ByteBuffer
import java.nio.ByteOrder
import java.text.SimpleDateFormat
import java.util.Date
import java.util.Locale
import java.util.zip.ZipEntry
import java.util.zip.ZipOutputStream
/**
* Collects call debug data (audio recordings, logs, histograms, stats)
* into a zip file for email sharing.
*/
class DebugReporter(private val context: Context) {
companion object {
private const val TAG = "DebugReporter"
private const val SAMPLE_RATE = 48000
}
/**
* Build a zip with all debug data.
* Returns the zip File on success, or null on failure.
*/
suspend fun collectZip(
callDurationSecs: Double,
finalStatsJson: String,
aecEnabled: Boolean,
alias: String,
server: String,
room: String
): File? = withContext(Dispatchers.IO) {
try {
val debugDir = File(context.cacheDir, "wzp_debug")
val timestamp = SimpleDateFormat("yyyyMMdd_HHmmss", Locale.US).format(Date())
val zipFile = File(context.cacheDir, "wzp_debug_${timestamp}.zip")
ZipOutputStream(BufferedOutputStream(FileOutputStream(zipFile))).use { zos ->
// 1. Call metadata
val meta = buildString {
appendLine("=== WZ Phone Debug Report ===")
appendLine("Timestamp: $timestamp")
appendLine("Alias: $alias")
appendLine("Server: $server")
appendLine("Room: $room")
appendLine("Duration: ${"%.1f".format(callDurationSecs)}s")
appendLine("AEC: ${if (aecEnabled) "ON" else "OFF"}")
appendLine("Device: ${android.os.Build.MANUFACTURER} ${android.os.Build.MODEL}")
appendLine("Android: ${android.os.Build.VERSION.RELEASE} (API ${android.os.Build.VERSION.SDK_INT})")
appendLine()
appendLine("=== Final Stats ===")
appendLine(finalStatsJson)
}
addTextEntry(zos, "meta.txt", meta)
// 2. Logcat — WZP-related tags
val logcat = collectLogcat()
addTextEntry(zos, "logcat.txt", logcat)
// 3. Capture audio (mic) → WAV
val captureRaw = File(debugDir, "capture.pcm")
if (captureRaw.exists() && captureRaw.length() > 0) {
addWavEntry(zos, "capture.wav", captureRaw)
Log.i(TAG, "capture.pcm: ${captureRaw.length()} bytes -> WAV")
}
// 4. Playout audio (speaker) → WAV
val playoutRaw = File(debugDir, "playout.pcm")
if (playoutRaw.exists() && playoutRaw.length() > 0) {
addWavEntry(zos, "playout.wav", playoutRaw)
Log.i(TAG, "playout.pcm: ${playoutRaw.length()} bytes -> WAV")
}
// 5. RMS histogram CSV
val captureHist = File(debugDir, "capture_rms.csv")
if (captureHist.exists()) addFileEntry(zos, "capture_rms.csv", captureHist)
val playoutHist = File(debugDir, "playout_rms.csv")
if (playoutHist.exists()) addFileEntry(zos, "playout_rms.csv", playoutHist)
}
Log.i(TAG, "zip created: ${zipFile.length()} bytes (${zipFile.length() / 1024}KB)")
// Clean up raw debug files (keep zip)
debugDir.listFiles()?.forEach { it.delete() }
zipFile
} catch (e: Exception) {
Log.e(TAG, "debug report failed", e)
null
}
}
/** Clean up any leftover debug files from a previous session. */
fun prepareForCall() {
val debugDir = File(context.cacheDir, "wzp_debug")
if (debugDir.exists()) {
debugDir.listFiles()?.forEach { it.delete() }
}
debugDir.mkdirs()
// Also clean up old zip files
context.cacheDir.listFiles()?.filter { it.name.startsWith("wzp_debug_") }?.forEach { it.delete() }
}
private fun collectLogcat(): String {
return try {
val process = Runtime.getRuntime().exec(
arrayOf(
"logcat", "-d",
"-t", "5000",
"--format", "threadtime"
)
)
val output = process.inputStream.bufferedReader().readText()
process.waitFor()
output.lines()
.filter { line ->
line.contains("wzp", ignoreCase = true) ||
line.contains("WzpEngine") ||
line.contains("AudioPipeline") ||
line.contains("WzpCall") ||
line.contains("CallService") ||
line.contains("AudioTrack") ||
line.contains("AudioRecord") ||
line.contains("AcousticEchoCanceler") ||
line.contains("NoiseSuppressor") ||
line.contains("FATAL") ||
line.contains("ANR") ||
line.contains("AudioFlinger") ||
line.contains("DebugReporter") ||
line.contains("QUIC") ||
line.contains("quinn") ||
line.contains("send task") ||
line.contains("recv task") ||
line.contains("send stats") ||
line.contains("recv stats") ||
line.contains("send_media") ||
line.contains("FEC block") ||
line.contains("recv gap") ||
line.contains("frames_dropped") ||
line.contains("opus")
}
.joinToString("\n")
} catch (e: Exception) {
"Failed to collect logcat: ${e.message}"
}
}
private fun addWavEntry(zos: ZipOutputStream, name: String, pcmFile: File) {
val dataSize = pcmFile.length().toInt()
val byteRate = SAMPLE_RATE * 1 * 16 / 8
val blockAlign = 1 * 16 / 8
zos.putNextEntry(ZipEntry(name))
// Write WAV header (44 bytes)
val header = ByteBuffer.allocate(44).order(ByteOrder.LITTLE_ENDIAN)
header.put("RIFF".toByteArray())
header.putInt(36 + dataSize)
header.put("WAVE".toByteArray())
header.put("fmt ".toByteArray())
header.putInt(16)
header.putShort(1) // PCM
header.putShort(1) // mono
header.putInt(SAMPLE_RATE)
header.putInt(byteRate)
header.putShort(blockAlign.toShort())
header.putShort(16) // bits per sample
header.put("data".toByteArray())
header.putInt(dataSize)
zos.write(header.array())
// Stream PCM data directly (avoids loading entire file into memory)
FileInputStream(pcmFile).use { it.copyTo(zos) }
zos.closeEntry()
}
private fun addTextEntry(zos: ZipOutputStream, name: String, content: String) {
zos.putNextEntry(ZipEntry(name))
zos.write(content.toByteArray())
zos.closeEntry()
}
private fun addFileEntry(zos: ZipOutputStream, name: String, file: File) {
zos.putNextEntry(ZipEntry(name))
FileInputStream(file).use { it.copyTo(zos) }
zos.closeEntry()
}
}

View File

@@ -0,0 +1,97 @@
package com.wzp.engine
import org.json.JSONArray
import org.json.JSONObject
/**
* Snapshot of call statistics, mirroring the Rust `CallStats` struct.
*
* Constructed from the JSON string returned by [WzpEngine.getStats].
*/
data class CallStats(
/** Current call state ordinal (see [CallStateConstants]). */
val state: Int = 0,
/** Call duration in seconds. */
val durationSecs: Double = 0.0,
/** Quality tier: 0 = Good, 1 = Degraded, 2 = Catastrophic. */
val qualityTier: Int = 0,
/** Observed packet loss percentage (0..100). */
val lossPct: Float = 0f,
/** Smoothed round-trip time in milliseconds. */
val rttMs: Int = 0,
/** Jitter in milliseconds. */
val jitterMs: Int = 0,
/** Current jitter buffer depth in packets. */
val jitterBufferDepth: Int = 0,
/** Total frames encoded since call start. */
val framesEncoded: Long = 0,
/** Total frames decoded since call start. */
val framesDecoded: Long = 0,
/** Number of playout underruns (buffer empty when audio was needed). */
val underruns: Long = 0,
/** Frames recovered by FEC. */
val fecRecovered: Long = 0,
/** Current mic audio level (RMS, 0-32767). */
val audioLevel: Int = 0,
/** Number of participants in the room. */
val roomParticipantCount: Int = 0,
/** Participants in the room (fingerprint + optional alias). */
val roomParticipants: List<RoomMember> = emptyList(),
) {
/** Human-readable quality label. */
val qualityLabel: String
get() = when (qualityTier) {
0 -> "Good"
1 -> "Degraded"
2 -> "Catastrophic"
else -> "Unknown"
}
companion object {
private fun parseParticipants(arr: JSONArray?): List<RoomMember> {
if (arr == null) return emptyList()
return (0 until arr.length()).map { i ->
val o = arr.getJSONObject(i)
RoomMember(
fingerprint = o.optString("fingerprint", ""),
alias = if (o.isNull("alias")) null else o.optString("alias", null)
)
}
}
/** Deserialise from the JSON string produced by the native engine. */
fun fromJson(json: String): CallStats {
return try {
val obj = JSONObject(json)
CallStats(
state = obj.optInt("state", 0),
durationSecs = obj.optDouble("duration_secs", 0.0),
qualityTier = obj.optInt("quality_tier", 0),
lossPct = obj.optDouble("loss_pct", 0.0).toFloat(),
rttMs = obj.optInt("rtt_ms", 0),
jitterMs = obj.optInt("jitter_ms", 0),
jitterBufferDepth = obj.optInt("jitter_buffer_depth", 0),
framesEncoded = obj.optLong("frames_encoded", 0),
framesDecoded = obj.optLong("frames_decoded", 0),
underruns = obj.optLong("underruns", 0),
fecRecovered = obj.optLong("fec_recovered", 0),
audioLevel = obj.optInt("audio_level", 0),
roomParticipantCount = obj.optInt("room_participant_count", 0),
roomParticipants = parseParticipants(obj.optJSONArray("room_participants"))
)
} catch (e: Exception) {
CallStats()
}
}
}
}
data class RoomMember(
val fingerprint: String,
val alias: String? = null
) {
/** Short display name: alias if set, otherwise first 8 chars of fingerprint. */
val displayName: String
get() = alias?.takeIf { it.isNotBlank() }
?: fingerprint.take(8).ifEmpty { "unknown" }
}

View File

@@ -0,0 +1,32 @@
package com.wzp.engine
/**
* Callback interface for VoIP engine events.
*
* All callbacks are invoked on the main/UI thread.
*/
interface WzpCallback {
/**
* Called when the call state changes.
*
* @param state one of [CallStateConstants]: IDLE(0), CONNECTING(1), ACTIVE(2),
* RECONNECTING(3), CLOSED(4)
*/
fun onCallStateChanged(state: Int)
/**
* Called when the network quality tier changes.
*
* @param tier 0 = Good, 1 = Degraded, 2 = Catastrophic
*/
fun onQualityTierChanged(tier: Int)
/**
* Called when an error occurs in the native engine.
*
* @param code numeric error code (negative)
* @param message human-readable description
*/
fun onError(code: Int, message: String)
}

View File

@@ -0,0 +1,149 @@
package com.wzp.engine
/**
* Native VoIP engine wrapper. Delegates all work to libwzp_android.so via JNI.
*
* Lifecycle:
* 1. Construct with a [WzpCallback]
* 2. Call [init] to create the native engine
* 3. Call [startCall] to begin a VoIP session
* 4. Use [setMute], [setSpeaker], [getStats], [forceProfile] during the call
* 5. Call [stopCall] to end the session
* 6. Call [destroy] when the engine is no longer needed
*
* Thread safety: all methods must be called from the same thread (typically main).
*/
class WzpEngine(private val callback: WzpCallback) {
/** Opaque pointer to the native EngineHandle. 0 means not initialised. */
private var nativeHandle: Long = 0L
/** Whether the engine has been initialised. */
val isInitialized: Boolean get() = nativeHandle != 0L
/** Create the native engine. Must be called before any other method. */
fun init() {
check(nativeHandle == 0L) { "Engine already initialized" }
nativeHandle = nativeInit()
check(nativeHandle != 0L) { "Native engine creation failed" }
}
/**
* Start a call.
*
* @param relayAddr relay server address (host:port)
* @param room room identifier (used as QUIC SNI)
* @param seedHex 64-char hex-encoded 32-byte identity seed (empty = random)
* @param token authentication token (empty = no auth)
* @param alias display name sent to relay for room participant list
* @return 0 on success, negative error code on failure
*/
fun startCall(relayAddr: String, room: String, seedHex: String = "", token: String = "", alias: String = ""): Int {
check(nativeHandle != 0L) { "Engine not initialized" }
val result = nativeStartCall(nativeHandle, relayAddr, room, seedHex, token, alias)
if (result == 0) {
callback.onCallStateChanged(CallStateConstants.CONNECTING)
} else {
callback.onError(result, "Failed to start call")
}
return result
}
/** Stop the active call. Safe to call when no call is active. */
fun stopCall() {
if (nativeHandle != 0L) {
nativeStopCall(nativeHandle)
callback.onCallStateChanged(CallStateConstants.CLOSED)
}
}
/** Mute or unmute the microphone. */
fun setMute(muted: Boolean) {
if (nativeHandle != 0L) nativeSetMute(nativeHandle, muted)
}
/** Enable or disable loudspeaker mode. */
fun setSpeaker(speaker: Boolean) {
if (nativeHandle != 0L) nativeSetSpeaker(nativeHandle, speaker)
}
/**
* Get current call statistics as a JSON string.
*
* @return JSON-serialised [CallStats], or `"{}"` if the engine is not initialised.
*/
fun getStats(): String {
if (nativeHandle == 0L) return "{}"
return try {
nativeGetStats(nativeHandle) ?: "{}"
} catch (_: Exception) {
"{}"
}
}
/**
* Force a quality profile, overriding adaptive selection.
*
* @param profile 0 = GOOD, 1 = DEGRADED, 2 = CATASTROPHIC
*/
fun forceProfile(profile: Int) {
if (nativeHandle != 0L) nativeForceProfile(nativeHandle, profile)
}
/** Destroy the native engine and free all resources. The instance must not be reused. */
fun destroy() {
if (nativeHandle != 0L) {
nativeDestroy(nativeHandle)
nativeHandle = 0L
}
}
/**
* Write captured PCM samples into the engine's capture ring buffer.
* Called from the AudioRecord capture thread.
*/
fun writeAudio(pcm: ShortArray): Int {
if (nativeHandle == 0L) return 0
return nativeWriteAudio(nativeHandle, pcm)
}
/**
* Read decoded PCM samples from the engine's playout ring buffer.
* Called from the AudioTrack playout thread.
*/
fun readAudio(pcm: ShortArray): Int {
if (nativeHandle == 0L) return 0
return nativeReadAudio(nativeHandle, pcm)
}
// -- JNI native methods --------------------------------------------------
private external fun nativeInit(): Long
private external fun nativeStartCall(
handle: Long, relay: String, room: String, seed: String, token: String, alias: String
): Int
private external fun nativeStopCall(handle: Long)
private external fun nativeSetMute(handle: Long, muted: Boolean)
private external fun nativeSetSpeaker(handle: Long, speaker: Boolean)
private external fun nativeGetStats(handle: Long): String?
private external fun nativeForceProfile(handle: Long, profile: Int)
private external fun nativeWriteAudio(handle: Long, pcm: ShortArray): Int
private external fun nativeReadAudio(handle: Long, pcm: ShortArray): Int
private external fun nativeDestroy(handle: Long)
companion object {
init {
System.loadLibrary("wzp_android")
}
}
}
/** Integer constants matching the Rust [CallState] enum ordinals. */
object CallStateConstants {
const val IDLE = 0
const val CONNECTING = 1
const val ACTIVE = 2
const val RECONNECTING = 3
const val CLOSED = 4
}

View File

@@ -0,0 +1,172 @@
package com.wzp.service
import android.app.Notification
import android.app.PendingIntent
import android.app.Service
import android.content.Context
import android.content.Intent
import android.media.AudioManager
import android.net.wifi.WifiManager
import android.os.IBinder
import android.os.PowerManager
import androidx.core.app.NotificationCompat
import com.wzp.WzpApplication
import com.wzp.ui.call.CallActivity
/**
* Foreground service that keeps the VoIP call alive when the app is backgrounded.
*
* Responsibilities:
* - Shows a persistent notification during the call
* - Acquires a partial wake lock so the CPU stays on
* - Acquires a Wi-Fi lock to prevent Wi-Fi from going to sleep
* - Sets [AudioManager] mode to [AudioManager.MODE_IN_COMMUNICATION]
* - Releases all resources when the call ends
*/
class CallService : Service() {
private var wakeLock: PowerManager.WakeLock? = null
private var wifiLock: WifiManager.WifiLock? = null
private var previousAudioMode: Int = AudioManager.MODE_NORMAL
// -- Lifecycle ------------------------------------------------------------
override fun onCreate() {
super.onCreate()
acquireWakeLock()
acquireWifiLock()
setAudioMode()
}
override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
when (intent?.action) {
ACTION_STOP -> {
onStopFromNotification?.invoke()
stopSelf()
return START_NOT_STICKY
}
}
startForeground(NOTIFICATION_ID, buildNotification())
return START_STICKY
}
override fun onDestroy() {
restoreAudioMode()
releaseWifiLock()
releaseWakeLock()
super.onDestroy()
}
override fun onBind(intent: Intent?): IBinder? = null
// -- Notification ---------------------------------------------------------
private fun buildNotification(): Notification {
// Tapping the notification returns to the call screen
val contentIntent = PendingIntent.getActivity(
this,
0,
Intent(this, CallActivity::class.java).apply {
flags = Intent.FLAG_ACTIVITY_SINGLE_TOP
},
PendingIntent.FLAG_IMMUTABLE or PendingIntent.FLAG_UPDATE_CURRENT
)
// "End call" action button
val stopIntent = PendingIntent.getService(
this,
1,
Intent(this, CallService::class.java).apply { action = ACTION_STOP },
PendingIntent.FLAG_IMMUTABLE or PendingIntent.FLAG_UPDATE_CURRENT
)
return NotificationCompat.Builder(this, WzpApplication.CHANNEL_ID)
.setContentTitle("WZ Phone")
.setContentText("Call in progress")
.setSmallIcon(android.R.drawable.ic_menu_call)
.setOngoing(true)
.setContentIntent(contentIntent)
.addAction(android.R.drawable.ic_menu_close_clear_cancel, "End Call", stopIntent)
.setCategory(NotificationCompat.CATEGORY_CALL)
.setPriority(NotificationCompat.PRIORITY_LOW)
.build()
}
// -- Wake lock ------------------------------------------------------------
private fun acquireWakeLock() {
val pm = getSystemService(Context.POWER_SERVICE) as PowerManager
wakeLock = pm.newWakeLock(
PowerManager.PARTIAL_WAKE_LOCK,
"wzp:call_wake_lock"
).apply {
acquire(MAX_CALL_DURATION_MS)
}
}
private fun releaseWakeLock() {
wakeLock?.let {
if (it.isHeld) it.release()
}
wakeLock = null
}
// -- Wi-Fi lock -----------------------------------------------------------
@Suppress("DEPRECATION")
private fun acquireWifiLock() {
val wm = applicationContext.getSystemService(Context.WIFI_SERVICE) as WifiManager
wifiLock = wm.createWifiLock(
WifiManager.WIFI_MODE_FULL_HIGH_PERF,
"wzp:call_wifi_lock"
).apply {
acquire()
}
}
private fun releaseWifiLock() {
wifiLock?.let {
if (it.isHeld) it.release()
}
wifiLock = null
}
// -- Audio mode -----------------------------------------------------------
private fun setAudioMode() {
val am = getSystemService(Context.AUDIO_SERVICE) as AudioManager
previousAudioMode = am.mode
am.mode = AudioManager.MODE_IN_COMMUNICATION
}
private fun restoreAudioMode() {
val am = getSystemService(Context.AUDIO_SERVICE) as AudioManager
am.mode = previousAudioMode
}
// -- Static helpers -------------------------------------------------------
companion object {
private const val NOTIFICATION_ID = 1001
private const val ACTION_STOP = "com.wzp.service.STOP"
private const val MAX_CALL_DURATION_MS = 4L * 60 * 60 * 1000 // 4 hours
/** Called when the user taps "End Call" in the notification. */
var onStopFromNotification: (() -> Unit)? = null
/** Start the foreground call service. */
fun start(context: Context) {
val intent = Intent(context, CallService::class.java)
context.startForegroundService(intent)
}
/** Stop the foreground call service. */
fun stop(context: Context) {
val intent = Intent(context, CallService::class.java).apply {
action = ACTION_STOP
}
context.startService(intent)
}
}
}

View File

@@ -0,0 +1,149 @@
package com.wzp.ui.call
import android.Manifest
import android.content.Intent
import android.content.pm.PackageManager
import android.os.Bundle
import android.util.Log
import android.widget.Toast
import androidx.activity.ComponentActivity
import androidx.activity.compose.setContent
import androidx.activity.result.contract.ActivityResultContracts
import androidx.activity.viewModels
import androidx.compose.material3.MaterialTheme
import androidx.compose.material3.darkColorScheme
import androidx.compose.material3.dynamicDarkColorScheme
import androidx.compose.material3.dynamicLightColorScheme
import androidx.compose.material3.lightColorScheme
import androidx.compose.foundation.isSystemInDarkTheme
import androidx.compose.runtime.Composable
import androidx.compose.runtime.getValue
import androidx.compose.runtime.mutableStateOf
import androidx.compose.runtime.remember
import androidx.compose.runtime.setValue
import androidx.compose.ui.platform.LocalContext
import androidx.core.content.ContextCompat
import androidx.core.content.FileProvider
import androidx.lifecycle.Lifecycle
import androidx.lifecycle.lifecycleScope
import androidx.lifecycle.repeatOnLifecycle
import com.wzp.ui.settings.SettingsScreen
import kotlinx.coroutines.launch
/**
* Main activity hosting the in-call Compose UI.
*
* Call lifecycle (wake lock, Wi-Fi lock, audio mode, notification)
* is managed by [com.wzp.service.CallService] foreground service.
*/
class CallActivity : ComponentActivity() {
companion object {
private const val TAG = "CallActivity"
}
private val viewModel: CallViewModel by viewModels()
private val audioPermissionLauncher = registerForActivityResult(
ActivityResultContracts.RequestPermission()
) { granted ->
if (!granted) {
Toast.makeText(this, "Microphone permission is required for calls", Toast.LENGTH_LONG).show()
}
}
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
viewModel.setContext(this)
setContent {
WzpTheme {
var showSettings by remember { mutableStateOf(false) }
if (showSettings) {
SettingsScreen(
viewModel = viewModel,
onBack = { showSettings = false }
)
} else {
InCallScreen(
viewModel = viewModel,
onHangUp = { viewModel.stopCall() },
onOpenSettings = { showSettings = true }
)
}
}
}
if (ContextCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO)
!= PackageManager.PERMISSION_GRANTED
) {
audioPermissionLauncher.launch(Manifest.permission.RECORD_AUDIO)
}
// Watch for debug zip ready → launch email intent
lifecycleScope.launch {
repeatOnLifecycle(Lifecycle.State.STARTED) {
viewModel.debugZipReady.collect { zipFile ->
if (zipFile != null && zipFile.exists()) {
Log.i(TAG, "debug zip ready: ${zipFile.absolutePath} (${zipFile.length()} bytes)")
launchEmailIntent(zipFile)
viewModel.onDebugReportSent()
}
}
}
}
}
private fun launchEmailIntent(zipFile: java.io.File) {
try {
val authority = "${applicationContext.packageName}.fileprovider"
Log.i(TAG, "FileProvider authority: $authority, file: ${zipFile.absolutePath}")
val uri = FileProvider.getUriForFile(this, authority, zipFile)
Log.i(TAG, "FileProvider URI: $uri")
val intent = Intent(Intent.ACTION_SEND).apply {
type = "message/rfc822"
putExtra(Intent.EXTRA_EMAIL, arrayOf("manwefarm@gmail.com"))
putExtra(Intent.EXTRA_SUBJECT, "WZ Phone Debug Report - ${zipFile.name}")
putExtra(
Intent.EXTRA_TEXT,
"Debug report attached.\n\nContains: call recordings (WAV), RMS histograms (CSV), logcat, stats."
)
putExtra(Intent.EXTRA_STREAM, uri)
addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION)
}
startActivity(Intent.createChooser(intent, "Send debug report"))
Log.i(TAG, "email intent launched")
} catch (e: Exception) {
Log.e(TAG, "email intent failed", e)
Toast.makeText(this, "Failed to launch email: ${e.message}", Toast.LENGTH_LONG).show()
}
}
override fun onDestroy() {
super.onDestroy()
if (isFinishing) {
viewModel.stopCall()
}
}
}
@Composable
fun WzpTheme(content: @Composable () -> Unit) {
val darkTheme = isSystemInDarkTheme()
val context = LocalContext.current
val colorScheme = when {
android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.S -> {
if (darkTheme) dynamicDarkColorScheme(context) else dynamicLightColorScheme(context)
}
darkTheme -> darkColorScheme()
else -> lightColorScheme()
}
MaterialTheme(
colorScheme = colorScheme,
content = content
)
}

View File

@@ -0,0 +1,445 @@
package com.wzp.ui.call
import android.content.Context
import android.util.Log
import androidx.lifecycle.ViewModel
import androidx.lifecycle.viewModelScope
import com.wzp.audio.AudioPipeline
import com.wzp.audio.AudioRouteManager
import com.wzp.data.SettingsRepository
import com.wzp.debug.DebugReporter
import com.wzp.engine.CallStats
import com.wzp.service.CallService
import com.wzp.engine.WzpCallback
import com.wzp.engine.WzpEngine
import kotlinx.coroutines.Job
import kotlinx.coroutines.delay
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.flow.asStateFlow
import kotlinx.coroutines.isActive
import kotlinx.coroutines.launch
import java.io.File
import java.net.Inet4Address
import java.net.Inet6Address
import java.net.InetAddress
data class ServerEntry(val address: String, val label: String)
class CallViewModel : ViewModel(), WzpCallback {
private var engine: WzpEngine? = null
private var engineInitialized = false
private var audioPipeline: AudioPipeline? = null
private var audioRouteManager: AudioRouteManager? = null
private var audioStarted = false
private var appContext: Context? = null
private var settings: SettingsRepository? = null
private var debugReporter: DebugReporter? = null
private var lastStatsJson: String = "{}"
private var lastCallDuration: Double = 0.0
private var lastCallServer: String = ""
private val _callState = MutableStateFlow(0)
val callState: StateFlow<Int> get() = _callState.asStateFlow()
private val _isMuted = MutableStateFlow(false)
val isMuted: StateFlow<Boolean> = _isMuted.asStateFlow()
private val _isSpeaker = MutableStateFlow(false)
val isSpeaker: StateFlow<Boolean> = _isSpeaker.asStateFlow()
private val _stats = MutableStateFlow(CallStats())
val stats: StateFlow<CallStats> = _stats.asStateFlow()
private val _qualityTier = MutableStateFlow(0)
val qualityTier: StateFlow<Int> = _qualityTier.asStateFlow()
private val _errorMessage = MutableStateFlow<String?>(null)
val errorMessage: StateFlow<String?> = _errorMessage.asStateFlow()
private val _roomName = MutableStateFlow(DEFAULT_ROOM)
val roomName: StateFlow<String> = _roomName.asStateFlow()
private val _selectedServer = MutableStateFlow(0)
val selectedServer: StateFlow<Int> = _selectedServer.asStateFlow()
private val _servers = MutableStateFlow(DEFAULT_SERVERS.toList())
val servers: StateFlow<List<ServerEntry>> = _servers.asStateFlow()
private val _preferIPv6 = MutableStateFlow(false)
val preferIPv6: StateFlow<Boolean> = _preferIPv6.asStateFlow()
private val _playoutGainDb = MutableStateFlow(0f)
val playoutGainDb: StateFlow<Float> = _playoutGainDb.asStateFlow()
private val _captureGainDb = MutableStateFlow(0f)
val captureGainDb: StateFlow<Float> = _captureGainDb.asStateFlow()
private val _alias = MutableStateFlow("")
val alias: StateFlow<String> = _alias.asStateFlow()
private val _seedHex = MutableStateFlow("")
val seedHex: StateFlow<String> = _seedHex.asStateFlow()
private val _aecEnabled = MutableStateFlow(true)
val aecEnabled: StateFlow<Boolean> = _aecEnabled.asStateFlow()
/** True when a call just ended and debug report can be sent. */
private val _debugReportAvailable = MutableStateFlow(false)
val debugReportAvailable: StateFlow<Boolean> = _debugReportAvailable.asStateFlow()
/** Status: null=idle, "Preparing..."=in progress, "ready"=zip ready, "Error:..."=failed */
private val _debugReportStatus = MutableStateFlow<String?>(null)
val debugReportStatus: StateFlow<String?> = _debugReportStatus.asStateFlow()
/** The zip file ready to be emailed. Set by sendDebugReport, consumed by Activity. */
private val _debugZipReady = MutableStateFlow<File?>(null)
val debugZipReady: StateFlow<File?> = _debugZipReady.asStateFlow()
private var statsJob: Job? = null
companion object {
private const val TAG = "WzpCall"
val DEFAULT_SERVERS = listOf(
ServerEntry("172.16.81.175:4433", "LAN (172.16.81.175)"),
ServerEntry("193.180.213.68:4433", "Pangolin (IP)"),
)
const val DEFAULT_ROOM = "android"
}
fun setContext(context: Context) {
val appCtx = context.applicationContext
appContext = appCtx
if (audioPipeline == null) {
audioPipeline = AudioPipeline(appCtx)
}
if (audioRouteManager == null) {
audioRouteManager = AudioRouteManager(appCtx)
}
if (debugReporter == null) {
debugReporter = DebugReporter(appCtx)
}
if (settings == null) {
settings = SettingsRepository(appCtx)
loadSettings()
}
}
private fun loadSettings() {
val s = settings ?: return
s.loadServers()?.let { saved ->
if (saved.isNotEmpty()) _servers.value = saved
}
_selectedServer.value = s.loadSelectedServer().coerceIn(0, _servers.value.lastIndex)
_roomName.value = s.loadRoom()
_alias.value = s.getOrCreateAlias()
_preferIPv6.value = s.loadPreferIPv6()
_playoutGainDb.value = s.loadPlayoutGain()
_captureGainDb.value = s.loadCaptureGain()
_seedHex.value = s.getOrCreateSeedHex()
_aecEnabled.value = s.loadAecEnabled()
}
fun selectServer(index: Int) {
if (index in _servers.value.indices) {
_selectedServer.value = index
settings?.saveSelectedServer(index)
}
}
fun setPreferIPv6(prefer: Boolean) {
_preferIPv6.value = prefer
settings?.savePreferIPv6(prefer)
}
fun addServer(hostPort: String, label: String) {
val current = _servers.value.toMutableList()
current.add(ServerEntry(hostPort, label))
_servers.value = current
settings?.saveServers(current)
}
fun removeServer(index: Int) {
if (index < DEFAULT_SERVERS.size) return // don't remove built-in servers
val current = _servers.value.toMutableList()
if (index in current.indices) {
current.removeAt(index)
_servers.value = current
if (_selectedServer.value >= current.size) {
_selectedServer.value = 0
}
settings?.saveServers(current)
settings?.saveSelectedServer(_selectedServer.value)
}
}
/** Batch-apply servers and selection from Settings draft state. */
fun applyServers(servers: List<ServerEntry>, selected: Int) {
_servers.value = servers
_selectedServer.value = selected.coerceIn(0, servers.lastIndex)
settings?.saveServers(servers)
settings?.saveSelectedServer(_selectedServer.value)
}
fun setRoomName(name: String) {
_roomName.value = name
settings?.saveRoom(name)
}
fun setPlayoutGainDb(db: Float) {
_playoutGainDb.value = db
audioPipeline?.playoutGainDb = db
settings?.savePlayoutGain(db)
}
fun setCaptureGainDb(db: Float) {
_captureGainDb.value = db
audioPipeline?.captureGainDb = db
settings?.saveCaptureGain(db)
}
fun setAlias(alias: String) {
_alias.value = alias
settings?.saveAlias(alias)
}
fun restoreSeed(hex: String) {
_seedHex.value = hex
settings?.saveSeedHex(hex)
}
fun setAecEnabled(enabled: Boolean) {
_aecEnabled.value = enabled
settings?.saveAecEnabled(enabled)
}
/**
* Resolve DNS hostname to IP address on the Kotlin/Android side,
* since Rust's DNS resolution may not work on Android.
* Returns "ip:port" string.
*/
private fun resolveToIp(hostPort: String): String {
val parts = hostPort.split(":")
if (parts.size != 2) return hostPort
val host = parts[0]
val port = parts[1]
// Already an IP address — return as-is
if (host.matches(Regex("""\d+\.\d+\.\d+\.\d+"""))) return hostPort
if (host.contains(":")) return hostPort // IPv6 literal
return try {
val addresses = InetAddress.getAllByName(host)
val preferV6 = _preferIPv6.value
val picked = if (preferV6) {
addresses.firstOrNull { it is Inet6Address } ?: addresses.firstOrNull { it is Inet4Address }
} else {
addresses.firstOrNull { it is Inet4Address } ?: addresses.firstOrNull { it is Inet6Address }
}
if (picked != null) {
val ip = picked.hostAddress ?: host
val formatted = if (picked is Inet6Address) "[$ip]:$port" else "$ip:$port"
formatted
} else {
hostPort
}
} catch (_: Exception) {
hostPort // resolution failed — pass through and let Rust try
}
}
/** Tear down engine and audio. Pass stopService=true to also stop the foreground service. */
private fun teardown(stopService: Boolean = true) {
Log.i(TAG, "teardown: stopping audio, stopService=$stopService")
val hadCall = audioStarted
CallService.onStopFromNotification = null
stopAudio()
stopStatsPolling()
Log.i(TAG, "teardown: stopping engine")
try { engine?.stopCall() } catch (e: Exception) { Log.w(TAG, "stopCall err: $e") }
try { engine?.destroy() } catch (e: Exception) { Log.w(TAG, "destroy err: $e") }
engine = null
engineInitialized = false
_callState.value = 0
if (hadCall) {
_debugReportAvailable.value = true
}
if (stopService) {
try { appContext?.let { CallService.stop(it) } } catch (_: Exception) {}
}
Log.i(TAG, "teardown: done")
}
fun startCall() {
val serverEntry = _servers.value[_selectedServer.value]
val room = _roomName.value
Log.i(TAG, "startCall: server=${serverEntry.address} room=$room")
_debugReportAvailable.value = false
_debugReportStatus.value = null
lastCallServer = serverEntry.address
debugReporter?.prepareForCall()
try {
// Teardown previous call but don't stop the service (we're about to restart it)
teardown(stopService = false)
Log.i(TAG, "startCall: creating engine")
engine = WzpEngine(this)
engine!!.init()
engineInitialized = true
_callState.value = 1
_errorMessage.value = null
try { appContext?.let { CallService.start(it) } } catch (e: Exception) {
Log.w(TAG, "service start err: $e")
}
startStatsPolling()
viewModelScope.launch(kotlinx.coroutines.Dispatchers.IO) {
try {
val relay = resolveToIp(serverEntry.address)
val seed = _seedHex.value
val name = _alias.value
Log.i(TAG, "startCall: resolved=$relay, alias=$name, calling engine.startCall")
val result = engine?.startCall(relay, room, seedHex = seed, alias = name) ?: -1
Log.i(TAG, "startCall: engine returned $result")
// Only wire up notification callback after engine is running
CallService.onStopFromNotification = { stopCall() }
if (result != 0) {
_callState.value = 0
_errorMessage.value = "Failed to start call (code $result)"
appContext?.let { CallService.stop(it) }
}
} catch (e: Exception) {
Log.e(TAG, "startCall IO error", e)
_callState.value = 0
_errorMessage.value = "Engine error: ${e.message}"
appContext?.let { CallService.stop(it) }
}
}
} catch (e: Exception) {
Log.e(TAG, "startCall error", e)
_callState.value = 0
_errorMessage.value = "Engine error: ${e.message}"
appContext?.let { CallService.stop(it) }
}
}
fun stopCall() {
Log.i(TAG, "stopCall")
teardown()
}
fun toggleMute() {
val newMuted = !_isMuted.value
_isMuted.value = newMuted
try { engine?.setMute(newMuted) } catch (_: Exception) {}
}
fun toggleSpeaker() {
val newSpeaker = !_isSpeaker.value
_isSpeaker.value = newSpeaker
audioRouteManager?.setSpeaker(newSpeaker)
}
fun clearError() { _errorMessage.value = null }
fun sendDebugReport() {
val reporter = debugReporter ?: return
_debugReportStatus.value = "Preparing debug report..."
viewModelScope.launch(kotlinx.coroutines.Dispatchers.IO) {
val zipFile = reporter.collectZip(
callDurationSecs = lastCallDuration,
finalStatsJson = lastStatsJson,
aecEnabled = _aecEnabled.value,
alias = _alias.value,
server = lastCallServer,
room = _roomName.value
)
if (zipFile != null) {
_debugZipReady.value = zipFile
_debugReportStatus.value = "ready"
} else {
_debugReportStatus.value = "Error: failed to create zip"
}
_debugReportAvailable.value = false
}
}
/** Called by Activity after email intent is launched. */
fun onDebugReportSent() {
_debugZipReady.value = null
_debugReportStatus.value = null
}
fun dismissDebugReport() {
_debugReportAvailable.value = false
_debugReportStatus.value = null
_debugZipReady.value = null
}
// WzpCallback
override fun onCallStateChanged(state: Int) { _callState.value = state }
override fun onQualityTierChanged(tier: Int) { _qualityTier.value = tier }
override fun onError(code: Int, message: String) { _errorMessage.value = "Error $code: $message" }
private fun startAudio() {
if (audioStarted) return
val e = engine ?: return
val ctx = appContext ?: return
// Create a fresh pipeline each call to avoid stale threads
audioPipeline = AudioPipeline(ctx).also {
it.playoutGainDb = _playoutGainDb.value
it.captureGainDb = _captureGainDb.value
it.aecEnabled = _aecEnabled.value
it.start(e)
}
audioRouteManager?.register()
audioStarted = true
}
private fun stopAudio() {
if (!audioStarted) return
audioPipeline?.stop()
audioPipeline = null
audioRouteManager?.unregister()
audioRouteManager?.setSpeaker(false)
_isSpeaker.value = false
audioStarted = false
}
private fun startStatsPolling() {
statsJob?.cancel()
statsJob = viewModelScope.launch {
while (isActive) {
try {
val json = engine?.getStats() ?: "{}"
if (json.isNotEmpty()) {
Log.d(TAG, "raw: $json")
lastStatsJson = json
val s = CallStats.fromJson(json)
lastCallDuration = s.durationSecs
_stats.value = s
if (s.state != 0) {
_callState.value = s.state
}
if (s.state == 2 && !audioStarted) {
startAudio()
}
}
} catch (_: Exception) {}
delay(500L)
}
}
}
private fun stopStatsPolling() {
statsJob?.cancel()
statsJob = null
}
override fun onCleared() {
super.onCleared()
Log.i(TAG, "onCleared")
teardown()
}
}

View File

@@ -0,0 +1,688 @@
package com.wzp.ui.call
import androidx.compose.foundation.background
import androidx.compose.foundation.layout.Arrangement
import androidx.compose.foundation.layout.Box
import androidx.compose.foundation.layout.Column
import androidx.compose.foundation.layout.ExperimentalLayoutApi
import androidx.compose.foundation.layout.FlowRow
import androidx.compose.foundation.layout.Row
import androidx.compose.foundation.layout.Spacer
import androidx.compose.foundation.layout.fillMaxSize
import androidx.compose.foundation.layout.fillMaxWidth
import androidx.compose.foundation.layout.height
import androidx.compose.foundation.layout.padding
import androidx.compose.foundation.layout.size
import androidx.compose.foundation.layout.width
import androidx.compose.foundation.rememberScrollState
import androidx.compose.foundation.shape.CircleShape
import androidx.compose.foundation.shape.RoundedCornerShape
import androidx.compose.foundation.verticalScroll
import androidx.compose.material3.AlertDialog
import androidx.compose.material3.Button
import androidx.compose.material3.ButtonDefaults
import androidx.compose.material3.FilledIconButton
import androidx.compose.material3.FilledTonalIconButton
import androidx.compose.material3.IconButtonDefaults
import androidx.compose.material3.MaterialTheme
import androidx.compose.material3.OutlinedButton
import androidx.compose.material3.OutlinedTextField
import androidx.compose.material3.Slider
import androidx.compose.material3.Surface
import androidx.compose.material3.Switch
import androidx.compose.material3.Text
import androidx.compose.material3.TextButton
import androidx.compose.runtime.Composable
import androidx.compose.runtime.collectAsState
import androidx.compose.runtime.getValue
import androidx.compose.runtime.mutableStateOf
import androidx.compose.runtime.remember
import androidx.compose.runtime.setValue
import androidx.compose.ui.Alignment
import androidx.compose.ui.Modifier
import androidx.compose.ui.draw.clip
import androidx.compose.ui.graphics.Color
import androidx.compose.ui.text.font.FontWeight
import androidx.compose.ui.text.style.TextAlign
import androidx.compose.ui.unit.dp
import androidx.compose.ui.unit.sp
import com.wzp.engine.CallStats
import kotlin.math.roundToInt
@OptIn(ExperimentalLayoutApi::class)
@Composable
fun InCallScreen(
viewModel: CallViewModel,
onHangUp: () -> Unit,
onOpenSettings: () -> Unit = {}
) {
val callState by viewModel.callState.collectAsState()
val isMuted by viewModel.isMuted.collectAsState()
val isSpeaker by viewModel.isSpeaker.collectAsState()
val stats by viewModel.stats.collectAsState()
val qualityTier by viewModel.qualityTier.collectAsState()
val errorMessage by viewModel.errorMessage.collectAsState()
val roomName by viewModel.roomName.collectAsState()
val selectedServer by viewModel.selectedServer.collectAsState()
val servers by viewModel.servers.collectAsState()
val preferIPv6 by viewModel.preferIPv6.collectAsState()
val playoutGainDb by viewModel.playoutGainDb.collectAsState()
val captureGainDb by viewModel.captureGainDb.collectAsState()
val debugReportAvailable by viewModel.debugReportAvailable.collectAsState()
val debugReportStatus by viewModel.debugReportStatus.collectAsState()
var showAddServerDialog by remember { mutableStateOf(false) }
Surface(
modifier = Modifier.fillMaxSize(),
color = MaterialTheme.colorScheme.background
) {
Column(
modifier = Modifier
.fillMaxSize()
.padding(24.dp)
.verticalScroll(rememberScrollState()),
horizontalAlignment = Alignment.CenterHorizontally
) {
// Settings button (top-right)
if (callState == 0) {
Row(modifier = Modifier.fillMaxWidth(), horizontalArrangement = Arrangement.End) {
TextButton(onClick = onOpenSettings) {
Text("Settings")
}
}
}
Spacer(modifier = Modifier.height(if (callState == 0) 16.dp else 48.dp))
Text(
text = "WZ Phone",
style = MaterialTheme.typography.headlineMedium.copy(
fontWeight = FontWeight.Bold
),
color = MaterialTheme.colorScheme.primary
)
Spacer(modifier = Modifier.height(8.dp))
CallStateLabel(callState)
if (callState == 0) {
Spacer(modifier = Modifier.height(32.dp))
// Server selector
Text(
text = "Server",
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Spacer(modifier = Modifier.height(4.dp))
FlowRow(
modifier = Modifier.fillMaxWidth(),
horizontalArrangement = Arrangement.Center
) {
servers.forEachIndexed { idx, entry ->
val isSelected = selectedServer == idx
FilledTonalIconButton(
onClick = { viewModel.selectServer(idx) },
modifier = Modifier
.padding(2.dp)
.height(36.dp)
.width(140.dp),
shape = RoundedCornerShape(8.dp),
colors = if (isSelected) {
IconButtonDefaults.filledTonalIconButtonColors(
containerColor = MaterialTheme.colorScheme.primaryContainer,
contentColor = MaterialTheme.colorScheme.onPrimaryContainer
)
} else {
IconButtonDefaults.filledTonalIconButtonColors()
}
) {
Text(
text = entry.label,
style = MaterialTheme.typography.labelSmall,
maxLines = 1
)
}
}
// + Add button
OutlinedButton(
onClick = { showAddServerDialog = true },
modifier = Modifier
.padding(2.dp)
.height(36.dp),
shape = RoundedCornerShape(8.dp)
) {
Text("+", style = MaterialTheme.typography.labelMedium)
}
}
// IPv4/IPv6 preference
Spacer(modifier = Modifier.height(8.dp))
Row(
verticalAlignment = Alignment.CenterVertically,
horizontalArrangement = Arrangement.Center
) {
Text(
text = "IPv4",
style = MaterialTheme.typography.labelSmall,
color = if (!preferIPv6) MaterialTheme.colorScheme.primary
else MaterialTheme.colorScheme.onSurfaceVariant
)
Switch(
checked = preferIPv6,
onCheckedChange = { viewModel.setPreferIPv6(it) },
modifier = Modifier.padding(horizontal = 8.dp)
)
Text(
text = "IPv6",
style = MaterialTheme.typography.labelSmall,
color = if (preferIPv6) MaterialTheme.colorScheme.primary
else MaterialTheme.colorScheme.onSurfaceVariant
)
}
// Selected server address
Spacer(modifier = Modifier.height(4.dp))
Text(
text = servers.getOrNull(selectedServer)?.address ?: "",
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Spacer(modifier = Modifier.height(8.dp))
OutlinedTextField(
value = roomName,
onValueChange = { viewModel.setRoomName(it) },
label = { Text("Room") },
singleLine = true,
modifier = Modifier.fillMaxWidth(0.6f)
)
Spacer(modifier = Modifier.height(24.dp))
Button(
onClick = { viewModel.startCall() },
modifier = Modifier
.size(120.dp)
.clip(CircleShape),
shape = CircleShape,
colors = ButtonDefaults.buttonColors(
containerColor = Color(0xFF4CAF50)
)
) {
Text(
text = "CALL",
style = MaterialTheme.typography.titleLarge.copy(
fontWeight = FontWeight.Bold
),
color = Color.White
)
}
errorMessage?.let { err ->
Spacer(modifier = Modifier.height(16.dp))
Text(
text = err,
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.error
)
}
// Debug report card — shown after call ends
if (debugReportAvailable || debugReportStatus != null) {
Spacer(modifier = Modifier.height(24.dp))
DebugReportCard(
available = debugReportAvailable,
status = debugReportStatus,
onSend = { viewModel.sendDebugReport() },
onDismiss = { viewModel.dismissDebugReport() }
)
}
} else {
// In-call UI
Spacer(modifier = Modifier.height(16.dp))
DurationDisplay(stats.durationSecs)
Spacer(modifier = Modifier.height(24.dp))
QualityIndicator(qualityTier, stats.qualityLabel)
if (stats.roomParticipantCount > 0) {
// Dedup by fingerprint — same key = same person, even if
// relay hasn't cleaned up stale entries yet.
val unique = stats.roomParticipants
.distinctBy { it.fingerprint.ifEmpty { it.displayName } }
Spacer(modifier = Modifier.height(8.dp))
Text(
text = "${unique.size} in room",
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
unique.forEach { member ->
Text(
text = member.displayName,
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
}
}
Spacer(modifier = Modifier.height(32.dp))
AudioLevelBar(stats.audioLevel)
Spacer(modifier = Modifier.height(16.dp))
// Gain sliders
GainSlider(
label = "Voice Volume",
gainDb = playoutGainDb,
onGainChange = { viewModel.setPlayoutGainDb(it) }
)
Spacer(modifier = Modifier.height(4.dp))
GainSlider(
label = "Mic Gain",
gainDb = captureGainDb,
onGainChange = { viewModel.setCaptureGainDb(it) }
)
Spacer(modifier = Modifier.height(32.dp))
ControlRow(
isMuted = isMuted,
isSpeaker = isSpeaker,
onToggleMute = viewModel::toggleMute,
onToggleSpeaker = viewModel::toggleSpeaker,
onHangUp = {
viewModel.stopCall()
}
)
Spacer(modifier = Modifier.height(32.dp))
StatsOverlay(stats)
Spacer(modifier = Modifier.height(16.dp))
}
}
}
if (showAddServerDialog) {
AddServerDialog(
onDismiss = { showAddServerDialog = false },
onAdd = { host, port, label ->
viewModel.addServer("$host:$port", label)
showAddServerDialog = false
}
)
}
}
@Composable
private fun AddServerDialog(
onDismiss: () -> Unit,
onAdd: (host: String, port: String, label: String) -> Unit
) {
var host by remember { mutableStateOf("") }
var port by remember { mutableStateOf("4433") }
var label by remember { mutableStateOf("") }
AlertDialog(
onDismissRequest = onDismiss,
title = { Text("Add Server") },
text = {
Column {
OutlinedTextField(
value = host,
onValueChange = { host = it },
label = { Text("Host (IP or domain)") },
singleLine = true,
modifier = Modifier.fillMaxWidth()
)
Spacer(modifier = Modifier.height(8.dp))
OutlinedTextField(
value = port,
onValueChange = { port = it },
label = { Text("Port") },
singleLine = true,
modifier = Modifier.fillMaxWidth()
)
Spacer(modifier = Modifier.height(8.dp))
OutlinedTextField(
value = label,
onValueChange = { label = it },
label = { Text("Label (optional)") },
singleLine = true,
modifier = Modifier.fillMaxWidth()
)
}
},
confirmButton = {
TextButton(
onClick = {
if (host.isNotBlank()) {
val displayLabel = label.ifBlank { host }
onAdd(host.trim(), port.trim(), displayLabel)
}
}
) { Text("Add") }
},
dismissButton = {
TextButton(onClick = onDismiss) { Text("Cancel") }
}
)
}
@Composable
private fun CallStateLabel(state: Int) {
val label = when (state) {
0 -> "Ready to connect"
1 -> "Connecting..."
2 -> "Active"
3 -> "Reconnecting..."
4 -> "Call Ended"
else -> "Unknown"
}
val color = when (state) {
2 -> Color(0xFF4CAF50)
1, 3 -> Color(0xFFFFC107)
else -> MaterialTheme.colorScheme.onSurfaceVariant
}
Text(
text = label,
style = MaterialTheme.typography.titleMedium,
color = color
)
}
@Composable
private fun DurationDisplay(durationSecs: Double) {
val totalSeconds = durationSecs.roundToInt()
val minutes = totalSeconds / 60
val seconds = totalSeconds % 60
Text(
text = "%02d:%02d".format(minutes, seconds),
style = MaterialTheme.typography.displayLarge.copy(
fontWeight = FontWeight.Light,
letterSpacing = 4.sp
),
color = MaterialTheme.colorScheme.onBackground
)
}
@Composable
private fun QualityIndicator(tier: Int, label: String) {
val dotColor = when (tier) {
0 -> Color(0xFF4CAF50)
1 -> Color(0xFFFFC107)
2 -> Color(0xFFF44336)
else -> Color.Gray
}
Row(
verticalAlignment = Alignment.CenterVertically,
horizontalArrangement = Arrangement.Center
) {
Box(
modifier = Modifier
.size(12.dp)
.clip(CircleShape)
.background(dotColor)
)
Spacer(modifier = Modifier.width(8.dp))
Text(
text = label,
style = MaterialTheme.typography.bodyMedium,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
}
}
@Composable
private fun AudioLevelBar(audioLevel: Int) {
val level = if (audioLevel > 0) {
(audioLevel.toFloat() / 8000f).coerceIn(0.02f, 1f)
} else {
0f
}
Column(horizontalAlignment = Alignment.CenterHorizontally) {
Text(
text = "Audio Level",
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Spacer(modifier = Modifier.height(4.dp))
Box(
modifier = Modifier
.fillMaxWidth(0.6f)
.height(6.dp)
.clip(RoundedCornerShape(3.dp))
.background(MaterialTheme.colorScheme.surfaceVariant)
) {
Box(
modifier = Modifier
.fillMaxWidth(level)
.height(6.dp)
.background(MaterialTheme.colorScheme.primary)
)
}
}
}
@Composable
private fun GainSlider(label: String, gainDb: Float, onGainChange: (Float) -> Unit) {
Column(
modifier = Modifier.fillMaxWidth(0.8f),
horizontalAlignment = Alignment.CenterHorizontally
) {
val sign = if (gainDb >= 0) "+" else ""
Text(
text = "$label: ${sign}${"%.0f".format(gainDb)} dB",
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Spacer(modifier = Modifier.height(4.dp))
Slider(
value = gainDb,
onValueChange = { onGainChange(Math.round(it).toFloat()) },
valueRange = -20f..20f,
steps = 0,
modifier = Modifier.fillMaxWidth()
)
}
}
@Composable
private fun ControlRow(
isMuted: Boolean,
isSpeaker: Boolean,
onToggleMute: () -> Unit,
onToggleSpeaker: () -> Unit,
onHangUp: () -> Unit
) {
Row(
modifier = Modifier.fillMaxWidth(),
horizontalArrangement = Arrangement.SpaceEvenly,
verticalAlignment = Alignment.CenterVertically
) {
FilledTonalIconButton(
onClick = onToggleMute,
modifier = Modifier.size(56.dp),
colors = if (isMuted) {
IconButtonDefaults.filledTonalIconButtonColors(
containerColor = MaterialTheme.colorScheme.errorContainer,
contentColor = MaterialTheme.colorScheme.onErrorContainer
)
} else {
IconButtonDefaults.filledTonalIconButtonColors()
}
) {
Text(
text = if (isMuted) "MIC\nOFF" else "MIC",
textAlign = TextAlign.Center,
style = MaterialTheme.typography.labelSmall,
lineHeight = 12.sp
)
}
FilledIconButton(
onClick = onHangUp,
modifier = Modifier.size(72.dp),
shape = CircleShape,
colors = IconButtonDefaults.filledIconButtonColors(
containerColor = Color(0xFFF44336),
contentColor = Color.White
)
) {
Text(
text = "END",
style = MaterialTheme.typography.titleMedium.copy(
fontWeight = FontWeight.Bold
)
)
}
FilledTonalIconButton(
onClick = onToggleSpeaker,
modifier = Modifier.size(56.dp),
colors = if (isSpeaker) {
IconButtonDefaults.filledTonalIconButtonColors(
containerColor = MaterialTheme.colorScheme.primaryContainer,
contentColor = MaterialTheme.colorScheme.onPrimaryContainer
)
} else {
IconButtonDefaults.filledTonalIconButtonColors()
}
) {
Text(
text = if (isSpeaker) "SPK\nON" else "SPK",
textAlign = TextAlign.Center,
style = MaterialTheme.typography.labelSmall,
lineHeight = 12.sp
)
}
}
}
@Composable
private fun StatsOverlay(stats: CallStats) {
Surface(
modifier = Modifier.fillMaxWidth(),
color = MaterialTheme.colorScheme.surfaceVariant.copy(alpha = 0.5f),
shape = RoundedCornerShape(8.dp)
) {
Column(
modifier = Modifier.padding(12.dp),
horizontalAlignment = Alignment.CenterHorizontally
) {
Text(
text = "Stats",
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Spacer(modifier = Modifier.height(4.dp))
Row(
modifier = Modifier.fillMaxWidth(),
horizontalArrangement = Arrangement.SpaceEvenly
) {
StatItem("Loss", "%.1f%%".format(stats.lossPct))
StatItem("RTT", "${stats.rttMs}ms")
StatItem("Jitter", "${stats.jitterMs}ms")
}
Spacer(modifier = Modifier.height(4.dp))
Row(
modifier = Modifier.fillMaxWidth(),
horizontalArrangement = Arrangement.SpaceEvenly
) {
StatItem("Sent", "${stats.framesEncoded}")
StatItem("Recv", "${stats.framesDecoded}")
StatItem("FEC", "${stats.fecRecovered}")
}
}
}
}
@Composable
private fun StatItem(label: String, value: String) {
Column(horizontalAlignment = Alignment.CenterHorizontally) {
Text(
text = value,
style = MaterialTheme.typography.bodySmall.copy(fontWeight = FontWeight.Medium),
color = MaterialTheme.colorScheme.onSurface
)
Text(
text = label,
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
}
}
@Composable
private fun DebugReportCard(
available: Boolean,
status: String?,
onSend: () -> Unit,
onDismiss: () -> Unit
) {
Surface(
modifier = Modifier.fillMaxWidth(),
color = MaterialTheme.colorScheme.surfaceVariant.copy(alpha = 0.7f),
shape = RoundedCornerShape(12.dp)
) {
Column(
modifier = Modifier.padding(16.dp),
horizontalAlignment = Alignment.CenterHorizontally
) {
Text(
text = "Debug Report",
style = MaterialTheme.typography.titleSmall.copy(fontWeight = FontWeight.Bold),
color = MaterialTheme.colorScheme.onSurface
)
Spacer(modifier = Modifier.height(4.dp))
Text(
text = "Email call recordings, logs & stats for analysis",
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.onSurfaceVariant,
textAlign = TextAlign.Center
)
Spacer(modifier = Modifier.height(12.dp))
when {
status != null && status.startsWith("Error") -> {
Text(
text = status,
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.error
)
Spacer(modifier = Modifier.height(8.dp))
Row(horizontalArrangement = Arrangement.spacedBy(8.dp)) {
OutlinedButton(onClick = onSend) { Text("Retry") }
TextButton(onClick = onDismiss) { Text("Dismiss") }
}
}
status != null && status != "ready" -> {
// Preparing zip...
Text(
text = status,
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
}
available -> {
Row(horizontalArrangement = Arrangement.spacedBy(8.dp)) {
Button(onClick = onSend) {
Text("Email Report")
}
TextButton(onClick = onDismiss) {
Text("Skip")
}
}
}
}
}
}
}

View File

@@ -0,0 +1,510 @@
package com.wzp.ui.settings
import android.content.ClipData
import android.content.ClipboardManager
import android.content.Context
import android.widget.Toast
import androidx.compose.foundation.layout.Arrangement
import androidx.compose.foundation.layout.Column
import androidx.compose.foundation.layout.ExperimentalLayoutApi
import androidx.compose.foundation.layout.FlowRow
import androidx.compose.foundation.layout.Row
import androidx.compose.foundation.layout.Spacer
import androidx.compose.foundation.layout.fillMaxSize
import androidx.compose.foundation.layout.fillMaxWidth
import androidx.compose.foundation.layout.height
import androidx.compose.foundation.layout.padding
import androidx.compose.foundation.layout.width
import androidx.compose.foundation.rememberScrollState
import androidx.compose.foundation.shape.RoundedCornerShape
import androidx.compose.foundation.verticalScroll
import androidx.compose.material3.AlertDialog
import androidx.compose.material3.Button
import androidx.compose.material3.ButtonDefaults
import androidx.compose.material3.Divider
import androidx.compose.material3.FilledTonalButton
import androidx.compose.material3.FilledTonalIconButton
import androidx.compose.material3.IconButtonDefaults
import androidx.compose.material3.MaterialTheme
import androidx.compose.material3.OutlinedButton
import androidx.compose.material3.OutlinedTextField
import androidx.compose.material3.Slider
import androidx.compose.material3.Surface
import androidx.compose.material3.Switch
import androidx.compose.material3.Text
import androidx.compose.material3.TextButton
import androidx.compose.runtime.Composable
import androidx.compose.runtime.collectAsState
import androidx.compose.runtime.getValue
import androidx.compose.runtime.mutableFloatStateOf
import androidx.compose.runtime.mutableIntStateOf
import androidx.compose.runtime.mutableStateOf
import androidx.compose.runtime.remember
import androidx.compose.runtime.setValue
import androidx.compose.runtime.toMutableStateList
import androidx.compose.ui.Alignment
import androidx.compose.ui.Modifier
import androidx.compose.ui.graphics.Color
import androidx.compose.ui.platform.LocalContext
import androidx.compose.ui.text.font.FontFamily
import androidx.compose.ui.text.font.FontWeight
import androidx.compose.ui.unit.dp
import com.wzp.ui.call.CallViewModel
import com.wzp.ui.call.ServerEntry
@OptIn(ExperimentalLayoutApi::class)
@Composable
fun SettingsScreen(
viewModel: CallViewModel,
onBack: () -> Unit
) {
val context = LocalContext.current
// Snapshot current values into local draft state
val currentAlias by viewModel.alias.collectAsState()
val currentSeedHex by viewModel.seedHex.collectAsState()
val currentServers by viewModel.servers.collectAsState()
val currentSelectedServer by viewModel.selectedServer.collectAsState()
val currentRoomName by viewModel.roomName.collectAsState()
val currentPreferIPv6 by viewModel.preferIPv6.collectAsState()
val currentPlayoutGain by viewModel.playoutGainDb.collectAsState()
val currentCaptureGain by viewModel.captureGainDb.collectAsState()
val currentAecEnabled by viewModel.aecEnabled.collectAsState()
// Draft state — initialized from current values
var draftAlias by remember { mutableStateOf(currentAlias) }
var draftSeedHex by remember { mutableStateOf(currentSeedHex) }
val draftServers = remember { currentServers.toMutableStateList() }
var draftSelectedServer by remember { mutableIntStateOf(currentSelectedServer) }
var draftRoomName by remember { mutableStateOf(currentRoomName) }
var draftPreferIPv6 by remember { mutableStateOf(currentPreferIPv6) }
var draftPlayoutGain by remember { mutableFloatStateOf(currentPlayoutGain) }
var draftCaptureGain by remember { mutableFloatStateOf(currentCaptureGain) }
var draftAecEnabled by remember { mutableStateOf(currentAecEnabled) }
// Track if anything changed
val hasChanges = draftAlias != currentAlias ||
draftSeedHex != currentSeedHex ||
draftServers.toList() != currentServers ||
draftSelectedServer != currentSelectedServer ||
draftRoomName != currentRoomName ||
draftPreferIPv6 != currentPreferIPv6 ||
draftPlayoutGain != currentPlayoutGain ||
draftCaptureGain != currentCaptureGain ||
draftAecEnabled != currentAecEnabled
var showAddServerDialog by remember { mutableStateOf(false) }
var showRestoreKeyDialog by remember { mutableStateOf(false) }
Surface(
modifier = Modifier.fillMaxSize(),
color = MaterialTheme.colorScheme.background
) {
Column(
modifier = Modifier
.fillMaxSize()
.padding(24.dp)
.verticalScroll(rememberScrollState())
) {
// Header
Row(
modifier = Modifier.fillMaxWidth(),
verticalAlignment = Alignment.CenterVertically
) {
TextButton(onClick = onBack) {
Text("< Back")
}
Spacer(modifier = Modifier.weight(1f))
Text(
text = "Settings",
style = MaterialTheme.typography.headlineSmall.copy(
fontWeight = FontWeight.Bold
),
color = MaterialTheme.colorScheme.primary
)
Spacer(modifier = Modifier.weight(1f))
// Save button — only enabled when changes exist
Button(
onClick = {
viewModel.setAlias(draftAlias)
if (draftSeedHex != currentSeedHex) viewModel.restoreSeed(draftSeedHex)
viewModel.applyServers(draftServers.toList(), draftSelectedServer)
viewModel.setRoomName(draftRoomName)
viewModel.setPreferIPv6(draftPreferIPv6)
viewModel.setPlayoutGainDb(draftPlayoutGain)
viewModel.setCaptureGainDb(draftCaptureGain)
viewModel.setAecEnabled(draftAecEnabled)
Toast.makeText(context, "Settings saved", Toast.LENGTH_SHORT).show()
onBack()
},
enabled = hasChanges
) {
Text("Save")
}
}
Spacer(modifier = Modifier.height(24.dp))
// --- Identity ---
SectionHeader("Identity")
OutlinedTextField(
value = draftAlias,
onValueChange = { draftAlias = it },
label = { Text("Display Name") },
singleLine = true,
modifier = Modifier.fillMaxWidth()
)
Spacer(modifier = Modifier.height(16.dp))
// Fingerprint display
val fingerprint = if (draftSeedHex.length >= 16) draftSeedHex.take(16).uppercase() else "Not generated"
Text(
text = "Fingerprint",
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Text(
text = fingerprint.chunked(4).joinToString(" "),
style = MaterialTheme.typography.bodyMedium.copy(
fontFamily = FontFamily.Monospace
),
color = MaterialTheme.colorScheme.onSurface
)
Spacer(modifier = Modifier.height(12.dp))
// Key backup/restore
Row(horizontalArrangement = Arrangement.spacedBy(8.dp)) {
FilledTonalButton(onClick = {
val clipboard = context.getSystemService(Context.CLIPBOARD_SERVICE) as ClipboardManager
clipboard.setPrimaryClip(ClipData.newPlainText("WZP Key", draftSeedHex))
Toast.makeText(context, "Key copied to clipboard", Toast.LENGTH_SHORT).show()
}) {
Text("Copy Key")
}
OutlinedButton(onClick = { showRestoreKeyDialog = true }) {
Text("Restore Key")
}
}
Spacer(modifier = Modifier.height(24.dp))
Divider()
Spacer(modifier = Modifier.height(16.dp))
// --- Audio ---
SectionHeader("Audio Defaults")
GainSlider(
label = "Voice Volume",
gainDb = draftPlayoutGain,
onGainChange = { draftPlayoutGain = Math.round(it).toFloat() }
)
Spacer(modifier = Modifier.height(4.dp))
GainSlider(
label = "Mic Gain",
gainDb = draftCaptureGain,
onGainChange = { draftCaptureGain = Math.round(it).toFloat() }
)
Spacer(modifier = Modifier.height(12.dp))
Row(
verticalAlignment = Alignment.CenterVertically,
modifier = Modifier.fillMaxWidth()
) {
Column(modifier = Modifier.weight(1f)) {
Text(
text = "Echo Cancellation (AEC)",
style = MaterialTheme.typography.bodyMedium
)
Text(
text = "Disable if audio sounds distorted",
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
}
Switch(
checked = draftAecEnabled,
onCheckedChange = { draftAecEnabled = it }
)
}
Spacer(modifier = Modifier.height(24.dp))
Divider()
Spacer(modifier = Modifier.height(16.dp))
// --- Servers ---
SectionHeader("Servers")
FlowRow(
modifier = Modifier.fillMaxWidth(),
horizontalArrangement = Arrangement.Start,
verticalArrangement = Arrangement.spacedBy(4.dp)
) {
draftServers.forEachIndexed { idx, entry ->
val isSelected = draftSelectedServer == idx
Row(verticalAlignment = Alignment.CenterVertically) {
FilledTonalIconButton(
onClick = { draftSelectedServer = idx },
modifier = Modifier
.padding(end = 2.dp)
.height(36.dp)
.width(140.dp),
shape = RoundedCornerShape(8.dp),
colors = if (isSelected) {
IconButtonDefaults.filledTonalIconButtonColors(
containerColor = MaterialTheme.colorScheme.primaryContainer,
contentColor = MaterialTheme.colorScheme.onPrimaryContainer
)
} else {
IconButtonDefaults.filledTonalIconButtonColors()
}
) {
Text(
text = entry.label,
style = MaterialTheme.typography.labelSmall,
maxLines = 1
)
}
// Show remove button for non-default servers
if (idx >= 2) {
TextButton(
onClick = {
draftServers.removeAt(idx)
if (draftSelectedServer >= draftServers.size) {
draftSelectedServer = 0
}
},
modifier = Modifier.height(36.dp)
) {
Text("X", color = MaterialTheme.colorScheme.error)
}
}
}
}
}
Spacer(modifier = Modifier.height(8.dp))
OutlinedButton(
onClick = { showAddServerDialog = true },
shape = RoundedCornerShape(8.dp)
) {
Text("+ Add Server")
}
// Show selected server address
Spacer(modifier = Modifier.height(8.dp))
Text(
text = "Default: ${draftServers.getOrNull(draftSelectedServer)?.address ?: "none"}",
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Spacer(modifier = Modifier.height(24.dp))
Divider()
Spacer(modifier = Modifier.height(16.dp))
// --- Network ---
SectionHeader("Network")
Row(
verticalAlignment = Alignment.CenterVertically,
modifier = Modifier.fillMaxWidth()
) {
Text(
text = "Prefer IPv6",
style = MaterialTheme.typography.bodyMedium,
modifier = Modifier.weight(1f)
)
Switch(
checked = draftPreferIPv6,
onCheckedChange = { draftPreferIPv6 = it }
)
}
Spacer(modifier = Modifier.height(24.dp))
Divider()
Spacer(modifier = Modifier.height(16.dp))
// --- Room ---
SectionHeader("Room")
OutlinedTextField(
value = draftRoomName,
onValueChange = { draftRoomName = it },
label = { Text("Default Room") },
singleLine = true,
modifier = Modifier.fillMaxWidth()
)
Spacer(modifier = Modifier.height(32.dp))
}
}
if (showAddServerDialog) {
AddServerDialog(
onDismiss = { showAddServerDialog = false },
onAdd = { host, port, label ->
draftServers.add(ServerEntry("$host:$port", label))
showAddServerDialog = false
}
)
}
if (showRestoreKeyDialog) {
RestoreKeyDialog(
onDismiss = { showRestoreKeyDialog = false },
onRestore = { hex ->
draftSeedHex = hex
showRestoreKeyDialog = false
Toast.makeText(context, "Key staged — press Save to apply", Toast.LENGTH_SHORT).show()
}
)
}
}
@Composable
private fun SectionHeader(title: String) {
Text(
text = title,
style = MaterialTheme.typography.titleMedium.copy(fontWeight = FontWeight.Bold),
color = MaterialTheme.colorScheme.primary
)
Spacer(modifier = Modifier.height(8.dp))
}
@Composable
private fun GainSlider(label: String, gainDb: Float, onGainChange: (Float) -> Unit) {
Column(
modifier = Modifier.fillMaxWidth(),
horizontalAlignment = Alignment.CenterHorizontally
) {
val sign = if (gainDb >= 0) "+" else ""
Text(
text = "$label: ${sign}${"%.0f".format(gainDb)} dB",
style = MaterialTheme.typography.labelSmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Slider(
value = gainDb,
onValueChange = onGainChange,
valueRange = -20f..20f,
steps = 0,
modifier = Modifier.fillMaxWidth()
)
}
}
@Composable
private fun AddServerDialog(
onDismiss: () -> Unit,
onAdd: (host: String, port: String, label: String) -> Unit
) {
var host by remember { mutableStateOf("") }
var port by remember { mutableStateOf("4433") }
var label by remember { mutableStateOf("") }
AlertDialog(
onDismissRequest = onDismiss,
title = { Text("Add Server") },
text = {
Column {
OutlinedTextField(
value = host,
onValueChange = { host = it },
label = { Text("Host (IP or domain)") },
singleLine = true,
modifier = Modifier.fillMaxWidth()
)
Spacer(modifier = Modifier.height(8.dp))
OutlinedTextField(
value = port,
onValueChange = { port = it },
label = { Text("Port") },
singleLine = true,
modifier = Modifier.fillMaxWidth()
)
Spacer(modifier = Modifier.height(8.dp))
OutlinedTextField(
value = label,
onValueChange = { label = it },
label = { Text("Label (optional)") },
singleLine = true,
modifier = Modifier.fillMaxWidth()
)
}
},
confirmButton = {
TextButton(
onClick = {
if (host.isNotBlank()) {
val displayLabel = label.ifBlank { host }
onAdd(host.trim(), port.trim(), displayLabel)
}
}
) { Text("Add") }
},
dismissButton = {
TextButton(onClick = onDismiss) { Text("Cancel") }
}
)
}
@Composable
private fun RestoreKeyDialog(
onDismiss: () -> Unit,
onRestore: (hex: String) -> Unit
) {
var keyInput by remember { mutableStateOf("") }
var error by remember { mutableStateOf<String?>(null) }
AlertDialog(
onDismissRequest = onDismiss,
title = { Text("Restore Identity Key") },
text = {
Column {
Text(
text = "Paste your 64-character hex key below. This will replace your current identity.",
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.onSurfaceVariant
)
Spacer(modifier = Modifier.height(8.dp))
OutlinedTextField(
value = keyInput,
onValueChange = {
keyInput = it.trim().lowercase()
error = null
},
label = { Text("Identity Key (hex)") },
singleLine = true,
modifier = Modifier.fillMaxWidth(),
isError = error != null
)
error?.let {
Text(
text = it,
style = MaterialTheme.typography.bodySmall,
color = MaterialTheme.colorScheme.error
)
}
}
},
confirmButton = {
TextButton(
onClick = {
val cleaned = keyInput.replace("\\s".toRegex(), "")
if (cleaned.length != 64 || !cleaned.all { it in '0'..'9' || it in 'a'..'f' }) {
error = "Key must be exactly 64 hex characters"
} else {
onRestore(cleaned)
}
}
) { Text("Restore") }
},
dismissButton = {
TextButton(onClick = onDismiss) { Text("Cancel") }
}
)
}

View File

@@ -0,0 +1,4 @@
<?xml version="1.0" encoding="utf-8"?>
<paths>
<cache-path name="debug" path="." />
</paths>

4
android/build.gradle.kts Normal file
View File

@@ -0,0 +1,4 @@
plugins {
id("com.android.application") version "8.2.0" apply false
id("org.jetbrains.kotlin.android") version "1.9.22" apply false
}

View File

@@ -0,0 +1,4 @@
org.gradle.jvmargs=-Xmx2048m -Dfile.encoding=UTF-8
android.useAndroidX=true
kotlin.code.style=official
android.nonTransitiveRClass=true

Binary file not shown.

View File

@@ -0,0 +1,6 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-8.5-bin.zip
networkTimeout=10000
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists

5
android/gradlew vendored Executable file
View File

@@ -0,0 +1,5 @@
#!/bin/sh
# Gradle wrapper script
APP_HOME=$(cd "$(dirname "$0")" && pwd)
CLASSPATH="$APP_HOME/gradle/wrapper/gradle-wrapper.jar"
exec java -classpath "$CLASSPATH" org.gradle.wrapper.GradleWrapperMain "$@"

View File

@@ -0,0 +1,18 @@
pluginManagement {
repositories {
google()
mavenCentral()
gradlePluginPortal()
}
}
dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
repositories {
google()
mavenCentral()
}
}
rootProject.name = "WZPhone"
include(":app")

View File

@@ -0,0 +1,34 @@
[package]
name = "wzp-android"
version.workspace = true
edition.workspace = true
license.workspace = true
rust-version.workspace = true
description = "WarzonePhone Android native VoIP engine — Oboe audio, JNI bridge, call pipeline"
[lib]
crate-type = ["cdylib", "rlib"]
[dependencies]
wzp-proto = { workspace = true }
wzp-codec = { workspace = true }
wzp-fec = { workspace = true }
wzp-crypto = { workspace = true }
wzp-transport = { workspace = true }
tokio = { workspace = true }
tracing = { workspace = true }
tracing-subscriber = { workspace = true }
bytes = { workspace = true }
serde = { workspace = true }
serde_json = "1"
thiserror = { workspace = true }
async-trait = { workspace = true }
anyhow = "1"
libc = "0.2"
jni = { version = "0.21", default-features = false }
rand = { workspace = true }
rustls = { version = "0.23", default-features = false, features = ["ring"] }
tracing-android = "0.2"
[build-dependencies]
cc = "1"

154
crates/wzp-android/build.rs Normal file
View File

@@ -0,0 +1,154 @@
use std::path::PathBuf;
fn main() {
let target = std::env::var("TARGET").unwrap_or_default();
if target.contains("android") {
// Override broken static getauxval from compiler-rt that crashes
// in shared libraries. Must be compiled first to take link priority.
cc::Build::new()
.file("cpp/getauxval_fix.c")
.compile("getauxval_fix");
let oboe_dir = fetch_oboe();
match oboe_dir {
Some(oboe_path) => {
println!("cargo:warning=Building with Oboe from {:?}", oboe_path);
let mut build = cc::Build::new();
build
.cpp(true)
.std("c++17")
// Use shared libc++ — avoids pulling in static libc stubs
// that crash in shared libraries (getauxval, pthread_create, etc.)
.cpp_link_stdlib(Some("c++_shared"))
.include("cpp")
.include(oboe_path.join("include"))
.include(oboe_path.join("src"))
.define("WZP_HAS_OBOE", None)
.file("cpp/oboe_bridge.cpp");
// Compile all Oboe source files
let src_dir = oboe_path.join("src");
add_cpp_files_recursive(&mut build, &src_dir);
build.compile("oboe_bridge");
}
None => {
println!("cargo:warning=Oboe not found, building with stub");
cc::Build::new()
.cpp(true)
.std("c++17")
.cpp_link_stdlib(Some("c++_shared"))
.file("cpp/oboe_stub.cpp")
.include("cpp")
.compile("oboe_bridge");
}
}
// Dynamic C++ runtime — libc++_shared.so must be in jniLibs alongside
// libwzp_android.so. We copy it there from the NDK sysroot.
//
// WHY NOT STATIC: libc++_static.a + libc++abi.a transitively pull in
// object files from libc.a (static libc) which contain broken stubs for
// getauxval, __init_tcb, pthread_create, etc. These stubs only work in
// statically-linked executables. In shared libraries loaded by dlopen(),
// they SIGSEGV because the static libc init hasn't run.
// Google's official recommendation: use libc++_shared.so for native libs.
if let Ok(ndk) = std::env::var("ANDROID_NDK_HOME") {
let arch = if target.contains("aarch64") {
"aarch64-linux-android"
} else if target.contains("armv7") {
"arm-linux-androideabi"
} else if target.contains("x86_64") {
"x86_64-linux-android"
} else {
"aarch64-linux-android"
};
let lib_dir = format!(
"{ndk}/toolchains/llvm/prebuilt/linux-x86_64/sysroot/usr/lib/{arch}"
);
println!("cargo:rustc-link-search=native={lib_dir}");
// Copy libc++_shared.so to the jniLibs directory
let shared_so = format!("{lib_dir}/libc++_shared.so");
if std::path::Path::new(&shared_so).exists() {
let jni_abi = if target.contains("aarch64") {
"arm64-v8a"
} else if target.contains("armv7") {
"armeabi-v7a"
} else {
"arm64-v8a"
};
// Try to copy to the Gradle jniLibs directory
let manifest = std::env::var("CARGO_MANIFEST_DIR").unwrap_or_default();
let jni_dir = format!(
"{manifest}/../../android/app/src/main/jniLibs/{jni_abi}"
);
if let Ok(_) = std::fs::create_dir_all(&jni_dir) {
let _ = std::fs::copy(&shared_so, format!("{jni_dir}/libc++_shared.so"));
println!("cargo:warning=Copied libc++_shared.so to {jni_dir}");
}
}
}
// Oboe needs liblog and libOpenSLES from Android
println!("cargo:rustc-link-lib=log");
println!("cargo:rustc-link-lib=OpenSLES");
} else {
// Non-Android: always use stub
cc::Build::new()
.cpp(true)
.std("c++17")
.file("cpp/oboe_stub.cpp")
.include("cpp")
.compile("oboe_bridge");
}
}
/// Recursively add all .cpp files from a directory to a cc::Build.
fn add_cpp_files_recursive(build: &mut cc::Build, dir: &std::path::Path) {
if !dir.is_dir() {
return;
}
for entry in std::fs::read_dir(dir).unwrap() {
let entry = entry.unwrap();
let path = entry.path();
if path.is_dir() {
add_cpp_files_recursive(build, &path);
} else if path.extension().map_or(false, |e| e == "cpp") {
build.file(&path);
}
}
}
/// Try to find or fetch Oboe headers + source.
fn fetch_oboe() -> Option<PathBuf> {
let out_dir = PathBuf::from(std::env::var("OUT_DIR").unwrap());
let oboe_dir = out_dir.join("oboe");
if oboe_dir.join("include").join("oboe").join("Oboe.h").exists() {
return Some(oboe_dir);
}
let status = std::process::Command::new("git")
.args([
"clone",
"--depth=1",
"--branch=1.8.1",
"https://github.com/google/oboe.git",
oboe_dir.to_str().unwrap(),
])
.status();
match status {
Ok(s) if s.success() => {
if oboe_dir.join("include").join("oboe").join("Oboe.h").exists() {
Some(oboe_dir)
} else {
None
}
}
_ => None,
}
}

View File

@@ -0,0 +1,21 @@
// Override the broken static getauxval from compiler-rt/CRT.
// The static version reads from __libc_auxv which is NULL in shared libs
// loaded via dlopen, causing SIGSEGV in init_have_lse_atomics at load time.
// This version calls the real bionic getauxval via dlsym.
#ifdef __ANDROID__
#include <dlfcn.h>
#include <stdint.h>
typedef unsigned long (*getauxval_fn)(unsigned long);
unsigned long getauxval(unsigned long type) {
static getauxval_fn real_getauxval = (getauxval_fn)0;
if (!real_getauxval) {
real_getauxval = (getauxval_fn)dlsym((void*)-1L /* RTLD_DEFAULT */, "getauxval");
if (!real_getauxval) {
return 0;
}
}
return real_getauxval(type);
}
#endif

View File

@@ -0,0 +1,278 @@
// Full Oboe implementation for Android
// This file is compiled only when targeting Android
#include "oboe_bridge.h"
#ifdef __ANDROID__
#include <oboe/Oboe.h>
#include <android/log.h>
#include <cstring>
#include <atomic>
#define LOG_TAG "wzp-oboe"
#define LOGI(...) __android_log_print(ANDROID_LOG_INFO, LOG_TAG, __VA_ARGS__)
#define LOGW(...) __android_log_print(ANDROID_LOG_WARN, LOG_TAG, __VA_ARGS__)
#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR, LOG_TAG, __VA_ARGS__)
// ---------------------------------------------------------------------------
// Ring buffer helpers (SPSC, lock-free)
// ---------------------------------------------------------------------------
static inline int32_t ring_available_read(const wzp_atomic_int* write_idx,
const wzp_atomic_int* read_idx,
int32_t capacity) {
int32_t w = std::atomic_load_explicit(write_idx, std::memory_order_acquire);
int32_t r = std::atomic_load_explicit(read_idx, std::memory_order_relaxed);
int32_t avail = w - r;
if (avail < 0) avail += capacity;
return avail;
}
static inline int32_t ring_available_write(const wzp_atomic_int* write_idx,
const wzp_atomic_int* read_idx,
int32_t capacity) {
return capacity - 1 - ring_available_read(write_idx, read_idx, capacity);
}
static inline void ring_write(int16_t* buf, int32_t capacity,
wzp_atomic_int* write_idx, const wzp_atomic_int* read_idx,
const int16_t* src, int32_t count) {
int32_t w = std::atomic_load_explicit(write_idx, std::memory_order_relaxed);
for (int32_t i = 0; i < count; i++) {
buf[w] = src[i];
w++;
if (w >= capacity) w = 0;
}
std::atomic_store_explicit(write_idx, w, std::memory_order_release);
}
static inline void ring_read(int16_t* buf, int32_t capacity,
const wzp_atomic_int* write_idx, wzp_atomic_int* read_idx,
int16_t* dst, int32_t count) {
int32_t r = std::atomic_load_explicit(read_idx, std::memory_order_relaxed);
for (int32_t i = 0; i < count; i++) {
dst[i] = buf[r];
r++;
if (r >= capacity) r = 0;
}
std::atomic_store_explicit(read_idx, r, std::memory_order_release);
}
// ---------------------------------------------------------------------------
// Global state
// ---------------------------------------------------------------------------
static std::shared_ptr<oboe::AudioStream> g_capture_stream;
static std::shared_ptr<oboe::AudioStream> g_playout_stream;
static const WzpOboeRings* g_rings = nullptr;
static std::atomic<bool> g_running{false};
static std::atomic<float> g_capture_latency_ms{0.0f};
static std::atomic<float> g_playout_latency_ms{0.0f};
// ---------------------------------------------------------------------------
// Capture callback
// ---------------------------------------------------------------------------
class CaptureCallback : public oboe::AudioStreamDataCallback {
public:
oboe::DataCallbackResult onAudioReady(
oboe::AudioStream* stream,
void* audioData,
int32_t numFrames) override {
if (!g_running.load(std::memory_order_relaxed) || !g_rings) {
return oboe::DataCallbackResult::Stop;
}
const int16_t* src = static_cast<const int16_t*>(audioData);
int32_t avail = ring_available_write(g_rings->capture_write_idx,
g_rings->capture_read_idx,
g_rings->capture_capacity);
int32_t to_write = (numFrames < avail) ? numFrames : avail;
if (to_write > 0) {
ring_write(g_rings->capture_buf, g_rings->capture_capacity,
g_rings->capture_write_idx, g_rings->capture_read_idx,
src, to_write);
}
// Update latency estimate
auto result = stream->calculateLatencyMillis();
if (result) {
g_capture_latency_ms.store(static_cast<float>(result.value()),
std::memory_order_relaxed);
}
return oboe::DataCallbackResult::Continue;
}
};
// ---------------------------------------------------------------------------
// Playout callback
// ---------------------------------------------------------------------------
class PlayoutCallback : public oboe::AudioStreamDataCallback {
public:
oboe::DataCallbackResult onAudioReady(
oboe::AudioStream* stream,
void* audioData,
int32_t numFrames) override {
if (!g_running.load(std::memory_order_relaxed) || !g_rings) {
memset(audioData, 0, numFrames * sizeof(int16_t));
return oboe::DataCallbackResult::Stop;
}
int16_t* dst = static_cast<int16_t*>(audioData);
int32_t avail = ring_available_read(g_rings->playout_write_idx,
g_rings->playout_read_idx,
g_rings->playout_capacity);
int32_t to_read = (numFrames < avail) ? numFrames : avail;
if (to_read > 0) {
ring_read(g_rings->playout_buf, g_rings->playout_capacity,
g_rings->playout_write_idx, g_rings->playout_read_idx,
dst, to_read);
}
// Fill remainder with silence on underrun
if (to_read < numFrames) {
memset(dst + to_read, 0, (numFrames - to_read) * sizeof(int16_t));
}
// Update latency estimate
auto result = stream->calculateLatencyMillis();
if (result) {
g_playout_latency_ms.store(static_cast<float>(result.value()),
std::memory_order_relaxed);
}
return oboe::DataCallbackResult::Continue;
}
};
static CaptureCallback g_capture_cb;
static PlayoutCallback g_playout_cb;
// ---------------------------------------------------------------------------
// Public C API
// ---------------------------------------------------------------------------
int wzp_oboe_start(const WzpOboeConfig* config, const WzpOboeRings* rings) {
if (g_running.load(std::memory_order_relaxed)) {
LOGW("wzp_oboe_start: already running");
return -1;
}
g_rings = rings;
// Build capture stream
oboe::AudioStreamBuilder captureBuilder;
captureBuilder.setDirection(oboe::Direction::Input)
->setPerformanceMode(oboe::PerformanceMode::LowLatency)
->setSharingMode(oboe::SharingMode::Exclusive)
->setFormat(oboe::AudioFormat::I16)
->setChannelCount(config->channel_count)
->setSampleRate(config->sample_rate)
->setFramesPerDataCallback(config->frames_per_burst)
->setInputPreset(oboe::InputPreset::VoiceCommunication)
->setDataCallback(&g_capture_cb);
oboe::Result result = captureBuilder.openStream(g_capture_stream);
if (result != oboe::Result::OK) {
LOGE("Failed to open capture stream: %s", oboe::convertToText(result));
return -2;
}
// Build playout stream
oboe::AudioStreamBuilder playoutBuilder;
playoutBuilder.setDirection(oboe::Direction::Output)
->setPerformanceMode(oboe::PerformanceMode::LowLatency)
->setSharingMode(oboe::SharingMode::Exclusive)
->setFormat(oboe::AudioFormat::I16)
->setChannelCount(config->channel_count)
->setSampleRate(config->sample_rate)
->setFramesPerDataCallback(config->frames_per_burst)
->setUsage(oboe::Usage::VoiceCommunication)
->setDataCallback(&g_playout_cb);
result = playoutBuilder.openStream(g_playout_stream);
if (result != oboe::Result::OK) {
LOGE("Failed to open playout stream: %s", oboe::convertToText(result));
g_capture_stream->close();
g_capture_stream.reset();
return -3;
}
g_running.store(true, std::memory_order_release);
// Start both streams
result = g_capture_stream->requestStart();
if (result != oboe::Result::OK) {
LOGE("Failed to start capture: %s", oboe::convertToText(result));
g_running.store(false, std::memory_order_release);
g_capture_stream->close();
g_playout_stream->close();
g_capture_stream.reset();
g_playout_stream.reset();
return -4;
}
result = g_playout_stream->requestStart();
if (result != oboe::Result::OK) {
LOGE("Failed to start playout: %s", oboe::convertToText(result));
g_running.store(false, std::memory_order_release);
g_capture_stream->requestStop();
g_capture_stream->close();
g_playout_stream->close();
g_capture_stream.reset();
g_playout_stream.reset();
return -5;
}
LOGI("Oboe started: sr=%d burst=%d ch=%d",
config->sample_rate, config->frames_per_burst, config->channel_count);
return 0;
}
void wzp_oboe_stop(void) {
g_running.store(false, std::memory_order_release);
if (g_capture_stream) {
g_capture_stream->requestStop();
g_capture_stream->close();
g_capture_stream.reset();
}
if (g_playout_stream) {
g_playout_stream->requestStop();
g_playout_stream->close();
g_playout_stream.reset();
}
g_rings = nullptr;
LOGI("Oboe stopped");
}
float wzp_oboe_capture_latency_ms(void) {
return g_capture_latency_ms.load(std::memory_order_relaxed);
}
float wzp_oboe_playout_latency_ms(void) {
return g_playout_latency_ms.load(std::memory_order_relaxed);
}
int wzp_oboe_is_running(void) {
return g_running.load(std::memory_order_relaxed) ? 1 : 0;
}
#else
// Non-Android fallback — should not be reached; oboe_stub.cpp is used instead.
// Provide empty implementations just in case.
int wzp_oboe_start(const WzpOboeConfig* config, const WzpOboeRings* rings) {
(void)config; (void)rings;
return -99;
}
void wzp_oboe_stop(void) {}
float wzp_oboe_capture_latency_ms(void) { return 0.0f; }
float wzp_oboe_playout_latency_ms(void) { return 0.0f; }
int wzp_oboe_is_running(void) { return 0; }
#endif // __ANDROID__

View File

@@ -0,0 +1,43 @@
#ifndef WZP_OBOE_BRIDGE_H
#define WZP_OBOE_BRIDGE_H
#include <stdint.h>
#ifdef __cplusplus
#include <atomic>
typedef std::atomic<int32_t> wzp_atomic_int;
extern "C" {
#else
#include <stdatomic.h>
typedef atomic_int wzp_atomic_int;
#endif
typedef struct {
int32_t sample_rate;
int32_t frames_per_burst;
int32_t channel_count;
} WzpOboeConfig;
typedef struct {
int16_t* capture_buf;
int32_t capture_capacity;
wzp_atomic_int* capture_write_idx;
wzp_atomic_int* capture_read_idx;
int16_t* playout_buf;
int32_t playout_capacity;
wzp_atomic_int* playout_write_idx;
wzp_atomic_int* playout_read_idx;
} WzpOboeRings;
int wzp_oboe_start(const WzpOboeConfig* config, const WzpOboeRings* rings);
void wzp_oboe_stop(void);
float wzp_oboe_capture_latency_ms(void);
float wzp_oboe_playout_latency_ms(void);
int wzp_oboe_is_running(void);
#ifdef __cplusplus
}
#endif
#endif // WZP_OBOE_BRIDGE_H

View File

@@ -0,0 +1,27 @@
// Stub implementation for non-Android host builds (testing, cargo check, etc.)
#include "oboe_bridge.h"
#include <stdio.h>
int wzp_oboe_start(const WzpOboeConfig* config, const WzpOboeRings* rings) {
(void)config;
(void)rings;
fprintf(stderr, "wzp_oboe_start: stub (not on Android)\n");
return 0;
}
void wzp_oboe_stop(void) {
fprintf(stderr, "wzp_oboe_stop: stub (not on Android)\n");
}
float wzp_oboe_capture_latency_ms(void) {
return 0.0f;
}
float wzp_oboe_playout_latency_ms(void) {
return 0.0f;
}
int wzp_oboe_is_running(void) {
return 0;
}

View File

@@ -0,0 +1,424 @@
//! Lock-free SPSC ring buffer audio backend for Android (Oboe).
//!
//! The ring buffers are shared between Rust and C++: the Oboe callbacks
//! (running on a high-priority audio thread) read/write directly into
//! the buffers via atomic indices, while the Rust codec thread on the
//! other side does the same.
use std::sync::atomic::{AtomicI32, Ordering};
use tracing::info;
#[allow(unused_imports)]
use tracing::warn;
/// Number of samples per 20 ms frame at 48 kHz mono.
pub const FRAME_SAMPLES: usize = 960;
/// Default ring buffer capacity: 8 frames = 160 ms at 48 kHz.
const RING_CAPACITY: usize = 7680;
// ---------------------------------------------------------------------------
// FFI declarations matching oboe_bridge.h
// ---------------------------------------------------------------------------
#[repr(C)]
#[allow(non_snake_case)]
struct WzpOboeConfig {
sample_rate: i32,
frames_per_burst: i32,
channel_count: i32,
}
#[repr(C)]
#[allow(non_snake_case)]
struct WzpOboeRings {
capture_buf: *mut i16,
capture_capacity: i32,
capture_write_idx: *mut AtomicI32,
capture_read_idx: *mut AtomicI32,
playout_buf: *mut i16,
playout_capacity: i32,
playout_write_idx: *mut AtomicI32,
playout_read_idx: *mut AtomicI32,
}
unsafe impl Send for WzpOboeRings {}
unsafe impl Sync for WzpOboeRings {}
unsafe extern "C" {
fn wzp_oboe_start(config: *const WzpOboeConfig, rings: *const WzpOboeRings) -> i32;
fn wzp_oboe_stop();
fn wzp_oboe_capture_latency_ms() -> f32;
fn wzp_oboe_playout_latency_ms() -> f32;
fn wzp_oboe_is_running() -> i32;
}
// ---------------------------------------------------------------------------
// SPSC Ring Buffer
// ---------------------------------------------------------------------------
/// Single-producer single-consumer lock-free ring buffer.
///
/// The producer calls `write()` and the consumer calls `read()`.
/// Atomics use acquire/release ordering to ensure correct visibility
/// across the Oboe audio thread and the Rust codec thread.
pub struct RingBuffer {
buf: Vec<i16>,
capacity: usize,
write_idx: AtomicI32,
read_idx: AtomicI32,
}
impl RingBuffer {
/// Create a new ring buffer with the given capacity (in samples).
///
/// The actual usable capacity is `capacity - 1` to distinguish
/// full from empty.
pub fn new(capacity: usize) -> Self {
Self {
buf: vec![0i16; capacity],
capacity,
write_idx: AtomicI32::new(0),
read_idx: AtomicI32::new(0),
}
}
/// Number of samples available to read.
pub fn available_read(&self) -> usize {
let w = self.write_idx.load(Ordering::Acquire);
let r = self.read_idx.load(Ordering::Relaxed);
let avail = w - r;
if avail < 0 {
(avail + self.capacity as i32) as usize
} else {
avail as usize
}
}
/// Number of samples that can be written before the buffer is full.
pub fn available_write(&self) -> usize {
self.capacity - 1 - self.available_read()
}
/// Write samples into the ring buffer (producer side).
///
/// Returns the number of samples actually written (may be less than
/// `data.len()` if the buffer is nearly full).
pub fn write(&self, data: &[i16]) -> usize {
let avail = self.available_write();
let count = data.len().min(avail);
if count == 0 {
return 0;
}
let mut w = self.write_idx.load(Ordering::Relaxed) as usize;
let cap = self.capacity;
let buf_ptr = self.buf.as_ptr() as *mut i16;
for i in 0..count {
// SAFETY: w is always in [0, capacity) and we are the sole producer.
unsafe {
*buf_ptr.add(w) = data[i];
}
w += 1;
if w >= cap {
w = 0;
}
}
self.write_idx.store(w as i32, Ordering::Release);
count
}
/// Read samples from the ring buffer (consumer side).
///
/// Returns the number of samples actually read (may be less than
/// `out.len()` if the buffer doesn't have enough data).
pub fn read(&self, out: &mut [i16]) -> usize {
let avail = self.available_read();
let count = out.len().min(avail);
if count == 0 {
return 0;
}
let mut r = self.read_idx.load(Ordering::Relaxed) as usize;
let cap = self.capacity;
let buf_ptr = self.buf.as_ptr();
for i in 0..count {
// SAFETY: r is always in [0, capacity) and we are the sole consumer.
unsafe {
out[i] = *buf_ptr.add(r);
}
r += 1;
if r >= cap {
r = 0;
}
}
self.read_idx.store(r as i32, Ordering::Release);
count
}
/// Get a raw pointer to the buffer data (for FFI).
fn buf_ptr(&self) -> *mut i16 {
self.buf.as_ptr() as *mut i16
}
/// Get a raw pointer to the write index atomic (for FFI).
fn write_idx_ptr(&self) -> *mut AtomicI32 {
&self.write_idx as *const AtomicI32 as *mut AtomicI32
}
/// Get a raw pointer to the read index atomic (for FFI).
fn read_idx_ptr(&self) -> *mut AtomicI32 {
&self.read_idx as *const AtomicI32 as *mut AtomicI32
}
}
// SAFETY: The ring buffer is designed for SPSC use where producer and consumer
// are on different threads. The atomic indices provide the synchronization.
unsafe impl Send for RingBuffer {}
unsafe impl Sync for RingBuffer {}
// ---------------------------------------------------------------------------
// Oboe Backend
// ---------------------------------------------------------------------------
/// Oboe-based audio backend for Android.
///
/// Owns two SPSC ring buffers (capture and playout) that are shared with
/// the C++ Oboe callbacks via raw pointers. The Oboe callbacks run on
/// high-priority audio threads managed by the Android audio system.
pub struct OboeBackend {
capture_ring: RingBuffer,
playout_ring: RingBuffer,
started: bool,
}
impl OboeBackend {
/// Create a new backend with default ring buffer sizes (160 ms each).
pub fn new() -> Self {
Self {
capture_ring: RingBuffer::new(RING_CAPACITY),
playout_ring: RingBuffer::new(RING_CAPACITY),
started: false,
}
}
/// Start Oboe audio streams.
///
/// This sets up the ring buffer pointers and calls into the C++ layer
/// to open and start the capture and playout Oboe streams.
pub fn start(&mut self) -> Result<(), anyhow::Error> {
if self.started {
return Ok(());
}
let config = WzpOboeConfig {
sample_rate: 48_000,
frames_per_burst: FRAME_SAMPLES as i32,
channel_count: 1,
};
let rings = WzpOboeRings {
capture_buf: self.capture_ring.buf_ptr(),
capture_capacity: self.capture_ring.capacity as i32,
capture_write_idx: self.capture_ring.write_idx_ptr(),
capture_read_idx: self.capture_ring.read_idx_ptr(),
playout_buf: self.playout_ring.buf_ptr(),
playout_capacity: self.playout_ring.capacity as i32,
playout_write_idx: self.playout_ring.write_idx_ptr(),
playout_read_idx: self.playout_ring.read_idx_ptr(),
};
let ret = unsafe { wzp_oboe_start(&config, &rings) };
if ret != 0 {
return Err(anyhow::anyhow!("wzp_oboe_start failed with code {}", ret));
}
self.started = true;
info!("Oboe backend started");
Ok(())
}
/// Stop Oboe audio streams.
pub fn stop(&mut self) {
if !self.started {
return;
}
unsafe { wzp_oboe_stop() };
self.started = false;
info!("Oboe backend stopped");
}
/// Read captured audio samples from the capture ring buffer.
///
/// Returns the number of samples actually read. The caller should
/// provide a buffer of at least `FRAME_SAMPLES` (960) samples.
pub fn read_capture(&self, out: &mut [i16]) -> usize {
self.capture_ring.read(out)
}
/// Write audio samples to the playout ring buffer.
///
/// Returns the number of samples actually written.
pub fn write_playout(&self, samples: &[i16]) -> usize {
self.playout_ring.write(samples)
}
/// Get the current capture latency in milliseconds (from Oboe).
#[allow(unused)]
pub fn capture_latency_ms(&self) -> f32 {
unsafe { wzp_oboe_capture_latency_ms() }
}
/// Get the current playout latency in milliseconds (from Oboe).
#[allow(unused)]
pub fn playout_latency_ms(&self) -> f32 {
unsafe { wzp_oboe_playout_latency_ms() }
}
/// Check if the Oboe streams are currently running.
#[allow(unused)]
pub fn is_running(&self) -> bool {
unsafe { wzp_oboe_is_running() != 0 }
}
}
impl Drop for OboeBackend {
fn drop(&mut self) {
self.stop();
}
}
// ---------------------------------------------------------------------------
// Thread affinity / priority helpers
// ---------------------------------------------------------------------------
/// Pin the current thread to the highest-numbered CPU cores (big cores on
/// ARM big.LITTLE architectures). Falls back silently on failure.
#[allow(unused)]
pub fn pin_to_big_core() {
#[cfg(target_os = "android")]
{
unsafe {
let num_cpus = libc::sysconf(libc::_SC_NPROCESSORS_ONLN);
if num_cpus <= 0 {
warn!("pin_to_big_core: could not determine CPU count");
return;
}
let num_cpus = num_cpus as usize;
// Target the upper half of CPUs (big cores on most big.LITTLE SoCs)
let start = num_cpus / 2;
let mut set: libc::cpu_set_t = std::mem::zeroed();
libc::CPU_ZERO(&mut set);
for cpu in start..num_cpus {
libc::CPU_SET(cpu, &mut set);
}
let ret = libc::sched_setaffinity(
0, // current thread
std::mem::size_of::<libc::cpu_set_t>(),
&set,
);
if ret != 0 {
warn!("sched_setaffinity failed: {}", std::io::Error::last_os_error());
} else {
info!(start, num_cpus, "pinned to big cores");
}
}
}
#[cfg(not(target_os = "android"))]
{
// No-op on non-Android
}
}
/// Attempt to set SCHED_FIFO real-time priority for the current thread.
/// Falls back silently on failure (requires appropriate permissions on Android).
#[allow(unused)]
pub fn set_realtime_priority() {
#[cfg(target_os = "android")]
{
unsafe {
let param = libc::sched_param {
sched_priority: 2, // Low RT priority — enough for audio, safe
};
let ret = libc::sched_setscheduler(0, libc::SCHED_FIFO, &param);
if ret != 0 {
warn!(
"sched_setscheduler(SCHED_FIFO) failed: {}",
std::io::Error::last_os_error()
);
} else {
info!("set SCHED_FIFO priority 2");
}
}
}
#[cfg(not(target_os = "android"))]
{
// No-op on non-Android
}
}
// ---------------------------------------------------------------------------
// Tests
// ---------------------------------------------------------------------------
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn ring_buffer_write_read() {
let ring = RingBuffer::new(16);
let data = [1i16, 2, 3, 4, 5];
assert_eq!(ring.write(&data), 5);
assert_eq!(ring.available_read(), 5);
let mut out = [0i16; 5];
assert_eq!(ring.read(&mut out), 5);
assert_eq!(out, [1, 2, 3, 4, 5]);
assert_eq!(ring.available_read(), 0);
}
#[test]
fn ring_buffer_wraparound() {
let ring = RingBuffer::new(8);
let data = [10i16, 20, 30, 40, 50, 60]; // 6 samples, capacity 8 (usable 7)
assert_eq!(ring.write(&data), 6);
let mut out = [0i16; 4];
assert_eq!(ring.read(&mut out), 4);
assert_eq!(out, [10, 20, 30, 40]);
// Now write more, which should wrap around
let data2 = [70i16, 80, 90, 100];
assert_eq!(ring.write(&data2), 4);
let mut out2 = [0i16; 6];
assert_eq!(ring.read(&mut out2), 6);
assert_eq!(out2, [50, 60, 70, 80, 90, 100]);
}
#[test]
fn ring_buffer_full() {
let ring = RingBuffer::new(4); // usable capacity = 3
let data = [1i16, 2, 3, 4, 5];
assert_eq!(ring.write(&data), 3); // Only 3 fit
assert_eq!(ring.available_write(), 0);
}
#[test]
fn oboe_backend_stub_start_stop() {
let mut backend = OboeBackend::new();
backend.start().expect("stub start should succeed");
assert!(backend.started);
backend.stop();
assert!(!backend.started);
}
}

View File

@@ -0,0 +1,91 @@
//! Lock-free SPSC ring buffers for audio PCM transfer between
//! Kotlin AudioRecord/AudioTrack threads and the Rust engine.
//!
//! These use a simple spin-free design: the producer writes and advances
//! a write cursor, the consumer reads and advances a read cursor.
//! Both cursors are atomic so no mutex is needed.
use std::sync::atomic::{AtomicUsize, Ordering};
/// Ring buffer capacity in i16 samples.
/// 960 samples * 10 frames = ~200ms of audio at 48kHz mono.
const RING_CAPACITY: usize = 960 * 10;
/// Lock-free single-producer single-consumer ring buffer for i16 PCM samples.
pub struct AudioRing {
buf: Box<[i16; RING_CAPACITY]>,
write_pos: AtomicUsize,
read_pos: AtomicUsize,
}
// SAFETY: AudioRing is designed for SPSC — one thread writes, one reads.
// The atomics ensure visibility. The buffer itself is never accessed
// from the same index by both threads simultaneously because the
// producer only writes to positions between write_pos and read_pos,
// and the consumer only reads from positions between read_pos and write_pos.
unsafe impl Send for AudioRing {}
unsafe impl Sync for AudioRing {}
impl AudioRing {
pub fn new() -> Self {
Self {
buf: Box::new([0i16; RING_CAPACITY]),
write_pos: AtomicUsize::new(0),
read_pos: AtomicUsize::new(0),
}
}
/// Number of samples available to read.
pub fn available(&self) -> usize {
let w = self.write_pos.load(Ordering::Acquire);
let r = self.read_pos.load(Ordering::Acquire);
w.wrapping_sub(r)
}
/// Number of samples that can be written without overwriting.
pub fn free_space(&self) -> usize {
RING_CAPACITY - self.available()
}
/// Write samples into the ring. Returns number of samples written.
/// Drops oldest samples if the ring is full.
pub fn write(&self, samples: &[i16]) -> usize {
let w = self.write_pos.load(Ordering::Relaxed);
let count = samples.len().min(RING_CAPACITY);
for i in 0..count {
let idx = (w + i) % RING_CAPACITY;
// SAFETY: We're the only writer, and the reader won't read
// past read_pos which we haven't advanced past yet.
unsafe {
let ptr = self.buf.as_ptr() as *mut i16;
*ptr.add(idx) = samples[i];
}
}
self.write_pos.store(w.wrapping_add(count), Ordering::Release);
// If we overwrote unread data, advance read_pos
if self.available() > RING_CAPACITY {
let new_read = self.write_pos.load(Ordering::Relaxed).wrapping_sub(RING_CAPACITY);
self.read_pos.store(new_read, Ordering::Release);
}
count
}
/// Read samples from the ring into `out`. Returns number of samples read.
pub fn read(&self, out: &mut [i16]) -> usize {
let avail = self.available();
let count = out.len().min(avail);
let r = self.read_pos.load(Ordering::Relaxed);
for i in 0..count {
let idx = (r + i) % RING_CAPACITY;
out[i] = unsafe { *self.buf.as_ptr().add(idx) };
}
self.read_pos.store(r.wrapping_add(count), Ordering::Release);
count
}
}

View File

@@ -0,0 +1,15 @@
//! Engine commands sent from the JNI/UI thread to the engine.
use wzp_proto::QualityProfile;
/// Commands that can be sent to the running engine.
pub enum EngineCommand {
/// Mute or unmute the microphone.
SetMute(bool),
/// Enable or disable speaker (loudspeaker) mode.
SetSpeaker(bool),
/// Force a specific quality profile (overrides adaptive logic).
ForceProfile(QualityProfile),
/// Stop the call and shut down the engine.
Stop,
}

View File

@@ -0,0 +1,686 @@
//! Engine orchestrator — manages the call lifecycle.
//!
//! IMPORTANT: On Android, pthread_create crashes in shared libraries due to
//! static bionic stubs in the Rust std prebuilt rlibs. ALL work must happen
//! on the JNI calling thread or via the tokio current_thread runtime.
//! No std::thread::spawn or tokio multi_thread allowed.
//!
//! Audio capture and playout happen on Kotlin JVM threads via AudioRecord
//! and AudioTrack. PCM samples are transferred through lock-free ring buffers.
use std::net::SocketAddr;
use std::sync::atomic::{AtomicBool, AtomicU16, AtomicU32, Ordering};
use std::sync::{Arc, Mutex};
use std::time::Instant;
use bytes::Bytes;
use tracing::{error, info, warn};
use wzp_codec::agc::AutoGainControl;
use wzp_codec::opus_dec::OpusDecoder;
use wzp_codec::opus_enc::OpusEncoder;
use wzp_crypto::{KeyExchange, WarzoneKeyExchange};
use wzp_fec::{RaptorQFecDecoder, RaptorQFecEncoder};
use wzp_proto::{
AudioDecoder, AudioEncoder, CodecId, FecDecoder, FecEncoder,
MediaHeader, MediaPacket, MediaTransport, QualityProfile, SignalMessage,
};
use crate::audio_ring::AudioRing;
use crate::commands::EngineCommand;
use crate::stats::{CallState, CallStats};
/// Opus frame size at 48kHz mono, 20ms = 960 samples.
const FRAME_SAMPLES: usize = 960;
/// Configuration to start a call.
pub struct CallStartConfig {
pub profile: QualityProfile,
pub relay_addr: String,
pub room: String,
pub auth_token: Vec<u8>,
pub identity_seed: [u8; 32],
pub alias: Option<String>,
}
impl Default for CallStartConfig {
fn default() -> Self {
Self {
profile: QualityProfile::GOOD,
relay_addr: String::new(),
room: String::new(),
auth_token: Vec::new(),
identity_seed: [0u8; 32],
alias: None,
}
}
}
pub(crate) struct EngineState {
pub running: AtomicBool,
pub muted: AtomicBool,
pub stats: Mutex<CallStats>,
pub command_tx: std::sync::mpsc::Sender<EngineCommand>,
pub command_rx: Mutex<Option<std::sync::mpsc::Receiver<EngineCommand>>>,
/// Ring buffer: Kotlin AudioRecord → Rust encoder
pub capture_ring: AudioRing,
/// Ring buffer: Rust decoder → Kotlin AudioTrack
pub playout_ring: AudioRing,
/// Current audio level (RMS) for UI display, updated by capture path.
pub audio_level_rms: AtomicU32,
/// QUIC transport handle — stored so stop_call() can close it immediately,
/// triggering relay-side leave + RoomUpdate broadcast.
pub quic_transport: Mutex<Option<Arc<wzp_transport::QuinnTransport>>>,
}
pub struct WzpEngine {
pub(crate) state: Arc<EngineState>,
tokio_runtime: Option<tokio::runtime::Runtime>,
call_start: Option<Instant>,
}
impl WzpEngine {
pub fn new() -> Self {
let (tx, rx) = std::sync::mpsc::channel();
let state = Arc::new(EngineState {
running: AtomicBool::new(false),
muted: AtomicBool::new(false),
stats: Mutex::new(CallStats::default()),
command_tx: tx,
command_rx: Mutex::new(Some(rx)),
capture_ring: AudioRing::new(),
playout_ring: AudioRing::new(),
audio_level_rms: AtomicU32::new(0),
quic_transport: Mutex::new(None),
});
Self {
state,
tokio_runtime: None,
call_start: None,
}
}
pub fn start_call(&mut self, config: CallStartConfig) -> Result<(), anyhow::Error> {
if self.state.running.load(Ordering::Acquire) {
return Err(anyhow::anyhow!("call already active"));
}
{
let mut stats = self.state.stats.lock().unwrap();
*stats = CallStats {
state: CallState::Connecting,
..Default::default()
};
}
let runtime = tokio::runtime::Builder::new_current_thread()
.enable_all()
.build()?;
let relay_addr: SocketAddr = config.relay_addr.parse().map_err(|e| {
anyhow::anyhow!("invalid relay address '{}': {e}", config.relay_addr)
})?;
let room = config.room.clone();
let identity_seed = config.identity_seed;
let profile = config.profile;
let alias = config.alias.clone();
let state = self.state.clone();
self.state.running.store(true, Ordering::Release);
self.call_start = Some(Instant::now());
let state_clone = state.clone();
runtime.block_on(async move {
if let Err(e) = run_call(relay_addr, &room, &identity_seed, profile, alias.as_deref(), state_clone).await
{
error!("call failed: {e}");
}
});
state.running.store(false, Ordering::Release);
{
let mut stats = state.stats.lock().unwrap();
stats.state = CallState::Closed;
}
self.tokio_runtime = Some(runtime);
Ok(())
}
pub fn stop_call(&mut self) {
info!("stop_call: setting running=false");
self.state.running.store(false, Ordering::Release);
// Close QUIC connection — this wakes up all blocked recv/send futures
// inside block_on(run_call(...)) on the JNI thread. run_call will then
// wait up to 500ms for the peer to acknowledge the close before returning.
if let Some(transport) = self.state.quic_transport.lock().unwrap().take() {
info!("stop_call: closing QUIC connection");
transport.close_now();
}
let _ = self.state.command_tx.send(EngineCommand::Stop);
// Note: the runtime is still blocked in block_on(run_call(...)) on the
// start_call thread. Once run_call exits (triggered by running=false +
// connection close above), block_on returns and stores the runtime in
// self.tokio_runtime. We don't need to shut it down here.
if let Some(rt) = self.tokio_runtime.take() {
rt.shutdown_timeout(std::time::Duration::from_millis(100));
}
self.call_start = None;
info!("stop_call: done");
}
pub fn set_mute(&self, muted: bool) {
self.state.muted.store(muted, Ordering::Relaxed);
}
pub fn set_speaker(&self, _enabled: bool) {}
pub fn force_profile(&self, _profile: QualityProfile) {}
pub fn get_stats(&self) -> CallStats {
let mut stats = self.state.stats.lock().unwrap().clone();
if let Some(start) = self.call_start {
stats.duration_secs = start.elapsed().as_secs_f64();
}
stats.audio_level = self.state.audio_level_rms.load(Ordering::Relaxed);
stats
}
pub fn is_active(&self) -> bool {
self.state.running.load(Ordering::Acquire)
}
pub fn write_audio(&self, samples: &[i16]) -> usize {
if self.state.muted.load(Ordering::Relaxed) {
return samples.len();
}
// Compute RMS for audio level display
if !samples.is_empty() {
let sum_sq: f64 = samples.iter().map(|&s| (s as f64) * (s as f64)).sum();
let rms = (sum_sq / samples.len() as f64).sqrt() as u32;
self.state.audio_level_rms.store(rms, Ordering::Relaxed);
}
self.state.capture_ring.write(samples)
}
pub fn read_audio(&self, out: &mut [i16]) -> usize {
self.state.playout_ring.read(out)
}
pub fn destroy(mut self) {
self.stop_call();
}
}
impl Drop for WzpEngine {
fn drop(&mut self) {
self.stop_call();
}
}
/// Run the full call lifecycle: connect, handshake, send/recv media with Opus + FEC.
async fn run_call(
relay_addr: SocketAddr,
room: &str,
identity_seed: &[u8; 32],
profile: QualityProfile,
alias: Option<&str>,
state: Arc<EngineState>,
) -> Result<(), anyhow::Error> {
let _ = rustls::crypto::ring::default_provider().install_default();
let bind_addr: SocketAddr = "0.0.0.0:0".parse().unwrap();
let endpoint = wzp_transport::create_endpoint(bind_addr, None)?;
let sni = if room.is_empty() { "android" } else { room };
info!(%relay_addr, sni, "connecting to relay...");
let client_cfg = wzp_transport::client_config();
let conn = wzp_transport::connect(&endpoint, relay_addr, sni, client_cfg).await?;
info!("QUIC connected to relay");
let transport = Arc::new(wzp_transport::QuinnTransport::new(conn));
// Store transport handle so stop_call() can close the connection immediately
*state.quic_transport.lock().unwrap() = Some(transport.clone());
// Crypto handshake
let mut kx = WarzoneKeyExchange::from_identity_seed(identity_seed);
let ephemeral_pub = kx.generate_ephemeral();
let identity_pub = kx.identity_public_key();
let mut sign_data = Vec::with_capacity(42);
sign_data.extend_from_slice(&ephemeral_pub);
sign_data.extend_from_slice(b"call-offer");
let signature = kx.sign(&sign_data);
let offer = SignalMessage::CallOffer {
identity_pub,
ephemeral_pub,
signature,
supported_profiles: vec![
QualityProfile::GOOD,
QualityProfile::DEGRADED,
QualityProfile::CATASTROPHIC,
],
alias: alias.map(|s| s.to_string()),
};
transport.send_signal(&offer).await?;
info!("CallOffer sent, waiting for CallAnswer...");
let answer = transport
.recv_signal()
.await?
.ok_or_else(|| anyhow::anyhow!("connection closed before CallAnswer"))?;
let relay_ephemeral_pub = match answer {
SignalMessage::CallAnswer { ephemeral_pub, .. } => ephemeral_pub,
other => {
return Err(anyhow::anyhow!(
"expected CallAnswer, got {:?}",
std::mem::discriminant(&other)
))
}
};
let _session = kx.derive_session(&relay_ephemeral_pub)?;
info!("handshake complete, call active");
{
let mut stats = state.stats.lock().unwrap();
stats.state = CallState::Active;
}
// Initialize Opus codec
let mut encoder =
OpusEncoder::new(profile).map_err(|e| anyhow::anyhow!("opus encoder init: {e}"))?;
let mut decoder =
OpusDecoder::new(profile).map_err(|e| anyhow::anyhow!("opus decoder init: {e}"))?;
// Initialize FEC encoder/decoder
let mut fec_enc = wzp_fec::create_encoder(&profile);
let mut fec_dec = wzp_fec::create_decoder(&profile);
// AGC: normalize volume on both capture and playout paths
let mut capture_agc = AutoGainControl::new();
let mut playout_agc = AutoGainControl::new();
info!(
fec_ratio = profile.fec_ratio,
frames_per_block = profile.frames_per_block,
"codec + FEC + AGC initialized (48kHz mono, 20ms frames)"
);
let seq = AtomicU16::new(0);
let ts = AtomicU32::new(0);
let transport_recv = transport.clone();
// Pre-allocate buffers
let mut capture_buf = vec![0i16; FRAME_SAMPLES];
let mut encode_buf = vec![0u8; encoder.max_frame_bytes()];
let mut frame_in_block: u8 = 0;
let mut block_id: u8 = 0;
// Send task: capture ring → Opus encode → FEC → MediaPackets
//
// IMPORTANT: send_media() uses quinn's send_datagram() which is
// synchronous and returns Err(Blocked) when the congestion window
// is full. We MUST NOT break on send errors — that would kill the
// entire call. Instead we drop the packet and keep going.
let send_task = async {
info!("send task started (Opus + RaptorQ FEC)");
let mut send_errors: u64 = 0;
let mut last_send_error_log = Instant::now();
let mut last_stats_log = Instant::now();
let mut frames_sent: u64 = 0;
let mut frames_dropped: u64 = 0;
loop {
if !state.running.load(Ordering::Relaxed) {
break;
}
let avail = state.capture_ring.available();
if avail < FRAME_SAMPLES {
tokio::time::sleep(std::time::Duration::from_millis(5)).await;
continue;
}
let read = state.capture_ring.read(&mut capture_buf);
if read < FRAME_SAMPLES {
continue;
}
// Mute: zero out the buffer so Opus encodes silence.
// We still read from the ring to prevent it from filling up.
if state.muted.load(Ordering::Relaxed) {
capture_buf.fill(0);
}
// AGC: normalize capture volume before encoding
capture_agc.process_frame(&mut capture_buf);
// Opus encode
let encoded_len = match encoder.encode(&capture_buf, &mut encode_buf) {
Ok(n) => n,
Err(e) => {
warn!("opus encode error: {e}");
continue;
}
};
let encoded = &encode_buf[..encoded_len];
// Build source packet
let s = seq.fetch_add(1, Ordering::Relaxed);
let t = ts.fetch_add(FRAME_SAMPLES as u32, Ordering::Relaxed);
let source_pkt = MediaPacket {
header: MediaHeader {
version: 0,
is_repair: false,
codec_id: profile.codec,
has_quality_report: false,
fec_ratio_encoded: MediaHeader::encode_fec_ratio(profile.fec_ratio),
seq: s,
timestamp: t,
fec_block: block_id,
fec_symbol: frame_in_block,
reserved: 0,
csrc_count: 0,
},
payload: Bytes::copy_from_slice(encoded),
quality_report: None,
};
// Send source packet — drop on error, never break
if let Err(e) = transport.send_media(&source_pkt).await {
send_errors += 1;
frames_dropped += 1;
// Log first few errors, then throttle to once per second
if send_errors <= 3 || last_send_error_log.elapsed().as_secs() >= 1 {
warn!(
seq = s,
send_errors,
frames_dropped,
"send_media error (dropping packet): {e}"
);
last_send_error_log = Instant::now();
}
// Don't feed to FEC either — the source is lost
continue;
}
frames_sent += 1;
// Feed encoded frame to FEC encoder
if let Err(e) = fec_enc.add_source_symbol(encoded) {
warn!("fec add_source error: {e}");
}
frame_in_block += 1;
// When block is full, generate repair packets
if frame_in_block >= profile.frames_per_block {
match fec_enc.generate_repair(profile.fec_ratio) {
Ok(repairs) => {
let repair_count = repairs.len();
for (sym_idx, repair_data) in repairs {
let rs = seq.fetch_add(1, Ordering::Relaxed);
let repair_pkt = MediaPacket {
header: MediaHeader {
version: 0,
is_repair: true,
codec_id: profile.codec,
has_quality_report: false,
fec_ratio_encoded: MediaHeader::encode_fec_ratio(
profile.fec_ratio,
),
seq: rs,
timestamp: t,
fec_block: block_id,
fec_symbol: sym_idx,
reserved: 0,
csrc_count: 0,
},
payload: Bytes::from(repair_data),
quality_report: None,
};
// Drop repair packets on error — never break
if let Err(_e) = transport.send_media(&repair_pkt).await {
send_errors += 1;
frames_dropped += 1;
// Don't log every repair failure — source error log covers it
}
}
if repair_count > 0 && (block_id % 50 == 0 || block_id == 0) {
info!(
block_id,
repair_count,
fec_ratio = profile.fec_ratio,
"FEC block complete"
);
}
}
Err(e) => {
warn!("fec generate_repair error: {e}");
}
}
let _ = fec_enc.finalize_block();
block_id = block_id.wrapping_add(1);
frame_in_block = 0;
}
// Periodic stats every 5 seconds
if last_stats_log.elapsed().as_secs() >= 5 {
info!(
seq = s,
block_id,
frames_sent,
frames_dropped,
send_errors,
ring_avail = state.capture_ring.available(),
"send stats"
);
last_stats_log = Instant::now();
}
}
info!(frames_sent, frames_dropped, send_errors, "send task ended");
};
// Pre-allocate decode buffer
let mut decode_buf = vec![0i16; FRAME_SAMPLES];
// Recv task: MediaPackets → FEC decode → Opus decode → playout ring
let recv_task = async {
let mut frames_decoded: u64 = 0;
let mut fec_recovered: u64 = 0;
let mut recv_errors: u64 = 0;
let mut last_recv_instant = Instant::now();
let mut max_recv_gap_ms: u64 = 0;
let mut last_stats_log = Instant::now();
info!("recv task started (Opus + RaptorQ FEC)");
loop {
if !state.running.load(Ordering::Relaxed) {
break;
}
match transport_recv.recv_media().await {
Ok(Some(pkt)) => {
// Track recv gaps — large gaps indicate network or relay issues
let recv_gap_ms = last_recv_instant.elapsed().as_millis() as u64;
last_recv_instant = Instant::now();
if recv_gap_ms > max_recv_gap_ms {
max_recv_gap_ms = recv_gap_ms;
}
if recv_gap_ms > 500 {
warn!(
recv_gap_ms,
seq = pkt.header.seq,
is_repair = pkt.header.is_repair,
"large recv gap — possible network stall"
);
}
let is_repair = pkt.header.is_repair;
let pkt_block = pkt.header.fec_block;
let pkt_symbol = pkt.header.fec_symbol;
// Feed every packet (source + repair) to FEC decoder
let _ = fec_dec.add_symbol(
pkt_block,
pkt_symbol,
is_repair,
&pkt.payload,
);
// Source packets: decode directly
if !is_repair {
match decoder.decode(&pkt.payload, &mut decode_buf) {
Ok(samples) => {
playout_agc.process_frame(&mut decode_buf[..samples]);
state.playout_ring.write(&decode_buf[..samples]);
frames_decoded += 1;
}
Err(e) => {
warn!("opus decode error: {e}");
if let Ok(samples) = decoder.decode_lost(&mut decode_buf) {
playout_agc.process_frame(&mut decode_buf[..samples]);
state.playout_ring.write(&decode_buf[..samples]);
}
}
}
}
// Try FEC recovery
if let Ok(Some(recovered_frames)) = fec_dec.try_decode(pkt_block) {
fec_recovered += recovered_frames.len() as u64;
if fec_recovered % 50 == 1 {
info!(
fec_recovered,
block = pkt_block,
frames = recovered_frames.len(),
"FEC block recovered"
);
}
}
// Expire old blocks to prevent memory growth
if pkt_block > 3 {
fec_dec.expire_before(pkt_block.wrapping_sub(3));
}
let mut stats = state.stats.lock().unwrap();
stats.frames_decoded = frames_decoded;
stats.fec_recovered = fec_recovered;
drop(stats);
// Periodic stats every 5 seconds
if last_stats_log.elapsed().as_secs() >= 5 {
info!(
frames_decoded,
fec_recovered,
recv_errors,
max_recv_gap_ms,
playout_avail = state.playout_ring.available(),
"recv stats"
);
max_recv_gap_ms = 0;
last_stats_log = Instant::now();
}
}
Ok(None) => {
info!(frames_decoded, fec_recovered, "relay disconnected (stream ended)");
break;
}
Err(e) => {
recv_errors += 1;
// Transient errors: log and keep going
let msg = e.to_string();
if msg.contains("closed") || msg.contains("reset") {
error!(recv_errors, "recv fatal: {e}");
break;
}
// Non-fatal: log throttled
if recv_errors <= 3 || recv_errors % 50 == 0 {
warn!(recv_errors, "recv error (continuing): {e}");
}
}
}
}
info!(frames_decoded, fec_recovered, recv_errors, "recv task ended");
};
// Stats task — polls path quality + quinn RTT every 500ms
let transport_stats = transport.clone();
let stats_task = async {
loop {
if !state.running.load(Ordering::Relaxed) {
break;
}
// Feed quinn's QUIC-level RTT into our path monitor
let quic_rtt_ms = transport_stats.connection().stats().path.rtt.as_millis() as u32;
if quic_rtt_ms > 0 {
transport_stats.feed_rtt(quic_rtt_ms);
}
let pq = transport_stats.path_quality();
{
let mut stats = state.stats.lock().unwrap();
stats.frames_encoded = seq.load(Ordering::Relaxed) as u64;
stats.loss_pct = pq.loss_pct;
stats.rtt_ms = quic_rtt_ms;
stats.jitter_ms = pq.jitter_ms;
}
tokio::time::sleep(std::time::Duration::from_millis(500)).await;
}
};
// Signal recv task — listens for RoomUpdate and other signaling messages
let transport_signal = transport.clone();
let state_signal = state.clone();
let signal_task = async {
loop {
match transport_signal.recv_signal().await {
Ok(Some(SignalMessage::RoomUpdate { count, participants })) => {
info!(count, "RoomUpdate received");
let members: Vec<crate::stats::RoomMember> = participants
.iter()
.map(|p| crate::stats::RoomMember {
fingerprint: p.fingerprint.clone(),
alias: p.alias.clone(),
})
.collect();
let mut stats = state_signal.stats.lock().unwrap();
stats.room_participant_count = count;
stats.room_participants = members;
}
Ok(Some(msg)) => {
info!("signal received: {:?}", std::mem::discriminant(&msg));
}
Ok(None) => {
info!("signal stream closed");
break;
}
Err(e) => {
warn!("signal recv error: {e}");
break;
}
}
}
};
tokio::select! {
_ = send_task => info!("send task ended"),
_ = recv_task => info!("recv task ended"),
_ = stats_task => info!("stats task ended"),
_ = signal_task => info!("signal task ended"),
}
// Send CONNECTION_CLOSE and wait up to 500ms for the peer to acknowledge.
// This ensures the relay sees the close even if the first packet is lost.
info!("closing QUIC connection...");
transport.close_now();
match tokio::time::timeout(
std::time::Duration::from_millis(500),
transport.connection().closed(),
).await {
Ok(_) => info!("QUIC connection closed cleanly"),
Err(_) => info!("QUIC close timed out (relay may not have ack'd)"),
}
Ok(())
}

View File

@@ -0,0 +1,256 @@
//! JNI bridge for Android — thin layer between Kotlin and the WzpEngine.
use std::panic;
use std::sync::Once;
use jni::objects::{JClass, JObject, JString};
use jni::sys::{jboolean, jint, jlong, jstring};
use jni::JNIEnv;
use tracing::{error, info};
use wzp_proto::QualityProfile;
use crate::engine::{CallStartConfig, WzpEngine};
/// Opaque engine handle passed to/from Kotlin as a `jlong`.
struct EngineHandle {
engine: WzpEngine,
}
/// Recover the `EngineHandle` from a raw handle value.
unsafe fn handle_ref(handle: jlong) -> &'static mut EngineHandle {
unsafe { &mut *(handle as *mut EngineHandle) }
}
fn profile_from_int(value: jint) -> QualityProfile {
match value {
1 => QualityProfile::DEGRADED,
2 => QualityProfile::CATASTROPHIC,
_ => QualityProfile::GOOD,
}
}
static INIT_LOGGING: Once = Once::new();
/// Initialize tracing → Android logcat (tag "wzp_android").
/// Safe to call multiple times — only the first call takes effect.
fn init_logging() {
INIT_LOGGING.call_once(|| {
use tracing_subscriber::layer::SubscriberExt;
use tracing_subscriber::util::SubscriberInitExt;
if let Ok(layer) = tracing_android::layer("wzp_android") {
let _ = tracing_subscriber::registry().with(layer).try_init();
}
});
}
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeInit(
_env: JNIEnv,
_class: JClass,
) -> jlong {
let result = panic::catch_unwind(|| {
init_logging();
let handle = Box::new(EngineHandle {
engine: WzpEngine::new(),
});
Box::into_raw(handle) as jlong
});
match result {
Ok(h) => h,
Err(_) => 0,
}
}
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeStartCall(
mut env: JNIEnv,
_class: JClass,
handle: jlong,
relay_addr_j: JString,
room_j: JString,
seed_hex_j: JString,
token_j: JString,
alias_j: JString,
) -> jint {
let result = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let relay_addr: String = env.get_string(&relay_addr_j).map(|s| s.into()).unwrap_or_default();
let room: String = env.get_string(&room_j).map(|s| s.into()).unwrap_or_default();
let seed_hex: String = env.get_string(&seed_hex_j).map(|s| s.into()).unwrap_or_default();
let token: String = env.get_string(&token_j).map(|s| s.into()).unwrap_or_default();
let alias: String = env.get_string(&alias_j).map(|s| s.into()).unwrap_or_default();
let h = unsafe { handle_ref(handle) };
// Parse hex seed
let mut identity_seed = [0u8; 32];
if seed_hex.len() == 64 {
for i in 0..32 {
if let Ok(byte) = u8::from_str_radix(&seed_hex[i * 2..i * 2 + 2], 16) {
identity_seed[i] = byte;
}
}
} else {
// Generate random seed if not provided
use rand::RngCore;
rand::thread_rng().fill_bytes(&mut identity_seed);
}
let config = CallStartConfig {
profile: QualityProfile::GOOD,
relay_addr,
room,
auth_token: if token.is_empty() { Vec::new() } else { token.into_bytes() },
identity_seed,
alias: if alias.is_empty() { None } else { Some(alias) },
};
match h.engine.start_call(config) {
Ok(()) => 0,
Err(e) => {
error!("start_call failed: {e}");
-1
}
}
}));
match result {
Ok(code) => code,
Err(_) => -1,
}
}
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeStopCall(
_env: JNIEnv,
_class: JClass,
handle: jlong,
) {
let _ = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let h = unsafe { handle_ref(handle) };
h.engine.stop_call();
}));
}
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeSetMute(
_env: JNIEnv,
_class: JClass,
handle: jlong,
muted: jboolean,
) {
let _ = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let h = unsafe { handle_ref(handle) };
h.engine.set_mute(muted != 0);
}));
}
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeSetSpeaker(
_env: JNIEnv,
_class: JClass,
handle: jlong,
speaker: jboolean,
) {
let _ = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let h = unsafe { handle_ref(handle) };
h.engine.set_speaker(speaker != 0);
}));
}
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeGetStats<'a>(
mut env: JNIEnv<'a>,
_class: JClass,
handle: jlong,
) -> jstring {
let result = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let h = unsafe { handle_ref(handle) };
let stats = h.engine.get_stats();
serde_json::to_string(&stats).unwrap_or_else(|_| "{}".to_string())
}));
let json = match result {
Ok(s) => s,
Err(_) => "{}".to_string(),
};
env.new_string(&json)
.map(|s| s.into_raw())
.unwrap_or(JObject::null().into_raw())
}
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeForceProfile(
_env: JNIEnv,
_class: JClass,
handle: jlong,
profile: jint,
) {
let _ = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let h = unsafe { handle_ref(handle) };
let qp = profile_from_int(profile);
h.engine.force_profile(qp);
}));
}
/// Write captured PCM samples from Kotlin AudioRecord into the engine's capture ring.
/// pcm is a Java short[] array.
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeWriteAudio(
env: JNIEnv,
_class: JClass,
handle: jlong,
pcm: jni::objects::JShortArray,
) -> jint {
let result = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let h = unsafe { handle_ref(handle) };
let len = env.get_array_length(&pcm).unwrap_or(0) as usize;
if len == 0 {
return 0;
}
let mut buf = vec![0i16; len];
// GetShortArrayRegion copies Java array into our buffer
if env.get_short_array_region(&pcm, 0, &mut buf).is_err() {
return 0;
}
h.engine.write_audio(&buf) as jint
}));
result.unwrap_or(0)
}
/// Read decoded PCM samples from the engine's playout ring for Kotlin AudioTrack.
/// pcm is a Java short[] array to fill. Returns number of samples actually read.
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeReadAudio(
env: JNIEnv,
_class: JClass,
handle: jlong,
pcm: jni::objects::JShortArray,
) -> jint {
let result = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let h = unsafe { handle_ref(handle) };
let len = env.get_array_length(&pcm).unwrap_or(0) as usize;
if len == 0 {
return 0;
}
let mut buf = vec![0i16; len];
let read = h.engine.read_audio(&mut buf);
if read > 0 {
let _ = env.set_short_array_region(&pcm, 0, &buf[..read]);
}
read as jint
}));
result.unwrap_or(0)
}
#[unsafe(no_mangle)]
pub unsafe extern "system" fn Java_com_wzp_engine_WzpEngine_nativeDestroy(
_env: JNIEnv,
_class: JClass,
handle: jlong,
) {
let _ = panic::catch_unwind(panic::AssertUnwindSafe(|| {
let h = unsafe { Box::from_raw(handle as *mut EngineHandle) };
drop(h);
}));
}

View File

@@ -0,0 +1,18 @@
//! WarzonePhone Android native VoIP engine.
//!
//! Provides:
//! - Oboe audio backend with lock-free SPSC ring buffers
//! - Engine orchestrator managing call lifecycle
//! - Codec pipeline thread (encode/decode/FEC/jitter)
//! - Call statistics and command interface
//!
//! On non-Android targets, the Oboe C++ layer compiles as a stub,
//! allowing `cargo check` and unit tests on the host.
pub mod audio_android;
pub mod audio_ring;
pub mod commands;
pub mod engine;
pub mod pipeline;
pub mod stats;
pub mod jni_bridge;

View File

@@ -0,0 +1,262 @@
//! Codec pipeline — encode/decode with FEC and jitter buffer.
//!
//! Runs on a dedicated thread, processing 20 ms frames at 48 kHz.
//! The pipeline is NOT Send/Sync (Opus encoder state) — it is owned
//! exclusively by the codec thread.
use tracing::{debug, warn};
use wzp_codec::{AdaptiveDecoder, AdaptiveEncoder, AutoGainControl, EchoCanceller};
use wzp_fec::{RaptorQFecDecoder, RaptorQFecEncoder};
use wzp_proto::jitter::{JitterBuffer, PlayoutResult};
use wzp_proto::quality::AdaptiveQualityController;
use wzp_proto::traits::{AudioDecoder, AudioEncoder, FecDecoder, FecEncoder};
use wzp_proto::traits::QualityController;
use wzp_proto::{MediaPacket, QualityProfile};
use crate::audio_android::FRAME_SAMPLES;
/// Maximum encoded frame size (Opus worst case at highest bitrate).
const MAX_ENCODED_BYTES: usize = 1275;
/// Pipeline statistics snapshot.
#[derive(Clone, Debug, Default)]
pub struct PipelineStats {
pub frames_encoded: u64,
pub frames_decoded: u64,
pub underruns: u64,
pub jitter_depth: usize,
pub quality_tier: u8,
}
/// The codec pipeline: encode, FEC, jitter buffer, decode.
///
/// This struct is owned by the codec thread and not shared.
pub struct Pipeline {
encoder: AdaptiveEncoder,
decoder: AdaptiveDecoder,
fec_encoder: RaptorQFecEncoder,
fec_decoder: RaptorQFecDecoder,
jitter_buffer: JitterBuffer,
quality_ctrl: AdaptiveQualityController,
/// Acoustic echo canceller applied before encoding.
aec: EchoCanceller,
/// Automatic gain control applied before encoding.
agc: AutoGainControl,
/// Last decoded PCM frame, used as the AEC far-end reference.
last_decoded_farend: Option<Vec<i16>>,
// Pre-allocated scratch buffers
capture_buf: Vec<i16>,
#[allow(dead_code)]
playout_buf: Vec<i16>,
encode_out: Vec<u8>,
// Stats counters
frames_encoded: u64,
frames_decoded: u64,
underruns: u64,
}
impl Pipeline {
/// Create a new pipeline configured for the given quality profile.
pub fn new(profile: QualityProfile) -> Result<Self, anyhow::Error> {
let encoder = AdaptiveEncoder::new(profile)
.map_err(|e| anyhow::anyhow!("encoder init: {e}"))?;
let decoder = AdaptiveDecoder::new(profile)
.map_err(|e| anyhow::anyhow!("decoder init: {e}"))?;
let fec_encoder =
RaptorQFecEncoder::with_defaults(profile.frames_per_block as usize);
let fec_decoder =
RaptorQFecDecoder::with_defaults(profile.frames_per_block as usize);
let jitter_buffer = JitterBuffer::new(10, 250, 3);
let quality_ctrl = AdaptiveQualityController::new();
Ok(Self {
encoder,
decoder,
fec_encoder,
fec_decoder,
jitter_buffer,
quality_ctrl,
aec: EchoCanceller::new(48000, 100), // 100 ms echo tail
agc: AutoGainControl::new(),
last_decoded_farend: None,
capture_buf: vec![0i16; FRAME_SAMPLES],
playout_buf: vec![0i16; FRAME_SAMPLES],
encode_out: vec![0u8; MAX_ENCODED_BYTES],
frames_encoded: 0,
frames_decoded: 0,
underruns: 0,
})
}
/// Encode a PCM frame into a compressed packet.
///
/// If `muted` is true, a silence frame is encoded (all zeros).
/// Returns the encoded bytes, or `None` on encoder error.
pub fn encode_frame(&mut self, pcm: &[i16], muted: bool) -> Option<Vec<u8>> {
let input = if muted {
// Zero the capture buffer for silence
for s in self.capture_buf.iter_mut() {
*s = 0;
}
&self.capture_buf[..]
} else {
// Feed the last decoded playout as AEC far-end reference.
if let Some(ref farend) = self.last_decoded_farend {
self.aec.feed_farend(farend);
}
// Apply AEC + AGC to the captured PCM.
let len = pcm.len().min(self.capture_buf.len());
self.capture_buf[..len].copy_from_slice(&pcm[..len]);
self.aec.process_frame(&mut self.capture_buf[..len]);
self.agc.process_frame(&mut self.capture_buf[..len]);
&self.capture_buf[..len]
};
match self.encoder.encode(input, &mut self.encode_out) {
Ok(n) => {
self.frames_encoded += 1;
let encoded = self.encode_out[..n].to_vec();
// Feed into FEC encoder
if let Err(e) = self.fec_encoder.add_source_symbol(&encoded) {
warn!("FEC encode error: {e}");
}
Some(encoded)
}
Err(e) => {
warn!("encode error: {e}");
None
}
}
}
/// Feed a received media packet into the jitter buffer.
pub fn feed_packet(&mut self, packet: MediaPacket) {
// Feed FEC symbols if present
let header = &packet.header;
if header.fec_block != 0 || header.fec_symbol != 0 {
let is_repair = header.is_repair;
if let Err(e) = self.fec_decoder.add_symbol(
header.fec_block,
header.fec_symbol,
is_repair,
&packet.payload,
) {
debug!("FEC symbol feed error: {e}");
}
}
self.jitter_buffer.push(packet);
}
/// Decode the next frame from the jitter buffer.
///
/// Returns decoded PCM samples, or `None` if the buffer is not ready.
/// Decoded PCM is also stored as the AEC far-end reference for the next
/// encode cycle.
pub fn decode_frame(&mut self) -> Option<Vec<i16>> {
let result = match self.jitter_buffer.pop() {
PlayoutResult::Packet(pkt) => {
let mut pcm = vec![0i16; FRAME_SAMPLES];
match self.decoder.decode(&pkt.payload, &mut pcm) {
Ok(n) => {
self.frames_decoded += 1;
pcm.truncate(n);
Some(pcm)
}
Err(e) => {
warn!("decode error: {e}");
// Attempt PLC
self.generate_plc()
}
}
}
PlayoutResult::Missing { seq } => {
debug!(seq, "jitter buffer: missing packet, generating PLC");
self.generate_plc()
}
PlayoutResult::NotReady => {
self.underruns += 1;
None
}
};
// Save decoded PCM as far-end reference for AEC.
if let Some(ref pcm) = result {
self.last_decoded_farend = Some(pcm.clone());
}
result
}
/// Generate packet loss concealment output.
fn generate_plc(&mut self) -> Option<Vec<i16>> {
let mut pcm = vec![0i16; FRAME_SAMPLES];
match self.decoder.decode_lost(&mut pcm) {
Ok(n) => {
self.frames_decoded += 1;
pcm.truncate(n);
Some(pcm)
}
Err(e) => {
warn!("PLC error: {e}");
None
}
}
}
/// Feed a quality report into the adaptive quality controller.
///
/// Returns a new profile if a tier transition occurred.
#[allow(unused)]
pub fn observe_quality(
&mut self,
report: &wzp_proto::QualityReport,
) -> Option<QualityProfile> {
let new_profile = self.quality_ctrl.observe(report);
if let Some(ref profile) = new_profile {
if let Err(e) = self.encoder.set_profile(*profile) {
warn!("encoder set_profile error: {e}");
}
if let Err(e) = self.decoder.set_profile(*profile) {
warn!("decoder set_profile error: {e}");
}
}
new_profile
}
/// Force a specific quality profile.
#[allow(unused)]
pub fn force_profile(&mut self, profile: QualityProfile) {
self.quality_ctrl.force_profile(profile);
if let Err(e) = self.encoder.set_profile(profile) {
warn!("encoder set_profile error: {e}");
}
if let Err(e) = self.decoder.set_profile(profile) {
warn!("decoder set_profile error: {e}");
}
}
/// Get current pipeline statistics.
pub fn stats(&self) -> PipelineStats {
PipelineStats {
frames_encoded: self.frames_encoded,
frames_decoded: self.frames_decoded,
underruns: self.underruns,
jitter_depth: self.jitter_buffer.stats().current_depth,
quality_tier: self.quality_ctrl.tier() as u8,
}
}
/// Enable or disable acoustic echo cancellation.
pub fn set_aec_enabled(&mut self, enabled: bool) {
self.aec.set_enabled(enabled);
}
/// Enable or disable automatic gain control.
pub fn set_agc_enabled(&mut self, enabled: bool) {
self.agc.set_enabled(enabled);
}
}

View File

@@ -0,0 +1,67 @@
//! Call statistics for the Android engine.
/// State of the call.
/// Serializes as integer for easy parsing on the Kotlin side:
/// 0=Idle, 1=Connecting, 2=Active, 3=Reconnecting, 4=Closed
#[derive(Clone, Debug, Default, PartialEq, Eq)]
pub enum CallState {
#[default]
Idle,
Connecting,
Active,
Reconnecting,
Closed,
}
impl serde::Serialize for CallState {
fn serialize<S: serde::Serializer>(&self, serializer: S) -> Result<S::Ok, S::Error> {
let n: u8 = match self {
CallState::Idle => 0,
CallState::Connecting => 1,
CallState::Active => 2,
CallState::Reconnecting => 3,
CallState::Closed => 4,
};
serializer.serialize_u8(n)
}
}
/// Aggregated call statistics, serializable for JNI bridge.
#[derive(Clone, Debug, Default, serde::Serialize)]
pub struct CallStats {
/// Current call state.
pub state: CallState,
/// Call duration in seconds.
pub duration_secs: f64,
/// Current quality tier (0=GOOD, 1=DEGRADED, 2=CATASTROPHIC).
pub quality_tier: u8,
/// Observed packet loss percentage.
pub loss_pct: f32,
/// Smoothed round-trip time in milliseconds.
pub rtt_ms: u32,
/// Jitter in milliseconds.
pub jitter_ms: u32,
/// Current jitter buffer depth in packets.
pub jitter_buffer_depth: usize,
/// Total frames encoded since call start.
pub frames_encoded: u64,
/// Total frames decoded since call start.
pub frames_decoded: u64,
/// Number of playout underruns (buffer empty when audio needed).
pub underruns: u64,
/// Frames recovered by FEC.
pub fec_recovered: u64,
/// Current mic audio level (RMS of i16 samples, 0-32767).
pub audio_level: u32,
/// Number of participants in the room (from last RoomUpdate).
pub room_participant_count: u32,
/// Participant list (fingerprint + optional alias) serialized as JSON array.
pub room_participants: Vec<RoomMember>,
}
/// A room member entry, serialized into the stats JSON.
#[derive(Clone, Debug, Default, serde::Serialize)]
pub struct RoomMember {
pub fingerprint: String,
pub alias: Option<String>,
}

View File

@@ -23,10 +23,13 @@ serde_json = "1"
chrono = "0.4" chrono = "0.4"
rustls = { version = "0.23", default-features = false, features = ["ring", "std"] } rustls = { version = "0.23", default-features = false, features = ["ring", "std"] }
cpal = { version = "0.15", optional = true } cpal = { version = "0.15", optional = true }
coreaudio-rs = { version = "0.11", optional = true }
libc = "0.2"
[features] [features]
default = [] default = []
audio = ["cpal"] audio = ["cpal"]
vpio = ["coreaudio-rs"]
[[bin]] [[bin]]
name = "wzp-client" name = "wzp-client"

View File

@@ -3,12 +3,10 @@
//! Both structs use 48 kHz, mono, i16 format to match the WarzonePhone codec //! Both structs use 48 kHz, mono, i16 format to match the WarzonePhone codec
//! pipeline. Frames are 960 samples (20 ms at 48 kHz). //! pipeline. Frames are 960 samples (20 ms at 48 kHz).
//! //!
//! The cpal `Stream` type is not `Send`, so each struct spawns a dedicated OS //! Audio callbacks are **lock-free**: they read/write directly to an `AudioRing`
//! thread that owns the stream. The public API exposes only `Send + Sync` //! (atomic SPSC ring buffer). No Mutex, no channel, no allocation on the hot path.
//! channel handles.
use std::sync::atomic::{AtomicBool, Ordering}; use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::mpsc;
use std::sync::Arc; use std::sync::Arc;
use anyhow::{anyhow, Context}; use anyhow::{anyhow, Context};
@@ -16,6 +14,8 @@ use cpal::traits::{DeviceTrait, HostTrait, StreamTrait};
use cpal::{SampleFormat, SampleRate, StreamConfig}; use cpal::{SampleFormat, SampleRate, StreamConfig};
use tracing::{info, warn}; use tracing::{info, warn};
use crate::audio_ring::AudioRing;
/// Number of samples per 20 ms frame at 48 kHz mono. /// Number of samples per 20 ms frame at 48 kHz mono.
pub const FRAME_SAMPLES: usize = 960; pub const FRAME_SAMPLES: usize = 960;
@@ -23,22 +23,24 @@ pub const FRAME_SAMPLES: usize = 960;
// AudioCapture // AudioCapture
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
/// Captures microphone input and yields 960-sample PCM frames. /// Captures microphone input via CPAL and writes PCM into a lock-free ring buffer.
/// ///
/// The cpal stream lives on a dedicated OS thread; this handle is `Send + Sync`. /// The cpal stream lives on a dedicated OS thread; this handle is `Send + Sync`.
pub struct AudioCapture { pub struct AudioCapture {
rx: mpsc::Receiver<Vec<i16>>, ring: Arc<AudioRing>,
running: Arc<AtomicBool>, running: Arc<AtomicBool>,
} }
impl AudioCapture { impl AudioCapture {
/// Create and start capturing from the default input device at 48 kHz mono. /// Create and start capturing from the default input device at 48 kHz mono.
pub fn start() -> Result<Self, anyhow::Error> { pub fn start() -> Result<Self, anyhow::Error> {
let (tx, rx) = mpsc::sync_channel::<Vec<i16>>(64); let ring = Arc::new(AudioRing::new());
let running = Arc::new(AtomicBool::new(true)); let running = Arc::new(AtomicBool::new(true));
let running_clone = running.clone();
let (init_tx, init_rx) = mpsc::sync_channel::<Result<(), String>>(1); let (init_tx, init_rx) = std::sync::mpsc::sync_channel::<Result<(), String>>(1);
let ring_cb = ring.clone();
let running_clone = running.clone();
std::thread::Builder::new() std::thread::Builder::new()
.name("wzp-audio-capture".into()) .name("wzp-audio-capture".into())
@@ -59,53 +61,51 @@ impl AudioCapture {
let use_f32 = !supports_i16_input(&device)?; let use_f32 = !supports_i16_input(&device)?;
let buf = Arc::new(std::sync::Mutex::new(
Vec::<i16>::with_capacity(FRAME_SAMPLES),
));
let err_cb = |e: cpal::StreamError| { let err_cb = |e: cpal::StreamError| {
warn!("input stream error: {e}"); warn!("input stream error: {e}");
}; };
let logged_cb_size = Arc::new(AtomicBool::new(false));
let stream = if use_f32 { let stream = if use_f32 {
let buf = buf.clone(); let ring = ring_cb.clone();
let tx = tx.clone();
let running = running_clone.clone(); let running = running_clone.clone();
let logged = logged_cb_size.clone();
device.build_input_stream( device.build_input_stream(
&config, &config,
move |data: &[f32], _: &cpal::InputCallbackInfo| { move |data: &[f32], _: &cpal::InputCallbackInfo| {
if !running.load(Ordering::Relaxed) { if !running.load(Ordering::Relaxed) {
return; return;
} }
let mut lock = buf.lock().unwrap(); if !logged.swap(true, Ordering::Relaxed) {
for &s in data { eprintln!("[audio] capture callback: {} f32 samples", data.len());
lock.push(f32_to_i16(s)); }
if lock.len() == FRAME_SAMPLES { let mut tmp = [0i16; FRAME_SAMPLES];
let frame = lock.drain(..).collect(); for chunk in data.chunks(FRAME_SAMPLES) {
let _ = tx.try_send(frame); let n = chunk.len();
for i in 0..n {
tmp[i] = f32_to_i16(chunk[i]);
} }
ring.write(&tmp[..n]);
} }
}, },
err_cb, err_cb,
None, None,
)? )?
} else { } else {
let buf = buf.clone(); let ring = ring_cb.clone();
let tx = tx.clone();
let running = running_clone.clone(); let running = running_clone.clone();
let logged = logged_cb_size.clone();
device.build_input_stream( device.build_input_stream(
&config, &config,
move |data: &[i16], _: &cpal::InputCallbackInfo| { move |data: &[i16], _: &cpal::InputCallbackInfo| {
if !running.load(Ordering::Relaxed) { if !running.load(Ordering::Relaxed) {
return; return;
} }
let mut lock = buf.lock().unwrap(); if !logged.swap(true, Ordering::Relaxed) {
for &s in data { eprintln!("[audio] capture callback: {} i16 samples", data.len());
lock.push(s);
if lock.len() == FRAME_SAMPLES {
let frame = lock.drain(..).collect();
let _ = tx.try_send(frame);
}
} }
ring.write(data);
}, },
err_cb, err_cb,
None, None,
@@ -114,7 +114,6 @@ impl AudioCapture {
stream.play().context("failed to start input stream")?; stream.play().context("failed to start input stream")?;
// Signal success to the caller before parking.
let _ = init_tx.send(Ok(())); let _ = init_tx.send(Ok(()));
// Keep stream alive until stopped. // Keep stream alive until stopped.
@@ -135,15 +134,12 @@ impl AudioCapture {
.map_err(|_| anyhow!("capture thread exited before signaling"))? .map_err(|_| anyhow!("capture thread exited before signaling"))?
.map_err(|e| anyhow!("{e}"))?; .map_err(|e| anyhow!("{e}"))?;
Ok(Self { rx, running }) Ok(Self { ring, running })
} }
/// Read the next frame of 960 PCM samples (blocking until available). /// Get a reference to the capture ring buffer for direct polling.
/// pub fn ring(&self) -> &Arc<AudioRing> {
/// Returns `None` when the stream has been stopped or the channel is &self.ring
/// disconnected.
pub fn read_frame(&self) -> Option<Vec<i16>> {
self.rx.recv().ok()
} }
/// Stop capturing. /// Stop capturing.
@@ -152,26 +148,34 @@ impl AudioCapture {
} }
} }
impl Drop for AudioCapture {
fn drop(&mut self) {
self.stop();
}
}
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
// AudioPlayback // AudioPlayback
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
/// Plays PCM frames through the default output device at 48 kHz mono. /// Plays PCM through the default output device, reading from a lock-free ring buffer.
/// ///
/// The cpal stream lives on a dedicated OS thread; this handle is `Send + Sync`. /// The cpal stream lives on a dedicated OS thread; this handle is `Send + Sync`.
pub struct AudioPlayback { pub struct AudioPlayback {
tx: mpsc::SyncSender<Vec<i16>>, ring: Arc<AudioRing>,
running: Arc<AtomicBool>, running: Arc<AtomicBool>,
} }
impl AudioPlayback { impl AudioPlayback {
/// Create and start playback on the default output device at 48 kHz mono. /// Create and start playback on the default output device at 48 kHz mono.
pub fn start() -> Result<Self, anyhow::Error> { pub fn start() -> Result<Self, anyhow::Error> {
let (tx, rx) = mpsc::sync_channel::<Vec<i16>>(64); let ring = Arc::new(AudioRing::new());
let running = Arc::new(AtomicBool::new(true)); let running = Arc::new(AtomicBool::new(true));
let running_clone = running.clone();
let (init_tx, init_rx) = mpsc::sync_channel::<Result<(), String>>(1); let (init_tx, init_rx) = std::sync::mpsc::sync_channel::<Result<(), String>>(1);
let ring_cb = ring.clone();
let running_clone = running.clone();
std::thread::Builder::new() std::thread::Builder::new()
.name("wzp-audio-playback".into()) .name("wzp-audio-playback".into())
@@ -192,62 +196,40 @@ impl AudioPlayback {
let use_f32 = !supports_i16_output(&device)?; let use_f32 = !supports_i16_output(&device)?;
// Shared ring of samples the cpal callback drains from.
let ring = Arc::new(std::sync::Mutex::new(
std::collections::VecDeque::<i16>::with_capacity(FRAME_SAMPLES * 8),
));
// Background drainer: moves frames from the mpsc channel into the ring.
{
let ring = ring.clone();
let running = running_clone.clone();
std::thread::Builder::new()
.name("wzp-playback-drain".into())
.spawn(move || {
while running.load(Ordering::Relaxed) {
match rx.recv_timeout(std::time::Duration::from_millis(100)) {
Ok(frame) => {
let mut lock = ring.lock().unwrap();
lock.extend(frame);
while lock.len() > FRAME_SAMPLES * 16 {
lock.pop_front();
}
}
Err(mpsc::RecvTimeoutError::Timeout) => {}
Err(mpsc::RecvTimeoutError::Disconnected) => break,
}
}
})?;
}
let err_cb = |e: cpal::StreamError| { let err_cb = |e: cpal::StreamError| {
warn!("output stream error: {e}"); warn!("output stream error: {e}");
}; };
let stream = if use_f32 { let stream = if use_f32 {
let ring = ring.clone(); let ring = ring_cb.clone();
device.build_output_stream( device.build_output_stream(
&config, &config,
move |data: &mut [f32], _: &cpal::OutputCallbackInfo| { move |data: &mut [f32], _: &cpal::OutputCallbackInfo| {
let mut lock = ring.lock().unwrap(); let mut tmp = [0i16; FRAME_SAMPLES];
for sample in data.iter_mut() { for chunk in data.chunks_mut(FRAME_SAMPLES) {
*sample = match lock.pop_front() { let n = chunk.len();
Some(s) => i16_to_f32(s), let read = ring.read(&mut tmp[..n]);
None => 0.0, for i in 0..read {
}; chunk[i] = i16_to_f32(tmp[i]);
}
// Fill remainder with silence if ring underran
for i in read..n {
chunk[i] = 0.0;
}
} }
}, },
err_cb, err_cb,
None, None,
)? )?
} else { } else {
let ring = ring.clone(); let ring = ring_cb.clone();
device.build_output_stream( device.build_output_stream(
&config, &config,
move |data: &mut [i16], _: &cpal::OutputCallbackInfo| { move |data: &mut [i16], _: &cpal::OutputCallbackInfo| {
let mut lock = ring.lock().unwrap(); let read = ring.read(data);
for sample in data.iter_mut() { // Fill remainder with silence if ring underran
*sample = lock.pop_front().unwrap_or(0); for sample in &mut data[read..] {
*sample = 0;
} }
}, },
err_cb, err_cb,
@@ -257,7 +239,6 @@ impl AudioPlayback {
stream.play().context("failed to start output stream")?; stream.play().context("failed to start output stream")?;
// Signal success to the caller before parking.
let _ = init_tx.send(Ok(())); let _ = init_tx.send(Ok(()));
// Keep stream alive until stopped. // Keep stream alive until stopped.
@@ -278,12 +259,12 @@ impl AudioPlayback {
.map_err(|_| anyhow!("playback thread exited before signaling"))? .map_err(|_| anyhow!("playback thread exited before signaling"))?
.map_err(|e| anyhow!("{e}"))?; .map_err(|e| anyhow!("{e}"))?;
Ok(Self { tx, running }) Ok(Self { ring, running })
} }
/// Write a frame of PCM samples for playback. /// Get a reference to the playout ring buffer for direct writing.
pub fn write_frame(&self, pcm: &[i16]) { pub fn ring(&self) -> &Arc<AudioRing> {
let _ = self.tx.try_send(pcm.to_vec()); &self.ring
} }
/// Stop playback. /// Stop playback.
@@ -292,11 +273,16 @@ impl AudioPlayback {
} }
} }
impl Drop for AudioPlayback {
fn drop(&mut self) {
self.stop();
}
}
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
// Helpers // Helpers
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
/// Check if the input device supports i16 at 48 kHz mono.
fn supports_i16_input(device: &cpal::Device) -> Result<bool, anyhow::Error> { fn supports_i16_input(device: &cpal::Device) -> Result<bool, anyhow::Error> {
let supported = device let supported = device
.supported_input_configs() .supported_input_configs()
@@ -313,7 +299,6 @@ fn supports_i16_input(device: &cpal::Device) -> Result<bool, anyhow::Error> {
Ok(false) Ok(false)
} }
/// Check if the output device supports i16 at 48 kHz mono.
fn supports_i16_output(device: &cpal::Device) -> Result<bool, anyhow::Error> { fn supports_i16_output(device: &cpal::Device) -> Result<bool, anyhow::Error> {
let supported = device let supported = device
.supported_output_configs() .supported_output_configs()

View File

@@ -0,0 +1,122 @@
//! Lock-free SPSC ring buffer — "Reader-Detects-Lap" architecture.
//!
//! SPSC invariant: the producer ONLY writes `write_pos`, the consumer
//! ONLY writes `read_pos`. Neither thread touches the other's cursor.
//!
//! On overflow (writer laps the reader), the writer simply overwrites
//! old buffer data. The reader detects the lap via `available() >
//! RING_CAPACITY` and snaps its own `read_pos` forward.
//!
//! Capacity is a power of 2 for bitmask indexing (no modulo).
use std::sync::atomic::{AtomicU64, AtomicUsize, Ordering};
/// Ring buffer capacity — power of 2 for bitmask indexing.
/// 16384 samples = 341.3ms at 48kHz mono.
const RING_CAPACITY: usize = 16384; // 2^14
const RING_MASK: usize = RING_CAPACITY - 1;
/// Lock-free single-producer single-consumer ring buffer for i16 PCM samples.
pub struct AudioRing {
buf: Box<[i16]>,
/// Monotonically increasing write cursor. ONLY written by producer.
write_pos: AtomicUsize,
/// Monotonically increasing read cursor. ONLY written by consumer.
read_pos: AtomicUsize,
/// Incremented by reader when it detects it was lapped (overflow).
overflow_count: AtomicU64,
/// Incremented by reader when ring is empty (underrun).
underrun_count: AtomicU64,
}
// SAFETY: AudioRing is SPSC — one thread writes (producer), one reads (consumer).
// The producer only writes write_pos. The consumer only writes read_pos.
// Neither thread writes the other's cursor. Buffer indices are derived from
// the owning thread's cursor, ensuring no concurrent access to the same index.
unsafe impl Send for AudioRing {}
unsafe impl Sync for AudioRing {}
impl AudioRing {
pub fn new() -> Self {
debug_assert!(RING_CAPACITY.is_power_of_two());
Self {
buf: vec![0i16; RING_CAPACITY].into_boxed_slice(),
write_pos: AtomicUsize::new(0),
read_pos: AtomicUsize::new(0),
overflow_count: AtomicU64::new(0),
underrun_count: AtomicU64::new(0),
}
}
/// Number of samples available to read (clamped to capacity).
pub fn available(&self) -> usize {
let w = self.write_pos.load(Ordering::Acquire);
let r = self.read_pos.load(Ordering::Relaxed);
w.wrapping_sub(r).min(RING_CAPACITY)
}
/// Write samples into the ring. Returns number of samples written.
///
/// If the ring is full, old data is silently overwritten. The reader
/// will detect the lap and self-correct. The writer NEVER touches
/// `read_pos`.
pub fn write(&self, samples: &[i16]) -> usize {
let count = samples.len().min(RING_CAPACITY);
let w = self.write_pos.load(Ordering::Relaxed);
for i in 0..count {
unsafe {
let ptr = self.buf.as_ptr() as *mut i16;
*ptr.add((w + i) & RING_MASK) = samples[i];
}
}
self.write_pos
.store(w.wrapping_add(count), Ordering::Release);
count
}
/// Read samples from the ring into `out`. Returns number of samples read.
///
/// If the writer has lapped the reader (overflow), `read_pos` is snapped
/// forward to the oldest valid data.
pub fn read(&self, out: &mut [i16]) -> usize {
let w = self.write_pos.load(Ordering::Acquire);
let mut r = self.read_pos.load(Ordering::Relaxed);
let mut avail = w.wrapping_sub(r);
// Lap detection: writer has overwritten our unread data.
if avail > RING_CAPACITY {
r = w.wrapping_sub(RING_CAPACITY);
avail = RING_CAPACITY;
self.overflow_count.fetch_add(1, Ordering::Relaxed);
}
let count = out.len().min(avail);
if count == 0 {
if w == r {
self.underrun_count.fetch_add(1, Ordering::Relaxed);
}
return 0;
}
for i in 0..count {
out[i] = unsafe { *self.buf.as_ptr().add((r + i) & RING_MASK) };
}
self.read_pos
.store(r.wrapping_add(count), Ordering::Release);
count
}
/// Number of overflow events (reader was lapped by writer).
pub fn overflow_count(&self) -> u64 {
self.overflow_count.load(Ordering::Relaxed)
}
/// Number of underrun events (reader found empty buffer).
pub fn underrun_count(&self) -> u64 {
self.underrun_count.load(Ordering::Relaxed)
}
}

View File

@@ -0,0 +1,179 @@
//! macOS Voice Processing I/O — uses Apple's VoiceProcessingIO audio unit
//! for hardware-accelerated echo cancellation, AGC, and noise suppression.
//!
//! VoiceProcessingIO is a combined input+output unit that knows what's going
//! to the speaker, so it can cancel the echo from the mic signal internally.
//! This is the same engine FaceTime and other Apple apps use.
use std::sync::atomic::{AtomicBool, Ordering};
use std::sync::Arc;
use anyhow::Context;
use coreaudio::audio_unit::audio_format::LinearPcmFlags;
use coreaudio::audio_unit::render_callback::{self, data};
use coreaudio::audio_unit::{AudioUnit, Element, IOType, SampleFormat, Scope, StreamFormat};
use coreaudio::sys;
use tracing::info;
use crate::audio_ring::AudioRing;
/// Number of samples per 20 ms frame at 48 kHz mono.
pub const FRAME_SAMPLES: usize = 960;
/// Combined capture + playback via macOS VoiceProcessingIO.
///
/// The OS handles AEC internally — no manual far-end feeding needed.
pub struct VpioAudio {
capture_ring: Arc<AudioRing>,
playout_ring: Arc<AudioRing>,
_audio_unit: AudioUnit,
running: Arc<AtomicBool>,
}
impl VpioAudio {
/// Start VoiceProcessingIO with AEC enabled.
pub fn start() -> Result<Self, anyhow::Error> {
let capture_ring = Arc::new(AudioRing::new());
let playout_ring = Arc::new(AudioRing::new());
let running = Arc::new(AtomicBool::new(true));
let mut au = AudioUnit::new(IOType::VoiceProcessingIO)
.context("failed to create VoiceProcessingIO audio unit")?;
// Must uninitialize before configuring properties.
au.uninitialize()
.context("failed to uninitialize VPIO for configuration")?;
// Enable input (mic) on Element::Input (bus 1).
let enable: u32 = 1;
au.set_property(
sys::kAudioOutputUnitProperty_EnableIO,
Scope::Input,
Element::Input,
Some(&enable),
)
.context("failed to enable VPIO input")?;
// Output (speaker) is enabled by default on VPIO, but be explicit.
au.set_property(
sys::kAudioOutputUnitProperty_EnableIO,
Scope::Output,
Element::Output,
Some(&enable),
)
.context("failed to enable VPIO output")?;
// Configure stream format: 48kHz mono f32 non-interleaved
let stream_format = StreamFormat {
sample_rate: 48_000.0,
sample_format: SampleFormat::F32,
flags: LinearPcmFlags::IS_FLOAT
| LinearPcmFlags::IS_PACKED
| LinearPcmFlags::IS_NON_INTERLEAVED,
channels: 1,
};
let asbd = stream_format.to_asbd();
// Input: set format on Output scope of Input element
// (= the format the AU delivers to us from the mic)
au.set_property(
sys::kAudioUnitProperty_StreamFormat,
Scope::Output,
Element::Input,
Some(&asbd),
)
.context("failed to set input stream format")?;
// Output: set format on Input scope of Output element
// (= the format we feed to the AU for the speaker)
au.set_property(
sys::kAudioUnitProperty_StreamFormat,
Scope::Input,
Element::Output,
Some(&asbd),
)
.context("failed to set output stream format")?;
// Set up input callback (mic capture with AEC applied)
let cap_ring = capture_ring.clone();
let cap_running = running.clone();
let logged = Arc::new(AtomicBool::new(false));
au.set_input_callback(
move |args: render_callback::Args<data::NonInterleaved<f32>>| {
if !cap_running.load(Ordering::Relaxed) {
return Ok(());
}
let mut buffers = args.data.channels();
if let Some(ch) = buffers.next() {
if !logged.swap(true, Ordering::Relaxed) {
eprintln!("[vpio] capture callback: {} f32 samples", ch.len());
}
let mut tmp = [0i16; FRAME_SAMPLES];
for chunk in ch.chunks(FRAME_SAMPLES) {
let n = chunk.len();
for i in 0..n {
tmp[i] = (chunk[i].clamp(-1.0, 1.0) * i16::MAX as f32) as i16;
}
cap_ring.write(&tmp[..n]);
}
}
Ok(())
},
)
.context("failed to set input callback")?;
// Set up output callback (speaker playback — AEC uses this as reference)
let play_ring = playout_ring.clone();
au.set_render_callback(
move |mut args: render_callback::Args<data::NonInterleaved<f32>>| {
let mut buffers = args.data.channels_mut();
if let Some(ch) = buffers.next() {
let mut tmp = [0i16; FRAME_SAMPLES];
for chunk in ch.chunks_mut(FRAME_SAMPLES) {
let n = chunk.len();
let read = play_ring.read(&mut tmp[..n]);
for i in 0..read {
chunk[i] = tmp[i] as f32 / i16::MAX as f32;
}
for i in read..n {
chunk[i] = 0.0;
}
}
}
Ok(())
},
)
.context("failed to set render callback")?;
au.initialize().context("failed to initialize VoiceProcessingIO")?;
au.start().context("failed to start VoiceProcessingIO")?;
info!("VoiceProcessingIO started (OS-level AEC enabled)");
Ok(Self {
capture_ring,
playout_ring,
_audio_unit: au,
running,
})
}
pub fn capture_ring(&self) -> &Arc<AudioRing> {
&self.capture_ring
}
pub fn playout_ring(&self) -> &Arc<AudioRing> {
&self.playout_ring
}
pub fn stop(&self) {
self.running.store(false, Ordering::Relaxed);
}
}
impl Drop for VpioAudio {
fn drop(&mut self) {
self.stop();
}
}

View File

@@ -7,7 +7,7 @@ use std::time::{Duration, Instant};
use bytes::Bytes; use bytes::Bytes;
use tracing::{debug, info, warn}; use tracing::{debug, info, warn};
use wzp_codec::{ComfortNoise, NoiseSupressor, SilenceDetector}; use wzp_codec::{AutoGainControl, ComfortNoise, EchoCanceller, NoiseSupressor, SilenceDetector};
use wzp_fec::{RaptorQFecDecoder, RaptorQFecEncoder}; use wzp_fec::{RaptorQFecDecoder, RaptorQFecEncoder};
use wzp_proto::jitter::{JitterBuffer, PlayoutResult}; use wzp_proto::jitter::{JitterBuffer, PlayoutResult};
use wzp_proto::packet::{MediaHeader, MediaPacket, MiniFrameContext}; use wzp_proto::packet::{MediaHeader, MediaPacket, MiniFrameContext};
@@ -42,6 +42,9 @@ pub struct CallConfig {
/// When enabled, only every 50th frame carries a full 12-byte MediaHeader; /// When enabled, only every 50th frame carries a full 12-byte MediaHeader;
/// intermediate frames use a compact 4-byte MiniHeader. /// intermediate frames use a compact 4-byte MiniHeader.
pub mini_frames_enabled: bool, pub mini_frames_enabled: bool,
/// AEC far-end delay compensation in milliseconds (default: 40).
/// Compensates for the round-trip audio latency from playout to mic capture.
pub aec_delay_ms: u32,
/// Enable adaptive jitter buffer (default: true). /// Enable adaptive jitter buffer (default: true).
/// ///
/// When true, the jitter buffer target depth is automatically adjusted /// When true, the jitter buffer target depth is automatically adjusted
@@ -63,6 +66,7 @@ impl Default for CallConfig {
noise_suppression: true, noise_suppression: true,
mini_frames_enabled: true, mini_frames_enabled: true,
adaptive_jitter: true, adaptive_jitter: true,
aec_delay_ms: 40,
} }
} }
} }
@@ -207,6 +211,10 @@ pub struct CallEncoder {
frame_in_block: u8, frame_in_block: u8,
/// Timestamp counter (ms). /// Timestamp counter (ms).
timestamp_ms: u32, timestamp_ms: u32,
/// Acoustic echo canceller (removes speaker echo from mic signal).
aec: EchoCanceller,
/// Automatic gain control (normalises mic level).
agc: AutoGainControl,
/// Silence detector for suppression. /// Silence detector for suppression.
silence_detector: SilenceDetector, silence_detector: SilenceDetector,
/// Whether silence suppression is enabled. /// Whether silence suppression is enabled.
@@ -237,6 +245,8 @@ impl CallEncoder {
block_id: 0, block_id: 0,
frame_in_block: 0, frame_in_block: 0,
timestamp_ms: 0, timestamp_ms: 0,
aec: EchoCanceller::with_delay(48000, 60, config.aec_delay_ms),
agc: AutoGainControl::new(),
silence_detector: SilenceDetector::new( silence_detector: SilenceDetector::new(
config.silence_threshold_rms, config.silence_threshold_rms,
config.silence_hangover_frames, config.silence_hangover_frames,
@@ -274,15 +284,21 @@ impl CallEncoder {
/// Input: 48kHz mono PCM, frame size depends on profile (960 for 20ms, 1920 for 40ms). /// Input: 48kHz mono PCM, frame size depends on profile (960 for 20ms, 1920 for 40ms).
/// Output: one or more MediaPackets to send. /// Output: one or more MediaPackets to send.
pub fn encode_frame(&mut self, pcm: &[i16]) -> Result<Vec<MediaPacket>, anyhow::Error> { pub fn encode_frame(&mut self, pcm: &[i16]) -> Result<Vec<MediaPacket>, anyhow::Error> {
// Noise suppression: denoise the PCM before silence detection and encoding. // Copy PCM into a mutable buffer for the processing pipeline.
let pcm = if self.denoiser.is_enabled() { let mut pcm_buf = pcm.to_vec();
let mut buf = pcm.to_vec();
self.denoiser.process(&mut buf); // Step 1: Echo cancellation (far-end reference must have been fed already).
buf self.aec.process_frame(&mut pcm_buf);
} else {
pcm.to_vec() // Step 2: Automatic gain control (normalise mic level).
}; self.agc.process_frame(&mut pcm_buf);
let pcm = &pcm[..];
// Step 3: Noise suppression (RNNoise).
if self.denoiser.is_enabled() {
self.denoiser.process(&mut pcm_buf);
}
let pcm = &pcm_buf[..];
// Silence suppression: skip encoding silent frames, periodically send CN. // Silence suppression: skip encoding silent frames, periodically send CN.
if self.suppression_enabled && self.silence_detector.is_silent(pcm) { if self.suppression_enabled && self.silence_detector.is_silent(pcm) {
@@ -400,6 +416,24 @@ impl CallEncoder {
self.frame_in_block = 0; self.frame_in_block = 0;
Ok(()) Ok(())
} }
/// Feed decoded playout audio as the echo reference signal.
///
/// Must be called with each decoded frame BEFORE the corresponding
/// microphone frame is processed.
pub fn feed_aec_farend(&mut self, farend: &[i16]) {
self.aec.feed_farend(farend);
}
/// Enable or disable acoustic echo cancellation.
pub fn set_aec_enabled(&mut self, enabled: bool) {
self.aec.set_enabled(enabled);
}
/// Enable or disable automatic gain control.
pub fn set_agc_enabled(&mut self, enabled: bool) {
self.agc.set_enabled(enabled);
}
} }
/// Manages the recv/decode side of a call. /// Manages the recv/decode side of a call.
@@ -466,6 +500,49 @@ impl CallDecoder {
} }
} }
/// Switch the decoder to match an incoming packet's codec if it differs
/// from the current profile. This enables cross-codec interop (e.g. one
/// client sends Opus, the other sends Codec2).
fn switch_decoder_if_needed(&mut self, incoming_codec: CodecId) {
if incoming_codec == self.profile.codec || incoming_codec == CodecId::ComfortNoise {
return;
}
let new_profile = Self::profile_for_codec(incoming_codec);
info!(
from = ?self.profile.codec,
to = ?incoming_codec,
"decoder switching codec to match incoming packet"
);
if let Err(e) = self.audio_dec.set_profile(new_profile) {
warn!("failed to switch decoder profile: {e}");
return;
}
self.fec_dec = wzp_fec::create_decoder(&new_profile);
self.profile = new_profile;
}
/// Map a `CodecId` to a reasonable `QualityProfile` for decoding.
fn profile_for_codec(codec: CodecId) -> QualityProfile {
match codec {
CodecId::Opus24k => QualityProfile::GOOD,
CodecId::Opus16k => QualityProfile {
codec: CodecId::Opus16k,
fec_ratio: 0.3,
frame_duration_ms: 20,
frames_per_block: 5,
},
CodecId::Opus6k => QualityProfile::DEGRADED,
CodecId::Codec2_3200 => QualityProfile {
codec: CodecId::Codec2_3200,
fec_ratio: 0.5,
frame_duration_ms: 20,
frames_per_block: 5,
},
CodecId::Codec2_1200 => QualityProfile::CATASTROPHIC,
CodecId::ComfortNoise => QualityProfile::GOOD,
}
}
/// Decode the next audio frame from the jitter buffer. /// Decode the next audio frame from the jitter buffer.
/// ///
/// Returns PCM samples (48kHz mono) or None if not ready. /// Returns PCM samples (48kHz mono) or None if not ready.
@@ -480,6 +557,9 @@ impl CallDecoder {
return Some(pcm.len()); return Some(pcm.len());
} }
// Auto-switch decoder if incoming codec differs from current.
self.switch_decoder_if_needed(pkt.header.codec_id);
self.last_was_cn = false; self.last_was_cn = false;
let result = match self.audio_dec.decode(&pkt.payload, pcm) { let result = match self.audio_dec.decode(&pkt.payload, pcm) {
Ok(n) => Some(n), Ok(n) => Some(n),

View File

@@ -14,17 +14,23 @@
use std::net::SocketAddr; use std::net::SocketAddr;
use std::sync::Arc; use std::sync::Arc;
use tracing::{error, info}; use tracing::{error, info, warn};
use wzp_client::call::{CallConfig, CallDecoder, CallEncoder}; use wzp_client::call::{CallConfig, CallDecoder, CallEncoder};
use wzp_proto::MediaTransport; use wzp_proto::MediaTransport;
const FRAME_SAMPLES: usize = 960; // 20ms @ 48kHz const FRAME_SAMPLES_20MS: usize = 960; // 20ms @ 48kHz
const FRAME_SAMPLES_40MS: usize = 1920; // 40ms @ 48kHz
/// Compute frame samples at 48kHz for a given profile.
fn frame_samples_for(profile: &wzp_proto::QualityProfile) -> usize {
(profile.frame_duration_ms as usize) * 48 // 48000 / 1000
}
/// Generate a sine wave tone. /// Generate a sine wave tone.
fn generate_sine_frame(freq_hz: f32, sample_rate: u32, frame_offset: u64) -> Vec<i16> { fn generate_sine_frame(freq_hz: f32, sample_rate: u32, frame_offset: u64, frame_samples: usize) -> Vec<i16> {
let start_sample = frame_offset * FRAME_SAMPLES as u64; let start_sample = frame_offset * frame_samples as u64;
(0..FRAME_SAMPLES) (0..frame_samples)
.map(|i| { .map(|i| {
let t = (start_sample + i as u64) as f32 / sample_rate as f32; let t = (start_sample + i as u64) as f32 / sample_rate as f32;
(f32::sin(2.0 * std::f32::consts::PI * freq_hz * t) * 16000.0) as i16 (f32::sin(2.0 * std::f32::consts::PI * freq_hz * t) * 16000.0) as i16
@@ -45,12 +51,32 @@ struct CliArgs {
seed_hex: Option<String>, seed_hex: Option<String>,
mnemonic: Option<String>, mnemonic: Option<String>,
room: Option<String>, room: Option<String>,
raw_room: bool,
alias: Option<String>,
no_denoise: bool,
no_aec: bool,
no_agc: bool,
no_fec: bool,
no_silence: bool,
direct_playout: bool,
aec_delay_ms: Option<u32>,
os_aec: bool,
token: Option<String>, token: Option<String>,
_metrics_file: Option<String>, _metrics_file: Option<String>,
/// Force a quality profile: "good", "degraded", "catastrophic", "codec2-3200"
profile_override: Option<String>,
}
/// Default identity file path: ~/.wzp/identity
fn default_identity_path() -> std::path::PathBuf {
let home = std::env::var("HOME").unwrap_or_else(|_| ".".to_string());
std::path::PathBuf::from(home).join(".wzp").join("identity")
} }
impl CliArgs { impl CliArgs {
/// Resolve the identity seed from --seed, --mnemonic, or generate a new one. /// Resolve the identity seed from --seed, --mnemonic, or persistent file.
///
/// Priority: --seed > --mnemonic > ~/.wzp/identity > generate + save.
pub fn resolve_seed(&self) -> wzp_crypto::Seed { pub fn resolve_seed(&self) -> wzp_crypto::Seed {
if let Some(ref hex_str) = self.seed_hex { if let Some(ref hex_str) = self.seed_hex {
let seed = wzp_crypto::Seed::from_hex(hex_str).expect("invalid --seed hex"); let seed = wzp_crypto::Seed::from_hex(hex_str).expect("invalid --seed hex");
@@ -65,15 +91,56 @@ impl CliArgs {
info!(fingerprint = %fp, "identity from --mnemonic"); info!(fingerprint = %fp, "identity from --mnemonic");
seed seed
} else { } else {
let path = default_identity_path();
// Try loading existing identity
if path.exists() {
if let Ok(hex_str) = std::fs::read_to_string(&path) {
let hex_str = hex_str.trim();
if let Ok(seed) = wzp_crypto::Seed::from_hex(hex_str) {
let id = seed.derive_identity();
let fp = id.public_identity().fingerprint;
info!(fingerprint = %fp, path = %path.display(), "loaded persistent identity");
return seed;
}
}
}
// Generate new and save
let seed = wzp_crypto::Seed::generate(); let seed = wzp_crypto::Seed::generate();
let id = seed.derive_identity(); let id = seed.derive_identity();
let fp = id.public_identity().fingerprint; let fp = id.public_identity().fingerprint;
info!(fingerprint = %fp, "generated ephemeral identity"); if let Some(parent) = path.parent() {
std::fs::create_dir_all(parent).ok();
}
// Encode seed as hex manually (avoid dep on `hex` crate in binary)
let hex_str: String = seed.0.iter().map(|b| format!("{b:02x}")).collect();
std::fs::write(&path, hex_str).ok();
info!(fingerprint = %fp, path = %path.display(), "generated and saved new identity");
seed seed
} }
} }
} }
/// Resolve a profile name to a QualityProfile.
fn resolve_profile(name: &str) -> wzp_proto::QualityProfile {
use wzp_proto::{CodecId, QualityProfile};
match name.to_lowercase().as_str() {
"good" | "opus" | "opus24k" => QualityProfile::GOOD,
"degraded" | "opus6k" => QualityProfile::DEGRADED,
"catastrophic" | "codec2-1200" | "c2-1200" | "1200" => QualityProfile::CATASTROPHIC,
"codec2-3200" | "c2-3200" | "3200" => QualityProfile {
codec: CodecId::Codec2_3200,
fec_ratio: 0.5,
frame_duration_ms: 20,
frames_per_block: 5,
},
other => {
eprintln!("unknown profile: {other}");
eprintln!("valid: good, degraded, catastrophic, codec2-3200, codec2-1200");
std::process::exit(1);
}
}
}
fn parse_args() -> CliArgs { fn parse_args() -> CliArgs {
let args: Vec<String> = std::env::args().collect(); let args: Vec<String> = std::env::args().collect();
let mut live = false; let mut live = false;
@@ -86,8 +153,19 @@ fn parse_args() -> CliArgs {
let mut seed_hex = None; let mut seed_hex = None;
let mut mnemonic = None; let mut mnemonic = None;
let mut room = None; let mut room = None;
let mut raw_room = false;
let mut alias = None;
let mut no_denoise = false;
let mut no_aec = false;
let mut no_agc = false;
let mut no_fec = false;
let mut no_silence = false;
let mut direct_playout = false;
let mut aec_delay_ms = None;
let mut os_aec = false;
let mut token = None; let mut token = None;
let mut metrics_file = None; let mut metrics_file = None;
let mut profile_override = None;
let mut relay_str = None; let mut relay_str = None;
let mut i = 1; let mut i = 1;
@@ -130,6 +208,27 @@ fn parse_args() -> CliArgs {
i += 1; i += 1;
room = Some(args.get(i).expect("--room requires a name").to_string()); room = Some(args.get(i).expect("--room requires a name").to_string());
} }
"--raw-room" => raw_room = true,
"--no-denoise" => no_denoise = true,
"--no-aec" => no_aec = true,
"--no-agc" => no_agc = true,
"--no-fec" => no_fec = true,
"--no-silence" => no_silence = true,
"--direct-playout" | "--android" => direct_playout = true,
"--os-aec" => os_aec = true,
"--aec-delay" => {
i += 1;
aec_delay_ms = Some(
args.get(i)
.expect("--aec-delay requires milliseconds")
.parse()
.expect("--aec-delay value must be a number"),
);
}
"--alias" => {
i += 1;
alias = Some(args.get(i).expect("--alias requires a name").to_string());
}
"--token" => { "--token" => {
i += 1; i += 1;
token = Some(args.get(i).expect("--token requires a value").to_string()); token = Some(args.get(i).expect("--token requires a value").to_string());
@@ -168,6 +267,14 @@ fn parse_args() -> CliArgs {
.expect("--drift-test value must be a number"), .expect("--drift-test value must be a number"),
); );
} }
"--profile" | "--codec" => {
i += 1;
profile_override = Some(
args.get(i)
.expect("--profile requires a value (good, degraded, catastrophic, codec2-3200)")
.to_string(),
);
}
"--sweep" => sweep = true, "--sweep" => sweep = true,
"--help" | "-h" => { "--help" | "-h" => {
eprintln!("Usage: wzp-client [options] [relay-addr]"); eprintln!("Usage: wzp-client [options] [relay-addr]");
@@ -179,14 +286,28 @@ fn parse_args() -> CliArgs {
eprintln!(" --record <file.raw> Record received audio to raw PCM file"); eprintln!(" --record <file.raw> Record received audio to raw PCM file");
eprintln!(" --echo-test <secs> Run automated echo quality test"); eprintln!(" --echo-test <secs> Run automated echo quality test");
eprintln!(" --drift-test <secs> Run automated clock-drift measurement"); eprintln!(" --drift-test <secs> Run automated clock-drift measurement");
eprintln!(" --profile <name> Force quality profile: good, degraded, catastrophic, codec2-3200");
eprintln!(" --codec <name> Alias for --profile");
eprintln!(" --sweep Run jitter buffer parameter sweep (local, no network)"); eprintln!(" --sweep Run jitter buffer parameter sweep (local, no network)");
eprintln!(" --seed <hex> Identity seed (64 hex chars, featherChat compatible)"); eprintln!(" --seed <hex> Identity seed (64 hex chars, featherChat compatible)");
eprintln!(" --mnemonic <words...> Identity seed as BIP39 mnemonic (24 words)"); eprintln!(" --mnemonic <words...> Identity seed as BIP39 mnemonic (24 words)");
eprintln!(" --room <name> Room name (hashed for privacy before sending)"); eprintln!(" --room <name> Room name (hashed for privacy before sending)");
eprintln!(" --raw-room Send room name as-is (no hash, for Android compat)");
eprintln!(" --alias <name> Display name shown to other participants");
eprintln!(" --no-denoise Disable RNNoise noise suppression");
eprintln!(" --no-aec Disable acoustic echo cancellation");
eprintln!(" --no-agc Disable automatic gain control");
eprintln!(" --no-fec Disable forward error correction");
eprintln!(" --no-silence Disable silence suppression");
eprintln!(" --direct-playout Bypass jitter buffer (decode on recv, like Android)");
eprintln!(" --aec-delay <ms> AEC far-end delay compensation (default: 40ms)");
eprintln!(" --os-aec Use macOS VoiceProcessingIO for hardware AEC (requires --vpio feature)");
eprintln!(" --android Alias for --no-denoise --no-silence --direct-playout");
eprintln!(" --token <token> featherChat bearer token for relay auth"); eprintln!(" --token <token> featherChat bearer token for relay auth");
eprintln!(" --metrics-file <path> Write JSONL telemetry to file (1 line/sec)"); eprintln!(" --metrics-file <path> Write JSONL telemetry to file (1 line/sec)");
eprintln!(" (48kHz mono s16le, play with ffplay -f s16le -ar 48000 -ch_layout mono file.raw)"); eprintln!(" (48kHz mono s16le, play with ffplay -f s16le -ar 48000 -ch_layout mono file.raw)");
eprintln!(); eprintln!();
eprintln!("Identity is auto-saved to ~/.wzp/identity on first run.");
eprintln!("Default relay: 127.0.0.1:4433"); eprintln!("Default relay: 127.0.0.1:4433");
std::process::exit(0); std::process::exit(0);
} }
@@ -219,8 +340,19 @@ fn parse_args() -> CliArgs {
seed_hex, seed_hex,
mnemonic, mnemonic,
room, room,
raw_room,
alias,
no_denoise,
no_aec,
no_agc,
no_fec,
no_silence,
direct_playout,
aec_delay_ms,
os_aec,
token, token,
_metrics_file: metrics_file, _metrics_file: metrics_file,
profile_override,
} }
} }
@@ -241,17 +373,30 @@ async fn main() -> anyhow::Result<()> {
let seed = cli.resolve_seed(); let seed = cli.resolve_seed();
// Resolve profile override
let profile = cli.profile_override.as_deref().map(resolve_profile);
if let Some(ref p) = profile {
info!(codec = ?p.codec, frame_ms = p.frame_duration_ms, fec = p.fec_ratio, "forced profile");
}
info!( info!(
relay = %cli.relay_addr, relay = %cli.relay_addr,
live = cli.live, live = cli.live,
send_tone = ?cli.send_tone_secs, send_tone = ?cli.send_tone_secs,
record = ?cli.record_file, record = ?cli.record_file,
room = ?cli.room, room = ?cli.room,
profile = ?cli.profile_override,
"WarzonePhone client" "WarzonePhone client"
); );
// Hash room name for SNI privacy (or "default" if none specified) // Compute SNI from room name.
// --raw-room sends the name as-is (for Android compat — Android doesn't hash).
// Default behaviour hashes for privacy.
let sni = match &cli.room { let sni = match &cli.room {
Some(name) if cli.raw_room => {
info!(room = %name, "using raw room name as SNI (no hash)");
name.clone()
}
Some(name) => { Some(name) => {
let hashed = wzp_crypto::hash_room_name(name); let hashed = wzp_crypto::hash_room_name(name);
info!(room = %name, hashed = %hashed, "room name hashed for SNI"); info!(room = %name, hashed = %hashed, "room name hashed for SNI");
@@ -287,13 +432,25 @@ async fn main() -> anyhow::Result<()> {
let _crypto_session = wzp_client::handshake::perform_handshake( let _crypto_session = wzp_client::handshake::perform_handshake(
&*transport, &*transport,
&seed.0, &seed.0,
cli.alias.as_deref(),
).await?; ).await?;
info!("crypto handshake complete"); info!("crypto handshake complete");
if cli.live { if cli.live {
#[cfg(feature = "audio")] #[cfg(feature = "audio")]
{ {
return run_live(transport).await; let audio_opts = AudioOpts {
no_denoise: cli.no_denoise || cli.direct_playout,
no_aec: cli.no_aec,
no_agc: cli.no_agc,
no_fec: cli.no_fec,
no_silence: cli.no_silence || cli.direct_playout,
direct_playout: cli.direct_playout,
aec_delay_ms: cli.aec_delay_ms,
os_aec: cli.os_aec,
profile_override: profile,
};
return run_live(transport, audio_opts).await;
} }
#[cfg(not(feature = "audio"))] #[cfg(not(feature = "audio"))]
{ {
@@ -314,19 +471,23 @@ async fn main() -> anyhow::Result<()> {
transport.close().await?; transport.close().await?;
Ok(()) Ok(())
} else if cli.send_tone_secs.is_some() || cli.send_file.is_some() || cli.record_file.is_some() { } else if cli.send_tone_secs.is_some() || cli.send_file.is_some() || cli.record_file.is_some() {
run_file_mode(transport, cli.send_tone_secs, cli.send_file, cli.record_file).await run_file_mode(transport, cli.send_tone_secs, cli.send_file, cli.record_file, profile).await
} else { } else {
run_silence(transport).await run_silence(transport, profile).await
} }
} }
/// Send silence frames (connectivity test). /// Send silence frames (connectivity test).
async fn run_silence(transport: Arc<wzp_transport::QuinnTransport>) -> anyhow::Result<()> { async fn run_silence(transport: Arc<wzp_transport::QuinnTransport>, profile: Option<wzp_proto::QualityProfile>) -> anyhow::Result<()> {
let config = CallConfig::default(); let config = match profile {
Some(p) => CallConfig::from_profile(p),
None => CallConfig::default(),
};
let frame_samples = frame_samples_for(&config.profile);
let mut encoder = CallEncoder::new(&config); let mut encoder = CallEncoder::new(&config);
let frame_duration = tokio::time::Duration::from_millis(20); let frame_duration = tokio::time::Duration::from_millis(config.profile.frame_duration_ms as u64);
let pcm = vec![0i16; FRAME_SAMPLES]; let pcm = vec![0i16; frame_samples];
let mut total_source = 0u64; let mut total_source = 0u64;
let mut total_repair = 0u64; let mut total_repair = 0u64;
@@ -342,8 +503,7 @@ async fn run_silence(transport: Arc<wzp_transport::QuinnTransport>) -> anyhow::R
} }
total_bytes += pkt.payload.len() as u64; total_bytes += pkt.payload.len() as u64;
if let Err(e) = transport.send_media(pkt).await { if let Err(e) = transport.send_media(pkt).await {
error!("send error: {e}"); warn!("send_media error (dropping packet): {e}");
break;
} }
} }
if (i + 1) % 50 == 0 { if (i + 1) % 50 == 0 {
@@ -373,13 +533,20 @@ async fn run_file_mode(
send_tone_secs: Option<u32>, send_tone_secs: Option<u32>,
send_file: Option<String>, send_file: Option<String>,
record_file: Option<String>, record_file: Option<String>,
profile: Option<wzp_proto::QualityProfile>,
) -> anyhow::Result<()> { ) -> anyhow::Result<()> {
let config = CallConfig::default(); let config = match profile {
Some(p) => CallConfig::from_profile(p),
None => CallConfig::default(),
};
let frame_samples = frame_samples_for(&config.profile);
let frame_duration_ms = config.profile.frame_duration_ms as u64;
// --- Send task: generate tone or play file --- // --- Send task: generate tone or play file ---
let send_transport = transport.clone(); let send_transport = transport.clone();
let send_handle = tokio::spawn(async move { let send_handle = tokio::spawn(async move {
// Load PCM frames from file or generate tone // Load PCM frames from file or generate tone
let frames_per_sec = 1000 / frame_duration_ms;
let pcm_frames: Vec<Vec<i16>> = if let Some(ref path) = send_file { let pcm_frames: Vec<Vec<i16>> = if let Some(ref path) = send_file {
// Read raw PCM file (48kHz mono s16le) // Read raw PCM file (48kHz mono s16le)
let bytes = match std::fs::read(path) { let bytes = match std::fs::read(path) {
@@ -391,14 +558,14 @@ async fn run_file_mode(
.collect(); .collect();
let duration = samples.len() as f64 / 48_000.0; let duration = samples.len() as f64 / 48_000.0;
info!(file = %path, duration = format!("{:.1}s", duration), "sending audio file"); info!(file = %path, duration = format!("{:.1}s", duration), "sending audio file");
samples.chunks(FRAME_SAMPLES) samples.chunks(frame_samples)
.filter(|c| c.len() == FRAME_SAMPLES) .filter(|c| c.len() == frame_samples)
.map(|c| c.to_vec()) .map(|c| c.to_vec())
.collect() .collect()
} else if let Some(secs) = send_tone_secs { } else if let Some(secs) = send_tone_secs {
let total = (secs as u64) * 50; let total = (secs as u64) * frames_per_sec;
info!(seconds = secs, frames = total, "sending 440Hz tone"); info!(seconds = secs, frames = total, frame_samples, frame_ms = frame_duration_ms, "sending 440Hz tone");
(0..total).map(|i| generate_sine_frame(440.0, 48_000, i)).collect() (0..total).map(|i| generate_sine_frame(440.0, 48_000, i, frame_samples)).collect()
} else { } else {
// No sending, just wait // No sending, just wait
tokio::signal::ctrl_c().await.ok(); tokio::signal::ctrl_c().await.ok();
@@ -407,7 +574,7 @@ async fn run_file_mode(
let mut encoder = CallEncoder::new(&config); let mut encoder = CallEncoder::new(&config);
let _total_frames = pcm_frames.len() as u64; let _total_frames = pcm_frames.len() as u64;
let frame_duration = tokio::time::Duration::from_millis(20); let frame_duration = tokio::time::Duration::from_millis(frame_duration_ms);
let mut total_source = 0u64; let mut total_source = 0u64;
let mut total_repair = 0u64; let mut total_repair = 0u64;
@@ -428,8 +595,7 @@ async fn run_file_mode(
total_source += 1; total_source += 1;
} }
if let Err(e) = send_transport.send_media(pkt).await { if let Err(e) = send_transport.send_media(pkt).await {
error!("send error: {e}"); warn!("send_media error (dropping packet): {e}");
return;
} }
} }
if (frame_idx + 1) % 250 == 0 { if (frame_idx + 1) % 250 == 0 {
@@ -458,8 +624,13 @@ async fn run_file_mode(
} }
}; };
let mut decoder = CallDecoder::new(&CallConfig::default()); let recv_config = match profile {
let mut pcm_buf = vec![0i16; FRAME_SAMPLES]; Some(p) => CallConfig::from_profile(p),
None => CallConfig::default(),
};
let recv_frame_samples = frame_samples_for(&recv_config.profile);
let mut decoder = CallDecoder::new(&recv_config);
let mut pcm_buf = vec![0i16; recv_frame_samples.max(FRAME_SAMPLES_40MS)];
let mut all_pcm: Vec<i16> = Vec::new(); let mut all_pcm: Vec<i16> = Vec::new();
let mut frames_received = 0u64; let mut frames_received = 0u64;
@@ -548,78 +719,534 @@ async fn run_file_mode(
} }
/// Live mode: capture from mic, encode, send; receive, decode, play. /// Live mode: capture from mic, encode, send; receive, decode, play.
///
/// Architecture (mirrors wzp-android/engine.rs):
/// CPAL capture callback → AudioRing → send task (5ms poll) → QUIC
/// QUIC → recv task → jitter buffer → decode tick (20ms) → AudioRing → CPAL playback callback
///
/// All lock-free: CPAL callbacks use atomic ring buffers, no Mutex on the audio path.
/// RAII guard for terminal raw mode. Restores on drop.
struct RawModeGuard {
orig: libc::termios,
}
impl RawModeGuard {
fn enter() -> Option<Self> {
unsafe {
let mut orig: libc::termios = std::mem::zeroed();
if libc::tcgetattr(libc::STDIN_FILENO, &mut orig) != 0 {
return None;
}
let mut raw = orig;
// ICANON: character-at-a-time input
// ECHO: don't echo typed characters
// ISIG: let us handle Ctrl+C as a byte
raw.c_lflag &= !(libc::ICANON | libc::ECHO | libc::ISIG);
// IXON: disable Ctrl+S/Ctrl+Q flow control so we receive them
raw.c_iflag &= !libc::IXON;
raw.c_cc[libc::VMIN] = 1;
raw.c_cc[libc::VTIME] = 0;
libc::tcsetattr(libc::STDIN_FILENO, libc::TCSANOW, &raw);
Some(Self { orig })
}
}
}
impl Drop for RawModeGuard {
fn drop(&mut self) {
unsafe {
libc::tcsetattr(libc::STDIN_FILENO, libc::TCSANOW, &self.orig);
}
}
}
struct AudioOpts {
no_denoise: bool,
no_aec: bool,
no_agc: bool,
no_fec: bool,
no_silence: bool,
direct_playout: bool,
aec_delay_ms: Option<u32>,
os_aec: bool,
profile_override: Option<wzp_proto::QualityProfile>,
}
#[cfg(feature = "audio")] #[cfg(feature = "audio")]
async fn run_live(transport: Arc<wzp_transport::QuinnTransport>) -> anyhow::Result<()> { async fn run_live(
transport: Arc<wzp_transport::QuinnTransport>,
opts: AudioOpts,
) -> anyhow::Result<()> {
use std::sync::Arc as StdArc;
use std::sync::atomic::{AtomicBool, Ordering};
use wzp_client::audio_io::{AudioCapture, AudioPlayback}; use wzp_client::audio_io::{AudioCapture, AudioPlayback};
use wzp_client::audio_ring::AudioRing;
use wzp_client::call::JitterTelemetry;
let capture = AudioCapture::start()?; // Audio I/O: either VPIO (OS-level AEC) or separate CPAL streams.
let playback = AudioPlayback::start()?; #[cfg(all(target_os = "macos", feature = "vpio"))]
info!("Audio I/O started — press Ctrl+C to stop"); let vpio;
let (capture_ring, playout_ring) = if opts.os_aec {
#[cfg(all(target_os = "macos", feature = "vpio"))]
{
vpio = wzp_client::audio_vpio::VpioAudio::start()?;
(vpio.capture_ring().clone(), vpio.playout_ring().clone())
}
#[cfg(all(target_os = "macos", not(feature = "vpio")))]
{
anyhow::bail!("--os-aec requires the 'vpio' feature (build with: cargo build --features audio,vpio)");
}
#[cfg(target_os = "windows")]
{
warn!("--os-aec on Windows is experimental and not yet tested");
warn!("Windows Voice Capture DSP (MFT) AEC is not yet implemented");
warn!("falling back to CPAL without AEC — please report issues");
let capture = AudioCapture::start()?;
let playback = AudioPlayback::start()?;
let cr = capture.ring().clone();
let pr = playback.ring().clone();
std::mem::forget(capture);
std::mem::forget(playback);
(cr, pr)
}
#[cfg(target_os = "linux")]
{
warn!("--os-aec on Linux is experimental and not yet tested");
warn!("PipeWire/PulseAudio echo-cancel module AEC is not yet implemented");
warn!("falling back to CPAL without AEC — please report issues");
let capture = AudioCapture::start()?;
let playback = AudioPlayback::start()?;
let cr = capture.ring().clone();
let pr = playback.ring().clone();
std::mem::forget(capture);
std::mem::forget(playback);
(cr, pr)
}
} else {
let capture = AudioCapture::start()?;
let playback = AudioPlayback::start()?;
let cr = capture.ring().clone();
let pr = playback.ring().clone();
// Keep handles alive (streams stop when dropped)
std::mem::forget(capture);
std::mem::forget(playback);
(cr, pr)
};
info!(os_aec = opts.os_aec, "audio I/O started — press Ctrl+C to stop");
// Far-end reference ring (only used when NOT using OS AEC).
let farend_ring = StdArc::new(AudioRing::new());
let running = StdArc::new(AtomicBool::new(true));
let mic_muted = StdArc::new(AtomicBool::new(false));
let spk_muted = StdArc::new(AtomicBool::new(false));
// --- Signal handler: set running=false on first Ctrl+C, force-quit on second ---
let signal_running = running.clone();
tokio::spawn(async move {
tokio::signal::ctrl_c().await.ok();
eprintln!(); // newline after ^C
info!("Ctrl+C received, shutting down...");
signal_running.store(false, Ordering::SeqCst);
tokio::signal::ctrl_c().await.ok();
eprintln!("\nForce quit");
std::process::exit(1);
});
let base_config = match opts.profile_override {
Some(p) => CallConfig::from_profile(p),
None => CallConfig::default(),
};
let config = CallConfig {
noise_suppression: !opts.no_denoise,
suppression_enabled: !opts.no_silence,
aec_delay_ms: opts.aec_delay_ms.unwrap_or(40),
..base_config
};
let frame_samples = frame_samples_for(&config.profile);
info!(codec = ?config.profile.codec, frame_samples, frame_ms = config.profile.frame_duration_ms, "call config");
{
let mut flags = Vec::new();
if opts.no_denoise { flags.push("denoise"); }
if opts.no_aec { flags.push("aec"); }
if opts.no_agc { flags.push("agc"); }
if opts.no_fec { flags.push("fec"); }
if opts.no_silence { flags.push("silence"); }
if opts.direct_playout { flags.push("jitter-buffer (direct playout)"); }
if !flags.is_empty() {
info!(disabled = %flags.join(", "), "audio processing overrides");
}
}
// --- Send task: poll capture ring → encode → send via async ---
let send_transport = transport.clone(); let send_transport = transport.clone();
let rt_handle = tokio::runtime::Handle::current(); let send_running = running.clone();
let send_handle = std::thread::Builder::new() let send_mic_muted = mic_muted.clone();
.name("wzp-send-loop".into()) let no_aec = opts.no_aec || opts.os_aec; // OS AEC replaces software AEC
.spawn(move || { let no_agc = opts.no_agc;
let config = CallConfig::default(); let _no_fec = opts.no_fec;
let mut encoder = CallEncoder::new(&config); let send_farend = farend_ring.clone();
loop { let send_task = async move {
let frame = match capture.read_frame() { let mut encoder = CallEncoder::new(&config);
Some(f) => f, if no_aec { encoder.set_aec_enabled(false); }
None => break, if no_agc { encoder.set_agc_enabled(false); }
}; let mut capture_buf = vec![0i16; frame_samples];
let packets = match encoder.encode_frame(&frame) { let mut farend_buf = vec![0i16; frame_samples];
Ok(p) => p, let mut frames_sent: u64 = 0;
Err(e) => { let mut frames_dropped: u64 = 0;
error!("encode error: {e}"); let mut send_errors: u64 = 0;
continue; let mut last_send_err = std::time::Instant::now();
} let mut polls: u64 = 0;
}; let mut last_diag = std::time::Instant::now();
for pkt in &packets {
if let Err(e) = rt_handle.block_on(send_transport.send_media(pkt)) { loop {
error!("send error: {e}"); if !send_running.load(Ordering::Relaxed) {
return; break;
}
let avail = capture_ring.available();
if avail < frame_samples {
tokio::time::sleep(std::time::Duration::from_millis(5)).await;
polls += 1;
// Diagnostic every 2 seconds
if last_diag.elapsed().as_secs() >= 2 {
info!(avail, polls, frames_sent, frame_samples, "send: ring starved");
last_diag = std::time::Instant::now();
}
continue;
}
let read = capture_ring.read(&mut capture_buf);
if read < frame_samples {
continue;
}
// Mic mute: zero out capture buffer (still encode + send silence to keep stream alive)
if send_mic_muted.load(Ordering::Relaxed) {
capture_buf.fill(0);
}
// Feed AEC far-end reference: what was played through the speaker.
// Must be called BEFORE encode_frame processes the mic signal.
if !no_aec {
while send_farend.available() >= frame_samples {
send_farend.read(&mut farend_buf);
encoder.feed_aec_farend(&farend_buf);
}
}
let t0 = std::time::Instant::now();
let packets = match encoder.encode_frame(&capture_buf) {
Ok(p) => p,
Err(e) => {
error!("encode error: {e}");
continue;
}
};
let encode_us = t0.elapsed().as_micros();
let mut dropped = false;
for pkt in &packets {
if let Err(e) = send_transport.send_media(pkt).await {
send_errors += 1;
frames_dropped += 1;
dropped = true;
if send_errors <= 3 || last_send_err.elapsed().as_secs() >= 1 {
warn!(send_errors, frames_dropped,
"send_media error (dropping packet): {e}");
last_send_err = std::time::Instant::now();
} }
} }
} }
})?;
if !dropped {
send_errors = 0; // reset on success
}
frames_sent += 1;
if frames_sent <= 5 || frames_sent % 500 == 0 {
info!(frames_sent, encode_us, pkts = packets.len(), "send progress");
}
}
};
// --- Recv + playout ---
let recv_transport = transport.clone(); let recv_transport = transport.clone();
let recv_handle = tokio::spawn(async move { let recv_running = running.clone();
let recv_spk_muted = spk_muted.clone();
let direct_playout = opts.direct_playout;
let recv_profile = opts.profile_override;
let playout_profile = recv_profile; // Copy for playout_task
// Direct playout: decode on recv, write straight to playout ring (like Android).
// Jitter buffer mode: ingest into jitter buffer, decode on 20ms tick.
let recv_task = {
let playout_ring = playout_ring.clone();
let farend_ring = farend_ring.clone();
let config = CallConfig::default(); let config = CallConfig::default();
let mut decoder = CallDecoder::new(&config); let decoder = StdArc::new(tokio::sync::Mutex::new(CallDecoder::new(&config)));
let mut pcm_buf = vec![0i16; FRAME_SAMPLES]; let decoder_recv = decoder.clone();
loop {
match recv_transport.recv_media().await { async move {
Ok(Some(pkt)) => { let mut packets_received: u64 = 0;
let is_repair = pkt.header.is_repair; let mut recv_errors: u64 = 0;
decoder.ingest(pkt); let mut timeouts: u64 = 0;
// Only decode for source packets (1 source = 1 audio frame). // For direct playout: raw codec decoder + AGC
// Repair packets feed the FEC decoder but don't produce audio. let direct_profile = recv_profile.unwrap_or(wzp_proto::QualityProfile::GOOD);
if !is_repair { let mut opus_dec = if direct_playout {
if let Some(_n) = decoder.decode_next(&mut pcm_buf) { Some(wzp_codec::create_decoder(direct_profile))
playback.write_frame(&pcm_buf); } else {
None
};
let mut playout_agc = wzp_codec::AutoGainControl::new();
let mut pcm_buf = vec![0i16; frame_samples.max(FRAME_SAMPLES_40MS)];
loop {
if !recv_running.load(Ordering::Relaxed) {
break;
}
let result = tokio::time::timeout(
std::time::Duration::from_millis(100),
recv_transport.recv_media(),
)
.await;
match result {
Ok(Ok(Some(pkt))) => {
packets_received += 1;
if direct_playout {
// Android path: decode immediately, AGC, write to ring
if !pkt.header.is_repair {
if let Some(ref mut dec) = opus_dec {
match dec.decode(&pkt.payload, &mut pcm_buf) {
Ok(n) => {
if !no_agc {
playout_agc.process_frame(&mut pcm_buf[..n]);
}
// Always feed AEC (even when speaker muted)
farend_ring.write(&pcm_buf[..n]);
// Speaker mute: don't write to playout ring
if !recv_spk_muted.load(Ordering::Relaxed) {
playout_ring.write(&pcm_buf[..n]);
}
}
Err(e) => {
if let Ok(n) = dec.decode_lost(&mut pcm_buf) {
if !recv_spk_muted.load(Ordering::Relaxed) {
playout_ring.write(&pcm_buf[..n]);
}
}
if packets_received < 10 {
warn!("decode error: {e}");
}
}
}
}
}
} else {
// Jitter buffer path
let mut dec = decoder_recv.lock().await;
dec.ingest(pkt);
}
if packets_received == 1 || packets_received % 500 == 0 {
info!(packets_received, direct_playout, "recv progress");
}
timeouts = 0;
}
Ok(Ok(None)) => {
info!("connection closed");
break;
}
Ok(Err(e)) => {
let msg = e.to_string();
if msg.contains("closed") || msg.contains("reset") {
error!("recv fatal: {e}");
break;
}
recv_errors += 1;
if recv_errors <= 3 {
warn!("recv error (continuing): {e}");
}
}
Err(_) => {
timeouts += 1;
if timeouts == 50 {
info!("recv: no media packets received in 5s");
} }
} }
} }
Ok(None) => { }
info!("connection closed"); }
break; };
}
Err(e) => { // Playout tick — only used when NOT in direct playout mode
error!("recv error: {e}"); let playout_running = running.clone();
let playout_task = async move {
if direct_playout {
// Direct playout handles everything in recv_task — just park here
loop {
tokio::time::sleep(std::time::Duration::from_secs(1)).await;
if !playout_running.load(Ordering::Relaxed) {
break; break;
} }
} }
return;
} }
});
tokio::signal::ctrl_c().await?; let playout_config = match playout_profile {
info!("Shutting down..."); Some(p) => CallConfig::from_profile(p),
None => CallConfig::default(),
};
let playout_frame_ms = playout_config.profile.frame_duration_ms as u64;
let playout_frame_samples = frame_samples_for(&playout_config.profile);
let mut decoder = CallDecoder::new(&playout_config);
let mut pcm_buf = vec![0i16; playout_frame_samples.max(FRAME_SAMPLES_40MS)];
let mut interval = tokio::time::interval(std::time::Duration::from_millis(playout_frame_ms));
interval.set_missed_tick_behavior(tokio::time::MissedTickBehavior::Skip);
let mut telemetry = JitterTelemetry::new(5);
loop {
interval.tick().await;
if !playout_running.load(Ordering::Relaxed) {
break;
}
recv_handle.abort(); let mut decoded_this_tick = 0;
drop(send_handle); while let Some(n) = decoder.decode_next(&mut pcm_buf) {
transport.close().await?; playout_ring.write(&pcm_buf[..n]);
info!("done"); decoded_this_tick += 1;
if decoded_this_tick >= 2 {
break;
}
}
telemetry.maybe_log(decoder.stats());
}
};
// --- Signal task: listen for RoomUpdate and display presence ---
let signal_transport = transport.clone();
let signal_running = running.clone();
let signal_task = async move {
loop {
if !signal_running.load(Ordering::Relaxed) {
break;
}
let result = tokio::time::timeout(
std::time::Duration::from_millis(200),
signal_transport.recv_signal(),
)
.await;
match result {
Ok(Ok(Some(wzp_proto::SignalMessage::RoomUpdate { participants, .. }))) => {
// Dedup by (fingerprint, alias) — same peer may appear multiple times
let mut seen = std::collections::HashSet::new();
let unique: Vec<_> = participants
.iter()
.filter(|p| seen.insert((&p.fingerprint, &p.alias)))
.collect();
info!(count = unique.len(), "room update");
for p in &unique {
let name = p
.alias
.as_deref()
.unwrap_or("(no alias)");
let fp = if p.fingerprint.is_empty() {
"(no fingerprint)"
} else {
&p.fingerprint
};
info!(" participant: {name} [{fp}]");
}
}
Ok(Ok(Some(msg))) => {
info!("signal: {:?}", std::mem::discriminant(&msg));
}
Ok(Ok(None)) => {
info!("signal stream closed");
break;
}
Ok(Err(e)) => {
error!("signal recv error: {e}");
break;
}
Err(_) => {} // timeout — loop and check running flag
}
}
};
// --- Keyboard task: Ctrl+M = toggle mic mute, Ctrl+S = toggle speaker mute ---
let kb_running = running.clone();
let kb_mic = mic_muted.clone();
let kb_spk = spk_muted.clone();
let keyboard_task = async move {
use tokio::io::AsyncReadExt;
// Put terminal in raw mode so we get individual keypresses
let _raw_guard = RawModeGuard::enter();
let mut stdin = tokio::io::stdin();
let mut buf = [0u8; 1];
loop {
if !kb_running.load(Ordering::Relaxed) {
break;
}
match tokio::time::timeout(
std::time::Duration::from_millis(200),
stdin.read(&mut buf),
)
.await
{
Ok(Ok(1)) => match buf[0] {
b'm' | b'M' | 0x0D => {
// 'm' or Ctrl+M
let was = kb_mic.fetch_xor(true, Ordering::SeqCst);
let state = if !was { "MUTED" } else { "unmuted" };
eprintln!("\r[mic {state}]");
}
b's' | b'S' | 0x13 => {
// 's' or Ctrl+S
let was = kb_spk.fetch_xor(true, Ordering::SeqCst);
let state = if !was { "MUTED" } else { "unmuted" };
eprintln!("\r[speaker {state}]");
}
0x03 => {
// Ctrl+C
eprintln!();
info!("Ctrl+C received, shutting down...");
kb_running.store(false, Ordering::SeqCst);
break;
}
b'q' | b'Q' => {
eprintln!("\r[quit]");
kb_running.store(false, Ordering::SeqCst);
break;
}
_ => {}
},
Ok(Ok(_)) | Ok(Err(_)) => break,
Err(_) => {} // timeout
}
}
};
// --- Run all tasks, exit when any finishes (or running flag cleared by Ctrl+C) ---
tokio::select! {
_ = send_task => info!("send task ended"),
_ = recv_task => info!("recv task ended"),
_ = playout_task => info!("playout task ended"),
_ = signal_task => info!("signal task ended"),
_ = keyboard_task => info!("keyboard task ended"),
}
running.store(false, Ordering::SeqCst);
// Audio streams stop when their handles are dropped (via mem::forget above or VPIO drop).
// Give transport 2s to close gracefully, then bail
match tokio::time::timeout(std::time::Duration::from_secs(2), transport.close()).await {
Ok(Ok(())) => info!("done"),
Ok(Err(e)) => info!("close error (non-fatal): {e}"),
Err(_) => info!("close timed out, exiting anyway"),
}
Ok(()) Ok(())
} }

View File

@@ -109,12 +109,15 @@ pub fn signal_to_call_type(signal: &SignalMessage) -> CallSignalType {
SignalMessage::RouteResponse { .. } => CallSignalType::Offer, // reuse SignalMessage::RouteResponse { .. } => CallSignalType::Offer, // reuse
SignalMessage::SessionForward { .. } => CallSignalType::Offer, // reuse SignalMessage::SessionForward { .. } => CallSignalType::Offer, // reuse
SignalMessage::SessionForwardAck { .. } => CallSignalType::Offer, // reuse SignalMessage::SessionForwardAck { .. } => CallSignalType::Offer, // reuse
SignalMessage::RoomUpdate { .. } => CallSignalType::Offer, // reuse
SignalMessage::SetAlias { .. } => CallSignalType::Offer, // reuse
} }
} }
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;
use wzp_proto::QualityProfile;
#[test] #[test]
fn payload_roundtrip() { fn payload_roundtrip() {
@@ -123,6 +126,7 @@ mod tests {
ephemeral_pub: [2u8; 32], ephemeral_pub: [2u8; 32],
signature: vec![3u8; 64], signature: vec![3u8; 64],
supported_profiles: vec![QualityProfile::GOOD], supported_profiles: vec![QualityProfile::GOOD],
alias: None,
}; };
let encoded = encode_call_payload(&signal, Some("relay.example.com:4433"), Some("myroom")); let encoded = encode_call_payload(&signal, Some("relay.example.com:4433"), Some("myroom"));
@@ -140,6 +144,7 @@ mod tests {
ephemeral_pub: [0; 32], ephemeral_pub: [0; 32],
signature: vec![], signature: vec![],
supported_profiles: vec![], supported_profiles: vec![],
alias: None,
}; };
assert!(matches!(signal_to_call_type(&offer), CallSignalType::Offer)); assert!(matches!(signal_to_call_type(&offer), CallSignalType::Offer));

View File

@@ -17,6 +17,7 @@ use wzp_proto::{MediaTransport, QualityProfile, SignalMessage};
pub async fn perform_handshake( pub async fn perform_handshake(
transport: &dyn MediaTransport, transport: &dyn MediaTransport,
seed: &[u8; 32], seed: &[u8; 32],
alias: Option<&str>,
) -> Result<Box<dyn CryptoSession>, anyhow::Error> { ) -> Result<Box<dyn CryptoSession>, anyhow::Error> {
// 1. Create key exchange from identity seed // 1. Create key exchange from identity seed
let mut kx = WarzoneKeyExchange::from_identity_seed(seed); let mut kx = WarzoneKeyExchange::from_identity_seed(seed);
@@ -41,6 +42,7 @@ pub async fn perform_handshake(
QualityProfile::DEGRADED, QualityProfile::DEGRADED,
QualityProfile::CATASTROPHIC, QualityProfile::CATASTROPHIC,
], ],
alias: alias.map(|s| s.to_string()),
}; };
transport.send_signal(&offer).await?; transport.send_signal(&offer).await?;

View File

@@ -8,6 +8,10 @@
#[cfg(feature = "audio")] #[cfg(feature = "audio")]
pub mod audio_io; pub mod audio_io;
#[cfg(feature = "audio")]
pub mod audio_ring;
#[cfg(feature = "vpio")]
pub mod audio_vpio;
pub mod bench; pub mod bench;
pub mod call; pub mod call;
pub mod drift_test; pub mod drift_test;

View File

@@ -14,7 +14,7 @@ use crate::codec2_dec::Codec2Decoder;
use crate::codec2_enc::Codec2Encoder; use crate::codec2_enc::Codec2Encoder;
use crate::opus_dec::OpusDecoder; use crate::opus_dec::OpusDecoder;
use crate::opus_enc::OpusEncoder; use crate::opus_enc::OpusEncoder;
use crate::resample; use crate::resample::{Downsampler48to8, Upsampler8to48};
// ─── Helpers ───────────────────────────────────────────────────────────────── // ─── Helpers ─────────────────────────────────────────────────────────────────
@@ -54,6 +54,7 @@ pub struct AdaptiveEncoder {
opus: OpusEncoder, opus: OpusEncoder,
codec2: Codec2Encoder, codec2: Codec2Encoder,
active: CodecId, active: CodecId,
downsampler: Downsampler48to8,
} }
impl AdaptiveEncoder { impl AdaptiveEncoder {
@@ -66,6 +67,7 @@ impl AdaptiveEncoder {
opus, opus,
codec2, codec2,
active: profile.codec, active: profile.codec,
downsampler: Downsampler48to8::new(),
}) })
} }
} }
@@ -74,7 +76,7 @@ impl AudioEncoder for AdaptiveEncoder {
fn encode(&mut self, pcm: &[i16], out: &mut [u8]) -> Result<usize, CodecError> { fn encode(&mut self, pcm: &[i16], out: &mut [u8]) -> Result<usize, CodecError> {
if is_codec2(self.active) { if is_codec2(self.active) {
// Downsample 48 kHz → 8 kHz then encode via Codec2. // Downsample 48 kHz → 8 kHz then encode via Codec2.
let pcm_8k = resample::resample_48k_to_8k(pcm); let pcm_8k = self.downsampler.process(pcm);
self.codec2.encode(&pcm_8k, out) self.codec2.encode(&pcm_8k, out)
} else { } else {
self.opus.encode(pcm, out) self.opus.encode(pcm, out)
@@ -126,6 +128,7 @@ pub struct AdaptiveDecoder {
opus: OpusDecoder, opus: OpusDecoder,
codec2: Codec2Decoder, codec2: Codec2Decoder,
active: CodecId, active: CodecId,
upsampler: Upsampler8to48,
} }
impl AdaptiveDecoder { impl AdaptiveDecoder {
@@ -138,6 +141,7 @@ impl AdaptiveDecoder {
opus, opus,
codec2, codec2,
active: profile.codec, active: profile.codec,
upsampler: Upsampler8to48::new(),
}) })
} }
} }
@@ -149,7 +153,7 @@ impl AudioDecoder for AdaptiveDecoder {
let c2_samples = self.codec2_frame_samples(); let c2_samples = self.codec2_frame_samples();
let mut buf_8k = vec![0i16; c2_samples]; let mut buf_8k = vec![0i16; c2_samples];
let n = self.codec2.decode(encoded, &mut buf_8k)?; let n = self.codec2.decode(encoded, &mut buf_8k)?;
let pcm_48k = resample::resample_8k_to_48k(&buf_8k[..n]); let pcm_48k = self.upsampler.process(&buf_8k[..n]);
let out_len = pcm_48k.len().min(pcm.len()); let out_len = pcm_48k.len().min(pcm.len());
pcm[..out_len].copy_from_slice(&pcm_48k[..out_len]); pcm[..out_len].copy_from_slice(&pcm_48k[..out_len]);
Ok(out_len) Ok(out_len)
@@ -163,7 +167,7 @@ impl AudioDecoder for AdaptiveDecoder {
let c2_samples = self.codec2_frame_samples(); let c2_samples = self.codec2_frame_samples();
let mut buf_8k = vec![0i16; c2_samples]; let mut buf_8k = vec![0i16; c2_samples];
let n = self.codec2.decode_lost(&mut buf_8k)?; let n = self.codec2.decode_lost(&mut buf_8k)?;
let pcm_48k = resample::resample_8k_to_48k(&buf_8k[..n]); let pcm_48k = self.upsampler.process(&buf_8k[..n]);
let out_len = pcm_48k.len().min(pcm.len()); let out_len = pcm_48k.len().min(pcm.len());
pcm[..out_len].copy_from_slice(&pcm_48k[..out_len]); pcm[..out_len].copy_from_slice(&pcm_48k[..out_len]);
Ok(out_len) Ok(out_len)

335
crates/wzp-codec/src/aec.rs Normal file
View File

@@ -0,0 +1,335 @@
//! Acoustic Echo Cancellation — delay-compensated leaky NLMS with
//! Geigel double-talk detection.
//!
//! Key insight: on a laptop, the round-trip audio latency (playout → speaker
//! → air → mic → capture) is 3050ms. The far-end reference must be delayed
//! by this amount so the adaptive filter models the *echo path*, not the
//! *system delay + echo path*.
//!
//! The leaky coefficient decay prevents the filter from diverging when the
//! echo path changes (e.g. hand near laptop) or when the delay estimate
//! is slightly off.
/// Delay-compensated leaky NLMS echo canceller with Geigel DTD.
pub struct EchoCanceller {
// --- Adaptive filter ---
filter: Vec<f32>,
filter_len: usize,
/// Circular buffer of far-end reference samples (after delay).
far_buf: Vec<f32>,
far_pos: usize,
/// NLMS step size.
mu: f32,
/// Leakage factor: coefficients are multiplied by (1 - leak) each frame.
/// Prevents unbounded growth / divergence. 0.0001 is gentle.
leak: f32,
enabled: bool,
// --- Delay buffer ---
/// Raw far-end samples before delay compensation.
delay_ring: Vec<f32>,
delay_write: usize,
delay_read: usize,
/// Delay in samples (e.g. 1920 = 40ms at 48kHz).
delay_samples: usize,
/// Capacity of the delay ring.
delay_cap: usize,
// --- Double-talk detection (Geigel) ---
/// Peak far-end level over the last filter_len samples.
far_peak: f32,
/// Geigel threshold: if |near| > threshold * far_peak, assume double-talk.
geigel_threshold: f32,
/// Holdover counter: keep DTD active for a few frames after detection.
dtd_holdover: u32,
dtd_hold_frames: u32,
}
impl EchoCanceller {
/// Create a new echo canceller.
///
/// * `sample_rate` — typically 48000
/// * `filter_ms` — echo-tail length in milliseconds (60ms recommended)
/// * `delay_ms` — far-end delay compensation in milliseconds (40ms for laptops)
pub fn new(sample_rate: u32, filter_ms: u32) -> Self {
Self::with_delay(sample_rate, filter_ms, 40)
}
pub fn with_delay(sample_rate: u32, filter_ms: u32, delay_ms: u32) -> Self {
let filter_len = (sample_rate as usize) * (filter_ms as usize) / 1000;
let delay_samples = (sample_rate as usize) * (delay_ms as usize) / 1000;
// Delay ring must hold at least delay_samples + one frame (960) of headroom.
let delay_cap = delay_samples + (sample_rate as usize / 10); // +100ms headroom
Self {
filter: vec![0.0; filter_len],
filter_len,
far_buf: vec![0.0; filter_len],
far_pos: 0,
mu: 0.01,
leak: 0.0001,
enabled: true,
delay_ring: vec![0.0; delay_cap],
delay_write: 0,
delay_read: 0,
delay_samples,
delay_cap,
far_peak: 0.0,
geigel_threshold: 0.7,
dtd_holdover: 0,
dtd_hold_frames: 5,
}
}
/// Feed far-end (speaker) samples. These go into the delay buffer first;
/// once enough samples have accumulated, they are released to the filter's
/// circular buffer with the correct delay offset.
pub fn feed_farend(&mut self, farend: &[i16]) {
// Write raw samples into the delay ring.
for &s in farend {
self.delay_ring[self.delay_write % self.delay_cap] = s as f32;
self.delay_write += 1;
}
// Release delayed samples to the filter's far-end buffer.
while self.delay_available() >= 1 {
let sample = self.delay_ring[self.delay_read % self.delay_cap];
self.delay_read += 1;
self.far_buf[self.far_pos] = sample;
self.far_pos = (self.far_pos + 1) % self.filter_len;
// Track peak far-end level for Geigel DTD.
let abs_s = sample.abs();
if abs_s > self.far_peak {
self.far_peak = abs_s;
}
}
// Decay far_peak slowly (avoids stale peak from a loud burst long ago).
self.far_peak *= 0.9995;
}
/// Number of delayed samples available to release.
fn delay_available(&self) -> usize {
let buffered = self.delay_write - self.delay_read;
if buffered > self.delay_samples {
buffered - self.delay_samples
} else {
0
}
}
/// Process a near-end (microphone) frame, removing the estimated echo.
pub fn process_frame(&mut self, nearend: &mut [i16]) -> f32 {
if !self.enabled {
return 1.0;
}
let n = nearend.len();
let fl = self.filter_len;
// --- Geigel double-talk detection ---
// If any near-end sample exceeds threshold * far_peak, assume
// the local speaker is active and freeze adaptation.
let mut is_doubletalk = self.dtd_holdover > 0;
if !is_doubletalk {
let threshold_level = self.geigel_threshold * self.far_peak;
for &s in nearend.iter() {
if (s as f32).abs() > threshold_level && self.far_peak > 100.0 {
is_doubletalk = true;
self.dtd_holdover = self.dtd_hold_frames;
break;
}
}
}
if self.dtd_holdover > 0 {
self.dtd_holdover -= 1;
}
// Check if far-end is active (otherwise nothing to cancel).
let far_active = self.far_peak > 100.0;
// --- Leaky coefficient decay ---
// Applied once per frame for efficiency.
let decay = 1.0 - self.leak;
for c in self.filter.iter_mut() {
*c *= decay;
}
let mut sum_near_sq: f64 = 0.0;
let mut sum_err_sq: f64 = 0.0;
for i in 0..n {
let near_f = nearend[i] as f32;
// Position of far-end "now" for this near-end sample.
let base = (self.far_pos + fl * ((n / fl) + 2) + i - n) % fl;
// --- Echo estimation: dot(filter, far_end_window) ---
let mut echo_est: f32 = 0.0;
let mut power: f32 = 0.0;
for k in 0..fl {
let fe_idx = (base + fl - k) % fl;
let fe = self.far_buf[fe_idx];
echo_est += self.filter[k] * fe;
power += fe * fe;
}
let error = near_f - echo_est;
// --- NLMS adaptation (only when far-end active & no double-talk) ---
if far_active && !is_doubletalk && power > 10.0 {
let step = self.mu * error / (power + 1.0);
for k in 0..fl {
let fe_idx = (base + fl - k) % fl;
self.filter[k] += step * self.far_buf[fe_idx];
}
}
let out = error.clamp(-32768.0, 32767.0);
nearend[i] = out as i16;
sum_near_sq += (near_f as f64).powi(2);
sum_err_sq += (out as f64).powi(2);
}
if sum_err_sq < 1.0 {
100.0
} else {
(sum_near_sq / sum_err_sq).sqrt() as f32
}
}
pub fn set_enabled(&mut self, enabled: bool) {
self.enabled = enabled;
}
pub fn is_enabled(&self) -> bool {
self.enabled
}
pub fn reset(&mut self) {
self.filter.iter_mut().for_each(|c| *c = 0.0);
self.far_buf.iter_mut().for_each(|s| *s = 0.0);
self.far_pos = 0;
self.far_peak = 0.0;
self.delay_ring.iter_mut().for_each(|s| *s = 0.0);
self.delay_write = 0;
self.delay_read = 0;
self.dtd_holdover = 0;
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn creates_with_correct_sizes() {
let aec = EchoCanceller::with_delay(48000, 60, 40);
assert_eq!(aec.filter_len, 2880); // 60ms @ 48kHz
assert_eq!(aec.delay_samples, 1920); // 40ms @ 48kHz
}
#[test]
fn passthrough_when_disabled() {
let mut aec = EchoCanceller::new(48000, 60);
aec.set_enabled(false);
let original: Vec<i16> = (0..960).map(|i| (i * 10) as i16).collect();
let mut frame = original.clone();
aec.process_frame(&mut frame);
assert_eq!(frame, original);
}
#[test]
fn silence_passthrough() {
let mut aec = EchoCanceller::with_delay(48000, 30, 0);
aec.feed_farend(&vec![0i16; 960]);
let mut frame = vec![0i16; 960];
aec.process_frame(&mut frame);
assert!(frame.iter().all(|&s| s == 0));
}
#[test]
fn reduces_echo_with_no_delay() {
// Simulate: far-end plays, echo arrives at mic attenuated by ~50%
// (realistic — speaker to mic on laptop loses volume).
let mut aec = EchoCanceller::with_delay(48000, 10, 0);
let frame_len = 480;
let make_tone = |offset: usize| -> Vec<i16> {
(0..frame_len)
.map(|i| {
let t = (offset + i) as f64 / 48000.0;
(5000.0 * (2.0 * std::f64::consts::PI * 300.0 * t).sin()) as i16
})
.collect()
};
let mut last_erle = 1.0f32;
for frame_idx in 0..100 {
let farend = make_tone(frame_idx * frame_len);
aec.feed_farend(&farend);
// Near-end = attenuated copy of far-end (echo at ~50% volume).
let mut nearend: Vec<i16> = farend.iter().map(|&s| s / 2).collect();
last_erle = aec.process_frame(&mut nearend);
}
assert!(
last_erle > 1.0,
"expected ERLE > 1.0 after adaptation, got {last_erle}"
);
}
#[test]
fn preserves_nearend_during_doubletalk() {
let mut aec = EchoCanceller::with_delay(48000, 30, 0);
let frame_len = 960;
let nearend: Vec<i16> = (0..frame_len)
.map(|i| {
let t = i as f64 / 48000.0;
(10000.0 * (2.0 * std::f64::consts::PI * 440.0 * t).sin()) as i16
})
.collect();
// Feed silence as far-end (no echo source).
aec.feed_farend(&vec![0i16; frame_len]);
let mut frame = nearend.clone();
aec.process_frame(&mut frame);
let input_energy: f64 = nearend.iter().map(|&s| (s as f64).powi(2)).sum();
let output_energy: f64 = frame.iter().map(|&s| (s as f64).powi(2)).sum();
let ratio = output_energy / input_energy;
assert!(
ratio > 0.8,
"near-end speech should be preserved, energy ratio = {ratio:.3}"
);
}
#[test]
fn delay_buffer_holds_samples() {
let mut aec = EchoCanceller::with_delay(48000, 10, 20);
// 20ms delay = 960 samples @ 48kHz.
// After feeding, feed_farend auto-drains available samples to far_buf.
// So delay_available() is always 0 after feed_farend returns.
// Instead, verify far_pos advances only after the delay is filled.
// Feed 960 samples (= delay amount). No samples released yet.
aec.feed_farend(&vec![1i16; 960]);
// far_buf should still be all zeros (nothing released).
assert!(aec.far_buf.iter().all(|&s| s == 0.0), "nothing should be released yet");
// Feed 480 more. 480 should be released to far_buf.
aec.feed_farend(&vec![2i16; 480]);
let non_zero = aec.far_buf.iter().filter(|&&s| s != 0.0).count();
assert!(non_zero > 0, "samples should have been released to far_buf");
}
}

219
crates/wzp-codec/src/agc.rs Normal file
View File

@@ -0,0 +1,219 @@
//! Automatic Gain Control (AGC) with two-stage smoothing.
//!
//! Uses a fast attack / slow release envelope follower to keep the
//! output signal near a configurable target RMS level. This prevents
//! both clipping (when the speaker is too loud) and inaudibility (when
//! the speaker is too quiet or far from the mic).
/// Two-stage automatic gain control.
///
/// The gain is adjusted per-frame based on the measured RMS energy,
/// with a fast attack (gain decreases quickly when signal gets louder)
/// and a slow release (gain increases gradually when signal gets quieter).
pub struct AutoGainControl {
target_rms: f64,
current_gain: f64,
min_gain: f64,
max_gain: f64,
attack_alpha: f64,
release_alpha: f64,
enabled: bool,
}
impl AutoGainControl {
/// Create a new AGC with sensible VoIP defaults.
pub fn new() -> Self {
Self {
target_rms: 3000.0, // ~-20 dBFS for i16
current_gain: 1.0,
min_gain: 0.5,
max_gain: 32.0,
attack_alpha: 0.3, // fast attack
release_alpha: 0.02, // slow release
enabled: true,
}
}
/// Process a frame of PCM audio in-place, applying gain adjustment.
pub fn process_frame(&mut self, pcm: &mut [i16]) {
if !self.enabled {
return;
}
// Compute RMS of the frame.
let rms = Self::compute_rms(pcm);
// Don't amplify near-silence — it would just boost noise.
if rms < 10.0 {
return;
}
// Desired instantaneous gain.
let desired_gain = (self.target_rms / rms).clamp(self.min_gain, self.max_gain);
// Smooth the gain transition.
let alpha = if desired_gain < self.current_gain {
// Signal is louder than target → reduce gain quickly (attack).
self.attack_alpha
} else {
// Signal is quieter than target → raise gain slowly (release).
self.release_alpha
};
self.current_gain = self.current_gain * (1.0 - alpha) + desired_gain * alpha;
// Apply gain to each sample with hard limiting at ±31000 (~0.946 * i16::MAX).
const LIMIT: f64 = 31000.0;
let gain = self.current_gain;
for sample in pcm.iter_mut() {
let amplified = (*sample as f64) * gain;
let clamped = amplified.clamp(-LIMIT, LIMIT);
*sample = clamped as i16;
}
}
/// Enable or disable the AGC.
pub fn set_enabled(&mut self, enabled: bool) {
self.enabled = enabled;
}
/// Returns whether the AGC is currently enabled.
pub fn is_enabled(&self) -> bool {
self.enabled
}
/// Current gain expressed in dB.
pub fn current_gain_db(&self) -> f64 {
20.0 * self.current_gain.log10()
}
/// Compute the RMS (root mean square) of a PCM buffer.
fn compute_rms(pcm: &[i16]) -> f64 {
if pcm.is_empty() {
return 0.0;
}
let sum_sq: f64 = pcm.iter().map(|&s| (s as f64) * (s as f64)).sum();
(sum_sq / pcm.len() as f64).sqrt()
}
}
impl Default for AutoGainControl {
fn default() -> Self {
Self::new()
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn agc_creates_with_defaults() {
let agc = AutoGainControl::new();
assert!(agc.is_enabled());
assert!((agc.current_gain - 1.0).abs() < f64::EPSILON);
}
#[test]
fn agc_passthrough_when_disabled() {
let mut agc = AutoGainControl::new();
agc.set_enabled(false);
let original: Vec<i16> = (0..960).map(|i| (i * 5) as i16).collect();
let mut frame = original.clone();
agc.process_frame(&mut frame);
assert_eq!(frame, original);
}
#[test]
fn agc_does_not_amplify_silence() {
let mut agc = AutoGainControl::new();
let mut frame = vec![0i16; 960];
agc.process_frame(&mut frame);
assert!(frame.iter().all(|&s| s == 0));
// Gain should remain at initial value.
assert!((agc.current_gain - 1.0).abs() < f64::EPSILON);
}
#[test]
fn agc_amplifies_quiet_signal() {
let mut agc = AutoGainControl::new();
// Very quiet signal (RMS ~ 50).
let mut frame: Vec<i16> = (0..960)
.map(|i| {
let t = i as f64 / 48000.0;
(50.0 * (2.0 * std::f64::consts::PI * 440.0 * t).sin()) as i16
})
.collect();
// Process several frames to let the gain ramp up.
for _ in 0..50 {
let mut f = frame.clone();
agc.process_frame(&mut f);
frame = f;
}
// Gain should have increased past 1.0.
assert!(
agc.current_gain > 1.05,
"expected gain > 1.05 for quiet signal, got {}",
agc.current_gain
);
}
#[test]
fn agc_attenuates_loud_signal() {
let mut agc = AutoGainControl::new();
// Loud signal (RMS ~ 20000).
let frame: Vec<i16> = (0..960)
.map(|i| {
let t = i as f64 / 48000.0;
(28000.0 * (2.0 * std::f64::consts::PI * 440.0 * t).sin()) as i16
})
.collect();
// Process several frames.
for _ in 0..20 {
let mut f = frame.clone();
agc.process_frame(&mut f);
}
// Gain should have decreased below 1.0.
assert!(
agc.current_gain < 1.0,
"expected gain < 1.0 for loud signal, got {}",
agc.current_gain
);
}
#[test]
fn agc_output_within_limits() {
let mut agc = AutoGainControl::new();
// Force a high gain by processing many quiet frames first.
for _ in 0..100 {
let mut f: Vec<i16> = vec![100; 960];
agc.process_frame(&mut f);
}
// Now send a louder frame — output should still be within ±31000.
let mut frame: Vec<i16> = vec![20000; 960];
agc.process_frame(&mut frame);
assert!(
frame.iter().all(|&s| s.abs() <= 31000),
"output samples must be within ±31000"
);
}
#[test]
fn agc_gain_db_at_unity() {
let agc = AutoGainControl::new();
let db = agc.current_gain_db();
assert!(
db.abs() < 0.01,
"expected ~0 dB at unity gain, got {db}"
);
}
}

View File

@@ -10,6 +10,8 @@
//! trait-object encoders/decoders that handle adaptive switching internally. //! trait-object encoders/decoders that handle adaptive switching internally.
pub mod adaptive; pub mod adaptive;
pub mod aec;
pub mod agc;
pub mod codec2_dec; pub mod codec2_dec;
pub mod codec2_enc; pub mod codec2_enc;
pub mod denoise; pub mod denoise;
@@ -19,6 +21,8 @@ pub mod resample;
pub mod silence; pub mod silence;
pub use adaptive::{AdaptiveDecoder, AdaptiveEncoder}; pub use adaptive::{AdaptiveDecoder, AdaptiveEncoder};
pub use aec::EchoCanceller;
pub use agc::AutoGainControl;
pub use denoise::NoiseSupressor; pub use denoise::NoiseSupressor;
pub use silence::{ComfortNoise, SilenceDetector}; pub use silence::{ComfortNoise, SilenceDetector};
pub use wzp_proto::{AudioDecoder, AudioEncoder, CodecId, QualityProfile}; pub use wzp_proto::{AudioDecoder, AudioEncoder, CodecId, QualityProfile};

View File

@@ -40,6 +40,11 @@ impl OpusEncoder {
.set_signal(Signal::Voice) .set_signal(Signal::Voice)
.map_err(|e| CodecError::EncodeFailed(format!("set signal: {e}")))?; .map_err(|e| CodecError::EncodeFailed(format!("set signal: {e}")))?;
// Default complexity 7 — good quality/CPU trade-off for VoIP
enc.inner
.set_complexity(7)
.map_err(|e| CodecError::EncodeFailed(format!("set complexity: {e}")))?;
Ok(enc) Ok(enc)
} }
@@ -56,6 +61,21 @@ impl OpusEncoder {
pub fn frame_samples(&self) -> usize { pub fn frame_samples(&self) -> usize {
(48_000 * self.frame_duration_ms as usize) / 1000 (48_000 * self.frame_duration_ms as usize) / 1000
} }
/// Set the encoder complexity (0-10). Higher values produce better quality
/// at the cost of more CPU. Default is 7.
pub fn set_complexity(&mut self, complexity: i32) {
let c = (complexity as u8).min(10);
let _ = self.inner.set_complexity(c);
}
/// Hint the encoder about expected packet loss percentage (0-100).
///
/// Higher values cause the encoder to use more redundancy to survive
/// packet loss, at the expense of slightly higher bitrate.
pub fn set_expected_loss(&mut self, loss_pct: u8) {
let _ = self.inner.set_packet_loss_perc(loss_pct.min(100));
}
} }
impl AudioEncoder for OpusEncoder { impl AudioEncoder for OpusEncoder {

View File

@@ -1,55 +1,258 @@
//! Simple linear resampler for 48 kHz <-> 8 kHz conversion. //! Windowed-sinc FIR resampler for 48 kHz <-> 8 kHz conversion.
//! //!
//! These are basic implementations suitable for voice. For higher quality, //! Provides both stateless free functions (backward-compatible) and stateful
//! replace with the `rubato` crate later. //! `Downsampler48to8` / `Upsampler8to48` structs that maintain overlap history
//! between frames for glitch-free streaming.
/// Downsample from 48 kHz to 8 kHz (6:1 decimation with averaging). use std::f64::consts::PI;
// ─── FIR kernel parameters ─────────────────────────────────────────────────
/// Number of FIR taps in the anti-alias / interpolation filter.
const FIR_TAPS: usize = 48;
/// Kaiser window beta parameter — controls sidelobe attenuation.
const KAISER_BETA: f64 = 8.0;
/// Cutoff frequency in Hz for the low-pass filter (just below 4 kHz Nyquist of 8 kHz).
const CUTOFF_HZ: f64 = 3800.0;
/// Working sample rate in Hz.
const SAMPLE_RATE: f64 = 48000.0;
/// Decimation / interpolation ratio between 48 kHz and 8 kHz.
const RATIO: usize = 6;
// ─── Kaiser window helpers ─────────────────────────────────────────────────
/// Zeroth-order modified Bessel function of the first kind, I₀(x).
/// ///
/// Each output sample is the average of 6 consecutive input samples, /// Computed via the well-known power-series expansion, converging rapidly
/// providing basic anti-aliasing via a box filter. /// for the moderate values of x used in Kaiser window design.
pub fn resample_48k_to_8k(input: &[i16]) -> Vec<i16> { fn bessel_i0(x: f64) -> f64 {
const RATIO: usize = 6; let mut sum = 1.0f64;
let out_len = input.len() / RATIO; let mut term = 1.0f64;
let mut output = Vec::with_capacity(out_len); let half_x = x / 2.0;
for k in 1..=25 {
for chunk in input.chunks_exact(RATIO) { term *= (half_x / k as f64) * (half_x / k as f64);
let sum: i32 = chunk.iter().map(|&s| s as i32).sum(); sum += term;
output.push((sum / RATIO as i32) as i16); if term < 1e-12 * sum {
break;
}
} }
sum
output
} }
/// Upsample from 8 kHz to 48 kHz (1:6 interpolation with linear interp). /// Build a windowed-sinc low-pass FIR kernel.
/// ///
/// Linearly interpolates between each pair of input samples to produce /// Returns `FIR_TAPS` coefficients normalised so that the DC gain is exactly 1.0.
/// 6 output samples per input sample. fn build_fir_kernel() -> [f64; FIR_TAPS] {
pub fn resample_8k_to_48k(input: &[i16]) -> Vec<i16> { let mut kernel = [0.0f64; FIR_TAPS];
const RATIO: usize = 6; let m = (FIR_TAPS - 1) as f64;
if input.is_empty() { let fc = CUTOFF_HZ / SAMPLE_RATE; // normalised cutoff (0..0.5)
return Vec::new(); let beta_denom = bessel_i0(KAISER_BETA);
}
let out_len = input.len() * RATIO; for i in 0..FIR_TAPS {
let mut output = Vec::with_capacity(out_len); // Sinc
let n = i as f64 - m / 2.0;
for i in 0..input.len() { let sinc = if n.abs() < 1e-12 {
let current = input[i] as i32; 2.0 * fc
let next = if i + 1 < input.len() {
input[i + 1] as i32
} else { } else {
current // hold last sample (2.0 * PI * fc * n).sin() / (PI * n)
}; };
for j in 0..RATIO { // Kaiser window
let interp = current + (next - current) * j as i32 / RATIO as i32; let t = 2.0 * i as f64 / m - 1.0; // range [-1, 1]
output.push(interp as i16); let kaiser = bessel_i0(KAISER_BETA * (1.0 - t * t).max(0.0).sqrt()) / beta_denom;
kernel[i] = sinc * kaiser;
}
// Normalise to unity DC gain.
let sum: f64 = kernel.iter().sum();
if sum.abs() > 1e-15 {
for k in kernel.iter_mut() {
*k /= sum;
} }
} }
output kernel
} }
// ─── Stateful Downsampler 48→8 ─────────────────────────────────────────────
/// Stateful polyphase FIR downsampler from 48 kHz to 8 kHz.
///
/// Maintains `FIR_TAPS - 1` samples of history between successive calls to
/// `process()` for seamless frame boundaries.
pub struct Downsampler48to8 {
kernel: [f64; FIR_TAPS],
history: Vec<f64>,
}
impl Downsampler48to8 {
pub fn new() -> Self {
Self {
kernel: build_fir_kernel(),
history: vec![0.0; FIR_TAPS - 1],
}
}
/// Downsample a block of 48 kHz samples to 8 kHz.
///
/// The input length should be a multiple of 6; any trailing samples that
/// don't form a complete output sample are consumed into the history.
pub fn process(&mut self, input: &[i16]) -> Vec<i16> {
let hist_len = self.history.len(); // FIR_TAPS - 1
let total_len = hist_len + input.len();
// Build a working buffer: history ++ input (as f64).
let mut work = Vec::with_capacity(total_len);
work.extend_from_slice(&self.history);
work.extend(input.iter().map(|&s| s as f64));
let out_len = input.len() / RATIO;
let mut output = Vec::with_capacity(out_len);
for i in 0..out_len {
// The centre of the filter for output sample i sits at
// position hist_len + i*RATIO in the work buffer (aligning
// with the first new input sample at decimation phase 0).
let centre = hist_len + i * RATIO;
let start = centre + 1 - FIR_TAPS; // may be 0 for the first few
let mut acc = 0.0f64;
for k in 0..FIR_TAPS {
let idx = start + k;
if idx < work.len() {
acc += work[idx] * self.kernel[k];
}
}
output.push(acc.round().clamp(-32768.0, 32767.0) as i16);
}
// Update history: keep the last (FIR_TAPS - 1) samples from work.
if work.len() >= hist_len {
self.history
.copy_from_slice(&work[work.len() - hist_len..]);
} else {
// Input was shorter than history — shift.
let shift = hist_len - work.len();
self.history.copy_within(shift.., 0);
for (i, &v) in work.iter().enumerate() {
self.history[hist_len - work.len() + i] = v;
}
}
output
}
}
impl Default for Downsampler48to8 {
fn default() -> Self {
Self::new()
}
}
// ─── Stateful Upsampler 8→48 ───────────────────────────────────────────────
/// Stateful FIR upsampler from 8 kHz to 48 kHz.
///
/// Inserts zeros between input samples (zero-stuffing), then applies the
/// low-pass FIR to remove imaging, with gain compensation of `RATIO`.
pub struct Upsampler8to48 {
kernel: [f64; FIR_TAPS],
history: Vec<f64>,
}
impl Upsampler8to48 {
pub fn new() -> Self {
Self {
kernel: build_fir_kernel(),
history: vec![0.0; FIR_TAPS - 1],
}
}
/// Upsample a block of 8 kHz samples to 48 kHz.
pub fn process(&mut self, input: &[i16]) -> Vec<i16> {
let hist_len = self.history.len(); // FIR_TAPS - 1
// Zero-stuff: insert RATIO-1 zeros between each input sample.
let stuffed_len = input.len() * RATIO;
let total_len = hist_len + stuffed_len;
let mut work = Vec::with_capacity(total_len);
work.extend_from_slice(&self.history);
for &s in input {
work.push(s as f64);
for _ in 1..RATIO {
work.push(0.0);
}
}
let out_len = stuffed_len;
let mut output = Vec::with_capacity(out_len);
// The gain factor compensates for the zeros introduced by stuffing.
let gain = RATIO as f64;
for i in 0..out_len {
let centre = hist_len + i;
let start = centre + 1 - FIR_TAPS;
let mut acc = 0.0f64;
for k in 0..FIR_TAPS {
let idx = start + k;
if idx < work.len() {
acc += work[idx] * self.kernel[k];
}
}
acc *= gain;
output.push(acc.round().clamp(-32768.0, 32767.0) as i16);
}
// Update history.
if work.len() >= hist_len {
self.history
.copy_from_slice(&work[work.len() - hist_len..]);
} else {
let shift = hist_len - work.len();
self.history.copy_within(shift.., 0);
for (i, &v) in work.iter().enumerate() {
self.history[hist_len - work.len() + i] = v;
}
}
output
}
}
impl Default for Upsampler8to48 {
fn default() -> Self {
Self::new()
}
}
// ─── Backward-compatible free functions ─────────────────────────────────────
/// Downsample from 48 kHz to 8 kHz (6:1 decimation with FIR anti-alias filter).
///
/// This is a convenience wrapper that creates a temporary [`Downsampler48to8`].
/// For streaming use, prefer the stateful struct to avoid edge artefacts between
/// frames.
pub fn resample_48k_to_8k(input: &[i16]) -> Vec<i16> {
let mut ds = Downsampler48to8::new();
ds.process(input)
}
/// Upsample from 8 kHz to 48 kHz (1:6 interpolation with FIR imaging filter).
///
/// This is a convenience wrapper that creates a temporary [`Upsampler8to48`].
/// For streaming use, prefer the stateful struct to avoid edge artefacts between
/// frames.
pub fn resample_8k_to_48k(input: &[i16]) -> Vec<i16> {
let mut us = Upsampler8to48::new();
us.process(input)
}
// ─── Tests ──────────────────────────────────────────────────────────────────
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use super::*; use super::*;
@@ -66,12 +269,28 @@ mod tests {
#[test] #[test]
fn dc_signal_preserved() { fn dc_signal_preserved() {
// A constant signal should survive resampling // A constant signal should survive resampling (approximately).
let input = vec![1000i16; 960]; let input = vec![1000i16; 960];
let down = resample_48k_to_8k(&input); let down = resample_48k_to_8k(&input);
assert!(down.iter().all(|&s| s == 1000)); // Allow some edge transient — check that the middle samples are close.
let mid_start = down.len() / 4;
let mid_end = 3 * down.len() / 4;
for &s in &down[mid_start..mid_end] {
assert!(
(s - 1000).abs() < 50,
"DC downsampled sample {s} too far from 1000"
);
}
let up = resample_8k_to_48k(&down); let up = resample_8k_to_48k(&down);
assert!(up.iter().all(|&s| s == 1000)); let mid_start_up = up.len() / 4;
let mid_end_up = 3 * up.len() / 4;
for &s in &up[mid_start_up..mid_end_up] {
assert!(
(s - 1000).abs() < 100,
"DC upsampled sample {s} too far from 1000"
);
}
} }
#[test] #[test]
@@ -79,4 +298,40 @@ mod tests {
assert!(resample_48k_to_8k(&[]).is_empty()); assert!(resample_48k_to_8k(&[]).is_empty());
assert!(resample_8k_to_48k(&[]).is_empty()); assert!(resample_8k_to_48k(&[]).is_empty());
} }
#[test]
fn stateful_downsampler_produces_correct_length() {
let mut ds = Downsampler48to8::new();
let out = ds.process(&vec![0i16; 960]);
assert_eq!(out.len(), 160);
let out2 = ds.process(&vec![0i16; 960]);
assert_eq!(out2.len(), 160);
}
#[test]
fn stateful_upsampler_produces_correct_length() {
let mut us = Upsampler8to48::new();
let out = us.process(&vec![0i16; 160]);
assert_eq!(out.len(), 960);
let out2 = us.process(&vec![0i16; 160]);
assert_eq!(out2.len(), 960);
}
#[test]
fn fir_kernel_has_unity_dc_gain() {
let kernel = build_fir_kernel();
let sum: f64 = kernel.iter().sum();
assert!(
(sum - 1.0).abs() < 1e-10,
"FIR kernel DC gain should be 1.0, got {sum}"
);
}
#[test]
fn bessel_i0_known_values() {
// I₀(0) = 1
assert!((bessel_i0(0.0) - 1.0).abs() < 1e-12);
// I₀(1) ≈ 1.2660658
assert!((bessel_i0(1.0) - 1.2660658).abs() < 1e-5);
}
} }

View File

@@ -1,4 +1,5 @@
use std::collections::BTreeMap; use std::collections::BTreeMap;
use std::time::{Duration, Instant};
use crate::packet::MediaPacket; use crate::packet::MediaPacket;
@@ -20,19 +21,29 @@ pub struct AdaptivePlayoutDelay {
max_delay: usize, max_delay: usize,
/// Exponential moving average of inter-packet arrival jitter (ms). /// Exponential moving average of inter-packet arrival jitter (ms).
jitter_ema: f64, jitter_ema: f64,
/// EMA smoothing factor (0.0-1.0, lower = smoother). /// EMA smoothing factor for jitter increases (fast reaction).
alpha: f64, alpha_up: f64,
/// EMA smoothing factor for jitter decreases (slow decay).
alpha_down: f64,
/// Last packet arrival timestamp (for computing inter-arrival jitter). /// Last packet arrival timestamp (for computing inter-arrival jitter).
last_arrival_ms: Option<u64>, last_arrival_ms: Option<u64>,
/// Last packet expected timestamp. /// Last packet expected timestamp.
last_expected_ms: Option<u64>, last_expected_ms: Option<u64>,
/// Safety margin added to jitter-derived target (in packets).
safety_margin: f64,
/// Instant when a jitter spike was detected (handoff detection).
spike_detected_at: Option<Instant>,
/// Duration to hold max_delay after a spike is detected.
spike_cooldown: Duration,
/// Multiplier of jitter_ema that constitutes a spike.
spike_threshold_multiplier: f64,
} }
/// Frame duration in milliseconds (20ms Opus/Codec2 frames). /// Frame duration in milliseconds (20ms Opus/Codec2 frames).
const FRAME_DURATION_MS: f64 = 20.0; const FRAME_DURATION_MS: f64 = 20.0;
/// Safety margin added to jitter-derived target (in packets). /// Default safety margin in packets.
const SAFETY_MARGIN_PACKETS: f64 = 2.0; const DEFAULT_SAFETY_MARGIN: f64 = 2.0;
/// Default EMA smoothing factor. /// Default EMA smoothing factor (used for both up/down in non-mobile mode).
const DEFAULT_ALPHA: f64 = 0.05; const DEFAULT_ALPHA: f64 = 0.05;
impl AdaptivePlayoutDelay { impl AdaptivePlayoutDelay {
@@ -46,9 +57,14 @@ impl AdaptivePlayoutDelay {
min_delay, min_delay,
max_delay, max_delay,
jitter_ema: 0.0, jitter_ema: 0.0,
alpha: DEFAULT_ALPHA, alpha_up: DEFAULT_ALPHA,
alpha_down: DEFAULT_ALPHA,
last_arrival_ms: None, last_arrival_ms: None,
last_expected_ms: None, last_expected_ms: None,
safety_margin: DEFAULT_SAFETY_MARGIN,
spike_detected_at: None,
spike_cooldown: Duration::from_secs(2),
spike_threshold_multiplier: 3.0,
} }
} }
@@ -64,13 +80,38 @@ impl AdaptivePlayoutDelay {
let expected_delta = expected_ms as f64 - last_expected as f64; let expected_delta = expected_ms as f64 - last_expected as f64;
let jitter = (actual_delta - expected_delta).abs(); let jitter = (actual_delta - expected_delta).abs();
// Update EMA // Spike detection: check before EMA update
self.jitter_ema = self.alpha * jitter + (1.0 - self.alpha) * self.jitter_ema; if self.jitter_ema > 0.0
&& jitter > self.jitter_ema * self.spike_threshold_multiplier
{
self.spike_detected_at = Some(Instant::now());
}
// Convert jitter estimate to target delay in packets // Asymmetric EMA update
let raw_target = (self.jitter_ema / FRAME_DURATION_MS).ceil() + SAFETY_MARGIN_PACKETS; let alpha = if jitter > self.jitter_ema {
self.target_delay = self.alpha_up
(raw_target as usize).clamp(self.min_delay, self.max_delay); } else {
self.alpha_down
};
self.jitter_ema = alpha * jitter + (1.0 - alpha) * self.jitter_ema;
// Check if spike cooldown has expired
if let Some(spike_time) = self.spike_detected_at {
if spike_time.elapsed() >= self.spike_cooldown {
self.spike_detected_at = None;
}
}
// If within spike cooldown, return max_delay
if self.spike_detected_at.is_some() {
self.target_delay = self.max_delay;
} else {
// Convert jitter estimate to target delay in packets
let raw_target =
(self.jitter_ema / FRAME_DURATION_MS).ceil() + self.safety_margin;
self.target_delay =
(raw_target as usize).clamp(self.min_delay, self.max_delay);
}
} }
self.last_arrival_ms = Some(arrival_ms); self.last_arrival_ms = Some(arrival_ms);
@@ -87,6 +128,28 @@ impl AdaptivePlayoutDelay {
pub fn jitter_estimate_ms(&self) -> f64 { pub fn jitter_estimate_ms(&self) -> f64 {
self.jitter_ema self.jitter_ema
} }
/// Enable or disable mobile mode, adjusting parameters for cellular networks.
///
/// Mobile mode uses:
/// - Asymmetric alpha (fast up=0.3, slow down=0.02) for quicker spike detection
/// - Higher safety margin (3.0 packets) to absorb handoff jitter
/// - Spike detection with 2-second cooldown at 3x threshold
pub fn set_mobile_mode(&mut self, enabled: bool) {
if enabled {
self.safety_margin = 3.0;
self.alpha_up = 0.3;
self.alpha_down = 0.02;
self.spike_threshold_multiplier = 3.0;
self.spike_cooldown = Duration::from_secs(2);
} else {
self.safety_margin = DEFAULT_SAFETY_MARGIN;
self.alpha_up = DEFAULT_ALPHA;
self.alpha_down = DEFAULT_ALPHA;
self.spike_threshold_multiplier = 3.0;
self.spike_cooldown = Duration::from_secs(2);
}
}
} }
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
@@ -391,6 +454,11 @@ impl JitterBuffer {
self.adaptive.as_ref() self.adaptive.as_ref()
} }
/// Get a mutable reference to the adaptive playout delay estimator.
pub fn adaptive_delay_mut(&mut self) -> Option<&mut AdaptivePlayoutDelay> {
self.adaptive.as_mut()
}
/// Adjust target depth based on observed jitter. /// Adjust target depth based on observed jitter.
pub fn set_target_depth(&mut self, depth: usize) { pub fn set_target_depth(&mut self, depth: usize) {
self.target_depth = depth.min(self.max_depth); self.target_depth = depth.min(self.max_depth);
@@ -720,4 +788,29 @@ mod tests {
let ad = jb.adaptive_delay().unwrap(); let ad = jb.adaptive_delay().unwrap();
assert_eq!(ad.target_delay(), 3); assert_eq!(ad.target_delay(), 3);
} }
// ---------------------------------------------------------------
// Mobile mode tests
// ---------------------------------------------------------------
#[test]
fn mobile_mode_increases_safety_margin() {
let mut apd = AdaptivePlayoutDelay::new(3, 50);
apd.set_mobile_mode(true);
assert_eq!(apd.safety_margin, 3.0);
assert_eq!(apd.alpha_up, 0.3);
assert_eq!(apd.alpha_down, 0.02);
apd.set_mobile_mode(false);
assert_eq!(apd.safety_margin, DEFAULT_SAFETY_MARGIN);
assert_eq!(apd.alpha_up, DEFAULT_ALPHA);
assert_eq!(apd.alpha_down, DEFAULT_ALPHA);
}
#[test]
fn mobile_mode_accessible_via_jitter_buffer() {
let mut jb = JitterBuffer::new_adaptive(3, 50);
jb.adaptive_delay_mut().unwrap().set_mobile_mode(true);
assert_eq!(jb.adaptive_delay().unwrap().safety_margin, 3.0);
}
} }

View File

@@ -26,9 +26,9 @@ pub use codec_id::{CodecId, QualityProfile};
pub use error::*; pub use error::*;
pub use packet::{ pub use packet::{
HangupReason, MediaHeader, MediaPacket, MiniFrameContext, MiniHeader, QualityReport, HangupReason, MediaHeader, MediaPacket, MiniFrameContext, MiniHeader, QualityReport,
SignalMessage, TrunkEntry, TrunkFrame, FRAME_TYPE_FULL, FRAME_TYPE_MINI, RoomParticipant, SignalMessage, TrunkEntry, TrunkFrame, FRAME_TYPE_FULL, FRAME_TYPE_MINI,
}; };
pub use bandwidth::{BandwidthEstimator, CongestionState}; pub use bandwidth::{BandwidthEstimator, CongestionState};
pub use quality::{AdaptiveQualityController, Tier}; pub use quality::{AdaptiveQualityController, NetworkContext, Tier};
pub use session::{Session, SessionEvent, SessionState}; pub use session::{Session, SessionEvent, SessionState};
pub use traits::*; pub use traits::*;

View File

@@ -548,6 +548,9 @@ pub enum SignalMessage {
signature: Vec<u8>, signature: Vec<u8>,
/// Supported quality profiles. /// Supported quality profiles.
supported_profiles: Vec<crate::QualityProfile>, supported_profiles: Vec<crate::QualityProfile>,
/// Optional display name set by the caller.
#[serde(default)]
alias: Option<String>,
}, },
/// Call acceptance (analogous to Warzone's WireMessage::CallAnswer). /// Call acceptance (analogous to Warzone's WireMessage::CallAnswer).
@@ -645,6 +648,28 @@ pub enum SignalMessage {
session_id: String, session_id: String,
room_name: String, room_name: String,
}, },
/// Room membership update — sent by relay to all participants when someone joins or leaves.
RoomUpdate {
/// Current participant count.
count: u32,
/// List of participants currently in the room.
participants: Vec<RoomParticipant>,
},
/// Set or update the client's display name.
/// Sent by client after joining; relay updates the participant entry and
/// re-broadcasts a RoomUpdate to all participants.
SetAlias { alias: String },
}
/// A participant entry in a RoomUpdate message.
#[derive(Clone, Debug, Serialize, Deserialize)]
pub struct RoomParticipant {
/// Identity fingerprint (hex string, stable across reconnects if seed is persisted).
pub fingerprint: String,
/// Optional display name set by the client.
pub alias: Option<String>,
} }
/// Reasons for ending a call. /// Reasons for ending a call.

View File

@@ -1,4 +1,5 @@
use std::collections::VecDeque; use std::collections::VecDeque;
use std::time::{Duration, Instant};
use crate::packet::QualityReport; use crate::packet::QualityReport;
use crate::traits::QualityController; use crate::traits::QualityController;
@@ -24,24 +25,71 @@ impl Tier {
} }
} }
/// Determine which tier a quality report belongs to. /// Determine which tier a quality report belongs to (default/WiFi thresholds).
pub fn classify(report: &QualityReport) -> Self { pub fn classify(report: &QualityReport) -> Self {
Self::classify_with_context(report, NetworkContext::Unknown)
}
/// Classify with network-context-aware thresholds.
pub fn classify_with_context(report: &QualityReport, context: NetworkContext) -> Self {
let loss = report.loss_percent(); let loss = report.loss_percent();
let rtt = report.rtt_ms(); let rtt = report.rtt_ms();
if loss > 40.0 || rtt > 600 { match context {
Self::Catastrophic NetworkContext::CellularLte
} else if loss > 10.0 || rtt > 400 { | NetworkContext::Cellular5g
Self::Degraded | NetworkContext::Cellular3g => {
} else { // Tighter thresholds for cellular networks
Self::Good if loss > 25.0 || rtt > 500 {
Self::Catastrophic
} else if loss > 8.0 || rtt > 300 {
Self::Degraded
} else {
Self::Good
}
}
NetworkContext::WiFi | NetworkContext::Unknown => {
// Original thresholds
if loss > 40.0 || rtt > 600 {
Self::Catastrophic
} else if loss > 10.0 || rtt > 400 {
Self::Degraded
} else {
Self::Good
}
}
} }
} }
/// Return the next lower (worse) tier, or None if already at the worst.
pub fn downgrade(self) -> Option<Tier> {
match self {
Self::Good => Some(Self::Degraded),
Self::Degraded => Some(Self::Catastrophic),
Self::Catastrophic => None,
}
}
}
/// Describes the network transport type for context-aware quality decisions.
#[derive(Clone, Copy, Debug, PartialEq, Eq)]
pub enum NetworkContext {
WiFi,
CellularLte,
Cellular5g,
Cellular3g,
Unknown,
}
impl Default for NetworkContext {
fn default() -> Self {
Self::Unknown
}
} }
/// Adaptive quality controller with hysteresis to prevent tier flapping. /// Adaptive quality controller with hysteresis to prevent tier flapping.
/// ///
/// - Downgrade: 3 consecutive reports in a worse tier /// - Downgrade: 3 consecutive reports in a worse tier (2 on cellular)
/// - Upgrade: 10 consecutive reports in a better tier /// - Upgrade: 10 consecutive reports in a better tier
pub struct AdaptiveQualityController { pub struct AdaptiveQualityController {
current_tier: Tier, current_tier: Tier,
@@ -54,14 +102,26 @@ pub struct AdaptiveQualityController {
history: VecDeque<QualityReport>, history: VecDeque<QualityReport>,
/// Whether the profile was manually forced (disables adaptive logic). /// Whether the profile was manually forced (disables adaptive logic).
forced: bool, forced: bool,
/// Current network context for threshold selection.
network_context: NetworkContext,
/// FEC boost expiry time (set during network handoff).
fec_boost_until: Option<Instant>,
/// FEC boost amount to add during handoff recovery window.
fec_boost_amount: f32,
} }
/// Threshold for downgrading (fast reaction to degradation). /// Threshold for downgrading (fast reaction to degradation).
const DOWNGRADE_THRESHOLD: u32 = 3; const DOWNGRADE_THRESHOLD: u32 = 3;
/// Threshold for downgrading on cellular networks (even faster).
const CELLULAR_DOWNGRADE_THRESHOLD: u32 = 2;
/// Threshold for upgrading (slow, cautious improvement). /// Threshold for upgrading (slow, cautious improvement).
const UPGRADE_THRESHOLD: u32 = 10; const UPGRADE_THRESHOLD: u32 = 10;
/// Maximum history window size. /// Maximum history window size.
const HISTORY_SIZE: usize = 20; const HISTORY_SIZE: usize = 20;
/// Default FEC boost amount during handoff recovery.
const DEFAULT_FEC_BOOST: f32 = 0.2;
/// Duration of FEC boost after a network handoff.
const FEC_BOOST_DURATION_SECS: u64 = 10;
impl AdaptiveQualityController { impl AdaptiveQualityController {
pub fn new() -> Self { pub fn new() -> Self {
@@ -72,6 +132,9 @@ impl AdaptiveQualityController {
consecutive_down: 0, consecutive_down: 0,
history: VecDeque::with_capacity(HISTORY_SIZE), history: VecDeque::with_capacity(HISTORY_SIZE),
forced: false, forced: false,
network_context: NetworkContext::default(),
fec_boost_until: None,
fec_boost_amount: DEFAULT_FEC_BOOST,
} }
} }
@@ -80,6 +143,69 @@ impl AdaptiveQualityController {
self.current_tier self.current_tier
} }
/// Get the current network context.
pub fn network_context(&self) -> NetworkContext {
self.network_context
}
/// Signal a network transport change (e.g., WiFi to cellular handoff).
///
/// When switching from WiFi to any cellular type, this preemptively
/// downgrades one quality tier and activates a temporary FEC boost.
pub fn signal_network_change(&mut self, new_context: NetworkContext) {
let old = self.network_context;
self.network_context = new_context;
let new_is_cellular = matches!(
new_context,
NetworkContext::CellularLte | NetworkContext::Cellular5g | NetworkContext::Cellular3g
);
// If switching from WiFi to cellular, preemptively downgrade one tier
if old == NetworkContext::WiFi && new_is_cellular {
if let Some(lower_tier) = self.current_tier.downgrade() {
self.current_tier = lower_tier;
self.current_profile = lower_tier.profile();
}
// Reset counters to avoid stale hysteresis state
self.consecutive_up = 0;
self.consecutive_down = 0;
// Un-force so adaptive logic resumes
self.forced = false;
}
// Activate FEC boost for any network change
self.fec_boost_until = Some(Instant::now() + Duration::from_secs(FEC_BOOST_DURATION_SECS));
}
/// Returns the FEC boost amount if within the handoff recovery window, 0.0 otherwise.
///
/// Callers should add this to their base FEC ratio during the boost window.
pub fn fec_boost(&self) -> f32 {
if let Some(until) = self.fec_boost_until {
if Instant::now() < until {
return self.fec_boost_amount;
}
}
0.0
}
/// Reset the hysteresis counters.
pub fn reset_counters(&mut self) {
self.consecutive_up = 0;
self.consecutive_down = 0;
}
/// Get the effective downgrade threshold based on network context.
fn downgrade_threshold(&self) -> u32 {
match self.network_context {
NetworkContext::CellularLte
| NetworkContext::Cellular5g
| NetworkContext::Cellular3g => CELLULAR_DOWNGRADE_THRESHOLD,
_ => DOWNGRADE_THRESHOLD,
}
}
fn try_transition(&mut self, observed_tier: Tier) -> Option<QualityProfile> { fn try_transition(&mut self, observed_tier: Tier) -> Option<QualityProfile> {
if observed_tier == self.current_tier { if observed_tier == self.current_tier {
self.consecutive_up = 0; self.consecutive_up = 0;
@@ -96,7 +222,7 @@ impl AdaptiveQualityController {
if is_worse { if is_worse {
self.consecutive_up = 0; self.consecutive_up = 0;
self.consecutive_down += 1; self.consecutive_down += 1;
if self.consecutive_down >= DOWNGRADE_THRESHOLD { if self.consecutive_down >= self.downgrade_threshold() {
self.current_tier = observed_tier; self.current_tier = observed_tier;
self.current_profile = observed_tier.profile(); self.current_profile = observed_tier.profile();
self.consecutive_down = 0; self.consecutive_down = 0;
@@ -142,7 +268,7 @@ impl QualityController for AdaptiveQualityController {
return None; return None;
} }
let observed = Tier::classify(report); let observed = Tier::classify_with_context(report, self.network_context);
self.try_transition(observed) self.try_transition(observed)
} }
@@ -246,4 +372,110 @@ mod tests {
assert_eq!(Tier::classify(&make_report(50.0, 200)), Tier::Catastrophic); assert_eq!(Tier::classify(&make_report(50.0, 200)), Tier::Catastrophic);
assert_eq!(Tier::classify(&make_report(5.0, 700)), Tier::Catastrophic); assert_eq!(Tier::classify(&make_report(5.0, 700)), Tier::Catastrophic);
} }
// ---------------------------------------------------------------
// Network context tests
// ---------------------------------------------------------------
#[test]
fn cellular_tighter_thresholds() {
// 12% loss: Good on WiFi, Degraded on cellular
let report = make_report(12.0, 200);
assert_eq!(
Tier::classify_with_context(&report, NetworkContext::WiFi),
Tier::Degraded
);
assert_eq!(
Tier::classify_with_context(&report, NetworkContext::CellularLte),
Tier::Degraded
);
// 9% loss: Good on WiFi, Degraded on cellular
let report = make_report(9.0, 200);
assert_eq!(
Tier::classify_with_context(&report, NetworkContext::WiFi),
Tier::Good
);
assert_eq!(
Tier::classify_with_context(&report, NetworkContext::CellularLte),
Tier::Degraded
);
// 30% loss: Degraded on WiFi, Catastrophic on cellular
let report = make_report(30.0, 200);
assert_eq!(
Tier::classify_with_context(&report, NetworkContext::WiFi),
Tier::Degraded
);
assert_eq!(
Tier::classify_with_context(&report, NetworkContext::Cellular3g),
Tier::Catastrophic
);
}
#[test]
fn cellular_rtt_thresholds() {
// RTT 350ms: Good on WiFi, Degraded on cellular
let report = make_report(2.0, 348); // rtt_4ms rounds so use 348
assert_eq!(
Tier::classify_with_context(&report, NetworkContext::WiFi),
Tier::Good
);
assert_eq!(
Tier::classify_with_context(&report, NetworkContext::CellularLte),
Tier::Degraded
);
}
#[test]
fn cellular_faster_downgrade() {
let mut ctrl = AdaptiveQualityController::new();
ctrl.signal_network_change(NetworkContext::CellularLte);
// Reset tier back to Good for testing downgrade threshold
ctrl.current_tier = Tier::Good;
ctrl.current_profile = Tier::Good.profile();
// On cellular, downgrade threshold is 2 instead of 3
let bad = make_report(50.0, 200);
assert!(ctrl.observe(&bad).is_none()); // 1st bad
let result = ctrl.observe(&bad); // 2nd bad — should trigger on cellular
assert!(result.is_some());
}
#[test]
fn signal_network_change_preemptive_downgrade() {
let mut ctrl = AdaptiveQualityController::new();
assert_eq!(ctrl.tier(), Tier::Good);
// Switch from WiFi to cellular
ctrl.network_context = NetworkContext::WiFi;
ctrl.signal_network_change(NetworkContext::CellularLte);
// Should have downgraded one tier: Good -> Degraded
assert_eq!(ctrl.tier(), Tier::Degraded);
}
#[test]
fn signal_network_change_fec_boost() {
let mut ctrl = AdaptiveQualityController::new();
assert_eq!(ctrl.fec_boost(), 0.0);
ctrl.signal_network_change(NetworkContext::CellularLte);
// FEC boost should be active
assert!(ctrl.fec_boost() > 0.0);
assert_eq!(ctrl.fec_boost(), DEFAULT_FEC_BOOST);
}
#[test]
fn tier_downgrade() {
assert_eq!(Tier::Good.downgrade(), Some(Tier::Degraded));
assert_eq!(Tier::Degraded.downgrade(), Some(Tier::Catastrophic));
assert_eq!(Tier::Catastrophic.downgrade(), None);
}
#[test]
fn network_context_default() {
assert_eq!(NetworkContext::default(), NetworkContext::Unknown);
}
} }

View File

@@ -28,6 +28,7 @@ prometheus = "0.13"
axum = { version = "0.7", default-features = false, features = ["tokio", "http1", "ws"] } axum = { version = "0.7", default-features = false, features = ["tokio", "http1", "ws"] }
tower-http = { version = "0.6", features = ["fs"] } tower-http = { version = "0.6", features = ["fs"] }
futures-util = "0.3" futures-util = "0.3"
dirs = "6"
[[bin]] [[bin]]
name = "wzp-relay" name = "wzp-relay"

View File

@@ -15,25 +15,27 @@ use wzp_proto::{MediaTransport, QualityProfile, SignalMessage};
/// 5. Derive shared ChaCha20-Poly1305 session /// 5. Derive shared ChaCha20-Poly1305 session
/// 6. Send `CallAnswer` back /// 6. Send `CallAnswer` back
/// ///
/// Returns the derived `CryptoSession` and the chosen `QualityProfile`. /// Returns the derived `CryptoSession`, the chosen `QualityProfile`, the caller's fingerprint,
/// and the caller's alias (if provided in CallOffer).
pub async fn accept_handshake( pub async fn accept_handshake(
transport: &dyn MediaTransport, transport: &dyn MediaTransport,
seed: &[u8; 32], seed: &[u8; 32],
) -> Result<(Box<dyn CryptoSession>, QualityProfile), anyhow::Error> { ) -> Result<(Box<dyn CryptoSession>, QualityProfile, String, Option<String>), anyhow::Error> {
// 1. Receive CallOffer // 1. Receive CallOffer
let offer = transport let offer = transport
.recv_signal() .recv_signal()
.await? .await?
.ok_or_else(|| anyhow::anyhow!("connection closed before receiving CallOffer"))?; .ok_or_else(|| anyhow::anyhow!("connection closed before receiving CallOffer"))?;
let (caller_identity_pub, caller_ephemeral_pub, caller_signature, supported_profiles) = let (caller_identity_pub, caller_ephemeral_pub, caller_signature, supported_profiles, caller_alias) =
match offer { match offer {
SignalMessage::CallOffer { SignalMessage::CallOffer {
identity_pub, identity_pub,
ephemeral_pub, ephemeral_pub,
signature, signature,
supported_profiles, supported_profiles,
} => (identity_pub, ephemeral_pub, signature, supported_profiles), alias,
} => (identity_pub, ephemeral_pub, signature, supported_profiles, alias),
other => { other => {
return Err(anyhow::anyhow!( return Err(anyhow::anyhow!(
"expected CallOffer, got {:?}", "expected CallOffer, got {:?}",
@@ -76,7 +78,13 @@ pub async fn accept_handshake(
}; };
transport.send_signal(&answer).await?; transport.send_signal(&answer).await?;
Ok((session, chosen_profile)) // Derive caller fingerprint from their identity public key (first 8 bytes as hex)
let caller_fp = caller_identity_pub[..8]
.iter()
.map(|b| format!("{b:02x}"))
.collect::<String>();
Ok((session, chosen_profile, caller_fp, caller_alias))
} }
/// Select the best quality profile from those the caller supports. /// Select the best quality profile from those the caller supports.

View File

@@ -13,7 +13,7 @@ use std::sync::Arc;
use std::time::Duration; use std::time::Duration;
use tokio::sync::Mutex; use tokio::sync::Mutex;
use tracing::{error, info}; use tracing::{error, info, warn};
use wzp_proto::MediaTransport; use wzp_proto::MediaTransport;
use wzp_relay::config::RelayConfig; use wzp_relay::config::RelayConfig;
@@ -207,8 +207,39 @@ async fn main() -> anyhow::Result<()> {
tokio::spawn(wzp_relay::metrics::serve_metrics(port, m, p, rr)); tokio::spawn(wzp_relay::metrics::serve_metrics(port, m, p, rr));
} }
// Generate ephemeral relay identity for crypto handshake // Load or generate relay identity — persisted in ~/.wzp/relay-identity
let relay_seed = wzp_crypto::Seed::generate(); let relay_seed = {
let config_dir = dirs::home_dir()
.unwrap_or_else(|| std::path::PathBuf::from("."))
.join(".wzp");
let identity_path = config_dir.join("relay-identity");
if identity_path.exists() {
if let Ok(hex) = std::fs::read_to_string(&identity_path) {
if let Ok(s) = wzp_crypto::Seed::from_hex(hex.trim()) {
info!("loaded relay identity from {}", identity_path.display());
s
} else {
warn!("corrupt relay identity file, generating new");
let s = wzp_crypto::Seed::generate();
let hex: String = s.0.iter().map(|b| format!("{b:02x}")).collect();
let _ = std::fs::write(&identity_path, &hex);
s
}
} else {
let s = wzp_crypto::Seed::generate();
let hex: String = s.0.iter().map(|b| format!("{b:02x}")).collect();
let _ = std::fs::write(&identity_path, &hex);
s
}
} else {
let s = wzp_crypto::Seed::generate();
let _ = std::fs::create_dir_all(&config_dir);
let hex: String = s.0.iter().map(|b| format!("{b:02x}")).collect();
let _ = std::fs::write(&identity_path, &hex);
info!("generated relay identity at {}", identity_path.display());
s
}
};
let relay_fp = relay_seed.derive_identity().public_identity().fingerprint; let relay_fp = relay_seed.derive_identity().public_identity().fingerprint;
info!(addr = %config.listen_addr, fingerprint = %relay_fp, "WarzonePhone relay starting"); info!(addr = %config.listen_addr, fingerprint = %relay_fp, "WarzonePhone relay starting");
@@ -299,6 +330,13 @@ async fn main() -> anyhow::Result<()> {
let transport = Arc::new(wzp_transport::QuinnTransport::new(connection)); let transport = Arc::new(wzp_transport::QuinnTransport::new(connection));
// Ping connections: client just measures QUIC connect RTT.
// No handshake, no streams — client closes immediately after connecting.
if room_name == "ping" {
info!(%addr, "ping connection (RTT probe)");
return;
}
// Probe connections use SNI "_probe" to identify themselves. // Probe connections use SNI "_probe" to identify themselves.
// They skip auth + handshake and just do Ping->Pong + presence gossip. // They skip auth + handshake and just do Ping->Pong + presence gossip.
if room_name == "_probe" { if room_name == "_probe" {
@@ -431,7 +469,7 @@ async fn main() -> anyhow::Result<()> {
// Crypto handshake: verify client identity + negotiate quality profile // Crypto handshake: verify client identity + negotiate quality profile
let handshake_start = std::time::Instant::now(); let handshake_start = std::time::Instant::now();
let (_crypto_session, _chosen_profile) = match wzp_relay::handshake::accept_handshake( let (_crypto_session, _chosen_profile, caller_fp, caller_alias) = match wzp_relay::handshake::accept_handshake(
&*transport, &*transport,
&relay_seed_bytes, &relay_seed_bytes,
).await { ).await {
@@ -448,10 +486,13 @@ async fn main() -> anyhow::Result<()> {
} }
}; };
// Use the caller's identity fingerprint from the handshake
let participant_fp = authenticated_fp.clone().unwrap_or(caller_fp);
// Register in presence registry // Register in presence registry
if let Some(ref fp) = authenticated_fp { {
let mut reg = presence.lock().await; let mut reg = presence.lock().await;
reg.register_local(fp, None, Some(room_name.clone())); reg.register_local(&participant_fp, None, Some(room_name.clone()));
} }
info!(%addr, room = %room_name, "client joining"); info!(%addr, room = %room_name, "client joining");
@@ -502,14 +543,21 @@ async fn main() -> anyhow::Result<()> {
let participant_id = { let participant_id = {
let mut mgr = room_mgr.lock().await; let mut mgr = room_mgr.lock().await;
match mgr.join(&room_name, addr, room::ParticipantSender::Quic(transport.clone()), authenticated_fp.as_deref()) { match mgr.join(
Ok(id) => { &room_name,
addr,
room::ParticipantSender::Quic(transport.clone()),
Some(&participant_fp),
caller_alias.as_deref(),
) {
Ok((id, update, senders)) => {
metrics.active_rooms.set(mgr.list().len() as i64); metrics.active_rooms.set(mgr.list().len() as i64);
drop(mgr); // release lock before async broadcast
room::broadcast_signal(&senders, &update).await;
id id
} }
Err(e) => { Err(e) => {
error!(%addr, room = %room_name, "room join denied: {e}"); error!(%addr, room = %room_name, "room join denied: {e}");
// Clean up the session we just created
metrics.active_sessions.dec(); metrics.active_sessions.dec();
let mut smgr = session_mgr.lock().await; let mut smgr = session_mgr.lock().await;
smgr.remove_session(session_id); smgr.remove_session(session_id);

View File

@@ -10,7 +10,7 @@ use std::time::Duration;
use bytes::Bytes; use bytes::Bytes;
use tokio::sync::Mutex; use tokio::sync::Mutex;
use tracing::{error, info, warn}; use tracing::{debug, error, info, trace, warn};
use wzp_proto::packet::TrunkFrame; use wzp_proto::packet::TrunkFrame;
use wzp_proto::MediaTransport; use wzp_proto::MediaTransport;
@@ -67,11 +67,24 @@ impl ParticipantSender {
} }
} }
/// Broadcast a signal message to a list of participant senders.
pub async fn broadcast_signal(senders: &[ParticipantSender], msg: &wzp_proto::SignalMessage) {
for sender in senders {
if let ParticipantSender::Quic(t) = sender {
if let Err(e) = t.send_signal(msg).await {
warn!("broadcast_signal error: {e}");
}
}
}
}
/// A participant in a room. /// A participant in a room.
struct Participant { struct Participant {
id: ParticipantId, id: ParticipantId,
_addr: std::net::SocketAddr, _addr: std::net::SocketAddr,
sender: ParticipantSender, sender: ParticipantSender,
fingerprint: Option<String>,
alias: Option<String>,
} }
/// A room holding multiple participants. /// A room holding multiple participants.
@@ -86,10 +99,16 @@ impl Room {
} }
} }
fn add(&mut self, addr: std::net::SocketAddr, sender: ParticipantSender) -> ParticipantId { fn add(
&mut self,
addr: std::net::SocketAddr,
sender: ParticipantSender,
fingerprint: Option<String>,
alias: Option<String>,
) -> ParticipantId {
let id = next_id(); let id = next_id();
info!(room_size = self.participants.len() + 1, participant = id, %addr, "joined room"); info!(room_size = self.participants.len() + 1, participant = id, %addr, "joined room");
self.participants.push(Participant { id, _addr: addr, sender }); self.participants.push(Participant { id, _addr: addr, sender, fingerprint, alias });
id id
} }
@@ -106,6 +125,33 @@ impl Room {
.collect() .collect()
} }
/// Build a RoomUpdate participant list.
fn participant_list(&self) -> Vec<wzp_proto::packet::RoomParticipant> {
self.participants
.iter()
.map(|p| wzp_proto::packet::RoomParticipant {
fingerprint: p.fingerprint.clone().unwrap_or_default(),
alias: p.alias.clone(),
})
.collect()
}
/// Get all senders (for broadcasting to everyone including the joiner).
fn all_senders(&self) -> Vec<ParticipantSender> {
self.participants.iter().map(|p| p.sender.clone()).collect()
}
/// Update a participant's alias. Returns true if the participant was found.
fn set_alias(&mut self, id: ParticipantId, alias: String) -> bool {
if let Some(p) = self.participants.iter_mut().find(|p| p.id == id) {
info!(participant = id, %alias, "alias updated");
p.alias = Some(alias);
true
} else {
false
}
}
fn is_empty(&self) -> bool { fn is_empty(&self) -> bool {
self.participants.is_empty() self.participants.is_empty()
} }
@@ -165,20 +211,27 @@ impl RoomManager {
} }
} }
/// Join a room. Returns the participant ID or an error if unauthorized. /// Join a room. Returns (participant_id, room_update_msg, all_senders) for broadcasting.
pub fn join( pub fn join(
&mut self, &mut self,
room_name: &str, room_name: &str,
addr: std::net::SocketAddr, addr: std::net::SocketAddr,
sender: ParticipantSender, sender: ParticipantSender,
fingerprint: Option<&str>, fingerprint: Option<&str>,
) -> Result<ParticipantId, String> { alias: Option<&str>,
) -> Result<(ParticipantId, wzp_proto::SignalMessage, Vec<ParticipantSender>), String> {
if !self.is_authorized(room_name, fingerprint) { if !self.is_authorized(room_name, fingerprint) {
warn!(room = room_name, fingerprint = ?fingerprint, "unauthorized room join attempt"); warn!(room = room_name, fingerprint = ?fingerprint, "unauthorized room join attempt");
return Err("not authorized for this room".to_string()); return Err("not authorized for this room".to_string());
} }
let room = self.rooms.entry(room_name.to_string()).or_insert_with(Room::new); let room = self.rooms.entry(room_name.to_string()).or_insert_with(Room::new);
Ok(room.add(addr, sender)) let id = room.add(addr, sender, fingerprint.map(|s| s.to_string()), alias.map(|s| s.to_string()));
let update = wzp_proto::SignalMessage::RoomUpdate {
count: room.len() as u32,
participants: room.participant_list(),
};
let senders = room.all_senders();
Ok((id, update, senders))
} }
/// Join a room via WebSocket. Convenience wrapper around `join()`. /// Join a room via WebSocket. Convenience wrapper around `join()`.
@@ -189,18 +242,48 @@ impl RoomManager {
sender: tokio::sync::mpsc::Sender<Bytes>, sender: tokio::sync::mpsc::Sender<Bytes>,
fingerprint: Option<&str>, fingerprint: Option<&str>,
) -> Result<ParticipantId, String> { ) -> Result<ParticipantId, String> {
self.join(room_name, addr, ParticipantSender::WebSocket(sender), fingerprint) let (id, _update, _senders) = self.join(room_name, addr, ParticipantSender::WebSocket(sender), fingerprint, None)?;
Ok(id)
} }
/// Leave a room. Removes the room if empty. /// Leave a room. Returns (room_update_msg, remaining_senders) for broadcasting, or None if room is now empty.
pub fn leave(&mut self, room_name: &str, participant_id: ParticipantId) { pub fn leave(&mut self, room_name: &str, participant_id: ParticipantId) -> Option<(wzp_proto::SignalMessage, Vec<ParticipantSender>)> {
if let Some(room) = self.rooms.get_mut(room_name) { if let Some(room) = self.rooms.get_mut(room_name) {
room.remove(participant_id); room.remove(participant_id);
if room.is_empty() { if room.is_empty() {
self.rooms.remove(room_name); self.rooms.remove(room_name);
info!(room = room_name, "room closed (empty)"); info!(room = room_name, "room closed (empty)");
return None;
}
let update = wzp_proto::SignalMessage::RoomUpdate {
count: room.len() as u32,
participants: room.participant_list(),
};
let senders = room.all_senders();
Some((update, senders))
} else {
None
}
}
/// Update a participant's alias and return a RoomUpdate + senders for broadcasting.
pub fn set_alias(
&mut self,
room_name: &str,
participant_id: ParticipantId,
alias: String,
) -> Option<(wzp_proto::SignalMessage, Vec<ParticipantSender>)> {
if let Some(room) = self.rooms.get_mut(room_name) {
if room.set_alias(participant_id, alias) {
let update = wzp_proto::SignalMessage::RoomUpdate {
count: room.len() as u32,
participants: room.participant_list(),
};
let senders = room.all_senders();
return Some((update, senders));
} }
} }
None
} }
/// Get senders for all OTHER participants in a room. /// Get senders for all OTHER participants in a room.
@@ -322,73 +405,174 @@ async fn run_participant_plain(
session_id: &str, session_id: &str,
) { ) {
let addr = transport.connection().remote_address(); let addr = transport.connection().remote_address();
let mut packets_forwarded = 0u64;
loop { // Media forwarding task (with debug logging from Android fixes)
let pkt = match transport.recv_media().await { let media_room_mgr = room_mgr.clone();
Ok(Some(pkt)) => pkt, let media_room_name = room_name.clone();
Ok(None) => { let media_transport = transport.clone();
info!(%addr, participant = participant_id, "disconnected"); let media_metrics = metrics.clone();
break; let media_session_id = session_id.to_string();
} let media_task = async move {
Err(e) => { let mut packets_forwarded = 0u64;
let msg = e.to_string(); let mut last_recv_instant = std::time::Instant::now();
if msg.contains("timed out") || msg.contains("reset") || msg.contains("closed") { let mut max_recv_gap_ms = 0u64;
info!(%addr, participant = participant_id, "connection closed: {e}"); let mut max_forward_ms = 0u64;
} else { let mut send_errors = 0u64;
error!(%addr, participant = participant_id, "recv error: {e}"); let mut last_log_instant = std::time::Instant::now();
info!(
room = %media_room_name,
participant = participant_id,
%addr,
session = %media_session_id,
"forwarding loop started (plain)"
);
loop {
let pkt = match media_transport.recv_media().await {
Ok(Some(pkt)) => pkt,
Ok(None) => {
info!(%addr, participant = participant_id, forwarded = packets_forwarded, "disconnected (stream ended)");
break;
} }
break; Err(e) => {
} let msg = e.to_string();
}; if msg.contains("timed out") || msg.contains("reset") || msg.contains("closed") {
info!(%addr, participant = participant_id, forwarded = packets_forwarded, "connection closed: {e}");
// Update per-session quality metrics if a quality report is present } else {
if let Some(ref report) = pkt.quality_report { error!(%addr, participant = participant_id, forwarded = packets_forwarded, "recv error: {e}");
metrics.update_session_quality(session_id, report); }
} break;
// Get current list of other participants
let others = {
let mgr = room_mgr.lock().await;
mgr.others(&room_name, participant_id)
};
// Forward to all others
let pkt_bytes = pkt.payload.len() as u64;
for other in &others {
match other {
ParticipantSender::Quic(t) => {
let _ = t.send_media(&pkt).await;
} }
ParticipantSender::WebSocket(_) => {
// WS clients receive raw payload bytes
let _ = other.send_raw(&pkt.payload).await;
}
}
}
let fan_out = others.len() as u64;
metrics.packets_forwarded.inc_by(fan_out);
metrics.bytes_forwarded.inc_by(pkt_bytes * fan_out);
packets_forwarded += 1;
if packets_forwarded % 500 == 0 {
let room_size = {
let mgr = room_mgr.lock().await;
mgr.room_size(&room_name)
}; };
info!(
room = %room_name, let recv_gap_ms = last_recv_instant.elapsed().as_millis() as u64;
participant = participant_id, last_recv_instant = std::time::Instant::now();
forwarded = packets_forwarded, if recv_gap_ms > max_recv_gap_ms {
room_size, max_recv_gap_ms = recv_gap_ms;
"participant stats" }
); if recv_gap_ms > 200 {
warn!(
room = %media_room_name,
participant = participant_id,
recv_gap_ms,
seq = pkt.header.seq,
"large recv gap"
);
}
if let Some(ref report) = pkt.quality_report {
media_metrics.update_session_quality(&media_session_id, report);
}
let lock_start = std::time::Instant::now();
let others = {
let mgr = media_room_mgr.lock().await;
mgr.others(&media_room_name, participant_id)
};
let lock_ms = lock_start.elapsed().as_millis() as u64;
if lock_ms > 10 {
warn!(room = %media_room_name, participant = participant_id, lock_ms, "slow room_mgr lock");
}
let fwd_start = std::time::Instant::now();
let pkt_bytes = pkt.payload.len() as u64;
for other in &others {
match other {
ParticipantSender::Quic(t) => {
if let Err(e) = t.send_media(&pkt).await {
send_errors += 1;
if send_errors <= 5 || send_errors % 100 == 0 {
warn!(
room = %media_room_name,
participant = participant_id,
peer = %t.connection().remote_address(),
total_send_errors = send_errors,
"send_media error: {e}"
);
}
}
}
ParticipantSender::WebSocket(_) => {
let _ = other.send_raw(&pkt.payload).await;
}
}
}
let fwd_ms = fwd_start.elapsed().as_millis() as u64;
if fwd_ms > max_forward_ms { max_forward_ms = fwd_ms; }
if fwd_ms > 50 {
warn!(room = %media_room_name, participant = participant_id, fwd_ms, fan_out = others.len(), "slow forward");
}
let fan_out = others.len() as u64;
media_metrics.packets_forwarded.inc_by(fan_out);
media_metrics.bytes_forwarded.inc_by(pkt_bytes * fan_out);
packets_forwarded += 1;
if last_log_instant.elapsed() >= Duration::from_secs(5) {
let room_size = {
let mgr = media_room_mgr.lock().await;
mgr.room_size(&media_room_name)
};
info!(
room = %media_room_name,
participant = participant_id,
forwarded = packets_forwarded,
room_size, fan_out, max_recv_gap_ms, max_forward_ms, send_errors,
"participant stats"
);
max_recv_gap_ms = 0;
max_forward_ms = 0;
last_log_instant = std::time::Instant::now();
}
} }
};
// Signal handling task — processes SetAlias and other in-call signals
let signal_room_mgr = room_mgr.clone();
let signal_room_name = room_name.clone();
let signal_transport = transport.clone();
let signal_task = async move {
loop {
match signal_transport.recv_signal().await {
Ok(Some(wzp_proto::SignalMessage::SetAlias { alias })) => {
info!(%addr, participant = participant_id, %alias, "SetAlias received");
let mut mgr = signal_room_mgr.lock().await;
if let Some((update, senders)) =
mgr.set_alias(&signal_room_name, participant_id, alias)
{
drop(mgr);
broadcast_signal(&senders, &update).await;
}
}
Ok(Some(wzp_proto::SignalMessage::Hangup { .. })) => {
info!(%addr, participant = participant_id, "hangup received");
break;
}
Ok(Some(msg)) => {
info!(%addr, participant = participant_id, "signal: {:?}", std::mem::discriminant(&msg));
}
Ok(None) => break,
Err(e) => {
warn!(%addr, participant = participant_id, "signal recv error: {e}");
break;
}
}
}
};
// Run both in parallel — exit when either finishes (disconnection)
tokio::select! {
_ = media_task => {}
_ = signal_task => {}
} }
// Clean up // Clean up — leave room and broadcast update to remaining participants
let mut mgr = room_mgr.lock().await; let mut mgr = room_mgr.lock().await;
mgr.leave(&room_name, participant_id); if let Some((update, senders)) = mgr.leave(&room_name, participant_id) {
drop(mgr); // release lock before async broadcast
broadcast_signal(&senders, &update).await;
}
} }
/// Trunked forwarding loop — batches outgoing packets per peer. /// Trunked forwarding loop — batches outgoing packets per peer.
@@ -404,6 +588,19 @@ async fn run_participant_trunked(
let addr = transport.connection().remote_address(); let addr = transport.connection().remote_address();
let mut packets_forwarded = 0u64; let mut packets_forwarded = 0u64;
let mut last_recv_instant = std::time::Instant::now();
let mut max_recv_gap_ms = 0u64;
let mut max_forward_ms = 0u64;
let mut send_errors = 0u64;
let mut last_log_instant = std::time::Instant::now();
info!(
room = %room_name,
participant = participant_id,
%addr,
session = session_id,
"forwarding loop started (trunked)"
);
// Per-peer TrunkedForwarders, keyed by the raw pointer of the peer // Per-peer TrunkedForwarders, keyed by the raw pointer of the peer
// transport (stable for the Arc's lifetime). We use the remote address // transport (stable for the Arc's lifetime). We use the remote address
@@ -425,24 +622,50 @@ async fn run_participant_trunked(
let pkt = match result { let pkt = match result {
Ok(Some(pkt)) => pkt, Ok(Some(pkt)) => pkt,
Ok(None) => { Ok(None) => {
info!(%addr, participant = participant_id, "disconnected"); info!(%addr, participant = participant_id, forwarded = packets_forwarded, "disconnected (stream ended)");
break; break;
} }
Err(e) => { Err(e) => {
error!(%addr, participant = participant_id, "recv error: {e}"); error!(%addr, participant = participant_id, forwarded = packets_forwarded, "recv error: {e}");
break; break;
} }
}; };
let recv_gap_ms = last_recv_instant.elapsed().as_millis() as u64;
last_recv_instant = std::time::Instant::now();
if recv_gap_ms > max_recv_gap_ms {
max_recv_gap_ms = recv_gap_ms;
}
if recv_gap_ms > 200 {
warn!(
room = %room_name,
participant = participant_id,
recv_gap_ms,
seq = pkt.header.seq,
"large recv gap (trunked)"
);
}
if let Some(ref report) = pkt.quality_report { if let Some(ref report) = pkt.quality_report {
metrics.update_session_quality(session_id, report); metrics.update_session_quality(session_id, report);
} }
let lock_start = std::time::Instant::now();
let others = { let others = {
let mgr = room_mgr.lock().await; let mgr = room_mgr.lock().await;
mgr.others(&room_name, participant_id) mgr.others(&room_name, participant_id)
}; };
let lock_ms = lock_start.elapsed().as_millis() as u64;
if lock_ms > 10 {
warn!(
room = %room_name,
participant = participant_id,
lock_ms,
"slow room_mgr lock (trunked)"
);
}
let fwd_start = std::time::Instant::now();
let pkt_bytes = pkt.payload.len() as u64; let pkt_bytes = pkt.payload.len() as u64;
for other in &others { for other in &others {
match other { match other {
@@ -452,21 +675,44 @@ async fn run_participant_trunked(
.entry(peer_addr) .entry(peer_addr)
.or_insert_with(|| TrunkedForwarder::new(t.clone(), sid_bytes)); .or_insert_with(|| TrunkedForwarder::new(t.clone(), sid_bytes));
if let Err(e) = fwd.send(&pkt).await { if let Err(e) = fwd.send(&pkt).await {
let _ = e; send_errors += 1;
if send_errors <= 5 || send_errors % 100 == 0 {
warn!(
room = %room_name,
participant = participant_id,
peer = %peer_addr,
total_send_errors = send_errors,
"trunked send error: {e}"
);
}
} }
} }
ParticipantSender::WebSocket(_) => { ParticipantSender::WebSocket(_) => {
// WS clients bypass trunking — send raw payload directly
let _ = other.send_raw(&pkt.payload).await; let _ = other.send_raw(&pkt.payload).await;
} }
} }
} }
let fwd_ms = fwd_start.elapsed().as_millis() as u64;
if fwd_ms > max_forward_ms {
max_forward_ms = fwd_ms;
}
if fwd_ms > 50 {
warn!(
room = %room_name,
participant = participant_id,
fwd_ms,
fan_out = others.len(),
"slow forward (trunked)"
);
}
let fan_out = others.len() as u64; let fan_out = others.len() as u64;
metrics.packets_forwarded.inc_by(fan_out); metrics.packets_forwarded.inc_by(fan_out);
metrics.bytes_forwarded.inc_by(pkt_bytes * fan_out); metrics.bytes_forwarded.inc_by(pkt_bytes * fan_out);
packets_forwarded += 1; packets_forwarded += 1;
if packets_forwarded % 500 == 0 {
// Periodic stats every 5 seconds
if last_log_instant.elapsed() >= Duration::from_secs(5) {
let room_size = { let room_size = {
let mgr = room_mgr.lock().await; let mgr = room_mgr.lock().await;
mgr.room_size(&room_name) mgr.room_size(&room_name)
@@ -476,15 +722,30 @@ async fn run_participant_trunked(
participant = participant_id, participant = participant_id,
forwarded = packets_forwarded, forwarded = packets_forwarded,
room_size, room_size,
fan_out,
max_recv_gap_ms,
max_forward_ms,
send_errors,
"participant stats (trunked)" "participant stats (trunked)"
); );
max_recv_gap_ms = 0;
max_forward_ms = 0;
last_log_instant = std::time::Instant::now();
} }
} }
_ = flush_interval.tick() => { _ = flush_interval.tick() => {
for fwd in forwarders.values_mut() { for fwd in forwarders.values_mut() {
if let Err(e) = fwd.flush().await { if let Err(e) = fwd.flush().await {
let _ = e; send_errors += 1;
if send_errors <= 5 || send_errors % 100 == 0 {
warn!(
room = %room_name,
participant = participant_id,
total_send_errors = send_errors,
"trunk flush error: {e}"
);
}
} }
} }
} }
@@ -497,7 +758,10 @@ async fn run_participant_trunked(
} }
let mut mgr = room_mgr.lock().await; let mut mgr = room_mgr.lock().await;
mgr.leave(&room_name, participant_id); if let Some((update, senders)) = mgr.leave(&room_name, participant_id) {
drop(mgr);
broadcast_signal(&senders, &update).await;
}
} }
/// Parse up to the first 2 bytes of a hex session-id string into `[u8; 2]`. /// Parse up to the first 2 bytes of a hex session-id string into `[u8; 2]`.

View File

@@ -136,6 +136,11 @@ impl PathMonitor {
} }
} }
/// Get raw packet counts for debugging.
pub fn counts(&self) -> (u64, u64) {
(self.total_sent, self.total_received)
}
/// Estimate bandwidth in kbps from bytes received over time. /// Estimate bandwidth in kbps from bytes received over time.
fn estimate_bandwidth_kbps(&self) -> u32 { fn estimate_bandwidth_kbps(&self) -> u32 {
if let (Some(first), Some(last)) = (self.first_recv_time_ms, self.last_recv_time_ms) { if let (Some(first), Some(last)) = (self.first_recv_time_ms, self.last_recv_time_ms) {
@@ -149,6 +154,27 @@ impl PathMonitor {
} }
0 0
} }
/// Detect whether a network handoff likely occurred.
///
/// Returns `true` if the most recent RTT jitter measurement exceeds 3x
/// the EWMA-smoothed jitter average, which is characteristic of a cellular
/// network handoff (tower switch, WiFi-to-cellular transition, etc.).
pub fn detect_handoff(&self) -> bool {
// We need at least two RTT observations to have a meaningful jitter value,
// and the EWMA must be non-zero to avoid division/multiplication by zero.
if self.jitter_ewma <= 0.0 {
return false;
}
if let (Some(last_rtt), Some(_)) = (self.last_rtt_ms, Some(self.rtt_ewma)) {
// Compute the most recent instantaneous jitter (RTT deviation from EWMA)
let instant_jitter = (last_rtt - self.rtt_ewma).abs();
instant_jitter > self.jitter_ewma * 3.0
} else {
false
}
}
} }
impl Default for PathMonitor { impl Default for PathMonitor {

View File

@@ -33,6 +33,22 @@ impl QuinnTransport {
&self.connection &self.connection
} }
/// Close the QUIC connection immediately (synchronous, no async needed).
/// The relay will detect the close and remove this participant from the room.
pub fn close_now(&self) {
self.connection.close(quinn::VarInt::from_u32(0), b"hangup");
}
/// Feed an external RTT observation (e.g. from QUIC path stats) into the path monitor.
pub fn feed_rtt(&self, rtt_ms: u32) {
self.path_monitor.lock().unwrap().observe_rtt(rtt_ms);
}
/// Get raw packet counts from path monitor (sent, received).
pub fn monitor_counts(&self) -> (u64, u64) {
self.path_monitor.lock().unwrap().counts()
}
/// Get the maximum datagram payload size, if datagrams are supported. /// Get the maximum datagram payload size, if datagrams are supported.
pub fn max_datagram_size(&self) -> Option<usize> { pub fn max_datagram_size(&self) -> Option<usize> {
datagram::max_datagram_payload(&self.connection) datagram::max_datagram_payload(&self.connection)

View File

@@ -272,7 +272,7 @@ async fn handle_ws(socket: WebSocket, room: String, state: AppState) {
// Crypto handshake with relay // Crypto handshake with relay
let handshake_start = std::time::Instant::now(); let handshake_start = std::time::Instant::now();
let bridge_seed = wzp_crypto::Seed::generate(); let bridge_seed = wzp_crypto::Seed::generate();
match wzp_client::handshake::perform_handshake(&*transport, &bridge_seed.0).await { match wzp_client::handshake::perform_handshake(&*transport, &bridge_seed.0, None).await {
Ok(_session) => { Ok(_session) => {
let elapsed = handshake_start.elapsed().as_secs_f64(); let elapsed = handshake_start.elapsed().as_secs_f64();
state.metrics.handshake_latency.observe(elapsed); state.metrics.handshake_latency.observe(elapsed);

View File

@@ -0,0 +1,16 @@
{
"name": "wzp-wasm",
"type": "module",
"description": "WarzonePhone WASM bindings — FEC (RaptorQ) + crypto (ChaCha20-Poly1305, X25519)",
"version": "0.1.0",
"files": [
"wzp_wasm_bg.wasm",
"wzp_wasm.js",
"wzp_wasm.d.ts"
],
"main": "wzp_wasm.js",
"types": "wzp_wasm.d.ts",
"sideEffects": [
"./snippets/*"
]
}

169
crates/wzp-web/static/wasm/wzp_wasm.d.ts vendored Normal file
View File

@@ -0,0 +1,169 @@
/* tslint:disable */
/* eslint-disable */
/**
* Symmetric encryption session using ChaCha20-Poly1305.
*
* Mirrors `wzp-crypto::session::ChaChaSession` for WASM. Nonce derivation
* and key setup are identical so WASM and native peers interoperate.
*/
export class WzpCryptoSession {
free(): void;
[Symbol.dispose](): void;
/**
* Decrypt a media payload with AAD.
*
* Returns plaintext on success, or throws on auth failure.
*/
decrypt(header_aad: Uint8Array, ciphertext: Uint8Array): Uint8Array;
/**
* Encrypt a media payload with AAD (typically the 12-byte MediaHeader).
*
* Returns `ciphertext || poly1305_tag` (plaintext.len() + 16 bytes).
*/
encrypt(header_aad: Uint8Array, plaintext: Uint8Array): Uint8Array;
/**
* Create from a 32-byte shared secret (output of `WzpKeyExchange.derive_shared_secret`).
*/
constructor(shared_secret: Uint8Array);
/**
* Current receive sequence number (for diagnostics / UI stats).
*/
recv_seq(): number;
/**
* Current send sequence number (for diagnostics / UI stats).
*/
send_seq(): number;
}
export class WzpFecDecoder {
free(): void;
[Symbol.dispose](): void;
/**
* Feed a received symbol.
*
* Returns the decoded block (concatenated original frames, unpadded) if
* enough symbols have been received to recover the block, or `undefined`.
*/
add_symbol(block_id: number, symbol_idx: number, _is_repair: boolean, data: Uint8Array): Uint8Array | undefined;
/**
* Create a new FEC decoder.
*
* * `block_size` — expected number of source symbols per block.
* * `symbol_size` — padded byte size of each symbol (must match encoder).
*/
constructor(block_size: number, symbol_size: number);
}
export class WzpFecEncoder {
free(): void;
[Symbol.dispose](): void;
/**
* Add a source symbol (audio frame).
*
* Returns encoded packets (all source + repair) when the block is complete,
* or `undefined` if the block is still accumulating.
*
* Each returned packet carries the 3-byte header:
* `[block_id][symbol_idx][is_repair]` followed by `symbol_size` bytes.
*/
add_symbol(data: Uint8Array): Uint8Array | undefined;
/**
* Force-flush the current (possibly partial) block.
*
* Returns all source + repair symbols with headers, or empty vec if no
* symbols have been accumulated.
*/
flush(): Uint8Array;
/**
* Create a new FEC encoder.
*
* * `block_size` — number of source symbols (audio frames) per FEC block.
* * `symbol_size` — padded byte size of each symbol (default 256).
*/
constructor(block_size: number, symbol_size: number);
}
/**
* X25519 key exchange: generate ephemeral keypair and derive shared secret.
*
* Usage from JS:
* ```js
* const kx = new WzpKeyExchange();
* const ourPub = kx.public_key(); // Uint8Array(32)
* // ... send ourPub to peer, receive peerPub ...
* const secret = kx.derive_shared_secret(peerPub); // Uint8Array(32)
* const session = new WzpCryptoSession(secret);
* ```
*/
export class WzpKeyExchange {
free(): void;
[Symbol.dispose](): void;
/**
* Derive a 32-byte session key from the peer's public key.
*
* Raw DH output is expanded via HKDF-SHA256 with info="warzone-session-key",
* matching `wzp-crypto::handshake::WarzoneKeyExchange::derive_session`.
*/
derive_shared_secret(peer_public: Uint8Array): Uint8Array;
/**
* Generate a new random X25519 keypair.
*/
constructor();
/**
* Our public key (32 bytes).
*/
public_key(): Uint8Array;
}
export type InitInput = RequestInfo | URL | Response | BufferSource | WebAssembly.Module;
export interface InitOutput {
readonly memory: WebAssembly.Memory;
readonly __wbg_wzpcryptosession_free: (a: number, b: number) => void;
readonly __wbg_wzpfecdecoder_free: (a: number, b: number) => void;
readonly __wbg_wzpfecencoder_free: (a: number, b: number) => void;
readonly __wbg_wzpkeyexchange_free: (a: number, b: number) => void;
readonly wzpcryptosession_decrypt: (a: number, b: number, c: number, d: number, e: number) => [number, number, number, number];
readonly wzpcryptosession_encrypt: (a: number, b: number, c: number, d: number, e: number) => [number, number, number, number];
readonly wzpcryptosession_new: (a: number, b: number) => [number, number, number];
readonly wzpcryptosession_recv_seq: (a: number) => number;
readonly wzpcryptosession_send_seq: (a: number) => number;
readonly wzpfecdecoder_add_symbol: (a: number, b: number, c: number, d: number, e: number, f: number) => [number, number];
readonly wzpfecdecoder_new: (a: number, b: number) => number;
readonly wzpfecencoder_add_symbol: (a: number, b: number, c: number) => [number, number];
readonly wzpfecencoder_flush: (a: number) => [number, number];
readonly wzpfecencoder_new: (a: number, b: number) => number;
readonly wzpkeyexchange_derive_shared_secret: (a: number, b: number, c: number) => [number, number, number, number];
readonly wzpkeyexchange_new: () => number;
readonly wzpkeyexchange_public_key: (a: number) => [number, number];
readonly __wbindgen_exn_store: (a: number) => void;
readonly __externref_table_alloc: () => number;
readonly __wbindgen_externrefs: WebAssembly.Table;
readonly __wbindgen_malloc: (a: number, b: number) => number;
readonly __externref_table_dealloc: (a: number) => void;
readonly __wbindgen_free: (a: number, b: number, c: number) => void;
readonly __wbindgen_start: () => void;
}
export type SyncInitInput = BufferSource | WebAssembly.Module;
/**
* Instantiates the given `module`, which can either be bytes or
* a precompiled `WebAssembly.Module`.
*
* @param {{ module: SyncInitInput }} module - Passing `SyncInitInput` directly is deprecated.
*
* @returns {InitOutput}
*/
export function initSync(module: { module: SyncInitInput } | SyncInitInput): InitOutput;
/**
* If `module_or_path` is {RequestInfo} or {URL}, makes a request and
* for everything else, calls `WebAssembly.instantiate` directly.
*
* @param {{ module_or_path: InitInput | Promise<InitInput> }} module_or_path - Passing `InitInput` directly is deprecated.
*
* @returns {Promise<InitOutput>}
*/
export default function __wbg_init (module_or_path?: { module_or_path: InitInput | Promise<InitInput> } | InitInput | Promise<InitInput>): Promise<InitOutput>;

View File

@@ -0,0 +1,27 @@
/* tslint:disable */
/* eslint-disable */
export const memory: WebAssembly.Memory;
export const __wbg_wzpcryptosession_free: (a: number, b: number) => void;
export const __wbg_wzpfecdecoder_free: (a: number, b: number) => void;
export const __wbg_wzpfecencoder_free: (a: number, b: number) => void;
export const __wbg_wzpkeyexchange_free: (a: number, b: number) => void;
export const wzpcryptosession_decrypt: (a: number, b: number, c: number, d: number, e: number) => [number, number, number, number];
export const wzpcryptosession_encrypt: (a: number, b: number, c: number, d: number, e: number) => [number, number, number, number];
export const wzpcryptosession_new: (a: number, b: number) => [number, number, number];
export const wzpcryptosession_recv_seq: (a: number) => number;
export const wzpcryptosession_send_seq: (a: number) => number;
export const wzpfecdecoder_add_symbol: (a: number, b: number, c: number, d: number, e: number, f: number) => [number, number];
export const wzpfecdecoder_new: (a: number, b: number) => number;
export const wzpfecencoder_add_symbol: (a: number, b: number, c: number) => [number, number];
export const wzpfecencoder_flush: (a: number) => [number, number];
export const wzpfecencoder_new: (a: number, b: number) => number;
export const wzpkeyexchange_derive_shared_secret: (a: number, b: number, c: number) => [number, number, number, number];
export const wzpkeyexchange_new: () => number;
export const wzpkeyexchange_public_key: (a: number) => [number, number];
export const __wbindgen_exn_store: (a: number) => void;
export const __externref_table_alloc: () => number;
export const __wbindgen_externrefs: WebAssembly.Table;
export const __wbindgen_malloc: (a: number, b: number) => number;
export const __externref_table_dealloc: (a: number) => void;
export const __wbindgen_free: (a: number, b: number, c: number) => void;
export const __wbindgen_start: () => void;

View File

@@ -0,0 +1,166 @@
# Incident Report: Playout Ring Buffer Cursor Desync — Bidirectional Audio Loss
**Date:** 2026-04-06
**Severity:** Critical — causes 10-16 seconds of complete bidirectional silence mid-call
**Status:** Root-caused, fix pending
**Affects:** All clients using `AudioRing` (Android, potentially desktop)
## Summary
Both participants in a call experience simultaneous, prolonged audio silence (10-16 seconds) despite the QUIC transport, relay, and Rust codec pipeline all functioning normally. The root cause is a cursor desynchronization in the lock-free SPSC ring buffer (`AudioRing`) that transfers decoded PCM from the Rust recv task to the Kotlin AudioTrack playout thread.
## How We Know It's the Ring Buffer
### Evidence that eliminates other components
| Component | Evidence it's healthy | Source |
|-----------|----------------------|--------|
| **QUIC send path** | `frames_dropped=0, send_errors=0` on both clients | Engine send stats log |
| **QUIC recv path** | `max_recv_gap_ms=82, recv_errors=0` — no gaps >82ms | Engine recv stats log |
| **Relay forwarding** | `max_forward_ms=0, send_errors=0` in previous relay-instrumented test | Relay debug logging |
| **Opus codec** | `frames_decoded=2442` over 51.9s = 47 frames/sec (correct for 20ms) | Final stats JSON |
| **FEC** | `fec_recovered=4870` — FEC working normally | Final stats JSON |
| **Audio capture** | Pixel 6 capture has 0% silence; Nothing has gaps but those are expected mic pauses | capture_rms.csv |
### Evidence pointing to the ring buffer
1. **Both clients go silent at the exact same wall-clock moment (26.66s into call)** — rules out per-device issues; the common factor is the relay, but the relay was proven healthy in prior tests.
2. **`playout_avail=8640` at stats dump time** — the playout ring reports 8640 samples available (180ms, nearly full at the 9600 capacity). The recv task believes it has successfully written data into the ring. But the AudioTrack playout thread is reading silence (RMS=0 for 12+ seconds).
3. **Recv task continued receiving packets with no gaps**`max_recv_gap_ms=82` across the entire call. The decoded audio was written to the ring continuously.
4. **Silence starts and ends cleanly** — the transition from audio → silence happens within a single 20ms frame (frame 1332: rms=101, frame 1333: rms=0). This is not network degradation (which shows gradual quality loss). It's a discrete state change — the reader suddenly stops seeing data.
5. **Recovery is also discrete** — at ~38.8s (Sharp Hawk) and ~42.7s (Pixel 6), audio snaps back with high-energy frames (rms=3296+). Not a gradual reconnection.
## The Ring Buffer Code
**File:** `crates/wzp-android/src/audio_ring.rs`
```rust
const RING_CAPACITY: usize = 960 * 10; // 9600 samples = 200ms at 48kHz
pub struct AudioRing {
buf: Box<[i16; RING_CAPACITY]>,
write_pos: AtomicUsize, // monotonically increasing, wraps at usize::MAX
read_pos: AtomicUsize, // monotonically increasing, wraps at usize::MAX
}
```
### `available()` — how many samples can be read
```rust
pub fn available(&self) -> usize {
let w = self.write_pos.load(Ordering::Acquire);
let r = self.read_pos.load(Ordering::Acquire);
w.wrapping_sub(r) // relies on usize wrapping arithmetic
}
```
### `write()` — producer (Rust recv task thread, inside tokio block_on)
```rust
pub fn write(&self, samples: &[i16]) -> usize {
let w = self.write_pos.load(Ordering::Relaxed);
let count = samples.len().min(RING_CAPACITY);
// ... write samples at (w + i) % RING_CAPACITY ...
self.write_pos.store(w.wrapping_add(count), Ordering::Release);
// If we overwrote unread data, advance read_pos
if self.available() > RING_CAPACITY {
let new_read = self.write_pos.load(Ordering::Relaxed).wrapping_sub(RING_CAPACITY);
self.read_pos.store(new_read, Ordering::Release);
}
}
```
### `read()` — consumer (Kotlin AudioTrack JVM thread, via JNI)
```rust
pub fn read(&self, out: &mut [i16]) -> usize {
let avail = self.available();
let count = out.len().min(avail);
let r = self.read_pos.load(Ordering::Relaxed);
// ... read samples at (r + i) % RING_CAPACITY ...
self.read_pos.store(r.wrapping_add(count), Ordering::Release);
count
}
```
## Suspected Failure Modes
### 1. Writer advances `read_pos` while reader is mid-read (data race)
The `write()` method at lines 68-72 modifies `read_pos` from the writer thread when it detects overflow. But the `read()` method on the consumer thread also modifies `read_pos`. This violates the SPSC contract — `read_pos` is supposed to be owned by the consumer.
**Scenario:**
1. Reader loads `read_pos = R` (line 82)
2. Writer detects overflow, stores `read_pos = R'` (line 71) where `R' > R`
3. Reader finishes reading, stores `read_pos = R + count` (line 88) — **overwrites** the writer's `R'` with a stale, smaller value
After step 3, the ring's `read_pos` has gone backwards. Now `available()` returns `write_pos.wrapping_sub(old_read_pos)` which is larger than `RING_CAPACITY`. Every subsequent `write()` call hits the overflow branch and keeps advancing `read_pos`, but the reader keeps overwriting it back. The ring is in a corrupted state where the reader and writer are fighting over `read_pos`.
### 2. `wrapping_sub` returns astronomically large values
`available()` uses `w.wrapping_sub(r)`. On a 64-bit platform, if due to the race above `r > w`, `wrapping_sub` returns `usize::MAX - (r - w) + 1` — an enormous number. The `read()` method caps this with `out.len().min(avail)` so it reads `out.len()` samples. But those samples are from indices calculated as `(r + i) % RING_CAPACITY` which wraps correctly. The samples read would be whatever was in the buffer at those positions — potentially stale/old data, or zeros from initialization.
However, the playout RMS CSV shows clean zeros (RMS=0), not garbage. This suggests the ring is returning the zeroed-out initial buffer contents, meaning `read_pos` has jumped far ahead of `write_pos`, pointing to memory that was never written to (or was written long ago and has since been zeroed by the overflow advance logic).
### 3. Why silence lasts exactly 12-16 seconds
After the desync, each `write()` call (every 20ms when a packet is decoded) enters the overflow branch and resets `read_pos`. But the reader immediately overwrites it back in its next `read()` call. This tug-of-war continues until one of two things happens:
- The cursors happen to realign through wrapping arithmetic
- A timing coincidence where the writer's store to `read_pos` happens to "win" the race
The 12-16 second duration is non-deterministic and depends on exact thread scheduling and cursor values.
## Reproduction Pattern
The bug manifests after roughly 25-30 seconds of a call. This timing is suspicious:
- At 48kHz mono, 20ms frames = 50 frames/sec
- Each decoded frame writes 960 samples to the ring
- After 25 seconds: `write_pos ≈ 25 * 50 * 960 = 1,200,000`
- The ring capacity is 9600, so `write_pos` has wrapped around `RING_CAPACITY` about 125 times
The wrapping of the monotonic cursors past certain thresholds, combined with the reader/writer `read_pos` race, likely triggers the desync at this scale.
## Data Files
All data from two independent test sessions (3 calls total) is in `/workspace/wzp/debug/`:
| File | Contents |
|------|----------|
| `wzp_debug_20260406_120546.zip` | Sharp Hawk (Nothing A059) — 51.9s call |
| `wzp_debug_20260406_120549.zip` | Bright Viper (Pixel 6) — 51.9s call |
| `wzp_debug_20260406_111733.zip` | Sharp Hawk — earlier 72.0s call, same pattern |
| `wzp_debug_20260406_111735.zip` | Bright Viper — earlier 72.0s call, same pattern |
| `wzp_debug_20260406_105858.zip` | First session (pre-logging fix), 39.8s call |
| `wzp_debug_20260406_105900.zip` | First session, 39.7s call |
### Key fields in each zip
- `meta.txt` — device, duration, final stats JSON
- `playout_rms.csv` — per-frame (20ms) RMS of AudioTrack output; silence = RMS 0
- `capture_rms.csv` — per-frame RMS of AudioRecord input
- `logcat.txt` — Android logcat filtered to WZP + audio tags
### How to reproduce the analysis
```python
import csv
with open("playout_rms.csv") as f:
for row in csv.DictReader(f):
if int(row['rms']) == 0 and int(row['time_ms']) > 2000:
print(f"SILENCE at {row['time_ms']}ms")
```
## Affected Code
- `crates/wzp-android/src/audio_ring.rs` — the `AudioRing` struct, specifically the `write()` method's overflow handling that mutates `read_pos` from the producer thread
- Any client using `AudioRing` for playout (currently only Android; desktop uses `cpal` directly)
## Constraints for the Fix
1. Must remain lock-free — AudioTrack thread runs at `Thread.MAX_PRIORITY` and cannot block
2. Must handle overflow gracefully — if the reader falls behind, old audio should be dropped, not cause a desync
3. The writer (Rust recv task) and reader (Kotlin AudioTrack via JNI) run on different threads with different scheduling priorities
4. Ring capacity is 200ms which is tight — any fix must not increase latency significantly
5. The `write_pos` and `read_pos` are the only synchronization mechanism (no mutex, no condvar)

View File

@@ -0,0 +1,123 @@
# Incident Report: Send Task Fatal Exit on QUIC Congestion
**Date:** 2026-04-06
**Severity:** High — causes complete audio loss mid-call
**Status:** Fixed in Android client, **pending fix in desktop client and web client**
## Summary
A QUIC congestion event causes `send_datagram()` to return `Err(Blocked)`. The send task treats this as a fatal error and exits, which kills the entire call via `tokio::select!`. Audio becomes one-way (recv still works briefly) then dies completely.
## Root Cause
In the engine's send loop (`run_call` function), `transport.send_media()` errors were handled with `break`:
```rust
// BEFORE (broken)
if let Err(e) = transport.send_media(&source_pkt).await {
error!("send error: {e}");
break; // <-- kills send task, which kills everything
}
```
Quinn's `send_datagram()` is synchronous and returns `Err(SendDatagramError::Blocked)` when the QUIC congestion window is full. This is a **transient condition** — the window opens again once ACKs arrive. But the `break` kills the send task, and since all tasks run under `tokio::select!`, the recv task, stats task, and signal task all die too.
### Why it manifests as "intermittent disconnections"
- Mobile networks have brief congestion spikes (cell tower handoff, WiFi interference)
- A single spike fills the QUIC congestion window
- One `Blocked` error → send task exits → `select!` cancels recv → complete silence
- The QUIC connection stays open (no error logged), so stats polling continues showing stale data
- From the user's perspective: audio drops for 5-20 seconds then "maybe comes back" (it doesn't — they're hearing cached playout ring drain)
### Evidence from debug reports
**Relay logs** confirmed the relay was healthy:
- `max_forward_ms=0` — relay forwards instantly
- `send_errors=0` — no relay-side failures
- The relay saw `large recv gap` warnings on participant 1 (Nothing A059): 722ms → 814ms → 1778ms → 3500ms → 6091ms — the client progressively stopped sending
**Client stats** confirmed:
- `frames_encoded` kept incrementing (Opus encoder running)
- `frames_decoded` froze at a fixed value (recv task died)
- `fec_recovered` froze simultaneously
- RTT, loss, jitter all frozen (stats task died)
## Fix Applied
### Android client (`crates/wzp-android/src/engine.rs`)
```rust
// AFTER (fixed)
if let Err(e) = transport.send_media(&source_pkt).await {
send_errors += 1;
frames_dropped += 1;
if send_errors <= 3 || last_send_error_log.elapsed().as_secs() >= 1 {
warn!(seq = s, send_errors, frames_dropped,
"send_media error (dropping packet): {e}");
last_send_error_log = Instant::now();
}
continue; // <-- drop packet, keep going
}
```
Same pattern applied to FEC repair packet sends.
Recv task also hardened: transient errors (non-closed/reset) are now logged and survived rather than causing exit.
Added periodic health logging to both tasks (5-second intervals):
- Send: `frames_sent`, `frames_dropped`, `send_errors`, `ring_avail`
- Recv: `frames_decoded`, `fec_recovered`, `recv_errors`, `max_recv_gap_ms`, `playout_avail`
### Relay (`crates/wzp-relay/src/room.rs`)
Added debug logging to both plain and trunked forwarding loops:
- Per-recv gap tracking (warns on >200ms gaps)
- Room manager lock contention tracking (warns on >10ms)
- Forward latency tracking (warns on >50ms)
- Send error counting with peer identification
- 5-second periodic stats with all above metrics
## Affected Clients — FIX REQUIRED
### Desktop client (`crates/wzp-client/src/cli.rs`)
**Lines 345-348:**
```rust
if let Err(e) = transport.send_media(pkt).await {
error!("send error: {e}");
break; // <-- SAME BUG
}
```
**Lines 431-434:**
```rust
if let Err(e) = send_transport.send_media(pkt).await {
error!("send error: {e}");
return; // <-- SAME BUG
}
```
Both need the same continue-on-error pattern.
### Web client (`crates/wzp-web/src/main.rs`)
Needs audit — WebSocket transport may have different error semantics but same pattern should be checked.
## Testing
After fix, a congestion event will:
1. Log warnings with packet counts: `send_media error (dropping packet): Blocked`
2. Drop affected packets (brief audio glitch — ~20-100ms)
3. Resume normal sending once congestion window opens
4. FEC on the receiver side will recover most dropped packets
5. Call continues uninterrupted
## Timeline
- 10:37 — First crash observed (LinearProgressIndicator compose bug masked investigation)
- 10:58 — Debug reports collected, decoded stall pattern identified
- 11:16 — Relay debug logging deployed, confirmed relay is clean
- 11:17 — Second debug reports collected, send gaps correlated with relay recv gaps
- 11:30 — Root cause identified: `break` on `send_media` error in send task
- 11:45 — Fix applied and deployed

2
desktop/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
node_modules/
dist/

View File

@@ -0,0 +1,8 @@
{
"hash": "9046c0bf",
"configHash": "ef0fc96f",
"lockfileHash": "d66891b1",
"browserHash": "8171ed59",
"optimized": {},
"chunks": {}
}

View File

@@ -0,0 +1,3 @@
{
"type": "module"
}

143
desktop/index.html Normal file
View File

@@ -0,0 +1,143 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>WarzonePhone</title>
<link rel="stylesheet" href="/src/style.css" />
</head>
<body>
<div id="app">
<!-- Connect screen -->
<div id="connect-screen">
<h1>WarzonePhone</h1>
<p class="subtitle">Encrypted Voice</p>
<div class="form">
<label>Relay
<button id="relay-selected" class="relay-selected" type="button">
<span id="relay-dot" class="dot"></span>
<span id="relay-label">Select relay...</span>
<span class="arrow">&#9881;</span>
</button>
</label>
<label>Room
<input id="room" type="text" value="android" />
</label>
<label>Alias
<input id="alias" type="text" placeholder="your name" />
</label>
<div class="form-row">
<label class="checkbox">
<input id="os-aec" type="checkbox" checked />
OS Echo Cancel
</label>
<button id="settings-btn-home" class="icon-btn" title="Settings (Cmd+,)">&#9881;</button>
</div>
<button id="connect-btn" class="primary">Connect</button>
<p id="connect-error" class="error"></p>
</div>
<div class="identity-info">
<span id="my-identicon"></span>
<span id="my-fingerprint" class="fp-display"></span>
</div>
<div class="recent-rooms" id="recent-rooms"></div>
</div>
<!-- In-call screen -->
<div id="call-screen" class="hidden">
<div class="call-header">
<div class="call-header-row">
<div id="room-name" class="room-name"></div>
<button id="settings-btn-call" class="icon-btn small" title="Settings (Cmd+,)">&#9881;</button>
</div>
<div class="call-meta">
<span id="call-status" class="status-dot"></span>
<span id="call-timer" class="call-timer">0:00</span>
</div>
</div>
<div class="level-meter">
<div id="level-bar" class="level-bar-fill"></div>
</div>
<div id="participants" class="participants"></div>
<div class="controls">
<button id="mic-btn" class="control-btn" title="Toggle Mic (m)">
<span class="icon" id="mic-icon">Mic</span>
</button>
<button id="hangup-btn" class="control-btn hangup" title="Hang Up (q)">
<span class="icon">End</span>
</button>
<button id="spk-btn" class="control-btn" title="Toggle Speaker (s)">
<span class="icon" id="spk-icon">Spk</span>
</button>
</div>
<div id="stats" class="stats"></div>
</div>
<!-- Settings panel -->
<div id="settings-panel" class="hidden">
<div class="settings-card">
<div class="settings-header">
<h2>Settings</h2>
<button id="settings-close" class="icon-btn">&times;</button>
</div>
<div class="settings-section">
<h3>Connection</h3>
<label>Default Room
<input id="s-room" type="text" />
</label>
<label>Alias
<input id="s-alias" type="text" />
</label>
</div>
<div class="settings-section">
<h3>Audio</h3>
<label class="checkbox">
<input id="s-os-aec" type="checkbox" />
OS Echo Cancellation (macOS VoiceProcessingIO)
</label>
<label class="checkbox">
<input id="s-agc" type="checkbox" checked />
Automatic Gain Control
</label>
</div>
<div class="settings-section">
<h3>Identity</h3>
<div class="setting-row">
<span class="setting-label">Fingerprint</span>
<span id="s-fingerprint" class="fp-display-large"></span>
</div>
<div class="setting-row">
<span class="setting-label">Identity file</span>
<span class="fp-display">~/.wzp/identity</span>
</div>
</div>
<div class="settings-section">
<h3>Recent Rooms</h3>
<div id="s-recent-rooms" class="recent-rooms-list"></div>
<button id="s-clear-recent" class="secondary-btn">Clear History</button>
</div>
<button id="settings-save" class="primary">Save</button>
</div>
</div>
<!-- Manage Relays dialog -->
<div id="relay-dialog" class="hidden">
<div class="settings-card relay-dialog-card">
<div class="settings-header">
<h2>Manage Relays</h2>
<button id="relay-dialog-close" class="icon-btn">&times;</button>
</div>
<div id="relay-dialog-list" class="relay-dialog-list"></div>
<div class="relay-add-row">
<div class="relay-add-inputs">
<input id="relay-add-name" type="text" placeholder="Name" />
<input id="relay-add-addr" type="text" placeholder="host:port" />
</div>
<button id="relay-add-btn" class="primary">Add Relay</button>
</div>
</div>
</div>
</div>
<script type="module" src="/src/main.ts"></script>
</body>
</html>

1350
desktop/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

19
desktop/package.json Normal file
View File

@@ -0,0 +1,19 @@
{
"name": "wzp-desktop",
"private": true,
"version": "0.1.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "vite build",
"tauri": "tauri"
},
"dependencies": {
"@tauri-apps/api": "^2"
},
"devDependencies": {
"typescript": "^5",
"vite": "^6",
"@tauri-apps/cli": "^2"
}
}

View File

@@ -0,0 +1,36 @@
[package]
name = "wzp-desktop"
version = "0.1.0"
edition = "2024"
description = "WarzonePhone Desktop — encrypted VoIP client"
default-run = "wzp-desktop"
[build-dependencies]
tauri-build = { version = "2", features = [] }
[dependencies]
tauri = { version = "2", features = [] }
tauri-plugin-shell = "2"
serde = { version = "1", features = ["derive"] }
serde_json = "1"
tokio = { version = "1", features = ["full"] }
tracing = "0.1"
tracing-subscriber = "0.3"
anyhow = "1"
rustls = { version = "0.23", default-features = false, features = ["ring", "std"] }
# WarzonePhone crates
wzp-proto = { path = "../../crates/wzp-proto" }
wzp-codec = { path = "../../crates/wzp-codec" }
wzp-fec = { path = "../../crates/wzp-fec" }
wzp-crypto = { path = "../../crates/wzp-crypto" }
wzp-transport = { path = "../../crates/wzp-transport" }
wzp-client = { path = "../../crates/wzp-client", features = ["audio", "vpio"] }
# Platform-specific
[target.'cfg(target_os = "macos")'.dependencies]
coreaudio-rs = "0.11"
[features]
default = ["custom-protocol"]
custom-protocol = ["tauri/custom-protocol"]

View File

@@ -0,0 +1,3 @@
fn main() {
tauri_build::build()
}

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1 @@
{}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

Binary file not shown.

After

Width:  |  Height:  |  Size: 104 B

View File

@@ -0,0 +1,365 @@
//! Call engine for the desktop app — wraps wzp-client audio + transport
//! into a clean async interface for Tauri commands.
use std::net::SocketAddr;
use std::sync::atomic::{AtomicBool, AtomicU32, AtomicU64, Ordering};
use std::sync::Arc;
use std::time::Instant;
use tokio::sync::Mutex;
use tracing::{error, info};
use wzp_client::audio_io::{AudioCapture, AudioPlayback};
use wzp_client::call::{CallConfig, CallEncoder};
use wzp_proto::MediaTransport;
const FRAME_SAMPLES: usize = 960;
/// Wrapper to make non-Sync audio handles safe to store in shared state.
/// The audio handle is only accessed from the thread that created it (drop),
/// never shared across threads — Sync is safe.
#[allow(dead_code)]
struct SyncWrapper(Box<dyn std::any::Any + Send>);
unsafe impl Sync for SyncWrapper {}
pub struct ParticipantInfo {
pub fingerprint: String,
pub alias: Option<String>,
}
pub struct EngineStatus {
pub mic_muted: bool,
pub spk_muted: bool,
pub participants: Vec<ParticipantInfo>,
pub frames_sent: u64,
pub frames_received: u64,
pub audio_level: u32,
pub call_duration_secs: f64,
pub fingerprint: String,
}
pub struct CallEngine {
running: Arc<AtomicBool>,
mic_muted: Arc<AtomicBool>,
spk_muted: Arc<AtomicBool>,
participants: Arc<Mutex<Vec<ParticipantInfo>>>,
frames_sent: Arc<AtomicU64>,
frames_received: Arc<AtomicU64>,
audio_level: Arc<AtomicU32>,
transport: Arc<wzp_transport::QuinnTransport>,
start_time: Instant,
fingerprint: String,
/// Keep audio handles alive for the duration of the call.
/// Wrapped in SyncWrapper because AudioUnit isn't Sync.
_audio_handle: SyncWrapper,
}
impl CallEngine {
pub async fn start<F>(
relay: String,
room: String,
alias: String,
_os_aec: bool,
event_cb: F,
) -> Result<Self, anyhow::Error>
where
F: Fn(&str, &str) + Send + Sync + 'static,
{
let _ = rustls::crypto::ring::default_provider().install_default();
let relay_addr: SocketAddr = relay.parse()?;
// Load or generate identity
let seed = {
let path = {
let home = std::env::var("HOME").unwrap_or_else(|_| ".".into());
std::path::PathBuf::from(home).join(".wzp").join("identity")
};
if path.exists() {
if let Ok(hex) = std::fs::read_to_string(&path) {
if let Ok(s) = wzp_crypto::Seed::from_hex(hex.trim()) {
s
} else {
wzp_crypto::Seed::generate()
}
} else {
wzp_crypto::Seed::generate()
}
} else {
let s = wzp_crypto::Seed::generate();
if let Some(p) = path.parent() {
std::fs::create_dir_all(p).ok();
}
let hex: String = s.0.iter().map(|b| format!("{b:02x}")).collect();
std::fs::write(&path, hex).ok();
s
}
};
let fp = seed.derive_identity().public_identity().fingerprint;
let fingerprint = fp.to_string();
info!(%fp, "identity loaded");
// Connect
let bind_addr: SocketAddr = "0.0.0.0:0".parse().unwrap();
let endpoint = wzp_transport::create_endpoint(bind_addr, None)?;
let client_config = wzp_transport::client_config();
let conn = wzp_transport::connect(&endpoint, relay_addr, &room, client_config).await?;
let transport = Arc::new(wzp_transport::QuinnTransport::new(conn));
// Handshake
let _session = wzp_client::handshake::perform_handshake(
&*transport,
&seed.0,
Some(&alias),
)
.await?;
info!("connected to relay, handshake complete");
event_cb("connected", &format!("joined room {room}"));
// Audio I/O — VPIO (OS AEC) on macOS, plain CPAL otherwise.
// The audio handle must be stored in CallEngine to keep streams alive.
let (capture_ring, playout_ring, audio_handle): (_, _, Box<dyn std::any::Any + Send>) =
if _os_aec {
#[cfg(target_os = "macos")]
{
match wzp_client::audio_vpio::VpioAudio::start() {
Ok(v) => {
let cr = v.capture_ring().clone();
let pr = v.playout_ring().clone();
info!("using VoiceProcessingIO (OS AEC)");
(cr, pr, Box::new(v))
}
Err(e) => {
info!("VPIO failed ({e}), falling back to CPAL");
let capture = AudioCapture::start()?;
let playback = AudioPlayback::start()?;
let cr = capture.ring().clone();
let pr = playback.ring().clone();
(cr, pr, Box::new((capture, playback)))
}
}
}
#[cfg(not(target_os = "macos"))]
{
info!("OS AEC not available on this platform, using CPAL");
let capture = AudioCapture::start()?;
let playback = AudioPlayback::start()?;
let cr = capture.ring().clone();
let pr = playback.ring().clone();
(cr, pr, Box::new((capture, playback)))
}
} else {
let capture = AudioCapture::start()?;
let playback = AudioPlayback::start()?;
let cr = capture.ring().clone();
let pr = playback.ring().clone();
(cr, pr, Box::new((capture, playback)))
};
let running = Arc::new(AtomicBool::new(true));
let mic_muted = Arc::new(AtomicBool::new(false));
let spk_muted = Arc::new(AtomicBool::new(false));
let participants: Arc<Mutex<Vec<ParticipantInfo>>> = Arc::new(Mutex::new(vec![]));
let frames_sent = Arc::new(AtomicU64::new(0));
let frames_received = Arc::new(AtomicU64::new(0));
let audio_level = Arc::new(AtomicU32::new(0));
// Send task
let send_t = transport.clone();
let send_r = running.clone();
let send_mic = mic_muted.clone();
let send_fs = frames_sent.clone();
let send_level = audio_level.clone();
let send_drops = Arc::new(AtomicU64::new(0));
tokio::spawn(async move {
let config = CallConfig {
noise_suppression: false,
suppression_enabled: false,
..CallConfig::default()
};
let mut encoder = CallEncoder::new(&config);
encoder.set_aec_enabled(false); // OS AEC or none
let mut buf = vec![0i16; FRAME_SAMPLES];
loop {
if !send_r.load(Ordering::Relaxed) {
break;
}
if capture_ring.available() < FRAME_SAMPLES {
tokio::time::sleep(std::time::Duration::from_millis(5)).await;
continue;
}
capture_ring.read(&mut buf);
// Compute RMS audio level for UI meter
if !buf.is_empty() {
let sum_sq: f64 = buf.iter().map(|&s| (s as f64) * (s as f64)).sum();
let rms = (sum_sq / buf.len() as f64).sqrt() as u32;
send_level.store(rms, Ordering::Relaxed);
}
if send_mic.load(Ordering::Relaxed) {
buf.fill(0);
}
match encoder.encode_frame(&buf) {
Ok(pkts) => {
for pkt in &pkts {
if let Err(e) = send_t.send_media(pkt).await {
// Transient congestion (Blocked) — drop packet, keep going
send_drops.fetch_add(1, Ordering::Relaxed);
if send_drops.load(Ordering::Relaxed) <= 3 {
tracing::warn!("send_media error (dropping packet): {e}");
}
}
}
send_fs.fetch_add(1, Ordering::Relaxed);
}
Err(e) => error!("encode: {e}"),
}
}
});
// Recv task (direct playout)
let recv_t = transport.clone();
let recv_r = running.clone();
let recv_spk = spk_muted.clone();
let recv_fr = frames_received.clone();
tokio::spawn(async move {
let mut opus_dec = wzp_codec::create_decoder(wzp_proto::QualityProfile::GOOD);
let mut agc = wzp_codec::AutoGainControl::new();
let mut pcm = vec![0i16; FRAME_SAMPLES];
loop {
if !recv_r.load(Ordering::Relaxed) {
break;
}
match tokio::time::timeout(
std::time::Duration::from_millis(100),
recv_t.recv_media(),
)
.await
{
Ok(Ok(Some(pkt))) => {
if !pkt.header.is_repair {
if let Ok(n) = opus_dec.decode(&pkt.payload, &mut pcm) {
agc.process_frame(&mut pcm[..n]);
if !recv_spk.load(Ordering::Relaxed) {
playout_ring.write(&pcm[..n]);
}
}
}
recv_fr.fetch_add(1, Ordering::Relaxed);
}
Ok(Ok(None)) => break,
Ok(Err(e)) => {
let msg = e.to_string();
if msg.contains("closed") || msg.contains("reset") {
error!("recv fatal: {e}");
break;
}
// Transient error — continue
}
Err(_) => {}
}
}
});
// Signal task (presence)
let sig_t = transport.clone();
let sig_r = running.clone();
let sig_p = participants.clone();
let event_cb = Arc::new(event_cb);
let sig_cb = event_cb.clone();
tokio::spawn(async move {
loop {
if !sig_r.load(Ordering::Relaxed) {
break;
}
match tokio::time::timeout(
std::time::Duration::from_millis(200),
sig_t.recv_signal(),
)
.await
{
Ok(Ok(Some(wzp_proto::SignalMessage::RoomUpdate {
participants: parts,
..
}))) => {
let mut seen = std::collections::HashSet::new();
let unique: Vec<ParticipantInfo> = parts
.into_iter()
.filter(|p| seen.insert((p.fingerprint.clone(), p.alias.clone())))
.map(|p| ParticipantInfo {
fingerprint: p.fingerprint,
alias: p.alias,
})
.collect();
let count = unique.len();
*sig_p.lock().await = unique;
sig_cb("room-update", &format!("{count} participants"));
}
Ok(Ok(Some(_))) => {}
Ok(Ok(None)) => break,
Ok(Err(_)) => break,
Err(_) => {}
}
}
});
Ok(Self {
running,
mic_muted,
spk_muted,
participants,
frames_sent,
frames_received,
audio_level,
transport,
start_time: Instant::now(),
fingerprint,
_audio_handle: SyncWrapper(audio_handle),
})
}
pub fn toggle_mic(&self) -> bool {
let was = self.mic_muted.load(Ordering::Relaxed);
self.mic_muted.store(!was, Ordering::Relaxed);
!was
}
pub fn toggle_speaker(&self) -> bool {
let was = self.spk_muted.load(Ordering::Relaxed);
self.spk_muted.store(!was, Ordering::Relaxed);
!was
}
pub async fn status(&self) -> EngineStatus {
let participants = {
let parts = self.participants.lock().await;
parts
.iter()
.map(|p| ParticipantInfo {
fingerprint: p.fingerprint.clone(),
alias: p.alias.clone(),
})
.collect()
}; // lock dropped here
EngineStatus {
mic_muted: self.mic_muted.load(Ordering::Relaxed),
spk_muted: self.spk_muted.load(Ordering::Relaxed),
participants,
frames_sent: self.frames_sent.load(Ordering::Relaxed),
frames_received: self.frames_received.load(Ordering::Relaxed),
audio_level: self.audio_level.load(Ordering::Relaxed),
call_duration_secs: self.start_time.elapsed().as_secs_f64(),
fingerprint: self.fingerprint.clone(),
}
}
pub async fn stop(self) {
self.running.store(false, Ordering::SeqCst);
self.transport.close().await.ok();
}
}

View File

@@ -0,0 +1,241 @@
#![cfg_attr(not(debug_assertions), windows_subsystem = "windows")]
mod engine;
use engine::CallEngine;
use serde::Serialize;
use std::sync::Arc;
use tauri::Emitter;
use tokio::sync::Mutex;
#[derive(Clone, Serialize)]
struct CallEvent {
kind: String,
message: String,
}
#[derive(Clone, Serialize)]
struct Participant {
fingerprint: String,
alias: Option<String>,
}
#[derive(Clone, Serialize)]
struct CallStatus {
active: bool,
mic_muted: bool,
spk_muted: bool,
participants: Vec<Participant>,
encode_fps: u64,
recv_fps: u64,
audio_level: u32,
call_duration_secs: f64,
fingerprint: String,
}
struct AppState {
engine: Mutex<Option<CallEngine>>,
}
/// Ping result with RTT and server identity hash.
#[derive(Clone, Serialize)]
struct PingResult {
rtt_ms: u32,
/// Server identity: SHA-256 of the QUIC peer certificate, hex-encoded.
server_fingerprint: String,
}
/// Ping a relay to check if it's online, measure RTT, and get server identity.
#[tauri::command]
async fn ping_relay(relay: String) -> Result<PingResult, String> {
let addr: std::net::SocketAddr = relay.parse().map_err(|e| format!("bad address: {e}"))?;
let _ = rustls::crypto::ring::default_provider().install_default();
let bind: std::net::SocketAddr = "0.0.0.0:0".parse().unwrap();
let endpoint = wzp_transport::create_endpoint(bind, None).map_err(|e| format!("{e}"))?;
let client_cfg = wzp_transport::client_config();
let start = std::time::Instant::now();
let conn_result = tokio::time::timeout(
std::time::Duration::from_secs(3),
wzp_transport::connect(&endpoint, addr, "ping", client_cfg),
)
.await;
// Always close endpoint to prevent resource leaks
endpoint.close(0u32.into(), b"done");
match conn_result {
Ok(Ok(conn)) => {
let rtt_ms = start.elapsed().as_millis() as u32;
let server_fingerprint = conn
.peer_identity()
.and_then(|id| id.downcast::<Vec<rustls::pki_types::CertificateDer>>().ok())
.and_then(|certs| certs.first().map(|c| {
use std::hash::{Hash, Hasher};
let mut hasher = std::collections::hash_map::DefaultHasher::new();
c.as_ref().hash(&mut hasher);
let h = hasher.finish();
format!("{h:016x}")
}))
.unwrap_or_else(|| {
format!("{:x}", addr.ip().to_string().len() as u64 * 0x9e3779b97f4a7c15 + addr.port() as u64)
});
conn.close(0u32.into(), b"ping");
Ok(PingResult { rtt_ms, server_fingerprint })
}
Ok(Err(e)) => Err(format!("{e}")),
Err(_) => Err("timeout (3s)".into()),
}
}
/// Read fingerprint from ~/.wzp/identity without connecting.
#[tauri::command]
fn get_identity() -> Result<String, String> {
let home = std::env::var("HOME").unwrap_or_else(|_| ".".into());
let path = std::path::PathBuf::from(home).join(".wzp").join("identity");
if path.exists() {
if let Ok(hex) = std::fs::read_to_string(&path) {
if let Ok(seed) = wzp_crypto::Seed::from_hex(hex.trim()) {
let fp = seed.derive_identity().public_identity().fingerprint;
return Ok(fp.to_string());
}
}
}
// No identity yet — generate one so we can show the fingerprint
let seed = wzp_crypto::Seed::generate();
let fp = seed.derive_identity().public_identity().fingerprint;
if let Some(parent) = path.parent() {
std::fs::create_dir_all(parent).ok();
}
let hex: String = seed.0.iter().map(|b| format!("{b:02x}")).collect();
std::fs::write(&path, hex).ok();
Ok(fp.to_string())
}
#[tauri::command]
async fn connect(
state: tauri::State<'_, Arc<AppState>>,
app: tauri::AppHandle,
relay: String,
room: String,
alias: String,
os_aec: bool,
) -> Result<String, String> {
let mut engine_lock = state.engine.lock().await;
if engine_lock.is_some() {
return Err("already connected".into());
}
let app_clone = app.clone();
match CallEngine::start(relay, room, alias, os_aec, move |event_kind, message| {
let _ = app_clone.emit(
"call-event",
CallEvent {
kind: event_kind.to_string(),
message: message.to_string(),
},
);
})
.await
{
Ok(eng) => {
*engine_lock = Some(eng);
Ok("connected".into())
}
Err(e) => Err(format!("{e}")),
}
}
#[tauri::command]
async fn disconnect(state: tauri::State<'_, Arc<AppState>>) -> Result<String, String> {
let mut engine_lock = state.engine.lock().await;
if let Some(engine) = engine_lock.take() {
engine.stop().await;
Ok("disconnected".into())
} else {
Err("not connected".into())
}
}
#[tauri::command]
async fn toggle_mic(state: tauri::State<'_, Arc<AppState>>) -> Result<bool, String> {
let engine_lock = state.engine.lock().await;
if let Some(ref engine) = *engine_lock {
Ok(engine.toggle_mic())
} else {
Err("not connected".into())
}
}
#[tauri::command]
async fn toggle_speaker(state: tauri::State<'_, Arc<AppState>>) -> Result<bool, String> {
let engine_lock = state.engine.lock().await;
if let Some(ref engine) = *engine_lock {
Ok(engine.toggle_speaker())
} else {
Err("not connected".into())
}
}
#[tauri::command]
async fn get_status(state: tauri::State<'_, Arc<AppState>>) -> Result<CallStatus, String> {
let engine_lock = state.engine.lock().await;
if let Some(ref engine) = *engine_lock {
let status = engine.status().await;
Ok(CallStatus {
active: true,
mic_muted: status.mic_muted,
spk_muted: status.spk_muted,
participants: status
.participants
.into_iter()
.map(|p| Participant {
fingerprint: p.fingerprint,
alias: p.alias,
})
.collect(),
encode_fps: status.frames_sent,
recv_fps: status.frames_received,
audio_level: status.audio_level,
call_duration_secs: status.call_duration_secs,
fingerprint: status.fingerprint,
})
} else {
Ok(CallStatus {
active: false,
mic_muted: false,
spk_muted: false,
participants: vec![],
encode_fps: 0,
recv_fps: 0,
audio_level: 0,
call_duration_secs: 0.0,
fingerprint: String::new(),
})
}
}
fn main() {
tracing_subscriber::fmt().init();
let state = Arc::new(AppState {
engine: Mutex::new(None),
});
tauri::Builder::default()
.plugin(tauri_plugin_shell::init())
.manage(state)
.invoke_handler(tauri::generate_handler![
ping_relay,
get_identity,
connect,
disconnect,
toggle_mic,
toggle_speaker,
get_status,
])
.run(tauri::generate_context!())
.expect("error while running WarzonePhone Desktop");
}

View File

@@ -0,0 +1,33 @@
{
"productName": "WarzonePhone",
"version": "0.1.0",
"identifier": "com.wzp.desktop",
"build": {
"frontendDist": "../dist",
"devUrl": "http://localhost:1420",
"beforeDevCommand": "npm run dev",
"beforeBuildCommand": "npm run build"
},
"app": {
"windows": [
{
"title": "WarzonePhone",
"width": 400,
"height": 640,
"resizable": true,
"minWidth": 360,
"minHeight": 500
}
],
"security": {
"csp": null
}
},
"bundle": {
"active": true,
"targets": "all",
"icon": [
"icons/icon.png"
]
}
}

110
desktop/src/identicon.ts Normal file
View File

@@ -0,0 +1,110 @@
/**
* Deterministic identicon generator — creates a unique symmetric pattern
* from a hex fingerprint string, similar to MetaMask's Jazzicon / Ethereum blockies.
*
* Returns an SVG data URL that can be used as an <img> src.
*/
function hashBytes(hex: string): number[] {
const clean = hex.replace(/[^0-9a-fA-F]/g, "");
const bytes: number[] = [];
for (let i = 0; i < clean.length; i += 2) {
bytes.push(parseInt(clean.substring(i, i + 2), 16));
}
// Pad to at least 16 bytes
while (bytes.length < 16) bytes.push(0);
return bytes;
}
function hslToRgb(h: number, s: number, l: number): [number, number, number] {
s /= 100;
l /= 100;
const k = (n: number) => (n + h / 30) % 12;
const a = s * Math.min(l, 1 - l);
const f = (n: number) =>
l - a * Math.max(-1, Math.min(k(n) - 3, Math.min(9 - k(n), 1)));
return [
Math.round(f(0) * 255),
Math.round(f(8) * 255),
Math.round(f(4) * 255),
];
}
export function generateIdenticon(
fingerprint: string,
size: number = 36
): string {
const bytes = hashBytes(fingerprint);
// Derive colors from first bytes
const hue1 = (bytes[0] * 360) / 256;
const hue2 = ((bytes[1] * 360) / 256 + 120) % 360;
const [r1, g1, b1] = hslToRgb(hue1, 65, 35); // dark bg
const [r2, g2, b2] = hslToRgb(hue2, 70, 55); // bright fg
const bg = `rgb(${r1},${g1},${b1})`;
const fg = `rgb(${r2},${g2},${b2})`;
// 5x5 grid, left-right symmetric (only need 3 columns)
const grid: boolean[][] = [];
for (let y = 0; y < 5; y++) {
const row: boolean[] = [];
for (let x = 0; x < 3; x++) {
const byteIdx = 2 + y * 3 + x;
row.push(bytes[byteIdx % bytes.length] > 128);
}
// Mirror: col 3 = col 1, col 4 = col 0
grid.push([row[0], row[1], row[2], row[1], row[0]]);
}
// Render SVG
const cellSize = size / 5;
const r = size * 0.12; // border radius
let rects = "";
for (let y = 0; y < 5; y++) {
for (let x = 0; x < 5; x++) {
if (grid[y][x]) {
rects += `<rect x="${x * cellSize}" y="${y * cellSize}" width="${cellSize}" height="${cellSize}" fill="${fg}"/>`;
}
}
}
const svg = `<svg xmlns="http://www.w3.org/2000/svg" width="${size}" height="${size}" viewBox="0 0 ${size} ${size}">
<rect width="${size}" height="${size}" rx="${r}" fill="${bg}"/>
${rects}
</svg>`;
return `data:image/svg+xml,${encodeURIComponent(svg)}`;
}
/**
* Create an <img> element with the identicon.
* Click copies the fingerprint to clipboard.
*/
export function createIdenticonEl(
fingerprint: string,
size: number = 36,
clickToCopy: boolean = true
): HTMLImageElement {
const img = document.createElement("img");
img.src = generateIdenticon(fingerprint, size);
img.width = size;
img.height = size;
img.style.borderRadius = `${size * 0.12}px`;
img.style.cursor = clickToCopy ? "pointer" : "default";
img.title = fingerprint;
if (clickToCopy && fingerprint) {
img.addEventListener("click", (e) => {
e.stopPropagation();
navigator.clipboard.writeText(fingerprint).then(() => {
img.style.outline = "2px solid #4ade80";
setTimeout(() => {
img.style.outline = "";
}, 600);
});
});
}
return img;
}

591
desktop/src/main.ts Normal file
View File

@@ -0,0 +1,591 @@
import { invoke } from "@tauri-apps/api/core";
import { listen } from "@tauri-apps/api/event";
import { generateIdenticon, createIdenticonEl } from "./identicon";
// ── Elements ──
const connectScreen = document.getElementById("connect-screen")!;
const callScreen = document.getElementById("call-screen")!;
const roomInput = document.getElementById("room") as HTMLInputElement;
const aliasInput = document.getElementById("alias") as HTMLInputElement;
const osAecCheckbox = document.getElementById("os-aec") as HTMLInputElement;
const connectBtn = document.getElementById("connect-btn") as HTMLButtonElement;
const connectError = document.getElementById("connect-error")!;
const roomName = document.getElementById("room-name")!;
const callTimer = document.getElementById("call-timer")!;
const callStatus = document.getElementById("call-status")!;
const levelBar = document.getElementById("level-bar")!;
const participantsDiv = document.getElementById("participants")!;
const micBtn = document.getElementById("mic-btn")!;
const micIcon = document.getElementById("mic-icon")!;
const spkBtn = document.getElementById("spk-btn")!;
const spkIcon = document.getElementById("spk-icon")!;
const hangupBtn = document.getElementById("hangup-btn")!;
const statsDiv = document.getElementById("stats")!;
const myFingerprintEl = document.getElementById("my-fingerprint")!;
const myIdenticonEl = document.getElementById("my-identicon")!;
const recentRoomsDiv = document.getElementById("recent-rooms")!;
// Relay button
const relaySelected = document.getElementById("relay-selected")!;
const relayDot = document.getElementById("relay-dot")!;
const relayLabel = document.getElementById("relay-label")!;
// Relay dialog
const relayDialog = document.getElementById("relay-dialog")!;
const relayDialogClose = document.getElementById("relay-dialog-close")!;
const relayDialogList = document.getElementById("relay-dialog-list")!;
const relayAddName = document.getElementById("relay-add-name") as HTMLInputElement;
const relayAddAddr = document.getElementById("relay-add-addr") as HTMLInputElement;
const relayAddBtn = document.getElementById("relay-add-btn")!;
// Settings
const settingsPanel = document.getElementById("settings-panel")!;
const settingsClose = document.getElementById("settings-close")!;
const settingsSave = document.getElementById("settings-save")!;
const settingsBtnHome = document.getElementById("settings-btn-home")!;
const settingsBtnCall = document.getElementById("settings-btn-call")!;
const sRoom = document.getElementById("s-room") as HTMLInputElement;
const sAlias = document.getElementById("s-alias") as HTMLInputElement;
const sOsAec = document.getElementById("s-os-aec") as HTMLInputElement;
const sAgc = document.getElementById("s-agc") as HTMLInputElement;
const sFingerprint = document.getElementById("s-fingerprint")!;
const sRecentRooms = document.getElementById("s-recent-rooms")!;
const sClearRecent = document.getElementById("s-clear-recent")!;
let statusInterval: number | null = null;
let myFingerprint = "";
let userDisconnected = false;
// ── Data types ──
interface RelayServer {
name: string;
address: string;
rtt?: number | null;
serverFingerprint?: string | null; // from ping
knownFingerprint?: string | null; // saved TOFU fingerprint
}
interface RecentRoom { relay: string; room: string; }
interface Settings {
relays: RelayServer[];
selectedRelay: number;
room: string;
alias: string;
osAec: boolean;
agc: boolean;
recentRooms: RecentRoom[];
}
function loadSettings(): Settings {
const defaults: Settings = {
relays: [{ name: "Default", address: "193.180.213.68:4433" }],
selectedRelay: 0, room: "android", alias: "",
osAec: true, agc: true, recentRooms: [],
};
try {
const raw = localStorage.getItem("wzp-settings");
if (raw) {
const parsed = JSON.parse(raw);
if (parsed.relay && !parsed.relays) {
parsed.relays = [{ name: "Default", address: parsed.relay }];
parsed.selectedRelay = 0;
delete parsed.relay;
}
if (parsed.recentRooms?.length > 0 && typeof parsed.recentRooms[0] === "string") {
const addr = parsed.relays?.[0]?.address || defaults.relays[0].address;
parsed.recentRooms = parsed.recentRooms.map((r: string) => ({ relay: addr, room: r }));
}
return { ...defaults, ...parsed };
}
} catch {}
return defaults;
}
function saveSettingsObj(s: Settings) {
localStorage.setItem("wzp-settings", JSON.stringify(s));
}
function getSelectedRelay(): RelayServer | undefined {
const s = loadSettings();
return s.relays[s.selectedRelay];
}
// ── Helpers ──
function escapeHtml(s: string): string {
const d = document.createElement("div");
d.textContent = s;
return d.innerHTML;
}
// ── Lock status ──
type LockStatus = "verified" | "new" | "changed" | "offline" | "unknown";
function lockStatus(relay: RelayServer): LockStatus {
if (relay.rtt === undefined || relay.rtt === null) return "unknown";
if (relay.rtt < 0) return "offline";
if (!relay.serverFingerprint) return "new";
if (!relay.knownFingerprint) return "new"; // first time
if (relay.serverFingerprint === relay.knownFingerprint) return "verified";
return "changed";
}
function lockIcon(status: LockStatus): string {
switch (status) {
case "verified": return "🔒";
case "new": return "🔓";
case "changed": return "⚠️";
case "offline": return "🔴";
case "unknown": return "⚪";
}
}
function lockColor(status: LockStatus): string {
switch (status) {
case "verified": return "var(--green)";
case "new": return "var(--yellow)";
case "changed": return "var(--red)";
case "offline": return "var(--red)";
case "unknown": return "var(--text-dim)";
}
}
// ── Apply settings ──
function applySettings() {
const s = loadSettings();
roomInput.value = s.room;
aliasInput.value = s.alias;
osAecCheckbox.checked = s.osAec;
renderRecentRooms(s.recentRooms);
renderRelayButton();
}
// ── Relay button ──
function renderRelayButton() {
const s = loadSettings();
const sel = s.relays[s.selectedRelay];
if (sel) {
const ls = lockStatus(sel);
relayDot.textContent = lockIcon(ls);
relayDot.className = "relay-lock";
relayLabel.textContent = `${sel.name} (${sel.address})`;
} else {
relayDot.textContent = "⚪";
relayDot.className = "relay-lock";
relayLabel.textContent = "No relay configured";
}
}
relaySelected.addEventListener("click", () => openRelayDialog());
// ── Relay dialog ──
function openRelayDialog() {
renderRelayDialogList();
relayAddName.value = "";
relayAddAddr.value = "";
relayDialog.classList.remove("hidden");
}
function closeRelayDialog() {
relayDialog.classList.add("hidden");
renderRelayButton();
}
function renderRelayDialogList() {
const s = loadSettings();
relayDialogList.innerHTML = "";
s.relays.forEach((r, i) => {
const item = document.createElement("div");
item.className = `relay-dialog-item ${i === s.selectedRelay ? "selected" : ""}`;
const ls = lockStatus(r);
const fp = r.serverFingerprint || r.address;
// Identicon
const icon = createIdenticonEl(fp, 32, true);
icon.title = r.serverFingerprint
? `Server: ${r.serverFingerprint}\nClick to copy`
: `No fingerprint yet`;
item.appendChild(icon);
// Info
const info = document.createElement("div");
info.className = "relay-info";
info.innerHTML = `
<div class="relay-name">${escapeHtml(r.name)}</div>
<div class="relay-addr">${escapeHtml(r.address)}</div>
`;
item.appendChild(info);
// Lock + RTT
const meta = document.createElement("div");
meta.className = "relay-meta";
const rttStr = r.rtt !== undefined && r.rtt !== null
? (r.rtt < 0 ? "offline" : `${r.rtt}ms`)
: "";
meta.innerHTML = `
<span class="relay-lock-icon" style="color:${lockColor(ls)}">${lockIcon(ls)}</span>
<span class="relay-rtt">${rttStr}</span>
`;
item.appendChild(meta);
// Delete button
const del = document.createElement("button");
del.className = "remove";
del.textContent = "×";
del.addEventListener("click", (e) => {
e.stopPropagation();
const s = loadSettings();
s.relays.splice(i, 1);
if (s.selectedRelay >= s.relays.length) s.selectedRelay = Math.max(0, s.relays.length - 1);
saveSettingsObj(s);
renderRelayDialogList();
renderRelayButton();
});
item.appendChild(del);
// Click to select
item.addEventListener("click", () => {
const s = loadSettings();
s.selectedRelay = i;
// TOFU: if first time seeing this server, trust its fingerprint
if (r.serverFingerprint && !r.knownFingerprint) {
s.relays[i].knownFingerprint = r.serverFingerprint;
}
saveSettingsObj(s);
renderRelayDialogList();
renderRelayButton();
});
relayDialogList.appendChild(item);
});
}
relayAddBtn.addEventListener("click", () => {
const name = relayAddName.value.trim();
const addr = relayAddAddr.value.trim();
if (!addr) return;
const s = loadSettings();
s.relays.push({ name: name || addr, address: addr });
saveSettingsObj(s);
relayAddName.value = "";
relayAddAddr.value = "";
renderRelayDialogList();
pingAllRelays();
});
relayDialogClose.addEventListener("click", closeRelayDialog);
relayDialog.addEventListener("click", (e) => { if (e.target === relayDialog) closeRelayDialog(); });
// ── Ping ──
interface PingResult { rtt_ms: number; server_fingerprint: string; }
async function pingAllRelays() {
const s = loadSettings();
for (let i = 0; i < s.relays.length; i++) {
const r = s.relays[i];
try {
const result: PingResult = await invoke("ping_relay", { relay: r.address });
r.rtt = result.rtt_ms;
r.serverFingerprint = result.server_fingerprint;
// TOFU: auto-save fingerprint on first contact
if (!r.knownFingerprint) {
r.knownFingerprint = result.server_fingerprint;
}
} catch {
r.rtt = -1;
}
}
saveSettingsObj(s);
renderRelayButton();
if (!relayDialog.classList.contains("hidden")) renderRelayDialogList();
}
// ── Recent rooms ──
function renderRecentRooms(rooms: RecentRoom[]) {
recentRoomsDiv.innerHTML = rooms
.map((r) => `<span class="recent-room" data-relay="${escapeHtml(r.relay)}" data-room="${escapeHtml(r.room)}">${escapeHtml(r.room)}</span>`)
.join("");
recentRoomsDiv.querySelectorAll(".recent-room").forEach((el) => {
el.addEventListener("click", () => {
const ds = (el as HTMLElement).dataset;
roomInput.value = ds.room || "";
const s = loadSettings();
const idx = s.relays.findIndex((r) => r.address === ds.relay);
if (idx >= 0) { s.selectedRelay = idx; saveSettingsObj(s); renderRelayButton(); }
});
});
}
// ── Init ──
applySettings();
setTimeout(pingAllRelays, 300);
// Load fingerprint + render identicon
(async () => {
try {
const fp: string = await invoke("get_identity");
myFingerprint = fp;
myFingerprintEl.textContent = fp;
myFingerprintEl.style.cursor = "pointer";
myFingerprintEl.addEventListener("click", () => {
navigator.clipboard.writeText(fp).then(() => {
const orig = myFingerprintEl.textContent;
myFingerprintEl.textContent = "Copied!";
setTimeout(() => { myFingerprintEl.textContent = orig; }, 1000);
});
});
// Identicon next to fingerprint
const icon = createIdenticonEl(fp, 28, true);
myIdenticonEl.innerHTML = "";
myIdenticonEl.appendChild(icon);
} catch {}
})();
// ── Connect ──
connectBtn.addEventListener("click", doConnect);
[roomInput, aliasInput].forEach((el) =>
el.addEventListener("keydown", (e) => { if (e.key === "Enter") doConnect(); })
);
async function doConnect() {
const relay = getSelectedRelay();
if (!relay) { connectError.textContent = "No relay selected"; return; }
// Warn on fingerprint mismatch
const ls = lockStatus(relay);
if (ls === "changed") {
if (!confirm(`Server fingerprint has changed!\n\nKnown: ${relay.knownFingerprint}\nNew: ${relay.serverFingerprint}\n\nThis could indicate a man-in-the-middle attack. Continue?`)) {
return;
}
// User accepted — update known fingerprint
const s = loadSettings();
s.relays[s.selectedRelay].knownFingerprint = relay.serverFingerprint;
saveSettingsObj(s);
}
// Don't block connect on offline — ping may have failed transiently
connectError.textContent = "";
connectBtn.disabled = true;
connectBtn.textContent = "Connecting...";
userDisconnected = false;
const s = loadSettings();
s.room = roomInput.value; s.alias = aliasInput.value; s.osAec = osAecCheckbox.checked;
const room = roomInput.value.trim();
if (room) {
const entry: RecentRoom = { relay: relay.address, room };
s.recentRooms = [entry, ...s.recentRooms.filter((r) => !(r.relay === relay.address && r.room === room))].slice(0, 5);
}
saveSettingsObj(s);
try {
await invoke("connect", {
relay: relay.address, room: roomInput.value,
alias: aliasInput.value, osAec: osAecCheckbox.checked,
});
showCallScreen();
} catch (e: any) {
connectError.textContent = String(e);
connectBtn.disabled = false;
connectBtn.textContent = "Connect";
}
}
function showCallScreen() {
connectScreen.classList.add("hidden");
callScreen.classList.remove("hidden");
roomName.textContent = roomInput.value;
callStatus.className = "status-dot";
statusInterval = window.setInterval(pollStatus, 250);
}
function showConnectScreen() {
callScreen.classList.add("hidden");
connectScreen.classList.remove("hidden");
connectBtn.disabled = false;
connectBtn.textContent = "Connect";
levelBar.style.width = "0%";
if (statusInterval) { clearInterval(statusInterval); statusInterval = null; }
}
// ── Mute / hangup ──
micBtn.addEventListener("click", async () => {
try { const m: boolean = await invoke("toggle_mic"); micBtn.classList.toggle("muted", m); micIcon.textContent = m ? "Mic Off" : "Mic"; } catch {}
});
spkBtn.addEventListener("click", async () => {
try { const m: boolean = await invoke("toggle_speaker"); spkBtn.classList.toggle("muted", m); spkIcon.textContent = m ? "Spk Off" : "Spk"; } catch {}
});
hangupBtn.addEventListener("click", async () => {
userDisconnected = true;
try { await invoke("disconnect"); } catch {}
showConnectScreen();
});
document.addEventListener("keydown", (e) => {
if (callScreen.classList.contains("hidden")) return;
if ((e.target as HTMLElement).tagName === "INPUT") return;
if (e.key === "m") micBtn.click();
if (e.key === "s") spkBtn.click();
if (e.key === "q") hangupBtn.click();
});
// ── Status polling ──
interface CallStatusI {
active: boolean; mic_muted: boolean; spk_muted: boolean;
participants: { fingerprint: string; alias: string | null }[];
encode_fps: number; recv_fps: number; audio_level: number;
call_duration_secs: number; fingerprint: string;
}
function formatDuration(secs: number): string {
const m = Math.floor(secs / 60);
const s = Math.floor(secs % 60);
return `${m}:${s.toString().padStart(2, "0")}`;
}
let reconnectAttempts = 0;
async function pollStatus() {
try {
const st: CallStatusI = await invoke("get_status");
if (!st.active) {
if (!userDisconnected && reconnectAttempts < 5) {
reconnectAttempts++;
callStatus.className = "status-dot reconnecting";
statsDiv.textContent = `Reconnecting (${reconnectAttempts}/5)...`;
const relay = getSelectedRelay();
if (relay) {
const delay = Math.min(1000 * Math.pow(2, reconnectAttempts - 1), 10000);
setTimeout(async () => {
try {
await invoke("connect", { relay: relay.address, room: roomInput.value, alias: aliasInput.value, osAec: osAecCheckbox.checked });
reconnectAttempts = 0; callStatus.className = "status-dot";
} catch {}
}, delay);
}
return;
}
reconnectAttempts = 0; showConnectScreen(); return;
}
reconnectAttempts = 0;
if (st.fingerprint) myFingerprint = st.fingerprint;
micBtn.classList.toggle("muted", st.mic_muted);
micIcon.textContent = st.mic_muted ? "Mic Off" : "Mic";
spkBtn.classList.toggle("muted", st.spk_muted);
spkIcon.textContent = st.spk_muted ? "Spk Off" : "Spk";
callTimer.textContent = formatDuration(st.call_duration_secs);
const rms = st.audio_level;
const pct = rms > 0 ? Math.min(100, (Math.log(rms) / Math.log(32767)) * 100) : 0;
levelBar.style.width = `${pct}%`;
// Participants with identicons
if (st.participants.length === 0) {
participantsDiv.innerHTML = '<div class="participants-empty">Waiting for participants...</div>';
} else {
participantsDiv.innerHTML = "";
st.participants.forEach((p) => {
const name = p.alias || "Anonymous";
const fp = p.fingerprint || "";
const isMe = fp && myFingerprint.includes(fp);
const row = document.createElement("div");
row.className = "participant";
// Identicon avatar
const icon = createIdenticonEl(fp || name, 36, true);
if (isMe) icon.style.outline = "2px solid var(--accent)";
row.appendChild(icon);
const info = document.createElement("div");
info.className = "info";
info.innerHTML = `
<div class="name">${escapeHtml(name)} ${isMe ? '<span class="you-badge">you</span>' : ""}</div>
<div class="fp">${escapeHtml(fp ? fp.substring(0, 16) : "")}</div>
`;
row.appendChild(info);
participantsDiv.appendChild(row);
});
}
statsDiv.textContent = `TX: ${st.encode_fps} | RX: ${st.recv_fps}`;
} catch {}
}
listen("call-event", (event: any) => {
const { kind } = event.payload;
if (kind === "room-update") pollStatus();
if (kind === "disconnected" && !userDisconnected) pollStatus();
});
// ── Settings ──
function openSettings() {
const s = loadSettings();
sRoom.value = s.room; sAlias.value = s.alias; sOsAec.checked = s.osAec;
sFingerprint.textContent = myFingerprint || "(loading...)";
renderSettingsRecentRooms(s.recentRooms);
settingsPanel.classList.remove("hidden");
}
function closeSettings() { settingsPanel.classList.add("hidden"); }
function renderSettingsRecentRooms(rooms: RecentRoom[]) {
if (rooms.length === 0) {
sRecentRooms.innerHTML = '<span style="color:var(--text-dim);font-size:12px">No recent rooms</span>';
return;
}
sRecentRooms.innerHTML = rooms.map((r, i) => `
<div class="recent-room-item">
<span>${escapeHtml(r.room)} <small style="color:var(--text-dim)">${escapeHtml(r.relay)}</small></span>
<button class="remove" data-idx="${i}">×</button>
</div>`).join("");
sRecentRooms.querySelectorAll(".remove").forEach((btn) => {
btn.addEventListener("click", () => {
const idx = parseInt((btn as HTMLElement).dataset.idx || "0");
const s = loadSettings();
s.recentRooms.splice(idx, 1);
saveSettingsObj(s);
renderSettingsRecentRooms(s.recentRooms);
});
});
}
settingsBtnHome.addEventListener("click", openSettings);
settingsBtnCall.addEventListener("click", openSettings);
settingsClose.addEventListener("click", closeSettings);
settingsPanel.addEventListener("click", (e) => { if (e.target === settingsPanel) closeSettings(); });
settingsSave.addEventListener("click", () => {
const s = loadSettings();
s.room = sRoom.value; s.alias = sAlias.value; s.osAec = sOsAec.checked;
saveSettingsObj(s);
roomInput.value = s.room; aliasInput.value = s.alias; osAecCheckbox.checked = s.osAec;
renderRecentRooms(s.recentRooms);
closeSettings();
});
sClearRecent.addEventListener("click", () => {
const s = loadSettings();
s.recentRooms = [];
saveSettingsObj(s);
renderSettingsRecentRooms([]);
renderRecentRooms([]);
});
document.addEventListener("keydown", (e) => {
if ((e.metaKey || e.ctrlKey) && e.key === ",") {
e.preventDefault();
settingsPanel.classList.contains("hidden") ? openSettings() : closeSettings();
}
if (e.key === "Escape") {
if (!relayDialog.classList.contains("hidden")) closeRelayDialog();
else if (!settingsPanel.classList.contains("hidden")) closeSettings();
}
});

653
desktop/src/style.css Normal file
View File

@@ -0,0 +1,653 @@
:root {
--bg: #0f0f1a;
--surface: #1a1a2e;
--surface2: #222244;
--primary: #0f3460;
--accent: #e94560;
--text: #eee;
--text-dim: #777;
--green: #4ade80;
--red: #ef4444;
--yellow: #facc15;
--radius: 12px;
}
* { margin: 0; padding: 0; box-sizing: border-box; }
body {
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, sans-serif;
background: var(--bg);
color: var(--text);
min-height: 100vh;
user-select: none;
-webkit-user-select: none;
}
#app {
display: flex;
flex-direction: column;
min-height: 100vh;
padding: 20px;
}
.hidden { display: none !important; }
/* ── Connect screen ── */
#connect-screen {
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
flex: 1;
gap: 20px;
}
#connect-screen h1 {
font-size: 26px;
font-weight: 700;
letter-spacing: 1px;
}
.subtitle {
font-size: 13px;
color: var(--text-dim);
margin-top: -12px;
letter-spacing: 2px;
text-transform: uppercase;
}
.form {
display: flex;
flex-direction: column;
gap: 12px;
width: 100%;
max-width: 320px;
}
.form label {
display: flex;
flex-direction: column;
gap: 4px;
font-size: 11px;
color: var(--text-dim);
text-transform: uppercase;
letter-spacing: 0.5px;
}
.form input[type="text"] {
background: var(--surface);
border: 1px solid #333;
border-radius: 8px;
padding: 10px 12px;
color: var(--text);
font-size: 15px;
outline: none;
transition: border-color 0.2s;
}
.form input[type="text"]:focus {
border-color: var(--accent);
}
/* ── Relay button ── */
.relay-selected {
display: flex;
align-items: center;
gap: 8px;
width: 100%;
background: var(--surface);
border: 1px solid #333;
border-radius: 8px;
padding: 10px 12px;
color: var(--text);
font-size: 14px;
cursor: pointer;
text-align: left;
transition: border-color 0.2s;
}
.relay-selected:hover { border-color: var(--accent); }
.relay-lock {
font-size: 14px;
flex-shrink: 0;
}
.relay-selected .arrow {
margin-left: auto;
font-size: 10px;
color: var(--text-dim);
}
.dot.green { background: var(--green); }
.dot.yellow { background: var(--yellow); }
.dot.red { background: var(--red); }
.dot.gray { background: #555; }
/* ── Relay dialog ── */
#relay-dialog {
position: fixed;
inset: 0;
background: rgba(0,0,0,0.6);
backdrop-filter: blur(4px);
display: flex;
align-items: center;
justify-content: center;
z-index: 200;
padding: 20px;
}
.relay-dialog-card {
max-width: 360px;
width: 100%;
}
.relay-dialog-list {
display: flex;
flex-direction: column;
gap: 6px;
max-height: 300px;
overflow-y: auto;
}
.relay-dialog-item {
display: flex;
align-items: center;
gap: 8px;
background: var(--surface);
border-radius: 8px;
padding: 8px 12px;
}
.relay-dialog-item .dot { width: 8px; height: 8px; border-radius: 50%; flex-shrink: 0; }
.relay-dialog-item { cursor: pointer; transition: background 0.1s; }
.relay-dialog-item:hover { background: var(--surface2); }
.relay-dialog-item.selected { background: var(--primary); border: 1px solid var(--accent); }
.relay-dialog-item .relay-info { flex: 1; min-width: 0; overflow: hidden; }
.relay-dialog-item .relay-name { font-size: 13px; font-weight: 500; overflow: hidden; text-overflow: ellipsis; white-space: nowrap; }
.relay-dialog-item .relay-addr { font-size: 11px; color: var(--text-dim); font-family: monospace; overflow: hidden; text-overflow: ellipsis; }
.relay-dialog-item .relay-rtt { font-size: 11px; color: var(--text-dim); margin-right: 4px; }
.relay-meta {
display: flex;
flex-direction: column;
align-items: center;
gap: 2px;
flex-shrink: 0;
}
.relay-lock-icon { font-size: 16px; }
.relay-meta .relay-rtt { font-size: 10px; color: var(--text-dim); }
.relay-dialog-item .remove {
background: none;
border: none;
color: var(--text-dim);
cursor: pointer;
font-size: 16px;
padding: 0 4px;
}
.relay-dialog-item .remove:hover { color: var(--red); }
.relay-add-row {
display: flex;
flex-direction: column;
gap: 8px;
margin-top: 12px;
border-top: 1px solid #333;
padding-top: 12px;
}
.relay-add-inputs {
display: flex;
gap: 6px;
}
.relay-add-row input {
background: var(--surface);
border: 1px solid #333;
border-radius: 8px;
padding: 8px 10px;
color: var(--text);
font-size: 13px;
outline: none;
flex: 1;
min-width: 0;
}
.relay-add-row input:focus { border-color: var(--accent); }
.relay-add-row .primary {
padding: 10px;
font-size: 14px;
}
.form-row {
display: flex;
gap: 16px;
align-items: center;
}
.checkbox {
flex-direction: row !important;
align-items: center;
gap: 8px !important;
cursor: pointer;
font-size: 13px !important;
}
.checkbox input { width: 16px; height: 16px; }
button.primary {
background: var(--accent);
color: white;
border: none;
border-radius: 8px;
padding: 12px;
font-size: 16px;
font-weight: 600;
cursor: pointer;
transition: opacity 0.2s;
margin-top: 4px;
}
button.primary:hover { opacity: 0.9; }
button.primary:disabled { opacity: 0.5; cursor: not-allowed; }
.error {
color: var(--red);
font-size: 13px;
min-height: 18px;
}
.identity-info {
display: flex;
align-items: center;
justify-content: center;
gap: 8px;
}
.fp-display {
font-family: monospace;
font-size: 11px;
color: var(--text-dim);
}
.recent-rooms {
display: flex;
flex-wrap: wrap;
gap: 8px;
justify-content: center;
max-width: 320px;
}
.recent-room {
background: var(--surface);
border: 1px solid #333;
border-radius: 16px;
padding: 4px 12px;
font-size: 12px;
color: var(--text-dim);
cursor: pointer;
transition: all 0.2s;
}
.recent-room:hover {
border-color: var(--accent);
color: var(--text);
}
/* ── Call screen ── */
#call-screen {
display: flex;
flex-direction: column;
flex: 1;
gap: 16px;
}
.call-header {
text-align: center;
padding: 8px;
}
.room-name {
font-size: 20px;
font-weight: 600;
}
.call-meta {
display: flex;
align-items: center;
justify-content: center;
gap: 8px;
margin-top: 4px;
}
.status-dot {
width: 8px;
height: 8px;
border-radius: 50%;
background: var(--green);
display: inline-block;
animation: pulse 2s infinite;
}
@keyframes pulse {
0%, 100% { opacity: 1; }
50% { opacity: 0.4; }
}
.status-dot.reconnecting {
background: var(--yellow);
animation: blink 0.5s infinite;
}
@keyframes blink {
0%, 100% { opacity: 1; }
50% { opacity: 0.1; }
}
.call-timer {
font-size: 14px;
color: var(--text-dim);
font-variant-numeric: tabular-nums;
}
/* ── Audio level meter ── */
.level-meter {
height: 4px;
background: var(--surface);
border-radius: 2px;
overflow: hidden;
}
.level-bar-fill {
height: 100%;
width: 0%;
background: linear-gradient(90deg, var(--green) 0%, var(--yellow) 60%, var(--red) 100%);
border-radius: 2px;
transition: width 0.1s ease-out;
}
/* ── Participants ── */
.participants {
background: var(--surface);
border-radius: var(--radius);
padding: 12px 16px;
flex: 1;
overflow-y: auto;
min-height: 80px;
}
.participants-empty {
color: var(--text-dim);
font-size: 13px;
text-align: center;
padding: 20px 0;
}
.participant {
display: flex;
align-items: center;
gap: 10px;
padding: 8px 0;
border-bottom: 1px solid #ffffff08;
}
.participant:last-child { border-bottom: none; }
.participant .avatar {
width: 36px;
height: 36px;
border-radius: 50%;
background: var(--primary);
display: flex;
align-items: center;
justify-content: center;
font-size: 14px;
font-weight: 600;
flex-shrink: 0;
}
.participant .avatar.me {
background: var(--accent);
}
.participant .info { flex: 1; min-width: 0; }
.participant .name {
font-size: 14px;
font-weight: 500;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.participant .fp {
font-size: 10px;
color: var(--text-dim);
font-family: monospace;
overflow: hidden;
text-overflow: ellipsis;
}
.participant .you-badge {
font-size: 10px;
color: var(--accent);
background: #e9456020;
padding: 1px 6px;
border-radius: 8px;
}
/* ── Controls ── */
.controls {
display: flex;
justify-content: center;
gap: 24px;
padding: 12px;
}
.control-btn {
display: flex;
align-items: center;
justify-content: center;
background: var(--surface2);
color: var(--text);
border: none;
border-radius: 50%;
width: 56px;
height: 56px;
cursor: pointer;
transition: all 0.15s;
font-size: 13px;
font-weight: 600;
}
.control-btn:hover { background: var(--primary); }
.control-btn.muted {
background: var(--red);
color: white;
}
.control-btn.hangup {
background: var(--red);
color: white;
width: 64px;
height: 64px;
font-size: 14px;
}
.control-btn.hangup:hover { opacity: 0.85; }
/* ── Stats ── */
.stats {
text-align: center;
font-size: 10px;
color: var(--text-dim);
font-family: monospace;
padding: 4px;
}
/* ── Icon button ── */
.icon-btn {
background: none;
border: 1px solid #444;
border-radius: 8px;
color: var(--text-dim);
font-size: 18px;
width: 36px;
height: 36px;
cursor: pointer;
display: flex;
align-items: center;
justify-content: center;
transition: all 0.15s;
}
.icon-btn:hover { border-color: var(--accent); color: var(--text); }
.icon-btn.small { width: 28px; height: 28px; font-size: 14px; }
.call-header-row {
display: flex;
align-items: center;
justify-content: center;
gap: 8px;
}
/* ── Settings panel ── */
#settings-panel {
position: fixed;
inset: 0;
background: rgba(0, 0, 0, 0.6);
backdrop-filter: blur(4px);
display: flex;
align-items: center;
justify-content: center;
z-index: 100;
padding: 20px;
}
.settings-card {
background: var(--bg);
border: 1px solid #333;
border-radius: 16px;
padding: 24px;
width: 100%;
max-width: 380px;
max-height: 90vh;
overflow-y: auto;
display: flex;
flex-direction: column;
gap: 20px;
}
.settings-header {
display: flex;
align-items: center;
justify-content: space-between;
}
.settings-header h2 {
font-size: 18px;
font-weight: 600;
}
.settings-section {
display: flex;
flex-direction: column;
gap: 10px;
}
.settings-section h3 {
font-size: 12px;
text-transform: uppercase;
letter-spacing: 1px;
color: var(--text-dim);
border-bottom: 1px solid #333;
padding-bottom: 4px;
}
.settings-section label {
display: flex;
flex-direction: column;
gap: 4px;
font-size: 11px;
color: var(--text-dim);
text-transform: uppercase;
letter-spacing: 0.5px;
}
.settings-section input[type="text"] {
background: var(--surface);
border: 1px solid #333;
border-radius: 8px;
padding: 8px 10px;
color: var(--text);
font-size: 14px;
outline: none;
}
.settings-section input[type="text"]:focus {
border-color: var(--accent);
}
.setting-row {
display: flex;
justify-content: space-between;
align-items: center;
padding: 4px 0;
}
.setting-label {
font-size: 12px;
color: var(--text-dim);
}
.fp-display-large {
font-family: monospace;
font-size: 12px;
color: var(--text);
word-break: break-all;
}
.recent-rooms-list {
display: flex;
flex-direction: column;
gap: 4px;
}
.recent-room-item {
display: flex;
justify-content: space-between;
align-items: center;
background: var(--surface);
border-radius: 8px;
padding: 6px 10px;
font-size: 13px;
}
.recent-room-item .remove {
background: none;
border: none;
color: var(--text-dim);
cursor: pointer;
font-size: 16px;
}
.recent-room-item .remove:hover { color: var(--red); }
.secondary-btn {
background: var(--surface);
border: 1px solid #444;
border-radius: 8px;
padding: 8px;
color: var(--text-dim);
font-size: 13px;
cursor: pointer;
transition: all 0.15s;
}
.secondary-btn:hover { border-color: var(--accent); color: var(--text); }

15
desktop/tsconfig.json Normal file
View File

@@ -0,0 +1,15 @@
{
"compilerOptions": {
"target": "ESNext",
"module": "ESNext",
"moduleResolution": "bundler",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"allowImportingTsExtensions": true,
"noEmit": true
},
"include": ["src"]
}

15
desktop/vite.config.ts Normal file
View File

@@ -0,0 +1,15 @@
import { defineConfig } from "vite";
export default defineConfig({
clearScreen: false,
server: {
port: 1420,
strictPort: true,
},
envPrefix: ["VITE_", "TAURI_"],
build: {
target: "esnext",
minify: !process.env.TAURI_DEBUG ? "esbuild" : false,
sourcemap: !!process.env.TAURI_DEBUG,
},
});

41
docs/android/README.md Normal file
View File

@@ -0,0 +1,41 @@
# WarzonePhone Android Client
The WZP Android client is a native VoIP application built with Kotlin/Jetpack Compose on top of a Rust audio engine. It connects to WZP relay servers over QUIC, providing encrypted voice calls with adaptive quality, forward error correction, and acoustic echo cancellation.
## Quick Start
1. **Build**: `cd android && ./gradlew assembleRelease` (requires NDK 26.1, cargo-ndk)
2. **Install**: `adb install app/build/outputs/apk/release/app-release.apk`
3. **Run**: Open "WZ Phone", tap **CALL** to connect to the hardcoded relay
4. **Relay**: Must be running at the configured address (default `172.16.81.125:4433`)
## Current State (April 2025)
| Feature | Status |
|---------|--------|
| QUIC transport to relay | Working |
| Crypto handshake (X25519 + Ed25519) | Working |
| Opus 24k encoding/decoding | Working |
| Oboe audio I/O (48kHz mono) | Working |
| AEC / AGC signal processing | Working |
| RaptorQ FEC | Wired (repair symbols not sent yet) |
| Jitter buffer | Working |
| Adaptive quality switching | Codec-ready, not network-driven yet |
| Authentication (featherChat) | Skipped (relay has no --auth-url) |
| Media encryption (ChaCha20-Poly1305) | Session derived but not applied to packets |
| Foreground service / wake locks | Implemented, not started from UI |
## Documentation Index
- [Architecture](architecture.md) - System design, data flow diagrams, thread model
- [Build Guide](build-guide.md) - Build environment setup, dependencies, signing
- [Debugging](debugging.md) - Crash diagnosis, logcat filters, common issues
- [Maintenance](maintenance.md) - Code map, dependency management, upgrade paths
- [Roadmap](roadmap.md) - Planned work and known gaps
## Key Design Decisions
- **Rust native engine**: All audio processing, codecs, FEC, crypto, and networking run in Rust. Kotlin is UI-only.
- **Lock-free audio**: SPSC ring buffers with atomic ordering between Oboe C++ callbacks and the Rust codec thread. No mutexes in the audio path.
- **cargo-ndk**: The native library (`libwzp_android.so`) is cross-compiled for `arm64-v8a` using cargo-ndk, invoked automatically by Gradle's `cargoNdkBuild` task.
- **Single-activity Compose**: One `CallActivity` hosts all UI via Jetpack Compose with `CallViewModel` as the state holder.

View File

@@ -0,0 +1,400 @@
# Architecture
## System Overview
The Android client is a four-layer stack: Kotlin UI, JNI bridge, Rust engine, and C++ audio I/O. Each layer communicates through well-defined interfaces with minimal coupling.
```mermaid
graph TB
subgraph "Kotlin (Main Thread)"
CA[CallActivity]
VM[CallViewModel]
UI[InCallScreen<br/>Compose UI]
CA --> VM
VM --> UI
end
subgraph "JNI Bridge"
JB[jni_bridge.rs<br/>panic-safe FFI]
end
subgraph "Rust Engine"
ENG[WzpEngine<br/>Orchestrator]
CT[Codec Thread<br/>20ms real-time loop]
NET[Tokio Runtime<br/>2 async workers]
PIPE[Pipeline<br/>Encode/Decode/FEC/Jitter]
end
subgraph "C++ Audio"
OBOE[Oboe Bridge<br/>Capture + Playout callbacks]
RB[Ring Buffers<br/>Lock-free SPSC]
end
subgraph "Network"
QUIC[QUIC Connection<br/>quinn]
RELAY[WZP Relay<br/>SFU Room]
end
VM <-->|"JNI calls<br/>+ JSON stats"| JB
JB <--> ENG
ENG --> CT
ENG --> NET
CT <--> PIPE
CT <-->|"Atomic R/W"| RB
OBOE <-->|"Atomic R/W"| RB
CT <-->|"mpsc channels"| NET
NET <-->|"QUIC datagrams<br/>+ streams"| QUIC
QUIC <--> RELAY
```
## Thread Model
The engine uses four distinct thread contexts, each with specific responsibilities and real-time constraints.
```mermaid
graph LR
subgraph "Android Main Thread"
UI_T["UI + JNI calls<br/>startCall / stopCall / getStats"]
end
subgraph "Oboe Audio Thread (system)"
AUD["Capture callback: mic → ring buf<br/>Playout callback: ring buf → speaker<br/>⚡ Highest priority, no allocations"]
end
subgraph "Codec Thread (wzp-codec)"
COD["20ms loop:<br/>1. Read capture ring buf<br/>2. AEC → AGC → Encode<br/>3. Send to network channel<br/>4. Recv from network channel<br/>5. FEC → Jitter → Decode<br/>6. Write playout ring buf<br/>⚡ Pinned to big core, RT priority"]
end
subgraph "Tokio Runtime (2 workers)"
NET_S["Send task:<br/>Channel → MediaPacket → QUIC datagram"]
NET_R["Recv task:<br/>QUIC datagram → MediaPacket → Channel"]
HS["Handshake:<br/>CallOffer → CallAnswer"]
end
UI_T -->|"mpsc command channel"| COD
COD -->|"tokio::mpsc send_tx"| NET_S
NET_R -->|"tokio::mpsc recv_tx"| COD
AUD <-->|"Atomic ring buffers"| COD
```
### Thread Priorities and Constraints
| Thread | Priority | Allocations | Blocking | Lock-free |
|--------|----------|-------------|----------|-----------|
| Oboe audio | SCHED_FIFO (system) | None | Never | Yes |
| Codec | RT priority, big core | Pre-allocated buffers | sleep(remainder of 20ms) | Ring buf: yes, Stats: Mutex |
| Tokio workers | Normal | Allowed | Async only | N/A |
| Main/JNI | Normal | Allowed | Allowed | N/A |
## Call Lifecycle
```mermaid
sequenceDiagram
participant User
participant UI as InCallScreen
participant VM as CallViewModel
participant ENG as WzpEngine (JNI)
participant NET as Tokio Network
participant RELAY as WZP Relay
User->>UI: Tap CALL
UI->>VM: startCall()
VM->>ENG: init() + startCall(relay, room)
ENG->>ENG: Create tokio runtime
ENG->>NET: Spawn network task
NET->>RELAY: QUIC connect (SNI = room name)
RELAY-->>NET: Connection established
Note over NET,RELAY: Crypto Handshake
NET->>RELAY: CallOffer {identity_pub, ephemeral_pub, signature, profiles}
RELAY-->>NET: CallAnswer {ephemeral_pub, chosen_profile, signature}
NET->>NET: Derive ChaCha20-Poly1305 session
ENG->>ENG: Spawn codec thread
Note over ENG: State → Active
loop Every 20ms
ENG->>ENG: Read mic → AEC → AGC → Encode
ENG->>NET: Encoded frame via channel
NET->>RELAY: MediaPacket via QUIC DATAGRAM
RELAY->>NET: MediaPacket from other peer
NET->>ENG: MediaPacket via channel
ENG->>ENG: FEC → Jitter → Decode → Speaker
end
User->>UI: Tap END
UI->>VM: stopCall()
VM->>ENG: stopCall()
ENG->>ENG: Set running=false, send Stop command
ENG->>ENG: Join codec thread
ENG->>NET: Drop tokio runtime
NET->>RELAY: Connection close
```
## Audio Pipeline Detail
```mermaid
graph LR
subgraph "Capture Path"
MIC[Microphone] -->|"48kHz i16"| OBOE_C[Oboe Capture<br/>Callback]
OBOE_C -->|"ring_write()"| RB_C[Capture<br/>Ring Buffer]
RB_C -->|"read_capture()"| AEC[Echo<br/>Canceller]
AEC --> AGC[Auto Gain<br/>Control]
AGC --> ENC[AdaptiveEncoder<br/>Opus 24k]
ENC -->|"Vec u8"| FEC_E[RaptorQ<br/>FEC Encoder]
FEC_E -->|"send_tx"| CHAN_S[Send Channel]
end
subgraph "Network"
CHAN_S --> PKT_S[MediaPacket<br/>Header + Payload]
PKT_S -->|"QUIC DATAGRAM"| RELAY[Relay SFU]
RELAY -->|"QUIC DATAGRAM"| PKT_R[MediaPacket<br/>Deserialize]
PKT_R -->|"recv_tx"| CHAN_R[Recv Channel]
end
subgraph "Playout Path"
CHAN_R --> FEC_D[RaptorQ<br/>FEC Decoder]
FEC_D --> JB[Jitter Buffer<br/>10-250 pkts]
JB --> DEC[AdaptiveDecoder<br/>Opus 24k]
DEC -->|"48kHz i16"| AEC_REF[AEC Far-End<br/>Reference]
DEC -->|"write_playout()"| RB_P[Playout<br/>Ring Buffer]
RB_P -->|"ring_read()"| OBOE_P[Oboe Playout<br/>Callback]
OBOE_P --> SPK[Speaker]
end
```
### Audio Parameters
| Parameter | Value | Notes |
|-----------|-------|-------|
| Sample rate | 48,000 Hz | Opus native rate |
| Channels | 1 (mono) | VoIP only |
| Frame size | 960 samples | 20ms at 48kHz |
| Ring buffer | 7,680 samples | 160ms (8 frames) |
| Bit depth | 16-bit signed int | PCM format |
| AEC tail | 100ms | Echo canceller filter length |
## Crypto Handshake
```mermaid
sequenceDiagram
participant Client as Android Client
participant Relay as WZP Relay
Note over Client: Identity seed (32 bytes, random per launch)
Note over Client: HKDF → Ed25519 signing key + X25519 static key
Client->>Client: Generate ephemeral X25519 keypair
Client->>Client: Sign(ephemeral_pub || "call-offer") with Ed25519
Client->>Relay: SignalMessage::CallOffer<br/>{identity_pub, ephemeral_pub, signature, [GOOD, DEGRADED, CATASTROPHIC]}
Relay->>Relay: Verify Ed25519 signature
Relay->>Relay: Generate own ephemeral X25519
Relay->>Relay: Sign(ephemeral_pub || "call-answer")
Relay->>Relay: DH(relay_ephemeral, client_ephemeral) → shared secret
Relay->>Relay: HKDF(shared_secret) → ChaCha20-Poly1305 key
Relay->>Client: SignalMessage::CallAnswer<br/>{identity_pub, ephemeral_pub, signature, chosen_profile=GOOD}
Client->>Client: Verify relay signature
Client->>Client: DH(client_ephemeral, relay_ephemeral) → same shared secret
Client->>Client: HKDF(shared_secret) → same ChaCha20-Poly1305 key
Note over Client,Relay: Both sides now have identical session key
Note over Client,Relay: Media packets can be encrypted (not yet applied)
```
### Key Derivation Chain
```
Identity Seed (32 bytes, random)
├── HKDF(seed, info="warzone-ed25519") → Ed25519 signing key
│ └── Public key = identity_pub (32 bytes)
│ └── SHA-256(identity_pub)[:16] = fingerprint (16 bytes)
└── HKDF(seed, info="warzone-x25519") → X25519 static key (unused currently)
Per-Call Ephemeral:
Random X25519 keypair → ephemeral_pub (sent in CallOffer)
Session Key:
DH(our_ephemeral_secret, peer_ephemeral_pub) → shared_secret
HKDF(shared_secret, info="warzone-session-key") → ChaCha20-Poly1305 key (32 bytes)
```
## QUIC Transport
```mermaid
graph TB
subgraph "QUIC Connection"
EP[Client Endpoint<br/>0.0.0.0:0 UDP]
CONN[Connection to Relay<br/>SNI = room name]
subgraph "Unreliable Channel"
DG_S[Send DATAGRAM<br/>MediaPacket serialized]
DG_R[Recv DATAGRAM<br/>MediaPacket deserialized]
end
subgraph "Reliable Channel"
ST_S[Open bidi stream<br/>JSON length-prefixed<br/>SignalMessage]
ST_R[Accept bidi stream<br/>JSON length-prefixed<br/>SignalMessage]
end
EP --> CONN
CONN --> DG_S
CONN --> DG_R
CONN --> ST_S
CONN --> ST_R
end
```
### QUIC Configuration (VoIP-tuned)
| Setting | Value | Rationale |
|---------|-------|-----------|
| ALPN | `wzp` | Protocol identification |
| Idle timeout | 30s | Keep connection alive during silence |
| Keep-alive | 5s | Prevent NAT timeout |
| Datagram receive buffer | 65 KB | Buffer for burst arrivals |
| Flow control (recv) | 256 KB | Conservative for VoIP |
| Flow control (send) | 128 KB | Prevent bufferbloat |
| TLS | Self-signed certs | Development mode |
| Certificate verification | Disabled | Client accepts any cert |
## MediaPacket Wire Format
```
12-byte header:
┌─────────────────────────────────────────────────┐
│ Byte 0: V(1) T(1) CodecID(4) Q(1) FecHi(1) │
│ Byte 1: FecLo(6) unused(2) │
│ Byte 2-3: Sequence number (u16 BE) │
│ Byte 4-7: Timestamp ms (u32 BE) │
│ Byte 8: FEC block ID │
│ Byte 9: FEC symbol index │
│ Byte 10: Reserved │
│ Byte 11: CSRC count │
├─────────────────────────────────────────────────┤
│ Payload: Opus-encoded audio frame │
├─────────────────────────────────────────────────┤
│ Optional: QualityReport (4 bytes, if Q=1) │
│ loss_pct(u8) rtt_4ms(u8) jitter_ms(u8) │
│ bitrate_cap_kbps(u8) │
└─────────────────────────────────────────────────┘
```
## Relay Room Mode (SFU)
```mermaid
graph LR
subgraph "Room: android"
P1[Phone A<br/>QUIC conn] -->|MediaPacket| RELAY[Relay SFU]
RELAY -->|MediaPacket| P2[Phone B<br/>QUIC conn]
P2 -->|MediaPacket| RELAY
RELAY -->|MediaPacket| P1
end
Note1["Room name from QUIC TLS SNI<br/>No auth required<br/>Packets forwarded to all others"]
```
The relay operates as a Selective Forwarding Unit:
1. Client connects via QUIC, room name extracted from TLS SNI
2. Crypto handshake completes (relay has its own ephemeral identity)
3. Client joins named room
4. All received media packets are forwarded to every other participant in the room
5. Signaling messages are not forwarded (point-to-point with relay)
## Adaptive Quality System
```mermaid
graph TD
QR[QualityReport<br/>loss%, RTT, jitter] --> AQC[AdaptiveQualityController]
AQC -->|"loss<10%, RTT<400ms"| GOOD[GOOD<br/>Opus 24kbps<br/>FEC 20%<br/>20ms frames]
AQC -->|"loss 10-40%<br/>RTT 400-600ms"| DEG[DEGRADED<br/>Opus 6kbps<br/>FEC 50%<br/>40ms frames]
AQC -->|"loss>40%<br/>RTT>600ms"| CAT[CATASTROPHIC<br/>Codec2 1.2kbps<br/>FEC 100%<br/>40ms frames]
GOOD -->|"Hysteresis:<br/>sustained degradation"| DEG
DEG -->|"Sustained improvement"| GOOD
DEG -->|"Further degradation"| CAT
CAT -->|"Improvement"| DEG
```
| Profile | Codec | Bitrate | FEC Ratio | Frame Size | FEC Block |
|---------|-------|---------|-----------|------------|-----------|
| GOOD | Opus 24k | 24 kbps | 20% | 20ms | 5 frames |
| DEGRADED | Opus 6k | 6 kbps | 50% | 40ms | 10 frames |
| CATASTROPHIC | Codec2 1.2k | 1.2 kbps | 100% | 40ms | 8 frames |
## Module Dependency Graph
```mermaid
graph BT
PROTO[wzp-proto<br/>Types, traits, jitter,<br/>quality, session]
CODEC[wzp-codec<br/>Opus, Codec2, AEC,<br/>AGC, resampling]
FEC[wzp-fec<br/>RaptorQ fountain codes]
CRYPTO[wzp-crypto<br/>Ed25519, X25519,<br/>ChaCha20-Poly1305]
TRANSPORT[wzp-transport<br/>QUIC, datagrams,<br/>signaling streams]
ANDROID[wzp-android<br/>Engine, JNI bridge,<br/>Oboe audio, pipeline]
RELAY[wzp-relay<br/>SFU, rooms, auth,<br/>metrics, probes]
CODEC --> PROTO
FEC --> PROTO
CRYPTO --> PROTO
TRANSPORT --> PROTO
ANDROID --> PROTO
ANDROID --> CODEC
ANDROID --> FEC
ANDROID --> CRYPTO
ANDROID --> TRANSPORT
RELAY --> PROTO
RELAY --> CRYPTO
RELAY --> TRANSPORT
```
## File Map
### Kotlin (`android/app/src/main/java/com/wzp/`)
| File | Purpose |
|------|---------|
| `WzpApplication.kt` | App entry, notification channel creation |
| `engine/WzpEngine.kt` | JNI wrapper for native engine |
| `engine/WzpCallback.kt` | Callback interface for engine events |
| `engine/CallStats.kt` | Stats data class with JSON deserialization |
| `ui/call/CallActivity.kt` | Activity host, permissions, theme |
| `ui/call/CallViewModel.kt` | MVVM state holder, stats polling |
| `ui/call/InCallScreen.kt` | Compose UI (idle + in-call states) |
| `service/CallService.kt` | Foreground service, wake/wifi locks |
| `audio/AudioRouteManager.kt` | Speaker/earpiece/Bluetooth routing |
### Rust (`crates/wzp-android/src/`)
| File | Purpose |
|------|---------|
| `lib.rs` | Module declarations |
| `jni_bridge.rs` | JNI FFI (panic-safe, proper jni crate) |
| `engine.rs` | Call orchestrator (threads, channels, lifecycle) |
| `pipeline.rs` | Codec pipeline (AEC, AGC, encode, FEC, jitter, decode) |
| `audio_android.rs` | Oboe backend, SPSC ring buffers, RT scheduling |
| `commands.rs` | Engine command enum |
| `stats.rs` | CallState/CallStats types (serde) |
### C++ (`crates/wzp-android/cpp/`)
| File | Purpose |
|------|---------|
| `oboe_bridge.h` | FFI header for Rust-C++ audio interface |
| `oboe_bridge.cpp` | Oboe capture/playout callbacks, ring buffer I/O |
| `oboe_stub.cpp` | No-op stub for non-Android builds |
### Build
| File | Purpose |
|------|---------|
| `android/app/build.gradle.kts` | Android build config, cargo-ndk task |
| `crates/wzp-android/Cargo.toml` | Rust dependencies (cdylib output) |
| `crates/wzp-android/build.rs` | C++ compilation, Oboe fetch |

155
docs/android/build-guide.md Normal file
View File

@@ -0,0 +1,155 @@
# Build Guide
## Prerequisites
| Tool | Version | Purpose |
|------|---------|---------|
| JDK | 17 | Android Gradle builds |
| Android SDK | 34 | Compile SDK |
| Android NDK | 26.1.10909125 | Native C++/Rust compilation |
| Rust | 1.85+ | Native engine (edition 2024) |
| cargo-ndk | latest | Cross-compile Rust → Android |
| `aarch64-linux-android` target | - | Rust target for ARM64 |
### Install Rust Android target
```bash
rustup target add aarch64-linux-android
cargo install cargo-ndk
```
### Environment Variables
```bash
export JAVA_HOME="/usr/lib/jvm/java-17-openjdk-amd64"
export ANDROID_HOME="$HOME/android-sdk"
export ANDROID_NDK_HOME="$ANDROID_HOME/ndk/26.1.10909125"
# For manual cargo-ndk builds (Gradle sets these automatically):
export CC_aarch64_linux_android="$ANDROID_NDK_HOME/toolchains/llvm/prebuilt/linux-x86_64/bin/aarch64-linux-android21-clang"
export CXX_aarch64_linux_android="$ANDROID_NDK_HOME/toolchains/llvm/prebuilt/linux-x86_64/bin/aarch64-linux-android21-clang++"
export AR_aarch64_linux_android="$ANDROID_NDK_HOME/toolchains/llvm/prebuilt/linux-x86_64/bin/llvm-ar"
```
## Build Commands
### Full Build (Gradle drives everything)
```bash
cd android
./gradlew assembleRelease
```
This runs:
1. `cargoNdkBuild` task: invokes `cargo ndk -t arm64-v8a -o app/src/main/jniLibs build --release -p wzp-android`
2. Compiles Kotlin/Compose code
3. Packages APK with signing
### Native Library Only
```bash
cargo ndk -t arm64-v8a -o android/app/src/main/jniLibs build --release -p wzp-android
```
Output: `android/app/src/main/jniLibs/arm64-v8a/libwzp_android.so`
### Skip Native Rebuild
If the `.so` hasn't changed:
```bash
cd android
./gradlew assembleRelease -x cargoNdkBuild
```
### Debug Build
```bash
cd android
./gradlew assembleDebug
```
Debug APK is ~8.9 MB (unstripped `.so`), release is ~6.9 MB.
## Signing
### Debug
```
Keystore: android/keystore/wzp-debug.jks
Password: android
Key alias: wzp-debug
```
### Release
```
Keystore: android/keystore/wzp-release.jks
Password: wzphone2024
Key alias: wzp-release
```
Both keystores are checked into the repo for development convenience. For production, replace with proper key management.
## Build Artifacts
| Artifact | Path | Size |
|----------|------|------|
| Debug APK | `android/app/build/outputs/apk/debug/app-debug.apk` | ~8.9 MB |
| Release APK | `android/app/build/outputs/apk/release/app-release.apk` | ~6.9 MB |
| Native lib | `android/app/src/main/jniLibs/arm64-v8a/libwzp_android.so` | ~5 MB |
## ABI Support
Currently only `arm64-v8a` (ARM64) is built. This covers 95%+ of modern Android devices.
To add more ABIs, edit `build.gradle.kts`:
```kotlin
ndk { abiFilters += listOf("arm64-v8a", "armeabi-v7a") }
```
And update the cargo-ndk command in `cargoNdkBuild` task:
```kotlin
commandLine("cargo", "ndk", "-t", "arm64-v8a", "-t", "armeabi-v7a", ...)
```
## Oboe Dependency
The Oboe C++ audio library is fetched at build time by `build.rs`:
1. Attempts `git clone` of Oboe 1.8.1 into `$OUT_DIR/oboe`
2. If successful, compiles `oboe_bridge.cpp` with Oboe headers
3. If clone fails (no network), falls back to `oboe_stub.cpp` (no-op audio)
This means **first build requires internet** to fetch Oboe. Subsequent builds use the cached checkout.
## Common Build Issues
### `cargo ndk` not found
```bash
cargo install cargo-ndk
```
### Missing Android target
```bash
rustup target add aarch64-linux-android
```
### NDK not found
Ensure `ANDROID_NDK_HOME` points to the NDK directory containing `toolchains/llvm/`.
### C++ compilation errors
Check that `CXX_aarch64_linux_android` points to a valid clang++ from the NDK.
### Gradle daemon issues
```bash
./gradlew --stop
./gradlew assembleRelease --no-daemon
```

Some files were not shown because too many files have changed in this diff Show More