IMU-Based Infrasound Capture
Categories:
Internal-only. Cross-platform guidance for Project Peregrine. Stack: Expo SDK 54 + React Native 0.81 + expo-sensors 15 + react-native-nitro-modules. Applies to both iOS and Android. Cross-references: Sensors overview and Architecture.
TL;DR
Peregrine is built on Expo + React Native and ships to iOS and Android. If a developer wants to capture infrasound (sub-20 Hz pressure events — wind, HVAC drone, vehicle resonance, structural shake, distant thunder, large machinery): do not use the microphone on either platform. Use the IMU.
The current useSensorSidecar hook uses expo-sensors DeviceMotion at 100 ms intervals (10 Hz). That’s a 5 Hz Nyquist ceiling — below the infrasound band, on both platforms. The fix has two layers:
- JS layer (immediate, ≤50 Hz): keep
expo-sensors; lower the interval and readacceleration(gravity-compensated). Works on iOS and Android, capped by the React Native JS bridge. - Native layer (target, 200 Hz): add a custom
react-native-nitro-modulesmodule that reads Core Motion on iOS andSensorManageron Android, batches samples, and pushes them via JSI without bridge serialization.
Report values in m/s² (or acceleration PSD). Convert to dB SPL only with explicit calibration.
Why the Smartphone Microphone Cannot Capture Infrasound
Three independent barriers kill mic-based infrasound capture. They apply identically to iOS and Android — the physics doesn’t change with the OS.
Barrier 1 — Platform DSP rolls off below the speech band
- iOS:
AVAudioSessionin defaultVoice/VideoRecordingmodes applies a steep high-pass filter near 250 Hz at ~24 dB/octave. Mitigated only byAVAudioSession.Mode.measurement. - Android: the equivalent
AudioSource.MICandVOICE_RECOGNITIONpaths apply OEM-specific high-pass and noise-suppression DSP.AudioSource.UNPROCESSED(API 24+) bypasses most of it but is not honoured uniformly across vendors.
Either way, the default capture path on a React Native vision-camera recording produces audio with infrasound content already discarded by the time it reaches the encoder.
Barrier 2 — MEMS microphone physics
Smartphone MEMS capsules — both iOS and Android — are typically −3 dB at ~60–100 Hz with steep mechanical rolloff below. The vendor of the capsule does not change the floor by more than a few Hz.
Barrier 3 — Wavelength vs. membrane size mismatch
A 10 Hz pressure wave has a wavelength of ~34 m in air. The MEMS membrane is ~1 mm. The microphone is acoustically tiny relative to the wavelength — geometric coupling efficiency collapses. This is geometry, not signal processing. No software setting on either OS can fix it.
Bonus failure mode: clipping under loud sources
iPhone built-in mics clip near 104 dB SPL. Android phones vary but most consumer hardware clips between 100–115 dB SPL. Real loud infrasound (near-source thunder, large diesel, structural events) runs 100–140 dB. The encoder receives clipped, harmonically-distorted output where the actual fundamental is buried in intermodulation products inside the audible band. Louder is not better for the microphone path.
Why the IMU Is the Correct Sensor (on both platforms)
Loud infrasound couples physically into the phone chassis. The accelerometer reads that coupling directly. Both platforms expose this sensor through a roughly equivalent API surface, accessed in Peregrine via expo-sensors.
| Property | Microphone (raw audio path) | IMU (expo-sensors → native) |
|---|---|---|
| Low-frequency response | Rolls off below ~60 Hz, dies below ~20 Hz | Flat to DC (0 Hz) on both iOS and Android |
| Platform DSP interference | iOS HPF/AGC, Android OEM noise suppression | None — raw inertial data |
| Self-noise vs. loud infrasound | Clips at ~100–115 dB SPL | Noise floor ≈ 100–200 µg/√Hz; ceiling ±2 g to ±16 g |
| Couples loud pressure waves? | Membrane physically too small for sub-20 Hz wavelengths | Whole device is the transducer — pressure events shake the chassis efficiently |
iOS exposes CMMotionManager.deviceMotion (Core Motion). Android exposes SensorManager with Sensor.TYPE_LINEAR_ACCELERATION (gravity-compensated) and Sensor.TYPE_ACCELEROMETER (raw). expo-sensors DeviceMotion wraps both.
Sample-Rate Math
Peregrine’s useSensorSidecar.ts currently sets SAMPLE_INTERVAL_MS = 100 and registers a DeviceMotion.addListener. That’s 10 Hz — Nyquist 5 Hz — below the infrasound band ceiling of 20 Hz. The current data is aliased for any infrasound use case.
| Target band | Minimum sample rate | DeviceMotion.setUpdateInterval | Notes |
|---|---|---|---|
| 0–10 Hz (deep infrasound only) | 25 Hz | 40 ms | Doubles existing rate |
| 0–20 Hz (full infrasound) | 50 Hz | 20 ms | Recommended floor — reachable in pure JS |
| 0–50 Hz (infrasound + low-bass overlap) | 125 Hz | 8 ms | JS bridge starts to drop samples; needs native batching |
| 0–100 Hz (full audible-overlap region) | 200 Hz | 5 ms | Recommended target — requires custom Nitro module on both platforms |
The JS bridge in React Native serialises every sensor sample as JSON and ships it across the JS↔native boundary. At ≤50 Hz this is fine; above that, sustained throughput becomes unreliable across both platforms even though the underlying hardware is capable. The fix is to capture at native rate and push samples through JSI (Nitro Modules) without bridge serialisation.
Implementation
Path 1 — JS layer with expo-sensors (works on iOS + Android, ≤50 Hz)
This is the lowest-risk change. Patch useSensorSidecar.ts:
import { DeviceMotion } from "expo-sensors"
// Was: const SAMPLE_INTERVAL_MS = 100 // 10 Hz
const SAMPLE_INTERVAL_MS = 20 // 50 Hz — captures the full infrasound band
DeviceMotion.setUpdateInterval(SAMPLE_INTERVAL_MS)
const sub = DeviceMotion.addListener((data) => {
// data.acceleration is gravity-compensated (m/s²), available on iOS and Android
// (data.accelerationIncludingGravity exists if you need the raw sensor reading)
const { x, y, z } = data.acceleration ?? { x: 0, y: 0, z: 0 }
appendSample({ timestampMs: now(), ax: x, ay: y, az: z })
})
Critical: use data.acceleration (gravity-compensated by the underlying platform fusion), not data.accelerationIncludingGravity. The “including gravity” variant carries the static 1 g vector that dominates the FFT at DC and forces unnecessary high-pass filtering downstream. expo-sensors exposes both fields uniformly across iOS and Android.
This path is sufficient for the full infrasound band (0–20 Hz). Above 50 Hz it gets unreliable.
Path 2 — Native layer via react-native-nitro-modules (200 Hz target, both platforms)
PhenomApp already depends on react-native-nitro-modules ^0.33.7 — the infrastructure is in place. Add a new module (e.g. react-native-imu-highrate) that exposes:
// JS-side TypeScript spec
export interface IMUStream {
start(intervalMs: number): void // 5 ms = 200 Hz
stop(): void
// Native pushes batches of samples via a JSI callback,
// bypassing the JS bridge serialisation entirely.
onBatch(handler: (samples: Sample[]) => void): void
}
iOS (Swift) — Core Motion
let mm = CMMotionManager()
mm.deviceMotionUpdateInterval = intervalMs / 1000.0 // 0.005 = 200 Hz
mm.startDeviceMotionUpdates(to: queue) { motion, _ in
guard let m = motion else { return }
// userAcceleration is gravity-compensated (m/s²)
let s = (m.userAcceleration.x, m.userAcceleration.y, m.userAcceleration.z, m.timestamp)
// batch and forward to JS via Nitro hybrid object
}
Use userAcceleration, not accelerometerData.acceleration. userAcceleration is the gravity-compensated AC component you actually want. Raw accelerometer readings include the static 1 g gravity vector.
Android (Kotlin) — SensorManager
val sm = ctx.getSystemService(SENSOR_SERVICE) as SensorManager
val linearAccel = sm.getDefaultSensor(Sensor.TYPE_LINEAR_ACCELERATION)
sm.registerListener(object : SensorEventListener {
override fun onSensorChanged(e: SensorEvent) {
// values[0..2] are x, y, z in m/s² with gravity already removed
// e.timestamp is monotonic nanoseconds
enqueue(e.values[0], e.values[1], e.values[2], e.timestamp)
}
override fun onAccuracyChanged(s: Sensor, accuracy: Int) {}
}, linearAccel, SensorManager.SENSOR_DELAY_FASTEST)
// SENSOR_DELAY_FASTEST = 0 µs hint = "as fast as the hardware supports"
// On flagship Android the accelerometer typically delivers 200–1000 Hz
Use Sensor.TYPE_LINEAR_ACCELERATION (gravity-compensated, analogous to iOS userAcceleration), not Sensor.TYPE_ACCELEROMETER (raw, includes gravity). SENSOR_DELAY_FASTEST is a hint — Android may down-sample under battery-save modes; for foreground recording this is rarely a problem, but consider the SensorDirectChannel API (Android 8+) for guaranteed continuous high-rate streams.
Where this lives in the existing pipeline
Peregrine’s iOS post-processor (react-native-video-postprocessor, in Ext/) generates two SRT tracks during background processing:
- Track 0: human-readable telemetry (10 Hz)
- Track 1: raw JSON telemetry (10 Hz)
iOS storage — Option A (third SRT track)
Add Track 2: high-rate IMU @ 200 Hz at mux time. Same C2PA-signing path as Tracks 0 and 1. Embedded in the .mov. Rendering engine ignores it unless explicitly requested. Recommended on iOS for forensic-grade phenomena because the high-rate IMU stays inside the C2PA-authenticated container.
Android storage — sidecar JSON until postprocessor parity
Per the project’s known constraints, HybridVideoPostprocessor.kt is currently an iOS-only-equivalent stub on Android — full background SRT-mux + C2PA-signing parity has not landed. Until that ships:
- Write a sidecar JSON file (e.g.
<videoId>.imu.json) keyed to the same start timestamp as the recording, alongside the.mp4/.mov. - Hash the sidecar separately and reference the hash in any downstream provenance manifest you do produce.
- When
HybridVideoPostprocessor.ktreaches parity, migrate Android to the third-SRT-track layout.
This split is honest about the platform gap rather than pretending feature parity exists.
Storage volume
At 200 Hz with three axes and double-precision floats, raw output is ~4.8 KB/s before encoding. A 60-second clip ≈ 290 KB. Within the existing post-processing budget on iOS and well within sidecar-file budgets on Android.
Reporting Units — Honesty Discipline
The captured stream is acceleration (m/s²), not pressure (Pa) or sound pressure level (dB SPL). Do not silently convert.
- Default report unit:
m/s²per axis, or magnitude√(ax² + ay² + az²). - Frequency-domain views: acceleration power spectral density
(m/s²)²/Hzorg²/Hz. - dB SPL conversion is a calibrated derivation, not a free transform. The relationship between incident sound pressure (Pa) and observed chassis acceleration (m/s²) depends on phone model, OS, grip stiffness, mounting, and source incidence angle. Calibration must be redone per device class — an iPhone 17 Pro and a Samsung S25 will not share a constant. Without a documented constant tied to a specific physical setup, any “dB SPL” number derived from the IMU is fiction. If the UI surfaces dB SPL, label it clearly as “derived (uncalibrated)” and document the constant in use.
This matches the Peregrine principle of forensic integrity: report what was measured, not what looks impressive.
What This Guide Explicitly Does Not Recommend
- Do not use the microphone for sub-20 Hz capture on iOS or Android. The geometric barrier (Barrier 3) cannot be cleared with software.
- Do not ship Swift-only or iOS-only sensor code paths. Peregrine targets both platforms; both need parity.
- Do not claim 200 Hz is reliably reachable via pure-JS
expo-sensors. The JS bridge caps real-world sustained throughput well below that. 200 Hz needs the Nitro path. - Do not publish absolute dB SPL values from IMU data without a documented calibration procedure — and remember that calibration is per-device-class.
- Do not keep the existing 10 Hz
SAMPLE_INTERVAL_MS = 100if the infrasound use case ships. Aliasing is silent corruption — the data will look plausible while being wrong.
See Also
- Sensors overview — Expo-level sensor inventory; this guide replaces its accelerometer entry for the infrasound use case
- Architecture — where the post-processor sits in the overall pipeline
- PRD — original product requirements
Changelog
- 2026-05-10 — v2: Rewritten for Expo + React Native cross-platform reality. iOS-only Swift framing replaced with
expo-sensorsJS layer + Nitro Module upgrade path covering iOS Core Motion and AndroidSensorManager. Storage section split per platform to acknowledge the AndroidHybridVideoPostprocessor.ktstub. - 2026-05-10 — v1: Initial draft (iOS-only Swift framing). Superseded.
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.