Research Analysis & Strategy Validation

Summary of research findings from user interviews and surveys, validating the core design strategy and identifying critical gaps for the Phenom App.
This document summarizes the **Research Analysis** conducted on August 7th, 2025. The findings confirm our core strategy while revealing critical gaps that require adjustments to our development priorities.

What We Did

  • 2 user interviews conducted.
  • 2 survey results analyzed.
  • Research Quality: 4 expert users, including a MUFON investigator and a content creator.
  • Outcome: The small sample revealed 3 critical retention barriers.
  • Validation: Confirms the overall strategy direction while revealing critical gaps.

Key Finding

Users consistently prioritize recording readiness and content quality over social features, confirming the scientific approach.


Major Strategy Validations ✅

✅ Professional Scientific Identity is Core

  • All 4 research participants prefer professional credibility over entertainment.
  • Survey respondents ranked community chat dead last (6/6) in both cases.
  • Interview participants emphasize technical accuracy and scientific methodology.
  • Validation: The “Feel Like a Scientist” emotional job-to-be-done is absolutely correct.

✅ Quality Over Quantity Preference Confirmed

  • Kristi: Won’t sacrifice iPhone quality for app convenience.
  • Survey A: “Remove uploads that are basically noise.”
  • Mark: Even though he prefers quantity, he wants strong filtering capabilities.
  • Validation: C2PA verification is correctly positioned as a P0 (critical priority) feature.

✅ Technical Sensor Data is Highly Valued

  • Mark: “The great thing about this app is with those technical…barometric pressure and magnetometer.”
  • Kristi: Appreciates detailed sensor information and wants anomaly indicators.
  • Survey users: Value both sensor data and verification features.
  • Validation: Enhanced sensor display is correctly positioned as a P0 feature.

Gaps Identified ❌

🟡 Object Identification is Underrated

  • Current Strategy: 12 stars (Enhanced, P1 priority).
  • Research Reality: Ranked #1 by BOTH survey respondents.
  • User Need: Essential for distinguishing known vs. unknown objects.
  • Required Change: Elevate to P0 priority.

🟡 App Launch Speed is Missing from Performance Matrix

  • Current Strategy: App performance focuses on crashes and stability.
  • User Reality: “Couldn’t get the phone out fast enough to capture.”
  • Impact: Users miss documentation opportunities due to slow launch.
  • Required Change: Add “instant launch capability” to P0 technical requirements.

🟡 2 Distinct User Paths Identified

  1. Unexpected Sighting: Instant app launch → One-tap recording → AR object identification.
  2. Research / Expected Activity: Full sensor data → Quality recording → Analysis tools.
  • Required Change: The progressive disclosure strategy is correct, but it needs to be implemented as a user-selectable interface complexity. A single path should lead to power users accessing full functionality immediately, while maintaining simple defaults for the “unexpected sighting” path.

Milestones (Design & Development Ready)

Milestone 1: Users can launch app instantly and capture basic video with sensor data

DESIGN FIRST: As a user, I want…

  • the camera to be immediately ready for recording when the app opens, so I can capture phenomena without delay.
  • to tap a single record button and start capturing video with sensor data, so documentation is immediate and complete.
  • all recordings saved as drafts automatically, so I never lose documentation due to crashes or mistakes.
  • to see sensor data during recording, so I can assess the technical quality immediately.
  • confirmation that my recording was successful, so I know the evidence was captured.
  • visual indicators when sensor readings are anomalous, so I know when something unusual is detected.
  • a glossary of sensor terms accessible during recording, so I understand what I’m capturing.

DEVELOPMENT READY:

  • The app to launch in under 2 seconds when tapped, so I don’t miss unexpected sightings.
  • Recording to never fail or crash, so I don’t lose critical documentation.
  • Video quality that matches iPhone camera performance, so I don’t compromise on evidence quality.
  • Zoom functionality that doesn’t cause excessive shakiness, so distant objects remain documentable.

Milestone 2: Real-time object identification functional with verification

DESIGN FIRST: As a user, I want…

  • to see known objects identified in real-time during recording, so I can focus on genuine unknowns.
  • to toggle object identification on/off during recording, so I can switch between “clean” and “analyzed” modes.
  • identified objects to show basic info (flight number, satellite name), so I have immediate reference data.
  • cryptographic proof that my videos haven’t been altered, so skeptics can’t dismiss evidence as fake.
  • to know the verification status of my recording, so I understand its credibility level.

DEVELOPMENT READY:

  • (To be defined)

Milestone 3: Complete recording-to-sharing workflow addressing critical adoption barrier

DESIGN FIRST: As a user, I want…

  • an interactive tutorial that teaches proper recording technique, so I create quality documentation.
  • a widget or shortcut to bypass app launch entirely, so I can start recording from the locked screen.
  • recording tips that appear when I’m shaking the camera, so I improve documentation quality.
  • to record reference terrain automatically, so viewers have context for scale and location.
  • notifications when new reports appear near my location, so I can correlate or investigate.

DEVELOPMENT READY:

  • (To be defined)

(Note: Milestone 4 in the presentation is a duplicate of Milestone 3 and has been omitted for clarity.)


Research Impact

Small research investment revealed retention barriers

  • Users abandon app due to launch speed ➡️ Launch Speed Optimization added
  • Content ownership concerns block adoption ➡️ Video sharing capabilities added
  • Missing #1 user priority (AR object identification) ➡️ Feature moved to P0
  • Investigators avoid core workflow ➡️ Filtering and search flow is moved to P1

Projected Retention Impact

  • 2x more users publishing verified recordings monthly.
  • 40% reduction in uninstalls within the first 30 days after release.
  • Professional user adoption: MUFON investigators and content creators.

How will we know retention is changing?

  • Installs / Uninstalls ratio through apps analytics.
  • Number of users who publish 2+ verified recordings per month.
  • Performance monitoring.

From Research to Implementation

Next Steps:

  • Prototyping: Create a prototype for the Core Recording System.
  • Design Validation: Test the prototype with current users.