AiGet

AI-Assisted Hidden Knowledge Discovery on Smart Glasses|

PaperCode

Highlight

🔍 Have you ever noticed the hidden surprises around you? 📸

—— How Smart Glasses Reshape Your Curiosity ——

Our latest research, AiGet, will be presented at CHI 2025! ✨

👉 Transform AI into an "environment detective" through proactive perception with smart glasses:

  • * Real-time analysis of your gaze and surroundings
  • * Providing unexpected but insightful knowledge
  • * Personalized recommendations relevant to daily life

Interaction Flow

We support a mixed-initiative method to enable both self-directed (user-initiative) and unintentional (AI-initiative) informal learning. The interaction flow consists of 5 stages.

User's View Without Glasses

FPV View

Mixed-Initiative LLM Trigger

FPV + Time Change

AI-initiated Query - Constant Sensing

Triggered Here

Gaze Fixation

AI-initiated Query - Implicit Gaze

Ring Mouse + Verbal Query

User-initiated Query

Comparison with Baselines

We conducted a ablation study by comparing the AiGet pipeline with two baselines:

(1) Baseline w/o R : a baseline removes the Rules we used, while keeps the User Profile.

(2) Baseline w/o RP: a baseline removes both Rules and the User Profile.

We demonstrate two scenarios below to show the user needs of using rules and user profile to 1) balance both real-time intention and long-term interest, and 2) providing "surprise" at suitable moments.

First-Person View (FPV)

FPV View

User Interests: Alcohol & Japanese Culture