Last 12 weeks ยท 2 commits
4 of 6 standards met
Video stream session closes after ~30 seconds regardless of configuration Environment Device: Android (API 29+) SDK version: 0.4.0 Glasses**: Meta Ray-Ban Smart Glasses Description Video stream sessions initiated via startStreamSession consistently close after approximately 30 seconds of streaming, regardless of StreamConfiguration settings. The session goes through the expected state transitions (STOPPED โ STARTING โ STARTED โ STREAMING) and delivers frames successfully, but then abruptly transitions to STOPPED โ CLOSED after ~30 seconds. The device briefly disconnects (activeDevice becomes null) then immediately reconnects. Observed behavior 17:54:07.140 Stream state changed: STREAMING 17:54:26.435 Received video frame: 360x640 17:54:27.406 Received video frame: 360x640 ...frames arriving normally at ~2fps... 17:54:32.936 Received video frame: 360x640 17:54:49.669 Stream state changed: STOPPED 17:54:49.678 Active device: null 17:54:49.685 Stream state changed: CLOSED 17:54:50.420 Active device: 2f4c18db2c729d4d2f68364e4e951078 What I've tested VideoQuality.MEDIUM at 24fps โ disconnects after ~30s VideoQuality.LOW at 5fps โ disconnects after ~30s The stream is being consumed via a Kotlin Flow (session.videoStream.collect) The device reconnects immediately after each disconnect, suggesting the Bluetooth connection itself is fine There is also a notable ~19 second gap between STREAMING state and the first frame arriving Expected behavior The stream session should remain open as long as the app is actively collecting frames and the glasses are connected.
Clicking at the buttons on the Mock DeviceKit Activity, there is no indication that the functionality was executed. For Example, pressing Don/Doff does not modify the UI state so there is no way of knowing what the state of the Mock Device is. All the buttons seem enable every time. https://github.com/user-attachments/assets/362a68b3-dc62-49ba-8423-d1188065fda4 Can this be addressed please?
Repository: facebook/meta-wearables-dat-android. Description: Meta Wearables Device Access Toolkit for Android Stars: 193, Forks: 47. Open PRs: 3, open issues: 3. Last activity: 3w ago. Community health: 75%. Top contributors: facebook-github-bot.
Hi Meta Wearables team, Thank you for providing this SDK and making it available to developers. I'm excited to build with the Ray-Ban Meta glasses! I've been working on integrating the camera streaming feature into my Android app, but I'm encountering a persistent issue with the Bluetooth Classic (RFCOMM) connection. I've done extensive debugging and verified all the prerequisites mentioned in the documentation, but the problem persists. I would really appreciate any guidance or insights you could provide. Thank you in advance for your help! What I was trying to do I was trying to stream video from my Ray-Ban Meta glasses to my Android app using the Meta Wearables Device Access Toolkit (DAT SDK version 0.3.0). I followed the official CameraAccess sample app and configured it with my registered APPLICATION_ID from the Wearables Developer Center. What happened instead When I tap "Start Streaming", the app shows a loading spinner indefinitely. The stream never starts - it stays stuck in StreamSessionState.STARTING and never transitions to STREAMING. Additionally, the glasses become very hot (overheating) during these failed connection attempts. The exact steps to reproduce the issue 1 ) Setup: Phone: Solana Saga (Android 14) Glasses: Ray-Ban Meta (firmware up to date) SDK: Meta Wearables DAT 0.3.0 Sample app: CameraAccess from official GitHub repository 2 ) Configuration: Added APPLICATION_ID in AndroidManifest.xml using a string resource (to avoid Integer/String type casting issue) Developer Mode enabled in Meta AI app Glasses paired via Bluetooth and connected to Meta AI 3 ) Steps: Launch CameraAccess app Tap "Connect my glasses" โ Successfully redirected to Meta AI and authorized Return to CameraAccess โ "Start Streaming" button is enabled Tap "Start Streaming" Result: Infinite loading spinner, no video displayed 4 ) Observed in logs (adb logcat): 5 ) What works: App registration with Meta AI โ Device detection (glasses appear as connected) โ Data synchronization (RLDrive sync messages) โ BTC lease creation ("BTC lease created for ACDC app") โ 6 ) What fails: RFCOMM/Bluetooth Classic socket connection for video streaming โ Additional notes The glasses overheat significantly during the failed connection attempts The RFCOMM connection fails repeatedly even after: 1. Restarting the glasses (fold/unfold, power cycle) 2. Restarting both apps (Meta AI and CameraAccess) 3. Re-pairing Bluetooth Low-speed BLE connection works fine (sync data flows normally) Only the high-speed BTC/RFCOMM connection for video streaming fails Expected behavior** After tapping "Start Streaming", the app should transition to StreamSessionState.STREAMING and display the live video feed from the glasses camera.
Environment Device: Pixel 9a Android version: 16 Meta Glasses SDK version: 0.3.0 Meta Glasses model: Ray-Ban Meta Smart Glasses Description The Meta SDK video stream experiences severe FPS drops (from ~24 fps to ~1-3 fps) whenever any video or audio encoder is active on the device, regardless of whether it's hardware or software encoding, and even when the encoder processes completely different video sources (not Meta frames). This makes it impossible to implement real-time streaming applications (RTMP, WebRTC, etc.) using Meta Glasses video while encoding the stream. Expected Behavior The Meta SDK should deliver video frames at a consistent ~24 fps, even when other video/audio encoders are active on the device, as long as system resources are available. Actual Behavior As soon as any video encoder is initialized and started (even before processing any frames), the Meta SDK video stream slows down to ~1-3 fps. Steps to Reproduce Baseline (no encoder): โ Works normally Result: โ 5-24 FPS (good performance) With H264 hardware encoder: โ Broken Result: โ 0.5-3 FPS (completely broken) With H264 SOFTWARE encoder: โ Still broken Result: โ 1-3 FPS (still broken, even with software encoder) Analysis What we tested 1. โ Hardware H264 encoder โ Meta throttles to ~1 fps 2. โ Software H264 encoder โ Meta throttles to ~1 fps 3. โ Audio AAC encoder โ Meta throttles to ~1 fps 4. โ Multiple RTMP libraries (HaishinKit, RootEncoder) โ Same issue with both 5. โ Separate threads with different priorities โ No improvement 6. โ Reduced encoder parameters** (480p @ 10fps, 500kbps) โ No improvement What doesn't cause the issue โ Hardware encoder resource contention (software encoder has same issue) โ CPU load (encoders run on separate low-priority threads) โ Specific RTMP library implementation (happens with all libraries) โ Frame processing time (Meta frames process in val fps = calculateFPS() // Measures ~20+ fps initially Log.d("Meta", "FPS: $fps") } // Now initialize any encoder (doesn't even need to process frames) val encoder = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC) val format = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, 640, 480) format.setInteger(MediaFormat.KEY_BIT_RATE, 1000000) format.setInteger(MediaFormat.KEY_FRAME_RATE, 24) encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE) encoder.start() // FPS immediately drops to ~1-3 fps ``` Additional Context This issue has been verified across: Multiple devices Multiple encoding libraries (HaishinKit, RootEncoder) Both hardware and software encoders Both video and audio encoders The consistent behavior across all these scenarios strongly suggests this is a Meta SDK limitation rather than an implementation issue. Thank you for investigating this issue!
Environment Samsung Galaxy S23 Meta Glasses SDK version - 0.4.0 Meta Glasses model: Ray-Ban Meta Smart Glasses Our own integration and the demo CameraAccess app Actual Behavior Both in our integration and in the sample CameraAccess app there is a significant initial delay (5-10 seconds) between receiving first frames from the glasses camera and the preview becoming responsive (e.g. following the movement of my head). After that time, it looks like the streaming catches up and then the frame rate stabilises. Lowering the streaming quality or frame rate doesn't help much. I'm using the default settings: There is also a significant delay when we invoke at the beginning of streaming. Expected Behavior** The streaming should deliver a stable frame rate from the beginning, without delay. It is crucial when the user's goal is to take a picture.