Facebook360 Depth Estimation Pipeline - https://facebook.github.io/facebook360_dep
by facebookHTML
6 of 6 standards met
Title Feature/coarsest level batching Description Summary Adds an option to process the first N coarsest pyramid levels in a single worker job to reduce S3 download/upload overhead during AWS renders. This trades off skipping temporal filtering for those batched levels, which is often acceptable and can significantly speed up renders. New flag: (default 0) Pipeline batches for the coarsest N levels, then continues remaining levels as usual (with optional temporal filtering) Worker updated to handle multi-level runs and upload outputs for all levels in the batch Adds a test to validate runs successfully over a multi-level range Changelog [Render] [Feature] - Batch coarsest N pyramid levels per worker to reduce I/O; skip temporal filtering for batched levels Test Plan Unit tests: Added in to run with and ensure multi-level execution works. Manual validation (example): Run with batching enabled (e.g., batch 6 coarsest levels): Verify outputs exist for all batched levels in and that temporal filtering is only applied to subsequent finer levels (if ). Files modified: (new flag) (batching logic + skip temporal filtering for batched levels) (multi-level input/output handling) (new multi-level test)
Hi, first of all thanks for providing this software. The software license (BSD) is clear to me, but I it is not clearly stated if the example data has the same license. Could this be clarified? There is an active standardization project within MPEG called MPEG Immersive Video (MIV) [ https://gitlab.com/mpeg-i-visual/tmiv ], and I was wondering if the example data that you have provided could be used at least for "academic and standardization purposes".
Repository: facebook/facebook360_dep. Description: Facebook360 Depth Estimation Pipeline - https://facebook.github.io/facebook360_dep Stars: 257, Forks: 52. Primary language: HTML. Languages: HTML (88.4%), C++ (5.8%), Python (3.8%), JavaScript (1.2%), CSS (0.5%). Open PRs: 4, open issues: 18. Last activity: 1mo ago. Community health: 87%. Top contributors: aparrapo, h-friederich, bkcabral, tschrager, amyreese, yfeldblum, zertosh, bowiechen, 8Keep, nlutsenko and others.
Last 12 weeks · 5 commits
Summary This change introduces a new --batch_levels flag in download_meshes.py that allows specifying multiple coarse pyramid levels to download from S3 in a single sync operation. Previously, each level was downloaded individually, which caused repeated downloads and increased processing time at coarse levels. With this update, workers can fetch multiple levels at once, optimizing S3 data transfer without breaking existing functionality. When --batch_levels is not provided, the script behaves exactly as before. Changelog [ENHANCEMENT] [FEATURE] - Added --batch_levels flag to optimize S3 downloads for coarse pyramid levels in download_meshes.py Test Plan Verified that download_meshes.py successfully downloads multiple levels when --batch_levels=9,8,7,6,5,4 is used. Confirmed existing single-level download still works when --batch_levels is not specified. Tested tar extraction and watch functionality; all files are correctly unpacked and observer events handled. Monitored logs to ensure no errors occur during batch sync.
Introduced flag in download_meshes.py to specify multiple coarse pyramid levels (e.g., 9,8,7,6,5,4) for a single S3 sync. Modified S3 download logic to include all specified levels in one aws_util.s3_sync call, reducing repeated downloads and speeding up processing at coarse levels. Preserved existing behavior when is not provided. Ensures tar extraction and watch functionality continue to work as before. This addresses the initial GitHub issue by optimizing S3 data transfer for coarse levels in the rendering pipeline.
On Windows, image file names that are symlinks give 'Missing file' errors. It would be good to test if this is also the case on *nix. While there is a good chance that part of the fault is the Boost filesystem's known difficulties with handling Windows' nasty symlink format, my reading of the project code suggests that symlinks might not even work on Linux. For example these lines in imageUtil.cpp::verifyImagePaths() would likely reject a symlink: It is actually possible to read from Windows symlinks with Boost, but this does require some platform-specific code. If the Linux builds already use symlinks OK, I would suggest giving this issue a low priority; but if not, a high one. Steps To Reproduce 1. copy a calibration setup where MatchCorners runs correctly 2. replace image files with symlinks to files stored elsewhere 3. run MatchCorners expect "Missing file" failure.