TikTok and HKU have released Depth Anything V2, a new monocular depth estimation model. This model is trained using 595,000 synthetic labeled images and over 62 million real unlabeled images. The pipeline involves finetuning DinoV2-G for depth estimation on synthetic data, using it as a teacher to generate pseudo-labels on real images, and then training a student model on these pseudo-labels. Depth Anything V2 offers real-time depth estimation directly in the browser with Transformers.js and WebGPU acceleration. The smallest model is approximately 50MB, making it suitable for on-device usage. The project includes code, checkpoints, and a demo.
Depth Anything V2 just released, enabling real-time depth estimation directly in your browser with š¤ Transformers.js and WebGPU acceleration! ā”ļø The smallest model is only ~50MB (@ fp16), making it perfect for on-device usage! š Check out the demo (+ source code) š https://t.co/WKVqrxztRy
Depth Anything 2 š„ A monocular depth estimation model from HKU and TikTok š Model: https://t.co/USg2ffjERa Demo: https://t.co/wzUxYZo8M7 Paper:https://t.co/U531gTiyjv āØ Enhancing depth prediction with synthetic images, larger teacher models, and pseudo-labeled real images.ā¦
DepthAnythingV2 is up w/ code, ckpts and excellent performance. tl;dr pipeline: finetune DinoV2-G for depth estimation on synthetic data (595k images) -> use it as teacher to generate pseudo-labels on 62M real images-> train student model on pseudo-labels https://t.co/KQRyIdWRTe https://t.co/UuLAjt5uK8
TikTok presents Depth Anything V2 Trained from 595K synthetic labeled images and 62M+ real unlabeled images, providing the most capable monocular depth estimation model proj: https://t.co/KaOQauiOST abs: https://t.co/9HxIpsPWJJ https://t.co/aj9S1SKjzN
Depth Anything V2 This work presents Depth Anything V2. Without pursuing fancy techniques, we aim to reveal crucial findings to pave the way towards building a powerful monocular depth estimation model. Notably, compared with V1, this version produces much finer and more https://t.co/s7TNDJIDvQ