video-to-3d
File size: 2,569 Bytes
e8b3426
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
51f3ea4
e8b3426
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
---
license: cc-by-nc-4.0
pipeline_tag: video-to-3d
---

# Trace Anything: Representing Any Video in 4D via Trajectory Fields

This repository contains the official implementation of the paper [Trace Anything: Representing Any Video in 4D via Trajectory Fields](https://huggingface.co/papers/2510.13802).

Trace Anything proposes a novel approach to represent any video as a Trajectory Field, a dense mapping that assigns a continuous 3D trajectory function of time to each pixel in every frame. The model predicts the entire trajectory field in a single feed-forward pass, enabling applications like goal-conditioned manipulation, motion forecasting, and spatio-temporal fusion.

Project Page: [https://trace-anything.github.io/](https://trace-anything.github.io/)
Code: [https://github.com/ByteDance-Seed/TraceAnything](https://github.com/ByteDance-Seed/TraceAnything)

## Overview
<div align="center">
  <img src="https://trace-anything.github.io/static/images/teaser.png" width="100%"/>
</div>

## Installation
For detailed installation instructions, please refer to the [GitHub repository](https://github.com/ByteDance-Seed/TraceAnything#setup).

## Sample Usage
To run inference with the Trace Anything model, first, download the pretrained weights (see GitHub for details). Then, you can use the provided script as follows:

```bash
# Download the model weights to checkpoints/trace_anything.pt
# Place your input video/image sequence in examples/input/<scene_name>/

python scripts/infer.py \
  --input_dir examples/input \
  --output_dir examples/output \
  --ckpt checkpoints/trace_anything.pt
```
Results, including 3D control points and confidence maps, will be saved to `<output_dir>/<scene>/output.pt`.

## Interactive Visualization
An interactive 3D viewer is available to explore the generated trajectory fields. Run it using:
```bash
python scripts/view.py --output examples/output/<scene>/output.pt
```
For more options and remote usage, check the [GitHub repository](https://github.com/ByteDance-Seed/TraceAnything#interactive-visualization-%EF%B8%8F).

## Citation
If you find this work useful, please consider citing the paper:
```bibtex
@misc{liu2025traceanythingrepresentingvideo,
      title={Trace Anything: Representing Any Video in 4D via Trajectory Fields}, 
      author={Xinhang Liu and Yuxi Xiao and Donny Y. Chen and Jiashi Feng and Yu-Wing Tai and Chi-Keung Tang and Bingyi Kang},
      year={2025},
      eprint={2510.13802},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2510.13802}, 
}
```