diff --git a/README.md b/README.md index 9aedf6f..75387f2 100644 --- a/README.md +++ b/README.md @@ -1,12 +1,40 @@ # 🌊 SAMURAI: Adapting Segment Anything Model for Zero-Shot Visual Tracking with Motion-Aware Memory -[[Arxiv]]()[[Raw Results]]() +[[Arxiv]]() [[Project Page]]() [[Raw Results]]() This repository is the official implementation of SAMURAI: Adapting Segment Anything Model for Zero-Shot Visual Tracking with Motion-Aware Memory -## Code & Installation -## Main Results +## Getting Started + +#### SAMURAI Installation + +SAM 2 needs to be installed first before use. The code requires `python>=3.10`, as well as `torch>=2.3.1` and `torchvision>=0.18.1`. Please follow the instructions [here](https://github.com/facebookresearch/sam2?tab=readme-ov-file) to install both PyTorch and TorchVision dependencies. You can install **the SAMURAI version** of SAM 2 on a GPU machine using: +``` +cd sam2 +pip install -e . +pip install -e ".[notebooks]" +``` + +Please see [INSTALL.md](https://github.com/facebookresearch/sam2/blob/main/INSTALL.md) from the original SAM 2 repository for FAQs on potential issues and solutions. + +``` +pip install requirements.txt +``` + +#### SAM 2.1 Checkpoint Download + +``` +cd checkpoints && \ +./download_ckpts.sh && \ +cd .. +``` + +#### Dataset Preparation + +#### Run SAMURAI + + ## Acknowledgment diff --git a/assets/samurai_demo.mp4 b/assets/samurai_demo.mp4 new file mode 100644 index 0000000..89fb9c2 Binary files /dev/null and b/assets/samurai_demo.mp4 differ