Commit Graph

7 Commits

Author SHA1 Message Date
Haitham Khedr
aa9b8722d0 SAM2.1
SAM2.1 checkpoints + training code + Demo
2024-09-29 05:49:56 +00:00
Ronghang Hu
7e1596c0b6 open README.md with unicode (to support Hugging Face emoji); fix various typos (#218)
(close #217, #66, #67, #69, #91, #126, #127, #145)
2024-08-14 09:06:25 -07:00
Ronghang Hu
1034ee2a1a better support for non-CUDA devices (CPU, MPS) (#192) 2024-08-12 10:46:50 -07:00
Arun
46945a2122 Update hieradet.py
ufmt formatting fixed.
2024-08-09 11:14:11 +05:30
Arun
6ec8560436 Update hieradet.py
Not used  
head_dim = dim_out // num_heads
self.scale = head_dim**-0.5

F.scaled_dot_product_attention takes care of this automatically.
2024-08-07 11:35:46 +05:30
Ronghang Hu
6f7e700c37 Make it optional to build CUDA extension for SAM 2; also fallback to all available kernels if Flash Attention fails (#155)
In this PR, we make it optional to build the SAM 2 CUDA extension, in observation that many users encounter difficulties with the CUDA compilation step.
1. During installation, we catch build errors and print a warning message. We also allow explicitly turning off the CUDA extension building with `SAM2_BUILD_CUDA=0`.
2. At runtime, we catch CUDA kernel errors from connected components and print a warning on skipping the post processing step.

We also fall back to the all available kernels if the Flash Attention kernel fails.
2024-08-06 10:52:01 -07:00
Haitham Khedr
0c5f8c5432 Initial commit 2024-07-29 21:54:20 +00:00