We only have v100s which don’t support bf16. Could you release f16/f32 versions?
you can set the torch_dtype to torch.float16
· Sign up or log in to comment