Spaces:
Sleeping
Sleeping
title: FastSAM & CLIP Inference with Gradio | |
emoji: 🏃 | |
colorFrom: red | |
colorTo: blue | |
sdk: gradio | |
sdk_version: 3.47.1 | |
app_file: app.py | |
pinned: false | |
license: mit | |
**GitHub Link:** https://github.com/RaviNaik/ERA-SESSION19/blob/main/README.md | |
### Tasks: | |
1. :heavy_check_mark: Make a CLIP or FastSAM application on gradio/spaces using open-source models. | |
2. :heavy_check_mark: Share the link on the assignment page for the GitHub, and Spaces | |
### FastSAM (Variant of Segment Anything-SAM Model) | |
**Architecture:** | |
![image](https://github.com/RaviNaik/ERA-SESSION19/assets/23289802/044d2f07-f8de-478f-b189-219dc0b0b52a) | |
#### Gradio Results | |
![image](https://github.com/RaviNaik/ERA-SESSION19/blob/main/sam_gradio.png) | |
### CLIP (Contrastive Language–Image Pre-training) | |
**Architecture:** | |
![image](https://github.com/RaviNaik/ERA-SESSION19/assets/23289802/03758a42-8464-4663-849f-b90c8cf0c03f) | |
#### Gradio Results | |
![image](https://github.com/RaviNaik/ERA-SESSION19/blob/main/clip_gradio.png) |