Model card for phikon-distil-mobilenet_v2-kather2016
This model is a distilled version of owkin/phikon to a MobileNet-v2 on the 1aurent/Kather-texture-2016 dataset.
Model Usage
Image Classification
from transformers import AutoModelForImageClassification, AutoImageProcessor
from urllib.request import urlopen
from PIL import Image
# get example histology image
img = Image.open(
urlopen(
"/static-proxy?url=https%3A%2F%2Fdatasets-server.huggingface.co%2Fassets%2F1aurent%2FKather-texture-2016%2F--%2Fdefault%2Ftrain%2F0%2Fimage%2Fimage.jpg"
)
)
# load image_processor and model from the hub
model_name = "1aurent/phikon-distil-mobilenet_v2-kather2016"
image_processor = AutoImageProcessor.from_pretrained(model_name)
model = AutoModelForImageClassification.from_pretrained(model_name)
inputs = image_processor(img, return_tensors="pt")
outputs = model(**inputs)
Citation
@article{Filiot2023.07.21.23292757,
author = {Alexandre Filiot and Ridouane Ghermi and Antoine Olivier and Paul Jacob and Lucas Fidon and Alice Mac Kain and Charlie Saillard and Jean-Baptiste Schiratti},
title = {Scaling Self-Supervised Learning for Histopathology with Masked Image Modeling},
elocation-id = {2023.07.21.23292757},
year = {2023},
doi = {10.1101/2023.07.21.23292757},
publisher = {Cold Spring Harbor Laboratory Press},
url = {https://www.medrxiv.org/content/early/2023/09/14/2023.07.21.23292757},
eprint = {https://www.medrxiv.org/content/early/2023/09/14/2023.07.21.23292757.full.pdf},
journal = {medRxiv}
}
- Downloads last month
- 4
Model tree for 1aurent/phikon-distil-mobilenet_v2-kather2016
Dataset used to train 1aurent/phikon-distil-mobilenet_v2-kather2016
Evaluation results
- accuracy on 1aurent/Kather-texture-2016self-reported0.928