Skip to content

Instantly share code, notes, and snippets.

View sebastianschramm's full-sized avatar

Sebastian M. Schramm sebastianschramm

View GitHub Profile
@sebastianschramm
sebastianschramm / classifier_litserver.py
Created August 28, 2024 08:45
LitServe API for toxicity classifier
import torch
from litserve import LitAPI, LitServer
from pydantic import BaseModel, conint
from pydantic_settings import BaseSettings
from transformers import AutoModelForSequenceClassification, AutoTokenizer
class ToxicitySettings(BaseSettings):
model_id: str = "s-nlp/roberta_toxicity_classifier"
port: conint(ge=1024, le=65535) = 8000 # type: ignore