Skip to content

Instantly share code, notes, and snippets.

@icedac
Created February 14, 2025 15:17
Show Gist options
  • Select an option

  • Save icedac/d0f611c78faa8d7ce51f3c42d51761af to your computer and use it in GitHub Desktop.

Select an option

Save icedac/d0f611c78faa8d7ce51f3c42d51761af to your computer and use it in GitHub Desktop.
oneliner run DeepSeek-llama3.3-Bllossom-70B
#!/bin/bash
set -e
echo "Creating Python virtual environment..."
python3 -m venv venv
source venv/bin/activate
echo "Upgrading pip and installing dependencies..."
pip install --upgrade pip
pip install torch transformers huggingface_hub
echo "Checking dependencies..."
python -c "import torch, transformers, huggingface_hub" || { echo "Dependency check failed."; exit 1; }
echo "Downloading model from HuggingFace..."
python - <<'EOF'
from transformers import AutoModel
print("Downloading model...")
model = AutoModel.from_pretrained("UNIVA-Bllossom/DeepSeek-llama3.3-Bllossom-70B")
print("Model downloaded.")
EOF
echo "※ HuggingFace 캐시 경로는 환경에 따라 다르므로 직접 확인하세요."
echo "Converting model to Ollama format..."
rm -rf DeepSeek-llama3.3-Bllossom-70B.ollama
mkdir -p DeepSeek-llama3.3-Bllossom-70B.ollama
cat << 'EOF' > DeepSeek-llama3.3-Bllossom-70B.ollama/model.json
{"name": "DeepSeek-llama3.3-Bllossom-70B", "version": "1.0", "runtime": "python", "entry_point": "server.py"}
EOF
cat << 'EOF' > DeepSeek-llama3.3-Bllossom-70B.ollama/server.py
from transformers import AutoModel
model = AutoModel.from_pretrained("UNIVA-Bllossom/DeepSeek-llama3.3-Bllossom-70B")
def handle_request(inp): return "Dummy response for: " + inp
if __name__=='__main__':
import sys
inp = sys.argv[1] if len(sys.argv) > 1 else ""
print(handle_request(inp))
EOF
zip -r DeepSeek-llama3.3-Bllossom-70B.ollama.zip DeepSeek-llama3.3-Bllossom-70B.ollama
mv DeepSeek-llama3.3-Bllossom-70B.ollama.zip DeepSeek-llama3.3-Bllossom-70B.ollama
echo "Verifying Ollama CLI installation..."
if ! command -v ollama >/dev/null 2>&1; then
echo "Ollama CLI not found. Please install it from https://ollama.ai/docs"
deactivate
exit 1
fi
echo "Loading model into Ollama..."
ollama load DeepSeek-llama3.3-Bllossom-70B.ollama
echo "Running model via Ollama..."
ollama run DeepSeek-llama3.3-Bllossom-70B
echo "Deactivating Python virtual environment..."
deactivate
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment