Skip to content

Instantly share code, notes, and snippets.

@supersonictw
Last active April 5, 2025 18:27
Show Gist options
  • Save supersonictw/f6cf5e599377132fe5e180b3d495c553 to your computer and use it in GitHub Desktop.
Save supersonictw/f6cf5e599377132fe5e180b3d495c553 to your computer and use it in GitHub Desktop.
Ollama Model Export Script
#!/bin/bash
# Ollama Model Export Script
# Usage: bash ollama-export.sh vicuna:7b
# SPDX-License-Identifier: MIT (https://ncurl.xyz/s/o_o6DVqIR)
# https://gist.github.com/supersonictw/f6cf5e599377132fe5e180b3d495c553
# Interrupt if any error occurred
set -e
# Declare
echo "Ollama Model Export Script"
echo "License: MIT (https://ncurl.xyz/s/o_o6DVqIR)"
echo ""
# OS-specific
case "$OSTYPE" in
linux*)
HOME="$(echo ~ollama)"
;;
esac
# Define variables
OLLAMA_HOME="${OLLAMA_HOME:="$HOME/.ollama"}"
BLOBS_FILE_BASE_PATH="$OLLAMA_HOME/models/blobs"
MANIFESTS_FILE_BASE_PATH="$OLLAMA_HOME/models/manifests"
# Define constants
SUCCESS_PREFIX="\033[1;32mSuccess:\033[0m"
FAILED_PREFIX="\033[0;31mFailed:\033[0m"
# Read arguments
IFS='/' read -ra NAME_ARGS <<< "${1/://}"
case "${#NAME_ARGS[@]}" in
4)
MANIFESTS_REGISTRY_NAME="${NAME_ARGS[0]}"
MANIFESTS_LIBRARY_NAME="${NAME_ARGS[1]}"
MANIFESTS_MODEL_NAME="${NAME_ARGS[2]}"
MANIFESTS_PARAMS_NAME="${NAME_ARGS[3]}"
;;
3)
MANIFESTS_LIBRARY_NAME="${NAME_ARGS[0]}"
MANIFESTS_MODEL_NAME="${NAME_ARGS[1]}"
MANIFESTS_PARAMS_NAME="${NAME_ARGS[2]}"
;;
2)
MANIFESTS_MODEL_NAME="${NAME_ARGS[0]}"
MANIFESTS_PARAMS_NAME="${NAME_ARGS[1]}"
;;
1)
MANIFESTS_MODEL_NAME="${NAME_ARGS[0]}"
;;
esac
# Define variables
MANIFESTS_REGISTRY_NAME="${MANIFESTS_REGISTRY_NAME:="registry.ollama.ai"}"
MANIFESTS_LIBRARY_NAME="${MANIFESTS_LIBRARY_NAME:="library"}"
MANIFESTS_MODEL_NAME="${MANIFESTS_MODEL_NAME:="vicuna"}"
MANIFESTS_PARAMS_NAME="${MANIFESTS_PARAMS_NAME:="latest"}"
# Echo the model full name
MODEL_FULL_NAME="$MANIFESTS_MODEL_NAME:$MANIFESTS_PARAMS_NAME"
echo "Exporting model \"$MODEL_FULL_NAME\"..."
echo ""
# Make sure manifests exist
MANIFESTS_FILE_PATH="$MANIFESTS_FILE_BASE_PATH/$MANIFESTS_REGISTRY_NAME/$MANIFESTS_LIBRARY_NAME/$MANIFESTS_MODEL_NAME/$MANIFESTS_PARAMS_NAME"
if [ ! -f "$MANIFESTS_FILE_PATH" ]; then
echo -e "$FAILED_PREFIX \"$MANIFESTS_FILE_PATH\" not exists, the model \"$MODEL_FULL_NAME\" you requested is not found."
exit 1
fi
# Make sure dist not exist
EXPORT_DST_BASE_PATH="${EXPORT_DST_BASE_PATH:="$PWD/${MODEL_FULL_NAME/:/-}"}"
if [ -d "$EXPORT_DST_BASE_PATH" ]; then
echo -e "$FAILED_PREFIX \"$EXPORT_DST_BASE_PATH\" already exists, exits for preventing from unexpected operations."
exit 1
fi
# Create dist directory
mkdir -p "$EXPORT_DST_BASE_PATH"
printf "%s" "$MANIFESTS_REGISTRY_NAME/$MANIFESTS_LIBRARY_NAME/$MANIFESTS_MODEL_NAME:$MANIFESTS_PARAMS_NAME" >"$EXPORT_DST_BASE_PATH/source.txt"
# Read manifests and handle them
while IFS= read -r layer; do
BLOB_FILE_NAME="${layer/:/-}"
BLOB_FILE_PATH="$BLOBS_FILE_BASE_PATH/$BLOB_FILE_NAME"
BLOB_TYPE_NAME=$(jq -r --arg layer "$layer" '.layers[] | select(.digest == $layer) | .mediaType' "$MANIFESTS_FILE_PATH" | sed 's|.*\.ollama\.image\.\(.*\)|\1|')
EXPORT_MODEL_FILE_PATH="$EXPORT_DST_BASE_PATH/Modelfile"
EXPORT_MODEL_BIN_PATH="$EXPORT_DST_BASE_PATH/model.bin"
case "$BLOB_TYPE_NAME" in
model)
cp "$BLOB_FILE_PATH" "$EXPORT_MODEL_BIN_PATH"
printf "%s\n" "FROM ./model.bin" >>"$EXPORT_MODEL_FILE_PATH"
;;
params)
PARAMS_JSON="$(cat "$BLOB_FILE_PATH")"
printf "%s" "$(jq -r 'keys[] as $key | .[$key][] | "PARAMETER \($key) \"\(.)\"" ' <<<"$PARAMS_JSON")" >>"$EXPORT_MODEL_FILE_PATH"
;;
*)
TYPE_NAME="$(echo "$BLOB_TYPE_NAME" | tr '[:lower:]' '[:upper:]')"
FILE_CONTENT="$(cat "$BLOB_FILE_PATH")"
printf "%s\n" "$TYPE_NAME \"\"\"$FILE_CONTENT\"\"\"" >>"$EXPORT_MODEL_FILE_PATH"
;;
esac
done < <(jq -r '.layers[].digest' "${MANIFESTS_FILE_PATH}")
# Echo success message
echo -e "$SUCCESS_PREFIX Model \"$MODEL_FULL_NAME\" has been exported to \"$EXPORT_DST_BASE_PATH\"!"
@supersonictw
Copy link
Author

@sokovnich
The patch already applied!
Linux can recognize their OLLAMA_HOME automatically. 🐧

@sokovnich
Copy link

sokovnich commented Sep 16, 2024

@supersonictw thank you

@xuegl
Copy link

xuegl commented Oct 23, 2024

i got an warning while exporting llava:34b model:
/ollama-export.sh: line 103: warning: command substitution: ignored null byte in input

i inspected the llava model manifest file, and the warning is maybe caused by the mediaType: projector:

{
      "mediaType": "application/vnd.ollama.image.projector",
      "digest": "sha256:83720bd8438ccdc910deba5efbdc3340820b29258d94a7a60d1addc9a1b5f095",
      "size": 699956416
    }

there is no blob file corresponding to the sha256 digest under ollama models/blob folder

@vasanthnagkv
Copy link

Hi, I can use this script to export the model that I have already downloaded using for example "ollama pull" right?
This exported model I can then use to fine tune using LoRA?
I understood that I cannot fine tune using ollama and hence have to export the model, fine tune outside, then import it back to Ollama to use it with ollama inference?

(I am totally new to AI models)

@supersonictw
Copy link
Author

Hello! @vasanthnagkv

This exported model I can then use to fine tune using LoRA?

Yes if the model is llama model based. It's related to ggml-org/llama.cpp#6680

import it back to Ollama to use it with ollama inference

The script can be worked with ollama create for importing, it's compatible.
https://github.com/ollama/ollama/blob/main/docs/import.md

@wingertjp
Copy link

Hi!
Thanks for your work!

I ve got this weird trace just before success message : jq: error (at <stdin>:1): Cannot iterate over number (8)

It happens with few models like codellama:34b and nomic-embed-text

I still manage to import these exported models into my offline ollama instance

If you have any idea ?

@Rays-Robotics
Copy link

Am i allowed to add this to my package manager, scriptgrab?

@supersonictw
Copy link
Author

Am i allowed to add this to my package manager, scriptgrab?

Welcome! 🚀

According to SPDX-License-Identifier: MIT (https://ncurl.xyz/s/o_o6DVqIR), it's allowed.

Please leave this line in the description/source code of the script. @Rays-Robotics

@supersonictw
Copy link
Author

@xuegl @wingertjp

Sorry for replying late.
I'll check these problems as soon as possible. 🫡

@wingertjp
Copy link

@xuegl @wingertjp

Sorry for replying late. I'll check these problems as soon as possible. 🫡

Great 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment