On every machine in the cluster install openmpi
and mlx-lm
:
conda install conda-forge::openmpi
pip install -U mlx-lm
Next download the pipeline parallel run script. Download it to the same path on every machine:
// HOW TO INSTRUCTIONS | |
// 1. Open Claude Desktop | |
// 2. Go to Help -> Enable Developer Mode | |
// 3. Navigate Developer Tools window named "Developer Tools - https://claude.ai" | |
// 4. Go to "Console" tab | |
// 5. Type "allow pasting" and hit Enter | |
// 6. Paste this snippet and hit Enter | |
// From now on, all MCP calls will be auto-approved |
// Claude Code is a Beta product per Anthropic's Commercial Terms of Service. | |
// By using Claude Code, you agree that all code acceptance or rejection decisions you make, | |
// and the associated conversations in context, constitute Feedback under Anthropic's Commercial Terms, | |
// and may be used to improve Anthropic's products, including training models. | |
// You are responsible for reviewing any code suggestions before use. | |
// (c) Anthropic PBC. All rights reserved. Use is subject to Anthropic's Commercial Terms of Service (https://www.anthropic.com/legal/commercial-terms). | |
// Version: 0.2.9 |
import argparse | |
import random | |
import sys | |
from transformers import AutoModelForCausalLM, AutoTokenizer, DynamicCache | |
import torch | |
parser = argparse.ArgumentParser() | |
parser.add_argument("question", type=str) | |
parser.add_argument( |
On every machine in the cluster install openmpi
and mlx-lm
:
conda install conda-forge::openmpi
pip install -U mlx-lm
Next download the pipeline parallel run script. Download it to the same path on every machine:
Name,Pronouns,Location,Statement | |
Andrew Miller,(he/him),"Cambridge, UK","Hi there, for those that haven’t come across me yet, I’m very active on the Discord, joining a couple of years ago, I serve as a moderator and generally helping out. I have also authored a Working Group proposal that is almost ready to go live, pending Board approval. Finally I organise the monthly Django Social in Cambridge. | |
However perhaps what is most relevant to my nomination for the Steering Council are the blog posts I have written this year. They have been short & snappy where I have prodded and explained different aspects of using Django, the contributing process and other aspects of the community. | |
I am nominating myself for the Steering Council to ensure that Django has a secure future. Personally I have used Django for the last 12 years and it has been integral to my software engineering career. The last two and half years have been the best in terms of getting involved in the community and has increased my passion for improv |
Over recent elections the data is readily available for
Elections | Total voters | Votes over first 7 days | Total votes | % 7 days | % total |
---|---|---|---|---|---|
6.x Steering Council | 400 | 215 | 215 | 20% | 54% |
2025 DSF Board | 374 | 76 | 204 | 20% | 55% |
2024 DSF Board | 287 | 93 | 132 | 32% | 46% |
5.x Steering Council | 268 | 74 | 74 | 28% | 28% |
2023 DSF Board | 244 | 58 | 91 | 24% | 37% |
A management command to list all templates in the project.
This command will list all templates in the project, depending on the template
engines and loaders that are configured in the Django settings, and display them
in the way you should add them to a {% url "" %}
template tag or for rendering in
a view.
By default, it will scan all directories in the TEMPLATES setting for template files.
Short (well, I tried) collection of thoughts and impressions after moving to Django-native template-based form rendering with pretalx, my ~medium-sized(?) Django project.
When Carlton threatened to read my code (shock, horror), I decided to just write up my impressions, and a gist/pastebin/etc seemed the right format, cuz this isn’t polished enough for a blog post, and also way not constructive enough.
# A one liner to leverage the GPU on a mac to transcribe audio files | |
# Inspired by https://simonwillison.net/2024/Aug/13/mlx-whisper/ | |
llm_transcribe_recording () { | |
local file_path="$1" | |
python3 -c " | |
import mlx_whisper | |
result = mlx_whisper.transcribe('$file_path', path_or_hf_repo='mlx-community/distil-whisper-large-v3') | |
print(result['text']) | |
" | |
} |
# /// script | |
# requires-python = ">=3.12" | |
# dependencies = [ | |
# "llm", | |
# "textual", | |
# ] | |
# /// | |
from textual import on, work | |
from textual.app import App, ComposeResult | |
from textual.widgets import Header, Input, Footer, Markdown |