- Delete unused or obsolete files when your changes make them irrelevant (refactors, feature removals, etc.), and revert files only when the change is yours or explicitly requested. If a git operation leaves you unsure about other agents' in-flight work, stop and coordinate instead of deleting.
- Before attempting to delete a file to resolve a local type/lint failure, stop and ask the user. Other agents are often editing adjacent files; deleting their work to silence an error is never acceptable without explicit approval.
- NEVER edit
.envor any environment variable files—only the user may change them. - Coordinate with other agents before removing their in-progress edits—don't revert or delete work you didn't author unless everyone agrees.
- Moving/renaming and restoring files is allowed.
- ABSOLUTELY NEVER run destructive git operations (e.g.,
git reset --hard,rm,git checkout/git restoreto an older commit) unless the user gives an explicit, written instruction in this conversation. Treat t
| #!/usr/bin/env bash | |
| set -euo pipefail | |
| usage() { | |
| printf 'Usage: %s "commit message" "file" ["file" ...]\n' "$(basename "$0")" >&2 | |
| exit 2 | |
| } | |
| if [ "$#" -lt 2 ]; then |
Perfect—let’s make the whole guide Swift‑native and wire it to gpt-5-codex as the default model. This version shows the Responses API tool‑calling loop, built‑in tools (web search), custom tools, structured outputs (text.format), and streaming.
Why
gpt-5-codex? It’s a GPT‑5 variant optimized for agentic coding and is Responses‑only—built to drive coding agents (think Codex/CLI/IDE workflows). ([OpenAI][1]) Structured outputs + function/tool calling +previous_response_idare the core building blocks in Responses. ([OpenAI Platform][2]) Prompting tips for this model differ slightly from plain GPT‑5; the cookbook notes it’s Responses‑only and has a few behavior differences. ([OpenAI Cookbook][3])
| use objc::{msg_send, sel, sel_impl}; | |
| use rand::{distributions::Alphanumeric, Rng}; | |
| use tauri::{ | |
| plugin::{Builder, TauriPlugin}, | |
| Manager, Runtime, Window, | |
| }; // 0.8 | |
| const WINDOW_CONTROL_PAD_X: f64 = 15.0; | |
| const WINDOW_CONTROL_PAD_Y: f64 = 23.0; |
| export const GET = apiHandlers.withAdminAuth(async (request) => { | |
| const url = new URL(request.url); | |
| const sampleCount = Math.min(Number(url.searchParams.get('samples')) || 20, 100); | |
| const includeQueries = url.searchParams.get('queries')?.split(',') || Object.keys(standardTestQueries); | |
| // Create instances for both drivers | |
| const postgresDb = createKyselyWithDriver('postgres'); | |
| const neonDb = createKyselyWithDriver('neon'); | |
| try { |
| #!/usr/bin/env bun | |
| "use strict"; | |
| const fs = require("fs"); | |
| const { execSync } = require("child_process"); | |
| const path = require("path"); | |
| // ANSI color constants | |
| const c = { | |
| cy: '\033[36m', // cyan |
By Kylie Robison
Business | August 5, 2025
Roughly 200 people gathered in San Francisco on Saturday to mourn the loss of Claude 3 Sonnet, an older AI model that Anthropic recently killed.
On July 21 at 9 am PT, Anthropic retired Claude 3 Sonnet, a lightweight model known for being quick and cost-effective. On Saturday, in a large warehouse in San Francisco's SOMA district, more than 200 people gathered to mourn its passing.
| { | |
| "$schema": "https://opencode.ai/config.json", | |
| "model": "cerebras/qwen-3-coder-480b", | |
| "provider": { | |
| "cerebras": { | |
| "api": "https://api.cerebras.ai/v1", | |
| "npm": "@ai-sdk/openai-compatible", | |
| "name": "Cerebras", | |
| "env": [], | |
| "options": { |
| <summary><strong>🤖 Detailed Setup Guide for AI Agents</strong></summary> | |
| ## Complete Setup Instructions for Self-Hosting Qwen3-Coder-480B | |
| This guide provides step-by-step instructions for AI agents to set up Qwen3-Coder-480B on Vast.ai with maximum context window. | |
| ### Prerequisites | |
| - Vast.ai account with billing configured | |
| - SSH client installed | |
| - 2-3 hours for initial setup |