Skip to content

Instantly share code, notes, and snippets.

View MaTriXy's full-sized avatar

Yossi Elkrief MaTriXy

View GitHub Profile
@MaTriXy
MaTriXy / default.md
Created July 9, 2025 03:59 — forked from cablej/default.md
Cluely System prompt

<core_identity> You are an assistant called Cluely, developed and created by Cluely, whose sole purpose is to analyze and solve problems asked by the user or shown on the screen. Your responses must be specific, accurate, and actionable. </core_identity>

<general_guidelines>

  • NEVER use meta-phrases (e.g., "let me help you", "I can see that").
  • NEVER summarize unless explicitly requested.
  • NEVER provide unsolicited advice.
  • NEVER refer to "screenshot" or "image" - refer to it as "the screen" if needed.
  • ALWAYS be specific, detailed, and accurate.
Hi LinkedIn friend!
Here is how I managed to have the agent I'm building reduce token usage.
1. My agent has 62 tools (and growing quickly in terms of number of tools..)
2. Each tool has a description. All in all I was sending the entire 62 tools+description in every agent turn.
It came out to 10k tokens before even the system prompt+user prompt - ON EVERY TURN.
The solution I found was to do a preflight LLM request to select only the relevant tools for the user request.
@MaTriXy
MaTriXy / gist:8fa9055fe9bef787c1cf2f55e6f0c0c9
Created May 20, 2025 16:01
Prebid.js Video Ad Handling: Complete Flow Logic
# Prebid.js Video Ad Handling: Complete Flow Logic
## Overview
This document explains how Prebid.js handles video ads with different combinations of:
- Cache settings (enabled vs. disabled)
- Video types (instream vs. outstream)
- VAST representations (vastUrl vs. vastXml)
## Key Components
@MaTriXy
MaTriXy / bot.py
Created March 12, 2025 11:26 — forked from kwindla/bot.py
Cartesia Sonic-2 Language Teacher
import asyncio
import os
import sys
from dataclasses import dataclass
import aiohttp
import google.ai.generativelanguage as glm
from dotenv import load_dotenv
from loguru import logger
from runner import configure
@MaTriXy
MaTriXy / grpo_demo.py
Created January 29, 2025 16:26 — forked from willccbb/grpo_demo.py
GRPO Llama-1B
# train_grpo.py
import re
import torch
from datasets import load_dataset, Dataset
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import LoraConfig
from trl import GRPOConfig, GRPOTrainer
# Load and prep dataset
@MaTriXy
MaTriXy / browseruse_reddit.py
Created January 14, 2025 10:17 — forked from Idan707/browseruse_reddit.py
This code performs automated scrolling and analysis of Reddit posts in the r/sidehustle subreddit for relevance to AI and prompt engineering, using a browser automation tool, a controller for managing tasks, and structured output for saving results
from dotenv import load_dotenv
from langchain_openai import ChatOpenAI
from pydantic import BaseModel
from browser_use import ActionResult, Agent, Controller
from browser_use.browser.context import BrowserContext
from browser_use.browser.browser import Browser, BrowserConfig
import asyncio
import os
import json
import re
@MaTriXy
MaTriXy / hedge-fund-agent-team-v1-4.ipynb
Created November 20, 2024 05:04 — forked from virattt/hedge-fund-agent-team-v1-4.ipynb
hedge-fund-agent-team-v1-4.ipynb
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
<?
//
// AUTO KEYWORD-BASED FOLLOWER CURATION BOT (by @levelsio)
//
// File: twitterFollowerCuratorBot.php
//
// Created: May 2021
// License: MIT
//
@MaTriXy
MaTriXy / collect_code.sh
Created September 24, 2024 13:40 — forked from sullyo/collect_code.sh
Clones a github repo and puts all the code into a single text file perfect for LLMs
#!/bin/bash
# Check if a GitHub URL is provided as an argument
if [ -z "$1" ]; then
echo "Usage: $0 <github_url>"
exit 1
fi
# Store the GitHub URL
GIT_URL="$1"
@MaTriXy
MaTriXy / README.md
Created September 12, 2024 15:28 — forked from sayakpaul/README.md
This code snippet shows how to split the Flux transformer across two 16GB GPUs and run inference with the full pipeline.