Skip to content

Instantly share code, notes, and snippets.

@laurencecwj
laurencecwj / HelloWorld.proto
Created June 29, 2025 05:41 — forked from miguelmota/HelloWorld.proto
Golang gRPC protobuf hello world example (also using grpcurl)
syntax = "proto3";
package protos;
service HelloWorld {
rpc SayHello (HelloRequest) returns (HelloResponse) {}
}
message HelloRequest {
string name = 1;
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
input_embeds = self.transformer.wte(input_tokens) # look up tokens in embedding matrix
# Non-recurrent prelude
for block in self.transformer.prelude:
input_embeds, attn_map = block(input_embeds)
# Main recurrence
for step in range(num_steps_recurrence):
x = self.transformer.adapter(torch.cat([x, input_embeds], dim=-1)) # Adapter
for block in self.transformer.core_block: # 4 Inner layers
@laurencecwj
laurencecwj / grpo_demo.py
Created February 9, 2025 13:42 — forked from willccbb/grpo_demo.py
GRPO Llama-1B
# train_grpo.py
import re
import torch
from datasets import load_dataset, Dataset
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import LoraConfig
from trl import GRPOConfig, GRPOTrainer
# Load and prep dataset
@laurencecwj
laurencecwj / MoE.py
Created January 30, 2025 12:11 — forked from ruvnet/MoE.py
A PyTorch implementation of a Mixture of Experts (MoE) model resembling the Mixtral 8x7B architecture, with detailed inline comments. This model combines transformer layers with an MoE layer consisting of 8 experts, aiming for high efficiency by activating only 2 experts per token. It's configured with dimensions reflecting the operational effic…
"""
This model integrates the MoE concept within a Transformer architecture. Each token's
representation is processed by a subset of experts, determined by the gating mechanism.
This architecture allows for efficient and specialized handling of different aspects of the
data, aiming for the adaptability and efficiency noted in the Mixtral 8x7B model's design
philosophy. The model activates only a fraction of the available experts for each token,
significantly reducing the computational resources needed compared to activating all experts
for all tokens.
"""
@laurencecwj
laurencecwj / *DeepSeek-uncensored.md
Created January 30, 2025 03:16 — forked from ruvnet/*DeepSeek-uncensored.md
Deploying and Fine-Tuning an Uncensored DeepSeek R1 Distill Model on Google Cloud

DeepSeek R1 Distill: Complete Tutorial for Deployment & Fine-Tuning

This guide shows how to deploy an uncensored DeepSeek R1 Distill model to Google Cloud Run with GPU support and how to perform a basic, functional fine-tuning process. The tutorial is split into:

  1. Environment Setup
  2. FastAPI Inference Server
  3. Docker Configuration
  4. Google Cloud Run Deployment
  5. Fine-Tuning Pipeline (Cold Start, Reasoning RL, Data Collection, Final RL Phase)
@laurencecwj
laurencecwj / main.go
Created January 7, 2025 02:18 — forked from crosstyan/main.go
A websocket based multiroom chat using golang and gin
package main
import (
"log"
"net/http"
"net/url"
"github.com/gin-gonic/gin"
"github.com/google/uuid"
@laurencecwj
laurencecwj / a.pyx
Created December 17, 2024 13:57 — forked from kigawas/a.pyx
Cython example of multiple pyx files
cdef int _fib(int n):
cdef int i
cdef int a=0, b=1
for i in range(n):
a, b = a + b,a
return a
def fib(n):
return _fib(n)
@laurencecwj
laurencecwj / selfsigned.py
Created December 11, 2024 09:56 — forked from bloodearnest/selfsigned.py
Create a self-signed x509 certificate with python cryptography library
# Copyright 2018 Simon Davy
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
package main
import (
"crypto/aes"
"crypto/cipher"
"fmt"
"reflect"
"strconv"
"time"
)