Complete setup guide for local search + cognitive security for AI agents
| Tool | Repo | Creator |
|---|
These useful concepts show up in specific areas of NN-training literature but can be applied pretty broadly.
torch.rand(batch_size), you can use th.randperm(batch_size).add_(th.rand(batch_size)).div_(batch_size) instead, which has the same distribution but lower variance, and therefore trains more stably. This shows up in k-diffusion https://github.com/crowsonkb/k-diffusion/commit/a2b7b5f1ea0d3711a06661ca9e41b4e6089e5707, but it's applicable whenever you're randomizing data across the batch axis.| #!/usr/bin/env python | |
| # coding: utf-8 | |
| from __future__ import print_function | |
| import configparser | |
| import re | |
| import shutil | |
| import sys |
| import numpy as np | |
| #input is a RGB numpy array with shape (height,width,3), can be uint,int, float or double, values expected in the range 0..255 | |
| #output is a double YUV numpy array with shape (height,width,3), values in the range 0..255 | |
| def RGB2YUV( rgb ): | |
| m = np.array([[ 0.29900, -0.16874, 0.50000], | |
| [0.58700, -0.33126, -0.41869], | |
| [ 0.11400, 0.50000, -0.08131]]) |