Skip to content

Instantly share code, notes, and snippets.

View kenorb's full-sized avatar
💭
The Consciousness Has Shifted...The Awakening Has Begun

Rafal W. kenorb

💭
The Consciousness Has Shifted...The Awakening Has Begun
View GitHub Profile
@kenorb
kenorb / minePYTHON36.py
Created June 22, 2025 19:22 — forked from turunut/minePYTHON36.py
Example of how a Bitcoin block is mined by finding a successful nonce
import hashlib, struct, codecs
ver = 2
prev_block = "000000000000000117c80378b8da0e33559b5997f2ad55e2f7d18ec1975b9717"
mrkl_root = "871714dcbae6c8193a2bb9b2a69fe1c0440399f38d94b3a0f1b447275a29978a"
time_ = 0x53058b35 # 2014-02-20 04:57:25
bits = 0x19015f53
# https://en.bitcoin.it/wiki/Difficulty
exp = bits >> 24
@kenorb
kenorb / .gitattributes
Created March 9, 2023 14:09 — forked from nemotoo/.gitattributes
.gitattributes for Unity3D with git-lfs
## Unity ##
*.cs diff=csharp text
*.cginc text
*.shader text
*.mat merge=unityyamlmerge eol=lf
*.anim merge=unityyamlmerge eol=lf
*.unity merge=unityyamlmerge eol=lf
*.prefab merge=unityyamlmerge eol=lf
@kenorb
kenorb / DownloadMixamoByJakeCattrall.js
Last active February 7, 2023 16:02 — forked from krazyjakee/DownloadMixamoByJakeCattrall.js
Downloads all the free Mixamo Animations
function trigger(el, eventType) {
if (typeof eventType === 'string' && typeof el[eventType] === 'function') {
el[eventType]();
} else {
const event =
eventType === 'string'
? new Event(eventType, {bubbles: true})
: eventType;
el.dispatchEvent(event);
}
alizarin
amaranth
amber
amethyst
apricot
aqua
aquamarine
asparagus
auburn
azure
alizarin
amaranth
amber
amethyst
apricot
aqua
aquamarine
asparagus
auburn
azure
# Show Tensor Images utility function
from torchvision.utils import make_grid
import matplotlib.pyplot as plt
def show_tensor_images(image_tensor, num_images=25, size=(1, 28, 28)):
'''
Function for visualizing images: Given a tensor of images, number of images, and
size per image, plots and prints the images in a uniform grid.
'''
def conv_backward(dH, cache):
'''
The backward computation for a convolution function
Arguments:
dH -- gradient of the cost with respect to output of the conv layer (H), numpy array of shape (n_H, n_W) assuming channels = 1
cache -- cache of values needed for the conv_backward(), output of conv_forward()
Returns:
dX -- gradient of the cost with respect to input of the conv layer (X), numpy array of shape (n_H_prev, n_W_prev) assuming channels = 1
def conv_forward(X, W):
'''
The forward computation for a convolution function
Arguments:
X -- output activations of the previous layer, numpy array of shape (n_H_prev, n_W_prev) assuming input channels = 1
W -- Weights, numpy array of size (f, f) assuming number of filters = 1
Returns:
H -- conv output, numpy array of size (n_H, n_W)
def conv_forward(X, W):
'''
The forward computation for a convolution function
Arguments:
X -- output activations of the previous layer, numpy array of shape (n_H_prev, n_W_prev) assuming input channels = 1
W -- Weights, numpy array of size (f, f) assuming number of filters = 1
Returns:
H -- conv output, numpy array of size (n_H, n_W)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.