This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import unittest | |
import numpy as np | |
import numpy.linalg as linalg | |
def makeUnit(x): | |
"""Normalize entire input to norm 1. Not what you want for 2D arrays!""" | |
return x / linalg.norm(x) | |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from __future__ import division | |
import numpy as np | |
import pandas as pd | |
import random | |
def sample(data): | |
sample = [random.choice(data) for _ in xrange(len(data))] | |
return sample | |
def bootstrap_t_test(treatment, control, nboot = 1000, direction = "less"): |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
""" | |
You can use PyCharm's Python console and use Ctrl + C, | |
if you catch the exception that PyCharm raises when Ctrl + C is pressed. | |
I wrote a short function below called `is_keyboard_interrupt` | |
that tells you whether the exception is KeyboardInterrupt, | |
including PyCharm's. | |
If it is not, simply re-raise it. | |
I paste a simplified version of the code below. | |
When it is run: |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# On PyCharm Debugger console, \r needs to come before the text. | |
# Otherwise, the text may not appear at all, or appear inconsistently. | |
# tested on PyCharm 2019.3, Python 3.6 | |
# Modification of https://stackoverflow.com/a/517523/2565317 | |
import time | |
print('Start.') | |
for i in range(100): | |
time.sleep(0.02) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
*.pyc |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
"""A part of the pylabyk library: numpytorch.py at https://github.com/yulkang/pylabyk""" | |
import torch | |
def block_diag(m): | |
""" | |
Make a block diagonal matrix along dim=-3 | |
EXAMPLE: | |
block_diag(torch.ones(4,3,2)) | |
should give a 12 x 8 matrix with blocks of 3 x 2 ones. | |
Prepend batch dimensions if needed. | |
You can also give a list of matrices. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
"""A part of the pylabyk library: numpytorch.py at https://github.com/yulkang/pylabyk""" | |
import torch | |
def kron(a, b): | |
""" | |
Kronecker product of matrices a and b with leading batch dimensions. | |
Batch dimensions are broadcast. The number of them mush | |
:type a: torch.Tensor | |
:type b: torch.Tensor | |
:rtype: torch.Tensor | |
""" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
def get_jacobian(net, x, noutputs): | |
x = x.squeeze() | |
n = x.size()[0] | |
x = x.repeat(noutputs, 1) | |
x.requires_grad_(True) | |
y = net(x) | |
y.backward(torch.eye(noutputs)) | |
return x.grad.data |