This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import torch | |
from torch import nn | |
from monai.utils import optional_import | |
xformers, has_xformers = optional_import("xformers", name="xformers") | |
class SelfAttentionBlock(nn.Module): | |
def __init__( | |
self, |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
class ResidualBlock(keras.Model): | |
"""Residual blocks that compose pixelCNN | |
Blocks of layers with 3 convolutional layers and one residual connection. | |
Based on Figure 5 from [1] where h indicates number of filters. | |
Refs: | |
[1] - Oord, A. V. D., Kalchbrenner, N., & Kavukcuoglu, K. (2016). Pixel | |
recurrent neural networks. arXiv preprint arXiv:1601.06759. | |
""" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
class MaskedConv2D(keras.layers.Layer): | |
"""Convolutional layers with masks. | |
Convolutional layers with simple implementation of masks type A and B for | |
autoregressive models. | |
Arguments: | |
mask_type: one of `"A"` or `"B".` | |
filters: Integer, the dimensionality of the output space | |
(i.e. the number of output filters in the convolution). |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# -*- coding: utf-8 -*- | |
""" | |
This module creates an optimization of hyper-parameters of a DBN using hyperopt library | |
Check out the library here: https://github.com/hyperopt/hyperopt). | |
example run: | |
python hyperopt_exampleMNIST.py --trainSize=10000 --path=... | |
REFERENCES | |
[1] - Bergstra, James, Dan Yamins, and David D. Cox. "Hyperopt: A python library for optimizing the hyperparameters of machine learning algorithms." (2013). |