How to Implement ResNet 18 from scratch?

I’ve loaded CIFAR-10 from the datasets (that worked very nicely, thank you )

I want to use an untrained ResNet on the data.

PL has a pretrained ResNet50 component, but I would like to re-run a DoubleDescent experiment using ResNet-18 from this paper:

  • Nakkiran, Preetum, Gal Kaplun, Yamini Bansal, Tristan Yang, Boaz Barak, and Ilya Sutskever. ‘Deep Double Descent: Where Bigger Models and More Data Hurt’. ArXiv:1912.02292 [Cs, Stat] , 4 December 2019. http://arxiv.org/abs/1912.02292.

(so that I can see Double Descent clearly in PL. I think I have created it in another model, but it’s nice to reproduce known results sometimes.)

In particular according to the ResNets description in Appendix B (B.1 Models):

ResNets. We define a family of ResNet18s of increasing size as follows. We follow the Preactivation ResNet18 architecture of He et al. (2016), using 4 ResNet blocks, each consisting of two BatchNorm-ReLU-Convolution layers. The layer widths for the 4 blocks are [k; 2k; 4k; 8k] for varying k 2 N and the strides are [1, 2, 2, 2]. The standard ResNet18 corresponds to k = 64 convolutional channels in the first layer. The scaling of model size with k is shown in Figure 13b. Our implementation is adapted from https://github.com/kuangliu/pytorch-cifar.

All input gratefully received - pure PL components or custom.

I think this the original paper where ResNet-18 was introduced, but the image below only shows the 34 layer version (obviously)

  • He, Kaiming, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. ‘Deep Residual Learning for Image Recognition’. ArXiv:1512.03385 [Cs] , 10 December 2015. http://arxiv.org/abs/1512.03385.

Hi!

There is a Tensorflow2.x implementation of ResNet18 available on GitHub here

You can download that .py and import it in a perceptilabs custom layer :slight_smile:

wget 'https://raw.githubusercontent.com/acoadmarmon/resnet18-tensorflow/master/resnet_18.py'

Then start perceptilabs from the same directory as you’ve downloaded the resnet18.py
Now, you can import that to a custom layer (full code):

from resnet18 import ResNet18
class LayerCustom_LayerCustom_1Keras(tf.keras.layers.Layer, PerceptiLabsVisualizer):
    def build(self, input_shape):
        input_shape_ = input_shape['input']
        if len(input_shape_) > 3:
            input_shape_ = input_shape_[1:]
        self.resnet18 = ResNet18(include_top = False,
                                weights = None,
                                input_shape = input_shape_,
                                layer_params = [2, 2, 2, 2],
                                pooling = max
                                )
    def call(self, inputs, training=True):
        """ Takes a tensor and one-hot encodes it """
        input_ = inputs['input']
        output = self.resnet18(input_)
        self._outputs = {            
            'output': output,
            'preview': output,
        }
        return self._outputs
    def get_config(self):
        """Any variables belonging to this layer that should be rendered in the frontend.
        Returns:
            A dictionary with tensor names for keys and picklable for values.
        """
        return {}
    @property
    def visualized_trainables(self):
        """ Returns two tf.Variables (weights, biases) to be visualized in the frontend """
        return tf.constant(0), tf.constant(0)
class LayerCustom_LayerCustom_1(Tf2xLayer):
    def __init__(self):
        super().__init__(
            keras_class=LayerCustom_LayerCustom_1Keras
        )

To change parameters for the model (i.e pooling) just edit the “self.resnet18 = ResNet18(include_top…” part of the code.

Hope that works

2 Likes

@birdstream Thanks for that! I had google a bit but only found pre-trained in my quick search.

I also want to build it from scratch using PL components but that code provides proof-of-principle & cross-check

Can you clarify that bit, because I run PL from an environment I don’t know what it means to run PL from a directory (Apologies for being extremely literal sometimes!)

I can see that resnet18 in imported, but don’t know how paths are handled… this is the environment root… download resnet_18.py to here?

Oh, I keep forgetting that you’re on Windows :see_no_evil:
I wish i could give a straight answer here, as i’ve only used Python on Linux :sweat_smile:

When I use Perceptilabs i open up a terminal window:

conda activate perceptilabs
then
perceptilabs

Whatever working directory I’m in (99% of the time it’s my home directory) when launching perceptilabs is where it will look for the resnet.py file when doing the import.

Can you do that in a similar way on windows from command line? :sweat_smile: If not, my bet is the file should be put in the root of your home directory…

Oh, I was thinking train from scratch :sweat_smile:
With some effort, it should be doable? All the necessary components should be there i think :thinking:

:rofl:Why does that feel like “Poor guy…”?

Oh, I was thinking train from scratch :sweat_smile:
With some effort, it should be doable? All the necessary components should be there i think :thinking:

I think so too… we’ll see. The code is a big help though .

Must confess I am spending too much time on ML at the moment… (!) it’s all too interesting: am also discussing generalisation on the side with @robertl and trying out some theories on him…

And now I’ve discovered Loss Landscape Visualisation I want to do that too…

Noo didn’t mean so :wink::sweat_smile: I’m really not that versed in Windows nowadays and I wish i could be of more help here. Hopefully someone else can :slight_smile:

How can one spend too much time on ML? :see_no_evil::rofl:

What is that? I’m thinking 3D plots showing the hills and valleys of some function that one wants to optimize…? :thinking:

Oh yes :slight_smile: Loss Visualisation lets you see the landscape over which gradients are being evaluated for backprop.

See Li, Hao, Zheng Xu, Gavin Taylor, Christoph Studer, and Tom Goldstein. ‘Visualizing the Loss Landscape of Neural Nets’. In Advances in Neural Information Processing Systems , Vol. 31. Curran Associates, Inc., 2018. abstract.

for this sort of thing:

“We wants it!”

1 Like