In Perceptilabs we have the ability to augment our dataset (images) with handy options in the dataset settings like random flip, random crop & random rotation. These are really useful for combating overfitting and helps our model generalize better - especially if there is a limited number of samples to train our model on. Random horizontal flip alone effectively doubles the samples in the dataset, and by enabling more augmentation we can double it a couple of times even more. In most cases, this is perfectly enough.
But what if we want even more samples? Or want some type of augmentation that isn’t available as an option in Perceptilabs?
While it would be hard to cover all possible sorts of augmentation in a nice graphical frontend to satisfy every thinkable scenario, Perceptlilabs offer the use of custom components where it let’s us put in code of our own And we can harness them to do just that, more augmentation!
EDITED: The new version does not rely on TF 2.7, so no need to change it
To our model, we add a custom component and place it between the input and the rest of our model:
Then, we open the custom component’s code by (with the LayerCustom_1 component selected) clicking the “open code </>” button on the settings panel to the right.
In the code editor, first import the experimental.preprocessing module. Then, we want to put the second part of our code (build section) under the class() declaration like so:
import tensorflow.keras.layers.experimental.preprocessing as pp
class LayerCustom_LayerCustom_1Keras(tf.keras.layers.Layer, PerceptiLabsVisualizer):
def build(self, input_shape):
self.preprocess = tf.keras.Sequential([
pp.Rescaling(1./127.5, -1),
pp.RandomContrast(0.1),
pp.RandomZoom(0.1, 0.1),
pp.RandomTranslation(0.1, 0.1),
pp.RandomFlip(),
pp.RandomRotation(0.1)
])
Here, we stack a couple of augmentation layers inside a tf.keras.Sequential:
tf.keras.layers.Rescaling(1./127.5, -1)
scales the input from the range [0, 255] to [-1, 1]. Don’t use if you’re already doing normalization in the data settings.
tf.keras.layers.RandomContrast()
applies random contrast by a factor of 0.1 (or, 10%)
tf.keras.layers.RandomZoom(0.1, 0.1)
randomly zooms the image in or out by a factor of 0.1
tf.keras.layers.RandomTranslation(0.1, 0.1)
randomly translates the image in any direction by a factor of 0.1
tf.keras.layers.RandomFlip()
randomly flips the image horizontally and/or vertically
tf.keras.layers.RandomRotation(0.1)
randomly rotates the image by 0.1 * 2Ď€ rad
Finally, we just need to call the layer in the call() section:
input_ = inputs['input']
output = self.preprocess(input_)
And that’s all! Here, the entire code for our custom component:
import tensorflow.keras.layers.experimental.preprocessing as pp
class LayerCustom_LayerCustom_1Keras(tf.keras.layers.Layer, PerceptiLabsVisualizer):
def build(self, input_shape):
self.preprocess = tf.keras.Sequential([
pp.Rescaling(1./127.5, -1),
pp.RandomContrast(0.1),
pp.RandomZoom(0.1, 0.1),
pp.RandomTranslation(0.1, 0.1),
pp.RandomFlip(),
pp.RandomRotation(0.1)
])
def call(self, inputs, training=True):
""" Takes a tensor and one-hot encodes it """
input_ = inputs['input']
output = self.preprocess(input_)
self._outputs = {
'output': output,
'preview': output,
}
return self._outputs
def get_config(self):
"""Any variables belonging to this layer that should be rendered in the frontend.
Returns:
A dictionary with tensor names for keys and picklable for values.
"""
return {}
@property
def visualized_trainables(self):
""" Returns two tf.Variables (weights, biases) to be visualized in the frontend """
return tf.constant(0), tf.constant(0)
class LayerCustom_LayerCustom_1(Tf2xLayer):
def __init__(self):
super().__init__(
keras_class=LayerCustom_LayerCustom_1Keras
)
By doing this way, new samples will be created on the fly and will be different every epoch. So if we want more samples, we just train more epochs