Hi all, I am loading a custom model into percepti and facing a few errors:
Error 1:
I loaded a new object and used keras load_model(path/to/model) however I am getting the below errors , I loaded a pth and a pt files types: looking into the code of other objects it seems I should also add some "template code such as build and get_conf etc… " to build the object itself so I did that also:
LayerCustom_LayerCustom_1Keras(tf.keras.layers.Layer, PerceptiLabsVisualizer):
def call(self, inputs, training=True):
""" Takes a tensor and one-hot encodes it """
raise TypeError("Missing input connection 'input'")
input_ = inputs['input']
input_ = self.preprocess_input(input_)
#output = preview = input_['input']
output = self.Model(input_, training=False)
self._outputs = {
'output': output,
'preview': output,
}
return self._outputs
def load_model():
model = keras.models.load_model('/home/user/Desktop/Models Trained/Model-C10.pth')
model = get_model()
return model
def get_config(self):
"""Any variables belonging to this layer that should be rendered in the frontend.
Returns:
A dictionary with tensor names for keys and picklable for values.
"""
return {}
@property
def visualized_trainables(self):
""" Returns two tf.Variables (weights, biases) to be visualized in the frontend """
return tf.constant(0), tf.constant(0)
class LayerCustom_LayerCustom_1(Tf2xLayer):
def __init__(self):
super().__init__(
keras_class=LayerCustom_LayerCustom_1Keras
) t
rac Errors:
File "perceptilabs/lwcore/strategies/tf2x.py", line 36, in perceptilabs.lwcore.strategies.tf2x.Tf2xInnerStrategy._run_internal
File "perceptilabs/lwcore/strategies/tf2x.py", line 54, in perceptilabs.lwcore.strategies.tf2x.Tf2xInnerStrategy._make_results
File "perceptilabs/lwcore/strategies/tf2x.py", line 45, in perceptilabs.lwcore.strategies.tf2x.Tf2xInnerStrategy._make_results
File "perceptilabs/layers/legacy.py", line 19, in perceptilabs.layers.legacy.Tf2xLayer.__call__
File "/home/tarek/.local/lib/python3.8/site-packages/tensorflow/python/keras/engine/base_layer.py", line 1030, in __call__
outputs = call_fn(inputs, *args, **kwargs)
File "<rendered-code: 1636785172006 [LayerCustom]>", line 5, in call
@robertl advised me to remove this line of code:
It looks like this line of code still exist in your component, you might want to remove it: `raise TypeError("Missing input connection 'input'")`
I removed this line of code and checked and one error gone thank you @robertl,
Error2:
However, I am still getting one error:
Error in layer targets. Expected shape (10,) but got None
During training model is saved via pytorsch, I paste the code below:
model=utils.get_model(self.model) torch.save({'model_state_dict': model, }, os.path.join(self.checkpoint, 'model_{}.pth.tar'.format(task_id)))
My question is if I am saving the model in this way Iwill affects on how I am loading the model in my case via keras?
As far as I know that if i saved the model via keras API I should load it via keras, and If i save it via tensorflow I should loaded via tensorflow API I read this info on this documentation:
https://www.tensorflow.org/tutorials/distribute/save_and_load
And in your advise this will be reason that it is failing to be loaded properly in pertcepti or there is another reason?
Thank you for your assistance in advance