Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

运行example文件夹下的代码,想把模型保存为save model格式,同样方式dssm模型成功保存,youtubematch、sdm、mind却保存报错 #46

Open
Outstandingwinner opened this issue Jan 17, 2021 · 9 comments

Comments

@Outstandingwinner
Copy link

Outstandingwinner commented Jan 17, 2021

python3.6.5 tensorflow 2.2
想把模型保存为save model格式,dssm模型成功保存,youtubematch、sdm、mind模型报错
user_embedding_model = Model(inputs=model.user_input, outputs=model.user_embedding)
item_embedding_model = Model(inputs=model.item_input, outputs=model.item_embedding)

用keras.models.save_model保存模型

from tensorflow import keras
keras.models.save_model(user_embedding_model,"./models")

报错如下:
\Python36\site-packages\tensorflow\python\keras\saving\saved_model\save_impl.py", line 566, in call_and_return_conditional_losses
return layer_call(inputs, *args, **kwargs), layer.get_losses_for(inputs)
TypeError: call() missing 1 required positional argument: 'state'

用tf.saved_model.save保存模型

tf.saved_model.save(user_embedding_model,"./models")
报错如下:
\Python36\site-packages\tensorflow\python\keras\saving\saved_model\save_impl.py", line 566, in call_and_return_conditional_losses
return layer_call(inputs, *args, **kwargs), layer.get_losses_for(inputs)
TypeError: call() missing 1 required positional argument: 'state'

先保存为h5再读取保存为savemodel

user_embedding_model.save("./models/models.h5")
pre_model = tf.keras.models.load_model("./models/models.h5")
pre_model.save("./models/output")
报错如下:
\Python36\site-packages\tensorflow\python\keras\utils\generic_utils.py", line 321, in class_and_config_for_serialized_keras_object
raise ValueError('Unknown ' + printable_module_name + ': ' + class_name)
ValueError: Unknown layer: NoMask

deepmatch模型example代码中dsssm成功保存,其它模型保存失败。因为线上调用模型需要将模型保存为saveModel格式,而非h5。请问能否解决一下这个问题?

@shuDaoNan9
Copy link

shuDaoNan9 commented Feb 3, 2021

这都被你发现了,你是一个个都试过存pb格式了么?看样子用类似Java调模型的不多,都走TensorFlow serving实时推荐的去了吧

@shuDaoNan9
Copy link

还是自己动手改造到TF2版本后,再直接存为savemodel吧。马上过年了,正好有时间弄了

@Outstandingwinner
Copy link
Author

这都被你发现了,你是一个个都试过存pb格式了么?看样子用类似Java调模型的不多,都走TensorFlow serving实时推荐的去了吧

不管是Java调模型还是用tensorflow-serving都需要调用pb文件

@Outstandingwinner
Copy link
Author

还是自己动手改造到TF2版本后,再直接存为savemodel吧。马上过年了,正好有时间弄了

这个代码本来就是TF2可以成功运行的。

@shuDaoNan9
Copy link

这都被你发现了,你是一个个都试过存pb格式了么?看样子用类似Java调模型的不多,都走TensorFlow serving实时推荐的去了吧

不管是Java调模型还是用tensorflow-serving都需要调用pb文件

在训练完后单独保存embeding后的特征向量,然后自己在其它程序里面算物品和用户向量的得分取topn也是能算的,此时甚至不需要保存模型本身。

@shuDaoNan9
Copy link

还是自己动手改造到TF2版本后,再直接存为savemodel吧。马上过年了,正好有时间弄了

这个代码本来就是TF2可以成功运行的。

这个不是纯TF2,兼容TF1/2还是有差别的

@chenkejin
Copy link

这个问题主要是因为core文件中不能用list 方式创建layer,你将其改为字典,而且字典的key必须为字符串
修改deepctr/layers/core.py
self.activation_layers={}
for i in range(len(self.hidden_units)):
self.activation_layers[str(i)]=activation_layer(self.activation)
# self.activation_layers = [activation_layer(self.activation) for _ in range(len(self.hidden_units))]
# print("activation num:", len(self.activation_layers))
if self.output_activation:
# self.activation_layers[-1] = activation_layer(self.output_activation)
self.activation_layers[str(len(self.hidden_units)-1)] = activation_layer(self.output_activation)

@guixianjin
Copy link

guixianjin commented Jun 28, 2022

这个问题主要是因为core文件中不能用list 方式创建layer,你将其改为字典,而且字典的key必须为字符串 修改deepctr/layers/core.py self.activation_layers={} for i in range(len(self.hidden_units)): self.activation_layers[str(i)]=activation_layer(self.activation) # self.activation_layers = [activation_layer(self.activation) for _ in range(len(self.hidden_units))] # print("activation num:", len(self.activation_layers)) if self.output_activation: # self.activation_layers[-1] = activation_layer(self.output_activation) self.activation_layers[str(len(self.hidden_units)-1)] = activation_layer(self.output_activation)

确实有效。
补充一下,再改一下 deepctr/layers/core.py 文件中:

                # fc = self.activation_layers[i](fc, training=training)
                fc = self.activation_layers[str(i)](fc, training=training)
                # fc = self.activation_layers[i](fc)
                fc = self.activation_layers[str(i)](fc)

然后用 tf.saved_model.save() 即可保存 pb 格式文件。

@marzooq-unbxd
Copy link

marzooq-unbxd commented Oct 28, 2022

Was this solved for everyone?
I still cant fix this for sdm, is the change only required in deepctr?
Could this have any impact (DynamicMultiRNN in DeepMatch)

cell_list.append(single_cell_residual)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants