Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tf: Scope model under a model identifier (WIP) #54

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

julien-c
Copy link
Contributor

@julien-c julien-c commented Feb 7, 2017

So what I want to be able to do is:

botFoo = chatbot.Chatbot()
botFoo.main(['--modelTag', "foo", '--test', 'daemon'])

botBar = chatbot.Chatbot()
botBar.main(['--modelTag', "bar", '--test', 'daemon'])

This does not work right now as there are collisions between variable names in the TF symbolic graph. This first commit is – I think – a right step towards fixing this, but is not enough: I think in the model.py file, where we rely on the tf.nn.seq2seq.embedding_rnn_seq2seq implementation, something must not be properly scoped.

What do you guys think?

@eschnou
Copy link
Contributor

eschnou commented Feb 7, 2017

Actually, there is a scope argument to the method embedding_rnn_seq2seq, so this might be what you need to use to scope your two models properly.

See here in the TF code:
https://github.com/tensorflow/tensorflow/blob/r0.12/tensorflow/python/ops/seq2seq.py#L292

A specific model scope could thus be added here in DeepQA:
https://github.com/Conchylicultor/DeepQA/blob/master/chatbot/model.py#L169

Haven't tried, but seems like the right place to look into. Also note that the current implementation of loading pre-trained embeddings relies on fixed scope name. This should be solved if you change scoping:
https://github.com/Conchylicultor/DeepQA/blob/master/chatbot/chatbot.py#L390

@julien-c
Copy link
Contributor Author

julien-c commented Feb 10, 2017

Thanks @eschnou ! I will look into it tomorrow and hopefully have a mergeable branch by then.

@Conchylicultor
Copy link
Owner

Conchylicultor commented Mar 29, 2017

As both chatbots would use different sessions, I think they should be on different graphs. If you simply change the scope, as the scope is global, the second session would also allocate memory for the first chatbot. I didn't try but that would be more something like:

g1 = tf.Graph()
with g1.as_default():
    botFoo = chatbot.Chatbot()
    botFoo.main(['--modelTag', "foo", '--test', 'daemon'])

g2 = tf.Graph()
with g2.as_default():
    botBar = chatbot.Chatbot()
    botBar.main(['--modelTag', "bar", '--test', 'daemon'])

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants