Inference with dictionary of pytorch tensors #147
-
I'm trying to use FTorch to call a pytorch model from fortran 90. The pytorch model wants a dictionary of torch tensors, for example, inference in python looks like this: `pos=torch.tensor([[1.0,1.1,1.2], types=torch.tensor([0,1,0]) edges=torch.tensor([[0, 0, 1, 1, 2, 2], input={ output=model(input)` Can I do this with FTorch? There seem to be various workarounds for building dictionaries or dictionary-like objects in fortran 90. Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 8 replies
-
Hi @jonathan-booth At the moment if you have a model that takes multiple input tensors we facilitate this by providing them as individual items rather than as a single dictionary of tensors. You can see an example of this in our Multiple inputs and outputs worked example which takes two input tensors (they don't need to be the same shape or size). In python this is declared as two separate inputs to the So my advice, unless you are for some reason restricted to using a dict, would be to write your net architecture as taking each input tensor as a distinct input. This would then be possible to can using FTorch as in the above example. If this helps please let me know, and if not we would be happy to take a closer look at some code. |
Beta Was this translation helpful? Give feedback.
-
Thanks. I'm calling a model made by the allegro package: https://github.com/mir-group/allegro so I don't really have control over how it takes its input, although maybe I could modify it so it takes its input as an array of tensors like in the example you gave. If this doesn't prove to be possible is there a way to get FTorch to pass a dict of torch tensors to the model? Thanks |
Beta Was this translation helpful? Give feedback.
-
@jonathan-booth Thanks, that helps. Looking at the allegro code they have a lot of infrastructure build around a Torch neural net, so you would probably have to do some surgery to extract the net so that you could save it as TorchScript. (I'm looking at https://github.com/mir-group/allegro/blob/main/allegro/nn/_fc.py) If you do this then I would also take the opportunity to modify so that you pass inputs in as torch tensors. The beauty of FTorch is that it bypasses the python runtime to use the more efficient C++ implementation of Torch. However, it is not clear to me whether you can save the allegro model to TorchScript or not. If this is not possible you might need to look at the more 'traditional' methods for coupling python to Fortran. |
Beta Was this translation helpful? Give feedback.
-
OK it all works now, the issue with the output was my error. Thanks a lot for your help! |
Beta Was this translation helpful? Give feedback.
The method of loading the model and modifying the input has worked. I'm now trying to modify the output as well because the output is a Dict[str: tensor]. This is proving difficult. Does FTorch work with models that have been scripted with torch.jit.script? I think that's the only way I can save my modified model. I do the following: