Suggestion: Add binary data storage options for EGS_ScoringSingle and EGS_ScoringArray #688
Replies: 3 comments
-
This would be a useful addition! Much of the work has already been done, in terms of comparing binary formats - see PR #405. |
Beta Was this translation helpful? Give feedback.
-
I’d like to add my voice to @MartinMartinov. It is also my suspicion that I/O is a major bottleneck of certain types of simulations. I had also come around to using nbatch = 1 and I had also noticed the bottleneck of all parallel jobs writing out their results at approximately the same time. Also, this realization made me immediately suspicious of Monte Carlo timing studies (especially GPU vs CPU) that do not explicitly distinguish radiation transport vs all other tasks (initialization, I/O, etc.). |
Beta Was this translation helpful? Give feedback.
-
Another mitigation technique is to work in RAM, by having all I/O happen in /dev/shm for example. Then just copy over final results to a hard-drive. |
Beta Was this translation helpful? Give feedback.
-
I've been running simulations computing 3ddoses in 512x512x117 phantoms, where the actual particle simulation runs pretty quickly, but the all the storeState/setState calls (I suspect) are bottlenecking the simulation significantly. I'm currently setting nbatch to 1 to avoid too many state calls which works well interactively, but when running several simulations in parallel accessing the same hard drive, it is still a huge efficiency killer when they all go to write egsdat files at once (once again, just a suspicion).
Would it be possible to write a parallel set of store/setState which save data in binary format (ie, storeBinState and setBinState) which usercodes that have massive arrays can use instead? I don't mind giving it a try myself, I'm just not sure how well it would fit into the current architecture, so I wanted to post here first.
Beta Was this translation helpful? Give feedback.
All reactions