This repository has been archived by the owner on Mar 17, 2021. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 404
NiftyNet Dev Meeting 15th April 2019
csudre edited this page Apr 15, 2019
·
2 revisions
Wenqi, Felix, Tom Va., Jorge, Pedro, Ben, Carole
- Discussion around ways to cut between modular parts - Particular issue of default activation layer with loss function / Pb of associated documentation
- Discussion around move towards tensorflow 2.0 - Details to be discussed on dedicated meeting 10th May 2019 2pm - Input from TomVa. Changes associated to Application driver (no session anymore), collection of variables (no call to get Collection), use of tf.layers instead of current template op.
- Move towards more abstracted use
- Need to integrate as much as possible existing feature recently developed
- Need for detailed "How to pages" with examples of actual code implementation for user-dependent features
- Need to find ways to smooth learning curve for new adopters of NiftyNet
- Need for combination of commented contrib systems with recent published applications - ALL to act on this
Current assignment based on main developers of associated features.
- Hemis - Tom
- VGG - Felix
- Guotai's BRATS network
- Gaussian Sampler - Ben
- Stochastic filter group (Masking kernels) - Felix
- Probability layer (use different probability distribution) - Felix
- Histogram regression to classification - Felix
- MR Augmentation: motion, blurring, noise, RF spike - Richard
- Uniform sampler with pad authorization - Ben
- CSV reader - Carole/Tom
- exists as a module but still difficult to be used directly (need many modifications)
- Performance based gradient adjustments
- Learning rate decay
- Early stopping performance based
- Reduced learning rate on plateau
- Online update
- Whole volume validation (still needs review and testing)
- Customised application using CSV reader - Pedro / Ze upon publication
- Multi-task - Felix upon publication
- Reinforcement learning - Kerstin upon publication
- Distillation loss - Irme upon publication
- RCNN/object detection - Carole
- Counting - Zach upon publication
- VAE modification - Reuben upon publication
- Quality Driven Loss - Zach
- Cosine similarity gradient enforcement - Irme
- Cosine loss for direction regression - Carole
- Smooth L1 - Carole
- Volume consistency - Carole
- Volume existence - Carole
- Variability loss - Carole
- Weight batch - Carole
- Model introspection - With standalone demo jupyter notebook - Felix
- Multiple outputs - Demo - Felix/Carole
- Histogram viewer
- Loading model from different graph - Tom
- Event handler - Tom
- Multi output - Carole
- Custom placeholders - Tom