Skip to content

Commit

Permalink
docs: minor improvements to skeletonization tutorial
Browse files Browse the repository at this point in the history
  • Loading branch information
schlegelp committed Sep 26, 2024
1 parent b5bb351 commit 5a902c3
Show file tree
Hide file tree
Showing 3 changed files with 26 additions and 14 deletions.
Binary file added docs/_static/lm_tut/C1.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/_static/lm_tut/all_skeletons.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
40 changes: 26 additions & 14 deletions docs/examples/0_io/zzz_tutorial_io_05_skeletonize.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@
5. Go to "Image" -> "Type" -> "8-bit" to convert the image to 8-bit (optional but recommended)
6. Save via "File" -> "Save As" -> "NRRD" and save the file as `neuron.nrrd`
![Z stack](../../../_static/lm_tut/image_stack.png)
![Z stack](../../../_static/lm_tut/C1.gif)
## Extracting the Skeleton
Expand All @@ -85,6 +85,7 @@
"neuron.nrrd"
)

# %%
# Next, we need to find some sensible threshold to binarize the image. This is not strictly
# necessary (see the further note down) but at least for starters this more intuitive.

Expand Down Expand Up @@ -137,27 +138,38 @@
# Collect some statistics
stats = cc3d.statistics(labels)

print("Labeled", N, "connected components")
print("Per-label voxel counts:", stats["voxel_counts"])
print("Total no. of labeled componenents:", N)
print("Per-label voxel counts:", np.sort(stats["voxel_counts"])[::-1])
print("Label IDs:", np.argsort(stats["voxel_counts"])[::-1])

# %%
# Note how the first label has suspiciously many voxels? That's because this is the background label.
# ```
# Total no. of labeled componenents: 37836
# Per-label voxel counts: [491996140 527374 207632 ... 1 1 1]
# Label IDs: [ 0 6423 6091 ... 22350 22351 18918]
# ```
#
# Note how label `0` has suspiciously many voxels? That's because this is the background label.
# We need to make sure to exlude it from the skeletonization process:
to_skeletonize = np.arange(1, N)


# %%
# Now we can run the actual skeletonization. There are a number of parameters that are worth tweaking:
# Now we can run the actual skeletonization!
#
# !!! note "Skeletonization paramters"
# There are a number of parameters that are worth explaining
# first because you might want to tweak them for your data:
#
# - `scale` & `const` control how detailed your skeleton will be: lower = more detailed but more noise
# - `anisotropy` controls the voxel size - see the `header` dictionary for the voxel size of our image
# - `dust_threshold` controls how small connected components are skipped
# - `object_ids` is a list of labels to process (remember that we skipped the background label)
# - `max_path` if this is set, the algorithm will only process N paths in each skeleton - you can use
# this to finish early (e.g. for testing)
# - `scale` & `const`: control how detailed your skeleton will be: lower = more detailed but more noise
# - `anisotropy`: controls the voxel size - see the `header` dictionary for the voxel size of our image
# - `dust_threshold`: controls how small connected components are skipped
# - `object_ids`: a list of labels to process (remember that we skipped the background label)
# - `max_path`: if this is set, the algorithm will only process N paths in each skeleton - you can use
# this to finish early (e.g. for testing)
#
# See the [`kimimaro` repository](https://github.com/seung-lab/kimimaro) for a detailed explanation
# of the parameters!
# See the [`kimimaro` repository](https://github.com/seung-lab/kimimaro) for a detailed explanation
# of the parameters!

skels = kimimaro.skeletonize(
labels,
Expand Down Expand Up @@ -206,7 +218,7 @@
#
# Zooming in on `6091` you will see that it wasn't fully skeletonized: some of the branches are missing
# and others are disconnected. That's either because our threshold for the mask was too high (this neuron
# was fainter than the other) and/or we dropped too many fragments during the skeletonization process
# had a weaker signal than the other) and/or we dropped too many fragments during the skeletonization process
# (see the `dust_threshold` parameter).
#
# ![zoom in](../../../_static/lm_tut/zoom_in.png)
Expand Down

0 comments on commit 5a902c3

Please sign in to comment.