Commit b3d069d0 authored by Philipp Schubert's avatar Philipp Schubert
Browse files

more docstrings, including examples for `SuperSegmentationDataset` and `SegmentationDataset`

parent 69d15454
......@@ -22,7 +22,7 @@ If you use parts of this code base in your academic projects, please cite the co
Documentation
-------------
The documentation for the refactored version is still work-in-progress and can be found [here](docs/doc.md). Alternatively see the latest [readthedocs build](https://syconn.readthedocs.io/en/latest/).
The documentation for the refactored version is still work-in-progress and can be found [here](docs/doc.md). Alternatively and for API docs please refer to the latest [readthedocs build](https://syconn.readthedocs.io/en/latest/).
For SyConn v1, please refer to the old [documentation](https://structuralneurobiologylab.github.io/SyConn/documentation/). We also present more general information about SyConn on our [Website](https://structuralneurobiologylab.github.io/SyConn/).
......
......@@ -53,7 +53,7 @@ m2r_parse_relative_links = True
m2r_anonymous_references = False
m2r_disable_inline_math = False
napoleon_include_private_with_doc = True
# napoleon_include_private_with_doc = True
# The encoding of source files.
#
......
# Extracting connectivity
Contact sites are the basis for synaptic classification. Therefore, contact sites need to be
combined with the synapse `SegmentationObjects` to conn `SegmentationObjects` and then further
combined with the synapse ``SegmentationObjects`` to conn ``SegmentationObjects`` and then further
classified as synaptic or not-synaptic using an Random Forest Classifier (RFC).
The code is in `syconn.extraction.cs_processing_steps`, `syconn.proc.sd_proc` and `syconn.proc.ssd_proc`.
The exection script is located at `SyConn/scrips/syns/syn_gen.py`.
The exection script is located at ``SyConn/scrips/syns/syn_gen.py``.
## Prerequisites
* SegmentationDataset of [aggregated contact sites (`syn_ssv`)](contact_site_extraction.md)
* [Synapse type](synapse_type.md) predictions (TODO: add here)
* Labelled cellular compartments (see [neuron analysis](neuron_analysis.md)) (WIP)
* SegmentationDataset of type ``syn_ssv``, [see contact site extraction](contact_site_extraction.md)
* [Synapse type](synapse_type.md) predictions
* Labeled cellular compartments (see [neuron analysis](neuron_analysis.md))
## Classifying synapse objects
TODO: re-work this analysis part
*TODO: refactor this part*
The previously generated [`syn_ssv` SegmentationObjects](contact_site_extraction.md) are in the following used to aggregate synaptic properties.
The previously generated [syn_ssv SegmentationObjects](contact_site_extraction.md) are in the following used to aggregate synaptic properties.
Other objects such as vesicle clouds and mitochondria are mapped by proximity.
Mapping these objects helps to improve the features used for classifying the conn objects.
......@@ -25,24 +25,23 @@ because it can incorporate other relevant features, such as vesicles clouds in p
cps.create_syn_gt(sd_syn_ssv, path_to_gt_kzip)
creates the ground truth for the RFC and also trains and stores the classifier. Then, the `syn_ssv` `SegmentationObjects` can be classified with
creates the ground truth for the RFC and also trains and stores the classifier. Then, the ``SegmentationObjects`` of type ``syn_ssv`` can be classified with
cps.classify_conn_objects(working_dir, qsub_pe=my_qsub_pe, n_max_co_processes=100)
cps.classify_conn_objects(working_dir)
## Collecting directionality information (axoness)
`syn_ssv` `SegmentationObjects` can acquire information about the "axoness" of both partners around the synapse. This allows
a judgement about the direction of the synapse. To collect this information from the `ssv` partners do
## Cellular compartment information
Each ``syn_ssv`` ``SegmentationObject`` will be assigned information about the "axoness" of both cellular partners forming the synapse. To collect this information from the ``ssv`` partners do
cps.collect_axoness_from_ssv_partners(wd, qsub_pe=my_qsub_pe, n_max_co_processes=100)
cps.collect_axoness_from_ssv_partners(wd)
The axoness prediction key used here can currently only be changed in the code directly (see `cps._collect_axoness_from_ssv_partners_thread`).
The axoness prediction key used here can currently only be changed in the code directly (see ``cps._collect_axoness_from_ssv_partners_thread``).
## Writing the connectivity information to the SuperSegmentationDataset
For convenience and efficiency, the connectivity information created in the last step can be written to the `SuperSegmentationDataset`.
For convenience and efficiency, the connectivity information created in the last step can be written to the ``SuperSegmentationDataset``.
ssd_proc.map_synaptic_conn_objects(ssd, qsub_pe=my_qsub_pe, n_max_co_processes=100)
ssd_proc.map_synaptic_conn_objects(ssd)
This enables direct look-ups on the level of ssv's, without having to go back to the sv objects, which would add delays.
......
......@@ -83,7 +83,7 @@ attr_dict = sj_obj.attr_dict
```
sj_obj.mesh_caching = False
sj_obj.voxel_caching = False
sj_obj.vskeleton_caching = False
sj_obj.skeleton_caching = False
```
and the cache can be cleared by
......
......@@ -174,9 +174,10 @@ def VoxelStorage(inp, **kwargs):
obj = VoxelStorageClass(inp, **kwargs)
if 'meta' in obj._dc_intern: # TODO: Remove asap as soon as we switch to VoxelStorageDyn
obj = VoxelStorageDyn(inp, **kwargs)
else:
log_backend.warning('VoxelStorage is deprecated. Please switch to'
' VoxelStorageDyn.')
# # TODO: activate as soon as synapse-detection pipelines is refactored.
# else:
# log_backend.warning('VoxelStorage is deprecated. Please switch to'
# ' VoxelStorageDyn.')
return obj
......
# -*- coding: utf-8 -*-
# SyConn - Synaptic connectivity inference toolkit
#
# Copyright (c) 2016 - now
# Max-Planck-Institute for Medical Research, Heidelberg, Germany
# Authors: Sven Dorkenwald, Philipp Schubert, Jörgen Kornfeld
import sys
try:
import cPickle as pkl
except ImportError:
import pickle as pkl
from syconnfs.representations import super_segmentation_helper as ssh
path_storage_file = sys.argv[1]
path_out_file = sys.argv[2]
with open(path_storage_file, 'rb') as f:
args = []
while True:
try:
args.append(pkl.load(f))
except EOFError:
break
out = ssh.export_skeletons(args)
with open(path_out_file, "wb") as f:
pkl.dump(out, f)
......@@ -11,7 +11,8 @@ try:
import cPickle as pkl
except ImportError:
import pickle as pkl
from syconn.reps.super_segmentation_dataset import create_sso_skeletons_thread
from syconn.reps.super_segmentation_helper import create_sso_skeletons_thread
from syconn.reps.super_segmentation import SuperSegmentationDataset
path_storage_file = sys.argv[1]
path_out_file = sys.argv[2]
......@@ -24,7 +25,15 @@ with open(path_storage_file, 'rb') as f:
except EOFError:
break
create_sso_skeletons_thread(args)
ssv_ids = args[0]
version = args[1]
version_dict = args[2]
working_dir = args[3]
ssd = SuperSegmentationDataset(working_dir=working_dir, version=version,
version_dict=version_dict)
ssvs = ssd.get_super_segmentation_object(ssv_ids)
create_sso_skeletons_thread(ssvs)
with open(path_out_file, "wb") as f:
......
......@@ -126,8 +126,8 @@ def init_cell_subcell_sds(chunk_size=None, n_folders_fs=10000, n_folders_fs_sc=1
if cube_of_interest_bb is None:
cube_of_interest_bb = [np.zeros(3, dtype=np.int), kd.boundary]
log.info('Generating KnossosDatasets for subcellular structures {}.'
''.format(global_params.existing_cell_organelles))
log.info('Converting predictions of cellular organelles to KnossosDatasets for every'
'type available: {}.'.format(global_params.existing_cell_organelles))
start = time.time()
ps = [Process(target=kd_init, args=[co, chunk_size, transf_func_kd_overlay,
load_cellorganelles_from_kd_overlaycubes,
......
......@@ -465,7 +465,8 @@ def run_neuron_rendering(max_n_jobs=None):
def run_create_neuron_ssd():
"""
Creates SuperSegmentationDataset with `version=0`.
Creates a :class:`~syconn.reps.super_segmentation_dataset.SuperSegmentationDataset` with
`version=0` at the currently active working directory.
"""
log = initialize_logging('create_neuron_ssd', global_params.config.working_dir + '/logs/',
overwrite=False)
......@@ -490,7 +491,8 @@ def run_create_neuron_ssd():
ssd = SuperSegmentationDataset(working_dir=global_params.config.working_dir, version='0',
ssd_type="ssv", sv_mapping=cc_dict_inv)
# create cache-arrays for frequently used attributes
ssd.save_dataset_deep(n_max_co_processes=global_params.NCORE_TOTAL) # also executes 'ssd.save_dataset_shallow()'
# also executes 'ssd.save_dataset_shallow()'
ssd.save_dataset_deep(n_max_co_processes=global_params.NCORE_TOTAL)
exec_skeleton.run_skeleton_generation()
......
......@@ -151,6 +151,12 @@ class DynConfig(Config):
"""
Enables dynamic and SyConn-wide update of working directory 'wd' and provides an
interface to all working directory dependent parameters.
Examples:
To initialize a working directory at the beginning of your script, run::
from syconn import global_params
global_params.wd = '~/SyConn/example_cube1/'
"""
def __init__(self, wd: Optional[str] = None):
"""
......
......@@ -6,6 +6,7 @@
# Authors: Sven Dorkenwald, Philipp Schubert, Joergen Kornfeld
from scipy import spatial
from typing import List, Optional, Dict, Any
import networkx as nx
import numpy as np
from knossos_utils.skeleton import Skeleton, SkeletonAnnotation, SkeletonNode
......@@ -100,7 +101,8 @@ def chunkify_contiguous(l, n):
yield l[i:i + n]
def split_subcc_join(g, subgraph_size, lo_first_n=1):
def split_subcc_join(g: nx.Graph, subgraph_size: int,
lo_first_n: int = 1) -> List[List[Any]]:
"""
Creates a subgraph for each node consisting of nodes until maximum number of
nodes is reached.
......
......@@ -5,7 +5,7 @@
# Max Planck Institute of Neurobiology, Martinsried, Germany
# Authors: Philipp Schubert, Joergen Kornfeld
# TODO: outsource all skeletonization code, currently not used and methods are spread all over code base
# TODO: outsource all skeletonization code, currently not used
try:
import cPickle as pkl
except ImportError:
......
......@@ -16,8 +16,16 @@ from . import log_reps
from .. import global_params
# TODO: unclear what and when this was used for, refactor and use in current project
def extract_connectivity_thread(args):
"""
Used within :class:`~syconn.reps.connectivity.ConnectivityMatrix`.
Args:
args:
Returns:
"""
sj_obj_ids = args[0]
sj_version = args[1]
ssd_version = args[2]
......
This diff is collapsed.
This diff is collapsed.
......@@ -16,12 +16,13 @@ from collections import defaultdict
from scipy import spatial
from knossos_utils.skeleton_utils import annotation_to_nx_graph,\
load_skeleton as load_skeleton_kzip
from collections.abc import Iterable
try:
from knossos_utils import mergelist_tools
except ImportError:
from knossos_utils import mergelist_tools_fallback as mergelist_tool
from .rep_helper import assign_rep_values, colorcode_vertices
from .rep_helper import assign_rep_values, colorcode_vertices, surface_samples
from . import segmentation
from .segmentation import SegmentationObject
from .segmentation_helper import load_skeleton, find_missing_sv_views,\
......@@ -33,6 +34,7 @@ from ..reps import log_reps
from .. import global_params
from ..proc.meshes import write_mesh2kzip
from ..proc.rendering import render_sso_coords
from ..proc.graphs import create_graph_from_coords
try:
from ..proc.in_bounding_boxC import in_bounding_box
except ImportError:
......@@ -676,7 +678,7 @@ def stitch_skel_nx(skel_nx):
def create_sso_skeleton(sso, pruning_thresh=700, sparsify=True):
"""
Creates the super super voxel skeleton
Creates the super-supervoxel skeleton.
Parameters
----------
......@@ -712,6 +714,50 @@ def create_sso_skeleton(sso, pruning_thresh=700, sparsify=True):
return sso
def create_sso_skeletons_thread(ssvs, dest_paths=None):
if dest_paths is not None:
if not isinstance(dest_paths, Iterable):
raise ValueError('Destination paths given but are not iterable.')
else:
dest_paths = [None for _ in ssvs]
for ssv, dest_path in zip(ssvs, dest_paths):
if not global_params.config.allow_skel_gen:
# TODO: change to create_sso_skeleton_fast as soon as RAG edges
# only connected spatially close SVs
# This merges existing SV skeletons
create_sso_skeleton(ssv)
else:
# TODO: add parameter to config
verts = ssv.mesh[1].reshape(-1, 3)
# choose random subset of surface
np.random.seed(0)
ixs = np.arange(len(verts))
np.random.shuffle(ixs)
ixs = ixs[:int(0.5*len(ixs))]
if global_params.config.use_new_renderings_locs:
locs = generate_rendering_locs(verts[ixs], 1000)
else:
locs = surface_samples(verts[ixs], bin_sizes=(1000, 1000, 1000),
max_nb_samples=10000, r=500)
g = create_graph_from_coords(locs, mst=True)
if g.number_of_edges() == 1:
edge_list = np.array(list(g.edges()))
else:
edge_list = np.array(g.edges())
del g
if edge_list.ndim != 2:
raise ValueError("Edgelist must be a 2D array. Currently: {}\n{}".format(
edge_list.ndim, edge_list))
ssv.skeleton = dict()
ssv.skeleton["nodes"] = locs / np.array(ssv.scaling)
ssv.skeleton["edges"] = edge_list
ssv.skeleton["diameters"] = np.ones(len(locs))
ssv.save_skeleton()
if dest_path is not None:
ssv.save_skeleton_to_kzip(dest_path=dest_path)
# New Implementation of skeleton generation which makes use of ssv.rag
def from_netkx_to_arr(skel_nx):
......@@ -1688,7 +1734,7 @@ def semseg2mesh(sso, semseg_key, nb_views=None, dest_path=None, k=1,
col = None # set to None, because write_mesh2kzip only supports
# RGBA colors and no labels
log_reps.debug('Writing results to kzip.')
write_mesh2kzip(dest_path, sso.mesh[0], sso.mesh[1], sso.mesh[2],
write_mesh2kzip(dest_path, sso. mesh[0], sso.mesh[1], sso.mesh[2],
col, ply_fname=semseg_key + ".ply")
return
return sso.mesh[0], sso.mesh[1], sso.mesh[2], col
......
This diff is collapsed.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment