Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sbachmei/mic 5561/docstring linking #141

Merged
merged 19 commits into from
Jan 13, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .readthedocs.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
# This is the version for the readthedocs configuration. Version 2 ignores
# web-based configuration and uses everything from this file.
version: 2
sphinx:
configuration: docs/source/conf.py
fail_on_warning: true

# Configure the python version and environment construction run before
# docs are built.
Expand All @@ -16,6 +19,3 @@ python:
extra_requirements:
- docs

# Doc builds will fail if there are any warnings
sphinx:
fail_on_warning: true
9 changes: 9 additions & 0 deletions docs/source/_static/style.css
Original file line number Diff line number Diff line change
@@ -1,3 +1,12 @@
.wy-nav-content {
max-width: 1000px !important;
}
/* make links red and bold */
code.xref {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This makes cross-references (which I think are only things like :class:, :meth:, etc) red instead of black

color: red !important;
font-weight: bold !important;
}
/* make inline code black */
code {
color: black !important;
}
41 changes: 20 additions & 21 deletions src/easylink/configuration.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,9 +52,8 @@
class Config(LayeredConfigTree):
"""A container for configuration information.

A ``Config`` (which inherits from :class:`~layered_config_tree.LayeredConfigTree`)
is a container that includes the combination of the user-provided pipeline,
input data, and computing environment specifications. It is a nested
The ``Config`` contains the user-provided specifications for the pipeline,
input data, and computing environment specifications. It is a nested
dictionary-like object that supports prioritized layers of configuration settings
as well as dot-notation access to its attributes.

Expand Down Expand Up @@ -89,9 +88,9 @@ class Config(LayeredConfigTree):
Notes
-----
The requested pipeline is checked against a set of supported
:class:`pipeline schemas <easylink.pipeline_schema.PipelineSchema>`. The first
schema that successfully validates is assumed to be the correct one and is attached
to the ``Config`` object and its :meth:`~easylink.pipeline_schema.PipelineSchema.configure_pipeline`
``PipelineSchemas``. The first schema that successfully validates is assumed
to be the correct one and is attached to the ``Config`` object and its
:meth:`~easylink.pipeline_schema.PipelineSchema.configure_pipeline`
method is called.
"""

Expand Down Expand Up @@ -175,26 +174,26 @@ def spark_resources(self) -> dict[str, Any]:
#################

def _get_schema(self, potential_schemas: list[PipelineSchema]) -> PipelineSchema:
"""Returns the first pipeline schema that successfully validates the requested pipeline.
"""Returns the first :class:`~easylink.pipeline_schema.PipelineSchema` that validates the requested pipeline.

Parameters
----------
potential_schemas
Pipeline schemas to validate the pipeline configuration against.
``PipelineSchemas`` to validate the pipeline configuration against.

Returns
-------
The first pipeline schema that successfully validates the requested pipeline.
If no validated pipeline schema is found, `exit()` is called with `errno.EINVAL`
and any validation errors are logged.
The first ``PipelineSchema`` that validates the requested pipeline.
If no validated ``PipelineSchema`` is found, `exit()` is called with
`errno.EINVAL` and any validation errors are logged.

Notes
-----
This acts as the pipeline configuration file's validation method since
we can only find a matching schema if that file is valid.
we can only find a matching ``PipelineSchema`` if that file is valid.

This method returns the first schema that successfully validates and does
not attempt to validate additional ones.
This method returns the *first* ``PipelineSchema`` that validates and does
not attempt to check additional ones.
"""
errors = defaultdict(dict)
# Try each schema until one is validated
Expand Down Expand Up @@ -279,16 +278,16 @@ def load_params_from_specification(

This gathers the pipeline, input data, and computing environment specifications
as well as the results directory into a single dictionary for insertion into
the ``Config`` object.
the :class:`Config` object.

Parameters
----------
pipeline_specification
The path to the pipeline specification yaml file.
The path to the pipeline specification file.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we say somewhere else what these specifications files are supposed to look like?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not yet, no. I'd imagine that will show up in a concept or the quickstart or soemthing at some point. I just didn't feel the need to specify yaml here in case it ever does change or we support different formats

input_data
The path to the input data yaml file.
The path to the input data file.
computing_environment
The path to the computing environment yaml file.
The path to the computing environment file.
results_dir
The path to the results directory.

Expand All @@ -307,7 +306,7 @@ def load_params_from_specification(
def _load_input_data_paths(
input_data_specification_path: str | Path,
) -> dict[str, list[Path]]:
"""Creates a dictionary of input data paths from the input data yaml file."""
"""Creates a dictionary of input data paths from the input data specification file."""
input_data_paths = load_yaml(input_data_specification_path)
if not isinstance(input_data_paths, dict):
raise TypeError(
Expand All @@ -323,13 +322,13 @@ def _load_input_data_paths(
def _load_computing_environment(
computing_environment_specification_path: str | None,
) -> dict[Any, Any]:
"""Loads the computing environment yaml file and returns the contents as a dict."""
"""Loads the computing environment specification file and returns the contents as a dict."""
if not computing_environment_specification_path:
return {} # handles empty environment.yaml
elif not Path(computing_environment_specification_path).is_file():
raise FileNotFoundError(
"Computing environment is expected to be a path to an existing"
f" yaml file. Input was: '{computing_environment_specification_path}'"
f" specification file. Input was: '{computing_environment_specification_path}'"
)
else:
return load_yaml(computing_environment_specification_path)
Loading
Loading