deode package

Package to run the Destination Earth on Demand Extremes system.

class GeneralConstants(*_args, **_kwargs)[source]

Bases: QuasiConstant

General package-related constants.

PACKAGE_DIRECTORY = PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode')
PACKAGE_NAME = 'deode'
VERSION = '0.11.0'

Subpackages

Submodules

deode.argparse_wrapper module

Wrappers for argparse functionality.

add_keep_def_file(parser_object, help_message='Keep suite definition file after submission')[source]

Add object args.

Parameters:
  • parser_object (args oject) – args object to update

  • help_message (str) – Help text

add_namelist_args(parser_object)[source]

Add namelist args.

Parameters:

parser_object (args oject) – args object to update

Returns:

updated args object

Return type:

parser_object (args oject)

get_parsed_args(program_name='deode', argv=None)[source]

Get parsed command line arguments.

Parameters:
  • program_name (str) – The name of the program.

  • argv (list) – A list of passed command line args.

Returns:

Parsed command line arguments.

Return type:

argparse.Namespace

deode.aux_types module

Aux types used in the package.

class BaseMapping(*args, **kwargs)[source]

Bases: Mapping

Immutable mapping that will serve as basis for all config-related classes.

copy(update: Mapping | Callable[[Mapping], Any] | None = None)[source]

Return a copy of the instance, optionally updated according to update.

property data

Return the underlying data stored by the instance.

dict()[source]

Return a dict representation, converting also nested Mapping-type items.

dumps(section='', style: Literal['toml', 'json', 'yaml'] = 'toml', toml_formatting_function: Callable | None = None)[source]

Get a nicely printed version of the container’s contents.

class QuasiConstant(*_args, **_kwargs)[source]

Bases: object

Inheriting from this will make the class’ attributes (almost) immutable.

class QuasiConstantMetaclass(*args, **kwargs)[source]

Bases: type

Metaclass to help making a class behave as if it had quasi-constant attributes.

dict()[source]

Return a dict form of the instance, with nested instances also converted.

property type_conversions

Type conversions to be performed on the attributes.

deode.cleaning module

Clean deode file systems.

class CleanDeode(config, defaults=None, basetime=None)[source]

Bases: object

Clean data.

clean()[source]

Perform the cleaning.

prep_cleaning(choices, basetime=None)[source]

Prepare tasks for cleaning.

Parameters:
  • choices (dict) – Dict with cleaning settings

  • basetime (dateTime object) – Reference time

deode.commands_functions module

Implement the package’s commands.

create_exp(args, config)[source]

Implement the ‘case’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

doc_config(args, config: ParsedConfig)[source]

Implement the ‘doc_config’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (ParsedConfig) – Parsed config file contents.

namelist_convert(args, config: ParsedConfig)[source]

Implement the ‘namelist convert’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

namelist_format(args, config: ParsedConfig)[source]

Implement the ‘namelist format’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

namelist_integrate(args, config)[source]

Implement the ‘namelist integrate’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

Raises:

SystemExit # noqa – DAR401

run_task(args, config)[source]

Implement the ‘run’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

show_config(args, config)[source]

Implement the ‘show_config’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

show_config_schema(args, config)[source]

Implement the show config-schema command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

show_host(args, config)[source]

Implement the show host command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

show_namelist(args, config)[source]

Implement the ‘show_namelist’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

show_paths(args, config)[source]

Implement the ‘show_paths’ command.

ssh_cmd(host, user, cmd)[source]

SSH to remote server and execute basic commands.

Parameters:
  • host – Host name or server IP address

  • user – Username (string) to login to server with

  • cmd – Command to be executed

Returns:

Message to notify if failed and why or successfully completed.

start_suite(args, config)[source]

Implement the ‘start suite’ command.

Parameters:
  • args (argparse.Namespace) – Parsed command line arguments.

  • config (.config_parser.ParsedConfig) – Parsed config file contents.

Raises:

SystemExit – If error occurs while transferring files.

deode.config_parser module

Registration and validation of options passed in the config file.

class BasicConfig(*args, _metadata=None, **kwargs)[source]

Bases: BaseMapping

Base class for configs. Arbitrary entries allowed: no validation is performed.

property data

Return the underlying data stored by the instance.

classmethod from_file(path, **kwargs)[source]

Retrieve configs from a file in miscellaneous formats.

Parameters:
  • path (Union[pathlib.Path, str]) – Path to the config file.

  • **kwargs – Arguments passed to the class constructor.

Returns:

Configs retrieved from the specified path.

Return type:

cls

property metadata

Get the metadata associated with the instance.

save_as(config_file)[source]

Save config file.

Parameters:

config_file (str) – Path to config file

exception ConfigFileValidationError[source]

Bases: Exception

Error to be raised when parsing the input config file fails.

class ConfigParserDefaults(*_args, **_kwargs)[source]

Bases: QuasiConstant

Defaults related to the parsing of config files.

CONFIG_DIRECTORY = PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data/config_files')
CONFIG_PATH = PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data/config_files/config.toml')
DATA_DIRECTORY = PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data')
MAIN_CONFIG_JSON_SCHEMA = mappingproxy({'$schema': 'http://json-schema.org/draft-07/schema#', 'title': 'Config', 'description': "Model for validation of `deode`'s main config file.", 'type': 'object', 'additionalProperties': True, 'properties': mappingproxy({'general': mappingproxy({'$ref': '#/definitions/general'}), 'suite_control': mappingproxy({'$ref': '#/definitions/suite_control'}), 'clean_old_data': mappingproxy({'$ref': '#/definitions/clean_old_data'}), 'system': mappingproxy({'$ref': '#/definitions/system'}), 'boundaries': mappingproxy({'$ref': '#/definitions/boundaries'}), 'pgd': mappingproxy({'$ref': '#/definitions/pgd'}), 'fullpos': mappingproxy({'$ref': '#/definitions/fullpos'}), 'troika': mappingproxy({'$ref': '#/definitions/troika'}), 'include': mappingproxy({'$ref': '#/definitions/include'}), 'gribmodify': mappingproxy({'$ref': '#/definitions/gribmodify'})}), 'required': ['general'], 'definitions': mappingproxy({'include': mappingproxy({'title': 'Include Section', 'description': "Sections to be included from other config files. Specified as 'key = val' pairs, where the keys are the section names and the vals are the paths to the files containing them.", 'type': 'object', 'properties': mappingproxy({})}), 'troika': mappingproxy({'title': 'Troika', 'description': 'DEODE system specific path settings', 'type': 'object', 'properties': mappingproxy({'config_file': mappingproxy({'type': 'string', 'description': 'Troika config file', 'default': '@DEODE_HOME@/data/config_files/troika.yml'}), 'troika': mappingproxy({'type': 'string', 'description': 'Troika command, allows to specify troika with full path', 'default': 'troika'})})}), 'plugin_registry': mappingproxy({'title': 'Plug-in registry', 'type': 'object', 'properties': mappingproxy({'plugins': mappingproxy({'title': 'Loaded plugins', 'type': 'object'})})}), 'boundaries': mappingproxy({'title': 'Boundary Section', 'description': 'Settings for control of boundary treatment.', 'type': 'object', 'properties': mappingproxy({'bdint': mappingproxy({'description': 'Boundary interval', 'type': 'string', 'format': 'duration'}), 'bdshift': mappingproxy({'description': "Shift of boundary initial time. E.g. 'PT3H' would mean use 3H old boundaries", 'type': 'string', 'format': 'duration'}), 'max_interpolation_tasks': mappingproxy({'description': 'Number of interpolate tasks run in parallel', 'type': 'number'}), 'bdmodel': mappingproxy({'description': 'Coupling model', 'enum': ['ALARO', 'AROME', 'IFS'], 'type': 'string'}), 'humi_gp': mappingproxy({'type': 'boolean', 'description': 'Value of TFP_Q%LLGP in domain namelist for c903. True for gridpoint humidity.'}), 'ifs.selection': mappingproxy({'type': 'string', 'description': 'Selection', 'enum': ['HRES', 'ATOS_DT', 'DT12', 'i5qp', 'i7u4', 'i7ye', 'i8hy', 'IFSENS']}), 'ifs.bdmember': mappingproxy({'type': 'string', 'description': 'The number of members in ensemble if selection is IFSENS, else empty string.'}), 'bd_has_surfex': mappingproxy({'description': 'Set to true if the host model runs with surfex', 'type': 'boolean'})})}), 'clean_old_data': mappingproxy({'title': 'Cleaning old data', 'decription': 'Option for cleaning old data', 'type': 'object', 'properties': mappingproxy({'scratch_data_period': mappingproxy({'description': 'Age for old data in days', 'type': 'string', 'pattern': '^P(?!$)(\\d+Y)?(\\d+M)?(\\d+W)?(\\d+D)?$', 'default': 'P14D'}), 'suites_period': mappingproxy({'description': 'Age for old data in days', 'type': 'string', 'pattern': '^P(?!$)(\\d+Y)?(\\d+M)?(\\d+W)?(\\d+D)?$', 'default': 'P14D'}), 'IFS_period': mappingproxy({'description': 'Age for old data in days', 'type': 'string', 'pattern': '^P(?!$)(\\d+Y)?(\\d+M)?(\\d+W)?(\\d+D)?$', 'default': 'P7D'}), 'ignore': mappingproxy({'decription': 'File not to remove', 'type': 'array', 'default': []}), 'suite_format': mappingproxy({'decription': 'Format of path to ecf files', 'type': 'string', 'default': '/'}), 'scratch_format': mappingproxy({'decription': 'Format of path to scratch', 'type': 'string', 'default': '/'}), 'ifs_format': mappingproxy({'decription': 'Format of path to IFS', 'type': 'string', 'default': '/IFS/(\\d{4})/(0[1-9]|1[0-2])/(0[1-9]|[12]\\d|3[01])/([01]\\d|2[0-3])'}), 'scratch_ext': mappingproxy({'decription': 'Extention of cratche for operational user.', 'type': 'string', 'default': 'deode'})})}), 'pgd': mappingproxy({'title': 'Pgd Section', 'description': 'Settings for control of Pgd selection.', 'type': 'object', 'properties': mappingproxy({'npatch': mappingproxy({'title': 'Number of patches for surfex variables', 'type': 'integer', 'default': 3}), 'one_decade': mappingproxy({'title': 'Option for using one decade', 'type': 'boolean', 'default': True}), 'use_osm': mappingproxy({'title': 'Option to use Open Street map input data', 'type': 'boolean', 'default': False})})}), 'fullpos': mappingproxy({'title': 'Fullpos Section', 'description': 'Settings for control of fullpos selection.', 'type': 'object', 'properties': mappingproxy({'config_path': mappingproxy({'title': 'Path to fullpos config files', 'type': 'string', 'default': '@DEODE_HOME@/data/namelist_generation_input/@CYCLE@/fullpos'}), 'main': mappingproxy({'description': 'List of mandatory rules', 'type': 'array', 'items': mappingproxy({'type': 'string'}), 'default': ['rules', 'namfpc_header']}), 'selection': mappingproxy({'description': 'List of selections to include', 'type': 'object', 'additionalProperties': mappingproxy({'type': 'array', 'items': mappingproxy({'type': 'string'})})}), 'domain_name': mappingproxy({'type': 'string', 'description': 'Name of your domain as used by fullpos', 'maxLength': 20, 'default': 'DOMAIN'})})}), 'suite_control': mappingproxy({'title': 'Suite control', 'description': 'Suite control options', 'type': 'object', 'properties': mappingproxy({'suite_definition': mappingproxy({'description': 'Suite definition to play in EcFlow', 'type': 'string', 'default': 'DeodeSuiteDefinition'}), 'do_soil': mappingproxy({'description': 'Activate preparation of SOILGRID data as input for PGD', 'type': 'boolean'}), 'do_pgd': mappingproxy({'description': 'Activate generation of a PGD file', 'type': 'boolean'}), 'mode': mappingproxy({'description': 'Choose a strategy to achieve initial/initial_sfx files (start:  start from interpolated files and then propagate the FG between cycles, restart: start from a previous run and then propagate the FG between cycles, cold_start: always start from interpolated files).', 'type': 'string', 'enum': ['start', 'restart', 'cold_start']}), 'interpolate_boundaries': mappingproxy({'description': 'Activates the boundary interpolation', 'type': 'boolean'}), 'do_marsprep': mappingproxy({'description': 'Run marsprep or skip extraction', 'type': 'boolean'}), 'do_extractsqlite': mappingproxy({'description': 'Activates extraction of selected points to SQLite files for verification.', 'type': 'boolean'}), 'create_static_data': mappingproxy({'description': 'Activate the generation of PGD and monthly climate files in the suite', 'type': 'boolean'}), 'create_time_dependent_suite': mappingproxy({'description': 'Activate the time dependent part of the suite', 'type': 'boolean'}), 'split_mars': mappingproxy({'description': 'Do marsprep only for one step before c903', 'type': 'boolean', 'default': False}), 'do_cleaning': mappingproxy({'description': 'Do file cleaning', 'type': 'boolean', 'default': False}), 'n_io_merge': mappingproxy({'description': 'Number of IOmerge tasks', 'type': 'integer', 'default': 1}), 'DeodeCleaningSuiteDefinition': mappingproxy({'decription': 'Settings for Cleaning Suite', 'type': 'object', 'properties': mappingproxy({'do_clean_scratch_data': mappingproxy({'description': 'Activate cleaning the scratch data', 'type': 'boolean', 'default': True}), 'do_clean_suites': mappingproxy({'description': 'Activate cleaning the suites', 'type': 'boolean', 'default': True}), 'do_clean_IFS': mappingproxy({'description': 'Activate cleaning IFS data', 'type': 'boolean', 'default': True}), 'days_in_week': mappingproxy({'description': 'List of days when cleaning suite will run, 0-6 Sunday-Saturday', 'type': 'array', 'items': mappingproxy({'type': 'integer'})}), 'time_to_run': mappingproxy({'description': 'Time to run', 'type': 'string', 'pattern': '^([01]\\d|2[0-3]):[0-5]\\d:[0-5]\\dZ$'})})})})}), 'system': mappingproxy({'title': 'System', 'description': 'DEODE system specific path settings', 'type': 'object', 'properties': mappingproxy({'bdfile_sfx_template': mappingproxy({'type': 'string', 'description': 'Name template for input initial surfex file for prep'}), 'bdfile_template': mappingproxy({'type': 'string', 'description': 'Name template for input boundary files'}), 'bddir_sfx': mappingproxy({'type': 'string', 'description': 'Location of input intial file for surfex'}), 'bddir': mappingproxy({'type': 'string', 'description': 'Location of input boundaries'}), 'intp_bddir': mappingproxy({'type': 'string', 'description': 'Location of boundaries on the current domain, i.e. the output directory of e927', 'default': '@WRK@'}), 'climdir': mappingproxy({'type': 'string', 'description': 'Location of generated/used climate files (PGD/monthly files)'}), 'bindir': mappingproxy({'type': 'string', 'description': 'Location of binaries'}), 'bdclimdir': mappingproxy({'type': 'string', 'description': 'Location of host model climate files used for e927/c903'}), 'wrk': mappingproxy({'type': 'string', 'description': 'Location of date specific working directory', 'default': '@CASEDIR@/@YYYY@@MM@@DD@_@HH@@mm@/'}), 'fullpos_config_file': mappingproxy({'type': 'string', 'description': 'Path to the fullpos yml config file'}), 'archive_timestamp': mappingproxy({'type': 'string', 'description': 'Relative path of date specific archive', 'default': '@YYYY@/@MM@/@DD@/@HH@/'}), 'archive': mappingproxy({'type': 'string', 'description': 'Absolute path of date specific archive', 'default': '@ARCHIVE_ROOT@/@ARCHIVE_TIMESTAMP@'}), 'logs': mappingproxy({'type': 'string', 'description': 'Location of collected logfiles'}), 'namelists': mappingproxy({'type': 'string', 'description': 'Path to static namelists', 'default': '@DEODE_HOME@/data/namelists/@CYCLE@'}), 'marsdir': mappingproxy({'type': 'string', 'description': 'Location input files extracted from MARS by Marsprep'}), 'global_sfcdir': mappingproxy({'type': 'string', 'description': 'Path to SFC files needed for c903, which are not in MARS'}), 'prev_case': mappingproxy({'description': 'Name of the previous experiment', 'type': 'string', 'default': '@CASE@'}), 'sfx_input_definition': mappingproxy({'description': 'Path to the surfex json config file', 'type': 'string', 'default': '@CYCLE@/sfx.json'}), 'forecast_input_definition': mappingproxy({'type': 'string', 'description': 'Path to the forecast json config file', 'default': '@CYCLE@/forecast.json'}), 'c903_input_definition': mappingproxy({'type': 'string', 'description': 'Path to the c903 json config file', 'default': '@CYCLE@/c903.json'})})}), 'gribmodify': mappingproxy({'title': 'Section for modify grib', 'description': 'Rules for modify grib to get total prcipitation', 'type': 'object', 'properties': mappingproxy({'3': mappingproxy({'title': 'Total precpipation', 'description': 'shortName for the total precipitation.', 'type': 'object', 'properties': mappingproxy({'input': mappingproxy({'title': 'Input parameters', 'description': 'List of parameters', 'type': 'array', 'items': mappingproxy({'type': 'string'})}), 'operator': mappingproxy({'title': 'Operator', 'description': 'Operator for compute the new field from input parameters', 'type': 'string'}), 'output': mappingproxy({'title': 'Output parameter', 'description': 'shortName of output parameter.', 'type': 'string'})})}), '2': mappingproxy({'title': 'Total snow', 'description': 'shortName for the total precipitation.', 'type': 'object', 'properties': mappingproxy({'input': mappingproxy({'title': 'Input parameters', 'description': 'List of parameters', 'type': 'array', 'items': mappingproxy({'type': 'string'})}), 'operator': mappingproxy({'title': 'Operator', 'description': 'Operator for compute the new field from input parameters', 'type': 'string'}), 'output': mappingproxy({'title': 'Output parameter', 'description': 'shortName of output parameter.', 'type': 'string'})})}), '1': mappingproxy({'title': 'Total rain', 'description': 'shortName for the total precipitation.', 'type': 'object', 'properties': mappingproxy({'input': mappingproxy({'title': 'Input parameters', 'description': 'List of parameters', 'type': 'array', 'items': mappingproxy({'type': 'string'})}), 'operator': mappingproxy({'title': 'Operator', 'description': 'Operator for compute the new field from input parameters', 'type': 'string'}), 'output': mappingproxy({'title': 'Output parameter', 'description': 'shortName of output parameter.', 'type': 'string'})})})})}), 'general': mappingproxy({'title': 'General Section', 'description': "Model for the 'general' section.", 'type': 'object', 'properties': mappingproxy({'times': mappingproxy({'title': 'Time-Related Entries', 'description': "Model for the 'general.times' section.", 'type': 'object', 'properties': mappingproxy({'forecast_range': mappingproxy({'description': 'Forecast range', 'type': 'string', 'format': 'duration'}), 'start': mappingproxy({'description': 'Start', 'type': 'string', 'format': 'date-time'}), 'end': mappingproxy({'description': 'End', 'type': 'string', 'format': 'date-time'}), 'list': mappingproxy({'description': 'List', 'type': 'array', 'items': mappingproxy({'type': 'string', 'format': 'date-time'})}), 'cycle_length': mappingproxy({'description': 'Cycle length defined as duration', 'default': 'PT3H', 'type': 'string', 'format': 'duration'}), 'basetime': mappingproxy({'description': 'Initial date and time for the current task.', 'type': 'string', 'format': 'date-time'}), 'validtime': mappingproxy({'description': 'Final date and time for the current task.', 'type': 'string', 'format': 'date-time'})}), 'oneOf': [{'anyOf': [{'required': ['start'], 'not': {'required': ['list']}}, {'required': ['basetime'], 'not': {'anyOf': [{'required': ['end']}, {'required': ['list']}, {'required': ['start']}]}}]}, {'required': ['list'], 'not': {'anyOf': [{'required': ['start']}, {'required': ['end']}]}}]}), 'accept_static_namelists': mappingproxy({'description': 'Allow usage of static namelists as input for the tasks. The namelist should be located in the directory specified by platform.namelists and named with the convention namelist_{kind}_{task} where kind could be surfex or master and task forecast/pgd or similar. For the extraction of namelist see [namelists.md](../docs/namelists.md).', 'type': 'boolean'}), 'case': mappingproxy({'description': 'Experiment name', 'default': 'deode_case', 'type': 'string'}), 'realization': mappingproxy({'description': 'Placeholder for future ensemble or similar need', 'default': '', 'oneOf': [{'type': 'string', 'maxLength': 0}, {'type': 'number', 'minimum': 0}]}), 'cnmexp': mappingproxy({'description': 'Experiment short name', 'default': 'DEOD', 'type': 'string', 'minLength': 4, 'maxLength': 4}), 'loglevel': mappingproxy({'description': 'Logger output level', 'type': 'string', 'enum': ['CRITICAL', 'ERROR', 'DEBUG', 'INFO', 'WARNING']}), 'surfex': mappingproxy({'description': 'SURFEX switch', 'type': 'boolean'}), 'keep_workdirs': mappingproxy({'description': 'Do not remove working directories', 'type': 'boolean'}), 'csc': mappingproxy({'description': 'CSC', 'default': 'AROME', 'type': 'string', 'enum': ['AROME', 'ALARO', 'HARMONIE_AROME']}), 'cycle': mappingproxy({'description': 'IAL cycle', 'type': 'string', 'enum': ['CY49t2', 'CY48t3', 'CY46h1']}), 'event_type': mappingproxy({'description': 'Type of extreme event', 'type': 'string', 'default': 'nwp', 'enum': ['airquality', 'convection', 'flooding', 'heatwave', 'nwp', 'storm']}), 'initfile': mappingproxy({'description': 'Initial file for the forecast object. Normally no need to provide unless there are special requirements or the forecast object is used in stand alone mode.', 'type': 'string'}), 'initfile_sfx': mappingproxy({'description': 'Initial surfex file for the forecast object. Normally no need to provide unless there are special requirements or the forecast object is used in stand alone mode.', 'type': 'string'}), 'namelist': mappingproxy({'title': 'Namelist', 'description': 'Namelist settings. Various settings for the forecast namelist that have no other place to go.', 'type': 'object', 'properties': mappingproxy({'nrazts': mappingproxy({'description': 'Reset times for instantaneous fluxes ( e.g. min/max temperature )', 'type': 'string'}), 'radiation_frequency': mappingproxy({'description': 'Time interval for radiation compitations in the model.', 'type': 'string'})})}), 'output_settings': mappingproxy({'title': 'Output Settings', 'description': 'Output settings. Specified as duration for any of interval, endtime:interval, starttime:endtime:interval or a list of these options', 'type': 'object', 'properties': mappingproxy({'history': mappingproxy({'description': 'Output list for history files', '$ref': '#/$output_setting_rules'}), 'surfex': mappingproxy({'description': 'Output list for surfex files', '$ref': '#/$output_setting_rules'}), 'fullpos': mappingproxy({'description': 'Output list for fullpos files', '$ref': '#/$output_setting_rules'})})}), 'remove_sections': mappingproxy({'description': 'Sections to remove prior to config merge', 'type': 'array', 'default': []}), 'windfarm': mappingproxy({'description': 'Switch on wind farm parameterisation', 'type': 'boolean', 'default': False})}), 'required': ['times']})}), '$output_setting_rules': mappingproxy({'anyOf': [{'type': 'string', 'maxLength': 0}, {'type': 'string', 'format': 'duration'}, {'type': 'array', 'items': {'anyOf': [{'type': 'string', 'format': 'duration_slice'}]}}]})})
MAIN_CONFIG_JSON_SCHEMA_PATH = PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data/config_files/config_file_schemas/main_config_schema.json')
PACKAGE_CONFIG_PATH = PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data/config_files/config.toml')
PACKAGE_INCLUDE_DIR = PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data/config_files/include')
SCHEMAS_DIRECTORY = PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data/config_files/config_file_schemas')
class ConfigPaths[source]

Bases: object

Support multiple path search.

CONFIG_DATA_SEARCHPATHS = [PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data')]
erroneous_paths = []
static path_from_subpath(subpath) Path[source]

Interface to find full path given any subpath, by searching ‘searchpaths’.

Parameters:

subpath (str) – Subpath to search for

Returns:

Full path to target

Return type:

(Path)

Raises:

RuntimeRerror – Various errors

static print(config_file=None, host=None)[source]

Prints the available config directories.

Displays the main config search paths as defined by list_paths in addition to the actual search paths in the config file used.

exception ConflictingValidationSchemasError[source]

Bases: Exception

Error to be raised when more than one schema is defined for a config section.

class JsonSchema(*args, **kwargs)[source]

Bases: BaseMapping

Class to use for JSON schemas. Provides a validate method to validate data.

get_markdown_doc()[source]

Return human-readable doc for the schema in markdown format.

validate(data)[source]

Return a copy of data validated against the stored JSON schema.

class ParsedConfig(*args, json_schema, include_dir=PosixPath('/home/runner/work/Deode-Workflow/Deode-Workflow/deode/data/config_files'), host=None, **kwargs)[source]

Bases: BasicConfig

Object that holds parsed configs validated against a json_schema.

property data

Return the underlying data stored by the instance.

expand_macros()[source]

Expand macros in config recursively.

classmethod from_file(path, include_dir=None, **kwargs)[source]

Do as in BasicConfig. If None, include_dir will become path.parent.

property include_dir

Return the search dir used sections in the raw config’s include section.

property json_schema

Return the instance’s JSON schema.

deode.custom_validators module

Define custom validators for Pydantic models.

Exports:

validate_durations: Validate the format of a duration string against ISO8601. import_from_string: Validate if a string can be imported and

return the imported object.

import_from_string(value: str) ModuleType | Type[source]

Validate if a string can be imported and return the imported object.

Only supports importing objects specified with full module path, e.g. module.submodule.object.

Parameters:

value – The string to import.

Raises:
  • ImportError – If the module or object cannot be imported.

  • TypeError – If the input is not a string.

Returns:

The imported object.

deode.datetime_utils module

Implement helper routines to deal with dates and times.

class DatetimeConstants(*_args, **_kwargs)[source]

Bases: QuasiConstant

Datetime-related constants.

DEFAULT_SHIFT = Timedelta('0 days 00:00:00')
ISO_8601_TIME_DURATION_REGEX = '^P(?!$)(\\d+Y)?(\\d+M)?(\\d+W)?(\\d+D)?(T(?=\\d+[HMS])(\\d+H)?(\\d+M)?(\\d+S)?)?$'
as_datetime(obj)[source]

Convert obj to string, parse into datetime and add UTC timezone iff naive.

as_timedelta(obj)[source]

Convert obj to string and parse into pd.Timedelta.

check_syntax(output_settings: Tuple[str] | List[str], length: int)[source]

Check syntax of output_settings.

Parameters:
  • output_settings (Union[Tuple[str], List[str]]) – Specifies the output steps

  • length (integer) – length to check on

Raises:

SystemExit – General system handler

cycle_offset(basetime, bdcycle, bdcycle_start=Timedelta('0 days 00:00:00'), bdshift=Timedelta('0 days 00:00:00'))[source]

Calculcate offset from a reference time.

Parameters:
  • basetime (datetime) – Reference time

  • bdcycle (timedelta) – Interval between cycles

  • bdcycle_start (timedelta) – Time of day when bdcycle starts

  • bdshift (timedelta) – shift of boundary usage

Returns:

a timdelta object of the offset

Return type:

timedelta

dt2str(dt)[source]

Convert timdelta object to file name suitable string.

Parameters:

dt (timedelta object) – duration

Returns:

string representation of duration

suitable for FA files

Return type:

duration (str)

expand_output_settings(output_settings: str | Tuple[str] | List[str], forecast_range: str) List[List[Timedelta]][source]

Expand the output_settings coming from config.

Parameters:
  • output_settings (Union[str, Tuple[str], List[str]]) – Specifies the output steps

  • forecast_range (str) – Forecast range in duration syntax

Raises:

RuntimeError – Handle erroneous time increment

Returns:

List of output subsections.

Can be empty in case of empty output_settings

Return type:

sections (List[List[pd.Timedelta]])

get_decadal_list(dt_start, dt_end) list[source]

Return a list of dates for which decadal pgd files have to be created.

get_decade(dt) str[source]

Return the decade given a datetime object.

get_month_list(start, end) list[source]

Get list of months between to given dates (input as string).

oi2dt_list(output_settings: str | Tuple[str] | List[str], forecast_range: str) List[Timedelta][source]

Build list of output occurences.

Parameters:
  • output_settings (Union[str, Tuple[str], List[str]]) – Specifies the output steps

  • forecast_range (str) – Forecast range in duration syntax

Returns:

Sorted list of output occurences

Return type:

dt (List[pd.Timedelta])

deode.derived_variables module

Derive runtime variables.

check_fullpos_namelist(config, nlgen)[source]

Find existing fullpos select files or generate them.

Parameters:
  • config (deode.ParsedConfig) – Configuration

  • nlgen (dict) – master forecast namelist

Returns:

Possibly updated forecast namelist

Return type:

nlgen (dict)

derived_variables(config, processor_layout=None)[source]

Derive some variables required in the namelists.

Parameters:
  • config (deode.ParsedConfig) – Configuration

  • processor_layout (ProcessorLayout, optional) – Processor layout object

Returns:

Derived config update

Return type:

update (dict)

Raises:

NotImplementedError – For configurations checking

set_times(config)[source]

Set basetime/validtime if not present.

Parameters:

config (.config_parser.ParsedConfig) – Parsed config file contents.

Returns:

Dict of corrected basetime/validtime

Return type:

update (dict)

deode.experiment module

Experiment tools.

class EPSExp(config: ParsedConfig)[source]

Bases: Exp

Setup EPS experiment from config file.

setup_exp()[source]

Generate EPS member settings and update config file.

Only deviations of member settings from general EPS settings are added to the config.

class Exp(config: ParsedConfig, merged_config: dict | None)[source]

Bases: object

Experiment class.

class ExpFromFiles(config: ParsedConfig, exp_dependencies, mod_files: List[Path], host=None, merged_config=None)[source]

Bases: Exp

Generate Exp object from existing files. Use config files from a setup.

static deep_update(source, overrides)[source]

Update a nested dictionary or similar mapping.

Modify source in place.

Parameters:
  • source (dict) – Source

  • overrides (dict) – Updates

Returns:

Updated dictionary

Return type:

dict

static setup_files(output_file, case=None)[source]

Set up the files for an experiment.

Parameters:
  • output_file (str) – Output file

  • case (str, optional) – Experiment name. Defaults to None.

Returns:

Experiment dependencies from setup.

Return type:

exp_dependencies(dict)

static toml_dump(to_dump, fname)[source]

Dump toml to file.

Using tomlkit to preserve stucture

Parameters:
  • to_dump (dict) – Data to save

  • fname (str) – Filename

case_setup(config: ParsedConfig, output_file, mod_files: List[Path], case=None, host=None, expand_config=False)[source]

Do experiment setup.

Parameters:
  • config (.config_parser.ParsedConfig) – Parsed config file contents.

  • output_file (str) – Output config file.

  • mod_files (list) – Modifications. Defaults to None.

  • case (str, optional) – Case identifier. Defaults to None.

  • host (str, optional) – host name. Defaults to None.

  • expand_config (boolean, optional) – Flag for expanding macros in config

Returns:

Output config file.

Return type:

output_file (str)

get_git_info()[source]

Get git information.

deode.formatters module

Define format validators used in the JSON schema validation.

duration_format_validator(duration: str) bool[source]

Validate the format of a duration string against ISO8601.

Parameters:

duration – The duration string to validate.

Returns:

True if the duration string is valid, False otherwise.

duration_slice_format_validator(duration_slice: str) bool[source]

Validate the format of a duration slice string against ISO8601.

Each part of the slice is validated using duration_format_validator.

Parameters:

duration_slice – The duration slice string to validate.

Returns:

True if the duration slice string is valid, False otherwise.

deode.fullpos module

Fullpos namelist generation.

class Fullpos(domain, fpdir='.', fpfiles=None, fpdict=None, rules=None)[source]

Bases: object

Fullpos namelist generator based on (yaml) dicts.

check_non_instant_fields(selection, time_selection)[source]

Search for non instant fields.

Parameters:
  • selection (dict) – Dict with fullpos settings to be examined

  • time_selection (str) – Which selection time rules to check

Raises:

RuntimeError – Non instant fields found

construct()[source]

Construct the fullpos namelists.

Returns:

namfpc part selection (dict): xxtddddhhmm part

Return type:

namfpc_out (dict)

Raises:

InvalidSelectionCombination – Invalid combination in selection

expand(v, levtype, levels, domain, i)[source]

Expand fullpos namelists to levels and domains.

Parameters:
  • v (str) – parameter list

  • levtype (str) – type of vertical level in fullpos syntax

  • levels (list) – list of levels

  • domain (str) – domain name

  • i (int) – parameter conuter

Returns:

expaned names i (int): parameter counter

Return type:

d (dict)

load(fpdir, fpfiles)[source]

Load fullpos yaml file.

Parameters:
  • fpdir (str) – Path do fullpos settings

  • fpfiles (list) – List of yml files to read

Returns:

fullpos settings

Return type:

nldict (dict)

replace_rules(vin)[source]

Replace patterns from the rules dict.

Parameters:

vin (str|list) – Input values

Raises:

RuntimeError – Invalid type

Returns:

possibly replaced strings

Return type:

v (str|list)

update_selection(additions_list=None, additions_dict=None)[source]

Add choices to the selection section.

Parameters:
  • additions_list (list) – Additional selection to be read from files

  • additions_dict (dict) – Additional selection as a dictionary

exception InvalidSelectionCombinationError[source]

Bases: ValueError

Custom exception.

flatten_list(li)[source]

Recursively flatten a list of lists (of lists).

deode.general_utils module

General utils for use throughout the package.

expand_dict_key_slice(dict_: Dict[int | str, Any], indices: List[int]) Dict[int, Any][source]

Expand key slices of a Dict.

Handles slices in the form of “start:stop:step”, expands them to individual keys, and assigns the original value to all individual keys. Keys are converted to integers.

Any of the start, stop and step can be ommited. If start is ommited, it is set to the minimum value of indices. If stop is ommited, it is set to the maximum value of indices. If step is ommited, it is set to 1.

Parameters:
  • dict – The dict, which keys shall be expanded.

  • indices – The indices to respect when expanding, i.e. if expanded index is not in indices, it will not be added to the new dict.

Returns:

New dict with expanded keys.

Return type:

dict

get_empty_nested_defaultdict()[source]

Return an empty nested (recursive) defaultdict object.

merge_dicts(dict1: dict, dict2: dict, overwrite: bool = False) dict[source]

Merge two dictionaries with values from dict2 taking precedence.

If values are lists, they are concatenated.

Parameters:
  • dict1 (dict) – Reference dict

  • dict2 (dict) – Update dict

  • overwrite (bool) – Whether to overwrite values in dict1 with values from dict2 if the keys are the same, but the types of the values are not lists or dicts.

Returns:

Merged dict

Return type:

(dict)

Raises:

RuntimeError – Invalid type

modify_mappings(obj: Mapping, operator: Mapping | Callable[[Mapping], Any])[source]

Descend recursively into obj and modify encountered mappings using operator.

recursive_dict_deviation(base_dict: dict, deviating_dict: dict) dict[source]

Calculate the (recursive) difference between two dicts.

Parameters:
  • base_dict – The base dictionary to calculate deviation from.

  • deviating_dict – The dict to calculate the deviation of w.r.t to the base_dict

Returns:

The deviation as a dictionary.

value_from_any_generator(any_: Any | Sequence[Any] | Mapping[int, Any], indices: List[int], default_value: str | None = None) Generator[str, None, None][source]

Yield values from any type.

Parameters:
  • any – The input object to yield values from.

  • indices – The indices to retrieve from the value.

  • default_value – The default value to use if an index is not found in Mapping.

Yields:

The value from the input object.

value_from_mapping_generator(mappable: Mapping[int, Any], keys: List[int], default_value: Any) Generator[Any, None, None][source]

Yield values from a dictionary according to keys.

Parameters:
  • mappable – The mappable to yield values from

  • keys – The keys for which to retrieve corresponding values from the dictionary.

  • default_value – The default value to use if a key is not found.

Yields:

The value corresponding to the key.

value_from_sequence_generator(sequence: Sequence[Any]) Generator[Any, None, None][source]

Yield alternately one of the values from a sequence of values.

The order of the yielded values is determined by the order of the sequence.

Parameters:

sequence – The sequence to yield values from.

Yields:

One of the values from sequence in alternate order.

deode.geo_utils module

Utilities for simple geographic tasks.

class Projection(proj4str)[source]

Bases: object

Projection class.

check_key(key: str, config: dict) bool[source]

Check if key is in config.

Parameters:
  • key (str) – Key to check

  • config (dict) – Configuration

Returns:

True if key is in config

Return type:

bool

Raises:

ValueError – If key is not in config

get_domain_properties(domain_spec: dict) dict[source]

Get domain properties.

Parameters:

domain_spec (dict) – Domain specification

Returns:

Domain properties

Return type:

dict

class Projstring[source]

Bases: object

Proj4 string class.

get_projstring(lon0=0.0, lat0=-90.0) str[source]

Get proj4 string.

Parameters:
  • lon0 (float, optional) – Central longitude. Defaults to 0.0.

  • lat0 (float, optional) – Central latitude. Defaults to -90.0.

Returns:

Proj4 string

Return type:

str

deode.host_actions module

Handle host detection.

class DeodeHost(known_hosts=None, known_hosts_file=None)[source]

Bases: object

DeodeHost object.

detect_deode_host()[source]

Detect deode host by matching various properties.

First check self.deode_host as set by os.getenv(“DEODE_HOST”), second use the defined hosts in known_hosts.yml. If no matches are found return the first host defined in known_hosts.yml

Raises:

RuntimeError – Ambiguous matches

Returns:

mapped hostname

Return type:

deode_host (str)

set_deode_home(config, deode_home=None)[source]

Set deode_home in various ways.

Parameters:
  • config (.config_parser.ParsedConfig) – Parsed config file contents.

  • deode_home (str) – Externally set deode_home

Returns:

deode_home

deode.initial_conditions module

Initial_conditions.

class InitialConditions(config)[source]

Bases: object

FirstGuess.

check_if_found()[source]

Check if files are present.

find_initial_files()[source]

Find initial file.

success()[source]

Report of success.

deode.logs module

Logging-related classes, functions and definitions.

class InterceptHandler(level=0)[source]

Bases: Handler

Add logging handler to augment python stdlib logging.

Logs which would otherwise go to stdlib logging are redirected through loguru.

emit(record: LogRecord) None[source]

Emit a record.

Parameters:

record (builtin_logging.LogRecord) – Output record

class LogDefaults(*_args, **_kwargs)[source]

Bases: QuasiConstant

Defaults used for the logging system.

DIRECTORY = PosixPath('/home/runner/.logs/deode')
LEVEL = 'INFO'
RETENTION_TIME = '1 week'
SINKS = mappingproxy({'console': <_io.TextIOWrapper name='<stderr>' mode='w' encoding='utf-8'>})
class LogFormatter(datetime: str = '<green>{time:YYYY-MM-DD HH:mm:ss}</green>', level: str = '<level>{level: <8}</level>', code_location: str = '<cyan>@{name}</cyan>:<cyan>{function}</cyan> <cyan><{file.path}</cyan>:<cyan>{line}>:</cyan>', message: str = '<level>{message}</level>')[source]

Bases: object

Helper class to setup logging without poluting the module’s main scope.

code_location: str = '<cyan>@{name}</cyan>:<cyan>{function}</cyan> <cyan><{file.path}</cyan>:<cyan>{line}>:</cyan>'
datetime: str = '<green>{time:YYYY-MM-DD HH:mm:ss}</green>'
format_string(loglevel: str)[source]

Return the appropriate fmt string according to log level and fmt opts.

level: str = '<level>{level: <8}</level>'
message: str = '<level>{message}</level>'
class LoggerHandlers(default_level: str = 'INFO', **sinks)[source]

Bases: Sequence

Helper class to configure logger handlers when using loguru.logger.configure.

add(name, sink, **configs)[source]

Add handler to instance.

log_elapsed_time(**kwargs)[source]

Return a decorator that logs beginning, exit and elapsed time of function.

deode.namelist module

Namelist handling for MASTERODB w/SURFEX.

exception InvalidNamelistKindError[source]

Bases: ValueError

Custom exception.

exception InvalidNamelistTargetError[source]

Bases: ValueError

Custom exception.

class NamelistComparator(config)[source]

Bases: object

Helper class for namelist generation and integration.

compare_dicts(dbase, dcomp, action)[source]

Compare two dictionaries, recursively if needed.

The dict must only consist of dict, list, str, float, int, bool

Parameters:
  • dbase – base dict

  • dcomp – comparison dict

  • action (str) – one of ‘intersection’, ‘diff’ or ‘union’

Returns:

result dict from the specified comparison action

Return type:

dout

Raises:

SystemExit # noqa – DAR401

compare_lists(libase, licomp, sib, sic, sio, action)[source]

Compare two lists, recursively if needed.

The list must only consist of dict, list, str, float, int, bool

Parameters:
  • libase – base list

  • licomp – comparison list

  • sib – list of start indices for libase

  • sic – list of start indices for licomp

  • sio – list of start indices for liout (modified)

  • action (str) – one of ‘intersection’, ‘diff’ or ‘union’

Returns:

result list from the specified comparison action

Return type:

liout

Raises:

SystemExit # noqa – DAR401

class NamelistConverter[source]

Bases: object

Helper class to convert namelists between cycles, based on thenamelisttool.

static apply_tnt_directives_to_ftn_namelist(tnt_directive_filename, input_ftn)[source]

Apply the tnt directives to a fotran namelist using tnt.

Parameters:
  • tnt_directive_filename – the tnt directive filename

  • input_ftn – the namelist fortran file

Raises:

SystemExit – when conversion failed

static apply_tnt_directives_to_namelist_dict(tnt_directive_filename, namelist_dict)[source]

Apply the tnt directives to a namelist as dictionary.

Parameters:
  • tnt_directive_filename – the tnt directive filename

  • namelist_dict – the namelist dictionary

Returns:

the converted namelist dictionary

Return type:

new_namelist

Raises:

SystemExit – when conversion failed

static convert_ftn(input_ftn, output_ftn, from_cycle, to_cycle)[source]

Convert a namelist in fortran file between two cycles.

Parameters:
  • input_ftn – the input fortran filename

  • output_ftn – the output fortran filename

  • from_cycle – the input cycle

  • to_cycle – the target cycle

static convert_yml(input_yml, output_yml, from_cycle, to_cycle)[source]

Convert a namelist in yml file between two cycles.

Parameters:
  • input_yml – the input yaml filename

  • output_yml – the output yaml filename

  • from_cycle – the input cycle

  • to_cycle – the target cycle

Raises:

SystemExit – when conversion failed

static get_known_cycles()[source]

Return the cycles handled by the converter.

static get_tnt_files_list(from_cycle, to_cycle)[source]

Return the list of tnt directive files required for the conversion.

static get_to_next_version_tnt_filenames()[source]

Return the tnt file names between get_known_cycles().

class NamelistGenerator(config, kind, substitute=True)[source]

Bases: object

Fortran namelist generator based on hierarchical merging of (yaml) dicts.

assemble_namelist(target)[source]

Generate the namelists for ‘target’.

Parameters:

target (str) – task to generate namelists for

Returns:

Assembled namelist

Return type:

nlres (f90nml.Namelist)

fn_config(arg, default=None)[source]

Resolve namelist function cfg.

fn_stepfreq(arg)[source]

Resolve namelist function stepfreq.

fn_steplist(time_intervals)[source]

Resolve namelist function steplist.

generate_namelist(target, output_file)[source]

Generate the namelists for ‘target’.

Parameters:
  • target (str) – task to generate namelists for

  • output_file – where to write the result (fort.4 or EXSEG1.nam typically)

load(target)[source]

Generate the namelists for ‘target’.

Parameters:

target (str) – task to generate namelists for

Raises:

InvalidNamelistTargetError # noqa – DAR401

Returns:

Assembled namelist

Return type:

nlres (dict)

load_user_namelist()[source]

Read user provided namelist.

Returns:

Logics if namelist is found or not nldict : Namelist as dictionary cndict : Rules for dictionary

Return type:

found (boolean)

resolve_macros(cn)[source]

Resolve all macros in a nested dict (both keys and values!).

update(nldict, cndict_tag)[source]

Update with additional namelist dict.

Parameters:
  • nldict (dict) – additional namelist dict

  • cndict_tag – name to be used for recognition

update_from_config(target)[source]

Update with additional namelist dict from config.

Parameters:

target (str) – task to generate namelists for

write_namelist(nml, output_file)[source]

Write namelist using f90nml.

Parameters:
  • nml (f90nml.Namelist) – namelist to write

  • output_file (str) – namelist file name

class NamelistIntegrator(config)[source]

Bases: object

Helper class to read fortran namelists and store as yaml, in a more compact form.

Reduces duplication if several namelists have similar settings.

static dict2yml(nmldict, ymlfile, ordered_sections=None)[source]

Write dict as yaml file.

ftn2dict(ftnfile)[source]

Read fortran namelist file with f90nml and return as plain dict.

static yml2dict(ymlfile)[source]

Read yaml namelist file and return as dict.

find_value(s)[source]

Purpose: un-quote (list of) numbers and booleans.

flatten_cn(li)[source]

Recursively flatten a list of lists (of lists).

list_set_at_index(li, ix, val)[source]

Handle ‘sparse’ lists (that may contain null values).

represent_ordereddict(dumper, data)[source]

YAML representer, simplifies working with namelists read by f90nml.

to_dict(x)[source]

Recursively modify OrderedDict etc. to normal dict (required for OmegaConf).

write_namelist(nml, output_file)[source]

Write namelist using f90nml.

Parameters:
  • nml (f90nml.Namelist) – namelist to write

  • output_file (str) – namelist file name

deode.os_utils module

Utilities for simple tasks on OS level.

class Search[source]

Bases: object

Search class.

static find_files(directory, prefix='', postfix='', pattern='', recursive=True, onlyfiles=True, fullpath=False, olderthan=None, inorder=False) list[source]

Find files in a directory.

Parameters:
  • directory (str) – Directory to search in.

  • prefix (str, optional) – Only find files with this prefix. Defaults to “”.

  • postfix (str, optional) – Only find files with the postfix. Defaults to “”.

  • pattern (str, optional) – Only find files with matching pattern. Defaults to “”.

  • recursive (bool, optional) – Go into directories recursively. Defaults to True.

  • onlyfiles (bool, optional) – Show only files. Defaults to True.

  • fullpath (bool, optional) – Give full path. Defaults to False. If recursive=True, fullpath is given automatically.

  • olderthan (int, optional) – Match only files older than X seconds from now. Defaults to None.

  • inorder (bool, optional) – Return sorted list of filenames. Defaults to False.

Returns:

List containing file names that matches criterias

Return type:

list

Examples

>>> files = find_files(
                '/foo/', prefix="", postfix="", recursive=False,
                onlyfiles=True, fullpath=True, olderthan=86400*100
            )
deodemakedirs(path, unixgroup='', exist_ok=True, def_dir_mode=493)[source]

Create directories and change unix group as required.

For a given path the top directory that does not yet exist is searched for, created and unix group is set, if required. Permissions are set such that all subdirectories and new files inherit the unix group.

Parameters:
  • path (str) – directory path that should be created if it doesn’t already exist.

  • unixgroup (str, optional) – unix group the newly created dirs should belong to.

  • exist_ok (boolean, optional) – Define whether directories may already exist or whether an error should be raised.

  • def_dir_mode (int, optional) – Default directory persmission mode. Defaults to 0o755

Raises:

OSError – If cannot create the directory.

filepath_iterator(paths, filename_pattern='*')[source]

Return iterator of paths to files given a list of file or directory paths.

Given a path or list of paths, yield Path objects corresponding to them. If a path points to a directory, then the directory is searched recursively and the paths to the files found in this process will be yielded.

Parameters:
  • paths (Union[pathlib.Path, List[pathlib.Path], str, List[str]]) – A single path or a collection of paths.

  • filename_pattern (str, optional) – Pattern used in the recursive glob in order to select the names of the files to be yielded. Defaults to “*”.

Yields:

pathlib.Path – Path to located files.

ping(host)[source]

Ping host.

Parameters:

host (str) – Host to ping

Returns:

True if host responded

Return type:

(boolean)

remove_empty_dirs(src, dry_run=False)[source]

Remove directories.

Recursively and permanently removes the specified directory, and all of its empty subdirectories.

Parameters:
  • src (str or Path) – Top search directory

  • dry_run (boolean) – Flag for execution of cleaning or not

Returns:

True if any files found

Return type:

found_files (boolean)

resolve_path_relative_to_package(path: Path, ignore_errors: bool = False) Path[source]

Resolve path relative to package directory.

If the path exists as is, return it. If not, check if it exists in the package directory and return path relative to package

Parameters:
  • path (Path) – Path to resolve.

  • ignore_errors (bool, optional) – Option to ignore errors. Defaults to False.

Raises:
  • FileNotFoundError – If it was impossible to determine path relative to package.

  • FileNotFoundError – If file does not exist locally or in the package directory.

Returns:

Original path (if exists locally), or resolved path relative to

package directory.

Return type:

Path

strip_off_mount_path(path: str | Path) Path[source]

Strip off the mount path from a given path.

Assumptions:
  • the path contains the user name as a directory.

  • the parent of the user directory is of the format “<new-dir-name>” or “_<new-dir-name>_”, where “*” contains no underscore(s) and where the <new-dir-name> will be used as the new parent directory name relative to the user directory.

Parameters:

path (Union[str, Path]) – Path to strip off the mount path from.

Returns:

Path with the mount path stripped off.

Return type:

path

Raises:

ValueError – If the parent of the user directory does not contain 0 or 2 underscores.

Example

>>> strip_off_mount_path("/etc/ecmwf/nfs/dh1_home_b/$USER/Deode-Workflow/deode")
Path("/home/$USER/Deode-Workflow/deode")

deode.plugin module

Plug-in functionality.

class DeodePlugin(name, path)[source]

Bases: object

Deode plugin.

class DeodePluginFromConfigFile(config_file)[source]

Bases: DeodePlugin

Deode plugin.

static get_plugin_config(config_file)[source]

Get the registry config.

class DeodePluginRegistry(config=None)[source]

Bases: object

Registry of plugins.

deode_plugin()[source]

Base DEODE plugin.

get_registry_config()[source]

Get the registry config.

load_plugin(plugin)[source]

Load plugin.

Parameters:

plugin (DeodePlugin) – Deode plugin

load_plugins()[source]

Load all registered plugins.

plugin_exists(plugin)[source]

Check if plugin exists.

Parameters:

plugin (DeodePlugin) – Deode plugin

Returns:

True if already exists in registry.

Return type:

bool

register_plugin(plugin)[source]

Register plugin.

Parameters:

plugin (DeodePlugin) – Deode plugin

save_registry(config_file)[source]

Save registry config.

Parameters:

config_file (str) – Filename

class DeodePluginRegistryFromConfig(config)[source]

Bases: DeodePluginRegistry

Create a registry from a deode config file.

class DeodePluginRegistryFromFile(config_file)[source]

Bases: DeodePluginRegistry

Registry file of plugins.

deode.scheduler module

Scheduler module.

class EcflowClient(server, task)[source]

Bases: object

An ecflow client.

Encapsulate communication with the ecflow server. This will automatically call the child command init()/complete(), for job start/finish. It will also handle exceptions and signals, by calling the abort child command. ONLY one instance of this class, should be used. Otherwise zombies will be created.

static at_time()[source]

Generate time stamp.

Returns:

Time stamp.

Return type:

str

signal_handler(signum, extra=None)[source]

Signal handler.

Parameters:
  • signum (_type_) – _description_

  • extra (_type_, optional) – _description_. Defaults to None.

class EcflowLogServer(config)[source]

Bases: object

Ecflow log server.

class EcflowServer(config, start_command=None)[source]

Bases: Server

Ecflow server.

begin_suite(suite_name)[source]

Begin the suite.

Parameters:

suite_name (str) – Nam eof the suite.

force_aborted(task)[source]

Force the task aborted.

Parameters:

task (scheduler.EcflowTask) – Task to force aborted.

force_complete(task)[source]

Force the task complete.

Parameters:

task (scheduler.EcflowTask) – Task to force complete.

remove_suites(suite_list)[source]

Remove suites selected from a list.

Parameters:

suite_list (list) – Suite names.

replace(suite_name, def_file)[source]

Replace the suite name from def_file.

Parameters:
  • suite_name (str) – Suite name.

  • def_file (str) – Definition file.

Raises:

RuntimeError – If suite cannot be replaced.

start_server()[source]

Start the server.

Raises:

RuntimeError – Server is not running or Could not restart server.

class EcflowTask(ecf_name, ecf_tryno, ecf_pass, ecf_rid, ecf_timeout=20)[source]

Bases: object

Ecflow scheduler task.

class Server(config)[source]

Bases: ABC

Base server/scheduler class.

abstract begin_suite(suite_name)[source]

Begin the suite in a server specific way.

Parameters:

suite_name (str) – Name of the suite

Raises:

NotImplementedError – Must be implemented by the child server object.

abstract replace(suite_name, def_file)[source]

Create or change the suite definition.

Parameters:
  • suite_name (str) – Name of the suite.

  • def_file (str) – Name of the definition file.

Raises:

NotImplementedError – Must be implemented by the child server object.

abstract start_server()[source]

Start the server.

Raises:

NotImplementedError – Must be implemented by the child server object.

start_suite(suite_name, def_file, begin=True)[source]

Start the suite.

All the servers have these methods implemented and can start the server in a server specific way.

Parameters:
  • suite_name (str) – Name of the suite

  • def_file (str) – Name of the definition file.

  • begin (bool, optional) – If the suite should begin. Defaults to True.

deode.submission module

Module to handle submissions.

class NoSchedulerSubmission(task_settings)[source]

Bases: object

Create and submit job without a scheduler.

submit(task, config, template_job, task_job, output, troika='troika')[source]

Submit task.

Parameters:
  • task (str) – Task name

  • config (deode.ParsedConfig) – Config

  • template_job (str) – Task template job file

  • task_job (str) – Task job file

  • output (str) – Output file

  • troika (str, optional) – troika binary. Defaults to “troika”.

Raises:

RuntimeError – Submission failure.

class ProcessorLayout(kwargs: dict)[source]

Bases: object

Set processor information.

get_proc_dict()[source]

Generate a processor dict.

get_wrapper()[source]

Get and potentially parse the wrapper.

class TaskSettings(config)[source]

Bases: object

Set the task specific setttings.

get_settings(task)[source]

Get the settings.

Parameters:

task (_type_) – _description_

Returns:

_description_

Return type:

_type_

get_task_settings(task, key=None, variables=None, ecf_micro='%')[source]

Get task settings.

Parameters:
  • task (_type_) – _description_

  • key (_type_, optional) – _description_. Defaults to None.

  • variables (_type_, optional) – _description_. Defaults to None.

  • ecf_micro (str, optional) – _description_. Defaults to “%”.

Returns:

_description_

Return type:

_type_

parse_job(task, config, input_template_job, task_job, variables=None, ecf_micro='%', scheduler='ecflow')[source]

Read default job and change interpretor.

Parameters:
  • task (str) – Task name

  • config (deode.config) – The configuration

  • input_template_job (str) – Input container template.

  • task_job (str) – Task container

  • variables (_type_, optional) – _description_. Defaults to None.

  • ecf_micro (str, optional) – _description_.

  • scheduler (str, optional) – Scheduler. Defaults to ecflow.

Raises:

RuntimeError – In case of missing module env file

parse_submission_defs(task)[source]

Parse the submssion definitions.

Parameters:

task (str) – The name of the task

Returns:

Parsed settings

Return type:

dict

Raises:

RuntimeError – Undefined submit type

recursive_items(dictionary)[source]

Recursive loop of dict.

Parameters:

dictionary (_type_) – _description_

Yields:

_type_ – _description_

static update_task_setting(dic, upd)[source]

Update task settings dictionary.

Parameters:
  • dic (dict) – Dictionary to update

  • upd (dict) – Values to update

Returns:

updated dictionary

Return type:

dict

deode.toolbox module

Toolbox handling e.g. input/output.

exception ArchiveError[source]

Bases: Exception

Error raised when there are problems archiving data.

class ArchiveProvider(config, pattern, fetch=True)[source]

Bases: Provider

Data from ECFS.

create_resource(resource)[source]

Create the resource.

Parameters:

resource (Resource) – Resource.

Returns:

True if success

Return type:

bool

class ECFS(config, pattern, fetch=True)[source]

Bases: ArchiveProvider

Data from ECFS.

create_resource(resource)[source]

Create the resource.

Parameters:

resource (Resource) – Resource.

Returns:

True if success

Return type:

bool

class FDB(config, pattern, fetch=True)[source]

Bases: ArchiveProvider

Dummy FDB class.

create_resource(resource)[source]

Create the resource.

Parameters:

resource (Resource) – Resource.

Raises:

RuntimeError – If expver not set

Returns:

True if success

Return type:

bool

class FileManager(config)[source]

Bases: object

FileManager class.

Default DEDODE provider.

Platform specific.

create_list(basetime, forecast_range, input_template, output_settings)[source]

Create list of files to process.

Parameters:
  • basetime (datetime.datetime) – Base time,

  • forecast_range (datetime.datetime) – forecast range,

  • input_template (str) – Input template,

  • output_settings (str) – Output settings

Returns:

dict of validates and grib fiels

Return type:

dict

get_input(target, destination, basetime=None, validtime=None, check_archive=False, provider_id='symlink')[source]

Set input data to deode.

Parameters:
  • target (str) – Input file pattern

  • destination (str) – Destination file pattern

  • basetime (datetime.datetime, optional) – Base time. Defaults to None.

  • validtime (datetime.datetime, optional) – Valid time. Defaults to None.

  • check_archive (bool, optional) – Also check archive. Defaults to False.

  • provider_id (str, optional) – Provider ID. Defaults to “symlink”.

Raises:
  • ProviderError – “No provider found for {target}”

  • NotImplementedError – “Checking archive not implemented yet”

Returns:

provider, resource

Return type:

tuple

get_output(target, destination, basetime=None, validtime=None, archive=False, provider_id='move')[source]

Set output data from deode.

Parameters:
  • target (str) – Input file pattern

  • destination (str) – Destination file pattern

  • basetime (datetime.datetime, optional) – Base time. Defaults to None.

  • validtime (datetime.datetime, optional) – Valid time. Defaults to None.

  • archive (bool, optional) – Also archive data. Defaults to False.

  • provider_id (str, optional) – Provider ID. Defaults to “move”.

Returns:

provider, aprovider, resource

Return type:

tuple

Raises:
  • ArchiveError – Could not archive data

  • NotImplementedError – Archive = True is not implemented

input(target, destination, basetime=None, validtime=None, check_archive=False, provider_id='symlink')[source]

Set input data to deode.

Parameters:
  • target (str) – Input file pattern

  • destination (str) – Destination file pattern

  • basetime (datetime.datetime, optional) – Base time. Defaults to None.

  • validtime (datetime.datetime, optional) – Valid time. Defaults to None.

  • check_archive (bool, optional) – Also check archive. Defaults to False.

  • provider_id (str, optional) – Provider ID. Defaults to “symlink”.

input_data_iterator(input_data_definition, provider_id='symlink')[source]

Handle input data spec dict.

Loop through the defined data types and fetch them.

Parameters:
  • input_data_definition (dict) – Input data spec

  • provider_id (str) – Provider id. Defaults to “symlink”

Raises:

RuntimeError – “Invalid data handle type”

output(target, destination, basetime=None, validtime=None, archive=False, provider_id='move')[source]

Set output data from deode.

Parameters:
  • target (str) – Input file pattern

  • destination (str) – Destination file pattern

  • basetime (datetime.datetime, optional) – Base time. Defaults to None.

  • validtime (datetime.datetime, optional) – Valid time. Defaults to None.

  • archive (bool, optional) – Also archive data. Defaults to False.

  • provider_id (str, optional) – Provider ID. Defaults to “move”.

set_resources_from_dict(res_dict)[source]

Set resources from dict.

Parameters:

res_dict (_type_) – _description_

Raises:

ValueError – If the passed file type is neither ‘input’ nor ‘output’.

class LocalFileOnDisk(config, pattern, basetime=None, validtime=None)[source]

Bases: Resource

Local file on disk.

class LocalFileSystemCopy(config, pattern, fetch=True)[source]

Bases: Provider

Local file system copy.

create_resource(resource)[source]

Create the resource.

Parameters:

resource (Resource) – Resource.

Returns:

True if success

Return type:

bool

class LocalFileSystemMove(config, pattern, fetch=False)[source]

Bases: Provider

Local file system move.

create_resource(resource)[source]

Create the resource.

Parameters:

resource (Resource) – Resource.

Returns:

True if success

Return type:

bool

Bases: Provider

Local file system.

create_resource(resource)[source]

Symlink the resource.

Parameters:

resource (Resource) – Resource.

Returns:

True if success

Return type:

bool

class Platform(config)[source]

Bases: object

Platform.

evaluate(command_string: str, object_: str | object) Any[source]

Evaluate command string, by applying corresponding command of object.

Parameters:
  • command_string (str) – Command string to evaluate

  • object (Union[str, object]) – Object to apply command from (if command is function of object). If str, the object is assumed to be a module. If a class, the command is assumed to be a method of the class.

Raises:
  • ModuleNotFoundError – If module {object_} not found

  • AttributeError – If module/class {object_} has no attribute named {func}

  • TypeError – If object is not a class or a string

  • TypeError – If the command to evaluate is not a function

Returns:

Return original command string if it is not a function call,

otherwise return the result of the function call.

Return type:

any

expand_macros(macros)[source]

Check and expand macros.

Parameters:

macros (dict) – Input macros

Returns:

Stored macros

Return type:

out_macros (dict)

fill_each_macro(macro_config)[source]

Fill each of the macros.

fill_macros()[source]

Fill the macros.

get_gen_macros()[source]

Get the environment macros.

Returns:

Environment macros to be used.

Return type:

dict

get_os_macros()[source]

Get the environment macros.

Returns:

Environment macros to be used.

Return type:

dict

get_platform()[source]

Get the platform.

Returns:

Platform specifc values.

Return type:

dict

get_platform_value(role, alt=None)[source]

Get the path.

Parameters:
  • role (str) – Type of variable to substitute

  • alt (str) – Alternative return value

Returns:

Value from platform.[role]

Return type:

str

get_provider(provider_id, target, fetch=True)[source]

Get the needed provider.

Parameters:
  • provider_id (str) – The intent of the provider.

  • target (Resource) – The target.

  • fetch (boolean) – Fetch the file or store it. Default to True.

Returns:

Provider

Return type:

Provider

Raises:

NotImplementedError – If provider not defined.

get_system_macros()[source]

Get the macros.

Returns:

Macros to define.

Return type:

dict

get_system_value(role)[source]

Get the system value.

Parameters:

role (str) – Type of variable to substitute

Returns:

Value from system.[role]

Return type:

str

get_value(setting, alt=None)[source]

Get the config value with substition.

Parameters:
  • setting (str) – Type of variable to substitute

  • alt (str) – Alternative return value

Returns:

Value from config with substituted variables

Return type:

str

resolve_macros(config_dict, keyval=None)[source]

Resolve all macros in a nested dict (both keys and values!).

store_macro(key, val)[source]

Check and store a macro.

Parameters:
  • key (str) – Macro key

  • val (str) – Macro value

Raises:

RuntimeError – If macro already exists

sub_str_dict(input_dict, basetime=None, validtime=None)[source]

Substitute strings in dictionary.

Parameters:
  • input_dict (dict) – Dict to be parsed

  • basetime (datetime.datetime, optional) – Base time. Defaults to None.

  • validtime (datetime.datetime, optional) – Valid time. Defaults to None.

Returns:

Updated dict

Return type:

d (dict)

sub_value(pattern, key, value, micro='@', ci=True)[source]

Substitute the value case-insensitively.

Parameters:
  • pattern (str) – Input string

  • key (str) – Key to replace

  • value (str) – Value to replace

  • micro (str, optional) – Micro character. Defaults to “@”.

  • ci (bool, optional) – Case insensitive. Defaults to True.

Returns:

Replaces string

Return type:

str

substitute(pattern, basetime=None, validtime=None, keyval=None)[source]

Substitute pattern.

Parameters:
  • pattern (str) – _description_

  • basetime (datetime.datetime, optional) – Base time. Defaults to None.

  • validtime (datetime.datetime, optional) – Valid time. Defaults to None.

  • keyval (str) – Key associated with pattern

Returns:

Substituted string.

Return type:

str

Raises:

RuntimeError – In case of erroneous macro

substitute_datetime(pattern, datetime, suffix='')[source]

Substitute datetime related properties.

Parameters:
  • pattern (str) – _description_

  • datetime (DateTime object) – datetime to treat

  • suffix (str) – Add on to key

Returns:

Substituted string.

Return type:

str

class Provider(config, identifier, fetch=True)[source]

Bases: object

Base provider class.

create_missing_dir(target)[source]

Create a directory if missing.

Parameters:

target (str) – Name of target

create_resource(resource)[source]

Create the resource.

Parameters:

resource (Resource) – The resource to be created

Raises:

NotImplementedError – Should be implemented

exception ProviderError[source]

Bases: Exception

Error raised when there are provider-related problems.

class Resource(_config, identifier)[source]

Bases: object

Resource container.

class SCP(config, pattern, fetch=True)[source]

Bases: ArchiveProvider

Transfer data with SCP.

create_resource(resource)[source]

Create the resource.

Parameters:

resource (Resource) – Resource.

Raises:

RuntimeError – If directory is not created

Returns:

True if success

Return type:

bool

compute_georef(domain_config)[source]

Computes georef from domain_config.