You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It may be useful for users to be able to add some metadata to a xsimlab.Model object, e.g., a description, comments, package versions... -> xsimlab.Model({}, metadata={})
Using the xarray interface, these metadata can be used to systematically add some informative global attributes to Dataset objects generated using the model.
This raises a couple of questions:
How to handle metadata with Model.drop_processes() and Model.update_processes()? It is dangerous simply copying all the metadata as for example updating the processes may result in an invalid description. Probably it is better to no copying any metadata.
Add the metadata to the input or output Dataset? Probably the input Dataset is better. If we save the input to disk for loading it back later, then we at least have a clue on which model has been used to create this setup. Edit: considering also methods like Dataset.xsimlab.update_vars or Dataset.xsimlab.filter_vars that generate new Datasets, another option would be to systematically add (and maybe overwrite) the metadata for each of these methods that accept a Model instance as argument.
The text was updated successfully, but these errors were encountered:
It may be useful for users to be able to add some metadata to a
xsimlab.Model
object, e.g., a description, comments, package versions... ->xsimlab.Model({}, metadata={})
Using the xarray interface, these metadata can be used to systematically add some informative global attributes to Dataset objects generated using the model.
This raises a couple of questions:
How to handle metadata with
Model.drop_processes()
andModel.update_processes()
? It is dangerous simply copying all the metadata as for example updating the processes may result in an invalid description. Probably it is better to no copying any metadata.Add the metadata to the input or output Dataset? Probably the input Dataset is better. If we save the input to disk for loading it back later, then we at least have a clue on which model has been used to create this setup. Edit: considering also methods like
Dataset.xsimlab.update_vars
orDataset.xsimlab.filter_vars
that generate new Datasets, another option would be to systematically add (and maybe overwrite) the metadata for each of these methods that accept aModel
instance as argument.The text was updated successfully, but these errors were encountered: