Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tests for config #212

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open

Add tests for config #212

wants to merge 3 commits into from

Conversation

olivierdelree
Copy link
Contributor

Closes #211.

This PR provides a first batch of tests for the Config class and related functions.
I've tried to handle most use cases but I'm more than happy to add other tests if there are things I forgot.

@olivierdelree olivierdelree self-assigned this May 30, 2024
@olivierdelree olivierdelree added the tests Issue relates to testing or coverage label May 30, 2024
@olivierdelree
Copy link
Contributor Author

It seems the automated GitHub tests fail because of the ctime comparisons. Putting an artificial micro sleep in the tests seemed like the easiest approach to ensure the ctime would be different if recreating the directories immediately but it seems tox doesn't agree and the sleep would need to be larger.
So I'm open to suggestion on how to fix this (or if looking at ctime is just a bad idea to begin with).

Comment on lines +117 to +127
def test_clear_cache(self) -> None:
"""Test clearing cache does delete and recreate cache directory."""
self.assert_not_equal_ctime("cache_dir", config.clear_cache)

def test_clear_db(self) -> None:
"""Test clearing database does delete and recreate database directory."""
self.assert_not_equal_ctime("db_dir", config.clear_db)

def test_clear_data(self) -> None:
"""Test clearing data does delete and recreate data directory."""
self.assert_not_equal_ctime("data_dir", config.clear_data)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I get what you're doing here, but I think instead of testing for deletion/recreation a better test might be to check whether these directories are empty. i.e. stick a file in, then assert that the directory is empty after the clear_data.

The reasoning is that technically speaking, a clear_anything function ought to result in an empty directory. Whether that's done by deleting and recreating the directory, or by deleting every single file in a directory, is kinda irrelevant. The latter would result in the same desirable outcome but fail the test.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's what I meant to do at the start but the check for an empty directory doesn't actually work because data_dir is the parent to cache_dir and db_dir in the default config. That means they always get recreated before clear_data returns. I could do a recursive check for only empty directories in the tree starting at the directory of interest but that's a bit smelly to me.

So checking for empty makes sense in the case of db and cache but not data, and I figured since I needed a different check for data that I might as well use the same logic for the other two.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess I could check for empty except for cache_dir and db_dir but this means that any time we add a default directory in the config, we need to check if it's inside another directory described in the config and if it is, we have to change the tests.

I'm not too happy with the ctime check either but I just didn't find any better option.

import tempfile
import time
import typing
import unittest
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really an issue but any particular reason to prefer unittest vs pytest? I've always used the latter but don't have a preference, just curious

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same reason (and one fewer pip install or dependency).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tests Issue relates to testing or coverage
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Write tests for config
2 participants