Dynamically import every module in a given directory/package
I have a file structure like this:
dir_a __init__.py mod_1.py mod_2.py dir_b __init__.py my_mod.py
I want to dynamically import every module in dir_a in my_mod.py, and in order to do that, I have the following inside the __init__.py of dir_a:
import os import glob __all__ = [os.path.basename(f)[:-3] for f in glob.glob(os.path.dirname(os.path.abspath(__file__)) + "/*.py")]
But when I do:
from dir_a import *
I get ImportError: No module named dir_a
I wonder what causes this, is it because the parent directory containing dir_a and dir_b has to be in PYTHONPATH? In addition, the above approach is dynamic only in the sense that my_mod.py picks up everything when it is run, but if a new module is added to dir_a during its run this new module won't be picked up. So is it possible to implement a truly dynamic importing mechanism?
The answer to the first part of your question is a trivial yes: you cannot import dir_a when the directory containing dir_a isn't in the search path.
For the second part, I don't believe that is possible in general - you could listen for 'file created' OS signals in dir_a, see whether the created file is a .py, and add it to dir_a.__all__ - but that is complicated, probably expensive, and doesn't retroactively add the new thing to the global namespace, since from foo import * only sees what is in foo.__all__ at import time. And changing that would be error-prone - it allows your namespace to change at any time based on an unpredictable external event. Say you do this:
from dir_a import * bar = 5
And then a bar.py gets added to dir_a. You can't write a "truly dynamic importer" without considering that situation, and deciding what you want to have happen.