Python Global Variables In Multiple Files
Solution 1:
It looks like (although you don't tell explicitly that) you are running your programs in a completely independent way: two different invocations of the Python interpreter.
There is no such magic as you are hoping would exist: just as if you have two instances of the same program running, each one will have its instance of variables (global or otherwise).
If you are performing some simple task, the easier way to go is to have one text file as output for each process, and the other process trying to read information from the file generated by each process it wants to know about - (you could even use named pipes in Unixes).
The other way is to have a Python script to coordinate the starting of your daemons using the multiprocessing
stdlib module, and then create a multiprocessing.Manager object to share variables directly between process.
This can be more complicated to set up at first, but it is the clean thing to do. Check the docs on the Manager class here:
https://docs.python.org/3/library/multiprocessing.html
Solution 2:
How do I share global variables across modules?
The canonical way to share information across modules within a single program is to create a special module (often called config or cfg). Just import the config module in all modules of your application; the module then becomes available as a global name. Because there is only one instance of each module, any changes made to the module object get reflected everywhere.:
import time
import glb
while(True):
glb.t += 1
time.sleep(3)
print glb.t
b.py:
import glb
import a
while(True):
print(glb.t)
glb.py:
t = 0
Output after starting a.py:
python b.py123456
Post a Comment for "Python Global Variables In Multiple Files"