Quote Originally Posted by KaptainKarl View Post
We use this method to write server daemons.
We create a private slot that watches for signals that tell the application to exit.
Without the signal, the program continues to run.
So what is the purpose of the single-shot timer? How does the program "continue to run" if while you exit the slot, you call application's exit() slot which quits the event loop? Could you provide a minimal compilable example demonstrating what you mean?