It could be an uninitialized variable which gets initialized to a sane value in debug mode.
It could be an uninitialized variable which gets initialized to a sane value in debug mode.
J-P Nurmi
This is called the Heisenberg Principle. It origins from physics (see the wikipedia entry: http://en.wikipedia.org/wiki/Uncertainty_principle). The idea is that when want to measure a system, you insert a probe to measure it, and thus you change the system. It applies to software as well (read the Popular culture part of the wikipedia entry for more "implementations" of this principle).
The application crashes due to a race condition, you put a which changes the internal timings (or compile in debug mode, which might be slower, faster, or just different), different commands get scheduled differently - oops, no problem
Now the fun part of my responce: good luck, you will need it.
I have a similar problem. Works on debug, but no on release. To make it special, this happens on linux. On windows it's the other way around. I've yet to find any reason for this behavior. My solution ends up being this:
win32 {
CONFIG += release
}
linux-g++ {
CONFIG += debug
}
I also run strip on the app on linux to get a smaller size for it. It's not always 0s and 1s...
Good luck, but don't wast to much time on this quest. It might not be worth it.
Thanks for your replies guys.
I solved it by basically going back and rewriting the thread more cleanly, so that its behavior to stochastic (i.e. user) input is more robust.
I don't know what the exact problem was, but the code looks prettier now and, it works both in debug and release!
-Kaushik
Bookmarks