I am having a problem with my X11 qt5 application running on a raspberry pi 3 (os - Jessie) connected to the official raspberry pi 7 inch touch screen display.
The application works exactly as desired if using a mouse, but when the user uses the touch screen, the event is ALSO picked up and handled by the desktop. In other words, if i press on a location where i know a desktop icon to be (while it is covered by my full screen application), the system will think I am trying to open that new application, or move it, etc.
I found this thread: http://www.qtcentre.org/threads/6621...erating-system
that describes an identical problem, but unfortunately there was no resolution.
I'm spinning my wheels trying to figure this one out. Does anyone have any suggestions? I will of course provide any necessary information, I'm just not sure what exactly is relevant at this point.