Hello dear forumers,

I’m trying to set up a button in QML which reacts as a user presses it, as almost every touch application should do. However, when I press my finger over the button the onPressed slot is only called after I move my finger. Just as if the touch point is only being added by Qt if I move my finger while pressed.

This is the test code:
Qt Code:
  1. MultiPointTouchArea {
  2. id: mousearea
  3. anchors.fill: parent
  4.  
  5. onReleased: {
  6. container.clicked()
  7. }
  8.  
  9. onPressed: {
  10. console.log("PRESSED!!!")
  11. }
  12.  
  13. }
To copy to clipboard, switch view to plain text mode 
This snippet only outputs PRESSED if either I move my finger or release it. The onRelease slot works correctly.
Is this normal behaviour?

If I use Microsoft Surface Input Visualizer, provided with Microsoft Surface SDK 2.0, I can see the touch point being created when I press the screen, even if I don’t move my finger!
If you’d like to test this and doesn’t have a touch screen device at your reach, I suggest using Microsoft’s Surface Input Simulator, also provided with the Surface SDK, which generates windows touch events for you.

Best regards, hope anyone can help.

Theo