Hello dear forumers,
I’m trying to set up a button in QML which reacts as a user presses it, as almost every touch application should do. However, when I press my finger over the button the onPressed slot is only called after I move my finger. Just as if the touch point is only being added by Qt if I move my finger while pressed.
This is the test code:
MultiPointTouchArea {
id: mousearea
anchors.fill: parent
onReleased: {
container.clicked()
}
onPressed: {
console.log("PRESSED!!!")
}
}
MultiPointTouchArea {
id: mousearea
anchors.fill: parent
onReleased: {
container.clicked()
}
onPressed: {
console.log("PRESSED!!!")
}
}
To copy to clipboard, switch view to plain text mode
This snippet only outputs PRESSED if either I move my finger or release it. The onRelease slot works correctly.
Is this normal behaviour?
If I use Microsoft Surface Input Visualizer, provided with Microsoft Surface SDK 2.0, I can see the touch point being created when I press the screen, even if I don’t move my finger!
If you’d like to test this and doesn’t have a touch screen device at your reach, I suggest using Microsoft’s Surface Input Simulator, also provided with the Surface SDK, which generates windows touch events for you.
Best regards, hope anyone can help.
Theo
Bookmarks