Win a copy of TDD for a Shopping Website LiveProject this week in the Testing forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
  • Campbell Ritchie
  • Paul Clapham
  • Ron McLeod
  • Jeanne Boyarsky
  • Tim Cooke
  • Liutauras Vilda
  • paul wheaton
  • Henry Wong
Saloon Keepers:
  • Tim Moores
  • Tim Holloway
  • Stephan van Hulst
  • Carey Brown
  • Frits Walraven
  • Piet Souris
  • Himai Minh

Can I use WPF to make touchscreen buttons react while pressed down?

Posts: 1464
Netbeans IDE C++ Java Windows
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I want to be able to use my touchscreen (on my Asus laptop) to press buttons on my Visual Basic applications. Now, here's a funny thing about buttons: In general, the methods that handle their "click" events do not get called when they are pressed; those methods are called when the buttons are released. If there is an advantage to this, it may be that you get one last chance to avoid clicking a button you may have pressed by mistake (a chance you use by dragging the mouse pointer off of the button before you release the mouse button).

But, that's not how a real, physical button would work. Those tend to generate whatever action they govern when you press them down. There's no real-world action that is comparable to dragging a mouse pointer off of a button, so there's no sequence of events you could follow that would let you press a physical button down, yet somehow indicate you didn't want its associated action to occur. In short, by the time you have pressed a physical button down, it's too late to undo it. If there's an advantage to this, it may be that you can be sure that the action you want will happen immediately upon the press, not some time after (which is immediately upon the release, for a typical virtual button you click with a mouse).

If you like that on-the-release behavior, then your touchscreen will partly make you happy. That's because, again, the "click" doesn't happen until you lift your finger from the screen. But, I say "partly" because even the mouse-down event doesn't happen until you lift your finger. For whatever reason, the buttons you add to a Visual Basic application's UI with Windows Forms do not "know" you have touched the screen until you untouch the screen (or, curiously, until you move your finger while touching the screen).

I'd like to see immediate feedback and I'd also like the click event to occur upon touching, not upon untouching, the screen, so the screen button and physical button will behave as much like each other as they can. Windows Presentation Foundation lets me do that.

Here's a simple Windows Forms application with handlers that check the appropriate boxes upon the MouseDown and Click events:


Here's what happens when, after shifting the focus to that "Clear" button, I touch the "Press Me!" button (which has those event handlers connected to it):


That's right. Nothing happens. No events fire. The focus doesn't even move.

If I wiggle my finger on the screen surface a bit, the focus does move and I get the MouseDown event:


Finally, if I lift my finger (with or without wiggling it first), I get both events:


Well, that's not what I want, though it's a bit tantalizing as, somehow, Windows does seem to know I have touched the screen (else how could it react to my wiggling finger?). It's tempting to pursue it in the hope that we could use the MouseDown event to call a handler and get the behavior we want. But, even if we could, that's really a dead-end. Here's why: if you use the keyboard to press the button, you never get a MouseDown event at all. Here's the "Press Me!" button after it has received the focus and while the space bar is subsequently being held down:


Again, no events (although the button does darken in color a bit, providing visual feedback that it is being pressed). Here's the situation after the space bar is released:


We finally do get our Click event, but no MouseDown event is ever fired. Using MouseDown to know when a button has been clicked is a non-starter.

What about Windows Presentation Foundation? WPF solves both our problems, one automatically, and one optionally.

Here's the approximately identical application written with WPF (we've again used our XAML skills to make the button change color and seem to recede a bit when pressed):


Note that this is doing everything we want: the button reacts visibly when we touch the screen, and the Click event fires on the "finger down" event, not upon release. The first feature, of reacting upon touch, is automatic with WPF. It just works that way. The second, of firing the Click event upon press (instead of waiting for release) is an option. All we had to do was add the ClickMode attribute to the Button element in our XAML code: [code=xml]
I think he's gonna try to grab my monkey. Do we have a monkey outfit for this tiny ad?
Free, earth friendly heat - from the CodeRanch trailboss
    Bookmark Topic Watch Topic
  • New Topic