I want to be able to use my touchscreen (on my Asus laptop) to press buttons on my Visual Basic applications. Now, here's a funny thing about buttons: In general, the methods that handle their "click" events do not get called when they are pressed; those methods are called when the buttons are released. If there is an advantage to this, it may be that you get one last chance to avoid clicking a button you may have pressed by mistake (a chance you use by dragging the mouse pointer off of the button before you release the mouse button).
But, that's not how a real, physical button would work. Those tend to generate whatever action they govern when you press them down. There's no real-world action that is comparable to dragging a mouse pointer off of a button, so there's no sequence of events you could follow that would let you press a physical button down, yet somehow indicate you didn't want its associated action to occur. In short, by the time you have pressed a physical button down, it's too late to undo it. If there's an advantage to this, it may be that you can be sure that the action you want will happen immediately upon the press, not some time after (which is immediately upon the release, for a typical virtual button you click with a mouse).
If you like that on-the-release behavior, then your touchscreen will partly make you happy. That's because, again, the "click" doesn't happen until you lift your finger from the screen. But, I say "partly" because even the mouse-down event doesn't happen until you lift your finger. For whatever reason, the buttons you add to a Visual Basic application's UI with Windows Forms do not "know" you have touched the screen until you untouch the screen (or, curiously, until you move your finger while touching the screen).
I'd like to see immediate feedback and I'd also like the click event to occur upon touching, not upon untouching, the screen, so the screen button and physical button will behave as much like each other as they can. Windows Presentation Foundation lets me do that.
Here's a simple Windows Forms application with handlers that check the appropriate boxes upon the MouseDown and Click events:
[img]http://exlumina.com/1369.png[/img]
Here's what happens when, after shifting the focus to that "Clear" button, I touch the "Press Me!" button (which has those event handlers connected to it):
[img]http://exlumina.com/1370.png[/img]
That's right. Nothing happens. No events fire. The focus doesn't even move.
If I wiggle my finger on the screen surface a bit, the focus does move and I get the MouseDown event:
[img]http://exlumina.com/1387.png[/img]
Finally, if I lift my finger (with or without wiggling it first), I get both events:
[img]http://exlumina.com/1371.png[/img]
Well, that's not what I want, though it's a bit tantalizing as, somehow, Windows does seem to know I have touched the screen (else how could it react to my wiggling finger?). It's tempting to pursue it in the hope that we could use the MouseDown event to call a handler and get the behavior we want. But, even if we could, that's really a dead-end. Here's why: if you use the keyboard to press the button, you never get a MouseDown event at all. Here's the "Press Me!" button after it has received the focus and while the space bar is subsequently being held down:
[img]http://exlumina.com/1374.png[/img]
Again, no events (although the button does darken in color a bit, providing visual feedback that it is being pressed). Here's the situation after the space bar is released:
[img]http://exlumina.com/1375.png[/img]
We finally do get our Click event, but no MouseDown event is ever fired. Using MouseDown to know when a button has been clicked is a non-starter.
What about Windows Presentation Foundation? WPF solves both our problems, one automatically, and one optionally.
Here's the approximately identical application written with WPF (we've again used our XAML skills to make the button change color and seem to recede a bit when pressed):
[img]http://exlumina.com/1376.png[/img]
Note that this is doing everything we want: the button reacts visibly when we touch the screen, and the Click event fires on the "finger down" event, not upon release. The first feature, of reacting upon touch, is automatic with WPF. It just works that way. The second, of firing the Click event upon press (instead of waiting for release) is an option. All we had to do was add the ClickMode attribute to the Button element in our XAML code:
[code=xml][/code]
"ClickMode" is also settable in the Visual Studio property sheet when the button is selected in the designer.
Now I can have touchscreen buttons that behave like real-world buttons do, with visible response upon contact, and immediate action.
By the way, you may have noticed something else about those WPF buttons that you don't tend to see on Windows Forms buttons: multiple lines of text. Look back at that XAML, above. See the Content attribute? That clumsy string that is its value inserts the hoary Windows (nay, DOS) "newline" sequence into the text string. WPF buttons can handle that. Try it as the text of a Windows Forms button, and you'll get a button that says, "Press
Me!" on your screen.
Now, that text is left-justified and I admit that's probably not what I want. Centered is probably what everyone would want. But I don't know how to do that yet, so we'll end here with the answer to the question I asked in the first place, and come back another day to ask more about WPF buttons and their content.
"Il y a peu de choses qui me soient impossibles..."