Touchscreen application with Progress 4gl

simnik

New Member
after reading some articles, i'm still not sure how app-development for use with a touch-screen is working.

Does every Windows-Application, including a Progress Application, work also on a computer with a touchscreen?

Can I build a Programm, considering UI-rules (e.g. large buttons...), and it will work on any touch-screen featured computer?

I'm greatful for every feedback on how to start with developing touchscreen-apps.
 
All the touch screens I have came across emulate the mouse. So when the operator touches the screen windows sends a MOUSE-MOVE event to move the cursor to the position touched and then a MOUSE-DOWN and when they remove their finger a MOUSE-UP.

You should be able to develop your code just using the mouse and everything should be fine for your touch screens. Just remember that the operators won't be able to right click and click and drag operations on touch screens aren't very easy to do.

Another thing that comes up is if you have a "next" button and the operator is trying to move through a number of records they will click the next button fast and some of those clicks (every 2nd one usually) will appear to you application as a DOUBLE-CLICK, so you may need to treat DOUBLE-CLICK as click for some of your objects.
 
We've developed numerous touchscreen-based applications to run in a factory-floor environment and have found Progress an excellent tool to do so.

Basically, touching the screen raises the LEFT-MOUSE-CLICK event so you can code around this.

There are a couple of docs around on designing for Touchscreen-based apps, I'm sure a quick search on Google will unearth some. From experience, keep the interface simple (large buttons, clear labelling, good grouping of buttons etc).

I also have some code for removing title bars from Windows which has proved very useful as users had a tendency to try and drag windows around etc.

Hope this helps,
Paul O' Connor.




 

Luke Gardiner

New Member
Just wondering if anyone has found a way to tell the difference between a touchscreen tap and a mouse-click. We are looking at using both a mouse and touchscreen in our app, however we want in some cases to handle each differently, such as when moving windows around using the touchscreen would require one tap on the title bar to pick it up and then another to put it down somewhere.
 
If your application runs with 1 or 2 specific touch screen devices, you may be able to just look at the running processes or the registery eg:

HKEY_LOCAL_MACHINE\HARDWARE\DEVICEMAP\PointerClass

This won't work if you have both mouse and touch screen, or VNC/PC Anywhere and touch screen input.

The best bet might be a use touch screen startup parameter eg:

prowin32.exe -pf startapp.pf -param TSCREEN=true,LOG=false

You could then have a RIGHT-CLICK menu option to switch to mouse mode, if needed (eg. when you connect in via VNC).
 
Top