This function is used to enable the default touch controllers on widgets. If you want to enable touch on a widget canvas, use this function on the RootContainer. Children added to the widget canvas after touch has been enabled through the root container will automatically enable touch support on themselves. It is possible to send an enable touch call to a widget besides the root container; however, the touch events may not route correctly.
- pif:[in] Pointer to the IWidget interface object.
int IWidget_EnableTouch(IWidget *pif);
By default, any display transparencies used by containers and widgets have no impact on how they react to touch events. This means that even if a widget displays as completely transparent on the screen, if it is on the top of the z-order, and a touch event happens within its extent, that widget will process the event.
Let's say you're creating a custom user interface where there are a series of radio buttons (AEECLSID_RadioWidget) represented graphically by stars (IWidget_SetPropImageStrip). When first touch enabled, the radio buttons will use AEEWIDGET_TOUCH_MODE_ALWAYS, which will react to touches such that any touch within the rectangular bounding box of the radio widget is a touch on that widget.
If you instead want the radio buttons to only react to touches to the portion of the radio button that contains the image of the star, you could instead change the touch mode to AEEWIDGET_TOUCH_MODE_TRANSPARENCY using IWidget_SetTouchMode. In this mode, whenever the user touches the radio button, it will compare the point of the touch against the transparency value of the star image at that point. If it passes the threshold test, the parent will route the touch event to the radio widget. If it fails the threshold test, the touch will instead fall through to the widget behind the radio button (such as the container itself), where the touch mode tests will begin anew (but using the touch modes of the widgets behind the radio button).