Help Regarding Touch Screen | developer.brewmp.com Help Regarding Touch Screen | developer.brewmp.com

Developer

Help Regarding Touch Screen

Forums:

Hi,

I have problem regarding showing virtual keypad on touch screen devices.
I am using BREW SDK 3.1.5 and trying to implement it on lgvx 10000 and lgvx 9700.
I have read almost all the threads realted with touch screen like handling handling pointer events in eventhandling, handling ITEXTCTL_HandleEvent.But all failed.

Can anyone help me regarding this.

I have few more queries like:
1> Does the emulator of lgvx 10000 and 9700 also shows virtual keypad or it happens only on handset??

Don't know about the 10000 but I was able to get the 9700 to work, using the past posts on this topic in the forums.
The Simulator does not show the virtual keypad. You'll have to test this on the physical handset. Good luck.

Don't know about the 10000 but I was able to get the 9700 to work, using the past posts on this topic in the forums.
The Simulator does not show the virtual keypad. You'll have to test this on the physical handset. Good luck.

Hi,
Thanx for your reply.
Just wanted to know
1> Whether i can be able to see the vitrual keypad in any other device (as i dont have lgvx 9700 hand set now) by making the same changes in other builds for example in momo v9m??
2> The only touch screen device i have is lgvx 10000 but when i try to to run my app in touchscreen mode it gives a prompt "open flip to use".
Is there is any way i can run my app on 10000 in touchscreen mode.??
Please help me with the same............!!

Hi,
Thanx for your reply.
Just wanted to know
1> Whether i can be able to see the vitrual keypad in any other device (as i dont have lgvx 9700 hand set now) by making the same changes in other builds for example in momo v9m??
2> The only touch screen device i have is lgvx 10000 but when i try to to run my app in touchscreen mode it gives a prompt "open flip to use".
Is there is any way i can run my app on 10000 in touchscreen mode.??
Please help me with the same............!!

Hi Abhinav,
Try adding the MIME type "application/x-touchscreen" to your application's mif and see whether you can launch your application from external display.
Regards,
Suneel

Hi Abhinav,
Try adding the MIME type "application/x-touchscreen" to your application's mif and see whether you can launch your application from external display.
Regards,
Suneel

on LG 10K you cannot use touchscreen

on LG 10K you cannot use touchscreen

Hi Pradeep,
On LG VX10K touch screen is enabled for Brew Applications. On LG RD10K also it works. But only difference is you can not launch your application from external display using touch. That is because Brew App Manager is not touch enabled. If you want to support touch screen for RD10K, while closing flip don't close your application. Then you will get Pointer events to your application from external display.
Regards,
Suneel

Hi Pradeep,
On LG VX10K touch screen is enabled for Brew Applications. On LG RD10K also it works. But only difference is you can not launch your application from external display using touch. That is because Brew App Manager is not touch enabled. If you want to support touch screen for RD10K, while closing flip don't close your application. Then you will get Pointer events to your application from external display.
Regards,
Suneel

will i be able to view my application output on the external screen ?

will i be able to view my application output on the external screen ?

yes. You can see your application output on external display if you don't close your app on EVT_FLIP.

yes. You can see your application output on external display if you don't close your app on EVT_FLIP.

suneelkumar, you are right..thankx

suneelkumar, you are right..thankx

You CAN launch an application with the clamshell closed on the VX10,000. This enables you to utilize the touch screen for your app without forcing people to open the clamshell.
Inside the MIF file:
MIME extensions
Add MIME type = 'image/display-touchscreen'
Base Class AEECLSID_VIEW

You CAN launch an application with the clamshell closed on the VX10,000. This enables you to utilize the touch screen for your app without forcing people to open the clamshell.
Inside the MIF file:
MIME extensions
Add MIME type = 'image/display-touchscreen'
Base Class AEECLSID_VIEW

Hi all
I need to port my application on a touch screen device, but at present I only have the device pack of LG VX10000. Now I have read in the various threads about the EVT_POINTER events or the EVT_PEN_DOWN,EVT_PEN_UP, etc...but have no clue how to proceed further(my application is ready but wat changes have to be done and added??) :confused:
As told in the above post, I have added the MIME type image/display-toouchscreen in my MIF settings but am not getting any output on the simulator(LGVX1000).....Please help.
PS: Is it necessary to use the IControl interface for implementing the touch screen events as my application doensn't have IControl interface.???
Thanks & Regards,
Akshay

Hi all
I need to port my application on a touch screen device, but at present I only have the device pack of LG VX10000. Now I have read in the various threads about the EVT_POINTER events or the EVT_PEN_DOWN,EVT_PEN_UP, etc...but have no clue how to proceed further(my application is ready but wat changes have to be done and added??) :confused:
As told in the above post, I have added the MIME type image/display-toouchscreen in my MIF settings but am not getting any output on the simulator(LGVX1000).....Please help.
PS: Is it necessary to use the IControl interface for implementing the touch screen events as my application doensn't have IControl interface.???
Thanks & Regards,
Akshay

You need to have the latest Simulator pack to support touch. Brew tools 3.1.5 sp1 or something
You should be using the EVT_POINTER events.
Do not rely on the simulator, get a device
Simulator will report touch events regardless of the MIF or device
No you don't need IControl, touch events just trigger EVT_POINTER events just like key presses. You don't need to do anything special, just handle those events.

You need to have the latest Simulator pack to support touch. Brew tools 3.1.5 sp1 or something
You should be using the EVT_POINTER events.
Do not rely on the simulator, get a device
Simulator will report touch events regardless of the MIF or device
No you don't need IControl, touch events just trigger EVT_POINTER events just like key presses. You don't need to do anything special, just handle those events.

Hello ZeroCool
Thnx for the reply.
I am getting the touch events on EVT_POINTER events but can u pls tell me the differences between the events in a non Touch screen app and Touch screen one? How do I handle both in one app only.....the steps involved???
I read somewhere about using the Touch helper function like - AEE_GET_XY but have no idea as to how to use them.
Please explain about these....
Thanks in Advance.
Akshay

Hello ZeroCool
Thnx for the reply.
I am getting the touch events on EVT_POINTER events but can u pls tell me the differences between the events in a non Touch screen app and Touch screen one? How do I handle both in one app only.....the steps involved???
I read somewhere about using the Touch helper function like - AEE_GET_XY but have no idea as to how to use them.
Please explain about these....
Thanks in Advance.
Akshay

HI ALL,
i followed the following steps for touch screen application
for touch screen enabled application, you have to make changes in mif file. i.e.
---application/x-touch_screen---- in MIME TYPES of the extensions tab of mif file
Next is:
1> find the location of the point, on the screen , where you are touching .
and this can be done using the functions AEE_POINTER_GET_X & AEE_POINTER_GET_Y in the file "AEEPointerHelpers.h"
{
char *co_ordinate_Str = (char *)dwParam;
int touch_x = AEE_POINTER_GET_X(co_ordinate_Str);
int touch_y = AEE_POINTER_GET_Y(co_ordinate_Str);

2> then find where the touch point lies , whether in the textbox rect or menu rect of softmenu rect.
Depending on the location activate the interface and call the handleevent function of that interface.
3>if you are handling the text_box properly, then by touching the texbox in your application the virtual key board automatically lauches. Your application will get suspend_evt on virtual key pad launch and resume_evt on completion of virtual key operation.
4>If you want to get the data entered by virtual key pad , then dont release the respective texbox control on application's suspend event cause the data handled by the text box.
on appliaction resume_evt you fetch the data with itextctl_gettext api
5> get the touch point in following events
{
case EVT_POINTER_DOWN:
case EVT_POINTER_UP:
case EVT_POINTER_MOVE:
case EVT_POINTER_STALE_MOVE:
}
Hope this will help,
The above procedure i followed for both Lg_9700 and lg_10000
Thanks

HI ALL,
i followed the following steps for touch screen application
for touch screen enabled application, you have to make changes in mif file. i.e.
---application/x-touch_screen---- in MIME TYPES of the extensions tab of mif file
Next is:
1> find the location of the point, on the screen , where you are touching .
and this can be done using the functions AEE_POINTER_GET_X & AEE_POINTER_GET_Y in the file "AEEPointerHelpers.h"
{
char *co_ordinate_Str = (char *)dwParam;
int touch_x = AEE_POINTER_GET_X(co_ordinate_Str);
int touch_y = AEE_POINTER_GET_Y(co_ordinate_Str);

2> then find where the touch point lies , whether in the textbox rect or menu rect of softmenu rect.
Depending on the location activate the interface and call the handleevent function of that interface.
3>if you are handling the text_box properly, then by touching the texbox in your application the virtual key board automatically lauches. Your application will get suspend_evt on virtual key pad launch and resume_evt on completion of virtual key operation.
4>If you want to get the data entered by virtual key pad , then dont release the respective texbox control on application's suspend event cause the data handled by the text box.
on appliaction resume_evt you fetch the data with itextctl_gettext api
5> get the touch point in following events
{
case EVT_POINTER_DOWN:
case EVT_POINTER_UP:
case EVT_POINTER_MOVE:
case EVT_POINTER_STALE_MOVE:
}
Hope this will help,
The above procedure i followed for both Lg_9700 and lg_10000
Thanks

Hi Prasant.
Thanks for explaining the steps.
I have added in the MIME type = 'image/display-touchscreen'
Base Class AEECLSID_VIEW
But you wrote ---application/x-touch_screen----
What is the difference between the two??
Also in the First point - u said to find the location of the touch (where do we have to write this - in the handle event??? If yes - then directly under EVT_POINTER events??).
How to handle where the touch event lies??
I know this seems to be stupid questions but implementing touch in a non touch app is giving me headachs.
Your help would be highly appreciated.
Thanks in advance.
Akshay

Hi Prasant.
Thanks for explaining the steps.
I have added in the MIME type = 'image/display-touchscreen'
Base Class AEECLSID_VIEW
But you wrote ---application/x-touch_screen----
What is the difference between the two??
Also in the First point - u said to find the location of the touch (where do we have to write this - in the handle event??? If yes - then directly under EVT_POINTER events??).
How to handle where the touch event lies??
I know this seems to be stupid questions but implementing touch in a non touch app is giving me headachs.
Your help would be highly appreciated.
Thanks in advance.
Akshay

AKSHYA,
I HAVE REPLIED TO YOUR private message , please check it

AKSHYA,
I HAVE REPLIED TO YOUR private message , please check it

akshay1384 wrote:Hi Prasant.
Thanks for explaining the steps.
I have added in the MIME type = 'image/display-touchscreen'
Base Class AEECLSID_VIEW
But you wrote ---application/x-touch_screen----
What is the difference between the two??
Also in the First point - u said to find the location of the touch (where do we have to write this - in the handle event??? If yes - then directly under EVT_POINTER events??).
How to handle where the touch event lies??
I know this seems to be stupid questions but implementing touch in a non touch app is giving me headachs.
Your help would be highly appreciated.
Thanks in advance.
Akshay
Hi Akshay,
i have replied you , but that may not reached you, sorry
1>
application/x-touch_screen-----for enabling touch screen
image/x-disprotation-------------for enabling screen rotation
i have no idea about ---'image/display-touchscreen'
2>
you will get the touch location co-ordinates under the events---EVT_POINTER
3>how to handle touch event
step ---1
in the current screen get the rectangle of all interfaces like menu_control, soft key control,textbox rect etc..
step--2
manually find , inside which rectangle the point lies , by using conditional statements
step--3
then activate the respective interface (like -- IMenuctl_setactive(menu_ptr,TRUE)),
and call the handle event function
hope this will help you

akshay1384 wrote:Hi Prasant.
Thanks for explaining the steps.
I have added in the MIME type = 'image/display-touchscreen'
Base Class AEECLSID_VIEW
But you wrote ---application/x-touch_screen----
What is the difference between the two??
Also in the First point - u said to find the location of the touch (where do we have to write this - in the handle event??? If yes - then directly under EVT_POINTER events??).
How to handle where the touch event lies??
I know this seems to be stupid questions but implementing touch in a non touch app is giving me headachs.
Your help would be highly appreciated.
Thanks in advance.
Akshay
Hi Akshay,
i have replied you , but that may not reached you, sorry
1>
application/x-touch_screen-----for enabling touch screen
image/x-disprotation-------------for enabling screen rotation
i have no idea about ---'image/display-touchscreen'
2>
you will get the touch location co-ordinates under the events---EVT_POINTER
3>how to handle touch event
step ---1
in the current screen get the rectangle of all interfaces like menu_control, soft key control,textbox rect etc..
step--2
manually find , inside which rectangle the point lies , by using conditional statements
step--3
then activate the respective interface (like -- IMenuctl_setactive(menu_ptr,TRUE)),
and call the handle event function
hope this will help you

suneelkumar wrote:yes. You can see your application output on external display if you don't close your app on EVT_FLIP.
Hi suneelkumar,
As you have suggested in the post, i have done the changes required in the code ie not closing app on EVT_FLIP and setting the mif to image/display-touchscreen in MIME type but the problem is i am not abe to continue my app when i close the clamsheel of the device lg 10000.
Can you guide me on how to comtinue my on external display after launching it on landscape mode?

suneelkumar wrote:yes. You can see your application output on external display if you don't close your app on EVT_FLIP.
Hi suneelkumar,
As you have suggested in the post, i have done the changes required in the code ie not closing app on EVT_FLIP and setting the mif to image/display-touchscreen in MIME type but the problem is i am not abe to continue my app when i close the clamsheel of the device lg 10000.
Can you guide me on how to comtinue my on external display after launching it on landscape mode?

Plz provide me the sample code to implement touchscreen in my code.
Thnx in advance
ami

Plz provide me the sample code to implement touchscreen in my code.
Thnx in advance
ami

I've got the VX10000 showing up on the outside display, but I can't figure out how to get it to rotate so that its showing in portrait mode instead of landscape. I've tried IDISPLAY_SetPrefs(pMe->m_pIDisp, "r:0", 3); but that doesn't seem to do anything.
Also, is it possible to know when you are showing the outside display or the inside display?
Thanks so much!

I've got the VX10000 showing up on the outside display, but I can't figure out how to get it to rotate so that its showing in portrait mode instead of landscape. I've tried IDISPLAY_SetPrefs(pMe->m_pIDisp, "r:0", 3); but that doesn't seem to do anything.
Also, is it possible to know when you are showing the outside display or the inside display?
Thanks so much!

To get current orientation use ISHELL_GetDeviceInfoEx(pMe->a.m_pIShell, AEE_DEVICESTATE_SCR_ORIENTATION, &so, &size);
To set orientation use IDISPLAY_SetPrefs(pMe->a.m_pIDisplay, "a:0", STRLEN("a:0"));
You may need to set clip rect using IDISPLAY_SetClipRect.

To get current orientation use ISHELL_GetDeviceInfoEx(pMe->a.m_pIShell, AEE_DEVICESTATE_SCR_ORIENTATION, &so, &size);
To set orientation use IDISPLAY_SetPrefs(pMe->a.m_pIDisplay, "a:0", STRLEN("a:0"));
You may need to set clip rect using IDISPLAY_SetClipRect.

Hi all,
I am new in this touch application i tried above steps but my deguger is telling me
error C2065: 'EVT_POINTER_DOWN' : undeclared identifier
what should i do to make my application to accept this
I am not able to write anything related to
case EVT_POINTER_DOWN:
case EVT_POINTER_UP:
case EVT_POINTER_MOVE:
case EVT_POINTER_STALE_MOVE:
all shows same error
Any suggestion will be helpful for me
Thanks

Hi all,
I am new in this touch application i tried above steps but my deguger is telling me
error C2065: 'EVT_POINTER_DOWN' : undeclared identifier
what should i do to make my application to accept this
I am not able to write anything related to
case EVT_POINTER_DOWN:
case EVT_POINTER_UP:
case EVT_POINTER_MOVE:
case EVT_POINTER_STALE_MOVE:
all shows same error
Any suggestion will be helpful for me
Thanks

where  all touch events r handled .
if some can provide some code snap shot in reply will be of great help..
Thanks

where  all touch events r handled .
if some can provide some code snap shot in reply will be of great help..
Thanks

You need to include "AEEPointerHelpers.h" header file, which has definitions of all these events.
You can refer sample apps (like c_listwidgettouch_app_o.zip) available at https://developer.brewmp.com/resources/family/ui to get an idea.
 
 

You need to include "AEEPointerHelpers.h" header file, which has definitions of all these events.
You can refer sample apps (like c_listwidgettouch_app_o.zip) available at https://developer.brewmp.com/resources/family/ui to get an idea.
 
 

Hello adhudase,
i have include "AEEPointerHelpers.h" header file,  thanks, now i am able to get points of x and y coordinates where ever i touch but .... one more problem  it give me some points outside screen... :(
 
my code is as follows::
x n y r uint8;
pMe->x = AEE_POINTER_GET_X((char*)dwParam);pMe->y = AEE_POINTER_GET_Y((char*)dwParam);
 
DBGPRINTF("P_UP::X=%u  Y=%u ",pMe->x,pMe->y);
//if my device is of 176X200 pixels then some time it give output y as 260, 298 ,208 anything which is of no use ,
How can i get these uint8 to my device pixels ..... is  conversion required....
any suggestion  please help me....
Thanks
 
 

Hello adhudase,
i have include "AEEPointerHelpers.h" header file,  thanks, now i am able to get points of x and y coordinates where ever i touch but .... one more problem  it give me some points outside screen... :(
 
my code is as follows::
x n y r uint8;
pMe->x = AEE_POINTER_GET_X((char*)dwParam);pMe->y = AEE_POINTER_GET_Y((char*)dwParam);
 
DBGPRINTF("P_UP::X=%u  Y=%u ",pMe->x,pMe->y);
//if my device is of 176X200 pixels then some time it give output y as 260, 298 ,208 anything which is of no use ,
How can i get these uint8 to my device pixels ..... is  conversion required....
any suggestion  please help me....
Thanks
 
 

You mean tapping on screen gives x/y outside screen? It shouldn't happen, which device is that?

You mean tapping on screen gives x/y outside screen? It shouldn't happen, which device is that?