2010-11-19 00:25:28

by Ping Cheng

[permalink] [raw]
Subject: [PATCH] Add BTN_TOOL_BUTTONS to input.h

We "borrowed" BTN_TOOL_FINGER from input/mouse to pass tablet
buttons to the user land. This has not been an issue since
tablet was not considered as a mouse replacement. With the
introduction of hybrid digitizer and touch devices, the tool
type is causing confusion. A new tool type is due for the
well-being of future input device drivers.

Signed-off-by: Ping Cheng <[email protected]>
---
drivers/input/tablet/hanwang.c | 6 +++---
drivers/input/tablet/wacom_wac.c | 24 ++++++++++++------------
include/linux/input.h | 1 +
3 files changed, 16 insertions(+), 15 deletions(-)

diff --git a/drivers/input/tablet/hanwang.c b/drivers/input/tablet/hanwang.c
index 6504b62..c59f78c 100644
--- a/drivers/input/tablet/hanwang.c
+++ b/drivers/input/tablet/hanwang.c
@@ -112,7 +112,7 @@ static const int hw_absevents[] = {

static const int hw_btnevents[] = {
BTN_STYLUS, BTN_STYLUS2, BTN_TOOL_PEN, BTN_TOOL_RUBBER,
- BTN_TOOL_MOUSE, BTN_TOOL_FINGER,
+ BTN_TOOL_MOUSE, BTN_TOOL_BUTTONS,
BTN_0, BTN_1, BTN_2, BTN_3, BTN_4, BTN_5, BTN_6, BTN_7, BTN_8,
};

@@ -202,7 +202,7 @@ static void hanwang_parse_packet(struct hanwang *hanwang)

switch (type) {
case HANWANG_ART_MASTER_III:
- input_report_key(input_dev, BTN_TOOL_FINGER, data[1] ||
+ input_report_key(input_dev, BTN_TOOL_BUTTONS, data[1] ||
data[2] || data[3]);
input_report_abs(input_dev, ABS_WHEEL, data[1]);
input_report_key(input_dev, BTN_0, data[2]);
@@ -212,7 +212,7 @@ static void hanwang_parse_packet(struct hanwang *hanwang)
break;

case HANWANG_ART_MASTER_HD:
- input_report_key(input_dev, BTN_TOOL_FINGER, data[1] ||
+ input_report_key(input_dev, BTN_TOOL_BUTTONS, data[1] ||
data[2] || data[3] || data[4] ||
data[5] || data[6]);
input_report_abs(input_dev, ABS_RX,
diff --git a/drivers/input/tablet/wacom_wac.c b/drivers/input/tablet/wacom_wac.c
index b3252ef..360ce4a 100644
--- a/drivers/input/tablet/wacom_wac.c
+++ b/drivers/input/tablet/wacom_wac.c
@@ -268,7 +268,7 @@ static int wacom_graphire_irq(struct wacom_wac *wacom)
input_report_key(input, BTN_4, (data[7] & 0x80));
rw = ((data[7] & 0x18) >> 3) - ((data[7] & 0x20) >> 3);
input_report_rel(input, REL_WHEEL, rw);
- input_report_key(input, BTN_TOOL_FINGER, 0xf0);
+ input_report_key(input, BTN_TOOL_BUTTONS, 0xf0);
if (!prox)
wacom->id[1] = 0;
input_report_abs(input, ABS_MISC, wacom->id[1]);
@@ -286,7 +286,7 @@ static int wacom_graphire_irq(struct wacom_wac *wacom)
input_report_key(input, BTN_4, (data[7] & 0x10));
input_report_key(input, BTN_5, (data[7] & 0x40));
input_report_abs(input, ABS_WHEEL, (data[8] & 0x7f));
- input_report_key(input, BTN_TOOL_FINGER, 0xf0);
+ input_report_key(input, BTN_TOOL_BUTTONS, 0xf0);
if (!prox)
wacom->id[1] = 0;
input_report_abs(input, ABS_MISC, wacom->id[1]);
@@ -486,8 +486,8 @@ static int wacom_intuos_irq(struct wacom_wac *wacom)
/* pad packets. Works as a second tool and is always in prox */
if (data[0] == WACOM_REPORT_INTUOSPAD) {
/* initiate the pad as a device */
- if (wacom->tool[1] != BTN_TOOL_FINGER)
- wacom->tool[1] = BTN_TOOL_FINGER;
+ if (wacom->tool[1] != BTN_TOOL_BUTTONS)
+ wacom->tool[1] = BTN_TOOL_BUTTONS;

if (features->type >= INTUOS4S && features->type <= INTUOS4L) {
input_report_key(input, BTN_0, (data[2] & 0x01));
@@ -552,11 +552,11 @@ static int wacom_intuos_irq(struct wacom_wac *wacom)
if ((data[5] & 0x1f) | data[6] | (data[1] & 0x1f) |
data[2] | (data[3] & 0x1f) | data[4] | data[8] |
(data[7] & 0x01)) {
- input_report_key(input, wacom->tool[1], 1);
- input_report_abs(input, ABS_MISC, PAD_DEVICE_ID);
+ /* input_report_key(input, wacom->tool[1], 1);
+ */ input_report_abs(input, ABS_MISC, PAD_DEVICE_ID);
} else {
- input_report_key(input, wacom->tool[1], 0);
- input_report_abs(input, ABS_MISC, 0);
+ /* input_report_key(input, wacom->tool[1], 0);
+ */ input_report_abs(input, ABS_MISC, 0);
}
}
input_event(input, EV_MSC, MSC_SERIAL, 0xffffffff);
@@ -1141,7 +1141,7 @@ void wacom_setup_input_capabilities(struct input_dev *input_dev,
case WACOM_G4:
input_set_capability(input_dev, EV_MSC, MSC_SERIAL);

- __set_bit(BTN_TOOL_FINGER, input_dev->keybit);
+ __set_bit(BTN_TOOL_BUTTONS, input_dev->keybit);
__set_bit(BTN_0, input_dev->keybit);
__set_bit(BTN_4, input_dev->keybit);
/* fall through */
@@ -1179,7 +1179,7 @@ void wacom_setup_input_capabilities(struct input_dev *input_dev,
case CINTIQ:
for (i = 0; i < 8; i++)
__set_bit(BTN_0 + i, input_dev->keybit);
- __set_bit(BTN_TOOL_FINGER, input_dev->keybit);
+ __set_bit(BTN_TOOL_BUTTONS, input_dev->keybit);

input_set_abs_params(input_dev, ABS_RX, 0, 4096, 0, 0);
input_set_abs_params(input_dev, ABS_RY, 0, 4096, 0, 0);
@@ -1203,7 +1203,7 @@ void wacom_setup_input_capabilities(struct input_dev *input_dev,
__set_bit(BTN_2, input_dev->keybit);
__set_bit(BTN_3, input_dev->keybit);

- __set_bit(BTN_TOOL_FINGER, input_dev->keybit);
+ __set_bit(BTN_TOOL_BUTTONS, input_dev->keybit);

input_set_abs_params(input_dev, ABS_RX, 0, 4096, 0, 0);
input_set_abs_params(input_dev, ABS_Z, -900, 899, 0, 0);
@@ -1222,7 +1222,7 @@ void wacom_setup_input_capabilities(struct input_dev *input_dev,
case INTUOS4S:
for (i = 0; i < 7; i++)
__set_bit(BTN_0 + i, input_dev->keybit);
- __set_bit(BTN_TOOL_FINGER, input_dev->keybit);
+ __set_bit(BTN_TOOL_BUTTONS, input_dev->keybit);

input_set_abs_params(input_dev, ABS_Z, -900, 899, 0, 0);
wacom_setup_intuos(wacom_wac);
diff --git a/include/linux/input.h b/include/linux/input.h
index 6ef4446..be3c4bf 100644
--- a/include/linux/input.h
+++ b/include/linux/input.h
@@ -468,6 +468,7 @@ struct input_keymap_entry {
#define BTN_TOOL_FINGER 0x145
#define BTN_TOOL_MOUSE 0x146
#define BTN_TOOL_LENS 0x147
+#define BTN_TOOL_BUTTONS 0x148
#define BTN_TOUCH 0x14a
#define BTN_STYLUS 0x14b
#define BTN_STYLUS2 0x14c
--
1.7.2.3


2010-11-22 07:56:04

by Dmitry Torokhov

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

Hi Ping,

On Thu, Nov 18, 2010 at 04:25:35PM -0800, Ping Cheng wrote:
> We "borrowed" BTN_TOOL_FINGER from input/mouse to pass tablet
> buttons to the user land. This has not been an issue since
> tablet was not considered as a mouse replacement. With the
> introduction of hybrid digitizer and touch devices, the tool
> type is causing confusion. A new tool type is due for the
> well-being of future input device drivers.
>

I am sorry but I do not understand the reasoning behind
BTN_TOOL_BUTTONS.

The BTN_TOOL_* were introduced to indicate to the userspace tool that is
currently touching the surface of the device. Buttons are expected to be
always present and can change their state regardless of what tool is
being used at the moment. I.e. The full hardware state (between
EV_SYN/SYN_REPORT) could be, for example,

Pen at 10,20, BTN_0, and BTN_2 (ABS_X 10, ABS_Y 20, BTN_TOOL_PEN, BTN_0,
BTN_2) or

Lens at 20,15 and BTN_1 (ABS_X 20, ABS_Y 15, BTN_TOOL_LENS, BTN_1).

As you can see BTN_* events can accompany either BTN_TOOL_LENS or
BTN_TOOL_PEN or any other BTN_TOOL_*.

Thanks.

--
Dmitry

2010-11-23 17:40:22

by Ping Cheng

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Sun, Nov 21, 2010 at 11:55 PM, Dmitry Torokhov
<[email protected]> wrote:
> Hi Ping,
>
> On Thu, Nov 18, 2010 at 04:25:35PM -0800, Ping Cheng wrote:
>> We "borrowed" BTN_TOOL_FINGER from input/mouse to pass tablet
>> buttons to the user land. This has not been an issue since
>> tablet was not considered as a mouse replacement. With the
>> introduction of hybrid digitizer and touch devices, the tool
>> type is causing confusion. A new tool type is due for the
>> well-being of future input device drivers.
>>
>
> I am sorry but I do not understand the reasoning behind
> BTN_TOOL_BUTTONS.

Don't be sorry, Dmitry. Your statement is fair since:

1. I did not explain it clearly;
2. You do not have the physical device (Intuos series) to test with.

I'll explain it again by refering back to the code to see if I can
make it clear or not. The code I am going to refer to is under the
Linus tree at git.kernel.org.

> The BTN_TOOL_* were introduced to indicate to the userspace tool that is
> currently touching the surface of the device. Buttons are expected to be
> always present and can change their state regardless of what tool is
> being used at the moment. I.e. The full hardware state (between
> EV_SYN/SYN_REPORT) could be, for example,
>
> Pen at 10,20, BTN_0, and BTN_2 (ABS_X 10, ABS_Y 20, BTN_TOOL_PEN, BTN_0,
> BTN_2) or
>
> Lens at 20,15 and BTN_1 (ABS_X 20, ABS_Y 15, BTN_TOOL_LENS, BTN_1).
>
> As you can see BTN_* events can accompany either BTN_TOOL_LENS or
> BTN_TOOL_PEN or any other BTN_TOOL_*.

You are right. The tablet buttons can go with one of those other
BTN_TOOL_s _if_ they do not define the same event types (BTN_s) as the
tablet buttons.

The new Bamboo MT code sends both BTN_LEFT and BTN_RIGHT events for
Tablet Buttons (refer to line 905 and 908 of wacom_wac.c). However,
BTN_LEFT and BTN_RIGHT are also sent by BTN_TOOL_MOUSE/LENS tool
(refer to wacom_wac.c line 622 to 665).

If we remove BTN_TOOL_FINGER without a BTN_TOOL-something to replace
it, the two LEFT and RIGHT buttons will have a hard time to tell the
user land if they are from the MOUSE/LENS or the Tablet Buttons. The
worst case could be the LEFT/RIGHT sent later overwrites the earlier
ones.

We could do some guesswork in the user land to figure out which
LEFT/RIGHT belongs to which BTN_TOOL_ if the above scenario does not
happen. But, it would be much cheaper and more reliable if we can tell
the user land where those LEFT and RIGHT come from. This is the whole
purpose of the kernel driver, isn't it?

Ping

2010-11-23 22:25:31

by Dmitry Torokhov

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Tue, Nov 23, 2010 at 09:40:19AM -0800, Ping Cheng wrote:
> On Sun, Nov 21, 2010 at 11:55 PM, Dmitry Torokhov
> <[email protected]> wrote:
> > Hi Ping,
> >
> > On Thu, Nov 18, 2010 at 04:25:35PM -0800, Ping Cheng wrote:
> >> We "borrowed" BTN_TOOL_FINGER from input/mouse to pass tablet
> >> buttons to the user land. This has not been an issue since
> >> tablet was not considered as a mouse replacement. With the
> >> introduction of hybrid digitizer and touch devices, the tool
> >> type is causing confusion. A new tool type is due for the
> >> well-being of future input device drivers.
> >>
> >
> > I am sorry but I do not understand the reasoning behind
> > BTN_TOOL_BUTTONS.
>
> Don't be sorry, Dmitry. Your statement is fair since:
>
> 1. I did not explain it clearly;
> 2. You do not have the physical device (Intuos series) to test with.
>
> I'll explain it again by refering back to the code to see if I can
> make it clear or not. The code I am going to refer to is under the
> Linus tree at git.kernel.org.
>
> > The BTN_TOOL_* were introduced to indicate to the userspace tool that is
> > currently touching the surface of the device. Buttons are expected to be
> > always present and can change their state regardless of what tool is
> > being used at the moment. I.e. The full hardware state (between
> > EV_SYN/SYN_REPORT) could be, for example,
> >
> > Pen at 10,20, BTN_0, and BTN_2 (ABS_X 10, ABS_Y 20, BTN_TOOL_PEN, BTN_0,
> > BTN_2) or
> >
> > Lens at 20,15 and BTN_1 (ABS_X 20, ABS_Y 15, BTN_TOOL_LENS, BTN_1).
> >
> > As you can see BTN_* events can accompany either BTN_TOOL_LENS or
> > BTN_TOOL_PEN or any other BTN_TOOL_*.
>
> You are right. The tablet buttons can go with one of those other
> BTN_TOOL_s _if_ they do not define the same event types (BTN_s) as the
> tablet buttons.
>
> The new Bamboo MT code sends both BTN_LEFT and BTN_RIGHT events for
> Tablet Buttons (refer to line 905 and 908 of wacom_wac.c). However,
> BTN_LEFT and BTN_RIGHT are also sent by BTN_TOOL_MOUSE/LENS tool
> (refer to wacom_wac.c line 622 to 665).
>
> If we remove BTN_TOOL_FINGER without a BTN_TOOL-something to replace
> it, the two LEFT and RIGHT buttons will have a hard time to tell the
> user land if they are from the MOUSE/LENS or the Tablet Buttons. The
> worst case could be the LEFT/RIGHT sent later overwrites the earlier
> ones.
>
> We could do some guesswork in the user land to figure out which
> LEFT/RIGHT belongs to which BTN_TOOL_ if the above scenario does not
> happen. But, it would be much cheaper and more reliable if we can tell
> the user land where those LEFT and RIGHT come from. This is the whole
> purpose of the kernel driver, isn't it?

What would userspace want to figure what physical button was pressed?
Input events convey _actions_, i.e. BTN_LEFT means that user pressed
primary button on the device. It does not matter if it was pressed on
tablet or the mouse/lens; the response should be the same.

If you expect different response, depending on which physical button is
pressed, then they should emit different BTN_* events. If you are
concerned that some users might want to have the same actions while
others want different actions - then please implement key remapping in
the driver.

If you want to treat touch and pen as completely independent devices -
we can do that too but as you remember there are some issues with
"pairing" of the devices.

Thanks.

--
Dmitry

2010-11-24 00:12:16

by Ping Cheng

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Tue, Nov 23, 2010 at 2:24 PM, Dmitry Torokhov
<[email protected]> wrote:
>
>> > The BTN_TOOL_* were introduced to indicate to the userspace tool that is
>> > currently touching the surface of the device. Buttons are expected to be
>> > always present and can change their state regardless of what tool is
>> > being used at the moment. I.e. The full hardware state (between
>> > EV_SYN/SYN_REPORT) could be, for example,
>> >
>> > Pen at 10,20, BTN_0, and BTN_2 (ABS_X 10, ABS_Y 20, BTN_TOOL_PEN, BTN_0,
>> > BTN_2) or
>> >
>> > Lens at 20,15 and BTN_1 (ABS_X 20, ABS_Y 15, BTN_TOOL_LENS, BTN_1).
>> >
>> > As you can see BTN_* events can accompany either BTN_TOOL_LENS or
>> > BTN_TOOL_PEN or any other BTN_TOOL_*.
>>
>> You are right. The tablet buttons can go with one of those other
>> BTN_TOOL_s _if_ they do not define the same event types (BTN_s) as the
>> tablet buttons.
>>
>> The new Bamboo MT code sends both BTN_LEFT and BTN_RIGHT events for
>> Tablet Buttons (refer to line 905 and 908 of wacom_wac.c). However,
>> BTN_LEFT and BTN_RIGHT are also sent by BTN_TOOL_MOUSE/LENS tool
>> (refer to wacom_wac.c line 622 to 665).
>>
>> If we remove BTN_TOOL_FINGER without a BTN_TOOL-something to replace
>> it, the two LEFT and RIGHT buttons will have a hard time to tell the
>> user land if they are from the MOUSE/LENS or the Tablet Buttons. The
>> worst case could be the LEFT/RIGHT sent later overwrites the earlier
>> ones.
>>
>> We could do some guesswork in the user land to figure out which
>> LEFT/RIGHT belongs to which BTN_TOOL_ if the above scenario does not
>> happen. But, it would be much cheaper and more reliable if we can tell
>> the user land where those LEFT and RIGHT come from. This is the whole
>> purpose of the kernel driver, isn't it?
>
> What would userspace want to figure what physical button was pressed?
> Input events convey _actions_, i.e. BTN_LEFT means that user pressed
> primary button on the device. It does not matter if it was pressed on
> tablet or the mouse/lens; the response should be the same.

You're right, if the user wants a LEFT click. In a lot of cases, they
want to translate it into something else. LEFT is only a default value
that we give them if they do nothing.

> If you expect different response, depending on which physical button is
> pressed, then they should emit different BTN_* events. If you are
> concerned that some users might want to have the same actions while
> others want different actions - then please implement key remapping in
> the driver.

That is exactly what I am trying to convince you. Without being able
to tell one button event from the other, even just logically, how can
I and other clients remap them?

> If you want to treat touch and pen as completely independent devices -
> we can do that too but as you remember there are some issues with
> "pairing" of the devices.

No, this issue has nothing to do with separating pen and touch
devices. This is for BUTTONS only. Those button events can come from
the physical tools (pen, mouse, lens, etc.) or the tablet itself.
Without knowing that the button events are from the ones that
physically sit on the tablet, it may mess up with the buttons that are
from the physical tools.

You might have mixed my BTN_TOOL_TOUCH request with this patch.

Ping

2010-11-24 00:38:43

by Dmitry Torokhov

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Tue, Nov 23, 2010 at 04:12:13PM -0800, Ping Cheng wrote:
> On Tue, Nov 23, 2010 at 2:24 PM, Dmitry Torokhov
> <[email protected]> wrote:
> >
> >> > The BTN_TOOL_* were introduced to indicate to the userspace tool that is
> >> > currently touching the surface of the device. Buttons are expected to be
> >> > always present and can change their state regardless of what tool is
> >> > being used at the moment. I.e. The full hardware state (between
> >> > EV_SYN/SYN_REPORT) could be, for example,
> >> >
> >> > Pen at 10,20, BTN_0, and BTN_2 (ABS_X 10, ABS_Y 20, BTN_TOOL_PEN, BTN_0,
> >> > BTN_2) or
> >> >
> >> > Lens at 20,15 and BTN_1 (ABS_X 20, ABS_Y 15, BTN_TOOL_LENS, BTN_1).
> >> >
> >> > As you can see BTN_* events can accompany either BTN_TOOL_LENS or
> >> > BTN_TOOL_PEN or any other BTN_TOOL_*.
> >>
> >> You are right. The tablet buttons can go with one of those other
> >> BTN_TOOL_s _if_ they do not define the same event types (BTN_s) as the
> >> tablet buttons.
> >>
> >> The new Bamboo MT code sends both BTN_LEFT and BTN_RIGHT events for
> >> Tablet Buttons (refer to line 905 and 908 of wacom_wac.c). However,
> >> BTN_LEFT and BTN_RIGHT are also sent by BTN_TOOL_MOUSE/LENS tool
> >> (refer to wacom_wac.c line 622 to 665).
> >>
> >> If we remove BTN_TOOL_FINGER without a BTN_TOOL-something to replace
> >> it, the two LEFT and RIGHT buttons will have a hard time to tell the
> >> user land if they are from the MOUSE/LENS or the Tablet Buttons. The
> >> worst case could be the LEFT/RIGHT sent later overwrites the earlier
> >> ones.
> >>
> >> We could do some guesswork in the user land to figure out which
> >> LEFT/RIGHT belongs to which BTN_TOOL_ if the above scenario does not
> >> happen. But, it would be much cheaper and more reliable if we can tell
> >> the user land where those LEFT and RIGHT come from. This is the whole
> >> purpose of the kernel driver, isn't it?
> >
> > What would userspace want to figure what physical button was pressed?
> > Input events convey _actions_, i.e. BTN_LEFT means that user pressed
> > primary button on the device. It does not matter if it was pressed on
> > tablet or the mouse/lens; the response should be the same.
>
> You're right, if the user wants a LEFT click. In a lot of cases, they
> want to translate it into something else. LEFT is only a default value
> that we give them if they do nothing.
>
> > If you expect different response, depending on which physical button is
> > pressed, then they should emit different BTN_* events. If you are
> > concerned that some users might want to have the same actions while
> > others want different actions - then please implement key remapping in
> > the driver.
>
> That is exactly what I am trying to convince you. Without being able
> to tell one button event from the other, even just logically, how can
> I and other clients remap them?

EVIOCSKEYCODE. You just need to wire wacom driver to support this ioctl.

>
> > If you want to treat touch and pen as completely independent devices -
> > we can do that too but as you remember there are some issues with
> > "pairing" of the devices.
>
> No, this issue has nothing to do with separating pen and touch
> devices. This is for BUTTONS only. Those button events can come from
> the physical tools (pen, mouse, lens, etc.) or the tablet itself.
> Without knowing that the button events are from the ones that
> physically sit on the tablet, it may mess up with the buttons that are
> from the physical tools.
>
> You might have mixed my BTN_TOOL_TOUCH request with this patch.

No, I have not. Again, if actions are different then physical buttons
should emit different events. If actions are the same then events are
the same and it does not matter whether user pushed button on the
touchpad or the mouse.

--
Dmitry

2010-11-24 00:51:13

by Ping Cheng

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Tue, Nov 23, 2010 at 4:38 PM, Dmitry Torokhov
<[email protected]> wrote:
> On Tue, Nov 23, 2010 at 04:12:13PM -0800, Ping Cheng wrote:
>> On Tue, Nov 23, 2010 at 2:24 PM, Dmitry Torokhov
>> <[email protected]> wrote:
>> >
>> >> > The BTN_TOOL_* were introduced to indicate to the userspace tool that is
>> >> > currently touching the surface of the device. Buttons are expected to be
>> >> > always present and can change their state regardless of what tool is
>> >> > being used at the moment. I.e. The full hardware state (between
>> >> > EV_SYN/SYN_REPORT) could be, for example,
>> >> >
>> >> > Pen at 10,20, BTN_0, and BTN_2 (ABS_X 10, ABS_Y 20, BTN_TOOL_PEN, BTN_0,
>> >> > BTN_2) or
>> >> >
>> >> > Lens at 20,15 and BTN_1 (ABS_X 20, ABS_Y 15, BTN_TOOL_LENS, BTN_1).
>> >> >
>> >> > As you can see BTN_* events can accompany either BTN_TOOL_LENS or
>> >> > BTN_TOOL_PEN or any other BTN_TOOL_*.
>> >>
>> >> You are right. The tablet buttons can go with one of those other
>> >> BTN_TOOL_s _if_ they do not define the same event types (BTN_s) as the
>> >> tablet buttons.
>> >>
>> >> The new Bamboo MT code sends both BTN_LEFT and BTN_RIGHT events for
>> >> Tablet Buttons (refer to line 905 and 908 of wacom_wac.c). However,
>> >> BTN_LEFT and BTN_RIGHT are also sent by BTN_TOOL_MOUSE/LENS tool
>> >> (refer to wacom_wac.c line 622 to 665).
>> >>
>> >> If we remove BTN_TOOL_FINGER without a BTN_TOOL-something to replace
>> >> it, the two LEFT and RIGHT buttons will have a hard time to tell the
>> >> user land if they are from the MOUSE/LENS or the Tablet Buttons. The
>> >> worst case could be the LEFT/RIGHT sent later overwrites the earlier
>> >> ones.
>> >>
>> >> We could do some guesswork in the user land to figure out which
>> >> LEFT/RIGHT belongs to which BTN_TOOL_ if the above scenario does not
>> >> happen. But, it would be much cheaper and more reliable if we can tell
>> >> the user land where those LEFT and RIGHT come from. This is the whole
>> >> purpose of the kernel driver, isn't it?
>> >
>> > What would userspace want to figure what physical button was pressed?
>> > Input events convey _actions_, i.e. BTN_LEFT means that user pressed
>> > primary button on the device. It does not matter if it was pressed on
>> > tablet or the mouse/lens; the response should be the same.
>>
>> You're right, if the user wants a LEFT click. In a lot of cases, they
>> want to translate it into something else. LEFT is only a default value
>> that we give them if they do nothing.
>>
>> > If you expect different response, depending on which physical button is
>> > pressed, then they should emit different BTN_* events. If you are
>> > concerned that some users might want to have the same actions while
>> > others want different actions - then please implement key remapping in
>> > the driver.
>>
>> That is exactly what I am trying to convince you. Without being able
>> to tell one button event from the other, even just logically, how can
>> I and other clients remap them?
>
> EVIOCSKEYCODE. You just need to wire wacom driver to support this ioctl.
>
>>
>> > If you want to treat touch and pen as completely independent devices -
>> > we can do that too but as you remember there are some issues with
>> > "pairing" of the devices.
>>
>> No, this issue has nothing to do with separating pen and touch
>> devices. This is for BUTTONS only. Those button events can come from
>> the physical tools (pen, mouse, lens, etc.) or the tablet itself.
>> Without knowing that the button events are from the ones that
>> physically sit on the tablet, it may mess up with the buttons that are
>> from the physical tools.
>>
>> You might have mixed my BTN_TOOL_TOUCH request with this patch.
>
> No, I have not. Again, if actions are different then physical buttons
> should emit different events. If actions are the same then events are
> the same and it does not matter whether user pushed button on the
> touchpad or the mouse.

All right. I'll submit a patch to close this request. The
BTN_TOOL_TOUCH request is still open though.

Thank you,

Ping

2010-11-24 01:10:54

by Ping Cheng

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Tue, Nov 23, 2010 at 4:51 PM, Ping Cheng <[email protected]> wrote:
> On Tue, Nov 23, 2010 at 4:38 PM, Dmitry Torokhov
> <[email protected]> wrote:
>> On Tue, Nov 23, 2010 at 04:12:13PM -0800, Ping Cheng wrote:
>>> On Tue, Nov 23, 2010 at 2:24 PM, Dmitry Torokhov
>>> <[email protected]> wrote:
>>> >
>>> >> > The BTN_TOOL_* were introduced to indicate to the userspace tool that is
>>> >> > currently touching the surface of the device. Buttons are expected to be
>>> >> > always present and can change their state regardless of what tool is
>>> >> > being used at the moment. I.e. The full hardware state (between
>>> >> > EV_SYN/SYN_REPORT) could be, for example,
>>> >> >
>>> >> > Pen at 10,20, BTN_0, and BTN_2 (ABS_X 10, ABS_Y 20, BTN_TOOL_PEN, BTN_0,
>>> >> > BTN_2) or
>>> >> >
>>> >> > Lens at 20,15 and BTN_1 (ABS_X 20, ABS_Y 15, BTN_TOOL_LENS, BTN_1).
>>> >> >
>>> >> > As you can see BTN_* events can accompany either BTN_TOOL_LENS or
>>> >> > BTN_TOOL_PEN or any other BTN_TOOL_*.
>>> >>
>>> >> You are right. The tablet buttons can go with one of those other
>>> >> BTN_TOOL_s _if_ they do not define the same event types (BTN_s) as the
>>> >> tablet buttons.
>>> >>
>>> >> The new Bamboo MT code sends both BTN_LEFT and BTN_RIGHT events for
>>> >> Tablet Buttons (refer to line 905 and 908 of wacom_wac.c). However,
>>> >> BTN_LEFT and BTN_RIGHT are also sent by BTN_TOOL_MOUSE/LENS tool
>>> >> (refer to wacom_wac.c line 622 to 665).
>>> >>
>>> >> If we remove BTN_TOOL_FINGER without a BTN_TOOL-something to replace
>>> >> it, the two LEFT and RIGHT buttons will have a hard time to tell the
>>> >> user land if they are from the MOUSE/LENS or the Tablet Buttons. The
>>> >> worst case could be the LEFT/RIGHT sent later overwrites the earlier
>>> >> ones.
>>> >>
>>> >> We could do some guesswork in the user land to figure out which
>>> >> LEFT/RIGHT belongs to which BTN_TOOL_ if the above scenario does not
>>> >> happen. But, it would be much cheaper and more reliable if we can tell
>>> >> the user land where those LEFT and RIGHT come from. This is the whole
>>> >> purpose of the kernel driver, isn't it?
>>> >
>>> > What would userspace want to figure what physical button was pressed?
>>> > Input events convey _actions_, i.e. BTN_LEFT means that user pressed
>>> > primary button on the device. It does not matter if it was pressed on
>>> > tablet or the mouse/lens; the response should be the same.
>>>
>>> You're right, if the user wants a LEFT click. In a lot of cases, they
>>> want to translate it into something else. LEFT is only a default value
>>> that we give them if they do nothing.
>>>
>>> > If you expect different response, depending on which physical button is
>>> > pressed, then they should emit different BTN_* events. If you are
>>> > concerned that some users might want to have the same actions while
>>> > others want different actions - then please implement key remapping in
>>> > the driver.
>>>
>>> That is exactly what I am trying to convince you. Without being able
>>> to tell one button event from the other, even just logically, how can
>>> I and other clients remap them?
>>
>> EVIOCSKEYCODE. You just need to wire wacom driver to support this ioctl.

Hold on. I was too concentrated on the buttons then. There are touch
rings (reported as ABS_WHEEL) on the tablet. How do we pass the raw
ring data to the user land and tell if that ABS_WHEEL is from the ring
or from a stylus' wheel? Should we add an ABS_RING then?

Also, if there is no tool on the tablet, which BTN_TOOL_* should we
use to report those buttons and strips/rings? They are not PEN, not
MOUSE, and not TOUCH. They are in fact an independent tool, like it or
not.

Ping

2010-11-24 02:44:35

by Dmitry Torokhov

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Tue, Nov 23, 2010 at 05:10:50PM -0800, Ping Cheng wrote:
> On Tue, Nov 23, 2010 at 4:51 PM, Ping Cheng <[email protected]> wrote:
> > On Tue, Nov 23, 2010 at 4:38 PM, Dmitry Torokhov
> > <[email protected]> wrote:
> >> On Tue, Nov 23, 2010 at 04:12:13PM -0800, Ping Cheng wrote:
> >>> On Tue, Nov 23, 2010 at 2:24 PM, Dmitry Torokhov
> >>> <[email protected]> wrote:
> >>> >
> >>> >> > The BTN_TOOL_* were introduced to indicate to the userspace tool that is
> >>> >> > currently touching the surface of the device. Buttons are expected to be
> >>> >> > always present and can change their state regardless of what tool is
> >>> >> > being used at the moment. I.e. The full hardware state (between
> >>> >> > EV_SYN/SYN_REPORT) could be, for example,
> >>> >> >
> >>> >> > Pen at 10,20, BTN_0, and BTN_2 (ABS_X 10, ABS_Y 20, BTN_TOOL_PEN, BTN_0,
> >>> >> > BTN_2) or
> >>> >> >
> >>> >> > Lens at 20,15 and BTN_1 (ABS_X 20, ABS_Y 15, BTN_TOOL_LENS, BTN_1).
> >>> >> >
> >>> >> > As you can see BTN_* events can accompany either BTN_TOOL_LENS or
> >>> >> > BTN_TOOL_PEN or any other BTN_TOOL_*.
> >>> >>
> >>> >> You are right. The tablet buttons can go with one of those other
> >>> >> BTN_TOOL_s _if_ they do not define the same event types (BTN_s) as the
> >>> >> tablet buttons.
> >>> >>
> >>> >> The new Bamboo MT code sends both BTN_LEFT and BTN_RIGHT events for
> >>> >> Tablet Buttons (refer to line 905 and 908 of wacom_wac.c). However,
> >>> >> BTN_LEFT and BTN_RIGHT are also sent by BTN_TOOL_MOUSE/LENS tool
> >>> >> (refer to wacom_wac.c line 622 to 665).
> >>> >>
> >>> >> If we remove BTN_TOOL_FINGER without a BTN_TOOL-something to replace
> >>> >> it, the two LEFT and RIGHT buttons will have a hard time to tell the
> >>> >> user land if they are from the MOUSE/LENS or the Tablet Buttons. The
> >>> >> worst case could be the LEFT/RIGHT sent later overwrites the earlier
> >>> >> ones.
> >>> >>
> >>> >> We could do some guesswork in the user land to figure out which
> >>> >> LEFT/RIGHT belongs to which BTN_TOOL_ if the above scenario does not
> >>> >> happen. But, it would be much cheaper and more reliable if we can tell
> >>> >> the user land where those LEFT and RIGHT come from. This is the whole
> >>> >> purpose of the kernel driver, isn't it?
> >>> >
> >>> > What would userspace want to figure what physical button was pressed?
> >>> > Input events convey _actions_, i.e. BTN_LEFT means that user pressed
> >>> > primary button on the device. It does not matter if it was pressed on
> >>> > tablet or the mouse/lens; the response should be the same.
> >>>
> >>> You're right, if the user wants a LEFT click. In a lot of cases, they
> >>> want to translate it into something else. LEFT is only a default value
> >>> that we give them if they do nothing.
> >>>
> >>> > If you expect different response, depending on which physical button is
> >>> > pressed, then they should emit different BTN_* events. If you are
> >>> > concerned that some users might want to have the same actions while
> >>> > others want different actions - then please implement key remapping in
> >>> > the driver.
> >>>
> >>> That is exactly what I am trying to convince you. Without being able
> >>> to tell one button event from the other, even just logically, how can
> >>> I and other clients remap them?
> >>
> >> EVIOCSKEYCODE. You just need to wire wacom driver to support this ioctl.
>
> Hold on. I was too concentrated on the buttons then. There are touch
> rings (reported as ABS_WHEEL) on the tablet. How do we pass the raw
> ring data to the user land and tell if that ABS_WHEEL is from the ring
> or from a stylus' wheel? Should we add an ABS_RING then?

May be. Could you please describe exactly what it is? What is the
default application? Is it really used for scrolling the work area up
and down?

>
> Also, if there is no tool on the tablet, which BTN_TOOL_* should we
> use to report those buttons and strips/rings? They are not PEN, not
> MOUSE, and not TOUCH. They are in fact an independent tool, like it or
> not.

No, the buttons on the device are not independent tool but rather fixed
features that are applicable to all tools and none in particular.

Considering that proper use of input protocol means that you describe
_entire_ state of the device how would BTN_TOOL_BUTTONS help you do that
if button is presses on both touchpad and mouse? What about pressed on
the tablet and released on mouse? Note that there should be no
ordering dependencies; userspace is expected to accumulate all data and
maybe cancel out opposites until it gets EV_SYN/SYN_REPORT.

I was thinking that while having pen and thouch as separate devices
might not be the best idea having mouse and maybe lens as separate
input devices might make more sense. I'll try to find some time and
play with my Graphire...

--
Dmitry

2010-11-24 17:46:52

by Ping Cheng

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Tue, Nov 23, 2010 at 6:44 PM, Dmitry Torokhov
<[email protected]> wrote:
>
>> >>> That is exactly what I am trying to convince you. Without being able
>> >>> to tell one button event from the other, even just logically, how can
>> >>> I and other clients remap them?
>> >>
>> >> EVIOCSKEYCODE. You just need to wire wacom driver to support this ioctl.
>>
>> Hold on. I was too concentrated on the buttons then. There are touch
>> rings (reported as ABS_WHEEL) on the tablet. How do we pass the raw
>> ring data to the user land and tell if that ABS_WHEEL is from the ring
>> or from a stylus' wheel? Should we add an ABS_RING then?
>
> May be. Could you please describe exactly what it is?

Take a look at this page: http://www.wacom.com/intuos/. The ring is
the one on the center of the left edge. We also have two strips on
both edges of some tablets:
http://www.wacom.com/cintiq/cintiq-12wx.php. I was not suggesting to
add ABS_RING. Actually, adding ABS_RING is overkilling since we may
add other stuff on the tablet which would require a new ABS_something.
One BTN_TOOL_ groups everything on the tablet, which is much cleaner
than adding more ABS_*.

> What is the default application? Is it really used for scrolling the work area up
> and down?

It can be used for scrolling, zooming, or any application specific
functions. For example, it can be translated into some painting
effect, such shadowing, dimming....

>> Also, if there is no tool on the tablet, which BTN_TOOL_* should we
>> use to report those buttons and strips/rings? They are not PEN, not
>> MOUSE, and not TOUCH. They are in fact an independent tool, like it or
>> not.
>
> No, the buttons on the device are not independent tool but rather fixed
> features that are applicable to all tools and none in particular.

Which tool do you think we can use while no tool is on the tablet? No
matter which one we use, it is confusing. The
strips/ring/wheel/buttons on the tablet do not belong to those tools
at all. They live without any of those tools.

Plus, we are not just talking about buttons here. I started the
argument with buttons and Bamboo since it is the first one that
introduced the BTN_TOOL_FINGER in a different meaning. The need of
BTN_TOOL_BUUTONS is not really for Bamboo. It is for the strips and
rings on the Intuos and Cintiq (see the URL I refered above).

> Considering that proper use of input protocol means that you describe
> _entire_ state of the device how would BTN_TOOL_BUTTONS help you do that
> if button is presses on both touchpad and mouse?

There should be no confusion between pressing a button on a touchpad
or a mouse from a driver's perspective. The driver should be able to
tell user land which button has been pressed/released. It is up to the
user to decide which function they want to assign to the button.

We assign default values to the buttons just to be friendly to most
common apps/users. But, we should not limit user land usability.
Reporting raw strips/ring/wheel vaules gives the room for client to
develop their own functions. But those raw data do not belong to any
of the other tools.

> What about pressed on
> the tablet and released on mouse? Note that there should be no
> ordering dependencies; userspace is expected to accumulate all data and
> maybe cancel out opposites until it gets EV_SYN/SYN_REPORT.
>
> I was thinking that while having pen and thouch as separate devices
> might not be the best idea having mouse and maybe lens as separate
> input devices might make more sense. I'll try to find some time and
> play with my Graphire...

Yeah, Graphire may prove my point. Do you have a Graphire 4? How are
you going to pass the REL_WHEEL event on the tablet while a mouse,
which also has a REL_WHEEL, is on the tablet?

We are complicating a simple problem/request here.

Ping

2010-11-24 20:16:17

by Dmitry Torokhov

[permalink] [raw]
Subject: Re: [PATCH] Add BTN_TOOL_BUTTONS to input.h

On Wed, Nov 24, 2010 at 09:46:48AM -0800, Ping Cheng wrote:
> On Tue, Nov 23, 2010 at 6:44 PM, Dmitry Torokhov
> <[email protected]> wrote:
> >
> >> >>> That is exactly what I am trying to convince you. Without being able
> >> >>> to tell one button event from the other, even just logically, how can
> >> >>> I and other clients remap them?
> >> >>
> >> >> EVIOCSKEYCODE. You just need to wire wacom driver to support this ioctl.
> >>
> >> Hold on. I was too concentrated on the buttons then. There are touch
> >> rings (reported as ABS_WHEEL) on the tablet. How do we pass the raw
> >> ring data to the user land and tell if that ABS_WHEEL is from the ring
> >> or from a stylus' wheel? Should we add an ABS_RING then?
> >
> > May be. Could you please describe exactly what it is?
>
> Take a look at this page: http://www.wacom.com/intuos/. The ring is
> the one on the center of the left edge. We also have two strips on
> both edges of some tablets:
> http://www.wacom.com/cintiq/cintiq-12wx.php. I was not suggesting to
> add ABS_RING. Actually, adding ABS_RING is overkilling since we may
> add other stuff on the tablet which would require a new ABS_something.
> One BTN_TOOL_ groups everything on the tablet, which is much cleaner
> than adding more ABS_*.

What if you get 3 sets of rings? 4? 5? And some more buttons? Will we
need BTN_TOOL_BUTTONS_EXTRA? BTN_TOOL_BUTTONS_EVEN_MORE? This direction
leads to a dead end, it is not sustainable.

>
> > What is the default application? Is it really used for scrolling the work area up
> > and down?
>
> It can be used for scrolling, zooming, or any application specific
> functions. For example, it can be translated into some painting
> effect, such shadowing, dimming....

Right. However it is nice to know the default application as we use it
to provide default mapping.

>
> >> Also, if there is no tool on the tablet, which BTN_TOOL_* should we
> >> use to report those buttons and strips/rings? They are not PEN, not
> >> MOUSE, and not TOUCH. They are in fact an independent tool, like it or
> >> not.
> >
> > No, the buttons on the device are not independent tool but rather fixed
> > features that are applicable to all tools and none in particular.
>
> Which tool do you think we can use while no tool is on the tablet? No
> matter which one we use, it is confusing. The
> strips/ring/wheel/buttons on the tablet do not belong to those tools
> at all. They live without any of those tools.

Exactly. They are not a tool (BTN_TOOL_* in input protocol conveys that
a certain object touches working surface at certain coordinates with
certain pressure - the pressure is optional), and they do not belong to
any tool. So you should not try to associate them with any tools.

>
> Plus, we are not just talking about buttons here. I started the
> argument with buttons and Bamboo since it is the first one that
> introduced the BTN_TOOL_FINGER in a different meaning. The need of
> BTN_TOOL_BUUTONS is not really for Bamboo. It is for the strips and
> rings on the Intuos and Cintiq (see the URL I refered above).
>
> > Considering that proper use of input protocol means that you describe
> > _entire_ state of the device how would BTN_TOOL_BUTTONS help you do that
> > if button is presses on both touchpad and mouse?
>
> There should be no confusion between pressing a button on a touchpad
> or a mouse from a driver's perspective. The driver should be able to
> tell user land which button has been pressed/released. It is up to the
> user to decide which function they want to assign to the button.

I know. I just asked you to provide the example of event stream for the
conditions below using your new BTN_TOOL_BUTTONS given the constraints I
mentioned.

>
> We assign default values to the buttons just to be friendly to most
> common apps/users. But, we should not limit user land usability.
> Reporting raw strips/ring/wheel vaules gives the room for client to
> develop their own functions. But those raw data do not belong to any
> of the other tools.
>
> > What about pressed on
> > the tablet and released on mouse? Note that there should be no
> > ordering dependencies; userspace is expected to accumulate all data and
> > maybe cancel out opposites until it gets EV_SYN/SYN_REPORT.
> >
> > I was thinking that while having pen and thouch as separate devices
> > might not be the best idea having mouse and maybe lens as separate
> > input devices might make more sense. I'll try to find some time and
> > play with my Graphire...
>
> Yeah, Graphire may prove my point. Do you have a Graphire 4? How are
> you going to pass the REL_WHEEL event on the tablet while a mouse,
> which also has a REL_WHEEL, is on the tablet?

No, it is Graphire 3.

>
> We are complicating a simple problem/request here.

No, we are trying to ram something in that might work for wacom driver
only and only for the short term.

Input protocol is set up so that events of the same type are combined;
there is only one logical BTN_LEFT with its state, only one horizontal
and vertical wheels, only one KEY_A and so forth. If several physical
switches are set up to emit the same event they are by design
indistinguishable from each other. We did relax this for multi-touch
devices where it is natural to have several independent contacts being
present and their number changes over time.

You are trying to devise the scheme to partition device into several
parts to work around this issue. But if you want to have the buttons
truly independent then just partition the device explicitly. Do split it
off into separate button/ring/misc device and it should work just fine
without need for new "tool" type. And it will continue to work if you
add more buttons, sliders, rings and so forth.

The only issue with the above is that we need to support more than 32
event devices. That was on my back burner but if someone were to beat me
to it - that would be even better.

Thanks.

--
Dmitry