Some touchscreens are shipped with a physical layer on top of them where
a number of buttons and a resized touchscreen surface might be available.
In order to generate proper key events by overlay buttons and adjust the
touch events to a clipped surface, these patches offer a documented,
device-tree-based solution by means of helper functions.
An implementation for a specific touchscreen driver is also included.
The functions in ts-virtobj provide a simple workflow to acquire
physical objects from the device tree, map them into the device driver
structures as virtual objects and generate events according to
the object descriptions.
This solution has been tested with a JT240MHQS-E3 display, which uses
the st1624 as a touchscreen and provides two overly buttons and a frame
that clips its effective surface.
Javier Carrasco (4):
Input: ts-virtobj - Add touchsreen virtual object handling to the core
dt-bindings: touchscreen: add virtual-touchscreen and virtual-buttons
properties
Input: st1232 - add virtual touchscreen and buttons handling
dt-bindings: input: touchscreen: st1232: add virtual objects to the
example
.../input/touchscreen/sitronix,st1232.yaml | 28 ++
.../input/touchscreen/touchscreen.yaml | 54 ++++
drivers/input/Makefile | 1 +
drivers/input/touchscreen/st1232.c | 41 ++-
drivers/input/ts-virtobj.c | 305 ++++++++++++++++++
include/linux/input/ts-virtobj.h | 59 ++++
6 files changed, 478 insertions(+), 10 deletions(-)
create mode 100644 drivers/input/ts-virtobj.c
create mode 100644 include/linux/input/ts-virtobj.h
--
2.37.2
Some touchscreens provide mechanical overlays with different objects
like buttons or downsized touchscreen surfaces.
In order to support these objects, add a series of helper functions
to the input core to transform them into virtual objects via device
tree nodes.
These virtual objects consume the raw touch events and report the
expected input events depending on the object properties.
Signed-off-by: Javier Carrasco <[email protected]>
---
drivers/input/Makefile | 1 +
drivers/input/ts-virtobj.c | 305 +++++++++++++++++++++++++++++++
include/linux/input/ts-virtobj.h | 59 ++++++
3 files changed, 365 insertions(+)
create mode 100644 drivers/input/ts-virtobj.c
create mode 100644 include/linux/input/ts-virtobj.h
diff --git a/drivers/input/Makefile b/drivers/input/Makefile
index 2266c7d010ef..9c717e28719e 100644
--- a/drivers/input/Makefile
+++ b/drivers/input/Makefile
@@ -8,6 +8,7 @@
obj-$(CONFIG_INPUT) += input-core.o
input-core-y := input.o input-compat.o input-mt.o input-poller.o ff-core.o
input-core-y += touchscreen.o
+input-core-y += ts-virtobj.o
obj-$(CONFIG_INPUT_FF_MEMLESS) += ff-memless.o
obj-$(CONFIG_INPUT_SPARSEKMAP) += sparse-keymap.o
diff --git a/drivers/input/ts-virtobj.c b/drivers/input/ts-virtobj.c
new file mode 100644
index 000000000000..4bddd8015af3
--- /dev/null
+++ b/drivers/input/ts-virtobj.c
@@ -0,0 +1,305 @@
+// SPDX-License-Identifier: GPL-2.0-only
+/*
+ * Helper functions for virtual objects on touchscreens
+ *
+ * Copyright (c) 2023 Javier Carrasco <[email protected]>
+ */
+
+#include <linux/property.h>
+#include <linux/input.h>
+#include <linux/input/mt.h>
+#include <linux/module.h>
+#include <linux/input/ts-virtobj.h>
+
+static int ts_virtobj_count_buttons(struct device *dev)
+{
+ struct fwnode_handle *child_node;
+ struct fwnode_handle *child_button;
+ int count = 0;
+
+ child_node = device_get_named_child_node(dev, ts_virtobj_names[BUTTON]);
+ if (!child_node)
+ return 0;
+
+ fwnode_for_each_child_node(child_node, child_button)
+ count++;
+ fwnode_handle_put(child_node);
+
+ return count;
+}
+
+static int ts_virtobj_get_shape_properties(struct fwnode_handle *child_node,
+ struct ts_virtobj_shape *shape)
+{
+ int rc;
+
+ rc = fwnode_property_read_u32(child_node, "x-origin", &shape->x_origin);
+ if (rc < 0)
+ return rc;
+
+ rc = fwnode_property_read_u32(child_node, "y-origin", &shape->y_origin);
+ if (rc < 0)
+ return rc;
+
+ rc = fwnode_property_read_u32(child_node, "x-size", &shape->x_size);
+ if (rc < 0)
+ return rc;
+
+ rc = fwnode_property_read_u32(child_node, "y-size", &shape->y_size);
+ if (rc < 0)
+ return rc;
+
+ return 0;
+}
+
+static int ts_virtobj_get_button_properties(struct device *dev,
+ struct fwnode_handle *child_node,
+ struct ts_virtobj_button *btn)
+{
+ struct fwnode_handle *child_btn;
+ int rc;
+ int j = 0;
+
+ fwnode_for_each_child_node(child_node, child_btn) {
+ rc = ts_virtobj_get_shape_properties(child_btn, &btn[j].shape);
+ if (rc < 0) {
+ fwnode_handle_put(child_btn);
+ return rc;
+ }
+
+ rc = fwnode_property_read_u32(child_btn, "linux,code",
+ &btn[j].key);
+ if (rc < 0) {
+ fwnode_handle_put(child_btn);
+ return rc;
+ }
+
+ dev_info(dev, "Added button at (%u, %u), size %ux%u, code=%u\n",
+ btn[j].shape.x_origin, btn[j].shape.y_origin,
+ btn[j].shape.x_size, btn[j].shape.y_size, btn[j].key);
+ j++;
+ }
+
+ return 0;
+}
+
+static void ts_virtobj_set_button_caps(struct ts_virtobj_map *map,
+ struct input_dev *dev)
+{
+ int i;
+
+ for (i = 0; i < map->button_count; i++)
+ input_set_capability(dev, EV_KEY,
+ map->buttons[i].key);
+}
+
+static int ts_virtobj_map_touchscreen(struct device *dev,
+ struct ts_virtobj_map *map)
+{
+ struct fwnode_handle *child;
+ int rc = 0;
+
+ child = device_get_named_child_node(dev, ts_virtobj_names[TOUCHSCREEN]);
+ if (!child)
+ return 0;
+
+ map->touchscreen = devm_kzalloc(dev, sizeof(*map->touchscreen),
+ GFP_KERNEL);
+ if (!map->touchscreen) {
+ fwnode_handle_put(child);
+ return -ENOMEM;
+ }
+ rc = ts_virtobj_get_shape_properties(child, map->touchscreen);
+ if (rc < 0) {
+ devm_kfree(dev, map->touchscreen);
+ fwnode_handle_put(child);
+ return rc;
+ }
+ map->virtual_touchscreen = 1;
+ dev_info(dev, "Added virtual touchscreen at (%u, %u), size %u x %u\n",
+ map->touchscreen->x_origin, map->touchscreen->y_origin,
+ map->touchscreen->x_size, map->touchscreen->y_size);
+
+ fwnode_handle_put(child);
+ return 0;
+}
+
+static int ts_virtobj_map_buttons(struct device *dev,
+ struct ts_virtobj_map *map,
+ struct input_dev *input)
+{
+ struct fwnode_handle *child;
+ u32 button_count;
+ int rc;
+
+ button_count = ts_virtobj_count_buttons(dev);
+ if (button_count) {
+ map->buttons = devm_kcalloc(dev, button_count,
+ sizeof(*map->buttons), GFP_KERNEL);
+ if (!map->buttons)
+ return -ENOMEM;
+
+ child = device_get_named_child_node(dev,
+ ts_virtobj_names[BUTTON]);
+ if (unlikely(!child)) {
+ devm_kfree(dev, map->buttons);
+ return 0;
+ }
+ rc = ts_virtobj_get_button_properties(dev, child, map->buttons);
+ if (rc < 0) {
+ devm_kfree(dev, map->buttons);
+ return rc;
+ }
+ map->button_count = button_count;
+ ts_virtobj_set_button_caps(map, input);
+ }
+
+ return 0;
+}
+
+static bool ts_virtobj_defined_objects(struct device *dev)
+{
+ struct fwnode_handle *child;
+ int i;
+
+ for (i = 0; i < ARRAY_SIZE(ts_virtobj_names); i++) {
+ child = device_get_named_child_node(dev, ts_virtobj_names[i]);
+ if (child) {
+ fwnode_handle_put(child);
+ return true;
+ }
+ fwnode_handle_put(child);
+ }
+
+ return false;
+}
+
+struct ts_virtobj_map *ts_virtobj_map_objects(struct device *dev,
+ struct input_dev *input)
+{
+ struct ts_virtobj_map *map;
+ int rc;
+
+ if (!ts_virtobj_defined_objects(dev))
+ return NULL;
+
+ map = devm_kzalloc(dev, sizeof(*map), GFP_KERNEL);
+ if (!map)
+ return ERR_PTR(-ENOMEM);
+
+ rc = ts_virtobj_map_touchscreen(dev, map);
+ if (rc < 0) {
+ devm_kfree(dev, map);
+ return ERR_PTR(rc);
+ }
+ rc = ts_virtobj_map_buttons(dev, map, input);
+ if (rc < 0) {
+ devm_kfree(dev, map);
+ return ERR_PTR(rc);
+ }
+
+ return map;
+}
+EXPORT_SYMBOL(ts_virtobj_map_objects);
+
+void ts_virtobj_retrieve_abs(struct ts_virtobj_map *map, u16 *x, u16 *y)
+{
+ *x = map->touchscreen->x_size - 1;
+ *y = map->touchscreen->y_size - 1;
+}
+EXPORT_SYMBOL(ts_virtobj_retrieve_abs);
+
+static bool ts_virtobj_event_in_shape_range(struct ts_virtobj_shape *shape,
+ u32 x, u32 y)
+{
+ if (!shape)
+ return false;
+
+ if (x >= shape->x_origin && x < (shape->x_origin + shape->x_size) &&
+ y >= shape->y_origin && y < (shape->y_origin + shape->y_size))
+ return true;
+
+ return false;
+}
+
+bool ts_virtobj_touchscreen_event(struct ts_virtobj_shape *touchscreen,
+ u32 *x, u32 *y)
+{
+ if (ts_virtobj_event_in_shape_range(touchscreen, *x, *y)) {
+ *x -= touchscreen->x_origin;
+ *y -= touchscreen->y_origin;
+ return true;
+ }
+
+ return false;
+}
+EXPORT_SYMBOL(ts_virtobj_touchscreen_event);
+
+bool ts_virtobj_mapped_touchscreen(struct ts_virtobj_map *map)
+{
+ if (!map || !map->virtual_touchscreen)
+ return false;
+
+ return true;
+}
+EXPORT_SYMBOL(ts_virtobj_mapped_touchscreen);
+
+static bool ts_virtobj_mapped_button(struct ts_virtobj_map *map)
+{
+ if (!map || !map->button_count)
+ return false;
+
+ return true;
+}
+
+bool ts_virtobj_mt_on_touchscreen(struct ts_virtobj_map *map, u32 *x, u32 *y)
+{
+ if (!ts_virtobj_mapped_touchscreen(map))
+ return true;
+
+ if (!ts_virtobj_touchscreen_event(map->touchscreen, x, y))
+ return false;
+
+ return true;
+}
+EXPORT_SYMBOL(ts_virtobj_mt_on_touchscreen);
+
+bool ts_virtobj_button_event(struct ts_virtobj_map *map,
+ struct input_dev *input, u32 x, u32 y)
+{
+ int i;
+
+ if (!ts_virtobj_mapped_button(map))
+ return false;
+
+ for (i = 0; i < map->button_count; i++) {
+ if (ts_virtobj_event_in_shape_range(&map->buttons[i].shape, x, y)) {
+ input_report_key(input, map->buttons[i].key, 1);
+ map->buttons[i].pressed = true;
+ return true;
+ }
+ }
+
+ return false;
+}
+EXPORT_SYMBOL(ts_virtobj_button_event);
+
+void ts_virtobj_button_release_pressed(struct ts_virtobj_map *map,
+ struct input_dev *input)
+{
+ int i;
+
+ if (!map || !map->button_count)
+ return;
+
+ for (i = 0; i < map->button_count; i++) {
+ if (map->buttons[i].pressed) {
+ input_report_key(input, map->buttons[i].key, 0);
+ map->buttons[i].pressed = false;
+ }
+ }
+}
+EXPORT_SYMBOL(ts_virtobj_button_release_pressed);
+
+MODULE_LICENSE("GPL");
+MODULE_DESCRIPTION("Helper functions for virtual objects on touchscreens");
diff --git a/include/linux/input/ts-virtobj.h b/include/linux/input/ts-virtobj.h
new file mode 100644
index 000000000000..4b61709a2680
--- /dev/null
+++ b/include/linux/input/ts-virtobj.h
@@ -0,0 +1,59 @@
+/* SPDX-License-Identifier: GPL-2.0-only */
+/*
+ * Copyright (c) 2023 Javier Carrasco <[email protected]>
+ */
+
+#ifndef _TS_VIRTOBJ
+#define _TS_VIRTOBJ
+
+struct input_dev;
+struct device;
+
+enum ts_virtobj_valid_objects {
+ TOUCHSCREEN,
+ BUTTON,
+};
+
+static const char * const ts_virtobj_names[] = {
+ [TOUCHSCREEN] = "virtual-touchscreen",
+ [BUTTON] = "virtual-buttons",
+};
+
+struct ts_virtobj_shape {
+ u32 x_origin;
+ u32 y_origin;
+ u32 x_size;
+ u32 y_size;
+};
+
+struct ts_virtobj_button {
+ struct ts_virtobj_shape shape;
+ u32 key;
+ bool pressed;
+};
+
+struct ts_virtobj_map {
+ struct ts_virtobj_shape *touchscreen;
+ bool virtual_touchscreen;
+ struct ts_virtobj_button *buttons;
+ u32 button_count;
+};
+
+struct ts_virtobj_map *ts_virtobj_map_objects(struct device *dev,
+ struct input_dev *input);
+
+void ts_virtobj_retrieve_abs(struct ts_virtobj_map *map, u16 *x, u16 *y);
+
+bool ts_virtobj_touchscreen_event(struct ts_virtobj_shape *touchscreen,
+ u32 *x, u32 *y);
+
+bool ts_virtobj_mapped_touchscreen(struct ts_virtobj_map *map);
+
+bool ts_virtobj_mt_on_touchscreen(struct ts_virtobj_map *map, u32 *x, u32 *y);
+
+bool ts_virtobj_button_event(struct ts_virtobj_map *map,
+ struct input_dev *input, u32 x, u32 y);
+
+void ts_virtobj_button_release_pressed(struct ts_virtobj_map *map,
+ struct input_dev *input);
+#endif
--
2.37.2
The virtual-touchscreen object defines an area within the touchscreen
where touch events are reported and their coordinates get converted to
the virtual origin. This object avoids getting events from areas that
are physically hidden by overlay frames.
For touchscreens where overlay buttons on the touchscreen surface are
provided, the virtual-buttons object contains a node for every button
and the key event that should be reported when pressed.
Signed-off-by: Javier Carrasco <[email protected]>
---
.../input/touchscreen/touchscreen.yaml | 54 +++++++++++++++++++
1 file changed, 54 insertions(+)
diff --git a/Documentation/devicetree/bindings/input/touchscreen/touchscreen.yaml b/Documentation/devicetree/bindings/input/touchscreen/touchscreen.yaml
index 895592da9626..869be007eb6f 100644
--- a/Documentation/devicetree/bindings/input/touchscreen/touchscreen.yaml
+++ b/Documentation/devicetree/bindings/input/touchscreen/touchscreen.yaml
@@ -80,6 +80,60 @@ properties:
touchscreen-y-plate-ohms:
description: Resistance of the Y-plate in Ohms
+ virtual-touchscreen:
+ description: Clipped touchscreen area
+ type: object
+
+ properties:
+ x-origin:
+ description: horizontal origin of the clipped area
+ $ref: /schemas/types.yaml#/definitions/uint32
+
+ y-origin:
+ description: vertical origin of the clipped area
+ $ref: /schemas/types.yaml#/definitions/uint32
+
+ x-size:
+ description: horizontal resolution of the clipped area
+ $ref: /schemas/types.yaml#/definitions/uint32
+
+ y-size:
+ description: vertical resolution of the clipped area
+ $ref: /schemas/types.yaml#/definitions/uint32
+
+ virtual-buttons:
+ description: list of nodes defining the buttons on the touchscreen
+ type: object
+
+ patternProperties:
+ '^button-':
+ type: object
+ description:
+ Each button (key) is represented as a sub-node.
+
+ properties:
+ label:
+ $ref: /schemas/types.yaml#/definitions/string
+ description: descriptive name of the button
+
+ linux,code: true
+
+ x-origin:
+ description: horizontal origin of the button area
+ $ref: /schemas/types.yaml#/definitions/uint32
+
+ y-origin:
+ description: vertical origin of the button area
+ $ref: /schemas/types.yaml#/definitions/uint32
+
+ x-size:
+ description: horizontal resolution of the button area
+ $ref: /schemas/types.yaml#/definitions/uint32
+
+ y-size:
+ description: vertical resolution of the button area
+ $ref: /schemas/types.yaml#/definitions/uint32
+
dependencies:
touchscreen-size-x: [ touchscreen-size-y ]
touchscreen-size-y: [ touchscreen-size-x ]
--
2.37.2
Use ts-virtobj to support overlay objects such as buttons and resized
frames defined in the device tree.
Signed-off-by: Javier Carrasco <[email protected]>
---
drivers/input/touchscreen/st1232.c | 41 ++++++++++++++++++++++--------
1 file changed, 31 insertions(+), 10 deletions(-)
diff --git a/drivers/input/touchscreen/st1232.c b/drivers/input/touchscreen/st1232.c
index f49566dc96f8..025f43c532d8 100644
--- a/drivers/input/touchscreen/st1232.c
+++ b/drivers/input/touchscreen/st1232.c
@@ -22,6 +22,7 @@
#include <linux/pm_qos.h>
#include <linux/slab.h>
#include <linux/types.h>
+#include <linux/input/ts-virtobj.h>
#define ST1232_TS_NAME "st1232-ts"
#define ST1633_TS_NAME "st1633-ts"
@@ -56,6 +57,7 @@ struct st1232_ts_data {
struct touchscreen_properties prop;
struct dev_pm_qos_request low_latency_req;
struct gpio_desc *reset_gpio;
+ struct ts_virtobj_map *map;
const struct st_chip_info *chip_info;
int read_buf_len;
u8 *read_buf;
@@ -133,6 +135,7 @@ static int st1232_ts_parse_and_report(struct st1232_ts_data *ts)
struct input_mt_pos pos[ST_TS_MAX_FINGERS];
u8 z[ST_TS_MAX_FINGERS];
int slots[ST_TS_MAX_FINGERS];
+ bool button_slot[ST_TS_MAX_FINGERS] = {false};
int n_contacts = 0;
int i;
@@ -143,6 +146,11 @@ static int st1232_ts_parse_and_report(struct st1232_ts_data *ts)
unsigned int x = ((buf[0] & 0x70) << 4) | buf[1];
unsigned int y = ((buf[0] & 0x07) << 8) | buf[2];
+ if (ts_virtobj_button_event(ts->map, ts->input_dev, x, y))
+ button_slot[n_contacts] = true;
+ else if (!ts_virtobj_mt_on_touchscreen(ts->map, &x, &y))
+ continue;
+
touchscreen_set_mt_pos(&pos[n_contacts],
&ts->prop, x, y);
@@ -158,12 +166,16 @@ static int st1232_ts_parse_and_report(struct st1232_ts_data *ts)
for (i = 0; i < n_contacts; i++) {
input_mt_slot(input, slots[i]);
input_mt_report_slot_state(input, MT_TOOL_FINGER, true);
- input_report_abs(input, ABS_MT_POSITION_X, pos[i].x);
- input_report_abs(input, ABS_MT_POSITION_Y, pos[i].y);
+ if (!button_slot[i]) {
+ input_report_abs(input, ABS_MT_POSITION_X, pos[i].x);
+ input_report_abs(input, ABS_MT_POSITION_Y, pos[i].y);
+ }
if (ts->chip_info->have_z)
input_report_abs(input, ABS_MT_TOUCH_MAJOR, z[i]);
}
+ if (!n_contacts)
+ ts_virtobj_button_release_pressed(ts->map, ts->input_dev);
input_mt_sync_frame(input);
input_sync(input);
@@ -266,6 +278,11 @@ static int st1232_ts_probe(struct i2c_client *client)
ts->client = client;
ts->input_dev = input_dev;
+ /* map virtual objects if defined in the device tree */
+ ts->map = ts_virtobj_map_objects(&ts->client->dev, ts->input_dev);
+ if (IS_ERR(ts->map))
+ return PTR_ERR(ts->map);
+
ts->reset_gpio = devm_gpiod_get_optional(&client->dev, NULL,
GPIOD_OUT_HIGH);
if (IS_ERR(ts->reset_gpio)) {
@@ -292,18 +309,22 @@ static int st1232_ts_probe(struct i2c_client *client)
if (error)
return error;
- /* Read resolution from the chip */
- error = st1232_ts_read_resolution(ts, &max_x, &max_y);
- if (error) {
- dev_err(&client->dev,
- "Failed to read resolution: %d\n", error);
- return error;
- }
-
if (ts->chip_info->have_z)
input_set_abs_params(input_dev, ABS_MT_TOUCH_MAJOR, 0,
ts->chip_info->max_area, 0, 0);
+ if (ts_virtobj_mapped_touchscreen(ts->map)) {
+ ts_virtobj_retrieve_abs(ts->map, &max_x, &max_y);
+ } else {
+ /* Read resolution from the chip */
+ error = st1232_ts_read_resolution(ts, &max_x, &max_y);
+ if (error) {
+ dev_err(&client->dev,
+ "Failed to read resolution: %d\n", error);
+ return error;
+ }
+ }
+
input_set_abs_params(input_dev, ABS_MT_POSITION_X,
0, max_x, 0, 0);
input_set_abs_params(input_dev, ABS_MT_POSITION_Y,
--
2.37.2
The st1232 driver supports the virtual-touchscreen and virtual-buttons
objects defined in the generic touchscreen bindings. In order to support
the key codes properties within the virtual buttons, add the required
linux-event-codes include as well.
Signed-off-by: Javier Carrasco <[email protected]>
---
.../input/touchscreen/sitronix,st1232.yaml | 28 +++++++++++++++++++
1 file changed, 28 insertions(+)
diff --git a/Documentation/devicetree/bindings/input/touchscreen/sitronix,st1232.yaml b/Documentation/devicetree/bindings/input/touchscreen/sitronix,st1232.yaml
index 1d8ca19fd37a..66b8c85135b1 100644
--- a/Documentation/devicetree/bindings/input/touchscreen/sitronix,st1232.yaml
+++ b/Documentation/devicetree/bindings/input/touchscreen/sitronix,st1232.yaml
@@ -37,6 +37,7 @@ unevaluatedProperties: false
examples:
- |
+ #include <dt-bindings/input/linux-event-codes.h>
i2c {
#address-cells = <1>;
#size-cells = <0>;
@@ -46,5 +47,32 @@ examples:
reg = <0x55>;
interrupts = <2 0>;
gpios = <&gpio1 166 0>;
+
+ virtual-touchscreen {
+ x-origin = <0>;
+ x-size = <240>;
+ y-origin = <40>;
+ y-size = <280>;
+ };
+
+ virtual-buttons {
+ button-light {
+ label = "Camera light";
+ linux,code = <KEY_LIGHTS_TOGGLE>;
+ x-origin = <40>;
+ x-size = <40>;
+ y-origin = <0>;
+ y-size = <40>;
+ };
+
+ button-suspend {
+ label = "Suspend";
+ linux,code = <KEY_SUSPEND>;
+ x-origin = <160>;
+ x-size = <40>;
+ y-origin = <0>;
+ y-size = <40>;
+ };
+ };
};
};
--
2.37.2
Hi Javier,
On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
> Some touchscreens are shipped with a physical layer on top of them where
> a number of buttons and a resized touchscreen surface might be available.
>
> In order to generate proper key events by overlay buttons and adjust the
> touch events to a clipped surface, these patches offer a documented,
> device-tree-based solution by means of helper functions.
> An implementation for a specific touchscreen driver is also included.
>
> The functions in ts-virtobj provide a simple workflow to acquire
> physical objects from the device tree, map them into the device driver
> structures as virtual objects and generate events according to
> the object descriptions.
>
> This solution has been tested with a JT240MHQS-E3 display, which uses
> the st1624 as a touchscreen and provides two overly buttons and a frame
> that clips its effective surface.
There are quite a few of notebooks from Asus that feature a printed
numpad on their touchpad [0]. The mapping from the touch events to the
numpad events needs to happen in software.
Do you think your solution is general enough to also support this
usecase?
The differences I see are
* not device-tree based
* touchpads instead of touchscreens
> [..]
[0] https://unix.stackexchange.com/q/494400
Hi Thomas,
On Tue, Apr 25, 2023 at 05:29:39PM +0200, Thomas Wei?schuh wrote:
> Hi Javier,
>
> On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
> > Some touchscreens are shipped with a physical layer on top of them where
> > a number of buttons and a resized touchscreen surface might be available.
> >
> > In order to generate proper key events by overlay buttons and adjust the
> > touch events to a clipped surface, these patches offer a documented,
> > device-tree-based solution by means of helper functions.
> > An implementation for a specific touchscreen driver is also included.
> >
> > The functions in ts-virtobj provide a simple workflow to acquire
> > physical objects from the device tree, map them into the device driver
> > structures as virtual objects and generate events according to
> > the object descriptions.
> >
> > This solution has been tested with a JT240MHQS-E3 display, which uses
> > the st1624 as a touchscreen and provides two overly buttons and a frame
> > that clips its effective surface.
>
> There are quite a few of notebooks from Asus that feature a printed
> numpad on their touchpad [0]. The mapping from the touch events to the
> numpad events needs to happen in software.
That example seems a kind of fringe use-case in my opinion; I think the
gap filled by this RFC is the case where a touchscreen has a printed
overlay with a key that represents a fixed function.
One problem I do see here is something like libinput or multitouch taking
hold of the input device, and swallowing the key presses because it sees
the device as a touchscreen and is not interested in these keys.
Therefore, my first impression is that the virtual keypad may be better
served by registering its own input device.
Great work by the way, Javier!
>
> Do you think your solution is general enough to also support this
> usecase?
>
> The differences I see are
> * not device-tree based
> * touchpads instead of touchscreens
>
> > [..]
>
> [0] https://unix.stackexchange.com/q/494400
Kind regards,
Jeff LaBundy
Hi!
> Some touchscreens are shipped with a physical layer on top of them where
> a number of buttons and a resized touchscreen surface might be
> available.
Yes, it is quite comon, for example Motorola Droid 4 has 4 virtual
buttons below touchscreen.
One question is if this should be handled inside the kernel. It will
make it compatible with existing software, but it will also reduce
flexibility.
Best regards,
Pavel
--
People of Russia, stop Putin before his war on Ukraine escalates.
Hi Pavel,
On 4/27/23 13:04, Pavel Machek wrote:
> Hi!
>
>> Some touchscreens are shipped with a physical layer on top of them where
>> a number of buttons and a resized touchscreen surface might be
>> available.
>
> Yes, it is quite comon, for example Motorola Droid 4 has 4 virtual
> buttons below touchscreen.
Are those buttons configurable in some way? Or do they have a fixed purpose?
How does Android handle those buttons, BTW?
> One question is if this should be handled inside the kernel. It will
> make it compatible with existing software, but it will also reduce
> flexibility.
I would say that it should be described in device tree if the purpose is
fixed. For example, if there is no display behind the touch screen at a
certain point but a printed sheet (e.g., with a home or return symbol)
then it is clear that this button is not going to change. In such a case
I doubt that flexibility is required.
Best regards, Michael
>
> Best regards,
> Pavel
Hi!
> >
> >> Some touchscreens are shipped with a physical layer on top of them where
> >> a number of buttons and a resized touchscreen surface might be
> >> available.
> >
> > Yes, it is quite comon, for example Motorola Droid 4 has 4 virtual
> > buttons below touchscreen.
>
> Are those buttons configurable in some way? Or do they have a fixed purpose?
Fixed.
> How does Android handle those buttons, BTW?
No idea.
> > One question is if this should be handled inside the kernel. It will
> > make it compatible with existing software, but it will also reduce
> > flexibility.
>
> I would say that it should be described in device tree if the purpose is
> fixed. For example, if there is no display behind the touch screen at a
> certain point but a printed sheet (e.g., with a home or return symbol)
> then it is clear that this button is not going to change. In such a case
> I doubt that flexibility is required.
I agree it should be in the device tree.
AFAICT hardware can do drags between the buttons, and drag between the
buttons and touchscreen. Turning it into buttons prevents that.
Plus, real buttons can do simultaneous presses on all of them,
touchscreens will have problems with that.
Best regards,
Pavel
--
People of Russia, stop Putin before his war on Ukraine escalates.
Hi,
On 25.04.23 18:02, Jeff LaBundy wrote:
> Hi Thomas,
>
> On Tue, Apr 25, 2023 at 05:29:39PM +0200, Thomas Weißschuh wrote:
>> Hi Javier,
>>
>> On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
>>> Some touchscreens are shipped with a physical layer on top of them where
>>> a number of buttons and a resized touchscreen surface might be available.
>>>
>>> In order to generate proper key events by overlay buttons and adjust the
>>> touch events to a clipped surface, these patches offer a documented,
>>> device-tree-based solution by means of helper functions.
>>> An implementation for a specific touchscreen driver is also included.
>>>
>>> The functions in ts-virtobj provide a simple workflow to acquire
>>> physical objects from the device tree, map them into the device driver
>>> structures as virtual objects and generate events according to
>>> the object descriptions.
>>>
>>> This solution has been tested with a JT240MHQS-E3 display, which uses
>>> the st1624 as a touchscreen and provides two overly buttons and a frame
>>> that clips its effective surface.
>>
>> There are quite a few of notebooks from Asus that feature a printed
>> numpad on their touchpad [0]. The mapping from the touch events to the
>> numpad events needs to happen in software.
>
> That example seems a kind of fringe use-case in my opinion; I think the
> gap filled by this RFC is the case where a touchscreen has a printed
> overlay with a key that represents a fixed function.
Exactly, this RFC addresses exactly such printed overlays.
>
> One problem I do see here is something like libinput or multitouch taking
> hold of the input device, and swallowing the key presses because it sees
> the device as a touchscreen and is not interested in these keys.
Unfortunately I do not know libinput or multitouch and I might be
getting you wrong, but I guess the same would apply to any event
consumer that takes touchscreens as touch event producers and nothing else.
Should they not check the supported events from the device instead of
making such assumptions? This RFC adds key events defined in the device
tree and they are therefore available and published as device
capabilities. That is for example what evtest does to report the
supported events and they are then notified accordingly. Is that not the
right way to do it?
Thanks a lot for your feedback!
>
> Therefore, my first impression is that the virtual keypad may be better
> served by registering its own input device.
>
> Great work by the way, Javier!
>
>>
>> Do you think your solution is general enough to also support this
>> usecase?
>>
>> The differences I see are
>> * not device-tree based
>> * touchpads instead of touchscreens
>>
>>> [..]
>>
>> [0] https://unix.stackexchange.com/q/494400
>
> Kind regards,
> Jeff LaBundy
Hi Javier,
On Thu, Apr 27, 2023 at 05:59:42PM +0200, Javier Carrasco wrote:
> Hi,
>
> On 25.04.23 18:02, Jeff LaBundy wrote:
> > Hi Thomas,
> >
> > On Tue, Apr 25, 2023 at 05:29:39PM +0200, Thomas Wei?schuh wrote:
> >> Hi Javier,
> >>
> >> On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
> >>> Some touchscreens are shipped with a physical layer on top of them where
> >>> a number of buttons and a resized touchscreen surface might be available.
> >>>
> >>> In order to generate proper key events by overlay buttons and adjust the
> >>> touch events to a clipped surface, these patches offer a documented,
> >>> device-tree-based solution by means of helper functions.
> >>> An implementation for a specific touchscreen driver is also included.
> >>>
> >>> The functions in ts-virtobj provide a simple workflow to acquire
> >>> physical objects from the device tree, map them into the device driver
> >>> structures as virtual objects and generate events according to
> >>> the object descriptions.
> >>>
> >>> This solution has been tested with a JT240MHQS-E3 display, which uses
> >>> the st1624 as a touchscreen and provides two overly buttons and a frame
> >>> that clips its effective surface.
> >>
> >> There are quite a few of notebooks from Asus that feature a printed
> >> numpad on their touchpad [0]. The mapping from the touch events to the
> >> numpad events needs to happen in software.
> >
> > That example seems a kind of fringe use-case in my opinion; I think the
> > gap filled by this RFC is the case where a touchscreen has a printed
> > overlay with a key that represents a fixed function.
>
> Exactly, this RFC addresses exactly such printed overlays.
> >
> > One problem I do see here is something like libinput or multitouch taking
> > hold of the input device, and swallowing the key presses because it sees
> > the device as a touchscreen and is not interested in these keys.
>
> Unfortunately I do not know libinput or multitouch and I might be
> getting you wrong, but I guess the same would apply to any event
> consumer that takes touchscreens as touch event producers and nothing else.
>
> Should they not check the supported events from the device instead of
> making such assumptions? This RFC adds key events defined in the device
> tree and they are therefore available and published as device
> capabilities. That is for example what evtest does to report the
> supported events and they are then notified accordingly. Is that not the
> right way to do it?
evtest is just that, a test tool. It's handy for ensuring the device emits
the appropriate input events in response to hardware inputs, but it is not
necessarily representative of how the input device may be used in practice.
I would encourage you to test this solution with a simple use-case such as
Raspbian, and the virtual keys mapped to easily recognizable functions like
volume up/down.
Here, you will find that libinput will grab the device and declare it to be
a touchscreen based on the input events it advertises. However, you will not
see volume up/down keys are handled.
If you break out the virtual keypad as a separate input device, however, you
will see libinput additionally recognize it as a keyboard and volume up/down
keys will be handled. It is for this reason that a handful of drivers with
this kind of mixed functionality (e.g. ad714x) already branch out multiple
input devices for each function.
As a matter of principle, I find it to be most flexible for logically separate
functions to be represented as logically separate input devices, even if those
input devices all stem from the same piece of hardware. Not only does it allow
you to attach different handlers to each device (i.e. file descriptor), but it
also allows user space to inhibit one device but not the other, etc.
Maybe the right approach, which your RFC already seems to support, is to simply
let the driver decide whether to pass the touchscreen input_dev or a different
input_dev. The driver would be responsible for allocating and registering the
keypad; your functions simply set the capabilities for, and report events from,
whichever input_dev is passed to them. This is something to consider for your
st1232 example.
>
> Thanks a lot for your feedback!
> >
> > Therefore, my first impression is that the virtual keypad may be better
> > served by registering its own input device.
> >
> > Great work by the way, Javier!
> >
> >>
> >> Do you think your solution is general enough to also support this
> >> usecase?
> >>
> >> The differences I see are
> >> * not device-tree based
> >> * touchpads instead of touchscreens
> >>
> >>> [..]
> >>
> >> [0] https://unix.stackexchange.com/q/494400
> >
> > Kind regards,
> > Jeff LaBundy
Kind regards,
Jeff LaBundy
Hi Pavel,
On Thu, Apr 27, 2023 at 03:15:19PM +0200, Pavel Machek wrote:
> Hi!
>
> > >
> > >> Some touchscreens are shipped with a physical layer on top of them where
> > >> a number of buttons and a resized touchscreen surface might be
> > >> available.
> > >
> > > Yes, it is quite comon, for example Motorola Droid 4 has 4 virtual
> > > buttons below touchscreen.
> >
> > Are those buttons configurable in some way? Or do they have a fixed purpose?
>
> Fixed.
>
> > How does Android handle those buttons, BTW?
>
> No idea.
>
> > > One question is if this should be handled inside the kernel. It will
> > > make it compatible with existing software, but it will also reduce
> > > flexibility.
That's a great question; I think there are arguments for both.
On one hand, we generally want the kernel to be responsible for nothing more
than handing off the raw coordinate and touch information to user space. Any
further translation of that represents policy which would not belong here.
On the other hand, the notion of what buttons exist and where is very much a
hardware statement for the use-case targeted by this RFC. It would be ideal
if both the kernel and user space did not need to know information about the
same piece of hardware. So I think it is OK for the driver to give some help
by doing some of its own interpretation, much like some hardware-accelerated
solutions already do.
While there are obviously exceptions in either case, I don't see any reason
to prohibit having a simple option like this in the kernel, especially since
it doesn't preclude having something in user space for more advanced cases.
> >
> > I would say that it should be described in device tree if the purpose is
> > fixed. For example, if there is no display behind the touch screen at a
> > certain point but a printed sheet (e.g., with a home or return symbol)
> > then it is clear that this button is not going to change. In such a case
> > I doubt that flexibility is required.
>
> I agree it should be in the device tree.
>
> AFAICT hardware can do drags between the buttons, and drag between the
> buttons and touchscreen. Turning it into buttons prevents that.
>
> Plus, real buttons can do simultaneous presses on all of them,
> touchscreens will have problems with that.
I interpreted the RFC and its example to accommodate multitouch support, so
I don't see any problem here unless the vendor built such a module without
a multitouch panel which would not make sense. Let me know in case I have
misunderstood the concern.
>
> Best regards,
> Pavel
> --
> People of Russia, stop Putin before his war on Ukraine escalates.
Kind regards,
Jeff LaBundy
Hi Jeff,
On 27.04.23 19:23, Jeff LaBundy wrote:
> Hi Javier,
>
> On Thu, Apr 27, 2023 at 05:59:42PM +0200, Javier Carrasco wrote:
>> Hi,
>>
>> On 25.04.23 18:02, Jeff LaBundy wrote:
>>> Hi Thomas,
>>>
>>> On Tue, Apr 25, 2023 at 05:29:39PM +0200, Thomas Weißschuh wrote:
>>>> Hi Javier,
>>>>
>>>> On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
>>>>> Some touchscreens are shipped with a physical layer on top of them where
>>>>> a number of buttons and a resized touchscreen surface might be available.
>>>>>
>>>>> In order to generate proper key events by overlay buttons and adjust the
>>>>> touch events to a clipped surface, these patches offer a documented,
>>>>> device-tree-based solution by means of helper functions.
>>>>> An implementation for a specific touchscreen driver is also included.
>>>>>
>>>>> The functions in ts-virtobj provide a simple workflow to acquire
>>>>> physical objects from the device tree, map them into the device driver
>>>>> structures as virtual objects and generate events according to
>>>>> the object descriptions.
>>>>>
>>>>> This solution has been tested with a JT240MHQS-E3 display, which uses
>>>>> the st1624 as a touchscreen and provides two overly buttons and a frame
>>>>> that clips its effective surface.
>>>>
>>>> There are quite a few of notebooks from Asus that feature a printed
>>>> numpad on their touchpad [0]. The mapping from the touch events to the
>>>> numpad events needs to happen in software.
>>>
>>> That example seems a kind of fringe use-case in my opinion; I think the
>>> gap filled by this RFC is the case where a touchscreen has a printed
>>> overlay with a key that represents a fixed function.
>>
>> Exactly, this RFC addresses exactly such printed overlays.
>>>
>>> One problem I do see here is something like libinput or multitouch taking
>>> hold of the input device, and swallowing the key presses because it sees
>>> the device as a touchscreen and is not interested in these keys.
>>
>> Unfortunately I do not know libinput or multitouch and I might be
>> getting you wrong, but I guess the same would apply to any event
>> consumer that takes touchscreens as touch event producers and nothing else.
>>
>> Should they not check the supported events from the device instead of
>> making such assumptions? This RFC adds key events defined in the device
>> tree and they are therefore available and published as device
>> capabilities. That is for example what evtest does to report the
>> supported events and they are then notified accordingly. Is that not the
>> right way to do it?
>
> evtest is just that, a test tool. It's handy for ensuring the device emits
> the appropriate input events in response to hardware inputs, but it is not
> necessarily representative of how the input device may be used in practice.
You are right. I might have been biased by my use case though, where a
touchscreen with key capabilities is is exactly that and there is no
reason to ignore any event if the capabilities are available.
Well, props to evtest for being representative of at least that
practical use.
>
> I would encourage you to test this solution with a simple use-case such as
> Raspbian, and the virtual keys mapped to easily recognizable functions like
> volume up/down.
>
> Here, you will find that libinput will grab the device and declare it to be
> a touchscreen based on the input events it advertises. However, you will not
> see volume up/down keys are handled.
>
> If you break out the virtual keypad as a separate input device, however, you
> will see libinput additionally recognize it as a keyboard and volume up/down
> keys will be handled. It is for this reason that a handful of drivers with
> this kind of mixed functionality (e.g. ad714x) already branch out multiple
> input devices for each function.
>
> As a matter of principle, I find it to be most flexible for logically separate
> functions to be represented as logically separate input devices, even if those
> input devices all stem from the same piece of hardware. Not only does it allow
> you to attach different handlers to each device (i.e. file descriptor), but it
> also allows user space to inhibit one device but not the other, etc.
I had complex devices in mind where many capabilities are provided (like
a mouse with several buttons, wheels, and who knows what else or a bunch
of other complex pieces of hardware) but are still registered as a
single input device. That makes the whole functionality accessible
within a single object that translates 1:1 to the actual hardware, but
on the other hand it lacks of the flexibility you mention.
Nevertheless, in the end this RFC applies to touchscreens and if the
existing tools do not expect them to have key events, they must be
advertised in a different way. And ss I want any tool to identify the
touchscreen and the keys properly, I will go for the multi-device solution.
> Maybe the right approach, which your RFC already seems to support, is to simply
> let the driver decide whether to pass the touchscreen input_dev or a different
> input_dev. The driver would be responsible for allocating and registering the
> keypad; your functions simply set the capabilities for, and report events from,Y
> whichever input_dev is passed to them. This is something to consider for your
> st1232 example.
I would let the drivers register the devices that fit better in each
case according to the objects defined in the device tree and the
hardware configuration. Of course I could include the device
registration too, but that would probably reduce flexibility with no
real gain.
This RFC will not work out of the box with several input devices from a
single driver because it sets the key capabilities right away as it
always supposes there is only one input device. But splitting that part
is rather trivial and the rest does not need to change much as it works
with generic input devices.
The st1232 example will need some bigger changes though, so that part
will change a bit in the next version.
>
>>
>> Thanks a lot for your feedback!
>>>
>>> Therefore, my first impression is that the virtual keypad may be better
>>> served by registering its own input device.
>>>
>>> Great work by the way, Javier!
>>>
>>>>
>>>> Do you think your solution is general enough to also support this
>>>> usecase?
>>>>
>>>> The differences I see are
>>>> * not device-tree based
>>>> * touchpads instead of touchscreens
>>>>
>>>>> [..]
>>>>
>>>> [0] https://unix.stackexchange.com/q/494400
>>>
>>> Kind regards,
>>> Jeff LaBundy
>
> Kind regards,
> Jeff LaBundy
Thanks again for your feedback, I will keep your comments in mind for
the next version.
Best regards,
Javier Carrasco
On Tue, Apr 25, 2023 at 1:51 PM Javier Carrasco
<[email protected]> wrote:
> Some touchscreens are shipped with a physical layer on top of them where
> a number of buttons and a resized touchscreen surface might be available.
The APQ8060 DragonBoard even shipped with two different
stickers to be put over the touchscreen: one for Android and
another one for Windows Mobile.
True story!
Yours,
Linus Walleij
On Thu, Apr 27, 2023 at 12:23:14PM -0500, Jeff LaBundy wrote:
> Hi Javier,
>
> On Thu, Apr 27, 2023 at 05:59:42PM +0200, Javier Carrasco wrote:
> > Hi,
> >
> > On 25.04.23 18:02, Jeff LaBundy wrote:
> > > Hi Thomas,
> > >
> > > On Tue, Apr 25, 2023 at 05:29:39PM +0200, Thomas Wei?schuh wrote:
> > >> Hi Javier,
> > >>
> > >> On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
> > >>> Some touchscreens are shipped with a physical layer on top of them where
> > >>> a number of buttons and a resized touchscreen surface might be available.
> > >>>
> > >>> In order to generate proper key events by overlay buttons and adjust the
> > >>> touch events to a clipped surface, these patches offer a documented,
> > >>> device-tree-based solution by means of helper functions.
> > >>> An implementation for a specific touchscreen driver is also included.
> > >>>
> > >>> The functions in ts-virtobj provide a simple workflow to acquire
> > >>> physical objects from the device tree, map them into the device driver
> > >>> structures as virtual objects and generate events according to
> > >>> the object descriptions.
> > >>>
> > >>> This solution has been tested with a JT240MHQS-E3 display, which uses
> > >>> the st1624 as a touchscreen and provides two overly buttons and a frame
> > >>> that clips its effective surface.
> > >>
> > >> There are quite a few of notebooks from Asus that feature a printed
> > >> numpad on their touchpad [0]. The mapping from the touch events to the
> > >> numpad events needs to happen in software.
> > >
> > > That example seems a kind of fringe use-case in my opinion; I think the
> > > gap filled by this RFC is the case where a touchscreen has a printed
> > > overlay with a key that represents a fixed function.
> >
> > Exactly, this RFC addresses exactly such printed overlays.
> > >
> > > One problem I do see here is something like libinput or multitouch taking
> > > hold of the input device, and swallowing the key presses because it sees
> > > the device as a touchscreen and is not interested in these keys.
> >
> > Unfortunately I do not know libinput or multitouch and I might be
> > getting you wrong, but I guess the same would apply to any event
> > consumer that takes touchscreens as touch event producers and nothing else.
> >
> > Should they not check the supported events from the device instead of
> > making such assumptions? This RFC adds key events defined in the device
> > tree and they are therefore available and published as device
> > capabilities. That is for example what evtest does to report the
> > supported events and they are then notified accordingly. Is that not the
> > right way to do it?
>
> evtest is just that, a test tool. It's handy for ensuring the device emits
> the appropriate input events in response to hardware inputs, but it is not
> necessarily representative of how the input device may be used in practice.
ftr, I strongly recommend "libinput record" over evtest since it can be
replayed. And for libinput testing "libinput debug-events" to see what
comes out of libinput.
> I would encourage you to test this solution with a simple use-case such as
> Raspbian, and the virtual keys mapped to easily recognizable functions like
> volume up/down.
>
> Here, you will find that libinput will grab the device and declare it to be
> a touchscreen based on the input events it advertises. However, you will not
> see volume up/down keys are handled.
that would be a bug in libinput. libinput doesn't classify devices. It
uses *internal* backends but the backend for keyboard and touchscreen
devices is the same. So as long as your device advertises the various
EV_KEY and EV_ABS bit correctly, things should just work. If that's not
the case for a device please file a bug.
It's still "better" to split it up into different event nodes because
a lot of userspace may not be able to handle touchscreen+keyboard
devices but at least at the libinput level this shouldn't be a problem.
And the xf86-input-libinput driver splits up such devices at the X
level, so even where a device is touchscreen + keyboard you would end up
with two X devices with separate capabilities so they fit into the X
"everything is either a pointer or a keyboard" worldview.
Cheers,
Peter
Hi Peter and Javier,
On Thu, May 04, 2023 at 02:29:27PM +1000, Peter Hutterer wrote:
> On Thu, Apr 27, 2023 at 12:23:14PM -0500, Jeff LaBundy wrote:
> > Hi Javier,
> >
> > On Thu, Apr 27, 2023 at 05:59:42PM +0200, Javier Carrasco wrote:
> > > Hi,
> > >
> > > On 25.04.23 18:02, Jeff LaBundy wrote:
> > > > Hi Thomas,
> > > >
> > > > On Tue, Apr 25, 2023 at 05:29:39PM +0200, Thomas Wei?schuh wrote:
> > > >> Hi Javier,
> > > >>
> > > >> On 2023-04-25 13:50:45+0200, Javier Carrasco wrote:
> > > >>> Some touchscreens are shipped with a physical layer on top of them where
> > > >>> a number of buttons and a resized touchscreen surface might be available.
> > > >>>
> > > >>> In order to generate proper key events by overlay buttons and adjust the
> > > >>> touch events to a clipped surface, these patches offer a documented,
> > > >>> device-tree-based solution by means of helper functions.
> > > >>> An implementation for a specific touchscreen driver is also included.
> > > >>>
> > > >>> The functions in ts-virtobj provide a simple workflow to acquire
> > > >>> physical objects from the device tree, map them into the device driver
> > > >>> structures as virtual objects and generate events according to
> > > >>> the object descriptions.
> > > >>>
> > > >>> This solution has been tested with a JT240MHQS-E3 display, which uses
> > > >>> the st1624 as a touchscreen and provides two overly buttons and a frame
> > > >>> that clips its effective surface.
> > > >>
> > > >> There are quite a few of notebooks from Asus that feature a printed
> > > >> numpad on their touchpad [0]. The mapping from the touch events to the
> > > >> numpad events needs to happen in software.
> > > >
> > > > That example seems a kind of fringe use-case in my opinion; I think the
> > > > gap filled by this RFC is the case where a touchscreen has a printed
> > > > overlay with a key that represents a fixed function.
> > >
> > > Exactly, this RFC addresses exactly such printed overlays.
> > > >
> > > > One problem I do see here is something like libinput or multitouch taking
> > > > hold of the input device, and swallowing the key presses because it sees
> > > > the device as a touchscreen and is not interested in these keys.
> > >
> > > Unfortunately I do not know libinput or multitouch and I might be
> > > getting you wrong, but I guess the same would apply to any event
> > > consumer that takes touchscreens as touch event producers and nothing else.
> > >
> > > Should they not check the supported events from the device instead of
> > > making such assumptions? This RFC adds key events defined in the device
> > > tree and they are therefore available and published as device
> > > capabilities. That is for example what evtest does to report the
> > > supported events and they are then notified accordingly. Is that not the
> > > right way to do it?
> >
> > evtest is just that, a test tool. It's handy for ensuring the device emits
> > the appropriate input events in response to hardware inputs, but it is not
> > necessarily representative of how the input device may be used in practice.
>
> ftr, I strongly recommend "libinput record" over evtest since it can be
> replayed. And for libinput testing "libinput debug-events" to see what
> comes out of libinput.
>
> > I would encourage you to test this solution with a simple use-case such as
> > Raspbian, and the virtual keys mapped to easily recognizable functions like
> > volume up/down.
> >
> > Here, you will find that libinput will grab the device and declare it to be
> > a touchscreen based on the input events it advertises. However, you will not
> > see volume up/down keys are handled.
>
> that would be a bug in libinput. libinput doesn't classify devices. It
> uses *internal* backends but the backend for keyboard and touchscreen
> devices is the same. So as long as your device advertises the various
> EV_KEY and EV_ABS bit correctly, things should just work. If that's not
> the case for a device please file a bug.
Please accept my apology for spreading misinformation; the sighting occurred
some time ago and I appear to have mixed up some observations.
I recreated my original issue just now and the problem is actually with LIRC,
which in this case is presenting the hybrid input device to VLC media player
as a remote control.
Prior to launching VLC media player, both touchscreen movement and key events
are handled just fine. Once VLC media player launches and LIRC begins handling
the key events, however, all touchscreen functionality is lost.
Upon closer inspection, it seems that LIRC creates another input device called
"lircd bypass" which relays the "left over" (i.e. touchscreen) events. However,
it seems LIRC does not copy the axis limits, so libinput rightfully rejects
the new device since min ABS_X = max ABS_X = 0.
Therefore, please ignore my sighting with regard to this RFC; it is neither a
bug in libinput nor a valid argument in shaping this RFC. This instead seems
like a possible bug in LIRC, so I will report it there.
>
> It's still "better" to split it up into different event nodes because
> a lot of userspace may not be able to handle touchscreen+keyboard
> devices but at least at the libinput level this shouldn't be a problem.
I still agree; if nothing else, for the ability to inhibit different functions
at a more granular level. Therefore it seems best that patch [1/4] not mandate
the two input devices to be the same, which it doesn't appear to do anyway.
That being said, Javier, feel free to disregard my suggestion that the input
devices in patch [3/4] remain separate. Sorry for the churn; this was still
very helpful for me at least :)
>
> And the xf86-input-libinput driver splits up such devices at the X
> level, so even where a device is touchscreen + keyboard you would end up
> with two X devices with separate capabilities so they fit into the X
> "everything is either a pointer or a keyboard" worldview.
>
> Cheers,
> Peter
>
Kind regards,
Jeff LaBundy