Hi,
While reading the input code, I found the following error in
drivers/linux/input.h :
--- include/linux/input.h 2003-08-23 01:54:59.000000000 +0200
+++ include/linux/input.h.mod 2003-09-16 21:38:10.000000000 +0200
@@ -751,7 +751,7 @@
#define LONG(x) ((x)/BITS_PER_LONG)
#define INPUT_KEYCODE(dev, scancode) ((dev->keycodesize == 1) ?
((u8*)dev->keycode)[scancode] : \
- ((dev->keycodesize == 1) ? ((u16*)dev->keycode)[scancode] :
(((u32*)dev->keycode)[scancode])))
+ ((dev->keycodesize == 2) ? ((u16*)dev->keycode)[scancode] :
(((u32*)dev->keycode)[scancode])))
#define init_input_dev(dev) do { INIT_LIST_HEAD(&((dev)->h_list));
INIT_LIST_HEAD(&((dev)->node)); } while (0)
Remi
On Tue, Sep 16, 2003 at 09:42:38PM +0200, Remi Colinet wrote:
> While reading the input code, I found the following error in
> drivers/linux/input.h :
>
> #define INPUT_KEYCODE(dev, scancode) ((dev->keycodesize == 1) ?
> ((u8*)dev->keycode)[scancode] : \
> - ((dev->keycodesize == 1) ? ((u16*)dev->keycode)[scancode] :
> (((u32*)dev->keycode)[scancode])))
> + ((dev->keycodesize == 2) ? ((u16*)dev->keycode)[scancode] :
> (((u32*)dev->keycode)[scancode])))
Yes.
(But we only use 1 if I am not mistaken, so no kernel behaviour changes.)