Gather round children, it’s story time. Especially for you children who lurk on /r/linux and think you may learn something there. Today, I’ll tell you a horror story. The one where we convert kernel input events into touchpad events, with the subtle subtitle of “friends don’t let friends handle evdev events”.
The question put forward is “why do we need libinput at all”, when, as frequently suggested on the usual websites, it’s sufficient to just read evdev data and there’s really no need for libinput. That is of course true. You can use evdev events from the kernel directly. Did you know that the events the kernel gives you are absolute coordinates? And that not all touchpads have buttons? Or that some touchpads have specific event sequences that need to be filtered? No? Well, boy, are you in for a few surprises! Anyway, let’s go and handle evdev events ourselves and write our own libmyinput.
How do we know something is a touchpad? Well, we look at the exposed evdev bits. We need ABS_X, ABS_Y and BTN_TOOL_FINGER but don’t want INPUT_PROP_DIRECT. If the latter bit is set then we have a touchscreen (probably). We don’t actually care about buttons here, that comes later. ABS_X and ABS_Y give us device-absolute coordinates. On touch down you get the evdev frame of “a finger is down at x/y device units from the top-left”. As you move around, you get the x/y coordinate updates. The data itself is exactly the same as you would get from a touchscreen, but we know it’s a touchpad because we queried the other bits at startup. So your first job is to convert the absolute x/y coordinates to deltas by subtracting the previous position.
Touchpads have different resolutions for x and y so a delta of 10/10 does not mean it’s a 45-degree movement. Better check with the resolution to convert this to physical distances to be on the safe side. Oh, btw, the axes aren’t reliable. The min/max ranges and the resolutions are wrong on a large number of touchpads. Luckily systemd fixes this for you with the 60-evdev.hwdb. But I should probably note that hwdb only exists because of libinput… Either way, you don’t have to care about it because the road’s already paved. You’re welcome.
Oh wait, you do have to care a little because there are touchpads (e.g. HP Stream 11, ZBook Studio G3, …) where bits are missing or wrong. So you better write a device database that tells you when you have correct the evdev bits. You could implement this as config option but that’s just saying “I know what’s wrong here, I know how to fix it but I’m still going to make you google for it and edit a local configuration file to make it work”. You could treat your users this way, but you really shouldn’t.
As you’re happily processing your deltas, you notice that on some touchpads you get motion before you touch the touchpad. Ooops, we need a way to tell whether a finger is down. Luckily the kernel gives you BTN_TOUCH for that event, so you switch your implementation to only calculate deltas when BTN_TOUCH is set. But then you realise that is effectively a hardcoded threshold in the kernel and does not match a lot of devices. Some devices require too-hard finger pressure to trigger BTN_TOUCH, others send it on super-light pressure or even while hovering. After grinding some enamel away you find that many touchpads give you ABS_PRESSURE. Awesome, let’s make touches pressure-based instead. Let’s use a threshold, no, I mean a device-specific threshold (because if two touchpads would be the same the universe will stop doing whatever a universe does, I clearly haven’t thought this through). Luckily we already have the device database so we just add the thresholds there.
Oh, if you want this to run on a Apple touchpad better implement touch size handling (ABS_MT_TOUCH_MAJOR/ABS_MT_TOUCH_MINOR). These axes give you the size of the touching ellipse which is great. Except that the value is just an arbitrary number range that have no reflection to physical properties, so better update your database so you can add those thresholds.
Ok, now we have single-finger handling in our libnotinput. Let’s add some sophisticated touchpad features like button clicks. Buttons are easy, the kernel gives us BTN_LEFT and BTN_RIGHT and, if you’re lucky, BTN_MIDDLE. Unless you have a clickpad of course in which case you only ever get BTN_LEFT because the whole touchpad can be depressed (much like you, if you continue writing your own evdev handling). Those clickpads are in the majority of laptops these days, so we have to deal with them. The two approaches we have are “software button areas” and “clickfinger”. The former detects where your finger is when you push the touchpad down – if it’s in the bottom right corner we convert the kernel’s BTN_LEFT to a BTN_RIGHT and pass that on. Decide how big the buttons will be (note: some touchpads that need software buttons are only 50mm high, others exceed 100mm height). Whatever size you choose, it’s an invisible line on the touchpad. Do you know yet how you will handle a finger that moves from outside the button are into the button area before the click? Or the other way round? Maybe add this to your todo list for fixing later.
Maybe “clickfinger” is easier? It counts how many fingers are on the touchpad when clicking (1 finger == left click, 2 fingers == right click, 3 fingers == middle click). Much easier, except that so far we only handle one finger. The easy fix is to use BTN_TOOL_DOUBLETAP and BTN_TOOL_TRIPLETAP which are bitflags that tell you when a second/third finger are down. Add that to your libthisisnotlibinput. Coincidentally, users often click with their thumb while moving. So you have one finger moving the pointer, then a thumb click. Two fingers down but the user doesn’t perceive it as such, this should be a left click. Oops, we don’t actually know where the second finger is.
Let’s switch our libstillnotlibinput to use ABS_MT_POSITION_X and ABS_MT_POSITION_Y because that gives us per-finger position information (once you understand how the kernel’s MT protocol slots work). And when I say “switch” of course I meant “add” because there are still touchpads in use that don’t support multitouch so you get to keep both implementations. There are also a bunch of touchpads that can give you the position of two fingers but not of the third. Wipe that tear away and pencil that into your todo list. I haven’t mentioned semi-mt devices yet that will give you multitouch position data for two fingers but it won’t track them correctly – the first touch position is always the top/left of the bounding box, the second touch is always the bottom/right of the bounding box. Do the right thing for our libwhathaveidone and just pretend semi-mt devices are single-touch touchpads. libinput (the real one) does the same because my sanity is something to be cherished.
Oh, on another note, some touchpads don’t have any buttons (some Wacom tablets are large touchpads). Add that to your todo list. You wanted middle buttons to work? Few touchpads have a middle button (clickpads never do anyway). Better write a middle button emulation system that generates BTN_MIDDLE when both buttons are pressed. Or when a finger is on the left and another finger is on the right software button. Or when a finger is in a virtual middle button area. All these need to be present because if not, you get dissed by users for not implementing their favourite interaction method.
So we’re several paragraphs in and so far we have: finger tracking and some button handling. And a bunch of things on the todo list. We haven’t even started with other fancy features like edge scrolling, two-finger scrolling, pinch/swipe gestures or thumb and palm detection. Oh, and you’re not yet handling any other devices like graphics tablets which are a world of their own. If you think all the other features and devices are any less of a mess… well, an Austrian comedian once said (paraphrased): “optimism is just a fancy word for ignorance”.
All this is just handling features that users have come to expect. Examples for non-features that you’ll have to implement: on some Lenovo series (*50 and newer) you will get a pointer jump after a series of of events that only have pressure information. You’ll have to detect and discard that jump. The HP Pavilion DM4 touchpad has random jumps in the slot data. Synaptics PS/2 touchpads may ‘randomly’ end touches and restart them on the next event frame 10ms later. If you don’t handle that you’ll get ghost taps. And so on and so forth.
So as you, happily or less so, continue writing your libthisismoreworkthanexpected you’ll eventually come to realise that you’re just reimplementing libinput. Congratulations or condolences, whichever applies.
libinput’s raison d’etre is that it deals with all the mess above so that compositor authors can be blissfully unaware of all this. That’s the reason why all the major/general-purpose compositors have switched to libinput. That’s the reason most distributions now use libinput with the X server (through the xf86-input-libinput driver). libinput has made some design decisions that you may disagree with but honestly, that’s life. Deal with it. It doesn’t even do all I want and I wrote >90% of it. Suggesting that you can just handle evdev directly is like suggesting you can use GPS coordinates directly to navigate. Sure you can, but there’s a reason why people instead use a Tom Tom or Google Maps.
Source From: fedoraplanet.org.
Original article title: Peter Hutterer: Why it’s not a good idea to handle evdev directly.
This full article can be read at: Peter Hutterer: Why it’s not a good idea to handle evdev directly.