Enable emulator hotkeys and controller input (when that option is
enabled) when a TAS Input window has focus, as if it was the render
window instead. This allows TASers to use frame advance and the like
without having to switch the focused window or disabling Hotkeys Require
Window Focus which also picks up keypresses while other apps are active.
Cursor updates are disabled when the TAS Input window has focus, as
otherwise the Wii IR widget (and anything else controlled by the mouse)
becomes unusable. The cursor continues to work normally when the render
window has focus.
This specific issue was already addressed by https://github.com/dolphin-emu/dolphin/pull/11635
though I felt like there was something more we could do, and wasn't too happy with the
likelihood of devices update calls being skipped (due to `m_devices_population_mutex` being locked).
A comment removed by this commit gives two reasons for listening to
slave devices, both of which no longer apply:
- "Only slaves emit raw motion events": perhaps this was true when the
comment was written, but now master devices provide raw motion events
along with the other raw events.
- "Selecting slave keyboards avoids dealing with key focus": we get raw
key events regardless of the focus.
Listening to both master and slave devices results in duplicate raw
events. For button and key events, that's a tiny waste of time setting
the update flag a second time, but for raw mouse events the raw motion
will be processed twice. That makes this commit a user-facing change.
In X, the ButtonPress events generated when a mouse button is pressed
have a special property: if they don't activate an existing passive
grab, the X server automatically activates the "implicit passive grab"
on behalf of the client the event is delivered to. This ensures the
ButtonRelease event is delivered to the same client even if the pointer
moves between windows, but it also causes all events from that pointer
to be delivered exclusively to that client. As a consequence of the
implicit passive grab, for each window, only one client can listen for
ButtonPress events; any further listeners would never receive the event.
XInput 1 made the implicit grab optional and explicit by allowing
clients to listen for DeviceButtonPress events without
DeviceButtonPressGrab events. XInput 2 does not have a separate grab
event class, but multiple clients can listen for XI_ButtonPress on the
same window. When a button is pressed, the X server first tries to
deliver an XI_ButtonPress event; if no clients want it, then the server
tries to deliver a DeviceButtonPress event; if no clients want it, then
the server tries to deliver a ButtonPress event. Once an event has been
delivered, event processing stops and earlier protocol levels are not
considered. The reason for this rule is not obviously documented, but
it is probably because of the implicit passive grab; a client receiving
a ButtonPress event assumes it is the only client receiving that event,
and later protocols maintain that property for backward compatibility.
Before this commit, Dolphin listened for XI_ButtonPress events on the
root window. This interferes with window managers that expect to
receive ButtonPress events on the root window, such as awesome and
Openbox. In Openbox, applications are often launched from a menu
activated by clicking on the root window, and desktops are switched by
scroll wheel input on the root window. This makes normal use of other
applications difficult when Dolphin is open (though Openbox keyboard
shortcuts still work). Conversely, Dolphin only receives XI_ButtonPress
events for clicks on the root window or window decorations (title bars),
not on Dolphin's windows' content or the render window. In window
managers that use a "virtual root window" covering the actual root
window, such as Mutter running in X, Dolphin and the window manager do
not conflict, but clicks delivered to other applications using XInput2
(for testing, try xinput --test-xi2) are not seen by Dolphin, which is
relevant when background input is enabled.
This commit changes Dolphin to listen for XI_RawButtonPress (and the raw
versions of other events); Dolphin was already listening to XI_RawMotion
for raw mouse movement. Raw events are always and exclusively delivered
to the root window and are delivered to every client listening for them,
so Dolphin will not interfere with (or be interfered with by) other
applications listening for events.
As part of being raw, button numbers and keycodes in raw events have not
had mapping applied. If a left-handed user swapped the left and right
buttons on their mouse, raw events do not reflect that. It is possible
to query the mappings for each device and apply them manually, but that
would require a fair amount of code, including listening for mapping
changes.
Instead, Dolphin now uses the events only to set a "changed" flag, then
queries the current button and key state after processing all events.
Dolphin was already querying the pointer to get its absolute position
and querying the keyboard to filter the key bitmap it created from
events; now Dolphin also uses the button state from the pointer query
and uses the keyboard query directly.
Queries have a performance cost because they are synchronous requests to
the X server (Dolphin waits for the result). Commit 2b640a4f made the
pointer query conditional on receiving a motion event to "cut down on
round trips", but commit bbb12a75 added an unconditional keyboard query,
and there have apparently been no performance complaints. This commit
queries the pointer slightly more often (on button events in addition to
motion), but only queries the keyboard after key events, so the total
rate of queries should be substantially reduced.
Fixes: https://bugs.dolphin-emu.org/issues/10668
We need XInput 2.1 to get raw events on the root window even while
another client has a grab. We currently use raw events for relative
mouse input, and upcoming commits will use raw events for buttons and
keys.
XInput2 was created to support multiple pointer/keyboard pairs (often
called MPX for multi-pointer X). Dolphin's XInput2 implementation has
always supported MPX by creating a KeyboardMouse object per master
pointer. Since commit bbb12a7, Dolphin's keyboard state is filtered by
the output of XQueryKeymap. As a core X function, XQueryKeymap queries
"the" keyboard, which by default is the first master keyboard. As a
result, Dolphin will ignore keys pressed on other master keyboards
unless the first master is simultaneously pressing the same keys.
XInput2 doesn't provide a function to query the keyboard state. There
is no XIQueryKeymap and the current state is not a member of the
XIKeyClassInfo returned by XIQueryDevice. Instead, XInput2 allows a
master pointer to be nominated as "the" pointer on a per-client basis,
with "the" keyboard automatically becoming the associated master
keyboard. The "documentation" [1] says passing None for the window is
only for debugging purposes, but it is documented in the
XISetClientPointer man page and seems to be the only way to query
keyboards beyond the first.
With this commit, Dolphin correctly reads keys from keyboards other than
the first master keyboard. To test, use the xinput command-line utility
to create a master pointer and reattach a keyboard to the associated
master keyboard.
[1]: https://who-t.blogspot.com/2009/07/xi2-recipes-part-6.html
(the XInput2 developer's blog)
SPDX standardizes how source code conveys its copyright and licensing
information. See https://spdx.github.io/spdx-spec/1-rationale/ . SPDX
tags are adopted in many large projects, including things like the Linux
kernel.
Xlib supports many mouse buttons, though there are 9 standard buttons, and they aren't arranged like other mouse APIs. Using only 5 buttons was preventing the use of buttons besides left/right/middle click and the scroll wheel. Here's what all the standard buttons are:
1. left button
2. middle button (pressing the scroll wheel)
3. right button
4. turn scroll wheel up
5. turn scroll wheel down
6. push scroll wheel left
7. push scroll wheel right
8. 4th button (aka browser backward button)
9. 5th button (aka browser forward button)
The remaining button indices are non-standard and device-specific, and technically far more than 32 are supported, but this seems like a reasonable limit to avoid cluttering the list with tons of useless mouse buttons. What mouse has more than 32 buttons anyways?
The SDL backend crashes when you close a joystick after SDL_Quit has
been called. Some backends don't need to be shutdown and
re-initialized everytime, we can just ask to enumerate devices again.
This makes the device ID assigning code common to all backends, by
moving it to AddDevice() instead of copy-pasting or replicating
the logic in the backends.
Also, to prepare for hotplugging, instead of relying on a name usage
count, the new ID assigning system always starts from ID 0 and tries
to assign the first ID that is not used.
Small cleanup by using std::shared_ptr and getting rid of
ciface.Devices() which just returned the m_devices (which defeats the
point of making m_devices protected).
Incidentally, this should make the code safer when we have
different threads accessing devices in the future (for hotplug?).
A lot of code use Device references directly so there is
no easy way to remove FindDevice() and make those unique_ptrs.
Previously, the devices vector would be passed to all backends. They
would then manually push_back to it to add new devices. This was fine
but caused issues when trying to add synchronisation.
Instead, backends now call AddDevice() to fill m_devices so that it is
not accessible from the outside.