Skip to content

Commit 0e409d4

Browse files
authored
Improve multi-monitor handling under wayland (#1276)
* fix: improve multi-monitor handling under wayland When a monitor gets disconnected, the destroy event of all associated windows gets called, and the window gets removed. This patch changes that behavior: the window is still closed but the configuration is kept using the existing reload mechanism. In addition, a callback is added to listen for new monitors, triggering a reload when a new monitor gets connected. This logic also reloads already running windows, which has a positive and negative effect: - positive: if currently running e.g. on the second monitor specified in the list, the window can get moved to the first monitor - negative: if reloading starts it on the same monitor, it gets reset (e.g. graphs) I also had to work around an issue: the monitor model is not yet available immediately when a new monitor callback triggers. Waiting in the callback does not help (I tried 10 seconds). However, waiting outside, it always became available after 10ms. Tested with a dual monitor setup under KDE through a combinations of: - enabling/disabling individual monitors - switching between monitors - specifying a specific monitor in the yuck config - specifying a list of specific monitors in the yuck config In all these cases the behavior is as expected, and the widget gets loaded on the first available monitor (or stays unloaded until one becomes available). It also works when opening a window without any of the configured monitors being available. There is one remaining error from GTK when closing the window: GLib-GObject-CRITICAL **: 20:06:05.912: ../gobject/gsignal.c:2684: instance '0x55a4ab4be2d0' has no handler with id '136' This comes from the `self.gtk_window.disconnect(handler_id)` call. To prevent that we'd have to reset `destroy_event_handler_id`. * fix: do not call gtk::main_iteration_do while waiting for monitor model Executors that poll a future cannot be called recursively (in this case glib::main_context_futures::TaskSource::poll). So we cannot call gtk::main_iteration_do here, which in some cases led to the future being polled again, which raised a panic in the form of: thread 'main' panicked at glib/src/main_context_futures.rs:238:56: called `Result::unwrap()` on an `Err` value: EnterError We can just remove it as tokio::time::sleep() ensures the main thread continues to process (gtk) events during that time.
1 parent 98c2201 commit 0e409d4

4 files changed

Lines changed: 87 additions & 27 deletions

File tree

CHANGELOG.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ All notable changes to eww will be listed here, starting at changes since versio
2020
- Fix wayland monitor names support (By: dragonnn)
2121
- Load systray items that are registered without a path (By: Kage-Yami)
2222
- `get_locale` now follows POSIX standard for locale selection (By: mirhahn, w-lfchen)
23+
- Improve multi-monitor handling under wayland (By: bkueng)
2324

2425
### Features
2526
- Add warning and docs for incompatible `:anchor` and `:exclusive` options

crates/eww/src/app.rs

Lines changed: 57 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -68,6 +68,7 @@ pub enum DaemonCommand {
6868
},
6969
CloseWindows {
7070
windows: Vec<String>,
71+
auto_reopen: bool,
7172
sender: DaemonResponseSender,
7273
},
7374
KillServer,
@@ -147,16 +148,34 @@ impl<B: DisplayBackend> std::fmt::Debug for App<B> {
147148
}
148149
}
149150

151+
/// Wait until the .model() is available for all monitors (or there is a timeout)
152+
async fn wait_for_monitor_model() {
153+
let display = gdk::Display::default().expect("could not get default display");
154+
let start = std::time::Instant::now();
155+
loop {
156+
let all_monitors_set =
157+
(0..display.n_monitors()).all(|i| display.monitor(i).and_then(|monitor| monitor.model()).is_some());
158+
if all_monitors_set {
159+
break;
160+
}
161+
tokio::time::sleep(Duration::from_millis(10)).await;
162+
if std::time::Instant::now() - start > Duration::from_millis(500) {
163+
log::warn!("Timed out waiting for monitor model to be set");
164+
break;
165+
}
166+
}
167+
}
168+
150169
impl<B: DisplayBackend> App<B> {
151170
/// Handle a [`DaemonCommand`] event, logging any errors that occur.
152-
pub fn handle_command(&mut self, event: DaemonCommand) {
153-
if let Err(err) = self.try_handle_command(event) {
171+
pub async fn handle_command(&mut self, event: DaemonCommand) {
172+
if let Err(err) = self.try_handle_command(event).await {
154173
error_handling_ctx::print_error(err);
155174
}
156175
}
157176

158177
/// Try to handle a [`DaemonCommand`] event.
159-
fn try_handle_command(&mut self, event: DaemonCommand) -> Result<()> {
178+
async fn try_handle_command(&mut self, event: DaemonCommand) -> Result<()> {
160179
log::debug!("Handling event: {:?}", &event);
161180
match event {
162181
DaemonCommand::NoOp => {}
@@ -174,6 +193,10 @@ impl<B: DisplayBackend> App<B> {
174193
}
175194
}
176195
DaemonCommand::ReloadConfigAndCss(sender) => {
196+
// Wait for all monitor models to be set. When a new monitor gets added, this
197+
// might not immediately be the case. And if we were to wait inside the
198+
// connect_monitor_added callback, model() never gets set. So instead we wait here.
199+
wait_for_monitor_model().await;
177200
let mut errors = Vec::new();
178201

179202
let config_result = config::read_from_eww_paths(&self.paths);
@@ -200,7 +223,7 @@ impl<B: DisplayBackend> App<B> {
200223
DaemonCommand::CloseAll => {
201224
log::info!("Received close command, closing all windows");
202225
for window_name in self.open_windows.keys().cloned().collect::<Vec<String>>() {
203-
self.close_window(&window_name)?;
226+
self.close_window(&window_name, false)?;
204227
}
205228
}
206229
DaemonCommand::OpenMany { windows, args, should_toggle, sender } => {
@@ -209,7 +232,7 @@ impl<B: DisplayBackend> App<B> {
209232
.map(|w| {
210233
let (config_name, id) = w;
211234
if should_toggle && self.open_windows.contains_key(id) {
212-
self.close_window(id)
235+
self.close_window(id, false)
213236
} else {
214237
log::debug!("Config: {}, id: {}", config_name, id);
215238
let window_args = args
@@ -240,7 +263,7 @@ impl<B: DisplayBackend> App<B> {
240263
let is_open = self.open_windows.contains_key(&instance_id);
241264

242265
let result = if should_toggle && is_open {
243-
self.close_window(&instance_id)
266+
self.close_window(&instance_id, false)
244267
} else {
245268
self.open_window(&WindowArguments {
246269
instance_id,
@@ -256,9 +279,10 @@ impl<B: DisplayBackend> App<B> {
256279

257280
sender.respond_with_result(result)?;
258281
}
259-
DaemonCommand::CloseWindows { windows, sender } => {
260-
let errors = windows.iter().map(|window| self.close_window(window)).filter_map(Result::err);
261-
sender.respond_with_error_list(errors)?;
282+
DaemonCommand::CloseWindows { windows, auto_reopen, sender } => {
283+
let errors = windows.iter().map(|window| self.close_window(window, auto_reopen)).filter_map(Result::err);
284+
// Ignore sending errors, as the channel might already be closed
285+
let _ = sender.respond_with_error_list(errors);
262286
}
263287
DaemonCommand::PrintState { all, sender } => {
264288
let scope_graph = self.scope_graph.borrow();
@@ -360,7 +384,7 @@ impl<B: DisplayBackend> App<B> {
360384
}
361385

362386
/// Close a window and do all the required cleanups in the scope_graph and script_var_handler
363-
fn close_window(&mut self, instance_id: &str) -> Result<()> {
387+
fn close_window(&mut self, instance_id: &str, auto_reopen: bool) -> Result<()> {
364388
if let Some(old_abort_send) = self.window_close_timer_abort_senders.remove(instance_id) {
365389
_ = old_abort_send.send(());
366390
}
@@ -380,7 +404,17 @@ impl<B: DisplayBackend> App<B> {
380404
self.script_var_handler.stop_for_variable(unused_var.clone());
381405
}
382406

383-
self.instance_id_to_args.remove(instance_id);
407+
if auto_reopen {
408+
self.failed_windows.insert(instance_id.to_string());
409+
// There might be an alternative monitor available already, so try to re-open it immediately.
410+
// This can happen for example when a monitor gets disconnected and another connected,
411+
// and the connection event happens before the disconnect.
412+
if let Some(window_arguments) = self.instance_id_to_args.get(instance_id) {
413+
let _ = self.open_window(&window_arguments.clone());
414+
}
415+
} else {
416+
self.instance_id_to_args.remove(instance_id);
417+
}
384418

385419
Ok(())
386420
}
@@ -392,7 +426,7 @@ impl<B: DisplayBackend> App<B> {
392426

393427
// if an instance of this is already running, close it
394428
if self.open_windows.contains_key(instance_id) {
395-
self.close_window(instance_id)?;
429+
self.close_window(instance_id, false)?;
396430
}
397431

398432
self.instance_id_to_args.insert(instance_id.to_string(), window_args.clone());
@@ -445,9 +479,17 @@ impl<B: DisplayBackend> App<B> {
445479
move |_| {
446480
// we don't care about the actual error response from the daemon as this is mostly just a fallback.
447481
// Generally, this should get disconnected before the gtk window gets destroyed.
448-
// It serves as a fallback for when the window is closed manually.
482+
// This callback is triggered in 2 cases:
483+
// - When the monitor of this window gets disconnected
484+
// - When the window is closed manually.
485+
// We don't distinguish here and assume the window should be reopened once a monitor
486+
// becomes available again
449487
let (response_sender, _) = daemon_response::create_pair();
450-
let command = DaemonCommand::CloseWindows { windows: vec![instance_id.clone()], sender: response_sender };
488+
let command = DaemonCommand::CloseWindows {
489+
windows: vec![instance_id.clone()],
490+
auto_reopen: true,
491+
sender: response_sender,
492+
};
451493
if let Err(err) = app_evt_sender.send(command) {
452494
log::error!("Error sending close window command to daemon after gtk window destroy event: {}", err);
453495
}
@@ -466,7 +508,7 @@ impl<B: DisplayBackend> App<B> {
466508
tokio::select! {
467509
_ = glib::timeout_future(duration) => {
468510
let (response_sender, mut response_recv) = daemon_response::create_pair();
469-
let command = DaemonCommand::CloseWindows { windows: vec![instance_id.clone()], sender: response_sender };
511+
let command = DaemonCommand::CloseWindows { windows: vec![instance_id.clone()], auto_reopen: false, sender: response_sender };
470512
if let Err(err) = app_evt_sender.send(command) {
471513
log::error!("Error sending close window command to daemon after gtk window destroy event: {}", err);
472514
}

crates/eww/src/opts.rs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -292,7 +292,7 @@ impl ActionWithServer {
292292
})
293293
}
294294
ActionWithServer::CloseWindows { windows } => {
295-
return with_response_channel(|sender| app::DaemonCommand::CloseWindows { windows, sender });
295+
return with_response_channel(|sender| app::DaemonCommand::CloseWindows { windows, auto_reopen: false, sender });
296296
}
297297
ActionWithServer::Reload => return with_response_channel(app::DaemonCommand::ReloadConfigAndCss),
298298
ActionWithServer::ListWindows => return with_response_channel(app::DaemonCommand::ListWindows),

crates/eww/src/server.rs

Lines changed: 28 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -105,13 +105,15 @@ pub fn initialize_server<B: DisplayBackend>(
105105
}
106106
}
107107

108+
connect_monitor_added(ui_send.clone());
109+
108110
// initialize all the handlers and tasks running asyncronously
109111
let tokio_handle = init_async_part(app.paths.clone(), ui_send);
110112

111113
gtk::glib::MainContext::default().spawn_local(async move {
112114
// if an action was given to the daemon initially, execute it first.
113115
if let Some(action) = action {
114-
app.handle_command(action);
116+
app.handle_command(action).await;
115117
}
116118

117119
loop {
@@ -120,7 +122,7 @@ pub fn initialize_server<B: DisplayBackend>(
120122
app.scope_graph.borrow_mut().handle_scope_graph_event(scope_graph_evt);
121123
},
122124
Some(ui_event) = ui_recv.recv() => {
123-
app.handle_command(ui_event);
125+
app.handle_command(ui_event).await;
124126
}
125127
else => break,
126128
}
@@ -136,6 +138,29 @@ pub fn initialize_server<B: DisplayBackend>(
136138
Ok(ForkResult::Child)
137139
}
138140

141+
fn connect_monitor_added(ui_send: UnboundedSender<DaemonCommand>) {
142+
let display = gtk::gdk::Display::default().expect("could not get default display");
143+
display.connect_monitor_added({
144+
move |_display: &gtk::gdk::Display, _monitor: &gtk::gdk::Monitor| {
145+
log::info!("New monitor connected, reloading configuration");
146+
let _ = reload_config_and_css(&ui_send);
147+
}
148+
});
149+
}
150+
151+
fn reload_config_and_css(ui_send: &UnboundedSender<DaemonCommand>) -> Result<()> {
152+
let (daemon_resp_sender, mut daemon_resp_response) = daemon_response::create_pair();
153+
ui_send.send(DaemonCommand::ReloadConfigAndCss(daemon_resp_sender))?;
154+
tokio::spawn(async move {
155+
match daemon_resp_response.recv().await {
156+
Some(daemon_response::DaemonResponse::Success(_)) => log::info!("Reloaded config successfully"),
157+
Some(daemon_response::DaemonResponse::Failure(e)) => eprintln!("{}", e),
158+
None => log::error!("No response to reload configuration-reload request"),
159+
}
160+
});
161+
Ok(())
162+
}
163+
139164
fn init_async_part(paths: EwwPaths, ui_send: UnboundedSender<app::DaemonCommand>) -> tokio::runtime::Handle {
140165
let rt = tokio::runtime::Builder::new_multi_thread()
141166
.thread_name("main-async-runtime")
@@ -216,20 +241,12 @@ async fn run_filewatch<P: AsRef<Path>>(config_dir: P, evt_send: UnboundedSender<
216241
debounce_done.store(true, Ordering::SeqCst);
217242
});
218243

219-
let (daemon_resp_sender, mut daemon_resp_response) = daemon_response::create_pair();
220244
// without this sleep, reading the config file sometimes gives an empty file.
221245
// This is probably a result of editors not locking the file correctly,
222246
// and eww being too fast, thus reading the file while it's empty.
223247
// There should be some cleaner solution for this, but this will do for now.
224248
tokio::time::sleep(std::time::Duration::from_millis(50)).await;
225-
evt_send.send(app::DaemonCommand::ReloadConfigAndCss(daemon_resp_sender))?;
226-
tokio::spawn(async move {
227-
match daemon_resp_response.recv().await {
228-
Some(daemon_response::DaemonResponse::Success(_)) => log::info!("Reloaded config successfully"),
229-
Some(daemon_response::DaemonResponse::Failure(e)) => eprintln!("{}", e),
230-
None => log::error!("No response to reload configuration-reload request"),
231-
}
232-
});
249+
reload_config_and_css(&evt_send)?;
233250
}
234251
},
235252
else => break

0 commit comments

Comments
 (0)