1
0
mirror of https://github.com/AvengeMedia/DankMaterialShell.git synced 2026-01-24 21:42:51 -05:00

Compare commits

...

188 Commits

Author SHA1 Message Date
bbedward
9673078a75 fix gui 2025-12-15 16:30:35 -05:00
bbedward
9e8c93bfd7 displays: fix hyprland config saving 2025-12-15 16:04:31 -05:00
bbedward
43d6f4b1d3 displays: add configurator (Beta)
- Position, resolution, refresh, orientation, VRR
- niri, Hyprland, MangoWC
- Rely on wlr-output for reading data, compositors to write output
  configurations
- Re-organize display setting group
2025-12-15 15:55:31 -05:00
bbedward
bafe1c5fee niri: handle window urgency event
fixes #1033
2025-12-15 12:16:43 -05:00
bbedward
306d7b2ce0 gamma: guard against application
- QML will sync its desired state with GO, when IE settings are changed
  or opened. Go was applying gamma even if unchanged
- Track last applied gamma to avoid sends
2025-12-15 11:43:16 -05:00
bbedward
e9f6583c60 workspaces: add scroll handler to widget itself 2025-12-15 11:12:27 -05:00
redybcs
42a2835929 Update flake.nix to fix Hash Mismatch (#1035)
Looks like there hasn't been any go.mod updates since the workflow to fix the hash was repaired.
2025-12-15 14:14:29 +01:00
purian23
c2c90c680e distro: OBS edgecase 2025-12-15 01:28:00 -05:00
purian23
cd01f6378c Revise OBS / PPA Workflows 2025-12-15 00:37:42 -05:00
purian23
6033075de6 distro: Revise builds to use API variants 2025-12-15 00:32:40 -05:00
tsukasa
79794d3441 dankmodal: removed backgroundWindow to fix clicking twice (#1030)
* dankmodal: removed backgroundWindow

removed 'backgroundWindow' but combined it with 'contentWindow'

* made single window behavior specific to hyprland

this should keep other compositor behavior the same and fix double
clicking to exit out of Spotlight/ClipboardHist/Powermenu
2025-12-14 19:52:06 -05:00
bbedward
031f86b417 Revert "Fixed having to click twice to exit out of Spotlight/Cliphist/Powermenu (#1022)"
This reverts commit ca5fe6f7db.
2025-12-14 19:09:04 -05:00
bbedward
891f53cf6f battery: fix button group sclaing 2025-12-14 17:22:54 -05:00
bbedward
848991cf5b idle: implement screensaver interface
- Mainly used to create the idle inhibitor when an app requests
  screensaver inhibit
2025-12-14 16:49:59 -05:00
bbedward
d37ddd1d41 vpn: optim cc and dankbar widget 2025-12-14 16:12:46 -05:00
Pi Home Server
00d12acd5e Add hide option for updater widget (#1028) 2025-12-14 15:55:47 -05:00
bbedward
3bbc78a44f dankbar: make control center widget per-instance not global
fixes #1017
2025-12-14 15:52:46 -05:00
bbedward
b0a6652cc6 ci: simplify changelog handling 2025-12-14 14:23:27 -05:00
bbedward
cb710b2e5f notifications: fix redundant height animation 2025-12-14 13:40:21 -05:00
tsukasa
ca5fe6f7db Fixed having to click twice to exit out of Spotlight/Cliphist/Powermenu (#1022)
There's possibly more but this fix the need of having to click the
background twice to close those modals.
2025-12-14 11:16:25 -05:00
bbedward
fb75f4c68b lock/greeter: fix font alignment
fixes #1018
2025-12-14 11:13:48 -05:00
bbedward
5e2a418485 binds: fix to scale with arbitrary font sizes 2025-12-14 10:56:03 -05:00
bbedward
24fe215067 ci: pull changelogs from obs/launchpad APIs
- Get changelog from OBS/Launchpad API endpoints, instead of storing in
  git
2025-12-14 10:42:00 -05:00
bbedward
ab2e8875ac runningapps: round icon margin to integer 2025-12-14 10:25:36 -05:00
dms-ci[bot]
dec5740c74 ci: Auto-update PPA packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-14 15:22:12 +00:00
bbedward
208266dfa3 dwl: fix layout popout 2025-12-14 10:17:58 -05:00
dms-ci[bot]
32f218d58c ci: Auto-update OBS packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-14 04:07:07 +00:00
dms-ci[bot]
6fdaab2ccd ci: Auto-update PPA packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-14 03:58:50 +00:00
purian23
d336866f44 distro: Let the workflow run 2025-12-13 22:54:58 -05:00
purian23
b40df5f1c4 distro: Unify options across repos 2025-12-13 22:38:25 -05:00
dms-ci[bot]
3c9886ad1b ci: Auto-update PPA packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-14 01:55:34 +00:00
bbedward
ea205ebd12 wallpaper: pause cycling when locked, clean state when changing modes 2025-12-13 20:29:02 -05:00
bbedward
30dad46c94 dankbar: add scroll wheel behavior configuration 2025-12-13 20:12:21 -05:00
dms-ci[bot]
fbf79e62e9 ci: Auto-update OBS packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-13 21:23:58 +00:00
dms-ci[bot]
efcf72bc08 ci: Auto-update PPA packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-13 21:19:59 +00:00
bbedward
3b511e2f55 i18n: add hungarian 2025-12-13 14:03:49 -05:00
dms-ci[bot]
e4e20fb43a ci: Auto-update PPA packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-13 15:26:22 +00:00
dms-ci[bot]
48ccff67a6 ci: Auto-update OBS packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-13 15:25:01 +00:00
Souyama
a783d6507b Change DPMS off to DPMS toggle in hyprland.conf (#1011) 2025-12-13 10:07:11 -05:00
bbedward
fd94e60797 cava: dont set method/source 2025-12-13 10:04:20 -05:00
bbedward
a1bcb7ea30 vpn: just try and import all types on errors 2025-12-13 10:02:57 -05:00
bbedward
31b67164c7 clipboard: re-add ownership option 2025-12-13 09:45:04 -05:00
bbedward
786c13f892 clipboard: fix mime type selection 2025-12-13 09:35:55 -05:00
bbedward
c652659d54 wallpaper: scale texture to physical pixels
- reverts a regression
2025-12-13 08:43:46 -05:00
dms-ci[bot]
ca39196f13 ci: Auto-update OBS packages [dms,dms-git]
🤖 Automated by GitHub Actions
2025-12-13 07:00:01 +00:00
dms-ci[bot]
f02dd8fd4b ci: Auto-update PPA packages [dms,dms-git,dms-greeter]
🤖 Automated by GitHub Actions
2025-12-13 06:51:47 +00:00
purian23
0f89886ce7 distro: Break the loop 2025-12-13 01:44:20 -05:00
dms-ci[bot]
1118404192 ci: Auto-update PPA packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-13 06:34:59 +00:00
dms-ci[bot]
f011ea6cce ci: Auto-update OBS packages [dms-git]
🤖 Automated by GitHub Actions
2025-12-13 06:30:45 +00:00
dms-ci[bot]
b2ac9c6c1a ci: Auto-update OBS packages [dms,dms-git]
🤖 Automated by GitHub Actions
2025-12-13 06:06:44 +00:00
dms-ci[bot]
fbab41abd6 ci: Auto-update PPA packages [dms,dms-git,dms-greeter]
🤖 Automated by GitHub Actions
2025-12-13 05:58:14 +00:00
bbedward
82f881af5b matugen: scrub the never implemented dynamic contrast palette 2025-12-13 00:51:51 -05:00
purian23
68de9b437d distro: Switch to dms-ci 2025-12-13 00:50:42 -05:00
bbedward
830a715b6d wlcontext: use poll with wake pipe instead of read deadlines 2025-12-13 00:46:30 -05:00
bbedward
ce4aca9a72 fix shellcheck 2025-12-13 00:29:20 -05:00
bbedward
7641171a01 clipboard: move cl receive to main wlcontext goroutine 2025-12-13 00:16:56 -05:00
purian23
119e084e52 distro: Remove PR tests 2025-12-13 00:15:37 -05:00
bbedward
7c6d52913e niri: fix test 2025-12-12 23:57:50 -05:00
bbedward
f63ab5cf7c ci: add workflow for pushing stable tag 2025-12-12 23:57:50 -05:00
purian23
50f1bc5017 distros: Remove false path dir 2025-12-12 23:52:31 -05:00
bbedward
c3ab409b6a clipboard: scrap persist, optimize mime-type handling 2025-12-12 23:48:07 -05:00
purian23
44f6ab4878 distro: Reformat workflow newlines 2025-12-12 23:35:37 -05:00
bbedward
5fda6e0f12 clipboard: allow configuration even when disabled 2025-12-12 23:17:55 -05:00
purian23
38068e78c9 distros: PR writeback 2025-12-12 23:02:52 -05:00
purian23
66d22727e9 distros: Enhance build automation 2025-12-12 22:41:51 -05:00
Lucas
db2f68e35d nix: fix qt-plugins path (#1005) 2025-12-13 01:34:25 +01:00
Marcus Ramberg
352277ec15 notifications: add ipc call for toggleDoNotDisturb (#1002) 2025-12-12 18:21:00 -05:00
bbedward
d6043e64f2 osd: increase shadow buffer
accounts for percentage view
2025-12-12 18:11:31 -05:00
bbedward
d3f5b8f32e niri: fix gap reactivity 2025-12-12 16:58:07 -05:00
bbedward
6c3c722674 niri: add warnings on auto-generated files 2025-12-12 16:53:52 -05:00
purian23
5b8edb13d8 distro: OBS updates 2025-12-12 15:42:21 -05:00
bbedward
c595727b94 osd: optimize surface damage
fixes #994
2025-12-12 15:37:39 -05:00
bbedward
d46302588a clipboard: add shift+enter to paste from clipboard history modal
fixes #358
2025-12-12 15:29:10 -05:00
bbedward
0ff9fdb365 notifications: add swipe to dismiss functionality
fixes #927
2025-12-12 14:39:51 -05:00
purian23
e95f7ce367 Update Copr specs 2025-12-12 12:30:18 -05:00
Pi Home Server
df1a8f4066 Add lock screen layout settings (#981)
* Add lock screen layout settings

* Update translation keys
2025-12-12 11:45:00 -05:00
bbedward
32e6c1660f wallpaper: clamp max texture size 2025-12-12 11:17:28 -05:00
bbedward
d6b9b72e9b ci: disable pkg builds from main release wf 2025-12-12 10:16:24 -05:00
bbedward
179ad03fa4 ci: switch to dispatch-based release flow 2025-12-12 10:01:44 -05:00
bbedward
c3cb82c84e dankinstall: call add-wants for niri/hyprland with dms service 2025-12-12 09:58:12 -05:00
bbedward
4b52e2ed9e niri: fix keybind handling of cooldown-ms parameter 2025-12-12 09:52:35 -05:00
bbedward
77fd61f81e workspaces: make icons scale with bar size, fixi valign of numbers
fixes #990
2025-12-12 00:23:40 -05:00
Lucas
c3ffb7f83b nix: remove wl-clipboard and cliphist dependencies (#991) 2025-12-11 21:44:36 -05:00
Lucas
89dcd72d70 nix: let paths be used instead of only packages in plugins (#988) 2025-12-11 23:57:22 +01:00
bbedward
5c3346aa9d core: fix test 2025-12-11 16:33:31 -05:00
bbedward
7c4b383477 clipboard: persistence off by default
- It's a little risky and messy of a default
2025-12-11 16:28:56 -05:00
bbedward
bdc0e8e0fc clipboard: dont take ownership on nil offers 2025-12-11 15:55:42 -05:00
bbedward
6d66f93565 core: mock wayland context for tests & add i18n guidance to CONTRIBUTING 2025-12-11 14:50:02 -05:00
Lucas
9cac93b724 nix: fix pre-commit hook in dev-shell (#987) 2025-12-11 14:40:19 -05:00
bbedward
0709f263af core: add test coverage for some of the wayland stack
- mostly targeting any race issue detection
2025-12-11 13:47:18 -05:00
Lucas
4e4effd8b1 nix: fix home-manager module plugins (#984) 2025-12-11 19:36:32 +01:00
bbedward
f9632cba61 core: remove hyprpicker remnant 2025-12-11 13:05:07 -05:00
bbedward
38db6a41d5 gamma: fix initial night mode enablement 2025-12-11 12:27:58 -05:00
bbedward
7c6f0432c8 clipboard: add copyEntry (by id) handler 2025-12-11 12:00:47 -05:00
bbedward
56ff9368be matugen: add option to disable DMS templates
fixes #983
2025-12-11 11:48:59 -05:00
bbedward
597e21d44d clipboard: remove wl-copy references 2025-12-11 11:10:27 -05:00
bbedward
5bf54632be media: add option to disable visualizer in bar widget
fixes #978
2025-12-11 10:55:32 -05:00
bbedward
3a8d3ee515 core: use stdlib for xdg dirs 2025-12-11 10:15:23 -05:00
bbedward
1c1cf866e2 settings: make default height screen-aware 2025-12-11 09:51:44 -05:00
bbedward
ccc1df75f1 nix: update vendorHash 2025-12-11 09:50:50 -05:00
bbedward
d2c3f87656 ci: fix nix vendor-hash workflow 2025-12-11 09:46:57 -05:00
bbedward
6d62229b5f clipboard: introduce native clipboard, clip-persist, clip-storage functionality 2025-12-11 09:41:07 -05:00
Marcus Ramberg
7c88865d67 Refactor pre-commit hooks to use prek (#976)
* ci: change to prek for pre-commit

* refactor: fix shellcheck warnings for the scripts

* chore: unify whitespace formatting

* nix: add prek to dev shell
2025-12-11 09:11:12 -05:00
bbedward
c8cfe0cb5a dwl: fix layout popout not opening
fixes #980
2025-12-11 09:05:53 -05:00
Lucas
e573bdba92 nix: add QML dependencies to dms-shell package (#967) 2025-12-11 09:19:43 +01:00
Lucas
d8cd15d361 nix: add plugins in NixOS module (#970)
* nix: remove unnecessary /etc/xdg/quickshell/dms and .config/quickshell/dms

* nix: add plugins in NixOS module
2025-12-11 09:03:22 +01:00
Lucas
1db3907838 nix: fix greeter per-monitor and per-mode wallpapers (#974) 2025-12-11 09:01:14 +01:00
Lucas
72cfd37ab7 nix: fix niri module (#969) 2025-12-10 23:21:52 -05:00
bbedward
1e67ee995e plugins: hide uninstall and update buttons for system plugins 2025-12-10 19:30:58 -05:00
bbedward
6c26b4080c core: fix socket reported CLI version 2025-12-10 16:48:44 -05:00
purian23
0dbd59b223 Manual Changelog versioning 2025-12-10 13:52:29 -05:00
Lucas
b2066c60d1 nix: drop unnecessary dependencies and enable power and accounts daemons (#963) 2025-12-10 19:35:58 +01:00
bbedward
8d7ae324ff Revert "distro: update ppa-build script to ref right version"
This reverts commit c0d3c4f875.
2025-12-10 13:31:15 -05:00
bbedward
c0d3c4f875 distro: update ppa-build script to ref right version 2025-12-10 13:28:34 -05:00
purian23
27a771648a Ubuntu workflow tweak 2025-12-10 12:37:56 -05:00
purian23
86affc7304 Add WorkDIR to build steps 2025-12-10 12:33:41 -05:00
purian23
d939b99628 Workflow build increment logic 2025-12-10 12:27:10 -05:00
purian23
1fcf777f3d Bump OBS spec 2025-12-10 12:13:43 -05:00
purian23
7a8e23faa9 Update build scripts 2025-12-10 12:06:28 -05:00
bbedward
73a4dd3321 change codename 2025-12-10 11:18:08 -05:00
purian23
13ce873a69 Update dms stable systemd & desktop path 2025-12-10 11:16:03 -05:00
bbedward
406dc64aba wf: disable update-versions job 2025-12-10 10:50:47 -05:00
dms-ci[bot]
af5d6a2015 chore: bump version to v1.0.0 2025-12-10 15:43:36 +00:00
bbedward
61c6f509ae i18n: update translations 2025-12-10 09:32:57 -05:00
Marcus Ramberg
98769ecd88 nix: switch to standard nixpkgs rfc formatting (#962) 2025-12-10 04:55:45 -03:00
bbedward
8615950bd6 cc: allow 75 width sliders 2025-12-10 00:48:27 -05:00
bbedward
1bec8dfc48 vpn: make import modal floating variant 2025-12-10 00:30:45 -05:00
bbedward
460486fe25 media: fix media player updates 2025-12-09 23:59:04 -05:00
bbedward
318c50bc6c media: block scrolling media volume in widget when no player vol avail 2025-12-09 23:45:01 -05:00
purian23
3e08bac7f3 distros: Prep dms-git build versioning 2025-12-09 23:25:34 -05:00
bbedward
c3d64ab185 scrollwm: fix keybind provider registration 2025-12-09 20:14:07 -05:00
bbedward
2b73077b50 cc: add small disk usage variant
fixes #958
2025-12-09 16:09:13 -05:00
bbedward
f953bd5488 i18n: update translations 2025-12-09 16:01:05 -05:00
Varshit
f94011cf05 feat: add scroll compositor support (#959)
* added scroll support

* import QuickShell.i3

* update scroll provider registration logic

* improve scroll support for workspace switcher

* update title for scroll keybinds

* add scroll to dms-greeter

* fix: formatting & sway keybind provider

* readme update

---------

Co-authored-by: bbedward <bbedward@gmail.com>
2025-12-09 15:57:46 -05:00
bbedward
aeacf109eb core: add slices, paths, exec utils 2025-12-09 15:34:13 -05:00
purian23
e307de83e2 packages: Update manual changelogs 2025-12-09 14:17:53 -05:00
bbedward
85968ec417 core/server: refactory to use shared params/request structs 2025-12-09 14:13:20 -05:00
bbedward
993f14a31f widgets: make dank icon picker a popup 2025-12-09 13:41:12 -05:00
purian23
566d617508 Re-adjust systemd debian/ubuntu 2025-12-09 13:40:59 -05:00
purian23
542a279fcb Add systemd debian/ubuntu packages 2025-12-09 12:39:56 -05:00
purian23
e784bb89e1 Version lock dms fedora/opensuse packages 2025-12-09 12:39:21 -05:00
bbedward
f680ace258 keybinds: fix dms args for some commands, some XF86 mappings 2025-12-09 12:21:20 -05:00
bbedward
7aa5976e07 media: fix padding issues with long titles 2025-12-09 11:46:50 -05:00
bbedward
f88f1ea951 gamma: display automation state in UI 2025-12-09 11:26:28 -05:00
bbedward
da4561cb35 keybinds: support more keys, allow Super+Alt 2025-12-09 10:41:39 -05:00
bbedward
1f89ae9813 popout: fix sizing on older QT 2025-12-09 09:57:31 -05:00
bbedward
5647323449 gamma: switch to wlsunset-style transitions 2025-12-09 09:44:16 -05:00
Karsten Zeides
bc27253cbf fix(README): fixes documentation link to include trailing slash (#920)
fixes same issue as described in AvengeMedia/DankLinux-Docs#25
2025-12-09 08:13:33 +01:00
Lucas
0672b711f3 nix: fix greeter custom theme (#954) 2025-12-09 07:14:13 +01:00
bbedward
ed9ee6e347 gamma: fix transition on enable 2025-12-09 00:46:49 -05:00
bbedward
7ad23ad4a2 gamma: fix night mode toggling 2025-12-09 00:35:52 -05:00
bbedward
8a83f03cc1 keybinds: fix provider loading via IPC 2025-12-09 00:30:14 -05:00
bbedward
0be9ac4097 keybinds: fix cheatsheet on non niri
- separate read only logic from writeread
2025-12-09 00:03:39 -05:00
bbedward
ba5be6b516 wallpaper: cleanup transitions 2025-12-08 23:53:50 -05:00
bbedward
c4aea6d326 themes: dont handle custom themes in onCompleted
- Defer entirley to FileView
2025-12-08 23:44:04 -05:00
bbedward
858c6407a9 dankinstall: ;remove keyring file on debian 2025-12-08 23:37:13 -05:00
bbedward
c4313395b5 dankinstall: use gpg batch for deb 2025-12-08 23:36:14 -05:00
bbedward
a32aec3d59 dankinstall: fix other debian sudo cmd 2025-12-08 23:31:08 -05:00
bbedward
696bcfe8fa dankinstall: fix deb sudo command 2025-12-08 23:30:03 -05:00
bbedward
2f3a253c6a wallpaper: fix per-monitor wallpaper in dash 2025-12-08 23:25:02 -05:00
bbedward
e41fbe0188 misc: change transmission icon override 2025-12-08 23:11:17 -05:00
bbedward
ef9d28597b dankinstall: don't fail suse if addrepo fails 2025-12-08 23:03:46 -05:00
bbedward
6f3c4c89ab keybinds: show fallback as action 2025-12-08 22:18:40 -05:00
bbedward
60c577a61e core: hyprland session on all distros, dms setup systemd prompt 2025-12-08 22:04:04 -05:00
bbedward
f3276c3039 notification: fix closing popout from escape
fixes #953
2025-12-08 20:46:22 -05:00
bbedward
37a843323d dankisntall: add hyprland session target, disable hyprland-git variant
universally
2025-12-08 20:40:13 -05:00
bbedward
95c780ca8c Revert "dankinstall: remove systemd path for Hyprland"
This reverts commit 0435a805c7.
2025-12-08 20:24:58 -05:00
bbedward
d60d5b154a dankinstall: switch to yalter/niri copr 2025-12-08 20:04:48 -05:00
bbedward
0435a805c7 dankinstall: remove systemd path for Hyprland 2025-12-08 19:48:07 -05:00
bbedward
f406a977e0 Revert "dankinstall: update hyprland syntax"
This reverts commit 54b253099d.
2025-12-08 19:35:05 -05:00
bbedward
18db1e1ecb dankinstall: update postinstall message 2025-12-08 19:13:32 -05:00
bbedward
6bd1beb719 dankinstall: pin arch to quickshell-git 2025-12-08 19:05:29 -05:00
bbedward
1293aecbca dankinstall: nuke polkit 2025-12-08 19:03:11 -05:00
Marcus Ramberg
8a10c2e112 nixos: fix fprintd unlock (#952)
* nixos: fix fprintd unlock

* ci: this workflow doesn't need a token
2025-12-08 19:14:51 -03:00
bbedward
c21d777269 screenshot: flip bits for RGB888 2025-12-08 15:38:49 -05:00
bbedward
d864094f48 screenshot/colorpicker: handle 24-bit frames from compositor 2025-12-08 14:56:01 -05:00
bbedward
deaac3fdf0 list: approve mouse detection 2025-12-08 14:11:44 -05:00
bbedward
b7062fe40c windows: dont close on esc
fixes #911
2025-12-08 14:02:58 -05:00
bbedward
64d5e99b9d dock: ensure creation after bars
fixes #919
2025-12-08 13:54:44 -05:00
bbedward
f9d8a7d22b greeter: fix weather setting
fixes #921
2025-12-08 13:45:26 -05:00
bbedward
52fcd3ad98 lock: make VPN icon white to be consistent with others
fixes #926
2025-12-08 13:24:53 -05:00
bbedward
9d1e0ee29b fix color picker color space 2025-12-08 12:59:24 -05:00
bbedward
de62f48f50 screenshot: handle transformed displays 2025-12-08 12:45:05 -05:00
bbedward
f47b19274c media: fix position/bar awareness
- shift media control column so it doesnt go off screen
fixes #942
2025-12-08 11:51:40 -05:00
bbedward
bb7f7083b9 meta: transparency fixes
- fixes #949 - transparency not working > 95%
- fixes #947 - dont apply opacity to windows, defer to window-rules
2025-12-08 11:43:29 -05:00
Yuxiang Qiu
cd580090dc evdev: improve capslock detection for no led device (#923)
* evdev: improve capslock detection for no led device

* style: fmt
2025-12-08 11:16:43 -05:00
Marcus Ramberg
ddb74b598d ci: add flake check (#951) 2025-12-08 11:15:35 -05:00
bbedward
29571fc3aa screenshot: use wlr-output-management on DWL for x/y offsets 2025-12-08 10:53:08 -05:00
452 changed files with 35431 additions and 15824 deletions

8
.editorconfig Normal file
View File

@@ -0,0 +1,8 @@
[*.sh]
# like -i=4
indent_style = space
indent_size = 4
[*.nix]
# like -i=4
indent_style = space
indent_size = 4

View File

@@ -1,69 +0,0 @@
#!/usr/bin/env bash
set -euo pipefail
HOOK_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
REPO_ROOT="$(cd "$HOOK_DIR/.." && pwd)"
cd "$REPO_ROOT"
# =============================================================================
# Go CI checks (when core/ files are staged)
# =============================================================================
STAGED_CORE_FILES=$(git diff --cached --name-only --diff-filter=ACMR | grep '^core/' || true)
if [[ -n "$STAGED_CORE_FILES" ]]; then
echo "Go files staged in core/, running CI checks..."
cd "$REPO_ROOT/core"
# Format check
echo " Checking gofmt..."
UNFORMATTED=$(gofmt -s -l . 2>/dev/null || true)
if [[ -n "$UNFORMATTED" ]]; then
echo "The following files are not formatted:"
echo "$UNFORMATTED"
echo ""
echo "Run: cd core && gofmt -s -w ."
exit 1
fi
# golangci-lint
if command -v golangci-lint &>/dev/null; then
echo " Running golangci-lint..."
golangci-lint run ./...
else
echo " Warning: golangci-lint not installed, skipping lint"
echo " Install: go install github.com/golangci/golangci-lint/cmd/golangci-lint@latest"
fi
# Tests
echo " Running tests..."
if ! go test ./... >/dev/null 2>&1; then
echo "Tests failed! Run 'go test ./...' for details."
exit 1
fi
# Build checks
echo " Building..."
mkdir -p bin
go build -buildvcs=false -o bin/dms ./cmd/dms
go build -buildvcs=false -o bin/dms-distro -tags distro_binary ./cmd/dms
go build -buildvcs=false -o bin/dankinstall ./cmd/dankinstall
echo "All Go CI checks passed!"
cd "$REPO_ROOT"
fi
# =============================================================================
# i18n sync check (DISABLED for now)
# =============================================================================
# if [[ -n "${POEDITOR_API_TOKEN:-}" ]] && [[ -n "${POEDITOR_PROJECT_ID:-}" ]]; then
# if command -v python3 &>/dev/null; then
# if ! python3 scripts/i18nsync.py check &>/dev/null; then
# echo "Translations out of sync"
# echo "Run: python3 scripts/i18nsync.py sync"
# exit 1
# fi
# fi
# fi
exit 0

View File

@@ -6,7 +6,7 @@ labels: "bug"
assignees: ""
---
<!-- If your issue is related to ICONS
<!-- If your issue is related to ICONS
- Purple and black checkerboards are QT's way of signalling an icon doesn't exist
- FIX: Configure a QT6 or Icon Pack in DMS Settings that has the icon you want
- Follow the [THEMING](https://danklinux.com/docs/dankmaterialshell/icon-theming) section to ensure your QT environment variable is configured correctly for themes.
@@ -62,4 +62,4 @@ Paste error messages or logs here
## Screenshots/Recordings
<!-- If applicable, add screenshots or screen recordings -->
<!-- If applicable, add screenshots or screen recordings -->

View File

@@ -30,4 +30,4 @@ Is this feature specific to one compositor?
## Alternatives/Existing Solutions
<!-- Include any similar/pre-existing products that solve this problem -->
<!-- Include any similar/pre-existing products that solve this problem -->

View File

@@ -37,4 +37,4 @@ assignees: ""
## Screenshots/Recordings
<!-- If applicable, add screenshots or screen recordings -->
<!-- If applicable, add screenshots or screen recordings -->

383
.github/workflows/backup/run-obs.yml.bak vendored Normal file
View File

@@ -0,0 +1,383 @@
name: Update OBS Packages
on:
workflow_dispatch:
inputs:
package:
description: "Package to update (dms, dms-git, or all)"
required: false
default: "all"
force_upload:
description: "Force upload without version check"
required: false
default: "false"
type: choice
options:
- "false"
- "true"
rebuild_release:
description: "Release number for rebuilds (e.g., 2, 3, 4 to increment spec Release)"
required: false
default: ""
push:
tags:
- "v*"
schedule:
- cron: "0 */3 * * *" # Every 3 hours for dms-git builds
jobs:
check-updates:
name: Check for updates
runs-on: ubuntu-latest
outputs:
has_updates: ${{ steps.check.outputs.has_updates }}
packages: ${{ steps.check.outputs.packages }}
version: ${{ steps.check.outputs.version }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Install OSC
run: |
sudo apt-get update
sudo apt-get install -y osc
mkdir -p ~/.config/osc
cat > ~/.config/osc/oscrc << EOF
[general]
apiurl = https://api.opensuse.org
[https://api.opensuse.org]
user = ${{ secrets.OBS_USERNAME }}
pass = ${{ secrets.OBS_PASSWORD }}
EOF
chmod 600 ~/.config/osc/oscrc
- name: Check for updates
id: check
run: |
if [[ "${{ github.event_name }}" == "push" && "${{ github.ref }}" =~ ^refs/tags/ ]]; then
echo "packages=dms" >> $GITHUB_OUTPUT
VERSION="${GITHUB_REF#refs/tags/}"
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "Triggered by tag: $VERSION (always update)"
elif [[ "${{ github.event_name }}" == "schedule" ]]; then
echo "packages=dms-git" >> $GITHUB_OUTPUT
echo "Checking if dms-git source has changed..."
# Get current commit hash (8 chars to match spec format)
CURRENT_COMMIT=$(git rev-parse --short=8 HEAD)
# Check OBS for last uploaded commit
OBS_BASE="$HOME/.cache/osc-checkouts"
mkdir -p "$OBS_BASE"
OBS_PROJECT="home:AvengeMedia:dms-git"
if [[ -d "$OBS_BASE/$OBS_PROJECT/dms-git" ]]; then
cd "$OBS_BASE/$OBS_PROJECT/dms-git"
osc up -q 2>/dev/null || true
# Extract commit hash from spec Version line & format like; 0.6.2+git2264.a679be68
if [[ -f "dms-git.spec" ]]; then
OBS_COMMIT=$(grep "^Version:" "dms-git.spec" | grep -oP '\.[a-f0-9]{8}' | tr -d '.' || echo "")
if [[ -n "$OBS_COMMIT" ]]; then
if [[ "$CURRENT_COMMIT" == "$OBS_COMMIT" ]]; then
echo "has_updates=false" >> $GITHUB_OUTPUT
echo "📋 Commit $CURRENT_COMMIT already uploaded to OBS, skipping"
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 New commit detected: $CURRENT_COMMIT (OBS has $OBS_COMMIT)"
fi
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 Could not extract OBS commit, proceeding with update"
fi
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 No spec file in OBS, proceeding with update"
fi
cd "${{ github.workspace }}"
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 First upload to OBS, update needed"
fi
elif [[ "${{ github.event.inputs.force_upload }}" == "true" ]]; then
PKG="${{ github.event.inputs.package }}"
if [[ -z "$PKG" || "$PKG" == "all" ]]; then
echo "packages=all" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "🚀 Force upload: all packages"
else
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "🚀 Force upload: $PKG"
fi
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
echo "packages=${{ github.event.inputs.package }}" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ github.event.inputs.package }}"
else
echo "packages=all" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
fi
update-obs:
name: Upload to OBS
needs: check-updates
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
if: |
github.event.inputs.force_upload == 'true' ||
github.event_name == 'workflow_dispatch' ||
needs.check-updates.outputs.has_updates == 'true'
steps:
- name: Generate GitHub App Token
id: generate_token
uses: actions/create-github-app-token@v1
with:
app-id: ${{ secrets.APP_ID }}
private-key: ${{ secrets.APP_PRIVATE_KEY }}
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
token: ${{ steps.generate_token.outputs.token }}
- name: Check if last commit was automated
id: check-loop
run: |
LAST_COMMIT_MSG=$(git log -1 --pretty=%B | head -1)
if [[ "$LAST_COMMIT_MSG" == "ci: Auto-update PPA packages"* ]] || [[ "$LAST_COMMIT_MSG" == "ci: Auto-update OBS packages"* ]]; then
echo "⏭️ Last commit was automated ($LAST_COMMIT_MSG), skipping to prevent infinite loop"
echo "skip=true" >> $GITHUB_OUTPUT
else
echo "✅ Last commit was not automated, proceeding"
echo "skip=false" >> $GITHUB_OUTPUT
fi
- name: Determine packages to update
if: steps.check-loop.outputs.skip != 'true'
id: packages
run: |
if [[ "${{ github.event_name }}" == "push" && "${{ github.ref }}" =~ ^refs/tags/ ]]; then
echo "packages=dms" >> $GITHUB_OUTPUT
VERSION="${GITHUB_REF#refs/tags/}"
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "Triggered by tag: $VERSION"
elif [[ "${{ github.event_name }}" == "schedule" ]]; then
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
echo "Triggered by schedule: updating git package"
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
echo "packages=${{ github.event.inputs.package }}" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ github.event.inputs.package }}"
else
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
fi
- name: Update dms-git spec version
if: steps.check-loop.outputs.skip != 'true' && (contains(steps.packages.outputs.packages, 'dms-git') || steps.packages.outputs.packages == 'all')
run: |
# Get commit info for dms-git versioning
COMMIT_HASH=$(git rev-parse --short=8 HEAD)
COMMIT_COUNT=$(git rev-list --count HEAD)
BASE_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1 || echo "0.6.2")
NEW_VERSION="${BASE_VERSION}+git${COMMIT_COUNT}.${COMMIT_HASH}"
echo "📦 Updating dms-git.spec to version: $NEW_VERSION"
# Update version in spec
sed -i "s/^Version:.*/Version: $NEW_VERSION/" distro/opensuse/dms-git.spec
# Add changelog entry
DATE_STR=$(date "+%a %b %d %Y")
CHANGELOG_ENTRY="* $DATE_STR Avenge Media <AvengeMedia.US@gmail.com> - ${NEW_VERSION}-1\n- Git snapshot (commit $COMMIT_COUNT: $COMMIT_HASH)"
sed -i "/%changelog/a\\$CHANGELOG_ENTRY" distro/opensuse/dms-git.spec
- name: Update Debian dms-git changelog version
if: steps.check-loop.outputs.skip != 'true' && (contains(steps.packages.outputs.packages, 'dms-git') || steps.packages.outputs.packages == 'all')
run: |
# Get commit info for dms-git versioning
COMMIT_HASH=$(git rev-parse --short=8 HEAD)
COMMIT_COUNT=$(git rev-list --count HEAD)
BASE_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1 || echo "0.6.2")
# Debian version format: 0.6.2+git2256.9162e314
NEW_VERSION="${BASE_VERSION}+git${COMMIT_COUNT}.${COMMIT_HASH}"
echo "📦 Updating Debian dms-git changelog to version: $NEW_VERSION"
CHANGELOG_DATE=$(date -R)
CHANGELOG_FILE="distro/debian/dms-git/debian/changelog"
# Get current version from changelog
CURRENT_VERSION=$(head -1 "$CHANGELOG_FILE" | sed 's/.*(\([^)]*\)).*/\1/')
echo "Current Debian version: $CURRENT_VERSION"
echo "New version: $NEW_VERSION"
# Only update if version changed
if [ "$CURRENT_VERSION" != "$NEW_VERSION" ]; then
# Create new changelog entry at top
TEMP_CHANGELOG=$(mktemp)
cat > "$TEMP_CHANGELOG" << EOF
dms-git ($NEW_VERSION) nightly; urgency=medium
* Git snapshot (commit $COMMIT_COUNT: $COMMIT_HASH)
-- Avenge Media <AvengeMedia.US@gmail.com> $CHANGELOG_DATE
EOF
# Prepend to existing changelog
cat "$CHANGELOG_FILE" >> "$TEMP_CHANGELOG"
mv "$TEMP_CHANGELOG" "$CHANGELOG_FILE"
echo "✓ Updated Debian changelog: $CURRENT_VERSION → $NEW_VERSION"
else
echo "✓ Debian changelog already at version $NEW_VERSION"
fi
- name: Update dms stable version
if: steps.check-loop.outputs.skip != 'true' && steps.packages.outputs.version != ''
run: |
VERSION="${{ steps.packages.outputs.version }}"
VERSION_NO_V="${VERSION#v}"
echo "Updating packaging to version $VERSION_NO_V"
# Update openSUSE dms spec (stable only)
sed -i "s/^Version:.*/Version: $VERSION_NO_V/" distro/opensuse/dms.spec
# Update openSUSE spec changelog
DATE_STR=$(date "+%a %b %d %Y")
CHANGELOG_ENTRY="* $DATE_STR AvengeMedia <maintainer@avengemedia.com> - ${VERSION_NO_V}-1\\n- Update to stable $VERSION release\\n- Bug fixes and improvements"
sed -i "/%changelog/a\\$CHANGELOG_ENTRY\\n" distro/opensuse/dms.spec
# Update Debian _service files (both tar_scm and download_url formats)
for service in distro/debian/*/_service; do
if [[ -f "$service" ]]; then
# Update tar_scm revision parameter (for dms-git)
sed -i "s|<param name=\"revision\">v[0-9.]*</param>|<param name=\"revision\">$VERSION</param>|" "$service"
# Update download_url paths (for dms stable)
sed -i "s|/v[0-9.]\+/|/$VERSION/|g" "$service"
sed -i "s|/tags/v[0-9.]\+\.tar\.gz|/tags/$VERSION.tar.gz|g" "$service"
fi
done
# Update Debian changelog for dms stable
if [[ -f "distro/debian/dms/debian/changelog" ]]; then
CHANGELOG_DATE=$(date -R)
TEMP_CHANGELOG=$(mktemp)
cat > "$TEMP_CHANGELOG" << EOF
dms ($VERSION_NO_V) stable; urgency=medium
* Update to $VERSION stable release
* Bug fixes and improvements
-- Avenge Media <AvengeMedia.US@gmail.com> $CHANGELOG_DATE
EOF
cat "distro/debian/dms/debian/changelog" >> "$TEMP_CHANGELOG"
mv "$TEMP_CHANGELOG" "distro/debian/dms/debian/changelog"
echo "✓ Updated Debian changelog to $VERSION_NO_V"
fi
- name: Install Go
if: steps.check-loop.outputs.skip != 'true'
uses: actions/setup-go@v5
with:
go-version: "1.24"
- name: Install OSC
if: steps.check-loop.outputs.skip != 'true'
run: |
sudo apt-get update
sudo apt-get install -y osc
mkdir -p ~/.config/osc
cat > ~/.config/osc/oscrc << EOF
[general]
apiurl = https://api.opensuse.org
[https://api.opensuse.org]
user = ${{ secrets.OBS_USERNAME }}
pass = ${{ secrets.OBS_PASSWORD }}
EOF
chmod 600 ~/.config/osc/oscrc
- name: Upload to OBS
if: steps.check-loop.outputs.skip != 'true'
env:
FORCE_UPLOAD: ${{ github.event.inputs.force_upload }}
REBUILD_RELEASE: ${{ github.event.inputs.rebuild_release }}
run: |
PACKAGES="${{ steps.packages.outputs.packages }}"
MESSAGE="Automated update from GitHub Actions"
if [[ -n "${{ steps.packages.outputs.version }}" ]]; then
MESSAGE="Update to ${{ steps.packages.outputs.version }}"
fi
if [[ "$PACKAGES" == "all" ]]; then
bash distro/scripts/obs-upload.sh dms "$MESSAGE"
bash distro/scripts/obs-upload.sh dms-git "Automated git update"
else
bash distro/scripts/obs-upload.sh "$PACKAGES" "$MESSAGE"
fi
- name: Get changed packages
if: steps.check-loop.outputs.skip != 'true'
id: changed-packages
run: |
# Check if there are any changes to commit
if git diff --exit-code distro/debian/ distro/opensuse/ >/dev/null 2>&1; then
echo "has_changes=false" >> $GITHUB_OUTPUT
echo "📋 No changelog or spec changes to commit"
else
echo "has_changes=true" >> $GITHUB_OUTPUT
# Get list of changed packages for commit message
CHANGED_DEB=$(git diff --name-only distro/debian/ 2>/dev/null | grep 'debian/changelog' | xargs dirname 2>/dev/null | xargs dirname 2>/dev/null | xargs basename 2>/dev/null | tr '\n' ', ' | sed 's/, $//' || echo "")
CHANGED_SUSE=$(git diff --name-only distro/opensuse/ 2>/dev/null | grep '\.spec$' | sed 's|distro/opensuse/||' | sed 's/\.spec$//' | tr '\n' ', ' | sed 's/, $//' || echo "")
PKGS=$(echo "$CHANGED_DEB,$CHANGED_SUSE" | tr ',' '\n' | grep -v '^$' | sort -u | tr '\n' ',' | sed 's/,$//')
echo "packages=$PKGS" >> $GITHUB_OUTPUT
echo "📋 Changed packages: $PKGS"
fi
- name: Commit packaging changes
if: steps.check-loop.outputs.skip != 'true' && steps.changed-packages.outputs.has_changes == 'true'
run: |
git config user.name "dms-ci[bot]"
git config user.email "dms-ci[bot]@users.noreply.github.com"
git add distro/debian/*/debian/changelog distro/opensuse/*.spec
git commit -m "ci: Auto-update OBS packages [${{ steps.changed-packages.outputs.packages }}]" -m "🤖 Automated by GitHub Actions"
git pull --rebase origin master
git push
- name: Summary
run: |
echo "### OBS Package Update Complete" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "- **Packages**: ${{ steps.packages.outputs.packages }}" >> $GITHUB_STEP_SUMMARY
if [[ -n "${{ steps.packages.outputs.version }}" ]]; then
echo "- **Version**: ${{ steps.packages.outputs.version }}" >> $GITHUB_STEP_SUMMARY
fi
if [[ "${{ needs.check-updates.outputs.has_updates }}" == "false" ]]; then
echo "- **Status**: Skipped (no changes detected)" >> $GITHUB_STEP_SUMMARY
fi
echo "- **Project**: https://build.opensuse.org/project/show/home:AvengeMedia" >> $GITHUB_STEP_SUMMARY

298
.github/workflows/backup/run-ppa.yml.bak vendored Normal file
View File

@@ -0,0 +1,298 @@
name: Update PPA Packages
on:
workflow_dispatch:
inputs:
package:
description: "Package to upload (dms, dms-git, dms-greeter, or all)"
required: false
default: "dms-git"
force_upload:
description: "Force upload without version check"
required: false
default: "false"
type: choice
options:
- "false"
- "true"
rebuild_release:
description: "Release number for rebuilds (e.g., 2, 3, 4 for ppa2, ppa3, ppa4)"
required: false
default: ""
schedule:
- cron: "0 */3 * * *" # Every 3 hours for dms-git builds
jobs:
check-updates:
name: Check for updates
runs-on: ubuntu-latest
outputs:
has_updates: ${{ steps.check.outputs.has_updates }}
packages: ${{ steps.check.outputs.packages }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for updates
id: check
run: |
if [[ "${{ github.event_name }}" == "schedule" ]]; then
echo "packages=dms-git" >> $GITHUB_OUTPUT
echo "Checking if dms-git source has changed..."
# Get current commit hash (8 chars to match changelog format)
CURRENT_COMMIT=$(git rev-parse --short=8 HEAD)
# Extract commit hash from changelog
# Format: dms-git (0.6.2+git2264.c5c5ce84) questing; urgency=medium
CHANGELOG_FILE="distro/ubuntu/dms-git/debian/changelog"
if [[ -f "$CHANGELOG_FILE" ]]; then
CHANGELOG_COMMIT=$(head -1 "$CHANGELOG_FILE" | grep -oP '\.[a-f0-9]{8}' | tr -d '.' || echo "")
if [[ -n "$CHANGELOG_COMMIT" ]]; then
if [[ "$CURRENT_COMMIT" == "$CHANGELOG_COMMIT" ]]; then
echo "has_updates=false" >> $GITHUB_OUTPUT
echo "📋 Commit $CURRENT_COMMIT already in changelog, skipping upload"
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 New commit detected: $CURRENT_COMMIT (changelog has $CHANGELOG_COMMIT)"
fi
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 Could not extract commit from changelog, proceeding with upload"
fi
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 No changelog file found, proceeding with upload"
fi
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
echo "packages=${{ github.event.inputs.package }}" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ github.event.inputs.package }}"
else
echo "packages=dms-git" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
fi
upload-ppa:
name: Upload to PPA
needs: check-updates
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
if: |
github.event.inputs.force_upload == 'true' ||
github.event_name == 'workflow_dispatch' ||
needs.check-updates.outputs.has_updates == 'true'
steps:
- name: Generate GitHub App Token
id: generate_token
uses: actions/create-github-app-token@v1
with:
app-id: ${{ secrets.APP_ID }}
private-key: ${{ secrets.APP_PRIVATE_KEY }}
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
token: ${{ steps.generate_token.outputs.token }}
- name: Check if last commit was automated
id: check-loop
run: |
LAST_COMMIT_MSG=$(git log -1 --pretty=%B | head -1)
if [[ "$LAST_COMMIT_MSG" == "ci: Auto-update PPA packages"* ]] || [[ "$LAST_COMMIT_MSG" == "ci: Auto-update OBS packages"* ]]; then
echo "⏭️ Last commit was automated ($LAST_COMMIT_MSG), skipping to prevent infinite loop"
echo "skip=true" >> $GITHUB_OUTPUT
else
echo "✅ Last commit was not automated, proceeding"
echo "skip=false" >> $GITHUB_OUTPUT
fi
- name: Set up Go
if: steps.check-loop.outputs.skip != 'true'
uses: actions/setup-go@v5
with:
go-version: "1.24"
cache: false
- name: Install build dependencies
if: steps.check-loop.outputs.skip != 'true'
run: |
sudo apt-get update
sudo apt-get install -y \
debhelper \
devscripts \
dput \
lftp \
build-essential \
fakeroot \
dpkg-dev
- name: Configure GPG
if: steps.check-loop.outputs.skip != 'true'
env:
GPG_KEY: ${{ secrets.GPG_PRIVATE_KEY }}
run: |
echo "$GPG_KEY" | gpg --import
GPG_KEY_ID=$(gpg --list-secret-keys --keyid-format LONG | grep sec | awk '{print $2}' | cut -d'/' -f2)
echo "DEBSIGN_KEYID=$GPG_KEY_ID" >> $GITHUB_ENV
- name: Determine packages to upload
if: steps.check-loop.outputs.skip != 'true'
id: packages
run: |
if [[ "${{ github.event.inputs.force_upload }}" == "true" ]]; then
PKG="${{ github.event.inputs.package }}"
if [[ -z "$PKG" || "$PKG" == "all" ]]; then
echo "packages=all" >> $GITHUB_OUTPUT
echo "🚀 Force upload: all packages"
else
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "🚀 Force upload: $PKG"
fi
elif [[ "${{ github.event_name }}" == "schedule" ]]; then
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
echo "Triggered by schedule: uploading git package"
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
# Manual package selection should respect change detection
SELECTED_PKG="${{ github.event.inputs.package }}"
UPDATED_PKG="${{ needs.check-updates.outputs.packages }}"
# Check if manually selected package is in the updated list
if [[ "$UPDATED_PKG" == *"$SELECTED_PKG"* ]] || [[ "$SELECTED_PKG" == "all" ]]; then
echo "packages=$SELECTED_PKG" >> $GITHUB_OUTPUT
echo "📦 Manual selection (has updates): $SELECTED_PKG"
else
echo "packages=" >> $GITHUB_OUTPUT
echo "⚠️ Manual selection '$SELECTED_PKG' has no updates - skipping (use force_upload to override)"
fi
else
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
fi
- name: Upload to PPA
if: steps.check-loop.outputs.skip != 'true'
run: |
PACKAGES="${{ steps.packages.outputs.packages }}"
REBUILD_RELEASE="${{ github.event.inputs.rebuild_release }}"
if [[ -z "$PACKAGES" ]]; then
echo "No packages selected for upload. Skipping."
exit 0
fi
# Build command arguments
BUILD_ARGS=()
if [[ -n "$REBUILD_RELEASE" ]]; then
BUILD_ARGS+=("$REBUILD_RELEASE")
echo "✓ Using rebuild release number: ppa$REBUILD_RELEASE"
fi
if [[ "$PACKAGES" == "all" ]]; then
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading dms to PPA..."
if [ -n "$REBUILD_RELEASE" ]; then
echo "🔄 Using rebuild release number: ppa$REBUILD_RELEASE"
fi
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
bash distro/scripts/ppa-upload.sh dms dms questing "${BUILD_ARGS[@]}"
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading dms-git to PPA..."
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
bash distro/scripts/ppa-upload.sh dms-git dms-git questing "${BUILD_ARGS[@]}"
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading dms-greeter to PPA..."
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
bash distro/scripts/ppa-upload.sh dms-greeter danklinux questing "${BUILD_ARGS[@]}"
else
# Map package to PPA name
case "$PACKAGES" in
dms)
PPA_NAME="dms"
;;
dms-git)
PPA_NAME="dms-git"
;;
dms-greeter)
PPA_NAME="danklinux"
;;
*)
PPA_NAME="$PACKAGES"
;;
esac
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading $PACKAGES to PPA..."
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
bash distro/scripts/ppa-upload.sh "$PACKAGES" "$PPA_NAME" questing "${BUILD_ARGS[@]}"
fi
- name: Get changed packages
if: steps.check-loop.outputs.skip != 'true'
id: changed-packages
run: |
# Check if there are any changelog changes to commit
if git diff --exit-code distro/ubuntu/ >/dev/null 2>&1; then
echo "has_changes=false" >> $GITHUB_OUTPUT
echo "📋 No changelog changes to commit"
else
echo "has_changes=true" >> $GITHUB_OUTPUT
# Get list of changed packages for commit message (deduplicate)
CHANGED=$(git diff --name-only distro/ubuntu/ | grep 'debian/changelog' | sed 's|/debian/changelog||' | xargs -I{} basename {} | sort -u | tr '\n' ',' | sed 's/,$//')
echo "packages=$CHANGED" >> $GITHUB_OUTPUT
echo "📋 Changed packages: $CHANGED"
echo "📋 Debug - Changed files:"
git diff --name-only distro/ubuntu/ | grep 'debian/changelog' || echo "No changelog files found"
fi
- name: Commit changelog changes
if: steps.check-loop.outputs.skip != 'true' && steps.changed-packages.outputs.has_changes == 'true'
run: |
git config user.name "dms-ci[bot]"
git config user.email "dms-ci[bot]@users.noreply.github.com"
git add distro/ubuntu/*/debian/changelog
git commit -m "ci: Auto-update PPA packages [${{ steps.changed-packages.outputs.packages }}]" -m "🤖 Automated by GitHub Actions"
git pull --rebase origin master
git push
- name: Summary
run: |
echo "### PPA Package Upload Complete" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "- **Packages**: ${{ steps.packages.outputs.packages }}" >> $GITHUB_STEP_SUMMARY
if [[ "${{ needs.check-updates.outputs.has_updates }}" == "false" ]]; then
echo "- **Status**: Skipped (no changes detected)" >> $GITHUB_STEP_SUMMARY
fi
PACKAGES="${{ steps.packages.outputs.packages }}"
if [[ "$PACKAGES" == "all" ]]; then
echo "- **PPA dms**: https://launchpad.net/~avengemedia/+archive/ubuntu/dms/+packages" >> $GITHUB_STEP_SUMMARY
echo "- **PPA dms-git**: https://launchpad.net/~avengemedia/+archive/ubuntu/dms-git/+packages" >> $GITHUB_STEP_SUMMARY
echo "- **PPA danklinux**: https://launchpad.net/~avengemedia/+archive/ubuntu/danklinux/+packages" >> $GITHUB_STEP_SUMMARY
elif [[ "$PACKAGES" == "dms" ]]; then
echo "- **PPA**: https://launchpad.net/~avengemedia/+archive/ubuntu/dms/+packages" >> $GITHUB_STEP_SUMMARY
elif [[ "$PACKAGES" == "dms-git" ]]; then
echo "- **PPA**: https://launchpad.net/~avengemedia/+archive/ubuntu/dms-git/+packages" >> $GITHUB_STEP_SUMMARY
elif [[ "$PACKAGES" == "dms-greeter" ]]; then
echo "- **PPA**: https://launchpad.net/~avengemedia/+archive/ubuntu/danklinux/+packages" >> $GITHUB_STEP_SUMMARY
fi
if [[ -n "${{ steps.packages.outputs.version }}" ]]; then
echo "- **Version**: ${{ steps.packages.outputs.version }}" >> $GITHUB_STEP_SUMMARY
fi
echo "" >> $GITHUB_STEP_SUMMARY
echo "Builds will appear once Launchpad processes the uploads." >> $GITHUB_STEP_SUMMARY

View File

@@ -33,20 +33,6 @@ jobs:
with:
go-version-file: ./core/go.mod
- name: Format check
run: |
if [ "$(gofmt -s -l . | wc -l)" -gt 0 ]; then
echo "The following files are not formatted:"
gofmt -s -l .
exit 1
fi
- name: Run golangci-lint
uses: golangci/golangci-lint-action@v9
with:
version: v2.6
working-directory: core
- name: Test
run: go test -v ./...

23
.github/workflows/nix-pr-check.yml vendored Normal file
View File

@@ -0,0 +1,23 @@
name: Check nix flake
on:
pull_request:
branches: [master, main]
paths:
- "flake.*"
- "distro/nix/**"
jobs:
check-flake:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Install Nix
uses: cachix/install-nix-action@v31
- name: Check the flake
run: nix flake check

15
.github/workflows/prek.yml vendored Normal file
View File

@@ -0,0 +1,15 @@
name: Pre-commit Checks
on:
push:
pull_request:
branches: [master, main]
jobs:
pre-commit-check:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: run pre-commit hooks
uses: j178/prek-action@v1

View File

@@ -1,16 +1,19 @@
name: Release
on:
push:
tags:
- 'v*'
workflow_dispatch:
inputs:
tag:
description: "Tag to release (e.g., v1.0.1)"
required: true
type: string
permissions:
contents: write
actions: write
concurrency:
group: release-${{ github.ref_name }}
group: release-${{ inputs.tag }}
cancel-in-progress: true
jobs:
@@ -24,10 +27,14 @@ jobs:
run:
working-directory: core
env:
TAG: ${{ inputs.tag }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ inputs.tag }}
fetch-depth: 0
- name: Set up Go
@@ -54,7 +61,7 @@ jobs:
run: |
set -eux
cd cmd/dankinstall
go build -trimpath -ldflags "-s -w -X main.Version=${GITHUB_REF#refs/tags/}" \
go build -trimpath -ldflags "-s -w -X main.Version=${TAG}" \
-o ../../dankinstall-${{ matrix.arch }}
cd ../..
gzip -9 -k dankinstall-${{ matrix.arch }}
@@ -68,7 +75,7 @@ jobs:
run: |
set -eux
cd cmd/dms
go build -trimpath -ldflags "-s -w -X main.Version=${GITHUB_REF#refs/tags/}" \
go build -trimpath -ldflags "-s -w -X main.Version=${TAG}" \
-o ../../dms-${{ matrix.arch }}
cd ../..
gzip -9 -k dms-${{ matrix.arch }}
@@ -91,7 +98,7 @@ jobs:
run: |
set -eux
cd cmd/dms
go build -trimpath -tags distro_binary -ldflags "-s -w -X main.Version=${GITHUB_REF#refs/tags/}" \
go build -trimpath -tags distro_binary -ldflags "-s -w -X main.Version=${TAG}" \
-o ../../dms-distropkg-${{ matrix.arch }}
cd ../..
gzip -9 -k dms-distropkg-${{ matrix.arch }}
@@ -128,60 +135,61 @@ jobs:
core/completion.zsh
if-no-files-found: error
update-versions:
runs-on: ubuntu-latest
needs: build-core
steps:
- name: Create GitHub App token
id: app_token
uses: actions/create-github-app-token@v1
with:
app-id: ${{ secrets.APP_ID }}
private-key: ${{ secrets.APP_PRIVATE_KEY }}
# update-versions:
# runs-on: ubuntu-latest
# needs: build-core
# steps:
# - name: Create GitHub App token
# id: app_token
# uses: actions/create-github-app-token@v1
# with:
# app-id: ${{ secrets.APP_ID }}
# private-key: ${{ secrets.APP_PRIVATE_KEY }}
- name: Checkout
uses: actions/checkout@v4
with:
token: ${{ steps.app_token.outputs.token }}
fetch-depth: 0
# - name: Checkout
# uses: actions/checkout@v4
# with:
# token: ${{ steps.app_token.outputs.token }}
# fetch-depth: 0
- name: Update VERSION
env:
GH_TOKEN: ${{ steps.app_token.outputs.token }}
run: |
set -euo pipefail
git config user.name "dms-ci[bot]"
git config user.email "dms-ci[bot]@users.noreply.github.com"
# - name: Update VERSION
# env:
# GH_TOKEN: ${{ steps.app_token.outputs.token }}
# run: |
# set -euo pipefail
# git config user.name "dms-ci[bot]"
# git config user.email "dms-ci[bot]@users.noreply.github.com"
version="${GITHUB_REF#refs/tags/}"
echo "Updating to version: $version"
echo "${version}" > quickshell/VERSION
git add quickshell/VERSION
# version="${GITHUB_REF#refs/tags/}"
# echo "Updating to version: $version"
# echo "${version}" > quickshell/VERSION
# git add quickshell/VERSION
if ! git diff --cached --quiet; then
git commit -m "chore: bump version to $version"
git pull --rebase origin master
git push https://x-access-token:${GH_TOKEN}@github.com/${{ github.repository }}.git HEAD:master
fi
# if ! git diff --cached --quiet; then
# git commit -m "chore: bump version to $version"
# git pull --rebase origin master
# git push https://x-access-token:${GH_TOKEN}@github.com/${{ github.repository }}.git HEAD:master
# fi
git tag -f "${version}"
git push -f https://x-access-token:${GH_TOKEN}@github.com/${{ github.repository }}.git "${version}"
# git tag -f "${version}"
# git push -f https://x-access-token:${GH_TOKEN}@github.com/${{ github.repository }}.git "${version}"
release:
runs-on: ubuntu-24.04
needs: [build-core, update-versions]
needs: [build-core] #, update-versions]
env:
TAG: ${{ github.ref_name }}
TAG: ${{ inputs.tag }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ inputs.tag }}
fetch-depth: 0
- name: Fetch updated tag after version bump
run: |
git fetch origin --force tag ${{ github.ref_name }}
git checkout ${{ github.ref_name }}
git fetch origin --force tag ${TAG}
git checkout ${TAG}
- name: Download core artifacts
uses: actions/download-artifact@v4
@@ -273,6 +281,9 @@ jobs:
# Copy root LICENSE and CONTRIBUTING.md to quickshell/ for packaging
cp LICENSE CONTRIBUTING.md quickshell/
# Copy root assets directory to quickshell for systemd service and desktop file
cp -r assets quickshell/
# Tar the CONTENTS of quickshell/, not the directory itself
(cd quickshell && tar --exclude='.git' \
--exclude='.github' \
@@ -388,313 +399,296 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
trigger-obs-update:
runs-on: ubuntu-latest
needs: release
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install OSC
run: |
sudo apt-get update
sudo apt-get install -y osc
mkdir -p ~/.config/osc
cat > ~/.config/osc/oscrc << EOF
[general]
apiurl = https://api.opensuse.org
[https://api.opensuse.org]
user = ${{ secrets.OBS_USERNAME }}
pass = ${{ secrets.OBS_PASSWORD }}
EOF
chmod 600 ~/.config/osc/oscrc
- name: Update OBS packages
run: |
VERSION="${{ github.ref_name }}"
cd distro
bash scripts/obs-upload.sh dms "Update to $VERSION"
# trigger-obs-update:
# runs-on: ubuntu-latest
# needs: release
# env:
# TAG: ${{ inputs.tag }}
# steps:
# - name: Checkout
# uses: actions/checkout@v4
# with:
# ref: ${{ inputs.tag }}
trigger-ppa-update:
runs-on: ubuntu-latest
needs: release
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install build dependencies
run: |
sudo apt-get update
sudo apt-get install -y \
debhelper \
devscripts \
dput \
lftp \
build-essential \
fakeroot \
dpkg-dev
- name: Configure GPG
env:
GPG_KEY: ${{ secrets.GPG_PRIVATE_KEY }}
run: |
echo "$GPG_KEY" | gpg --import
GPG_KEY_ID=$(gpg --list-secret-keys --keyid-format LONG | grep sec | awk '{print $2}' | cut -d'/' -f2)
echo "DEBSIGN_KEYID=$GPG_KEY_ID" >> $GITHUB_ENV
- name: Upload to PPA
run: |
VERSION="${{ github.ref_name }}"
cd distro/ubuntu/ppa
bash create-and-upload.sh ../dms dms questing
# - name: Install OSC
# run: |
# sudo apt-get update
# sudo apt-get install -y osc
copr-build:
runs-on: ubuntu-latest
needs: release
env:
TAG: ${{ github.ref_name }}
# mkdir -p ~/.config/osc
# cat > ~/.config/osc/oscrc << EOF
# [general]
# apiurl = https://api.opensuse.org
steps:
- name: Checkout repository
uses: actions/checkout@v4
# [https://api.opensuse.org]
# user = ${{ secrets.OBS_USERNAME }}
# pass = ${{ secrets.OBS_PASSWORD }}
# EOF
# chmod 600 ~/.config/osc/oscrc
- name: Determine version
id: version
run: |
VERSION="${TAG#v}"
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "Building DMS stable version: $VERSION"
# - name: Update OBS packages
# run: |
# cd distro
# bash scripts/obs-upload.sh dms "Update to ${TAG}"
- name: Setup build environment
run: |
sudo apt-get update
sudo apt-get install -y rpm wget curl jq gzip
mkdir -p ~/rpmbuild/{BUILD,BUILDROOT,RPMS,SOURCES,SPECS,SRPMS}
# trigger-ppa-update:
# runs-on: ubuntu-latest
# needs: release
# env:
# TAG: ${{ inputs.tag }}
# steps:
# - name: Checkout
# uses: actions/checkout@v4
# with:
# ref: ${{ inputs.tag }}
- name: Download release assets
run: |
VERSION="${{ steps.version.outputs.version }}"
cd ~/rpmbuild/SOURCES
# - name: Install build dependencies
# run: |
# sudo apt-get update
# sudo apt-get install -y \
# debhelper \
# devscripts \
# dput \
# lftp \
# build-essential \
# fakeroot \
# dpkg-dev
wget "https://github.com/AvengeMedia/DankMaterialShell/releases/download/v${VERSION}/dms-qml.tar.gz" || {
echo "Failed to download dms-qml.tar.gz for v${VERSION}"
exit 1
}
# - name: Configure GPG
# env:
# GPG_KEY: ${{ secrets.GPG_PRIVATE_KEY }}
# run: |
# echo "$GPG_KEY" | gpg --import
# GPG_KEY_ID=$(gpg --list-secret-keys --keyid-format LONG | grep sec | awk '{print $2}' | cut -d'/' -f2)
# echo "DEBSIGN_KEYID=$GPG_KEY_ID" >> $GITHUB_ENV
- name: Generate stable spec file
run: |
VERSION="${{ steps.version.outputs.version }}"
CHANGELOG_DATE="$(date '+%a %b %d %Y')"
# - name: Upload to PPA
# run: |
# cd distro/ubuntu/ppa
# bash create-and-upload.sh ../dms dms questing
cat > ~/rpmbuild/SPECS/dms.spec <<'SPECEOF'
# Spec for DMS stable releases - Generated by GitHub Actions
# copr-build:
# runs-on: ubuntu-latest
# needs: release
# env:
# TAG: ${{ inputs.tag }}
%global debug_package %{nil}
%global version VERSION_PLACEHOLDER
%global pkg_summary DankMaterialShell - Material 3 inspired shell for Wayland compositors
# steps:
# - name: Checkout repository
# uses: actions/checkout@v4
# with:
# ref: ${{ inputs.tag }}
Name: dms
Version: %{version}
Release: 1%{?dist}
Summary: %{pkg_summary}
# - name: Determine version
# id: version
# run: |
# VERSION="${TAG#v}"
# echo "version=$VERSION" >> $GITHUB_OUTPUT
# echo "Building DMS stable version: $VERSION"
License: MIT
URL: https://github.com/AvengeMedia/DankMaterialShell
# - name: Setup build environment
# run: |
# sudo apt-get update
# sudo apt-get install -y rpm wget curl jq gzip
# mkdir -p ~/rpmbuild/{BUILD,BUILDROOT,RPMS,SOURCES,SPECS,SRPMS}
Source0: dms-qml.tar.gz
# - name: Download release assets
# run: |
# VERSION="${{ steps.version.outputs.version }}"
# cd ~/rpmbuild/SOURCES
BuildRequires: gzip
BuildRequires: wget
BuildRequires: systemd-rpm-macros
# wget "https://github.com/AvengeMedia/DankMaterialShell/releases/download/v${VERSION}/dms-qml.tar.gz" || {
# echo "Failed to download dms-qml.tar.gz for v${VERSION}"
# exit 1
# }
Requires: (quickshell or quickshell-git)
Requires: accountsservice
Requires: dms-cli
Requires: dgop
# - name: Generate stable spec file
# run: |
# VERSION="${{ steps.version.outputs.version }}"
# CHANGELOG_DATE="$(date '+%a %b %d %Y')"
Recommends: cava
Recommends: cliphist
Recommends: danksearch
Recommends: matugen
Recommends: wl-clipboard
Recommends: NetworkManager
Recommends: qt6-qtmultimedia
Suggests: qt6ct
# cat > ~/rpmbuild/SPECS/dms.spec <<'SPECEOF'
# # Spec for DMS stable releases - Generated by GitHub Actions
%description
DankMaterialShell (DMS) is a modern Wayland desktop shell built with Quickshell
and optimized for the niri and hyprland compositors. Features notifications,
app launcher, wallpaper customization, and fully customizable with plugins.
# %global debug_package %{nil}
# %global version VERSION_PLACEHOLDER
# %global pkg_summary DankMaterialShell - Material 3 inspired shell for Wayland compositors
Includes auto-theming for GTK/Qt apps with matugen, 20+ customizable widgets,
process monitoring, notification center, clipboard history, dock, control center,
lock screen, and comprehensive plugin system.
# Name: dms
# Version: %{version}
# Release: 1%{?dist}
# Summary: %{pkg_summary}
%package -n dms-cli
Summary: DankMaterialShell CLI tool
License: MIT
URL: https://github.com/AvengeMedia/DankMaterialShell
# License: MIT
# URL: https://github.com/AvengeMedia/DankMaterialShell
%description -n dms-cli
Command-line interface for DankMaterialShell configuration and management.
Provides native DBus bindings, NetworkManager integration, and system utilities.
# Source0: dms-qml.tar.gz
%package -n dgop
Summary: Stateless CPU/GPU monitor for DankMaterialShell
License: MIT
URL: https://github.com/AvengeMedia/dgop
Provides: dgop
# BuildRequires: gzip
# BuildRequires: wget
# BuildRequires: systemd-rpm-macros
%description -n dgop
DGOP is a stateless system monitoring tool that provides CPU, GPU, memory, and
network statistics. Designed for integration with DankMaterialShell but can be
used standalone. This package always includes the latest stable dgop release.
# Requires: (quickshell or quickshell-git)
# Requires: accountsservice
# Requires: dms-cli = %{version}-%{release}
# Requires: dgop
%prep
%setup -q -c -n dms-qml
# Recommends: cava
# Recommends: cliphist
# Recommends: danksearch
# Recommends: matugen
# Recommends: wl-clipboard
# Recommends: NetworkManager
# Recommends: qt6-qtmultimedia
# Suggests: qt6ct
# Download architecture-specific binaries during build
case "%{_arch}" in
x86_64)
ARCH_SUFFIX="amd64"
;;
aarch64)
ARCH_SUFFIX="arm64"
;;
*)
echo "Unsupported architecture: %{_arch}"
exit 1
;;
esac
# %description
# DankMaterialShell (DMS) is a modern Wayland desktop shell built with Quickshell
# and optimized for the niri and hyprland compositors. Features notifications,
# app launcher, wallpaper customization, and fully customizable with plugins.
wget -O %{_builddir}/dms-cli.gz "https://github.com/AvengeMedia/DankMaterialShell/releases/latest/download/dms-distropkg-${ARCH_SUFFIX}.gz" || {
echo "Failed to download dms-cli for architecture %{_arch}"
exit 1
}
gunzip -c %{_builddir}/dms-cli.gz > %{_builddir}/dms-cli
chmod +x %{_builddir}/dms-cli
# Includes auto-theming for GTK/Qt apps with matugen, 20+ customizable widgets,
# process monitoring, notification center, clipboard history, dock, control center,
# lock screen, and comprehensive plugin system.
wget -O %{_builddir}/dgop.gz "https://github.com/AvengeMedia/dgop/releases/latest/download/dgop-linux-${ARCH_SUFFIX}.gz" || {
echo "Failed to download dgop for architecture %{_arch}"
exit 1
}
gunzip -c %{_builddir}/dgop.gz > %{_builddir}/dgop
chmod +x %{_builddir}/dgop
# %package -n dms-cli
# Summary: DankMaterialShell CLI tool
# License: MIT
# URL: https://github.com/AvengeMedia/DankMaterialShell
%build
# %description -n dms-cli
# Command-line interface for DankMaterialShell configuration and management.
# Provides native DBus bindings, NetworkManager integration, and system utilities.
%install
install -Dm755 %{_builddir}/dms-cli %{buildroot}%{_bindir}/dms
install -Dm755 %{_builddir}/dgop %{buildroot}%{_bindir}/dgop
# %prep
# %setup -q -c -n dms-qml
install -d %{buildroot}%{_datadir}/bash-completion/completions
install -d %{buildroot}%{_datadir}/zsh/site-functions
install -d %{buildroot}%{_datadir}/fish/vendor_completions.d
%{_builddir}/dms-cli completion bash > %{buildroot}%{_datadir}/bash-completion/completions/dms || :
%{_builddir}/dms-cli completion zsh > %{buildroot}%{_datadir}/zsh/site-functions/_dms || :
%{_builddir}/dms-cli completion fish > %{buildroot}%{_datadir}/fish/vendor_completions.d/dms.fish || :
# # Download architecture-specific binaries during build
# case "%{_arch}" in
# x86_64)
# ARCH_SUFFIX="amd64"
# ;;
# aarch64)
# ARCH_SUFFIX="arm64"
# ;;
# *)
# echo "Unsupported architecture: %{_arch}"
# exit 1
# ;;
# esac
install -Dm644 assets/systemd/dms.service %{buildroot}%{_userunitdir}/dms.service
# wget -O %{_builddir}/dms-cli.gz "https://github.com/AvengeMedia/DankMaterialShell/releases/latest/download/dms-distropkg-${ARCH_SUFFIX}.gz" || {
# echo "Failed to download dms-cli for architecture %{_arch}"
# exit 1
# }
# gunzip -c %{_builddir}/dms-cli.gz > %{_builddir}/dms-cli
# chmod +x %{_builddir}/dms-cli
install -Dm644 assets/dms-open.desktop %{buildroot}%{_datadir}/applications/dms-open.desktop
install -Dm644 assets/danklogo.svg %{buildroot}%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
# %build
install -dm755 %{buildroot}%{_datadir}/quickshell/dms
cp -r %{_builddir}/dms-qml/* %{buildroot}%{_datadir}/quickshell/dms/
# %install
# install -Dm755 %{_builddir}/dms-cli %{buildroot}%{_bindir}/dms
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.git*
rm -f %{buildroot}%{_datadir}/quickshell/dms/.gitignore
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.github
rm -rf %{buildroot}%{_datadir}/quickshell/dms/distro
# install -d %{buildroot}%{_datadir}/bash-completion/completions
# install -d %{buildroot}%{_datadir}/zsh/site-functions
# install -d %{buildroot}%{_datadir}/fish/vendor_completions.d
# %{_builddir}/dms-cli completion bash > %{buildroot}%{_datadir}/bash-completion/completions/dms || :
# %{_builddir}/dms-cli completion zsh > %{buildroot}%{_datadir}/zsh/site-functions/_dms || :
# %{_builddir}/dms-cli completion fish > %{buildroot}%{_datadir}/fish/vendor_completions.d/dms.fish || :
echo "%{version}" > %{buildroot}%{_datadir}/quickshell/dms/VERSION
# install -Dm644 assets/systemd/dms.service %{buildroot}%{_userunitdir}/dms.service
%posttrans
if [ -d "%{_sysconfdir}/xdg/quickshell/dms" ]; then
rmdir "%{_sysconfdir}/xdg/quickshell/dms" 2>/dev/null || true
rmdir "%{_sysconfdir}/xdg/quickshell" 2>/dev/null || true
rmdir "%{_sysconfdir}/xdg" 2>/dev/null || true
fi
# install -Dm644 assets/dms-open.desktop %{buildroot}%{_datadir}/applications/dms-open.desktop
# install -Dm644 assets/danklogo.svg %{buildroot}%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
if [ "$1" -ge 2 ]; then
pkill -USR1 -x dms >/dev/null 2>&1 || true
fi
# install -dm755 %{buildroot}%{_datadir}/quickshell/dms
# cp -r %{_builddir}/dms-qml/* %{buildroot}%{_datadir}/quickshell/dms/
%files
%license LICENSE
%doc README.md CONTRIBUTING.md
%{_datadir}/quickshell/dms/
%{_userunitdir}/dms.service
%{_datadir}/applications/dms-open.desktop
%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
# rm -rf %{buildroot}%{_datadir}/quickshell/dms/.git*
# rm -f %{buildroot}%{_datadir}/quickshell/dms/.gitignore
# rm -rf %{buildroot}%{_datadir}/quickshell/dms/.github
# rm -rf %{buildroot}%{_datadir}/quickshell/dms/distro
%files -n dms-cli
%{_bindir}/dms
%{_datadir}/bash-completion/completions/dms
%{_datadir}/zsh/site-functions/_dms
%{_datadir}/fish/vendor_completions.d/dms.fish
# echo "%{version}" > %{buildroot}%{_datadir}/quickshell/dms/VERSION
%files -n dgop
%{_bindir}/dgop
# %posttrans
# if [ -d "%{_sysconfdir}/xdg/quickshell/dms" ]; then
# rmdir "%{_sysconfdir}/xdg/quickshell/dms" 2>/dev/null || true
# rmdir "%{_sysconfdir}/xdg/quickshell" 2>/dev/null || true
# rmdir "%{_sysconfdir}/xdg" 2>/dev/null || true
# fi
# # Signal running DMS instances to reload
# pkill -USR1 -x dms >/dev/null 2>&1 || :
%changelog
* CHANGELOG_DATE_PLACEHOLDER AvengeMedia <contact@avengemedia.com> - VERSION_PLACEHOLDER-1
- Stable release VERSION_PLACEHOLDER
- Built from GitHub release
- Includes latest dms-cli and dgop binaries
SPECEOF
# %files
# %license LICENSE
# %doc README.md CONTRIBUTING.md
# %{_datadir}/quickshell/dms/
# %{_userunitdir}/dms.service
# %{_datadir}/applications/dms-open.desktop
# %{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
sed -i "s/VERSION_PLACEHOLDER/${VERSION}/g" ~/rpmbuild/SPECS/dms.spec
sed -i "s/CHANGELOG_DATE_PLACEHOLDER/${CHANGELOG_DATE}/g" ~/rpmbuild/SPECS/dms.spec
# %files -n dms-cli
# %{_bindir}/dms
# %{_datadir}/bash-completion/completions/dms
# %{_datadir}/zsh/site-functions/_dms
# %{_datadir}/fish/vendor_completions.d/dms.fish
- name: Build SRPM
id: build
run: |
cd ~/rpmbuild/SPECS
rpmbuild -bs dms.spec
# %changelog
# * CHANGELOG_DATE_PLACEHOLDER AvengeMedia <contact@avengemedia.com> - VERSION_PLACEHOLDER-1
# - Stable release VERSION_PLACEHOLDER
# - Built from GitHub release
# SPECEOF
SRPM=$(ls ~/rpmbuild/SRPMS/*.src.rpm | tail -n 1)
SRPM_NAME=$(basename "$SRPM")
# sed -i "s/VERSION_PLACEHOLDER/${VERSION}/g" ~/rpmbuild/SPECS/dms.spec
# sed -i "s/CHANGELOG_DATE_PLACEHOLDER/${CHANGELOG_DATE}/g" ~/rpmbuild/SPECS/dms.spec
echo "srpm_path=$SRPM" >> $GITHUB_OUTPUT
echo "srpm_name=$SRPM_NAME" >> $GITHUB_OUTPUT
echo "SRPM built: $SRPM_NAME"
# - name: Build SRPM
# id: build
# run: |
# cd ~/rpmbuild/SPECS
# rpmbuild -bs dms.spec
- name: Upload SRPM artifact
uses: actions/upload-artifact@v4
with:
name: dms-stable-srpm-${{ steps.version.outputs.version }}
path: ${{ steps.build.outputs.srpm_path }}
retention-days: 90
# SRPM=$(ls ~/rpmbuild/SRPMS/*.src.rpm | tail -n 1)
# SRPM_NAME=$(basename "$SRPM")
- name: Install Copr CLI
run: |
sudo apt-get install -y python3-pip
pip3 install copr-cli
# echo "srpm_path=$SRPM" >> $GITHUB_OUTPUT
# echo "srpm_name=$SRPM_NAME" >> $GITHUB_OUTPUT
# echo "SRPM built: $SRPM_NAME"
mkdir -p ~/.config
cat > ~/.config/copr << EOF
[copr-cli]
login = ${{ secrets.COPR_LOGIN }}
username = avengemedia
token = ${{ secrets.COPR_TOKEN }}
copr_url = https://copr.fedorainfracloud.org
EOF
chmod 600 ~/.config/copr
# - name: Upload SRPM artifact
# uses: actions/upload-artifact@v4
# with:
# name: dms-stable-srpm-${{ steps.version.outputs.version }}
# path: ${{ steps.build.outputs.srpm_path }}
# retention-days: 90
- name: Upload to Copr
run: |
SRPM="${{ steps.build.outputs.srpm_path }}"
VERSION="${{ steps.version.outputs.version }}"
# - name: Install Copr CLI
# run: |
# sudo apt-get install -y python3-pip
# pip3 install copr-cli
echo "Uploading SRPM to avengemedia/dms..."
BUILD_OUTPUT=$(copr-cli build avengemedia/dms "$SRPM" --nowait 2>&1)
echo "$BUILD_OUTPUT"
# mkdir -p ~/.config
# cat > ~/.config/copr << EOF
# [copr-cli]
# login = ${{ secrets.COPR_LOGIN }}
# username = avengemedia
# token = ${{ secrets.COPR_TOKEN }}
# copr_url = https://copr.fedorainfracloud.org
# EOF
# chmod 600 ~/.config/copr
BUILD_ID=$(echo "$BUILD_OUTPUT" | grep -oP 'Build was added to.*\K[0-9]+' || echo "unknown")
# - name: Upload to Copr
# run: |
# SRPM="${{ steps.build.outputs.srpm_path }}"
# VERSION="${{ steps.version.outputs.version }}"
if [ "$BUILD_ID" != "unknown" ]; then
echo "Build submitted: https://copr.fedorainfracloud.org/coprs/avengemedia/dms/build/$BUILD_ID/"
fi
# echo "Uploading SRPM to avengemedia/dms..."
# BUILD_OUTPUT=$(copr-cli build avengemedia/dms "$SRPM" --nowait 2>&1)
# echo "$BUILD_OUTPUT"
# BUILD_ID=$(echo "$BUILD_OUTPUT" | grep -oP 'Build was added to.*\K[0-9]+' || echo "unknown")
# if [ "$BUILD_ID" != "unknown" ]; then
# echo "Build submitted: https://copr.fedorainfracloud.org/coprs/avengemedia/dms/build/$BUILD_ID/"
# fi

View File

@@ -19,7 +19,7 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Determine version
id: version
run: |
@@ -40,31 +40,31 @@ jobs:
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "release=$RELEASE" >> $GITHUB_OUTPUT
echo "✅ Building DMS hotfix version: $VERSION-$RELEASE"
- name: Setup build environment
run: |
sudo apt-get update
sudo apt-get install -y rpm wget curl jq gzip
mkdir -p ~/rpmbuild/{BUILD,BUILDROOT,RPMS,SOURCES,SPECS,SRPMS}
echo "✅ RPM build environment ready"
- name: Download release assets
run: |
VERSION="${{ steps.version.outputs.version }}"
cd ~/rpmbuild/SOURCES
echo "📦 Downloading DMS QML source for v${VERSION}..."
# Download DMS QML source
wget "https://github.com/AvengeMedia/DankMaterialShell/releases/download/v${VERSION}/dms-qml.tar.gz" || {
echo "❌ Failed to download dms-qml.tar.gz for v${VERSION}"
exit 1
}
echo "✅ Source downloaded"
echo "Note: dms-cli and dgop binaries will be downloaded during build based on target architecture"
echo "Note: dms-cli binary will be downloaded during build based on target architecture"
ls -lh
- name: Generate stable spec file
run: |
VERSION="${{ steps.version.outputs.version }}"
@@ -94,7 +94,7 @@ jobs:
Requires: (quickshell or quickshell-git)
Requires: accountsservice
Requires: dms-cli
Requires: dms-cli = %{version}-%{release}
Requires: dgop
Recommends: cava
@@ -125,17 +125,6 @@ jobs:
Command-line interface for DankMaterialShell configuration and management.
Provides native DBus bindings, NetworkManager integration, and system utilities.
%package -n dgop
Summary: Stateless CPU/GPU monitor for DankMaterialShell
License: MIT
URL: https://github.com/AvengeMedia/dgop
Provides: dgop
%description -n dgop
DGOP is a stateless system monitoring tool that provides CPU, GPU, memory, and
network statistics. Designed for integration with DankMaterialShell but can be
used standalone. This package always includes the latest stable dgop release.
%prep
%setup -q -c -n dms-qml
@@ -162,19 +151,10 @@ jobs:
gunzip -c %{_builddir}/dms-cli.gz > %{_builddir}/dms-cli
chmod +x %{_builddir}/dms-cli
# Download dgop for target architecture
wget -O %{_builddir}/dgop.gz "https://github.com/AvengeMedia/dgop/releases/latest/download/dgop-linux-${ARCH_SUFFIX}.gz" || {
echo "Failed to download dgop for architecture %{_arch}"
exit 1
}
gunzip -c %{_builddir}/dgop.gz > %{_builddir}/dgop
chmod +x %{_builddir}/dgop
%build
%install
install -Dm755 %{_builddir}/dms-cli %{buildroot}%{_bindir}/dms
install -Dm755 %{_builddir}/dgop %{buildroot}%{_bindir}/dgop
# Shell completions
install -d %{buildroot}%{_datadir}/bash-completion/completions
@@ -202,11 +182,8 @@ jobs:
rmdir "%{_sysconfdir}/xdg/quickshell" 2>/dev/null || true
rmdir "%{_sysconfdir}/xdg" 2>/dev/null || true
fi
# Restart DMS for active users after upgrade
if [ "$1" -ge 2 ]; then
pkill -USR1 -x dms >/dev/null 2>&1 || true
fi
# Signal running DMS instances to reload (harmless if none running)
pkill -USR1 -x dms >/dev/null 2>&1 || :
%files
%license LICENSE
@@ -220,14 +197,10 @@ jobs:
%{_datadir}/zsh/site-functions/_dms
%{_datadir}/fish/vendor_completions.d/dms.fish
%files -n dgop
%{_bindir}/dgop
%changelog
* CHANGELOG_DATE_PLACEHOLDER AvengeMedia <contact@avengemedia.com> - VERSION_PLACEHOLDER-RELEASE_PLACEHOLDER
- Stable release VERSION_PLACEHOLDER
- Built from GitHub release
- Includes latest dms-cli and dgop binaries
SPECEOF
sed -i "s/VERSION_PLACEHOLDER/${VERSION}/g" ~/rpmbuild/SPECS/dms.spec
@@ -238,38 +211,38 @@ jobs:
echo ""
echo "=== Spec file preview ==="
head -40 ~/rpmbuild/SPECS/dms.spec
- name: Build SRPM
id: build
run: |
cd ~/rpmbuild/SPECS
echo "🔨 Building SRPM..."
rpmbuild -bs dms.spec
SRPM=$(ls ~/rpmbuild/SRPMS/*.src.rpm | tail -n 1)
SRPM_NAME=$(basename "$SRPM")
echo "srpm_path=$SRPM" >> $GITHUB_OUTPUT
echo "srpm_name=$SRPM_NAME" >> $GITHUB_OUTPUT
echo "✅ SRPM built: $SRPM_NAME"
echo ""
echo "=== SRPM Info ==="
rpm -qpi "$SRPM"
- name: Upload SRPM artifact
uses: actions/upload-artifact@v4
with:
name: dms-stable-srpm-${{ steps.version.outputs.version }}
path: ${{ steps.build.outputs.srpm_path }}
retention-days: 90
- name: Install Copr CLI
run: |
sudo apt-get install -y python3-pip
pip3 install copr-cli
mkdir -p ~/.config
cat > ~/.config/copr << EOF
[copr-cli]
@@ -279,30 +252,30 @@ jobs:
copr_url = https://copr.fedorainfracloud.org
EOF
chmod 600 ~/.config/copr
echo "✅ Copr CLI configured"
- name: Upload to Copr
run: |
SRPM="${{ steps.build.outputs.srpm_path }}"
VERSION="${{ steps.version.outputs.version }}"
echo "🚀 Uploading SRPM to avengemedia/dms..."
echo " SRPM: $(basename $SRPM)"
echo " Version: $VERSION"
BUILD_OUTPUT=$(copr-cli build avengemedia/dms "$SRPM" --nowait 2>&1)
echo "$BUILD_OUTPUT"
BUILD_ID=$(echo "$BUILD_OUTPUT" | grep -oP 'Build was added to.*\K[0-9]+' || echo "unknown")
if [ "$BUILD_ID" != "unknown" ]; then
echo "✅ Build submitted successfully!"
echo "🔗 https://copr.fedorainfracloud.org/coprs/avengemedia/dms/build/$BUILD_ID/"
else
echo "⚠️ Could not extract build ID, but upload may have succeeded"
fi
- name: Build summary
if: always()
run: |

View File

@@ -4,107 +4,143 @@ on:
workflow_dispatch:
inputs:
package:
description: 'Package to update (dms, dms-git, or all)'
description: "Package to update (dms, dms-git, or all)"
required: false
default: 'all'
default: "all"
rebuild_release:
description: 'Release number for rebuilds (e.g., 2, 3, 4 to increment spec Release)'
description: "Release number for rebuilds (e.g., 2, 3, 4 to increment spec Release)"
required: false
default: ''
default: ""
push:
tags:
- 'v*'
- "v*"
schedule:
- cron: '0 */3 * * *' # Every 3 hours for dms-git builds
- cron: "0 */3 * * *" # Every 3 hours for dms-git builds
jobs:
check-updates:
name: Check for updates
runs-on: ubuntu-latest
outputs:
has_updates: ${{ steps.check.outputs.has_updates }}
packages: ${{ steps.check.outputs.packages }}
version: ${{ steps.check.outputs.version }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Install OSC
run: |
sudo apt-get update
sudo apt-get install -y osc
mkdir -p ~/.config/osc
cat > ~/.config/osc/oscrc << EOF
[general]
apiurl = https://api.opensuse.org
[https://api.opensuse.org]
user = ${{ secrets.OBS_USERNAME }}
pass = ${{ secrets.OBS_PASSWORD }}
EOF
chmod 600 ~/.config/osc/oscrc
- name: Check for updates
id: check
env:
OBS_USERNAME: ${{ secrets.OBS_USERNAME }}
OBS_PASSWORD: ${{ secrets.OBS_PASSWORD }}
run: |
# Helper function to check dms-git commit
check_dms_git() {
local CURRENT_COMMIT=$(git rev-parse --short=8 HEAD)
local OBS_SPEC=$(curl -s -u "$OBS_USERNAME:$OBS_PASSWORD" "https://api.opensuse.org/source/home:AvengeMedia:dms-git/dms-git/dms-git.spec" 2>/dev/null || echo "")
local OBS_COMMIT=$(echo "$OBS_SPEC" | grep "^Version:" | grep -oP '\.[a-f0-9]{8}' | tr -d '.' || echo "")
if [[ -n "$OBS_COMMIT" && "$CURRENT_COMMIT" == "$OBS_COMMIT" ]]; then
echo "📋 dms-git: Commit $CURRENT_COMMIT already exists, skipping"
return 1 # No update needed
else
echo "📋 dms-git: New commit $CURRENT_COMMIT (OBS has ${OBS_COMMIT:-none})"
return 0 # Update needed
fi
}
# Helper function to check dms stable tag
check_dms_stable() {
local LATEST_TAG=$(curl -s https://api.github.com/repos/AvengeMedia/DankMaterialShell/releases/latest | grep '"tag_name"' | sed 's/.*"tag_name": "v\?\([^"]*\)".*/\1/' || echo "")
local OBS_SPEC=$(curl -s -u "$OBS_USERNAME:$OBS_PASSWORD" "https://api.opensuse.org/source/home:AvengeMedia:dms/dms/dms.spec" 2>/dev/null || echo "")
local OBS_VERSION=$(echo "$OBS_SPEC" | grep "^Version:" | awk '{print $2}' | xargs || echo "")
if [[ -n "$LATEST_TAG" && "$LATEST_TAG" == "$OBS_VERSION" ]]; then
echo "📋 dms: Tag $LATEST_TAG already exists, skipping"
return 1 # No update needed
else
echo "📋 dms: New tag ${LATEST_TAG:-unknown} (OBS has ${OBS_VERSION:-none})"
return 0 # Update needed
fi
}
# Main logic
REBUILD="${{ github.event.inputs.rebuild_release }}"
if [[ "${{ github.event_name }}" == "push" && "${{ github.ref }}" =~ ^refs/tags/ ]]; then
# Tag push - always update stable package
echo "packages=dms" >> $GITHUB_OUTPUT
VERSION="${GITHUB_REF#refs/tags/}"
echo "version=$VERSION" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "Triggered by tag: $VERSION (always update)"
elif [[ "${{ github.event_name }}" == "schedule" ]]; then
# Scheduled run - check dms-git only
echo "packages=dms-git" >> $GITHUB_OUTPUT
echo "Checking if dms-git source has changed..."
if check_dms_git; then
echo "has_updates=true" >> $GITHUB_OUTPUT
else
echo "has_updates=false" >> $GITHUB_OUTPUT
fi
# Get current commit hash (8 chars to match spec format)
CURRENT_COMMIT=$(git rev-parse --short=8 HEAD)
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
# Manual workflow trigger
PKG="${{ github.event.inputs.package }}"
# Check OBS for last uploaded commit
OBS_BASE="$HOME/.cache/osc-checkouts"
mkdir -p "$OBS_BASE"
OBS_PROJECT="home:AvengeMedia:dms-git"
if [[ -n "$REBUILD" ]]; then
# Rebuild requested - always proceed
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "🔄 Manual rebuild requested: $PKG (ppa$REBUILD)"
if [[ -d "$OBS_BASE/$OBS_PROJECT/dms-git" ]]; then
cd "$OBS_BASE/$OBS_PROJECT/dms-git"
osc up -q 2>/dev/null || true
elif [[ "$PKG" == "all" ]]; then
# Check each package and build list of those needing updates
PACKAGES_TO_UPDATE=()
check_dms_git && PACKAGES_TO_UPDATE+=("dms-git")
check_dms_stable && PACKAGES_TO_UPDATE+=("dms")
# Extract commit hash from spec Version line & format like; 0.6.2+git2264.a679be68
if [[ -f "dms-git.spec" ]]; then
OBS_COMMIT=$(grep "^Version:" "dms-git.spec" | grep -oP '\.[a-f0-9]{8}' | tr -d '.' || echo "")
if [[ -n "$OBS_COMMIT" ]]; then
if [[ "$CURRENT_COMMIT" == "$OBS_COMMIT" ]]; then
echo "has_updates=false" >> $GITHUB_OUTPUT
echo "📋 Commit $CURRENT_COMMIT already uploaded to OBS, skipping"
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 New commit detected: $CURRENT_COMMIT (OBS has $OBS_COMMIT)"
fi
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 Could not extract OBS commit, proceeding with update"
fi
else
if [[ ${#PACKAGES_TO_UPDATE[@]} -gt 0 ]]; then
echo "packages=${PACKAGES_TO_UPDATE[*]}" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 No spec file in OBS, proceeding with update"
echo "✓ Packages to update: ${PACKAGES_TO_UPDATE[*]}"
else
echo "packages=" >> $GITHUB_OUTPUT
echo "has_updates=false" >> $GITHUB_OUTPUT
echo "✓ All packages up to date"
fi
elif [[ "$PKG" == "dms-git" ]]; then
if check_dms_git; then
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
else
echo "packages=" >> $GITHUB_OUTPUT
echo "has_updates=false" >> $GITHUB_OUTPUT
fi
elif [[ "$PKG" == "dms" ]]; then
if check_dms_stable; then
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
else
echo "packages=" >> $GITHUB_OUTPUT
echo "has_updates=false" >> $GITHUB_OUTPUT
fi
cd "${{ github.workspace }}"
else
# Unknown package - proceed anyway
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 First upload to OBS, update needed"
echo "Manual trigger: $PKG"
fi
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
echo "packages=${{ github.event.inputs.package }}" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ github.event.inputs.package }}"
else
# Fallback - proceed
echo "packages=all" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
fi
@@ -113,16 +149,14 @@ jobs:
name: Upload to OBS
needs: check-updates
runs-on: ubuntu-latest
if: |
github.event_name == 'workflow_dispatch' ||
needs.check-updates.outputs.has_updates == 'true'
if: needs.check-updates.outputs.has_updates == 'true'
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Determine packages to update
id: packages
run: |
@@ -135,75 +169,57 @@ jobs:
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
echo "Triggered by schedule: updating git package"
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
echo "packages=${{ github.event.inputs.package }}" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ github.event.inputs.package }}"
# Use filtered packages from check-updates when package="all" and no rebuild requested
if [[ "${{ github.event.inputs.package }}" == "all" ]] && [[ -z "${{ github.event.inputs.rebuild_release }}" ]]; then
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
echo "Manual trigger: all (filtered to: ${{ needs.check-updates.outputs.packages }})"
else
echo "packages=${{ github.event.inputs.package }}" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ github.event.inputs.package }}"
fi
else
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
fi
- name: Update dms-git spec version
if: contains(steps.packages.outputs.packages, 'dms-git') || steps.packages.outputs.packages == 'all'
run: |
# Get commit info for dms-git versioning
COMMIT_HASH=$(git rev-parse --short=8 HEAD)
COMMIT_COUNT=$(git rev-list --count HEAD)
BASE_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1 || echo "0.6.2")
NEW_VERSION="${BASE_VERSION}+git${COMMIT_COUNT}.${COMMIT_HASH}"
echo "📦 Updating dms-git.spec to version: $NEW_VERSION"
# Update version in spec
sed -i "s/^Version:.*/Version: $NEW_VERSION/" distro/opensuse/dms-git.spec
# Add changelog entry
# Single changelog entry (git snapshots don't need history)
DATE_STR=$(date "+%a %b %d %Y")
CHANGELOG_ENTRY="* $DATE_STR Avenge Media <AvengeMedia.US@gmail.com> - ${NEW_VERSION}-1\n- Git snapshot (commit $COMMIT_COUNT: $COMMIT_HASH)"
sed -i "/%changelog/a\\$CHANGELOG_ENTRY" distro/opensuse/dms-git.spec
LOCAL_SPEC_HEAD=$(sed -n '1,/%changelog/{ /%changelog/d; p }' distro/opensuse/dms-git.spec)
{
echo "$LOCAL_SPEC_HEAD"
echo "%changelog"
echo "* $DATE_STR Avenge Media <AvengeMedia.US@gmail.com> - ${NEW_VERSION}-1"
echo "- Git snapshot (commit $COMMIT_COUNT: $COMMIT_HASH)"
} > distro/opensuse/dms-git.spec
- name: Update Debian dms-git changelog version
if: contains(steps.packages.outputs.packages, 'dms-git') || steps.packages.outputs.packages == 'all'
run: |
# Get commit info for dms-git versioning
COMMIT_HASH=$(git rev-parse --short=8 HEAD)
COMMIT_COUNT=$(git rev-list --count HEAD)
BASE_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1 || echo "0.6.2")
# Debian version format: 0.6.2+git2256.9162e314
NEW_VERSION="${BASE_VERSION}+git${COMMIT_COUNT}.${COMMIT_HASH}"
echo "📦 Updating Debian dms-git changelog to version: $NEW_VERSION"
# Single changelog entry (git snapshots don't need history)
CHANGELOG_DATE=$(date -R)
CHANGELOG_FILE="distro/debian/dms-git/debian/changelog"
# Get current version from changelog
CURRENT_VERSION=$(head -1 "$CHANGELOG_FILE" | sed 's/.*(\([^)]*\)).*/\1/')
echo "Current Debian version: $CURRENT_VERSION"
echo "New version: $NEW_VERSION"
# Only update if version changed
if [ "$CURRENT_VERSION" != "$NEW_VERSION" ]; then
# Create new changelog entry at top
TEMP_CHANGELOG=$(mktemp)
cat > "$TEMP_CHANGELOG" << EOF
dms-git ($NEW_VERSION) nightly; urgency=medium
* Git snapshot (commit $COMMIT_COUNT: $COMMIT_HASH)
-- Avenge Media <AvengeMedia.US@gmail.com> $CHANGELOG_DATE
EOF
# Prepend to existing changelog
cat "$CHANGELOG_FILE" >> "$TEMP_CHANGELOG"
mv "$TEMP_CHANGELOG" "$CHANGELOG_FILE"
echo "✓ Updated Debian changelog: $CURRENT_VERSION → $NEW_VERSION"
else
echo "✓ Debian changelog already at version $NEW_VERSION"
fi
{
echo "dms-git ($NEW_VERSION) nightly; urgency=medium"
echo ""
echo " * Git snapshot (commit $COMMIT_COUNT: $COMMIT_HASH)"
echo ""
echo " -- Avenge Media <AvengeMedia.US@gmail.com> $CHANGELOG_DATE"
} > "distro/debian/dms-git/debian/changelog"
- name: Update dms stable version
if: steps.packages.outputs.version != ''
@@ -211,21 +227,48 @@ jobs:
VERSION="${{ steps.packages.outputs.version }}"
VERSION_NO_V="${VERSION#v}"
echo "Updating packaging to version $VERSION_NO_V"
# Update openSUSE dms spec (stable only)
sed -i "s/^Version:.*/Version: $VERSION_NO_V/" distro/opensuse/dms.spec
# Update Debian _service files
# Single changelog entry (full history on OBS website)
DATE_STR=$(date "+%a %b %d %Y")
LOCAL_SPEC_HEAD=$(sed -n '1,/%changelog/{ /%changelog/d; p }' distro/opensuse/dms.spec)
{
echo "$LOCAL_SPEC_HEAD"
echo "%changelog"
echo "* $DATE_STR AvengeMedia <maintainer@avengemedia.com> - ${VERSION_NO_V}-1"
echo "- Update to stable $VERSION release"
} > distro/opensuse/dms.spec
# Update Debian _service files (both tar_scm and download_url formats)
for service in distro/debian/*/_service; do
if [[ -f "$service" ]]; then
# Update tar_scm revision parameter (for dms-git)
sed -i "s|<param name=\"revision\">v[0-9.]*</param>|<param name=\"revision\">$VERSION</param>|" "$service"
# Update download_url paths (for dms stable)
sed -i "s|/v[0-9.]\+/|/$VERSION/|g" "$service"
sed -i "s|/tags/v[0-9.]\+\.tar\.gz|/tags/$VERSION.tar.gz|g" "$service"
fi
done
# Update Debian changelog for dms stable (single entry, history on OBS website)
if [[ -f "distro/debian/dms/debian/changelog" ]]; then
CHANGELOG_DATE=$(date -R)
{
echo "dms ($VERSION_NO_V) stable; urgency=medium"
echo ""
echo " * Update to $VERSION stable release"
echo ""
echo " -- Avenge Media <AvengeMedia.US@gmail.com> $CHANGELOG_DATE"
} > "distro/debian/dms/debian/changelog"
echo "✓ Updated Debian changelog to $VERSION_NO_V"
fi
- name: Install Go
uses: actions/setup-go@v5
with:
go-version: '1.24'
go-version: "1.24"
- name: Install OSC
run: |
@@ -245,32 +288,76 @@ jobs:
- name: Upload to OBS
env:
FORCE_REBUILD: ${{ github.event_name == 'workflow_dispatch' && 'true' || '' }}
REBUILD_RELEASE: ${{ github.event.inputs.rebuild_release }}
run: |
PACKAGES="${{ steps.packages.outputs.packages }}"
if [[ -z "$PACKAGES" ]]; then
echo "✓ No packages need uploading. All up to date!"
exit 0
fi
MESSAGE="Automated update from GitHub Actions"
if [[ -n "${{ steps.packages.outputs.version }}" ]]; then
MESSAGE="Update to ${{ steps.packages.outputs.version }}"
fi
if [[ "$PACKAGES" == "all" ]]; then
bash distro/scripts/obs-upload.sh dms "$MESSAGE"
bash distro/scripts/obs-upload.sh dms-git "Automated git update"
else
bash distro/scripts/obs-upload.sh "$PACKAGES" "$MESSAGE"
fi
# PACKAGES can be space-separated list (e.g., "dms-git dms" from "all" check)
# Loop through each package and upload
for PKG in $PACKAGES; do
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading $PKG to OBS..."
if [[ -n "$REBUILD_RELEASE" ]]; then
echo "🔄 Using rebuild release number: ppa$REBUILD_RELEASE"
fi
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
if [[ "$PKG" == "dms-git" ]]; then
bash distro/scripts/obs-upload.sh dms-git "Automated git update"
else
bash distro/scripts/obs-upload.sh "$PKG" "$MESSAGE"
fi
done
- name: Summary
if: always()
run: |
echo "### OBS Package Update Complete" >> $GITHUB_STEP_SUMMARY
echo "### OBS Package Upload Summary" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "- **Packages**: ${{ steps.packages.outputs.packages }}" >> $GITHUB_STEP_SUMMARY
if [[ -n "${{ steps.packages.outputs.version }}" ]]; then
echo "- **Version**: ${{ steps.packages.outputs.version }}" >> $GITHUB_STEP_SUMMARY
PACKAGES="${{ steps.packages.outputs.packages }}"
if [[ -z "$PACKAGES" ]]; then
echo "**Status:** ✅ All packages up to date (no uploads needed)" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "All packages are current. Run completed successfully." >> $GITHUB_STEP_SUMMARY
else
echo "**Packages Uploaded:**" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
for PKG in $PACKAGES; do
case "$PKG" in
dms)
echo "- ✅ **dms** → [View builds](https://build.opensuse.org/package/show/home:AvengeMedia:dms/dms)" >> $GITHUB_STEP_SUMMARY
;;
dms-git)
echo "- ✅ **dms-git** → [View builds](https://build.opensuse.org/package/show/home:AvengeMedia:dms-git/dms-git)" >> $GITHUB_STEP_SUMMARY
;;
esac
done
echo "" >> $GITHUB_STEP_SUMMARY
if [[ -n "${{ github.event.inputs.rebuild_release }}" ]]; then
echo "**Rebuild Number:** ppa${{ github.event.inputs.rebuild_release }}" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
fi
if [[ -n "${{ steps.packages.outputs.version }}" ]]; then
echo "**Version:** ${{ steps.packages.outputs.version }}" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
fi
echo "Monitor build progress on [OBS project page](https://build.opensuse.org/project/show/home:AvengeMedia)." >> $GITHUB_STEP_SUMMARY
fi
if [[ "${{ needs.check-updates.outputs.has_updates }}" == "false" ]]; then
echo "- **Status**: Skipped (no changes detected)" >> $GITHUB_STEP_SUMMARY
fi
echo "- **Project**: https://build.opensuse.org/project/show/home:AvengeMedia" >> $GITHUB_STEP_SUMMARY

View File

@@ -4,15 +4,15 @@ on:
workflow_dispatch:
inputs:
package:
description: 'Package to upload (dms, dms-git, dms-greeter, or all)'
description: "Package to upload (dms, dms-git, dms-greeter, or all)"
required: false
default: 'dms-git'
default: "dms-git"
rebuild_release:
description: 'Release number for rebuilds (e.g., 2, 3, 4 for ppa2, ppa3, ppa4)'
description: "Release number for rebuilds (e.g., 2, 3, 4 for ppa2, ppa3, ppa4)"
required: false
default: ''
default: ""
schedule:
- cron: '0 */3 * * *' # Every 3 hours for dms-git builds
- cron: "0 */3 * * *" # Every 3 hours for dms-git builds
jobs:
check-updates:
@@ -32,41 +32,112 @@ jobs:
- name: Check for updates
id: check
run: |
if [[ "${{ github.event_name }}" == "schedule" ]]; then
echo "packages=dms-git" >> $GITHUB_OUTPUT
echo "Checking if dms-git source has changed..."
# Helper function to check dms-git commit
check_dms_git() {
local CURRENT_COMMIT=$(git rev-parse --short=8 HEAD)
local PPA_VERSION=$(curl -s "https://api.launchpad.net/1.0/~avengemedia/+archive/ubuntu/dms-git?ws.op=getPublishedSources&source_name=dms-git&status=Published" | grep -oP '"source_package_version":\s*"\K[^"]+' | head -1 || echo "")
local PPA_COMMIT=$(echo "$PPA_VERSION" | grep -oP '\.[a-f0-9]{8}' | tr -d '.' || echo "")
# Get current commit hash (8 chars to match changelog format)
CURRENT_COMMIT=$(git rev-parse --short=8 HEAD)
# Extract commit hash from changelog
# Format: dms-git (0.6.2+git2264.c5c5ce84) questing; urgency=medium
CHANGELOG_FILE="distro/ubuntu/dms-git/debian/changelog"
if [[ -f "$CHANGELOG_FILE" ]]; then
CHANGELOG_COMMIT=$(head -1 "$CHANGELOG_FILE" | grep -oP '\.[a-f0-9]{8}' | tr -d '.' || echo "")
if [[ -n "$CHANGELOG_COMMIT" ]]; then
if [[ "$CURRENT_COMMIT" == "$CHANGELOG_COMMIT" ]]; then
echo "has_updates=false" >> $GITHUB_OUTPUT
echo "📋 Commit $CURRENT_COMMIT already in changelog, skipping upload"
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 New commit detected: $CURRENT_COMMIT (changelog has $CHANGELOG_COMMIT)"
fi
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 Could not extract commit from changelog, proceeding with upload"
fi
if [[ -n "$PPA_COMMIT" && "$CURRENT_COMMIT" == "$PPA_COMMIT" ]]; then
echo "📋 dms-git: Commit $CURRENT_COMMIT already exists, skipping"
return 1 # No update needed
else
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "📋 No changelog file found, proceeding with upload"
echo "📋 dms-git: New commit $CURRENT_COMMIT (PPA has ${PPA_COMMIT:-none})"
return 0 # Update needed
fi
}
# Helper function to check stable package tag
check_stable_package() {
local PKG="$1"
local PPA_NAME="$2"
local LATEST_TAG=$(curl -s https://api.github.com/repos/AvengeMedia/DankMaterialShell/releases/latest | grep '"tag_name"' | sed 's/.*"tag_name": "v\?\([^"]*\)".*/\1/' || echo "")
local PPA_VERSION=$(curl -s "https://api.launchpad.net/1.0/~avengemedia/+archive/ubuntu/$PPA_NAME?ws.op=getPublishedSources&source_name=$PKG&status=Published" | grep -oP '"source_package_version":\s*"\K[^"]+' | head -1 || echo "")
local PPA_BASE_VERSION=$(echo "$PPA_VERSION" | sed 's/ppa[0-9]*$//')
if [[ -n "$LATEST_TAG" && "$LATEST_TAG" == "$PPA_BASE_VERSION" ]]; then
echo "📋 $PKG: Tag $LATEST_TAG already exists, skipping"
return 1 # No update needed
else
echo "📋 $PKG: New tag ${LATEST_TAG:-unknown} (PPA has ${PPA_BASE_VERSION:-none})"
return 0 # Update needed
fi
}
# Main logic
REBUILD="${{ github.event.inputs.rebuild_release }}"
if [[ "${{ github.event_name }}" == "schedule" ]]; then
# Scheduled run - check dms-git only
echo "packages=dms-git" >> $GITHUB_OUTPUT
if check_dms_git; then
echo "has_updates=true" >> $GITHUB_OUTPUT
else
echo "has_updates=false" >> $GITHUB_OUTPUT
fi
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
echo "packages=${{ github.event.inputs.package }}" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ github.event.inputs.package }}"
# Manual workflow trigger
PKG="${{ github.event.inputs.package }}"
if [[ -n "$REBUILD" ]]; then
# Rebuild requested - always proceed
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "🔄 Manual rebuild requested: $PKG (ppa$REBUILD)"
elif [[ "$PKG" == "all" ]]; then
# Check each package and build list of those needing updates
PACKAGES_TO_UPDATE=()
check_dms_git && PACKAGES_TO_UPDATE+=("dms-git")
check_stable_package "dms" "dms" && PACKAGES_TO_UPDATE+=("dms")
check_stable_package "dms-greeter" "danklinux" && PACKAGES_TO_UPDATE+=("dms-greeter")
if [[ ${#PACKAGES_TO_UPDATE[@]} -gt 0 ]]; then
echo "packages=${PACKAGES_TO_UPDATE[*]}" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "✓ Packages to update: ${PACKAGES_TO_UPDATE[*]}"
else
echo "packages=" >> $GITHUB_OUTPUT
echo "has_updates=false" >> $GITHUB_OUTPUT
echo "✓ All packages up to date"
fi
elif [[ "$PKG" == "dms-git" ]]; then
if check_dms_git; then
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
else
echo "packages=" >> $GITHUB_OUTPUT
echo "has_updates=false" >> $GITHUB_OUTPUT
fi
elif [[ "$PKG" == "dms" ]]; then
if check_stable_package "dms" "dms"; then
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
else
echo "packages=" >> $GITHUB_OUTPUT
echo "has_updates=false" >> $GITHUB_OUTPUT
fi
elif [[ "$PKG" == "dms-greeter" ]]; then
if check_stable_package "dms-greeter" "danklinux"; then
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
else
echo "packages=" >> $GITHUB_OUTPUT
echo "has_updates=false" >> $GITHUB_OUTPUT
fi
else
# Unknown package - proceed anyway
echo "packages=$PKG" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
echo "Manual trigger: $PKG"
fi
else
# Fallback
echo "packages=dms-git" >> $GITHUB_OUTPUT
echo "has_updates=true" >> $GITHUB_OUTPUT
fi
@@ -75,21 +146,19 @@ jobs:
name: Upload to PPA
needs: check-updates
runs-on: ubuntu-latest
if: |
github.event_name == 'workflow_dispatch' ||
needs.check-updates.outputs.has_updates == 'true'
if: needs.check-updates.outputs.has_updates == 'true'
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Go
uses: actions/setup-go@v5
with:
go-version: '1.24'
cache: false
go-version: "1.24"
cache: false
- name: Install build dependencies
run: |
@@ -102,7 +171,7 @@ jobs:
build-essential \
fakeroot \
dpkg-dev
- name: Configure GPG
env:
GPG_KEY: ${{ secrets.GPG_PRIVATE_KEY }}
@@ -110,79 +179,101 @@ jobs:
echo "$GPG_KEY" | gpg --import
GPG_KEY_ID=$(gpg --list-secret-keys --keyid-format LONG | grep sec | awk '{print $2}' | cut -d'/' -f2)
echo "DEBSIGN_KEYID=$GPG_KEY_ID" >> $GITHUB_ENV
- name: Determine packages to upload
id: packages
run: |
# Use packages determined by check-updates job
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
if [[ "${{ github.event_name }}" == "schedule" ]]; then
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
echo "Triggered by schedule: uploading git package"
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
echo "packages=${{ github.event.inputs.package }}" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ github.event.inputs.package }}"
else
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
echo "Manual trigger: ${{ needs.check-updates.outputs.packages }}"
fi
- name: Upload to PPA
env:
REBUILD_RELEASE: ${{ github.event.inputs.rebuild_release }}
run: |
PACKAGES="${{ steps.packages.outputs.packages }}"
if [[ "$PACKAGES" == "all" ]]; then
REBUILD_RELEASE="${{ github.event.inputs.rebuild_release }}"
if [[ -z "$PACKAGES" ]]; then
echo "✓ No packages need uploading. All up to date!"
exit 0
fi
# Export REBUILD_RELEASE so ppa-build.sh can use it
if [[ -n "$REBUILD_RELEASE" ]]; then
export REBUILD_RELEASE
echo "✓ Using rebuild release number: ppa$REBUILD_RELEASE"
fi
# PACKAGES can be space-separated list (e.g., "dms-git dms" from "all" check)
# Loop through each package and upload
for PKG in $PACKAGES; do
# Map package to PPA name
case "$PKG" in
dms)
PPA_NAME="dms"
;;
dms-git)
PPA_NAME="dms-git"
;;
dms-greeter)
PPA_NAME="danklinux"
;;
*)
echo "⚠️ Unknown package: $PKG, skipping"
continue
;;
esac
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading dms to PPA..."
if [ -n "$REBUILD_RELEASE" ]; then
echo "Uploading $PKG to PPA $PPA_NAME..."
if [[ -n "$REBUILD_RELEASE" ]]; then
echo "🔄 Using rebuild release number: ppa$REBUILD_RELEASE"
fi
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
bash distro/scripts/ppa-upload.sh "distro/ubuntu/dms" dms questing
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading dms-git to PPA..."
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
bash distro/scripts/ppa-upload.sh "distro/ubuntu/dms-git" dms-git questing
echo ""
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading dms-greeter to PPA..."
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
bash distro/scripts/ppa-upload.sh "distro/ubuntu/dms-greeter" danklinux questing
else
PPA_NAME="$PACKAGES"
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo "Uploading $PACKAGES to PPA..."
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
bash distro/scripts/ppa-upload.sh "distro/ubuntu/$PACKAGES" "$PPA_NAME" questing
fi
- name: Summary
run: |
echo "### PPA Package Upload Complete" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "- **Packages**: ${{ steps.packages.outputs.packages }}" >> $GITHUB_STEP_SUMMARY
bash distro/scripts/ppa-upload.sh "$PKG" "$PPA_NAME" questing ${REBUILD_RELEASE:+"$REBUILD_RELEASE"}
done
if [[ "${{ needs.check-updates.outputs.has_updates }}" == "false" ]]; then
echo "- **Status**: Skipped (no changes detected)" >> $GITHUB_STEP_SUMMARY
fi
- name: Summary
if: always()
run: |
echo "### PPA Package Upload Summary" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
PACKAGES="${{ steps.packages.outputs.packages }}"
if [[ "$PACKAGES" == "all" ]]; then
echo "- **PPA dms**: https://launchpad.net/~avengemedia/+archive/ubuntu/dms/+packages" >> $GITHUB_STEP_SUMMARY
echo "- **PPA dms-git**: https://launchpad.net/~avengemedia/+archive/ubuntu/dms-git/+packages" >> $GITHUB_STEP_SUMMARY
echo "- **PPA danklinux**: https://launchpad.net/~avengemedia/+archive/ubuntu/danklinux/+packages" >> $GITHUB_STEP_SUMMARY
elif [[ "$PACKAGES" == "dms" ]]; then
echo "- **PPA**: https://launchpad.net/~avengemedia/+archive/ubuntu/dms/+packages" >> $GITHUB_STEP_SUMMARY
elif [[ "$PACKAGES" == "dms-git" ]]; then
echo "- **PPA**: https://launchpad.net/~avengemedia/+archive/ubuntu/dms-git/+packages" >> $GITHUB_STEP_SUMMARY
elif [[ "$PACKAGES" == "dms-greeter" ]]; then
echo "- **PPA**: https://launchpad.net/~avengemedia/+archive/ubuntu/danklinux/+packages" >> $GITHUB_STEP_SUMMARY
if [[ -z "$PACKAGES" ]]; then
echo "**Status:** ✅ All packages up to date (no uploads needed)" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "All packages are current. Run will complete successfully." >> $GITHUB_STEP_SUMMARY
else
echo "**Packages Uploaded:**" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
for PKG in $PACKAGES; do
case "$PKG" in
dms)
echo "- ✅ **dms** → [View builds](https://launchpad.net/~avengemedia/+archive/ubuntu/dms/+packages)" >> $GITHUB_STEP_SUMMARY
;;
dms-git)
echo "- ✅ **dms-git** → [View builds](https://launchpad.net/~avengemedia/+archive/ubuntu/dms-git/+packages)" >> $GITHUB_STEP_SUMMARY
;;
dms-greeter)
echo "- ✅ **dms-greeter** → [View builds](https://launchpad.net/~avengemedia/+archive/ubuntu/danklinux/+packages)" >> $GITHUB_STEP_SUMMARY
;;
esac
done
echo "" >> $GITHUB_STEP_SUMMARY
if [[ -n "${{ github.event.inputs.rebuild_release }}" ]]; then
echo "**Rebuild Number:** ppa${{ github.event.inputs.rebuild_release }}" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
fi
echo "Builds will appear once Launchpad processes the uploads." >> $GITHUB_STEP_SUMMARY
fi
if [[ -n "${{ steps.packages.outputs.version }}" ]]; then
echo "- **Version**: ${{ steps.packages.outputs.version }}" >> $GITHUB_STEP_SUMMARY
fi
echo "" >> $GITHUB_STEP_SUMMARY
echo "Builds will appear once Launchpad processes the uploads." >> $GITHUB_STEP_SUMMARY

19
.github/workflows/stable.yml vendored Normal file
View File

@@ -0,0 +1,19 @@
name: Update stable branch
on:
push:
tags:
- "v*"
jobs:
update-stable:
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Push to stable branch
run: git push origin HEAD:refs/heads/stable --force

View File

@@ -36,7 +36,7 @@ jobs:
run: |
set -euo pipefail
echo "Attempting nix build to get new vendorHash..."
if output=$(nix build .#dmsCli 2>&1); then
if output=$(nix build .#dms-shell 2>&1); then
echo "Build succeeded, no hash update needed"
exit 0
fi
@@ -46,7 +46,7 @@ jobs:
[ "$current_hash" = "$new_hash" ] && { echo "vendorHash already up to date"; exit 0; }
sed -i "s|vendorHash = \"$current_hash\"|vendorHash = \"$new_hash\"|" flake.nix
echo "Verifying build with new vendorHash..."
nix build .#dmsCli
nix build .#dms-shell
echo "vendorHash updated successfully!"
- name: Commit and push vendorHash update

6
.gitignore vendored
View File

@@ -104,12 +104,6 @@ go.work.sum
bin/
# Extracted source trees in Ubuntu package directories
distro/ubuntu/*/dms-git-repo/
distro/ubuntu/*/DankMaterialShell-*/
distro/ubuntu/danklinux/*/dsearch-*/
distro/ubuntu/danklinux/*/dgop-*/
# direnv
.envrc
.direnv/

12
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,12 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v6.0.0
hooks:
- id: trailing-whitespace
- id: check-yaml
- id: end-of-file-fixer
- repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.10.0.1
hooks:
- id: shellcheck
args: [-e, SC2164, -e, SC2001, -e, SC2012, -e, SC2317]

6
CHANGELOG.MD Normal file
View File

@@ -0,0 +1,6 @@
# 1.2.0
- Added clipboard and clipboard history integration
- Added swipe to dismiss notification popups and from center
- Added paste from clipboard history view - requires wtype
- Optimize surface damage of OSD & Toast

View File

@@ -6,10 +6,10 @@ To contribute fork this repository, make your changes, and open a pull request.
## Setup
Enable pre-commit hooks to catch CI failures before pushing:
Install [prek](https://prek.j178.dev/) then activate pre-commit hooks:
```bash
git config core.hooksPath .githooks
prek install
```
### Nix Development Shell
@@ -21,6 +21,7 @@ nix develop
```
This will provide:
- Go 1.24 toolchain (go, gopls, delve, go-tools) and GNU Make
- Quickshell and required QML packages
- Properly configured QML2_IMPORT_PATH
@@ -54,6 +55,20 @@ touch .qmlls.ini
5. Make your changes, test, and open a pull request.
### I18n/Localization
When adding user-facing strings, ensure they are wrapped in `I18n.tr()` with context, for example.
```qml
import qs.Common
Text {
text: I18n.tr("Hello World", "<This is context for the translators, example> Hello world greeting that appears on the lock screen")
}
```
Preferably, try to keep new terms to a minimum and re-use existing terms where possible. See `quickshell/translations/en.json` for the list of existing terms. (This isn't always possible obviously, but instead of using `Auto-connect` you would use `Autoconnect` since it's already translated)
### GO (`core` directory)
1. Install the [Go Extension](https://code.visualstudio.com/docs/languages/go)

View File

@@ -5,21 +5,21 @@
<img src="assets/danklogo.svg" alt="DankMaterialShell" width="200">
</a>
### A modern desktop shell for Wayland
### A modern desktop shell for Wayland
Built with [Quickshell](https://quickshell.org/) and [Go](https://go.dev/)
Built with [Quickshell](https://quickshell.org/) and [Go](https://go.dev/)
[![Documentation](https://img.shields.io/badge/docs-danklinux.com-9ccbfb?style=for-the-badge&labelColor=101418)](https://danklinux.com/docs)
[![GitHub stars](https://img.shields.io/github/stars/AvengeMedia/DankMaterialShell?style=for-the-badge&labelColor=101418&color=ffd700)](https://github.com/AvengeMedia/DankMaterialShell/stargazers)
[![GitHub License](https://img.shields.io/github/license/AvengeMedia/DankMaterialShell?style=for-the-badge&labelColor=101418&color=b9c8da)](https://github.com/AvengeMedia/DankMaterialShell/blob/master/LICENSE)
[![GitHub release](https://img.shields.io/github/v/release/AvengeMedia/DankMaterialShell?style=for-the-badge&labelColor=101418&color=9ccbfb)](https://github.com/AvengeMedia/DankMaterialShell/releases)
[![AUR version](https://img.shields.io/aur/version/dms-shell-bin?style=for-the-badge&labelColor=101418&color=9ccbfb)](https://aur.archlinux.org/packages/dms-shell-bin)
[![AUR version (git)](https://img.shields.io/aur/version/dms-shell-git?style=for-the-badge&labelColor=101418&color=9ccbfb&label=AUR%20(git))](https://aur.archlinux.org/packages/dms-shell-git)
[![AUR version (git)](<https://img.shields.io/aur/version/dms-shell-git?style=for-the-badge&labelColor=101418&color=9ccbfb&label=AUR%20(git)>)](https://aur.archlinux.org/packages/dms-shell-git)
[![Ko-Fi donate](https://img.shields.io/badge/donate-kofi?style=for-the-badge&logo=ko-fi&logoColor=ffffff&label=ko-fi&labelColor=101418&color=f16061&link=https%3A%2F%2Fko-fi.com%2Fdanklinux)](https://ko-fi.com/danklinux)
</div>
DankMaterialShell is a complete desktop shell for [niri](https://github.com/YaLTeR/niri), [Hyprland](https://hyprland.org/), [MangoWC](https://github.com/DreamMaoMao/mangowc), [Sway](https://swaywm.org), [labwc](https://labwc.github.io/), and other Wayland compositors. It replaces waybar, swaylock, swayidle, mako, fuzzel, polkit, and everything else you'd normally stitch together to make a desktop.
DankMaterialShell is a complete desktop shell for [niri](https://github.com/YaLTeR/niri), [Hyprland](https://hyprland.org/), [MangoWC](https://github.com/DreamMaoMao/mangowc), [Sway](https://swaywm.org), [labwc](https://labwc.github.io/), [Scroll](https://github.com/dawsers/scroll), and other Wayland compositors. It replaces waybar, swaylock, swayidle, mako, fuzzel, polkit, and everything else you'd normally stitch together to make a desktop.
## Repository Structure
@@ -105,7 +105,7 @@ Extend functionality with the [plugin registry](https://plugins.danklinux.com).
## Supported Compositors
Works best with [niri](https://github.com/YaLTeR/niri), [Hyprland](https://hyprland.org/), [Sway](https://swaywm.org/), [MangoWC](https://github.com/DreamMaoMao/mangowc), and [labwc](https://labwc.github.io/) with full workspace switching, overview integration, and monitor management. Other Wayland compositors work with reduced features.
Works best with [niri](https://github.com/YaLTeR/niri), [Hyprland](https://hyprland.org/), [Sway](https://swaywm.org/), [MangoWC](https://github.com/DreamMaoMao/mangowc), [labwc](https://labwc.github.io/), and [Scroll](https://github.com/dawsers/scroll) with full workspace switching, overview integration, and monitor management. Other Wayland compositors work with reduced features.
[Compositor configuration guide](https://danklinux.com/docs/dankmaterialshell/compositors)
@@ -127,7 +127,7 @@ dms plugins search # Browse plugin registry
## Documentation
- **Website:** [danklinux.com](https://danklinux.com)
- **Docs:** [danklinux.com/docs](https://danklinux.com/docs)
- **Docs:** [danklinux.com/docs](https://danklinux.com/docs/)
- **Theming:** [Application themes](https://danklinux.com/docs/dankmaterialshell/application-themes) | [Custom themes](https://danklinux.com/docs/dankmaterialshell/custom-themes)
- **Plugins:** [Development guide](https://danklinux.com/docs/dankmaterialshell/plugins-overview)
- **Support:** [Ko-fi](https://ko-fi.com/avengemediallc)
@@ -143,6 +143,7 @@ See component-specific documentation:
### Building from Source
**Core + Dankinstall:**
```bash
cd core
make # Build dms CLI
@@ -150,11 +151,13 @@ make dankinstall # Build installer
```
**Shell:**
```bash
quickshell -p quickshell/
```
**NixOS:**
```nix
{
inputs.dms.url = "github:AvengeMedia/DankMaterialShell";

View File

@@ -1 +0,0 @@
indentation = "FourSpaces"

View File

@@ -14,4 +14,4 @@ RestartSec=1.23
TimeoutStopSec=10
[Install]
WantedBy=graphical-session.target
WantedBy=graphical-session.target

View File

@@ -22,6 +22,8 @@ linters:
- (*os.Process).Signal
- (*os.Process).Kill
- syscall.Kill
# Seek on memfd (reset position before passing fd)
- syscall.Seek
# DBus cleanup
- (*github.com/godbus/dbus/v5.Conn).RemoveMatchSignal
- (*github.com/godbus/dbus/v5.Conn).RemoveSignal
@@ -100,7 +102,11 @@ linters:
- linters:
- ineffassign
path: internal/proto/
# binary.Write to bytes.Buffer can't fail
# binary.Write/Read to bytes.Buffer can't fail
- linters:
- errcheck
text: "Error return value of `binary\\.Write` is not checked"
text: "Error return value of `binary\\.(Write|Read)` is not checked"
# bytes.Reader.Read can't fail (reads from memory)
- linters:
- errcheck
text: "Error return value of `buf\\.Read` is not checked"

View File

@@ -56,3 +56,15 @@ packages:
outpkg: mocks_version
interfaces:
VersionFetcher:
github.com/AvengeMedia/DankMaterialShell/core/internal/server/wlcontext:
config:
dir: "internal/mocks/wlcontext"
outpkg: mocks_wlcontext
interfaces:
WaylandContext:
github.com/AvengeMedia/DankMaterialShell/core/pkg/go-wayland/wayland/client:
config:
dir: "internal/mocks/wlclient"
outpkg: mocks_wlclient
interfaces:
WaylandDisplay:

View File

@@ -0,0 +1,16 @@
repos:
- repo: https://github.com/golangci/golangci-lint
rev: v2.6.2
hooks:
- id: golangci-lint-fmt
require_serial: true
- id: golangci-lint-full
- id: golangci-lint-config-verify
- repo: local
hooks:
- id: go-test
name: go test
entry: go test ./...
language: system
pass_filenames: false
types: [go]

View File

@@ -139,4 +139,4 @@ Most packages available in standard repos. Minimal building required.
**Gentoo**
Uses Portage with GURU overlay. Automatically configures USE flags. Variable success depending on system configuration.
See installer output for distribution-specific details during installation.
See installer output for distribution-specific details during installation.

View File

@@ -8,7 +8,7 @@
<rect x="0" y="29" width="8" height="8" fill="#CCBEFF"/>
<rect x="20" y="29" width="8" height="8" fill="#CCBEFF"/>
<rect x="0" y="37" width="24" height="8" fill="#CCBEFF"/>
<!-- A -->
<rect x="36" y="5" width="20" height="8" fill="#CCBEFF"/>
<rect x="32" y="13" width="8" height="8" fill="#CCBEFF"/>
@@ -18,7 +18,7 @@
<rect x="52" y="29" width="8" height="8" fill="#CCBEFF"/>
<rect x="32" y="37" width="8" height="8" fill="#CCBEFF"/>
<rect x="52" y="37" width="8" height="8" fill="#CCBEFF"/>
<!-- N -->
<rect x="64" y="5" width="12" height="8" fill="#CCBEFF"/>
<rect x="92" y="5" width="8" height="8" fill="#CCBEFF"/>
@@ -32,7 +32,7 @@
<rect x="92" y="29" width="8" height="8" fill="#CCBEFF"/>
<rect x="64" y="37" width="8" height="8" fill="#CCBEFF"/>
<rect x="84" y="37" width="16" height="8" fill="#CCBEFF"/>
<!-- K -->
<rect x="104" y="5" width="8" height="8" fill="#CCBEFF"/>
<rect x="124" y="5" width="8" height="8" fill="#CCBEFF"/>
@@ -43,4 +43,4 @@
<rect x="120" y="29" width="8" height="8" fill="#CCBEFF"/>
<rect x="104" y="37" width="8" height="8" fill="#CCBEFF"/>
<rect x="124" y="37" width="8" height="8" fill="#CCBEFF"/>
</svg>
</svg>

Before

Width:  |  Height:  |  Size: 2.3 KiB

After

Width:  |  Height:  |  Size: 2.3 KiB

View File

@@ -12,6 +12,11 @@ import (
var Version = "dev"
func main() {
if os.Getuid() == 0 {
fmt.Fprintln(os.Stderr, "Error: dankinstall must not be run as root")
os.Exit(1)
}
fileLogger, err := log.NewFileLogger()
if err != nil {
fmt.Printf("Warning: Failed to create log file: %v\n", err)

View File

@@ -211,45 +211,13 @@ func runBrightnessSet(cmd *cobra.Command, args []string) {
exponential, _ := cmd.Flags().GetBool("exponential")
exponent, _ := cmd.Flags().GetFloat64("exponent")
// For backlight/leds devices, try logind backend first (requires D-Bus connection)
parts := strings.SplitN(deviceID, ":", 2)
if len(parts) == 2 && (parts[0] == "backlight" || parts[0] == "leds") {
subsystem := parts[0]
name := parts[1]
// Initialize backends needed for logind approach
sysfs, err := brightness.NewSysfsBackend()
if err != nil {
log.Debugf("NewSysfsBackend failed: %v", err)
} else {
logind, err := brightness.NewLogindBackend()
if err != nil {
log.Debugf("NewLogindBackend failed: %v", err)
} else {
defer logind.Close()
// Get device info to convert percent to value
dev, err := sysfs.GetDevice(deviceID)
if err == nil {
// Calculate hardware value using the same logic as Manager.setViaSysfsWithLogind
value := sysfs.PercentToValueWithExponent(percent, dev, exponential, exponent)
// Call logind with hardware value
if err := logind.SetBrightness(subsystem, name, uint32(value)); err == nil {
log.Debugf("set %s to %d%% (%d) via logind", deviceID, percent, value)
fmt.Printf("Set %s to %d%%\n", deviceID, percent)
return
} else {
log.Debugf("logind.SetBrightness failed: %v", err)
}
} else {
log.Debugf("sysfs.GetDeviceByID failed: %v", err)
}
}
if ok := tryLogindBrightness(parts[0], parts[1], deviceID, percent, exponential, exponent); ok {
return
}
}
// Fallback to direct sysfs (requires write permissions)
sysfs, err := brightness.NewSysfsBackend()
if err == nil {
if err := sysfs.SetBrightnessWithExponent(deviceID, percent, exponential, exponent); err == nil {
@@ -280,6 +248,37 @@ func runBrightnessSet(cmd *cobra.Command, args []string) {
log.Fatalf("Failed to set brightness for device: %s", deviceID)
}
func tryLogindBrightness(subsystem, name, deviceID string, percent int, exponential bool, exponent float64) bool {
sysfs, err := brightness.NewSysfsBackend()
if err != nil {
log.Debugf("NewSysfsBackend failed: %v", err)
return false
}
logind, err := brightness.NewLogindBackend()
if err != nil {
log.Debugf("NewLogindBackend failed: %v", err)
return false
}
defer logind.Close()
dev, err := sysfs.GetDevice(deviceID)
if err != nil {
log.Debugf("sysfs.GetDeviceByID failed: %v", err)
return false
}
value := sysfs.PercentToValueWithExponent(percent, dev, exponential, exponent)
if err := logind.SetBrightness(subsystem, name, uint32(value)); err != nil {
log.Debugf("logind.SetBrightness failed: %v", err)
return false
}
log.Debugf("set %s to %d%% (%d) via logind", deviceID, percent, value)
fmt.Printf("Set %s to %d%%\n", deviceID, percent)
return true
}
func getBrightnessDevices(includeDDC bool) []string {
allDevices := getAllBrightnessDevices(includeDDC)

View File

@@ -0,0 +1,628 @@
package main
import (
"context"
"encoding/json"
"fmt"
"io"
"os"
"os/exec"
"os/signal"
"strconv"
"syscall"
"time"
"github.com/AvengeMedia/DankMaterialShell/core/internal/clipboard"
"github.com/AvengeMedia/DankMaterialShell/core/internal/log"
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
"github.com/spf13/cobra"
)
var clipboardCmd = &cobra.Command{
Use: "clipboard",
Aliases: []string{"cl"},
Short: "Manage clipboard",
Long: "Interact with the clipboard manager",
}
var clipCopyCmd = &cobra.Command{
Use: "copy [text]",
Short: "Copy text to clipboard",
Long: "Copy text to clipboard. If no text provided, reads from stdin. Works without server.",
Run: runClipCopy,
}
var (
clipCopyForeground bool
clipCopyPasteOnce bool
clipCopyType string
clipJSONOutput bool
)
var clipPasteCmd = &cobra.Command{
Use: "paste",
Short: "Paste text from clipboard",
Long: "Paste text from clipboard to stdout. Works without server.",
Run: runClipPaste,
}
var clipWatchCmd = &cobra.Command{
Use: "watch [command]",
Short: "Watch clipboard for changes",
Long: `Watch clipboard for changes and optionally execute a command.
Works like wl-paste --watch. Does not require server.
If a command is provided, it will be executed each time the clipboard changes,
with the clipboard content piped to its stdin.
Examples:
dms cl watch # Print clipboard changes to stdout
dms cl watch cat # Same as above
dms cl watch notify-send # Send notification on clipboard change`,
Run: runClipWatch,
}
var clipHistoryCmd = &cobra.Command{
Use: "history",
Short: "Show clipboard history",
Long: "Show clipboard history with previews (requires server)",
Run: runClipHistory,
}
var clipGetCmd = &cobra.Command{
Use: "get <id>",
Short: "Get clipboard entry by ID",
Long: "Get full clipboard entry data by ID (requires server). Use --copy to copy it to clipboard.",
Args: cobra.ExactArgs(1),
Run: runClipGet,
}
var clipGetCopy bool
var clipDeleteCmd = &cobra.Command{
Use: "delete <id>",
Short: "Delete clipboard entry",
Long: "Delete a clipboard history entry by ID (requires server)",
Args: cobra.ExactArgs(1),
Run: runClipDelete,
}
var clipClearCmd = &cobra.Command{
Use: "clear",
Short: "Clear clipboard history",
Long: "Clear all clipboard history (requires server)",
Run: runClipClear,
}
var clipWatchStore bool
var clipSearchCmd = &cobra.Command{
Use: "search [query]",
Short: "Search clipboard history",
Long: "Search clipboard history with filters (requires server)",
Run: runClipSearch,
}
var (
clipSearchLimit int
clipSearchOffset int
clipSearchMimeType string
clipSearchImages bool
clipSearchText bool
)
var clipConfigCmd = &cobra.Command{
Use: "config",
Short: "Manage clipboard config",
Long: "Get or set clipboard configuration (requires server)",
}
var clipConfigGetCmd = &cobra.Command{
Use: "get",
Short: "Get clipboard config",
Run: runClipConfigGet,
}
var clipConfigSetCmd = &cobra.Command{
Use: "set",
Short: "Set clipboard config",
Long: `Set clipboard configuration options.
Examples:
dms cl config set --max-history 200
dms cl config set --auto-clear-days 7
dms cl config set --clear-at-startup`,
Run: runClipConfigSet,
}
var (
clipConfigMaxHistory int
clipConfigAutoClearDays int
clipConfigClearAtStartup bool
clipConfigNoClearStartup bool
clipConfigDisabled bool
clipConfigEnabled bool
clipConfigDisableHistory bool
clipConfigEnableHistory bool
clipConfigDisablePersist bool
clipConfigEnablePersist bool
)
func init() {
clipCopyCmd.Flags().BoolVarP(&clipCopyForeground, "foreground", "f", false, "Stay in foreground instead of forking")
clipCopyCmd.Flags().BoolVarP(&clipCopyPasteOnce, "paste-once", "o", false, "Exit after first paste")
clipCopyCmd.Flags().StringVarP(&clipCopyType, "type", "t", "text/plain;charset=utf-8", "MIME type")
clipWatchCmd.Flags().BoolVar(&clipJSONOutput, "json", false, "Output as JSON")
clipHistoryCmd.Flags().BoolVar(&clipJSONOutput, "json", false, "Output as JSON")
clipGetCmd.Flags().BoolVar(&clipJSONOutput, "json", false, "Output as JSON")
clipGetCmd.Flags().BoolVarP(&clipGetCopy, "copy", "c", false, "Copy entry to clipboard")
clipSearchCmd.Flags().IntVarP(&clipSearchLimit, "limit", "l", 50, "Max results")
clipSearchCmd.Flags().IntVarP(&clipSearchOffset, "offset", "o", 0, "Result offset")
clipSearchCmd.Flags().StringVarP(&clipSearchMimeType, "mime", "m", "", "Filter by MIME type")
clipSearchCmd.Flags().BoolVar(&clipSearchImages, "images", false, "Only images")
clipSearchCmd.Flags().BoolVar(&clipSearchText, "text", false, "Only text")
clipSearchCmd.Flags().BoolVar(&clipJSONOutput, "json", false, "Output as JSON")
clipConfigSetCmd.Flags().IntVar(&clipConfigMaxHistory, "max-history", 0, "Max history entries")
clipConfigSetCmd.Flags().IntVar(&clipConfigAutoClearDays, "auto-clear-days", -1, "Auto-clear entries older than N days (0 to disable)")
clipConfigSetCmd.Flags().BoolVar(&clipConfigClearAtStartup, "clear-at-startup", false, "Clear history on startup")
clipConfigSetCmd.Flags().BoolVar(&clipConfigNoClearStartup, "no-clear-at-startup", false, "Don't clear history on startup")
clipConfigSetCmd.Flags().BoolVar(&clipConfigDisabled, "disable", false, "Disable clipboard manager entirely")
clipConfigSetCmd.Flags().BoolVar(&clipConfigEnabled, "enable", false, "Enable clipboard manager")
clipConfigSetCmd.Flags().BoolVar(&clipConfigDisableHistory, "disable-history", false, "Disable clipboard history persistence")
clipConfigSetCmd.Flags().BoolVar(&clipConfigEnableHistory, "enable-history", false, "Enable clipboard history persistence")
clipConfigSetCmd.Flags().BoolVar(&clipConfigDisablePersist, "disable-persist", false, "Disable clipboard ownership persistence")
clipConfigSetCmd.Flags().BoolVar(&clipConfigEnablePersist, "enable-persist", false, "Enable clipboard ownership persistence")
clipWatchCmd.Flags().BoolVarP(&clipWatchStore, "store", "s", false, "Store clipboard changes to history (no server required)")
clipConfigCmd.AddCommand(clipConfigGetCmd, clipConfigSetCmd)
clipboardCmd.AddCommand(clipCopyCmd, clipPasteCmd, clipWatchCmd, clipHistoryCmd, clipGetCmd, clipDeleteCmd, clipClearCmd, clipSearchCmd, clipConfigCmd)
}
func runClipCopy(cmd *cobra.Command, args []string) {
var data []byte
if len(args) > 0 {
data = []byte(args[0])
} else {
var err error
data, err = io.ReadAll(os.Stdin)
if err != nil {
log.Fatalf("read stdin: %v", err)
}
}
if err := clipboard.CopyOpts(data, clipCopyType, clipCopyForeground, clipCopyPasteOnce); err != nil {
log.Fatalf("copy: %v", err)
}
}
func runClipPaste(cmd *cobra.Command, args []string) {
data, _, err := clipboard.Paste()
if err != nil {
log.Fatalf("paste: %v", err)
}
os.Stdout.Write(data)
}
func runClipWatch(cmd *cobra.Command, args []string) {
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
sigCh := make(chan os.Signal, 1)
signal.Notify(sigCh, syscall.SIGINT, syscall.SIGTERM)
go func() {
<-sigCh
cancel()
}()
switch {
case len(args) > 0:
if err := clipboard.Watch(ctx, func(data []byte, mimeType string) {
runCommand(args, data)
}); err != nil && err != context.Canceled {
log.Fatalf("Watch error: %v", err)
}
case clipWatchStore:
if err := clipboard.Watch(ctx, func(data []byte, mimeType string) {
if err := clipboard.Store(data, mimeType); err != nil {
log.Errorf("store: %v", err)
}
}); err != nil && err != context.Canceled {
log.Fatalf("Watch error: %v", err)
}
case clipJSONOutput:
if err := clipboard.Watch(ctx, func(data []byte, mimeType string) {
out := map[string]any{
"data": string(data),
"mimeType": mimeType,
"timestamp": time.Now().Format(time.RFC3339),
"size": len(data),
}
j, _ := json.Marshal(out)
fmt.Println(string(j))
}); err != nil && err != context.Canceled {
log.Fatalf("Watch error: %v", err)
}
default:
if err := clipboard.Watch(ctx, func(data []byte, mimeType string) {
os.Stdout.Write(data)
os.Stdout.WriteString("\n")
}); err != nil && err != context.Canceled {
log.Fatalf("Watch error: %v", err)
}
}
}
func runCommand(args []string, stdin []byte) {
cmd := exec.Command(args[0], args[1:]...)
cmd.Stdout = os.Stdout
cmd.Stderr = os.Stderr
if len(stdin) == 0 {
cmd.Run()
return
}
r, w, err := os.Pipe()
if err != nil {
cmd.Run()
return
}
cmd.Stdin = r
go func() {
w.Write(stdin)
w.Close()
}()
cmd.Run()
}
func runClipHistory(cmd *cobra.Command, args []string) {
req := models.Request{
ID: 1,
Method: "clipboard.getHistory",
}
resp, err := sendServerRequest(req)
if err != nil {
log.Fatalf("Failed to get clipboard history: %v", err)
}
if resp.Error != "" {
log.Fatalf("Error: %s", resp.Error)
}
if resp.Result == nil {
if clipJSONOutput {
fmt.Println("[]")
} else {
fmt.Println("No clipboard history")
}
return
}
historyList, ok := (*resp.Result).([]any)
if !ok {
log.Fatal("Invalid response format")
}
if clipJSONOutput {
out, _ := json.MarshalIndent(historyList, "", " ")
fmt.Println(string(out))
return
}
if len(historyList) == 0 {
fmt.Println("No clipboard history")
return
}
fmt.Println("Clipboard History:")
fmt.Println()
for _, item := range historyList {
entry, ok := item.(map[string]any)
if !ok {
continue
}
id := uint64(entry["id"].(float64))
preview := entry["preview"].(string)
timestamp := entry["timestamp"].(string)
isImage := entry["isImage"].(bool)
typeStr := "text"
if isImage {
typeStr = "image"
}
fmt.Printf("ID: %d | %s | %s\n", id, typeStr, timestamp)
fmt.Printf(" %s\n", preview)
fmt.Println()
}
}
func runClipGet(cmd *cobra.Command, args []string) {
id, err := strconv.ParseUint(args[0], 10, 64)
if err != nil {
log.Fatalf("Invalid ID: %v", err)
}
if clipGetCopy {
req := models.Request{
ID: 1,
Method: "clipboard.copyEntry",
Params: map[string]any{
"id": id,
},
}
resp, err := sendServerRequest(req)
if err != nil {
log.Fatalf("Failed to copy clipboard entry: %v", err)
}
if resp.Error != "" {
log.Fatalf("Error: %s", resp.Error)
}
fmt.Printf("Copied entry %d to clipboard\n", id)
return
}
req := models.Request{
ID: 1,
Method: "clipboard.getEntry",
Params: map[string]any{
"id": id,
},
}
resp, err := sendServerRequest(req)
if err != nil {
log.Fatalf("Failed to get clipboard entry: %v", err)
}
if resp.Error != "" {
log.Fatalf("Error: %s", resp.Error)
}
if resp.Result == nil {
log.Fatal("Entry not found")
}
entry, ok := (*resp.Result).(map[string]any)
if !ok {
log.Fatal("Invalid response format")
}
switch {
case clipJSONOutput:
output, _ := json.MarshalIndent(entry, "", " ")
fmt.Println(string(output))
default:
if data, ok := entry["data"].(string); ok {
fmt.Print(data)
} else {
output, _ := json.MarshalIndent(entry, "", " ")
fmt.Println(string(output))
}
}
}
func runClipDelete(cmd *cobra.Command, args []string) {
id, err := strconv.ParseUint(args[0], 10, 64)
if err != nil {
log.Fatalf("Invalid ID: %v", err)
}
req := models.Request{
ID: 1,
Method: "clipboard.deleteEntry",
Params: map[string]any{
"id": id,
},
}
resp, err := sendServerRequest(req)
if err != nil {
log.Fatalf("Failed to delete clipboard entry: %v", err)
}
if resp.Error != "" {
log.Fatalf("Error: %s", resp.Error)
}
fmt.Printf("Deleted entry %d\n", id)
}
func runClipClear(cmd *cobra.Command, args []string) {
req := models.Request{
ID: 1,
Method: "clipboard.clearHistory",
}
resp, err := sendServerRequest(req)
if err != nil {
log.Fatalf("Failed to clear clipboard history: %v", err)
}
if resp.Error != "" {
log.Fatalf("Error: %s", resp.Error)
}
fmt.Println("Clipboard history cleared")
}
func runClipSearch(cmd *cobra.Command, args []string) {
params := map[string]any{
"limit": clipSearchLimit,
"offset": clipSearchOffset,
}
if len(args) > 0 {
params["query"] = args[0]
}
if clipSearchMimeType != "" {
params["mimeType"] = clipSearchMimeType
}
if clipSearchImages {
params["isImage"] = true
} else if clipSearchText {
params["isImage"] = false
}
req := models.Request{
ID: 1,
Method: "clipboard.search",
Params: params,
}
resp, err := sendServerRequest(req)
if err != nil {
log.Fatalf("Failed to search clipboard: %v", err)
}
if resp.Error != "" {
log.Fatalf("Error: %s", resp.Error)
}
if resp.Result == nil {
log.Fatal("No results")
}
result, ok := (*resp.Result).(map[string]any)
if !ok {
log.Fatal("Invalid response format")
}
if clipJSONOutput {
out, _ := json.MarshalIndent(result, "", " ")
fmt.Println(string(out))
return
}
entries, _ := result["entries"].([]any)
total := int(result["total"].(float64))
hasMore := result["hasMore"].(bool)
if len(entries) == 0 {
fmt.Println("No results found")
return
}
fmt.Printf("Results: %d of %d\n\n", len(entries), total)
for _, item := range entries {
entry, ok := item.(map[string]any)
if !ok {
continue
}
id := uint64(entry["id"].(float64))
preview := entry["preview"].(string)
timestamp := entry["timestamp"].(string)
isImage := entry["isImage"].(bool)
typeStr := "text"
if isImage {
typeStr = "image"
}
fmt.Printf("ID: %d | %s | %s\n", id, typeStr, timestamp)
fmt.Printf(" %s\n\n", preview)
}
if hasMore {
fmt.Printf("Use --offset %d to see more results\n", clipSearchOffset+clipSearchLimit)
}
}
func runClipConfigGet(cmd *cobra.Command, args []string) {
req := models.Request{
ID: 1,
Method: "clipboard.getConfig",
}
resp, err := sendServerRequest(req)
if err != nil {
log.Fatalf("Failed to get config: %v", err)
}
if resp.Error != "" {
log.Fatalf("Error: %s", resp.Error)
}
if resp.Result == nil {
log.Fatal("No config returned")
}
cfg, ok := (*resp.Result).(map[string]any)
if !ok {
log.Fatal("Invalid response format")
}
output, _ := json.MarshalIndent(cfg, "", " ")
fmt.Println(string(output))
}
func runClipConfigSet(cmd *cobra.Command, args []string) {
params := map[string]any{}
if cmd.Flags().Changed("max-history") {
params["maxHistory"] = clipConfigMaxHistory
}
if cmd.Flags().Changed("auto-clear-days") {
params["autoClearDays"] = clipConfigAutoClearDays
}
if clipConfigClearAtStartup {
params["clearAtStartup"] = true
}
if clipConfigNoClearStartup {
params["clearAtStartup"] = false
}
if clipConfigDisabled {
params["disabled"] = true
}
if clipConfigEnabled {
params["disabled"] = false
}
if clipConfigDisableHistory {
params["disableHistory"] = true
}
if clipConfigEnableHistory {
params["disableHistory"] = false
}
if clipConfigDisablePersist {
params["disablePersist"] = true
}
if clipConfigEnablePersist {
params["disablePersist"] = false
}
if len(params) == 0 {
fmt.Println("No config options specified")
return
}
req := models.Request{
ID: 1,
Method: "clipboard.setConfig",
Params: params,
}
resp, err := sendServerRequest(req)
if err != nil {
log.Fatalf("Failed to set config: %v", err)
}
if resp.Error != "" {
log.Fatalf("Error: %s", resp.Error)
}
fmt.Println("Config updated")
}

View File

@@ -3,8 +3,8 @@ package main
import (
"fmt"
"os"
"os/exec"
"github.com/AvengeMedia/DankMaterialShell/core/internal/clipboard"
"github.com/AvengeMedia/DankMaterialShell/core/internal/colorpicker"
"github.com/spf13/cobra"
)
@@ -121,13 +121,7 @@ func runColorPick(cmd *cobra.Command, args []string) {
}
func copyToClipboard(text string) {
var cmd *exec.Cmd
if _, err := exec.LookPath("wl-copy"); err == nil {
cmd = exec.Command("wl-copy", text)
} else {
fmt.Fprintln(os.Stderr, "wl-copy not found, cannot copy to clipboard")
return
if err := clipboard.CopyText(text); err != nil {
fmt.Fprintln(os.Stderr, "clipboard copy failed:", err)
}
_ = cmd.Run()
}

View File

@@ -152,6 +152,24 @@ var pluginsUninstallCmd = &cobra.Command{
},
}
var pluginsUpdateCmd = &cobra.Command{
Use: "update <plugin-id>",
Short: "Update a plugin by ID",
Long: "Update an installed DMS plugin using its ID (e.g., 'myPlugin'). Plugin names are also supported.",
Args: cobra.ExactArgs(1),
ValidArgsFunction: func(cmd *cobra.Command, args []string, toComplete string) ([]string, cobra.ShellCompDirective) {
if len(args) != 0 {
return nil, cobra.ShellCompDirectiveNoFileComp
}
return getInstalledPluginIDs(), cobra.ShellCompDirectiveNoFileComp
},
Run: func(cmd *cobra.Command, args []string) {
if err := updatePluginCLI(args[0]); err != nil {
log.Fatalf("Error updating plugin: %v", err)
}
},
}
func runVersion(cmd *cobra.Command, args []string) {
printASCII()
fmt.Printf("%s\n", formatVersion(Version))
@@ -408,49 +426,70 @@ func uninstallPluginCLI(idOrName string) error {
return fmt.Errorf("failed to create registry: %w", err)
}
pluginList, err := registry.List()
if err != nil {
return fmt.Errorf("failed to list plugins: %w", err)
}
pluginList, _ := registry.List()
plugin := plugins.FindByIDOrName(idOrName, pluginList)
// First, try to find by ID (preferred method)
var plugin *plugins.Plugin
for _, p := range pluginList {
if p.ID == idOrName {
plugin = &p
break
if plugin != nil {
installed, err := manager.IsInstalled(*plugin)
if err != nil {
return fmt.Errorf("failed to check install status: %w", err)
}
}
// Fallback to name for backward compatibility
if plugin == nil {
for _, p := range pluginList {
if p.Name == idOrName {
plugin = &p
break
}
if !installed {
return fmt.Errorf("plugin not installed: %s", plugin.Name)
}
fmt.Printf("Uninstalling plugin: %s (ID: %s)\n", plugin.Name, plugin.ID)
if err := manager.Uninstall(*plugin); err != nil {
return fmt.Errorf("failed to uninstall plugin: %w", err)
}
fmt.Printf("Plugin uninstalled successfully: %s\n", plugin.Name)
return nil
}
if plugin == nil {
return fmt.Errorf("plugin not found: %s", idOrName)
fmt.Printf("Uninstalling plugin: %s\n", idOrName)
if err := manager.UninstallByIDOrName(idOrName); err != nil {
return err
}
fmt.Printf("Plugin uninstalled successfully: %s\n", idOrName)
return nil
}
installed, err := manager.IsInstalled(*plugin)
func updatePluginCLI(idOrName string) error {
manager, err := plugins.NewManager()
if err != nil {
return fmt.Errorf("failed to check install status: %w", err)
return fmt.Errorf("failed to create manager: %w", err)
}
if !installed {
return fmt.Errorf("plugin not installed: %s", plugin.Name)
registry, err := plugins.NewRegistry()
if err != nil {
return fmt.Errorf("failed to create registry: %w", err)
}
fmt.Printf("Uninstalling plugin: %s (ID: %s)\n", plugin.Name, plugin.ID)
if err := manager.Uninstall(*plugin); err != nil {
return fmt.Errorf("failed to uninstall plugin: %w", err)
pluginList, _ := registry.List()
plugin := plugins.FindByIDOrName(idOrName, pluginList)
if plugin != nil {
installed, err := manager.IsInstalled(*plugin)
if err != nil {
return fmt.Errorf("failed to check install status: %w", err)
}
if !installed {
return fmt.Errorf("plugin not installed: %s", plugin.Name)
}
fmt.Printf("Updating plugin: %s (ID: %s)\n", plugin.Name, plugin.ID)
if err := manager.Update(*plugin); err != nil {
return fmt.Errorf("failed to update plugin: %w", err)
}
fmt.Printf("Plugin updated successfully: %s\n", plugin.Name)
return nil
}
fmt.Printf("Plugin uninstalled successfully: %s\n", plugin.Name)
fmt.Printf("Updating plugin: %s\n", idOrName)
if err := manager.UpdateByIDOrName(idOrName); err != nil {
return err
}
fmt.Printf("Plugin updated successfully: %s\n", idOrName)
return nil
}
@@ -474,5 +513,6 @@ func getCommonCommands() []*cobra.Command {
screenshotCmd,
notifyActionCmd,
matugenCmd,
clipboardCmd,
}
}

View File

@@ -15,6 +15,7 @@ import (
"github.com/AvengeMedia/DankMaterialShell/core/internal/distros"
"github.com/AvengeMedia/DankMaterialShell/core/internal/errdefs"
"github.com/AvengeMedia/DankMaterialShell/core/internal/log"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
"github.com/AvengeMedia/DankMaterialShell/core/internal/version"
"github.com/spf13/cobra"
)
@@ -121,10 +122,10 @@ func updateArchLinux() error {
var helper string
var updateCmd *exec.Cmd
if commandExists("yay") {
if utils.CommandExists("yay") {
helper = "yay"
updateCmd = exec.Command("yay", "-S", packageName)
} else if commandExists("paru") {
} else if utils.CommandExists("paru") {
helper = "paru"
updateCmd = exec.Command("paru", "-S", packageName)
} else {

View File

@@ -10,6 +10,7 @@ import (
"github.com/AvengeMedia/DankMaterialShell/core/internal/greeter"
"github.com/AvengeMedia/DankMaterialShell/core/internal/log"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
"github.com/spf13/cobra"
"golang.org/x/text/cases"
"golang.org/x/text/language"
@@ -448,7 +449,7 @@ func enableGreeter() error {
fmt.Println("Detecting installed compositors...")
compositors := greeter.DetectCompositors()
if commandExists("sway") {
if utils.CommandExists("sway") {
compositors = append(compositors, "sway")
}

View File

@@ -89,6 +89,11 @@ func initializeProviders() {
log.Warnf("Failed to register MangoWC provider: %v", err)
}
scrollProvider := providers.NewSwayProvider("$HOME/.config/scroll")
if err := registry.Register(scrollProvider); err != nil {
log.Warnf("Failed to register Scroll provider: %v", err)
}
swayProvider := providers.NewSwayProvider("$HOME/.config/sway")
if err := registry.Register(swayProvider); err != nil {
log.Warnf("Failed to register Sway provider: %v", err)
@@ -125,6 +130,8 @@ func makeProviderWithPath(name, path string) keybinds.Provider {
return providers.NewMangoWCProvider(path)
case "sway":
return providers.NewSwayProvider(path)
case "scroll":
return providers.NewSwayProvider(path)
case "niri":
return providers.NewNiriProvider(path)
default:

View File

@@ -2,15 +2,12 @@ package main
import (
"context"
"encoding/json"
"fmt"
"net"
"os"
"time"
"github.com/AvengeMedia/DankMaterialShell/core/internal/log"
"github.com/AvengeMedia/DankMaterialShell/core/internal/matugen"
"github.com/AvengeMedia/DankMaterialShell/core/internal/server"
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
"github.com/spf13/cobra"
)
@@ -49,6 +46,7 @@ func init() {
cmd.Flags().String("stock-colors", "", "Stock theme colors JSON")
cmd.Flags().Bool("sync-mode-with-portal", false, "Sync color scheme with GNOME portal")
cmd.Flags().Bool("terminals-always-dark", false, "Force terminal themes to dark variant")
cmd.Flags().String("skip-templates", "", "Comma-separated list of templates to skip")
}
matugenQueueCmd.Flags().Bool("wait", true, "Wait for completion")
@@ -68,6 +66,7 @@ func buildMatugenOptions(cmd *cobra.Command) matugen.Options {
stockColors, _ := cmd.Flags().GetString("stock-colors")
syncModeWithPortal, _ := cmd.Flags().GetBool("sync-mode-with-portal")
terminalsAlwaysDark, _ := cmd.Flags().GetBool("terminals-always-dark")
skipTemplates, _ := cmd.Flags().GetString("skip-templates")
return matugen.Options{
StateDir: stateDir,
@@ -82,6 +81,7 @@ func buildMatugenOptions(cmd *cobra.Command) matugen.Options {
StockColors: stockColors,
SyncModeWithPortal: syncModeWithPortal,
TerminalsAlwaysDark: terminalsAlwaysDark,
SkipTemplates: skipTemplates,
}
}
@@ -97,33 +97,10 @@ func runMatugenQueue(cmd *cobra.Command, args []string) {
wait, _ := cmd.Flags().GetBool("wait")
timeout, _ := cmd.Flags().GetDuration("timeout")
socketPath := os.Getenv("DMS_SOCKET")
if socketPath == "" {
var err error
socketPath, err = server.FindSocket()
if err != nil {
log.Info("No socket available, running synchronously")
if err := matugen.Run(opts); err != nil {
log.Fatalf("Theme generation failed: %v", err)
}
return
}
}
conn, err := net.Dial("unix", socketPath)
if err != nil {
log.Info("Socket connection failed, running synchronously")
if err := matugen.Run(opts); err != nil {
log.Fatalf("Theme generation failed: %v", err)
}
return
}
defer conn.Close()
request := map[string]any{
"id": 1,
"method": "matugen.queue",
"params": map[string]any{
request := models.Request{
ID: 1,
Method: "matugen.queue",
Params: map[string]any{
"stateDir": opts.StateDir,
"shellDir": opts.ShellDir,
"configDir": opts.ConfigDir,
@@ -136,15 +113,19 @@ func runMatugenQueue(cmd *cobra.Command, args []string) {
"stockColors": opts.StockColors,
"syncModeWithPortal": opts.SyncModeWithPortal,
"terminalsAlwaysDark": opts.TerminalsAlwaysDark,
"skipTemplates": opts.SkipTemplates,
"wait": wait,
},
}
if err := json.NewEncoder(conn).Encode(request); err != nil {
log.Fatalf("Failed to send request: %v", err)
}
if !wait {
if err := sendServerRequestFireAndForget(request); err != nil {
log.Info("Server unavailable, running synchronously")
if err := matugen.Run(opts); err != nil {
log.Fatalf("Theme generation failed: %v", err)
}
return
}
fmt.Println("Theme generation queued")
return
}
@@ -154,17 +135,18 @@ func runMatugenQueue(cmd *cobra.Command, args []string) {
resultCh := make(chan error, 1)
go func() {
var response struct {
ID int `json:"id"`
Result any `json:"result"`
Error string `json:"error"`
}
if err := json.NewDecoder(conn).Decode(&response); err != nil {
resultCh <- fmt.Errorf("failed to read response: %w", err)
resp, ok := tryServerRequest(request)
if !ok {
log.Info("Server unavailable, running synchronously")
if err := matugen.Run(opts); err != nil {
resultCh <- err
return
}
resultCh <- nil
return
}
if response.Error != "" {
resultCh <- fmt.Errorf("server error: %s", response.Error)
if resp.Error != "" {
resultCh <- fmt.Errorf("server error: %s", resp.Error)
return
}
resultCh <- nil

View File

@@ -1,17 +1,14 @@
package main
import (
"encoding/json"
"fmt"
"mime"
"net"
"net/url"
"os"
"path/filepath"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/log"
"github.com/AvengeMedia/DankMaterialShell/core/internal/server"
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
"github.com/spf13/cobra"
)
@@ -93,32 +90,6 @@ func mimeTypeToCategories(mimeType string) []string {
}
func runOpen(target string) {
socketPath, err := server.FindSocket()
if err != nil {
log.Warnf("DMS socket not found: %v", err)
fmt.Println("DMS is not running. Please start DMS first.")
os.Exit(1)
}
conn, err := net.Dial("unix", socketPath)
if err != nil {
log.Warnf("DMS socket connection failed: %v", err)
fmt.Println("DMS is not running. Please start DMS first.")
os.Exit(1)
}
defer conn.Close()
buf := make([]byte, 1)
for {
_, err := conn.Read(buf)
if err != nil {
return
}
if buf[0] == '\n' {
break
}
}
// Parse file:// URIs to extract the actual file path
actualTarget := target
detectedMimeType := openMimeType
@@ -219,8 +190,9 @@ func runOpen(target string) {
log.Infof("Sending request - Method: %s, Params: %+v", method, params)
if err := json.NewEncoder(conn).Encode(req); err != nil {
log.Fatalf("Failed to send request: %v", err)
if err := sendServerRequestFireAndForget(req); err != nil {
fmt.Println("DMS is not running. Please start DMS first.")
os.Exit(1)
}
log.Infof("Request sent successfully")

View File

@@ -4,10 +4,10 @@ import (
"bytes"
"fmt"
"os"
"os/exec"
"path/filepath"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/clipboard"
"github.com/AvengeMedia/DankMaterialShell/core/internal/screenshot"
"github.com/spf13/cobra"
)
@@ -257,9 +257,7 @@ func copyImageToClipboard(buf *screenshot.ShmBuffer, format screenshot.Format, q
}
}
cmd := exec.Command("wl-copy", "--type", mimeType)
cmd.Stdin = &data
return cmd.Run()
return clipboard.Copy(data.Bytes(), mimeType)
}
func writeImageToStdout(buf *screenshot.ShmBuffer, format screenshot.Format, quality int, pixelFormat uint32) error {
@@ -295,7 +293,14 @@ func bufferToRGBThumbnail(buf *screenshot.ShmBuffer, maxSize int, pixelFormat ui
data := buf.Data()
rgb := make([]byte, dstW*dstH*3)
swapRB := pixelFormat == uint32(screenshot.FormatARGB8888) || pixelFormat == uint32(screenshot.FormatXRGB8888) || pixelFormat == 0
var swapRB bool
switch pixelFormat {
case uint32(screenshot.FormatABGR8888), uint32(screenshot.FormatXBGR8888):
swapRB = false
default:
swapRB = true
}
for y := 0; y < dstH; y++ {
srcY := int(float64(y) / scale)
@@ -309,16 +314,17 @@ func bufferToRGBThumbnail(buf *screenshot.ShmBuffer, maxSize int, pixelFormat ui
}
si := srcY*buf.Stride + srcX*4
di := (y*dstW + x) * 3
if si+2 < len(data) {
if swapRB {
rgb[di+0] = data[si+2]
rgb[di+1] = data[si+1]
rgb[di+2] = data[si+0]
} else {
rgb[di+0] = data[si+0]
rgb[di+1] = data[si+1]
rgb[di+2] = data[si+2]
}
if si+3 >= len(data) {
continue
}
if swapRB {
rgb[di+0] = data[si+2]
rgb[di+1] = data[si+1]
rgb[di+2] = data[si+0]
} else {
rgb[di+0] = data[si+0]
rgb[di+1] = data[si+1]
rgb[di+2] = data[si+2]
}
}
}
@@ -370,7 +376,37 @@ func runScreenshotList(cmd *cobra.Command, args []string) {
}
for _, o := range outputs {
fmt.Printf("%s: %dx%d+%d+%d (scale: %d)\n",
o.Name, o.Width, o.Height, o.X, o.Y, o.Scale)
scaleStr := fmt.Sprintf("%.2f", o.FractionalScale)
if o.FractionalScale == float64(int(o.FractionalScale)) {
scaleStr = fmt.Sprintf("%d", int(o.FractionalScale))
}
transformStr := transformName(o.Transform)
fmt.Printf("%s: %dx%d+%d+%d scale=%s transform=%s\n",
o.Name, o.Width, o.Height, o.X, o.Y, scaleStr, transformStr)
}
}
func transformName(t int32) string {
switch t {
case 0:
return "normal"
case 1:
return "90"
case 2:
return "180"
case 3:
return "270"
case 4:
return "flipped"
case 5:
return "flipped-90"
case 6:
return "flipped-180"
case 7:
return "flipped-270"
default:
return fmt.Sprintf("%d", t)
}
}

View File

@@ -29,6 +29,7 @@ func runSetup() error {
wm, wmSelected := promptCompositor()
terminal, terminalSelected := promptTerminal()
useSystemd := promptSystemd()
if !wmSelected && !terminalSelected {
fmt.Println("No configurations selected. Exiting.")
@@ -67,14 +68,14 @@ func runSetup() error {
var err error
if wmSelected && terminalSelected {
results, err = deployer.DeployConfigurationsWithTerminal(ctx, wm, terminal)
results, err = deployer.DeployConfigurationsWithSystemd(ctx, wm, terminal, useSystemd)
} else if wmSelected {
results, err = deployer.DeployConfigurationsWithTerminal(ctx, wm, deps.TerminalGhostty)
results, err = deployer.DeployConfigurationsWithSystemd(ctx, wm, deps.TerminalGhostty, useSystemd)
if len(results) > 1 {
results = results[:1]
}
} else if terminalSelected {
results, err = deployer.DeployConfigurationsWithTerminal(ctx, deps.WindowManagerNiri, terminal)
results, err = deployer.DeployConfigurationsWithSystemd(ctx, deps.WindowManagerNiri, terminal, useSystemd)
if len(results) > 0 && results[0].ConfigType == "Niri" {
results = results[1:]
}
@@ -144,6 +145,19 @@ func promptTerminal() (deps.Terminal, bool) {
}
}
func promptSystemd() bool {
fmt.Println("\nUse systemd for session management?")
fmt.Println("1) Yes (recommended for most distros)")
fmt.Println("2) No (standalone, no systemd integration)")
var response string
fmt.Print("\nChoice (1-2): ")
fmt.Scanln(&response)
response = strings.TrimSpace(response)
return response != "2"
}
func checkExistingConfigs(wm deps.WindowManager, wmSelected bool, terminal deps.Terminal, terminalSelected bool) bool {
homeDir := os.Getenv("HOME")
willBackup := false

View File

@@ -23,7 +23,7 @@ func init() {
updateCmd.AddCommand(updateCheckCmd)
// Add subcommands to plugins
pluginsCmd.AddCommand(pluginsBrowseCmd, pluginsListCmd, pluginsInstallCmd, pluginsUninstallCmd)
pluginsCmd.AddCommand(pluginsBrowseCmd, pluginsListCmd, pluginsInstallCmd, pluginsUninstallCmd, pluginsUpdateCmd)
// Add common commands to root
rootCmd.AddCommand(getCommonCommands()...)

View File

@@ -21,7 +21,7 @@ func init() {
greeterCmd.AddCommand(greeterSyncCmd, greeterEnableCmd, greeterStatusCmd)
// Add subcommands to plugins
pluginsCmd.AddCommand(pluginsBrowseCmd, pluginsListCmd, pluginsInstallCmd, pluginsUninstallCmd)
pluginsCmd.AddCommand(pluginsBrowseCmd, pluginsListCmd, pluginsInstallCmd, pluginsUninstallCmd, pluginsUpdateCmd)
// Add common commands to root
rootCmd.AddCommand(getCommonCommands()...)

View File

@@ -0,0 +1,114 @@
package main
import (
"bufio"
"encoding/json"
"fmt"
"net"
"os"
"path/filepath"
"github.com/AvengeMedia/DankMaterialShell/core/internal/server"
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
)
func sendServerRequest(req models.Request) (*models.Response[any], error) {
socketPath := getServerSocketPath()
conn, err := net.Dial("unix", socketPath)
if err != nil {
return nil, fmt.Errorf("failed to connect to server (is it running?): %w", err)
}
defer conn.Close()
scanner := bufio.NewScanner(conn)
scanner.Scan() // discard initial capabilities message
reqData, err := json.Marshal(req)
if err != nil {
return nil, fmt.Errorf("failed to marshal request: %w", err)
}
if _, err := conn.Write(reqData); err != nil {
return nil, fmt.Errorf("failed to write request: %w", err)
}
if _, err := conn.Write([]byte("\n")); err != nil {
return nil, fmt.Errorf("failed to write newline: %w", err)
}
if !scanner.Scan() {
return nil, fmt.Errorf("failed to read response")
}
var resp models.Response[any]
if err := json.Unmarshal(scanner.Bytes(), &resp); err != nil {
return nil, fmt.Errorf("failed to unmarshal response: %w", err)
}
return &resp, nil
}
// sendServerRequestFireAndForget sends a request without waiting for a response.
// Useful for commands that trigger UI or async operations.
func sendServerRequestFireAndForget(req models.Request) error {
socketPath := getServerSocketPath()
conn, err := net.Dial("unix", socketPath)
if err != nil {
return fmt.Errorf("failed to connect to server (is it running?): %w", err)
}
defer conn.Close()
scanner := bufio.NewScanner(conn)
scanner.Scan() // discard initial capabilities message
reqData, err := json.Marshal(req)
if err != nil {
return fmt.Errorf("failed to marshal request: %w", err)
}
if _, err := conn.Write(reqData); err != nil {
return fmt.Errorf("failed to write request: %w", err)
}
if _, err := conn.Write([]byte("\n")); err != nil {
return fmt.Errorf("failed to write newline: %w", err)
}
return nil
}
// tryServerRequest attempts to send a request but returns false if server unavailable.
// Does not log errors - caller can decide what to do on failure.
func tryServerRequest(req models.Request) (*models.Response[any], bool) {
resp, err := sendServerRequest(req)
if err != nil {
return nil, false
}
return resp, true
}
func getServerSocketPath() string {
runtimeDir := os.Getenv("XDG_RUNTIME_DIR")
if runtimeDir == "" {
runtimeDir = os.TempDir()
}
entries, err := os.ReadDir(runtimeDir)
if err != nil {
return filepath.Join(runtimeDir, "danklinux.sock")
}
for _, entry := range entries {
name := entry.Name()
if name == "danklinux.sock" {
return filepath.Join(runtimeDir, name)
}
if len(name) > 10 && name[:10] == "danklinux-" && filepath.Ext(name) == ".sock" {
return filepath.Join(runtimeDir, name)
}
}
return server.GetSocketPath()
}

View File

@@ -104,7 +104,6 @@ func getAllDMSPIDs() []int {
continue
}
// Check if the child process is still alive
proc, err := os.FindProcess(childPID)
if err != nil {
os.Remove(pidFile)
@@ -112,18 +111,15 @@ func getAllDMSPIDs() []int {
}
if err := proc.Signal(syscall.Signal(0)); err != nil {
// Process is dead, remove stale PID file
os.Remove(pidFile)
continue
}
pids = append(pids, childPID)
// Also get the parent PID from the filename
parentPIDStr := strings.TrimPrefix(entry.Name(), "danklinux-")
parentPIDStr = strings.TrimSuffix(parentPIDStr, ".pid")
if parentPID, err := strconv.Atoi(parentPIDStr); err == nil {
// Check if parent is still alive
if parentProc, err := os.FindProcess(parentPID); err == nil {
if err := parentProc.Signal(syscall.Signal(0)); err == nil {
pids = append(pids, parentPID)
@@ -159,6 +155,7 @@ func runShellInteractive(session bool) {
errChan <- fmt.Errorf("server panic: %v", r)
}
}()
server.CLIVersion = Version
if err := server.Start(false); err != nil {
errChan <- fmt.Errorf("server error: %w", err)
}
@@ -225,7 +222,6 @@ func runShellInteractive(session bool) {
return
}
// All other signals: clean shutdown
log.Infof("\nReceived signal %v, shutting down...", sig)
cancel()
cmd.Process.Signal(syscall.SIGTERM)
@@ -282,7 +278,6 @@ func restartShell() {
}
func killShell() {
// Get all tracked DMS PIDs from PID files
pids := getAllDMSPIDs()
if len(pids) == 0 {
@@ -293,14 +288,12 @@ func killShell() {
currentPid := os.Getpid()
uniquePids := make(map[int]bool)
// Deduplicate and filter out current process
for _, pid := range pids {
if pid != currentPid {
uniquePids[pid] = true
}
}
// Kill all tracked processes
for pid := range uniquePids {
proc, err := os.FindProcess(pid)
if err != nil {
@@ -308,7 +301,6 @@ func killShell() {
continue
}
// Check if process is still alive before killing
if err := proc.Signal(syscall.Signal(0)); err != nil {
continue
}
@@ -320,7 +312,6 @@ func killShell() {
}
}
// Clean up any remaining PID files
dir := getRuntimeDir()
entries, err := os.ReadDir(dir)
if err != nil {
@@ -337,7 +328,6 @@ func killShell() {
func runShellDaemon(session bool) {
isSessionManaged = session
// Check if this is the daemon child process by looking for the hidden flag
isDaemonChild := false
for _, arg := range os.Args {
if arg == "--daemon-child" {

View File

@@ -17,8 +17,8 @@ func getThemedASCII() string {
logo := `
██████╗ █████╗ ███╗ ██╗██╗ ██╗
██╔══██╗██╔══██╗████╗ ██║██║ ██╔╝
██║ ██║███████║██╔██╗ ██║█████╔╝
██║ ██║██╔══██║██║╚██╗██║██╔═██╗
██║ ██║███████║██╔██╗ ██║█████╔╝
██║ ██║██╔══██║██║╚██╗██║██╔═██╗
██████╔╝██║ ██║██║ ╚████║██║ ██╗
╚═════╝ ╚═╝ ╚═╝╚═╝ ╚═══╝╚═╝ ╚═╝`

View File

@@ -6,12 +6,6 @@ import (
"strings"
)
func commandExists(cmd string) bool {
_, err := exec.LookPath(cmd)
return err == nil
}
// findCommandPath returns the absolute path to a command in PATH
func findCommandPath(cmd string) (string, error) {
path, err := exec.LookPath(cmd)
if err != nil {

View File

@@ -15,7 +15,9 @@ require (
github.com/sblinch/kdl-go v0.0.0-20250930225324-bf4099d4614a
github.com/spf13/cobra v1.10.1
github.com/stretchr/testify v1.11.1
go.etcd.io/bbolt v1.4.3
golang.org/x/exp v0.0.0-20251125195548-87e1e737ad39
golang.org/x/image v0.34.0
)
require (
@@ -65,6 +67,6 @@ require (
github.com/spf13/pflag v1.0.10 // indirect
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e // indirect
golang.org/x/sys v0.38.0
golang.org/x/text v0.31.0
golang.org/x/text v0.32.0
gopkg.in/yaml.v3 v3.0.1 // indirect
)

View File

@@ -131,20 +131,26 @@ github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e h1:JVG44RsyaB9T2KIHavMF/ppJZNG9ZpyihvCd0w101no=
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e/go.mod h1:RbqR21r5mrJuqunuUZ/Dhy/avygyECGrLceyNeo4LiM=
go.etcd.io/bbolt v1.4.3 h1:dEadXpI6G79deX5prL3QRNP6JB8UxVkqo4UPnHaNXJo=
go.etcd.io/bbolt v1.4.3/go.mod h1:tKQlpPaYCVFctUIgFKFnAlvbmB3tpy1vkTnDWohtc0E=
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
golang.org/x/exp v0.0.0-20251125195548-87e1e737ad39 h1:DHNhtq3sNNzrvduZZIiFyXWOL9IWaDPHqTnLJp+rCBY=
golang.org/x/exp v0.0.0-20251125195548-87e1e737ad39/go.mod h1:46edojNIoXTNOhySWIWdix628clX9ODXwPsQuG6hsK0=
golang.org/x/image v0.34.0 h1:33gCkyw9hmwbZJeZkct8XyR11yH889EQt/QH4VmXMn8=
golang.org/x/image v0.34.0/go.mod h1:2RNFBZRB+vnwwFil8GkMdRvrJOFd1AzdZI6vOY+eJVU=
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
golang.org/x/sync v0.19.0 h1:vV+1eWNmZ5geRlYjzm2adRgW2/mcpevXNg50YZtPCE4=
golang.org/x/sync v0.19.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.0.0-20210809222454-d867a43fc93e/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/term v0.37.0 h1:8EGAD0qCmHYZg6J17DvsMy9/wJ7/D/4pV/wfnld5lTU=
golang.org/x/term v0.37.0/go.mod h1:5pB4lxRNYYVZuTLmy8oR2BH8dflOR+IbTYFD8fi3254=
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
golang.org/x/text v0.32.0 h1:ZD01bjUt1FQ9WJ0ClOL5vxgxOI/sVCNgX1YtKwcY0mU=
golang.org/x/text v0.32.0/go.mod h1:o/rUWzghvpD5TXrTIBuJU77MTaN0ljMWE47kxGJQ7jY=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=

View File

@@ -1,4 +1,4 @@
#!/bin/sh
#!/usr/bin/env bash
set -e
@@ -9,8 +9,8 @@ NC='\033[0m' # No Color
# Check for root privileges
if [ "$(id -u)" == "0" ]; then
printf "%bError: This script must not be run as root%b\n" "$RED" "$NC"
exit 1
printf "%bError: This script must not be run as root%b\n" "$RED" "$NC"
exit 1
fi
# Check if running on Linux
@@ -22,17 +22,17 @@ fi
# Detect architecture
ARCH=$(uname -m)
case "$ARCH" in
x86_64)
ARCH="amd64"
;;
aarch64)
ARCH="arm64"
;;
*)
printf "%bError: Unsupported architecture: %s%b\n" "$RED" "$ARCH" "$NC"
printf "This installer only supports x86_64 (amd64) and aarch64 (arm64) architectures\n"
exit 1
;;
x86_64)
ARCH="amd64"
;;
aarch64)
ARCH="arm64"
;;
*)
printf "%bError: Unsupported architecture: %s%b\n" "$RED" "$ARCH" "$NC"
printf "This installer only supports x86_64 (amd64) and aarch64 (arm64) architectures\n"
exit 1
;;
esac
# Get the latest release version
@@ -55,7 +55,7 @@ curl -L "https://github.com/AvengeMedia/DankMaterialShell/releases/download/$LAT
curl -L "https://github.com/AvengeMedia/DankMaterialShell/releases/download/$LATEST_VERSION/dankinstall-$ARCH.gz.sha256" -o "expected.sha256"
# Get the expected checksum
EXPECTED_CHECKSUM=$(cat expected.sha256 | awk '{print $1}')
EXPECTED_CHECKSUM=$(awk '{print $1}' expected.sha256)
# Calculate actual checksum
printf "%bVerifying checksum...%b\n" "$GREEN" "$NC"
@@ -67,7 +67,7 @@ if [ "$EXPECTED_CHECKSUM" != "$ACTUAL_CHECKSUM" ]; then
printf "Expected: %s\n" "$EXPECTED_CHECKSUM"
printf "Got: %s\n" "$ACTUAL_CHECKSUM"
printf "The downloaded file may be corrupted or tampered with\n"
cd - > /dev/null
cd - >/dev/null
rm -rf "$TEMP_DIR"
exit 1
fi
@@ -82,5 +82,5 @@ printf "%bRunning installer...%b\n" "$GREEN" "$NC"
./installer
# Cleanup
cd - > /dev/null
rm -rf "$TEMP_DIR"
cd - >/dev/null
rm -rf "$TEMP_DIR"

View File

@@ -0,0 +1,332 @@
package clipboard
import (
"fmt"
"io"
"os"
"os/exec"
"syscall"
"github.com/AvengeMedia/DankMaterialShell/core/internal/proto/ext_data_control"
wlclient "github.com/AvengeMedia/DankMaterialShell/core/pkg/go-wayland/wayland/client"
)
func Copy(data []byte, mimeType string) error {
return CopyOpts(data, mimeType, false, false)
}
func CopyOpts(data []byte, mimeType string, foreground, pasteOnce bool) error {
if !foreground {
return copyFork(data, mimeType, pasteOnce)
}
return copyServe(data, mimeType, pasteOnce)
}
func copyFork(data []byte, mimeType string, pasteOnce bool) error {
args := []string{os.Args[0], "cl", "copy", "--foreground"}
if pasteOnce {
args = append(args, "--paste-once")
}
args = append(args, "--type", mimeType)
cmd := exec.Command(args[0], args[1:]...)
cmd.Stdin = nil
cmd.Stdout = nil
cmd.Stderr = nil
cmd.SysProcAttr = &syscall.SysProcAttr{Setsid: true}
stdin, err := cmd.StdinPipe()
if err != nil {
return fmt.Errorf("stdin pipe: %w", err)
}
if err := cmd.Start(); err != nil {
return fmt.Errorf("start: %w", err)
}
if _, err := stdin.Write(data); err != nil {
stdin.Close()
return fmt.Errorf("write stdin: %w", err)
}
stdin.Close()
return nil
}
func copyServe(data []byte, mimeType string, pasteOnce bool) error {
display, err := wlclient.Connect("")
if err != nil {
return fmt.Errorf("wayland connect: %w", err)
}
defer display.Destroy()
ctx := display.Context()
registry, err := display.GetRegistry()
if err != nil {
return fmt.Errorf("get registry: %w", err)
}
defer registry.Destroy()
var dataControlMgr *ext_data_control.ExtDataControlManagerV1
var seat *wlclient.Seat
var bindErr error
registry.SetGlobalHandler(func(e wlclient.RegistryGlobalEvent) {
switch e.Interface {
case "ext_data_control_manager_v1":
dataControlMgr = ext_data_control.NewExtDataControlManagerV1(ctx)
if err := registry.Bind(e.Name, e.Interface, e.Version, dataControlMgr); err != nil {
bindErr = err
}
case "wl_seat":
if seat != nil {
return
}
seat = wlclient.NewSeat(ctx)
if err := registry.Bind(e.Name, e.Interface, e.Version, seat); err != nil {
bindErr = err
}
}
})
display.Roundtrip()
display.Roundtrip()
if bindErr != nil {
return fmt.Errorf("registry bind: %w", bindErr)
}
if dataControlMgr == nil {
return fmt.Errorf("compositor does not support ext_data_control_manager_v1")
}
defer dataControlMgr.Destroy()
if seat == nil {
return fmt.Errorf("no seat available")
}
device, err := dataControlMgr.GetDataDevice(seat)
if err != nil {
return fmt.Errorf("get data device: %w", err)
}
defer device.Destroy()
source, err := dataControlMgr.CreateDataSource()
if err != nil {
return fmt.Errorf("create data source: %w", err)
}
if err := source.Offer(mimeType); err != nil {
return fmt.Errorf("offer mime type: %w", err)
}
if mimeType == "text/plain;charset=utf-8" || mimeType == "text/plain" {
if err := source.Offer("text/plain"); err != nil {
return fmt.Errorf("offer text/plain: %w", err)
}
if err := source.Offer("text/plain;charset=utf-8"); err != nil {
return fmt.Errorf("offer text/plain;charset=utf-8: %w", err)
}
if err := source.Offer("UTF8_STRING"); err != nil {
return fmt.Errorf("offer UTF8_STRING: %w", err)
}
if err := source.Offer("STRING"); err != nil {
return fmt.Errorf("offer STRING: %w", err)
}
if err := source.Offer("TEXT"); err != nil {
return fmt.Errorf("offer TEXT: %w", err)
}
}
cancelled := make(chan struct{})
pasted := make(chan struct{}, 1)
source.SetSendHandler(func(e ext_data_control.ExtDataControlSourceV1SendEvent) {
defer syscall.Close(e.Fd)
file := os.NewFile(uintptr(e.Fd), "pipe")
defer file.Close()
file.Write(data)
select {
case pasted <- struct{}{}:
default:
}
})
source.SetCancelledHandler(func(e ext_data_control.ExtDataControlSourceV1CancelledEvent) {
close(cancelled)
})
if err := device.SetSelection(source); err != nil {
return fmt.Errorf("set selection: %w", err)
}
display.Roundtrip()
for {
select {
case <-cancelled:
return nil
case <-pasted:
if pasteOnce {
return nil
}
default:
if err := ctx.Dispatch(); err != nil {
return nil
}
}
}
}
func CopyText(text string) error {
return Copy([]byte(text), "text/plain;charset=utf-8")
}
func Paste() ([]byte, string, error) {
display, err := wlclient.Connect("")
if err != nil {
return nil, "", fmt.Errorf("wayland connect: %w", err)
}
defer display.Destroy()
ctx := display.Context()
registry, err := display.GetRegistry()
if err != nil {
return nil, "", fmt.Errorf("get registry: %w", err)
}
defer registry.Destroy()
var dataControlMgr *ext_data_control.ExtDataControlManagerV1
var seat *wlclient.Seat
var bindErr error
registry.SetGlobalHandler(func(e wlclient.RegistryGlobalEvent) {
switch e.Interface {
case "ext_data_control_manager_v1":
dataControlMgr = ext_data_control.NewExtDataControlManagerV1(ctx)
if err := registry.Bind(e.Name, e.Interface, e.Version, dataControlMgr); err != nil {
bindErr = err
}
case "wl_seat":
if seat != nil {
return
}
seat = wlclient.NewSeat(ctx)
if err := registry.Bind(e.Name, e.Interface, e.Version, seat); err != nil {
bindErr = err
}
}
})
display.Roundtrip()
display.Roundtrip()
if bindErr != nil {
return nil, "", fmt.Errorf("registry bind: %w", bindErr)
}
if dataControlMgr == nil {
return nil, "", fmt.Errorf("compositor does not support ext_data_control_manager_v1")
}
defer dataControlMgr.Destroy()
if seat == nil {
return nil, "", fmt.Errorf("no seat available")
}
device, err := dataControlMgr.GetDataDevice(seat)
if err != nil {
return nil, "", fmt.Errorf("get data device: %w", err)
}
defer device.Destroy()
offerMimeTypes := make(map[*ext_data_control.ExtDataControlOfferV1][]string)
device.SetDataOfferHandler(func(e ext_data_control.ExtDataControlDeviceV1DataOfferEvent) {
if e.Id == nil {
return
}
offerMimeTypes[e.Id] = nil
e.Id.SetOfferHandler(func(me ext_data_control.ExtDataControlOfferV1OfferEvent) {
offerMimeTypes[e.Id] = append(offerMimeTypes[e.Id], me.MimeType)
})
})
var selectionOffer *ext_data_control.ExtDataControlOfferV1
gotSelection := false
device.SetSelectionHandler(func(e ext_data_control.ExtDataControlDeviceV1SelectionEvent) {
selectionOffer = e.Id
gotSelection = true
})
display.Roundtrip()
display.Roundtrip()
if !gotSelection || selectionOffer == nil {
return nil, "", fmt.Errorf("no clipboard data")
}
mimeTypes := offerMimeTypes[selectionOffer]
selectedMime := selectPreferredMimeType(mimeTypes)
if selectedMime == "" {
return nil, "", fmt.Errorf("no supported mime type")
}
r, w, err := os.Pipe()
if err != nil {
return nil, "", fmt.Errorf("create pipe: %w", err)
}
defer r.Close()
if err := selectionOffer.Receive(selectedMime, int(w.Fd())); err != nil {
w.Close()
return nil, "", fmt.Errorf("receive: %w", err)
}
w.Close()
display.Roundtrip()
data, err := io.ReadAll(r)
if err != nil {
return nil, "", fmt.Errorf("read: %w", err)
}
return data, selectedMime, nil
}
func PasteText() (string, error) {
data, _, err := Paste()
if err != nil {
return "", err
}
return string(data), nil
}
func selectPreferredMimeType(mimes []string) string {
preferred := []string{
"text/plain;charset=utf-8",
"text/plain",
"UTF8_STRING",
"STRING",
"TEXT",
"image/png",
"image/jpeg",
}
for _, pref := range preferred {
for _, mime := range mimes {
if mime == pref {
return mime
}
}
}
if len(mimes) > 0 {
return mimes[0]
}
return ""
}
func IsImageMimeType(mime string) bool {
return len(mime) > 6 && mime[:6] == "image/"
}

View File

@@ -0,0 +1,229 @@
package clipboard
import (
"bytes"
"encoding/binary"
"fmt"
"image"
_ "image/gif"
_ "image/jpeg"
_ "image/png"
"os"
"path/filepath"
"strings"
"time"
_ "golang.org/x/image/bmp"
_ "golang.org/x/image/tiff"
"hash/fnv"
bolt "go.etcd.io/bbolt"
)
type StoreConfig struct {
MaxHistory int
MaxEntrySize int64
}
func DefaultStoreConfig() StoreConfig {
return StoreConfig{
MaxHistory: 100,
MaxEntrySize: 5 * 1024 * 1024,
}
}
type Entry struct {
ID uint64
Data []byte
MimeType string
Preview string
Size int
Timestamp time.Time
IsImage bool
Hash uint64
}
func Store(data []byte, mimeType string) error {
return StoreWithConfig(data, mimeType, DefaultStoreConfig())
}
func StoreWithConfig(data []byte, mimeType string, cfg StoreConfig) error {
if len(data) == 0 {
return nil
}
if int64(len(data)) > cfg.MaxEntrySize {
return fmt.Errorf("data too large: %d > %d", len(data), cfg.MaxEntrySize)
}
dbPath, err := getDBPath()
if err != nil {
return fmt.Errorf("get db path: %w", err)
}
db, err := bolt.Open(dbPath, 0644, &bolt.Options{Timeout: 1 * time.Second})
if err != nil {
return fmt.Errorf("open db: %w", err)
}
defer db.Close()
entry := Entry{
Data: data,
MimeType: mimeType,
Size: len(data),
Timestamp: time.Now(),
IsImage: IsImageMimeType(mimeType),
Hash: computeHash(data),
}
switch {
case entry.IsImage:
entry.Preview = imagePreview(data, mimeType)
default:
entry.Preview = textPreview(data)
}
return db.Update(func(tx *bolt.Tx) error {
b, err := tx.CreateBucketIfNotExists([]byte("clipboard"))
if err != nil {
return err
}
if err := deduplicateInTx(b, entry.Hash); err != nil {
return err
}
id, err := b.NextSequence()
if err != nil {
return err
}
entry.ID = id
encoded, err := encodeEntry(entry)
if err != nil {
return err
}
if err := b.Put(itob(id), encoded); err != nil {
return err
}
return trimLengthInTx(b, cfg.MaxHistory)
})
}
func getDBPath() (string, error) {
cacheDir, err := os.UserCacheDir()
if err != nil {
homeDir, err := os.UserHomeDir()
if err != nil {
return "", err
}
cacheDir = filepath.Join(homeDir, ".cache")
}
dbDir := filepath.Join(cacheDir, "dms-clipboard")
if err := os.MkdirAll(dbDir, 0700); err != nil {
return "", err
}
return filepath.Join(dbDir, "db"), nil
}
func deduplicateInTx(b *bolt.Bucket, hash uint64) error {
c := b.Cursor()
for k, v := c.Last(); k != nil; k, v = c.Prev() {
if extractHash(v) != hash {
continue
}
if err := b.Delete(k); err != nil {
return err
}
}
return nil
}
func trimLengthInTx(b *bolt.Bucket, maxHistory int) error {
c := b.Cursor()
var count int
for k, _ := c.Last(); k != nil; k, _ = c.Prev() {
if count < maxHistory {
count++
continue
}
if err := b.Delete(k); err != nil {
return err
}
}
return nil
}
func encodeEntry(e Entry) ([]byte, error) {
buf := new(bytes.Buffer)
binary.Write(buf, binary.BigEndian, e.ID)
binary.Write(buf, binary.BigEndian, uint32(len(e.Data)))
buf.Write(e.Data)
binary.Write(buf, binary.BigEndian, uint32(len(e.MimeType)))
buf.WriteString(e.MimeType)
binary.Write(buf, binary.BigEndian, uint32(len(e.Preview)))
buf.WriteString(e.Preview)
binary.Write(buf, binary.BigEndian, int32(e.Size))
binary.Write(buf, binary.BigEndian, e.Timestamp.Unix())
if e.IsImage {
buf.WriteByte(1)
} else {
buf.WriteByte(0)
}
binary.Write(buf, binary.BigEndian, e.Hash)
return buf.Bytes(), nil
}
func itob(v uint64) []byte {
b := make([]byte, 8)
binary.BigEndian.PutUint64(b, v)
return b
}
func computeHash(data []byte) uint64 {
h := fnv.New64a()
h.Write(data)
return h.Sum64()
}
func extractHash(data []byte) uint64 {
if len(data) < 8 {
return 0
}
return binary.BigEndian.Uint64(data[len(data)-8:])
}
func textPreview(data []byte) string {
text := string(data)
text = strings.TrimSpace(text)
text = strings.Join(strings.Fields(text), " ")
if len(text) > 100 {
return text[:100] + "…"
}
return text
}
func imagePreview(data []byte, format string) string {
config, imgFmt, err := image.DecodeConfig(bytes.NewReader(data))
if err != nil {
return fmt.Sprintf("[[ image %s %s ]]", sizeStr(len(data)), format)
}
return fmt.Sprintf("[[ image %s %s %dx%d ]]", sizeStr(len(data)), imgFmt, config.Width, config.Height)
}
func sizeStr(size int) string {
units := []string{"B", "KiB", "MiB"}
var i int
fsize := float64(size)
for fsize >= 1024 && i < len(units)-1 {
fsize /= 1024
i++
}
return fmt.Sprintf("%.0f %s", fsize, units[i])
}

View File

@@ -0,0 +1,160 @@
package clipboard
import (
"context"
"fmt"
"io"
"os"
"time"
"github.com/AvengeMedia/DankMaterialShell/core/internal/proto/ext_data_control"
wlclient "github.com/AvengeMedia/DankMaterialShell/core/pkg/go-wayland/wayland/client"
)
type ClipboardChange struct {
Data []byte
MimeType string
}
func Watch(ctx context.Context, callback func(data []byte, mimeType string)) error {
display, err := wlclient.Connect("")
if err != nil {
return fmt.Errorf("wayland connect: %w", err)
}
defer display.Destroy()
wlCtx := display.Context()
registry, err := display.GetRegistry()
if err != nil {
return fmt.Errorf("get registry: %w", err)
}
defer registry.Destroy()
var dataControlMgr *ext_data_control.ExtDataControlManagerV1
var seat *wlclient.Seat
var bindErr error
registry.SetGlobalHandler(func(e wlclient.RegistryGlobalEvent) {
switch e.Interface {
case "ext_data_control_manager_v1":
dataControlMgr = ext_data_control.NewExtDataControlManagerV1(wlCtx)
if err := registry.Bind(e.Name, e.Interface, e.Version, dataControlMgr); err != nil {
bindErr = err
}
case "wl_seat":
if seat != nil {
return
}
seat = wlclient.NewSeat(wlCtx)
if err := registry.Bind(e.Name, e.Interface, e.Version, seat); err != nil {
bindErr = err
}
}
})
display.Roundtrip()
display.Roundtrip()
if bindErr != nil {
return fmt.Errorf("registry bind: %w", bindErr)
}
if dataControlMgr == nil {
return fmt.Errorf("compositor does not support ext_data_control_manager_v1")
}
defer dataControlMgr.Destroy()
if seat == nil {
return fmt.Errorf("no seat available")
}
device, err := dataControlMgr.GetDataDevice(seat)
if err != nil {
return fmt.Errorf("get data device: %w", err)
}
defer device.Destroy()
offerMimeTypes := make(map[*ext_data_control.ExtDataControlOfferV1][]string)
device.SetDataOfferHandler(func(e ext_data_control.ExtDataControlDeviceV1DataOfferEvent) {
if e.Id == nil {
return
}
offerMimeTypes[e.Id] = nil
e.Id.SetOfferHandler(func(me ext_data_control.ExtDataControlOfferV1OfferEvent) {
offerMimeTypes[e.Id] = append(offerMimeTypes[e.Id], me.MimeType)
})
})
device.SetSelectionHandler(func(e ext_data_control.ExtDataControlDeviceV1SelectionEvent) {
if e.Id == nil {
return
}
mimes := offerMimeTypes[e.Id]
selectedMime := selectPreferredMimeType(mimes)
if selectedMime == "" {
return
}
r, w, err := os.Pipe()
if err != nil {
return
}
if err := e.Id.Receive(selectedMime, int(w.Fd())); err != nil {
w.Close()
r.Close()
return
}
w.Close()
go func() {
defer r.Close()
data, err := io.ReadAll(r)
if err != nil || len(data) == 0 {
return
}
callback(data, selectedMime)
}()
})
display.Roundtrip()
display.Roundtrip()
for {
select {
case <-ctx.Done():
return ctx.Err()
default:
if err := wlCtx.SetReadDeadline(time.Now().Add(100 * time.Millisecond)); err != nil {
return fmt.Errorf("set read deadline: %w", err)
}
if err := wlCtx.Dispatch(); err != nil && err != os.ErrDeadlineExceeded {
return fmt.Errorf("dispatch: %w", err)
}
}
}
}
func WatchChan(ctx context.Context) (<-chan ClipboardChange, <-chan error) {
ch := make(chan ClipboardChange, 16)
errCh := make(chan error, 1)
go func() {
defer close(ch)
err := Watch(ctx, func(data []byte, mimeType string) {
select {
case ch <- ClipboardChange{Data: data, MimeType: mimeType}:
default:
}
})
if err != nil && err != context.Canceled {
errCh <- err
}
close(errCh)
}()
time.Sleep(50 * time.Millisecond)
return ch, errCh
}

View File

@@ -30,6 +30,7 @@ type Output struct {
height int32
scale int32
fractionalScale float64
transform int32
}
type LayerSurface struct {
@@ -276,6 +277,7 @@ func (p *Picker) setupOutputHandlers(name uint32, output *client.Output) {
if o, ok := p.outputs[name]; ok {
o.x = e.X
o.y = e.Y
o.transform = int32(e.Transform)
}
p.outputsMu.Unlock()
})
@@ -485,8 +487,19 @@ func (p *Picker) captureForSurface(ls *LayerSurface) {
frame.SetReadyHandler(func(e wlr_screencopy.ZwlrScreencopyFrameV1ReadyEvent) {
ls.state.OnScreencopyReady()
logicalW, _ := ls.state.LogicalSize()
screenBuf := ls.state.ScreenBuffer()
if screenBuf != nil && ls.output.transform != TransformNormal {
invTransform := InverseTransform(ls.output.transform)
transformed, err := screenBuf.ApplyTransform(invTransform)
if err != nil {
log.Error("apply transform failed", "err", err)
} else if transformed != screenBuf {
ls.state.ReplaceScreenBuffer(transformed)
}
}
logicalW, _ := ls.state.LogicalSize()
screenBuf = ls.state.ScreenBuffer()
if logicalW > 0 && screenBuf != nil {
ls.output.fractionalScale = float64(screenBuf.Width) / float64(logicalW)
}

View File

@@ -4,10 +4,25 @@ import "github.com/AvengeMedia/DankMaterialShell/core/internal/wayland/shm"
type ShmBuffer = shm.Buffer
const (
TransformNormal = shm.TransformNormal
Transform90 = shm.Transform90
Transform180 = shm.Transform180
Transform270 = shm.Transform270
TransformFlipped = shm.TransformFlipped
TransformFlipped90 = shm.TransformFlipped90
TransformFlipped180 = shm.TransformFlipped180
TransformFlipped270 = shm.TransformFlipped270
)
func CreateShmBuffer(width, height, stride int) (*ShmBuffer, error) {
return shm.CreateBuffer(width, height, stride)
}
func InverseTransform(transform int32) int32 {
return shm.InverseTransform(transform)
}
func GetPixelColor(buf *ShmBuffer, x, y int) Color {
return GetPixelColorWithFormat(buf, x, y, FormatARGB8888)
}

View File

@@ -1,6 +1,7 @@
package colorpicker
import (
"fmt"
"math"
"strings"
"sync"
@@ -15,6 +16,8 @@ const (
FormatXRGB8888 = shm.FormatXRGB8888
FormatABGR8888 = shm.FormatABGR8888
FormatXBGR8888 = shm.FormatXBGR8888
FormatRGB888 = shm.FormatRGB888
FormatBGR888 = shm.FormatBGR888
)
type SurfaceState struct {
@@ -79,6 +82,11 @@ func (s *SurfaceState) OnScreencopyBuffer(format PixelFormat, width, height, str
s.mu.Lock()
defer s.mu.Unlock()
bpp := format.BytesPerPixel()
if stride < width*bpp {
return fmt.Errorf("invalid stride %d for width %d (bpp=%d)", stride, width, bpp)
}
if s.screenBuf != nil {
s.screenBuf.Close()
s.screenBuf = nil
@@ -90,6 +98,7 @@ func (s *SurfaceState) OnScreencopyBuffer(format PixelFormat, width, height, str
}
s.screenBuf = buf
s.screenBuf.Format = format
s.screenFormat = format
return nil
}
@@ -106,6 +115,20 @@ func (s *SurfaceState) ScreenFormat() PixelFormat {
return s.screenFormat
}
func (s *SurfaceState) ReplaceScreenBuffer(newBuf *ShmBuffer) {
s.mu.Lock()
defer s.mu.Unlock()
if s.screenBuf != nil {
s.screenBuf.Close()
}
s.screenBuf = newBuf
s.screenFormat = newBuf.Format
s.recomputeScale()
s.ensureRenderBuffers()
}
func (s *SurfaceState) OnScreencopyFlags(flags uint32) {
s.mu.Lock()
s.yInverted = (flags & 1) != 0
@@ -120,6 +143,15 @@ func (s *SurfaceState) OnScreencopyReady() {
return
}
if s.screenFormat.Is24Bit() {
converted, newFormat, err := s.screenBuf.ConvertTo32Bit(s.screenFormat)
if err == nil && converted != s.screenBuf {
s.screenBuf.Close()
s.screenBuf = converted
s.screenFormat = newFormat
}
}
s.recomputeScale()
s.ensureRenderBuffers()
s.readyForDisplay = true
@@ -279,10 +311,10 @@ func (s *SurfaceState) Redraw() *ShmBuffer {
drawMagnifierWithInversion(
dst.Data(), dst.Stride, dst.Width, dst.Height,
s.screenBuf.Data(), s.screenBuf.Stride, s.screenBuf.Width, s.screenBuf.Height,
px, py, picked, s.yInverted,
px, py, picked, s.yInverted, s.screenFormat,
)
drawColorPreview(dst.Data(), dst.Stride, dst.Width, dst.Height, px, py, picked, s.displayFormat, s.lowercase)
drawColorPreview(dst.Data(), dst.Stride, dst.Width, dst.Height, px, py, picked, s.displayFormat, s.lowercase, s.screenFormat)
return dst
}
@@ -390,6 +422,7 @@ func drawMagnifierWithInversion(
cx, cy int,
borderColor Color,
yInverted bool,
format PixelFormat,
) {
if dstW <= 0 || dstH <= 0 || srcW <= 0 || srcH <= 0 {
return
@@ -407,6 +440,14 @@ func drawMagnifierWithInversion(
innerRadius := float64(outerRadius - borderThickness)
outerRadiusF := float64(outerRadius)
var rOff, bOff int
switch format {
case FormatABGR8888, FormatXBGR8888:
rOff, bOff = 0, 2
default:
rOff, bOff = 2, 0
}
for dy := -outerRadius - 2; dy <= outerRadius+2; dy++ {
y := cy + dy
if y < 0 || y >= dstH {
@@ -431,9 +472,9 @@ func drawMagnifierWithInversion(
}
bgColor := Color{
B: dst[dstOff+0],
R: dst[dstOff+rOff],
G: dst[dstOff+1],
R: dst[dstOff+2],
B: dst[dstOff+bOff],
A: dst[dstOff+3],
}
@@ -462,7 +503,7 @@ func drawMagnifierWithInversion(
}
srcOff := sy*srcStride + sx*4
if srcOff+4 <= len(src) {
magColor := Color{B: src[srcOff+0], G: src[srcOff+1], R: src[srcOff+2], A: 255}
magColor := Color{R: src[srcOff+rOff], G: src[srcOff+1], B: src[srcOff+bOff], A: 255}
finalColor = blendColors(magColor, borderColor, alpha)
} else {
finalColor = borderColor
@@ -483,24 +524,25 @@ func drawMagnifierWithInversion(
}
srcOff := sy*srcStride + sx*4
if srcOff+4 <= len(src) {
finalColor = Color{B: src[srcOff+0], G: src[srcOff+1], R: src[srcOff+2], A: 255}
finalColor = Color{R: src[srcOff+rOff], G: src[srcOff+1], B: src[srcOff+bOff], A: 255}
} else {
continue
}
}
dst[dstOff+0] = finalColor.B
dst[dstOff+rOff] = finalColor.R
dst[dstOff+1] = finalColor.G
dst[dstOff+2] = finalColor.R
dst[dstOff+bOff] = finalColor.B
dst[dstOff+3] = 255
}
}
drawMagnifierCrosshair(dst, dstStride, dstW, dstH, cx, cy, int(innerRadius), crossThickness, crossInnerRadius)
drawMagnifierCrosshair(dst, dstStride, dstW, dstH, cx, cy, int(innerRadius), crossThickness, crossInnerRadius, format)
}
func drawMagnifierCrosshair(
data []byte, stride, width, height, cx, cy, radius, thickness, innerRadius int,
format PixelFormat,
) {
if width <= 0 || height <= 0 {
return
@@ -998,7 +1040,7 @@ var fontGlyphs = map[rune][fontH]uint8{
},
}
func drawColorPreview(data []byte, stride, width, height int, cx, cy int, c Color, format OutputFormat, lowercase bool) {
func drawColorPreview(data []byte, stride, width, height int, cx, cy int, c Color, format OutputFormat, lowercase bool, pixelFormat PixelFormat) {
text := formatColorForPreview(c, format, lowercase)
if len(text) == 0 {
return
@@ -1033,9 +1075,8 @@ func drawColorPreview(data []byte, stride, width, height int, cx, cy int, c Colo
y = height - boxH
}
drawFilledRect(data, stride, width, height, x, y, boxW, boxH, c)
drawFilledRect(data, stride, width, height, x, y, boxW, boxH, c, pixelFormat)
// Use contrasting text color based on luminance
lum := 0.299*float64(c.R) + 0.587*float64(c.G) + 0.114*float64(c.B)
var fg Color
if lum > 128 {
@@ -1043,7 +1084,7 @@ func drawColorPreview(data []byte, stride, width, height int, cx, cy int, c Colo
} else {
fg = Color{R: 255, G: 255, B: 255, A: 255}
}
drawText(data, stride, width, height, x+paddingX, y+paddingY, text, fg)
drawText(data, stride, width, height, x+paddingX, y+paddingY, text, fg, pixelFormat)
}
func formatColorForPreview(c Color, format OutputFormat, lowercase bool) string {
@@ -1064,7 +1105,7 @@ func formatColorForPreview(c Color, format OutputFormat, lowercase bool) string
}
}
func drawFilledRect(data []byte, stride, width, height, x, y, w, h int, col Color) {
func drawFilledRect(data []byte, stride, width, height, x, y, w, h int, col Color, format PixelFormat) {
if w <= 0 || h <= 0 {
return
}
@@ -1073,6 +1114,14 @@ func drawFilledRect(data []byte, stride, width, height, x, y, w, h int, col Colo
x = clamp(x, 0, width)
y = clamp(y, 0, height)
var rOff, bOff int
switch format {
case FormatABGR8888, FormatXBGR8888:
rOff, bOff = 0, 2
default:
rOff, bOff = 2, 0
}
for yy := y; yy < yEnd; yy++ {
rowOff := yy * stride
for xx := x; xx < xEnd; xx++ {
@@ -1080,26 +1129,34 @@ func drawFilledRect(data []byte, stride, width, height, x, y, w, h int, col Colo
if off+4 > len(data) {
continue
}
data[off+0] = col.B
data[off+rOff] = col.R
data[off+1] = col.G
data[off+2] = col.R
data[off+bOff] = col.B
data[off+3] = 255
}
}
}
func drawText(data []byte, stride, width, height, x, y int, text string, col Color) {
func drawText(data []byte, stride, width, height, x, y int, text string, col Color, format PixelFormat) {
for i, r := range text {
drawGlyph(data, stride, width, height, x+i*(fontW+2), y, r, col)
drawGlyph(data, stride, width, height, x+i*(fontW+2), y, r, col, format)
}
}
func drawGlyph(data []byte, stride, width, height, x, y int, r rune, col Color) {
func drawGlyph(data []byte, stride, width, height, x, y int, r rune, col Color, format PixelFormat) {
g, ok := fontGlyphs[r]
if !ok {
return
}
var rOff, bOff int
switch format {
case FormatABGR8888, FormatXBGR8888:
rOff, bOff = 0, 2
default:
rOff, bOff = 2, 0
}
for row := 0; row < fontH; row++ {
yy := y + row
if yy < 0 || yy >= height {
@@ -1123,9 +1180,9 @@ func drawGlyph(data []byte, stride, width, height, x, y int, r rune, col Color)
continue
}
data[off+0] = col.B
data[off+rOff] = col.R
data[off+1] = col.G
data[off+2] = col.R
data[off+bOff] = col.B
data[off+3] = 255
}
}

View File

@@ -0,0 +1,314 @@
package colorpicker
import (
"sync"
"testing"
"github.com/stretchr/testify/assert"
)
func TestSurfaceState_ConcurrentPointerMotion(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
var wg sync.WaitGroup
const goroutines = 50
const iterations = 100
for i := 0; i < goroutines; i++ {
wg.Add(1)
go func(id int) {
defer wg.Done()
for j := 0; j < iterations; j++ {
s.OnPointerMotion(float64(id*10+j), float64(id*10+j))
}
}(i)
}
wg.Wait()
}
func TestSurfaceState_ConcurrentScaleAccess(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
var wg sync.WaitGroup
const goroutines = 30
const iterations = 100
for i := 0; i < goroutines/2; i++ {
wg.Add(1)
go func(id int) {
defer wg.Done()
for j := 0; j < iterations; j++ {
s.SetScale(int32(id%3 + 1))
}
}(i)
}
for i := 0; i < goroutines/2; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for j := 0; j < iterations; j++ {
scale := s.Scale()
assert.GreaterOrEqual(t, scale, int32(1))
}
}()
}
wg.Wait()
}
func TestSurfaceState_ConcurrentLogicalSize(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
var wg sync.WaitGroup
const goroutines = 20
const iterations = 100
for i := 0; i < goroutines/2; i++ {
wg.Add(1)
go func(id int) {
defer wg.Done()
for j := 0; j < iterations; j++ {
_ = s.OnLayerConfigure(1920+id, 1080+j)
}
}(i)
}
for i := 0; i < goroutines/2; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for j := 0; j < iterations; j++ {
w, h := s.LogicalSize()
_ = w
_ = h
}
}()
}
wg.Wait()
}
func TestSurfaceState_ConcurrentIsDone(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
var wg sync.WaitGroup
const goroutines = 30
const iterations = 100
for i := 0; i < goroutines/3; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for j := 0; j < iterations; j++ {
s.OnPointerButton(0x110, 1)
}
}()
}
for i := 0; i < goroutines/3; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for j := 0; j < iterations; j++ {
s.OnKey(1, 1)
}
}()
}
for i := 0; i < goroutines/3; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for j := 0; j < iterations; j++ {
picked, cancelled := s.IsDone()
_ = picked
_ = cancelled
}
}()
}
wg.Wait()
}
func TestSurfaceState_ConcurrentIsReady(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
var wg sync.WaitGroup
const goroutines = 20
const iterations = 100
for i := 0; i < goroutines; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for j := 0; j < iterations; j++ {
_ = s.IsReady()
}
}()
}
wg.Wait()
}
func TestSurfaceState_ConcurrentSwapBuffers(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
var wg sync.WaitGroup
const goroutines = 20
const iterations = 100
for i := 0; i < goroutines; i++ {
wg.Add(1)
go func() {
defer wg.Done()
for j := 0; j < iterations; j++ {
s.SwapBuffers()
}
}()
}
wg.Wait()
}
func TestSurfaceState_ZeroScale(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
s.SetScale(0)
assert.Equal(t, int32(1), s.Scale())
}
func TestSurfaceState_NegativeScale(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
s.SetScale(-5)
assert.Equal(t, int32(1), s.Scale())
}
func TestSurfaceState_ZeroDimensionConfigure(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
err := s.OnLayerConfigure(0, 100)
assert.NoError(t, err)
err = s.OnLayerConfigure(100, 0)
assert.NoError(t, err)
err = s.OnLayerConfigure(-1, 100)
assert.NoError(t, err)
w, h := s.LogicalSize()
assert.Equal(t, 0, w)
assert.Equal(t, 0, h)
}
func TestSurfaceState_PickColorNilBuffer(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
color, ok := s.PickColor()
assert.False(t, ok)
assert.Equal(t, Color{}, color)
}
func TestSurfaceState_RedrawNilBuffer(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
buf := s.Redraw()
assert.Nil(t, buf)
}
func TestSurfaceState_RedrawScreenOnlyNilBuffer(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
buf := s.RedrawScreenOnly()
assert.Nil(t, buf)
}
func TestSurfaceState_FrontRenderBufferNil(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
buf := s.FrontRenderBuffer()
assert.Nil(t, buf)
}
func TestSurfaceState_ScreenBufferNil(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
buf := s.ScreenBuffer()
assert.Nil(t, buf)
}
func TestSurfaceState_DestroyMultipleTimes(t *testing.T) {
s := NewSurfaceState(FormatHex, false)
s.Destroy()
s.Destroy()
}
func TestClamp(t *testing.T) {
tests := []struct {
v, lo, hi, expected int
}{
{5, 0, 10, 5},
{-5, 0, 10, 0},
{15, 0, 10, 10},
{0, 0, 10, 0},
{10, 0, 10, 10},
}
for _, tt := range tests {
result := clamp(tt.v, tt.lo, tt.hi)
assert.Equal(t, tt.expected, result)
}
}
func TestClampF(t *testing.T) {
tests := []struct {
v, lo, hi, expected float64
}{
{5.0, 0.0, 10.0, 5.0},
{-5.0, 0.0, 10.0, 0.0},
{15.0, 0.0, 10.0, 10.0},
{0.0, 0.0, 10.0, 0.0},
{10.0, 0.0, 10.0, 10.0},
}
for _, tt := range tests {
result := clampF(tt.v, tt.lo, tt.hi)
assert.InDelta(t, tt.expected, result, 0.001)
}
}
func TestAbs(t *testing.T) {
tests := []struct {
v, expected int
}{
{5, 5},
{-5, 5},
{0, 0},
}
for _, tt := range tests {
result := abs(tt.v)
assert.Equal(t, tt.expected, result)
}
}
func TestBlendColors(t *testing.T) {
bg := Color{R: 0, G: 0, B: 0, A: 255}
fg := Color{R: 255, G: 255, B: 255, A: 255}
result := blendColors(bg, fg, 0.0)
assert.Equal(t, bg.R, result.R)
assert.Equal(t, bg.G, result.G)
assert.Equal(t, bg.B, result.B)
result = blendColors(bg, fg, 1.0)
assert.Equal(t, fg.R, result.R)
assert.Equal(t, fg.G, result.G)
assert.Equal(t, fg.B, result.B)
result = blendColors(bg, fg, 0.5)
assert.InDelta(t, 127, int(result.R), 1)
assert.InDelta(t, 127, int(result.G), 1)
assert.InDelta(t, 127, int(result.B), 1)
result = blendColors(bg, fg, -1.0)
assert.Equal(t, bg.R, result.R)
result = blendColors(bg, fg, 2.0)
assert.Equal(t, fg.R, result.R)
}

View File

@@ -46,11 +46,20 @@ func (cd *ConfigDeployer) DeployConfigurationsWithTerminal(ctx context.Context,
return cd.DeployConfigurationsSelective(ctx, wm, terminal, nil, nil)
}
// DeployConfigurationsWithSystemd deploys configurations with systemd option
func (cd *ConfigDeployer) DeployConfigurationsWithSystemd(ctx context.Context, wm deps.WindowManager, terminal deps.Terminal, useSystemd bool) ([]DeploymentResult, error) {
return cd.deployConfigurationsInternal(ctx, wm, terminal, nil, nil, nil, useSystemd)
}
func (cd *ConfigDeployer) DeployConfigurationsSelective(ctx context.Context, wm deps.WindowManager, terminal deps.Terminal, installedDeps []deps.Dependency, replaceConfigs map[string]bool) ([]DeploymentResult, error) {
return cd.DeployConfigurationsSelectiveWithReinstalls(ctx, wm, terminal, installedDeps, replaceConfigs, nil)
}
func (cd *ConfigDeployer) DeployConfigurationsSelectiveWithReinstalls(ctx context.Context, wm deps.WindowManager, terminal deps.Terminal, installedDeps []deps.Dependency, replaceConfigs map[string]bool, reinstallItems map[string]bool) ([]DeploymentResult, error) {
return cd.deployConfigurationsInternal(ctx, wm, terminal, installedDeps, replaceConfigs, reinstallItems, true)
}
func (cd *ConfigDeployer) deployConfigurationsInternal(ctx context.Context, wm deps.WindowManager, terminal deps.Terminal, installedDeps []deps.Dependency, replaceConfigs map[string]bool, reinstallItems map[string]bool, useSystemd bool) ([]DeploymentResult, error) {
var results []DeploymentResult
shouldReplaceConfig := func(configType string) bool {
@@ -64,7 +73,7 @@ func (cd *ConfigDeployer) DeployConfigurationsSelectiveWithReinstalls(ctx contex
switch wm {
case deps.WindowManagerNiri:
if shouldReplaceConfig("Niri") {
result, err := cd.deployNiriConfig(terminal)
result, err := cd.deployNiriConfig(terminal, useSystemd)
results = append(results, result)
if err != nil {
return results, fmt.Errorf("failed to deploy Niri config: %w", err)
@@ -72,7 +81,7 @@ func (cd *ConfigDeployer) DeployConfigurationsSelectiveWithReinstalls(ctx contex
}
case deps.WindowManagerHyprland:
if shouldReplaceConfig("Hyprland") {
result, err := cd.deployHyprlandConfig(terminal)
result, err := cd.deployHyprlandConfig(terminal, useSystemd)
results = append(results, result)
if err != nil {
return results, fmt.Errorf("failed to deploy Hyprland config: %w", err)
@@ -110,7 +119,7 @@ func (cd *ConfigDeployer) DeployConfigurationsSelectiveWithReinstalls(ctx contex
return results, nil
}
func (cd *ConfigDeployer) deployNiriConfig(terminal deps.Terminal) (DeploymentResult, error) {
func (cd *ConfigDeployer) deployNiriConfig(terminal deps.Terminal, useSystemd bool) (DeploymentResult, error) {
result := DeploymentResult{
ConfigType: "Niri",
Path: filepath.Join(os.Getenv("HOME"), ".config", "niri", "config.kdl"),
@@ -148,12 +157,6 @@ func (cd *ConfigDeployer) deployNiriConfig(terminal deps.Terminal) (DeploymentRe
cd.log(fmt.Sprintf("Backed up existing config to %s", result.BackupPath))
}
polkitPath, err := cd.detectPolkitAgent()
if err != nil {
cd.log(fmt.Sprintf("Warning: Could not detect polkit agent: %v", err))
polkitPath = "/usr/lib/mate-polkit/polkit-mate-authentication-agent-1"
}
var terminalCommand string
switch terminal {
case deps.TerminalGhostty:
@@ -166,8 +169,11 @@ func (cd *ConfigDeployer) deployNiriConfig(terminal deps.Terminal) (DeploymentRe
terminalCommand = "ghostty"
}
newConfig := strings.ReplaceAll(NiriConfig, "{{POLKIT_AGENT_PATH}}", polkitPath)
newConfig = strings.ReplaceAll(newConfig, "{{TERMINAL_COMMAND}}", terminalCommand)
newConfig := strings.ReplaceAll(NiriConfig, "{{TERMINAL_COMMAND}}", terminalCommand)
if !useSystemd {
newConfig = cd.transformNiriConfigForNonSystemd(newConfig, terminalCommand)
}
if existingConfig != "" {
mergedConfig, err := cd.mergeNiriOutputSections(newConfig, existingConfig)
@@ -404,41 +410,6 @@ func (cd *ConfigDeployer) deployAlacrittyConfig() ([]DeploymentResult, error) {
return results, nil
}
// detectPolkitAgent tries to find the polkit authentication agent on the system
// Prioritizes mate-polkit paths since that's what we install
func (cd *ConfigDeployer) detectPolkitAgent() (string, error) {
// Prioritize mate-polkit paths first
matePaths := []string{
"/usr/libexec/polkit-mate-authentication-agent-1", // Fedora path
"/usr/lib/mate-polkit/polkit-mate-authentication-agent-1",
"/usr/libexec/mate-polkit/polkit-mate-authentication-agent-1",
"/usr/lib/polkit-mate/polkit-mate-authentication-agent-1",
"/usr/lib/x86_64-linux-gnu/mate-polkit/polkit-mate-authentication-agent-1",
}
for _, path := range matePaths {
if _, err := os.Stat(path); err == nil {
cd.log(fmt.Sprintf("Found mate-polkit agent at: %s", path))
return path, nil
}
}
// Fallback to other polkit agents if mate-polkit is not found
fallbackPaths := []string{
"/usr/lib/polkit-gnome/polkit-gnome-authentication-agent-1",
"/usr/libexec/polkit-gnome-authentication-agent-1",
}
for _, path := range fallbackPaths {
if _, err := os.Stat(path); err == nil {
cd.log(fmt.Sprintf("Found fallback polkit agent at: %s", path))
return path, nil
}
}
return "", fmt.Errorf("no polkit agent found in common locations")
}
// mergeNiriOutputSections extracts output sections from existing config and merges them into the new config
func (cd *ConfigDeployer) mergeNiriOutputSections(newConfig, existingConfig string) (string, error) {
// Regular expression to match output sections (including commented ones)
@@ -482,7 +453,7 @@ func (cd *ConfigDeployer) mergeNiriOutputSections(newConfig, existingConfig stri
}
// deployHyprlandConfig handles Hyprland configuration deployment with backup and merging
func (cd *ConfigDeployer) deployHyprlandConfig(terminal deps.Terminal) (DeploymentResult, error) {
func (cd *ConfigDeployer) deployHyprlandConfig(terminal deps.Terminal, useSystemd bool) (DeploymentResult, error) {
result := DeploymentResult{
ConfigType: "Hyprland",
Path: filepath.Join(os.Getenv("HOME"), ".config", "hypr", "hyprland.conf"),
@@ -514,14 +485,6 @@ func (cd *ConfigDeployer) deployHyprlandConfig(terminal deps.Terminal) (Deployme
cd.log(fmt.Sprintf("Backed up existing config to %s", result.BackupPath))
}
// Detect polkit agent path
polkitPath, err := cd.detectPolkitAgent()
if err != nil {
cd.log(fmt.Sprintf("Warning: Could not detect polkit agent: %v", err))
polkitPath = "/usr/lib/mate-polkit/polkit-mate-authentication-agent-1" // fallback
}
// Determine terminal command based on choice
var terminalCommand string
switch terminal {
case deps.TerminalGhostty:
@@ -531,13 +494,15 @@ func (cd *ConfigDeployer) deployHyprlandConfig(terminal deps.Terminal) (Deployme
case deps.TerminalAlacritty:
terminalCommand = "alacritty"
default:
terminalCommand = "ghostty" // fallback to ghostty
terminalCommand = "ghostty"
}
newConfig := strings.ReplaceAll(HyprlandConfig, "{{POLKIT_AGENT_PATH}}", polkitPath)
newConfig = strings.ReplaceAll(newConfig, "{{TERMINAL_COMMAND}}", terminalCommand)
newConfig := strings.ReplaceAll(HyprlandConfig, "{{TERMINAL_COMMAND}}", terminalCommand)
if !useSystemd {
newConfig = cd.transformHyprlandConfigForNonSystemd(newConfig, terminalCommand)
}
// If there was an existing config, merge the monitor sections
if existingConfig != "" {
mergedConfig, err := cd.mergeHyprlandMonitorSections(newConfig, existingConfig)
if err != nil {
@@ -560,24 +525,16 @@ func (cd *ConfigDeployer) deployHyprlandConfig(terminal deps.Terminal) (Deployme
// mergeHyprlandMonitorSections extracts monitor sections from existing config and merges them into the new config
func (cd *ConfigDeployer) mergeHyprlandMonitorSections(newConfig, existingConfig string) (string, error) {
// Regular expression to match monitor lines (including commented ones)
// Matches: monitor = NAME, RESOLUTION, POSITION, SCALE, etc.
// Also matches commented versions: # monitor = ...
monitorRegex := regexp.MustCompile(`(?m)^#?\s*monitor\s*=.*$`)
// Find all monitor lines in the existing config
existingMonitors := monitorRegex.FindAllString(existingConfig, -1)
if len(existingMonitors) == 0 {
// No monitor sections to merge
return newConfig, nil
}
// Remove the example monitor line from the new config
exampleMonitorRegex := regexp.MustCompile(`(?m)^# monitor = eDP-2.*$`)
mergedConfig := exampleMonitorRegex.ReplaceAllString(newConfig, "")
// Find where to insert the monitor sections (after the MONITOR CONFIG header)
monitorHeaderRegex := regexp.MustCompile(`(?m)^# MONITOR CONFIG\n# ==================$`)
headerMatch := monitorHeaderRegex.FindStringIndex(mergedConfig)
@@ -585,8 +542,7 @@ func (cd *ConfigDeployer) mergeHyprlandMonitorSections(newConfig, existingConfig
return "", fmt.Errorf("could not find MONITOR CONFIG section")
}
// Insert after the header
insertPos := headerMatch[1] + 1 // +1 for the newline
insertPos := headerMatch[1] + 1
var builder strings.Builder
builder.WriteString(mergedConfig[:insertPos])
@@ -601,3 +557,70 @@ func (cd *ConfigDeployer) mergeHyprlandMonitorSections(newConfig, existingConfig
return builder.String(), nil
}
func (cd *ConfigDeployer) transformHyprlandConfigForNonSystemd(config, terminalCommand string) string {
lines := strings.Split(config, "\n")
var result []string
startupSectionFound := false
for _, line := range lines {
trimmed := strings.TrimSpace(line)
if strings.HasPrefix(trimmed, "exec-once = dbus-update-activation-environment") {
continue
}
if strings.HasPrefix(trimmed, "exec-once = systemctl --user start") {
startupSectionFound = true
result = append(result, "exec-once = dms run")
result = append(result, "env = QT_QPA_PLATFORM,wayland")
result = append(result, "env = ELECTRON_OZONE_PLATFORM_HINT,auto")
result = append(result, "env = QT_QPA_PLATFORMTHEME,gtk3")
result = append(result, "env = QT_QPA_PLATFORMTHEME_QT6,gtk3")
result = append(result, fmt.Sprintf("env = TERMINAL,%s", terminalCommand))
continue
}
result = append(result, line)
}
if !startupSectionFound {
for i, line := range result {
if strings.Contains(line, "STARTUP APPS") {
insertLines := []string{
"exec-once = dms run",
"env = QT_QPA_PLATFORM,wayland",
"env = ELECTRON_OZONE_PLATFORM_HINT,auto",
"env = QT_QPA_PLATFORMTHEME,gtk3",
"env = QT_QPA_PLATFORMTHEME_QT6,gtk3",
fmt.Sprintf("env = TERMINAL,%s", terminalCommand),
}
result = append(result[:i+2], append(insertLines, result[i+2:]...)...)
break
}
}
}
return strings.Join(result, "\n")
}
func (cd *ConfigDeployer) transformNiriConfigForNonSystemd(config, terminalCommand string) string {
envVars := fmt.Sprintf(`environment {
XDG_CURRENT_DESKTOP "niri"
QT_QPA_PLATFORM "wayland"
ELECTRON_OZONE_PLATFORM_HINT "auto"
QT_QPA_PLATFORMTHEME "gtk3"
QT_QPA_PLATFORMTHEME_QT6 "gtk3"
TERMINAL "%s"
}`, terminalCommand)
config = regexp.MustCompile(`environment \{[^}]*\}`).ReplaceAllString(config, envVars)
spawnDms := `spawn-at-startup "dms" "run"`
if !strings.Contains(config, spawnDms) {
// Insert spawn-at-startup for dms after the environment block
envBlockEnd := regexp.MustCompile(`environment \{[^}]*\}`)
if loc := envBlockEnd.FindStringIndex(config); loc != nil {
config = config[:loc[1]] + "\n" + spawnDms + config[loc[1]:]
}
}
return config
}

View File

@@ -3,7 +3,6 @@ package config
import (
"os"
"path/filepath"
"strings"
"testing"
"github.com/AvengeMedia/DankMaterialShell/core/internal/deps"
@@ -11,23 +10,6 @@ import (
"github.com/stretchr/testify/require"
)
func TestDetectPolkitAgent(t *testing.T) {
cd := &ConfigDeployer{}
// This test depends on the system having a polkit agent installed
// We'll just test that the function doesn't crash and returns some path or error
path, err := cd.detectPolkitAgent()
if err != nil {
// If no polkit agent is found, that's okay for testing
assert.Contains(t, err.Error(), "no polkit agent found")
} else {
// If found, it should be a valid path
assert.NotEmpty(t, path)
assert.True(t, strings.Contains(path, "polkit"))
}
}
func TestMergeNiriOutputSections(t *testing.T) {
cd := &ConfigDeployer{}
@@ -272,17 +254,6 @@ func getGhosttyPath() string {
return filepath.Join(os.Getenv("HOME"), ".config", "ghostty", "config")
}
func TestPolkitPathInjection(t *testing.T) {
testConfig := `spawn-at-startup "{{POLKIT_AGENT_PATH}}"
other content`
result := strings.Replace(testConfig, "{{POLKIT_AGENT_PATH}}", "/test/polkit/path", 1)
assert.Contains(t, result, `spawn-at-startup "/test/polkit/path"`)
assert.NotContains(t, result, "{{POLKIT_AGENT_PATH}}")
}
func TestMergeHyprlandMonitorSections(t *testing.T) {
cd := &ConfigDeployer{}
@@ -424,7 +395,7 @@ func TestHyprlandConfigDeployment(t *testing.T) {
cd := NewConfigDeployer(logChan)
t.Run("deploy hyprland config to empty directory", func(t *testing.T) {
result, err := cd.deployHyprlandConfig(deps.TerminalGhostty)
result, err := cd.deployHyprlandConfig(deps.TerminalGhostty, true)
require.NoError(t, err)
assert.Equal(t, "Hyprland", result.ConfigType)
@@ -435,7 +406,7 @@ func TestHyprlandConfigDeployment(t *testing.T) {
content, err := os.ReadFile(result.Path)
require.NoError(t, err)
assert.Contains(t, string(content), "# MONITOR CONFIG")
assert.Contains(t, string(content), "bind = $mod, T, exec, $TERMINAL")
assert.Contains(t, string(content), "bind = $mod, T, exec, ghostty")
assert.Contains(t, string(content), "exec-once = ")
})
@@ -454,7 +425,7 @@ general {
err = os.WriteFile(hyprPath, []byte(existingContent), 0644)
require.NoError(t, err)
result, err := cd.deployHyprlandConfig(deps.TerminalKitty)
result, err := cd.deployHyprlandConfig(deps.TerminalKitty, true)
require.NoError(t, err)
assert.Equal(t, "Hyprland", result.ConfigType)
@@ -471,7 +442,7 @@ general {
require.NoError(t, err)
assert.Contains(t, string(newContent), "monitor = DP-1, 1920x1080@144")
assert.Contains(t, string(newContent), "monitor = HDMI-A-1, 3840x2160@60")
assert.Contains(t, string(newContent), "bind = $mod, T, exec, $TERMINAL")
assert.Contains(t, string(newContent), "bind = $mod, T, exec, kitty")
assert.NotContains(t, string(newContent), "monitor = eDP-2")
})
}
@@ -479,7 +450,6 @@ general {
func TestNiriConfigStructure(t *testing.T) {
assert.Contains(t, NiriConfig, "input {")
assert.Contains(t, NiriConfig, "layout {")
assert.Contains(t, NiriConfig, "{{POLKIT_AGENT_PATH}}")
assert.Contains(t, NiriBindsConfig, "binds {")
assert.Contains(t, NiriBindsConfig, `spawn "{{TERMINAL_COMMAND}}"`)
@@ -490,11 +460,9 @@ func TestHyprlandConfigStructure(t *testing.T) {
assert.Contains(t, HyprlandConfig, "# STARTUP APPS")
assert.Contains(t, HyprlandConfig, "# INPUT CONFIG")
assert.Contains(t, HyprlandConfig, "# KEYBINDINGS")
assert.Contains(t, HyprlandConfig, "{{POLKIT_AGENT_PATH}}")
assert.Contains(t, HyprlandConfig, "bind = $mod, T, exec, $TERMINAL")
assert.Contains(t, HyprlandConfig, "bind = $mod, T, exec, {{TERMINAL_COMMAND}}")
assert.Contains(t, HyprlandConfig, "bind = $mod, space, exec, dms ipc call spotlight toggle")
assert.Contains(t, HyprlandConfig, "windowrule {")
assert.Contains(t, HyprlandConfig, "match:class = ^(com\\.mitchellh\\.ghostty)$")
assert.Contains(t, HyprlandConfig, "windowrulev2 = noborder, class:^(com\\.mitchellh\\.ghostty)$")
}
func TestGhosttyConfigStructure(t *testing.T) {

View File

@@ -7,18 +7,11 @@ import (
"strings"
)
// LocateDMSConfig searches for DMS installation following XDG Base Directory specification
func LocateDMSConfig() (string, error) {
var primaryPaths []string
configHome := os.Getenv("XDG_CONFIG_HOME")
if configHome == "" {
if homeDir, err := os.UserHomeDir(); err == nil {
configHome = filepath.Join(homeDir, ".config")
}
}
if configHome != "" {
configHome, err := os.UserConfigDir()
if err == nil && configHome != "" {
primaryPaths = append(primaryPaths, filepath.Join(configHome, "quickshell", "dms"))
}

View File

@@ -10,8 +10,8 @@ monitor = , preferred,auto,auto
# ==================
# STARTUP APPS
# ==================
exec-once = bash -c "wl-paste --watch cliphist store &"
exec-once = {{POLKIT_AGENT_PATH}}
exec-once = dbus-update-activation-environment --systemd --all
exec-once = systemctl --user start hyprland-session.target
# ==================
# INPUT CONFIG
@@ -90,132 +90,36 @@ misc {
# ==================
# WINDOW RULES
# ==================
windowrule {
name = windowrule-1
tile = on
match:class = ^(org\.wezfurlong\.wezterm)$
border_size = 0
}
windowrulev2 = tile, class:^(org\.wezfurlong\.wezterm)$
windowrulev2 = rounding 12, class:^(org\.gnome\.)
windowrulev2 = noborder, class:^(org\.gnome\.)
windowrule {
name = windowrule-2
rounding = 12
match:class = ^(org\.gnome\.)
border_size = 0
}
windowrulev2 = tile, class:^(gnome-control-center)$
windowrulev2 = tile, class:^(pavucontrol)$
windowrulev2 = tile, class:^(nm-connection-editor)$
windowrulev2 = float, class:^(gnome-calculator)$
windowrulev2 = float, class:^(galculator)$
windowrulev2 = float, class:^(blueman-manager)$
windowrulev2 = float, class:^(org\.gnome\.Nautilus)$
windowrulev2 = float, class:^(steam)$
windowrulev2 = float, class:^(xdg-desktop-portal)$
windowrulev2 = noborder, class:^(org\.wezfurlong\.wezterm)$
windowrulev2 = noborder, class:^(Alacritty)$
windowrulev2 = noborder, class:^(zen)$
windowrulev2 = noborder, class:^(com\.mitchellh\.ghostty)$
windowrulev2 = noborder, class:^(kitty)$
windowrule {
name = windowrule-3
tile = on
match:class = ^(gnome-control-center)$
}
windowrulev2 = float, class:^(firefox)$, title:^(Picture-in-Picture)$
windowrulev2 = float, class:^(zoom)$
windowrule {
name = windowrule-4
tile = on
match:class = ^(pavucontrol)$
}
# DMS windows floating by default
windowrulev2 = float, class:^(org.quickshell)$
windowrulev2 = opacity 0.9 0.9, floating:0, focus:0
windowrule {
name = windowrule-5
tile = on
match:class = ^(nm-connection-editor)$
}
windowrule {
name = windowrule-6
float = on
match:class = ^(gnome-calculator)$
}
windowrule {
name = windowrule-7
float = on
match:class = ^(galculator)$
}
windowrule {
name = windowrule-8
float = on
match:class = ^(blueman-manager)$
}
windowrule {
name = windowrule-9
float = on
match:class = ^(org\.gnome\.Nautilus)$
}
windowrule {
name = windowrule-10
float = on
match:class = ^(steam)$
}
windowrule {
name = windowrule-11
float = on
match:class = ^(xdg-desktop-portal)$
}
windowrule {
name = windowrule-12
border_size = 0
match:class = ^(Alacritty)$
}
windowrule {
name = windowrule-13
border_size = 0
match:class = ^(zen)$
}
windowrule {
name = windowrule-14
border_size = 0
match:class = ^(com\.mitchellh\.ghostty)$
}
windowrule {
name = windowrule-15
border_size = 0
match:class = ^(kitty)$
}
windowrule {
name = windowrule-16
float = on
match:class = ^(firefox)$
match:title = ^(Picture-in-Picture)$
}
windowrule {
name = windowrule-17
float = on
match:class = ^(zoom)$
}
windowrule {
name = windowrule-18
opacity = 0.9 0.9
match:float = 0
match:focus = 0
}
layerrule {
name = layerrule-1
no_anim = on
match:namespace = ^(quickshell)$
}
layerrule = noanim, ^(quickshell)$
# ==================
# KEYBINDINGS
@@ -223,7 +127,7 @@ layerrule {
$mod = SUPER
# === Application Launchers ===
bind = $mod, T, exec, $TERMINAL
bind = $mod, T, exec, {{TERMINAL_COMMAND}}
bind = $mod, space, exec, dms ipc call spotlight toggle
bind = $mod, V, exec, dms ipc call clipboard toggle
bind = $mod, M, exec, dms ipc call processlist focusOrToggle
@@ -372,4 +276,4 @@ bind = CTRL, Print, exec, dms screenshot full
bind = ALT, Print, exec, dms screenshot window
# === System Controls ===
bind = $mod SHIFT, P, dpms, off
bind = $mod SHIFT, P, dpms, toggle

View File

@@ -1,3 +1,8 @@
// ! DO NOT EDIT !
// ! AUTO-GENERATED BY DMS !
// ! CHANGES WILL BE OVERWRITTEN !
// ! PLACE YOUR CUSTOM CONFIGURATION ELSEWHERE !
recent-windows {
highlight {
corner-radius 12

View File

@@ -1,3 +1,8 @@
// ! DO NOT EDIT !
// ! AUTO-GENERATED BY DMS !
// ! CHANGES WILL BE OVERWRITTEN !
// ! PLACE YOUR CUSTOM CONFIGURATION ELSEWHERE !
binds {
// === System & Overview ===
Mod+D repeat=false { toggle-overview; }
@@ -192,4 +197,4 @@ binds {
// === System Controls ===
Mod+Escape allow-inhibiting=false { toggle-keyboard-shortcuts-inhibit; }
Mod+Shift+P { power-off-monitors; }
}
}

View File

@@ -1,3 +1,8 @@
// ! DO NOT EDIT !
// ! AUTO-GENERATED BY DMS !
// ! CHANGES WILL BE OVERWRITTEN !
// ! PLACE YOUR CUSTOM CONFIGURATION ELSEWHERE !
layout {
background-color "transparent"
@@ -33,4 +38,4 @@ recent-windows {
active-color "#124a73"
urgent-color "#ffb4ab"
}
}
}

View File

@@ -1,3 +1,8 @@
// ! DO NOT EDIT !
// ! AUTO-GENERATED BY DMS !
// ! CHANGES WILL BE OVERWRITTEN !
// ! PLACE YOUR CUSTOM CONFIGURATION ELSEWHERE !
layout {
gaps 4

View File

@@ -18,15 +18,64 @@ gestures {
input {
keyboard {
xkb {
// You can set rules, model, layout, variant and options.
// For more information, see xkeyboard-config(7).
// For example:
// layout "us,ru"
// options "grp:win_space_toggle,compose:ralt,ctrl:nocaps"
// If this section is empty, niri will fetch xkb settings
// from org.freedesktop.locale1. You can control these using
// localectl set-x11-keymap.
}
// Enable numlock on startup, omitting this setting disables it.
numlock
}
// Next sections include libinput settings.
// Omitting settings disables them, or leaves them at their default values.
// All commented-out settings here are examples, not defaults.
touchpad {
// off
tap
// dwt
// dwtp
// drag false
// drag-lock
natural-scroll
// accel-speed 0.2
// accel-profile "flat"
// scroll-method "two-finger"
// disabled-on-external-mouse
}
mouse {
// off
// natural-scroll
// accel-speed 0.2
// accel-profile "flat"
// scroll-method "no-scroll"
}
trackpoint {
// off
// natural-scroll
// accel-speed 0.2
// accel-profile "flat"
// scroll-method "on-button-down"
// scroll-button 273
// scroll-button-lock
// middle-emulation
}
// Uncomment this to make the mouse warp to the center of newly focused windows.
// warp-mouse-to-focus
// Focus windows and outputs automatically when moving the mouse into them.
// Setting max-scroll-amount="0%" makes it work only on windows already fully on screen.
// focus-follows-mouse max-scroll-amount="0%"
}
// You can configure outputs by their name, which you can find
// by running `niri msg outputs` while inside a niri instance.
@@ -44,7 +93,6 @@ input {
// https://github.com/YaLTeR/niri/wiki/Configuration:-Layout
layout {
// Set gaps around windows in logical pixels.
gaps 5
background-color "transparent"
// When to center a column when changing focus, options are:
// - "never", default behavior, focusing an off-screen column will keep at the left
@@ -87,11 +135,6 @@ layout {
inactive-color "#d0d0d0" // Light gray
urgent-color "#cc4444" // Softer red
}
focus-ring {
width 2
active-color "#808080" // Medium gray
inactive-color "#505050" // Dark gray
}
shadow {
softness 30
spread 5
@@ -115,8 +158,6 @@ overview {
// which may be more convenient to use.
// See the binds section below for more spawn examples.
// This line starts waybar, a commonly used bar for Wayland compositors.
spawn-at-startup "bash" "-c" "wl-paste --watch cliphist store &"
spawn-at-startup "{{POLKIT_AGENT_PATH}}"
environment {
XDG_CURRENT_DESKTOP "niri"
}

View File

@@ -113,13 +113,14 @@ func RGBToHSV(rgb RGB) HSV {
delta := max - min
var h float64
if delta == 0 {
switch {
case delta == 0:
h = 0
} else if max == rgb.R {
case max == rgb.R:
h = math.Mod((rgb.G-rgb.B)/delta, 6.0) / 6.0
} else if max == rgb.G {
case max == rgb.G:
h = ((rgb.B-rgb.R)/delta + 2.0) / 6.0
} else {
default:
h = ((rgb.R-rgb.G)/delta + 4.0) / 6.0
}

View File

@@ -103,40 +103,18 @@ func (a *ArchDistribution) DetectDependenciesWithTerminal(ctx context.Context, w
dependencies = append(dependencies, a.detectXwaylandSatellite())
}
// Base detections (common across distros)
dependencies = append(dependencies, a.detectMatugen())
dependencies = append(dependencies, a.detectDgop())
dependencies = append(dependencies, a.detectClipboardTools()...)
return dependencies, nil
}
func (a *ArchDistribution) detectXDGPortal() deps.Dependency {
status := deps.StatusMissing
if a.packageInstalled("xdg-desktop-portal-gtk") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xdg-desktop-portal-gtk",
Status: status,
Description: "Desktop integration portal for GTK",
Required: true,
}
return a.detectPackage("xdg-desktop-portal-gtk", "Desktop integration portal for GTK", a.packageInstalled("xdg-desktop-portal-gtk"))
}
func (a *ArchDistribution) detectAccountsService() deps.Dependency {
status := deps.StatusMissing
if a.packageInstalled("accountsservice") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "accountsservice",
Status: status,
Description: "D-Bus interface for user account query and manipulation",
Required: true,
}
return a.detectPackage("accountsservice", "D-Bus interface for user account query and manipulation", a.packageInstalled("accountsservice"))
}
func (a *ArchDistribution) packageInstalled(pkg string) bool {
@@ -159,8 +137,6 @@ func (a *ArchDistribution) GetPackageMappingWithVariants(wm deps.WindowManager,
"ghostty": {Name: "ghostty", Repository: RepoTypeSystem},
"kitty": {Name: "kitty", Repository: RepoTypeSystem},
"alacritty": {Name: "alacritty", Repository: RepoTypeSystem},
"cliphist": {Name: "cliphist", Repository: RepoTypeSystem},
"wl-clipboard": {Name: "wl-clipboard", Repository: RepoTypeSystem},
"xdg-desktop-portal-gtk": {Name: "xdg-desktop-portal-gtk", Repository: RepoTypeSystem},
"accountsservice": {Name: "accountsservice", Repository: RepoTypeSystem},
}
@@ -182,13 +158,11 @@ func (a *ArchDistribution) getQuickshellMapping(variant deps.PackageVariant) Pac
if forceQuickshellGit || variant == deps.VariantGit {
return PackageMapping{Name: "quickshell-git", Repository: RepoTypeAUR}
}
return PackageMapping{Name: "quickshell", Repository: RepoTypeSystem}
// ! TODO - for now we're only forcing quickshell-git on ARCH, as other distros use DL repos which pin a newer quickshell
return PackageMapping{Name: "quickshell-git", Repository: RepoTypeAUR}
}
func (a *ArchDistribution) getHyprlandMapping(variant deps.PackageVariant) PackageMapping {
if variant == deps.VariantGit {
return PackageMapping{Name: "hyprland-git", Repository: RepoTypeAUR}
}
func (a *ArchDistribution) getHyprlandMapping(_ deps.PackageVariant) PackageMapping {
return PackageMapping{Name: "hyprland", Repository: RepoTypeSystem}
}
@@ -362,7 +336,11 @@ func (a *ArchDistribution) InstallPackages(ctx context.Context, dependencies []d
a.log(fmt.Sprintf("Warning: failed to write environment config: %v", err))
}
if err := a.EnableDMSService(ctx); err != nil {
if err := a.WriteWindowManagerConfig(wm); err != nil {
a.log(fmt.Sprintf("Warning: failed to write window manager config: %v", err))
}
if err := a.EnableDMSService(ctx, wm); err != nil {
a.log(fmt.Sprintf("Warning: failed to enable dms service: %v", err))
}

View File

@@ -76,47 +76,42 @@ func ExecSudoCommand(ctx context.Context, sudoPassword string, command string) *
return exec.CommandContext(ctx, "bash", "-c", cmdStr)
}
// Common dependency detection methods
func (b *BaseDistribution) detectGit() deps.Dependency {
func (b *BaseDistribution) detectCommand(name, description string) deps.Dependency {
status := deps.StatusMissing
if b.commandExists("git") {
if b.commandExists(name) {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "git",
Name: name,
Status: status,
Description: "Version control system",
Description: description,
Required: true,
}
}
func (b *BaseDistribution) detectPackage(name, description string, installed bool) deps.Dependency {
status := deps.StatusMissing
if installed {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: name,
Status: status,
Description: description,
Required: true,
}
}
func (b *BaseDistribution) detectGit() deps.Dependency {
return b.detectCommand("git", "Version control system")
}
func (b *BaseDistribution) detectMatugen() deps.Dependency {
status := deps.StatusMissing
if b.commandExists("matugen") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "matugen",
Status: status,
Description: "Material Design color generation tool",
Required: true,
}
return b.detectCommand("matugen", "Material Design color generation tool")
}
func (b *BaseDistribution) detectDgop() deps.Dependency {
status := deps.StatusMissing
if b.commandExists("dgop") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "dgop",
Status: status,
Description: "Desktop portal management tool",
Required: true,
}
return b.detectCommand("dgop", "Desktop portal management tool")
}
func (b *BaseDistribution) detectDMS() deps.Dependency {
@@ -190,37 +185,6 @@ func (b *BaseDistribution) detectSpecificTerminal(terminal deps.Terminal) deps.D
}
}
func (b *BaseDistribution) detectClipboardTools() []deps.Dependency {
var dependencies []deps.Dependency
cliphist := deps.StatusMissing
if b.commandExists("cliphist") {
cliphist = deps.StatusInstalled
}
wlClipboard := deps.StatusMissing
if b.commandExists("wl-copy") && b.commandExists("wl-paste") {
wlClipboard = deps.StatusInstalled
}
dependencies = append(dependencies,
deps.Dependency{
Name: "cliphist",
Status: cliphist,
Description: "Wayland clipboard manager",
Required: true,
},
deps.Dependency{
Name: "wl-clipboard",
Status: wlClipboard,
Description: "Wayland clipboard utilities",
Required: true,
},
)
return dependencies
}
func (b *BaseDistribution) detectHyprlandTools() []deps.Dependency {
var dependencies []deps.Dependency
@@ -602,12 +566,59 @@ TERMINAL=%s
return nil
}
func (b *BaseDistribution) EnableDMSService(ctx context.Context) error {
func (b *BaseDistribution) EnableDMSService(ctx context.Context, wm deps.WindowManager) error {
cmd := exec.CommandContext(ctx, "systemctl", "--user", "enable", "--now", "dms")
if err := cmd.Run(); err != nil {
return fmt.Errorf("failed to enable dms service: %w", err)
}
b.log("Enabled dms systemd user service")
switch wm {
case deps.WindowManagerNiri:
if err := exec.CommandContext(ctx, "systemctl", "--user", "add-wants", "niri.service", "dms").Run(); err != nil {
b.log("Warning: failed to add dms as a want for niri.service")
}
case deps.WindowManagerHyprland:
if err := exec.CommandContext(ctx, "systemctl", "--user", "add-wants", "hyprland-session.target", "dms").Run(); err != nil {
b.log("Warning: failed to add dms as a want for hyprland-session.target")
}
}
return nil
}
func (b *BaseDistribution) WriteWindowManagerConfig(wm deps.WindowManager) error {
if wm == deps.WindowManagerHyprland {
if err := b.WriteHyprlandSessionTarget(); err != nil {
return fmt.Errorf("failed to write hyprland session target: %w", err)
}
}
return nil
}
func (b *BaseDistribution) WriteHyprlandSessionTarget() error {
homeDir, err := os.UserHomeDir()
if err != nil {
return fmt.Errorf("failed to get home directory: %w", err)
}
targetDir := filepath.Join(homeDir, ".config", "systemd", "user")
if err := os.MkdirAll(targetDir, 0755); err != nil {
return fmt.Errorf("failed to create systemd user directory: %w", err)
}
targetPath := filepath.Join(targetDir, "hyprland-session.target")
content := `[Unit]
Description=Hyprland Session Target
Requires=graphical-session.target
After=graphical-session.target
`
if err := os.WriteFile(targetPath, []byte(content), 0644); err != nil {
return fmt.Errorf("failed to write hyprland-session.target: %w", err)
}
b.log(fmt.Sprintf("Wrote hyprland-session.target to %s", targetPath))
return nil
}

View File

@@ -7,6 +7,7 @@ import (
"testing"
"github.com/AvengeMedia/DankMaterialShell/core/internal/deps"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
func TestBaseDistribution_detectDMS_NotInstalled(t *testing.T) {
@@ -36,7 +37,7 @@ func TestBaseDistribution_detectDMS_NotInstalled(t *testing.T) {
}
func TestBaseDistribution_detectDMS_Installed(t *testing.T) {
if !commandExists("git") {
if !utils.CommandExists("git") {
t.Skip("git not available")
}
@@ -80,7 +81,7 @@ func TestBaseDistribution_detectDMS_Installed(t *testing.T) {
}
func TestBaseDistribution_detectDMS_NeedsUpdate(t *testing.T) {
if !commandExists("git") {
if !utils.CommandExists("git") {
t.Skip("git not available")
}
@@ -164,11 +165,6 @@ func TestBaseDistribution_NewBaseDistribution(t *testing.T) {
}
}
func commandExists(cmd string) bool {
_, err := exec.LookPath(cmd)
return err == nil
}
func TestBaseDistribution_versionCompare(t *testing.T) {
logChan := make(chan string, 10)
defer close(logChan)

View File

@@ -69,51 +69,20 @@ func (d *DebianDistribution) DetectDependenciesWithTerminal(ctx context.Context,
dependencies = append(dependencies, d.detectMatugen())
dependencies = append(dependencies, d.detectDgop())
dependencies = append(dependencies, d.detectClipboardTools()...)
return dependencies, nil
}
func (d *DebianDistribution) detectXDGPortal() deps.Dependency {
status := deps.StatusMissing
if d.packageInstalled("xdg-desktop-portal-gtk") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xdg-desktop-portal-gtk",
Status: status,
Description: "Desktop integration portal for GTK",
Required: true,
}
return d.detectPackage("xdg-desktop-portal-gtk", "Desktop integration portal for GTK", d.packageInstalled("xdg-desktop-portal-gtk"))
}
func (d *DebianDistribution) detectXwaylandSatellite() deps.Dependency {
status := deps.StatusMissing
if d.commandExists("xwayland-satellite") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xwayland-satellite",
Status: status,
Description: "Xwayland support",
Required: true,
}
return d.detectCommand("xwayland-satellite", "Xwayland support")
}
func (d *DebianDistribution) detectAccountsService() deps.Dependency {
status := deps.StatusMissing
if d.packageInstalled("accountsservice") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "accountsservice",
Status: status,
Description: "D-Bus interface for user account query and manipulation",
Required: true,
}
return d.detectPackage("accountsservice", "D-Bus interface for user account query and manipulation", d.packageInstalled("accountsservice"))
}
func (d *DebianDistribution) packageInstalled(pkg string) bool {
@@ -132,7 +101,6 @@ func (d *DebianDistribution) GetPackageMappingWithVariants(wm deps.WindowManager
"git": {Name: "git", Repository: RepoTypeSystem},
"kitty": {Name: "kitty", Repository: RepoTypeSystem},
"alacritty": {Name: "alacritty", Repository: RepoTypeSystem},
"wl-clipboard": {Name: "wl-clipboard", Repository: RepoTypeSystem},
"xdg-desktop-portal-gtk": {Name: "xdg-desktop-portal-gtk", Repository: RepoTypeSystem},
"accountsservice": {Name: "accountsservice", Repository: RepoTypeSystem},
@@ -141,7 +109,6 @@ func (d *DebianDistribution) GetPackageMappingWithVariants(wm deps.WindowManager
"quickshell": d.getQuickshellMapping(variants["quickshell"]),
"matugen": {Name: "matugen", Repository: RepoTypeOBS, RepoURL: "home:AvengeMedia:danklinux"},
"dgop": {Name: "dgop", Repository: RepoTypeOBS, RepoURL: "home:AvengeMedia:danklinux"},
"cliphist": {Name: "cliphist", Repository: RepoTypeOBS, RepoURL: "home:AvengeMedia:danklinux"},
"ghostty": {Name: "ghostty", Repository: RepoTypeOBS, RepoURL: "home:AvengeMedia:danklinux"},
}
@@ -208,7 +175,7 @@ func (d *DebianDistribution) InstallPrerequisites(ctx context.Context, sudoPassw
checkCmd := exec.CommandContext(ctx, "dpkg", "-l", "build-essential")
if err := checkCmd.Run(); err != nil {
cmd := ExecSudoCommand(ctx, sudoPassword, "apt-get install -y build-essential")
cmd := ExecSudoCommand(ctx, sudoPassword, "DEBIAN_FRONTEND=noninteractive apt-get install -y build-essential")
if err := d.runWithProgress(cmd, progressChan, PhasePrerequisites, 0.08, 0.09); err != nil {
return fmt.Errorf("failed to install build-essential: %w", err)
}
@@ -225,7 +192,7 @@ func (d *DebianDistribution) InstallPrerequisites(ctx context.Context, sudoPassw
}
devToolsCmd := ExecSudoCommand(ctx, sudoPassword,
"apt-get install -y curl wget git cmake ninja-build pkg-config libxcb-cursor-dev libglib2.0-dev libpolkit-agent-1-dev libjpeg-dev libpugixml-dev")
"DEBIAN_FRONTEND=noninteractive apt-get install -y curl wget git cmake ninja-build pkg-config libxcb-cursor-dev libglib2.0-dev libpolkit-agent-1-dev libjpeg-dev libpugixml-dev")
if err := d.runWithProgress(devToolsCmd, progressChan, PhasePrerequisites, 0.10, 0.12); err != nil {
return fmt.Errorf("failed to install development tools: %w", err)
}
@@ -338,7 +305,11 @@ func (d *DebianDistribution) InstallPackages(ctx context.Context, dependencies [
d.log(fmt.Sprintf("Warning: failed to write environment config: %v", err))
}
if err := d.EnableDMSService(ctx); err != nil {
if err := d.WriteWindowManagerConfig(wm); err != nil {
d.log(fmt.Sprintf("Warning: failed to write window manager config: %v", err))
}
if err := d.EnableDMSService(ctx, wm); err != nil {
d.log(fmt.Sprintf("Warning: failed to enable dms service: %v", err))
}
@@ -449,7 +420,7 @@ func (d *DebianDistribution) enableOBSRepos(ctx context.Context, obsPkgs []Packa
CommandInfo: fmt.Sprintf("curl & gpg to add key for %s", pkg.RepoURL),
}
keyCmd := fmt.Sprintf("curl -fsSL %s/Release.key | gpg --dearmor -o %s", baseURL, keyringPath)
keyCmd := fmt.Sprintf("bash -c 'rm -f %s && curl -fsSL %s/Release.key | gpg --batch --dearmor -o %s'", keyringPath, baseURL, keyringPath)
cmd := ExecSudoCommand(ctx, sudoPassword, keyCmd)
if err := d.runWithProgress(cmd, progressChan, PhaseSystemPackages, 0.18, 0.20); err != nil {
return fmt.Errorf("failed to add OBS GPG key for %s: %w", pkg.RepoURL, err)
@@ -467,7 +438,7 @@ func (d *DebianDistribution) enableOBSRepos(ctx context.Context, obsPkgs []Packa
}
addRepoCmd := ExecSudoCommand(ctx, sudoPassword,
fmt.Sprintf("echo '%s' | tee %s", repoLine, listFile))
fmt.Sprintf("bash -c \"echo '%s' | tee %s\"", repoLine, listFile))
if err := d.runWithProgress(addRepoCmd, progressChan, PhaseSystemPackages, 0.20, 0.22); err != nil {
return fmt.Errorf("failed to add OBS repo %s: %w", pkg.RepoURL, err)
}
@@ -502,7 +473,7 @@ func (d *DebianDistribution) installAPTPackages(ctx context.Context, packages []
d.log(fmt.Sprintf("Installing APT packages: %s", strings.Join(packages, ", ")))
args := []string{"apt-get", "install", "-y"}
args := []string{"DEBIAN_FRONTEND=noninteractive", "apt-get", "install", "-y"}
args = append(args, packages...)
progressChan <- InstallProgressMsg{
@@ -575,7 +546,7 @@ func (d *DebianDistribution) installBuildDependencies(ctx context.Context, manua
if err := d.installRust(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install Rust: %w", err)
}
case "cliphist", "dgop":
case "dgop":
if err := d.installGo(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install Go: %w", err)
}
@@ -612,7 +583,7 @@ func (d *DebianDistribution) installRust(ctx context.Context, sudoPassword strin
CommandInfo: "sudo apt-get install rustup",
}
rustupInstallCmd := ExecSudoCommand(ctx, sudoPassword, "apt-get install -y rustup")
rustupInstallCmd := ExecSudoCommand(ctx, sudoPassword, "DEBIAN_FRONTEND=noninteractive apt-get install -y rustup")
if err := d.runWithProgress(rustupInstallCmd, progressChan, PhaseSystemPackages, 0.82, 0.83); err != nil {
return fmt.Errorf("failed to install rustup: %w", err)
}
@@ -651,7 +622,7 @@ func (d *DebianDistribution) installGo(ctx context.Context, sudoPassword string,
CommandInfo: "sudo apt-get install golang-go",
}
installCmd := ExecSudoCommand(ctx, sudoPassword, "apt-get install -y golang-go")
installCmd := ExecSudoCommand(ctx, sudoPassword, "DEBIAN_FRONTEND=noninteractive apt-get install -y golang-go")
return d.runWithProgress(installCmd, progressChan, PhaseSystemPackages, 0.87, 0.90)
}

View File

@@ -88,26 +88,14 @@ func (f *FedoraDistribution) DetectDependenciesWithTerminal(ctx context.Context,
dependencies = append(dependencies, f.detectXwaylandSatellite())
}
// Base detections (common across distros)
dependencies = append(dependencies, f.detectMatugen())
dependencies = append(dependencies, f.detectDgop())
dependencies = append(dependencies, f.detectClipboardTools()...)
return dependencies, nil
}
func (f *FedoraDistribution) detectXDGPortal() deps.Dependency {
status := deps.StatusMissing
if f.packageInstalled("xdg-desktop-portal-gtk") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xdg-desktop-portal-gtk",
Status: status,
Description: "Desktop integration portal for GTK",
Required: true,
}
return f.detectPackage("xdg-desktop-portal-gtk", "Desktop integration portal for GTK", f.packageInstalled("xdg-desktop-portal-gtk"))
}
func (f *FedoraDistribution) packageInstalled(pkg string) bool {
@@ -127,14 +115,12 @@ func (f *FedoraDistribution) GetPackageMappingWithVariants(wm deps.WindowManager
"ghostty": {Name: "ghostty", Repository: RepoTypeCOPR, RepoURL: "avengemedia/danklinux"},
"kitty": {Name: "kitty", Repository: RepoTypeSystem},
"alacritty": {Name: "alacritty", Repository: RepoTypeSystem},
"wl-clipboard": {Name: "wl-clipboard", Repository: RepoTypeSystem},
"xdg-desktop-portal-gtk": {Name: "xdg-desktop-portal-gtk", Repository: RepoTypeSystem},
"accountsservice": {Name: "accountsservice", Repository: RepoTypeSystem},
// COPR packages
"quickshell": f.getQuickshellMapping(variants["quickshell"]),
"matugen": {Name: "matugen", Repository: RepoTypeCOPR, RepoURL: "avengemedia/danklinux"},
"cliphist": {Name: "cliphist", Repository: RepoTypeCOPR, RepoURL: "avengemedia/danklinux"},
"dms (DankMaterialShell)": f.getDmsMapping(variants["dms (DankMaterialShell)"]),
"dgop": {Name: "dgop", Repository: RepoTypeCOPR, RepoURL: "avengemedia/danklinux"},
}
@@ -166,10 +152,7 @@ func (f *FedoraDistribution) getDmsMapping(variant deps.PackageVariant) PackageM
return PackageMapping{Name: "dms", Repository: RepoTypeCOPR, RepoURL: "avengemedia/dms"}
}
func (f *FedoraDistribution) getHyprlandMapping(variant deps.PackageVariant) PackageMapping {
if variant == deps.VariantGit {
return PackageMapping{Name: "hyprland-git", Repository: RepoTypeCOPR, RepoURL: "solopasha/hyprland"}
}
func (f *FedoraDistribution) getHyprlandMapping(_ deps.PackageVariant) PackageMapping {
return PackageMapping{Name: "hyprland", Repository: RepoTypeCOPR, RepoURL: "solopasha/hyprland"}
}
@@ -177,7 +160,7 @@ func (f *FedoraDistribution) getNiriMapping(variant deps.PackageVariant) Package
if variant == deps.VariantGit {
return PackageMapping{Name: "niri", Repository: RepoTypeCOPR, RepoURL: "yalter/niri-git"}
}
return PackageMapping{Name: "niri", Repository: RepoTypeSystem}
return PackageMapping{Name: "niri", Repository: RepoTypeCOPR, RepoURL: "yalter/niri"}
}
func (f *FedoraDistribution) detectXwaylandSatellite() deps.Dependency {
@@ -362,7 +345,11 @@ func (f *FedoraDistribution) InstallPackages(ctx context.Context, dependencies [
f.log(fmt.Sprintf("Warning: failed to write environment config: %v", err))
}
if err := f.EnableDMSService(ctx); err != nil {
if err := f.WriteWindowManagerConfig(wm); err != nil {
f.log(fmt.Sprintf("Warning: failed to write window manager config: %v", err))
}
if err := f.EnableDMSService(ctx, wm); err != nil {
f.log(fmt.Sprintf("Warning: failed to enable dms service: %v", err))
}

View File

@@ -95,7 +95,6 @@ func (g *GentooDistribution) DetectDependenciesWithTerminal(ctx context.Context,
dependencies = append(dependencies, g.detectWindowManager(wm))
dependencies = append(dependencies, g.detectQuickshell())
dependencies = append(dependencies, g.detectXDGPortal())
dependencies = append(dependencies, g.detectPolkitAgent())
dependencies = append(dependencies, g.detectAccountsService())
if wm == deps.WindowManagerHyprland {
@@ -108,65 +107,20 @@ func (g *GentooDistribution) DetectDependenciesWithTerminal(ctx context.Context,
dependencies = append(dependencies, g.detectMatugen())
dependencies = append(dependencies, g.detectDgop())
dependencies = append(dependencies, g.detectClipboardTools()...)
return dependencies, nil
}
func (g *GentooDistribution) detectXDGPortal() deps.Dependency {
status := deps.StatusMissing
if g.packageInstalled("sys-apps/xdg-desktop-portal-gtk") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xdg-desktop-portal-gtk",
Status: status,
Description: "Desktop integration portal for GTK",
Required: true,
}
}
func (g *GentooDistribution) detectPolkitAgent() deps.Dependency {
status := deps.StatusMissing
if g.packageInstalled("mate-extra/mate-polkit") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "mate-polkit",
Status: status,
Description: "PolicyKit authentication agent",
Required: true,
}
return g.detectPackage("xdg-desktop-portal-gtk", "Desktop integration portal for GTK", g.packageInstalled("sys-apps/xdg-desktop-portal-gtk"))
}
func (g *GentooDistribution) detectXwaylandSatellite() deps.Dependency {
status := deps.StatusMissing
if g.packageInstalled("gui-apps/xwayland-satellite") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xwayland-satellite",
Status: status,
Description: "Xwayland support",
Required: true,
}
return g.detectPackage("xwayland-satellite", "Xwayland support", g.packageInstalled("gui-apps/xwayland-satellite"))
}
func (g *GentooDistribution) detectAccountsService() deps.Dependency {
status := deps.StatusMissing
if g.packageInstalled("sys-apps/accountsservice") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "accountsservice",
Status: status,
Description: "D-Bus interface for user account query and manipulation",
Required: true,
}
return g.detectPackage("accountsservice", "D-Bus interface for user account query and manipulation", g.packageInstalled("sys-apps/accountsservice"))
}
func (g *GentooDistribution) packageInstalled(pkg string) bool {
@@ -185,9 +139,7 @@ func (g *GentooDistribution) GetPackageMappingWithVariants(wm deps.WindowManager
"git": {Name: "dev-vcs/git", Repository: RepoTypeSystem},
"kitty": {Name: "x11-terms/kitty", Repository: RepoTypeSystem, UseFlags: "X wayland"},
"alacritty": {Name: "x11-terms/alacritty", Repository: RepoTypeSystem, UseFlags: "X wayland"},
"wl-clipboard": {Name: "gui-apps/wl-clipboard", Repository: RepoTypeSystem},
"xdg-desktop-portal-gtk": {Name: "sys-apps/xdg-desktop-portal-gtk", Repository: RepoTypeSystem, UseFlags: "wayland X"},
"mate-polkit": {Name: "mate-extra/mate-polkit", Repository: RepoTypeSystem},
"accountsservice": {Name: "sys-apps/accountsservice", Repository: RepoTypeSystem},
"qtbase": {Name: "dev-qt/qtbase", Repository: RepoTypeSystem, UseFlags: "wayland opengl vulkan widgets"},
@@ -197,7 +149,6 @@ func (g *GentooDistribution) GetPackageMappingWithVariants(wm deps.WindowManager
"quickshell": g.getQuickshellMapping(variants["quickshell"]),
"matugen": {Name: "x11-misc/matugen", Repository: RepoTypeGURU, AcceptKeywords: archKeyword},
"cliphist": {Name: "app-misc/cliphist", Repository: RepoTypeGURU, AcceptKeywords: archKeyword},
"dms (DankMaterialShell)": g.getDmsMapping(variants["dms (DankMaterialShell)"]),
"dgop": {Name: "dgop", Repository: RepoTypeManual, BuildFunc: "installDgop"},
}
@@ -223,12 +174,8 @@ func (g *GentooDistribution) getDmsMapping(_ deps.PackageVariant) PackageMapping
return PackageMapping{Name: "dms", Repository: RepoTypeManual, BuildFunc: "installDankMaterialShell"}
}
func (g *GentooDistribution) getHyprlandMapping(variant deps.PackageVariant) PackageMapping {
archKeyword := g.getArchKeyword()
if variant == deps.VariantGit {
return PackageMapping{Name: "gui-wm/hyprland", Repository: RepoTypeGURU, UseFlags: "X", AcceptKeywords: archKeyword}
}
return PackageMapping{Name: "gui-wm/hyprland", Repository: RepoTypeSystem, UseFlags: "X", AcceptKeywords: archKeyword}
func (g *GentooDistribution) getHyprlandMapping(_ deps.PackageVariant) PackageMapping {
return PackageMapping{Name: "gui-wm/hyprland", Repository: RepoTypeSystem, UseFlags: "X", AcceptKeywords: g.getArchKeyword()}
}
func (g *GentooDistribution) getNiriMapping(_ deps.PackageVariant) PackageMapping {
@@ -456,7 +403,11 @@ func (g *GentooDistribution) InstallPackages(ctx context.Context, dependencies [
g.log(fmt.Sprintf("Warning: failed to write environment config: %v", err))
}
if err := g.EnableDMSService(ctx); err != nil {
if err := g.WriteWindowManagerConfig(wm); err != nil {
g.log(fmt.Sprintf("Warning: failed to write window manager config: %v", err))
}
if err := g.EnableDMSService(ctx, wm); err != nil {
g.log(fmt.Sprintf("Warning: failed to enable dms service: %v", err))
}

View File

@@ -74,10 +74,6 @@ func (m *ManualPackageInstaller) InstallManualPackages(ctx context.Context, pack
if err := m.installHyprland(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install hyprland: %w", err)
}
case "hyprpicker":
if err := m.installHyprpicker(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install hyprpicker: %w", err)
}
case "ghostty":
if err := m.installGhostty(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install ghostty: %w", err)
@@ -86,10 +82,6 @@ func (m *ManualPackageInstaller) InstallManualPackages(ctx context.Context, pack
if err := m.installMatugen(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install matugen: %w", err)
}
case "cliphist":
if err := m.installCliphist(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install cliphist: %w", err)
}
case "xwayland-satellite":
if err := m.installXwaylandSatellite(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install xwayland-satellite: %w", err)
@@ -405,184 +397,6 @@ func (m *ManualPackageInstaller) installHyprland(ctx context.Context, sudoPasswo
return nil
}
func (m *ManualPackageInstaller) installHyprpicker(ctx context.Context, sudoPassword string, progressChan chan<- InstallProgressMsg) error {
m.log("Installing hyprpicker from source...")
homeDir := os.Getenv("HOME")
if homeDir == "" {
return fmt.Errorf("HOME environment variable not set")
}
cacheDir := filepath.Join(homeDir, ".cache", "dankinstall")
if err := os.MkdirAll(cacheDir, 0755); err != nil {
return fmt.Errorf("failed to create cache directory: %w", err)
}
// Install hyprutils first
progressChan <- InstallProgressMsg{
Phase: PhaseSystemPackages,
Progress: 0.05,
Step: "Building hyprutils dependency...",
IsComplete: false,
CommandInfo: "git clone https://github.com/hyprwm/hyprutils.git",
}
hyprutilsDir := filepath.Join(cacheDir, "hyprutils-build")
if err := os.MkdirAll(hyprutilsDir, 0755); err != nil {
return fmt.Errorf("failed to create hyprutils directory: %w", err)
}
defer os.RemoveAll(hyprutilsDir)
cloneUtilsCmd := exec.CommandContext(ctx, "git", "clone", "https://github.com/hyprwm/hyprutils.git", hyprutilsDir)
if err := cloneUtilsCmd.Run(); err != nil {
return fmt.Errorf("failed to clone hyprutils: %w", err)
}
configureUtilsCmd := exec.CommandContext(ctx, "cmake",
"--no-warn-unused-cli",
"-DCMAKE_BUILD_TYPE:STRING=Release",
"-DCMAKE_INSTALL_PREFIX:PATH=/usr",
"-DBUILD_TESTING=off",
"-S", ".",
"-B", "./build")
configureUtilsCmd.Dir = hyprutilsDir
configureUtilsCmd.Env = append(os.Environ(), "TMPDIR="+cacheDir)
if err := m.runWithProgressStep(configureUtilsCmd, progressChan, PhaseSystemPackages, 0.05, 0.1, "Configuring hyprutils..."); err != nil {
return fmt.Errorf("failed to configure hyprutils: %w", err)
}
buildUtilsCmd := exec.CommandContext(ctx, "cmake", "--build", "./build", "--config", "Release", "--target", "all")
buildUtilsCmd.Dir = hyprutilsDir
buildUtilsCmd.Env = append(os.Environ(), "TMPDIR="+cacheDir)
if err := m.runWithProgressStep(buildUtilsCmd, progressChan, PhaseSystemPackages, 0.1, 0.2, "Building hyprutils..."); err != nil {
return fmt.Errorf("failed to build hyprutils: %w", err)
}
installUtilsCmd := ExecSudoCommand(ctx, sudoPassword, "cmake --install ./build")
installUtilsCmd.Dir = hyprutilsDir
if err := installUtilsCmd.Run(); err != nil {
return fmt.Errorf("failed to install hyprutils: %w", err)
}
// Install hyprwayland-scanner
progressChan <- InstallProgressMsg{
Phase: PhaseSystemPackages,
Progress: 0.2,
Step: "Building hyprwayland-scanner dependency...",
IsComplete: false,
CommandInfo: "git clone https://github.com/hyprwm/hyprwayland-scanner.git",
}
scannerDir := filepath.Join(cacheDir, "hyprwayland-scanner-build")
if err := os.MkdirAll(scannerDir, 0755); err != nil {
return fmt.Errorf("failed to create scanner directory: %w", err)
}
defer os.RemoveAll(scannerDir)
cloneScannerCmd := exec.CommandContext(ctx, "git", "clone", "https://github.com/hyprwm/hyprwayland-scanner.git", scannerDir)
if err := cloneScannerCmd.Run(); err != nil {
return fmt.Errorf("failed to clone hyprwayland-scanner: %w", err)
}
configureScannerCmd := exec.CommandContext(ctx, "cmake",
"-DCMAKE_INSTALL_PREFIX=/usr",
"-B", "build")
configureScannerCmd.Dir = scannerDir
configureScannerCmd.Env = append(os.Environ(), "TMPDIR="+cacheDir)
if err := m.runWithProgressStep(configureScannerCmd, progressChan, PhaseSystemPackages, 0.2, 0.25, "Configuring hyprwayland-scanner..."); err != nil {
return fmt.Errorf("failed to configure hyprwayland-scanner: %w", err)
}
buildScannerCmd := exec.CommandContext(ctx, "cmake", "--build", "build", "-j")
buildScannerCmd.Dir = scannerDir
buildScannerCmd.Env = append(os.Environ(), "TMPDIR="+cacheDir)
if err := m.runWithProgressStep(buildScannerCmd, progressChan, PhaseSystemPackages, 0.25, 0.35, "Building hyprwayland-scanner..."); err != nil {
return fmt.Errorf("failed to build hyprwayland-scanner: %w", err)
}
installScannerCmd := ExecSudoCommand(ctx, sudoPassword, "cmake --install build")
installScannerCmd.Dir = scannerDir
if err := installScannerCmd.Run(); err != nil {
return fmt.Errorf("failed to install hyprwayland-scanner: %w", err)
}
// Now build hyprpicker
tmpDir := filepath.Join(cacheDir, "hyprpicker-build")
if err := os.MkdirAll(tmpDir, 0755); err != nil {
return fmt.Errorf("failed to create temp directory: %w", err)
}
defer os.RemoveAll(tmpDir)
progressChan <- InstallProgressMsg{
Phase: PhaseSystemPackages,
Progress: 0.35,
Step: "Cloning hyprpicker repository...",
IsComplete: false,
CommandInfo: "git clone https://github.com/hyprwm/hyprpicker.git",
}
cloneCmd := exec.CommandContext(ctx, "git", "clone", "https://github.com/hyprwm/hyprpicker.git", tmpDir)
if err := cloneCmd.Run(); err != nil {
return fmt.Errorf("failed to clone hyprpicker: %w", err)
}
progressChan <- InstallProgressMsg{
Phase: PhaseSystemPackages,
Progress: 0.45,
Step: "Configuring hyprpicker build...",
IsComplete: false,
CommandInfo: "cmake -B build -S . -DCMAKE_BUILD_TYPE=Release",
}
configureCmd := exec.CommandContext(ctx, "cmake",
"--no-warn-unused-cli",
"-DCMAKE_BUILD_TYPE:STRING=Release",
"-DCMAKE_INSTALL_PREFIX:PATH=/usr",
"-S", ".",
"-B", "./build")
configureCmd.Dir = tmpDir
configureCmd.Env = append(os.Environ(), "TMPDIR="+cacheDir)
output, err := configureCmd.CombinedOutput()
if err != nil {
m.log(fmt.Sprintf("cmake configure failed. Output:\n%s", string(output)))
return fmt.Errorf("failed to configure hyprpicker: %w\nCMake output:\n%s", err, string(output))
}
progressChan <- InstallProgressMsg{
Phase: PhaseSystemPackages,
Progress: 0.55,
Step: "Building hyprpicker...",
IsComplete: false,
CommandInfo: "cmake --build build --target hyprpicker",
}
buildCmd := exec.CommandContext(ctx, "cmake", "--build", "./build", "--config", "Release", "--target", "hyprpicker")
buildCmd.Dir = tmpDir
buildCmd.Env = append(os.Environ(), "TMPDIR="+cacheDir)
if err := m.runWithProgressStep(buildCmd, progressChan, PhaseSystemPackages, 0.55, 0.8, "Building hyprpicker..."); err != nil {
return fmt.Errorf("failed to build hyprpicker: %w", err)
}
progressChan <- InstallProgressMsg{
Phase: PhaseSystemPackages,
Progress: 0.8,
Step: "Installing hyprpicker...",
IsComplete: false,
NeedsSudo: true,
CommandInfo: "sudo cmake --install build",
}
installCmd := ExecSudoCommand(ctx, sudoPassword, "cmake --install ./build")
installCmd.Dir = tmpDir
if err := installCmd.Run(); err != nil {
return fmt.Errorf("failed to install hyprpicker: %w", err)
}
m.log("hyprpicker installed successfully from source")
return nil
}
func (m *ManualPackageInstaller) installGhostty(ctx context.Context, sudoPassword string, progressChan chan<- InstallProgressMsg) error {
m.log("Installing Ghostty from source...")
@@ -803,52 +617,6 @@ func (m *ManualPackageInstaller) installDankMaterialShell(ctx context.Context, v
return nil
}
func (m *ManualPackageInstaller) installCliphist(ctx context.Context, sudoPassword string, progressChan chan<- InstallProgressMsg) error {
m.log("Installing cliphist from source...")
progressChan <- InstallProgressMsg{
Phase: PhaseSystemPackages,
Progress: 0.1,
Step: "Installing cliphist via go install...",
IsComplete: false,
CommandInfo: "go install go.senan.xyz/cliphist@latest",
}
installCmd := exec.CommandContext(ctx, "go", "install", "go.senan.xyz/cliphist@latest")
if err := m.runWithProgressStep(installCmd, progressChan, PhaseSystemPackages, 0.1, 0.7, "Building cliphist..."); err != nil {
return fmt.Errorf("failed to install cliphist: %w", err)
}
homeDir := os.Getenv("HOME")
sourcePath := filepath.Join(homeDir, "go", "bin", "cliphist")
targetPath := "/usr/local/bin/cliphist"
progressChan <- InstallProgressMsg{
Phase: PhaseSystemPackages,
Progress: 0.7,
Step: "Installing cliphist binary to system...",
IsComplete: false,
NeedsSudo: true,
CommandInfo: fmt.Sprintf("sudo cp %s %s", sourcePath, targetPath),
}
copyCmd := exec.CommandContext(ctx, "sudo", "-S", "cp", sourcePath, targetPath)
copyCmd.Stdin = strings.NewReader(sudoPassword + "\n")
if err := copyCmd.Run(); err != nil {
return fmt.Errorf("failed to copy cliphist to /usr/local/bin: %w", err)
}
// Make it executable
chmodCmd := exec.CommandContext(ctx, "sudo", "-S", "chmod", "+x", targetPath)
chmodCmd.Stdin = strings.NewReader(sudoPassword + "\n")
if err := chmodCmd.Run(); err != nil {
return fmt.Errorf("failed to make cliphist executable: %w", err)
}
m.log("cliphist installed successfully from source")
return nil
}
func (m *ManualPackageInstaller) installXwaylandSatellite(ctx context.Context, sudoPassword string, progressChan chan<- InstallProgressMsg) error {
m.log("Installing xwayland-satellite from source...")

View File

@@ -78,26 +78,14 @@ func (o *OpenSUSEDistribution) DetectDependenciesWithTerminal(ctx context.Contex
dependencies = append(dependencies, o.detectXwaylandSatellite())
}
// Base detections (common across distros)
dependencies = append(dependencies, o.detectMatugen())
dependencies = append(dependencies, o.detectDgop())
dependencies = append(dependencies, o.detectClipboardTools()...)
return dependencies, nil
}
func (o *OpenSUSEDistribution) detectXDGPortal() deps.Dependency {
status := deps.StatusMissing
if o.packageInstalled("xdg-desktop-portal-gtk") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xdg-desktop-portal-gtk",
Status: status,
Description: "Desktop integration portal for GTK",
Required: true,
}
return o.detectPackage("xdg-desktop-portal-gtk", "Desktop integration portal for GTK", o.packageInstalled("xdg-desktop-portal-gtk"))
}
func (o *OpenSUSEDistribution) packageInstalled(pkg string) bool {
@@ -117,10 +105,8 @@ func (o *OpenSUSEDistribution) GetPackageMappingWithVariants(wm deps.WindowManag
"ghostty": {Name: "ghostty", Repository: RepoTypeSystem},
"kitty": {Name: "kitty", Repository: RepoTypeSystem},
"alacritty": {Name: "alacritty", Repository: RepoTypeSystem},
"wl-clipboard": {Name: "wl-clipboard", Repository: RepoTypeSystem},
"xdg-desktop-portal-gtk": {Name: "xdg-desktop-portal-gtk", Repository: RepoTypeSystem},
"accountsservice": {Name: "accountsservice", Repository: RepoTypeSystem},
"cliphist": {Name: "cliphist", Repository: RepoTypeSystem},
// DMS packages from OBS
"dms (DankMaterialShell)": o.getDmsMapping(variants["dms (DankMaterialShell)"]),
@@ -377,7 +363,11 @@ func (o *OpenSUSEDistribution) InstallPackages(ctx context.Context, dependencies
o.log(fmt.Sprintf("Warning: failed to write environment config: %v", err))
}
if err := o.EnableDMSService(ctx); err != nil {
if err := o.WriteWindowManagerConfig(wm); err != nil {
o.log(fmt.Sprintf("Warning: failed to write window manager config: %v", err))
}
if err := o.EnableDMSService(ctx, wm); err != nil {
o.log(fmt.Sprintf("Warning: failed to enable dms service: %v", err))
}
@@ -472,7 +462,7 @@ func (o *OpenSUSEDistribution) enableOBSRepos(ctx context.Context, obsPkgs []Pac
cmd := ExecSudoCommand(ctx, sudoPassword,
fmt.Sprintf("zypper addrepo -f %s", repoURL))
if err := o.runWithProgress(cmd, progressChan, PhaseSystemPackages, 0.20, 0.22); err != nil {
return fmt.Errorf("failed to enable OBS repo %s: %w", pkg.RepoURL, err)
o.log(fmt.Sprintf("OBS repo %s add failed (may already exist): %v", pkg.RepoURL, err))
}
enabledRepos[pkg.RepoURL] = true

View File

@@ -76,54 +76,22 @@ func (u *UbuntuDistribution) DetectDependenciesWithTerminal(ctx context.Context,
dependencies = append(dependencies, u.detectXwaylandSatellite())
}
// Base detections (common across distros)
dependencies = append(dependencies, u.detectMatugen())
dependencies = append(dependencies, u.detectDgop())
dependencies = append(dependencies, u.detectClipboardTools()...)
return dependencies, nil
}
func (u *UbuntuDistribution) detectXDGPortal() deps.Dependency {
status := deps.StatusMissing
if u.packageInstalled("xdg-desktop-portal-gtk") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xdg-desktop-portal-gtk",
Status: status,
Description: "Desktop integration portal for GTK",
Required: true,
}
return u.detectPackage("xdg-desktop-portal-gtk", "Desktop integration portal for GTK", u.packageInstalled("xdg-desktop-portal-gtk"))
}
func (u *UbuntuDistribution) detectXwaylandSatellite() deps.Dependency {
status := deps.StatusMissing
if u.commandExists("xwayland-satellite") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "xwayland-satellite",
Status: status,
Description: "Xwayland support",
Required: true,
}
return u.detectCommand("xwayland-satellite", "Xwayland support")
}
func (u *UbuntuDistribution) detectAccountsService() deps.Dependency {
status := deps.StatusMissing
if u.packageInstalled("accountsservice") {
status = deps.StatusInstalled
}
return deps.Dependency{
Name: "accountsservice",
Status: status,
Description: "D-Bus interface for user account query and manipulation",
Required: true,
}
return u.detectPackage("accountsservice", "D-Bus interface for user account query and manipulation", u.packageInstalled("accountsservice"))
}
func (u *UbuntuDistribution) packageInstalled(pkg string) bool {
@@ -142,7 +110,6 @@ func (u *UbuntuDistribution) GetPackageMappingWithVariants(wm deps.WindowManager
"git": {Name: "git", Repository: RepoTypeSystem},
"kitty": {Name: "kitty", Repository: RepoTypeSystem},
"alacritty": {Name: "alacritty", Repository: RepoTypeSystem},
"wl-clipboard": {Name: "wl-clipboard", Repository: RepoTypeSystem},
"xdg-desktop-portal-gtk": {Name: "xdg-desktop-portal-gtk", Repository: RepoTypeSystem},
"accountsservice": {Name: "accountsservice", Repository: RepoTypeSystem},
@@ -151,7 +118,6 @@ func (u *UbuntuDistribution) GetPackageMappingWithVariants(wm deps.WindowManager
"quickshell": u.getQuickshellMapping(variants["quickshell"]),
"matugen": {Name: "matugen", Repository: RepoTypePPA, RepoURL: "ppa:avengemedia/danklinux"},
"dgop": {Name: "dgop", Repository: RepoTypePPA, RepoURL: "ppa:avengemedia/danklinux"},
"cliphist": {Name: "cliphist", Repository: RepoTypePPA, RepoURL: "ppa:avengemedia/danklinux"},
"ghostty": {Name: "ghostty", Repository: RepoTypePPA, RepoURL: "ppa:avengemedia/danklinux"},
}
@@ -357,7 +323,11 @@ func (u *UbuntuDistribution) InstallPackages(ctx context.Context, dependencies [
u.log(fmt.Sprintf("Warning: failed to write environment config: %v", err))
}
if err := u.EnableDMSService(ctx); err != nil {
if err := u.WriteWindowManagerConfig(wm); err != nil {
u.log(fmt.Sprintf("Warning: failed to write window manager config: %v", err))
}
if err := u.EnableDMSService(ctx, wm); err != nil {
u.log(fmt.Sprintf("Warning: failed to enable dms service: %v", err))
}
@@ -565,8 +535,6 @@ func (u *UbuntuDistribution) installBuildDependencies(ctx context.Context, manua
buildDeps["libpam0g-dev"] = true
case "matugen":
buildDeps["curl"] = true
case "cliphist":
// Go will be installed separately with PPA
}
}
@@ -576,7 +544,7 @@ func (u *UbuntuDistribution) installBuildDependencies(ctx context.Context, manua
if err := u.installRust(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install Rust: %w", err)
}
case "cliphist", "dgop":
case "dgop":
if err := u.installGo(ctx, sudoPassword, progressChan); err != nil {
return fmt.Errorf("failed to install Go: %w", err)
}

View File

@@ -286,6 +286,13 @@ func (m Model) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
return m, loadInstalledPlugins
}
return m, nil
case pluginUpdatedMsg:
if msg.err != nil {
m.installedPluginsError = msg.err.Error()
} else {
m.installedPluginsError = ""
}
return m, nil
case pluginInstalledMsg:
if msg.err != nil {
m.pluginsError = msg.err.Error()

View File

@@ -75,14 +75,13 @@ type MenuItem struct {
func NewModel(version string) Model {
detector, _ := NewDetector()
dependencies := detector.GetInstalledComponents()
// Use the proper detection method for both window managers
hyprlandInstalled, niriInstalled, err := detector.GetWindowManagerStatus()
if err != nil {
// Fallback to false if detection fails
hyprlandInstalled = false
niriInstalled = false
var dependencies []DependencyInfo
var hyprlandInstalled, niriInstalled bool
if detector != nil {
dependencies = detector.GetInstalledComponents()
hyprlandInstalled, niriInstalled, _ = detector.GetWindowManagerStatus()
}
m := Model{
@@ -201,6 +200,13 @@ func (m Model) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
return m, loadInstalledPlugins
}
return m, nil
case pluginUpdatedMsg:
if msg.err != nil {
m.installedPluginsError = msg.err.Error()
} else {
m.installedPluginsError = ""
}
return m, nil
case pluginInstalledMsg:
if msg.err != nil {
m.pluginsError = msg.err.Error()

View File

@@ -227,6 +227,11 @@ func (m Model) updatePluginInstalledDetail(msg tea.KeyMsg) (tea.Model, tea.Cmd)
plugin := m.installedPluginsList[m.selectedInstalledIndex]
return m, uninstallPlugin(plugin)
}
case "p":
if m.selectedInstalledIndex < len(m.installedPluginsList) {
plugin := m.installedPluginsList[m.selectedInstalledIndex]
return m, updatePlugin(plugin)
}
}
return m, nil
}
@@ -246,6 +251,11 @@ type pluginInstalledMsg struct {
err error
}
type pluginUpdatedMsg struct {
pluginName string
err error
}
func loadInstalledPlugins() tea.Msg {
manager, err := plugins.NewManager()
if err != nil {
@@ -337,3 +347,31 @@ func uninstallPlugin(plugin pluginInfo) tea.Cmd {
return pluginUninstalledMsg{pluginName: plugin.Name}
}
}
func updatePlugin(plugin pluginInfo) tea.Cmd {
return func() tea.Msg {
manager, err := plugins.NewManager()
if err != nil {
return pluginUpdatedMsg{pluginName: plugin.Name, err: err}
}
p := plugins.Plugin{
ID: plugin.ID,
Name: plugin.Name,
Category: plugin.Category,
Author: plugin.Author,
Description: plugin.Description,
Repo: plugin.Repo,
Path: plugin.Path,
Capabilities: plugin.Capabilities,
Compositors: plugin.Compositors,
Dependencies: plugin.Dependencies,
}
if err := manager.Update(p); err != nil {
return pluginUpdatedMsg{pluginName: plugin.Name, err: err}
}
return pluginUpdatedMsg{pluginName: plugin.Name}
}
}

View File

@@ -518,7 +518,7 @@ func (m Model) categorizeDependencies() map[string][]DependencyInfo {
categories["Hyprland Components"] = append(categories["Hyprland Components"], dep)
case "niri":
categories["Niri Components"] = append(categories["Niri Components"], dep)
case "kitty", "alacritty", "ghostty", "hyprpicker":
case "kitty", "alacritty", "ghostty":
categories["Shared Components"] = append(categories["Shared Components"], dep)
default:
categories["Shared Components"] = append(categories["Shared Components"], dep)

View File

@@ -11,6 +11,7 @@ import (
"github.com/AvengeMedia/DankMaterialShell/core/internal/config"
"github.com/AvengeMedia/DankMaterialShell/core/internal/distros"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
// DetectDMSPath checks for DMS installation following XDG Base Directory specification
@@ -22,10 +23,10 @@ func DetectDMSPath() (string, error) {
func DetectCompositors() []string {
var compositors []string
if commandExists("niri") {
if utils.CommandExists("niri") {
compositors = append(compositors, "niri")
}
if commandExists("Hyprland") {
if utils.CommandExists("Hyprland") {
compositors = append(compositors, "Hyprland")
}
@@ -62,7 +63,7 @@ func PromptCompositorChoice(compositors []string) (string, error) {
// EnsureGreetdInstalled checks if greetd is installed and installs it if not
func EnsureGreetdInstalled(logFunc func(string), sudoPassword string) error {
if commandExists("greetd") {
if utils.CommandExists("greetd") {
logFunc("✓ greetd is already installed")
return nil
}
@@ -144,7 +145,7 @@ func EnsureGreetdInstalled(logFunc func(string), sudoPassword string) error {
// CopyGreeterFiles installs the dms-greeter wrapper and sets up cache directory
func CopyGreeterFiles(dmsPath, compositor string, logFunc func(string), sudoPassword string) error {
// Check if dms-greeter is already in PATH
if commandExists("dms-greeter") {
if utils.CommandExists("dms-greeter") {
logFunc("✓ dms-greeter wrapper already installed")
} else {
// Install the wrapper script
@@ -204,7 +205,7 @@ func CopyGreeterFiles(dmsPath, compositor string, logFunc func(string), sudoPass
// SetupParentDirectoryACLs sets ACLs on parent directories to allow traversal
func SetupParentDirectoryACLs(logFunc func(string), sudoPassword string) error {
if !commandExists("setfacl") {
if !utils.CommandExists("setfacl") {
logFunc("⚠ Warning: setfacl command not found. ACL support may not be available on this filesystem.")
logFunc(" If theme sync doesn't work, you may need to install acl package:")
logFunc(" - Fedora/RHEL: sudo dnf install acl")
@@ -419,7 +420,7 @@ user = "greeter"
// Determine wrapper command path
wrapperCmd := "dms-greeter"
if !commandExists("dms-greeter") {
if !utils.CommandExists("dms-greeter") {
wrapperCmd = "/usr/local/bin/dms-greeter"
}
@@ -486,8 +487,3 @@ func runSudoCmd(sudoPassword string, command string, args ...string) error {
cmd.Stderr = os.Stderr
return cmd.Run()
}
func commandExists(cmd string) bool {
_, err := exec.LookPath(cmd)
return err == nil
}

View File

@@ -5,6 +5,8 @@ import (
"os"
"path/filepath"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
type DiscoveryConfig struct {
@@ -14,15 +16,9 @@ type DiscoveryConfig struct {
func DefaultDiscoveryConfig() *DiscoveryConfig {
var searchPaths []string
configHome := os.Getenv("XDG_CONFIG_HOME")
if configHome == "" {
if homeDir, err := os.UserHomeDir(); err == nil {
configHome = filepath.Join(homeDir, ".config")
}
}
if configHome != "" {
searchPaths = append(searchPaths, filepath.Join(configHome, "DankMaterialShell", "cheatsheets"))
configDir, err := os.UserConfigDir()
if err == nil && configDir != "" {
searchPaths = append(searchPaths, filepath.Join(configDir, "DankMaterialShell", "cheatsheets"))
}
configDirs := os.Getenv("XDG_CONFIG_DIRS")
@@ -43,7 +39,7 @@ func (d *DiscoveryConfig) FindJSONFiles() ([]string, error) {
var files []string
for _, searchPath := range d.SearchPaths {
expandedPath, err := expandPath(searchPath)
expandedPath, err := utils.ExpandPath(searchPath)
if err != nil {
continue
}
@@ -74,20 +70,6 @@ func (d *DiscoveryConfig) FindJSONFiles() ([]string, error) {
return files, nil
}
func expandPath(path string) (string, error) {
expandedPath := os.ExpandEnv(path)
if strings.HasPrefix(expandedPath, "~") {
home, err := os.UserHomeDir()
if err != nil {
return "", err
}
expandedPath = filepath.Join(home, expandedPath[1:])
}
return filepath.Clean(expandedPath), nil
}
type JSONProviderFactory func(filePath string) (Provider, error)
var jsonProviderFactory JSONProviderFactory

View File

@@ -4,6 +4,8 @@ import (
"os"
"path/filepath"
"testing"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
func TestDefaultDiscoveryConfig(t *testing.T) {
@@ -272,13 +274,13 @@ func TestExpandPathInDiscovery(t *testing.T) {
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result, err := expandPath(tt.input)
result, err := utils.ExpandPath(tt.input)
if err != nil {
t.Fatalf("expandPath failed: %v", err)
}
if result != tt.expected {
t.Errorf("expandPath(%q) = %q, want %q", tt.input, result, tt.expected)
t.Errorf("utils.ExpandPath(%q) = %q, want %q", tt.input, result, tt.expected)
}
})
}

View File

@@ -5,6 +5,8 @@ import (
"path/filepath"
"regexp"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
const (
@@ -42,14 +44,9 @@ func NewHyprlandParser() *HyprlandParser {
}
func (p *HyprlandParser) ReadContent(directory string) error {
expandedDir := os.ExpandEnv(directory)
expandedDir = filepath.Clean(expandedDir)
if strings.HasPrefix(expandedDir, "~") {
home, err := os.UserHomeDir()
if err != nil {
return err
}
expandedDir = filepath.Join(home, expandedDir[1:])
expandedDir, err := utils.ExpandPath(directory)
if err != nil {
return err
}
info, err := os.Stat(expandedDir)

View File

@@ -5,9 +5,9 @@ import (
"fmt"
"os"
"path/filepath"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/keybinds"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
type JSONFileProvider struct {
@@ -20,7 +20,7 @@ func NewJSONFileProvider(filePath string) (*JSONFileProvider, error) {
return nil, fmt.Errorf("file path cannot be empty")
}
expandedPath, err := expandPath(filePath)
expandedPath, err := utils.ExpandPath(filePath)
if err != nil {
return nil, fmt.Errorf("failed to expand path: %w", err)
}
@@ -117,17 +117,3 @@ func (j *JSONFileProvider) GetCheatSheet() (*keybinds.CheatSheet, error) {
Binds: categorizedBinds,
}, nil
}
func expandPath(path string) (string, error) {
expandedPath := os.ExpandEnv(path)
if strings.HasPrefix(expandedPath, "~") {
home, err := os.UserHomeDir()
if err != nil {
return "", err
}
expandedPath = filepath.Join(home, expandedPath[1:])
}
return filepath.Clean(expandedPath), nil
}

View File

@@ -4,6 +4,8 @@ import (
"os"
"path/filepath"
"testing"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
func TestNewJSONFileProvider(t *testing.T) {
@@ -266,13 +268,13 @@ func TestExpandPath(t *testing.T) {
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
result, err := expandPath(tt.input)
result, err := utils.ExpandPath(tt.input)
if err != nil {
t.Fatalf("expandPath failed: %v", err)
}
if result != tt.expected {
t.Errorf("expandPath(%q) = %q, want %q", tt.input, result, tt.expected)
t.Errorf("utils.ExpandPath(%q) = %q, want %q", tt.input, result, tt.expected)
}
})
}

View File

@@ -5,6 +5,8 @@ import (
"path/filepath"
"regexp"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
const (
@@ -34,14 +36,9 @@ func NewMangoWCParser() *MangoWCParser {
}
func (p *MangoWCParser) ReadContent(path string) error {
expandedPath := os.ExpandEnv(path)
expandedPath = filepath.Clean(expandedPath)
if strings.HasPrefix(expandedPath, "~") {
home, err := os.UserHomeDir()
if err != nil {
return err
}
expandedPath = filepath.Join(home, expandedPath[1:])
expandedPath, err := utils.ExpandPath(path)
if err != nil {
return err
}
info, err := os.Stat(expandedPath)

View File

@@ -6,6 +6,7 @@ import (
"os/exec"
"path/filepath"
"sort"
"strconv"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/keybinds"
@@ -29,15 +30,11 @@ func NewNiriProvider(configDir string) *NiriProvider {
}
func defaultNiriConfigDir() string {
if configHome := os.Getenv("XDG_CONFIG_HOME"); configHome != "" {
return filepath.Join(configHome, "niri")
}
home, err := os.UserHomeDir()
configDir, err := os.UserConfigDir()
if err != nil {
return ""
}
return filepath.Join(home, ".config", "niri")
return filepath.Join(configDir, "niri")
}
func (n *NiriProvider) Name() string {
@@ -154,11 +151,13 @@ func (n *NiriProvider) convertKeybind(kb *NiriKeyBinding, subcategory string, co
}
bind := keybinds.Keybind{
Key: keyStr,
Description: kb.Description,
Action: rawAction,
Subcategory: subcategory,
Source: source,
Key: keyStr,
Description: kb.Description,
Action: rawAction,
Subcategory: subcategory,
Source: source,
HideOnOverlay: kb.HideOnOverlay,
CooldownMs: kb.CooldownMs,
}
if source == "dms" && conflicts != nil {
@@ -316,7 +315,9 @@ func (n *NiriProvider) extractOptions(node *document.Node) map[string]any {
opts["repeat"] = val.String() == "true"
}
if val, ok := node.Properties.Get("cooldown-ms"); ok {
opts["cooldown-ms"] = val.String()
if ms, err := strconv.Atoi(val.String()); err == nil {
opts["cooldown-ms"] = ms
}
}
if val, ok := node.Properties.Get("allow-when-locked"); ok {
opts["allow-when-locked"] = val.String() == "true"
@@ -342,7 +343,14 @@ func (n *NiriProvider) buildBindNode(bind *overrideBind) *document.Node {
node.AddProperty("repeat", false, "")
}
if v, ok := bind.Options["cooldown-ms"]; ok {
node.AddProperty("cooldown-ms", v, "")
switch val := v.(type) {
case int:
node.AddProperty("cooldown-ms", val, "")
case string:
if ms, err := strconv.Atoi(val); err == nil {
node.AddProperty("cooldown-ms", ms, "")
}
}
}
if v, ok := bind.Options["allow-when-locked"]; ok && v == true {
node.AddProperty("allow-when-locked", true, "")
@@ -453,9 +461,16 @@ func (n *NiriProvider) getBindSortPriority(action string) int {
}
}
const dmsWarningHeader = `// ! DO NOT EDIT !
// ! AUTO-GENERATED BY DMS !
// ! CHANGES WILL BE OVERWRITTEN !
// ! PLACE YOUR CUSTOM CONFIGURATION ELSEWHERE !
`
func (n *NiriProvider) generateBindsContent(binds map[string]*overrideBind) string {
if len(binds) == 0 {
return "binds {}\n"
return dmsWarningHeader + "binds {}\n"
}
var regularBinds, recentWindowsBinds []*overrideBind
@@ -482,6 +497,7 @@ func (n *NiriProvider) generateBindsContent(binds map[string]*overrideBind) stri
var sb strings.Builder
sb.WriteString(dmsWarningHeader)
sb.WriteString("binds {\n")
for _, bind := range regularBinds {
n.writeBindNode(&sb, bind, " ")

View File

@@ -4,6 +4,7 @@ import (
"fmt"
"os"
"path/filepath"
"strconv"
"strings"
"github.com/sblinch/kdl-go"
@@ -11,12 +12,14 @@ import (
)
type NiriKeyBinding struct {
Mods []string
Key string
Action string
Args []string
Description string
Source string
Mods []string
Key string
Action string
Args []string
Description string
HideOnOverlay bool
CooldownMs int
Source string
}
type NiriSection struct {
@@ -273,19 +276,31 @@ func (p *NiriParser) parseKeybindNode(node *document.Node, _ string) *NiriKeyBin
}
var description string
var hideOnOverlay bool
var cooldownMs int
if node.Properties != nil {
if val, ok := node.Properties.Get("hotkey-overlay-title"); ok {
description = val.ValueString()
switch val.ValueString() {
case "null", "":
hideOnOverlay = true
default:
description = val.ValueString()
}
}
if val, ok := node.Properties.Get("cooldown-ms"); ok {
cooldownMs, _ = strconv.Atoi(val.String())
}
}
return &NiriKeyBinding{
Mods: mods,
Key: key,
Action: action,
Args: args,
Description: description,
Source: p.currentSource,
Mods: mods,
Key: key,
Action: action,
Args: args,
Description: description,
HideOnOverlay: hideOnOverlay,
CooldownMs: cooldownMs,
Source: p.currentSource,
}
}

View File

@@ -6,6 +6,13 @@ import (
"testing"
)
const testHeader = `// ! DO NOT EDIT !
// ! AUTO-GENERATED BY DMS !
// ! CHANGES WILL BE OVERWRITTEN !
// ! PLACE YOUR CUSTOM CONFIGURATION ELSEWHERE !
`
func TestNiriProviderName(t *testing.T) {
provider := NewNiriProvider("")
if provider.Name() != "niri" {
@@ -197,7 +204,7 @@ func TestNiriGenerateBindsContent(t *testing.T) {
{
name: "empty binds",
binds: map[string]*overrideBind{},
expected: "binds {}\n",
expected: testHeader + "binds {}\n",
},
{
name: "simple spawn bind",
@@ -208,7 +215,7 @@ func TestNiriGenerateBindsContent(t *testing.T) {
Description: "Open Terminal",
},
},
expected: `binds {
expected: testHeader + `binds {
Mod+T hotkey-overlay-title="Open Terminal" { spawn "kitty"; }
}
`,
@@ -222,7 +229,7 @@ func TestNiriGenerateBindsContent(t *testing.T) {
Description: "Application Launcher",
},
},
expected: `binds {
expected: testHeader + `binds {
Mod+Space hotkey-overlay-title="Application Launcher" { spawn "dms" "ipc" "call" "spotlight" "toggle"; }
}
`,
@@ -236,7 +243,7 @@ func TestNiriGenerateBindsContent(t *testing.T) {
Options: map[string]any{"allow-when-locked": true},
},
},
expected: `binds {
expected: testHeader + `binds {
XF86AudioMute allow-when-locked=true { spawn "dms" "ipc" "call" "audio" "mute"; }
}
`,
@@ -250,7 +257,7 @@ func TestNiriGenerateBindsContent(t *testing.T) {
Description: "Close Window",
},
},
expected: `binds {
expected: testHeader + `binds {
Mod+Q hotkey-overlay-title="Close Window" { close-window; }
}
`,
@@ -263,7 +270,7 @@ func TestNiriGenerateBindsContent(t *testing.T) {
Action: "next-window",
},
},
expected: `binds {
expected: testHeader + `binds {
}
recent-windows {
@@ -415,7 +422,7 @@ func TestNiriGenerateBindsContentNumericArgs(t *testing.T) {
Description: "Focus Workspace 1",
},
},
expected: `binds {
expected: testHeader + `binds {
Mod+1 hotkey-overlay-title="Focus Workspace 1" { focus-workspace 1; }
}
`,
@@ -429,7 +436,7 @@ func TestNiriGenerateBindsContentNumericArgs(t *testing.T) {
Description: "Focus Workspace 10",
},
},
expected: `binds {
expected: testHeader + `binds {
Mod+0 hotkey-overlay-title="Focus Workspace 10" { focus-workspace 10; }
}
`,
@@ -443,7 +450,7 @@ func TestNiriGenerateBindsContentNumericArgs(t *testing.T) {
Description: "Adjust Column Width -10%",
},
},
expected: `binds {
expected: testHeader + `binds {
Super+Minus hotkey-overlay-title="Adjust Column Width -10%" { set-column-width "-10%"; }
}
`,
@@ -457,7 +464,7 @@ func TestNiriGenerateBindsContentNumericArgs(t *testing.T) {
Description: "Adjust Column Width +10%",
},
},
expected: `binds {
expected: testHeader + `binds {
Super+Equal hotkey-overlay-title="Adjust Column Width +10%" { set-column-width "+10%"; }
}
`,
@@ -486,7 +493,7 @@ func TestNiriGenerateActionWithUnquotedPercentArg(t *testing.T) {
}
content := provider.generateBindsContent(binds)
expected := `binds {
expected := testHeader + `binds {
Super+Equal hotkey-overlay-title="Adjust Window Height +10%" { set-window-height "+10%"; }
}
`
@@ -507,7 +514,7 @@ func TestNiriGenerateSpawnWithNumericArgs(t *testing.T) {
}
content := provider.generateBindsContent(binds)
expected := `binds {
expected := testHeader + `binds {
XF86AudioLowerVolume allow-when-locked=true { spawn "dms" "ipc" "call" "audio" "decrement" "3"; }
}
`
@@ -528,7 +535,7 @@ func TestNiriGenerateSpawnNumericArgFromCLI(t *testing.T) {
}
content := provider.generateBindsContent(binds)
expected := `binds {
expected := testHeader + `binds {
XF86AudioLowerVolume allow-when-locked=true { spawn "dms" "ipc" "call" "audio" "decrement" "3"; }
}
`

View File

@@ -2,6 +2,7 @@ package providers
import (
"fmt"
"os"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/keybinds"
@@ -9,18 +10,42 @@ import (
type SwayProvider struct {
configPath string
isScroll bool
}
func NewSwayProvider(configPath string) *SwayProvider {
isScroll := false
_, scrollEnvSet := os.LookupEnv("SCROLLSOCK")
if configPath == "" {
configPath = "$HOME/.config/sway"
if scrollEnvSet {
configPath = "$HOME/.config/scroll"
isScroll = true
} else {
configPath = "$HOME/.config/sway"
}
} else {
// Determine isScroll based on the provided config path
isScroll = strings.Contains(configPath, "scroll")
}
return &SwayProvider{
configPath: configPath,
isScroll: isScroll,
}
}
func (s *SwayProvider) Name() string {
if s != nil && s.isScroll {
return "scroll"
}
if s == nil {
_, ok := os.LookupEnv("SCROLLSOCK")
if ok {
return "scroll"
}
}
return "sway"
}
@@ -33,8 +58,13 @@ func (s *SwayProvider) GetCheatSheet() (*keybinds.CheatSheet, error) {
categorizedBinds := make(map[string][]keybinds.Keybind)
s.convertSection(section, "", categorizedBinds)
cheatSheetTitle := "Sway Keybinds"
if s != nil && s.isScroll {
cheatSheetTitle = "Scroll Keybinds"
}
return &keybinds.CheatSheet{
Title: "Sway Keybinds",
Title: cheatSheetTitle,
Provider: s.Name(),
Binds: categorizedBinds,
}, nil

View File

@@ -5,6 +5,8 @@ import (
"path/filepath"
"regexp"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
const (
@@ -42,14 +44,9 @@ func NewSwayParser() *SwayParser {
}
func (p *SwayParser) ReadContent(path string) error {
expandedPath := os.ExpandEnv(path)
expandedPath = filepath.Clean(expandedPath)
if strings.HasPrefix(expandedPath, "~") {
home, err := os.UserHomeDir()
if err != nil {
return err
}
expandedPath = filepath.Join(home, expandedPath[1:])
expandedPath, err := utils.ExpandPath(path)
if err != nil {
return err
}
info, err := os.Stat(expandedPath)

View File

@@ -1,12 +1,14 @@
package keybinds
type Keybind struct {
Key string `json:"key"`
Description string `json:"desc"`
Action string `json:"action,omitempty"`
Subcategory string `json:"subcat,omitempty"`
Source string `json:"source,omitempty"`
Conflict *Keybind `json:"conflict,omitempty"`
Key string `json:"key"`
Description string `json:"desc"`
Action string `json:"action,omitempty"`
Subcategory string `json:"subcat,omitempty"`
Source string `json:"source,omitempty"`
HideOnOverlay bool `json:"hideOnOverlay,omitempty"`
CooldownMs int `json:"cooldownMs,omitempty"`
Conflict *Keybind `json:"conflict,omitempty"`
}
type DMSBindsStatus struct {

View File

@@ -13,6 +13,7 @@ import (
"github.com/AvengeMedia/DankMaterialShell/core/internal/dank16"
"github.com/AvengeMedia/DankMaterialShell/core/internal/log"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
var (
@@ -33,6 +34,7 @@ type Options struct {
StockColors string
SyncModeWithPortal bool
TerminalsAlwaysDark bool
SkipTemplates string
}
type ColorsOutput struct {
@@ -46,6 +48,18 @@ func (o *Options) ColorsOutput() string {
return filepath.Join(o.StateDir, "dms-colors.json")
}
func (o *Options) ShouldSkipTemplate(name string) bool {
if o.SkipTemplates == "" {
return false
}
for _, skip := range strings.Split(o.SkipTemplates, ",") {
if strings.TrimSpace(skip) == name {
return true
}
}
return false
}
func Run(opts Options) error {
if opts.StateDir == "" {
return fmt.Errorf("state-dir is required")
@@ -217,34 +231,66 @@ output_path = '%s'
`, opts.ShellDir, opts.ColorsOutput())
switch opts.Mode {
case "light":
appendConfig(opts, cfgFile, "skip", "gtk3-light.toml")
default:
appendConfig(opts, cfgFile, "skip", "gtk3-dark.toml")
if !opts.ShouldSkipTemplate("gtk") {
switch opts.Mode {
case "light":
appendConfig(opts, cfgFile, "skip", "gtk3-light.toml")
default:
appendConfig(opts, cfgFile, "skip", "gtk3-dark.toml")
}
}
appendConfig(opts, cfgFile, "niri", "niri.toml")
appendConfig(opts, cfgFile, "qt5ct", "qt5ct.toml")
appendConfig(opts, cfgFile, "qt6ct", "qt6ct.toml")
appendConfig(opts, cfgFile, "firefox", "firefox.toml")
appendConfig(opts, cfgFile, "pywalfox", "pywalfox.toml")
appendConfig(opts, cfgFile, "vesktop", "vesktop.toml")
if !opts.ShouldSkipTemplate("niri") {
appendConfig(opts, cfgFile, "niri", "niri.toml")
}
if !opts.ShouldSkipTemplate("qt5ct") {
appendConfig(opts, cfgFile, "qt5ct", "qt5ct.toml")
}
if !opts.ShouldSkipTemplate("qt6ct") {
appendConfig(opts, cfgFile, "qt6ct", "qt6ct.toml")
}
if !opts.ShouldSkipTemplate("firefox") {
appendConfig(opts, cfgFile, "firefox", "firefox.toml")
}
if !opts.ShouldSkipTemplate("pywalfox") {
appendConfig(opts, cfgFile, "pywalfox", "pywalfox.toml")
}
if !opts.ShouldSkipTemplate("vesktop") {
appendConfig(opts, cfgFile, "vesktop", "vesktop.toml")
}
appendTerminalConfig(opts, cfgFile, tmpDir, "ghostty", "ghostty.toml")
appendTerminalConfig(opts, cfgFile, tmpDir, "kitty", "kitty.toml")
appendTerminalConfig(opts, cfgFile, tmpDir, "foot", "foot.toml")
appendTerminalConfig(opts, cfgFile, tmpDir, "alacritty", "alacritty.toml")
appendTerminalConfig(opts, cfgFile, tmpDir, "wezterm", "wezterm.toml")
if !opts.ShouldSkipTemplate("ghostty") {
appendTerminalConfig(opts, cfgFile, tmpDir, "ghostty", "ghostty.toml")
}
if !opts.ShouldSkipTemplate("kitty") {
appendTerminalConfig(opts, cfgFile, tmpDir, "kitty", "kitty.toml")
}
if !opts.ShouldSkipTemplate("foot") {
appendTerminalConfig(opts, cfgFile, tmpDir, "foot", "foot.toml")
}
if !opts.ShouldSkipTemplate("alacritty") {
appendTerminalConfig(opts, cfgFile, tmpDir, "alacritty", "alacritty.toml")
}
if !opts.ShouldSkipTemplate("wezterm") {
appendTerminalConfig(opts, cfgFile, tmpDir, "wezterm", "wezterm.toml")
}
appendConfig(opts, cfgFile, "dgop", "dgop.toml")
if !opts.ShouldSkipTemplate("dgop") {
appendConfig(opts, cfgFile, "dgop", "dgop.toml")
}
homeDir, _ := os.UserHomeDir()
appendVSCodeConfig(cfgFile, "vscode", filepath.Join(homeDir, ".vscode/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
appendVSCodeConfig(cfgFile, "codium", filepath.Join(homeDir, ".vscode-oss/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
appendVSCodeConfig(cfgFile, "codeoss", filepath.Join(homeDir, ".config/Code - OSS/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
appendVSCodeConfig(cfgFile, "cursor", filepath.Join(homeDir, ".cursor/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
appendVSCodeConfig(cfgFile, "windsurf", filepath.Join(homeDir, ".windsurf/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
if !opts.ShouldSkipTemplate("kcolorscheme") {
appendConfig(opts, cfgFile, "skip", "kcolorscheme.toml")
}
if !opts.ShouldSkipTemplate("vscode") {
homeDir, _ := os.UserHomeDir()
appendVSCodeConfig(cfgFile, "vscode", filepath.Join(homeDir, ".vscode/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
appendVSCodeConfig(cfgFile, "codium", filepath.Join(homeDir, ".vscode-oss/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
appendVSCodeConfig(cfgFile, "codeoss", filepath.Join(homeDir, ".config/Code - OSS/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
appendVSCodeConfig(cfgFile, "cursor", filepath.Join(homeDir, ".cursor/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
appendVSCodeConfig(cfgFile, "windsurf", filepath.Join(homeDir, ".windsurf/extensions/local.dynamic-base16-dankshell-0.0.1"), opts.ShellDir)
}
if opts.RunUserTemplates {
if data, err := os.ReadFile(userConfigPath); err == nil {
@@ -277,7 +323,7 @@ func appendConfig(opts *Options, cfgFile *os.File, checkCmd, fileName string) {
if _, err := os.Stat(configPath); err != nil {
return
}
if checkCmd != "skip" && !commandExists(checkCmd) {
if checkCmd != "skip" && !utils.CommandExists(checkCmd) {
return
}
data, err := os.ReadFile(configPath)
@@ -293,7 +339,7 @@ func appendTerminalConfig(opts *Options, cfgFile *os.File, tmpDir, checkCmd, fil
if _, err := os.Stat(configPath); err != nil {
return
}
if checkCmd != "skip" && !commandExists(checkCmd) {
if checkCmd != "skip" && !utils.CommandExists(checkCmd) {
return
}
data, err := os.ReadFile(configPath)
@@ -390,11 +436,6 @@ func extractTOMLSection(content, startMarker, endMarker string) string {
return content[startIdx : startIdx+endIdx]
}
func commandExists(name string) bool {
_, err := exec.LookPath(name)
return err == nil
}
func checkMatugenVersion() {
matugenVersionOnce.Do(func() {
cmd := exec.Command("matugen", "--version")

View File

@@ -0,0 +1,229 @@
// Code generated by mockery v2.53.5. DO NOT EDIT.
package mocks_wlclient
import (
client "github.com/AvengeMedia/DankMaterialShell/core/pkg/go-wayland/wayland/client"
mock "github.com/stretchr/testify/mock"
)
// MockWaylandDisplay is an autogenerated mock type for the WaylandDisplay type
type MockWaylandDisplay struct {
mock.Mock
}
type MockWaylandDisplay_Expecter struct {
mock *mock.Mock
}
func (_m *MockWaylandDisplay) EXPECT() *MockWaylandDisplay_Expecter {
return &MockWaylandDisplay_Expecter{mock: &_m.Mock}
}
// Context provides a mock function with no fields
func (_m *MockWaylandDisplay) Context() *client.Context {
ret := _m.Called()
if len(ret) == 0 {
panic("no return value specified for Context")
}
var r0 *client.Context
if rf, ok := ret.Get(0).(func() *client.Context); ok {
r0 = rf()
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(*client.Context)
}
}
return r0
}
// MockWaylandDisplay_Context_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'Context'
type MockWaylandDisplay_Context_Call struct {
*mock.Call
}
// Context is a helper method to define mock.On call
func (_e *MockWaylandDisplay_Expecter) Context() *MockWaylandDisplay_Context_Call {
return &MockWaylandDisplay_Context_Call{Call: _e.mock.On("Context")}
}
func (_c *MockWaylandDisplay_Context_Call) Run(run func()) *MockWaylandDisplay_Context_Call {
_c.Call.Run(func(args mock.Arguments) {
run()
})
return _c
}
func (_c *MockWaylandDisplay_Context_Call) Return(_a0 *client.Context) *MockWaylandDisplay_Context_Call {
_c.Call.Return(_a0)
return _c
}
func (_c *MockWaylandDisplay_Context_Call) RunAndReturn(run func() *client.Context) *MockWaylandDisplay_Context_Call {
_c.Call.Return(run)
return _c
}
// Destroy provides a mock function with no fields
func (_m *MockWaylandDisplay) Destroy() error {
ret := _m.Called()
if len(ret) == 0 {
panic("no return value specified for Destroy")
}
var r0 error
if rf, ok := ret.Get(0).(func() error); ok {
r0 = rf()
} else {
r0 = ret.Error(0)
}
return r0
}
// MockWaylandDisplay_Destroy_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'Destroy'
type MockWaylandDisplay_Destroy_Call struct {
*mock.Call
}
// Destroy is a helper method to define mock.On call
func (_e *MockWaylandDisplay_Expecter) Destroy() *MockWaylandDisplay_Destroy_Call {
return &MockWaylandDisplay_Destroy_Call{Call: _e.mock.On("Destroy")}
}
func (_c *MockWaylandDisplay_Destroy_Call) Run(run func()) *MockWaylandDisplay_Destroy_Call {
_c.Call.Run(func(args mock.Arguments) {
run()
})
return _c
}
func (_c *MockWaylandDisplay_Destroy_Call) Return(_a0 error) *MockWaylandDisplay_Destroy_Call {
_c.Call.Return(_a0)
return _c
}
func (_c *MockWaylandDisplay_Destroy_Call) RunAndReturn(run func() error) *MockWaylandDisplay_Destroy_Call {
_c.Call.Return(run)
return _c
}
// GetRegistry provides a mock function with no fields
func (_m *MockWaylandDisplay) GetRegistry() (*client.Registry, error) {
ret := _m.Called()
if len(ret) == 0 {
panic("no return value specified for GetRegistry")
}
var r0 *client.Registry
var r1 error
if rf, ok := ret.Get(0).(func() (*client.Registry, error)); ok {
return rf()
}
if rf, ok := ret.Get(0).(func() *client.Registry); ok {
r0 = rf()
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(*client.Registry)
}
}
if rf, ok := ret.Get(1).(func() error); ok {
r1 = rf()
} else {
r1 = ret.Error(1)
}
return r0, r1
}
// MockWaylandDisplay_GetRegistry_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'GetRegistry'
type MockWaylandDisplay_GetRegistry_Call struct {
*mock.Call
}
// GetRegistry is a helper method to define mock.On call
func (_e *MockWaylandDisplay_Expecter) GetRegistry() *MockWaylandDisplay_GetRegistry_Call {
return &MockWaylandDisplay_GetRegistry_Call{Call: _e.mock.On("GetRegistry")}
}
func (_c *MockWaylandDisplay_GetRegistry_Call) Run(run func()) *MockWaylandDisplay_GetRegistry_Call {
_c.Call.Run(func(args mock.Arguments) {
run()
})
return _c
}
func (_c *MockWaylandDisplay_GetRegistry_Call) Return(_a0 *client.Registry, _a1 error) *MockWaylandDisplay_GetRegistry_Call {
_c.Call.Return(_a0, _a1)
return _c
}
func (_c *MockWaylandDisplay_GetRegistry_Call) RunAndReturn(run func() (*client.Registry, error)) *MockWaylandDisplay_GetRegistry_Call {
_c.Call.Return(run)
return _c
}
// Roundtrip provides a mock function with no fields
func (_m *MockWaylandDisplay) Roundtrip() error {
ret := _m.Called()
if len(ret) == 0 {
panic("no return value specified for Roundtrip")
}
var r0 error
if rf, ok := ret.Get(0).(func() error); ok {
r0 = rf()
} else {
r0 = ret.Error(0)
}
return r0
}
// MockWaylandDisplay_Roundtrip_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'Roundtrip'
type MockWaylandDisplay_Roundtrip_Call struct {
*mock.Call
}
// Roundtrip is a helper method to define mock.On call
func (_e *MockWaylandDisplay_Expecter) Roundtrip() *MockWaylandDisplay_Roundtrip_Call {
return &MockWaylandDisplay_Roundtrip_Call{Call: _e.mock.On("Roundtrip")}
}
func (_c *MockWaylandDisplay_Roundtrip_Call) Run(run func()) *MockWaylandDisplay_Roundtrip_Call {
_c.Call.Run(func(args mock.Arguments) {
run()
})
return _c
}
func (_c *MockWaylandDisplay_Roundtrip_Call) Return(_a0 error) *MockWaylandDisplay_Roundtrip_Call {
_c.Call.Return(_a0)
return _c
}
func (_c *MockWaylandDisplay_Roundtrip_Call) RunAndReturn(run func() error) *MockWaylandDisplay_Roundtrip_Call {
_c.Call.Return(run)
return _c
}
// NewMockWaylandDisplay creates a new instance of MockWaylandDisplay. It also registers a testing interface on the mock and a cleanup function to assert the mocks expectations.
// The first argument is typically a *testing.T value.
func NewMockWaylandDisplay(t interface {
mock.TestingT
Cleanup(func())
}) *MockWaylandDisplay {
mock := &MockWaylandDisplay{}
mock.Mock.Test(t)
t.Cleanup(func() { mock.AssertExpectations(t) })
return mock
}

View File

@@ -0,0 +1,226 @@
// Code generated by mockery v2.53.5. DO NOT EDIT.
package mocks_wlcontext
import (
client "github.com/AvengeMedia/DankMaterialShell/core/pkg/go-wayland/wayland/client"
mock "github.com/stretchr/testify/mock"
)
// MockWaylandContext is an autogenerated mock type for the WaylandContext type
type MockWaylandContext struct {
mock.Mock
}
type MockWaylandContext_Expecter struct {
mock *mock.Mock
}
func (_m *MockWaylandContext) EXPECT() *MockWaylandContext_Expecter {
return &MockWaylandContext_Expecter{mock: &_m.Mock}
}
// Close provides a mock function with no fields
func (_m *MockWaylandContext) Close() {
_m.Called()
}
// MockWaylandContext_Close_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'Close'
type MockWaylandContext_Close_Call struct {
*mock.Call
}
// Close is a helper method to define mock.On call
func (_e *MockWaylandContext_Expecter) Close() *MockWaylandContext_Close_Call {
return &MockWaylandContext_Close_Call{Call: _e.mock.On("Close")}
}
func (_c *MockWaylandContext_Close_Call) Run(run func()) *MockWaylandContext_Close_Call {
_c.Call.Run(func(args mock.Arguments) {
run()
})
return _c
}
func (_c *MockWaylandContext_Close_Call) Return() *MockWaylandContext_Close_Call {
_c.Call.Return()
return _c
}
func (_c *MockWaylandContext_Close_Call) RunAndReturn(run func()) *MockWaylandContext_Close_Call {
_c.Run(run)
return _c
}
// Display provides a mock function with no fields
func (_m *MockWaylandContext) Display() *client.Display {
ret := _m.Called()
if len(ret) == 0 {
panic("no return value specified for Display")
}
var r0 *client.Display
if rf, ok := ret.Get(0).(func() *client.Display); ok {
r0 = rf()
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(*client.Display)
}
}
return r0
}
// MockWaylandContext_Display_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'Display'
type MockWaylandContext_Display_Call struct {
*mock.Call
}
// Display is a helper method to define mock.On call
func (_e *MockWaylandContext_Expecter) Display() *MockWaylandContext_Display_Call {
return &MockWaylandContext_Display_Call{Call: _e.mock.On("Display")}
}
func (_c *MockWaylandContext_Display_Call) Run(run func()) *MockWaylandContext_Display_Call {
_c.Call.Run(func(args mock.Arguments) {
run()
})
return _c
}
func (_c *MockWaylandContext_Display_Call) Return(_a0 *client.Display) *MockWaylandContext_Display_Call {
_c.Call.Return(_a0)
return _c
}
func (_c *MockWaylandContext_Display_Call) RunAndReturn(run func() *client.Display) *MockWaylandContext_Display_Call {
_c.Call.Return(run)
return _c
}
// FatalError provides a mock function with no fields
func (_m *MockWaylandContext) FatalError() <-chan error {
ret := _m.Called()
if len(ret) == 0 {
panic("no return value specified for FatalError")
}
var r0 <-chan error
if rf, ok := ret.Get(0).(func() <-chan error); ok {
r0 = rf()
} else {
if ret.Get(0) != nil {
r0 = ret.Get(0).(<-chan error)
}
}
return r0
}
// MockWaylandContext_FatalError_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'FatalError'
type MockWaylandContext_FatalError_Call struct {
*mock.Call
}
// FatalError is a helper method to define mock.On call
func (_e *MockWaylandContext_Expecter) FatalError() *MockWaylandContext_FatalError_Call {
return &MockWaylandContext_FatalError_Call{Call: _e.mock.On("FatalError")}
}
func (_c *MockWaylandContext_FatalError_Call) Run(run func()) *MockWaylandContext_FatalError_Call {
_c.Call.Run(func(args mock.Arguments) {
run()
})
return _c
}
func (_c *MockWaylandContext_FatalError_Call) Return(_a0 <-chan error) *MockWaylandContext_FatalError_Call {
_c.Call.Return(_a0)
return _c
}
func (_c *MockWaylandContext_FatalError_Call) RunAndReturn(run func() <-chan error) *MockWaylandContext_FatalError_Call {
_c.Call.Return(run)
return _c
}
// Post provides a mock function with given fields: fn
func (_m *MockWaylandContext) Post(fn func()) {
_m.Called(fn)
}
// MockWaylandContext_Post_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'Post'
type MockWaylandContext_Post_Call struct {
*mock.Call
}
// Post is a helper method to define mock.On call
// - fn func()
func (_e *MockWaylandContext_Expecter) Post(fn interface{}) *MockWaylandContext_Post_Call {
return &MockWaylandContext_Post_Call{Call: _e.mock.On("Post", fn)}
}
func (_c *MockWaylandContext_Post_Call) Run(run func(fn func())) *MockWaylandContext_Post_Call {
_c.Call.Run(func(args mock.Arguments) {
run(args[0].(func()))
})
return _c
}
func (_c *MockWaylandContext_Post_Call) Return() *MockWaylandContext_Post_Call {
_c.Call.Return()
return _c
}
func (_c *MockWaylandContext_Post_Call) RunAndReturn(run func(func())) *MockWaylandContext_Post_Call {
_c.Run(run)
return _c
}
// Start provides a mock function with no fields
func (_m *MockWaylandContext) Start() {
_m.Called()
}
// MockWaylandContext_Start_Call is a *mock.Call that shadows Run/Return methods with type explicit version for method 'Start'
type MockWaylandContext_Start_Call struct {
*mock.Call
}
// Start is a helper method to define mock.On call
func (_e *MockWaylandContext_Expecter) Start() *MockWaylandContext_Start_Call {
return &MockWaylandContext_Start_Call{Call: _e.mock.On("Start")}
}
func (_c *MockWaylandContext_Start_Call) Run(run func()) *MockWaylandContext_Start_Call {
_c.Call.Run(func(args mock.Arguments) {
run()
})
return _c
}
func (_c *MockWaylandContext_Start_Call) Return() *MockWaylandContext_Start_Call {
_c.Call.Return()
return _c
}
func (_c *MockWaylandContext_Start_Call) RunAndReturn(run func()) *MockWaylandContext_Start_Call {
_c.Run(run)
return _c
}
// NewMockWaylandContext creates a new instance of MockWaylandContext. It also registers a testing interface on the mock and a cleanup function to assert the mocks expectations.
// The first argument is typically a *testing.T value.
func NewMockWaylandContext(t interface {
mock.TestingT
Cleanup(func())
}) *MockWaylandContext {
mock := &MockWaylandContext{}
mock.Mock.Test(t)
t.Cleanup(func() { mock.AssertExpectations(t) })
return mock
}

View File

@@ -9,6 +9,7 @@ import (
"path/filepath"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/log"
"github.com/spf13/afero"
)
@@ -32,33 +33,75 @@ func NewManagerWithFs(fs afero.Fs) (*Manager, error) {
}
func getPluginsDir() string {
configHome := os.Getenv("XDG_CONFIG_HOME")
if configHome == "" {
homeDir, err := os.UserHomeDir()
if err != nil {
return filepath.Join(os.TempDir(), "DankMaterialShell", "plugins")
}
configHome = filepath.Join(homeDir, ".config")
configDir, err := os.UserConfigDir()
if err != nil {
log.Error("failed to get user config dir", "err", err)
return ""
}
return filepath.Join(configHome, "DankMaterialShell", "plugins")
return filepath.Join(configDir, "DankMaterialShell", "plugins")
}
func (m *Manager) IsInstalled(plugin Plugin) (bool, error) {
pluginPath := filepath.Join(m.pluginsDir, plugin.ID)
exists, err := afero.DirExists(m.fs, pluginPath)
path, err := m.findInstalledPath(plugin.ID)
if err != nil {
return false, err
}
if exists {
return true, nil
return path != "", nil
}
func (m *Manager) findInstalledPath(pluginID string) (string, error) {
// Check user plugins directory
path, err := m.findInDir(m.pluginsDir, pluginID)
if err != nil {
return "", err
}
if path != "" {
return path, nil
}
systemPluginPath := filepath.Join("/etc/xdg/quickshell/dms-plugins", plugin.ID)
systemExists, err := afero.DirExists(m.fs, systemPluginPath)
if err != nil {
return false, err
// Check system plugins directory
systemDir := "/etc/xdg/quickshell/dms-plugins"
return m.findInDir(systemDir, pluginID)
}
func (m *Manager) findInDir(dir, pluginID string) (string, error) {
// First, check if folder with exact ID name exists
exactPath := filepath.Join(dir, pluginID)
if exists, _ := afero.DirExists(m.fs, exactPath); exists {
return exactPath, nil
}
return systemExists, nil
// Scan all folders and check plugin.json for matching ID
exists, err := afero.DirExists(m.fs, dir)
if err != nil || !exists {
return "", nil
}
entries, err := afero.ReadDir(m.fs, dir)
if err != nil {
return "", nil
}
for _, entry := range entries {
name := entry.Name()
if name == ".repos" || strings.HasSuffix(name, ".meta") {
continue
}
fullPath := filepath.Join(dir, name)
isPlugin := entry.IsDir() || entry.Mode()&os.ModeSymlink != 0
if !isPlugin {
if info, err := m.fs.Stat(fullPath); err == nil && info.IsDir() {
isPlugin = true
}
}
if isPlugin && m.getPluginID(fullPath) == pluginID {
return fullPath, nil
}
}
return "", nil
}
func (m *Manager) Install(plugin Plugin) error {
@@ -151,25 +194,19 @@ func (m *Manager) createSymlink(source, dest string) error {
}
func (m *Manager) Update(plugin Plugin) error {
pluginPath := filepath.Join(m.pluginsDir, plugin.ID)
exists, err := afero.DirExists(m.fs, pluginPath)
pluginPath, err := m.findInstalledPath(plugin.ID)
if err != nil {
return fmt.Errorf("failed to check if plugin exists: %w", err)
return fmt.Errorf("failed to find plugin: %w", err)
}
if !exists {
systemPluginPath := filepath.Join("/etc/xdg/quickshell/dms-plugins", plugin.ID)
systemExists, err := afero.DirExists(m.fs, systemPluginPath)
if err != nil {
return fmt.Errorf("failed to check if plugin exists: %w", err)
}
if systemExists {
return fmt.Errorf("cannot update system plugin: %s", plugin.Name)
}
if pluginPath == "" {
return fmt.Errorf("plugin not installed: %s", plugin.Name)
}
if strings.HasPrefix(pluginPath, "/etc/xdg/quickshell/dms-plugins") {
return fmt.Errorf("cannot update system plugin: %s", plugin.Name)
}
metaPath := pluginPath + ".meta"
metaExists, err := afero.Exists(m.fs, metaPath)
if err != nil {
@@ -209,25 +246,19 @@ func (m *Manager) Update(plugin Plugin) error {
}
func (m *Manager) Uninstall(plugin Plugin) error {
pluginPath := filepath.Join(m.pluginsDir, plugin.ID)
exists, err := afero.DirExists(m.fs, pluginPath)
pluginPath, err := m.findInstalledPath(plugin.ID)
if err != nil {
return fmt.Errorf("failed to check if plugin exists: %w", err)
return fmt.Errorf("failed to find plugin: %w", err)
}
if !exists {
systemPluginPath := filepath.Join("/etc/xdg/quickshell/dms-plugins", plugin.ID)
systemExists, err := afero.DirExists(m.fs, systemPluginPath)
if err != nil {
return fmt.Errorf("failed to check if plugin exists: %w", err)
}
if systemExists {
return fmt.Errorf("cannot uninstall system plugin: %s", plugin.Name)
}
if pluginPath == "" {
return fmt.Errorf("plugin not installed: %s", plugin.Name)
}
if strings.HasPrefix(pluginPath, "/etc/xdg/quickshell/dms-plugins") {
return fmt.Errorf("cannot uninstall system plugin: %s", plugin.Name)
}
metaPath := pluginPath + ".meta"
metaExists, err := afero.Exists(m.fs, metaPath)
if err != nil {
@@ -369,47 +400,174 @@ func (m *Manager) ListInstalled() ([]string, error) {
// getPluginID reads the plugin.json file and returns the plugin ID
func (m *Manager) getPluginID(pluginPath string) string {
manifest := m.getPluginManifest(pluginPath)
if manifest == nil {
return ""
}
return manifest.ID
}
func (m *Manager) getPluginManifest(pluginPath string) *pluginManifest {
manifestPath := filepath.Join(pluginPath, "plugin.json")
data, err := afero.ReadFile(m.fs, manifestPath)
if err != nil {
return ""
return nil
}
var manifest struct {
ID string `json:"id"`
}
var manifest pluginManifest
if err := json.Unmarshal(data, &manifest); err != nil {
return ""
return nil
}
return manifest.ID
return &manifest
}
type pluginManifest struct {
ID string `json:"id"`
Name string `json:"name"`
}
func (m *Manager) GetPluginsDir() string {
return m.pluginsDir
}
func (m *Manager) HasUpdates(pluginID string, plugin Plugin) (bool, error) {
pluginPath := filepath.Join(m.pluginsDir, pluginID)
exists, err := afero.DirExists(m.fs, pluginPath)
func (m *Manager) UninstallByIDOrName(idOrName string) error {
pluginPath, err := m.findInstalledPathByIDOrName(idOrName)
if err != nil {
return false, fmt.Errorf("failed to check if plugin exists: %w", err)
return err
}
if pluginPath == "" {
return fmt.Errorf("plugin not found: %s", idOrName)
}
if !exists {
systemPluginPath := filepath.Join("/etc/xdg/quickshell/dms-plugins", pluginID)
systemExists, err := afero.DirExists(m.fs, systemPluginPath)
if err != nil {
return false, fmt.Errorf("failed to check system plugin: %w", err)
if strings.HasPrefix(pluginPath, "/etc/xdg/quickshell/dms-plugins") {
return fmt.Errorf("cannot uninstall system plugin: %s", idOrName)
}
metaPath := pluginPath + ".meta"
metaExists, _ := afero.Exists(m.fs, metaPath)
if metaExists {
if err := m.fs.Remove(pluginPath); err != nil {
return fmt.Errorf("failed to remove symlink: %w", err)
}
if systemExists {
return false, nil
if err := m.fs.Remove(metaPath); err != nil {
return fmt.Errorf("failed to remove metadata: %w", err)
}
} else {
if err := m.fs.RemoveAll(pluginPath); err != nil {
return fmt.Errorf("failed to remove plugin: %w", err)
}
}
return nil
}
func (m *Manager) UpdateByIDOrName(idOrName string) error {
pluginPath, err := m.findInstalledPathByIDOrName(idOrName)
if err != nil {
return err
}
if pluginPath == "" {
return fmt.Errorf("plugin not found: %s", idOrName)
}
if strings.HasPrefix(pluginPath, "/etc/xdg/quickshell/dms-plugins") {
return fmt.Errorf("cannot update system plugin: %s", idOrName)
}
metaPath := pluginPath + ".meta"
metaExists, _ := afero.Exists(m.fs, metaPath)
if metaExists {
// Plugin is from monorepo, but we don't know the repo URL without registry
// Just try to pull from existing .git in the symlink target
return fmt.Errorf("cannot update monorepo plugin without registry info: %s", idOrName)
}
// Standalone plugin - just pull
if err := m.gitClient.Pull(pluginPath); err != nil {
return fmt.Errorf("failed to update plugin: %w", err)
}
return nil
}
func (m *Manager) findInstalledPathByIDOrName(idOrName string) (string, error) {
path, err := m.findInDirByIDOrName(m.pluginsDir, idOrName)
if err != nil {
return "", err
}
if path != "" {
return path, nil
}
systemDir := "/etc/xdg/quickshell/dms-plugins"
return m.findInDirByIDOrName(systemDir, idOrName)
}
func (m *Manager) findInDirByIDOrName(dir, idOrName string) (string, error) {
// Check exact folder name match first
exactPath := filepath.Join(dir, idOrName)
if exists, _ := afero.DirExists(m.fs, exactPath); exists {
return exactPath, nil
}
exists, err := afero.DirExists(m.fs, dir)
if err != nil || !exists {
return "", nil
}
entries, err := afero.ReadDir(m.fs, dir)
if err != nil {
return "", nil
}
for _, entry := range entries {
name := entry.Name()
if name == ".repos" || strings.HasSuffix(name, ".meta") {
continue
}
fullPath := filepath.Join(dir, name)
isPlugin := entry.IsDir() || entry.Mode()&os.ModeSymlink != 0
if !isPlugin {
if info, err := m.fs.Stat(fullPath); err == nil && info.IsDir() {
isPlugin = true
}
}
if !isPlugin {
continue
}
manifest := m.getPluginManifest(fullPath)
if manifest == nil {
continue
}
if manifest.ID == idOrName || manifest.Name == idOrName {
return fullPath, nil
}
}
return "", nil
}
func (m *Manager) HasUpdates(pluginID string, plugin Plugin) (bool, error) {
pluginPath, err := m.findInstalledPath(pluginID)
if err != nil {
return false, fmt.Errorf("failed to find plugin: %w", err)
}
if pluginPath == "" {
return false, fmt.Errorf("plugin not installed: %s", pluginID)
}
// Check if there's a .meta file (plugin installed from a monorepo)
if strings.HasPrefix(pluginPath, "/etc/xdg/quickshell/dms-plugins") {
return false, nil
}
metaPath := pluginPath + ".meta"
metaExists, err := afero.Exists(m.fs, metaPath)
if err != nil {

View File

@@ -3,6 +3,8 @@ package plugins
import (
"sort"
"strings"
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
)
func FuzzySearch(query string, plugins []Plugin) []Plugin {
@@ -11,18 +13,12 @@ func FuzzySearch(query string, plugins []Plugin) []Plugin {
}
queryLower := strings.ToLower(query)
var results []Plugin
for _, plugin := range plugins {
if fuzzyMatch(queryLower, strings.ToLower(plugin.Name)) ||
fuzzyMatch(queryLower, strings.ToLower(plugin.Category)) ||
fuzzyMatch(queryLower, strings.ToLower(plugin.Description)) ||
fuzzyMatch(queryLower, strings.ToLower(plugin.Author)) {
results = append(results, plugin)
}
}
return results
return utils.Filter(plugins, func(p Plugin) bool {
return fuzzyMatch(queryLower, strings.ToLower(p.Name)) ||
fuzzyMatch(queryLower, strings.ToLower(p.Category)) ||
fuzzyMatch(queryLower, strings.ToLower(p.Description)) ||
fuzzyMatch(queryLower, strings.ToLower(p.Author))
})
}
func fuzzyMatch(query, text string) bool {
@@ -39,57 +35,34 @@ func FilterByCategory(category string, plugins []Plugin) []Plugin {
if category == "" {
return plugins
}
var results []Plugin
categoryLower := strings.ToLower(category)
for _, plugin := range plugins {
if strings.ToLower(plugin.Category) == categoryLower {
results = append(results, plugin)
}
}
return results
return utils.Filter(plugins, func(p Plugin) bool {
return strings.ToLower(p.Category) == categoryLower
})
}
func FilterByCompositor(compositor string, plugins []Plugin) []Plugin {
if compositor == "" {
return plugins
}
var results []Plugin
compositorLower := strings.ToLower(compositor)
for _, plugin := range plugins {
for _, comp := range plugin.Compositors {
if strings.ToLower(comp) == compositorLower {
results = append(results, plugin)
break
}
}
}
return results
return utils.Filter(plugins, func(p Plugin) bool {
return utils.Any(p.Compositors, func(c string) bool {
return strings.ToLower(c) == compositorLower
})
})
}
func FilterByCapability(capability string, plugins []Plugin) []Plugin {
if capability == "" {
return plugins
}
var results []Plugin
capabilityLower := strings.ToLower(capability)
for _, plugin := range plugins {
for _, cap := range plugin.Capabilities {
if strings.ToLower(cap) == capabilityLower {
results = append(results, plugin)
break
}
}
}
return results
return utils.Filter(plugins, func(p Plugin) bool {
return utils.Any(p.Capabilities, func(c string) bool {
return strings.ToLower(c) == capabilityLower
})
})
}
func SortByFirstParty(plugins []Plugin) []Plugin {
@@ -103,3 +76,13 @@ func SortByFirstParty(plugins []Plugin) []Plugin {
})
return plugins
}
func FindByIDOrName(idOrName string, plugins []Plugin) *Plugin {
if p, found := utils.Find(plugins, func(p Plugin) bool { return p.ID == idOrName }); found {
return &p
}
if p, found := utils.Find(plugins, func(p Plugin) bool { return p.Name == idOrName }); found {
return &p
}
return nil
}

View File

@@ -0,0 +1,695 @@
// Generated by go-wayland-scanner
// https://github.com/yaslama/go-wayland/cmd/go-wayland-scanner
// XML file : internal/proto/xml/ext-data-control-v1.xml
//
// ext_data_control_v1 Protocol Copyright:
//
// Copyright © 2018 Simon Ser
// Copyright © 2019 Ivan Molodetskikh
// Copyright © 2024 Neal Gompa
//
// Permission to use, copy, modify, distribute, and sell this
// software and its documentation for any purpose is hereby granted
// without fee, provided that the above copyright notice appear in
// all copies and that both that copyright notice and this permission
// notice appear in supporting documentation, and that the name of
// the copyright holders not be used in advertising or publicity
// pertaining to distribution of the software without specific,
// written prior permission. The copyright holders make no
// representations about the suitability of this software for any
// purpose. It is provided "as is" without express or implied
// warranty.
//
// THE COPYRIGHT HOLDERS DISCLAIM ALL WARRANTIES WITH REGARD TO THIS
// SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
// FITNESS, IN NO EVENT SHALL THE COPYRIGHT HOLDERS BE LIABLE FOR ANY
// SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
// WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN
// AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION,
// ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF
// THIS SOFTWARE.
package ext_data_control
import (
"github.com/AvengeMedia/DankMaterialShell/core/pkg/go-wayland/wayland/client"
"golang.org/x/sys/unix"
)
// ExtDataControlManagerV1InterfaceName is the name of the interface as it appears in the [client.Registry].
// It can be used to match the [client.RegistryGlobalEvent.Interface] in the
// [Registry.SetGlobalHandler] and can be used in [Registry.Bind] if this applies.
const ExtDataControlManagerV1InterfaceName = "ext_data_control_manager_v1"
// ExtDataControlManagerV1 : manager to control data devices
//
// This interface is a manager that allows creating per-seat data device
// controls.
type ExtDataControlManagerV1 struct {
client.BaseProxy
}
// NewExtDataControlManagerV1 : manager to control data devices
//
// This interface is a manager that allows creating per-seat data device
// controls.
func NewExtDataControlManagerV1(ctx *client.Context) *ExtDataControlManagerV1 {
extDataControlManagerV1 := &ExtDataControlManagerV1{}
ctx.Register(extDataControlManagerV1)
return extDataControlManagerV1
}
// CreateDataSource : create a new data source
//
// Create a new data source.
func (i *ExtDataControlManagerV1) CreateDataSource() (*ExtDataControlSourceV1, error) {
id := NewExtDataControlSourceV1(i.Context())
const opcode = 0
const _reqBufLen = 8 + 4
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
client.PutUint32(_reqBuf[l:l+4], id.ID())
l += 4
err := i.Context().WriteMsg(_reqBuf[:], nil)
return id, err
}
// GetDataDevice : get a data device for a seat
//
// Create a data device that can be used to manage a seat's selection.
func (i *ExtDataControlManagerV1) GetDataDevice(seat *client.Seat) (*ExtDataControlDeviceV1, error) {
id := NewExtDataControlDeviceV1(i.Context())
const opcode = 1
const _reqBufLen = 8 + 4 + 4
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
client.PutUint32(_reqBuf[l:l+4], id.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], seat.ID())
l += 4
err := i.Context().WriteMsg(_reqBuf[:], nil)
return id, err
}
// GetDataDeviceWithProxy : get a data device for a seat using a pre-created proxy
//
// Like GetDataDevice, but uses a pre-created ExtDataControlDeviceV1 proxy.
// This allows setting up event handlers before the request is sent.
func (i *ExtDataControlManagerV1) GetDataDeviceWithProxy(device *ExtDataControlDeviceV1, seat *client.Seat) error {
const opcode = 1
const _reqBufLen = 8 + 4 + 4
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
client.PutUint32(_reqBuf[l:l+4], device.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], seat.ID())
l += 4
return i.Context().WriteMsg(_reqBuf[:], nil)
}
// Destroy : destroy the manager
//
// All objects created by the manager will still remain valid, until their
// appropriate destroy request has been called.
func (i *ExtDataControlManagerV1) Destroy() error {
defer i.MarkZombie()
const opcode = 2
const _reqBufLen = 8
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
err := i.Context().WriteMsg(_reqBuf[:], nil)
return err
}
// ExtDataControlDeviceV1InterfaceName is the name of the interface as it appears in the [client.Registry].
// It can be used to match the [client.RegistryGlobalEvent.Interface] in the
// [Registry.SetGlobalHandler] and can be used in [Registry.Bind] if this applies.
const ExtDataControlDeviceV1InterfaceName = "ext_data_control_device_v1"
// ExtDataControlDeviceV1 : manage a data device for a seat
//
// This interface allows a client to manage a seat's selection.
//
// When the seat is destroyed, this object becomes inert.
type ExtDataControlDeviceV1 struct {
client.BaseProxy
dataOfferHandler ExtDataControlDeviceV1DataOfferHandlerFunc
selectionHandler ExtDataControlDeviceV1SelectionHandlerFunc
finishedHandler ExtDataControlDeviceV1FinishedHandlerFunc
primarySelectionHandler ExtDataControlDeviceV1PrimarySelectionHandlerFunc
}
// NewExtDataControlDeviceV1 : manage a data device for a seat
//
// This interface allows a client to manage a seat's selection.
//
// When the seat is destroyed, this object becomes inert.
func NewExtDataControlDeviceV1(ctx *client.Context) *ExtDataControlDeviceV1 {
extDataControlDeviceV1 := &ExtDataControlDeviceV1{}
ctx.Register(extDataControlDeviceV1)
return extDataControlDeviceV1
}
// SetSelection : copy data to the selection
//
// This request asks the compositor to set the selection to the data from
// the source on behalf of the client.
//
// The given source may not be used in any further set_selection or
// set_primary_selection requests. Attempting to use a previously used
// source triggers the used_source protocol error.
//
// To unset the selection, set the source to NULL.
func (i *ExtDataControlDeviceV1) SetSelection(source *ExtDataControlSourceV1) error {
const opcode = 0
const _reqBufLen = 8 + 4
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
if source == nil {
client.PutUint32(_reqBuf[l:l+4], 0)
l += 4
} else {
client.PutUint32(_reqBuf[l:l+4], source.ID())
l += 4
}
err := i.Context().WriteMsg(_reqBuf[:], nil)
return err
}
// Destroy : destroy this data device
//
// Destroys the data device object.
func (i *ExtDataControlDeviceV1) Destroy() error {
defer i.MarkZombie()
const opcode = 1
const _reqBufLen = 8
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
err := i.Context().WriteMsg(_reqBuf[:], nil)
return err
}
// SetPrimarySelection : copy data to the primary selection
//
// This request asks the compositor to set the primary selection to the
// data from the source on behalf of the client.
//
// The given source may not be used in any further set_selection or
// set_primary_selection requests. Attempting to use a previously used
// source triggers the used_source protocol error.
//
// To unset the primary selection, set the source to NULL.
//
// The compositor will ignore this request if it does not support primary
// selection.
func (i *ExtDataControlDeviceV1) SetPrimarySelection(source *ExtDataControlSourceV1) error {
const opcode = 2
const _reqBufLen = 8 + 4
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
if source == nil {
client.PutUint32(_reqBuf[l:l+4], 0)
l += 4
} else {
client.PutUint32(_reqBuf[l:l+4], source.ID())
l += 4
}
err := i.Context().WriteMsg(_reqBuf[:], nil)
return err
}
type ExtDataControlDeviceV1Error uint32
// ExtDataControlDeviceV1Error :
const (
// ExtDataControlDeviceV1ErrorUsedSource : source given to set_selection or set_primary_selection was already used before
ExtDataControlDeviceV1ErrorUsedSource ExtDataControlDeviceV1Error = 1
)
func (e ExtDataControlDeviceV1Error) Name() string {
switch e {
case ExtDataControlDeviceV1ErrorUsedSource:
return "used_source"
default:
return ""
}
}
func (e ExtDataControlDeviceV1Error) Value() string {
switch e {
case ExtDataControlDeviceV1ErrorUsedSource:
return "1"
default:
return ""
}
}
func (e ExtDataControlDeviceV1Error) String() string {
return e.Name() + "=" + e.Value()
}
// ExtDataControlDeviceV1DataOfferEvent : introduce a new ext_data_control_offer
//
// The data_offer event introduces a new ext_data_control_offer object,
// which will subsequently be used in either the
// ext_data_control_device.selection event (for the regular clipboard
// selections) or the ext_data_control_device.primary_selection event (for
// the primary clipboard selections). Immediately following the
// ext_data_control_device.data_offer event, the new data_offer object
// will send out ext_data_control_offer.offer events to describe the MIME
// types it offers.
type ExtDataControlDeviceV1DataOfferEvent struct {
Id *ExtDataControlOfferV1
}
type ExtDataControlDeviceV1DataOfferHandlerFunc func(ExtDataControlDeviceV1DataOfferEvent)
// SetDataOfferHandler : sets handler for ExtDataControlDeviceV1DataOfferEvent
func (i *ExtDataControlDeviceV1) SetDataOfferHandler(f ExtDataControlDeviceV1DataOfferHandlerFunc) {
i.dataOfferHandler = f
}
// ExtDataControlDeviceV1SelectionEvent : advertise new selection
//
// The selection event is sent out to notify the client of a new
// ext_data_control_offer for the selection for this device. The
// ext_data_control_device.data_offer and the ext_data_control_offer.offer
// events are sent out immediately before this event to introduce the data
// offer object. The selection event is sent to a client when a new
// selection is set. The ext_data_control_offer is valid until a new
// ext_data_control_offer or NULL is received. The client must destroy the
// previous selection ext_data_control_offer, if any, upon receiving this
// event. Regardless, the previous selection will be ignored once a new
// selection ext_data_control_offer is received.
//
// The first selection event is sent upon binding the
// ext_data_control_device object.
type ExtDataControlDeviceV1SelectionEvent struct {
Id *ExtDataControlOfferV1
OfferId uint32 // Raw object ID for external registry lookups
}
type ExtDataControlDeviceV1SelectionHandlerFunc func(ExtDataControlDeviceV1SelectionEvent)
// SetSelectionHandler : sets handler for ExtDataControlDeviceV1SelectionEvent
func (i *ExtDataControlDeviceV1) SetSelectionHandler(f ExtDataControlDeviceV1SelectionHandlerFunc) {
i.selectionHandler = f
}
// ExtDataControlDeviceV1FinishedEvent : this data control is no longer valid
//
// This data control object is no longer valid and should be destroyed by
// the client.
type ExtDataControlDeviceV1FinishedEvent struct{}
type ExtDataControlDeviceV1FinishedHandlerFunc func(ExtDataControlDeviceV1FinishedEvent)
// SetFinishedHandler : sets handler for ExtDataControlDeviceV1FinishedEvent
func (i *ExtDataControlDeviceV1) SetFinishedHandler(f ExtDataControlDeviceV1FinishedHandlerFunc) {
i.finishedHandler = f
}
// ExtDataControlDeviceV1PrimarySelectionEvent : advertise new primary selection
//
// The primary_selection event is sent out to notify the client of a new
// ext_data_control_offer for the primary selection for this device. The
// ext_data_control_device.data_offer and the ext_data_control_offer.offer
// events are sent out immediately before this event to introduce the data
// offer object. The primary_selection event is sent to a client when a
// new primary selection is set. The ext_data_control_offer is valid until
// a new ext_data_control_offer or NULL is received. The client must
// destroy the previous primary selection ext_data_control_offer, if any,
// upon receiving this event. Regardless, the previous primary selection
// will be ignored once a new primary selection ext_data_control_offer is
// received.
//
// If the compositor supports primary selection, the first
// primary_selection event is sent upon binding the
// ext_data_control_device object.
type ExtDataControlDeviceV1PrimarySelectionEvent struct {
Id *ExtDataControlOfferV1
OfferId uint32 // Raw object ID for external registry lookups
}
type ExtDataControlDeviceV1PrimarySelectionHandlerFunc func(ExtDataControlDeviceV1PrimarySelectionEvent)
// SetPrimarySelectionHandler : sets handler for ExtDataControlDeviceV1PrimarySelectionEvent
func (i *ExtDataControlDeviceV1) SetPrimarySelectionHandler(f ExtDataControlDeviceV1PrimarySelectionHandlerFunc) {
i.primarySelectionHandler = f
}
func (i *ExtDataControlDeviceV1) Dispatch(opcode uint32, fd int, data []byte) {
switch opcode {
case 0:
// data_offer event: server creates a new object (new_id)
if i.dataOfferHandler == nil {
return
}
var e ExtDataControlDeviceV1DataOfferEvent
l := 0
newID := client.Uint32(data[l : l+4])
l += 4
ctx := i.Context()
offer := &ExtDataControlOfferV1{}
offer.SetContext(ctx)
offer.SetID(newID)
ctx.RegisterWithID(offer, newID)
e.Id = offer
i.dataOfferHandler(e)
case 1:
// selection event: nullable object reference
if i.selectionHandler == nil {
return
}
var e ExtDataControlDeviceV1SelectionEvent
l := 0
objID := client.Uint32(data[l : l+4])
l += 4
e.OfferId = objID
if objID != 0 {
if p := i.Context().GetProxy(objID); p != nil {
e.Id = p.(*ExtDataControlOfferV1)
}
}
i.selectionHandler(e)
case 2:
if i.finishedHandler == nil {
return
}
var e ExtDataControlDeviceV1FinishedEvent
i.finishedHandler(e)
case 3:
// primary_selection event: nullable object reference
if i.primarySelectionHandler == nil {
return
}
var e ExtDataControlDeviceV1PrimarySelectionEvent
l := 0
objID := client.Uint32(data[l : l+4])
l += 4
e.OfferId = objID
if objID != 0 {
if p := i.Context().GetProxy(objID); p != nil {
e.Id = p.(*ExtDataControlOfferV1)
}
}
i.primarySelectionHandler(e)
}
}
// ExtDataControlSourceV1InterfaceName is the name of the interface as it appears in the [client.Registry].
// It can be used to match the [client.RegistryGlobalEvent.Interface] in the
// [Registry.SetGlobalHandler] and can be used in [Registry.Bind] if this applies.
const ExtDataControlSourceV1InterfaceName = "ext_data_control_source_v1"
// ExtDataControlSourceV1 : offer to transfer data
//
// The ext_data_control_source object is the source side of a
// ext_data_control_offer. It is created by the source client in a data
// transfer and provides a way to describe the offered data and a way to
// respond to requests to transfer the data.
type ExtDataControlSourceV1 struct {
client.BaseProxy
sendHandler ExtDataControlSourceV1SendHandlerFunc
cancelledHandler ExtDataControlSourceV1CancelledHandlerFunc
}
// NewExtDataControlSourceV1 : offer to transfer data
//
// The ext_data_control_source object is the source side of a
// ext_data_control_offer. It is created by the source client in a data
// transfer and provides a way to describe the offered data and a way to
// respond to requests to transfer the data.
func NewExtDataControlSourceV1(ctx *client.Context) *ExtDataControlSourceV1 {
extDataControlSourceV1 := &ExtDataControlSourceV1{}
ctx.Register(extDataControlSourceV1)
return extDataControlSourceV1
}
// Offer : add an offered MIME type
//
// This request adds a MIME type to the set of MIME types advertised to
// targets. Can be called several times to offer multiple types.
//
// Calling this after ext_data_control_device.set_selection is a protocol
// error.
//
// mimeType: MIME type offered by the data source
func (i *ExtDataControlSourceV1) Offer(mimeType string) error {
const opcode = 0
mimeTypeLen := client.PaddedLen(len(mimeType) + 1)
_reqBufLen := 8 + (4 + mimeTypeLen)
_reqBuf := make([]byte, _reqBufLen)
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
client.PutString(_reqBuf[l:l+(4+mimeTypeLen)], mimeType)
l += (4 + mimeTypeLen)
err := i.Context().WriteMsg(_reqBuf, nil)
return err
}
// Destroy : destroy this source
//
// Destroys the data source object.
func (i *ExtDataControlSourceV1) Destroy() error {
defer i.MarkZombie()
const opcode = 1
const _reqBufLen = 8
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
err := i.Context().WriteMsg(_reqBuf[:], nil)
return err
}
type ExtDataControlSourceV1Error uint32
// ExtDataControlSourceV1Error :
const (
// ExtDataControlSourceV1ErrorInvalidOffer : offer sent after ext_data_control_device.set_selection
ExtDataControlSourceV1ErrorInvalidOffer ExtDataControlSourceV1Error = 1
)
func (e ExtDataControlSourceV1Error) Name() string {
switch e {
case ExtDataControlSourceV1ErrorInvalidOffer:
return "invalid_offer"
default:
return ""
}
}
func (e ExtDataControlSourceV1Error) Value() string {
switch e {
case ExtDataControlSourceV1ErrorInvalidOffer:
return "1"
default:
return ""
}
}
func (e ExtDataControlSourceV1Error) String() string {
return e.Name() + "=" + e.Value()
}
// ExtDataControlSourceV1SendEvent : send the data
//
// Request for data from the client. Send the data as the specified MIME
// type over the passed file descriptor, then close it.
type ExtDataControlSourceV1SendEvent struct {
MimeType string
Fd int
}
type ExtDataControlSourceV1SendHandlerFunc func(ExtDataControlSourceV1SendEvent)
// SetSendHandler : sets handler for ExtDataControlSourceV1SendEvent
func (i *ExtDataControlSourceV1) SetSendHandler(f ExtDataControlSourceV1SendHandlerFunc) {
i.sendHandler = f
}
// ExtDataControlSourceV1CancelledEvent : selection was cancelled
//
// This data source is no longer valid. The data source has been replaced
// by another data source.
//
// The client should clean up and destroy this data source.
type ExtDataControlSourceV1CancelledEvent struct{}
type ExtDataControlSourceV1CancelledHandlerFunc func(ExtDataControlSourceV1CancelledEvent)
// SetCancelledHandler : sets handler for ExtDataControlSourceV1CancelledEvent
func (i *ExtDataControlSourceV1) SetCancelledHandler(f ExtDataControlSourceV1CancelledHandlerFunc) {
i.cancelledHandler = f
}
func (i *ExtDataControlSourceV1) Dispatch(opcode uint32, fd int, data []byte) {
switch opcode {
case 0:
if i.sendHandler == nil {
if fd != -1 {
unix.Close(fd)
}
return
}
var e ExtDataControlSourceV1SendEvent
l := 0
mimeTypeLen := client.PaddedLen(int(client.Uint32(data[l : l+4])))
l += 4
e.MimeType = client.String(data[l : l+mimeTypeLen])
l += mimeTypeLen
e.Fd = fd
i.sendHandler(e)
case 1:
if i.cancelledHandler == nil {
return
}
var e ExtDataControlSourceV1CancelledEvent
i.cancelledHandler(e)
}
}
// ExtDataControlOfferV1InterfaceName is the name of the interface as it appears in the [client.Registry].
// It can be used to match the [client.RegistryGlobalEvent.Interface] in the
// [Registry.SetGlobalHandler] and can be used in [Registry.Bind] if this applies.
const ExtDataControlOfferV1InterfaceName = "ext_data_control_offer_v1"
// ExtDataControlOfferV1 : offer to transfer data
//
// A ext_data_control_offer represents a piece of data offered for transfer
// by another client (the source client). The offer describes the different
// MIME types that the data can be converted to and provides the mechanism
// for transferring the data directly from the source client.
type ExtDataControlOfferV1 struct {
client.BaseProxy
offerHandler ExtDataControlOfferV1OfferHandlerFunc
}
// NewExtDataControlOfferV1 : offer to transfer data
//
// A ext_data_control_offer represents a piece of data offered for transfer
// by another client (the source client). The offer describes the different
// MIME types that the data can be converted to and provides the mechanism
// for transferring the data directly from the source client.
func NewExtDataControlOfferV1(ctx *client.Context) *ExtDataControlOfferV1 {
extDataControlOfferV1 := &ExtDataControlOfferV1{}
ctx.Register(extDataControlOfferV1)
return extDataControlOfferV1
}
// Receive : request that the data is transferred
//
// To transfer the offered data, the client issues this request and
// indicates the MIME type it wants to receive. The transfer happens
// through the passed file descriptor (typically created with the pipe
// system call). The source client writes the data in the MIME type
// representation requested and then closes the file descriptor.
//
// The receiving client reads from the read end of the pipe until EOF and
// then closes its end, at which point the transfer is complete.
//
// This request may happen multiple times for different MIME types.
//
// mimeType: MIME type desired by receiver
// fd: file descriptor for data transfer
func (i *ExtDataControlOfferV1) Receive(mimeType string, fd int) error {
const opcode = 0
mimeTypeLen := client.PaddedLen(len(mimeType) + 1)
_reqBufLen := 8 + (4 + mimeTypeLen)
_reqBuf := make([]byte, _reqBufLen)
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
client.PutString(_reqBuf[l:l+(4+mimeTypeLen)], mimeType)
l += (4 + mimeTypeLen)
oob := unix.UnixRights(int(fd))
err := i.Context().WriteMsg(_reqBuf, oob)
return err
}
// Destroy : destroy this offer
//
// Destroys the data offer object.
func (i *ExtDataControlOfferV1) Destroy() error {
defer i.MarkZombie()
const opcode = 1
const _reqBufLen = 8
var _reqBuf [_reqBufLen]byte
l := 0
client.PutUint32(_reqBuf[l:4], i.ID())
l += 4
client.PutUint32(_reqBuf[l:l+4], uint32(_reqBufLen<<16|opcode&0x0000ffff))
l += 4
err := i.Context().WriteMsg(_reqBuf[:], nil)
return err
}
// ExtDataControlOfferV1OfferEvent : advertise offered MIME type
//
// Sent immediately after creating the ext_data_control_offer object.
// One event per offered MIME type.
type ExtDataControlOfferV1OfferEvent struct {
MimeType string
}
type ExtDataControlOfferV1OfferHandlerFunc func(ExtDataControlOfferV1OfferEvent)
// SetOfferHandler : sets handler for ExtDataControlOfferV1OfferEvent
func (i *ExtDataControlOfferV1) SetOfferHandler(f ExtDataControlOfferV1OfferHandlerFunc) {
i.offerHandler = f
}
func (i *ExtDataControlOfferV1) Dispatch(opcode uint32, fd int, data []byte) {
switch opcode {
case 0:
if i.offerHandler == nil {
return
}
var e ExtDataControlOfferV1OfferEvent
l := 0
mimeTypeLen := client.PaddedLen(int(client.Uint32(data[l : l+4])))
l += 4
e.MimeType = client.String(data[l : l+mimeTypeLen])
l += mimeTypeLen
i.offerHandler(e)
}
}

View File

@@ -238,9 +238,17 @@ func (i *ZwlrOutputManagerV1) Dispatch(opcode uint32, fd int, data []byte) {
l := 0
objectID := client.Uint32(data[l : l+4])
proxy := i.Context().GetProxy(objectID)
if proxy != nil {
e.Head = proxy.(*ZwlrOutputHeadV1)
if proxy == nil || proxy.IsZombie() {
head := &ZwlrOutputHeadV1{}
head.SetContext(i.Context())
head.SetID(objectID)
registerServerProxy(i.Context(), head, objectID)
e.Head = head
} else if head, ok := proxy.(*ZwlrOutputHeadV1); ok {
e.Head = head
} else {
// Stale proxy of wrong type (can happen after suspend/resume)
// Replace it with the correct type
head := &ZwlrOutputHeadV1{}
head.SetContext(i.Context())
head.SetID(objectID)
@@ -715,9 +723,17 @@ func (i *ZwlrOutputHeadV1) Dispatch(opcode uint32, fd int, data []byte) {
l := 0
objectID := client.Uint32(data[l : l+4])
proxy := i.Context().GetProxy(objectID)
if proxy != nil {
e.Mode = proxy.(*ZwlrOutputModeV1)
if proxy == nil || proxy.IsZombie() {
mode := &ZwlrOutputModeV1{}
mode.SetContext(i.Context())
mode.SetID(objectID)
registerServerProxy(i.Context(), mode, objectID)
e.Mode = mode
} else if mode, ok := proxy.(*ZwlrOutputModeV1); ok {
e.Mode = mode
} else {
// Stale proxy of wrong type (can happen after suspend/resume)
// Replace it with the correct type
mode := &ZwlrOutputModeV1{}
mode.SetContext(i.Context())
mode.SetID(objectID)
@@ -743,7 +759,26 @@ func (i *ZwlrOutputHeadV1) Dispatch(opcode uint32, fd int, data []byte) {
}
var e ZwlrOutputHeadV1CurrentModeEvent
l := 0
e.Mode = i.Context().GetProxy(client.Uint32(data[l : l+4])).(*ZwlrOutputModeV1)
objectID := client.Uint32(data[l : l+4])
proxy := i.Context().GetProxy(objectID)
if proxy == nil || proxy.IsZombie() {
// Mode not yet registered or zombie, create fresh
mode := &ZwlrOutputModeV1{}
mode.SetContext(i.Context())
mode.SetID(objectID)
registerServerProxy(i.Context(), mode, objectID)
e.Mode = mode
} else if mode, ok := proxy.(*ZwlrOutputModeV1); ok {
e.Mode = mode
} else {
// Stale proxy of wrong type (can happen after suspend/resume)
// Replace it with the correct type
mode := &ZwlrOutputModeV1{}
mode.SetContext(i.Context())
mode.SetID(objectID)
registerServerProxy(i.Context(), mode, objectID)
e.Mode = mode
}
l += 4
i.currentModeHandler(e)

View File

@@ -0,0 +1,276 @@
<?xml version="1.0" encoding="UTF-8"?>
<protocol name="ext_data_control_v1">
<copyright>
Copyright © 2018 Simon Ser
Copyright © 2019 Ivan Molodetskikh
Copyright © 2024 Neal Gompa
Permission to use, copy, modify, distribute, and sell this
software and its documentation for any purpose is hereby granted
without fee, provided that the above copyright notice appear in
all copies and that both that copyright notice and this permission
notice appear in supporting documentation, and that the name of
the copyright holders not be used in advertising or publicity
pertaining to distribution of the software without specific,
written prior permission. The copyright holders make no
representations about the suitability of this software for any
purpose. It is provided "as is" without express or implied
warranty.
THE COPYRIGHT HOLDERS DISCLAIM ALL WARRANTIES WITH REGARD TO THIS
SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS, IN NO EVENT SHALL THE COPYRIGHT HOLDERS BE LIABLE FOR ANY
SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN
AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION,
ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF
THIS SOFTWARE.
</copyright>
<description summary="control data devices">
This protocol allows a privileged client to control data devices. In
particular, the client will be able to manage the current selection and take
the role of a clipboard manager.
Warning! The protocol described in this file is currently in the testing
phase. Backward compatible changes may be added together with the
corresponding interface version bump. Backward incompatible changes can
only be done by creating a new major version of the extension.
</description>
<interface name="ext_data_control_manager_v1" version="1">
<description summary="manager to control data devices">
This interface is a manager that allows creating per-seat data device
controls.
</description>
<request name="create_data_source">
<description summary="create a new data source">
Create a new data source.
</description>
<arg name="id" type="new_id" interface="ext_data_control_source_v1"
summary="data source to create"/>
</request>
<request name="get_data_device">
<description summary="get a data device for a seat">
Create a data device that can be used to manage a seat's selection.
</description>
<arg name="id" type="new_id" interface="ext_data_control_device_v1"/>
<arg name="seat" type="object" interface="wl_seat"/>
</request>
<request name="destroy" type="destructor">
<description summary="destroy the manager">
All objects created by the manager will still remain valid, until their
appropriate destroy request has been called.
</description>
</request>
</interface>
<interface name="ext_data_control_device_v1" version="1">
<description summary="manage a data device for a seat">
This interface allows a client to manage a seat's selection.
When the seat is destroyed, this object becomes inert.
</description>
<request name="set_selection">
<description summary="copy data to the selection">
This request asks the compositor to set the selection to the data from
the source on behalf of the client.
The given source may not be used in any further set_selection or
set_primary_selection requests. Attempting to use a previously used
source triggers the used_source protocol error.
To unset the selection, set the source to NULL.
</description>
<arg name="source" type="object" interface="ext_data_control_source_v1"
allow-null="true"/>
</request>
<request name="destroy" type="destructor">
<description summary="destroy this data device">
Destroys the data device object.
</description>
</request>
<event name="data_offer">
<description summary="introduce a new ext_data_control_offer">
The data_offer event introduces a new ext_data_control_offer object,
which will subsequently be used in either the
ext_data_control_device.selection event (for the regular clipboard
selections) or the ext_data_control_device.primary_selection event (for
the primary clipboard selections). Immediately following the
ext_data_control_device.data_offer event, the new data_offer object
will send out ext_data_control_offer.offer events to describe the MIME
types it offers.
</description>
<arg name="id" type="new_id" interface="ext_data_control_offer_v1"/>
</event>
<event name="selection">
<description summary="advertise new selection">
The selection event is sent out to notify the client of a new
ext_data_control_offer for the selection for this device. The
ext_data_control_device.data_offer and the ext_data_control_offer.offer
events are sent out immediately before this event to introduce the data
offer object. The selection event is sent to a client when a new
selection is set. The ext_data_control_offer is valid until a new
ext_data_control_offer or NULL is received. The client must destroy the
previous selection ext_data_control_offer, if any, upon receiving this
event. Regardless, the previous selection will be ignored once a new
selection ext_data_control_offer is received.
The first selection event is sent upon binding the
ext_data_control_device object.
</description>
<arg name="id" type="object" interface="ext_data_control_offer_v1"
allow-null="true"/>
</event>
<event name="finished">
<description summary="this data control is no longer valid">
This data control object is no longer valid and should be destroyed by
the client.
</description>
</event>
<event name="primary_selection">
<description summary="advertise new primary selection">
The primary_selection event is sent out to notify the client of a new
ext_data_control_offer for the primary selection for this device. The
ext_data_control_device.data_offer and the ext_data_control_offer.offer
events are sent out immediately before this event to introduce the data
offer object. The primary_selection event is sent to a client when a
new primary selection is set. The ext_data_control_offer is valid until
a new ext_data_control_offer or NULL is received. The client must
destroy the previous primary selection ext_data_control_offer, if any,
upon receiving this event. Regardless, the previous primary selection
will be ignored once a new primary selection ext_data_control_offer is
received.
If the compositor supports primary selection, the first
primary_selection event is sent upon binding the
ext_data_control_device object.
</description>
<arg name="id" type="object" interface="ext_data_control_offer_v1"
allow-null="true"/>
</event>
<request name="set_primary_selection">
<description summary="copy data to the primary selection">
This request asks the compositor to set the primary selection to the
data from the source on behalf of the client.
The given source may not be used in any further set_selection or
set_primary_selection requests. Attempting to use a previously used
source triggers the used_source protocol error.
To unset the primary selection, set the source to NULL.
The compositor will ignore this request if it does not support primary
selection.
</description>
<arg name="source" type="object" interface="ext_data_control_source_v1"
allow-null="true"/>
</request>
<enum name="error">
<entry name="used_source" value="1"
summary="source given to set_selection or set_primary_selection was already used before"/>
</enum>
</interface>
<interface name="ext_data_control_source_v1" version="1">
<description summary="offer to transfer data">
The ext_data_control_source object is the source side of a
ext_data_control_offer. It is created by the source client in a data
transfer and provides a way to describe the offered data and a way to
respond to requests to transfer the data.
</description>
<enum name="error">
<entry name="invalid_offer" value="1"
summary="offer sent after ext_data_control_device.set_selection"/>
</enum>
<request name="offer">
<description summary="add an offered MIME type">
This request adds a MIME type to the set of MIME types advertised to
targets. Can be called several times to offer multiple types.
Calling this after ext_data_control_device.set_selection is a protocol
error.
</description>
<arg name="mime_type" type="string"
summary="MIME type offered by the data source"/>
</request>
<request name="destroy" type="destructor">
<description summary="destroy this source">
Destroys the data source object.
</description>
</request>
<event name="send">
<description summary="send the data">
Request for data from the client. Send the data as the specified MIME
type over the passed file descriptor, then close it.
</description>
<arg name="mime_type" type="string" summary="MIME type for the data"/>
<arg name="fd" type="fd" summary="file descriptor for the data"/>
</event>
<event name="cancelled">
<description summary="selection was cancelled">
This data source is no longer valid. The data source has been replaced
by another data source.
The client should clean up and destroy this data source.
</description>
</event>
</interface>
<interface name="ext_data_control_offer_v1" version="1">
<description summary="offer to transfer data">
A ext_data_control_offer represents a piece of data offered for transfer
by another client (the source client). The offer describes the different
MIME types that the data can be converted to and provides the mechanism
for transferring the data directly from the source client.
</description>
<request name="receive">
<description summary="request that the data is transferred">
To transfer the offered data, the client issues this request and
indicates the MIME type it wants to receive. The transfer happens
through the passed file descriptor (typically created with the pipe
system call). The source client writes the data in the MIME type
representation requested and then closes the file descriptor.
The receiving client reads from the read end of the pipe until EOF and
then closes its end, at which point the transfer is complete.
This request may happen multiple times for different MIME types.
</description>
<arg name="mime_type" type="string"
summary="MIME type desired by receiver"/>
<arg name="fd" type="fd" summary="file descriptor for data transfer"/>
</request>
<request name="destroy" type="destructor">
<description summary="destroy this offer">
Destroys the data offer object.
</description>
</request>
<event name="offer">
<description summary="advertise offered MIME type">
Sent immediately after creating the ext_data_control_offer object.
One event per offered MIME type.
</description>
<arg name="mime_type" type="string" summary="offered MIME type"/>
</event>
</interface>
</protocol>

Some files were not shown because too many files have changed in this diff Show More