mirror of
https://github.com/AvengeMedia/DankMaterialShell.git
synced 2026-01-25 05:52:50 -05:00
Compare commits
3 Commits
bae32e51ff
...
displaycon
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9673078a75 | ||
|
|
9e8c93bfd7 | ||
|
|
43d6f4b1d3 |
65
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
65
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,65 @@
|
||||
---
|
||||
name: Bug Report
|
||||
about: Crashes or unexpected behaviors
|
||||
title: ""
|
||||
labels: "bug"
|
||||
assignees: ""
|
||||
---
|
||||
|
||||
<!-- If your issue is related to ICONS
|
||||
- Purple and black checkerboards are QT's way of signalling an icon doesn't exist
|
||||
- FIX: Configure a QT6 or Icon Pack in DMS Settings that has the icon you want
|
||||
- Follow the [THEMING](https://danklinux.com/docs/dankmaterialshell/icon-theming) section to ensure your QT environment variable is configured correctly for themes.
|
||||
- Once done, configure an icon theme - either however you normally do with gtk3 or qt6ct, or through the built-in settings modal. -->
|
||||
|
||||
## Compositor
|
||||
|
||||
- [ ] niri
|
||||
- [ ] Hyprland
|
||||
- [ ] dwl (MangoWC)
|
||||
- [ ] sway
|
||||
- [ ] Other (specify)
|
||||
|
||||
## Distribution
|
||||
|
||||
<!-- Arch, Fedora, Debian, etc. -->
|
||||
|
||||
## dms version
|
||||
|
||||
<!-- Output of dms version command -->
|
||||
|
||||
## Description
|
||||
|
||||
<!-- Brief description of the issue -->
|
||||
|
||||
## Expected Behavior
|
||||
|
||||
<!-- Describe what you expected to happen -->
|
||||
|
||||
## Steps to Reproduce
|
||||
|
||||
<!-- Please provide detailed steps to reproduce the issue -->
|
||||
|
||||
1.
|
||||
2.
|
||||
3.
|
||||
|
||||
## Error Messages/Logs
|
||||
|
||||
<!-- Please include any error messages, stack traces, or relevant logs -->
|
||||
<!-- you can get a log file with the following steps:
|
||||
dms kill
|
||||
mkdir ~/dms_logs
|
||||
nohup dms run > ~/dms_logs/dms-$(date +%s).txt 2>&1 &
|
||||
|
||||
Then trigger your issue, and share the contents of ~/dms_logs/dms-<timestamp>.txt
|
||||
|
||||
-->
|
||||
|
||||
```
|
||||
Paste error messages or logs here
|
||||
```
|
||||
|
||||
## Screenshots/Recordings
|
||||
|
||||
<!-- If applicable, add screenshots or screen recordings -->
|
||||
96
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
96
.github/ISSUE_TEMPLATE/bug_report.yml
vendored
@@ -1,96 +0,0 @@
|
||||
name: Bug Report
|
||||
description: Crashes or unexpected behaviors
|
||||
labels:
|
||||
- bug
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
## DankMaterialShell Bug Report
|
||||
Limit your report to one issue per submission unless closely related
|
||||
- type: checkboxes
|
||||
id: compositor
|
||||
attributes:
|
||||
label: Compositor
|
||||
options:
|
||||
- label: Niri
|
||||
- label: Hyprland
|
||||
- label: MangoWC (dwl)
|
||||
- label: Sway
|
||||
validations:
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: distribution
|
||||
attributes:
|
||||
label: Distribution
|
||||
options:
|
||||
- label: Arch Linux
|
||||
- label: CachyOS
|
||||
- label: Fedora
|
||||
- label: NixOS
|
||||
- label: Debian
|
||||
- label: Ubuntu
|
||||
- label: Gentoo
|
||||
- label: OpenSUSE
|
||||
- label: Other (specify below)
|
||||
validations:
|
||||
required: true
|
||||
- type: input
|
||||
id: distribution_other
|
||||
attributes:
|
||||
label: If Other, please specify
|
||||
placeholder: e.g., PikaOS, Void Linux, etc.
|
||||
validations:
|
||||
required: false
|
||||
- type: input
|
||||
id: dms_version
|
||||
attributes:
|
||||
label: dms version
|
||||
description: Output of dms version command
|
||||
placeholder: e.g., 1.2.3
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Description
|
||||
description: Brief description of the issue
|
||||
placeholder: What happened?
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: expected_behavior
|
||||
attributes:
|
||||
label: Expected Behavior
|
||||
description: What did you expect to happen?
|
||||
placeholder: Describe the expected behavior
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: steps_to_reproduce
|
||||
attributes:
|
||||
label: Steps to Reproduce & Installation Method
|
||||
description: Please provide detailed steps to reproduce the issue
|
||||
placeholder: |
|
||||
1. ...
|
||||
2. ...
|
||||
3. ...
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: logs
|
||||
attributes:
|
||||
label: Error Messages/Logs
|
||||
description: Please include any error messages, stack traces, or relevant logs
|
||||
placeholder: |
|
||||
Paste error messages or logs here
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: screenshots
|
||||
attributes:
|
||||
label: Screenshots/Recordings
|
||||
description: If applicable, add screenshots or screen recordings
|
||||
placeholder: Attach images or videos here
|
||||
validations:
|
||||
required: false
|
||||
33
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
33
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@@ -0,0 +1,33 @@
|
||||
---
|
||||
name: Request a Feature
|
||||
about: New widgets, new widget behavior, etc.
|
||||
title: ""
|
||||
labels: "enhancement"
|
||||
assignees: ""
|
||||
---
|
||||
|
||||
## Feature Description
|
||||
|
||||
<!-- Brief description of the feature requested -->
|
||||
|
||||
## Use Case
|
||||
|
||||
<!-- Explain the purpose of this feature/why it'd be useful to you -->
|
||||
|
||||
## Compositor
|
||||
|
||||
Is this feature specific to one compositor?
|
||||
|
||||
- [ ] All compositors
|
||||
- [ ] niri
|
||||
- [ ] Hyprland
|
||||
- [ ] dwl (MangoWC)
|
||||
- [ ] sway
|
||||
|
||||
## Proposed Solution
|
||||
|
||||
<!-- If you have any ideas for how to implement this, please share! -->
|
||||
|
||||
## Alternatives/Existing Solutions
|
||||
|
||||
<!-- Include any similar/pre-existing products that solve this problem -->
|
||||
55
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
55
.github/ISSUE_TEMPLATE/feature_request.yml
vendored
@@ -1,55 +0,0 @@
|
||||
name: Feature Request
|
||||
description: Suggest a new feature or improvement for DMS
|
||||
labels:
|
||||
- enhancement
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
## DankMaterialShell Feature Request
|
||||
- type: textarea
|
||||
id: feature_description
|
||||
attributes:
|
||||
label: Feature Description
|
||||
description: Brief description of the feature requested
|
||||
placeholder: What feature would you like to see?
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: use_case
|
||||
attributes:
|
||||
label: Use Case
|
||||
description: Explain the purpose of this feature/why it'd be useful to you
|
||||
placeholder: Why is this feature important?
|
||||
validations:
|
||||
required: false
|
||||
- type: checkboxes
|
||||
id: compositor
|
||||
attributes:
|
||||
label: Compositor(s)
|
||||
description: Is this feature specific to one or more compositors?
|
||||
options:
|
||||
- label: All compositors
|
||||
- label: Niri
|
||||
- label: Hyprland
|
||||
- label: MangoWC (dwl)
|
||||
- label: Sway
|
||||
- label: Other (specify below)
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: proposed_solution
|
||||
attributes:
|
||||
label: Proposed Solution
|
||||
description: If you have any ideas for how to implement this, please share!
|
||||
placeholder: Suggest a solution or approach
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: alternatives
|
||||
attributes:
|
||||
label: Alternatives/Existing Solutions
|
||||
description: Include any similar/pre-existing products that solve this problem
|
||||
placeholder: List alternatives or existing solutions
|
||||
validations:
|
||||
required: false
|
||||
40
.github/ISSUE_TEMPLATE/support_request.md
vendored
Normal file
40
.github/ISSUE_TEMPLATE/support_request.md
vendored
Normal file
@@ -0,0 +1,40 @@
|
||||
---
|
||||
name: Request Assistance or Support
|
||||
about: Help with installation, usage, or general questions.
|
||||
title: ""
|
||||
labels: "support"
|
||||
assignees: ""
|
||||
---
|
||||
|
||||
## Compositor
|
||||
|
||||
- [ ] niri
|
||||
- [ ] Hyprland
|
||||
- [ ] dwl (MangoWC)
|
||||
- [ ] sway
|
||||
- [ ] other
|
||||
|
||||
## Distribution
|
||||
|
||||
<!-- Arch, Fedora, Debian, etc. -->
|
||||
|
||||
## dms version
|
||||
|
||||
<!-- Output of dms version command -->
|
||||
|
||||
## Description
|
||||
|
||||
<!-- Brief description of the support needed -->
|
||||
|
||||
## Solutions Tried
|
||||
|
||||
<!-- Describe what you've tried so far -->
|
||||
<!-- Outlining what you've tried so far helps us make improvements to the user experience and documentation to avoid recurrent issues -->
|
||||
|
||||
## Configuration Details
|
||||
|
||||
<!-- Include any configuration if relevant -->
|
||||
|
||||
## Screenshots/Recordings
|
||||
|
||||
<!-- If applicable, add screenshots or screen recordings -->
|
||||
69
.github/ISSUE_TEMPLATE/support_request.yml
vendored
69
.github/ISSUE_TEMPLATE/support_request.yml
vendored
@@ -1,69 +0,0 @@
|
||||
name: Support Request
|
||||
description: Help with installation, usage, or general questions about DankMaterialShell
|
||||
labels:
|
||||
- support
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
## DankMaterialShell Support Request
|
||||
- type: checkboxes
|
||||
id: compositor
|
||||
attributes:
|
||||
label: Compositor
|
||||
options:
|
||||
- label: Niri
|
||||
- label: Hyprland
|
||||
- label: MangoWC (dwl)
|
||||
- label: Sway
|
||||
- label: Other (specify below)
|
||||
validations:
|
||||
required: false
|
||||
- type: input
|
||||
id: distribution
|
||||
attributes:
|
||||
label: Distribution
|
||||
description: Which Linux distribution are you using? (e.g., Arch, Fedora, Debian, etc.)
|
||||
placeholder: Your Linux distribution
|
||||
validations:
|
||||
required: false
|
||||
- type: input
|
||||
id: dms_version
|
||||
attributes:
|
||||
label: dms version
|
||||
description: Output of dms version command
|
||||
placeholder: e.g., 1.2.3
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: description
|
||||
attributes:
|
||||
label: Description
|
||||
description: Brief description of the support needed
|
||||
placeholder: What do you need help with?
|
||||
validations:
|
||||
required: true
|
||||
- type: textarea
|
||||
id: solutions_tried
|
||||
attributes:
|
||||
label: Solutions Tried
|
||||
description: Describe what you've tried so far (commands, documentation, etc.)
|
||||
placeholder: List steps or resources you've already tried
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: configuration
|
||||
attributes:
|
||||
label: Configuration Details
|
||||
description: Include any relevant configuration if relevant
|
||||
placeholder: Add configuration or environment info
|
||||
validations:
|
||||
required: false
|
||||
- type: textarea
|
||||
id: screenshots
|
||||
attributes:
|
||||
label: Screenshots/Recordings
|
||||
description: If applicable, add screenshots or screen recordings
|
||||
placeholder: Attach images or videos here
|
||||
validations:
|
||||
required: false
|
||||
294
.github/workflows/release.yml
vendored
294
.github/workflows/release.yml
vendored
@@ -398,3 +398,297 @@ jobs:
|
||||
prerelease: ${{ contains(env.TAG, '-') }}
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
# trigger-obs-update:
|
||||
# runs-on: ubuntu-latest
|
||||
# needs: release
|
||||
# env:
|
||||
# TAG: ${{ inputs.tag }}
|
||||
# steps:
|
||||
# - name: Checkout
|
||||
# uses: actions/checkout@v4
|
||||
# with:
|
||||
# ref: ${{ inputs.tag }}
|
||||
|
||||
# - name: Install OSC
|
||||
# run: |
|
||||
# sudo apt-get update
|
||||
# sudo apt-get install -y osc
|
||||
|
||||
# mkdir -p ~/.config/osc
|
||||
# cat > ~/.config/osc/oscrc << EOF
|
||||
# [general]
|
||||
# apiurl = https://api.opensuse.org
|
||||
|
||||
# [https://api.opensuse.org]
|
||||
# user = ${{ secrets.OBS_USERNAME }}
|
||||
# pass = ${{ secrets.OBS_PASSWORD }}
|
||||
# EOF
|
||||
# chmod 600 ~/.config/osc/oscrc
|
||||
|
||||
# - name: Update OBS packages
|
||||
# run: |
|
||||
# cd distro
|
||||
# bash scripts/obs-upload.sh dms "Update to ${TAG}"
|
||||
|
||||
# trigger-ppa-update:
|
||||
# runs-on: ubuntu-latest
|
||||
# needs: release
|
||||
# env:
|
||||
# TAG: ${{ inputs.tag }}
|
||||
# steps:
|
||||
# - name: Checkout
|
||||
# uses: actions/checkout@v4
|
||||
# with:
|
||||
# ref: ${{ inputs.tag }}
|
||||
|
||||
# - name: Install build dependencies
|
||||
# run: |
|
||||
# sudo apt-get update
|
||||
# sudo apt-get install -y \
|
||||
# debhelper \
|
||||
# devscripts \
|
||||
# dput \
|
||||
# lftp \
|
||||
# build-essential \
|
||||
# fakeroot \
|
||||
# dpkg-dev
|
||||
|
||||
# - name: Configure GPG
|
||||
# env:
|
||||
# GPG_KEY: ${{ secrets.GPG_PRIVATE_KEY }}
|
||||
# run: |
|
||||
# echo "$GPG_KEY" | gpg --import
|
||||
# GPG_KEY_ID=$(gpg --list-secret-keys --keyid-format LONG | grep sec | awk '{print $2}' | cut -d'/' -f2)
|
||||
# echo "DEBSIGN_KEYID=$GPG_KEY_ID" >> $GITHUB_ENV
|
||||
|
||||
# - name: Upload to PPA
|
||||
# run: |
|
||||
# cd distro/ubuntu/ppa
|
||||
# bash create-and-upload.sh ../dms dms questing
|
||||
|
||||
# copr-build:
|
||||
# runs-on: ubuntu-latest
|
||||
# needs: release
|
||||
# env:
|
||||
# TAG: ${{ inputs.tag }}
|
||||
|
||||
# steps:
|
||||
# - name: Checkout repository
|
||||
# uses: actions/checkout@v4
|
||||
# with:
|
||||
# ref: ${{ inputs.tag }}
|
||||
|
||||
# - name: Determine version
|
||||
# id: version
|
||||
# run: |
|
||||
# VERSION="${TAG#v}"
|
||||
# echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||
# echo "Building DMS stable version: $VERSION"
|
||||
|
||||
# - name: Setup build environment
|
||||
# run: |
|
||||
# sudo apt-get update
|
||||
# sudo apt-get install -y rpm wget curl jq gzip
|
||||
# mkdir -p ~/rpmbuild/{BUILD,BUILDROOT,RPMS,SOURCES,SPECS,SRPMS}
|
||||
|
||||
# - name: Download release assets
|
||||
# run: |
|
||||
# VERSION="${{ steps.version.outputs.version }}"
|
||||
# cd ~/rpmbuild/SOURCES
|
||||
|
||||
# wget "https://github.com/AvengeMedia/DankMaterialShell/releases/download/v${VERSION}/dms-qml.tar.gz" || {
|
||||
# echo "Failed to download dms-qml.tar.gz for v${VERSION}"
|
||||
# exit 1
|
||||
# }
|
||||
|
||||
# - name: Generate stable spec file
|
||||
# run: |
|
||||
# VERSION="${{ steps.version.outputs.version }}"
|
||||
# CHANGELOG_DATE="$(date '+%a %b %d %Y')"
|
||||
|
||||
# cat > ~/rpmbuild/SPECS/dms.spec <<'SPECEOF'
|
||||
# # Spec for DMS stable releases - Generated by GitHub Actions
|
||||
|
||||
# %global debug_package %{nil}
|
||||
# %global version VERSION_PLACEHOLDER
|
||||
# %global pkg_summary DankMaterialShell - Material 3 inspired shell for Wayland compositors
|
||||
|
||||
# Name: dms
|
||||
# Version: %{version}
|
||||
# Release: 1%{?dist}
|
||||
# Summary: %{pkg_summary}
|
||||
|
||||
# License: MIT
|
||||
# URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
|
||||
# Source0: dms-qml.tar.gz
|
||||
|
||||
# BuildRequires: gzip
|
||||
# BuildRequires: wget
|
||||
# BuildRequires: systemd-rpm-macros
|
||||
|
||||
# Requires: (quickshell or quickshell-git)
|
||||
# Requires: accountsservice
|
||||
# Requires: dms-cli = %{version}-%{release}
|
||||
# Requires: dgop
|
||||
|
||||
# Recommends: cava
|
||||
# Recommends: cliphist
|
||||
# Recommends: danksearch
|
||||
# Recommends: matugen
|
||||
# Recommends: wl-clipboard
|
||||
# Recommends: NetworkManager
|
||||
# Recommends: qt6-qtmultimedia
|
||||
# Suggests: qt6ct
|
||||
|
||||
# %description
|
||||
# DankMaterialShell (DMS) is a modern Wayland desktop shell built with Quickshell
|
||||
# and optimized for the niri and hyprland compositors. Features notifications,
|
||||
# app launcher, wallpaper customization, and fully customizable with plugins.
|
||||
|
||||
# Includes auto-theming for GTK/Qt apps with matugen, 20+ customizable widgets,
|
||||
# process monitoring, notification center, clipboard history, dock, control center,
|
||||
# lock screen, and comprehensive plugin system.
|
||||
|
||||
# %package -n dms-cli
|
||||
# Summary: DankMaterialShell CLI tool
|
||||
# License: MIT
|
||||
# URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
|
||||
# %description -n dms-cli
|
||||
# Command-line interface for DankMaterialShell configuration and management.
|
||||
# Provides native DBus bindings, NetworkManager integration, and system utilities.
|
||||
|
||||
# %prep
|
||||
# %setup -q -c -n dms-qml
|
||||
|
||||
# # Download architecture-specific binaries during build
|
||||
# case "%{_arch}" in
|
||||
# x86_64)
|
||||
# ARCH_SUFFIX="amd64"
|
||||
# ;;
|
||||
# aarch64)
|
||||
# ARCH_SUFFIX="arm64"
|
||||
# ;;
|
||||
# *)
|
||||
# echo "Unsupported architecture: %{_arch}"
|
||||
# exit 1
|
||||
# ;;
|
||||
# esac
|
||||
|
||||
# wget -O %{_builddir}/dms-cli.gz "https://github.com/AvengeMedia/DankMaterialShell/releases/latest/download/dms-distropkg-${ARCH_SUFFIX}.gz" || {
|
||||
# echo "Failed to download dms-cli for architecture %{_arch}"
|
||||
# exit 1
|
||||
# }
|
||||
# gunzip -c %{_builddir}/dms-cli.gz > %{_builddir}/dms-cli
|
||||
# chmod +x %{_builddir}/dms-cli
|
||||
|
||||
# %build
|
||||
|
||||
# %install
|
||||
# install -Dm755 %{_builddir}/dms-cli %{buildroot}%{_bindir}/dms
|
||||
|
||||
# install -d %{buildroot}%{_datadir}/bash-completion/completions
|
||||
# install -d %{buildroot}%{_datadir}/zsh/site-functions
|
||||
# install -d %{buildroot}%{_datadir}/fish/vendor_completions.d
|
||||
# %{_builddir}/dms-cli completion bash > %{buildroot}%{_datadir}/bash-completion/completions/dms || :
|
||||
# %{_builddir}/dms-cli completion zsh > %{buildroot}%{_datadir}/zsh/site-functions/_dms || :
|
||||
# %{_builddir}/dms-cli completion fish > %{buildroot}%{_datadir}/fish/vendor_completions.d/dms.fish || :
|
||||
|
||||
# install -Dm644 assets/systemd/dms.service %{buildroot}%{_userunitdir}/dms.service
|
||||
|
||||
# install -Dm644 assets/dms-open.desktop %{buildroot}%{_datadir}/applications/dms-open.desktop
|
||||
# install -Dm644 assets/danklogo.svg %{buildroot}%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
|
||||
# install -dm755 %{buildroot}%{_datadir}/quickshell/dms
|
||||
# cp -r %{_builddir}/dms-qml/* %{buildroot}%{_datadir}/quickshell/dms/
|
||||
|
||||
# rm -rf %{buildroot}%{_datadir}/quickshell/dms/.git*
|
||||
# rm -f %{buildroot}%{_datadir}/quickshell/dms/.gitignore
|
||||
# rm -rf %{buildroot}%{_datadir}/quickshell/dms/.github
|
||||
# rm -rf %{buildroot}%{_datadir}/quickshell/dms/distro
|
||||
|
||||
# echo "%{version}" > %{buildroot}%{_datadir}/quickshell/dms/VERSION
|
||||
|
||||
# %posttrans
|
||||
# if [ -d "%{_sysconfdir}/xdg/quickshell/dms" ]; then
|
||||
# rmdir "%{_sysconfdir}/xdg/quickshell/dms" 2>/dev/null || true
|
||||
# rmdir "%{_sysconfdir}/xdg/quickshell" 2>/dev/null || true
|
||||
# rmdir "%{_sysconfdir}/xdg" 2>/dev/null || true
|
||||
# fi
|
||||
# # Signal running DMS instances to reload
|
||||
# pkill -USR1 -x dms >/dev/null 2>&1 || :
|
||||
|
||||
# %files
|
||||
# %license LICENSE
|
||||
# %doc README.md CONTRIBUTING.md
|
||||
# %{_datadir}/quickshell/dms/
|
||||
# %{_userunitdir}/dms.service
|
||||
# %{_datadir}/applications/dms-open.desktop
|
||||
# %{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
|
||||
# %files -n dms-cli
|
||||
# %{_bindir}/dms
|
||||
# %{_datadir}/bash-completion/completions/dms
|
||||
# %{_datadir}/zsh/site-functions/_dms
|
||||
# %{_datadir}/fish/vendor_completions.d/dms.fish
|
||||
|
||||
# %changelog
|
||||
# * CHANGELOG_DATE_PLACEHOLDER AvengeMedia <contact@avengemedia.com> - VERSION_PLACEHOLDER-1
|
||||
# - Stable release VERSION_PLACEHOLDER
|
||||
# - Built from GitHub release
|
||||
# SPECEOF
|
||||
|
||||
# sed -i "s/VERSION_PLACEHOLDER/${VERSION}/g" ~/rpmbuild/SPECS/dms.spec
|
||||
# sed -i "s/CHANGELOG_DATE_PLACEHOLDER/${CHANGELOG_DATE}/g" ~/rpmbuild/SPECS/dms.spec
|
||||
|
||||
# - name: Build SRPM
|
||||
# id: build
|
||||
# run: |
|
||||
# cd ~/rpmbuild/SPECS
|
||||
# rpmbuild -bs dms.spec
|
||||
|
||||
# SRPM=$(ls ~/rpmbuild/SRPMS/*.src.rpm | tail -n 1)
|
||||
# SRPM_NAME=$(basename "$SRPM")
|
||||
|
||||
# echo "srpm_path=$SRPM" >> $GITHUB_OUTPUT
|
||||
# echo "srpm_name=$SRPM_NAME" >> $GITHUB_OUTPUT
|
||||
# echo "SRPM built: $SRPM_NAME"
|
||||
|
||||
# - name: Upload SRPM artifact
|
||||
# uses: actions/upload-artifact@v4
|
||||
# with:
|
||||
# name: dms-stable-srpm-${{ steps.version.outputs.version }}
|
||||
# path: ${{ steps.build.outputs.srpm_path }}
|
||||
# retention-days: 90
|
||||
|
||||
# - name: Install Copr CLI
|
||||
# run: |
|
||||
# sudo apt-get install -y python3-pip
|
||||
# pip3 install copr-cli
|
||||
|
||||
# mkdir -p ~/.config
|
||||
# cat > ~/.config/copr << EOF
|
||||
# [copr-cli]
|
||||
# login = ${{ secrets.COPR_LOGIN }}
|
||||
# username = avengemedia
|
||||
# token = ${{ secrets.COPR_TOKEN }}
|
||||
# copr_url = https://copr.fedorainfracloud.org
|
||||
# EOF
|
||||
# chmod 600 ~/.config/copr
|
||||
|
||||
# - name: Upload to Copr
|
||||
# run: |
|
||||
# SRPM="${{ steps.build.outputs.srpm_path }}"
|
||||
# VERSION="${{ steps.version.outputs.version }}"
|
||||
|
||||
# echo "Uploading SRPM to avengemedia/dms..."
|
||||
# BUILD_OUTPUT=$(copr-cli build avengemedia/dms "$SRPM" --nowait 2>&1)
|
||||
# echo "$BUILD_OUTPUT"
|
||||
|
||||
# BUILD_ID=$(echo "$BUILD_OUTPUT" | grep -oP 'Build was added to.*\K[0-9]+' || echo "unknown")
|
||||
|
||||
# if [ "$BUILD_ID" != "unknown" ]; then
|
||||
# echo "Build submitted: https://copr.fedorainfracloud.org/coprs/avengemedia/dms/build/$BUILD_ID/"
|
||||
# fi
|
||||
|
||||
216
.github/workflows/run-copr.yml
vendored
216
.github/workflows/run-copr.yml
vendored
@@ -3,17 +3,8 @@ name: DMS Copr Stable Release
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
package:
|
||||
description: 'Package to build (dms, dms-greeter, or both)'
|
||||
required: false
|
||||
default: 'dms'
|
||||
type: choice
|
||||
options:
|
||||
- dms
|
||||
- dms-greeter
|
||||
- both
|
||||
version:
|
||||
description: 'Versioning (e.g., 1.0.3, leave empty for latest release)'
|
||||
description: 'Versioning (e.g., 0.1.14, leave empty for latest release)'
|
||||
required: false
|
||||
default: ''
|
||||
release:
|
||||
@@ -22,27 +13,8 @@ on:
|
||||
default: '1'
|
||||
|
||||
jobs:
|
||||
determine-packages:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
packages: ${{ steps.set-packages.outputs.packages }}
|
||||
steps:
|
||||
- name: Set package list
|
||||
id: set-packages
|
||||
run: |
|
||||
PACKAGE_INPUT="${{ github.event.inputs.package || 'dms' }}"
|
||||
if [ "$PACKAGE_INPUT" = "both" ]; then
|
||||
echo 'packages=["dms","dms-greeter"]' >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "packages=[\"$PACKAGE_INPUT\"]" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
build-and-upload:
|
||||
needs: determine-packages
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
package: ${{ fromJSON(needs.determine-packages.outputs.packages) }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
@@ -67,7 +39,7 @@ jobs:
|
||||
|
||||
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||
echo "release=$RELEASE" >> $GITHUB_OUTPUT
|
||||
echo "✅ Building ${{ matrix.package }} version: $VERSION-$RELEASE"
|
||||
echo "✅ Building DMS hotfix version: $VERSION-$RELEASE"
|
||||
|
||||
- name: Setup build environment
|
||||
run: |
|
||||
@@ -98,31 +70,157 @@ jobs:
|
||||
VERSION="${{ steps.version.outputs.version }}"
|
||||
RELEASE="${{ steps.version.outputs.release }}"
|
||||
CHANGELOG_DATE="$(date '+%a %b %d %Y')"
|
||||
PACKAGE="${{ matrix.package }}"
|
||||
|
||||
# Copy spec file from repository
|
||||
cp distro/fedora/${PACKAGE}.spec ~/rpmbuild/SPECS/${PACKAGE}.spec
|
||||
cat > ~/rpmbuild/SPECS/dms.spec <<'SPECEOF'
|
||||
# Spec for DMS stable releases - Generated by GitHub Actions
|
||||
|
||||
# Replace placeholders with actual values
|
||||
sed -i "s/VERSION_PLACEHOLDER/${VERSION}/g" ~/rpmbuild/SPECS/${PACKAGE}.spec
|
||||
sed -i "s/RELEASE_PLACEHOLDER/${RELEASE}/g" ~/rpmbuild/SPECS/${PACKAGE}.spec
|
||||
sed -i "s/CHANGELOG_DATE_PLACEHOLDER/${CHANGELOG_DATE}/g" ~/rpmbuild/SPECS/${PACKAGE}.spec
|
||||
%global debug_package %{nil}
|
||||
%global version VERSION_PLACEHOLDER
|
||||
%global pkg_summary DankMaterialShell - Material 3 inspired shell for Wayland compositors
|
||||
|
||||
echo "✅ Spec file generated for ${PACKAGE} v${VERSION}-${RELEASE}"
|
||||
Name: dms
|
||||
Version: %{version}
|
||||
Release: RELEASE_PLACEHOLDER%{?dist}
|
||||
Summary: %{pkg_summary}
|
||||
|
||||
License: MIT
|
||||
URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
|
||||
Source0: dms-qml.tar.gz
|
||||
|
||||
BuildRequires: gzip
|
||||
BuildRequires: wget
|
||||
BuildRequires: systemd-rpm-macros
|
||||
|
||||
Requires: (quickshell or quickshell-git)
|
||||
Requires: accountsservice
|
||||
Requires: dms-cli = %{version}-%{release}
|
||||
Requires: dgop
|
||||
|
||||
Recommends: cava
|
||||
Recommends: cliphist
|
||||
Recommends: danksearch
|
||||
Recommends: hyprpicker
|
||||
Recommends: matugen
|
||||
Recommends: wl-clipboard
|
||||
Recommends: NetworkManager
|
||||
Recommends: qt6-qtmultimedia
|
||||
Suggests: qt6ct
|
||||
|
||||
%description
|
||||
DankMaterialShell (DMS) is a modern Wayland desktop shell built with Quickshell
|
||||
and optimized for the niri and hyprland compositors. Features notifications,
|
||||
app launcher, wallpaper customization, and fully customizable with plugins.
|
||||
|
||||
Includes auto-theming for GTK/Qt apps with matugen, 20+ customizable widgets,
|
||||
process monitoring, notification center, clipboard history, dock, control center,
|
||||
lock screen, and comprehensive plugin system.
|
||||
|
||||
%package -n dms-cli
|
||||
Summary: DankMaterialShell CLI tool
|
||||
License: MIT
|
||||
URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
|
||||
%description -n dms-cli
|
||||
Command-line interface for DankMaterialShell configuration and management.
|
||||
Provides native DBus bindings, NetworkManager integration, and system utilities.
|
||||
|
||||
%prep
|
||||
%setup -q -c -n dms-qml
|
||||
|
||||
# Download architecture-specific binaries during build
|
||||
# This ensures the correct architecture is used for each build target
|
||||
case "%{_arch}" in
|
||||
x86_64)
|
||||
ARCH_SUFFIX="amd64"
|
||||
;;
|
||||
aarch64)
|
||||
ARCH_SUFFIX="arm64"
|
||||
;;
|
||||
*)
|
||||
echo "Unsupported architecture: %{_arch}"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
# Download dms-cli for target architecture
|
||||
wget -O %{_builddir}/dms-cli.gz "https://github.com/AvengeMedia/DankMaterialShell/releases/latest/download/dms-distropkg-${ARCH_SUFFIX}.gz" || {
|
||||
echo "Failed to download dms-cli for architecture %{_arch}"
|
||||
exit 1
|
||||
}
|
||||
gunzip -c %{_builddir}/dms-cli.gz > %{_builddir}/dms-cli
|
||||
chmod +x %{_builddir}/dms-cli
|
||||
|
||||
%build
|
||||
|
||||
%install
|
||||
install -Dm755 %{_builddir}/dms-cli %{buildroot}%{_bindir}/dms
|
||||
|
||||
# Shell completions
|
||||
install -d %{buildroot}%{_datadir}/bash-completion/completions
|
||||
install -d %{buildroot}%{_datadir}/zsh/site-functions
|
||||
install -d %{buildroot}%{_datadir}/fish/vendor_completions.d
|
||||
%{_builddir}/dms-cli completion bash > %{buildroot}%{_datadir}/bash-completion/completions/dms || :
|
||||
%{_builddir}/dms-cli completion zsh > %{buildroot}%{_datadir}/zsh/site-functions/_dms || :
|
||||
%{_builddir}/dms-cli completion fish > %{buildroot}%{_datadir}/fish/vendor_completions.d/dms.fish || :
|
||||
|
||||
install -Dm644 %{_builddir}/dms-qml/assets/systemd/dms.service %{buildroot}%{_userunitdir}/dms.service
|
||||
|
||||
install -dm755 %{buildroot}%{_datadir}/quickshell/dms
|
||||
cp -r %{_builddir}/dms-qml/* %{buildroot}%{_datadir}/quickshell/dms/
|
||||
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.git*
|
||||
rm -f %{buildroot}%{_datadir}/quickshell/dms/.gitignore
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.github
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/distro
|
||||
|
||||
%posttrans
|
||||
# Clean up old installation path from previous versions (only if empty)
|
||||
if [ -d "%{_sysconfdir}/xdg/quickshell/dms" ]; then
|
||||
# Remove directories only if empty (preserves any user-added files)
|
||||
rmdir "%{_sysconfdir}/xdg/quickshell/dms" 2>/dev/null || true
|
||||
rmdir "%{_sysconfdir}/xdg/quickshell" 2>/dev/null || true
|
||||
rmdir "%{_sysconfdir}/xdg" 2>/dev/null || true
|
||||
fi
|
||||
# Signal running DMS instances to reload (harmless if none running)
|
||||
pkill -USR1 -x dms >/dev/null 2>&1 || :
|
||||
|
||||
%files
|
||||
%license LICENSE
|
||||
%doc README.md CONTRIBUTING.md
|
||||
%{_datadir}/quickshell/dms/
|
||||
%{_userunitdir}/dms.service
|
||||
|
||||
%files -n dms-cli
|
||||
%{_bindir}/dms
|
||||
%{_datadir}/bash-completion/completions/dms
|
||||
%{_datadir}/zsh/site-functions/_dms
|
||||
%{_datadir}/fish/vendor_completions.d/dms.fish
|
||||
|
||||
%changelog
|
||||
* CHANGELOG_DATE_PLACEHOLDER AvengeMedia <contact@avengemedia.com> - VERSION_PLACEHOLDER-RELEASE_PLACEHOLDER
|
||||
- Stable release VERSION_PLACEHOLDER
|
||||
- Built from GitHub release
|
||||
SPECEOF
|
||||
|
||||
sed -i "s/VERSION_PLACEHOLDER/${VERSION}/g" ~/rpmbuild/SPECS/dms.spec
|
||||
sed -i "s/RELEASE_PLACEHOLDER/${RELEASE}/g" ~/rpmbuild/SPECS/dms.spec
|
||||
sed -i "s/CHANGELOG_DATE_PLACEHOLDER/${CHANGELOG_DATE}/g" ~/rpmbuild/SPECS/dms.spec
|
||||
|
||||
echo "✅ Spec file generated for v${VERSION}-${RELEASE}"
|
||||
echo ""
|
||||
echo "=== Spec file preview ==="
|
||||
head -40 ~/rpmbuild/SPECS/${PACKAGE}.spec
|
||||
head -40 ~/rpmbuild/SPECS/dms.spec
|
||||
|
||||
- name: Build SRPM
|
||||
id: build
|
||||
run: |
|
||||
cd ~/rpmbuild/SPECS
|
||||
PACKAGE="${{ matrix.package }}"
|
||||
|
||||
echo "🔨 Building SRPM for ${PACKAGE}..."
|
||||
rpmbuild -bs ${PACKAGE}.spec
|
||||
echo "🔨 Building SRPM..."
|
||||
rpmbuild -bs dms.spec
|
||||
|
||||
SRPM=$(ls ~/rpmbuild/SRPMS/${PACKAGE}-*.src.rpm | tail -n 1)
|
||||
SRPM=$(ls ~/rpmbuild/SRPMS/*.src.rpm | tail -n 1)
|
||||
SRPM_NAME=$(basename "$SRPM")
|
||||
|
||||
echo "srpm_path=$SRPM" >> $GITHUB_OUTPUT
|
||||
@@ -136,7 +234,7 @@ jobs:
|
||||
- name: Upload SRPM artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ${{ matrix.package }}-stable-srpm-${{ steps.version.outputs.version }}
|
||||
name: dms-stable-srpm-${{ steps.version.outputs.version }}
|
||||
path: ${{ steps.build.outputs.srpm_path }}
|
||||
retention-days: 90
|
||||
|
||||
@@ -157,40 +255,23 @@ jobs:
|
||||
|
||||
echo "✅ Copr CLI configured"
|
||||
|
||||
- name: Determine Copr project
|
||||
id: copr_project
|
||||
run: |
|
||||
PACKAGE="${{ matrix.package }}"
|
||||
if [ "$PACKAGE" = "dms" ]; then
|
||||
COPR_PROJECT="avengemedia/dms"
|
||||
elif [ "$PACKAGE" = "dms-greeter" ]; then
|
||||
COPR_PROJECT="avengemedia/danklinux"
|
||||
else
|
||||
echo "❌ Unknown package: $PACKAGE"
|
||||
exit 1
|
||||
fi
|
||||
echo "copr_project=$COPR_PROJECT" >> $GITHUB_OUTPUT
|
||||
echo "✅ Copr project: $COPR_PROJECT"
|
||||
|
||||
- name: Upload to Copr
|
||||
run: |
|
||||
SRPM="${{ steps.build.outputs.srpm_path }}"
|
||||
VERSION="${{ steps.version.outputs.version }}"
|
||||
COPR_PROJECT="${{ steps.copr_project.outputs.copr_project }}"
|
||||
PACKAGE="${{ matrix.package }}"
|
||||
|
||||
echo "🚀 Uploading ${PACKAGE} SRPM to ${COPR_PROJECT}..."
|
||||
echo "🚀 Uploading SRPM to avengemedia/dms..."
|
||||
echo " SRPM: $(basename $SRPM)"
|
||||
echo " Version: $VERSION"
|
||||
|
||||
BUILD_OUTPUT=$(copr-cli build "$COPR_PROJECT" "$SRPM" --nowait 2>&1)
|
||||
BUILD_OUTPUT=$(copr-cli build avengemedia/dms "$SRPM" --nowait 2>&1)
|
||||
echo "$BUILD_OUTPUT"
|
||||
|
||||
BUILD_ID=$(echo "$BUILD_OUTPUT" | grep -oP 'Build was added to.*\K[0-9]+' || echo "unknown")
|
||||
|
||||
if [ "$BUILD_ID" != "unknown" ]; then
|
||||
echo "✅ Build submitted successfully!"
|
||||
echo "🔗 https://copr.fedorainfracloud.org/coprs/${COPR_PROJECT}/build/$BUILD_ID/"
|
||||
echo "🔗 https://copr.fedorainfracloud.org/coprs/avengemedia/dms/build/$BUILD_ID/"
|
||||
else
|
||||
echo "⚠️ Could not extract build ID, but upload may have succeeded"
|
||||
fi
|
||||
@@ -198,13 +279,10 @@ jobs:
|
||||
- name: Build summary
|
||||
if: always()
|
||||
run: |
|
||||
PACKAGE="${{ matrix.package }}"
|
||||
COPR_PROJECT="${{ steps.copr_project.outputs.copr_project }}"
|
||||
echo "### 🎉 ${PACKAGE} Stable Build Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "### 🎉 DMS Stable Build Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Package:** ${PACKAGE}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Version:** ${{ steps.version.outputs.version }}-${{ steps.version.outputs.release }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **SRPM:** ${{ steps.build.outputs.srpm_name }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Project:** https://copr.fedorainfracloud.org/coprs/${COPR_PROJECT}/" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Project:** https://copr.fedorainfracloud.org/coprs/avengemedia/dms/" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "Stable release has been built and uploaded to Copr!" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
71
.github/workflows/run-obs.yml
vendored
71
.github/workflows/run-obs.yml
vendored
@@ -7,14 +7,13 @@ on:
|
||||
description: "Package to update (dms, dms-git, or all)"
|
||||
required: false
|
||||
default: "all"
|
||||
tag_version:
|
||||
description: "Specific tag version for dms stable (e.g., v1.0.2). Leave empty to auto-detect latest release."
|
||||
required: false
|
||||
default: ""
|
||||
rebuild_release:
|
||||
description: "Release number for rebuilds (e.g., 2, 3, 4 to increment spec Release)"
|
||||
required: false
|
||||
default: ""
|
||||
push:
|
||||
tags:
|
||||
- "v*"
|
||||
schedule:
|
||||
- cron: "0 */3 * * *" # Every 3 hours for dms-git builds
|
||||
|
||||
@@ -98,7 +97,7 @@ jobs:
|
||||
# Rebuild requested - always proceed
|
||||
echo "packages=$PKG" >> $GITHUB_OUTPUT
|
||||
echo "has_updates=true" >> $GITHUB_OUTPUT
|
||||
echo "🔄 Manual rebuild requested: $PKG (db$REBUILD)"
|
||||
echo "🔄 Manual rebuild requested: $PKG (ppa$REBUILD)"
|
||||
|
||||
elif [[ "$PKG" == "all" ]]; then
|
||||
# Check each package and build list of those needing updates
|
||||
@@ -162,51 +161,16 @@ jobs:
|
||||
id: packages
|
||||
run: |
|
||||
if [[ "${{ github.event_name }}" == "push" && "${{ github.ref }}" =~ ^refs/tags/ ]]; then
|
||||
# Tag push event - use the pushed tag
|
||||
echo "packages=dms" >> $GITHUB_OUTPUT
|
||||
VERSION="${GITHUB_REF#refs/tags/}"
|
||||
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||
echo "Triggered by tag: $VERSION"
|
||||
elif [[ "${{ github.event_name }}" == "schedule" ]]; then
|
||||
# Scheduled run - dms-git only
|
||||
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
|
||||
echo "Triggered by schedule: updating git package"
|
||||
elif [[ -n "${{ github.event.inputs.package }}" ]]; then
|
||||
# Manual workflow dispatch
|
||||
|
||||
# Determine version for dms stable
|
||||
if [[ "${{ github.event.inputs.package }}" == "dms" ]]; then
|
||||
# For explicit dms selection, require tag_version
|
||||
if [[ -n "${{ github.event.inputs.tag_version }}" ]]; then
|
||||
VERSION="${{ github.event.inputs.tag_version }}"
|
||||
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||
echo "Using specified tag: $VERSION"
|
||||
else
|
||||
echo "ERROR: tag_version is required when package=dms"
|
||||
echo "Please specify a tag version (e.g., v1.0.2) or use package=all for auto-detection"
|
||||
exit 1
|
||||
fi
|
||||
elif [[ "${{ github.event.inputs.package }}" == "all" ]]; then
|
||||
# For "all", auto-detect if tag_version not specified
|
||||
if [[ -n "${{ github.event.inputs.tag_version }}" ]]; then
|
||||
VERSION="${{ github.event.inputs.tag_version }}"
|
||||
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||
echo "Using specified tag: $VERSION"
|
||||
else
|
||||
# Auto-detect latest release for "all"
|
||||
LATEST_TAG=$(curl -s https://api.github.com/repos/AvengeMedia/DankMaterialShell/releases/latest | grep '"tag_name"' | sed 's/.*"tag_name": "\([^"]*\)".*/\1/' || echo "")
|
||||
if [[ -n "$LATEST_TAG" ]]; then
|
||||
echo "version=$LATEST_TAG" >> $GITHUB_OUTPUT
|
||||
echo "Auto-detected latest release: $LATEST_TAG"
|
||||
else
|
||||
echo "ERROR: Could not auto-detect latest release"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
# Use filtered packages from check-updates when package="all" and no rebuild/tag specified
|
||||
if [[ "${{ github.event.inputs.package }}" == "all" ]] && [[ -z "${{ github.event.inputs.rebuild_release }}" ]] && [[ -z "${{ github.event.inputs.tag_version }}" ]]; then
|
||||
# Use filtered packages from check-updates when package="all" and no rebuild requested
|
||||
if [[ "${{ github.event.inputs.package }}" == "all" ]] && [[ -z "${{ github.event.inputs.rebuild_release }}" ]]; then
|
||||
echo "packages=${{ needs.check-updates.outputs.packages }}" >> $GITHUB_OUTPUT
|
||||
echo "Manual trigger: all (filtered to: ${{ needs.check-updates.outputs.packages }})"
|
||||
else
|
||||
@@ -222,7 +186,7 @@ jobs:
|
||||
run: |
|
||||
COMMIT_HASH=$(git rev-parse --short=8 HEAD)
|
||||
COMMIT_COUNT=$(git rev-list --count HEAD)
|
||||
BASE_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1 || echo "1.0.2")
|
||||
BASE_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1 || echo "0.6.2")
|
||||
NEW_VERSION="${BASE_VERSION}+git${COMMIT_COUNT}.${COMMIT_HASH}"
|
||||
echo "📦 Updating dms-git.spec to version: $NEW_VERSION"
|
||||
|
||||
@@ -243,14 +207,14 @@ jobs:
|
||||
run: |
|
||||
COMMIT_HASH=$(git rev-parse --short=8 HEAD)
|
||||
COMMIT_COUNT=$(git rev-list --count HEAD)
|
||||
BASE_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1 || echo "1.0.2")
|
||||
BASE_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1 || echo "0.6.2")
|
||||
NEW_VERSION="${BASE_VERSION}+git${COMMIT_COUNT}.${COMMIT_HASH}"
|
||||
echo "📦 Updating Debian dms-git changelog to version: $NEW_VERSION"
|
||||
|
||||
# Single changelog entry (git snapshots don't need history)
|
||||
CHANGELOG_DATE=$(date -R)
|
||||
{
|
||||
echo "dms-git (${NEW_VERSION}db1) nightly; urgency=medium"
|
||||
echo "dms-git ($NEW_VERSION) nightly; urgency=medium"
|
||||
echo ""
|
||||
echo " * Git snapshot (commit $COMMIT_COUNT: $COMMIT_HASH)"
|
||||
echo ""
|
||||
@@ -262,15 +226,10 @@ jobs:
|
||||
run: |
|
||||
VERSION="${{ steps.packages.outputs.version }}"
|
||||
VERSION_NO_V="${VERSION#v}"
|
||||
echo "==> Updating packaging files to version: $VERSION_NO_V"
|
||||
echo "Updating packaging to version $VERSION_NO_V"
|
||||
|
||||
# Update spec file
|
||||
sed -i "s/^Version:.*/Version: $VERSION_NO_V/" distro/opensuse/dms.spec
|
||||
|
||||
# Verify the update
|
||||
UPDATED_VERSION=$(grep -oP '^Version:\s+\K[0-9.]+' distro/opensuse/dms.spec | head -1)
|
||||
echo "✓ Spec file now shows Version: $UPDATED_VERSION"
|
||||
|
||||
# Single changelog entry (full history on OBS website)
|
||||
DATE_STR=$(date "+%a %b %d %Y")
|
||||
LOCAL_SPEC_HEAD=$(sed -n '1,/%changelog/{ /%changelog/d; p }' distro/opensuse/dms.spec)
|
||||
@@ -297,13 +256,13 @@ jobs:
|
||||
if [[ -f "distro/debian/dms/debian/changelog" ]]; then
|
||||
CHANGELOG_DATE=$(date -R)
|
||||
{
|
||||
echo "dms (${VERSION_NO_V}db1) stable; urgency=medium"
|
||||
echo "dms ($VERSION_NO_V) stable; urgency=medium"
|
||||
echo ""
|
||||
echo " * Update to $VERSION stable release"
|
||||
echo ""
|
||||
echo " -- Avenge Media <AvengeMedia.US@gmail.com> $CHANGELOG_DATE"
|
||||
} > "distro/debian/dms/debian/changelog"
|
||||
echo "✓ Updated Debian changelog to ${VERSION_NO_V}db1"
|
||||
echo "✓ Updated Debian changelog to $VERSION_NO_V"
|
||||
fi
|
||||
|
||||
- name: Install Go
|
||||
@@ -330,7 +289,6 @@ jobs:
|
||||
- name: Upload to OBS
|
||||
env:
|
||||
REBUILD_RELEASE: ${{ github.event.inputs.rebuild_release }}
|
||||
TAG_VERSION: ${{ steps.packages.outputs.version }}
|
||||
run: |
|
||||
PACKAGES="${{ steps.packages.outputs.packages }}"
|
||||
|
||||
@@ -342,7 +300,6 @@ jobs:
|
||||
MESSAGE="Automated update from GitHub Actions"
|
||||
if [[ -n "${{ steps.packages.outputs.version }}" ]]; then
|
||||
MESSAGE="Update to ${{ steps.packages.outputs.version }}"
|
||||
echo "==> Version being uploaded: ${{ steps.packages.outputs.version }}"
|
||||
fi
|
||||
|
||||
# PACKAGES can be space-separated list (e.g., "dms-git dms" from "all" check)
|
||||
@@ -352,7 +309,7 @@ jobs:
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
echo "Uploading $PKG to OBS..."
|
||||
if [[ -n "$REBUILD_RELEASE" ]]; then
|
||||
echo "🔄 Using rebuild release number: db$REBUILD_RELEASE"
|
||||
echo "🔄 Using rebuild release number: ppa$REBUILD_RELEASE"
|
||||
fi
|
||||
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
|
||||
|
||||
@@ -393,7 +350,7 @@ jobs:
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ -n "${{ github.event.inputs.rebuild_release }}" ]]; then
|
||||
echo "**Rebuild Number:** db${{ github.event.inputs.rebuild_release }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Rebuild Number:** ppa${{ github.event.inputs.rebuild_release }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
|
||||
3
.github/workflows/run-ppa.yml
vendored
3
.github/workflows/run-ppa.yml
vendored
@@ -51,8 +51,7 @@ jobs:
|
||||
check_stable_package() {
|
||||
local PKG="$1"
|
||||
local PPA_NAME="$2"
|
||||
# Use git ls-remote to find the latest tag, sorted by version (descending)
|
||||
local LATEST_TAG=$(git ls-remote --tags --refs --sort='-v:refname' https://github.com/AvengeMedia/DankMaterialShell.git | head -n1 | awk -F/ '{print $NF}' | sed 's/^v//')
|
||||
local LATEST_TAG=$(curl -s https://api.github.com/repos/AvengeMedia/DankMaterialShell/releases/latest | grep '"tag_name"' | sed 's/.*"tag_name": "v\?\([^"]*\)".*/\1/' || echo "")
|
||||
local PPA_VERSION=$(curl -s "https://api.launchpad.net/1.0/~avengemedia/+archive/ubuntu/$PPA_NAME?ws.op=getPublishedSources&source_name=$PKG&status=Published" | grep -oP '"source_package_version":\s*"\K[^"]+' | head -1 || echo "")
|
||||
local PPA_BASE_VERSION=$(echo "$PPA_VERSION" | sed 's/ppa[0-9]*$//')
|
||||
|
||||
|
||||
3
.gitignore
vendored
3
.gitignore
vendored
@@ -96,12 +96,11 @@ go.work
|
||||
go.work.sum
|
||||
|
||||
# env file
|
||||
.env*
|
||||
.env
|
||||
|
||||
# Editor/IDE
|
||||
# .idea/
|
||||
# .vscode/
|
||||
vim/
|
||||
|
||||
bin/
|
||||
|
||||
|
||||
11
CHANGELOG.MD
11
CHANGELOG.MD
@@ -1,17 +1,6 @@
|
||||
This file is more of a quick reference so I know what to account for before next releases.
|
||||
|
||||
# 1.2.0
|
||||
|
||||
- Added clipboard and clipboard history integration
|
||||
- Added swipe to dismiss notification popups and from center
|
||||
- Added paste from clipboard history view - requires wtype
|
||||
- Optimize surface damage of OSD & Toast
|
||||
- Add monitor configurator (niri, Hyprland, MangoWC)
|
||||
- **BREAKING** ghostty theme changed to ~/.config/ghostty/themes/danktheme
|
||||
- requires intervention and doc update
|
||||
- Added desktop widget plugins
|
||||
- dev guidance available
|
||||
- builtin clock & dgop widgets
|
||||
- new IPC targets
|
||||
- Initial RTL support/i18n
|
||||
- Theme registry
|
||||
|
||||
@@ -163,7 +163,7 @@ quickshell -p quickshell/
|
||||
inputs.dms.url = "github:AvengeMedia/DankMaterialShell";
|
||||
|
||||
# Use in home-manager or NixOS configuration
|
||||
imports = [ inputs.dms.homeModules.dank-material-shell ];
|
||||
imports = [ inputs.dms.homeModules.dankMaterialShell.default ];
|
||||
}
|
||||
```
|
||||
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
[Desktop Entry]
|
||||
Type=Application
|
||||
Name=DMS
|
||||
Name=DMS Application Picker
|
||||
Comment=Select an application to open links and files
|
||||
Exec=dms open %u
|
||||
Icon=danklogo
|
||||
Terminal=false
|
||||
NoDisplay=true
|
||||
MimeType=x-scheme-handler/http;x-scheme-handler/https;x-scheme-handler/dms;text/html;application/xhtml+xml;
|
||||
MimeType=x-scheme-handler/http;x-scheme-handler/https;text/html;application/xhtml+xml;
|
||||
Categories=Utility;
|
||||
|
||||
@@ -9,7 +9,7 @@ Type=dbus
|
||||
BusName=org.freedesktop.Notifications
|
||||
ExecStart=/usr/bin/dms run --session
|
||||
ExecReload=/usr/bin/pkill -USR1 -x dms
|
||||
Restart=on-failure
|
||||
Restart=always
|
||||
RestartSec=1.23
|
||||
TimeoutStopSec=10
|
||||
|
||||
|
||||
@@ -14,63 +14,34 @@ Distribution-aware installer with TUI for deploying DMS and compositor configura
|
||||
|
||||
## System Integration
|
||||
|
||||
### Wayland Protocols (Client)
|
||||
**Wayland Protocols**
|
||||
- `wlr-gamma-control-unstable-v1` - Night mode and gamma control
|
||||
- `wlr-screencopy-unstable-v1` - Screen capture for color picker
|
||||
- `wlr-layer-shell-unstable-v1` - Overlay surfaces for color picker
|
||||
- `wp-viewporter` - Fractional scaling support
|
||||
- `dwl-ipc-unstable-v2` - dwl/MangoWC workspace integration
|
||||
- `ext-workspace-v1` - Workspace protocol support
|
||||
- `wlr-output-management-unstable-v1` - Display configuration
|
||||
|
||||
All Wayland protocols are consumed as a client - connecting to the compositor.
|
||||
**DBus Interfaces**
|
||||
- NetworkManager/iwd - Network management
|
||||
- logind - Session control and inhibit locks
|
||||
- accountsservice - User account information
|
||||
- CUPS - Printer management
|
||||
- Custom IPC via unix socket (JSON API)
|
||||
|
||||
| Protocol | Purpose |
|
||||
| ----------------------------------------- | ----------------------------------------------------------- |
|
||||
| `wlr-gamma-control-unstable-v1` | Night mode color temperature control |
|
||||
| `wlr-screencopy-unstable-v1` | Screen capture for color picker/screenshot |
|
||||
| `wlr-layer-shell-unstable-v1` | Overlay surfaces for color picker UI/screenshot |
|
||||
| `wlr-output-management-unstable-v1` | Display configuration |
|
||||
| `wlr-output-power-management-unstable-v1` | DPMS on/off CLI |
|
||||
| `wp-viewporter` | Fractional scaling support (color picker/screenshot UIs) |
|
||||
| `keyboard-shortcuts-inhibit-unstable-v1` | Inhibit compositor shortcuts during color picker/screenshot |
|
||||
| `ext-data-control-v1` | Clipboard history and persistence |
|
||||
| `ext-workspace-v1` | Workspace integration |
|
||||
| `dwl-ipc-unstable-v2` | dwl/MangoWC IPC for tags, outputs, etc. |
|
||||
|
||||
### DBus Interfaces
|
||||
|
||||
**Client (consuming external services):**
|
||||
|
||||
| Interface | Purpose |
|
||||
| -------------------------------- | --------------------------------------------- |
|
||||
| `org.bluez` | Bluetooth management with pairing agent |
|
||||
| `org.freedesktop.NetworkManager` | Network management |
|
||||
| `net.connman.iwd` | iwd Wi-Fi backend |
|
||||
| `org.freedesktop.network1` | systemd-networkd integration |
|
||||
| `org.freedesktop.login1` | Session control, sleep inhibitors, brightness |
|
||||
| `org.freedesktop.Accounts` | User account information |
|
||||
| `org.freedesktop.portal.Desktop` | Desktop appearance settings (color scheme) |
|
||||
| CUPS via IPP + D-Bus | Printer management with job notifications |
|
||||
|
||||
**Server (implementing interfaces):**
|
||||
|
||||
| Interface | Purpose |
|
||||
| ----------------------------- | -------------------------------------- |
|
||||
| `org.freedesktop.ScreenSaver` | Screensaver inhibit for video playback |
|
||||
|
||||
Custom IPC via unix socket (JSON API) for shell communication.
|
||||
|
||||
### Hardware Control
|
||||
|
||||
| Subsystem | Method | Purpose |
|
||||
| --------- | ------------------- | ---------------------------------- |
|
||||
| DDC/CI | I2C direct | External monitor brightness |
|
||||
| Backlight | logind or sysfs | Internal display brightness |
|
||||
| evdev | `/dev/input/event*` | Keyboard state (caps lock LED) |
|
||||
| udev | netlink monitor | Backlight device updates (for OSD) |
|
||||
|
||||
### Plugin System
|
||||
**Hardware Control**
|
||||
- DDC/CI protocol - External monitor brightness control (like `ddcutil`)
|
||||
- Backlight control - Internal display brightness via `login1` or sysfs
|
||||
- LED control - Keyboard/device LED management
|
||||
- evdev input monitoring - Keyboard state tracking (caps lock, etc.)
|
||||
|
||||
**Plugin System**
|
||||
- Plugin registry integration
|
||||
- Plugin lifecycle management
|
||||
- Settings persistence
|
||||
|
||||
## CLI Commands
|
||||
|
||||
- `dms run [-d]` - Start shell (optionally as daemon)
|
||||
- `dms restart` / `dms kill` - Manage running processes
|
||||
- `dms ipc <command>` - Send IPC commands (toggle launcher, notifications, etc.)
|
||||
@@ -99,7 +70,6 @@ The on-screen preview displays the selected format. JSON output includes hex, RG
|
||||
Requires Go 1.24+
|
||||
|
||||
**Development build:**
|
||||
|
||||
```bash
|
||||
make # Build dms CLI
|
||||
make dankinstall # Build installer
|
||||
@@ -107,7 +77,6 @@ make test # Run tests
|
||||
```
|
||||
|
||||
**Distribution build:**
|
||||
|
||||
```bash
|
||||
make dist # Build without update/greeter features
|
||||
```
|
||||
@@ -115,7 +84,6 @@ make dist # Build without update/greeter features
|
||||
Produces `bin/dms-linux-amd64` and `bin/dms-linux-arm64`
|
||||
|
||||
**Installation:**
|
||||
|
||||
```bash
|
||||
sudo make install # Install to /usr/local/bin/dms
|
||||
```
|
||||
@@ -123,7 +91,6 @@ sudo make install # Install to /usr/local/bin/dms
|
||||
## Development
|
||||
|
||||
**Setup pre-commit hooks:**
|
||||
|
||||
```bash
|
||||
git config core.hooksPath .githooks
|
||||
```
|
||||
@@ -131,7 +98,6 @@ git config core.hooksPath .githooks
|
||||
This runs gofmt, golangci-lint, tests, and builds before each commit when `core/` files are staged.
|
||||
|
||||
**Regenerating Wayland Protocol Bindings:**
|
||||
|
||||
```bash
|
||||
go install github.com/rajveermalviya/go-wayland/cmd/go-wayland-scanner@latest
|
||||
go-wayland-scanner -i internal/proto/xml/wlr-gamma-control-unstable-v1.xml \
|
||||
@@ -139,7 +105,6 @@ go-wayland-scanner -i internal/proto/xml/wlr-gamma-control-unstable-v1.xml \
|
||||
```
|
||||
|
||||
**Module Structure:**
|
||||
|
||||
- `cmd/` - Binary entrypoints (dms, dankinstall)
|
||||
- `internal/distros/` - Distribution-specific installation logic
|
||||
- `internal/proto/` - Wayland protocol bindings
|
||||
|
||||
@@ -179,7 +179,7 @@ func runBrightnessList(cmd *cobra.Command, args []string) {
|
||||
fmt.Printf("%-*s %-12s %-*s %s\n", idPad, "Device", "Class", namePad, "Name", "Brightness")
|
||||
|
||||
sepLen := idPad + 2 + 12 + 2 + namePad + 2 + 15
|
||||
for range sepLen {
|
||||
for i := 0; i < sepLen; i++ {
|
||||
fmt.Print("─")
|
||||
}
|
||||
fmt.Println()
|
||||
|
||||
@@ -142,6 +142,10 @@ var (
|
||||
clipConfigNoClearStartup bool
|
||||
clipConfigDisabled bool
|
||||
clipConfigEnabled bool
|
||||
clipConfigDisableHistory bool
|
||||
clipConfigEnableHistory bool
|
||||
clipConfigDisablePersist bool
|
||||
clipConfigEnablePersist bool
|
||||
)
|
||||
|
||||
func init() {
|
||||
@@ -165,8 +169,12 @@ func init() {
|
||||
clipConfigSetCmd.Flags().IntVar(&clipConfigAutoClearDays, "auto-clear-days", -1, "Auto-clear entries older than N days (0 to disable)")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigClearAtStartup, "clear-at-startup", false, "Clear history on startup")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigNoClearStartup, "no-clear-at-startup", false, "Don't clear history on startup")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigDisabled, "disable", false, "Disable clipboard tracking")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigEnabled, "enable", false, "Enable clipboard tracking")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigDisabled, "disable", false, "Disable clipboard manager entirely")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigEnabled, "enable", false, "Enable clipboard manager")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigDisableHistory, "disable-history", false, "Disable clipboard history persistence")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigEnableHistory, "enable-history", false, "Enable clipboard history persistence")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigDisablePersist, "disable-persist", false, "Disable clipboard ownership persistence")
|
||||
clipConfigSetCmd.Flags().BoolVar(&clipConfigEnablePersist, "enable-persist", false, "Enable clipboard ownership persistence")
|
||||
|
||||
clipWatchCmd.Flags().BoolVarP(&clipWatchStore, "store", "s", false, "Store clipboard changes to history (no server required)")
|
||||
|
||||
@@ -583,6 +591,18 @@ func runClipConfigSet(cmd *cobra.Command, args []string) {
|
||||
if clipConfigEnabled {
|
||||
params["disabled"] = false
|
||||
}
|
||||
if clipConfigDisableHistory {
|
||||
params["disableHistory"] = true
|
||||
}
|
||||
if clipConfigEnableHistory {
|
||||
params["disableHistory"] = false
|
||||
}
|
||||
if clipConfigDisablePersist {
|
||||
params["disablePersist"] = true
|
||||
}
|
||||
if clipConfigEnablePersist {
|
||||
params["disablePersist"] = false
|
||||
}
|
||||
|
||||
if len(params) == 0 {
|
||||
fmt.Println("No config options specified")
|
||||
|
||||
@@ -171,6 +171,7 @@ var pluginsUpdateCmd = &cobra.Command{
|
||||
}
|
||||
|
||||
func runVersion(cmd *cobra.Command, args []string) {
|
||||
printASCII()
|
||||
fmt.Printf("%s\n", formatVersion(Version))
|
||||
}
|
||||
|
||||
@@ -219,7 +220,7 @@ func getBaseVersion() string {
|
||||
}
|
||||
|
||||
// Fallback
|
||||
return "1.0.2"
|
||||
return "0.6.2"
|
||||
}
|
||||
|
||||
func startDebugServer() error {
|
||||
|
||||
@@ -22,7 +22,6 @@ func init() {
|
||||
dank16Cmd.Flags().Bool("json", false, "Output in JSON format")
|
||||
dank16Cmd.Flags().Bool("kitty", false, "Output in Kitty terminal format")
|
||||
dank16Cmd.Flags().Bool("foot", false, "Output in Foot terminal format")
|
||||
dank16Cmd.Flags().Bool("neovim", false, "Output in Neovim plugin format")
|
||||
dank16Cmd.Flags().Bool("alacritty", false, "Output in Alacritty terminal format")
|
||||
dank16Cmd.Flags().Bool("ghostty", false, "Output in Ghostty terminal format")
|
||||
dank16Cmd.Flags().Bool("wezterm", false, "Output in Wezterm terminal format")
|
||||
@@ -41,7 +40,6 @@ func runDank16(cmd *cobra.Command, args []string) {
|
||||
isJson, _ := cmd.Flags().GetBool("json")
|
||||
isKitty, _ := cmd.Flags().GetBool("kitty")
|
||||
isFoot, _ := cmd.Flags().GetBool("foot")
|
||||
isNeovim, _ := cmd.Flags().GetBool("neovim")
|
||||
isAlacritty, _ := cmd.Flags().GetBool("alacritty")
|
||||
isGhostty, _ := cmd.Flags().GetBool("ghostty")
|
||||
isWezterm, _ := cmd.Flags().GetBool("wezterm")
|
||||
@@ -118,8 +116,6 @@ func runDank16(cmd *cobra.Command, args []string) {
|
||||
fmt.Print(dank16.GenerateGhosttyTheme(colors))
|
||||
} else if isWezterm {
|
||||
fmt.Print(dank16.GenerateWeztermTheme(colors))
|
||||
} else if isNeovim {
|
||||
fmt.Print(dank16.GenerateNeovimTheme(colors))
|
||||
} else {
|
||||
fmt.Print(dank16.GenerateGhosttyTheme(colors))
|
||||
}
|
||||
|
||||
@@ -377,7 +377,7 @@ func updateDMSBinary() error {
|
||||
}
|
||||
|
||||
version := ""
|
||||
for line := range strings.SplitSeq(string(output), "\n") {
|
||||
for _, line := range strings.Split(string(output), "\n") {
|
||||
if strings.Contains(line, "\"tag_name\"") {
|
||||
parts := strings.Split(line, "\"")
|
||||
if len(parts) >= 4 {
|
||||
@@ -443,7 +443,7 @@ func updateDMSBinary() error {
|
||||
|
||||
decompressedPath := filepath.Join(tempDir, "dms")
|
||||
|
||||
if err := os.Chmod(decompressedPath, 0o755); err != nil {
|
||||
if err := os.Chmod(decompressedPath, 0755); err != nil {
|
||||
return fmt.Errorf("failed to make binary executable: %w", err)
|
||||
}
|
||||
|
||||
|
||||
@@ -211,8 +211,8 @@ func checkGroupExists(groupName string) bool {
|
||||
return false
|
||||
}
|
||||
|
||||
lines := strings.SplitSeq(string(data), "\n")
|
||||
for line := range lines {
|
||||
lines := strings.Split(string(data), "\n")
|
||||
for _, line := range lines {
|
||||
if strings.HasPrefix(line, groupName+":") {
|
||||
return true
|
||||
}
|
||||
@@ -521,7 +521,7 @@ func enableGreeter() error {
|
||||
newConfig := strings.Join(finalLines, "\n")
|
||||
|
||||
tmpFile := "/tmp/greetd-config.toml"
|
||||
if err := os.WriteFile(tmpFile, []byte(newConfig), 0o644); err != nil {
|
||||
if err := os.WriteFile(tmpFile, []byte(newConfig), 0644); err != nil {
|
||||
return fmt.Errorf("failed to write temp config: %w", err)
|
||||
}
|
||||
|
||||
@@ -592,8 +592,8 @@ func checkGreeterStatus() error {
|
||||
if data, err := os.ReadFile(configPath); err == nil {
|
||||
configContent := string(data)
|
||||
if strings.Contains(configContent, "dms-greeter") {
|
||||
lines := strings.SplitSeq(configContent, "\n")
|
||||
for line := range lines {
|
||||
lines := strings.Split(configContent, "\n")
|
||||
for _, line := range lines {
|
||||
trimmed := strings.TrimSpace(line)
|
||||
if strings.HasPrefix(trimmed, "command =") || strings.HasPrefix(trimmed, "command=") {
|
||||
parts := strings.SplitN(trimmed, "=", 2)
|
||||
|
||||
@@ -131,12 +131,6 @@ func runOpen(target string) {
|
||||
detectedRequestType = "url"
|
||||
}
|
||||
log.Infof("Detected HTTP(S) URL")
|
||||
} else if strings.HasPrefix(target, "dms://") {
|
||||
// Handle DMS internal URLs (theme/plugin install, etc.)
|
||||
if detectedRequestType == "" {
|
||||
detectedRequestType = "url"
|
||||
}
|
||||
log.Infof("Detected DMS internal URL")
|
||||
} else if _, err := os.Stat(target); err == nil {
|
||||
// Handle local file paths directly (not file:// URIs)
|
||||
// Convert to absolute path
|
||||
@@ -183,7 +177,7 @@ func runOpen(target string) {
|
||||
}
|
||||
|
||||
method := "apppicker.open"
|
||||
if detectedMimeType == "" && len(detectedCategories) == 0 && (strings.HasPrefix(target, "http://") || strings.HasPrefix(target, "https://") || strings.HasPrefix(target, "dms://")) {
|
||||
if detectedMimeType == "" && len(detectedCategories) == 0 && (strings.HasPrefix(target, "http://") || strings.HasPrefix(target, "https://")) {
|
||||
method = "browser.open"
|
||||
params["url"] = target
|
||||
}
|
||||
|
||||
@@ -50,18 +50,15 @@ func findConfig(cmd *cobra.Command, args []string) error {
|
||||
|
||||
configStateFile := filepath.Join(getRuntimeDir(), "danklinux.path")
|
||||
if data, readErr := os.ReadFile(configStateFile); readErr == nil {
|
||||
if len(getAllDMSPIDs()) == 0 {
|
||||
os.Remove(configStateFile)
|
||||
} else {
|
||||
statePath := strings.TrimSpace(string(data))
|
||||
shellPath := filepath.Join(statePath, "shell.qml")
|
||||
statePath := strings.TrimSpace(string(data))
|
||||
shellPath := filepath.Join(statePath, "shell.qml")
|
||||
|
||||
if info, statErr := os.Stat(shellPath); statErr == nil && !info.IsDir() {
|
||||
log.Debug("Using config from active session state file: %s", statePath)
|
||||
configPath = statePath
|
||||
log.Debug("Using config from: %s", configPath)
|
||||
return nil
|
||||
}
|
||||
if info, statErr := os.Stat(shellPath); statErr == nil && !info.IsDir() {
|
||||
log.Debug("Using config from active session state file: %s", statePath)
|
||||
configPath = statePath
|
||||
log.Debug("Using config from: %s", configPath)
|
||||
return nil // <-- Guard statement
|
||||
} else {
|
||||
os.Remove(configStateFile)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -87,14 +87,20 @@ func newDPMSClient() (*dpmsClient, error) {
|
||||
switch e.Interface {
|
||||
case wlr_output_power.ZwlrOutputPowerManagerV1InterfaceName:
|
||||
powerMgr := wlr_output_power.NewZwlrOutputPowerManagerV1(c.ctx)
|
||||
version := min(e.Version, 1)
|
||||
version := e.Version
|
||||
if version > 1 {
|
||||
version = 1
|
||||
}
|
||||
if err := registry.Bind(e.Name, e.Interface, version, powerMgr); err == nil {
|
||||
c.powerMgr = powerMgr
|
||||
}
|
||||
|
||||
case "wl_output":
|
||||
output := wlclient.NewOutput(c.ctx)
|
||||
version := min(e.Version, 4)
|
||||
version := e.Version
|
||||
if version > 4 {
|
||||
version = 4
|
||||
}
|
||||
if err := registry.Bind(e.Name, e.Interface, version, output); err == nil {
|
||||
outputID := fmt.Sprintf("output-%d", output.ID())
|
||||
state := &outputState{
|
||||
|
||||
@@ -7,10 +7,8 @@ import (
|
||||
"os/exec"
|
||||
"os/signal"
|
||||
"path/filepath"
|
||||
"slices"
|
||||
"strconv"
|
||||
"strings"
|
||||
"sync"
|
||||
"syscall"
|
||||
"time"
|
||||
|
||||
@@ -20,25 +18,6 @@ import (
|
||||
|
||||
type ipcTargets map[string]map[string][]string
|
||||
|
||||
// getProcessExitCode returns the exit code from a ProcessState.
|
||||
// For normal exits, returns the exit code directly.
|
||||
// For signal termination, returns 128 + signal number (Unix convention).
|
||||
func getProcessExitCode(state *os.ProcessState) int {
|
||||
if state == nil {
|
||||
return 1
|
||||
}
|
||||
if code := state.ExitCode(); code != -1 {
|
||||
return code
|
||||
}
|
||||
// Process was killed by signal - extract signal number
|
||||
if status, ok := state.Sys().(syscall.WaitStatus); ok {
|
||||
if status.Signaled() {
|
||||
return 128 + int(status.Signal())
|
||||
}
|
||||
}
|
||||
return 1
|
||||
}
|
||||
|
||||
var isSessionManaged bool
|
||||
|
||||
func execDetachedRestart(targetPID int) {
|
||||
@@ -201,16 +180,6 @@ func runShellInteractive(session bool) {
|
||||
}
|
||||
}
|
||||
|
||||
if os.Getenv("QT_QPA_PLATFORMTHEME") == "" {
|
||||
cmd.Env = append(cmd.Env, "QT_QPA_PLATFORMTHEME=gtk3")
|
||||
}
|
||||
if os.Getenv("QT_QPA_PLATFORMTHEME_QT6") == "" {
|
||||
cmd.Env = append(cmd.Env, "QT_QPA_PLATFORMTHEME_QT6=gtk3")
|
||||
}
|
||||
if os.Getenv("QT_QPA_PLATFORM") == "" {
|
||||
cmd.Env = append(cmd.Env, "QT_QPA_PLATFORM=wayland")
|
||||
}
|
||||
|
||||
cmd.Stdin = os.Stdin
|
||||
cmd.Stdout = os.Stdout
|
||||
cmd.Stderr = os.Stderr
|
||||
@@ -245,28 +214,14 @@ func runShellInteractive(session bool) {
|
||||
for {
|
||||
select {
|
||||
case sig := <-sigChan:
|
||||
if sig == syscall.SIGUSR1 {
|
||||
if isSessionManaged {
|
||||
log.Infof("Received SIGUSR1, exiting for systemd restart...")
|
||||
cancel()
|
||||
cmd.Process.Signal(syscall.SIGTERM)
|
||||
os.Remove(socketPath)
|
||||
os.Exit(1)
|
||||
}
|
||||
// Handle SIGUSR1 restart for non-session managed processes
|
||||
if sig == syscall.SIGUSR1 && !isSessionManaged {
|
||||
log.Infof("Received SIGUSR1, spawning detached restart process...")
|
||||
execDetachedRestart(os.Getpid())
|
||||
// Exit immediately to avoid race conditions with detached restart
|
||||
return
|
||||
}
|
||||
|
||||
// Check if qs already crashed before we got SIGTERM (systemd sends SIGTERM when D-Bus name is released)
|
||||
select {
|
||||
case <-errChan:
|
||||
cancel()
|
||||
os.Remove(socketPath)
|
||||
os.Exit(getProcessExitCode(cmd.ProcessState))
|
||||
case <-time.After(500 * time.Millisecond):
|
||||
}
|
||||
|
||||
log.Infof("\nReceived signal %v, shutting down...", sig)
|
||||
cancel()
|
||||
cmd.Process.Signal(syscall.SIGTERM)
|
||||
@@ -280,7 +235,7 @@ func runShellInteractive(session bool) {
|
||||
cmd.Process.Signal(syscall.SIGTERM)
|
||||
}
|
||||
os.Remove(socketPath)
|
||||
os.Exit(getProcessExitCode(cmd.ProcessState))
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -373,7 +328,13 @@ func killShell() {
|
||||
|
||||
func runShellDaemon(session bool) {
|
||||
isSessionManaged = session
|
||||
isDaemonChild := slices.Contains(os.Args, "--daemon-child")
|
||||
isDaemonChild := false
|
||||
for _, arg := range os.Args {
|
||||
if arg == "--daemon-child" {
|
||||
isDaemonChild = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if !isDaemonChild {
|
||||
fmt.Fprintf(os.Stderr, "dms %s\n", Version)
|
||||
@@ -439,16 +400,6 @@ func runShellDaemon(session bool) {
|
||||
}
|
||||
}
|
||||
|
||||
if os.Getenv("QT_QPA_PLATFORMTHEME") == "" {
|
||||
cmd.Env = append(cmd.Env, "QT_QPA_PLATFORMTHEME=gtk3")
|
||||
}
|
||||
if os.Getenv("QT_QPA_PLATFORMTHEME_QT6") == "" {
|
||||
cmd.Env = append(cmd.Env, "QT_QPA_PLATFORMTHEME_QT6=gtk3")
|
||||
}
|
||||
if os.Getenv("QT_QPA_PLATFORM") == "" {
|
||||
cmd.Env = append(cmd.Env, "QT_QPA_PLATFORM=wayland")
|
||||
}
|
||||
|
||||
devNull, err := os.OpenFile("/dev/null", os.O_RDWR, 0)
|
||||
if err != nil {
|
||||
log.Fatalf("Error opening /dev/null: %v", err)
|
||||
@@ -489,28 +440,15 @@ func runShellDaemon(session bool) {
|
||||
for {
|
||||
select {
|
||||
case sig := <-sigChan:
|
||||
if sig == syscall.SIGUSR1 {
|
||||
if isSessionManaged {
|
||||
log.Infof("Received SIGUSR1, exiting for systemd restart...")
|
||||
cancel()
|
||||
cmd.Process.Signal(syscall.SIGTERM)
|
||||
os.Remove(socketPath)
|
||||
os.Exit(1)
|
||||
}
|
||||
// Handle SIGUSR1 restart for non-session managed processes
|
||||
if sig == syscall.SIGUSR1 && !isSessionManaged {
|
||||
log.Infof("Received SIGUSR1, spawning detached restart process...")
|
||||
execDetachedRestart(os.Getpid())
|
||||
// Exit immediately to avoid race conditions with detached restart
|
||||
return
|
||||
}
|
||||
|
||||
// Check if qs already crashed before we got SIGTERM (systemd sends SIGTERM when D-Bus name is released)
|
||||
select {
|
||||
case <-errChan:
|
||||
cancel()
|
||||
os.Remove(socketPath)
|
||||
os.Exit(getProcessExitCode(cmd.ProcessState))
|
||||
case <-time.After(500 * time.Millisecond):
|
||||
}
|
||||
|
||||
// All other signals: clean shutdown
|
||||
cancel()
|
||||
cmd.Process.Signal(syscall.SIGTERM)
|
||||
os.Remove(socketPath)
|
||||
@@ -522,25 +460,17 @@ func runShellDaemon(session bool) {
|
||||
cmd.Process.Signal(syscall.SIGTERM)
|
||||
}
|
||||
os.Remove(socketPath)
|
||||
os.Exit(getProcessExitCode(cmd.ProcessState))
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var qsHasAnyDisplay = sync.OnceValue(func() bool {
|
||||
out, err := exec.Command("qs", "ipc", "--help").Output()
|
||||
if err != nil {
|
||||
return false
|
||||
}
|
||||
return strings.Contains(string(out), "--any-display")
|
||||
})
|
||||
|
||||
func parseTargetsFromIPCShowOutput(output string) ipcTargets {
|
||||
targets := make(ipcTargets)
|
||||
var currentTarget string
|
||||
for line := range strings.SplitSeq(output, "\n") {
|
||||
if after, ok := strings.CutPrefix(line, "target "); ok {
|
||||
currentTarget = strings.TrimSpace(after)
|
||||
for _, line := range strings.Split(output, "\n") {
|
||||
if strings.HasPrefix(line, "target ") {
|
||||
currentTarget = strings.TrimSpace(strings.TrimPrefix(line, "target "))
|
||||
targets[currentTarget] = make(map[string][]string)
|
||||
}
|
||||
if strings.HasPrefix(line, " function") && currentTarget != "" {
|
||||
@@ -565,11 +495,7 @@ func parseTargetsFromIPCShowOutput(output string) ipcTargets {
|
||||
}
|
||||
|
||||
func getShellIPCCompletions(args []string, _ string) []string {
|
||||
cmdArgs := []string{"ipc"}
|
||||
if qsHasAnyDisplay() {
|
||||
cmdArgs = append(cmdArgs, "--any-display")
|
||||
}
|
||||
cmdArgs = append(cmdArgs, "-p", configPath, "show")
|
||||
cmdArgs := []string{"-p", configPath, "ipc", "show"}
|
||||
cmd := exec.Command("qs", cmdArgs...)
|
||||
var targets ipcTargets
|
||||
|
||||
@@ -623,12 +549,7 @@ func runShellIPCCommand(args []string) {
|
||||
args = append([]string{"call"}, args...)
|
||||
}
|
||||
|
||||
cmdArgs := []string{"ipc"}
|
||||
if qsHasAnyDisplay() {
|
||||
cmdArgs = append(cmdArgs, "--any-display")
|
||||
}
|
||||
cmdArgs = append(cmdArgs, "-p", configPath)
|
||||
cmdArgs = append(cmdArgs, args...)
|
||||
cmdArgs := append([]string{"-p", configPath, "ipc"}, args...)
|
||||
cmd := exec.Command("qs", cmdArgs...)
|
||||
cmd.Stdin = os.Stdin
|
||||
cmd.Stdout = os.Stdout
|
||||
|
||||
@@ -3,7 +3,6 @@ package main
|
||||
import (
|
||||
"fmt"
|
||||
"os/exec"
|
||||
"slices"
|
||||
"strings"
|
||||
)
|
||||
|
||||
@@ -37,7 +36,13 @@ func checkSystemdServiceEnabled(serviceName string) (string, bool, error) {
|
||||
|
||||
if err != nil {
|
||||
knownStates := []string{"disabled", "masked", "masked-runtime", "not-found", "enabled", "enabled-runtime", "static", "indirect", "alias"}
|
||||
isKnownState := slices.Contains(knownStates, stateStr)
|
||||
isKnownState := false
|
||||
for _, known := range knownStates {
|
||||
if stateStr == known {
|
||||
isKnownState = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if !isKnownState {
|
||||
return stateStr, false, fmt.Errorf("systemctl is-enabled failed: %w (output: %s)", err, stateStr)
|
||||
|
||||
@@ -221,7 +221,10 @@ func (p *Picker) handleGlobal(e client.RegistryGlobalEvent) {
|
||||
|
||||
case client.OutputInterfaceName:
|
||||
output := client.NewOutput(p.ctx)
|
||||
version := min(e.Version, 4)
|
||||
version := e.Version
|
||||
if version > 4 {
|
||||
version = 4
|
||||
}
|
||||
if err := p.registry.Bind(e.Name, e.Interface, version, output); err == nil {
|
||||
p.outputsMu.Lock()
|
||||
p.outputs[e.Name] = &Output{
|
||||
@@ -236,14 +239,20 @@ func (p *Picker) handleGlobal(e client.RegistryGlobalEvent) {
|
||||
|
||||
case wlr_layer_shell.ZwlrLayerShellV1InterfaceName:
|
||||
layerShell := wlr_layer_shell.NewZwlrLayerShellV1(p.ctx)
|
||||
version := min(e.Version, 4)
|
||||
version := e.Version
|
||||
if version > 4 {
|
||||
version = 4
|
||||
}
|
||||
if err := p.registry.Bind(e.Name, e.Interface, version, layerShell); err == nil {
|
||||
p.layerShell = layerShell
|
||||
}
|
||||
|
||||
case wlr_screencopy.ZwlrScreencopyManagerV1InterfaceName:
|
||||
screencopy := wlr_screencopy.NewZwlrScreencopyManagerV1(p.ctx)
|
||||
version := min(e.Version, 3)
|
||||
version := e.Version
|
||||
if version > 3 {
|
||||
version = 3
|
||||
}
|
||||
if err := p.registry.Bind(e.Name, e.Interface, version, screencopy); err == nil {
|
||||
p.screencopy = screencopy
|
||||
}
|
||||
|
||||
@@ -1157,7 +1157,7 @@ func drawGlyph(data []byte, stride, width, height, x, y int, r rune, col Color,
|
||||
rOff, bOff = 2, 0
|
||||
}
|
||||
|
||||
for row := range fontH {
|
||||
for row := 0; row < fontH; row++ {
|
||||
yy := y + row
|
||||
if yy < 0 || yy >= height {
|
||||
continue
|
||||
@@ -1165,7 +1165,7 @@ func drawGlyph(data []byte, stride, width, height, x, y int, r rune, col Color,
|
||||
rowPattern := g[row]
|
||||
dstRowOff := yy * stride
|
||||
|
||||
for colIdx := range fontW {
|
||||
for colIdx := 0; colIdx < fontW; colIdx++ {
|
||||
if (rowPattern & (1 << (fontW - 1 - colIdx))) == 0 {
|
||||
continue
|
||||
}
|
||||
|
||||
@@ -14,11 +14,11 @@ func TestSurfaceState_ConcurrentPointerMotion(t *testing.T) {
|
||||
const goroutines = 50
|
||||
const iterations = 100
|
||||
|
||||
for i := range goroutines {
|
||||
for i := 0; i < goroutines; i++ {
|
||||
wg.Add(1)
|
||||
go func(id int) {
|
||||
defer wg.Done()
|
||||
for j := range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
s.OnPointerMotion(float64(id*10+j), float64(id*10+j))
|
||||
}
|
||||
}(i)
|
||||
@@ -34,21 +34,21 @@ func TestSurfaceState_ConcurrentScaleAccess(t *testing.T) {
|
||||
const goroutines = 30
|
||||
const iterations = 100
|
||||
|
||||
for i := range goroutines / 2 {
|
||||
for i := 0; i < goroutines/2; i++ {
|
||||
wg.Add(1)
|
||||
go func(id int) {
|
||||
defer wg.Done()
|
||||
for range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
s.SetScale(int32(id%3 + 1))
|
||||
}
|
||||
}(i)
|
||||
}
|
||||
|
||||
for range goroutines / 2 {
|
||||
for i := 0; i < goroutines/2; i++ {
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
for range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
scale := s.Scale()
|
||||
assert.GreaterOrEqual(t, scale, int32(1))
|
||||
}
|
||||
@@ -65,21 +65,21 @@ func TestSurfaceState_ConcurrentLogicalSize(t *testing.T) {
|
||||
const goroutines = 20
|
||||
const iterations = 100
|
||||
|
||||
for i := range goroutines / 2 {
|
||||
for i := 0; i < goroutines/2; i++ {
|
||||
wg.Add(1)
|
||||
go func(id int) {
|
||||
defer wg.Done()
|
||||
for j := range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
_ = s.OnLayerConfigure(1920+id, 1080+j)
|
||||
}
|
||||
}(i)
|
||||
}
|
||||
|
||||
for range goroutines / 2 {
|
||||
for i := 0; i < goroutines/2; i++ {
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
for range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
w, h := s.LogicalSize()
|
||||
_ = w
|
||||
_ = h
|
||||
@@ -97,31 +97,31 @@ func TestSurfaceState_ConcurrentIsDone(t *testing.T) {
|
||||
const goroutines = 30
|
||||
const iterations = 100
|
||||
|
||||
for range goroutines / 3 {
|
||||
for i := 0; i < goroutines/3; i++ {
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
for range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
s.OnPointerButton(0x110, 1)
|
||||
}
|
||||
}()
|
||||
}
|
||||
|
||||
for range goroutines / 3 {
|
||||
for i := 0; i < goroutines/3; i++ {
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
for range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
s.OnKey(1, 1)
|
||||
}
|
||||
}()
|
||||
}
|
||||
|
||||
for range goroutines / 3 {
|
||||
for i := 0; i < goroutines/3; i++ {
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
for range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
picked, cancelled := s.IsDone()
|
||||
_ = picked
|
||||
_ = cancelled
|
||||
@@ -139,11 +139,11 @@ func TestSurfaceState_ConcurrentIsReady(t *testing.T) {
|
||||
const goroutines = 20
|
||||
const iterations = 100
|
||||
|
||||
for range goroutines {
|
||||
for i := 0; i < goroutines; i++ {
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
for range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
_ = s.IsReady()
|
||||
}
|
||||
}()
|
||||
@@ -159,11 +159,11 @@ func TestSurfaceState_ConcurrentSwapBuffers(t *testing.T) {
|
||||
const goroutines = 20
|
||||
const iterations = 100
|
||||
|
||||
for range goroutines {
|
||||
for i := 0; i < goroutines; i++ {
|
||||
wg.Add(1)
|
||||
go func() {
|
||||
defer wg.Done()
|
||||
for range iterations {
|
||||
for j := 0; j < iterations; j++ {
|
||||
s.SwapBuffers()
|
||||
}
|
||||
}()
|
||||
|
||||
@@ -213,11 +213,6 @@ func (cd *ConfigDeployer) deployNiriDmsConfigs(dmsDir, terminalCommand string) e
|
||||
|
||||
for _, cfg := range configs {
|
||||
path := filepath.Join(dmsDir, cfg.name)
|
||||
// Skip if file already exists to preserve user modifications
|
||||
if _, err := os.Stat(path); err == nil {
|
||||
cd.log(fmt.Sprintf("Skipping %s (already exists)", cfg.name))
|
||||
continue
|
||||
}
|
||||
if err := os.WriteFile(path, []byte(cfg.content), 0644); err != nil {
|
||||
return fmt.Errorf("failed to write %s: %w", cfg.name, err)
|
||||
}
|
||||
@@ -270,13 +265,7 @@ func (cd *ConfigDeployer) deployGhosttyConfig() ([]DeploymentResult, error) {
|
||||
|
||||
colorResult := DeploymentResult{
|
||||
ConfigType: "Ghostty Colors",
|
||||
Path: filepath.Join(os.Getenv("HOME"), ".config", "ghostty", "themes", "dankcolors"),
|
||||
}
|
||||
|
||||
themesDir := filepath.Dir(colorResult.Path)
|
||||
if err := os.MkdirAll(themesDir, 0755); err != nil {
|
||||
mainResult.Error = fmt.Errorf("failed to create themes directory: %w", err)
|
||||
return []DeploymentResult{mainResult}, mainResult.Error
|
||||
Path: filepath.Join(os.Getenv("HOME"), ".config", "ghostty", "config-dankcolors"),
|
||||
}
|
||||
|
||||
if err := os.WriteFile(colorResult.Path, []byte(GhosttyColorConfig), 0644); err != nil {
|
||||
|
||||
@@ -462,13 +462,13 @@ func TestHyprlandConfigStructure(t *testing.T) {
|
||||
assert.Contains(t, HyprlandConfig, "# KEYBINDINGS")
|
||||
assert.Contains(t, HyprlandConfig, "bind = $mod, T, exec, {{TERMINAL_COMMAND}}")
|
||||
assert.Contains(t, HyprlandConfig, "bind = $mod, space, exec, dms ipc call spotlight toggle")
|
||||
assert.Contains(t, HyprlandConfig, "windowrule = border_size 0, match:class ^(com\\.mitchellh\\.ghostty)$")
|
||||
assert.Contains(t, HyprlandConfig, "windowrulev2 = noborder, class:^(com\\.mitchellh\\.ghostty)$")
|
||||
}
|
||||
|
||||
func TestGhosttyConfigStructure(t *testing.T) {
|
||||
assert.Contains(t, GhosttyConfig, "window-decoration = false")
|
||||
assert.Contains(t, GhosttyConfig, "background-opacity = 1.0")
|
||||
assert.Contains(t, GhosttyConfig, "theme = dankcolors")
|
||||
assert.Contains(t, GhosttyConfig, "config-file = ./config-dankcolors")
|
||||
}
|
||||
|
||||
func TestGhosttyColorConfigStructure(t *testing.T) {
|
||||
|
||||
@@ -21,7 +21,7 @@ func LocateDMSConfig() (string, error) {
|
||||
dataDirs = "/usr/local/share:/usr/share"
|
||||
}
|
||||
|
||||
for dir := range strings.SplitSeq(dataDirs, ":") {
|
||||
for _, dir := range strings.Split(dataDirs, ":") {
|
||||
if dir != "" {
|
||||
primaryPaths = append(primaryPaths, filepath.Join(dir, "quickshell", "dms"))
|
||||
}
|
||||
@@ -33,7 +33,7 @@ func LocateDMSConfig() (string, error) {
|
||||
configDirs = "/etc/xdg"
|
||||
}
|
||||
|
||||
for dir := range strings.SplitSeq(configDirs, ":") {
|
||||
for _, dir := range strings.Split(configDirs, ":") {
|
||||
if dir != "" {
|
||||
primaryPaths = append(primaryPaths, filepath.Join(dir, "quickshell", "dms"))
|
||||
}
|
||||
|
||||
@@ -48,4 +48,4 @@ keybind = shift+enter=text:\n
|
||||
gtk-single-instance = true
|
||||
|
||||
# Dank color generation
|
||||
theme = dankcolors
|
||||
config-file = ./config-dankcolors
|
||||
|
||||
@@ -90,36 +90,36 @@ misc {
|
||||
# ==================
|
||||
# WINDOW RULES
|
||||
# ==================
|
||||
windowrule = tile on, match:class ^(org\.wezfurlong\.wezterm)$
|
||||
windowrulev2 = tile, class:^(org\.wezfurlong\.wezterm)$
|
||||
|
||||
windowrule = rounding 12, match:class ^(org\.gnome\.)
|
||||
windowrule = border_size 0, match:class ^(org\.gnome\.)
|
||||
windowrulev2 = rounding 12, class:^(org\.gnome\.)
|
||||
windowrulev2 = noborder, class:^(org\.gnome\.)
|
||||
|
||||
windowrule = tile on, match:class ^(gnome-control-center)$
|
||||
windowrule = tile on, match:class ^(pavucontrol)$
|
||||
windowrule = tile on, match:class ^(nm-connection-editor)$
|
||||
windowrulev2 = tile, class:^(gnome-control-center)$
|
||||
windowrulev2 = tile, class:^(pavucontrol)$
|
||||
windowrulev2 = tile, class:^(nm-connection-editor)$
|
||||
|
||||
windowrule = float on, match:class ^(gnome-calculator)$
|
||||
windowrule = float on, match:class ^(galculator)$
|
||||
windowrule = float on, match:class ^(blueman-manager)$
|
||||
windowrule = float on, match:class ^(org\.gnome\.Nautilus)$
|
||||
windowrule = float on, match:class ^(steam)$
|
||||
windowrule = float on, match:class ^(xdg-desktop-portal)$
|
||||
windowrulev2 = float, class:^(gnome-calculator)$
|
||||
windowrulev2 = float, class:^(galculator)$
|
||||
windowrulev2 = float, class:^(blueman-manager)$
|
||||
windowrulev2 = float, class:^(org\.gnome\.Nautilus)$
|
||||
windowrulev2 = float, class:^(steam)$
|
||||
windowrulev2 = float, class:^(xdg-desktop-portal)$
|
||||
|
||||
windowrule = border_size 0, match:class ^(org\.wezfurlong\.wezterm)$
|
||||
windowrule = border_size 0, match:class ^(Alacritty)$
|
||||
windowrule = border_size 0, match:class ^(zen)$
|
||||
windowrule = border_size 0, match:class ^(com\.mitchellh\.ghostty)$
|
||||
windowrule = border_size 0, match:class ^(kitty)$
|
||||
windowrulev2 = noborder, class:^(org\.wezfurlong\.wezterm)$
|
||||
windowrulev2 = noborder, class:^(Alacritty)$
|
||||
windowrulev2 = noborder, class:^(zen)$
|
||||
windowrulev2 = noborder, class:^(com\.mitchellh\.ghostty)$
|
||||
windowrulev2 = noborder, class:^(kitty)$
|
||||
|
||||
windowrule = float on, match:class ^(firefox)$, match:title ^(Picture-in-Picture)$
|
||||
windowrule = float on, match:class ^(zoom)$
|
||||
windowrulev2 = float, class:^(firefox)$, title:^(Picture-in-Picture)$
|
||||
windowrulev2 = float, class:^(zoom)$
|
||||
|
||||
# DMS windows floating by default
|
||||
windowrule = float on, match:class ^(org.quickshell)$
|
||||
windowrule = opacity 0.9 0.9, match:float false, match:focus false
|
||||
windowrulev2 = float, class:^(org.quickshell)$
|
||||
windowrulev2 = opacity 0.9 0.9, floating:0, focus:0
|
||||
|
||||
layerrule = no_anim on, match:namespace ^(quickshell)$
|
||||
layerrule = noanim, ^(quickshell)$
|
||||
|
||||
# ==================
|
||||
# KEYBINDINGS
|
||||
|
||||
@@ -345,7 +345,7 @@ func EnsureContrastDPSLstar(hexColor, hexBg string, minLc float64, isLightMode b
|
||||
}
|
||||
|
||||
step := 0.5
|
||||
for range 120 {
|
||||
for i := 0; i < 120; i++ {
|
||||
Lf = math.Max(0, math.Min(100, Lf+dir*step))
|
||||
cand := labToHex(Lf, af, bf)
|
||||
if DeltaPhiStarContrast(cand, hexBg, isLightMode) >= minLc {
|
||||
|
||||
@@ -658,7 +658,7 @@ func TestContrastAlgorithmComparison(t *testing.T) {
|
||||
}
|
||||
|
||||
differentCount := 0
|
||||
for i := range 16 {
|
||||
for i := 0; i < 16; i++ {
|
||||
if wcagColors[i].Hex != dpsColors[i].Hex {
|
||||
differentCount++
|
||||
}
|
||||
|
||||
@@ -112,24 +112,3 @@ func GenerateWeztermTheme(p Palette) string {
|
||||
p.Color12.Hex, p.Color13.Hex, p.Color14.Hex, p.Color15.Hex)
|
||||
return result.String()
|
||||
}
|
||||
|
||||
func GenerateNeovimTheme(p Palette) string {
|
||||
var result strings.Builder
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_0 = \"%s\"\n", p.Color0.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_1 = \"%s\"\n", p.Color1.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_2 = \"%s\"\n", p.Color2.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_3 = \"%s\"\n", p.Color3.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_4 = \"%s\"\n", p.Color4.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_5 = \"%s\"\n", p.Color5.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_6 = \"%s\"\n", p.Color6.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_7 = \"%s\"\n", p.Color7.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_8 = \"%s\"\n", p.Color8.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_9 = \"%s\"\n", p.Color9.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_10 = \"%s\"\n", p.Color10.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_11 = \"%s\"\n", p.Color11.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_12 = \"%s\"\n", p.Color12.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_13 = \"%s\"\n", p.Color13.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_14 = \"%s\"\n", p.Color14.Hex)
|
||||
fmt.Fprintf(&result, "vim.g.terminal_color_15 = \"%s\"\n", p.Color15.Hex)
|
||||
return result.String()
|
||||
}
|
||||
|
||||
@@ -7,7 +7,6 @@ import (
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"runtime"
|
||||
"slices"
|
||||
"strings"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/deps"
|
||||
@@ -515,9 +514,12 @@ func (a *ArchDistribution) reorderAURPackages(packages []string) []string {
|
||||
dmsShell = append(dmsShell, pkg)
|
||||
} else {
|
||||
isDep := false
|
||||
if slices.Contains(dmsDepencies, pkg) {
|
||||
deps = append(deps, pkg)
|
||||
isDep = true
|
||||
for _, dep := range dmsDepencies {
|
||||
if pkg == dep {
|
||||
deps = append(deps, pkg)
|
||||
isDep = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !isDep {
|
||||
others = append(others, pkg)
|
||||
@@ -543,7 +545,7 @@ func (a *ArchDistribution) installSingleAURPackage(ctx context.Context, pkg, sud
|
||||
a.log(fmt.Sprintf("Warning: failed to clean existing cache for %s: %v", pkg, err))
|
||||
}
|
||||
|
||||
if err := os.MkdirAll(buildDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(buildDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create build directory: %w", err)
|
||||
}
|
||||
defer func() {
|
||||
|
||||
@@ -550,7 +550,10 @@ func (b *BaseDistribution) WriteEnvironmentConfig(terminal deps.Terminal) error
|
||||
terminalCmd = "ghostty"
|
||||
}
|
||||
|
||||
content := fmt.Sprintf(`ELECTRON_OZONE_PLATFORM_HINT=auto
|
||||
content := fmt.Sprintf(`QT_QPA_PLATFORM=wayland
|
||||
ELECTRON_OZONE_PLATFORM_HINT=auto
|
||||
QT_QPA_PLATFORMTHEME=gtk3
|
||||
QT_QPA_PLATFORMTHEME_QT6=gtk3
|
||||
TERMINAL=%s
|
||||
`, terminalCmd)
|
||||
|
||||
@@ -564,6 +567,12 @@ TERMINAL=%s
|
||||
}
|
||||
|
||||
func (b *BaseDistribution) EnableDMSService(ctx context.Context, wm deps.WindowManager) error {
|
||||
cmd := exec.CommandContext(ctx, "systemctl", "--user", "enable", "--now", "dms")
|
||||
if err := cmd.Run(); err != nil {
|
||||
return fmt.Errorf("failed to enable dms service: %w", err)
|
||||
}
|
||||
b.log("Enabled dms systemd user service")
|
||||
|
||||
switch wm {
|
||||
case deps.WindowManagerNiri:
|
||||
if err := exec.CommandContext(ctx, "systemctl", "--user", "add-wants", "niri.service", "dms").Run(); err != nil {
|
||||
|
||||
@@ -4,7 +4,6 @@ import (
|
||||
"context"
|
||||
"fmt"
|
||||
"os/exec"
|
||||
"runtime"
|
||||
"strings"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/deps"
|
||||
@@ -385,8 +384,6 @@ func (d *DebianDistribution) enableOBSRepos(ctx context.Context, obsPkgs []Packa
|
||||
debianVersion := "Debian_13"
|
||||
if osInfo.VersionID == "testing" {
|
||||
debianVersion = "Debian_Testing"
|
||||
} else if osInfo.VersionCodename == "sid" || osInfo.VersionID == "sid" || strings.Contains(strings.ToLower(osInfo.PrettyName), "sid") || strings.Contains(strings.ToLower(osInfo.PrettyName), "unstable") {
|
||||
debianVersion = "Debian_Unstable"
|
||||
}
|
||||
|
||||
for _, pkg := range obsPkgs {
|
||||
@@ -430,7 +427,7 @@ func (d *DebianDistribution) enableOBSRepos(ctx context.Context, obsPkgs []Packa
|
||||
}
|
||||
|
||||
// Add repository
|
||||
repoLine := fmt.Sprintf("deb [signed-by=%s, arch=%s] %s/ /", keyringPath, runtime.GOARCH, baseURL)
|
||||
repoLine := fmt.Sprintf("deb [signed-by=%s] %s/ /", keyringPath, baseURL)
|
||||
|
||||
progressChan <- InstallProgressMsg{
|
||||
Phase: PhaseSystemPackages,
|
||||
|
||||
@@ -18,8 +18,8 @@ type ManualPackageInstaller struct {
|
||||
|
||||
// parseLatestTagFromGitOutput parses git ls-remote output and returns the latest tag
|
||||
func (m *ManualPackageInstaller) parseLatestTagFromGitOutput(output string) string {
|
||||
lines := strings.SplitSeq(output, "\n")
|
||||
for line := range lines {
|
||||
lines := strings.Split(output, "\n")
|
||||
for _, line := range lines {
|
||||
if strings.Contains(line, "refs/tags/") && !strings.Contains(line, "^{}") {
|
||||
parts := strings.Split(line, "refs/tags/")
|
||||
if len(parts) > 1 {
|
||||
@@ -103,12 +103,12 @@ func (m *ManualPackageInstaller) installDgop(ctx context.Context, sudoPassword s
|
||||
}
|
||||
|
||||
cacheDir := filepath.Join(homeDir, ".cache", "dankinstall")
|
||||
if err := os.MkdirAll(cacheDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(cacheDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create cache directory: %w", err)
|
||||
}
|
||||
|
||||
tmpDir := filepath.Join(cacheDir, "dgop-build")
|
||||
if err := os.MkdirAll(tmpDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create temp directory: %w", err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
@@ -160,10 +160,10 @@ func (m *ManualPackageInstaller) installNiri(ctx context.Context, sudoPassword s
|
||||
homeDir, _ := os.UserHomeDir()
|
||||
buildDir := filepath.Join(homeDir, ".cache", "dankinstall", "niri-build")
|
||||
tmpDir := filepath.Join(homeDir, ".cache", "dankinstall", "tmp")
|
||||
if err := os.MkdirAll(buildDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(buildDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create build directory: %w", err)
|
||||
}
|
||||
if err := os.MkdirAll(tmpDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create temp directory: %w", err)
|
||||
}
|
||||
defer func() {
|
||||
@@ -237,12 +237,12 @@ func (m *ManualPackageInstaller) installQuickshell(ctx context.Context, variant
|
||||
}
|
||||
|
||||
cacheDir := filepath.Join(homeDir, ".cache", "dankinstall")
|
||||
if err := os.MkdirAll(cacheDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(cacheDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create cache directory: %w", err)
|
||||
}
|
||||
|
||||
tmpDir := filepath.Join(cacheDir, "quickshell-build")
|
||||
if err := os.MkdirAll(tmpDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create temp directory: %w", err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
@@ -273,7 +273,7 @@ func (m *ManualPackageInstaller) installQuickshell(ctx context.Context, variant
|
||||
}
|
||||
|
||||
buildDir := tmpDir + "/build"
|
||||
if err := os.MkdirAll(buildDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(buildDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create build directory: %w", err)
|
||||
}
|
||||
|
||||
@@ -343,12 +343,12 @@ func (m *ManualPackageInstaller) installHyprland(ctx context.Context, sudoPasswo
|
||||
}
|
||||
|
||||
cacheDir := filepath.Join(homeDir, ".cache", "dankinstall")
|
||||
if err := os.MkdirAll(cacheDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(cacheDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create cache directory: %w", err)
|
||||
}
|
||||
|
||||
tmpDir := filepath.Join(cacheDir, "hyprland-build")
|
||||
if err := os.MkdirAll(tmpDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create temp directory: %w", err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
@@ -406,12 +406,12 @@ func (m *ManualPackageInstaller) installGhostty(ctx context.Context, sudoPasswor
|
||||
}
|
||||
|
||||
cacheDir := filepath.Join(homeDir, ".cache", "dankinstall")
|
||||
if err := os.MkdirAll(cacheDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(cacheDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create cache directory: %w", err)
|
||||
}
|
||||
|
||||
tmpDir := filepath.Join(cacheDir, "ghostty-build")
|
||||
if err := os.MkdirAll(tmpDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(tmpDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create temp directory: %w", err)
|
||||
}
|
||||
defer os.RemoveAll(tmpDir)
|
||||
@@ -528,7 +528,7 @@ func (m *ManualPackageInstaller) installDankMaterialShell(ctx context.Context, v
|
||||
}
|
||||
|
||||
configDir := filepath.Dir(dmsPath)
|
||||
if err := os.MkdirAll(configDir, 0o755); err != nil {
|
||||
if err := os.MkdirAll(configDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create quickshell config directory: %w", err)
|
||||
}
|
||||
|
||||
|
||||
@@ -15,12 +15,6 @@ func init() {
|
||||
Register("opensuse-tumbleweed", "#73BA25", FamilySUSE, func(config DistroConfig, logChan chan<- string) Distribution {
|
||||
return NewOpenSUSEDistribution(config, logChan)
|
||||
})
|
||||
Register("opensuse-leap", "#73BA25", FamilySUSE, func(config DistroConfig, logChan chan<- string) Distribution {
|
||||
return NewOpenSUSEDistribution(config, logChan)
|
||||
})
|
||||
Register("opensuse-slowroll", "#73BA25", FamilySUSE, func(config DistroConfig, logChan chan<- string) Distribution {
|
||||
return NewOpenSUSEDistribution(config, logChan)
|
||||
})
|
||||
}
|
||||
|
||||
type OpenSUSEDistribution struct {
|
||||
@@ -440,19 +434,6 @@ func (o *OpenSUSEDistribution) extractPackageNames(packages []PackageMapping) []
|
||||
func (o *OpenSUSEDistribution) enableOBSRepos(ctx context.Context, obsPkgs []PackageMapping, sudoPassword string, progressChan chan<- InstallProgressMsg) error {
|
||||
enabledRepos := make(map[string]bool)
|
||||
|
||||
osInfo, err := GetOSInfo()
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to get OS info: %w", err)
|
||||
}
|
||||
|
||||
obsDistroVersion := "openSUSE_Tumbleweed"
|
||||
switch osInfo.Distribution.ID {
|
||||
case "opensuse-leap":
|
||||
obsDistroVersion = fmt.Sprintf("openSUSE_Leap_%s", osInfo.VersionID)
|
||||
case "opensuse-slowroll":
|
||||
obsDistroVersion = "openSUSE_Slowroll"
|
||||
}
|
||||
|
||||
for _, pkg := range obsPkgs {
|
||||
if pkg.RepoURL != "" && !enabledRepos[pkg.RepoURL] {
|
||||
o.log(fmt.Sprintf("Enabling OBS repository: %s", pkg.RepoURL))
|
||||
@@ -460,8 +441,8 @@ func (o *OpenSUSEDistribution) enableOBSRepos(ctx context.Context, obsPkgs []Pac
|
||||
// RepoURL format: "home:AvengeMedia:danklinux"
|
||||
repoPath := strings.ReplaceAll(pkg.RepoURL, ":", ":/")
|
||||
repoName := strings.ReplaceAll(pkg.RepoURL, ":", "-")
|
||||
repoURL := fmt.Sprintf("https://download.opensuse.org/repositories/%s/%s/%s.repo",
|
||||
repoPath, obsDistroVersion, pkg.RepoURL)
|
||||
repoURL := fmt.Sprintf("https://download.opensuse.org/repositories/%s/openSUSE_Tumbleweed/%s.repo",
|
||||
repoPath, pkg.RepoURL)
|
||||
|
||||
checkCmd := exec.CommandContext(ctx, "zypper", "repos", repoName)
|
||||
if checkCmd.Run() == nil {
|
||||
|
||||
@@ -19,12 +19,11 @@ type DistroInfo struct {
|
||||
|
||||
// OSInfo contains complete OS information
|
||||
type OSInfo struct {
|
||||
Distribution DistroInfo
|
||||
Version string
|
||||
VersionID string
|
||||
VersionCodename string
|
||||
PrettyName string
|
||||
Architecture string
|
||||
Distribution DistroInfo
|
||||
Version string
|
||||
VersionID string
|
||||
PrettyName string
|
||||
Architecture string
|
||||
}
|
||||
|
||||
// GetOSInfo detects the current OS and returns information about it
|
||||
@@ -73,8 +72,6 @@ func GetOSInfo() (*OSInfo, error) {
|
||||
info.VersionID = value
|
||||
case "VERSION":
|
||||
info.Version = value
|
||||
case "VERSION_CODENAME":
|
||||
info.VersionCodename = value
|
||||
case "PRETTY_NAME":
|
||||
info.PrettyName = value
|
||||
}
|
||||
@@ -103,10 +100,6 @@ func IsUnsupportedDistro(distroID, versionID string) bool {
|
||||
}
|
||||
|
||||
if distroID == "debian" {
|
||||
// unstable/sid support
|
||||
if versionID == "sid" {
|
||||
return false
|
||||
}
|
||||
if versionID == "" {
|
||||
// debian testing/sid have no version ID
|
||||
return false
|
||||
|
||||
@@ -23,7 +23,7 @@ func DefaultDiscoveryConfig() *DiscoveryConfig {
|
||||
|
||||
configDirs := os.Getenv("XDG_CONFIG_DIRS")
|
||||
if configDirs != "" {
|
||||
for dir := range strings.SplitSeq(configDirs, ":") {
|
||||
for _, dir := range strings.Split(configDirs, ":") {
|
||||
if dir != "" {
|
||||
searchPaths = append(searchPaths, filepath.Join(dir, "DankMaterialShell", "cheatsheets"))
|
||||
}
|
||||
|
||||
@@ -12,7 +12,7 @@ func TestNewJSONFileProvider(t *testing.T) {
|
||||
tmpDir := t.TempDir()
|
||||
testFile := filepath.Join(tmpDir, "test.json")
|
||||
|
||||
if err := os.WriteFile(testFile, []byte("{}"), 0o644); err != nil {
|
||||
if err := os.WriteFile(testFile, []byte("{}"), 0644); err != nil {
|
||||
t.Fatalf("Failed to create test file: %v", err)
|
||||
}
|
||||
|
||||
@@ -81,7 +81,7 @@ func TestJSONFileProviderGetCheatSheet(t *testing.T) {
|
||||
}
|
||||
}`
|
||||
|
||||
if err := os.WriteFile(testFile, []byte(content), 0o644); err != nil {
|
||||
if err := os.WriteFile(testFile, []byte(content), 0644); err != nil {
|
||||
t.Fatalf("Failed to write test file: %v", err)
|
||||
}
|
||||
|
||||
@@ -135,7 +135,7 @@ func TestJSONFileProviderGetCheatSheetNoProvider(t *testing.T) {
|
||||
"binds": {}
|
||||
}`
|
||||
|
||||
if err := os.WriteFile(testFile, []byte(content), 0o644); err != nil {
|
||||
if err := os.WriteFile(testFile, []byte(content), 0644); err != nil {
|
||||
t.Fatalf("Failed to write test file: %v", err)
|
||||
}
|
||||
|
||||
@@ -181,7 +181,7 @@ func TestJSONFileProviderFlatArrayBackwardsCompat(t *testing.T) {
|
||||
]
|
||||
}`
|
||||
|
||||
if err := os.WriteFile(testFile, []byte(content), 0o644); err != nil {
|
||||
if err := os.WriteFile(testFile, []byte(content), 0644); err != nil {
|
||||
t.Fatalf("Failed to write test file: %v", err)
|
||||
}
|
||||
|
||||
@@ -216,7 +216,7 @@ func TestJSONFileProviderInvalidJSON(t *testing.T) {
|
||||
tmpDir := t.TempDir()
|
||||
testFile := filepath.Join(tmpDir, "invalid.json")
|
||||
|
||||
if err := os.WriteFile(testFile, []byte("not valid json"), 0o644); err != nil {
|
||||
if err := os.WriteFile(testFile, []byte("not valid json"), 0644); err != nil {
|
||||
t.Fatalf("Failed to write test file: %v", err)
|
||||
}
|
||||
|
||||
|
||||
@@ -234,61 +234,53 @@ output_path = '%s'
|
||||
if !opts.ShouldSkipTemplate("gtk") {
|
||||
switch opts.Mode {
|
||||
case "light":
|
||||
appendConfig(opts, cfgFile, nil, "gtk3-light.toml")
|
||||
appendConfig(opts, cfgFile, "skip", "gtk3-light.toml")
|
||||
default:
|
||||
appendConfig(opts, cfgFile, nil, "gtk3-dark.toml")
|
||||
appendConfig(opts, cfgFile, "skip", "gtk3-dark.toml")
|
||||
}
|
||||
}
|
||||
|
||||
if !opts.ShouldSkipTemplate("niri") {
|
||||
appendConfig(opts, cfgFile, []string{"niri"}, "niri.toml")
|
||||
appendConfig(opts, cfgFile, "niri", "niri.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("qt5ct") {
|
||||
appendConfig(opts, cfgFile, []string{"qt5ct"}, "qt5ct.toml")
|
||||
appendConfig(opts, cfgFile, "qt5ct", "qt5ct.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("qt6ct") {
|
||||
appendConfig(opts, cfgFile, []string{"qt6ct"}, "qt6ct.toml")
|
||||
appendConfig(opts, cfgFile, "qt6ct", "qt6ct.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("firefox") {
|
||||
appendConfig(opts, cfgFile, []string{"firefox"}, "firefox.toml")
|
||||
appendConfig(opts, cfgFile, "firefox", "firefox.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("pywalfox") {
|
||||
appendConfig(opts, cfgFile, []string{"pywalfox"}, "pywalfox.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("zenbrowser") {
|
||||
appendConfig(opts, cfgFile, []string{"zen", "zen-browser"}, "zenbrowser.toml")
|
||||
appendConfig(opts, cfgFile, "pywalfox", "pywalfox.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("vesktop") {
|
||||
appendConfig(opts, cfgFile, []string{"vesktop"}, "vesktop.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("equibop") {
|
||||
appendConfig(opts, cfgFile, []string{"equibop"}, "equibop.toml")
|
||||
appendConfig(opts, cfgFile, "vesktop", "vesktop.toml")
|
||||
}
|
||||
|
||||
if !opts.ShouldSkipTemplate("ghostty") {
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, []string{"ghostty"}, "ghostty.toml")
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, "ghostty", "ghostty.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("kitty") {
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, []string{"kitty"}, "kitty.toml")
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, "kitty", "kitty.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("foot") {
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, []string{"foot"}, "foot.toml")
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, "foot", "foot.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("alacritty") {
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, []string{"alacritty"}, "alacritty.toml")
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, "alacritty", "alacritty.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("wezterm") {
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, []string{"wezterm"}, "wezterm.toml")
|
||||
}
|
||||
if !opts.ShouldSkipTemplate("nvim") {
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, []string{"nvim"}, "neovim.toml")
|
||||
appendTerminalConfig(opts, cfgFile, tmpDir, "wezterm", "wezterm.toml")
|
||||
}
|
||||
|
||||
if !opts.ShouldSkipTemplate("dgop") {
|
||||
appendConfig(opts, cfgFile, []string{"dgop"}, "dgop.toml")
|
||||
appendConfig(opts, cfgFile, "dgop", "dgop.toml")
|
||||
}
|
||||
|
||||
if !opts.ShouldSkipTemplate("kcolorscheme") {
|
||||
appendConfig(opts, cfgFile, nil, "kcolorscheme.toml")
|
||||
appendConfig(opts, cfgFile, "skip", "kcolorscheme.toml")
|
||||
}
|
||||
|
||||
if !opts.ShouldSkipTemplate("vscode") {
|
||||
@@ -326,12 +318,12 @@ output_path = '%s'
|
||||
return nil
|
||||
}
|
||||
|
||||
func appendConfig(opts *Options, cfgFile *os.File, checkCmd []string, fileName string) {
|
||||
func appendConfig(opts *Options, cfgFile *os.File, checkCmd, fileName string) {
|
||||
configPath := filepath.Join(opts.ShellDir, "matugen", "configs", fileName)
|
||||
if _, err := os.Stat(configPath); err != nil {
|
||||
return
|
||||
}
|
||||
if len(checkCmd) > 0 && !utils.AnyCommandExists(checkCmd...) {
|
||||
if checkCmd != "skip" && !utils.CommandExists(checkCmd) {
|
||||
return
|
||||
}
|
||||
data, err := os.ReadFile(configPath)
|
||||
@@ -342,12 +334,12 @@ func appendConfig(opts *Options, cfgFile *os.File, checkCmd []string, fileName s
|
||||
cfgFile.WriteString("\n")
|
||||
}
|
||||
|
||||
func appendTerminalConfig(opts *Options, cfgFile *os.File, tmpDir string, checkCmd []string, fileName string) {
|
||||
func appendTerminalConfig(opts *Options, cfgFile *os.File, tmpDir, checkCmd, fileName string) {
|
||||
configPath := filepath.Join(opts.ShellDir, "matugen", "configs", fileName)
|
||||
if _, err := os.Stat(configPath); err != nil {
|
||||
return
|
||||
}
|
||||
if len(checkCmd) > 0 && !utils.AnyCommandExists(checkCmd...) {
|
||||
if checkCmd != "skip" && !utils.CommandExists(checkCmd) {
|
||||
return
|
||||
}
|
||||
data, err := os.ReadFile(configPath)
|
||||
|
||||
@@ -141,7 +141,7 @@ func (r *RegionSelector) setupKeyboardHandlers() {
|
||||
for _, os := range r.surfaces {
|
||||
r.redrawSurface(os)
|
||||
}
|
||||
case 28, 57, 96:
|
||||
case 28, 57:
|
||||
if r.selection.hasSelection {
|
||||
r.finishSelection()
|
||||
}
|
||||
|
||||
@@ -205,6 +205,12 @@ func handleSetConfig(conn net.Conn, req models.Request, m *Manager) {
|
||||
if v, ok := req.Params["disabled"].(bool); ok {
|
||||
cfg.Disabled = v
|
||||
}
|
||||
if v, ok := req.Params["disableHistory"].(bool); ok {
|
||||
cfg.DisableHistory = v
|
||||
}
|
||||
if v, ok := req.Params["disablePersist"].(bool); ok {
|
||||
cfg.DisablePersist = v
|
||||
}
|
||||
|
||||
if err := m.SetConfig(cfg); err != nil {
|
||||
models.RespondError(conn, req.ID, err.Error())
|
||||
|
||||
@@ -11,16 +11,14 @@ import (
|
||||
"io"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"slices"
|
||||
"strings"
|
||||
"syscall"
|
||||
"time"
|
||||
|
||||
"hash/fnv"
|
||||
|
||||
"github.com/fsnotify/fsnotify"
|
||||
_ "golang.org/x/image/bmp"
|
||||
_ "golang.org/x/image/tiff"
|
||||
"hash/fnv"
|
||||
|
||||
bolt "go.etcd.io/bbolt"
|
||||
|
||||
@@ -30,12 +28,11 @@ import (
|
||||
wlclient "github.com/AvengeMedia/DankMaterialShell/core/pkg/go-wayland/wayland/client"
|
||||
)
|
||||
|
||||
// These mime types wont be stored in history
|
||||
var sensitiveMimeTypes = []string{
|
||||
"x-kde-passwordManagerHint",
|
||||
}
|
||||
|
||||
func NewManager(wlCtx wlcontext.WaylandContext, config Config) (*Manager, error) {
|
||||
if config.Disabled {
|
||||
return nil, fmt.Errorf("clipboard disabled in config")
|
||||
}
|
||||
|
||||
display := wlCtx.Display()
|
||||
dbPath, err := getDBPath()
|
||||
if err != nil {
|
||||
@@ -57,10 +54,8 @@ func NewManager(wlCtx wlcontext.WaylandContext, config Config) (*Manager, error)
|
||||
dbPath: dbPath,
|
||||
}
|
||||
|
||||
if !config.Disabled {
|
||||
if err := m.setupRegistry(); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if err := m.setupRegistry(); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
m.notifierWg.Add(1)
|
||||
@@ -68,17 +63,17 @@ func NewManager(wlCtx wlcontext.WaylandContext, config Config) (*Manager, error)
|
||||
|
||||
go m.watchConfig()
|
||||
|
||||
db, err := openDB(dbPath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to open db: %w", err)
|
||||
}
|
||||
m.db = db
|
||||
if !config.DisableHistory {
|
||||
db, err := openDB(dbPath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to open db: %w", err)
|
||||
}
|
||||
m.db = db
|
||||
|
||||
if err := m.migrateHashes(); err != nil {
|
||||
log.Errorf("Failed to migrate hashes: %v", err)
|
||||
}
|
||||
if err := m.migrateHashes(); err != nil {
|
||||
log.Errorf("Failed to migrate hashes: %v", err)
|
||||
}
|
||||
|
||||
if !config.Disabled {
|
||||
if config.ClearAtStartup {
|
||||
if err := m.clearHistoryInternal(); err != nil {
|
||||
log.Errorf("Failed to clear history at startup: %v", err)
|
||||
@@ -95,7 +90,7 @@ func NewManager(wlCtx wlcontext.WaylandContext, config Config) (*Manager, error)
|
||||
m.alive = true
|
||||
m.updateState()
|
||||
|
||||
if !config.Disabled && m.dataControlMgr != nil && m.seat != nil {
|
||||
if m.dataControlMgr != nil && m.seat != nil {
|
||||
m.setupDataDeviceSync()
|
||||
}
|
||||
|
||||
@@ -258,10 +253,6 @@ func (m *Manager) setupDataDeviceSync() {
|
||||
return
|
||||
}
|
||||
|
||||
if m.hasSensitiveMimeType(mimes) {
|
||||
return
|
||||
}
|
||||
|
||||
preferredMime := m.selectMimeType(mimes)
|
||||
if preferredMime == "" {
|
||||
return
|
||||
@@ -324,10 +315,14 @@ func (m *Manager) readAndStore(r *os.File, mimeType string) {
|
||||
return
|
||||
}
|
||||
|
||||
if !cfg.Disabled && m.db != nil {
|
||||
if !cfg.DisableHistory && m.db != nil {
|
||||
m.storeClipboardEntry(data, mimeType)
|
||||
}
|
||||
|
||||
if !cfg.DisablePersist {
|
||||
m.persistClipboard([]string{mimeType}, map[string][]byte{mimeType: data})
|
||||
}
|
||||
|
||||
m.updateState()
|
||||
m.notifySubscribers()
|
||||
}
|
||||
@@ -353,6 +348,105 @@ func (m *Manager) storeClipboardEntry(data []byte, mimeType string) {
|
||||
}
|
||||
}
|
||||
|
||||
func (m *Manager) persistClipboard(mimeTypes []string, data map[string][]byte) {
|
||||
m.persistMutex.Lock()
|
||||
m.persistMimeTypes = mimeTypes
|
||||
m.persistData = data
|
||||
m.persistMutex.Unlock()
|
||||
|
||||
m.post(func() {
|
||||
m.takePersistOwnership()
|
||||
})
|
||||
}
|
||||
|
||||
func (m *Manager) takePersistOwnership() {
|
||||
if m.dataControlMgr == nil || m.dataDevice == nil {
|
||||
return
|
||||
}
|
||||
|
||||
if m.getConfig().DisablePersist {
|
||||
return
|
||||
}
|
||||
|
||||
m.persistMutex.RLock()
|
||||
mimeTypes := m.persistMimeTypes
|
||||
m.persistMutex.RUnlock()
|
||||
|
||||
if len(mimeTypes) == 0 {
|
||||
return
|
||||
}
|
||||
|
||||
dataMgr := m.dataControlMgr.(*ext_data_control.ExtDataControlManagerV1)
|
||||
|
||||
source, err := dataMgr.CreateDataSource()
|
||||
if err != nil {
|
||||
log.Errorf("Failed to create persist source: %v", err)
|
||||
return
|
||||
}
|
||||
|
||||
for _, mime := range mimeTypes {
|
||||
if err := source.Offer(mime); err != nil {
|
||||
log.Errorf("Failed to offer mime type %s: %v", mime, err)
|
||||
}
|
||||
}
|
||||
|
||||
source.SetSendHandler(func(e ext_data_control.ExtDataControlSourceV1SendEvent) {
|
||||
fd := e.Fd
|
||||
defer syscall.Close(fd)
|
||||
|
||||
m.persistMutex.RLock()
|
||||
d := m.persistData[e.MimeType]
|
||||
m.persistMutex.RUnlock()
|
||||
|
||||
if len(d) == 0 {
|
||||
return
|
||||
}
|
||||
|
||||
file := os.NewFile(uintptr(fd), "clipboard-pipe")
|
||||
defer file.Close()
|
||||
file.Write(d)
|
||||
})
|
||||
|
||||
source.SetCancelledHandler(func(e ext_data_control.ExtDataControlSourceV1CancelledEvent) {
|
||||
m.ownerLock.Lock()
|
||||
m.isOwner = false
|
||||
m.ownerLock.Unlock()
|
||||
})
|
||||
|
||||
if m.currentSource != nil {
|
||||
oldSource := m.currentSource.(*ext_data_control.ExtDataControlSourceV1)
|
||||
oldSource.Destroy()
|
||||
}
|
||||
m.currentSource = source
|
||||
|
||||
device := m.dataDevice.(*ext_data_control.ExtDataControlDeviceV1)
|
||||
if err := device.SetSelection(source); err != nil {
|
||||
log.Errorf("Failed to set persist selection: %v", err)
|
||||
return
|
||||
}
|
||||
|
||||
m.ownerLock.Lock()
|
||||
m.isOwner = true
|
||||
m.ownerLock.Unlock()
|
||||
}
|
||||
|
||||
func (m *Manager) releaseOwnership() {
|
||||
m.ownerLock.Lock()
|
||||
m.isOwner = false
|
||||
m.ownerLock.Unlock()
|
||||
|
||||
m.persistMutex.Lock()
|
||||
m.persistData = nil
|
||||
m.persistMimeTypes = nil
|
||||
m.persistMutex.Unlock()
|
||||
|
||||
if m.currentSource != nil {
|
||||
source := m.currentSource.(*ext_data_control.ExtDataControlSourceV1)
|
||||
source.Destroy()
|
||||
m.currentSource = nil
|
||||
}
|
||||
}
|
||||
|
||||
func (m *Manager) storeEntry(entry Entry) error {
|
||||
if m.db == nil {
|
||||
return fmt.Errorf("database not available")
|
||||
@@ -401,9 +495,6 @@ func (m *Manager) deduplicateInTx(b *bolt.Bucket, hash uint64) error {
|
||||
}
|
||||
|
||||
func (m *Manager) trimLengthInTx(b *bolt.Bucket) error {
|
||||
if m.config.MaxHistory < 0 {
|
||||
return nil
|
||||
}
|
||||
c := b.Cursor()
|
||||
var count int
|
||||
for k, _ := c.Last(); k != nil; k, _ = c.Prev() {
|
||||
@@ -501,12 +592,6 @@ func extractHash(data []byte) uint64 {
|
||||
return binary.BigEndian.Uint64(data[len(data)-8:])
|
||||
}
|
||||
|
||||
func (m *Manager) hasSensitiveMimeType(mimes []string) bool {
|
||||
return slices.ContainsFunc(mimes, func(mime string) bool {
|
||||
return slices.Contains(sensitiveMimeTypes, mime)
|
||||
})
|
||||
}
|
||||
|
||||
func (m *Manager) selectMimeType(mimes []string) string {
|
||||
preferredTypes := []string{
|
||||
"text/plain;charset=utf-8",
|
||||
@@ -1209,13 +1294,29 @@ func (m *Manager) applyConfigChange(newCfg Config) {
|
||||
m.config = newCfg
|
||||
m.configMutex.Unlock()
|
||||
|
||||
switch {
|
||||
case newCfg.Disabled && !oldCfg.Disabled:
|
||||
log.Info("Clipboard tracking disabled")
|
||||
case !newCfg.Disabled && oldCfg.Disabled:
|
||||
log.Info("Clipboard tracking enabled")
|
||||
if newCfg.DisableHistory && !oldCfg.DisableHistory && m.db != nil {
|
||||
log.Info("Clipboard history disabled, closing database")
|
||||
m.db.Close()
|
||||
m.db = nil
|
||||
}
|
||||
|
||||
if !newCfg.DisableHistory && oldCfg.DisableHistory && m.db == nil {
|
||||
log.Info("Clipboard history enabled, opening database")
|
||||
if db, err := openDB(m.dbPath); err == nil {
|
||||
m.db = db
|
||||
} else {
|
||||
log.Errorf("Failed to reopen database: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
if newCfg.DisablePersist && !oldCfg.DisablePersist {
|
||||
log.Info("Clipboard persist disabled, releasing ownership")
|
||||
m.releaseOwnership()
|
||||
}
|
||||
|
||||
log.Infof("Clipboard config reloaded: disableHistory=%v disablePersist=%v",
|
||||
newCfg.DisableHistory, newCfg.DisablePersist)
|
||||
|
||||
m.updateState()
|
||||
m.notifySubscribers()
|
||||
}
|
||||
@@ -1223,8 +1324,8 @@ func (m *Manager) applyConfigChange(newCfg Config) {
|
||||
func (m *Manager) StoreData(data []byte, mimeType string) error {
|
||||
cfg := m.getConfig()
|
||||
|
||||
if cfg.Disabled {
|
||||
return fmt.Errorf("clipboard tracking disabled")
|
||||
if cfg.DisableHistory {
|
||||
return fmt.Errorf("clipboard history disabled")
|
||||
}
|
||||
|
||||
if m.db == nil {
|
||||
|
||||
@@ -457,6 +457,8 @@ func TestDefaultConfig(t *testing.T) {
|
||||
assert.Equal(t, 0, cfg.AutoClearDays)
|
||||
assert.False(t, cfg.ClearAtStartup)
|
||||
assert.False(t, cfg.Disabled)
|
||||
assert.False(t, cfg.DisableHistory)
|
||||
assert.True(t, cfg.DisablePersist)
|
||||
}
|
||||
|
||||
func TestManager_PostDelegatesToWlContext(t *testing.T) {
|
||||
|
||||
@@ -18,7 +18,10 @@ type Config struct {
|
||||
MaxEntrySize int64 `json:"maxEntrySize"`
|
||||
AutoClearDays int `json:"autoClearDays"`
|
||||
ClearAtStartup bool `json:"clearAtStartup"`
|
||||
Disabled bool `json:"disabled"`
|
||||
|
||||
Disabled bool `json:"disabled"`
|
||||
DisableHistory bool `json:"disableHistory"`
|
||||
DisablePersist bool `json:"disablePersist"`
|
||||
}
|
||||
|
||||
func DefaultConfig() Config {
|
||||
@@ -27,6 +30,7 @@ func DefaultConfig() Config {
|
||||
MaxEntrySize: 5 * 1024 * 1024,
|
||||
AutoClearDays: 0,
|
||||
ClearAtStartup: false,
|
||||
DisablePersist: true,
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -2,8 +2,6 @@ package cups
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"net"
|
||||
"net/url"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
@@ -158,42 +156,9 @@ func (m *Manager) PurgeJobs(printerName string) error {
|
||||
return err
|
||||
}
|
||||
|
||||
func resolveIPFromURI(uri string) string {
|
||||
parsed, err := url.Parse(uri)
|
||||
if err != nil {
|
||||
return ""
|
||||
}
|
||||
host := parsed.Hostname()
|
||||
if host == "" {
|
||||
return ""
|
||||
}
|
||||
if ip := net.ParseIP(host); ip != nil {
|
||||
return ip.String()
|
||||
}
|
||||
addrs, err := net.LookupIP(host)
|
||||
if err != nil || len(addrs) == 0 {
|
||||
return ""
|
||||
}
|
||||
for _, addr := range addrs {
|
||||
if v4 := addr.To4(); v4 != nil {
|
||||
return v4.String()
|
||||
}
|
||||
}
|
||||
return addrs[0].String()
|
||||
}
|
||||
|
||||
func (m *Manager) GetDevices() ([]Device, error) {
|
||||
if m.pkHelper != nil {
|
||||
devices, err := m.pkHelper.DevicesGet(10, 0, nil, nil)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
for i := range devices {
|
||||
if devices[i].Class == "network" {
|
||||
devices[i].IP = resolveIPFromURI(devices[i].URI)
|
||||
}
|
||||
}
|
||||
return devices, nil
|
||||
return m.pkHelper.DevicesGet(10, 0, nil, nil)
|
||||
}
|
||||
|
||||
deviceAttrs, err := m.client.GetDevices()
|
||||
@@ -211,9 +176,6 @@ func (m *Manager) GetDevices() ([]Device, error) {
|
||||
ID: getStringAttr(attrs, "device-id"),
|
||||
Location: getStringAttr(attrs, "device-location"),
|
||||
}
|
||||
if device.Class == "network" {
|
||||
device.IP = resolveIPFromURI(uri)
|
||||
}
|
||||
devices = append(devices, device)
|
||||
}
|
||||
|
||||
|
||||
@@ -42,7 +42,6 @@ type Device struct {
|
||||
MakeModel string `json:"makeModel"`
|
||||
ID string `json:"id"`
|
||||
Location string `json:"location"`
|
||||
IP string `json:"ip,omitempty"`
|
||||
}
|
||||
|
||||
type PPD struct {
|
||||
|
||||
@@ -233,9 +233,6 @@ func (a *SecretAgent) GetSecrets(
|
||||
if a.manager != nil && connType == "802-11-wireless" && a.manager.WasRecentlyFailed(ssid) {
|
||||
reason = "wrong-password"
|
||||
}
|
||||
if settingName == "vpn" && isPKCS11Auth(conn, vpnSvc) {
|
||||
reason = "pkcs11"
|
||||
}
|
||||
|
||||
var connId, connUuid string
|
||||
if c, ok := conn["connection"]; ok {
|
||||
@@ -252,28 +249,6 @@ func (a *SecretAgent) GetSecrets(
|
||||
}
|
||||
|
||||
if settingName == "vpn" && a.backend != nil {
|
||||
// Check for cached PKCS11 PIN first
|
||||
isPKCS11Request := len(fields) == 1 && fields[0] == "key_pass"
|
||||
if isPKCS11Request {
|
||||
a.backend.cachedPKCS11Mu.Lock()
|
||||
cached := a.backend.cachedPKCS11PIN
|
||||
if cached != nil && cached.ConnectionUUID == connUuid {
|
||||
a.backend.cachedPKCS11PIN = nil
|
||||
a.backend.cachedPKCS11Mu.Unlock()
|
||||
|
||||
log.Infof("[SecretAgent] Using cached PKCS11 PIN")
|
||||
|
||||
out := nmSettingMap{}
|
||||
vpnSec := nmVariantMap{}
|
||||
vpnSec["secrets"] = dbus.MakeVariant(map[string]string{"key_pass": cached.PIN})
|
||||
out[settingName] = vpnSec
|
||||
|
||||
return out, nil
|
||||
}
|
||||
a.backend.cachedPKCS11Mu.Unlock()
|
||||
}
|
||||
|
||||
// Check for cached VPN password
|
||||
a.backend.cachedVPNCredsMu.Lock()
|
||||
cached := a.backend.cachedVPNCreds
|
||||
if cached != nil && cached.ConnectionUUID == connUuid {
|
||||
@@ -283,9 +258,9 @@ func (a *SecretAgent) GetSecrets(
|
||||
log.Infof("[SecretAgent] Using cached password from pre-activation prompt")
|
||||
|
||||
out := nmSettingMap{}
|
||||
vpnSec := nmVariantMap{}
|
||||
vpnSec["secrets"] = dbus.MakeVariant(map[string]string{"password": cached.Password})
|
||||
out[settingName] = vpnSec
|
||||
sec := nmVariantMap{}
|
||||
sec["password"] = dbus.MakeVariant(cached.Password)
|
||||
out[settingName] = sec
|
||||
|
||||
if cached.SavePassword {
|
||||
a.backend.pendingVPNSaveMu.Lock()
|
||||
@@ -389,41 +364,16 @@ func (a *SecretAgent) GetSecrets(
|
||||
}
|
||||
sec[k] = dbus.MakeVariant(v)
|
||||
}
|
||||
|
||||
// Check if this is PKCS11 auth (key_pass)
|
||||
pin, isPKCS11 := reply.Secrets["key_pass"]
|
||||
out[settingName] = sec
|
||||
|
||||
switch settingName {
|
||||
case "vpn":
|
||||
// VPN secrets must be wrapped in a "secrets" key per NM spec
|
||||
secretsDict := make(map[string]string)
|
||||
for k, v := range reply.Secrets {
|
||||
if k != "username" {
|
||||
secretsDict[k] = v
|
||||
}
|
||||
}
|
||||
vpnSec := nmVariantMap{}
|
||||
vpnSec["secrets"] = dbus.MakeVariant(secretsDict)
|
||||
out[settingName] = vpnSec
|
||||
log.Infof("[SecretAgent] Returning VPN secrets with %d fields for %s", len(secretsDict), vpnSvc)
|
||||
|
||||
// Cache PKCS11 PIN in case GetSecrets is called again during activation
|
||||
if isPKCS11 && a.backend != nil {
|
||||
a.backend.cachedPKCS11Mu.Lock()
|
||||
a.backend.cachedPKCS11PIN = &cachedPKCS11PIN{
|
||||
ConnectionUUID: connUuid,
|
||||
PIN: pin,
|
||||
}
|
||||
a.backend.cachedPKCS11Mu.Unlock()
|
||||
log.Infof("[SecretAgent] Cached PKCS11 PIN for potential re-request")
|
||||
}
|
||||
case "802-1x":
|
||||
out[settingName] = sec
|
||||
log.Infof("[SecretAgent] Returning 802-1x enterprise secrets with %d fields", len(sec))
|
||||
default:
|
||||
out[settingName] = sec
|
||||
case "vpn":
|
||||
log.Infof("[SecretAgent] Returning VPN secrets with %d fields for %s", len(sec), vpnSvc)
|
||||
}
|
||||
if settingName == "vpn" && a.backend != nil && !isPKCS11 && (vpnUsername != "" || reply.Save) {
|
||||
|
||||
if settingName == "vpn" && a.backend != nil && (vpnUsername != "" || reply.Save) {
|
||||
pw := reply.Secrets["password"]
|
||||
a.backend.pendingVPNSaveMu.Lock()
|
||||
a.backend.pendingVPNSave = &pendingVPNCredentials{
|
||||
@@ -629,15 +579,6 @@ func inferVPNFields(conn map[string]nmVariantMap, vpnService string) []string {
|
||||
connType := dataMap["connection-type"]
|
||||
|
||||
switch {
|
||||
case strings.Contains(vpnService, "openconnect"):
|
||||
authType := dataMap["authtype"]
|
||||
userCert := dataMap["usercert"]
|
||||
if authType == "cert" && strings.HasPrefix(userCert, "pkcs11:") {
|
||||
return []string{"key_pass"}
|
||||
}
|
||||
if dataMap["username"] == "" {
|
||||
fields = []string{"username", "password"}
|
||||
}
|
||||
case strings.Contains(vpnService, "openvpn"):
|
||||
if connType == "password" || connType == "password-tls" {
|
||||
if dataMap["username"] == "" {
|
||||
@@ -645,7 +586,7 @@ func inferVPNFields(conn map[string]nmVariantMap, vpnService string) []string {
|
||||
}
|
||||
}
|
||||
case strings.Contains(vpnService, "vpnc"), strings.Contains(vpnService, "l2tp"),
|
||||
strings.Contains(vpnService, "pptp"):
|
||||
strings.Contains(vpnService, "pptp"), strings.Contains(vpnService, "openconnect"):
|
||||
if dataMap["username"] == "" {
|
||||
fields = []string{"username", "password"}
|
||||
}
|
||||
@@ -656,8 +597,6 @@ func inferVPNFields(conn map[string]nmVariantMap, vpnService string) []string {
|
||||
|
||||
func vpnFieldMeta(field, vpnService string) (label string, isSecret bool) {
|
||||
switch field {
|
||||
case "key_pass":
|
||||
return "PIN", true
|
||||
case "password":
|
||||
return "Password", true
|
||||
case "Xauth password":
|
||||
@@ -685,25 +624,6 @@ func vpnFieldMeta(field, vpnService string) (label string, isSecret bool) {
|
||||
return titleCaser.String(strings.ReplaceAll(field, "-", " ")), false
|
||||
}
|
||||
|
||||
func isPKCS11Auth(conn map[string]nmVariantMap, vpnService string) bool {
|
||||
if !strings.Contains(vpnService, "openconnect") {
|
||||
return false
|
||||
}
|
||||
vpnSettings, ok := conn["vpn"]
|
||||
if !ok {
|
||||
return false
|
||||
}
|
||||
dataVariant, ok := vpnSettings["data"]
|
||||
if !ok {
|
||||
return false
|
||||
}
|
||||
dataMap, ok := dataVariant.Value().(map[string]string)
|
||||
if !ok {
|
||||
return false
|
||||
}
|
||||
return dataMap["authtype"] == "cert" && strings.HasPrefix(dataMap["usercert"], "pkcs11:")
|
||||
}
|
||||
|
||||
func readVPNPasswordFlags(conn map[string]nmVariantMap, settingName string) uint32 {
|
||||
if settingName != "vpn" {
|
||||
return 0xFFFF
|
||||
|
||||
@@ -72,8 +72,6 @@ type NetworkManagerBackend struct {
|
||||
pendingVPNSaveMu sync.Mutex
|
||||
cachedVPNCreds *cachedVPNCredentials
|
||||
cachedVPNCredsMu sync.Mutex
|
||||
cachedPKCS11PIN *cachedPKCS11PIN
|
||||
cachedPKCS11Mu sync.Mutex
|
||||
|
||||
onStateChange func()
|
||||
}
|
||||
@@ -91,11 +89,6 @@ type cachedVPNCredentials struct {
|
||||
SavePassword bool
|
||||
}
|
||||
|
||||
type cachedPKCS11PIN struct {
|
||||
ConnectionUUID string
|
||||
PIN string
|
||||
}
|
||||
|
||||
func NewNetworkManagerBackend(nmConn ...gonetworkmanager.NetworkManager) (*NetworkManagerBackend, error) {
|
||||
var nm gonetworkmanager.NetworkManager
|
||||
var err error
|
||||
|
||||
@@ -33,7 +33,7 @@ func (b *NetworkManagerBackend) ListVPNProfiles() ([]VPNProfile, error) {
|
||||
return nil, fmt.Errorf("failed to get connections: %w", err)
|
||||
}
|
||||
|
||||
profiles := []VPNProfile{}
|
||||
var profiles []VPNProfile
|
||||
for _, conn := range connections {
|
||||
settings, err := conn.GetSettings()
|
||||
if err != nil {
|
||||
@@ -101,7 +101,7 @@ func (b *NetworkManagerBackend) ListActiveVPN() ([]VPNActive, error) {
|
||||
return nil, fmt.Errorf("failed to get active connections: %w", err)
|
||||
}
|
||||
|
||||
active := []VPNActive{}
|
||||
var active []VPNActive
|
||||
for _, activeConn := range activeConns {
|
||||
connType, err := activeConn.GetPropertyType()
|
||||
if err != nil {
|
||||
@@ -282,26 +282,111 @@ func (b *NetworkManagerBackend) ConnectVPN(uuidOrName string, singleActive bool)
|
||||
}
|
||||
}
|
||||
|
||||
needsUsernamePrePrompt := false
|
||||
var vpnServiceType string
|
||||
var vpnData map[string]string
|
||||
if vpnSettings, ok := targetSettings["vpn"]; ok {
|
||||
if svc, ok := vpnSettings["service-type"].(string); ok {
|
||||
vpnServiceType = svc
|
||||
}
|
||||
if data, ok := vpnSettings["data"].(map[string]string); ok {
|
||||
vpnData = data
|
||||
connType := data["connection-type"]
|
||||
username := data["username"]
|
||||
// OpenVPN password auth needs username in vpn.data
|
||||
if strings.Contains(vpnServiceType, "openvpn") &&
|
||||
(connType == "password" || connType == "password-tls") &&
|
||||
username == "" {
|
||||
needsUsernamePrePrompt = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
authAction := detectVPNAuthAction(vpnServiceType, vpnData)
|
||||
// If username is needed but missing, prompt for it before activating
|
||||
if needsUsernamePrePrompt && b.promptBroker != nil {
|
||||
log.Infof("[ConnectVPN] OpenVPN requires username in vpn.data - prompting before activation")
|
||||
|
||||
switch authAction {
|
||||
case "openvpn_username":
|
||||
if b.promptBroker == nil {
|
||||
return fmt.Errorf("OpenVPN password authentication requires interactive prompt")
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
token, err := b.promptBroker.Ask(ctx, PromptRequest{
|
||||
Name: connName,
|
||||
ConnType: "vpn",
|
||||
VpnService: vpnServiceType,
|
||||
SettingName: "vpn",
|
||||
Fields: []string{"username", "password"},
|
||||
FieldsInfo: []FieldInfo{{Name: "username", Label: "Username", IsSecret: false}, {Name: "password", Label: "Password", IsSecret: true}},
|
||||
Reason: "required",
|
||||
ConnectionId: connName,
|
||||
ConnectionUuid: targetUUID,
|
||||
ConnectionPath: string(targetConn.GetPath()),
|
||||
})
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to request credentials: %w", err)
|
||||
}
|
||||
if err := b.handleOpenVPNUsernameAuth(targetConn, connName, targetUUID, vpnServiceType); err != nil {
|
||||
return err
|
||||
|
||||
reply, err := b.promptBroker.Wait(ctx, token)
|
||||
if err != nil {
|
||||
return fmt.Errorf("credentials prompt failed: %w", err)
|
||||
}
|
||||
|
||||
username := reply.Secrets["username"]
|
||||
password := reply.Secrets["password"]
|
||||
if username != "" {
|
||||
connObj := b.dbusConn.Object("org.freedesktop.NetworkManager", targetConn.GetPath())
|
||||
var existingSettings map[string]map[string]dbus.Variant
|
||||
if err := connObj.Call("org.freedesktop.NetworkManager.Settings.Connection.GetSettings", 0).Store(&existingSettings); err != nil {
|
||||
return fmt.Errorf("failed to get settings for username save: %w", err)
|
||||
}
|
||||
|
||||
settings := make(map[string]map[string]dbus.Variant)
|
||||
if connSection, ok := existingSettings["connection"]; ok {
|
||||
settings["connection"] = connSection
|
||||
}
|
||||
vpn := existingSettings["vpn"]
|
||||
var data map[string]string
|
||||
if dataVariant, ok := vpn["data"]; ok {
|
||||
if dm, ok := dataVariant.Value().(map[string]string); ok {
|
||||
data = make(map[string]string)
|
||||
for k, v := range dm {
|
||||
data[k] = v
|
||||
}
|
||||
} else {
|
||||
data = make(map[string]string)
|
||||
}
|
||||
} else {
|
||||
data = make(map[string]string)
|
||||
}
|
||||
data["username"] = username
|
||||
|
||||
if reply.Save && password != "" {
|
||||
data["password-flags"] = "0"
|
||||
secs := make(map[string]string)
|
||||
secs["password"] = password
|
||||
vpn["secrets"] = dbus.MakeVariant(secs)
|
||||
log.Infof("[ConnectVPN] Saving username and password to vpn.data")
|
||||
} else {
|
||||
log.Infof("[ConnectVPN] Saving username to vpn.data (password will be prompted)")
|
||||
}
|
||||
|
||||
vpn["data"] = dbus.MakeVariant(data)
|
||||
settings["vpn"] = vpn
|
||||
|
||||
var result map[string]dbus.Variant
|
||||
if err := connObj.Call("org.freedesktop.NetworkManager.Settings.Connection.Update2", 0,
|
||||
settings, uint32(0x1), map[string]dbus.Variant{}).Store(&result); err != nil {
|
||||
return fmt.Errorf("failed to save username: %w", err)
|
||||
}
|
||||
log.Infof("[ConnectVPN] Username saved to connection, now activating")
|
||||
|
||||
if password != "" && !reply.Save {
|
||||
b.cachedVPNCredsMu.Lock()
|
||||
b.cachedVPNCreds = &cachedVPNCredentials{
|
||||
ConnectionUUID: targetUUID,
|
||||
Password: password,
|
||||
SavePassword: reply.Save,
|
||||
}
|
||||
b.cachedVPNCredsMu.Unlock()
|
||||
log.Infof("[ConnectVPN] Cached password for GetSecrets")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -332,119 +417,6 @@ func (b *NetworkManagerBackend) ConnectVPN(uuidOrName string, singleActive bool)
|
||||
return nil
|
||||
}
|
||||
|
||||
func detectVPNAuthAction(serviceType string, data map[string]string) string {
|
||||
if data == nil {
|
||||
return ""
|
||||
}
|
||||
|
||||
switch {
|
||||
case strings.Contains(serviceType, "openvpn"):
|
||||
connType := data["connection-type"]
|
||||
username := data["username"]
|
||||
if (connType == "password" || connType == "password-tls") && username == "" {
|
||||
return "openvpn_username"
|
||||
}
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
func (b *NetworkManagerBackend) handleOpenVPNUsernameAuth(targetConn gonetworkmanager.Connection, connName, targetUUID, vpnServiceType string) error {
|
||||
log.Infof("[ConnectVPN] OpenVPN requires username in vpn.data - prompting before activation")
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
token, err := b.promptBroker.Ask(ctx, PromptRequest{
|
||||
Name: connName,
|
||||
ConnType: "vpn",
|
||||
VpnService: vpnServiceType,
|
||||
SettingName: "vpn",
|
||||
Fields: []string{"username", "password"},
|
||||
FieldsInfo: []FieldInfo{{Name: "username", Label: "Username", IsSecret: false}, {Name: "password", Label: "Password", IsSecret: true}},
|
||||
Reason: "required",
|
||||
ConnectionId: connName,
|
||||
ConnectionUuid: targetUUID,
|
||||
ConnectionPath: string(targetConn.GetPath()),
|
||||
})
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to request credentials: %w", err)
|
||||
}
|
||||
|
||||
reply, err := b.promptBroker.Wait(ctx, token)
|
||||
if err != nil {
|
||||
return fmt.Errorf("credentials prompt failed: %w", err)
|
||||
}
|
||||
|
||||
if reply.Cancel {
|
||||
return fmt.Errorf("user cancelled authentication")
|
||||
}
|
||||
|
||||
username := reply.Secrets["username"]
|
||||
password := reply.Secrets["password"]
|
||||
if username == "" {
|
||||
return nil
|
||||
}
|
||||
|
||||
connObj := b.dbusConn.Object("org.freedesktop.NetworkManager", targetConn.GetPath())
|
||||
var existingSettings map[string]map[string]dbus.Variant
|
||||
if err := connObj.Call("org.freedesktop.NetworkManager.Settings.Connection.GetSettings", 0).Store(&existingSettings); err != nil {
|
||||
return fmt.Errorf("failed to get settings for username save: %w", err)
|
||||
}
|
||||
|
||||
settings := make(map[string]map[string]dbus.Variant)
|
||||
if connSection, ok := existingSettings["connection"]; ok {
|
||||
settings["connection"] = connSection
|
||||
}
|
||||
vpn := existingSettings["vpn"]
|
||||
var data map[string]string
|
||||
if dataVariant, ok := vpn["data"]; ok {
|
||||
if dm, ok := dataVariant.Value().(map[string]string); ok {
|
||||
data = make(map[string]string)
|
||||
for k, v := range dm {
|
||||
data[k] = v
|
||||
}
|
||||
} else {
|
||||
data = make(map[string]string)
|
||||
}
|
||||
} else {
|
||||
data = make(map[string]string)
|
||||
}
|
||||
data["username"] = username
|
||||
|
||||
if reply.Save && password != "" {
|
||||
data["password-flags"] = "0"
|
||||
secs := make(map[string]string)
|
||||
secs["password"] = password
|
||||
vpn["secrets"] = dbus.MakeVariant(secs)
|
||||
log.Infof("[ConnectVPN] Saving username and password to vpn.data")
|
||||
} else {
|
||||
log.Infof("[ConnectVPN] Saving username to vpn.data (password will be prompted)")
|
||||
}
|
||||
|
||||
vpn["data"] = dbus.MakeVariant(data)
|
||||
settings["vpn"] = vpn
|
||||
|
||||
var result map[string]dbus.Variant
|
||||
if err := connObj.Call("org.freedesktop.NetworkManager.Settings.Connection.Update2", 0,
|
||||
settings, uint32(0x1), map[string]dbus.Variant{}).Store(&result); err != nil {
|
||||
return fmt.Errorf("failed to save username: %w", err)
|
||||
}
|
||||
log.Infof("[ConnectVPN] Username saved to connection")
|
||||
|
||||
if password != "" && !reply.Save {
|
||||
b.cachedVPNCredsMu.Lock()
|
||||
b.cachedVPNCreds = &cachedVPNCredentials{
|
||||
ConnectionUUID: targetUUID,
|
||||
Password: password,
|
||||
SavePassword: reply.Save,
|
||||
}
|
||||
b.cachedVPNCredsMu.Unlock()
|
||||
log.Infof("[ConnectVPN] Cached password for GetSecrets")
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (b *NetworkManagerBackend) DisconnectVPN(uuidOrName string) error {
|
||||
nm := b.nmConn.(gonetworkmanager.NetworkManager)
|
||||
|
||||
@@ -683,11 +655,6 @@ func (b *NetworkManagerBackend) updateVPNConnectionState() {
|
||||
b.state.LastError = ""
|
||||
b.stateMutex.Unlock()
|
||||
|
||||
// Clear cached PKCS11 PIN on success
|
||||
b.cachedPKCS11Mu.Lock()
|
||||
b.cachedPKCS11PIN = nil
|
||||
b.cachedPKCS11Mu.Unlock()
|
||||
|
||||
b.pendingVPNSaveMu.Lock()
|
||||
pending := b.pendingVPNSave
|
||||
b.pendingVPNSave = nil
|
||||
@@ -704,11 +671,6 @@ func (b *NetworkManagerBackend) updateVPNConnectionState() {
|
||||
b.state.ConnectingVPNUUID = ""
|
||||
b.state.LastError = "VPN connection failed"
|
||||
b.stateMutex.Unlock()
|
||||
|
||||
// Clear cached PKCS11 PIN on failure
|
||||
b.cachedPKCS11Mu.Lock()
|
||||
b.cachedPKCS11PIN = nil
|
||||
b.cachedPKCS11Mu.Unlock()
|
||||
return
|
||||
}
|
||||
}
|
||||
@@ -721,11 +683,6 @@ func (b *NetworkManagerBackend) updateVPNConnectionState() {
|
||||
b.state.ConnectingVPNUUID = ""
|
||||
b.state.LastError = "VPN connection failed"
|
||||
b.stateMutex.Unlock()
|
||||
|
||||
// Clear cached PKCS11 PIN
|
||||
b.cachedPKCS11Mu.Lock()
|
||||
b.cachedPKCS11PIN = nil
|
||||
b.cachedPKCS11Mu.Unlock()
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -18,7 +18,6 @@ import (
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/network"
|
||||
serverPlugins "github.com/AvengeMedia/DankMaterialShell/core/internal/server/plugins"
|
||||
serverThemes "github.com/AvengeMedia/DankMaterialShell/core/internal/server/themes"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/wayland"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/wlroutput"
|
||||
)
|
||||
@@ -38,11 +37,6 @@ func RouteRequest(conn net.Conn, req models.Request) {
|
||||
return
|
||||
}
|
||||
|
||||
if strings.HasPrefix(req.Method, "themes.") {
|
||||
serverThemes.HandleRequest(conn, req)
|
||||
return
|
||||
}
|
||||
|
||||
if strings.HasPrefix(req.Method, "loginctl.") {
|
||||
if loginctlManager == nil {
|
||||
models.RespondError(conn, req.ID, "loginctl manager not initialized")
|
||||
@@ -207,6 +201,12 @@ func handleClipboardSetConfig(conn net.Conn, req models.Request) {
|
||||
if v, ok := req.Params["disabled"].(bool); ok {
|
||||
cfg.Disabled = v
|
||||
}
|
||||
if v, ok := req.Params["disableHistory"].(bool); ok {
|
||||
cfg.DisableHistory = v
|
||||
}
|
||||
if v, ok := req.Params["disablePersist"].(bool); ok {
|
||||
cfg.DisablePersist = v
|
||||
}
|
||||
|
||||
if err := clipboard.SaveConfig(cfg); err != nil {
|
||||
models.RespondError(conn, req.ID, err.Error())
|
||||
|
||||
@@ -1,27 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
|
||||
)
|
||||
|
||||
func HandleRequest(conn net.Conn, req models.Request) {
|
||||
switch req.Method {
|
||||
case "themes.list":
|
||||
HandleList(conn, req)
|
||||
case "themes.listInstalled":
|
||||
HandleListInstalled(conn, req)
|
||||
case "themes.install":
|
||||
HandleInstall(conn, req)
|
||||
case "themes.uninstall":
|
||||
HandleUninstall(conn, req)
|
||||
case "themes.update":
|
||||
HandleUpdate(conn, req)
|
||||
case "themes.search":
|
||||
HandleSearch(conn, req)
|
||||
default:
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("unknown method: %s", req.Method))
|
||||
}
|
||||
}
|
||||
@@ -1,52 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/themes"
|
||||
)
|
||||
|
||||
func HandleInstall(conn net.Conn, req models.Request) {
|
||||
idOrName, ok := req.Params["name"].(string)
|
||||
if !ok {
|
||||
models.RespondError(conn, req.ID, "missing or invalid 'name' parameter")
|
||||
return
|
||||
}
|
||||
|
||||
registry, err := themes.NewRegistry()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create registry: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
themeList, err := registry.List()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to list themes: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
theme := themes.FindByIDOrName(idOrName, themeList)
|
||||
if theme == nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("theme not found: %s", idOrName))
|
||||
return
|
||||
}
|
||||
|
||||
manager, err := themes.NewManager()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create manager: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
registryThemeDir := registry.GetThemeDir(theme.SourceDir)
|
||||
if err := manager.Install(*theme, registryThemeDir); err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to install theme: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
models.Respond(conn, req.ID, models.SuccessResult{
|
||||
Success: true,
|
||||
Message: fmt.Sprintf("theme installed: %s", theme.Name),
|
||||
})
|
||||
}
|
||||
@@ -1,54 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net"
|
||||
"strings"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/themes"
|
||||
)
|
||||
|
||||
func HandleList(conn net.Conn, req models.Request) {
|
||||
registry, err := themes.NewRegistry()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create registry: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
themeList, err := registry.List()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to list themes: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
manager, err := themes.NewManager()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create manager: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
result := make([]ThemeInfo, len(themeList))
|
||||
for i, t := range themeList {
|
||||
installed, _ := manager.IsInstalled(t)
|
||||
info := ThemeInfo{
|
||||
ID: t.ID,
|
||||
Name: t.Name,
|
||||
Version: t.Version,
|
||||
Author: t.Author,
|
||||
Description: t.Description,
|
||||
PreviewPath: t.PreviewPath,
|
||||
SourceDir: t.SourceDir,
|
||||
Installed: installed,
|
||||
FirstParty: isFirstParty(t.Author),
|
||||
}
|
||||
addVariantsInfo(&info, t.Variants)
|
||||
result[i] = info
|
||||
}
|
||||
|
||||
models.Respond(conn, req.ID, result)
|
||||
}
|
||||
|
||||
func isFirstParty(author string) bool {
|
||||
return strings.EqualFold(author, "Avenge Media") || strings.EqualFold(author, "AvengeMedia")
|
||||
}
|
||||
@@ -1,157 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/themes"
|
||||
)
|
||||
|
||||
func addVariantsInfo(info *ThemeInfo, variants *themes.ThemeVariants) {
|
||||
if variants == nil {
|
||||
return
|
||||
}
|
||||
|
||||
if variants.Type == "multi" {
|
||||
if len(variants.Flavors) == 0 && len(variants.Accents) == 0 {
|
||||
return
|
||||
}
|
||||
info.HasVariants = true
|
||||
info.Variants = &VariantsInfo{
|
||||
Type: "multi",
|
||||
Flavors: make([]FlavorInfo, len(variants.Flavors)),
|
||||
Accents: make([]AccentInfo, len(variants.Accents)),
|
||||
}
|
||||
if variants.Defaults != nil {
|
||||
info.Variants.Defaults = &MultiDefaults{
|
||||
Dark: variants.Defaults.Dark,
|
||||
Light: variants.Defaults.Light,
|
||||
}
|
||||
}
|
||||
for i, f := range variants.Flavors {
|
||||
mode := ""
|
||||
switch {
|
||||
case f.Dark.Primary != "" && f.Light.Primary != "":
|
||||
mode = "both"
|
||||
case f.Dark.Primary != "":
|
||||
mode = "dark"
|
||||
case f.Light.Primary != "":
|
||||
mode = "light"
|
||||
default:
|
||||
if f.Dark.Surface != "" {
|
||||
mode = "dark"
|
||||
} else if f.Light.Surface != "" {
|
||||
mode = "light"
|
||||
}
|
||||
}
|
||||
info.Variants.Flavors[i] = FlavorInfo{ID: f.ID, Name: f.Name, Mode: mode}
|
||||
}
|
||||
for i, a := range variants.Accents {
|
||||
color := ""
|
||||
if colors, ok := a.FlavorColors["mocha"]; ok && colors.Primary != "" {
|
||||
color = colors.Primary
|
||||
} else if colors, ok := a.FlavorColors["latte"]; ok && colors.Primary != "" {
|
||||
color = colors.Primary
|
||||
} else {
|
||||
for _, c := range a.FlavorColors {
|
||||
if c.Primary != "" {
|
||||
color = c.Primary
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
info.Variants.Accents[i] = AccentInfo{ID: a.ID, Name: a.Name, Color: color}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
if len(variants.Options) == 0 {
|
||||
return
|
||||
}
|
||||
info.HasVariants = true
|
||||
info.Variants = &VariantsInfo{
|
||||
Default: variants.Default,
|
||||
Options: make([]VariantInfo, len(variants.Options)),
|
||||
}
|
||||
for i, v := range variants.Options {
|
||||
info.Variants.Options[i] = VariantInfo{ID: v.ID, Name: v.Name}
|
||||
}
|
||||
}
|
||||
|
||||
func HandleListInstalled(conn net.Conn, req models.Request) {
|
||||
manager, err := themes.NewManager()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create manager: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
installedIDs, err := manager.ListInstalled()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to list installed themes: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
registry, err := themes.NewRegistry()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create registry: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
allThemes, err := registry.List()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to list themes: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
themeMap := make(map[string]themes.Theme)
|
||||
for _, t := range allThemes {
|
||||
themeMap[t.ID] = t
|
||||
}
|
||||
|
||||
result := make([]ThemeInfo, 0, len(installedIDs))
|
||||
for _, id := range installedIDs {
|
||||
if theme, ok := themeMap[id]; ok {
|
||||
hasUpdate := false
|
||||
if hasUpdates, err := manager.HasUpdates(id, theme); err == nil {
|
||||
hasUpdate = hasUpdates
|
||||
}
|
||||
|
||||
info := ThemeInfo{
|
||||
ID: theme.ID,
|
||||
Name: theme.Name,
|
||||
Version: theme.Version,
|
||||
Author: theme.Author,
|
||||
Description: theme.Description,
|
||||
SourceDir: id,
|
||||
FirstParty: isFirstParty(theme.Author),
|
||||
HasUpdate: hasUpdate,
|
||||
}
|
||||
addVariantsInfo(&info, theme.Variants)
|
||||
result = append(result, info)
|
||||
} else {
|
||||
installed, err := manager.GetInstalledTheme(id)
|
||||
if err != nil {
|
||||
result = append(result, ThemeInfo{
|
||||
ID: id,
|
||||
Name: id,
|
||||
SourceDir: id,
|
||||
})
|
||||
continue
|
||||
}
|
||||
info := ThemeInfo{
|
||||
ID: installed.ID,
|
||||
Name: installed.Name,
|
||||
Version: installed.Version,
|
||||
Author: installed.Author,
|
||||
Description: installed.Description,
|
||||
SourceDir: id,
|
||||
FirstParty: isFirstParty(installed.Author),
|
||||
}
|
||||
addVariantsInfo(&info, installed.Variants)
|
||||
result = append(result, info)
|
||||
}
|
||||
}
|
||||
|
||||
models.Respond(conn, req.ID, result)
|
||||
}
|
||||
@@ -1,53 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/themes"
|
||||
)
|
||||
|
||||
func HandleSearch(conn net.Conn, req models.Request) {
|
||||
query, ok := req.Params["query"].(string)
|
||||
if !ok {
|
||||
models.RespondError(conn, req.ID, "missing or invalid 'query' parameter")
|
||||
return
|
||||
}
|
||||
|
||||
registry, err := themes.NewRegistry()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create registry: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
themeList, err := registry.List()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to list themes: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
searchResults := themes.FuzzySearch(query, themeList)
|
||||
|
||||
manager, err := themes.NewManager()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create manager: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
result := make([]ThemeInfo, len(searchResults))
|
||||
for i, t := range searchResults {
|
||||
installed, _ := manager.IsInstalled(t)
|
||||
result[i] = ThemeInfo{
|
||||
ID: t.ID,
|
||||
Name: t.Name,
|
||||
Version: t.Version,
|
||||
Author: t.Author,
|
||||
Description: t.Description,
|
||||
Installed: installed,
|
||||
FirstParty: isFirstParty(t.Author),
|
||||
}
|
||||
}
|
||||
|
||||
models.Respond(conn, req.ID, result)
|
||||
}
|
||||
@@ -1,47 +0,0 @@
|
||||
package themes
|
||||
|
||||
type VariantInfo struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
}
|
||||
|
||||
type FlavorInfo struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Mode string `json:"mode,omitempty"`
|
||||
}
|
||||
|
||||
type AccentInfo struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Color string `json:"color,omitempty"`
|
||||
}
|
||||
|
||||
type MultiDefaults struct {
|
||||
Dark map[string]string `json:"dark,omitempty"`
|
||||
Light map[string]string `json:"light,omitempty"`
|
||||
}
|
||||
|
||||
type VariantsInfo struct {
|
||||
Type string `json:"type,omitempty"`
|
||||
Default string `json:"default,omitempty"`
|
||||
Defaults *MultiDefaults `json:"defaults,omitempty"`
|
||||
Options []VariantInfo `json:"options,omitempty"`
|
||||
Flavors []FlavorInfo `json:"flavors,omitempty"`
|
||||
Accents []AccentInfo `json:"accents,omitempty"`
|
||||
}
|
||||
|
||||
type ThemeInfo struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Version string `json:"version"`
|
||||
Author string `json:"author,omitempty"`
|
||||
Description string `json:"description,omitempty"`
|
||||
PreviewPath string `json:"previewPath,omitempty"`
|
||||
SourceDir string `json:"sourceDir,omitempty"`
|
||||
Installed bool `json:"installed,omitempty"`
|
||||
FirstParty bool `json:"firstParty,omitempty"`
|
||||
HasUpdate bool `json:"hasUpdate,omitempty"`
|
||||
HasVariants bool `json:"hasVariants,omitempty"`
|
||||
Variants *VariantsInfo `json:"variants,omitempty"`
|
||||
}
|
||||
@@ -1,63 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/themes"
|
||||
)
|
||||
|
||||
func HandleUninstall(conn net.Conn, req models.Request) {
|
||||
idOrName, ok := req.Params["name"].(string)
|
||||
if !ok {
|
||||
models.RespondError(conn, req.ID, "missing or invalid 'name' parameter")
|
||||
return
|
||||
}
|
||||
|
||||
manager, err := themes.NewManager()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create manager: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
registry, err := themes.NewRegistry()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create registry: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
themeList, _ := registry.List()
|
||||
theme := themes.FindByIDOrName(idOrName, themeList)
|
||||
|
||||
if theme != nil {
|
||||
installed, err := manager.IsInstalled(*theme)
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to check if theme is installed: %v", err))
|
||||
return
|
||||
}
|
||||
if !installed {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("theme not installed: %s", idOrName))
|
||||
return
|
||||
}
|
||||
if err := manager.Uninstall(*theme); err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to uninstall theme: %v", err))
|
||||
return
|
||||
}
|
||||
models.Respond(conn, req.ID, models.SuccessResult{
|
||||
Success: true,
|
||||
Message: fmt.Sprintf("theme uninstalled: %s", theme.Name),
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
if err := manager.UninstallByID(idOrName); err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("theme not found: %s", idOrName))
|
||||
return
|
||||
}
|
||||
|
||||
models.Respond(conn, req.ID, models.SuccessResult{
|
||||
Success: true,
|
||||
Message: fmt.Sprintf("theme uninstalled: %s", idOrName),
|
||||
})
|
||||
}
|
||||
@@ -1,57 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/server/models"
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/themes"
|
||||
)
|
||||
|
||||
func HandleUpdate(conn net.Conn, req models.Request) {
|
||||
idOrName, ok := req.Params["name"].(string)
|
||||
if !ok {
|
||||
models.RespondError(conn, req.ID, "missing or invalid 'name' parameter")
|
||||
return
|
||||
}
|
||||
|
||||
manager, err := themes.NewManager()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create manager: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
registry, err := themes.NewRegistry()
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to create registry: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
themeList, _ := registry.List()
|
||||
theme := themes.FindByIDOrName(idOrName, themeList)
|
||||
|
||||
if theme == nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("theme not found in registry: %s", idOrName))
|
||||
return
|
||||
}
|
||||
|
||||
installed, err := manager.IsInstalled(*theme)
|
||||
if err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to check if theme is installed: %v", err))
|
||||
return
|
||||
}
|
||||
if !installed {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("theme not installed: %s", idOrName))
|
||||
return
|
||||
}
|
||||
|
||||
if err := manager.Update(*theme); err != nil {
|
||||
models.RespondError(conn, req.ID, fmt.Sprintf("failed to update theme: %v", err))
|
||||
return
|
||||
}
|
||||
|
||||
models.Respond(conn, req.ID, models.SuccessResult{
|
||||
Success: true,
|
||||
Message: fmt.Sprintf("theme updated: %s", theme.Name),
|
||||
})
|
||||
}
|
||||
@@ -241,7 +241,6 @@ func (m *Manager) handleHead(e wlr_output_management.ZwlrOutputManagerV1HeadEven
|
||||
handle.SetAdaptiveSyncHandler(func(e wlr_output_management.ZwlrOutputHeadV1AdaptiveSyncEvent) {
|
||||
log.Debugf("WlrOutput: Head %d adaptive sync: %d", headID, e.State)
|
||||
head.adaptiveSync = e.State
|
||||
head.adaptiveSyncSupported = true
|
||||
m.post(func() {
|
||||
m.updateState()
|
||||
})
|
||||
@@ -361,23 +360,22 @@ func (m *Manager) updateState() {
|
||||
}
|
||||
|
||||
output := Output{
|
||||
Name: head.name,
|
||||
Description: head.description,
|
||||
Make: head.make,
|
||||
Model: head.model,
|
||||
SerialNumber: head.serialNumber,
|
||||
PhysicalWidth: head.physicalWidth,
|
||||
PhysicalHeight: head.physicalHeight,
|
||||
Enabled: head.enabled,
|
||||
X: head.x,
|
||||
Y: head.y,
|
||||
Transform: head.transform,
|
||||
Scale: head.scale,
|
||||
CurrentMode: currentMode,
|
||||
Modes: modes,
|
||||
AdaptiveSync: head.adaptiveSync,
|
||||
AdaptiveSyncSupported: head.adaptiveSyncSupported,
|
||||
ID: head.id,
|
||||
Name: head.name,
|
||||
Description: head.description,
|
||||
Make: head.make,
|
||||
Model: head.model,
|
||||
SerialNumber: head.serialNumber,
|
||||
PhysicalWidth: head.physicalWidth,
|
||||
PhysicalHeight: head.physicalHeight,
|
||||
Enabled: head.enabled,
|
||||
X: head.x,
|
||||
Y: head.y,
|
||||
Transform: head.transform,
|
||||
Scale: head.scale,
|
||||
CurrentMode: currentMode,
|
||||
Modes: modes,
|
||||
AdaptiveSync: head.adaptiveSync,
|
||||
ID: head.id,
|
||||
}
|
||||
outputs = append(outputs, output)
|
||||
return true
|
||||
|
||||
@@ -17,23 +17,22 @@ type OutputMode struct {
|
||||
}
|
||||
|
||||
type Output struct {
|
||||
Name string `json:"name"`
|
||||
Description string `json:"description"`
|
||||
Make string `json:"make"`
|
||||
Model string `json:"model"`
|
||||
SerialNumber string `json:"serialNumber"`
|
||||
PhysicalWidth int32 `json:"physicalWidth"`
|
||||
PhysicalHeight int32 `json:"physicalHeight"`
|
||||
Enabled bool `json:"enabled"`
|
||||
X int32 `json:"x"`
|
||||
Y int32 `json:"y"`
|
||||
Transform int32 `json:"transform"`
|
||||
Scale float64 `json:"scale"`
|
||||
CurrentMode *OutputMode `json:"currentMode"`
|
||||
Modes []OutputMode `json:"modes"`
|
||||
AdaptiveSync uint32 `json:"adaptiveSync"`
|
||||
AdaptiveSyncSupported bool `json:"adaptiveSyncSupported"`
|
||||
ID uint32 `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Description string `json:"description"`
|
||||
Make string `json:"make"`
|
||||
Model string `json:"model"`
|
||||
SerialNumber string `json:"serialNumber"`
|
||||
PhysicalWidth int32 `json:"physicalWidth"`
|
||||
PhysicalHeight int32 `json:"physicalHeight"`
|
||||
Enabled bool `json:"enabled"`
|
||||
X int32 `json:"x"`
|
||||
Y int32 `json:"y"`
|
||||
Transform int32 `json:"transform"`
|
||||
Scale float64 `json:"scale"`
|
||||
CurrentMode *OutputMode `json:"currentMode"`
|
||||
Modes []OutputMode `json:"modes"`
|
||||
AdaptiveSync uint32 `json:"adaptiveSync"`
|
||||
ID uint32 `json:"id"`
|
||||
}
|
||||
|
||||
type State struct {
|
||||
@@ -73,26 +72,25 @@ type Manager struct {
|
||||
}
|
||||
|
||||
type headState struct {
|
||||
id uint32
|
||||
handle *wlr_output_management.ZwlrOutputHeadV1
|
||||
name string
|
||||
description string
|
||||
make string
|
||||
model string
|
||||
serialNumber string
|
||||
physicalWidth int32
|
||||
physicalHeight int32
|
||||
enabled bool
|
||||
x int32
|
||||
y int32
|
||||
transform int32
|
||||
scale float64
|
||||
currentModeID uint32
|
||||
modeIDs []uint32
|
||||
adaptiveSync uint32
|
||||
adaptiveSyncSupported bool
|
||||
finished bool
|
||||
ready bool
|
||||
id uint32
|
||||
handle *wlr_output_management.ZwlrOutputHeadV1
|
||||
name string
|
||||
description string
|
||||
make string
|
||||
model string
|
||||
serialNumber string
|
||||
physicalWidth int32
|
||||
physicalHeight int32
|
||||
enabled bool
|
||||
x int32
|
||||
y int32
|
||||
transform int32
|
||||
scale float64
|
||||
currentModeID uint32
|
||||
modeIDs []uint32
|
||||
adaptiveSync uint32
|
||||
finished bool
|
||||
ready bool
|
||||
}
|
||||
|
||||
type modeState struct {
|
||||
@@ -171,7 +169,7 @@ func stateChanged(old, new *State) bool {
|
||||
if oldOut.Transform != newOut.Transform || oldOut.Scale != newOut.Scale {
|
||||
return true
|
||||
}
|
||||
if oldOut.AdaptiveSync != newOut.AdaptiveSync || oldOut.AdaptiveSyncSupported != newOut.AdaptiveSyncSupported {
|
||||
if oldOut.AdaptiveSync != newOut.AdaptiveSync {
|
||||
return true
|
||||
}
|
||||
if (oldOut.CurrentMode == nil) != (newOut.CurrentMode == nil) {
|
||||
|
||||
@@ -1,258 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/log"
|
||||
"github.com/spf13/afero"
|
||||
)
|
||||
|
||||
type Manager struct {
|
||||
fs afero.Fs
|
||||
themesDir string
|
||||
}
|
||||
|
||||
func NewManager() (*Manager, error) {
|
||||
return NewManagerWithFs(afero.NewOsFs())
|
||||
}
|
||||
|
||||
func NewManagerWithFs(fs afero.Fs) (*Manager, error) {
|
||||
themesDir := getThemesDir()
|
||||
return &Manager{
|
||||
fs: fs,
|
||||
themesDir: themesDir,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func getThemesDir() string {
|
||||
configDir, err := os.UserConfigDir()
|
||||
if err != nil {
|
||||
log.Error("failed to get user config dir", "err", err)
|
||||
return ""
|
||||
}
|
||||
return filepath.Join(configDir, "DankMaterialShell", "themes")
|
||||
}
|
||||
|
||||
func (m *Manager) IsInstalled(theme Theme) (bool, error) {
|
||||
path := m.getInstalledPath(theme.ID)
|
||||
exists, err := afero.Exists(m.fs, path)
|
||||
if err != nil {
|
||||
return false, err
|
||||
}
|
||||
return exists, nil
|
||||
}
|
||||
|
||||
func (m *Manager) getInstalledDir(themeID string) string {
|
||||
return filepath.Join(m.themesDir, themeID)
|
||||
}
|
||||
|
||||
func (m *Manager) getInstalledPath(themeID string) string {
|
||||
return filepath.Join(m.getInstalledDir(themeID), "theme.json")
|
||||
}
|
||||
|
||||
func (m *Manager) Install(theme Theme, registryThemeDir string) error {
|
||||
themeDir := m.getInstalledDir(theme.ID)
|
||||
|
||||
exists, err := afero.DirExists(m.fs, themeDir)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to check if theme exists: %w", err)
|
||||
}
|
||||
|
||||
if exists {
|
||||
return fmt.Errorf("theme already installed: %s", theme.Name)
|
||||
}
|
||||
|
||||
if err := m.fs.MkdirAll(themeDir, 0755); err != nil {
|
||||
return fmt.Errorf("failed to create theme directory: %w", err)
|
||||
}
|
||||
|
||||
data, err := json.MarshalIndent(theme, "", " ")
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to marshal theme: %w", err)
|
||||
}
|
||||
|
||||
themePath := filepath.Join(themeDir, "theme.json")
|
||||
if err := afero.WriteFile(m.fs, themePath, data, 0644); err != nil {
|
||||
return fmt.Errorf("failed to write theme file: %w", err)
|
||||
}
|
||||
|
||||
m.copyPreviewFiles(registryThemeDir, themeDir, theme)
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *Manager) copyPreviewFiles(srcDir, dstDir string, theme Theme) {
|
||||
previews := []string{"preview-dark.svg", "preview-light.svg"}
|
||||
|
||||
if theme.Variants != nil {
|
||||
for _, v := range theme.Variants.Options {
|
||||
previews = append(previews,
|
||||
fmt.Sprintf("preview-%s.svg", v.ID),
|
||||
fmt.Sprintf("preview-%s-dark.svg", v.ID),
|
||||
fmt.Sprintf("preview-%s-light.svg", v.ID),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
for _, preview := range previews {
|
||||
srcPath := filepath.Join(srcDir, preview)
|
||||
if exists, _ := afero.Exists(m.fs, srcPath); !exists {
|
||||
continue
|
||||
}
|
||||
data, err := afero.ReadFile(m.fs, srcPath)
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
dstPath := filepath.Join(dstDir, preview)
|
||||
_ = afero.WriteFile(m.fs, dstPath, data, 0644)
|
||||
}
|
||||
}
|
||||
|
||||
func (m *Manager) InstallFromRegistry(registry *Registry, themeID string) error {
|
||||
theme, err := registry.Get(themeID)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
registryThemeDir := registry.GetThemeDir(theme.SourceDir)
|
||||
return m.Install(*theme, registryThemeDir)
|
||||
}
|
||||
|
||||
func (m *Manager) Update(theme Theme) error {
|
||||
themePath := m.getInstalledPath(theme.ID)
|
||||
|
||||
exists, err := afero.Exists(m.fs, themePath)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to check if theme exists: %w", err)
|
||||
}
|
||||
|
||||
if !exists {
|
||||
return fmt.Errorf("theme not installed: %s", theme.Name)
|
||||
}
|
||||
|
||||
data, err := json.MarshalIndent(theme, "", " ")
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to marshal theme: %w", err)
|
||||
}
|
||||
|
||||
if err := afero.WriteFile(m.fs, themePath, data, 0644); err != nil {
|
||||
return fmt.Errorf("failed to write theme file: %w", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *Manager) Uninstall(theme Theme) error {
|
||||
return m.UninstallByID(theme.ID)
|
||||
}
|
||||
|
||||
func (m *Manager) UninstallByID(themeID string) error {
|
||||
themeDir := m.getInstalledDir(themeID)
|
||||
|
||||
exists, err := afero.DirExists(m.fs, themeDir)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to check if theme exists: %w", err)
|
||||
}
|
||||
|
||||
if !exists {
|
||||
return fmt.Errorf("theme not installed: %s", themeID)
|
||||
}
|
||||
|
||||
if err := m.fs.RemoveAll(themeDir); err != nil {
|
||||
return fmt.Errorf("failed to remove theme: %w", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (m *Manager) ListInstalled() ([]string, error) {
|
||||
exists, err := afero.DirExists(m.fs, m.themesDir)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
if !exists {
|
||||
return []string{}, nil
|
||||
}
|
||||
|
||||
entries, err := afero.ReadDir(m.fs, m.themesDir)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to read themes directory: %w", err)
|
||||
}
|
||||
|
||||
var installed []string
|
||||
for _, entry := range entries {
|
||||
if !entry.IsDir() {
|
||||
continue
|
||||
}
|
||||
|
||||
themeID := entry.Name()
|
||||
themePath := filepath.Join(m.themesDir, themeID, "theme.json")
|
||||
if exists, _ := afero.Exists(m.fs, themePath); exists {
|
||||
installed = append(installed, themeID)
|
||||
}
|
||||
}
|
||||
|
||||
return installed, nil
|
||||
}
|
||||
|
||||
func (m *Manager) GetInstalledTheme(themeID string) (*Theme, error) {
|
||||
themePath := m.getInstalledPath(themeID)
|
||||
|
||||
data, err := afero.ReadFile(m.fs, themePath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to read theme file: %w", err)
|
||||
}
|
||||
|
||||
var theme Theme
|
||||
if err := json.Unmarshal(data, &theme); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse theme file: %w", err)
|
||||
}
|
||||
|
||||
return &theme, nil
|
||||
}
|
||||
|
||||
func (m *Manager) HasUpdates(themeID string, registryTheme Theme) (bool, error) {
|
||||
installed, err := m.GetInstalledTheme(themeID)
|
||||
if err != nil {
|
||||
return false, err
|
||||
}
|
||||
|
||||
return compareVersions(installed.Version, registryTheme.Version) < 0, nil
|
||||
}
|
||||
|
||||
func compareVersions(installed, registry string) int {
|
||||
installedParts := strings.Split(installed, ".")
|
||||
registryParts := strings.Split(registry, ".")
|
||||
|
||||
maxLen := len(installedParts)
|
||||
if len(registryParts) > maxLen {
|
||||
maxLen = len(registryParts)
|
||||
}
|
||||
|
||||
for i := 0; i < maxLen; i++ {
|
||||
var installedNum, registryNum int
|
||||
if i < len(installedParts) {
|
||||
fmt.Sscanf(installedParts[i], "%d", &installedNum)
|
||||
}
|
||||
if i < len(registryParts) {
|
||||
fmt.Sscanf(registryParts[i], "%d", ®istryNum)
|
||||
}
|
||||
|
||||
if installedNum < registryNum {
|
||||
return -1
|
||||
}
|
||||
if installedNum > registryNum {
|
||||
return 1
|
||||
}
|
||||
}
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
func (m *Manager) GetThemesDir() string {
|
||||
return m.themesDir
|
||||
}
|
||||
@@ -1,309 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
|
||||
"github.com/go-git/go-git/v6"
|
||||
"github.com/spf13/afero"
|
||||
)
|
||||
|
||||
const registryRepo = "https://github.com/AvengeMedia/dms-plugin-registry.git"
|
||||
|
||||
type ColorScheme struct {
|
||||
Primary string `json:"primary,omitempty"`
|
||||
PrimaryText string `json:"primaryText,omitempty"`
|
||||
PrimaryContainer string `json:"primaryContainer,omitempty"`
|
||||
Secondary string `json:"secondary,omitempty"`
|
||||
Surface string `json:"surface,omitempty"`
|
||||
SurfaceText string `json:"surfaceText,omitempty"`
|
||||
SurfaceVariant string `json:"surfaceVariant,omitempty"`
|
||||
SurfaceVariantText string `json:"surfaceVariantText,omitempty"`
|
||||
SurfaceTint string `json:"surfaceTint,omitempty"`
|
||||
Background string `json:"background,omitempty"`
|
||||
BackgroundText string `json:"backgroundText,omitempty"`
|
||||
Outline string `json:"outline,omitempty"`
|
||||
SurfaceContainer string `json:"surfaceContainer,omitempty"`
|
||||
SurfaceContainerHigh string `json:"surfaceContainerHigh,omitempty"`
|
||||
SurfaceContainerHighest string `json:"surfaceContainerHighest,omitempty"`
|
||||
Error string `json:"error,omitempty"`
|
||||
Warning string `json:"warning,omitempty"`
|
||||
Info string `json:"info,omitempty"`
|
||||
}
|
||||
|
||||
type ThemeVariant struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Dark ColorScheme `json:"dark,omitempty"`
|
||||
Light ColorScheme `json:"light,omitempty"`
|
||||
}
|
||||
|
||||
type ThemeFlavor struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Dark ColorScheme `json:"dark,omitempty"`
|
||||
Light ColorScheme `json:"light,omitempty"`
|
||||
}
|
||||
|
||||
type ThemeAccent struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
FlavorColors map[string]ColorScheme `json:"-"`
|
||||
}
|
||||
|
||||
func (a *ThemeAccent) UnmarshalJSON(data []byte) error {
|
||||
var raw map[string]json.RawMessage
|
||||
if err := json.Unmarshal(data, &raw); err != nil {
|
||||
return err
|
||||
}
|
||||
a.FlavorColors = make(map[string]ColorScheme)
|
||||
var mErr error
|
||||
for key, value := range raw {
|
||||
switch key {
|
||||
case "id":
|
||||
mErr = errors.Join(mErr, json.Unmarshal(value, &a.ID))
|
||||
case "name":
|
||||
mErr = errors.Join(mErr, json.Unmarshal(value, &a.Name))
|
||||
default:
|
||||
var colors ColorScheme
|
||||
if err := json.Unmarshal(value, &colors); err == nil {
|
||||
a.FlavorColors[key] = colors
|
||||
} else {
|
||||
mErr = errors.Join(mErr, fmt.Errorf("failed to unmarshal flavor colors for key %s: %w", key, err))
|
||||
}
|
||||
}
|
||||
}
|
||||
return mErr
|
||||
}
|
||||
|
||||
func (a ThemeAccent) MarshalJSON() ([]byte, error) {
|
||||
m := map[string]any{
|
||||
"id": a.ID,
|
||||
"name": a.Name,
|
||||
}
|
||||
for k, v := range a.FlavorColors {
|
||||
m[k] = v
|
||||
}
|
||||
return json.Marshal(m)
|
||||
}
|
||||
|
||||
type MultiVariantDefaults struct {
|
||||
Dark map[string]string `json:"dark,omitempty"`
|
||||
Light map[string]string `json:"light,omitempty"`
|
||||
}
|
||||
|
||||
type ThemeVariants struct {
|
||||
Type string `json:"type,omitempty"`
|
||||
Default string `json:"default,omitempty"`
|
||||
Defaults *MultiVariantDefaults `json:"defaults,omitempty"`
|
||||
Options []ThemeVariant `json:"options,omitempty"`
|
||||
Flavors []ThemeFlavor `json:"flavors,omitempty"`
|
||||
Accents []ThemeAccent `json:"accents,omitempty"`
|
||||
}
|
||||
|
||||
type Theme struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Version string `json:"version"`
|
||||
Author string `json:"author"`
|
||||
Description string `json:"description"`
|
||||
Dark ColorScheme `json:"dark"`
|
||||
Light ColorScheme `json:"light"`
|
||||
Variants *ThemeVariants `json:"variants,omitempty"`
|
||||
PreviewPath string `json:"-"`
|
||||
SourceDir string `json:"sourceDir,omitempty"`
|
||||
}
|
||||
|
||||
type GitClient interface {
|
||||
PlainClone(path string, url string) error
|
||||
Pull(path string) error
|
||||
}
|
||||
|
||||
type realGitClient struct{}
|
||||
|
||||
func (g *realGitClient) PlainClone(path string, url string) error {
|
||||
_, err := git.PlainClone(path, &git.CloneOptions{
|
||||
URL: url,
|
||||
Progress: os.Stdout,
|
||||
})
|
||||
return err
|
||||
}
|
||||
|
||||
func (g *realGitClient) Pull(path string) error {
|
||||
repo, err := git.PlainOpen(path)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
worktree, err := repo.Worktree()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
err = worktree.Pull(&git.PullOptions{})
|
||||
if err != nil && err.Error() != "already up-to-date" {
|
||||
return err
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
type Registry struct {
|
||||
fs afero.Fs
|
||||
cacheDir string
|
||||
themes []Theme
|
||||
git GitClient
|
||||
}
|
||||
|
||||
func NewRegistry() (*Registry, error) {
|
||||
return NewRegistryWithFs(afero.NewOsFs())
|
||||
}
|
||||
|
||||
func NewRegistryWithFs(fs afero.Fs) (*Registry, error) {
|
||||
cacheDir := getCacheDir()
|
||||
return &Registry{
|
||||
fs: fs,
|
||||
cacheDir: cacheDir,
|
||||
git: &realGitClient{},
|
||||
}, nil
|
||||
}
|
||||
|
||||
func getCacheDir() string {
|
||||
return filepath.Join(os.TempDir(), "dankdots-plugin-registry")
|
||||
}
|
||||
|
||||
func (r *Registry) Update() error {
|
||||
exists, err := afero.DirExists(r.fs, r.cacheDir)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to check cache directory: %w", err)
|
||||
}
|
||||
|
||||
if !exists {
|
||||
if err := r.fs.MkdirAll(filepath.Dir(r.cacheDir), 0755); err != nil {
|
||||
return fmt.Errorf("failed to create cache directory: %w", err)
|
||||
}
|
||||
|
||||
if err := r.git.PlainClone(r.cacheDir, registryRepo); err != nil {
|
||||
return fmt.Errorf("failed to clone registry: %w", err)
|
||||
}
|
||||
} else {
|
||||
if err := r.git.Pull(r.cacheDir); err != nil {
|
||||
if err := r.fs.RemoveAll(r.cacheDir); err != nil {
|
||||
return fmt.Errorf("failed to remove corrupted registry: %w", err)
|
||||
}
|
||||
|
||||
if err := r.fs.MkdirAll(filepath.Dir(r.cacheDir), 0755); err != nil {
|
||||
return fmt.Errorf("failed to create cache directory: %w", err)
|
||||
}
|
||||
|
||||
if err := r.git.PlainClone(r.cacheDir, registryRepo); err != nil {
|
||||
return fmt.Errorf("failed to re-clone registry: %w", err)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return r.loadThemes()
|
||||
}
|
||||
|
||||
func (r *Registry) loadThemes() error {
|
||||
themesDir := filepath.Join(r.cacheDir, "themes")
|
||||
|
||||
entries, err := afero.ReadDir(r.fs, themesDir)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to read themes directory: %w", err)
|
||||
}
|
||||
|
||||
r.themes = []Theme{}
|
||||
|
||||
for _, entry := range entries {
|
||||
if !entry.IsDir() {
|
||||
continue
|
||||
}
|
||||
|
||||
themeDir := filepath.Join(themesDir, entry.Name())
|
||||
themeFile := filepath.Join(themeDir, "theme.json")
|
||||
|
||||
data, err := afero.ReadFile(r.fs, themeFile)
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
|
||||
var theme Theme
|
||||
if err := json.Unmarshal(data, &theme); err != nil {
|
||||
continue
|
||||
}
|
||||
|
||||
if theme.ID == "" {
|
||||
theme.ID = entry.Name()
|
||||
}
|
||||
theme.SourceDir = entry.Name()
|
||||
|
||||
previewPath := filepath.Join(themeDir, "preview.svg")
|
||||
if exists, _ := afero.Exists(r.fs, previewPath); exists {
|
||||
theme.PreviewPath = previewPath
|
||||
}
|
||||
|
||||
r.themes = append(r.themes, theme)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (r *Registry) List() ([]Theme, error) {
|
||||
if len(r.themes) == 0 {
|
||||
if err := r.Update(); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
|
||||
return SortByFirstParty(r.themes), nil
|
||||
}
|
||||
|
||||
func (r *Registry) Search(query string) ([]Theme, error) {
|
||||
allThemes, err := r.List()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
if query == "" {
|
||||
return allThemes, nil
|
||||
}
|
||||
|
||||
return SortByFirstParty(FuzzySearch(query, allThemes)), nil
|
||||
}
|
||||
|
||||
func (r *Registry) Get(idOrName string) (*Theme, error) {
|
||||
themes, err := r.List()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
for _, t := range themes {
|
||||
if t.ID == idOrName {
|
||||
return &t, nil
|
||||
}
|
||||
}
|
||||
|
||||
for _, t := range themes {
|
||||
if t.Name == idOrName {
|
||||
return &t, nil
|
||||
}
|
||||
}
|
||||
|
||||
return nil, fmt.Errorf("theme not found: %s", idOrName)
|
||||
}
|
||||
|
||||
func (r *Registry) GetThemeSourcePath(themeID string) string {
|
||||
return filepath.Join(r.cacheDir, "themes", themeID, "theme.json")
|
||||
}
|
||||
|
||||
func (r *Registry) GetThemeDir(themeID string) string {
|
||||
return filepath.Join(r.cacheDir, "themes", themeID)
|
||||
}
|
||||
|
||||
func SortByFirstParty(themes []Theme) []Theme {
|
||||
return themes
|
||||
}
|
||||
@@ -1,40 +0,0 @@
|
||||
package themes
|
||||
|
||||
import (
|
||||
"strings"
|
||||
|
||||
"github.com/AvengeMedia/DankMaterialShell/core/internal/utils"
|
||||
)
|
||||
|
||||
func FuzzySearch(query string, themes []Theme) []Theme {
|
||||
if query == "" {
|
||||
return themes
|
||||
}
|
||||
|
||||
queryLower := strings.ToLower(query)
|
||||
return utils.Filter(themes, func(t Theme) bool {
|
||||
return fuzzyMatch(queryLower, strings.ToLower(t.Name)) ||
|
||||
fuzzyMatch(queryLower, strings.ToLower(t.Description)) ||
|
||||
fuzzyMatch(queryLower, strings.ToLower(t.Author))
|
||||
})
|
||||
}
|
||||
|
||||
func fuzzyMatch(query, text string) bool {
|
||||
queryIdx := 0
|
||||
for _, char := range text {
|
||||
if queryIdx < len(query) && char == rune(query[queryIdx]) {
|
||||
queryIdx++
|
||||
}
|
||||
}
|
||||
return queryIdx == len(query)
|
||||
}
|
||||
|
||||
func FindByIDOrName(idOrName string, themes []Theme) *Theme {
|
||||
if t, found := utils.Find(themes, func(t Theme) bool { return t.ID == idOrName }); found {
|
||||
return &t
|
||||
}
|
||||
if t, found := utils.Find(themes, func(t Theme) bool { return t.Name == idOrName }); found {
|
||||
return &t
|
||||
}
|
||||
return nil
|
||||
}
|
||||
@@ -6,12 +6,3 @@ func CommandExists(cmd string) bool {
|
||||
_, err := exec.LookPath(cmd)
|
||||
return err == nil
|
||||
}
|
||||
|
||||
func AnyCommandExists(cmds ...string) bool {
|
||||
for _, cmd := range cmds {
|
||||
if CommandExists(cmd) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
dms-git (1.0.2+git2528.d336866fdb1) nightly; urgency=medium
|
||||
dms-git (1.0.2+git2528.d336866f) nightly; urgency=medium
|
||||
|
||||
* Git snapshot (commit 2528: d336866f)
|
||||
|
||||
@@ -16,6 +16,23 @@ dms-git (1.0.2+git2518.a783d650) nightly; urgency=medium
|
||||
|
||||
-- Avenge Media <AvengeMedia.US@gmail.com> Sat, 13 Dec 2025 15:11:40 +0000
|
||||
|
||||
dms-git (1.0.2+git2510.0f89886c) nightly; urgency=medium
|
||||
|
||||
* Git snapshot (commit 2510: 0f89886c)
|
||||
|
||||
-- Avenge Media <AvengeMedia.US@gmail.com> Sat, 13 Dec 2025 06:46:43 +0000
|
||||
|
||||
dms-git (1.0.2+git2507.b2ac9c6c) nightly; urgency=medium
|
||||
|
||||
* Git snapshot (commit 2507: b2ac9c6c)
|
||||
|
||||
-- Avenge Media <AvengeMedia.US@gmail.com> Sat, 13 Dec 2025 06:18:05 +0000
|
||||
|
||||
dms-git (1.0.2+git2505.82f881af) nightly; urgency=medium
|
||||
|
||||
* Git snapshot (commit 2505: 82f881af)
|
||||
|
||||
-- Avenge Media <AvengeMedia.US@gmail.com> Sat, 13 Dec 2025 05:55:03 +0000
|
||||
|
||||
dms-git (1.0.0+git2419.993f14a3) nightly; urgency=medium
|
||||
|
||||
|
||||
@@ -3,19 +3,19 @@
|
||||
<service name="download_url">
|
||||
<param name="protocol">https</param>
|
||||
<param name="host">github.com</param>
|
||||
<param name="path">/AvengeMedia/DankMaterialShell/archive/refs/tags/v1.0.3.tar.gz</param>
|
||||
<param name="path">/AvengeMedia/DankMaterialShell/archive/refs/tags/v1.0.2.tar.gz</param>
|
||||
<param name="filename">dms-source.tar.gz</param>
|
||||
</service>
|
||||
<!-- Download amd64 binary -->
|
||||
<service name="download_url">
|
||||
<param name="protocol">https</param>
|
||||
<param name="host">github.com</param>
|
||||
<param name="path">/AvengeMedia/DankMaterialShell/releases/download/v1.0.3/dms-distropkg-amd64.gz</param>
|
||||
<param name="path">/AvengeMedia/DankMaterialShell/releases/download/v1.0.2/dms-distropkg-amd64.gz</param>
|
||||
</service>
|
||||
<!-- Download arm64 binary -->
|
||||
<service name="download_url">
|
||||
<param name="protocol">https</param>
|
||||
<param name="host">github.com</param>
|
||||
<param name="path">/AvengeMedia/DankMaterialShell/releases/download/v1.0.3/dms-distropkg-arm64.gz</param>
|
||||
<param name="path">/AvengeMedia/DankMaterialShell/releases/download/v1.0.2/dms-distropkg-arm64.gz</param>
|
||||
</service>
|
||||
</services>
|
||||
|
||||
@@ -1,12 +1,6 @@
|
||||
dms (1.0.3db1) unstable; urgency=medium
|
||||
dms (1.0.2ppa6) unstable; urgency=medium
|
||||
|
||||
* Update to v1.0.3 stable release
|
||||
|
||||
-- Avenge Media <AvengeMedia.US@gmail.com> Mon, 16 Dec 2025 10:00:00 +0000
|
||||
|
||||
dms (1.0.2db1) unstable; urgency=medium
|
||||
|
||||
* Update to v1.0.2 stable release
|
||||
* Rebuild to fix repository metadata issues
|
||||
|
||||
-- Avenge Media <AvengeMedia.US@gmail.com> Sat, 13 Dec 2025 06:47:39 +0000
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ Vcs-Git: https://github.com/AvengeMedia/DankMaterialShell.git
|
||||
Package: dms
|
||||
Architecture: amd64
|
||||
Depends: ${misc:Depends},
|
||||
quickshell | quickshell-git,
|
||||
quickshell-git | quickshell,
|
||||
accountsservice,
|
||||
cava,
|
||||
cliphist,
|
||||
|
||||
@@ -1,135 +0,0 @@
|
||||
# Spec for DMS - uses rpkg macros for git builds
|
||||
|
||||
%global debug_package %{nil}
|
||||
%global version {{{ git_repo_version }}}
|
||||
%global pkg_summary DankMaterialShell - Material 3 inspired shell for Wayland compositors
|
||||
|
||||
Name: dms
|
||||
Epoch: 2
|
||||
Version: %{version}
|
||||
Release: 1%{?dist}
|
||||
Summary: %{pkg_summary}
|
||||
|
||||
License: MIT
|
||||
URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
VCS: {{{ git_repo_vcs }}}
|
||||
Source0: {{{ git_repo_pack }}}
|
||||
|
||||
BuildRequires: git-core
|
||||
BuildRequires: gzip
|
||||
BuildRequires: golang >= 1.24
|
||||
BuildRequires: make
|
||||
BuildRequires: wget
|
||||
BuildRequires: systemd-rpm-macros
|
||||
|
||||
# Core requirements
|
||||
Requires: (quickshell-git or quickshell)
|
||||
Requires: accountsservice
|
||||
Requires: dms-cli = %{epoch}:%{version}-%{release}
|
||||
Requires: dgop
|
||||
|
||||
# Core utilities (Highly recommended for DMS functionality)
|
||||
Recommends: cava
|
||||
Recommends: danksearch
|
||||
Recommends: matugen
|
||||
Recommends: quickshell-git
|
||||
Recommends: wl-clipboard
|
||||
|
||||
# Recommended system packages
|
||||
Recommends: NetworkManager
|
||||
Recommends: qt6-qtmultimedia
|
||||
Suggests: qt6ct
|
||||
|
||||
%description
|
||||
DankMaterialShell (DMS) is a modern Wayland desktop shell built with Quickshell
|
||||
and optimized for the niri, hyprland, sway, and dwl (MangoWC) compositors. Features notifications,
|
||||
app launcher, wallpaper customization, and fully customizable with plugins.
|
||||
|
||||
Includes auto-theming for GTK/Qt apps with matugen, 20+ customizable widgets,
|
||||
process monitoring, notification center, clipboard history, dock, control center,
|
||||
lock screen, and comprehensive plugin system.
|
||||
|
||||
%package -n dms-cli
|
||||
Summary: DankMaterialShell CLI tool
|
||||
License: MIT
|
||||
URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
|
||||
%description -n dms-cli
|
||||
Command-line interface for DankMaterialShell configuration and management.
|
||||
Provides native DBus bindings, NetworkManager integration, and system utilities.
|
||||
|
||||
%prep
|
||||
{{{ git_repo_setup_macro }}}
|
||||
|
||||
%build
|
||||
# Build DMS CLI from source (core/subdirectory)
|
||||
VERSION="%{version}"
|
||||
COMMIT=$(echo "%{version}" | grep -oP '[a-f0-9]{7,}' | head -n1 || echo "unknown")
|
||||
|
||||
cd core
|
||||
make dist VERSION="$VERSION" COMMIT="$COMMIT"
|
||||
|
||||
%install
|
||||
# Install dms-cli binary (built from source)
|
||||
case "%{_arch}" in
|
||||
x86_64)
|
||||
DMS_BINARY="dms-linux-amd64"
|
||||
;;
|
||||
aarch64)
|
||||
DMS_BINARY="dms-linux-arm64"
|
||||
;;
|
||||
*)
|
||||
echo "Unsupported architecture: %{_arch}"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
install -Dm755 core/bin/${DMS_BINARY} %{buildroot}%{_bindir}/dms
|
||||
|
||||
# Shell completions
|
||||
install -d %{buildroot}%{_datadir}/bash-completion/completions
|
||||
install -d %{buildroot}%{_datadir}/zsh/site-functions
|
||||
install -d %{buildroot}%{_datadir}/fish/vendor_completions.d
|
||||
core/bin/${DMS_BINARY} completion bash > %{buildroot}%{_datadir}/bash-completion/completions/dms || :
|
||||
core/bin/${DMS_BINARY} completion zsh > %{buildroot}%{_datadir}/zsh/site-functions/_dms || :
|
||||
core/bin/${DMS_BINARY} completion fish > %{buildroot}%{_datadir}/fish/vendor_completions.d/dms.fish || :
|
||||
|
||||
# Install systemd user service
|
||||
install -Dm644 assets/systemd/dms.service %{buildroot}%{_userunitdir}/dms.service
|
||||
|
||||
install -Dm644 assets/dms-open.desktop %{buildroot}%{_datadir}/applications/dms-open.desktop
|
||||
install -Dm644 assets/danklogo.svg %{buildroot}%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
|
||||
# Install shell files to shared data location
|
||||
install -dm755 %{buildroot}%{_datadir}/quickshell/dms
|
||||
cp -r quickshell/* %{buildroot}%{_datadir}/quickshell/dms/
|
||||
|
||||
# Remove build files
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.git*
|
||||
rm -f %{buildroot}%{_datadir}/quickshell/dms/.gitignore
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.github
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/distro
|
||||
|
||||
echo "%{version}" > %{buildroot}%{_datadir}/quickshell/dms/VERSION
|
||||
|
||||
%posttrans
|
||||
# Signal running DMS instances to reload
|
||||
pkill -USR1 -x dms >/dev/null 2>&1 || :
|
||||
|
||||
%files
|
||||
%license LICENSE
|
||||
%doc CONTRIBUTING.md
|
||||
%doc quickshell/README.md
|
||||
%{_datadir}/quickshell/dms/
|
||||
%{_userunitdir}/dms.service
|
||||
%{_datadir}/applications/dms-open.desktop
|
||||
%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
|
||||
%files -n dms-cli
|
||||
%{_bindir}/dms
|
||||
%{_datadir}/bash-completion/completions/dms
|
||||
%{_datadir}/zsh/site-functions/_dms
|
||||
%{_datadir}/fish/vendor_completions.d/dms.fish
|
||||
|
||||
%changelog
|
||||
{{{ git_repo_changelog }}}
|
||||
@@ -1,22 +1,22 @@
|
||||
# Spec for DMS Greeter - Stable releases
|
||||
# Spec for DMS Greeter - Git builds using rpkg macros
|
||||
|
||||
%global debug_package %{nil}
|
||||
%global version VERSION_PLACEHOLDER
|
||||
%global version {{{ git_repo_version }}}
|
||||
%global pkg_summary DankMaterialShell greeter for greetd
|
||||
|
||||
Name: dms-greeter
|
||||
Version: %{version}
|
||||
Release: RELEASE_PLACEHOLDER%{?dist}
|
||||
Release: 0.git%{?dist}
|
||||
Summary: %{pkg_summary}
|
||||
|
||||
License: MIT
|
||||
URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
VCS: {{{ git_repo_vcs }}}
|
||||
Source0: {{{ git_repo_pack }}}
|
||||
|
||||
Source0: dms-qml.tar.gz
|
||||
|
||||
BuildRequires: gzip
|
||||
BuildRequires: wget
|
||||
BuildRequires: systemd-rpm-macros
|
||||
BuildRequires: git-core
|
||||
# For the _tmpfilesdir macro.
|
||||
BuildRequires: systemd-rpm-macros
|
||||
|
||||
Requires: greetd
|
||||
Requires: (quickshell-git or quickshell)
|
||||
@@ -24,11 +24,14 @@ Requires(post): /usr/sbin/useradd
|
||||
Requires(post): /usr/sbin/groupadd
|
||||
|
||||
Recommends: policycoreutils-python-utils
|
||||
Recommends: acl
|
||||
Recommends: setfacl
|
||||
Suggests: niri
|
||||
Suggests: hyprland
|
||||
Suggests: sway
|
||||
|
||||
# Provides: greetd-dms-greeter = %{version}-%{release}
|
||||
# Conflicts: greetd-dms-greeter
|
||||
|
||||
%description
|
||||
DankMaterialShell greeter for greetd login manager. A modern, Material Design 3
|
||||
inspired greeter interface built with Quickshell for Wayland compositors.
|
||||
@@ -38,26 +41,31 @@ compositor detection and configuration. Features session selection, user
|
||||
authentication, and dynamic theming.
|
||||
|
||||
%prep
|
||||
%setup -q -c -n dms-qml
|
||||
|
||||
%build
|
||||
{{{ git_repo_setup_macro }}}
|
||||
|
||||
%install
|
||||
# Install greeter files to shared data location
|
||||
# Install greeter files to shared data location (from quickshell/ subdirectory)
|
||||
install -dm755 %{buildroot}%{_datadir}/quickshell/dms-greeter
|
||||
cp -r %{_builddir}/dms-qml/* %{buildroot}%{_datadir}/quickshell/dms-greeter/
|
||||
cp -r quickshell/* %{buildroot}%{_datadir}/quickshell/dms-greeter/
|
||||
|
||||
install -Dm755 %{_builddir}/dms-qml/Modules/Greetd/assets/dms-greeter %{buildroot}%{_bindir}/dms-greeter
|
||||
# Install launcher script
|
||||
install -Dm755 quickshell/Modules/Greetd/assets/dms-greeter %{buildroot}%{_bindir}/dms-greeter
|
||||
|
||||
install -Dm644 %{_builddir}/dms-qml/Modules/Greetd/README.md %{buildroot}%{_docdir}/dms-greeter/README.md
|
||||
# Install documentation
|
||||
install -Dm644 quickshell/Modules/Greetd/README.md %{buildroot}%{_docdir}/dms-greeter/README.md
|
||||
|
||||
install -Dpm0644 %{_builddir}/dms-qml/systemd/tmpfiles-dms-greeter.conf %{buildroot}%{_tmpfilesdir}/dms-greeter.conf
|
||||
# Create cache directory for greeter data
|
||||
install -Dpm0644 quickshell/systemd/tmpfiles-dms-greeter.conf %{buildroot}%{_tmpfilesdir}/dms-greeter.conf
|
||||
|
||||
install -Dm644 %{_builddir}/dms-qml/LICENSE %{buildroot}%{_docdir}/dms-greeter/LICENSE
|
||||
# Install LICENSE file
|
||||
install -Dm644 LICENSE %{buildroot}%{_docdir}/dms-greeter/LICENSE
|
||||
|
||||
# Create greeter home directory
|
||||
install -dm755 %{buildroot}%{_sharedstatedir}/greeter
|
||||
|
||||
# Note: We do NOT install a PAM config here to avoid conflicting w/greetd packages
|
||||
# Note: We do NOT install a PAM config here to avoid conflicting with greetd package
|
||||
# Instead, we verify/fix it in %post if needed
|
||||
|
||||
# Remove build and development files
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms-greeter/.git*
|
||||
rm -f %{buildroot}%{_datadir}/quickshell/dms-greeter/.gitignore
|
||||
@@ -65,8 +73,9 @@ rm -rf %{buildroot}%{_datadir}/quickshell/dms-greeter/.github
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms-greeter/distro
|
||||
|
||||
%posttrans
|
||||
# Clean up old installation path from previous versions (only if empty)
|
||||
if [ -d "%{_sysconfdir}/xdg/quickshell/dms-greeter" ]; then
|
||||
# Remove directories & preserves any user-added files
|
||||
# Remove directories only if empty (preserves any user-added files)
|
||||
rmdir "%{_sysconfdir}/xdg/quickshell/dms-greeter" 2>/dev/null || true
|
||||
rmdir "%{_sysconfdir}/xdg/quickshell" 2>/dev/null || true
|
||||
rmdir "%{_sysconfdir}/xdg" 2>/dev/null || true
|
||||
@@ -80,7 +89,7 @@ fi
|
||||
%{_tmpfilesdir}/%{name}.conf
|
||||
|
||||
%pre
|
||||
# Create greeter user/group if they don't exist
|
||||
# Create greeter user/group if they don't exist (greetd expects this)
|
||||
getent group greeter >/dev/null || groupadd -r greeter
|
||||
getent passwd greeter >/dev/null || \
|
||||
useradd -r -g greeter -d %{_sharedstatedir}/greeter -s /bin/bash \
|
||||
@@ -118,6 +127,7 @@ chown -R greeter:greeter %{_sharedstatedir}/greeter 2>/dev/null || true
|
||||
# Verify PAM configuration - only fix if insufficient
|
||||
PAM_CONFIG="/etc/pam.d/greetd"
|
||||
if [ ! -f "$PAM_CONFIG" ]; then
|
||||
# PAM config doesn't exist - create it
|
||||
cat > "$PAM_CONFIG" << 'PAM_EOF'
|
||||
#%PAM-1.0
|
||||
auth substack system-auth
|
||||
@@ -139,6 +149,7 @@ PAM_EOF
|
||||
# Only show message on initial install
|
||||
[ "$1" -eq 1 ] && echo "Created PAM configuration for greetd"
|
||||
elif ! grep -q "pam_systemd\|system-auth" "$PAM_CONFIG"; then
|
||||
# PAM config exists but looks insufficient - back it up and replace
|
||||
cp "$PAM_CONFIG" "$PAM_CONFIG.backup-dms-greeter"
|
||||
cat > "$PAM_CONFIG" << 'PAM_EOF'
|
||||
#%PAM-1.0
|
||||
@@ -187,8 +198,9 @@ command = "/usr/bin/dms-greeter --command COMPOSITOR_PLACEHOLDER"
|
||||
GREETD_EOF
|
||||
sed -i "s|COMPOSITOR_PLACEHOLDER|$COMPOSITOR|" "$GREETD_CONFIG"
|
||||
CONFIG_STATUS="Created new config with $COMPOSITOR ✓"
|
||||
|
||||
# If config exists and doesn't have dms-greeter, update it
|
||||
elif ! grep -q "dms-greeter" "$GREETD_CONFIG"; then
|
||||
# Backup existing config
|
||||
BACKUP_FILE="${GREETD_CONFIG}.backup-$(date +%%Y%%m%%d-%%H%%M%%S)"
|
||||
cp "$GREETD_CONFIG" "$BACKUP_FILE" 2>/dev/null || true
|
||||
|
||||
@@ -255,6 +267,4 @@ if [ "$1" -eq 0 ] && [ -x /usr/sbin/semanage ]; then
|
||||
fi
|
||||
|
||||
%changelog
|
||||
* CHANGELOG_DATE_PLACEHOLDER AvengeMedia <contact@avengemedia.com> - VERSION_PLACEHOLDER-RELEASE_PLACEHOLDER
|
||||
- Stable release VERSION_PLACEHOLDER
|
||||
- Built from GitHub release
|
||||
{{{ git_repo_changelog }}}
|
||||
|
||||
@@ -1,40 +1,49 @@
|
||||
# Feodra spec for DMS stable releases
|
||||
# Spec for DMS - uses rpkg macros for git builds
|
||||
|
||||
%global debug_package %{nil}
|
||||
%global version VERSION_PLACEHOLDER
|
||||
%global version {{{ git_repo_version }}}
|
||||
%global pkg_summary DankMaterialShell - Material 3 inspired shell for Wayland compositors
|
||||
|
||||
Name: dms
|
||||
Epoch: 2
|
||||
Version: %{version}
|
||||
Release: RELEASE_PLACEHOLDER%{?dist}
|
||||
Release: 1%{?dist}
|
||||
Summary: %{pkg_summary}
|
||||
|
||||
License: MIT
|
||||
URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
VCS: {{{ git_repo_vcs }}}
|
||||
Source0: {{{ git_repo_pack }}}
|
||||
|
||||
Source0: dms-qml.tar.gz
|
||||
|
||||
BuildRequires: git-core
|
||||
BuildRequires: gzip
|
||||
BuildRequires: golang >= 1.24
|
||||
BuildRequires: make
|
||||
BuildRequires: wget
|
||||
BuildRequires: systemd-rpm-macros
|
||||
|
||||
Requires: (quickshell or quickshell-git)
|
||||
# Core requirements
|
||||
Requires: (quickshell-git or quickshell)
|
||||
Requires: accountsservice
|
||||
Requires: dms-cli = %{version}-%{release}
|
||||
Requires: dms-cli = %{epoch}:%{version}-%{release}
|
||||
Requires: dgop
|
||||
|
||||
# Core utilities (Highly recommended for DMS functionality)
|
||||
Recommends: cava
|
||||
Recommends: cliphist
|
||||
Recommends: danksearch
|
||||
Recommends: matugen
|
||||
Recommends: quickshell-git
|
||||
Recommends: wl-clipboard
|
||||
|
||||
# Recommended system packages
|
||||
Recommends: NetworkManager
|
||||
Recommends: qt6-qtmultimedia
|
||||
Suggests: qt6ct
|
||||
|
||||
%description
|
||||
DankMaterialShell (DMS) is a modern Wayland desktop shell built with Quickshell
|
||||
and optimized for the niri and hyprland compositors. Features notifications,
|
||||
and optimized for the niri, hyprland, sway, and dwl (MangoWC) compositors. Features notifications,
|
||||
app launcher, wallpaper customization, and fully customizable with plugins.
|
||||
|
||||
Includes auto-theming for GTK/Qt apps with matugen, 20+ customizable widgets,
|
||||
@@ -51,14 +60,24 @@ Command-line interface for DankMaterialShell configuration and management.
|
||||
Provides native DBus bindings, NetworkManager integration, and system utilities.
|
||||
|
||||
%prep
|
||||
%setup -q -c -n dms-qml
|
||||
{{{ git_repo_setup_macro }}}
|
||||
|
||||
%build
|
||||
# Build DMS CLI from source (core/subdirectory)
|
||||
VERSION="%{version}"
|
||||
COMMIT=$(echo "%{version}" | grep -oP '[a-f0-9]{7,}' | head -n1 || echo "unknown")
|
||||
|
||||
cd core
|
||||
make dist VERSION="$VERSION" COMMIT="$COMMIT"
|
||||
|
||||
%install
|
||||
# Install dms-cli binary (built from source)
|
||||
case "%{_arch}" in
|
||||
x86_64)
|
||||
ARCH_SUFFIX="amd64"
|
||||
DMS_BINARY="dms-linux-amd64"
|
||||
;;
|
||||
aarch64)
|
||||
ARCH_SUFFIX="arm64"
|
||||
DMS_BINARY="dms-linux-arm64"
|
||||
;;
|
||||
*)
|
||||
echo "Unsupported architecture: %{_arch}"
|
||||
@@ -66,35 +85,27 @@ case "%{_arch}" in
|
||||
;;
|
||||
esac
|
||||
|
||||
# Download dms-cli for target architecture
|
||||
wget -O %{_builddir}/dms-cli.gz "https://github.com/AvengeMedia/DankMaterialShell/releases/latest/download/dms-distropkg-${ARCH_SUFFIX}.gz" || {
|
||||
echo "Failed to download dms-cli for architecture %{_arch}"
|
||||
exit 1
|
||||
}
|
||||
gunzip -c %{_builddir}/dms-cli.gz > %{_builddir}/dms-cli
|
||||
chmod +x %{_builddir}/dms-cli
|
||||
|
||||
%build
|
||||
|
||||
%install
|
||||
install -Dm755 %{_builddir}/dms-cli %{buildroot}%{_bindir}/dms
|
||||
install -Dm755 core/bin/${DMS_BINARY} %{buildroot}%{_bindir}/dms
|
||||
|
||||
# Shell completions
|
||||
install -d %{buildroot}%{_datadir}/bash-completion/completions
|
||||
install -d %{buildroot}%{_datadir}/zsh/site-functions
|
||||
install -d %{buildroot}%{_datadir}/fish/vendor_completions.d
|
||||
%{_builddir}/dms-cli completion bash > %{buildroot}%{_datadir}/bash-completion/completions/dms || :
|
||||
%{_builddir}/dms-cli completion zsh > %{buildroot}%{_datadir}/zsh/site-functions/_dms || :
|
||||
%{_builddir}/dms-cli completion fish > %{buildroot}%{_datadir}/fish/vendor_completions.d/dms.fish || :
|
||||
core/bin/${DMS_BINARY} completion bash > %{buildroot}%{_datadir}/bash-completion/completions/dms || :
|
||||
core/bin/${DMS_BINARY} completion zsh > %{buildroot}%{_datadir}/zsh/site-functions/_dms || :
|
||||
core/bin/${DMS_BINARY} completion fish > %{buildroot}%{_datadir}/fish/vendor_completions.d/dms.fish || :
|
||||
|
||||
install -Dm644 %{_builddir}/dms-qml/assets/systemd/dms.service %{buildroot}%{_userunitdir}/dms.service
|
||||
# Install systemd user service
|
||||
install -Dm644 assets/systemd/dms.service %{buildroot}%{_userunitdir}/dms.service
|
||||
|
||||
install -Dm644 %{_builddir}/dms-qml/assets/dms-open.desktop %{buildroot}%{_datadir}/applications/dms-open.desktop
|
||||
install -Dm644 %{_builddir}/dms-qml/assets/danklogo.svg %{buildroot}%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
install -Dm644 assets/dms-open.desktop %{buildroot}%{_datadir}/applications/dms-open.desktop
|
||||
install -Dm644 assets/danklogo.svg %{buildroot}%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
|
||||
# Install shell files to shared data location
|
||||
install -dm755 %{buildroot}%{_datadir}/quickshell/dms
|
||||
cp -r %{_builddir}/dms-qml/* %{buildroot}%{_datadir}/quickshell/dms/
|
||||
cp -r quickshell/* %{buildroot}%{_datadir}/quickshell/dms/
|
||||
|
||||
# Remove build files
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.git*
|
||||
rm -f %{buildroot}%{_datadir}/quickshell/dms/.gitignore
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.github
|
||||
@@ -108,7 +119,8 @@ pkill -USR1 -x dms >/dev/null 2>&1 || :
|
||||
|
||||
%files
|
||||
%license LICENSE
|
||||
%doc README.md CONTRIBUTING.md
|
||||
%doc CONTRIBUTING.md
|
||||
%doc quickshell/README.md
|
||||
%{_datadir}/quickshell/dms/
|
||||
%{_userunitdir}/dms.service
|
||||
%{_datadir}/applications/dms-open.desktop
|
||||
@@ -121,6 +133,4 @@ pkill -USR1 -x dms >/dev/null 2>&1 || :
|
||||
%{_datadir}/fish/vendor_completions.d/dms.fish
|
||||
|
||||
%changelog
|
||||
* CHANGELOG_DATE_PLACEHOLDER AvengeMedia <contact@avengemedia.com> - VERSION_PLACEHOLDER-RELEASE_PLACEHOLDER
|
||||
- Stable release VERSION_PLACEHOLDER
|
||||
- Built from GitHub release
|
||||
{{{ git_repo_changelog }}}
|
||||
|
||||
@@ -6,13 +6,13 @@
|
||||
...
|
||||
}:
|
||||
let
|
||||
cfg = config.programs.dank-material-shell;
|
||||
cfg = config.programs.dankMaterialShell;
|
||||
in
|
||||
{
|
||||
packages = [
|
||||
dmsPkgs.dms-shell
|
||||
]
|
||||
++ lib.optional cfg.enableSystemMonitoring cfg.dgop.package
|
||||
++ lib.optional cfg.enableSystemMonitoring dmsPkgs.dgop
|
||||
++ lib.optionals cfg.enableVPN [
|
||||
pkgs.glib
|
||||
pkgs.networkmanager
|
||||
|
||||
@@ -1,15 +0,0 @@
|
||||
{ lib, ... }:
|
||||
{
|
||||
imports = [
|
||||
(lib.mkRenamedOptionModule
|
||||
[
|
||||
"programs"
|
||||
"dankMaterialShell"
|
||||
]
|
||||
[
|
||||
"programs"
|
||||
"dank-material-shell"
|
||||
]
|
||||
)
|
||||
];
|
||||
}
|
||||
@@ -7,7 +7,7 @@
|
||||
}:
|
||||
let
|
||||
inherit (lib) types;
|
||||
cfg = config.programs.dank-material-shell.greeter;
|
||||
cfg = config.programs.dankMaterialShell.greeter;
|
||||
|
||||
inherit (config.services.greetd.settings.default_session) user;
|
||||
|
||||
@@ -44,20 +44,19 @@ in
|
||||
{
|
||||
imports =
|
||||
let
|
||||
msg = "The option 'programs.dank-material-shell.greeter.compositor.extraConfig' is deprecated. Please use 'programs.dank-material-shell.greeter.compositor.customConfig' instead.";
|
||||
msg = "The option 'programs.dankMaterialShell.greeter.compositor.extraConfig' is deprecated. Please use 'programs.dankMaterialShell.greeter.compositor.customConfig' instead.";
|
||||
in
|
||||
[
|
||||
(lib.mkRemovedOptionModule [
|
||||
"programs"
|
||||
"dank-material-shell"
|
||||
"dankMaterialShell"
|
||||
"greeter"
|
||||
"compositor"
|
||||
"extraConfig"
|
||||
] msg)
|
||||
./dms-rename.nix
|
||||
];
|
||||
|
||||
options.programs.dank-material-shell.greeter = {
|
||||
options.programs.dankMaterialShell.greeter = {
|
||||
enable = lib.mkEnableOption "DankMaterialShell greeter";
|
||||
compositor.name = lib.mkOption {
|
||||
type = types.enum [
|
||||
@@ -178,7 +177,7 @@ in
|
||||
mv dms-colors.json colors.json || :
|
||||
chown ${user}: * || :
|
||||
'';
|
||||
programs.dank-material-shell.greeter.configFiles = lib.mkIf (cfg.configHome != null) [
|
||||
programs.dankMaterialShell.greeter.configFiles = lib.mkIf (cfg.configHome != null) [
|
||||
"${cfg.configHome}/.config/DankMaterialShell/settings.json"
|
||||
"${cfg.configHome}/.local/state/DankMaterialShell/session.json"
|
||||
"${cfg.configHome}/.cache/DankMaterialShell/dms-colors.json"
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
...
|
||||
}@args:
|
||||
let
|
||||
cfg = config.programs.dank-material-shell;
|
||||
cfg = config.programs.dankMaterialShell;
|
||||
jsonFormat = pkgs.formats.json { };
|
||||
common = import ./common.nix {
|
||||
inherit
|
||||
@@ -22,16 +22,16 @@ in
|
||||
(import ./options.nix args)
|
||||
(lib.mkRemovedOptionModule [
|
||||
"programs"
|
||||
"dank-material-shell"
|
||||
"dankMaterialShell"
|
||||
"enableNightMode"
|
||||
] "Night mode is now always available.")
|
||||
(lib.mkRenamedOptionModule
|
||||
[ "programs" "dank-material-shell" "enableSystemd" ]
|
||||
[ "programs" "dank-material-shell" "systemd" "enable" ]
|
||||
[ "programs" "dankMaterialShell" "enableSystemd" ]
|
||||
[ "programs" "dankMaterialShell" "systemd" "enable" ]
|
||||
)
|
||||
];
|
||||
|
||||
options.programs.dank-material-shell = with lib.types; {
|
||||
options.programs.dankMaterialShell = with lib.types; {
|
||||
default = {
|
||||
settings = lib.mkOption {
|
||||
type = jsonFormat.type;
|
||||
|
||||
@@ -4,14 +4,10 @@
|
||||
...
|
||||
}:
|
||||
let
|
||||
cfg = config.programs.dank-material-shell;
|
||||
cfg = config.programs.dankMaterialShell;
|
||||
in
|
||||
{
|
||||
imports = [
|
||||
./dms-rename.nix
|
||||
];
|
||||
|
||||
options.programs.dank-material-shell = {
|
||||
options.programs.dankMaterialShell = {
|
||||
niri = {
|
||||
enableKeybinds = lib.mkEnableOption "DankMaterialShell niri keybinds";
|
||||
enableSpawn = lib.mkEnableOption "DankMaterialShell niri spawn-at-startup";
|
||||
|
||||
@@ -6,7 +6,7 @@
|
||||
...
|
||||
}@args:
|
||||
let
|
||||
cfg = config.programs.dank-material-shell;
|
||||
cfg = config.programs.dankMaterialShell;
|
||||
common = import ./common.nix {
|
||||
inherit
|
||||
config
|
||||
|
||||
@@ -1,14 +1,13 @@
|
||||
{
|
||||
lib,
|
||||
dmsPkgs,
|
||||
pkgs,
|
||||
...
|
||||
}:
|
||||
let
|
||||
inherit (lib) types;
|
||||
path = [
|
||||
"programs"
|
||||
"dank-material-shell"
|
||||
"dankMaterialShell"
|
||||
];
|
||||
|
||||
builtInRemovedMsg = "This is now built-in in DMS and doesn't need additional dependencies.";
|
||||
@@ -21,58 +20,46 @@ in
|
||||
(lib.mkRemovedOptionModule (
|
||||
path ++ [ "enableSystemSound" ]
|
||||
) "qtmultimedia is now included on dms-shell package.")
|
||||
./dms-rename.nix
|
||||
];
|
||||
|
||||
options.programs.dank-material-shell = {
|
||||
options.programs.dankMaterialShell = {
|
||||
enable = lib.mkEnableOption "DankMaterialShell";
|
||||
|
||||
systemd = {
|
||||
enable = lib.mkEnableOption "DankMaterialShell systemd startup";
|
||||
restartIfChanged = lib.mkOption {
|
||||
type = types.bool;
|
||||
default = true;
|
||||
description = "Auto-restart dms.service when dank-material-shell changes";
|
||||
description = "Auto-restart dms.service when dankMaterialShell changes";
|
||||
};
|
||||
};
|
||||
|
||||
dgop = {
|
||||
package = lib.mkPackageOption pkgs "dgop" {};
|
||||
};
|
||||
|
||||
enableSystemMonitoring = lib.mkOption {
|
||||
type = types.bool;
|
||||
default = true;
|
||||
description = "Add needed dependencies to use system monitoring widgets";
|
||||
};
|
||||
|
||||
enableVPN = lib.mkOption {
|
||||
type = types.bool;
|
||||
default = true;
|
||||
description = "Add needed dependencies to use the VPN widget";
|
||||
};
|
||||
|
||||
enableDynamicTheming = lib.mkOption {
|
||||
type = types.bool;
|
||||
default = true;
|
||||
description = "Add needed dependencies to have dynamic theming support";
|
||||
};
|
||||
|
||||
enableAudioWavelength = lib.mkOption {
|
||||
type = types.bool;
|
||||
default = true;
|
||||
description = "Add needed dependencies to have audio wavelength support";
|
||||
};
|
||||
|
||||
enableCalendarEvents = lib.mkOption {
|
||||
type = types.bool;
|
||||
default = true;
|
||||
description = "Add calendar events support via khal";
|
||||
};
|
||||
|
||||
quickshell = {
|
||||
package = lib.mkPackageOption dmsPkgs "quickshell" {
|
||||
extraDescription = "The quickshell package to use (defaults to be built from source, due to unreleased features used by DMS).";
|
||||
extraDescription = "The quickshell package to use (defaults to be built from source, in the commit 26531f due to unreleased features used by DMS).";
|
||||
};
|
||||
};
|
||||
|
||||
|
||||
@@ -3,8 +3,8 @@
|
||||
%global debug_package %{nil}
|
||||
|
||||
Name: dms
|
||||
Version: 1.0.3
|
||||
Release: 1%{?dist}
|
||||
Version: 1.0.2
|
||||
Release: 7%{?dist}
|
||||
Summary: DankMaterialShell - Material 3 inspired shell for Wayland compositors
|
||||
|
||||
License: MIT
|
||||
@@ -17,7 +17,7 @@ BuildRequires: gzip
|
||||
BuildRequires: systemd-rpm-macros
|
||||
|
||||
# Core requirements
|
||||
Requires: (quickshell or quickshell-git)
|
||||
Requires: (quickshell-git or quickshell)
|
||||
Requires: accountsservice
|
||||
Requires: dgop
|
||||
|
||||
@@ -105,9 +105,6 @@ pkill -USR1 -x dms >/dev/null 2>&1 || :
|
||||
%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
|
||||
%changelog
|
||||
* Mon Dec 16 2025 AvengeMedia <maintainer@avengemedia.com> - 1.0.3-1
|
||||
- Update to stable v1.0.3 release
|
||||
|
||||
* Fri Dec 12 2025 AvengeMedia <maintainer@avengemedia.com> - 1.0.2-1
|
||||
- Update to stable v1.0.2 release
|
||||
- Bug fixes and improvements
|
||||
|
||||
@@ -2,121 +2,226 @@
|
||||
set -euo pipefail
|
||||
|
||||
# Build SRPM locally with correct tarball and upload to Copr
|
||||
# Usage: ./copr-upload.sh [PACKAGE] [VERSION] [RELEASE]
|
||||
# Examples:
|
||||
# ./copr-upload.sh dms 1.0.3 1
|
||||
# ./copr-upload.sh dms-greeter 1.0.3 1
|
||||
|
||||
PACKAGE="${1:-dms}"
|
||||
VERSION="${2:-}"
|
||||
RELEASE="${3:-1}"
|
||||
# Usage: ./create-upload-copr.sh VERSION [RELEASE]
|
||||
# Example: ./create-upload-copr.sh 1.0.0 4
|
||||
|
||||
VERSION="${1:-1.0.0}"
|
||||
RELEASE="${2:-1}"
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
REPO_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
|
||||
# Determine Copr project based on package
|
||||
if [ "$PACKAGE" = "dms" ]; then
|
||||
COPR_PROJECT="avengemedia/dms"
|
||||
elif [ "$PACKAGE" = "dms-greeter" ]; then
|
||||
COPR_PROJECT="avengemedia/danklinux"
|
||||
else
|
||||
echo "❌ Unknown package: $PACKAGE"
|
||||
echo "Supported packages: dms, dms-greeter"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Get version from latest release if not provided
|
||||
if [ -z "$VERSION" ]; then
|
||||
echo "📦 Determining latest version..."
|
||||
VERSION=$(curl -s https://api.github.com/repos/AvengeMedia/DankMaterialShell/releases/latest | jq -r '.tag_name' | sed 's/^v//')
|
||||
if [ -z "$VERSION" ] || [ "$VERSION" = "null" ]; then
|
||||
echo "❌ Failed to determine version. Please specify manually."
|
||||
exit 1
|
||||
fi
|
||||
echo "✅ Using latest version: $VERSION"
|
||||
fi
|
||||
|
||||
echo "Building ${PACKAGE} v${VERSION}-${RELEASE} SRPM for Copr..."
|
||||
echo "Building DMS v${VERSION}-${RELEASE} SRPM for Copr..."
|
||||
|
||||
# Setup build directories
|
||||
mkdir -p ~/rpmbuild/{BUILD,BUILDROOT,RPMS,SOURCES,SPECS,SRPMS}
|
||||
cd ~/rpmbuild/SOURCES
|
||||
|
||||
# Download source tarball from GitHub releases
|
||||
echo "📦 Downloading source tarball for v${VERSION}..."
|
||||
if [ ! -f ~/rpmbuild/SOURCES/dms-qml.tar.gz ]; then
|
||||
wget -O ~/rpmbuild/SOURCES/dms-qml.tar.gz "https://github.com/AvengeMedia/DankMaterialShell/releases/download/v${VERSION}/dms-qml.tar.gz" || {
|
||||
echo "❌ Failed to download dms-qml.tar.gz for v${VERSION}"
|
||||
exit 1
|
||||
}
|
||||
echo "✅ Source tarball downloaded"
|
||||
else
|
||||
echo "✅ Source tarball already exists"
|
||||
fi
|
||||
# Create the corrected QML tarball locally
|
||||
echo "Creating QML tarball with assets..."
|
||||
TEMP_DIR=$(mktemp -d)
|
||||
cd "$REPO_ROOT"
|
||||
|
||||
# Copy and prepare spec file
|
||||
echo "📝 Preparing spec file..."
|
||||
SPEC_FILE="$REPO_ROOT/distro/fedora/${PACKAGE}.spec"
|
||||
if [ ! -f "$SPEC_FILE" ]; then
|
||||
echo "❌ Spec file not found: $SPEC_FILE"
|
||||
exit 1
|
||||
fi
|
||||
# Copy quickshell contents to temp
|
||||
cp -r quickshell/* "$TEMP_DIR/"
|
||||
|
||||
cp "$SPEC_FILE" ~/rpmbuild/SPECS/"${PACKAGE}".spec
|
||||
# Copy root LICENSE and CONTRIBUTING.md
|
||||
cp LICENSE CONTRIBUTING.md "$TEMP_DIR/"
|
||||
|
||||
# Replace placeholders in spec file
|
||||
# Copy root assets directory (this is what was missing!)
|
||||
cp -r assets "$TEMP_DIR/"
|
||||
|
||||
# Create tarball
|
||||
cd "$TEMP_DIR"
|
||||
tar --exclude='.git' \
|
||||
--exclude='.github' \
|
||||
--exclude='*.tar.gz' \
|
||||
-czf ~/rpmbuild/SOURCES/dms-qml.tar.gz .
|
||||
|
||||
cd ~/rpmbuild/SOURCES
|
||||
echo "Created dms-qml.tar.gz with md5sum: $(md5sum dms-qml.tar.gz | awk '{print $1}')"
|
||||
rm -rf "$TEMP_DIR"
|
||||
|
||||
# Generate spec file
|
||||
echo "Generating spec file..."
|
||||
CHANGELOG_DATE="$(date '+%a %b %d %Y')"
|
||||
sed -i "s/VERSION_PLACEHOLDER/${VERSION}/g" ~/rpmbuild/SPECS/"${PACKAGE}".spec
|
||||
sed -i "s/RELEASE_PLACEHOLDER/${RELEASE}/g" ~/rpmbuild/SPECS/"${PACKAGE}".spec
|
||||
sed -i "s/CHANGELOG_DATE_PLACEHOLDER/${CHANGELOG_DATE}/g" ~/rpmbuild/SPECS/"${PACKAGE}".spec
|
||||
|
||||
echo "✅ Spec file prepared for ${PACKAGE} v${VERSION}-${RELEASE}"
|
||||
cat >~/rpmbuild/SPECS/dms.spec <<'SPECEOF'
|
||||
# Spec for DMS stable releases - Built locally
|
||||
|
||||
%global debug_package %{nil}
|
||||
%global version VERSION_PLACEHOLDER
|
||||
%global pkg_summary DankMaterialShell - Material 3 inspired shell for Wayland compositors
|
||||
|
||||
Name: dms
|
||||
Version: %{version}
|
||||
Release: RELEASE_PLACEHOLDER%{?dist}
|
||||
Summary: %{pkg_summary}
|
||||
|
||||
License: MIT
|
||||
URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
|
||||
Source0: dms-qml.tar.gz
|
||||
|
||||
BuildRequires: gzip
|
||||
BuildRequires: wget
|
||||
BuildRequires: systemd-rpm-macros
|
||||
|
||||
Requires: (quickshell or quickshell-git)
|
||||
Requires: accountsservice
|
||||
Requires: dms-cli = %{version}-%{release}
|
||||
Requires: dgop
|
||||
|
||||
Recommends: cava
|
||||
Recommends: cliphist
|
||||
Recommends: danksearch
|
||||
Recommends: matugen
|
||||
Recommends: wl-clipboard
|
||||
Recommends: NetworkManager
|
||||
Recommends: qt6-qtmultimedia
|
||||
Suggests: qt6ct
|
||||
|
||||
%description
|
||||
DankMaterialShell (DMS) is a modern Wayland desktop shell built with Quickshell
|
||||
and optimized for the niri and hyprland compositors. Features notifications,
|
||||
app launcher, wallpaper customization, and fully customizable with plugins.
|
||||
|
||||
Includes auto-theming for GTK/Qt apps with matugen, 20+ customizable widgets,
|
||||
process monitoring, notification center, clipboard history, dock, control center,
|
||||
lock screen, and comprehensive plugin system.
|
||||
|
||||
%package -n dms-cli
|
||||
Summary: DankMaterialShell CLI tool
|
||||
License: MIT
|
||||
URL: https://github.com/AvengeMedia/DankMaterialShell
|
||||
|
||||
%description -n dms-cli
|
||||
Command-line interface for DankMaterialShell configuration and management.
|
||||
Provides native DBus bindings, NetworkManager integration, and system utilities.
|
||||
|
||||
%prep
|
||||
%setup -q -c -n dms-qml
|
||||
|
||||
# Download architecture-specific binaries during build
|
||||
case "%{_arch}" in
|
||||
x86_64)
|
||||
ARCH_SUFFIX="amd64"
|
||||
;;
|
||||
aarch64)
|
||||
ARCH_SUFFIX="arm64"
|
||||
;;
|
||||
*)
|
||||
echo "Unsupported architecture: %{_arch}"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
wget -O %{_builddir}/dms-cli.gz "https://github.com/AvengeMedia/DankMaterialShell/releases/download/v%{version}/dms-distropkg-${ARCH_SUFFIX}.gz" || {
|
||||
echo "Failed to download dms-cli for architecture %{_arch}"
|
||||
exit 1
|
||||
}
|
||||
gunzip -c %{_builddir}/dms-cli.gz > %{_builddir}/dms-cli
|
||||
chmod +x %{_builddir}/dms-cli
|
||||
|
||||
%build
|
||||
|
||||
%install
|
||||
install -Dm755 %{_builddir}/dms-cli %{buildroot}%{_bindir}/dms
|
||||
|
||||
install -d %{buildroot}%{_datadir}/bash-completion/completions
|
||||
install -d %{buildroot}%{_datadir}/zsh/site-functions
|
||||
install -d %{buildroot}%{_datadir}/fish/vendor_completions.d
|
||||
%{_builddir}/dms-cli completion bash > %{buildroot}%{_datadir}/bash-completion/completions/dms || :
|
||||
%{_builddir}/dms-cli completion zsh > %{buildroot}%{_datadir}/zsh/site-functions/_dms || :
|
||||
%{_builddir}/dms-cli completion fish > %{buildroot}%{_datadir}/fish/vendor_completions.d/dms.fish || :
|
||||
|
||||
install -Dm644 assets/systemd/dms.service %{buildroot}%{_userunitdir}/dms.service
|
||||
|
||||
install -Dm644 assets/dms-open.desktop %{buildroot}%{_datadir}/applications/dms-open.desktop
|
||||
install -Dm644 assets/danklogo.svg %{buildroot}%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
|
||||
install -dm755 %{buildroot}%{_datadir}/quickshell/dms
|
||||
cp -r %{_builddir}/dms-qml/* %{buildroot}%{_datadir}/quickshell/dms/
|
||||
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.git*
|
||||
rm -f %{buildroot}%{_datadir}/quickshell/dms/.gitignore
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/.github
|
||||
rm -rf %{buildroot}%{_datadir}/quickshell/dms/distro
|
||||
|
||||
echo "%{version}" > %{buildroot}%{_datadir}/quickshell/dms/VERSION
|
||||
|
||||
%posttrans
|
||||
if [ -d "%{_sysconfdir}/xdg/quickshell/dms" ]; then
|
||||
rmdir "%{_sysconfdir}/xdg/quickshell/dms" 2>/dev/null || true
|
||||
rmdir "%{_sysconfdir}/xdg/quickshell" 2>/dev/null || true
|
||||
rmdir "%{_sysconfdir}/xdg" 2>/dev/null || true
|
||||
fi
|
||||
# Signal running DMS instances to reload
|
||||
pkill -USR1 -x dms >/dev/null 2>&1 || :
|
||||
|
||||
%files
|
||||
%license LICENSE
|
||||
%doc README.md CONTRIBUTING.md
|
||||
%{_datadir}/quickshell/dms/
|
||||
%{_userunitdir}/dms.service
|
||||
%{_datadir}/applications/dms-open.desktop
|
||||
%{_datadir}/icons/hicolor/scalable/apps/danklogo.svg
|
||||
|
||||
%files -n dms-cli
|
||||
%{_bindir}/dms
|
||||
%{_datadir}/bash-completion/completions/dms
|
||||
%{_datadir}/zsh/site-functions/_dms
|
||||
%{_datadir}/fish/vendor_completions.d/dms.fish
|
||||
|
||||
%changelog
|
||||
* CHANGELOG_DATE_PLACEHOLDER AvengeMedia <contact@avengemedia.com> - VERSION_PLACEHOLDER-1
|
||||
- Stable release VERSION_PLACEHOLDER
|
||||
- Built locally with corrected tarball
|
||||
SPECEOF
|
||||
|
||||
sed -i "s/VERSION_PLACEHOLDER/${VERSION}/g" ~/rpmbuild/SPECS/dms.spec
|
||||
sed -i "s/RELEASE_PLACEHOLDER/${RELEASE}/g" ~/rpmbuild/SPECS/dms.spec
|
||||
sed -i "s/CHANGELOG_DATE_PLACEHOLDER/${CHANGELOG_DATE}/g" ~/rpmbuild/SPECS/dms.spec
|
||||
|
||||
# Build SRPM
|
||||
echo "🔨 Building SRPM..."
|
||||
echo "Building SRPM..."
|
||||
cd ~/rpmbuild/SPECS
|
||||
rpmbuild -bs "${PACKAGE}".spec
|
||||
rpmbuild -bs dms.spec
|
||||
|
||||
SRPM=$(ls ~/rpmbuild/SRPMS/"${PACKAGE}"-"${VERSION}"-*.src.rpm | tail -n 1)
|
||||
SRPM=$(ls ~/rpmbuild/SRPMS/dms-"${VERSION}"-*.src.rpm | tail -n 1)
|
||||
if [ ! -f "$SRPM" ]; then
|
||||
echo "❌ Error: SRPM not found!"
|
||||
echo "Expected pattern: ${PACKAGE}-${VERSION}-*.src.rpm"
|
||||
ls -la ~/rpmbuild/SRPMS/ || true
|
||||
echo "Error: SRPM not found!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "✅ SRPM built successfully: $SRPM"
|
||||
echo "SRPM built successfully: $SRPM"
|
||||
|
||||
# Check if copr-cli is installed
|
||||
if ! command -v copr-cli &>/dev/null; then
|
||||
echo ""
|
||||
echo "⚠️ copr-cli is not installed. Install it with:"
|
||||
echo "copr-cli is not installed. Install it with:"
|
||||
echo " pip install copr-cli"
|
||||
echo ""
|
||||
echo "Then configure it with your Copr API token in ~/.config/copr"
|
||||
echo ""
|
||||
echo "SRPM is ready at: $SRPM"
|
||||
echo "Upload manually with: copr-cli build $COPR_PROJECT $SRPM"
|
||||
echo "Upload manually with: copr-cli build avengemedia/dms $SRPM"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
# Upload to Copr
|
||||
echo ""
|
||||
echo "🚀 Uploading to Copr..."
|
||||
if copr-cli build "$COPR_PROJECT" "$SRPM" --nowait; then
|
||||
echo "Uploading to Copr..."
|
||||
if copr-cli build avengemedia/dms "$SRPM" --nowait; then
|
||||
echo ""
|
||||
echo "✅ Build submitted successfully!"
|
||||
echo "📊 Check status at:"
|
||||
echo " https://copr.fedorainfracloud.org/coprs/${COPR_PROJECT}/builds/"
|
||||
echo ""
|
||||
echo "📦 SRPM location: $SRPM"
|
||||
echo "Build submitted successfully! Check status at:"
|
||||
echo "https://copr.fedorainfracloud.org/coprs/avengemedia/dms/builds/"
|
||||
else
|
||||
echo ""
|
||||
echo "❌ Copr upload failed. You can manually upload the SRPM:"
|
||||
echo " copr-cli build $COPR_PROJECT $SRPM"
|
||||
echo "Copr upload failed. You can manually upload the SRPM:"
|
||||
echo " copr-cli build avengemedia/dms $SRPM"
|
||||
echo ""
|
||||
echo "Or upload via web interface:"
|
||||
echo " https://copr.fedorainfracloud.org/coprs/${COPR_PROJECT}/builds/"
|
||||
echo " https://copr.fedorainfracloud.org/coprs/avengemedia/dms/builds/"
|
||||
echo ""
|
||||
echo "SRPM location: $SRPM"
|
||||
exit 1
|
||||
|
||||
@@ -7,8 +7,8 @@
|
||||
# ./distro/scripts/obs-upload.sh dms "Update to v1.0.2"
|
||||
# ./distro/scripts/obs-upload.sh debian dms
|
||||
# ./distro/scripts/obs-upload.sh opensuse dms-git
|
||||
# ./distro/scripts/obs-upload.sh debian dms-git 2 # Rebuild with db2 suffix
|
||||
# ./distro/scripts/obs-upload.sh dms-git --rebuild=2 # Rebuild with db2 suffix (flag syntax)
|
||||
# ./distro/scripts/obs-upload.sh debian dms-git 2 # Rebuild with ppa2 suffix
|
||||
# ./distro/scripts/obs-upload.sh dms-git --rebuild=2 # Rebuild with ppa2 suffix (flag syntax)
|
||||
|
||||
set -e
|
||||
|
||||
@@ -126,8 +126,8 @@ check_obs_version_exists() {
|
||||
OBS_VERSION=$(echo "$OBS_SPEC" | grep "^Version:" | awk '{print $2}' | xargs)
|
||||
# Commit hash check for -git packages
|
||||
if [[ "$CHECK_MODE" == "commit" ]] && [[ "$PACKAGE" == *"-git" ]]; then
|
||||
OBS_COMMIT=$(echo "$OBS_VERSION" | grep -oP '\.([a-f0-9]{8})(db[0-9]+)?$' | grep -oP '[a-f0-9]{8}' || echo "")
|
||||
NEW_COMMIT=$(echo "$VERSION" | grep -oP '\.([a-f0-9]{8})(db[0-9]+)?$' | grep -oP '[a-f0-9]{8}' || echo "")
|
||||
OBS_COMMIT=$(echo "$OBS_VERSION" | grep -oP '\.([a-f0-9]{8})(ppa[0-9]+)?$' | grep -oP '[a-f0-9]{8}' || echo "")
|
||||
NEW_COMMIT=$(echo "$VERSION" | grep -oP '\.([a-f0-9]{8})(ppa[0-9]+)?$' | grep -oP '[a-f0-9]{8}' || echo "")
|
||||
|
||||
if [[ -n "$OBS_COMMIT" && -n "$NEW_COMMIT" && "$OBS_COMMIT" == "$NEW_COMMIT" ]]; then
|
||||
echo "⚠️ Commit $NEW_COMMIT already exists in OBS (current version: $OBS_VERSION)"
|
||||
@@ -279,8 +279,7 @@ if [[ -d "distro/debian/$PACKAGE/debian" ]]; then
|
||||
|
||||
# Apply rebuild suffix if specified (must happen before API check)
|
||||
if [[ -n "$REBUILD_RELEASE" ]] && [[ -n "$CHANGELOG_VERSION" ]]; then
|
||||
BASE_VERSION=$(echo "$CHANGELOG_VERSION" | sed 's/db[0-9]*$//')
|
||||
CHANGELOG_VERSION="${BASE_VERSION}db${REBUILD_RELEASE}"
|
||||
CHANGELOG_VERSION="${CHANGELOG_VERSION}ppa${REBUILD_RELEASE}"
|
||||
echo " - Applied rebuild suffix: $CHANGELOG_VERSION"
|
||||
fi
|
||||
|
||||
@@ -308,16 +307,12 @@ if [[ -d "distro/debian/$PACKAGE/debian" ]]; then
|
||||
else
|
||||
# Rebuild number specified - check if this exact version already exists (exact mode)
|
||||
if check_obs_version_exists "$OBS_PROJECT" "$PACKAGE" "$CHANGELOG_VERSION" "exact"; then
|
||||
echo "==> Version $CHANGELOG_VERSION already exists in OBS"
|
||||
echo " This exact version (including db${REBUILD_RELEASE}) is already uploaded."
|
||||
echo " Skipping upload - nothing to do."
|
||||
echo ""
|
||||
echo " 💡 To rebuild with a different release number, try incrementing:"
|
||||
echo "==> Error: Version $CHANGELOG_VERSION already exists in OBS"
|
||||
echo " This exact version (including ppa${REBUILD_RELEASE}) is already uploaded."
|
||||
echo " To rebuild with a different release number, try incrementing:"
|
||||
NEXT_NUM=$((REBUILD_RELEASE + 1))
|
||||
echo " REBUILD_RELEASE=$NEXT_NUM"
|
||||
echo ""
|
||||
echo "✓ Exiting gracefully (no changes needed)"
|
||||
exit 0
|
||||
echo " ./distro/scripts/obs-upload.sh $PACKAGE $NEXT_NUM"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
@@ -516,7 +511,7 @@ if [[ "$UPLOAD_DEBIAN" == true ]] && [[ -d "distro/debian/$PACKAGE/debian" ]]; t
|
||||
|
||||
if [[ -n "$URL_PROTOCOL" && -n "$URL_HOST" && -n "$URL_PATH" ]]; then
|
||||
SOURCE_URL="${URL_PROTOCOL}://${URL_HOST}${URL_PATH}"
|
||||
echo "==> Downloading source from: $SOURCE_URL"
|
||||
echo " Downloading source from: $SOURCE_URL"
|
||||
|
||||
if wget -q -O "$TEMP_DIR/source-archive" "$SOURCE_URL" 2>/dev/null ||
|
||||
curl -L -f -s -o "$TEMP_DIR/source-archive" "$SOURCE_URL" 2>/dev/null; then
|
||||
@@ -539,17 +534,9 @@ if [[ "$UPLOAD_DEBIAN" == true ]] && [[ -d "distro/debian/$PACKAGE/debian" ]]; t
|
||||
fi
|
||||
SOURCE_DIR=$(cd "$SOURCE_DIR" && pwd)
|
||||
cd "$REPO_ROOT"
|
||||
if [[ "$(pwd)" != "$REPO_ROOT" ]]; then
|
||||
echo "ERROR: Failed to return to REPO_ROOT. Expected: $REPO_ROOT, Got: $(pwd)"
|
||||
exit 1
|
||||
fi
|
||||
else
|
||||
echo "ERROR: Failed to download source from $SOURCE_URL"
|
||||
echo "Attempted both wget and curl"
|
||||
echo "Please check:"
|
||||
echo " 1. URL is accessible: $SOURCE_URL"
|
||||
echo " 2. _service file has correct version"
|
||||
echo " 3. GitHub releases are available"
|
||||
echo "Error: Failed to download source from $SOURCE_URL"
|
||||
echo "Tried both wget and curl. Please check the URL and network connectivity."
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
@@ -566,7 +553,7 @@ if [[ "$UPLOAD_DEBIAN" == true ]] && [[ -d "distro/debian/$PACKAGE/debian" ]]; t
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "==> Found source directory: $SOURCE_DIR"
|
||||
echo " Found source directory: $SOURCE_DIR"
|
||||
|
||||
# Vendor Go dependencies for dms-git
|
||||
if [[ "$PACKAGE" == "dms-git" ]] && [[ -d "$SOURCE_DIR/core" ]]; then
|
||||
@@ -725,10 +712,6 @@ if [[ "$UPLOAD_DEBIAN" == true ]] && [[ -d "distro/debian/$PACKAGE/debian" ]]; t
|
||||
TARBALL_BASE=$(basename "$SOURCE_DIR")
|
||||
tar --sort=name --mtime='2000-01-01 00:00:00' --owner=0 --group=0 -czf "$WORK_DIR/$COMBINED_TARBALL" "$TARBALL_BASE"
|
||||
cd "$REPO_ROOT"
|
||||
if [[ "$(pwd)" != "$REPO_ROOT" ]]; then
|
||||
echo "ERROR: Failed to return to REPO_ROOT after tarball creation"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ "$PACKAGE" == "dms" ]]; then
|
||||
TARBALL_DIR=$(tar -tzf "$WORK_DIR/$COMBINED_TARBALL" 2>/dev/null | head -1 | cut -d'/' -f1)
|
||||
@@ -740,10 +723,6 @@ if [[ "$UPLOAD_DEBIAN" == true ]] && [[ -d "distro/debian/$PACKAGE/debian" ]]; t
|
||||
rm -f "$WORK_DIR/$COMBINED_TARBALL"
|
||||
tar --sort=name --mtime='2000-01-01 00:00:00' --owner=0 --group=0 -czf "$WORK_DIR/$COMBINED_TARBALL" "$TARBALL_BASE"
|
||||
cd "$REPO_ROOT"
|
||||
if [[ "$(pwd)" != "$REPO_ROOT" ]]; then
|
||||
echo "ERROR: Failed to return to REPO_ROOT after tarball recreation"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
@@ -817,29 +796,23 @@ EOF
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "==> Ensuring we're in the OSC working directory"
|
||||
cd "$WORK_DIR" || {
|
||||
echo "ERROR: Cannot cd to WORK_DIR: $WORK_DIR"
|
||||
echo "DEBUG: Current directory: $(pwd)"
|
||||
echo "DEBUG: WORK_DIR exists: $(test -d "$WORK_DIR" && echo "yes" || echo "no")"
|
||||
exit 1
|
||||
}
|
||||
echo "DEBUG: Successfully entered WORK_DIR: $(pwd)"
|
||||
cd "$WORK_DIR"
|
||||
|
||||
# Server-side cleanup via API
|
||||
echo "==> Cleaning old tarballs from OBS server (prevents downloading 100+ old versions)"
|
||||
OBS_FILES=$(osc api "/source/$OBS_PROJECT/$PACKAGE" 2>/dev/null || echo "")
|
||||
if [[ -n "$OBS_FILES" ]]; then
|
||||
DELETED_COUNT=0
|
||||
KEEP_CURRENT=""
|
||||
KEEP_PATTERN=""
|
||||
if [[ -n "$CHANGELOG_VERSION" ]]; then
|
||||
KEEP_CURRENT="${PACKAGE}_${CHANGELOG_VERSION}.tar.gz"
|
||||
echo " Keeping only current version: ${KEEP_CURRENT}"
|
||||
BASE_KEEP_VERSION=$(echo "$CHANGELOG_VERSION" | sed 's/ppa[0-9]*$//')
|
||||
KEEP_PATTERN="${PACKAGE}_${BASE_KEEP_VERSION}"
|
||||
echo " Keeping tarballs matching: ${KEEP_PATTERN}*"
|
||||
fi
|
||||
|
||||
for old_file in $(echo "$OBS_FILES" | grep -oP '(?<=name=")[^"]*\.(tar\.gz|tar\.xz|tar\.bz2)(?=")' || true); do
|
||||
if [[ "$old_file" == "$KEEP_CURRENT" ]]; then
|
||||
echo " - Keeping: $old_file"
|
||||
if [[ -n "$KEEP_PATTERN" ]] && [[ "$old_file" == ${KEEP_PATTERN}* ]]; then
|
||||
echo " - Keeping current version: $old_file"
|
||||
continue
|
||||
fi
|
||||
|
||||
@@ -862,11 +835,14 @@ else
|
||||
echo " ⚠️ Could not fetch file list from server, skipping cleanup"
|
||||
fi
|
||||
|
||||
# Update working copy to latest revision (without expanding service files to avoid revision conflicts)
|
||||
# Fallback update with --server-side-source-service-files flag only syncs metadata (spec, dsc, _service)
|
||||
echo "==> Updating working copy"
|
||||
if ! osc up 2>/dev/null; then
|
||||
echo "Error: Failed to update working copy"
|
||||
exit 1
|
||||
if ! osc up --server-side-source-service-files 2>/dev/null; then
|
||||
echo " Note: Using regular update (--server-side-source-service-files not supported)"
|
||||
if ! osc up; then
|
||||
echo "Error: Failed to update working copy"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Ensure we're in WORK_DIR and it exists
|
||||
@@ -906,15 +882,6 @@ elif [[ "$UPLOAD_OPENSUSE" == true ]]; then
|
||||
fi
|
||||
echo ""
|
||||
|
||||
if [[ "$(pwd)" != "$WORK_DIR" ]]; then
|
||||
echo "ERROR: Lost directory context. Expected: $WORK_DIR, Got: $(pwd)"
|
||||
cd "$WORK_DIR" || {
|
||||
echo "FATAL: Cannot recover - unable to cd to WORK_DIR"
|
||||
exit 1
|
||||
}
|
||||
echo "WARNING: Recovered directory context"
|
||||
fi
|
||||
|
||||
osc addremove 2>&1 | grep -v "Git SCM package" || true
|
||||
|
||||
SOURCE_TARBALL="${PACKAGE}-source.tar.gz"
|
||||
@@ -941,7 +908,7 @@ if ! osc status 2>/dev/null | grep -qE '^[MAD]|^[?]'; then
|
||||
else
|
||||
echo "==> Committing to OBS"
|
||||
set +e
|
||||
osc commit --skip-local-service-run -m "$MESSAGE" 2>&1 | grep -v "Git SCM package" | grep -v "apiurl\|project\|_ObsPrj\|_manifest\|git-obs"
|
||||
osc commit -m "$MESSAGE" 2>&1 | grep -v "Git SCM package" | grep -v "apiurl\|project\|_ObsPrj\|_manifest\|git-obs"
|
||||
COMMIT_EXIT=${PIPESTATUS[0]}
|
||||
set -e
|
||||
if [[ $COMMIT_EXIT -ne 0 ]]; then
|
||||
|
||||
@@ -191,8 +191,19 @@ fi
|
||||
cd "$WORK_PACKAGE_DIR"
|
||||
get_latest_tag() {
|
||||
local repo="$1"
|
||||
# Get the latest tag, sorted by version
|
||||
git ls-remote --tags --refs --sort='-v:refname' "https://github.com/$repo.git" | head -n1 | awk -F/ '{print $NF}' | sed 's/^v//'
|
||||
if command -v curl &>/dev/null; then
|
||||
LATEST_TAG=$(curl -s "https://api.github.com/repos/$repo/releases/latest" 2>/dev/null | grep '"tag_name":' | sed 's/.*"tag_name": "\(.*\)".*/\1/' | head -1)
|
||||
if [ -n "$LATEST_TAG" ]; then
|
||||
echo "$LATEST_TAG" | sed 's/^v//'
|
||||
return
|
||||
fi
|
||||
fi
|
||||
TEMP_REPO=$(mktemp -d "$TEMP_BASE/ppa_tag_XXXXXX")
|
||||
if git clone --depth=1 --quiet "https://github.com/$repo.git" "$TEMP_REPO" 2>/dev/null; then
|
||||
LATEST_TAG=$(cd "$TEMP_REPO" && git describe --tags --abbrev=0 2>/dev/null | sed 's/^v//' || echo "")
|
||||
rm -rf "$TEMP_REPO"
|
||||
echo "$LATEST_TAG"
|
||||
fi
|
||||
}
|
||||
|
||||
IS_GIT_PACKAGE=false
|
||||
@@ -323,17 +334,6 @@ EOF
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ ! -f "dms-distropkg-arm64.gz" ]; then
|
||||
info "Downloading dms binary for arm64..."
|
||||
# Try to download arm64 binary, but don't fail if it doesn't exist (yet)
|
||||
if wget -O dms-distropkg-arm64.gz "https://github.com/AvengeMedia/DankMaterialShell/releases/download/v${VERSION}/dms-distropkg-arm64.gz"; then
|
||||
success "arm64 binary downloaded"
|
||||
else
|
||||
warn "Failed to download dms-distropkg-arm64.gz (skipping)"
|
||||
rm -f dms-distropkg-arm64.gz
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ ! -f "dms-source.tar.gz" ]; then
|
||||
info "Downloading dms source for QML files..."
|
||||
if wget -O dms-source.tar.gz "https://github.com/AvengeMedia/DankMaterialShell/archive/refs/tags/v${VERSION}.tar.gz"; then
|
||||
|
||||
@@ -341,10 +341,6 @@ if [ "$KEEP_BUILDS" = "false" ]; then
|
||||
rm -f "$PACKAGE_DIR/dms-distropkg-amd64.gz"
|
||||
REMOVED=$((REMOVED + 1))
|
||||
fi
|
||||
if [ -f "$PACKAGE_DIR/dms-distropkg-arm64.gz" ]; then
|
||||
rm -f "$PACKAGE_DIR/dms-distropkg-arm64.gz"
|
||||
REMOVED=$((REMOVED + 1))
|
||||
fi
|
||||
if [ -f "$PACKAGE_DIR/dms-source.tar.gz" ]; then
|
||||
rm -f "$PACKAGE_DIR/dms-source.tar.gz"
|
||||
REMOVED=$((REMOVED + 1))
|
||||
|
||||
5
distro/ubuntu/danklinux/danksearch/debian/changelog
Normal file
5
distro/ubuntu/danklinux/danksearch/debian/changelog
Normal file
@@ -0,0 +1,5 @@
|
||||
danksearch (0.0.7ppa3) questing; urgency=medium
|
||||
|
||||
* Rebuild for packaging fixes (ppa3)
|
||||
|
||||
-- Avenge Media <AvengeMedia.US@gmail.com> Fri, 21 Nov 2025 14:19:58 -0500
|
||||
24
distro/ubuntu/danklinux/danksearch/debian/control
Normal file
24
distro/ubuntu/danklinux/danksearch/debian/control
Normal file
@@ -0,0 +1,24 @@
|
||||
Source: danksearch
|
||||
Section: utils
|
||||
Priority: optional
|
||||
Maintainer: Avenge Media <AvengeMedia.US@gmail.com>
|
||||
Build-Depends: debhelper-compat (= 13)
|
||||
Standards-Version: 4.6.2
|
||||
Homepage: https://github.com/AvengeMedia/danksearch
|
||||
Vcs-Browser: https://github.com/AvengeMedia/danksearch
|
||||
Vcs-Git: https://github.com/AvengeMedia/danksearch.git
|
||||
|
||||
Package: danksearch
|
||||
Architecture: amd64 arm64
|
||||
Depends: ${misc:Depends}
|
||||
Description: Fast file search utility for DMS
|
||||
DankSearch is a fast file search utility designed for DankMaterialShell.
|
||||
It provides efficient file and content search capabilities with minimal
|
||||
dependencies. This package contains the pre-built binary from the official
|
||||
GitHub release.
|
||||
.
|
||||
Features include:
|
||||
- Fast file searching
|
||||
- Lightweight and efficient
|
||||
- Designed for DMS integration
|
||||
- Minimal resource usage
|
||||
24
distro/ubuntu/danklinux/danksearch/debian/copyright
Normal file
24
distro/ubuntu/danklinux/danksearch/debian/copyright
Normal file
@@ -0,0 +1,24 @@
|
||||
Format: https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/
|
||||
Upstream-Name: danksearch
|
||||
Upstream-Contact: Avenge Media LLC <AvengeMedia.US@gmail.com>
|
||||
Source: https://github.com/AvengeMedia/danksearch
|
||||
|
||||
Files: *
|
||||
Copyright: 2025 Avenge Media LLC
|
||||
License: GPL-3.0-only
|
||||
|
||||
License: GPL-3.0-only
|
||||
This package is free software; you can redistribute it and/or modify
|
||||
it under the terms of the GNU General Public License version 3 as
|
||||
published by the Free Software Foundation.
|
||||
.
|
||||
This package is distributed in the hope that it will be useful,
|
||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
GNU General Public License for more details.
|
||||
.
|
||||
You should have received a copy of the GNU General Public License
|
||||
along with this program. If not, see <https://www.gnu.org/licenses/>
|
||||
.
|
||||
On Debian systems, the complete text of the GNU General
|
||||
Public License version 3 can be found in "/usr/share/common-licenses/GPL-3".
|
||||
33
distro/ubuntu/danklinux/danksearch/debian/rules
Executable file
33
distro/ubuntu/danklinux/danksearch/debian/rules
Executable file
@@ -0,0 +1,33 @@
|
||||
#!/usr/bin/make -f
|
||||
|
||||
export DH_VERBOSE = 1
|
||||
|
||||
# Detect architecture for selecting correct binary
|
||||
DEB_HOST_ARCH := $(shell dpkg-architecture -qDEB_HOST_ARCH)
|
||||
|
||||
# Map Debian arch to binary filename
|
||||
ifeq ($(DEB_HOST_ARCH),amd64)
|
||||
BINARY_FILE := dsearch-amd64
|
||||
else ifeq ($(DEB_HOST_ARCH),arm64)
|
||||
BINARY_FILE := dsearch-arm64
|
||||
else
|
||||
$(error Unsupported architecture: $(DEB_HOST_ARCH))
|
||||
endif
|
||||
|
||||
%:
|
||||
dh $@
|
||||
|
||||
override_dh_auto_build:
|
||||
# Binary is already included in source package (native format)
|
||||
# Downloaded by build-source.sh before upload
|
||||
# Just verify it exists and is executable
|
||||
test -f $(BINARY_FILE) || (echo "ERROR: $(BINARY_FILE) not found!" && exit 1)
|
||||
chmod +x $(BINARY_FILE)
|
||||
|
||||
override_dh_auto_install:
|
||||
# Install binary as danksearch
|
||||
install -Dm755 $(BINARY_FILE) debian/danksearch/usr/bin/danksearch
|
||||
|
||||
override_dh_auto_clean:
|
||||
# Don't delete binaries - they're part of the source package (native format)
|
||||
dh_auto_clean
|
||||
1
distro/ubuntu/danklinux/danksearch/debian/source/format
Normal file
1
distro/ubuntu/danklinux/danksearch/debian/source/format
Normal file
@@ -0,0 +1 @@
|
||||
3.0 (native)
|
||||
BIN
distro/ubuntu/danklinux/danksearch/dsearch-amd64
Executable file
BIN
distro/ubuntu/danklinux/danksearch/dsearch-amd64
Executable file
Binary file not shown.
BIN
distro/ubuntu/danklinux/danksearch/dsearch-arm64
Executable file
BIN
distro/ubuntu/danklinux/danksearch/dsearch-arm64
Executable file
Binary file not shown.
9
distro/ubuntu/danklinux/dgop/debian/changelog
Normal file
9
distro/ubuntu/danklinux/dgop/debian/changelog
Normal file
@@ -0,0 +1,9 @@
|
||||
dgop (0.1.11ppa2) questing; urgency=medium
|
||||
|
||||
* Rebuild for Questing (25.10) - Ubuntu 25.10+ only
|
||||
* Stateless CPU/GPU monitoring tool
|
||||
* Support for NVIDIA and AMD GPUs
|
||||
* JSON output for integration
|
||||
* Pre-built binary package for amd64 and arm64
|
||||
|
||||
-- Avenge Media <AvengeMedia.US@gmail.com> Sun, 16 Nov 2025 22:50:00 -0500
|
||||
27
distro/ubuntu/danklinux/dgop/debian/control
Normal file
27
distro/ubuntu/danklinux/dgop/debian/control
Normal file
@@ -0,0 +1,27 @@
|
||||
Source: dgop
|
||||
Section: utils
|
||||
Priority: optional
|
||||
Maintainer: Avenge Media <AvengeMedia.US@gmail.com>
|
||||
Build-Depends: debhelper-compat (= 13),
|
||||
wget,
|
||||
gzip
|
||||
Standards-Version: 4.6.2
|
||||
Homepage: https://github.com/AvengeMedia/dgop
|
||||
Vcs-Browser: https://github.com/AvengeMedia/dgop
|
||||
Vcs-Git: https://github.com/AvengeMedia/dgop.git
|
||||
|
||||
Package: dgop
|
||||
Architecture: amd64 arm64
|
||||
Depends: ${misc:Depends}
|
||||
Description: Stateless CPU/GPU monitor for DankMaterialShell
|
||||
DGOP is a stateless system monitoring tool that provides CPU, GPU,
|
||||
memory, and network statistics. Designed for integration with
|
||||
DankMaterialShell but can be used standalone.
|
||||
.
|
||||
Features:
|
||||
- CPU usage monitoring
|
||||
- GPU usage and temperature (NVIDIA, AMD)
|
||||
- Memory and swap statistics
|
||||
- Network traffic monitoring
|
||||
- Zero-state design (no background processes)
|
||||
- JSON output for easy integration
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user