Compare commits

...

124 commits
v3.19 ... edge

Author SHA1 Message Date
396c98208d
backports/caprine: new aport 2024-10-29 09:01:48 -04:00
b345573aa1
forgejo: chose highest version when dealing with multiple downstream_versions 2024-10-29 08:56:16 -04:00
257e019992
forgejo: fix typo in check-community 2024-10-29 07:07:21 -04:00
97ed4992d7
forgejo: Fix is_it_old logics 2024-10-29 07:06:50 -04:00
7814f05e1c
Check every day at 5 am instead of hourly 2024-10-28 08:21:30 -04:00
9bf9771b8c
forgejo: update is_it_old to use new title format 2024-10-27 17:01:04 -04:00
6e698a0974
forgejo: add update check workflows 2024-10-27 15:02:24 -04:00
367a606da2
user/signal-desktop: upgrade to 7.29.0
Some checks failed
/ build-x86_64 (pull_request) Has been cancelled
/ deploy-x86_64 (pull_request) Has been cancelled
/ deploy-aarch64 (pull_request) Has been cancelled
/ build-aarch64 (pull_request) Has been cancelled
/ lint (pull_request) Successful in 31s
2024-10-16 20:38:26 -04:00
26657f4d57
backports/py3-sip: drop due to in community
All checks were successful
/ lint (pull_request) Successful in 28s
/ deploy-x86_64 (pull_request) Successful in 26s
/ build-x86_64 (pull_request) Successful in 42s
/ build-aarch64 (pull_request) Successful in 1m11s
/ deploy-aarch64 (pull_request) Successful in 54s
2024-10-16 20:28:51 -04:00
9e2e00cd44
backports/py3-qt6: drop due to in community 2024-10-16 20:28:26 -04:00
5771d09151
backports/py3-pyqt6-sip: drop due to in community 2024-10-16 20:28:02 -04:00
0414f86242
backports/py3-django-debug-toolbar: drop due to in community 2024-10-16 20:26:36 -04:00
fb7a3fe81b
user/caprine: drop due to move to aports 2024-10-16 20:07:46 -04:00
3ffe64d0d4
user/forgejo-aneksajo: drop due to move to iports 2024-10-16 20:07:07 -04:00
15d01121ba
user/zotero: drop due to in aports
All checks were successful
/ deploy-aarch64 (pull_request) Successful in 55s
/ build-aarch64 (pull_request) Successful in 1m14s
/ lint (pull_request) Successful in 28s
/ deploy-x86_64 (pull_request) Successful in 29s
/ build-x86_64 (pull_request) Successful in 39s
2024-10-16 20:04:33 -04:00
d24323205e backports/signal-desktop: upgrade to 7.28.0 2024-10-16 17:48:59 +00:00
eef1e89d88
backports/signal-desktop: upgrade to 7.28.0
All checks were successful
/ lint (pull_request) Successful in 28s
/ deploy-aarch64 (pull_request) Successful in 1m1s
/ build-aarch64 (pull_request) Successful in 1h4m59s
/ build-x86_64 (pull_request) Successful in 26m33s
/ deploy-x86_64 (pull_request) Successful in 40s
2024-10-09 18:23:19 -04:00
623d98575e user/zotero: upgrade to 7.0.7 2024-10-07 16:24:49 +00:00
b306518289
testing/signal-desktop: upgrade to 7.27.0
All checks were successful
/ lint (pull_request) Successful in 28s
/ deploy-x86_64 (pull_request) Successful in 38s
/ build-x86_64 (pull_request) Successful in 35m39s
/ build-aarch64 (pull_request) Successful in 1h19m18s
/ deploy-aarch64 (pull_request) Successful in 1m0s
2024-10-07 09:37:14 -04:00
63f6a6099f
backports/signal-desktop: import upstream changes 2024-09-27 17:32:59 -04:00
7e21600868
backports/signal-desktop: upgrade to 7.26.0
All checks were successful
/ lint (pull_request) Successful in 27s
/ deploy-x86_64 (pull_request) Successful in 38s
/ build-x86_64 (pull_request) Successful in 32m29s
/ build-aarch64 (pull_request) Successful in 1h8m29s
/ deploy-aarch64 (pull_request) Successful in 59s
2024-09-27 10:13:09 -04:00
714437157c
user/zotero: upgrade to 7.0.6
All checks were successful
/ lint (pull_request) Successful in 37s
/ deploy-x86_64 (pull_request) Successful in 1m2s
/ build-x86_64 (pull_request) Successful in 1h34m59s
/ deploy-aarch64 (pull_request) Successful in 1m7s
/ build-aarch64 (pull_request) Successful in 2h14m29s
2024-09-26 09:16:53 -04:00
f82ac83d0b
backports/signal-desktop: upgrade to 7.25.0
All checks were successful
/ lint (pull_request) Successful in 41s
/ build-x86_64 (pull_request) Successful in 24m19s
/ deploy-x86_64 (pull_request) Successful in 26s
/ deploy-aarch64 (pull_request) Successful in 57s
/ build-aarch64 (pull_request) Successful in 59m21s
2024-09-18 22:59:46 -04:00
b9b609bedf
backports/signal-desktop: upgrade to 7.24.1
All checks were successful
/ lint (pull_request) Successful in 26s
/ deploy-x86_64 (pull_request) Successful in 25s
/ build-x86_64 (pull_request) Successful in 22m30s
/ deploy-aarch64 (pull_request) Successful in 57s
/ build-aarch64 (pull_request) Successful in 57m41s
2024-09-13 11:20:57 -04:00
8ffac41cb8
backports/signal-desktop: upgrade to 7.24.0
All checks were successful
/ lint (pull_request) Successful in 27s
/ deploy-x86_64 (pull_request) Successful in 24s
/ build-x86_64 (pull_request) Successful in 23m6s
/ build-aarch64 (pull_request) Successful in 58m43s
/ deploy-aarch64 (pull_request) Successful in 56s
2024-09-12 13:42:16 -04:00
7ffb4b3105
backports/signal-desktop: upgrade to 7.23.0
All checks were successful
/ lint (pull_request) Successful in 26s
/ deploy-x86_64 (pull_request) Successful in 27s
/ build-x86_64 (pull_request) Successful in 15m33s
/ build-aarch64 (pull_request) Successful in 1h1m41s
/ deploy-aarch64 (pull_request) Successful in 57s
2024-09-09 13:46:08 -04:00
743ceb8dbe
backports/signal-desktop: upgrade to 7.22.2
All checks were successful
/ lint (pull_request) Successful in 38s
/ build-x86_64 (pull_request) Successful in 21m27s
/ deploy-x86_64 (pull_request) Successful in 44s
/ deploy-aarch64 (pull_request) Successful in 1m3s
/ build-aarch64 (pull_request) Successful in 1h3m14s
2024-09-05 16:05:34 -04:00
021b81131e
user/mathjax2: bump pkgrel
All checks were successful
/ lint (pull_request) Successful in 27s
/ deploy-aarch64 (pull_request) Successful in 53s
/ build-aarch64 (pull_request) Successful in 2m8s
/ deploy-x86_64 (pull_request) Successful in 26s
/ build-x86_64 (pull_request) Successful in 19m19s
2024-09-03 12:22:25 -04:00
d00a14e695
forgejo: always create artifacts for build stage
Some checks failed
/ lint (pull_request) Successful in 26s
/ deploy-aarch64 (pull_request) Successful in 51s
/ build-aarch64 (pull_request) Successful in 1m10s
/ deploy-x86_64 (pull_request) Has been skipped
/ build-x86_64 (pull_request) Failing after 2m15s
2024-09-03 12:17:26 -04:00
a6e60edfd9
user/rstudio-desktop: enable build 2024-09-03 12:15:02 -04:00
68130cdf8a user/zotero: upgrade to 7.0.3 2024-08-28 13:42:02 +00:00
888654be5c
user/rmfakecloud: upgrade to 0.0.19
All checks were successful
/ lint (pull_request) Successful in 32s
/ build-x86_64 (pull_request) Successful in 2m31s
/ deploy-x86_64 (pull_request) Successful in 31s
/ deploy-aarch64 (pull_request) Successful in 54s
/ build-aarch64 (pull_request) Successful in 9m20s
2024-08-26 11:04:13 -04:00
d6e00b6395
forgejo-ci: build.sh is now local rather than patched 2024-08-26 11:02:37 -04:00
77dc41c8aa
forgejo-ci: fix double v in repo 2024-08-22 21:42:50 -04:00
1478a9f5c7
forgejo-ci: use new forge repo
Some checks failed
/ deploy-aarch64 (pull_request) Has been cancelled
/ build-aarch64 (pull_request) Has been cancelled
/ build-x86_64 (pull_request) Has been cancelled
/ deploy-x86_64 (pull_request) Has been cancelled
/ lint (pull_request) Successful in 26s
2024-08-21 10:54:38 -04:00
fc3cfbc01c
user/forgejo-aneksajo: upgrade to 8.0.1 2024-08-21 10:19:57 -04:00
659bd20ba1
README: update name 2024-08-12 12:56:21 -04:00
970fd7297f
README: update upstream 2024-08-12 12:55:21 -04:00
135bcd5a89
README: update to use forge repo 2024-08-12 12:53:29 -04:00
e7bef354af
forgejo: initial implementation
Some checks failed
/ lint (pull_request) Successful in 29s
/ deploy-aarch64 (pull_request) Failing after 3m48s
/ build-aarch64 (pull_request) Successful in 58s
/ deploy-x86_64 (pull_request) Failing after 1m40s
/ build-x86_64 (pull_request) Successful in 28s
2024-08-12 12:39:43 -04:00
260b8c3da6
gitlab-ci: drop in favor of forgejo actions 2024-08-12 12:39:38 -04:00
9a81361936
README.md: new repo location 2024-08-10 16:34:15 -04:00
f124e1fd95 gitab-ci: use git-annex instead of git-lfs 2024-08-10 15:33:40 +00:00
3a9141372b user/py3-validators: bump 2024-08-10 15:33:40 +00:00
e0a5952518
Update README 2024-08-10 11:31:00 -04:00
14348459f1 README: update for codeberg migration 2024-08-10 02:44:17 +00:00
92b265a1d0 user/py3-django-rest-framework: drop due to migration to ilot iports 2024-08-10 02:44:17 +00:00
07ce4b2776 user/ruby3.2-take: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
9b5788e012 user/gitaly: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
580e136768 user/gitlab-foss: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
32a461c894 user/gitlab-pages: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
9320defbce user/gitlab-shell: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
9c03466cc0 user/mastodon: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
177efa00c2 user/ruby3.2: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
438d6d6e3e user/ruby3.2-bundler: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
638732a089 user/ruby3.2-minitest: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
d09e518d1f user/ruby3.2-power_assert: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
ece30e0fbb user/ruby3.2-test-unit: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
8a2a28342a user/ruby3.2-webrick: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
9d6bf2f5a2 user/authentik: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
10821c427a user/freescout: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
bc63f1ddb8 user/listmonk: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
187eb88770 user/loomio: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
60b6bb1f9a user/peertube: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
5edd40d7f0 user/php82-pecl-inotify: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
7870ee72dc user/php83-pecl-inotify: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
2f4998dfb6 user/py3-django-tenants: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
783a964410 user/py3-scim2-filter-parser: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
ff94611df0 user/py3-tenant-schemas-celery: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
bdc0c313c6 user/uptime-kuma: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
21082688af user/wikijs: drop due migration to ilot iports 2024-08-10 02:44:17 +00:00
a0993c9e31
user/zotero: upgrade to 7.0.0 2024-08-09 10:41:28 -04:00
686e6a6504 backports/py3-html5-parser: new aport 2024-08-08 16:14:05 +00:00
497771fe9e backports/py3-apsw: new aport 2024-08-08 16:14:05 +00:00
0807d658c5 backports/calibre: new aport 2024-08-08 16:14:05 +00:00
7320512e66 backports/freetube: upgrade to 0.21.3 2024-08-07 21:32:22 +00:00
475c43723c user/forgejo-aneksajo: new aport 2024-08-07 19:46:56 +00:00
ad7d9444cc
backports/looking-glass: upgrade to 7b_git20240607 2024-08-07 15:34:26 -04:00
87e02ab716 user/zotero: upgrade to 7.0.0_beta109 2024-08-01 21:25:12 +00:00
7671a9d567
backports/looking-glass: new aport 2024-07-30 23:39:19 -04:00
65f2a53a44 user/gitlab-foss: upgrade to 17.0.4, fix initd 2024-07-27 05:02:39 +00:00
4d473acf9e user/gitlab-pages: upgrade to 17.0.4 2024-07-27 05:02:39 +00:00
59705e3486 user/gitaly: upgrade to 17.0.4 2024-07-27 05:02:39 +00:00
72ad06acd7
backports/py3-pyqt6-sip: upgrade to 13.8.0 2024-07-27 00:51:52 -04:00
e5f83095e3 backports/py3-{pyqt6-sip, qt6, sip}: new aport 2024-07-27 04:49:23 +00:00
eb0374cfba
user/mastodon: fix initd scripts 2024-07-11 07:08:45 -04:00
07ea5d4b03
user/mastodon: upgrade to 4.2.10 2024-07-04 22:53:50 -04:00
091908e87e user/gitlab-shell: upgrade to 14.36.0 2024-07-05 02:17:29 +00:00
0b3331610d user/gitaly: upgrade to 17.0.3 2024-07-05 02:17:29 +00:00
948e244824 user/gitlab-pages: upgrade to 17.0.3 2024-07-05 02:17:29 +00:00
3124031a00 user/gitlab-foss: upgrade to 17.0.3 2024-07-05 02:17:29 +00:00
64d88aa2e1 user/authentik: add custom css to config dir 2024-07-04 03:15:58 +00:00
a5cd9ca969 user/authentik: upgrade to 2024.4.3 2024-07-04 03:15:58 +00:00
ddb15faac6
user/i18nspector: new aport 2024-07-03 16:59:16 -04:00
bdb37e35ee user/wikijs: new aport 2024-06-17 01:20:53 +00:00
1c7eeb3dce
user/ruby3.2-rugged: drop due to not needed 2024-06-14 08:32:52 -04:00
08ee79d032
user/grpc: drop due to not needed 2024-06-14 08:32:51 -04:00
81ac0611da
user/gitlab-pages: upgrade to 17.0.2 2024-06-14 08:32:49 -04:00
f57dc997f7
user/gitaly: upgrade to 17.0.2 2024-06-14 08:32:48 -04:00
7440e781b8
user/gitlab-foss: upgrade to 17.0.2 2024-06-14 08:32:44 -04:00
65a5f41649 user/gitlab-foss: upgrade to 16.11.4 2024-06-13 22:07:41 +00:00
507663db3b user/gitlab-pages: upgrade to 16.11.4 2024-06-13 22:07:41 +00:00
f5bbad0712 user/gitaly: upgrade to 16.11.4 2024-06-13 22:07:41 +00:00
ecc61a2182 user/php82-pecl-inotify: new aport 2024-06-13 20:33:20 +00:00
1f7767fc5b
user/php83-pecl-inotify: new aport 2024-06-13 13:48:22 -04:00
e5237392b1
user/authentik: add missing depends 2024-06-12 23:53:12 -04:00
43905a4a72
user/py3-django-rest-framework: fix to version 3.14.0 2024-06-12 23:53:03 -04:00
2a72f32a3a
user/py3-scim2-filter-parser: new aport 2024-06-12 23:52:52 -04:00
b29ff4dcae
user/py3-tenant-schemas-celery: new aports 2024-06-12 23:52:46 -04:00
d87a333de8
user/py3-django-tenants: new aports 2024-06-12 23:52:40 -04:00
849cc1d7b3 user/mastodon: upgrade to 4.2.9 2024-06-13 00:33:01 +00:00
ae8c40104c
user/gitlab-pages: upgrade to 16.9.8 2024-06-10 09:43:06 -04:00
29d690fe52
user/gitaly: upgrade to 16.9.8 2024-06-10 09:42:59 -04:00
ab3231afda
user/gitlab-shell: upgrade to 14.35.0 2024-06-10 09:42:53 -04:00
8ac74918a3
user/gitlab-foss: upgrade to 16.9.8 2024-06-10 09:42:46 -04:00
d3888ec0a3
user/zotero: add missing git info 2024-06-04 13:31:31 -04:00
c9d0bd0b1e
backports/signal-desktop: upgrade to 7.11.0 2024-06-04 12:15:55 -04:00
e9d494a147 user/peertube: enable build 2024-06-04 13:32:30 +00:00
fddd91b20c unmaintained/firefly-iii-plaid-connector: move from user 2024-06-04 13:22:14 +00:00
104708e9a6 user/firefly-iii: use php83 2024-06-04 13:22:14 +00:00
34e0757763 user/mastodon: enable build 2024-06-04 02:48:20 +00:00
068b275bf0 user/gitlab-foss: enable build 2024-06-04 02:48:11 +00:00
0c00fe0b12 user/zotero: upgrade to 7.0.0_beta83 2024-06-04 02:47:03 +00:00
4c1e2a5ad1 user/zotero: enable build 2024-06-04 02:47:03 +00:00
194 changed files with 1200 additions and 8788 deletions

View file

@ -1,27 +1,26 @@
#!/bin/sh
# shellcheck disable=SC3043
. $CI_PROJECT_DIR/.gitlab/bin/functions.sh
. /usr/local/lib/functions.sh
# shellcheck disable=SC3040
set -eu -o pipefail
readonly APORTSDIR=$CI_PROJECT_DIR
readonly REPOS="cross backports user testing community"
readonly ALPINE_REPOS="main community"
readonly REPOS="backports user"
readonly ALPINE_REPOS="main community testing"
readonly ARCH=$(apk --print-arch)
# gitlab variables
readonly BASEBRANCH=$CI_MERGE_REQUEST_TARGET_BRANCH_NAME
: "${REPODEST:=$HOME/packages}"
: "${MIRROR:=https://lab.ilot.io/ayakael/repo-apk/-/raw}"
: "${MIRROR:=https://ayakael.net/api/packages/forge/alpine}"
: "${ALPINE_MIRROR:=http://dl-cdn.alpinelinux.org/alpine}"
: "${MAX_ARTIFACT_SIZE:=300000000}" #300M
: "${CI_DEBUG_BUILD:=}"
: "${CI_ALPINE_BUILD_OFFSET:=0}"
: "${CI_ALPINE_BUILD_LIMIT:=9999}"
: "${CI_ALPINE_TARGET_ARCH:=$(uname -m)}"
msg() {
local color=${2:-green}
@ -71,7 +70,7 @@ report() {
get_release() {
case $BASEBRANCH in
v*) echo "${BASEBRANCH%-*}";;
v*) echo "$BASEBRANCH";;
edge) echo edge;;
*) die "Branch \"$BASEBRANCH\" not supported!"
esac
@ -80,9 +79,8 @@ get_release() {
build_aport() {
local repo="$1" aport="$2"
cd "$APORTSDIR/$repo/$aport"
export CHOST=$CI_ALPINE_TARGET_ARCH
if abuild -r 2>&1 | report "build-$aport"; then
checkapk | report "checkapk-$aport" || true
checkapk 2>&1 | report "checkapk-$aport" || true
aport_ok="$aport_ok $repo/$aport"
else
aport_ng="$aport_ng $repo/$aport"
@ -92,12 +90,6 @@ build_aport() {
check_aport() {
local repo="$1" aport="$2"
cd "$APORTSDIR/$repo/$aport"
export CHOST=$CI_ALPINE_TARGET_ARCH
# TODO: this enables crossbuild only on user, this should be cleaner
if [ "$repo" != "user" ] && [ "$repo" != "backports" ] && [ "$CI_ALPINE_TARGET_ARCH" != "$ARCH" ]; then
aport_na="$aport_na $repo/$aport"
return 1
fi
if ! abuild check_arch 2>/dev/null; then
aport_na="$aport_na $repo/$aport"
return 1
@ -110,16 +102,13 @@ set_repositories_for() {
release=$(get_release)
for repo in $REPOS; do
[ "$repo" = "non-free" ] && continue
[ "$release" == "edge" ] && [ "$repo" == "backports" ] && continue
repos="$repos $MIRROR/$release/$repo $REPODEST/$repo"
[ "$repo" = "$target_repo" ] && break
done
sudo sh -c "printf '%s\n' $repos >> /etc/apk/repositories"
sudo apk update || true
if [ "$CI_ALPINE_TARGET_ARCH" != "$ARCH" ]; then
sudo sh -c "printf '%s\n' $repos >> $HOME/sysroot-$CI_ALPINE_TARGET_ARCH/etc/apk/repositories"
sudo cp -R /etc/apk/keys/* $HOME/sysroot-$CI_ALPINE_TARGET_ARCH/etc/apk/keys/.
sudo apk --root=$HOME/sysroot-$CI_ALPINE_TARGET_ARCH update || true
fi
doas sh -c "printf '%s\n' $repos >> /etc/apk/repositories"
doas apk update
}
apply_offset_limit() {
@ -139,22 +128,10 @@ setup_system() {
[ "$release" != "edge" ] && [ "$repo" == "testing" ] && continue
repos="$repos $ALPINE_MIRROR/$release/$repo"
done
repos="$repos $MIRROR/$release/cross"
sudo sh -c "printf '%s\n' $repos > /etc/apk/repositories"
sudo apk -U upgrade -a || sudo apk fix || die "Failed to up/downgrade system"
if [ "$CI_ALPINE_TARGET_ARCH" != "$ARCH" ]; then
sudo apk add gcc-$CI_ALPINE_TARGET_ARCH
fi
gitlab_key_to_rsa $ABUILD_KEY rsa-private $HOME/.abuild/$ABUILD_KEY_NAME.rsa
gitlab_key_to_rsa $ABUILD_KEY_PUB rsa-public $HOME/.abuild/$ABUILD_KEY_NAME.rsa.pub
chmod 700 $HOME/.abuild/$ABUILD_KEY_NAME.rsa
echo "PACKAGER_PRIVKEY=$HOME/.abuild/$ABUILD_KEY_NAME.rsa" >> $HOME/.abuild/abuild.conf
sudo cp $HOME/.abuild/$ABUILD_KEY_NAME.rsa.pub /etc/apk/keys/$ABUILD_KEY_NAME.rsa.pub
# patch abuild for crosscompiling
sudo patch -p1 -d / -i $CI_PROJECT_DIR/.gitlab/patches/abuild-cross.patch
sudo sed -i -E 's/export JOBS=[0-9]+$/export JOBS=$(nproc)/' /etc/abuild.conf
doas sh -c "printf '%s\n' $repos > /etc/apk/repositories"
doas apk -U upgrade -a || apk fix || die "Failed to up/downgrade system"
abuild-keygen -ain
doas sed -i -E 's/export JOBS=[0-9]+$/export JOBS=$(nproc)/' /etc/abuild.conf
( . /etc/abuild.conf && echo "Building with $JOBS jobs" )
mkdir -p "$REPODEST"
git config --global init.defaultBranch master
@ -203,7 +180,7 @@ sysinfo || true
setup_system || die "Failed to setup system"
# git no longer allows to execute in repositories owned by different users
sudo chown -R $USER: .
doas chown -R buildozer: .
fetch_flags="-qn"
debugging && fetch_flags="-v"
@ -226,7 +203,6 @@ build_start=$CI_ALPINE_BUILD_OFFSET
build_limit=$CI_ALPINE_BUILD_LIMIT
for repo in $(changed_repos); do
mkdir -p "$APORTSDIR"/logs "$APORTSDIR"/packages "$APORTSDIR"/keys
set_repositories_for "$repo"
built_aports=0
changed_aports_in_repo=$(changed_aports "$repo")
@ -267,7 +243,7 @@ for ok in $aport_ok; do
done
for na in $aport_na; do
msg "$na: disabled for $CI_ALPINE_TARGET_ARCH" yellow
msg "$na: disabled for $ARCH" yellow
done
for ng in $aport_ng; do
@ -281,3 +257,4 @@ if [ "$failed" = true ]; then
elif [ -z "$aport_ok" ]; then
msg "No packages found to be built." yellow
fi

31
.forgejo/bin/check_ver.sh Executable file
View file

@ -0,0 +1,31 @@
#!/bin/bash
# expects the following env variables:
# downstream: downstream repo
repo=${downstream/*\/}
curl --silent $downstream/x86_64/APKINDEX.tar.gz | tar -O -zx APKINDEX > APKINDEX
owned_by_you=$(awk -v RS= -v ORS="\n\n" '/m:Antoine Martin \(ayakael\) <dev@ayakael.net>/' APKINDEX | awk -F ':' '{if($1=="o"){print $2}}' | sort | uniq)
echo "Found $(printf '%s\n' $owned_by_you | wc -l ) packages owned by you"
rm -f out_of_date not_in_anitya
for pkg in $owned_by_you; do
upstream_version=$(curl --fail -X GET -sS -H 'Content-Type: application/json' "https://release-monitoring.org/api/v2/packages/?name=$pkg&distribution=Alpine" | jq -r '.items.[].stable_version')
downstream_version=$(sed -n "/^P:$pkg$/,/^$/p" APKINDEX | awk -F ':' '{if($1=="V"){print $2}}' | sort -V | tail -n 1)
downstream_version=${downstream_version/-*}
if [ -z "$upstream_version" ]; then
echo "$pkg not in anitya"
echo "$pkg" >> not_in_anitya
elif [ "$downstream_version" != "$(printf '%s\n' $upstream_version $downstream_version | sort -V | head -n 1)" ]; then
echo "$pkg higher downstream"
continue
elif [ "$upstream_version" != "$downstream_version" ]; then
echo "$pkg upstream version $upstream_version does not match downstream version $downstream_version"
echo "$pkg $downstream_version $upstream_version $repo" >> out_of_date
fi
done

165
.forgejo/bin/create_issue.sh Executable file
View file

@ -0,0 +1,165 @@
#!/bin/bash
# expects:
# env variable FORGEJO_TOKEN
# file out_of_date
IFS='
'
repo=${downstream/*\/}
does_it_exist() {
name=$1
downstream_version=$2
upstream_version=$3
repo=$4
query="$repo/$name: upgrade to $upstream_version"
query="$(echo $query | sed 's| |%20|g' | sed 's|:|%3A|g' | sed 's|/|%2F|g' )"
result="$(curl --silent -X 'GET' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues?state=open&q=$query&type=issues" \
-H 'accept: application/json' \
-H "authorization: Basic $FORGEJO_TOKEN"
)"
if [ "$result" == "[]" ]; then
return 1
fi
}
is_it_old() {
name=$1
downstream_version=$2
upstream_version=$3
repo=$4
query="$repo/$name: upgrade to"
query="$(echo $query | sed 's| |%20|g' | sed 's|:|%3A|g' | sed 's|/|%2F|g' )"
result="$(curl --silent -X 'GET' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues?state=open&q=$query&type=issues" \
-H 'accept: application/json' \
-H "authorization: Basic $FORGEJO_TOKEN"
)"
result_title="$(echo $result | jq -r '.[].title' )"
result_id="$(echo $result | jq -r '.[].number' )"
result_upstream_version="$(echo $result_title | awk '{print $4}')"
if [ "$upstream_version" != "$result_upstream_version" ]; then
echo $result_id
else
echo 0
fi
}
update_title() {
name=$1
downstream_version=$2
upstream_version=$3
repo=$4
id=$5
result=$(curl --silent -X 'PATCH' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues/$id" \
-H 'accept: application/json' \
-H "authorization: Basic $FORGEJO_TOKEN" \
-H 'Content-Type: application/json' \
-d "{
\"title\": \"$repo/$name: upgrade to $upstream_version\"
}"
)
return 0
}
create_issue() {
name=$1
downstream_version=$2
upstream_version=$3
repo=$4
result=$(curl --silent -X 'POST' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues" \
-H 'accept: application/json' \
-H "authorization: Basic $FORGEJO_TOKEN" \
-H 'Content-Type: application/json' \
-d "{
\"title\": \"$repo/$name: upgrade to $upstream_version\",
\"labels\": [
$LABEL_NUMBER
]
}")
return 0
}
if [ -f out_of_date ]; then
out_of_date="$(cat out_of_date)"
echo "Detected $(wc -l out_of_date) out-of-date packages, creating issues"
for pkg in $out_of_date; do
name="$(echo $pkg | awk '{print $1}')"
downstream_version="$(echo $pkg | awk '{print $2}')"
upstream_version="$(echo $pkg | awk '{print $3}')"
repo="$(echo $pkg | awk '{print $4}')"
if does_it_exist $name $downstream_version $upstream_version $repo; then
echo "Issue for $repo/$name already exists"
continue
fi
id=$(is_it_old $name $downstream_version $upstream_version $repo)
if [ "$id" != "0" ] && [ -n "$id" ]; then
echo "Issue for $repo/$name needs updating"
update_title $name $downstream_version $upstream_version $repo $id
continue
fi
echo "Creating issue for $repo/$name"
create_issue $name $downstream_version $upstream_version $repo
done
fi
if [ -f not_in_anitya ]; then
query="Add missing $repo packages to anitya"
query="$(echo $query | sed 's| |%20|g')"
result="$(curl --silent -X 'GET' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues?state=open&q=$query&type=issues" \
-H 'accept: application/json' \
-H "authorization: Basic $FORGEJO_TOKEN"
)"
if [ "$result" == "[]" ]; then
echo "Creating anitya issue"
result=$(curl --silent -X 'POST' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues" \
-H 'accept: application/json' \
-H "authorization: Basic $FORGEJO_TOKEN" \
-H 'Content-Type: application/json' \
-d "{
\"title\": \"Add missing $repo packages to anitya\",
\"body\": \"- [ ] $(sed '{:q;N;s/\n/\\n- [ ] /g;t q}' not_in_anitya)\",
\"labels\": [
$LABEL_NUMBER
]
}")
else
echo "Updating anitya issue"
result_id="$(echo $result | jq -r '.[].number' )"
result=$(curl --silent -X 'PATCH' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues/$result_id" \
-H 'accept: application/json' \
-H "authorization: Basic $FORGEJO_TOKEN" \
-H 'Content-Type: application/json' \
-d "{
\"body\": \"- [ ] $(sed '{:q;N;s/\n/\\n- [ ] /g;t q}' not_in_anitya)\"
}"
)
fi
fi

26
.forgejo/bin/deploy.sh Executable file
View file

@ -0,0 +1,26 @@
#!/bin/sh
# shellcheck disable=SC3040
set -eu -o pipefail
readonly REPOS="backports user"
readonly BASEBRANCH=$GITHUB_BASE_REF
readonly TARGET_REPO=$CI_ALPINE_REPO
apkgs=$(find package -type f -name "*.apk")
for apk in $apkgs; do
branch=$(echo $apk | awk -F '/' '{print $2}')
arch=$(echo $apk | awk -F '/' '{print $3}')
name=$(echo $apk | awk -F '/' '{print $4}')
echo "Sending $name of arch $arch to $TARGET_REPO/$BASEBRANCH/$branch"
return=$(curl -s --user $FORGE_REPO_USER:$FORGE_REPO_TOKEN --upload-file $apk $TARGET_REPO/$BASEBRANCH/$branch 2>&1)
echo $return
if [ "$return" == "package file already exists" ]; then
echo "Package already exists, refreshing..."
curl -s --user $FORGE_REPO_USER:$FORGE_REPO_TOKEN -X DELETE $TARGET_REPO/$BASEBRANCH/$branch/$arch/$name
curl -s --user $FORGE_REPO_USER:$FORGE_REPO_TOKEN --upload-file $apk $TARGET_REPO/$BASEBRANCH/$branch
fi
done

View file

@ -0,0 +1,52 @@
on:
pull_request:
types: [ assigned, opened, synchronize, reopened ]
jobs:
build-aarch64:
runs-on: aarch64
container:
image: alpinelinux/alpine-gitlab-ci:latest
env:
CI_PROJECT_DIR: ${{ github.workspace }}
CI_DEBUG_BUILD: ${{ runner.debug }}
CI_MERGE_REQUEST_PROJECT_URL: ${{ github.server_url }}/${{ github.repository }}
CI_MERGE_REQUEST_TARGET_BRANCH_NAME: ${{ github.base_ref }}
steps:
- name: Environment setup
run: |
doas apk add nodejs git patch curl
cd /etc/apk/keys
doas curl -JO https://ayakael.net/api/packages/forge/alpine/key
- name: Repo pull
uses: actions/checkout@v4
with:
fetch-depth: 500
- name: Package build
run: |
${{ github.workspace }}/.forgejo/bin/build.sh
touch packages/dummy
- name: Package upload
uses: forgejo/upload-artifact@v3
with:
name: package
path: packages
deploy-aarch64:
needs: [build-aarch64]
runs-on: aarch64
container:
image: alpine:latest
env:
CI_ALPINE_REPO: 'https://ayakael.net/api/packages/forge/alpine'
FORGE_REPO_TOKEN: ${{ secrets.FORGE_REPO_TOKEN }}
FORGE_REPO_USER: ${{ vars.FORGE_REPO_USER }}
steps:
- name: Setting up environment
run: apk add nodejs curl findutils git gawk
- name: Repo pull
uses: actions/checkout@v4
- name: Package download
uses: forgejo/download-artifact@v3
- name: Package deployment
run: ${{ github.workspace }}/.forgejo/bin/deploy.sh

View file

@ -0,0 +1,52 @@
on:
pull_request:
types: [ assigned, opened, synchronize, reopened ]
jobs:
build-x86_64:
runs-on: x86_64
container:
image: alpinelinux/alpine-gitlab-ci:latest
env:
CI_PROJECT_DIR: ${{ github.workspace }}
CI_DEBUG_BUILD: ${{ runner.debug }}
CI_MERGE_REQUEST_PROJECT_URL: ${{ github.server_url }}/${{ github.repository }}
CI_MERGE_REQUEST_TARGET_BRANCH_NAME: ${{ github.base_ref }}
steps:
- name: Environment setup
run: |
doas apk add nodejs git patch curl
cd /etc/apk/keys
doas curl -JO https://ayakael.net/api/packages/forge/alpine/key
- name: Repo pull
uses: actions/checkout@v4
with:
fetch-depth: 500
- name: Package build
run: |
${{ github.workspace }}/.forgejo/bin/build.sh
touch packages/dummy
- name: Package upload
uses: forgejo/upload-artifact@v3
with:
name: package
path: packages
deploy-x86_64:
needs: [build-x86_64]
runs-on: x86_64
container:
image: alpine:latest
env:
CI_ALPINE_REPO: 'https://ayakael.net/api/packages/forge/alpine'
FORGE_REPO_TOKEN: ${{ secrets.FORGE_REPO_TOKEN }}
FORGE_REPO_USER: ${{ vars.FORGE_REPO_USER }}
steps:
- name: Setting up environment
run: apk add nodejs curl findutils git gawk
- name: Repo pull
uses: actions/checkout@v4
- name: Package download
uses: forgejo/download-artifact@v3
- name: Package deployment
run: ${{ github.workspace }}/.forgejo/bin/deploy.sh

View file

@ -0,0 +1,27 @@
on:
workflow_dispatch:
schedule:
- cron: '0 5 * * *'
jobs:
check-community:
name: Check community repo
runs-on: x86_64
container:
image: alpine:latest
env:
downstream: https://dl-cdn.alpinelinux.org/alpine/edge/community
FORGEJO_TOKEN: ${{ secrets.forgejo_token }}
LABEL_NUMBER: 4
steps:
- name: Environment setup
run: apk add grep coreutils gawk curl wget bash nodejs git jq sed
- name: Get scripts
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Check out-of-date packages
run: ${{ github.workspace }}/.forgejo/bin/check_ver.sh
- name: Create issues
run: ${{ github.workspace }}/.forgejo/bin/create_issue.sh

View file

@ -0,0 +1,27 @@
on:
workflow_dispatch:
schedule:
- cron: '0 5 * * *'
jobs:
check-community:
name: Check testing repo
runs-on: x86_64
container:
image: alpine:latest
env:
downstream: https://dl-cdn.alpinelinux.org/alpine/edge/testing
FORGEJO_TOKEN: ${{ secrets.forgejo_token }}
LABEL_NUMBER: 4
steps:
- name: Environment setup
run: apk add grep coreutils gawk curl wget bash nodejs git jq sed
- name: Get scripts
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Check out-of-date packages
run: ${{ github.workspace }}/.forgejo/bin/check_ver.sh
- name: Create issues
run: ${{ github.workspace }}/.forgejo/bin/create_issue.sh

View file

@ -0,0 +1,27 @@
on:
workflow_dispatch:
schedule:
- cron: '0 5 * * *'
jobs:
check-user:
name: Check user repo
runs-on: x86_64
container:
image: alpine:latest
env:
downstream: https://ayakael.net/api/packages/forge/alpine/edge/user
FORGEJO_TOKEN: ${{ secrets.forgejo_token }}
LABEL_NUMBER: 4
steps:
- name: Environment setup
run: apk add grep coreutils gawk curl wget bash nodejs git jq sed
- name: Get scripts
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Check out-of-date packages
run: ${{ github.workspace }}/.forgejo/bin/check_ver.sh
- name: Create issues
run: ${{ github.workspace }}/.forgejo/bin/create_issue.sh

View file

@ -0,0 +1,21 @@
on:
pull_request:
types: [ assigned, opened, synchronize, reopened ]
jobs:
lint:
run-name: lint
runs-on: x86_64
container:
image: alpinelinux/apkbuild-lint-tools:latest
env:
CI_PROJECT_DIR: ${{ github.workspace }}
CI_DEBUG_BUILD: ${{ runner.debug }}
CI_MERGE_REQUEST_PROJECT_URL: ${{ github.server_url }}/${{ github.repository }}
CI_MERGE_REQUEST_TARGET_BRANCH_NAME: ${{ github.base_ref }}
steps:
- run: doas apk add nodejs git
- uses: actions/checkout@v4
with:
fetch-depth: 500
- run: lint

View file

@ -1,109 +0,0 @@
stages:
- verify
- build
- deploy
variables:
GIT_STRATEGY: clone
GIT_DEPTH: "500"
lint:
stage: verify
interruptible: true
script:
- |
sudo apk add shellcheck atools sudo abuild
export PATH="$PATH:$CI_PROJECT_DIR/.gitlab/bin"
lint
allow_failure: true
only:
- merge_requests
tags:
- apk-$CI_MERGE_REQUEST_TARGET_BRANCH_NAME-x86_64
.build:
stage: build
interruptible: true
script:
- |
sudo apk add alpine-sdk lua-aports sudo
sudo addgroup $USER abuild
export PATH="$PATH:$CI_PROJECT_DIR/.gitlab/bin"
sudo -Eu $USER build.sh
artifacts:
paths:
- packages/
- keys/
- logs/
expire_in: 7 days
when: always
only:
- merge_requests
.cross:
stage: build
interruptible: true
script:
- |
sudo apk add alpine-sdk lua-aports sudo gzip xz qemu-$CI_QEMU_TARGET_ARCH
sudo addgroup $USER abuild
export PATH="$PATH:$CI_PROJECT_DIR/.gitlab/bin"
build-rootfs.sh alpine${CI_MERGE_REQUEST_TARGET_BRANCH_NAME/v} $CI_ALPINE_TARGET_ARCH --rootfsdir $HOME/sysroot-$CI_ALPINE_TARGET_ARCH
cp /etc/apk/repositories $HOME/sysroot-$CI_ALPINE_TARGET_ARCH/etc/apk/.
sudo -Eu $USER CHOST=$CI_TARGET_ALPINE_ARCH build.sh
artifacts:
paths:
- packages/
- keys/
- logs/
expire_in: 7 days
when: always
only:
- merge_requests
build-x86_64:
extends: .build
when: always
tags:
- apk-$CI_MERGE_REQUEST_TARGET_BRANCH_NAME-x86_64
build-aarch64:
extends: .build
when: always
tags:
- apk-$CI_MERGE_REQUEST_TARGET_BRANCH_NAME-aarch64
build-ppc64le:
extends: .build
when: manual
tags:
- apk-$CI_MERGE_REQUEST_TARGET_BRANCH_NAME-ppc64le
build-s390x:
extends: .build
when: manual
tags:
- apk-$CI_MERGE_REQUEST_TARGET_BRANCH_NAME-s390x
build-armv7:
extends: .cross
when: manual
tags:
- apk-$CI_MERGE_REQUEST_TARGET_BRANCH_NAME-x86_64
variables:
CI_ALPINE_TARGET_ARCH: armv7
CI_QEMU_TARGET_ARCH: arm
push:
interruptible: true
stage: deploy
script:
- |
sudo apk add abuild git-lfs findutils
export PATH="$PATH:$CI_PROJECT_DIR/.gitlab/bin"
push.sh
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
when: manual
tags:
- repo

View file

@ -1,111 +0,0 @@
#!/bin/sh
set -e
arch=
builddir=
checkdepends=
depends=
depends_dev=
depends_doc=
depends_libs=
depends_openrc=
depends_static=
install=
install_if=
langdir=
ldpath=
license=
makedepends=
makedepends_build=
makedepends_host=
md5sums=
options=
patch_args=
pkgbasedir=
pkgdesc=
pkgdir=
pkgname=
pkgrel=
pkgver=
pkggroups=
pkgusers=
provides=
provider_priority=
replaces=
sha256sums=
sha512sums=
sonameprefix=
source=
srcdir=
startdir=
subpackages=
subpkgdir=
subpkgname=
triggers=
url=
# abuild.conf
CFLAGS=
CXXFLAGS=
CPPFLAGS=
LDFLAGS=
JOBS=
MAKEFLAGS=
CMAKE_CROSSOPTS=
. ./APKBUILD
: "$arch"
: "$builddir"
: "$checkdepends"
: "$depends"
: "$depends_dev"
: "$depends_doc"
: "$depends_libs"
: "$depends_openrc"
: "$depends_static"
: "$install"
: "$install_if"
: "$langdir"
: "$ldpath"
: "$license"
: "$makedepends"
: "$makedepends_build"
: "$makedepends_host"
: "$md5sums"
: "$options"
: "$patch_args"
: "$pkgbasedir"
: "$pkgdesc"
: "$pkgdir"
: "$pkgname"
: "$pkgrel"
: "$pkgver"
: "$pkggroups"
: "$pkgusers"
: "$provides"
: "$provider_priority"
: "$replaces"
: "$sha256sums"
: "$sha512sums"
: "$sonameprefix"
: "$source"
: "$srcdir"
: "$startdir"
: "$subpackages"
: "$subpkgdir"
: "$subpkgname"
: "$triggers"
: "$url"
# abuild.conf
: "$CFLAGS"
: "$CXXFLAGS"
: "$CPPFLAGS"
: "$LDFLAGS"
: "$JOBS"
: "$MAKEFLAGS"
: "$CMAKE_CROSSOPTS"

View file

@ -1,16 +0,0 @@
#!/bin/sh
shellcheck -s ash \
-e SC3043 \
-e SC3057 \
-e SC3060 \
-e SC2016 \
-e SC2086 \
-e SC2169 \
-e SC2155 \
-e SC2100 \
-e SC2209 \
-e SC2030 \
-e SC2031 \
-e SC1090 \
-xa $CI_PROJECT_DIR/.gitlab/bin/APKBUILD_SHIM

View file

@ -1,556 +0,0 @@
#!/usr/bin/env bash
# Availabl here: https://lab.ilot.io/dotnet/arcade/-/blob/7f6d9796cc7f594772f798358dbdd8c69b6a97af/eng/common/cross/build-rootfs.sh
# Only modification: qemu-$arch-static becomes qemu-$arch
set -e
usage()
{
echo "Usage: $0 [BuildArch] [CodeName] [lldbx.y] [llvmx[.y]] [--skipunmount] --rootfsdir <directory>]"
echo "BuildArch can be: arm(default), arm64, armel, armv6, ppc64le, riscv64, s390x, x64, x86"
echo "CodeName - optional, Code name for Linux, can be: xenial(default), zesty, bionic, alpine"
echo " for alpine can be specified with version: alpineX.YY or alpineedge"
echo " for FreeBSD can be: freebsd12, freebsd13"
echo " for illumos can be: illumos"
echo " for Haiku can be: haiku."
echo "lldbx.y - optional, LLDB version, can be: lldb3.9(default), lldb4.0, lldb5.0, lldb6.0 no-lldb. Ignored for alpine and FreeBSD"
echo "llvmx[.y] - optional, LLVM version for LLVM related packages."
echo "--skipunmount - optional, will skip the unmount of rootfs folder."
echo "--use-mirror - optional, use mirror URL to fetch resources, when available."
echo "--jobs N - optional, restrict to N jobs."
exit 1
}
__CodeName=xenial
__CrossDir=$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )
__BuildArch=arm
__AlpineArch=armv7
__FreeBSDArch=arm
__FreeBSDMachineArch=armv7
__IllumosArch=arm7
__QEMUArch=arm
__UbuntuArch=armhf
__UbuntuRepo="http://ports.ubuntu.com/"
__LLDB_Package="liblldb-3.9-dev"
__SkipUnmount=0
# base development support
__UbuntuPackages="build-essential"
__AlpinePackages="alpine-base"
__AlpinePackages+=" build-base"
# symlinks fixer
__UbuntuPackages+=" symlinks"
# runtime dependencies
__UbuntuPackages+=" libicu-dev"
__UbuntuPackages+=" liblttng-ust-dev"
__UbuntuPackages+=" libunwind8-dev"
__UbuntuPackages+=" libnuma-dev"
# runtime libraries' dependencies
__UbuntuPackages+=" libcurl4-openssl-dev"
__UbuntuPackages+=" libkrb5-dev"
__UbuntuPackages+=" libssl-dev"
__UbuntuPackages+=" zlib1g-dev"
__FreeBSDBase="12.3-RELEASE"
__FreeBSDPkg="1.17.0"
__FreeBSDABI="12"
__FreeBSDPackages="libunwind"
__FreeBSDPackages+=" icu"
__FreeBSDPackages+=" libinotify"
__FreeBSDPackages+=" openssl"
__FreeBSDPackages+=" krb5"
__FreeBSDPackages+=" terminfo-db"
__IllumosPackages="icu"
__IllumosPackages+=" mit-krb5"
__IllumosPackages+=" openssl"
__IllumosPackages+=" zlib"
__HaikuPackages="gmp"
__HaikuPackages+=" gmp_devel"
__HaikuPackages+=" krb5"
__HaikuPackages+=" krb5_devel"
__HaikuPackages+=" libiconv"
__HaikuPackages+=" libiconv_devel"
__HaikuPackages+=" llvm12_libunwind"
__HaikuPackages+=" llvm12_libunwind_devel"
__HaikuPackages+=" mpfr"
__HaikuPackages+=" mpfr_devel"
# ML.NET dependencies
__UbuntuPackages+=" libomp5"
__UbuntuPackages+=" libomp-dev"
__Keyring=
__UseMirror=0
__UnprocessedBuildArgs=
while :; do
if [[ "$#" -le 0 ]]; then
break
fi
lowerI="$(echo "$1" | tr "[:upper:]" "[:lower:]")"
case $lowerI in
-\?|-h|--help)
usage
exit 1
;;
arm)
__BuildArch=arm
__UbuntuArch=armhf
__AlpineArch=armv7
__QEMUArch=arm
;;
arm64)
__BuildArch=arm64
__UbuntuArch=arm64
__AlpineArch=aarch64
__QEMUArch=aarch64
__FreeBSDArch=arm64
__FreeBSDMachineArch=aarch64
;;
armel)
__BuildArch=armel
__UbuntuArch=armel
__UbuntuRepo="http://ftp.debian.org/debian/"
__CodeName=jessie
;;
armv6)
__BuildArch=armv6
__UbuntuArch=armhf
__QEMUArch=arm
__UbuntuRepo="http://raspbian.raspberrypi.org/raspbian/"
__CodeName=buster
__LLDB_Package="liblldb-6.0-dev"
if [[ -e "/usr/share/keyrings/raspbian-archive-keyring.gpg" ]]; then
__Keyring="--keyring /usr/share/keyrings/raspbian-archive-keyring.gpg"
fi
;;
riscv64)
__BuildArch=riscv64
__AlpineArch=riscv64
__QEMUArch=riscv64
__UbuntuArch=riscv64
__UbuntuRepo="http://deb.debian.org/debian-ports"
__UbuntuPackages="${__UbuntuPackages// libunwind8-dev/}"
unset __LLDB_Package
if [[ -e "/usr/share/keyrings/debian-ports-archive-keyring.gpg" ]]; then
__Keyring="--keyring /usr/share/keyrings/debian-ports-archive-keyring.gpg --include=debian-ports-archive-keyring"
fi
;;
ppc64le)
__BuildArch=ppc64le
__AlpineArch=ppc64le
__QEMUArch=ppc64le
__UbuntuArch=ppc64el
__UbuntuRepo="http://ports.ubuntu.com/ubuntu-ports/"
__UbuntuPackages="${__UbuntuPackages// libunwind8-dev/}"
__UbuntuPackages="${__UbuntuPackages// libomp-dev/}"
__UbuntuPackages="${__UbuntuPackages// libomp5/}"
unset __LLDB_Package
;;
s390x)
__BuildArch=s390x
__AlpineArch=s390x
__QEMUArch=s390x
__UbuntuArch=s390x
__UbuntuRepo="http://ports.ubuntu.com/ubuntu-ports/"
__UbuntuPackages="${__UbuntuPackages// libunwind8-dev/}"
__UbuntuPackages="${__UbuntuPackages// libomp-dev/}"
__UbuntuPackages="${__UbuntuPackages// libomp5/}"
unset __LLDB_Package
;;
x64)
__BuildArch=x64
__AlpineArch=x86_64
__QEMUArch=x86_64
__UbuntuArch=amd64
__FreeBSDArch=amd64
__FreeBSDMachineArch=amd64
__illumosArch=x86_64
__UbuntuRepo=
;;
x86)
__BuildArch=x86
__AlpineArch=i386
__QEMUArch=i386
__UbuntuArch=i386
__AlpineArch=x86
__UbuntuRepo="http://archive.ubuntu.com/ubuntu/"
;;
lldb*)
version="${lowerI/lldb/}"
parts=(${version//./ })
# for versions > 6.0, lldb has dropped the minor version
if [[ "${parts[0]}" -gt 6 ]]; then
version="${parts[0]}"
fi
__LLDB_Package="liblldb-${version}-dev"
;;
no-lldb)
unset __LLDB_Package
;;
llvm*)
version="${lowerI/llvm/}"
parts=(${version//./ })
__LLVM_MajorVersion="${parts[0]}"
__LLVM_MinorVersion="${parts[1]}"
# for versions > 6.0, llvm has dropped the minor version
if [[ -z "$__LLVM_MinorVersion" && "$__LLVM_MajorVersion" -le 6 ]]; then
__LLVM_MinorVersion=0;
fi
;;
xenial) # Ubuntu 16.04
if [[ "$__CodeName" != "jessie" ]]; then
__CodeName=xenial
fi
;;
zesty) # Ubuntu 17.04
if [[ "$__CodeName" != "jessie" ]]; then
__CodeName=zesty
fi
;;
bionic) # Ubuntu 18.04
if [[ "$__CodeName" != "jessie" ]]; then
__CodeName=bionic
fi
;;
focal) # Ubuntu 20.04
if [[ "$__CodeName" != "jessie" ]]; then
__CodeName=focal
fi
;;
jammy) # Ubuntu 22.04
if [[ "$__CodeName" != "jessie" ]]; then
__CodeName=jammy
fi
;;
jessie) # Debian 8
__CodeName=jessie
if [[ -z "$__UbuntuRepo" ]]; then
__UbuntuRepo="http://ftp.debian.org/debian/"
fi
;;
stretch) # Debian 9
__CodeName=stretch
__LLDB_Package="liblldb-6.0-dev"
if [[ -z "$__UbuntuRepo" ]]; then
__UbuntuRepo="http://ftp.debian.org/debian/"
fi
;;
buster) # Debian 10
__CodeName=buster
__LLDB_Package="liblldb-6.0-dev"
if [[ -z "$__UbuntuRepo" ]]; then
__UbuntuRepo="http://ftp.debian.org/debian/"
fi
;;
bullseye) # Debian 11
__CodeName=bullseye
if [[ -z "$__UbuntuRepo" ]]; then
__UbuntuRepo="http://ftp.debian.org/debian/"
fi
;;
sid) # Debian sid
__CodeName=sid
if [[ -z "$__UbuntuRepo" ]]; then
__UbuntuRepo="http://ftp.debian.org/debian/"
fi
;;
tizen)
__CodeName=
__UbuntuRepo=
__Tizen=tizen
;;
alpine*)
__CodeName=alpine
__UbuntuRepo=
version="${lowerI/alpine/}"
if [[ "$version" == "edge" ]]; then
__AlpineVersion=edge
else
parts=(${version//./ })
__AlpineMajorVersion="${parts[0]}"
__AlpineMinoVersion="${parts[1]}"
__AlpineVersion="$__AlpineMajorVersion.$__AlpineMinoVersion"
fi
;;
freebsd12)
__CodeName=freebsd
__SkipUnmount=1
;;
freebsd13)
__CodeName=freebsd
__FreeBSDBase="13.0-RELEASE"
__FreeBSDABI="13"
__SkipUnmount=1
;;
illumos)
__CodeName=illumos
__SkipUnmount=1
;;
haiku)
__CodeName=haiku
__BuildArch=x64
__SkipUnmount=1
;;
--skipunmount)
__SkipUnmount=1
;;
--rootfsdir|-rootfsdir)
shift
__RootfsDir="$1"
;;
--use-mirror)
__UseMirror=1
;;
--use-jobs)
shift
MAXJOBS=$1
;;
*)
__UnprocessedBuildArgs="$__UnprocessedBuildArgs $1"
;;
esac
shift
done
if [[ "$__BuildArch" == "armel" ]]; then
__LLDB_Package="lldb-3.5-dev"
fi
__UbuntuPackages+=" ${__LLDB_Package:-}"
if [[ -n "$__LLVM_MajorVersion" ]]; then
__UbuntuPackages+=" libclang-common-${__LLVM_MajorVersion}${__LLVM_MinorVersion:+.$__LLVM_MinorVersion}-dev"
fi
if [[ -z "$__RootfsDir" && -n "$ROOTFS_DIR" ]]; then
__RootfsDir="$ROOTFS_DIR"
fi
if [[ -z "$__RootfsDir" ]]; then
__RootfsDir="$__CrossDir/../../../.tools/rootfs/$__BuildArch"
fi
if [[ -d "$__RootfsDir" ]]; then
if [[ "$__SkipUnmount" == "0" ]]; then
umount "$__RootfsDir"/* || true
fi
rm -rf "$__RootfsDir"
fi
mkdir -p "$__RootfsDir"
__RootfsDir="$( cd "$__RootfsDir" && pwd )"
if [[ "$__CodeName" == "alpine" ]]; then
__ApkToolsVersion=2.12.11
__ApkToolsDir="$(mktemp -d)"
wget "https://gitlab.alpinelinux.org/api/v4/projects/5/packages/generic//v$__ApkToolsVersion/x86_64/apk.static" -P "$__ApkToolsDir"
chmod +x "$__ApkToolsDir/apk.static"
mkdir -p "$__RootfsDir"/usr/bin
cp -v "/usr/bin/qemu-$__QEMUArch" "$__RootfsDir/usr/bin"
if [[ "$__AlpineVersion" == "edge" ]]; then
version=edge
else
version="v$__AlpineVersion"
fi
# initialize DB
"$__ApkToolsDir/apk.static" \
-X "http://dl-cdn.alpinelinux.org/alpine/$version/main" \
-X "http://dl-cdn.alpinelinux.org/alpine/$version/community" \
-U --allow-untrusted --root "$__RootfsDir" --arch "$__AlpineArch" --initdb add
if [[ "$__AlpineLlvmLibsLookup" == 1 ]]; then
__AlpinePackages+=" $("$__ApkToolsDir/apk.static" \
-X "http://dl-cdn.alpinelinux.org/alpine/$version/main" \
-X "http://dl-cdn.alpinelinux.org/alpine/$version/community" \
-U --allow-untrusted --root "$__RootfsDir" --arch "$__AlpineArch" \
search 'llvm*-libs' | sort | tail -1 | sed 's/-[^-]*//2g')"
fi
# install all packages in one go
"$__ApkToolsDir/apk.static" \
-X "http://dl-cdn.alpinelinux.org/alpine/$version/main" \
-X "http://dl-cdn.alpinelinux.org/alpine/$version/community" \
-U --allow-untrusted --no-scripts --root "$__RootfsDir" --arch "$__AlpineArch" \
add $__AlpinePackages
rm -r "$__ApkToolsDir"
elif [[ "$__CodeName" == "freebsd" ]]; then
mkdir -p "$__RootfsDir"/usr/local/etc
JOBS=${MAXJOBS:="$(getconf _NPROCESSORS_ONLN)"}
wget -O - "https://download.freebsd.org/ftp/releases/${__FreeBSDArch}/${__FreeBSDMachineArch}/${__FreeBSDBase}/base.txz" | tar -C "$__RootfsDir" -Jxf - ./lib ./usr/lib ./usr/libdata ./usr/include ./usr/share/keys ./etc ./bin/freebsd-version
echo "ABI = \"FreeBSD:${__FreeBSDABI}:${__FreeBSDMachineArch}\"; FINGERPRINTS = \"${__RootfsDir}/usr/share/keys\"; REPOS_DIR = [\"${__RootfsDir}/etc/pkg\"]; REPO_AUTOUPDATE = NO; RUN_SCRIPTS = NO;" > "${__RootfsDir}"/usr/local/etc/pkg.conf
echo "FreeBSD: { url: \"pkg+http://pkg.FreeBSD.org/\${ABI}/quarterly\", mirror_type: \"srv\", signature_type: \"fingerprints\", fingerprints: \"${__RootfsDir}/usr/share/keys/pkg\", enabled: yes }" > "${__RootfsDir}"/etc/pkg/FreeBSD.conf
mkdir -p "$__RootfsDir"/tmp
# get and build package manager
wget -O - "https://github.com/freebsd/pkg/archive/${__FreeBSDPkg}.tar.gz" | tar -C "$__RootfsDir"/tmp -zxf -
cd "$__RootfsDir/tmp/pkg-${__FreeBSDPkg}"
# needed for install to succeed
mkdir -p "$__RootfsDir"/host/etc
./autogen.sh && ./configure --prefix="$__RootfsDir"/host && make -j "$JOBS" && make install
rm -rf "$__RootfsDir/tmp/pkg-${__FreeBSDPkg}"
# install packages we need.
INSTALL_AS_USER=$(whoami) "$__RootfsDir"/host/sbin/pkg -r "$__RootfsDir" -C "$__RootfsDir"/usr/local/etc/pkg.conf update
INSTALL_AS_USER=$(whoami) "$__RootfsDir"/host/sbin/pkg -r "$__RootfsDir" -C "$__RootfsDir"/usr/local/etc/pkg.conf install --yes $__FreeBSDPackages
elif [[ "$__CodeName" == "illumos" ]]; then
mkdir "$__RootfsDir/tmp"
pushd "$__RootfsDir/tmp"
JOBS=${MAXJOBS:="$(getconf _NPROCESSORS_ONLN)"}
echo "Downloading sysroot."
wget -O - https://github.com/illumos/sysroot/releases/download/20181213-de6af22ae73b-v1/illumos-sysroot-i386-20181213-de6af22ae73b-v1.tar.gz | tar -C "$__RootfsDir" -xzf -
echo "Building binutils. Please wait.."
wget -O - https://ftp.gnu.org/gnu/binutils/binutils-2.33.1.tar.bz2 | tar -xjf -
mkdir build-binutils && cd build-binutils
../binutils-2.33.1/configure --prefix="$__RootfsDir" --target="${__illumosArch}-sun-solaris2.10" --program-prefix="${__illumosArch}-illumos-" --with-sysroot="$__RootfsDir"
make -j "$JOBS" && make install && cd ..
echo "Building gcc. Please wait.."
wget -O - https://ftp.gnu.org/gnu/gcc/gcc-8.4.0/gcc-8.4.0.tar.xz | tar -xJf -
CFLAGS="-fPIC"
CXXFLAGS="-fPIC"
CXXFLAGS_FOR_TARGET="-fPIC"
CFLAGS_FOR_TARGET="-fPIC"
export CFLAGS CXXFLAGS CXXFLAGS_FOR_TARGET CFLAGS_FOR_TARGET
mkdir build-gcc && cd build-gcc
../gcc-8.4.0/configure --prefix="$__RootfsDir" --target="${__illumosArch}-sun-solaris2.10" --program-prefix="${__illumosArch}-illumos-" --with-sysroot="$__RootfsDir" --with-gnu-as \
--with-gnu-ld --disable-nls --disable-libgomp --disable-libquadmath --disable-libssp --disable-libvtv --disable-libcilkrts --disable-libada --disable-libsanitizer \
--disable-libquadmath-support --disable-shared --enable-tls
make -j "$JOBS" && make install && cd ..
BaseUrl=https://pkgsrc.smartos.org
if [[ "$__UseMirror" == 1 ]]; then
BaseUrl=https://pkgsrc.smartos.skylime.net
fi
BaseUrl="$BaseUrl/packages/SmartOS/trunk/${__illumosArch}/All"
echo "Downloading manifest"
wget "$BaseUrl"
echo "Downloading dependencies."
read -ra array <<<"$__IllumosPackages"
for package in "${array[@]}"; do
echo "Installing '$package'"
# find last occurrence of package in listing and extract its name
package="$(sed -En '/.*href="('"$package"'-[0-9].*).tgz".*/h;$!d;g;s//\1/p' All)"
echo "Resolved name '$package'"
wget "$BaseUrl"/"$package".tgz
ar -x "$package".tgz
tar --skip-old-files -xzf "$package".tmp.tg* -C "$__RootfsDir" 2>/dev/null
done
echo "Cleaning up temporary files."
popd
rm -rf "$__RootfsDir"/{tmp,+*}
mkdir -p "$__RootfsDir"/usr/include/net
mkdir -p "$__RootfsDir"/usr/include/netpacket
wget -P "$__RootfsDir"/usr/include/net https://raw.githubusercontent.com/illumos/illumos-gate/master/usr/src/uts/common/io/bpf/net/bpf.h
wget -P "$__RootfsDir"/usr/include/net https://raw.githubusercontent.com/illumos/illumos-gate/master/usr/src/uts/common/io/bpf/net/dlt.h
wget -P "$__RootfsDir"/usr/include/netpacket https://raw.githubusercontent.com/illumos/illumos-gate/master/usr/src/uts/common/inet/sockmods/netpacket/packet.h
wget -P "$__RootfsDir"/usr/include/sys https://raw.githubusercontent.com/illumos/illumos-gate/master/usr/src/uts/common/sys/sdt.h
elif [[ "$__CodeName" == "haiku" ]]; then
JOBS=${MAXJOBS:="$(getconf _NPROCESSORS_ONLN)"}
echo "Building Haiku sysroot for x86_64"
mkdir -p "$__RootfsDir/tmp"
cd "$__RootfsDir/tmp"
git clone -b hrev56235 https://review.haiku-os.org/haiku
git clone -b btrev43195 https://review.haiku-os.org/buildtools
cd "$__RootfsDir/tmp/buildtools" && git checkout 7487388f5110021d400b9f3b88e1a7f310dc066d
# Fetch some unmerged patches
cd "$__RootfsDir/tmp/haiku"
## Add development build profile (slimmer than nightly)
git fetch origin refs/changes/64/4164/1 && git -c commit.gpgsign=false cherry-pick FETCH_HEAD
# Build jam
cd "$__RootfsDir/tmp/buildtools/jam"
make
# Configure cross tools
echo "Building cross-compiler"
mkdir -p "$__RootfsDir/generated"
cd "$__RootfsDir/generated"
"$__RootfsDir/tmp/haiku/configure" -j"$JOBS" --sysroot "$__RootfsDir" --cross-tools-source "$__RootfsDir/tmp/buildtools" --build-cross-tools x86_64
# Build Haiku packages
echo "Building Haiku"
echo 'HAIKU_BUILD_PROFILE = "development-raw" ;' > UserProfileConfig
"$__RootfsDir/tmp/buildtools/jam/jam0" -j"$JOBS" -q '<build>package' '<repository>Haiku'
BaseUrl="https://depot.haiku-os.org/__api/v2/pkg/get-pkg"
# Download additional packages
echo "Downloading additional required packages"
read -ra array <<<"$__HaikuPackages"
for package in "${array[@]}"; do
echo "Downloading $package..."
# API documented here: https://github.com/haiku/haikudepotserver/blob/master/haikudepotserver-api2/src/main/resources/api2/pkg.yaml#L60
# The schema here: https://github.com/haiku/haikudepotserver/blob/master/haikudepotserver-api2/src/main/resources/api2/pkg.yaml#L598
hpkgDownloadUrl="$(wget -qO- --post-data='{"name":"'"$package"'","repositorySourceCode":"haikuports_x86_64","versionType":"LATEST","naturalLanguageCode":"en"}' \
--header='Content-Type:application/json' "$BaseUrl" | jq -r '.result.versions[].hpkgDownloadURL')"
wget -P "$__RootfsDir/generated/download" "$hpkgDownloadUrl"
done
# Setup the sysroot
echo "Setting up sysroot and extracting needed packages"
mkdir -p "$__RootfsDir/boot/system"
for file in "$__RootfsDir/generated/objects/haiku/x86_64/packaging/packages/"*.hpkg; do
"$__RootfsDir/generated/objects/linux/x86_64/release/tools/package/package" extract -C "$__RootfsDir/boot/system" "$file"
done
for file in "$__RootfsDir/generated/download/"*.hpkg; do
"$__RootfsDir/generated/objects/linux/x86_64/release/tools/package/package" extract -C "$__RootfsDir/boot/system" "$file"
done
# Cleaning up temporary files
echo "Cleaning up temporary files"
rm -rf "$__RootfsDir/tmp"
for name in "$__RootfsDir/generated/"*; do
if [[ "$name" =~ "cross-tools-" ]]; then
: # Keep the cross-compiler
else
rm -rf "$name"
fi
done
elif [[ -n "$__CodeName" ]]; then
qemu-debootstrap $__Keyring --arch "$__UbuntuArch" "$__CodeName" "$__RootfsDir" "$__UbuntuRepo"
cp "$__CrossDir/$__BuildArch/sources.list.$__CodeName" "$__RootfsDir/etc/apt/sources.list"
chroot "$__RootfsDir" apt-get update
chroot "$__RootfsDir" apt-get -f -y install
chroot "$__RootfsDir" apt-get -y install $__UbuntuPackages
chroot "$__RootfsDir" symlinks -cr /usr
chroot "$__RootfsDir" apt-get clean
if [[ "$__SkipUnmount" == "0" ]]; then
umount "$__RootfsDir"/* || true
fi
if [[ "$__BuildArch" == "armel" && "$__CodeName" == "jessie" ]]; then
pushd "$__RootfsDir"
patch -p1 < "$__CrossDir/$__BuildArch/armel.jessie.patch"
popd
fi
elif [[ "$__Tizen" == "tizen" ]]; then
ROOTFS_DIR="$__RootfsDir" "$__CrossDir/tizen-build-rootfs.sh" "$__BuildArch"
else
echo "Unsupported target platform."
usage;
exit 1
fi

View file

@ -1,20 +0,0 @@
#!/bin/sh
if [ $# -lt 1 ]; then
echo "Usage: $0 <basebranch>"
exit 1
fi
if ! git rev-parse --is-inside-work-tree >/dev/null 2>&1; then
echo "Fatal: not inside a git repository"
exit 2
fi
basebranch=$1
if ! git rev-parse --verify --quiet $basebranch >/dev/null; then
# The base branch does not eixst, probably due to a shallow clone
git fetch -v $CI_MERGE_REQUEST_PROJECT_URL.git +refs/heads/$basebranch:refs/heads/$basebranch
fi
git --no-pager diff --diff-filter=ACMR --name-only $basebranch...HEAD -- "*/APKBUILD" | xargs -r -n1 dirname

View file

@ -1,74 +0,0 @@
# shellcheck disable=SC3043
:
# shellcheck disable=SC3040
set -eu -o pipefail
changed_repos() {
: "${APORTSDIR?APORTSDIR missing}"
: "${BASEBRANCH?BASEBRANCH missing}"
cd "$APORTSDIR"
for repo in $REPOS; do
git diff --diff-filter=ACMR --exit-code "$BASEBRANCH"...HEAD -- "$repo" >/dev/null \
|| echo "$repo"
done
}
changed_aports() {
: "${APORTSDIR?APORTSDIR missing}"
: "${BASEBRANCH?BASEBRANCH missing}"
cd "$APORTSDIR"
local repo="$1"
local aports
aports=$(git diff --name-only --diff-filter=ACMR --relative="$repo" \
"$BASEBRANCH"...HEAD -- "*/APKBUILD" | xargs -rn1 dirname)
# shellcheck disable=2086
ap builddirs -d "$APORTSDIR/$repo" $aports 2>/dev/null | xargs -rn1 basename
}
section_start() {
name=${1?arg 1 name missing}
header=${2?arg 2 header missing}
collapsed=$2
timestamp=$(date +%s)
options=""
case $collapsed in
yes|on|collapsed|true) options="[collapsed=true]";;
esac
printf "\e[0Ksection_start:%d:%s%s\r\e[0K%s\n" "$timestamp" "$name" "$options" "$header"
}
section_end() {
name=$1
timestamp=$(date +%s)
printf "\e[0Ksection_end:%d:%s\r\e[0K" "$timestamp" "$name"
}
gitlab_key_to_rsa() {
KEY=$1
TYPE=$2
TGT=$3
TGT_DIR=${TGT%/*}
if [ "$TGT" == "$TGT_DIR" ]; then
TGT_DIR="./"
fi
if [ ! -d "$TGT_DIR" ]; then
mkdir -p "$TGT_DIR"
fi
case $TYPE in
rsa-public) local type="PUBLIC";;
rsa-private) local type="RSA PRIVATE";;
esac
echo "-----BEGIN $type KEY-----" > "$TGT"
echo $1 | sed 's/.\{64\}/&\
/g' >> "$TGT"
echo "-----END $type KEY-----" >> "$TGT"
}

View file

@ -1,96 +0,0 @@
#!/bin/sh
BLUE="\e[34m"
MAGENTA="\e[35m"
RESET="\e[0m"
readonly BASEBRANCH=$CI_MERGE_REQUEST_TARGET_BRANCH_NAME
verbose() {
echo "> " "$@"
# shellcheck disable=SC2068
$@
}
debugging() {
[ -n "$CI_DEBUG_BUILD" ]
}
debug() {
if debugging; then
verbose "$@"
fi
}
# git no longer allows to execute in repositories owned by different users
sudo chown -R gitlab-runner: .
fetch_flags="-qn"
debugging && fetch_flags="-v"
git fetch $fetch_flags "$CI_MERGE_REQUEST_PROJECT_URL" \
"+refs/heads/$BASEBRANCH:refs/heads/$BASEBRANCH"
if debugging; then
merge_base=$(git merge-base "$BASEBRANCH" HEAD)
echo "$merge_base"
git --version
git config -l
git tag merge-base "$merge_base" || { echo "Could not determine merge-base"; exit 50; }
git log --oneline --graph --decorate --all
fi
has_problems=0
for PKG in $(changed-aports "$BASEBRANCH"); do
printf "$BLUE==>$RESET Linting $PKG\n"
(
cd "$PKG"
repo=$(basename $(dirname $PKG));
if [ "$repo" == "backports" ]; then
echo "Skipping $PKG as backports (we don't care)"
continue
fi
printf "\n\n"
printf "$BLUE"
printf '======================================================\n'
printf " parse APKBUILD:\n"
printf '======================================================'
printf "$RESET\n\n"
( . ./APKBUILD ) || has_problems=1
printf "\n\n"
printf "$BLUE"
printf '======================================================\n'
printf " abuild sanitycheck:\n"
printf '======================================================'
printf "$RESET\n\n"
abuild sanitycheck || has_problems=1
printf "\n\n"
printf "$BLUE"
printf '======================================================\n'
printf " apkbuild-shellcheck:\n"
printf '======================================================'
printf "$RESET\n"
apkbuild-shellcheck || has_problems=1
printf "\n\n"
printf "$BLUE"
printf '======================================================\n'
printf " apkbuild-lint:\n"
printf '======================================================'
printf "$RESET\n\n"
apkbuild-lint APKBUILD || has_problems=1
return $has_problems
) || has_problems=1
echo
done
exit $has_problems

View file

@ -1,56 +0,0 @@
#!/bin/sh
# shellcheck disable=SC3043
. $CI_PROJECT_DIR/.gitlab/bin/functions.sh
# shellcheck disable=SC3040
set -eu -o pipefail
readonly APORTSDIR=$CI_PROJECT_DIR
readonly REPOS="backports user"
readonly BASEBRANCH=$CI_MERGE_REQUEST_TARGET_BRANCH_NAME
export GIT_SSH_COMMAND="ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no"
gitlab_key_to_rsa $ABUILD_KEY rsa-private $HOME/.abuild/$ABUILD_KEY_NAME.rsa
gitlab_key_to_rsa $ABUILD_KEY_PUB rsa-public $HOME/.abuild/$ABUILD_KEY_NAME.rsa.pub
gitlab_key_to_rsa $SSH_KEY rsa-private $HOME/.ssh/id_rsa
chmod 700 "$HOME"/.ssh/id_rsa
chmod 700 "$HOME"/.abuild/$ABUILD_KEY_NAME.rsa
echo "PACKAGER_PRIVKEY=$HOME/.abuild/$ABUILD_KEY_NAME.rsa" > $HOME/.abuild/abuild.conf
echo "REPODEST=$HOME/repo-apk" >> $HOME/.abuild/abuild.conf
sudo cp $HOME/.abuild/$ABUILD_KEY_NAME.rsa.pub /etc/apk/keys/.
if [ -d $HOME/repo-apk ]; then
git -C $HOME/repo-apk fetch
git -C $HOME/repo-apk checkout $BASEBRANCH
git -C $HOME/repo-apk pull --rebase
else
git clone git@lab.ilot.io:ayakael/repo-apk -b $BASEBRANCH $HOME/repo-apk
fi
for i in $(find packages -type f -name "*.apk"); do
install -vDm644 $i ${i/packages/$HOME\/repo-apk}
done
fetch_flags="-qn"
git fetch $fetch_flags "$CI_MERGE_REQUEST_PROJECT_URL" \
"+refs/heads/$BASEBRANCH:refs/heads/$BASEBRANCH"
for repo in $(changed_repos); do
rm $HOME/repo-apk/$repo/*/APKINDEX.tar.gz | true
mkdir -p $repo/DUMMY
echo "pkgname=DUMMY" > $repo/DUMMY/APKBUILD
cd $repo/DUMMY
for i in $(find $HOME/repo-apk/$repo -maxdepth 1 -mindepth 1 -printf '%P '); do
CHOST=$i abuild index
done
cd "$CI_PROJECT_DIR"
rm -R $repo/DUMMY
done
git -C $HOME/repo-apk add .
git -C $HOME/repo-apk commit -m "Update from $CI_MERGE_REQUEST_IID - $CI_MERGE_REQUEST_TITLE"
git -C $HOME/repo-apk push

View file

@ -1,17 +0,0 @@
diff --git a/usr/bin/abuild.orig b/usr/bin/abuild
index 71e0681..d4ae3dd 100755
--- a/usr/bin/abuild.orig
+++ b/usr/bin/abuild
@@ -2231,7 +2231,11 @@ calcdeps() {
list_has $i $builddeps && continue
subpackages_has ${i%%[<>=]*} || builddeps="$builddeps $i"
done
- hostdeps="$EXTRADEPENDS_TARGET"
+ for i in $EXTRADEPENDS_HOST $EXTRADEPENDS_TARGET $depends $makedepends; do
+ [ "$pkgname" = "${i%%[<>=]*}" ] && continue
+ list_has $i $hostdeps && continue
+ subpackages_has ${i%%[<>=]*} || hostdeps="$hostdeps $i"
+ done
fi
}

View file

@ -1,32 +1,31 @@
# user-aports
Upstream: https://lab.ilot.io/ayakael/user-aports
# ayaports
Upstream: https://ayakael.net/forge/ayaports
## Description
This repository contains aports that are not yet merged in the official Alpine
Linux repository or dont adhere to Alpine polices. Packages are automatically
built using GitLab CI on my own GitLab instance. Once built, they are deployed
to a git-lfs repository, making them available to apk.
built using CI. Once built, they are deployed to a git-lfs repository, making
them available to apk.
Branches are matched to Alpine releases.
## Repositories
You can browse all the repositories at https://lab.ilot.io/ayakael/repo-apk.
You can browse all the repositories at https://codeberg.org/ayakael/ayaports
Affixed to each repository description is the appropriate link for use in
`/etc/apk/repositories`.
#### Backports
```
https://lab.ilot.io/ayakael/repo-apk/-/raw/edge/backports
https://ayakael.net/api/packages/forge/alpine/edge/backports
```
Aports from the official Alpine repositories backported from edge.
#### User
```
https://lab.ilot.io/ayakael/repo-apk/-/raw/edge/user
https://ayakael.net/api/packages/forge/alpine/edge/user
```
Aports that have yet to be (or may never be) upstreamed to the official
@ -34,11 +33,11 @@ aports.
## How to use
Add security key of the repo-apk repository to your /etc/apk/keys:
Add security key of the apk repository to your /etc/apk/keys:
```shell
cd /etc/apk/keys
wget https://lab.ilot.io/ayakael/repo-apk/-/raw/edge/antoine.martin@protonmail.com-5b3109ad.rsa.pub
curl -JO https://ayakael.net/api/packages/forge/alpine/key
```
Add repositories that you want to use (see above) to `/etc/apk/repositories`.
@ -52,10 +51,10 @@ they will work for you.
## Contribution & bug reports
If you wish to contribute to this aports collection, or wish to report a bug,
you can do so on Alpine's GitLab instance here:
https://gitlab.alpinelinux.org/ayakael/user-aports
you can do so on Codeberg here:
https://codeberg.org/ayakael/ayaports/issues
For packages that are in testing/community, bug reports and merge requests
For packages that are in backports, bug reports and merge requests
should be done on Alpine's aports repo instance:
https://gitlab.alpinelinux.org/alpine/aports

View file

@ -0,0 +1,25 @@
diff --color -Nur calibre-6.17.0.orig/src/calibre/gui2/update.py calibre-6.17.0/src/calibre/gui2/update.py
--- calibre-6.17.0.orig/src/calibre/gui2/update.py 2023-05-06 11:36:35.678461036 -0700
+++ calibre-6.17.0/src/calibre/gui2/update.py 2023-05-06 11:39:10.365134930 -0700
@@ -82,20 +82,6 @@
while not self.shutdown_event.is_set():
calibre_update_version = NO_CALIBRE_UPDATE
plugins_update_found = 0
- try:
- version = get_newest_version()
- if version[:2] > numeric_version[:2]:
- calibre_update_version = version
- except Exception as e:
- prints('Failed to check for calibre update:', as_unicode(e))
- try:
- update_plugins = get_plugin_updates_available(raise_error=True)
- if update_plugins is not None:
- plugins_update_found = len(update_plugins)
- except Exception as e:
- prints('Failed to check for plugin update:', as_unicode(e))
- if calibre_update_version != NO_CALIBRE_UPDATE or plugins_update_found > 0:
- self.signal.update_found.emit(calibre_update_version, plugins_update_found)
self.shutdown_event.wait(self.INTERVAL)
def shutdown(self):

116
backports/calibre/APKBUILD Normal file
View file

@ -0,0 +1,116 @@
# Maintainer: Cowington Post <cowingtonpost@gmail.com>
pkgname=calibre
pkgver=7.12.0
pkgrel=0
pkgdesc="Ebook management application"
# qt6-webengine
arch="x86_64 aarch64"
url="https://calibre-ebook.com"
license="GPL-3.0-or-later"
depends="
font-liberation
libwmf
mtdev
optipng
poppler
py3-apsw
py3-beautifulsoup4
py3-css-parser
py3-cssselect
py3-dateutil
py3-dnspython
py3-feedparser
py3-fonttools
py3-html2text
py3-html5-parser
py3-html5lib
py3-jeepney
py3-lxml
py3-markdown
py3-mechanize
py3-msgpack
py3-netifaces
py3-pillow
py3-psutil
py3-pycryptodome
py3-pygments
py3-pyqt6-webengine
py3-regex
py3-xxhash
py3-zeroconf
qt6-qtimageformats
qt6-qtsvg
qt6-qtwebengine
udisks2
"
makedepends="
cmake
curl
hunspell-dev
hyphen-dev
libmtp-dev
libstemmer-dev
libusb-dev
podofo-dev
py3-pyqt-builder
py3-pyqt6-sip
py3-sip
python3-dev
qt6-qtbase-dev
uchardet-dev
xdg-utils
"
subpackages="
$pkgname-pyc
$pkgname-doc
$pkgname-bash-completion
$pkgname-zsh-completion
"
source="https://download.calibre-ebook.com/$pkgver/calibre-$pkgver.tar.xz
0001-$pkgname-no-update.patch
"
# net: downloads iso-codes
# !check: no tests ran
options="net !check"
export LANG="en_US.UTF-8"
prepare() {
default_prepare
rm -f resources/calibre-portable.*
}
build() {
python3 setup.py build
python3 setup.py iso639
python3 setup.py iso3166
python3 setup.py liberation_fonts --system-liberation_fonts --path-to-liberation_fonts /usr/share/fonts/liberation
python3 setup.py mathjax
python3 setup.py gui
}
check() {
python3 -m unittest discover
}
package() {
# needed for zsh
mkdir -p "$pkgdir"/usr/share/zsh/site-functions
python3 setup.py install \
--staging-root="$pkgdir"/usr \
--no-compile \
--system-plugins-location=/usr/share/calibre/system-plugins
cp -a man-pages/ "$pkgdir"/usr/share/man
rm -r "$pkgdir"/usr/share/calibre/rapydscript/
python3 -m compileall -fq "$pkgdir"/usr
}
sha512sums="
ee654260d7047f0579a659b8907439a407fb561affcef84141126840452e7b98d10bb5e0a69e0cc809d9ba68729570900a0e7251f18b2056a94b0213880f1363 calibre-7.12.0.tar.xz
eb8e7ce40ff8b8daf6e7e55a5dff8ec4dff06c45744266bb48b3194e92ab1196bc91468203e3c2ca1e5144166a7d6be90e6cf0253513e761b56a4c85be4c2c76 0001-calibre-no-update.patch
"

View file

@ -1,20 +1,19 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=caprine
pkgver=2.59.1
pkgrel=0
pkgver=2.60.1
pkgrel=1
pkgdesc="Elegant Facebook Messenger desktop app"
arch="x86_64 aarch64" # bloced by electron
arch="x86_64 aarch64" # blocked by electron
url="https://github.com/sindresorhus/caprine"
license="MIT"
depends="electron"
makedepends="npm findutils coreutils"
options="!check"
options="!check" # No test suite
source="
$pkgname-$pkgver.tar.gz::https://github.com/sindresorhus/caprine/archive/refs/tags/v$pkgver.tar.gz
caprine.desktop
caprine.js
caprine.sh
"
build() {
@ -27,7 +26,7 @@ build() {
}
package() {
local appdir=/usr/lib/$pkgname
local appdir=/usr/lib/caprine
install -d "$pkgdir"$appdir
cp -r ./* "$pkgdir"$appdir
@ -35,13 +34,13 @@ package() {
install -dm755 "$pkgdir/usr/share/pixmaps"
install -m644 build/icon.png "$pkgdir/usr/share/pixmaps/$pkgname.png"
install -Dm755 "$srcdir"/$pkgname.js "$pkgdir"/usr/bin/$pkgname
install -Dm644 "$srcdir"/$pkgname.desktop \
"$pkgdir"/usr/share/applications/$pkgname.desktop
install -Dm755 "$srcdir"/caprine.sh "$pkgdir"/usr/bin/caprine
install -Dm644 "$srcdir"/caprine.desktop \
"$pkgdir"/usr/share/applications/caprine.desktop
install -dm755 "$pkgdir"/usr/share/licenses/$pkgname
ln -s "$(realpath -m --relative-to=/usr/share/licenses/$pkgname $appdir/license)" \
"$pkgdir"/usr/share/licenses/$pkgname
install -dm755 "$pkgdir"/usr/share/licenses/caprine
ln -s "$(realpath -m --relative-to=/usr/share/licenses/caprine $appdir/license)" \
"$pkgdir"/usr/share/licenses/caprine
# Clean up
rm -r "$pkgdir"$appdir/build
@ -49,7 +48,7 @@ package() {
rm -r "$pkgdir"$appdir/tsconfig.json
find "$pkgdir"$appdir \
-name "package.json" \
-exec sed -e "s|$srcdir/$pkgname|$appdir|" \
-exec sed -e "s|$srcdir/caprine|$appdir|" \
-i {} \; \
-or -name ".*" -prune -exec rm -r '{}' \; \
-or -name "bin" -prune -exec rm -r '{}' \; \
@ -60,7 +59,7 @@ package() {
-or -name "test" -prune -exec rm -r '{}' \;
}
sha512sums="
a525bafb6a53dd2dbdfc4b9b3e96d3939d93be950a3287f2a5ef6465d5a6b64ecda79b6d393023d067f939e1a6e85debc35f83bbb1f758011db9d94dd9ff8a72 caprine-2.59.1.tar.gz
0df7f233c91f5a044dcffde94b976c6ad71e6d355518615c48cd825a249c01d63f455de31ece69193a66ca0fd8157506f9b88088da1bd47fc75e9d3800784ed0 caprine-2.60.1.tar.gz
a469e3bea24926119e51642b777ef794c5fa65421107903f967c36d81bbb1adb3d52469ce3a3301b2c890f1aa53ab989ded22a7c6e811fb8cf0a582dbd835e19 caprine.desktop
44280c62ce43bdafa8528729371fccb16b8a0e3db7aca28d5c157ae0144dca5fbb023b8883b561955aa28ab62e967f2674d8c6bcaff186e2cdd0e7ba8beab9ac caprine.js
3ad8994c1a0417e73d622587769e527b4236a32c1a89442ff76413b75b4392d667c9e2908979b453e5926e54db6d94b31625340c5a94e84e91ea77f56feae778 caprine.sh
"

View file

@ -0,0 +1,2 @@
#!/bin/sh
/usr/bin/electron "/usr/lib/caprine"

View file

@ -1,8 +1,8 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=freetube
pkgver=0.20.0
pkgrel=1
pkgver=0.21.3
pkgrel=0
pkgdesc="An open source desktop YouTube player built with privacy in mind."
arch="x86_64 aarch64" # blocked by electron
license="AGPL-3.0-only"
@ -50,7 +50,7 @@ package() {
}
sha512sums="
b82cdaff82d7bd325f3127794160382c97be3b72c5ef4bb3f327a8ada6b609043bc30a1f6af59c38e23237aac7d8b6ea2685c22aa82469c8d08b96cb839a3099 freetube-0.20.0.tar.gz
22e5ab677cd442d50237b2d62534698d8ad73a37e1731003dc23c4ea3da992b3cae936f0bb3a0a86cd4b7fba731c9fa53276cb0a6cd5bab213ff2a6c9006cb05 freetube-0.21.3.tar.gz
2ce2effc794bb663789cefe968b5899122127983dbfa1b240aa33a2be383720b18204e6d01b4a550df72956f02b6636b79c93a58f470a970b09b770f5b8f2fc4 freetube.sh
d27cb896b65a7e8d52ffe86e5f74eed72b6cf976b28e1a13012d34c7eceba5ff6f20298017738dfa93c0336ffa52b8ee4da7e06b02747062898db7e678819526 tasje-dotdash.patch
"

View file

@ -0,0 +1,93 @@
# Contributor: Rogério da Silva Yokomizo <me@ro.ger.io>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Rogério da Silva Yokomizo <me@ro.ger.io>
pkgname=looking-glass
_gittag=b7_git20240607
pkgver=7b_git20240607
pkgrel=0
pkgdesc="Allows the use of a KVM configured for VGA PCI Pass-through without an attached physical monitor, keyboard or mouse"
url="https://looking-glass.io/"
arch="x86_64"
license="GPL-2.0-or-later"
makedepends="
cmake
fontconfig-dev
libsamplerate-dev
libx11-dev
libxcursor-dev
libxfixes-dev
libxi-dev
libxinerama-dev
libxkbcommon-dev
libxpresent-dev
libxscrnsaver-dev
nettle-dev
obs-studio-dev
pipewire-dev
pulseaudio-dev
samurai
spice-dev
wayland-dev
wayland-protocols
"
source="$pkgname-$_gittag.tar.gz::https://lab.ilot.io/mirrors/looking-glass/-/releases/$_gittag/downloads/tarball/looking-glass-$_gittag.tar.gz
missing-includes.patch
obs-plugins-lib.patch
werror.patch
"
subpackages="$pkgname-obs $pkgname-module"
builddir="$srcdir/$pkgname-$_gittag"
options="!check" # There are no tests nor --version.
build() {
cmake -S client -B build-client -G Ninja \
-DENABLE_BACKTRACE=OFF \
-DOPTIMIZE_FOR_NATIVE=OFF \
-DCMAKE_INSTALL_PREFIX=/usr
cmake -S obs -B build-obs -G Ninja \
-DENABLE_BACKTRACE=OFF \
-DOPTIMIZE_FOR_NATIVE=OFF \
-DCMAKE_INSTALL_PREFIX=/usr
cmake --build build-client
cmake --build build-obs
}
package() {
DESTDIR="$pkgdir" cmake --install build-client
DESTDIR="$pkgdir" cmake --install build-obs
}
module() {
pkgdesc="Looking Glass kernel module (AKMS)"
depends="akms"
install_if="looking-glass=$pkgver-r$pkgrel"
_modver=$(awk -F "=" '{if($1=="PACKAGE_VERSION"){print $2}}' src/looking-glass-B6/module/dkms.conf | tr -d '"')
install -Dm644 "$builddir"/module/Makefile "$subpkgdir"/usr/src/looking-glass/Makefile
install -Dm644 "$builddir"/module/kvmfr* "$subpkgdir"/usr/src/looking-glass/.
cat ->> "$subpkgdir"/usr/src/looking-glass/AKMBUILD <<EOF
modname=kvmfr
modver=$pkgver
built_modules='kvmfr.ko'
EOF
chmod -R u=rwX,go=rX-w "$subpkgdir"/usr/src/looking-glass
mkdir -p "$subpkgdir"/etc/udev/rules.d
echo 'SUBSYSTEM=="kvmfr", OWNER="root", GROUP="kvm", MODE="0660"' > "$subpkgdir"/etc/udev/rules.d/99-kvmfr.rules
}
obs() {
pkgdesc="$pkgdesc (obs plugin)"
amove usr/lib/obs-plugins
}
sha512sums="
959f49c91dc7bb06dfae890547bfbd1c02bd4154f4ba1c898a12d15a3579658d65fcb9fc4b951c04180e17fc9151e551858e0fb60f20e3f1a72d19b86c7dc3db looking-glass-b7_git20240607.tar.gz
6d2449764a8316dd3c1b5cc0aa552671068f89ed2f95297f3c5256af8529b93e5ec7af8f979bd2e744fd09b11063e8a93f3ed26284f0e49294e467ca10f6e772 missing-includes.patch
33c5463412a16691f47d7833ebf81d7cf20c560a077dca141dcc9f02a5d6dfb676e483835f39a06012b114be9f509dda4614fe253bb1c72a0142e82dc265a5ca obs-plugins-lib.patch
b952d1fd284aed15bcfe7990f160dec3a4565fb5833ce339920f62de6bb46fbc09265a0a79fe80d212eecc6a1813614e1e193a8846c37e2afd18431dc3a89ca4 werror.patch
"

View file

@ -0,0 +1,2 @@
#!/bin/sh
exec /usr/sbin/akms uninstall kvmfr

View file

@ -0,0 +1,92 @@
--- a/repos/PureSpice/src/agent.c
+++ b/repos/PureSpice/src/agent.c
@@ -31,6 +31,7 @@ Place, Suite 330, Boston, MA 02111-1307
#include <unistd.h>
#include <stdio.h>
#include <stdlib.h>
+#include <string.h>
#include <assert.h>
#include <sys/ioctl.h>
--- a/repos/PureSpice/src/channel_cursor.c
+++ b/repos/PureSpice/src/channel_cursor.c
@@ -25,6 +25,7 @@ Place, Suite 330, Boston, MA 02111-1307
#include "channel_cursor.h"
#include <stdlib.h>
+#include <string.h>
#include "messages.h"
--- a/repos/PureSpice/src/channel_display.c
+++ b/repos/PureSpice/src/channel_display.c
@@ -19,6 +19,7 @@ Place, Suite 330, Boston, MA 02111-1307
#include "purespice.h"
#include <stdlib.h>
+#include <string.h>
#include "ps.h"
#include "log.h"
--- a/repos/PureSpice/src/channel_inputs.c
+++ b/repos/PureSpice/src/channel_inputs.c
@@ -25,6 +25,7 @@ Place, Suite 330, Boston, MA 02111-1307
#include "messages.h"
#include <stdlib.h>
+#include <string.h>
const SpiceLinkHeader * channelInputs_getConnectPacket(void)
{
--- a/repos/PureSpice/src/channel_main.c
+++ b/repos/PureSpice/src/channel_main.c
@@ -24,6 +24,7 @@ Place, Suite 330, Boston, MA 02111-1307
#include "messages.h"
#include <stdlib.h>
+#include <string.h>
struct ChannelMain
{
--- a/repos/PureSpice/src/channel_playback.c
+++ b/repos/PureSpice/src/channel_playback.c
@@ -26,6 +26,8 @@ Place, Suite 330, Boston, MA 02111-1307
#include "messages.h"
+#include <string.h>
+
const SpiceLinkHeader * channelPlayback_getConnectPacket(void)
{
typedef struct
--- a/repos/PureSpice/src/channel_record.c
+++ b/repos/PureSpice/src/channel_record.c
@@ -26,6 +26,8 @@ Place, Suite 330, Boston, MA 02111-1307
#include "messages.h"
+#include <string.h>
+
const SpiceLinkHeader * channelRecord_getConnectPacket(void)
{
typedef struct
--- a/repos/PureSpice/src/log.c
+++ b/repos/PureSpice/src/log.c
@@ -25,6 +25,7 @@ Place, Suite 330, Boston, MA 02111-1307
#include <stdarg.h>
#include <stdio.h>
+#include <string.h>
static void log_stdout(const char * file, unsigned int line,
const char * function, const char * format, ...)
--- a/repos/PureSpice/src/ps.c
+++ b/repos/PureSpice/src/ps.c
@@ -37,6 +37,7 @@ Place, Suite 330, Boston, MA 02111-1307
#include <unistd.h>
#include <stdio.h>
#include <stdlib.h>
+#include <string.h>
#include <assert.h>
#include <errno.h>

View file

@ -0,0 +1,25 @@
From e32b292cc1ba089db6ed28e4d5eb0fc8cc4c2235 Mon Sep 17 00:00:00 2001
From: esi <git@esibun.net>
Date: Fri, 12 May 2023 16:28:01 -0400
Subject: [PATCH] [module] Fix build on Linux 6.4 (fixes #1075)
---
module/dkms.conf | 2 +-
module/kvmfr.c | 4 ++++
2 files changed, 5 insertions(+), 1 deletion(-)
diff --git a/module/kvmfr.c b/module/kvmfr.c
index ca0cca685..c711e000e 100644
--- a/module/kvmfr.c
+++ b/module/kvmfr.c
@@ -539,7 +539,11 @@ static int __init kvmfr_module_init(void)
if (kvmfr->major < 0)
goto out_free;
+#if LINUX_VERSION_CODE < KERNEL_VERSION(6, 4, 0)
kvmfr->pClass = class_create(THIS_MODULE, KVMFR_DEV_NAME);
+#else
+ kvmfr->pClass = class_create(KVMFR_DEV_NAME);
+#endif
if (IS_ERR(kvmfr->pClass))
goto out_unreg;

View file

@ -0,0 +1,23 @@
From 7305ce36af211220419eeab302ff28793d515df2 Mon Sep 17 00:00:00 2001
From: Geoffrey McRae <geoff@hostfission.com>
Date: Fri, 7 Jun 2024 19:01:38 +1000
Subject: [PATCH] [module] fix build on linux 6.10
Fixes #1124 - Thanks @pongo1231
---
module/dkms.conf | 2 +-
module/kvmfr.c | 1 +
2 files changed, 2 insertions(+), 1 deletion(-)
diff --git a/module/kvmfr.c b/module/kvmfr.c
index b5acd18de..c99a5d79c 100644
--- a/module/kvmfr.c
+++ b/module/kvmfr.c
@@ -30,6 +30,7 @@
#include <linux/highmem.h>
#include <linux/memremap.h>
#include <linux/version.h>
+#include <linux/vmalloc.h>
#include <asm/io.h>

View file

@ -0,0 +1,12 @@
--- a/obs/CMakeLists.txt
+++ b/obs/CMakeLists.txt
@@ -84,7 +84,8 @@ target_link_libraries(looking-glass-obs
)
install(TARGETS looking-glass-obs
- LIBRARY DESTINATION ${OBS_PLUGIN_PREFIX}/${CMAKE_PROJECT_NAME}/bin/${OBS_PLUGIN_DIR}
+ # LIBRARY DESTINATION ${OBS_PLUGIN_PREFIX}/${CMAKE_PROJECT_NAME}/bin/${OBS_PLUGIN_DIR}
+ LIBRARY DESTINATION /usr/lib/obs-plugins
)
feature_summary(WHAT ENABLED_FEATURES DISABLED_FEATURES)

View file

@ -0,0 +1,24 @@
diff --git a/client/CMakeLists.txt b/client/CMakeLists.txt
index 836f814..7047365 100644
--- a/client/CMakeLists.txt
+++ b/client/CMakeLists.txt
@@ -68,7 +68,6 @@ add_compile_options(
"-Wno-unused-parameter"
"$<$<COMPILE_LANGUAGE:C>:-Wstrict-prototypes>"
"$<$<C_COMPILER_ID:GNU>:-Wimplicit-fallthrough=2>"
- "-Werror"
"-Wfatal-errors"
"-ffast-math"
"-fdata-sections"
diff --git a/obs/CMakeLists.txt b/obs/CMakeLists.txt
index 0491e65..60b37ff 100644
--- a/obs/CMakeLists.txt
+++ b/obs/CMakeLists.txt
@@ -18,7 +18,6 @@ add_feature_info(ENABLE_BACKTRACE ENABLE_BACKTRACE "Backtrace support.")
add_compile_options(
"-Wall"
- "-Werror"
"-Wfatal-errors"
"-ffast-math"
"-fdata-sections"

View file

@ -0,0 +1,46 @@
# Contributor: Francesco Colista <fcolista@alpinelinux.org>
# Maintainer: Francesco Colista <fcolista@alpinelinux.org>
pkgname=py3-apsw
_pkgname=apsw
pkgver=3.45.2.0
pkgrel=1
pkgdesc="Another Python SQLite Wrapper"
url="https://github.com/rogerbinns/apsw"
arch="all"
license="Zlib"
depends="python3"
makedepends="
python3-dev
py3-gpep517
py3-setuptools
py3-wheel
sqlite-dev
"
subpackages="$pkgname-pyc"
source="$pkgname-$pkgver.zip::https://github.com/rogerbinns/apsw/releases/download/$pkgver/apsw-$pkgver.zip
detect-sqlite-config.patch
"
builddir="$srcdir/$_pkgname-$pkgver"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 setup.py build_test_extension test
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
0260f6479d5f1188ad172dfc0dd7e4a03c9d809d2f80c2296e587a19286681bb2ce759b0bd19ec6957e2902f18729b7e79410e4db79dff9918089f57dd510828 py3-apsw-3.45.2.0.zip
8f3957bd6fecb5660a7cab367043e4ccdacd87d8963bbe41cc3d525265de28f08aa207099658d785be29c5c90b818c1418f766995cd780d02b8e36252a389758 detect-sqlite-config.patch
"

View file

@ -0,0 +1,8 @@
diff --git a/setup.apsw b/setup.apsw
index 68dedb9..3ceb10b 100644
--- a/setup.apsw
+++ b/setup.apsw
@@ -1 +1,3 @@
# You can put ini format directives here in addition to command line flags
+[build_ext]
+use_system_sqlite_config = True

View file

@ -1,38 +0,0 @@
# Contributor: Leonardo Arena <rnalrd@alpinelinux.org>
# Maintainer: Will Sinatra <wpsinatra@gmail.com>
pkgname=py3-django-debug-toolbar
_pkgname=django-debug-toolbar
pkgver=4.3
pkgrel=1
pkgdesc="Configurable set of panels that display various debug information about the current request/response"
options="!check" # Requires unpackaged Selenium python3 module
url="https://github.com/jazzband/django-debug-toolbar"
arch="noarch"
license="BSD-3-Clause"
depends="py3-django py3-sqlparse"
makedepends="
py3-gpep517
py3-hatchling
"
# options="!check" #no testsuite
subpackages="$pkgname-pyc"
source="$pkgname-$pkgver.tar.gz::https://github.com/jazzband/$_pkgname/archive/$pkgver.tar.gz"
builddir="$srcdir"/$_pkgname-$pkgver
replaces="py-django-debug-toolbar" # Backwards compatibility
provides="py-django-debug-toolbar=$pkgver-r$pkgrel" # Backwards compatibility
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
caa8563d38e8c96305828b7a07006ce2ee0afae099d70d75d332f2196fc3ffcf7f3848440ea22c00f2b918029477672a172e30714f6f73a630404175aef3b925 py3-django-debug-toolbar-4.3.tar.gz
"

View file

@ -0,0 +1,41 @@
# Maintainer: Cowington Post <cowingtonpost@gmail.com>
pkgname=py3-html5-parser
pkgver=0.4.12
pkgrel=1
pkgdesc="Fast C based HTML 5 parsing for python"
url="https://github.com/kovidgoyal/html5-parser"
arch="all"
license="Apache-2.0"
depends="py3-lxml py3-chardet"
makedepends="
libxml2-dev
py3-gpep517
py3-setuptools
py3-wheel
python3-dev
"
checkdepends="py3-beautifulsoup4"
subpackages="$pkgname-pyc"
source="https://github.com/kovidgoyal/html5-parser/archive/v$pkgver/py3-html5-parser-$pkgver.tar.gz"
builddir="$srcdir/html5-parser-$pkgver"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 setup.py test
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
d2c031225b74d01a1ae3455837ac09e9afad8a4ec6ab1b8f66cbea8a86188db271a72570ef06e05ac56d369b41d97fc6f382455e25ca346a1897f62a3696a252 py3-html5-parser-0.4.12.tar.gz
"

View file

@ -1,14 +1,15 @@
# Contributor: lauren n. liberda <lauren@selfisekai.rocks>
# Maintainer: lauren n. liberda <lauren@selfisekai.rocks>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=signal-desktop
pkgver=7.9.0
pkgver=7.29.0
pkgrel=0
pkgdesc="A messaging app for simple private communication with friends"
url="https://github.com/signalapp/Signal-Desktop/"
# same as electron
# build failure
#arch="aarch64 x86_64"
arch="aarch64 x86_64"
license="AGPL-3.0-only"
_llvmver=18
depends="
electron
font-barlow
@ -24,7 +25,7 @@ makedepends="
cargo
cargo-auditable
cbindgen
clang-dev
clang$_llvmver-dev
cmake
crc32c-dev
dav1d-dev
@ -43,8 +44,9 @@ makedepends="
libsecret-dev
libvpx-dev
libwebp-dev
lld
llvm-dev
libxml2-dev
lld$_llvmver
llvm$_llvmver-dev
mesa-dev
nodejs
npm
@ -63,25 +65,17 @@ makedepends="
"
options="net !check"
# follow signal-desktop package.json -> @signalapp/libsignal-client
_libsignalver=0.45.0
# follow signal-desktop package.json -> @signalapp/ringrtc
_ringrtcver=2.41.0
# follow ringrtc (on version above) -> config/version.properties -> webrtc.version
# downloading tarball generated with abuild snapshot (with gclient dependencies fetched)
_webrtcver=6261g
# follow @signalapp/better-sqlite3 (on version in package.json) -> deps/download.js -> TOKENIZER_VERSION
# last bsqlite version: 8.7.1
# use _check_depends to validate this
_libsignalver=0.58.0
_ringrtcver=2.48.3
_webrtcver=6613c
_stokenizerver=0.2.1
source="
https://github.com/signalapp/Signal-Desktop/archive/refs/tags/v$pkgver/Signal-Desktop-$pkgver.tar.gz
https://github.com/signalapp/libsignal/archive/refs/tags/v$_libsignalver/libsignal-$_libsignalver.tar.gz
https://github.com/signalapp/ringrtc/archive/refs/tags/v$_ringrtcver/ringrtc-$_ringrtcver.tar.gz
https://ab-sn.lnl.gay/webrtc-$_webrtcver.tar.zst
https://ayakael.net/api/packages/mirrors/generic/webrtc/$_webrtcver/webrtc-$_webrtcver.tar.zst
https://github.com/signalapp/Signal-FTS5-Extension/archive/refs/tags/v$_stokenizerver/stokenizer-$_stokenizerver.tar.gz
bettersqlite-use-system-sqlcipher.patch
@ -90,7 +84,7 @@ source="
signal-update-links.patch
signal-show-window-please.patch
ringrtc-webrtc-renamed.patch
webrtc-shared-like-my-wife.patch
webrtc-shared-libs.patch
webrtc-compiler.patch
webrtc-gcc13.patch
@ -118,6 +112,25 @@ export CARGO_PROFILE_RELEASE_STRIP="symbols"
export YARN_CACHE_FOLDER="$srcdir/.yarn"
_check_depends() {
# _libsignalver: follow signal-desktop package.json -> @signalapp/libsignal-client
# _ringrtcver: follow signal-desktop package.json -> @signalapp/ringrtc
# _webrtcver: follow ringrtc (on version above) -> config/version.properties -> webrtc.version
# downloading tarball generated with abuild snapshot (with gclient dependencies fetched)
# _stokenizerver: follow @signalapp/better-sqlite3 (on version in package.json) -> deps/download.js -> TOKENIZER_VERSION
local _libsignalver=$(curl --silent https://raw.githubusercontent.com/signalapp/Signal-Desktop/v$pkgver/package-lock.json | grep "@signalapp/libsignal-client\": \"" | awk '{print $2}' | tr -d ',' | tr -d '"' | head -n 1)
local _ringrtcver=$(curl --silent https://raw.githubusercontent.com/signalapp/Signal-Desktop/v$pkgver/package-lock.json | grep "@signalapp/ringrtc\": \"" | awk '{print $2}' | tr -d ',' | tr -d '"' | head -n 1)
local _bsqlitever=$(curl --silent https://raw.githubusercontent.com/signalapp/Signal-Desktop/v$pkgver/package-lock.json | grep "@signalapp/better-sqlite3\": \"" | awk '{print $2}' | tr -d ',' | tr -d '"' | head -n 1)
local _webrtcver=$(curl --silent https://raw.githubusercontent.com/signalapp/ringrtc/v$_ringrtcver/config/version.properties | awk -F '=' '{if($1 == "webrtc.version"){print $2}}' | head -n 1)
local _stokenizerver=$(curl --silent https://raw.githubusercontent.com/signalapp/better-sqlite3/v$_bsqlitever/deps/download.js | grep "const TOKENIZER_VERSION" | awk '{print $4}' | tr -d "'" | tr -d ';' | head -n 1)
echo _libsignalver=$_libsignalver
echo _ringrtcver=$_ringrtcver
echo _webrtcver=$_webrtcver
echo _stokenizerver=$_stokenizerver
}
# webrtc only, the other dependencies are fine with tarballs
_distbucket="sakamoto/lnl-aports-snapshots/"
snapshot() {
@ -195,8 +208,7 @@ prepare() {
done
msg "Installing signal-desktop JS dependencies"
echo 'ignore-engines true' > .yarnrc
yarn --ignore-scripts --frozen-lockfile
npm ci --ignore-scripts
(
cd "$srcdir"/webrtc-$_webrtcver
@ -239,6 +251,7 @@ prepare() {
\! -path "*third_party/$_lib/google/*" \
\! -path './base/third_party/icu/*' \
\! -path './third_party/libxml/*' \
\! -path './third_party/re2/*' \
\! -path './third_party/pdfium/third_party/freetype/include/pstables.h' \
\! -path './third_party/harfbuzz-ng/utils/hb_scoped.h' \
\! -path './third_party/crashpad/crashpad/third_party/zlib/zlib_crashpad.h' \
@ -283,7 +296,7 @@ prepare() {
cd "$srcdir"/ringrtc-$_ringrtcver/src/node
msg "Installing ringrtc js dependencies"
yarn --frozen-lockfile --ignore-scripts
npm ci --ignore-scripts
)
(
@ -359,10 +372,10 @@ build() {
(
cd "$srcdir"/ringrtc-$_ringrtcver/src/node
msg "Building ringrtc JS glue code"
yarn build
npm run build
msg "Cleaning dev dependencies for ringrtc"
yarn --ignore-scripts --frozen-lockfile --production
npm prune --ignore-scripts --omit=dev
)
# module on npm intentionally unbuildable: https://github.com/signalapp/libsignal/issues/464#issuecomment-1160665052
@ -370,17 +383,17 @@ build() {
cd "$srcdir"/libsignal-$_libsignalver/node
msg "Building libsignal"
yarn node-gyp configure --nodedir=/usr/include/electron/node_headers --build-from-source
yarn node-gyp build --nodedir=/usr/include/electron/node_headers --build-from-source
node-gyp configure --nodedir=/usr/include/electron/node_headers --build-from-source
node-gyp build --nodedir=/usr/include/electron/node_headers --build-from-source
mkdir -p prebuilds/linux-$chromium_arch
mv build/Release/libsignal_client_linux_$chromium_arch.node prebuilds/linux-$chromium_arch/node.napi.node
msg "Building libsignal glue code"
yarn tsc
npm run tsc
msg "Cleaning dev dependencies for libsignal"
yarn --ignore-scripts --frozen-lockfile --production
npm prune -ignore-scripts --omit=dev
)
(
@ -392,8 +405,8 @@ build() {
)
# from package.json postinstall
yarn build:acknowledgments
yarn patch-package
npm run build:acknowledgments
npm exec patch-package
rm -rf node_modules/dtrace-provider
# get esbuild installed (needed for next step)
@ -403,20 +416,10 @@ build() {
NODE_ENV=production \
SIGNAL_ENV=production \
NODE_OPTIONS=--openssl-legacy-provider \
yarn build:dev
npm run build:dev
# purge non-production deps
yarn install --ignore-scripts --frozen-lockfile --production
# XXX: the previous step undoes the patches. and removes the patch applier.
# please force me to just implement packaging without dev modules in tasje. -lnl
for x in patches/*.patch; do
# some of these patches are made for devDependencies
if [ -d "$(grep -Eo 'node_modules/(@[a-z\d_-]+/)?[a-z\d_-]+/' "$x" | head -n1)" ]; then
msg "$x"
patch -Np1 -i ./"$x"
fi
done
npm prune --ignore-scripts --omit=dev
# use our libsignal
rm -rf node_modules/@signalapp/libsignal-client/
@ -451,7 +454,7 @@ build() {
check() {
# tests run against downloaded build of electron for glibc, probably can be patched
yarn test
npm run test
}
package() {
@ -472,18 +475,18 @@ package() {
}
sha512sums="
53ef9feccdbe1c52eee88d2e2ed337746dcaf0fd18ee0d462ba3faaef02b38e9ba7269857e975f241c719d6750ce01fc42b0d90bbd9ef7bbee14b9b4540adbb8 Signal-Desktop-7.9.0.tar.gz
70f2cb7d05e019235cd044c401bcf072a934fdfea4a161ef5be988d2e3932ba5233110b4b06525e6f33ea9cad036def442e70adad15eab883903d9246969896e libsignal-0.45.0.tar.gz
3adccc33d4efa29e003175d0e00cceb169426a73f467ea32406e9cd721c72aeaa45a7816985e484b8adceb2de2a6405f306f6d609b43a25c950b18dd49a14476 ringrtc-2.41.0.tar.gz
6dc8c709986816e724a57d056a165cf72db70644593e8de8e3026e511d1f8f3d6d5e171d500cfabe760309c5a81795b667b4399c9885be30163d326cbc82c1b4 webrtc-6261g.tar.zst
b97155dc2ca70436d6fdf15fff059f905f065738a288679aeee2199d43824206f4c7e4bae0c228b55b4cc76b7e00875b738ee4f7dea3c2a5414acec3e208aa1e Signal-Desktop-7.29.0.tar.gz
6fb62213d8177ac5abe83ea71a18ea4b1c7b323983c41087166658fe9c47c1fd39e5323ca6acefe3db2a9a9376b6f385b5f2c006154da3ab705741d848b28943 libsignal-0.58.0.tar.gz
6777354b60650c6c3d359714f3aff92a315996f3725ba05c74ed054d3c4ba5506406b30c940853b5ba426ac0271cdb4dd930a759c570f486a70e1f5adc5a2aae ringrtc-2.48.3.tar.gz
fe04fcf13f55b124f03ce9d516b1c53fc4f20c6a016819c62eeaa0500eda92c5a0c0d7dc5d1e360a27691dfd404c254e91bed9fb25d0fc40a27795c1b674a82e webrtc-6613c.tar.zst
84a1f2fc29262a12842e94698d124a85b823128e72a493b0be8ea92fbb72c5c268499f4a6827cdedaae06ec73cce4039a39fe5c5d536cbef330e59ba0183da28 stokenizer-0.2.1.tar.gz
be5b4e823543b79175a12314f10c6326d9f0d59f470136962daed4665887006acc05b48b40dc1b67747396d8f6f7d23be298c1e110ccdd35ff9b09d5e6b80bab bettersqlite-use-system-sqlcipher.patch
92de6fc7cc5f2b6d65bedbd74cc733dd86dafc9cbfb9b727c3267aef63a71a07247cde9b163c68fddfeb9096dcd7f554d36d0b2de078d8905e3825645ddbd6eb libsignal-auditable.patch
2e5fadff725f1d62e7134c8929c672ec88cae602b065480f1b799d34160daa0cb1ad0f5511e60676f81464ae8752c3bec7b3d7bc5a432533be004b4d20ac32c2 libsignal-auditable.patch
152435231cdcf52a17a9e24aadf95d77511258e818172941ba074a73a90a541f0136feb58868674f2bcb19191a6d12933fe6cd5baf3ee99e508915c72523163b signal-disable-updates.patch
d50eb5724502df9ea4d795db8cfc27af767c25168d7db2af512e615be7cc2ca290210a9ae78e1abb153c0198677e858ad3d74926c958099d0319295e7d9e7f1d signal-update-links.patch
646d303fe58cae3f0896ae0275a66695b902fae6ddde7c568cc9798157dee9f45ceff907bc951fadc4c511d512a73d114b4e4f7c8914e2311c63929d29e1621a signal-show-window-please.patch
ab51b8fdcda1d8811213d2c5d8cb5d8457b478a02e23ce40f36b38ec56d45a3bd7a2d184720c27046f98a27771551cfad93c1290fe93856cc02695d318b33e47 ringrtc-webrtc-renamed.patch
9d92389637cdda83a0a7039fa6c52516d7bc491d0b1e42d5374b9d1f4fa7b9c930642f2dca896da17a2dc3344fa1bb97434c8dddd0539a4fedfd0dec809fc875 webrtc-shared-like-my-wife.patch
bfc8acdd13aa48d29c7657311733cc9d33c4899782efbd1ef6d25ad1698be4de7cc67e829324bc0309715d69ae37ea9f782cf54887317e817213e110d73d68e7 signal-show-window-please.patch
b11fdd930943ca327650e4738ed85cd6b5eea779455a5895bccebba98e449bafc6b0f09bcf4545f2b2e16644355664e9768dd6d4d62f87619207c430367f72c5 ringrtc-webrtc-renamed.patch
0888673ba687747beda61cd50efbc25095f4a3d26f1dd58bf003e3a0bf1d302c3f2ebd1deecf630fbf04aedb7b8cd409e9efda4d1e6fda63234c9a9b9755bff4 webrtc-shared-libs.patch
e07ae8544988d402aaf0fbd95ea36a64c94c59566c561132578aa6dcf8ff11a34058530e64dc204e5cadc2482f1401e74b32384a144e5e08017c663d0cf7c2fc webrtc-compiler.patch
88515d8b8cc82355c9f9b0f44fac83b7ff149b13e9fb102fd46036ec5234cfb2385fa5ad58a0520ee604b93dc4ddd6ae18a7005978ef207841645724ef7a9749 webrtc-gcc13.patch
87534e7b5ad7365509eab75629e6bd1a9ed61ee92f7e358405a0abaf0df57de14623fb3894eb082f8785422e5c087e1c50f9e2e5cafbb2529591fd7bf447f7f5 signal-desktop

View file

@ -1,11 +1,13 @@
--- ./node/build_node_bridge.py.orig
+++ ./node/build_node_bridge.py
@@ -63,7 +63,7 @@
diff --git a/node/build_node_bridge.py.orig b/node/build_node_bridge.py
index e75c2d0..3bdb328 100755
--- a/node/build_node_bridge.py.orig
+++ b/node/build_node_bridge.py
@@ -97,7 +97,7 @@ def main(args: Optional[List[str]] = None) -> int:
if 'npm_config_libsignal_debug_level_logs' not in os.environ:
features.append('log/release_max_level_info')
out_dir = options.out_dir.strip('"') or os.path.join('build', configuration_name)
- cmdline = ['cargo', 'build', '--target', cargo_target, '-p', 'libsignal-node', '--features', 'testing-fns']
+ cmdline = ['cargo', 'auditable', 'build', '--target', cargo_target, '-p', 'libsignal-node', '--features', 'testing-fns']
- cmdline = ['cargo', 'build', '--target', cargo_target, '-p', 'libsignal-node', '--features', ','.join(features)]
+ cmdline = ['cargo', 'auditable', 'build', '--target', cargo_target, '-p', 'libsignal-node', '--features', ','.join(features)]
if configuration_name == 'Release':
cmdline.append('--release')
print("Running '%s'" % (' '.join(cmdline)))

View file

@ -1,19 +1,23 @@
--- ./src/rust/build.rs.orig
+++ ./src/rust/build.rs
@@ -41,15 +41,15 @@
if cfg!(feature = "native") {
if let Ok(out_dir) = out_dir {
println!(
- "cargo:rustc-link-search=native={}/{}/obj/",
+ "cargo:rustc-link-search=native={}/{}/",
out_dir, build_type,
);
- println!("cargo:rerun-if-changed={}/{}/obj/", out_dir, build_type,);
+ println!("cargo:rerun-if-changed={}/{}/", out_dir, build_type,);
} else {
println!("cargo:warning=No WebRTC output directory (OUTPUT_DIR) defined!");
}
@@ -79,6 +79,7 @@
if cfg!(feature = "native") {
let webrtc_dir = if cfg!(feature = "prebuilt_webrtc") {
+ panic!("trying to download prebuild webrtc");
if let Err(e) = fs::create_dir_all(&out_dir) {
panic!("Failed to create webrtc out directory: {:?}", e);
}
@@ -86,12 +87,12 @@
// Ignore build type since we only have release prebuilts
format!("{}/release/obj/", out_dir)
} else {
- format!("{}/{}/obj", out_dir, build_type)
+ format!("{}/{}", out_dir, build_type)
};
println!("cargo:rerun-if-changed={}", webrtc_dir);
println!("cargo:rerun-if-changed={}", config_dir());
println!("cargo:rustc-link-search=native={}", webrtc_dir);
- println!("cargo:rustc-link-lib=webrtc");
+ println!("cargo:rustc-link-lib=dylib=signaldeswebrtc");

View file

@ -1,11 +1,13 @@
--- ./app/main.ts.orig
+++ ./app/main.ts
@@ -721,7 +721,7 @@
const titleBarOverlay = await getTitleBarOverlay();
diff --git a/app/main.ts.orig b/app/main.ts
index aa1bec8..bd7c1d5 100644
--- a/app/main.ts.orig
+++ b/app/main.ts
@@ -690,7 +690,7 @@ async function createWindow() {
: DEFAULT_HEIGHT;
const windowOptions: Electron.BrowserWindowConstructorOptions = {
- show: false,
+ show: true,
width: DEFAULT_WIDTH,
height: DEFAULT_HEIGHT,
width,
height,
minWidth: MIN_WIDTH,

View file

@ -0,0 +1,64 @@
--- ./BUILD.gn.orig
+++ ./BUILD.gn
@@ -38,7 +38,7 @@
# 'ninja default' and then 'ninja all', the second build should do no work.
group("default") {
testonly = true
- deps = [ ":webrtc" ]
+ deps = [ ":signaldeswebrtc" ]
if (rtc_build_examples) {
deps += [ "examples" ]
}
@@ -464,7 +464,7 @@
if (!build_with_chromium) {
# Target to build all the WebRTC production code.
- rtc_static_library("webrtc") {
+ rtc_shared_library("signaldeswebrtc") {
# Only the root target and the test should depend on this.
visibility = [
"//:default",
@@ -472,7 +472,6 @@
]
sources = []
- complete_static_lib = true
suppressed_configs += [ "//build/config/compiler:thin_archive" ]
defines = []
diff --git a/third_party/googletest/BUILD.gn.orig b/third_party/googletest/BUILD.gn
index 14089f0..b7dc621 100644
--- a/third_party/googletest/BUILD.gn.orig
+++ b/third_party/googletest/BUILD.gn
@@ -48,7 +48,6 @@ config("gtest_config") {
configs = [
"//third_party/abseil-cpp:absl_include_config",
- "//third_party/re2:re2_config",
]
}
diff --git a/third_party/googletest/BUILD.gn.orig b/third_party/googletest/BUILD.gn
index b7dc621..367f929 100644
--- a/third_party/googletest/BUILD.gn.orig
+++ b/third_party/googletest/BUILD.gn
@@ -133,7 +133,6 @@ source_set("gtest") {
# googletest only needs `absl`, but this makes gn check happier.
deps = [ "//third_party/abseil-cpp:absl_full" ]
- public_deps = [ "//third_party/re2" ]
if (is_nacl || !build_with_chromium) {
defines += [ "GTEST_DISABLE_PRINT_STACK_TRACE" ]
sources -= [
diff --git a/third_party/fuzztest/BUILD.gn.orig b/third_party/fuzztest/BUILD.gn
index 57ee790..ba1d297 100644
--- a/third_party/fuzztest/BUILD.gn.orig
+++ b/third_party/fuzztest/BUILD.gn
@@ -309,7 +309,6 @@ source_set("fuzztest_internal") {
# For RE2 mutators. It's questionable whether we want to pull this library
# into every fuzztest target, but this is the approach used in other
# fuzztest contexts so we'll do the same
- "//third_party/re2",
# For protobuf mutators
"$protobuf_target_prefix:protobuf_lite",

View file

@ -1,28 +0,0 @@
--- ./BUILD.gn.orig
+++ ./BUILD.gn
@@ -38,7 +38,7 @@
# 'ninja default' and then 'ninja all', the second build should do no work.
group("default") {
testonly = true
- deps = [ ":webrtc" ]
+ deps = [ ":signaldeswebrtc" ]
if (rtc_build_examples) {
deps += [ "examples" ]
}
@@ -464,7 +464,7 @@
if (!build_with_chromium) {
# Target to build all the WebRTC production code.
- rtc_static_library("webrtc") {
+ rtc_shared_library("signaldeswebrtc") {
# Only the root target and the test should depend on this.
visibility = [
"//:default",
@@ -472,7 +472,6 @@
]
sources = []
- complete_static_lib = true
suppressed_configs += [ "//build/config/compiler:thin_archive" ]
defines = []

View file

@ -1,32 +1,29 @@
# Maintainer: Antoine Martin <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=firefly-iii-plaid-connector
pkgver=0.3.1
pkgrel=5
pkgdesc='The Free Software Media System'
arch='i686 x86_64'
url='https://gitlab.com/GeorgeHahn/firefly-plaid-connector'
license='MIT'
options="!strip !check"
#depends='firefly-iii dotnet31-runtime'
makedepends='dotnet31-sdk'
source="firefly-plaid-connector-$pkgver.tar.gz::https://gitlab.com/GeorgeHahn/firefly-plaid-connector/-/archive/v${pkgver}/firefly-plaid-connector-v${pkgver}.tar.gz"
builddir="${srcdir}/firefly-plaid-connector-v${pkgver}"
pkgrel=7
pkgdesc="The Free Software Media System"
#arch="x86_64"
url="https://gitlab.com/GeorgeHahn/firefly-plaid-connector"
license="MIT"
options="!check"
depends="firefly-iii dotnet6-runtime"
makedepends="dotnet6-sdk"
source="firefly-plaid-connector-$pkgver.tar.gz::https://gitlab.com/GeorgeHahn/firefly-plaid-connector/-/archive/v$pkgver/firefly-plaid-connector-v$pkgver.tar.gz net6-support.patch"
builddir="$srcdir/firefly-plaid-connector-v$pkgver"
build(){
cd "${builddir}"
# Disable dotnet telemetry
export DOTNET_CLI_TELEMETRY_OPTOUT=1
# publish app and libraries
dotnet publish --configuration Release --output "$PWD"/publish
build() {
dotnet publish --configuration Release --output ./publish --use-current-runtime --no-self-contained
}
package() {
mkdir -p "${pkgdir}"/var/lib
cp -r "${builddir}"/publish "$pkgdir"/var/lib/firefly-plaid-connector
mkdir -p "$pkgdir"/usr/lib
cp -r "$builddir"/publish "$pkgdir"/usr/lib/firefly-plaid-connector
}
sha512sums="57a64673bf2e8cae00cb215e1dc90eb02bddf50010835a9318f55f83313c00f19d6c8d8af65e2739b0fb6fd4522a2327941bdc7d11cbe59c9537ff6c1575765e firefly-plaid-connector-0.3.1.tar.gz"
sha512sums="
57a64673bf2e8cae00cb215e1dc90eb02bddf50010835a9318f55f83313c00f19d6c8d8af65e2739b0fb6fd4522a2327941bdc7d11cbe59c9537ff6c1575765e firefly-plaid-connector-0.3.1.tar.gz
f795fe58659763082e3f2bba0e6e2a70c4732bc6b402a4e586104bf09525ffca1d3586acda43ccba3b71d15e1a0a62794574f72a2fc6cd3d1905dcb2e8782dc2 net6-support.patch
"

View file

@ -1,33 +0,0 @@
# Contributor: Fabio Ribeiro <fabiorphp@gmail.com>
# Maintainer: Andy Postnikov <apostnikov@gmail.com>
pkgname=php82-pecl-inotify
_extname=inotify
pkgver=3.0.0
pkgrel=0
pkgdesc="Inotify bindings for PHP 8.2"
url="https://pecl.php.net/package/inotify"
arch="all"
license="PHP-3.01"
depends="php82-common"
makedepends="php82-dev"
source="php-pecl-$_extname-$pkgver.tgz::https://pecl.php.net/get/$_extname-$pkgver.tgz"
builddir="$srcdir"/$_extname-$pkgver
build() {
phpize82
./configure --prefix=/usr --with-php-config=php-config82
make
}
check() {
make NO_INTERACTION=1 REPORT_EXIT_STATUS=1 test
}
package() {
make INSTALL_ROOT="$pkgdir" install
local _confdir="$pkgdir"/etc/php82/conf.d
install -d $_confdir
echo "extension=$_extname" > $_confdir/70_$_extname.ini
}
sha512sums="f8b29f8611f16b92136ab8de89181c254bba1abee1e61cac2344440567a3155aae4b9b54b10fdb1b0254fd7a96da8c14b7dc5c9f7f08a03db30ab1645aca1eee php-pecl-inotify-3.0.0.tgz"

View file

@ -1,246 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=authentik
pkgver=2024.4.2
pkgrel=0
pkgdesc="An open-source Identity Provider focused on flexibility and versatility"
url="https://github.com/goauthentik/authentik"
# s390x: missing py3-celery py3-flower and py3-kombu
# armhf/armv7/x86: out of memory error when building goauthentik
# ppc64le: not supported by Rollup build
arch="aarch64 x86_64"
license="MIT"
depends="
libcap-setcap
nginx
postgresql
procps
pwgen
py3-aiohttp
py3-aiosignal
py3-amqp
py3-anyio
py3-asgiref
py3-asn1
py3-asn1crypto
py3-async-timeout
py3-attrs
py3-autobahn
py3-automat
py3-bcrypt
py3-billiard
py3-cachetools
py3-cbor2
py3-celery
py3-certifi
py3-cffi
py3-channels
py3-channels_redis
py3-charset-normalizer
py3-click
py3-click-didyoumean
py3-click-plugins
py3-click-repl
py3-codespell
py3-colorama
py3-constantly
py3-cparser
py3-cryptography
py3-dacite
py3-daphne
py3-dateutil
py3-deepmerge
py3-defusedxml
py3-deprecated
py3-dnspython
py3-django
py3-django-filter
py3-django-guardian
py3-django-model-utils
py3-django-otp
py3-django-prometheus
py3-django-redis
py3-django-rest-framework
py3-django-rest-framework-guardian
py3-docker-py
py3-dotenv
py3-dumb-init
py3-duo_client
py3-drf-spectacular
py3-email-validator
py3-facebook-sdk
py3-flower
py3-frozenlist
py3-geoip2
py3-google-auth
py3-gunicorn
py3-h11
py3-httptools
py3-humanize
py3-hyperlink
py3-idna
py3-incremental
py3-inflection
py3-jsonschema
py3-jsonpatch
py3-jwt
py3-kombu
py3-kubernetes
py3-ldap3
py3-lxml
py3-maxminddb
py3-msgpack
py3-multidict
py3-oauthlib
py3-opencontainers
py3-openssl
py3-packaging
py3-paramiko
py3-parsing
py3-prometheus-client
py3-prompt_toolkit
py3-psycopg
py3-psycopg-c
py3-pycryptodome
py3-pydantic-scim
py3-pynacl
py3-pyrsistent
py3-python-jwt
py3-redis
py3-requests
py3-requests-oauthlib
py3-rsa
py3-sentry-sdk
py3-service_identity
py3-setuptools
py3-six
py3-sniffio
py3-sqlparse
py3-structlog
py3-swagger-spec-validator
py3-tornado
py3-twilio
py3-twisted
py3-txaio
py3-typing-extensions
py3-tz
py3-ua-parser
py3-uritemplate
py3-urllib3-secure-extra
py3-uvloop
py3-vine
py3-watchdog
py3-watchfiles
py3-wcwidth
py3-webauthn
py3-websocket-client
py3-websockets
py3-wrapt
py3-wsproto
py3-xmlsec
py3-yaml
py3-yarl
py3-zope-interface
py3-zxcvbn
redis
uvicorn
"
makedepends="go npm"
# checkdepends scooped up by poetry due to number
checkdepends="poetry py3-coverage"
# tests disabled for now
options="!check"
install="$pkgname.post-install $pkgname.post-upgrade $pkgname.pre-install"
source="
$pkgname-$pkgver.tar.gz::https://github.com/goauthentik/authentik/archive/refs/tags/version/$pkgver.tar.gz
authentik.openrc
authentik-worker.openrc
authentik-ldap.openrc
authentik-ldap.conf
authentik-manage.sh
root-settings-csrf_trusted_origins.patch
"
builddir="$srcdir/"authentik-version-$pkgver
subpackages="$pkgname-openrc $pkgname-doc"
pkgusers="authentik"
pkggroups="authentik"
export GOPATH=$srcdir/go
export GOCACHE=$srcdir/go-build
export GOTMPDIR=$srcdir
build() {
msg "Building authentik-ldap"
go build -o ldap cmd/ldap/main.go
msg "Building authentik-proxy"
go build -o proxy cmd/proxy/main.go
msg "Building authentik-radius"
go build -o radius cmd/proxy/main.go
msg "Building authentik-server"
go build -o server cmd/server/*.go
msg "Building authentik-web"
cd web
npm ci --no-audit
npm run build
cd ..
msg "Building website"
cd website
npm ci --no-audit
npm run build
}
package() {
msg "Packaging $pkgname"
mkdir -p "$pkgdir"/usr/share/webapps/authentik/web
mkdir -p "$pkgdir"/usr/share/webapps/authentik/website
mkdir -p "$pkgdir"/var/lib/authentik
mkdir -p "$pkgdir"/usr/share/doc
mkdir -p "$pkgdir"/usr/bin
cp -dr "$builddir"/authentik "$pkgdir"/usr/share/webapps/authentik
cp -dr "$builddir"/web/dist "$pkgdir"/usr/share/webapps/authentik/web/dist
cp -dr "$builddir"/web/authentik "$pkgdir"/usr/share/webapps/authentik/web/authentik
cp -dr "$builddir"/website/build "$pkgdir"/usr/share/doc/authentik
cp -dr "$builddir"/tests "$pkgdir"/usr/share/webapps/authentik/tests
cp -dr "$builddir"/lifecycle "$pkgdir"/usr/share/webapps/authentik/lifecycle
cp -dr "$builddir"/locale "$pkgdir"/usr/share/webapps/authentik/locale
cp -dr "$builddir"/blueprints "$pkgdir"/var/lib/authentik/blueprints
install -Dm755 "$builddir"/manage.py "$pkgdir"/usr/share/webapps/authentik/manage.py
install -Dm755 "$builddir"/server "$pkgdir"/usr/share/webapps/authentik/server
ln -s "/etc/authentik/config.yml" "$pkgdir"/usr/share/webapps/authentik/local.env.yml
install -Dm755 "$builddir"/proxy "$pkgdir"/usr/bin/authentik-proxy
install -Dm755 "$builddir"/ldap "$pkgdir"/usr/bin/authentik-ldap
install -Dm755 "$builddir"/radius "$pkgdir"/usr/bin/authentik-radius
install -Dm755 "$srcdir"/$pkgname.openrc \
"$pkgdir"/etc/init.d/$pkgname
install -Dm755 "$srcdir"/$pkgname-worker.openrc \
"$pkgdir"/etc/init.d/$pkgname-worker
install -Dm755 "$srcdir"/$pkgname-ldap.openrc \
"$pkgdir"/etc/init.d/$pkgname-ldap
install -Dm640 "$srcdir"/$pkgname-ldap.conf \
"$pkgdir"/etc/conf.d/$pkgname-ldap
install -Dm640 "$builddir"/authentik/lib/default.yml \
"$pkgdir"/etc/authentik/config.yml
chown root:www-data "$pkgdir"/etc/authentik/config.yml
sed -i 's|cert_discovery_dir.*|cert_discovery_dir: /var/lib/authentik/certs|' "$pkgdir"/etc/authentik/config.yml
sed -i 's|blueprints_dir.*|blueprints_dir: /var/lib/authentik/blueprints|' "$pkgdir"/etc/authentik/config.yml
sed -i 's|template_dir.*|template_dir: /var/lib/authentik/templates|' "$pkgdir"/etc/authentik/config.yml
printf "\ncsrf:\n trusted_origins: ['auth.example.com']" >> "$pkgdir"/etc/authentik/config.yml
printf "\nsecret_key: '@@SECRET_KEY@@'" >> "$pkgdir"/etc/authentik/config.yml
# Install wrapper script to /usr/bin.
install -m755 -D "$srcdir"/authentik-manage.sh "$pkgdir"/usr/bin/authentik-manage
}
sha512sums="
58642829e320b1480706363712a73c82e55f79ed6451d5db82482c51b9c1ee13b9999caf152da0944ca277344d38c99a5636a7e9e718f858ca558f17ae9da104 authentik-2024.4.2.tar.gz
4defb4fe3a4230f4aa517fbecd5e5b8bcef2a64e1b40615660ae9eec33597310a09df5e126f4d39ce7764bd1716c0a7040637699135c103cbc1879593c6c06f1 authentik.openrc
5d7f28bf5a9f358a0fc3634b2bac6d070c276c3f8181d26fa7e94a17503a4d54556bf7c3207ccd6cb924b81754ed965795d5e2a8aa1af409fd9e32d390ec4cf5 authentik-worker.openrc
351e6920d987861f8bf0d7ab2f942db716a8dbdad1f690ac662a6ef29ac0fd46cf817cf557de08f1c024703503d36bc8b46f0d9eb1ecaeb399dce4c3bb527d17 authentik-ldap.openrc
89ee5f0ffdade1c153f3a56ff75b25a7104aa81d8c7a97802a8f4b0eab34850cee39f874dabe0f3c6da3f71d6a0f938f5e8904169e8cdd34d407c8984adee6b0 authentik-ldap.conf
d2df285e09d05bb78b17cdbf156cb19883764d0ae61d4c8faed599c015277b75c3f51e5fcb35e01fc25d5847f667ff2089d5e6c48b85a3a6b4523278b2eea89d authentik-manage.sh
a50ceddb239851d869212cd5064df117ab977d0e01bf0bc5fa7b5fa6e6428a4af59f802ca223a7e840753f86bfdb0df17d330f9ba4cbaa30a167f51d8aecb9bd root-settings-csrf_trusted_origins.patch
"

View file

@ -1,3 +0,0 @@
AUTHENTIK_HOST=https://example.com
AUTHENTIK_TOKEN=your-authentik-token
AUTHENTIK_INSECURE=true

View file

@ -1,24 +0,0 @@
#!/sbin/openrc-run
name="$RC_SVCNAME"
cfgfile="/etc/conf.d/$RC_SVCNAME"
pidfile="/run/$RC_SVCNAME.pid"
working_directory="/usr/share/webapps/authentik"
command="/usr/bin/authentik-ldap"
command_user="authentik"
command_group="authentik"
start_stop_daemon_args=""
command_background="yes"
output_log="/var/log/authentik/$RC_SVCNAME.log"
error_log="/var/log/authentik/$RC_SVCNAME.err"
depend() {
need authentik
}
start_pre() {
cd "$working_directory"
checkpath --directory --owner $command_user:$command_group --mode 0775 \
/var/log/authentik
export AUTHENTIK_HOST AUTHENTIK_TOKEN AUTHENTIK_INSECURE AUTHENTIK_DEBUG
}

View file

@ -1,12 +0,0 @@
#!/bin/sh
BUNDLE_DIR='/usr/share/webapps/authentik'
cd $BUNDLE_DIR
if [ "$(id -un)" != 'authentik' ]; then
exec su authentik -c '"$0" "$@"' -- ./manage.py "$@"
else
exec ./manage.py "$@"
fi

View file

@ -1,32 +0,0 @@
#!/sbin/openrc-run
name="$RC_SVCNAME"
cfgfile="/etc/conf.d/$RC_SVCNAME.conf"
pidfile="/run/$RC_SVCNAME.pid"
working_directory="/usr/share/webapps/authentik"
command="celery"
command_args="-A authentik.root.celery worker -Ofair --max-tasks-per-child=1 --autoscale 3,1 -E -B -s /tmp/celerybeat-schedule -Q authentik,authentik_scheduled,authentik_events"
command_user="authentik"
command_group="authentik"
start_stop_daemon_args=""
command_background="yes"
output_log="/var/log/authentik/$RC_SVCNAME.log"
error_log="/var/log/authentik/$RC_SVCNAME.err"
depend() {
need redis
need postgresql
}
start_pre() {
cd "$working_directory"
checkpath --directory --owner $command_user:$command_group --mode 0775 \
/var/log/authentik \
/var/lib/authentik/certs \
/var/lib/authentik/blueprints
}
stop_pre() {
ebegin "Killing child processes"
kill $(ps -o pid= --ppid $(cat $pidfile)) || true
}

View file

@ -1,30 +0,0 @@
#!/sbin/openrc-run
name="$RC_SVCNAME"
cfgfile="/etc/conf.d/$RC_SVCNAME.conf"
pidfile="/run/$RC_SVCNAME.pid"
working_directory="/usr/share/webapps/authentik"
command="/usr/share/webapps/authentik/server"
command_user="authentik"
command_group="authentik"
start_stop_daemon_args=""
command_background="yes"
output_log="/var/log/authentik/$RC_SVCNAME.log"
error_log="/var/log/authentik/$RC_SVCNAME.err"
depend() {
need redis
need postgresql
}
start_pre() {
cd "$working_directory"
checkpath --directory --owner $command_user:$command_group --mode 0775 \
/var/log/authentik \
/var/lib/authentik/certs
}
stop_pre() {
ebegin "Killing child processes"
kill $(ps -o pid= --ppid $(cat $pidfile)) || true
}

View file

@ -1,39 +0,0 @@
#!/bin/sh
set -eu
group=authentik
config_file='/etc/authentik/config.yml'
setcap 'cap_net_bind_service=+ep' /usr/share/webapps/authentik/server
if [ $(grep '@@SECRET_KEY@@' "$config_file") ]; then
echo "* Generating random secret in $config_file" >&2
secret_key="$(pwgen -s 50 1)"
sed -i "s|@@SECRET_KEY@@|$secret_key|" "$config_file"
chown root:$group "$config_file"
fi
if [ "${0##*.}" = 'post-upgrade' ]; then
cat >&2 <<-EOF
*
* To finish Authentik upgrade run:
*
* authentik-manage migrate
*
EOF
else
cat >&2 <<-EOF
*
* 1. Adjust settings in /etc/authentik/config.yml.
*
* 2. Create database for Authentik:
*
* psql -c "CREATE ROLE authentik PASSWORD 'top-secret' INHERIT LOGIN;"
* psql -c "CREATE DATABASE authentik OWNER authentik ENCODING 'UTF-8';"
*
* 3. Run "authentik-manage migrate"
* 4. Setup admin user at https://<your server>/if/flow/initial-setup/
*
EOF
fi

View file

@ -1 +0,0 @@
authentik.post-install

View file

@ -1,26 +0,0 @@
#!/bin/sh
# It's very important to set user/group correctly.
authentik_dir='/var/lib/authentik'
if ! getent group authentik 1>/dev/null; then
echo '* Creating group authentik' 1>&2
addgroup -S authentik
fi
if ! id authentik 2>/dev/null 1>&2; then
echo '* Creating user authentik' 1>&2
adduser -DHS -G authentik -h "$authentik_dir" -s /bin/sh \
-g "added by apk for authentik" authentik
passwd -u authentik 1>/dev/null # unlock
fi
if ! id -Gn authentik | grep -Fq redis; then
echo '* Adding user authentik to group redis' 1>&2
addgroup authentik redis
fi
exit 0

View file

@ -1,12 +0,0 @@
diff --git a/authentik/root/settings.py.orig b/authentik/root/settings.py
index ebfc471..ce1ef3b 100644
--- a/authentik/root/settings.py.orig
+++ b/authentik/root/settings.py
@@ -56,6 +56,7 @@ AUTH_USER_MODEL = "authentik_core.User"
CSRF_COOKIE_NAME = "authentik_csrf"
CSRF_HEADER_NAME = "HTTP_X_AUTHENTIK_CSRF"
+CSRF_TRUSTED_ORIGINS = CONFIG.get("csrf.trusted_origins")
LANGUAGE_COOKIE_NAME = "authentik_language"
SESSION_COOKIE_NAME = "authentik_session"
SESSION_COOKIE_DOMAIN = CONFIG.get("cookie_domain", None)

View file

@ -1,29 +0,0 @@
#!/usr/bin/electron
const name = 'caprine';
const {app} = require('electron');
const fs = require('fs');
const path = require('path');
// Change command name.
const fd = fs.openSync('/proc/self/comm', fs.constants.O_WRONLY);
fs.writeSync(fd, name);
fs.closeSync(fd);
// Remove first command line argument (/usr/bin/electron).
process.argv.splice(0, 1);
// Set application paths.
const appPath = path.join(path.dirname(__dirname), 'lib', name);
const packageJson = require(path.join(appPath, 'package.json'));
const productName = packageJson.productName;
app.setAppPath(appPath);
app.setDesktopName(name + '.desktop');
app.setName(productName);
app.setPath('userCache', path.join(app.getPath('cache'), productName));
app.setPath('userData', path.join(app.getPath('appData'), productName));
app.setVersion(packageJson.version);
// Run the application.
require('module')._load(appPath, module, true);

View file

@ -1,29 +0,0 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=firefly-iii-plaid-connector
pkgver=0.3.1
pkgrel=7
pkgdesc="The Free Software Media System"
#arch="x86_64"
url="https://gitlab.com/GeorgeHahn/firefly-plaid-connector"
license="MIT"
options="!check"
depends="firefly-iii dotnet6-runtime"
makedepends="dotnet6-sdk"
source="firefly-plaid-connector-$pkgver.tar.gz::https://gitlab.com/GeorgeHahn/firefly-plaid-connector/-/archive/v$pkgver/firefly-plaid-connector-v$pkgver.tar.gz net6-support.patch"
builddir="$srcdir/firefly-plaid-connector-v$pkgver"
build() {
dotnet publish --configuration Release --output ./publish --use-current-runtime --no-self-contained
}
package() {
mkdir -p "$pkgdir"/usr/lib
cp -r "$builddir"/publish "$pkgdir"/usr/lib/firefly-plaid-connector
}
sha512sums="
57a64673bf2e8cae00cb215e1dc90eb02bddf50010835a9318f55f83313c00f19d6c8d8af65e2739b0fb6fd4522a2327941bdc7d11cbe59c9537ff6c1575765e firefly-plaid-connector-0.3.1.tar.gz
f795fe58659763082e3f2bba0e6e2a70c4732bc6b402a4e586104bf09525ffca1d3586acda43ccba3b71d15e1a0a62794574f72a2fc6cd3d1905dcb2e8782dc2 net6-support.patch
"

View file

@ -5,11 +5,11 @@ pkgname=firefly-iii
pkgver=5.7.18
pkgrel=0
pkgdesc="PHP personal finances manager"
#arch="noarch"
arch="noarch"
url="https://github.com/firefly-iii/firefly-iii"
license="AGPL-3.0-only"
options="!check" # No testsuite
_php=php82
_php=php83
_php_mods="-intl -curl -bcmath -zip -gd -xml -mbstring -ldap -session -fileinfo -simplexml -sodium -tokenizer -xmlwriter -dom -pdo"
depends="$_php ${_php_mods//-/$_php-}"
makedepends="composer"

View file

@ -1,82 +0,0 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=freescout
pkgver=1.8.139
pkgrel=0
pkgdesc="Free self-hosted help desk & shared mailbox"
arch="noarch"
url="freescout.net"
license="AGPL-3.0"
_php=php83
_php_mods="-fpm -mbstring -xml -imap -zip -gd -curl -intl -tokenizer -pdo_pgsql -openssl -session -iconv -fileinfo -dom -pcntl"
depends="$_php ${_php_mods//-/$_php-} nginx postgresql pwgen"
makedepends="composer pcre"
install="$pkgname.post-install $pkgname.post-upgrade $pkgname.pre-install"
source="
$pkgname-$pkgver.tar.gz::https://github.com/freescout-helpdesk/freescout/archive/refs/tags/$pkgver.tar.gz
freescout.nginx
freescout-manage.sh
rename-client-to-membre-fr-en.patch
"
pkgusers="freescout"
pkggroups="freescout"
build() {
composer install --ignore-platform-reqs
}
package() {
local logdir="/var/log/$pkgname"
local datadir="/var/lib/$pkgname"
local wwwdir="/usr/share/webapps/$pkgname"
local confdir="/etc/$pkgname"
# Make directories
install -dm 755 \
"$pkgdir"/$wwwdir \
"$pkgdir"/$confdir \
"$pkgdir"/$logdir \
"$pkgdir"/$datadir
# Copy and ln operations
cp $builddir/* -R "$pkgdir"/$wwwdir/.
for i in storage/app storage/framework bootstrap/cache \
public/css/builds public/js/builds public/modules Modules; do
if [ -d "$pkgdir"$wwwdir/$i ]; then
if [ ! -d "$pkgdir"/$datadir/${i%/*} ]; then
mkdir -p "$pkgdir"/$datadir/${i%/*}
fi
mv "$pkgdir"$wwwdir/$i "$pkgdir"/$datadir/$i
else
mkdir -p "$pkgdir"/$datadir/$i
fi
ln -s $datadir/$i "$pkgdir"/$wwwdir/$i
done
ln -s /etc/freescout/freescout.conf "$pkgdir"/usr/share/webapps/freescout/.env
ln -s $wwwdir/storage/app/public "$pkgdir"/$wwwdir/public/storage
# log dir
rm -R "$pkgdir"/$wwwdir/storage/logs
ln -s "$logdir" "$pkgdir"/$wwwdir/storage/logs
# Permission settings
chown -R freescout:www-data "$pkgdir"/$datadir "$pkgdir"/$logdir
# config files
install -Dm644 "$srcdir"/freescout.nginx \
"$pkgdir"/etc/nginx/http.d/freescout.conf
install -Dm640 "$builddir"/.env.example \
"$pkgdir"/etc/freescout/freescout.conf
sed -i 's|APP_KEY.*|APP_KEY=@@SECRET_KEY@@|' "$pkgdir"/etc/freescout/freescout.conf
chown root:www-data "$pkgdir"/etc/freescout/freescout.conf
# Install wrapper script to /usr/bin.
install -m755 -D "$srcdir"/freescout-manage.sh "$pkgdir"/usr/bin/freescout-manage
}
sha512sums="
11d81fa670bd67a7db9f5bff3a067a1d1cf3c812a34c805a3fc83edc978ded3accc8334581eca1e73cf0ad95f8e289278add57de096528728e2989135b3057a3 freescout-1.8.139.tar.gz
e4af6c85dc12f694bef2a02e4664e31ed50b2c109914d7ffad5001c2bbd764ef25b17ecaa59ff55ef41bccf17169bf910d1a08888364bdedd0ecc54d310e661f freescout.nginx
7ce9b3ee3a979db44f5e6d7daa69431e04a5281f364ae7be23e5a0a0547f96abc858d2a8010346be2fb99bd2355fb529e7030ed20d54f310249e61ed5db4d0ba freescout-manage.sh
3416da98d71aea5a7093913ea34e783e21ff05dca90bdc5ff3d00c548db5889f6d0ec98441cd65ab9f590be5cd59fdd0d7f1c98b5deef7bb3adbc8db435ec9bf rename-client-to-membre-fr-en.patch
"

View file

@ -1,11 +0,0 @@
#!/bin/sh
BUNDLE_DIR='/usr/share/webapps/freescout'
cd $BUNDLE_DIR
if [ "$(id -un)" != 'freescout' ]; then
exec su freescout -c '"$0" "$@"' -- php artisan "$@"
else
exec php artisan "$@"
fi

View file

@ -1,56 +0,0 @@
server {
listen 80;
listen [::]:80;
server_name example.com www.example.com;
root /usr/share/webapps/freescout/public;
index index.php index.html index.htm;
error_log /var/www/html/storage/logs/web-server.log;
# Max. attachment size.
# It must be also set in PHP.ini via "upload_max_filesize" and "post_max_size" directives.
client_max_body_size 20M;
location / {
try_files $uri $uri/ /index.php?$query_string;
}
location ~ \.php$ {
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/run/php/php8.0-fpm.sock;
fastcgi_index index.php;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
include fastcgi_params;
}
# Uncomment this location if you want to improve attachments downloading speed.
# Also make sure to set APP_DOWNLOAD_ATTACHMENTS_VIA=nginx in the .env file.
#location ^~ /storage/app/attachment/ {
# internal;
# alias /var/www/html/storage/app/attachment/;
#}
location ~* ^/storage/attachment/ {
expires 1M;
access_log off;
try_files $uri $uri/ /index.php?$query_string;
}
location ~* ^/(?:css|js)/.*\.(?:css|js)$ {
expires 2d;
access_log off;
add_header Cache-Control "public, must-revalidate";
}
# The list should be in sync with /storage/app/public/uploads/.htaccess and /config/app.php
location ~* ^/storage/.*\.((?!(jpg|jpeg|jfif|pjpeg|pjp|apng|bmp|gif|ico|cur|png|tif|tiff|webp|pdf|txt|diff|patch|json|mp3|wav|ogg|wma)).)*$ {
add_header Content-disposition "attachment; filename=$2";
default_type application/octet-stream;
}
location ~* ^/(?:css|fonts|img|installer|js|modules|[^\\\]+\..*)$ {
expires 1M;
access_log off;
add_header Cache-Control "public";
}
location ~ /\. {
deny all;
}
}

View file

@ -1,48 +0,0 @@
#!/bin/sh
set -eu
group=www-data
config_file='/etc/freescout/freescout.conf'
if [ $(grep '@@SECRET_KEY@@' "$config_file") ]; then
echo "* Generating random secret in $config_file" >&2
secret_key="$(freescout-manage key:generate --show)"
sed -i "s|@@SECRET_KEY@@|$secret_key|" "$config_file"
fi
if [ "${0##*.}" = 'post-upgrade' ]; then
cat >&2 <<-EOF
*
* To finish Freescout upgrade run:
*
* freescout-manage freescout:after-app-update
*
EOF
else
cat >&2 <<-EOF
*
* 1. Adjust settings in /etc/freescout/freescout.conf
*
* 2. Make sure cgi.fix_pathinfo=0 is set in /etc/php8x/php.ini is set
*
* 3. Create database for Freescout:
*
* psql -c "CREATE ROLE freescout PASSWORD 'top-secret' INHERIT LOGIN;"
* psql -c "CREATE DATABASE freescout OWNER freescout ENCODING 'UTF-8';"
*
* 4. Clear application cache and apply .env file changes:
*
* freescout-manage freescout:clear-cache
*
* 5. Create tables:
*
* freescout-manage migrate
*
* 6. Create admin user
*
* freescout-manage freescout:create-user
*
EOF
fi

View file

@ -1 +0,0 @@
freescout.post-install

View file

@ -1,25 +0,0 @@
#!/bin/sh
freescout_dir='/var/lib/freescout'
if ! getent group freescout 1>/dev/null; then
echo '* Creating group freescout' 1>&2
addgroup -S freescout
fi
if ! id freescout 2>/dev/null 1>&2; then
echo '* Creating user freescout' 1>&2
adduser -DHS -G freescout -h "$freescout_dir" -s /bin/sh \
-g "added by apk for freescout" freescout
passwd -u freescout 1>/dev/null # unlock
fi
if ! id -Gn freescout | grep -Fq www-data; then
echo '* Adding user freescout to group www-data' 1>&2
addgroup freescout www-data
fi
exit 0

View file

@ -1,220 +0,0 @@
diff --git a/resources/lang/en.json b/resources/lang/en.json
new file mode 100644
index 00000000..82d26052
--- /dev/null
+++ b/resources/lang/en.json
@@ -0,0 +1,32 @@
+{
+ ":person changed the customer to :customer": ":person changed the member to :customer",
+ ":person changed the customer to :customer in conversation #:conversation_number": ":person changed the member to :customer in conversation #:conversation_number",
+ "Auto reply to customer": "Auto reply to member",
+ "Change Customer": "Change Member",
+ "Change the customer to :customer_email?": "Change the member to :customer_email?",
+ "Create a new customer": "Create a new member",
+ "Customer": "Member",
+ "Customer Name": "Member Name",
+ "Customer Profile": "Member Profile",
+ "Customer changed": "Member changed",
+ "Customer saved successfully.": "Member saved successfully",
+ "Customer viewed :when": "Member viewed :when",
+ "Customers": "Members",
+ "Customers email this address for help (e.g. support@domain.com)": "Members email this address for help (e.g. support@domain.com)",
+ "Email :tag_email_begin:email:tag_email_end has been moved from another customer: :a_begin:customer:a_end.": "Email :tag_email_begin:email:tag_email_end has been moved from another member: :a_begin:customer:a_end.",
+ "Email to customer": "Email to member",
+ "Emails to Customers": "Emails to Members",
+ "Error sending email to customer": "Error sending email to member",
+ "Message not sent to customer": "Message not sent to member",
+ "Name that will appear in the <strong>From<\/strong> field when a customer views your email.": "Name that will appear in the <strong>From<\/strong> field when a member views your email.",
+ "No customers found": "No members found",
+ "No customers found. Would you like to create one?": "No members found. Would you like to create one?",
+ "Notify :person when a customer replies…": "Notify :person when a member replies…",
+ "Notify me when a customer replies…": "Notify me when a member replies…",
+ "Search for a customer by name or email": "Search for a member by name or email",
+ "Sending emails need to be configured for the mailbox in order to send emails to customers and support agents": "Sending emails need to be configured for the mailbox in order to send emails to members and support agents",
+ "This number is not visible to customers. It is only used to track conversations within :app_name": "This number is not visible to members. It is only used to track conversations within :app_name",
+ "This reply will go to the customer. :%switch_start%Switch to a note:switch_end if you are replying to :user_name.": "This reply will go to the member. :%switch_start%Switch to a note:switch_end if you are replying to :user_name.",
+ "This text will be added to the beginning of each email reply sent to a customer.": "This text will be added to the beginning of each email reply sent to a member.",
+ "When a customer emails this mailbox, application can send an auto reply to the customer immediately.<br\/><br\/>Only one auto reply is sent per new conversation.": "When a member emails this mailbox, application can send an auto reply to the member immediately.<br\/><br\/>Only one auto reply is sent per new conversation."
+}
\ No newline at end of file
diff --git a/resources/lang/fr.json.orig b/resources/lang/fr.json
index ff8d9d4..98d158f 100644
--- a/resources/lang/fr.json.orig
+++ b/resources/lang/fr.json
@@ -26,8 +26,8 @@
":person added a note to conversation #:conversation_number": ":person a ajouté une note à la conversation #:conversation_number",
":person assigned :assignee conversation #:conversation_number": ":person a assigné :assignee à la conversation #:conversation_number",
":person assigned to :assignee": ":person a assigné :assignee",
- ":person changed the customer to :customer": ":person a changé le client en :customer",
- ":person changed the customer to :customer in conversation #:conversation_number": ":person a changé le client en :customer dans la conversation #:conversation_number",
+ ":person changed the customer to :customer": ":person a changé le membre en :customer",
+ ":person changed the customer to :customer in conversation #:conversation_number": ":person a changé le membre en :customer dans la conversation #:conversation_number",
":person created a draft": ":person a créé un brouillon",
":person deleted": ":person supprimée",
":person edited :creator's draft": ":person a modifié brouillon de :creator",
@@ -112,7 +112,7 @@
"Auto Reply": "Réponse Automatique",
"Auto Reply status saved": "Statut de réponse automatique enregistré",
"Auto replies don't include your mailbox signature, so be sure to add your contact information if necessary.": "Les réponses automatiques n'incluent pas la signature de votre boîte aux lettres, assurez-vous d'ajouter vos coordonnées si nécessaire.",
- "Auto reply to customer": "Réponse automatique au client",
+ "Auto reply to customer": "Réponse automatique au membre",
"Back": "Retour",
"Back to folder": "Retour au dossier",
"Background Jobs": "Emplois d'arrière-plan",
@@ -123,10 +123,10 @@
"Cancel": "Annuler",
"Cc": "Cc",
"Change": "Modifier",
- "Change Customer": "Changer de client",
+ "Change Customer": "Changer de membre",
"Change address in mailbox settings": "Modifier l'adresse dans les paramètres de la boîte aux lettres",
"Change default redirect": "Modifier la redirection par défaut",
- "Change the customer to :customer_email?": "Changer le client en :customer_email ?",
+ "Change the customer to :customer_email?": "Changer le membre en :customer_email ?",
"Change your password": "Changer votre mot de passe",
"Chat": "Tchat",
"Check Connection": "Vérifier la connexion",
@@ -182,7 +182,7 @@
"Create a New User": "Créer un nouvel utilisateur",
"Create a Password": "Créer un mot de passe",
"Create a mailbox": "Créer une boîte de réception",
- "Create a new customer": "Créer un nouveau client",
+ "Create a new customer": "Créer un nouveau membre",
"Create symlink manually": "Créer un lien symbolique manuellement",
"Created At": "Créé à",
"Created by :person": "Créé par :person",
@@ -190,14 +190,14 @@
"Current Password": "Mot de passe actuel",
"Custom From Name": "Nom de l'expéditeur personnalisé",
"Custom Name": "Nom personnalisé",
- "Customer": "Client",
- "Customer Name": "Nom du client",
- "Customer Profile": "Profil client",
- "Customer changed": "Client changé",
- "Customer saved successfully.": "Client enregistré avec succès.",
- "Customer viewed :when": "Client vu :when",
- "Customers": "Clients",
- "Customers email this address for help (e.g. support@domain.com)": "Les clients utilisent cette adresse par e-mail pour obtenir de l'aide (par exemple, support@domain.com)",
+ "Customer": "Membre",
+ "Customer Name": "Nom du membre",
+ "Customer Profile": "Profil membre",
+ "Customer changed": "Membre changé",
+ "Customer saved successfully.": "Membre enregistré avec succès.",
+ "Customer viewed :when": "Membre vu :when",
+ "Customers": "Membres",
+ "Customers email this address for help (e.g. support@domain.com)": "Les membres utilisent cette adresse par e-mail pour obtenir de l'aide (par exemple, support@domain.com)",
"Daily": "Quotidien",
"Dashboard": "Tableau de bord",
"Date": "Date",
@@ -247,15 +247,15 @@
"Edit User": "Modifier l'utilisateur",
"Edited by :whom :when": "Édité par :whom :when",
"Email": "Email",
- "Email :tag_email_begin:email:tag_email_end has been moved from another customer: :a_begin:customer:a_end.": "Email :tag_email_begin:email:tag_email_end a été déplacé depuis un autre client : :a_begin:customer:a_end.",
+ "Email :tag_email_begin:email:tag_email_end has been moved from another customer: :a_begin:customer:a_end.": "Email :tag_email_begin:email:tag_email_end a été déplacé depuis un autre membre : :a_begin:customer:a_end.",
"Email Address": "Adresse e-mail",
"Email Alerts For Administrators": "Envoyez des alertes par e-mail aux administrateurs",
"Email Header": "En-tête de l'e-mail",
"Email Signature": "Signature e-mail",
"Email Template": "Modèle d'e-mail",
"Email passed for delivery. If you don't receive a test email, check your mail server logs.": "E-mail transmis pour livraison. Si vous ne recevez pas d'e-mail de test, consultez les journaux de votre serveur de messagerie.",
- "Email to customer": "Courriel au client",
- "Emails to Customers": "Emails aux clients",
+ "Email to customer": "Courriel au membre",
+ "Emails to Customers": "Emails aux membres",
"Empty Trash": "Vider la corbeille",
"Empty license key": "Clé de licence vide",
"Enable Auto Reply": "Activer la réponse automatique",
@@ -276,7 +276,7 @@
"Error occurred. Please try again later.": "Erreur est survenue. Veuillez réessayer plus tard.",
"Error occurred. Please try again or try another :%a_start%update method:%a_end%": "Erreur est survenue. Veuillez réessayer ou en essayer une autre :%a_start% méthode de mise à jour:%a_end%",
"Error sending alert": "Erreur lors de l'envoi de l'alerte",
- "Error sending email to customer": "Erreur lors de l'envoi d'un e-mail au client",
+ "Error sending email to customer": "Erreur lors de l'envoi d'un e-mail au membre",
"Error sending email to the user who replied to notification from wrong email": "Erreur lors de l'envoi d'un e-mail à l'utilisateur qui a répondu à la notification d'un mauvais e-mail",
"Error sending email to user": "Erreur lors de l'envoi d'un e-mail à l'utilisateur",
"Error sending invitation email to user": "Erreur lors de l'envoi d'un e-mail d'invitation à l'utilisateur",
@@ -419,7 +419,7 @@
"Message bounced (:link)": "Message renvoyé (:link)",
"Message cannot be empty": "Le message ne peut pas être vide",
"Message has been already sent. Please discard this draft.": "Le message a déjà été envoyé. Veuillez effacer ce brouillon.",
- "Message not sent to customer": "Message non envoyé au client",
+ "Message not sent to customer": "Message non envoyé au membre",
"Method": "Méthode",
"Migrate DB": "Migrer la base de données",
"Mine": "Mes conversations",
@@ -439,7 +439,7 @@
"My Apps": "Mes Applications",
"My open conversations": "Mes conversations ouvertes",
"Name": "Nom",
- "Name that will appear in the <strong>From<\/strong> field when a customer views your email.": "Nom qui apparaîtra dans le champ <strong>De<\/strong> lorsqu'un client consulte votre e-mail.",
+ "Name that will appear in the <strong>From<\/strong> field when a customer views your email.": "Nom qui apparaîtra dans le champ <strong>De<\/strong> lorsqu'un membre consulte votre e-mail.",
"New Conversation": "Nouvelle conversation",
"New Mailbox": "Nouvelle boîte de réception",
"New Password": "Nouveau mot de passe",
@@ -451,8 +451,8 @@
"Next active conversation": "Conversation active suivante",
"No": "Non",
"No activations left for this license key": "Il ne reste aucune activation pour cette clé de licence",
- "No customers found": "Aucun client trouvé",
- "No customers found. Would you like to create one?": "Aucun client trouvé. Souhaitez-vous en créer un?",
+ "No customers found": "Aucun membre trouvé",
+ "No customers found. Would you like to create one?": "Aucun membre trouvé. Souhaitez-vous en créer un?",
"No invite was found. Please contact your administrator to have a new invite email sent.": "Aucune invitation trouvée. Veuillez contacter votre administrateur pour qu'il envoie une nouvelle invitation par email.",
"Non-writable files found": "Fichiers non-inscriptibles trouvés",
"None": "Aucun",
@@ -471,10 +471,10 @@
"Notifications": "Notifications",
"Notifications saved successfully": "Notifications enregistrées",
"Notifications will start showing up here soon": "Les notifications commenceront bientôt à apparaître ici",
- "Notify :person when a customer replies…": "Avertir :person lorsqu'un client répond…",
+ "Notify :person when a customer replies…": "Avertir :person lorsqu'un membre répond…",
"Notify :person when another :app_name user replies or adds a note…": "Notifier :person quand un autre utilisateur :app_name répond ou ajoute une note…",
"Notify :person when…": "Avertir :person lorsque…",
- "Notify me when a customer replies…": "M'avertir lorsqu'un client répond…",
+ "Notify me when a customer replies…": "M'avertir lorsqu'un membre répond…",
"Notify me when another :app_name user replies or adds a note…": "M'avertir lorsqu'un autre utilisateur :app_name répond ou ajoute une note…",
"Notify me when…": "Prévenez-moi quand…",
"Number": "Numéro",
@@ -587,7 +587,7 @@
"Search": "Recherche",
"Search Conversation by Number": "Rechercher une conversation par identifiant",
"Search Users": "Rechercher des utilisateurs",
- "Search for a customer by name or email": "Rechercher un client par nom ou par e-mail",
+ "Search for a customer by name or email": "Rechercher un membre par nom ou par e-mail",
"See logs": "Voir les journaux",
"Select Mailbox": "Sélectionnez une boîte aux lettres",
"Selected Users have access to this mailbox:": "Les utilisateurs sélectionnés ont accès à cette boîte aux lettres:",
@@ -613,7 +613,7 @@
"Sending": "Envoi en cours",
"Sending Emails": "Sending Emails",
"Sending can not be undone": "L'envoie ne peut être annulé",
- "Sending emails need to be configured for the mailbox in order to send emails to customers and support agents": "L'envoi d'e-mails doit être configuré pour la boîte aux lettres afin d'envoyer des e-mails aux clients et aux agents de support",
+ "Sending emails need to be configured for the mailbox in order to send emails to customers and support agents": "L'envoi d'e-mails doit être configuré pour la boîte aux lettres afin d'envoyer des e-mails aux membre et aux agents de support",
"Sendmail": "Exécutable Sendmail",
"Separate each email with a comma.": "Séparez chaque e-mail par une virgule",
"Server": "Serveur",
@@ -670,11 +670,11 @@
"This is a test mail sent by :app_name. It means that outgoing email settings of your :mailbox mailbox are fine.": "Il s'agit d'un mail de test envoyé par :app_name. Cela signifie que les paramètres de courrier électronique sortant de votre boîte aux lettres :mailbox sont corrects.",
"This is a test system mail sent by :app_name. It means that mail settings are fine.": "Il s'agit d'un e-mail du système de test envoyé par :app_name. Cela signifie que les paramètres de messagerie sont corrects.",
"This may take several minutes": "Cela peut prendre plusieurs minutes",
- "This number is not visible to customers. It is only used to track conversations within :app_name": "Ce numéro n'est pas visible pour les clients. Il est uniquement utilisé pour suivre les conversations dans :app_name",
+ "This number is not visible to customers. It is only used to track conversations within :app_name": "Ce numéro n'est pas visible pour les membres. Il est uniquement utilisé pour suivre les conversations dans :app_name",
"This password is incorrect.": "Ce mot de passe est incorrect.",
- "This reply will go to the customer. :%switch_start%Switch to a note:switch_end if you are replying to :user_name.": "Cette réponse ira au client. :%switch_start%Passez à une note:switch_end si vous répondez à :user_name.",
+ "This reply will go to the customer. :%switch_start%Switch to a note:switch_end if you are replying to :user_name.": "Cette réponse ira au membre. :%switch_start%Passez à une note:switch_end si vous répondez à :user_name.",
"This setting gives you control over what page loads after you perform an action (send a reply, add a note, change conversation status or assignee).": "Ce paramètre vous permet de contrôler la page qui se charge après avoir effectué une action (envoyer une réponse, ajouter une note, etc.).",
- "This text will be added to the beginning of each email reply sent to a customer.": "Ce texte sera ajouté au début de chaque réponse par e-mail envoyée à un client.",
+ "This text will be added to the beginning of each email reply sent to a customer.": "Ce texte sera ajouté au début de chaque réponse par e-mail envoyée à un membre.",
"Thread is not in a draft state": "Le fil n'est pas à l'état de brouillon",
"Thread not found": "Fil non trouvé",
"Time Format": "Format de l'heure",
@@ -751,7 +751,7 @@
"Welcome to :company_name!": "Bienvenue chez :company_name !",
"Welcome to :company_name, :first_name!": "Bienvenue chez :company_name, :first_name!",
"Welcome to the team!": "Bienvenue dans l'équipe !",
- "When a customer emails this mailbox, application can send an auto reply to the customer immediately.<br\/><br\/>Only one auto reply is sent per new conversation.": "Lorsqu'un client envoie un e-mail à cette boîte aux lettres, l'application peut envoyer immédiatement une réponse automatique au client. <br\/> <br\/> Une seule réponse automatique est envoyée par nouvelle conversation.",
+ "When a customer emails this mailbox, application can send an auto reply to the customer immediately.<br\/><br\/>Only one auto reply is sent per new conversation.": "Lorsqu'un membre envoie un e-mail à cette boîte aux lettres, l'application peut envoyer immédiatement une réponse automatique au membre. <br\/> <br\/> Une seule réponse automatique est envoyée par nouvelle conversation.",
"Which mailboxes will user use?": "Quelles boîtes aux lettres l'utilisateur utilisera-t-il?",
"Who Else Will Use This Mailbox": "Qui d'autre utilisera cette boîte aux lettres",
"Work": "Professionnel",

View file

@ -1,86 +0,0 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Jakub Jirutka <jakub@jirutka.cz>
pkgname=gitaly
pkgver=16.9.3
pkgrel=0
pkgdesc="A Git RPC service for handling all the git calls made by GitLab"
url="https://gitlab.com/gitlab-org/gitaly/"
arch="all"
# GPL-2.0-only WITH GCC-exception-2.0: bundled libgit2
license="MIT AND GPL-2.0-only WITH GCC-exception-2.0"
depends="
git>=2.42
"
makedepends="
bash
cmake
go
icu-dev
libssh2-dev
libxml2-dev
libxslt-dev
"
subpackages="
$pkgname-backup
$pkgname-blackbox
$pkgname-praefect
$pkgname-openrc
"
source="https://gitlab.com/gitlab-org/gitaly/-/archive/v$pkgver/gitaly-v$pkgver.tar.gz
config.patch
$pkgname.initd
"
builddir="$srcdir/$pkgname-v$pkgver"
options="!check"
build() {
make V=1 BUILD_TAGS="tracer_static tracer_static_jaeger"
}
package() {
## Go part
make install DESTDIR="$pkgdir" PREFIX=/usr
# Not very useful for us.
rm "$pkgdir"/usr/bin/gitaly-debug
rm "$pkgdir"/usr/bin/gitaly-wrapper
install -m644 -D config.toml.example "$pkgdir"/etc/gitlab/gitaly.toml
install -m644 -D config.praefect.toml.example "$pkgdir"/etc/gitlab/praefect.toml
install -m644 -D cmd/gitaly-blackbox/config.toml.example "$pkgdir"/etc/gitlab/gitaly-blackbox.toml
install -m755 -D "$srcdir"/gitaly.initd "$pkgdir"/etc/init.d/gitlab.gitaly
}
backup() {
pkgdesc="Utility used by the backup Rake task to create/restore repository backups from Gitaly"
depends=""
amove usr/bin/gitaly-backup
}
# TODO: Add init script.
blackbox() {
pkgdesc="Prometheus exporter that measures GitLab server performance by performing a Git HTTP clone"
depends=""
amove etc/gitlab/gitaly-blackbox.toml
amove usr/bin/gitaly-blackbox
}
# TODO: Add init script.
praefect() {
pkgdesc="A reverse-proxy for Gitaly to manage a cluster of Gitaly nodes for HA"
depends=""
amove etc/gitlab/praefect.toml
amove usr/bin/praefect
}
sha512sums="
c3784b7fb692d2e57a484b3a33b719de76d3ee8bfabc95919e7dabd89f5429f06000c615e433c99f18c1a6706ecd389dcf15d55a59ed546f62c10b585e20ad7b gitaly-v16.9.3.tar.gz
7685330e637c3a34db941c9e6b8776d0611ec16297e8be998a3eb4716c455d9f015d433a4d27720c24e520d489dd56bdab7c0e4264f2852b4b0bfd6ecaa7f773 config.patch
c32105d921be16eaf559cf21d6840bc346cd92b5e37974cedecdb5a2d2ca1eb5e8fbb144f5fc8a1289bf9415102b313cf2d61ee510c80f08ab33a799f5ac7122 gitaly.initd
"

View file

@ -1,91 +0,0 @@
diff --git a/config.toml.example.orig b/config.toml.example
index 82b8502..9982087 100644
--- a/config.toml.example.orig
+++ b/config.toml.example
@@ -2,19 +2,24 @@
# For Gitaly documentation, see https://docs.gitlab.com/ee/administration/gitaly/.
# A path which Gitaly should open a Unix socket.
-socket_path = "/home/git/gitlab/tmp/sockets/private/gitaly.socket"
+socket_path = "/run/gitlab/gitaly.socket"
# Directory containing Gitaly executables.
-bin_dir = "/home/git/gitaly/_build/bin"
+bin_dir = "/usr/bin"
# # Optional. The directory where Gitaly can create all files required to
# # properly operate at runtime. If not set, Gitaly will create a directory in
# # the global temporary directory. This directory must exist.
-# runtime_dir = "/home/git/gitaly/run"
+runtime_dir = "/run/gitaly"
# # Optional if socket_path is set. TCP address for Gitaly to listen on. This is insecure (unencrypted connection).
# listen_addr = "localhost:9999"
+# # Optional: configure where the Gitaly creates the sockets for internal connections. If unset, Gitaly will create a randomly
+# # named temp directory each time it boots.
+# # Non Gitaly clients should never connect to these sockets.
+internal_socket_dir = "/run/gitaly/internal"
+
# # Optional. TCP over TLS address for Gitaly to listen on.
# tls_listen_addr = "localhost:8888"
@@ -35,9 +40,9 @@ bin_dir = "/home/git/gitaly/_build/bin"
# # Gitaly supports TLS encryption. You must bring your own certificates because this isnt provided automatically.
# [tls]
# # Path to the certificate.
-# certificate_path = '/home/git/cert.cert'
+# certificate_path = '/etc/gitlab/ssl/gitaly.crt'
# # Path to the key.
-# key_path = '/home/git/key.pem'
+# key_path = '/etc/gitlab/ssl/gitaly.key'
# # Git settings
# [git]
@@ -58,7 +63,7 @@ bin_dir = "/home/git/gitaly/_build/bin"
# # The name of the storage
name = "default"
# # The path to the storage.
-path = "/home/git/repositories"
+path = "/var/lib/gitlab/repositories"
# # You can optionally configure more storages for this Gitaly instance to serve up
#
@@ -70,12 +75,12 @@ path = "/home/git/repositories"
# # Optional. Configure Gitaly to output JSON-formatted log messages to stdout.
# [logging]
# # Directory where Gitaly stores extra log files.
-dir = "/home/git/gitlab/log"
+dir = "/vat/log/gitlab"
# # Log format. Either 'text' or 'json'.
-# format = "json"
+format = "text"
# # Optional. Set log level to only log entries with that severity or above.
# # Valid values are, in order, 'debug', 'info', 'warn', 'error', 'fatal', and 'panic'. Defaults to 'info'.
-# level = "warn"
+level = "warn"
# # Additionally, exceptions from the Go server can be reported to Sentry. Sentry DSN (Data Source Name)
# # for exception monitoring.
# sentry_dsn = "https://<key>:<secret>@sentry.io/<project>"
@@ -91,18 +96,18 @@ sentry_environment = ""
# # Custom Git hooks that are used to perform tasks based on changes performed in any repository.
[hooks]
# # Directory where custom Git hooks are installed. If left unset, no custom hooks are used.
-custom_hooks_dir = "/home/git/custom_hooks"
+custom_hooks_dir = "/etc/gitlab/custom_hooks"
# # Gitaly must connect to the GitLab application to perform access checks when a user performs a change.
[gitlab]
# # URL of the GitLab server.
-url = "http+unix://%2Fhome%2Fgit%2Fgitlab%2Ftmp%2Fsockets%2Fgitlab-workhorse.socket"
+url = "http+unix://%2Frun%2Fgitlab%2Fworkhorse.socket"
# # 'relative_url_root' is only needed if a UNIX socket is used in 'url' and GitLab is configured to
# # use a relative path. For example, '/gitlab'.
# relative_url_root = '/'
# # Path of the file containing the secret token used to authenticate with GitLab. Use either 'secret_token' or 'secret'
# # but not both.
-secret_file = "/home/git/gitlab-shell/.gitlab_shell_secret"
+secret_file = "/etc/gitlab/gitlab_shell_secret"
# # Secret token used to authenticate with GitLab.
# secret = ""

View file

@ -1,39 +0,0 @@
#!/sbin/openrc-run
name="Gitaly"
description="A Git RPC service for handling all the git calls made by GitLab"
: ${gitaly_config:="/etc/gitlab/gitaly.toml"}
: ${gitaly_logfile:="/var/log/gitlab/gitaly.log"}
command="/usr/bin/gitaly"
command_args="$gitaly_config"
command_background="yes"
command_user="git"
output_log="$gitaly_logfile"
error_log="$gitaly_logfile"
pidfile="/run/gitaly.pid"
supervise_daemon_args="--env TZ=:/etc/localtime"
start_stop_daemon_args="$supervise_daemon_args"
rc_ulimit="-n 15000"
required_files="$gitaly_config"
depend() {
use net
}
start_pre() {
local socket_path=$(sed -En "s/^\s*socket_path\s*=\s*[\"']([^\"']+)[\"']/\1/p" "$gitaly_config")
local runtime_dir=$(sed -En "s/^\s*runtime_dir\s*=\s*[\"']([^\"']+)[\"']/\1/p" "$gitaly_config")
if [ "$socket_path" ]; then
checkpath -q -d -m 755 -o $command_user "${socket_path%/*}" || return 1
fi
if [ "$runtime_dir" ]; then
checkpath -q -d -m 750 -o $command_user "$runtime_dir" || return 1
fi
checkpath -f -m 640 -o $command_user "$gitaly_logfile"
}

View file

@ -1,396 +0,0 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Jakub Jirutka <jakub@jirutka.cz>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=gitlab-foss
_pkgname=${pkgname%-foss}
pkgver=16.9.3
_gittag=v$pkgver
pkgrel=0
pkgdesc="A version control for your server"
url="https://gitlab.com/gitlab-org/gitlab-foss"
#arch="x86_64 aarch64"
license="MIT"
# ruby-irb is needed only for Rails console (gitlab-rails console)
depends="
$pkgname-assets=$pkgver-r$pkgrel
ca-certificates
cmd:dpkg-deb
exiftool
git>=2.42.0
gitaly~=16.9
gitlab-shell~=14.34
graphicsmagick
http-parser
procps
py-docutils
python3
redis>=2.8
ruby3.2
ruby3.2-bigdecimal
ruby3.2-bundler
ruby3.2-fiddle
ruby3.2-io-console
ruby3.2-irb
ruby3.2-json
ruby3.2-rake
ruby3.2-rdoc
ruby3.2-webrick
shared-mime-info
tzdata
"
makedepends="
cargo
clang-dev
cmd:chrpath
cmake
file-dev
go
gpgme-dev
icu-dev
libffi-dev
libgcrypt-dev
libpq-dev
libxml2-dev
libxslt-dev
linux-headers
llvm
nodejs
openssl-dev
protobuf-dev
re2-dev
ruby3.2-dev
ruby3.2-grpc~=1.58
ruby3.2-rugged
rust
yarn>=1.2.0
"
pkgusers="git"
pkggroups="git www-data"
install="$pkgname.pre-install $pkgname.post-install $pkgname.post-upgrade"
subpackages="$pkgname-assets::noarch $pkgname-openrc"
source="https://gitlab.com/gitlab-org/gitlab-foss/-/archive/$_gittag/gitlab-foss-$_gittag.tar.gz
database-config.patch
$_pkgname.initd
$_pkgname.mailroom.initd
$_pkgname.rails.initd
$_pkgname.sidekiq.initd
$_pkgname.workhorse.initd
$_pkgname.confd
$_pkgname.logrotate
bin-wrapper.in
downgrade-sys-filesystem-depend.patch
"
builddir="$srcdir/gitlab-foss-$_gittag"
_prefix="usr/lib/bundles/$_pkgname"
export BUNDLE_DEPLOYMENT=true
export BUNDLE_FORCE_RUBY_PLATFORM=true
export BUNDLE_FROZEN=true
# Should be tied to $JOBS, but rust native code fails to build
export BUNDLE_JOBS=1
prepare() {
local sysgemdir=$(ruby -e 'puts Gem.default_dir')
default_prepare
# The default log level is very chatty.
sed -i 's/^\(\s*config.log_level\s*=\).*$/\1 :warn/' \
config/environments/production.rb
# This is not needed, the secret_token is generated by the
# gitlab-shell package. It also makes problems in the build phase.
rm config/initializers/gitlab_shell_secret_token.rb
# Remove all locale files except en.
find locale -type d -mindepth 1 ! -name en -exec rm -rf {} +
# Allow use of any bundler
sed -i -e '/BUNDLED/,+1d' Gemfile.lock
# Some gems are broken, so we copy our fixed version
# instead of installing it from RubyGems using Bundler.
for i in grpc rugged; do
mkdir -p vendor/gems/$i/src/ruby/lib/$i
cp -r "$sysgemdir"/gems/$i-*/* vendor/gems/$i/
cp "$sysgemdir"/specifications/$i-*.gemspec \
vendor/gems/$i/$i.gemspec
cp "$sysgemdir"/extensions/*/*/$i-*/$i/*.so \
vendor/gems/$i/src/ruby/lib/$i/
done
}
build() {
local bundle_without='exclude development kerberos mysql test'
cd "$builddir"/workhorse
make
cd "$builddir"
msg "Installing Ruby gems..."
bundle config --local without "$bundle_without"
bundle config --local build.ffi --enable-system-libffi
bundle config --local build.gpgme --use-system-libraries
bundle config --local build.re2 --enable-system-libraries
bundle config --local build.nokogiri --use-system-libraries \
--with-xml2-include=/usr/include/libxml2 \
--with-xslt-include=/usr/include/libxslt
bundle config --local build.ruby-magic --enable-system-libraries
bundle config --local build.google-protobuf '-- --with-cflags=-D__va_copy=va_copy'
bundle config --local path "vendor/bundle"
bundle install --no-cache
# Replace bundled CA bundle with symlink.
(
cd vendor/bundle/ruby/*/gems/aws-sdk-core-*/
rm ca-bundle.crt
ln -s /etc/ssl/certs/ca-certificates.crt ca-bundle.crt
)
# Remove faulty RPATH.
chrpath -d vendor/bundle/ruby/*/extensions/*/*/ruby-magic-*/magic/magic.so
# cp grpc lib
cp vendor/gems/grpc/src/ruby/lib/grpc/grpc_c.so vendor/bundle/ruby/*/gems/grpc-*/src/ruby/lib/grpc/grpc_c.so
# Patch installed gem gitlab-markup to use python3.
# Option "-S" causes that Python cannot find docutils module.
sed -i 's/python2 -S/python3/g' \
vendor/bundle/ruby/*/gems/gitlab-markup-*/lib/github/markups.rb
# Remove non-sense require of code for tests from top-level module
# (we're gonna delete tests from the package).
sed -i '/require .carrierwave\/test\/matchers./d' \
vendor/bundle/ruby/*/gems/carrierwave-*/lib/carrierwave.rb
msg "Installing npm modules..."
yarn install --production --frozen-lockfile
# Since we have moved assets gems into a group, they are not implicitly
# loaded by default. This will be reverted after compiling assets.
sed -i.bak '/Bundler.require(\*Rails.groups/s/)/, :assets)/' \
config/application.rb
# assets:precompile and gettext:compile bootstraps the app,
# so they needs configs.
cp config/gitlab.yml.example config/gitlab.yml
cp config/database.yml.postgresql config/database.yml
cp config/secrets.yml.example config/secrets.yml
# The configured path is not readable for the user building
# the package, so we must remove it; GitLab will use the default path.
sed -i '/^\s*secret_file:.*/d' config/gitlab.yml
(
export NODE_ENV=production
export RAILS_ENV=production
export SKIP_STORAGE_VALIDATION=true
export USE_DB=false
export NO_SOURCEMAPS=true
export NODE_OPTIONS="--max_old_space_size=3584"
msg "Compiling GetText PO files..."
bundle exec rake gettext:compile
msg "Compiling assets (this will take few minutes)..."
bundle exec rake gitlab:assets:compile
)
# Revert changes.
mv config/application.rb.bak config/application.rb
msg "Cleaning assets gems..."
bundle config --local without 'exclude development kerberos mysql test assets'
bundle clean
# Create executables in bin/*.
# See also https://github.com/bundler/bundler/issues/6149.
bundle binstubs --force bundler gitlab-mail_room puma sidekiq
# Cleanup
rm config/database.yml config/gitlab.yml config/secrets.yml
}
package() {
local destdir="$pkgdir/$_prefix"
local datadir="$pkgdir/var/lib/gitlab"
local file dest
install -d -m755 "$destdir" "$destdir"/bin
install -d -m755 -o git -g git \
"$datadir" \
"$pkgdir"/etc/gitlab \
"$pkgdir"/var/log/gitlab \
"$datadir"/pages
install -d -m700 -o git -g git \
"$datadir"/artifacts \
"$datadir"/builds \
"$datadir"/ci_secure_files \
"$datadir"/dependency_proxy \
"$datadir"/encrypted_settings \
"$datadir"/external-diffs \
"$datadir"/lfs-objects \
"$datadir"/packages \
"$datadir"/pages \
"$datadir"/terraform_state \
"$datadir"/uploads \
"$pkgdir"/var/tmp/gitlab \
"$pkgdir"/var/tmp/gitlab/downloads \
"$pkgdir"/var/tmp/gitlab/backups
install -d -m0750 -o git -g www-data \
"$datadir"/pages
install -d -m02770 -o git -g git \
"$datadir"/repositories
# Install application files.
# Note: *VERSION files and doc directory are required (Help in GitLab
# menu refers to the doc directory).
cp -rl .bundle config.ru Gemfile* INSTALLATION_TYPE Rakefile ./*VERSION \
app data db doc fixtures config lib locale metrics_server public sidekiq_cluster vendor gems \
"$destdir"/
install -m755 -t "$destdir"/bin/ \
bin/bundle \
bin/mail_room \
bin/metrics-server \
bin/rails \
bin/rake \
bin/sidekiq \
bin/sidekiq-cluster \
bin/sidekiqmon \
bin/puma
cd "$destdir"
# Not needed in runtime since we have already compiled all assets.
rm -r app/assets
rm -r vendor/assets
find public/assets -name '*.vue' -delete
find public/assets -type d -exec rmdir --ignore-fail-on-non-empty '{}' \;
# These load gems in the assets group.
rm config/initializers/sprockets.rb
# Remove more stuff not neeeded in production.
rm -r lib/support
rm -r db/fixtures/development
find lib/tasks -maxdepth 1 -type f ! -name cache.rake ! -name setup.rake -delete
find lib/tasks/gitlab \( -name 'generate_docs.*' \
-o -name 'shell.*' \
-o -name 'test.*' \) -delete
cd "$destdir"/vendor/bundle/ruby/*/
# Remove tests, documentations and other useless files.
find gems/ \( -name 'doc' \
-o -name 'spec' \
-o -name 'test' \) \
-type d -maxdepth 2 -exec rm -fr "{}" +
find gems/ \( -name 'README*' \
-o -name 'CHANGELOG*' \
-o -name 'CONTRIBUT*' \
-o -name '*LICENSE*' \
-o -name 'Rakefile' \
-o -name '.*' \) \
-type f -delete
# Remove bundled libgit2 sources.
rm -r gems/rugged-*/vendor/libgit2
# Remove assets, they are already compiled.
rm -r gems/tanuki_emoji-*/app/assets
# Remove build logs and cache.
rm -rf build_info/ cache/
find extensions/ \( -name gem_make.out -o -name mkmf.log \) -delete
cd "$destdir"
# Install and symlink config files.
for file in cable.yml.example \
database.yml.postgresql \
gitlab.yml.example \
puma.rb.example \
resque.yml.example \
sidekiq.yml.example \
initializers/smtp_settings.rb.sample
do
dest="$(basename "${file%.*}")"
install -m640 -g git -D config/$file "$pkgdir"/etc/gitlab/$dest
ln -sf /etc/gitlab/$dest "$pkgdir"/$_prefix/config/${file%.*}
done
# This file will be generated by the post-install script, just prepare symlink.
ln -sf /etc/gitlab/secrets.yml config/secrets.yml
# These shouldn't be necessary, they are all configurable, but OmniBus
# creates them too, so just to be sure...
ln -sf /etc/gitlab/gitlab_kas_secret .gitlab_kas_secret
ln -sf /etc/gitlab/gitlab_pages_secret .gitlab_pages_secret
ln -sf /etc/gitlab/gitlab_shell_secret .gitlab_shell_secret
ln -sf /etc/gitlab/gitlab_workhorse_secret .gitlab_workhorse_secret
# Some paths are hard-coded in GitLab, so we must make symlinks. :(
ln -sf /var/lib/gitlab/uploads public/uploads
ln -sf /var/log/gitlab log
ln -sf /var/tmp/gitlab tmp
cat > "$datadir"/.profile <<-EOF
export RAILS_ENV=production
export NODE_ENV=production
export EXECJS_RUNTIME=Disabled
EOF
# Install wrapper scripts to /usr/bin.
local name; for name in rake rails; do
sed "s/__COMMAND__/$name/g" "$srcdir"/bin-wrapper.in \
> "$builddir"/gitlab-$name
install -m755 -D "$builddir"/gitlab-$name "$pkgdir"/usr/bin/gitlab-$name
done
cd "$builddir"/workhorse
# Install workhorse.
make install DESTDIR="$pkgdir" PREFIX=/usr
install -m644 config.toml.example "$pkgdir"/etc/gitlab/workhorse.toml
for file in $_pkgname $_pkgname.rails $_pkgname.sidekiq $_pkgname.mailroom $_pkgname.workhorse; do
install -m755 -D "$srcdir"/$file.initd "$pkgdir"/etc/init.d/$file
done
install -m644 -D "$srcdir"/$_pkgname.confd \
"$pkgdir"/etc/conf.d/$_pkgname
install -m644 -D "$srcdir"/$_pkgname.logrotate \
"$pkgdir"/etc/logrotate.d/$_pkgname
}
assets() {
depends=""
amove $_prefix/public/assets
}
sha512sums="
d17ff841977d157965337774ac8ebed409e058bb1617d3fadeb8330d46efe32a091483ba30955c883e654b138d9a3ae7740a528418cd30eb1ed18cced508ddb8 gitlab-foss-v16.9.3.tar.gz
daa496f3d9146f9dbddff62477bf49d5c7bd2f2a4cdbadc70ee51c8230f3ef01dc950ef157154b31c7e7bef0beecc5cbac50fbac65a79d6d9099b27bcba8b2ab database-config.patch
55b0667d3969113ffd6860652ee8bdb9a534c25f413f33b2739e922c886988e7cea72c1c00c7eecf29fcff3682b1324156365605ffc6aae45d1e0ccddf96288b gitlab.initd
1f451b67a5d5e58650b0fe862a2b65cfb8bff5502b37d94ae90619c1ff9affbecf24428303a2849bebce5f94bef37078f0e5710e344bbab616134e910938384a gitlab.mailroom.initd
b6a6d9ba20557e61efa24f2d5a489873fefbb981f7d4465794a857b2971263c08ec29cc001c372522cdc0d48245e59751307c9f44f6ef4d87bf2e3ec5c23fb1c gitlab.rails.initd
cb4ec100f0ea7ffcbb37aead8423e636629e2f4848b2974a7b2468e96cb1081ca732ac336417b08dd943afb961df888c73af1334dcbe054dfd361e74f492fd86 gitlab.sidekiq.initd
85c4e257a030832bd70ad1e257ae7cb568b31e01201fc845abac02d00f02492ca694be1fa2bf743dd8c8623e6a79d36adee3f4de02040134c11158a6001c064b gitlab.workhorse.initd
4dc00b16462f30591297fcb535fc364185d3ed76e9956597f0423a8dfd8a9a351f6ac29d9f0c73052c11324fba4768eb89a21c6bef4da99f15baaea8c9ab8407 gitlab.confd
57f258246925fbef0780caebdf005983c72fe3db1ab3242a1e00137bd322f5ec6c0fd958db7178b8fc22103d071f550d6f71f08422bcd9e859d2a734b2ecef00 gitlab.logrotate
a944c3886388ba1574bf8c96b6de4d9f24ef4a83f553c31a224e17a3b01f2a5c65b60c59b7ed7ca4b25670c60ea8dd41b96a8a623d909d2bb09bdf2520ed7f23 bin-wrapper.in
ab9a09fca6126b18b76e61380990dc217f915162985880e90b905b3210a1fef229af3db1f1ca180177d3cba91ab5fe33798ac685055abf0adc44a1b630f71b39 downgrade-sys-filesystem-depend.patch
"

View file

@ -1,15 +0,0 @@
#!/bin/sh
BUNDLE_DIR='/usr/lib/bundles/gitlab'
export RAILS_ENV='production'
export NODE_ENV='production'
export EXECJS_RUNTIME='Disabled'
cd $BUNDLE_DIR
install -m 700 -o git -g git -d "$(readlink ./tmp)"
if [ "$(id -un)" != 'git' ]; then
exec su git -c '"$0" "$@"' -- bin/__COMMAND__ "$@"
else
exec bin/__COMMAND__ "$@"
fi

View file

@ -1,66 +0,0 @@
diff --git a/config/database.yml.postgresql.orig b/config/database.yml.postgresql
index da9f458..2d6d44e 100644
--- a/config/database.yml.postgresql.orig
+++ b/config/database.yml.postgresql
@@ -26,13 +26,6 @@ production:
username: git
password: "secure password"
host: localhost
- geo:
- adapter: postgresql
- encoding: unicode
- database: gitlabhq_geo_production
- username: git
- password: "secure password"
- host: localhost
#
# Development specific
@@ -57,13 +50,6 @@ development:
host: localhost
variables:
statement_timeout: 15s
- geo:
- adapter: postgresql
- encoding: unicode
- database: gitlabhq_geo_development
- username: postgres
- password: "secure password"
- host: localhost
#
# Staging specific
@@ -84,13 +70,6 @@ staging:
username: git
password: "secure password"
host: localhost
- geo:
- adapter: postgresql
- encoding: unicode
- database: gitlabhq_geo_staging
- username: git
- password: "secure password"
- host: localhost
# Warning: The database defined as "test" will be erased and
# re-generated from your development database when you run "rake".
@@ -119,19 +98,3 @@ test: &test
reaping_frequency: nil
variables:
statement_timeout: 15s
- geo:
- adapter: postgresql
- encoding: unicode
- database: gitlabhq_geo_test
- username: postgres
- password:
- host: localhost
- reaping_frequency: nil
- embedding:
- adapter: postgresql
- encoding: unicode
- database: gitlabhq_embedding_test
- username: postgres
- password:
- host: localhost
- reaping_frequency: nil

View file

@ -1,37 +0,0 @@
diff --git a/Gemfile.orig b/Gemfile
index c1e9e34..a4448b7 100644
--- a/Gemfile.orig
+++ b/Gemfile
@@ -525,7 +525,7 @@ gem 'health_check', '~> 3.0' # rubocop:todo Gemfile/MissingFeatureCategory
# System information
gem 'vmstat', '~> 2.3.0' # rubocop:todo Gemfile/MissingFeatureCategory
-gem 'sys-filesystem', '~> 1.4.3' # rubocop:todo Gemfile/MissingFeatureCategory
+gem 'sys-filesystem', '= 1.3.2' # rubocop:todo Gemfile/MissingFeatureCategory
# NTP client
gem 'net-ntp' # rubocop:todo Gemfile/MissingFeatureCategory
diff --git a/Gemfile.lock.orig b/Gemfile.lock
index e2ebb91..39b6df3 100644
--- a/Gemfile.lock.orig
+++ b/Gemfile.lock
@@ -1605,8 +1605,8 @@ GEM
attr_required (>= 0.0.5)
httpclient (>= 2.4)
sync (0.5.0)
- sys-filesystem (1.4.3)
- ffi (~> 1.1)
+ sys-filesystem (1.3.2)
+ ffi (>= 0)
sysexits (1.2.0)
table_print (1.5.7)
tanuki_emoji (0.9.0)
@@ -2061,7 +2059,7 @@ DEPENDENCIES
ssh_data (~> 1.3)
stackprof (~> 0.2.25)
state_machines-activerecord (~> 0.8.0)
- sys-filesystem (~> 1.4.3)
+ sys-filesystem (= 1.3.2)
tanuki_emoji (~> 0.9)
telesignenterprise (~> 2.2)
terser (= 1.0.2)

View file

@ -1,108 +0,0 @@
#!/bin/sh
set -eu
group='git'
data_dir='/var/lib/gitlab'
secrets_file='/etc/gitlab/secrets.yml'
shell_secret_file='/etc/gitlab/gitlab_shell_secret'
workhorse_secret_file='/etc/gitlab/gitlab_workhorse_secret'
kas_secret_file='/etc/gitlab/gitlab_kas_secret'
gen_random_b64() {
local bits="$1"
ruby <<-EOF
require 'securerandom'
require 'base64'
puts Base64.strict_encode64(SecureRandom.random_bytes($bits))
EOF
}
echo "* Checking $secrets_file" >&2
ruby <<-EOF
require 'openssl'
require 'securerandom'
require 'yaml'
secrets_file = '$secrets_file'
changed = false
secrets = YAML.load_file(secrets_file) if File.exist?(secrets_file)
secrets ||= {}
prod = secrets['production'] ||= {}
prod['db_key_base'] ||= ( changed = true; SecureRandom.hex(64) )
prod['secret_key_base'] ||= ( changed = true; SecureRandom.hex(64) )
prod['otp_key_base'] ||= ( changed = true; SecureRandom.hex(64) )
prod['encrypted_settings_key_base'] ||= ( changed = true; SecureRandom.hex(64) )
prod['openid_connect_signing_key'] ||= begin
changed = true
prod.delete('jws_private_key') || OpenSSL::PKey::RSA.new(2048).to_pem
end
# db/fixtures/production/010_settings.rb
prod['ci_jwt_signing_key'] ||= ( changed = true; OpenSSL::PKey::RSA.new(2048).to_pem )
if changed
STDERR.puts "* Generating random secrets into #{secrets_file}"
File.write(secrets_file, YAML.dump(secrets), mode: 'w', perm: 0640)
end
EOF
chown root:$group "$secrets_file"
if [ ! -f "$shell_secret_file" ]; then
echo "* Generating random secret in $shell_secret_file" >&2
head -c 512 /dev/urandom | LC_CTYPE=C tr -cd 'a-zA-Z0-9' | head -c 64 > "$shell_secret_file"
chown root:$group "$shell_secret_file"
chmod 0640 "$shell_secret_file"
fi
if [ ! -f "$workhorse_secret_file" ]; then
echo "* Generating random secret in $workhorse_secret_file" >&2
# Sync with lib/gitlab/workhorse.rb.
gen_random_b64 32 > "$workhorse_secret_file"
chown root:$group "$workhorse_secret_file"
chmod 0640 "$workhorse_secret_file"
fi
if [ ! -f "$kas_secret_file" ]; then
echo "* Generating random secret in $kas_secret_file" >&2
# Sync with lib/gitlab/workhorse.rb.
gen_random_b64 32 > "$kas_secret_file"
chown root:$group "$kas_secret_file"
chmod 0640 "$kas_secret_file"
fi
# NOTE: We create this symlink in post-install script instead of APKBULD,
# so user can decide to have tmp dir inside $data_dir (e.g. it's on bigger disk).
if [ ! -e "$data_dir"/tmp ]; then
ln -s /var/tmp/gitlab "$data_dir"/tmp
fi
if [ "${0##*.}" = 'post-upgrade' ]; then
cat >&2 <<-EOF
*
* To finish GitLab upgrade run:
*
* gitlab-rake gitlab:db:configure
*
EOF
else
cat >&2 <<-EOF
*
* 1. Adjust settings in /etc/gitlab/database.yml and gitlab.yml.
*
* 2. Create database for GitLab:
*
* psql -c "CREATE ROLE gitlab PASSWORD 'top-secret' INHERIT LOGIN;"
* psql -c "CREATE DATABASE gitlab OWNER gitlab ENCODING 'UTF-8';"
* psql -d gitlab -c "CREATE EXTENSION pg_trgm; CREATE EXTENSION btree_gist;"
*
* 3. Run "gitlab-rake gitlab:setup", or "gitlab-rake gitlab:db:configure" if
* you are updating existing database.
*
EOF
fi

View file

@ -1 +0,0 @@
gitlab-foss.post-install

View file

@ -1,53 +0,0 @@
#!/bin/sh
# It's very important to set user/group correctly.
git_dir='/var/lib/gitlab'
if ! getent group git 1>/dev/null; then
echo '* Creating group git' 1>&2
addgroup -S git
fi
if ! id git 2>/dev/null 1>&2; then
echo '* Creating user git' 1>&2
adduser -DHS -G git -h "$git_dir" -s /bin/sh \
-g "added by apk for gitlab-foss" git
passwd -u git 1>/dev/null # unlock
fi
if ! id -Gn git | grep -Fq redis; then
echo '* Adding user git to group redis' 1>&2
addgroup git redis
fi
if [ "$(id -gn git)" != 'git' ]; then
cat >&2 <<-EOF
!!
!! User git has primary group $(id -gn git). We strongly recommend to change
!! git's primary group to git, otherwise GitLab may not work correctly.
!!
EOF
# Add it at least as a supplementary group.
adduser git git
fi
user_home="$(getent passwd git | cut -d: -f6)"
if [ "$user_home" != "$git_dir" ]; then
cat >&2 <<-EOF
!!
!! User git has home directory in $user_home, but this package assumes
!! $git_dir. Although it's possible to use a different directory,
!! it's really not easy.
!!
!! Please change git's home directory to $git_dir, or adjust settings
!! and move files yourself. Otherwise GitLab will not work!
!!
EOF
fi
exit 0

View file

@ -1,20 +0,0 @@
# Configuration for /etc/init.d/gitlab.rails
# Path to the Puma configuration file.
#puma_config="/etc/gitlab/puma.rb"
# IP address and port for Puma server to listen on.
#puma_listen_tcp="127.0.0.1:8080"
# Absolute path of unix socket for Puma server to listen on.
#puma_listen_unix="/run/gitlab/gitlab.socket"
# Path to the file to redirect stdout from Puma server to.
#puma_stdout_file="/var/log/gitlab/puma_stdout.log"
# Path to the file to redirect stderr from Puma server to.
#puma_stderr_file="/var/log/gitlab/puma_stderr.log"
# Action Cable uses a separate thread pool per Puma worker. This configures
# number of threads in the pool.
#action_cable_worker_pool_size=4

View file

@ -1,85 +0,0 @@
# Configuration file for /etc/init.d/gitlab and
# /etc/init.d/gitlab.{mailroom,rails,sidekiq,workhorse}
# Path to the base directory for the Prometheus metrics used by Puma and
# Sidekiq.
#metrics_dir=/dev/shm/gitlab
# How many Puma worker processes to create (0 to disable cluster mode).
#puma_workers=3
# IP address and port for Puma server to listen on.
#puma_listen_tcp="127.0.0.1:8080"
# Absolute path of unix socket for Puma server to listen on.
#puma_listen_unix="/run/gitlab/gitlab.socket"
# Action Cable uses a separate thread pool per Puma worker. This configures
# number of threads in the pool.
#action_cable_worker_pool_size=4
# IP address and port, or absolute path of the unix socket, where should
# Workhorse listen on for connections from a web server.
#workhorse_listen="/run/gitlab/workhorse.socket"
# How long to wait for response headers when proxying the request.
#workhorse_proxy_header_timeout="1m0s"
# Number of API requests allowed at single time.
#workhorse_api_limit=
# Maximum queueing duration of requests (default 30s).
#workhorse_api_queue_duration=
# Number of API requests allowed to be queued.
#workhorse_api_queue_limit=
# Long polling duration for job requesting for runners (default 0s - disabled)
#workhorse_ci_long_polling_duration=
# Log format to use: text, json, structured, none. Defaults to "text".
#workhorse_log_format=
# Prometheus listening address.
#workhorse_prometheus_listen=
# Sentry DSN for Workhorse.
#workhorse_sentry_dsn=
# Specify how many processes to create using sidekiq-cluster and which queue
# they should handle. Each whitespace-separated item equates to one additional
# Sidekiq process, and comma-separated values in each item determine the queues
# it works on. The special queue name "*" means all queues.
# Example: "* gitlab_shell process_commit,post_receive"
# See https://docs.gitlab.com/ee/administration/sidekiq/extra_sidekiq_processes.html.
#sidekiq_queue_groups="*"
# Maximum threads to use with Sidekiq (default: 50, 0 to disable).
#sidekiq_max_concurrency=
# Minimum threads to use with Sidekiq (default: 0).
#sidekiq_min_concurrency=
# The number of seconds to wait between worker checks.
#sidekiq_interval=
# Graceful timeout for all running processes.
#sidekiq_shutdown_timeout=
# Run workers for all queues in sidekiq_queues.yml except the given ones.
#sidekiq_negate=no
# Run workers based on the provided selector.
#sidekiq_queue_selector=no
# Memory limit (in MiB) for the Sidekiq process. If the RSS (Resident Set Size)
# of the Sidekiq process exceeds this limit, a delayed shutdown is triggered.
#sidekiq_memkiller_max_rss=2000
# Enable mail_room to handle incoming mails?
#mailroom_enabled="no"

View file

@ -1,49 +0,0 @@
#!/sbin/openrc-run
name="GitLab"
description="Meta script for starting/stopping all the GitLab components"
: ${mailroom_enabled:="no"}
: ${pages_enabled:="yes"}
subservices="gitlab.rails gitlab.gitaly gitlab.sidekiq gitlab.workhorse"
if yesno "$mailroom_enabled"; then
subservices="$subservices gitlab.mailroom"
fi
if yesno "$pages_enabled" && [ -e /etc/init.d/gitlab.pages ]; then
subservices="$subservices gitlab.pages"
fi
depend() {
use net
}
start() {
local ret=0
ebegin "Starting all GitLab components"
local svc; for svc in $subservices; do
service $svc start || ret=1
done
eend $ret
}
stop() {
local ret=0
ebegin "Stopping all GitLab components"
local svc; for svc in $subservices; do
service $svc stop || ret=1
done
eend $ret
}
status() {
local ret=0
local svc; for svc in $subservices; do
echo "$svc:"
service $svc status || ret=1
done
eend $ret
}

View file

@ -1,24 +0,0 @@
/var/log/gitlab/workhorse.log {
compress
maxsize 10M
minsize 1M
missingok
postrotate
/etc/init.d/gitlab.workhorse --quiet --ifstarted reopen
endscript
sharedscripts
rotate 5
weekly
}
/var/log/gitlab/*.log {
compress
copytruncate
delaycompress
maxsize 10M
minsize 1M
missingok
sharedscripts
rotate 10
weekly
}

View file

@ -1,40 +0,0 @@
#!/sbin/openrc-run
supervisor=supervise-daemon
name="GitLab (mailroom)"
description="GitLab service for processing incoming mails."
: ${gitlab_base:="/usr/lib/bundles/gitlab"}
: ${gitlab_config:="/etc/gitlab/gitlab.yml"}
: ${mailroom_logfile:="/var/log/gitlab/mail_room.log"}
: ${mailroom_config:="$gitlab_base/config/mail_room.yml"}
command="$gitlab_base/bin/mail_room"
command_args="-c $mailroom_config"
command_background="yes"
command_user="git"
directory="$gitlab_base"
error_log="$mailroom_logfile"
output_log="$mailroom_logfile"
supervise_daemon_args="
--env RAILS_ENV=production
--env TZ=:/etc/localtime
--env MAIL_ROOM_GITLAB_CONFIG_FILE=$gitlab_config
"
start_stop_daemon_args="--interpreted $supervise_daemon_args"
pidfile="/run/gitlab/mail_room.pid"
required_files="$mailroom_config $gitlab_config"
depend() {
need redis
use net
}
start_pre() {
checkpath -d -m 755 -o $command_user -q "${pidfile%/*}" || return 1
checkpath -f -m 640 -o $command_user "$mailroom_logfile"
}

View file

@ -1,114 +0,0 @@
#!/sbin/openrc-run
name="GitLab Rails"
description="GitLab application"
extra_started_commands="reload reopen"
description_reload="Reload configuration"
description_reopen="Reopen log files"
: ${gitlab_base:="/usr/lib/bundles/gitlab"}
: ${metrics_dir:="/dev/shm/gitlab"}
: ${action_cable_worker_pool_size:=4}
: ${gitlab_config:="/etc/gitlab/gitlab.yml"}
: ${puma_workers:=3}
: ${puma_listen_unix:="/run/gitlab/gitlab.socket"}
: ${puma_listen_tcp:="127.0.0.1:8080"}
: ${puma_stdout_file:="/var/log/gitlab/puma_stdout.log"}
: ${puma_stderr_file:="/var/log/gitlab/puma_stderr.log"}
: ${puma_config:="/etc/gitlab/puma.rb"}
: ${puma_metrics_dir:="$metrics_dir/puma"}
command="$gitlab_base/bin/puma"
command_args="
--config $puma_config
--workers $puma_workers
--bind tcp://$puma_listen_tcp
--bind unix://$puma_listen_unix
--redirect-stdout $puma_stdout_file
--redirect-stderr $puma_stderr_file
--redirect-append
--state /run/gitlab/puma.state
"
command_background="yes"
command_user="git"
directory="$gitlab_base"
supervise_daemon_args="
--env ACTION_CABLE_WORKER_POOL_SIZE=$action_cable_worker_pool_size
--env RAILS_ENV=production
--env NODE_ENV=production
--env EXECJS_RUNTIME=Disabled
--env GITLAB_BASE=$gitlab_base
--env TZ=:/etc/localtime
--env prometheus_multiproc_dir=$puma_metrics_dir
${supervise_daemon_args:-}
"
start_stop_daemon_args="
--interpreted
$supervise_daemon_args
$start_stop_daemon_args
"
pidfile="/run/gitlab/puma.pid"
required_files="$gitlab_config $puma_config"
depend() {
need redis
want sshd postgresql docker-registry
use net
}
start_pre() {
checkpath -d -m 755 -o $command_user -q "${pidfile%/*}" || return 1
checkpath -d -m 700 -o $command_user -q "$(readlink -f "$gitlab_base"/tmp)" || return 1
checkpath -d -m 700 -o $command_user -q "$metrics_dir" || return 1
checkpath -d -m 700 -o $command_user --directory-truncate "$puma_metrics_dir" || return 1
checkpath -f -m 644 -o $command_user "$puma_stdout_file" || return 1
checkpath -f -m 644 -o $command_user "$puma_stderr_file" || return 1
# Ruby requires sticky bit on TMP directory.
checkpath -d -m 1777 /tmp
local downloads_path="$(_parse_yaml "$gitlab_config" \
production.gitlab.repository_downloads_path)"
if [ -n "$downloads_path" ]; then
checkpath -d -m 700 -o $command_user -q "$downloads_path"
fi
}
reload() {
ebegin "Reloading $name"
if [ "$supervisor" ]; then
$supervisor "$RC_SVCNAME" --signal USR2
else
start-stop-daemon --pidfile "$pidfile" --signal USR2
fi
eend $?
}
reopen() {
ebegin "Telling $name to reopen log files"
if [ "$supervisor" ]; then
$supervisor "$RC_SVCNAME" --signal USR1
else
start-stop-daemon --pidfile "$pidfile" --signal USR1
fi
eend $?
}
_parse_yaml() {
local file="$1"
local key="$2"
local default="${3:-}"
local key_path="$(echo "[\"$key\"]" | sed 's/\./"]["/g')"
ruby <<-EOF
require "yaml"
puts YAML.load_file("$file")$key_path rescue puts "$default"
EOF
}

View file

@ -1,76 +0,0 @@
#!/sbin/openrc-run
extra_started_commands="finish"
name="GitLab Sidekiq"
description="GitLab backgroud workers"
description_finish="Stop fetching new jobs and finish current ones"
: ${gitlab_base:="/usr/lib/bundles/gitlab"}
: ${metrics_dir:="/dev/shm/gitlab"}
: ${sidekiq_logfile:="/var/log/gitlab/sidekiq.log"}
: ${sidekiq_memkiller_max_rss:="2000"} # default per Omnibus
: ${sidekiq_metrics_dir:="$metrics_dir/sidekiq"}
: ${sidekiq_negate:="no"}
: ${sidekiq_queue_groups:="*"}
: ${sidekiq_queue_selector:="no"}
command="$gitlab_base/bin/sidekiq-cluster"
# Note: The rest of the options is set in start_pre().
command_args="-r $gitlab_base -e production ${command_args:-}"
command_background="yes"
command_user="git"
directory="$gitlab_base"
error_log="$sidekiq_logfile"
output_log="$sidekiq_logfile"
supervise_daemon_args="
--env RAILS_ENV=production
--env NODE_ENV=production
--env EXECJS_RUNTIME=Disabled
--env TZ=:/etc/localtime
--env SIDEKIQ_MEMORY_KILLER_MAX_RSS=$(( sidekiq_memkiller_max_rss * 1024 ))
--env prometheus_multiproc_dir=$sidekiq_metrics_dir
"
start_stop_daemon_args="--interpreted $supervise_daemon_args"
pidfile="/run/gitlab/sidekiq.pid"
depend() {
need redis
use net postgresql
}
start_pre() {
yesno "$sidekiq_queue_selector" && command_args="$command_args --queue-selector"
command_args="$command_args
$(optif --max-concurrency ${sidekiq_max_concurrency:-})
$(optif --min-concurrency ${sidekiq_min_concurrency:-})
$(optif --interval ${sidekiq_interval:-})
$(optif --timeout ${sidekiq_shutdown_timeout:-})
$(set -f; printf "'%s' " $sidekiq_queue_groups)
"
yesno "$sidekiq_negate" && command_args="$command_args --negate"
checkpath -d -m 755 -o $command_user -q "${pidfile%/*}" || return 1
checkpath -d -m 700 -o $command_user -q "$metrics_dir" || return 1
checkpath -d -m 700 -o $command_user --directory-truncate "$sidekiq_metrics_dir" || return 1
checkpath -f -m 644 -o $command_user "$sidekiq_logfile"
}
finish() {
ebegin "Telling $name to stop fetching new jobs"
if [ "$supervisor" ]; then
$supervisor "$RC_SVCNAME" --signal TSTP
else
start-stop-daemon --pidfile "$pidfile" --signal TSTP
fi
eend $?
}
optif() {
test -n "$2" && printf '%s/n' "$1=$2" || true
}

View file

@ -1,75 +0,0 @@
#!/sbin/openrc-run
extra_started_commands="reopen"
name="GitLab Workhorse"
description="A reverse proxy for GitLab."
description_reopen="Reopen log files"
: ${gitlab_base:="/usr/lib/bundles/gitlab"}
: ${workhorse_logfile:="/var/log/gitlab/workhorse.log"}
: ${workhorse_access_log:="no"}
command="/usr/bin/gitlab-workhorse"
# Note: The rest of the options is set in start_pre().
command_args="
-authBackend=http://${puma_listen_tcp:="127.0.0.1:8080"}
-config=${workhorse_config:="/etc/gitlab/workhorse.toml"}
-documentRoot=${gitlab_public_dir:="$gitlab_base/public"}
-listenAddr=${workhorse_listen:="/run/gitlab/workhorse.socket"}
-listenUmask=${workhorse_listen_umask:="000"}
-logFile=$workhorse_logfile
-secretPath=${workhorse_secret_path:="/etc/gitlab/gitlab_workhorse_secret"}
"
command_background="yes"
command_user="git"
directory="$gitlab_base"
pidfile="/run/gitlab/workhorse.pid"
depend() {
use net
}
start_pre() {
local listen_net="tcp"
[ "${workhorse_listen:0:1}" = '/' ] && listen_net="unix"
command_args="$command_args
-listenNetwork=$listen_net
$(optif -apiCiLongPollingDuration "$workhorse_ci_long_polling_duration")
$(optif -apiLimit "$workhorse_api_limit")
$(optif -apiQueueDuration "$workhorse_api_queue_duration")
$(optif -apiQueueLimit "$workhorse_api_queue_limit")
$(optif -authSocket "$puma_listen_unix")
$(optif -logFormat "$workhorse_log_format")
$(optif -prometheusListenAddr "$workhorse_prometheus_listen_addr")
$(optif -proxyHeadersTimeout "$workhorse_proxy_header_timeout")"
# FIXME: not implemented
#yesno "$workhorse_access_log" || command_args="$command_args -disableAccessLog"
start_stop_daemon_args="$start_stop_daemon_args
$(optif '--env GITLAB_WORKHORSE_SENTRY_DSN' "$workhorse_sentry_dns")"
supervise_daemon_args="$supervise_daemon_args
$(optif '--env GITLAB_WORKHORSE_SENTRY_DSN' "$workhorse_sentry_dns")"
checkpath -d -m 755 -o $command_user -q "${pidfile%/*}" || return 1
if [ "$listen_net" = "unix" ]; then
checkpath -d -m 755 -o $command_user -q "${workhorse_listen%/*}" || return 1
fi
checkpath -f -m 640 -o $command_user "$workhorse_logfile"
}
reopen() {
ebegin "Telling $name to reopen log files"
if [ "$supervisor" ]; then
$supervisor "$RC_SVCNAME" --signal HUP
else
start-stop-daemon --pidfile "$pidfile" --signal HUP
fi
eend $?
}
optif() {
test -n "$2" && printf '%s/n' "$1=$2" || true
}

View file

@ -1,35 +0,0 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Jakub Jirutka <jakub@jirutka.cz>
pkgname=gitlab-pages
pkgver=16.9.3
_gittag="v$pkgver"
pkgrel=0
pkgdesc="A daemon used to serve static websites for GitLab users"
url="https://gitlab.com/gitlab-org/gitlab-pages/"
arch="all"
license="MIT"
makedepends="go>=1.5"
source="
https://gitlab.com/gitlab-org/gitlab-pages/-/archive/$_gittag/gitlab-pages-$_gittag.tar.gz
ungit-makefile.patch
$pkgname.initd
"
subpackages="$pkgname-openrc"
builddir="$srcdir"/$pkgname-$_gittag
build() {
make VERSION=$pkgver REVISION=$pkgrel GOPATH="$srcdir" CGO_ENABLED=0
}
package() {
install -D -m 755 $pkgname "$pkgdir"/usr/bin/$pkgname
install -m755 -D "$srcdir"/$pkgname.initd \
"$pkgdir"/etc/init.d/gitlab.pages
}
sha512sums="
5a97176d820f787b96cac54dc040a0232c6b0e8a98f7e737af2f5c9a0cff10ce79263a35fdf560c58eb84eaaf1ed109a75121b050f059b4bdf493d05b58861bc gitlab-pages-v16.9.3.tar.gz
710a9b652327e57e620c2bdb02bf912a6f61044eaaf61d36c6612284e9b951d2ac6f5eef77dfea16a0cde328bd4c556d9e47791c560139c27cb9659076f809b1 ungit-makefile.patch
20bc66c1c3548568ed353ca8d584f9108b9688f9375f212a18efc7b8386fdaafb3b2dc9e865f21c7f8fd31ada6e91842a8bb8d397f64851d853bb0de3e0e60bb gitlab-pages.initd
"

View file

@ -1,55 +0,0 @@
#!/sbin/openrc-run
name="GitLab Pages"
description="A daemon used to serve static websites for GitLab users"
: ${pages_user:=${user:-"git"}}
: ${pages_root:="/var/lib/gitlab/pages"}
: ${pages_logfile:="/var/log/gitlab/pages.log"}
command="/usr/bin/gitlab-pages"
# Note: The rest of the options is set in start_pre().
command_args="
-pages-domain=$pages_domain
-pages-root=$pages_root
-redirect-http=${pages_redirect_http:-true}
-use-http2=${pages_use_http2:-true}
"
command_background="yes"
start_stop_daemon_args="
--chdir $pages_root
--user $pages_user
--stdout $pages_logfile
--stderr $pages_logfile"
pidfile="/run/gitlab-pages.pid"
depend() {
use net
}
start_pre() {
local item
for item in $pages_listen_http; do
command_args="$command_args -listen-http=$item"
done
for item in $pages_listen_https; do
command_args="$command_args -listen-https=$item"
done
for item in $pages_listen_proxy; do
command_args="$command_args -listen-proxy=$item"
done
command_args="$command_args
$(optif -metrics-address "$pages_metrics_address")
$(optif -root-cert "$pages_root_cert")
$(optif -root-key "$pages_root_key")"
checkpath -m 640 -o $pages_user -f "$pages_logfile"
}
optif() {
test -n "$2" && printf '%s/n' "$1=$2" || true
}

View file

@ -1,18 +0,0 @@
diff --git a/Makefile.internal.mk.orig b/Makefile.internal.mk
index 6dfaa1b..207bdaf 100644
--- a/Makefile.internal.mk.orig
+++ b/Makefile.internal.mk
@@ -1,13 +1,3 @@
-REVISION := $(shell git rev-parse --short HEAD || echo unknown)
-LAST_TAG := $(shell git describe --tags --abbrev=0)
-COMMITS := $(shell echo `git log --oneline $(LAST_TAG)..HEAD | wc -l`)
-VERSION := $(shell cat VERSION)
-BRANCH := $(shell git rev-parse --abbrev-ref HEAD)
-
-ifneq (v$(VERSION),$(LAST_TAG))
- VERSION := $(shell echo $(VERSION)~beta.$(COMMITS).g$(REVISION))
-endif
-
VERSION_FLAGS :=-X "main.VERSION=$(VERSION)" -X "main.REVISION=$(REVISION)"
export GOBIN := $(CURDIR)/bin

View file

@ -1,66 +0,0 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Jakub Jirutka <jakub@jirutka.cz>
pkgname=gitlab-shell
pkgver=14.34.0
pkgrel=0
pkgdesc="GitLab Shell handles git SSH sessions for GitLab"
url="https://gitlab.com/gitlab-org/gitlab-shell"
arch="all"
license="MIT"
depends="git openssh"
makedepends="go krb5-dev"
pkgusers="git"
pkggroups="git"
install="$pkgname.pre-install $pkgname.post-install"
# NOTE: user vs system gitconfig, see https://gitlab.com/gitlab-org/omnibus-gitlab/-/merge_requests/6166
source="https://gitlab.com/gitlab-org/gitlab-shell/-/archive/v$pkgver/gitlab-shell-v$pkgver.tar.gz
config.patch
change-config-path.patch
gitconfig
"
builddir="$srcdir/$pkgname-v$pkgver"
options="!check"
build() {
# BUILD_TAGS - build without tracing libs,
# see https://gitlab.com/gitlab-org/labkit/-/merge_requests/2
make build \
VERSION_STRING="$pkgver" \
BUILD_TAGS=""
}
package() {
local datadir="$pkgdir/var/lib/gitlab"
local libdir="$pkgdir/usr/lib/gitlab-shell"
# XXX: I couldn't figure out how/where is gitlab-shell called,
# so I kept /usr/lib/gitlab-shell. It should be changed to /usr.
make install DESTDIR="$pkgdir" PREFIX=/usr/lib/gitlab-shell
install -m644 VERSION "$libdir"/
install -m644 -D config.yml.example "$pkgdir"/etc/gitlab/gitlab-shell.yml
cd "$pkgdir"
rm "$libdir"/bin/gitlab-sshd
install -d -m755 -o git -g git \
"$pkgdir"/var/log/gitlab \
"$datadir"
install -d -m02770 -o git -g git \
"$datadir"/repositories
install -m644 -o git -g git "$srcdir"/gitconfig "$datadir"/.gitconfig
ln -s /etc/gitlab/gitlab-shell.yml "$libdir"/config.yml
ln -s /etc/gitlab/gitlab_shell_secret "$libdir"/.gitlab_shell_secret
}
sha512sums="
703685c8aae6498ad42103a70a65e18b4d2a617687a5488a52bf4c8147cd56a724a109ea27456ca93a723b458499ab09590ad5d1591eb5d3c38d8d33870736eb gitlab-shell-v14.34.0.tar.gz
5123f639de976b83a961f5d0a9f53b0ff7559ceb4e73b25a8029423932ba6249c430b8bb04dae4dce2e13330e95d4a7a88e63376ead2d6369f6adb264fd36d49 config.patch
499b3a46ea94a33a23b01f6a7509d74f5a6781b930619b3b8ae42bdeae8a052cc636578744d7992b4ae4f9b9f72b11ee3d3c0f5e50986fa3f7e35b979b08aada change-config-path.patch
c53da7f145593693392d9fa880ad5a1909bfc7504fd1c93d94a468c3e0f5cc80f712f41ee1dc8bf38105b410c1165658f208bd88a70c4674104c78af33d8d09c gitconfig
"

View file

@ -1,11 +0,0 @@
--- a/support/gitlab_config.rb
+++ b/support/gitlab_config.rb
@@ -4,7 +4,7 @@ class GitlabConfig
attr_reader :config
def initialize
- @config = YAML.load_file(File.join(ROOT_PATH, 'config.yml'))
+ @config = YAML.load_file(ENV.fetch('GITLAB_SHELL_CONFIG', '/etc/gitlab/gitlab-shell.yml'))
end
def home

View file

@ -1,110 +0,0 @@
diff --git a/config.yml.example.orig b/config.yml.example
index 13850e6..98eb0e3 100644
--- a/config.yml.example.orig
+++ b/config.yml.example
@@ -13,7 +13,7 @@ user: git
# only listen on a Unix domain socket. For Unix domain sockets use
# "http+unix://<urlquoted-path-to-socket>", e.g.
# "http+unix://%2Fpath%2Fto%2Fsocket"
-gitlab_url: "http+unix://%2Fhome%2Fgit%2Fgitlab%2Ftmp%2Fsockets%2Fgitlab-workhorse.socket"
+gitlab_url: "http+unix://%2Frun%2Fgitlab%2Fworkhorse.socket"
# When a http+unix:// is used in gitlab_url, this is the relative URL root to GitLab.
# Not used if gitlab_url is http:// or https://.
@@ -29,15 +29,15 @@ http_settings:
#
# File used as authorized_keys for gitlab user
-auth_file: "/home/git/.ssh/authorized_keys"
+auth_file: "/var/lib/gitlab/.ssh/authorized_keys"
# SSL certificate dir where custom certificates can be placed
# https://golang.org/pkg/crypto/x509/
-# ssl_cert_dir: /opt/gitlab/embedded/ssl/certs/
+# ssl_cert_dir: /etc/gitlab/ssl/certs/
# File that contains the secret key for verifying access to GitLab.
# Default is .gitlab_shell_secret in the gitlab-shell directory.
-# secret_file: "/home/git/gitlab-shell/.gitlab_shell_secret"
+secret_file: "/etc/gitlab/gitlab_shell_secret"
#
# The secret field supersedes the secret_file, and if set that
# file will not be read.
@@ -45,13 +45,13 @@ auth_file: "/home/git/.ssh/authorized_keys"
# Log file.
# Default is gitlab-shell.log in the root directory.
-# log_file: "/home/git/gitlab-shell/gitlab-shell.log"
+log_file: "/var/log/gitlab/gitlab-shell.log"
# Log level. INFO by default
-log_level: INFO
+log_level: WARN
# Log format. 'json' by default, can be changed to 'text' if needed
-# log_format: json
+log_format: text
# Audit usernames.
# Set to true to see real usernames in the logs instead of key ids, which is easier to follow, but
@@ -62,60 +62,6 @@ audit_usernames: false
# For more details, visit https://docs.gitlab.com/ee/development/distributed_tracing.html
# gitlab_tracing: opentracing://driver
-# This section configures the built-in SSH server. Ignored when running on OpenSSH.
-sshd:
- # Address which the SSH server listens on. Defaults to [::]:22.
- listen: "[::]:22"
- # Set to true if gitlab-sshd is being fronted by a load balancer that implements
- # the PROXY protocol.
- proxy_protocol: false
- # Proxy protocol policy ("use", "require", "reject", "ignore"), "use" is the default value
- # Values: https://github.com/pires/go-proxyproto/blob/195fedcfbfc1be163f3a0d507fac1709e9d81fed/policy.go#L20
- proxy_policy: "use"
- # Proxy allowed IP addresses. Takes precedent over proxy_policy. Disabled by default.
- # proxy_allowed:
- # - "192.168.0.1"
- # - "192.168.1.0/24"
- # Address which the server listens on HTTP for monitoring/health checks. Defaults to localhost:9122.
- web_listen: "localhost:9122"
- # Maximum number of concurrent sessions allowed on a single SSH connection. Defaults to 10.
- concurrent_sessions_limit: 10
- # Sets an interval after which server will send keepalive message to a client. Defaults to 15s.
- client_alive_interval: 15
- # The server waits for this time for the ongoing connections to complete before shutting down. Defaults to 10s.
- grace_period: 10
- # The server disconnects after this time if the user has not successfully logged in. Defaults to 60s.
- login_grace_time: 60
- # A short timeout to decide to abort the connection if the protocol header is not seen within it. Defaults to 500ms
- proxy_header_timeout: 500ms
- # The endpoint that returns 200 OK if the server is ready to receive incoming connections; otherwise, it returns 503 Service Unavailable. Defaults to "/start".
- readiness_probe: "/start"
- # The endpoint that returns 200 OK if the server is alive. Defaults to "/health".
- liveness_probe: "/health"
- # Specifies the available message authentication code algorithms that are used for protecting data integrity
- macs: [hmac-sha2-256-etm@openssh.com, hmac-sha2-512-etm@openssh.com, hmac-sha2-256, hmac-sha2-512, hmac-sha1]
- # Specifies the available Key Exchange algorithms
- kex_algorithms: [curve25519-sha256, curve25519-sha256@libssh.org, ecdh-sha2-nistp256, ecdh-sha2-nistp384, ecdh-sha2-nistp521, diffie-hellman-group14-sha256, diffie-hellman-group14-sha1]
- # Specified the ciphers allowed
- ciphers: [aes128-gcm@openssh.com, chacha20-poly1305@openssh.com, aes256-gcm@openssh.com, aes128-ctr, aes192-ctr,aes256-ctr]
- # SSH host key files.
- host_key_files:
- - /run/secrets/ssh-hostkeys/ssh_host_rsa_key
- - /run/secrets/ssh-hostkeys/ssh_host_ecdsa_key
- - /run/secrets/ssh-hostkeys/ssh_host_ed25519_key
- host_key_certs:
- - /run/secrets/ssh-hostkeys/ssh_host_rsa_key-cert.pub
- - /run/secrets/ssh-hostkeys/ssh_host_ecdsa_key-cert.pub
- - /run/secrets/ssh-hostkeys/ssh_host_ed25519_key-cert.pub
- # GSSAPI-related settings
- gssapi:
- # Enable the gssapi-with-mic authentication method. Defaults to false.
- enabled: false
- # Keytab path. Defaults to "", system default (usually /etc/krb5.keytab).
- keytab: ""
- # The Kerberos service name to be used by sshd. Defaults to "", accepts any service name in keytab file.
- service_principal_name: ""
-
lfs:
# https://gitlab.com/groups/gitlab-org/-/epics/11872, disabled by default.
pure_ssh_protocol: false

View file

@ -1,17 +0,0 @@
# Based on files/gitlab-cookbooks/gitlab/templates/default/gitconfig.erb
# in omnibus-gitlab.
[user]
name = GitLab
email = gitlab@local.host
[core]
# Needed for the web editor.
autocrlf = input
alternateRefsCommand="exit 0 #"
# This option is unnecessary on journaled file systems and it's not recognized
# by git >= 2.36.
# fsyncObjectFiles = true
[gc]
auto = 0

View file

@ -1,23 +0,0 @@
#!/bin/sh
set -eu
keys_file='/var/lib/gitlab/.ssh/authorized_keys'
if [ ! -f "$keys_file" ]; then
keys_dir="$(dirname "$keys_file")"
echo "* Initializing authorized_keys file in $keys_dir" 1>&2
mkdir -m0700 -p "$keys_dir"
chown git:git "$keys_dir"
touch "$keys_file"
chmod 0600 "$keys_file"
chown git:git "$keys_file"
fi
cat <<EOF >&2
*
* GitLab Shell has been initialized. Read /etc/gitlab/gitlab-shell.yml and
* modify settings as need.
*
EOF

View file

@ -1,41 +0,0 @@
#!/bin/sh
# It's very important to set user/group correctly.
git_dir='/var/lib/gitlab'
if ! getent group git >/dev/null; then
echo '* Creating group git' >&2
addgroup -S git
fi
if ! id git 2>/dev/null 1>&2; then
echo '* Creating user git' >&2
adduser -DHS -G git -h "$git_dir" -s /bin/sh \
-g "added by apk for gitlab-shell" git
passwd -u git >/dev/null # unlock
fi
if ! id -Gn git | grep -Fq redis; then
echo '* Adding user git to group redis' >&2
addgroup git redis
fi
user_home="$(getent passwd git | cut -d: -f6)"
if [ "$user_home" != "$git_dir" ]; then
cat >&2 <<-EOF
!!
!! User git has home directory in $user_home, but this package and gitlab-ce
!! package assumes $git_dir. Although it's possible to use a different
!! directory, it's really not easy.
!!
!! Please change git's home directory to $git_dir, or adjust settings
!! and move files yourself. Otherwise GitLab will not work!
!!
EOF
fi
exit 0

View file

@ -1,10 +0,0 @@
--- a/src/core/ext/transport/chttp2/transport/chttp2_transport.cc
+++ b/src/core/ext/transport/chttp2/transport/chttp2_transport.cc
@@ -978,6 +978,7 @@
} else {
r = grpc_chttp2_begin_write(t);
}
+ #pragma GCC diagnostic ignored "-Wmaybe-uninitialized"
if (r.writing) {
if (r.partial) {
GRPC_STATS_INC_HTTP2_PARTIAL_WRITES();

View file

@ -1,273 +0,0 @@
# Contributor: Keith Maxwell <keith.maxwell@gmail.com>
# Contributor: wener <wenermail@gmail.com>
# Maintainer: wener <wenermail@gmail.com>
pkgname=grpc
pkgver=1.58.0
pkgrel=2
pkgdesc="The C based gRPC"
url="https://grpc.io/"
arch="all"
# BSD-3-Clause: third_party/upb, third_party/address_sorting
# MIT: third_party/upb/third_party/utf8_range
license="Apache-2.0 AND BSD-3-Clause AND MIT"
depends="ca-certificates"
depends_dev="
$pkgname-cpp=$pkgver-r$pkgrel
$pkgname-plugins=$pkgver-r$pkgrel
"
_pythondepends="
cython
python3-dev
py3-setuptools
"
_rubydepends="
$pkgname=$pkgver-r$pkgrel
ruby3.2-google-protobuf>=3.19
"
makedepends="
abseil-cpp-dev
autoconf
automake
benchmark-dev
c-ares-dev
chrpath
cmake
libstdc++
libtool
linux-headers
openssl-dev>3
protobuf-dev
re2-dev
ruby3.2-dev
samurai
yaml-dev
xxhash-dev
zlib-dev
$_pythondepends
$_rubydepends
"
checkdepends="coreutils python3 py3-six"
subpackages="
$pkgname-dev
$pkgname-cpp
$pkgname-plugins
$pkgname-doc
py3-grpcio-pyc
py3-grpcio:grpcio
ruby3.2-grpc:_ruby
libaddress_sorting:lib
libgpr:lib
libgrpc:lib
libgrpc_authorization_provider:lib
libgrpc_unsecure:lib
libupb:lib
"
_googletest_rev=0e402173c97aea7a00749e825b194bfede4f2e45
# ruby-dont-strip-library.patch: abuild will dot the strip
source="https://github.com/grpc/grpc/archive/v$pkgver/grpc-v$pkgver.tar.gz
googletest-$_googletest_rev.tar.gz::https://github.com/google/googletest/archive/$_googletest_rev.tar.gz
01-chttp2-maybe-uninitialized.patch
find-dependency.patch
ruby-fix-protoc-path.patch
ruby-use-shared-libs.patch
ruby-use-system-certs.patch
makefile-use-system-abseil.patch
cython3.patch
"
options="net !check" # sometimes hang indefinitely on builders
prepare() {
rm -r third_party/googletest
mv "$srcdir"/googletest-$_googletest_rev third_party/googletest
# Remove bundled xxhash.
# Since grpc sets XXH_INCLUDE_ALL wherever it uses xxhash, it is using xxhash
# as a header-only library. This means we can replace it with the system copy
# by doing nothing further; xxhash.h is in the system include path and will be
# found instead, and there are no linker flags to add. See also
# https://github.com/grpc/grpc/issues/25945.
rm -rvf third_party/xxhash/*
# This will be replaced with a symlink to system certs.
echo '' > etc/roots.pem
default_prepare
# Remove some bundled dependencies from the gem's files list.
sed -i \
-e '/etc\/roots.pem/d' \
-e '/third_party\/abseil/d' \
-e '/third_party\/boringssl/d' \
-e '/third_party\/cares/d' \
-e '/third_party\/re2/d' \
-e '/third_party\/xxhash/d' \
-e '/third_party\/zlib/d' \
grpc.gemspec
# Remove unused dependency from gemspec - it's not required anyhwere,
# it's just Google pushing their crap everywhere...
sed -i '/add_dependency.*googleapis-common-protos-types/d' \
grpc.gemspec
}
build() {
export CFLAGS="$CFLAGS -flto=auto -DNDEBUG -O2"
export CXXFLAGS="$CXXFLAGS -flto=auto -DNDEBUG -O2"
cmake -B _build -G Ninja \
-DCMAKE_BUILD_TYPE=None \
-DCMAKE_INSTALL_PREFIX=/usr \
-DCMAKE_CXX_STANDARD=17 \
-DBUILD_SHARED_LIBS=True \
-DgRPC_INSTALL=ON \
-DgRPC_CARES_PROVIDER=package \
-DgRPC_PROTOBUF_PROVIDER=package \
-DgRPC_SSL_PROVIDER=package \
-DgRPC_ZLIB_PROVIDER=package \
-DgRPC_ABSL_PROVIDER=package \
-DgRPC_BENCHMARK_PROVIDER=package \
-DgRPC_RE2_PROVIDER=package \
-DgRPC_BACKWARDS_COMPATIBILITY_MODE=OFF \
-DgRPC_BUILD_TESTS="$(want_check && echo ON || echo OFF)"
cmake --build _build
GRPC_PYTHON_CFLAGS="-std=c++17" \
GRPC_PYTHON_DISABLE_LIBC_COMPATIBILITY=1 \
GRPC_PYTHON_BUILD_SYSTEM_CARES=1 \
GRPC_PYTHON_BUILD_SYSTEM_OPENSSL=1 \
GRPC_PYTHON_BUILD_SYSTEM_ZLIB=1 \
GRPC_PYTHON_BUILD_SYSTEM_RE2=1 \
GRPC_PYTHON_BUILD_SYSTEM_ABSL=1 \
python3 setup.py build
# grpcio-tools
cd tools/distrib/python
python3 make_grpcio_tools.py
cd "$builddir"
gem build grpc.gemspec
TOPDIR="$PWD/_build" gem install \
--local \
--install-dir _build/ruby \
--ignore-dependencies \
--no-document \
--verbose \
grpc-$pkgver.gem
}
check() {
# delete times out in ci or broken for ci tests
rm -f _build/spinlock_test _build/resolve_address_using_ares_resolver_posix_test build/resolve_address_using_native_resolver_posix_test
rm -f _build/flaky_network_test _build/unknown_frame_bad_client_test _build/ssl_transport_security_test _build/httpscli_test
rm -f _build/headers_bad_client_test _build/httpcli_test
case $CARCH in
aarch64|ppc64le) rm -f _build/server_test _build/grpc_tool_test ;;
s390x) rm -f _build/client_lb_end2end_test _build/alts_frame_protector_test _build/alts_iovec_record_protocol_test ;;
armv7) rm -f _build/initial_settings_frame_bad_client_test ;;
x86) rm -f _build/time_jump_test _build/connection_prefix_bad_client_test ;;
esac
# start helper
./tools/run_tests/start_port_server.py &
find build/ -maxdepth 1 -type f -executable -name "*_test" -exec {} \;
# kill helper
pkill -9 python3
}
package() {
DESTDIR="$pkgdir" cmake --install _build
python3 setup.py install --skip-build --root="$pkgdir"
cd doc
find ./ -type f -print -exec install -Dm644 {} "$pkgdir"/usr/share/doc/grpc/{} \;
rm "$pkgdir"/usr/share/doc/grpc/.gitignore
find "$pkgdir" -type f -name roots.pem -exec \
sh -c 'rm $0 && ln -s /etc/ssl/certs/ca-certificates.crt $0' "{}" \;
}
cpp() {
pkgdesc="C++ language bindings for gRPC"
depends="$pkgname=$pkgver-r$pkgrel"
amove usr/lib/libgrpc++*.so.*
amove usr/lib/libgrpc_plugin_support.so.*
amove usr/lib/libgrpcpp*.so.*
}
plugins() {
pkgdesc="Protocol buffers compiler plugins for gRPC"
depends="$pkgname-cpp=$pkgver-r$pkgrel protobuf"
amove usr/bin/grpc_*_plugin
}
cli() {
pkgdesc="gRPC command line tool"
install -Dm644 -t "$subpkgdir"/usr/lib "$builddir"/_build/libgrpc++_test_config.so.$pkgver
install -Dm755 -t "$subpkgdir"/usr/bin "$builddir"/_build/grpc_cli
# Fix "Has /home/... in rpath"
chrpath -d "$subpkgdir"/usr/lib/libgrpc++_test_config.so.$pkgver
chrpath -d "$subpkgdir"/usr/bin/grpc_cli
}
grpcio() {
pkgdesc="gRPC Python HTTP/2-based RPC framework"
depends="py3-six"
amove usr/lib/python3*
}
_ruby() {
pkgdesc="Send RPCs from Ruby using GRPC"
depends="$_rubydepends"
local gemdir="$subpkgdir/$(ruby -e 'puts Gem.default_dir')"
cd "$builddir"/_build/ruby
mkdir -p "$gemdir"
cp -r extensions gems specifications "$gemdir"/
# Remove unnecessary files and rubbish...
cd "$gemdir"/extensions/*/*/grpc-$pkgver
rm gem_make.out mkmf.log || true
cd "$gemdir"/gems/grpc-$pkgver
rm -rf .yardopts \
Makefile \
include/ \
src/core/ \
third_party/
cd src/ruby
rm -rf bin/ \
ext/ \
lib/grpc/*.so \
pb/generate_proto_ruby.sh \
pb/README.md \
pb/src/ \
pb/test/ \
spec/
}
lib() {
pkgdesc="$pkgdesc ($subpkgname library)"
depends="$pkgname=$pkgver-r$pkgrel"
amove usr/lib/$subpkgname.so.*
}
sha512sums="
fb2fd211a22dd777cf4df39a9dd72e5c8014f1546a89d3910b006503aac80a74d5797705e02911e9c07316ed973f71110b94cc0e86225f648d4ff91773748a43 grpc-v1.58.0.tar.gz
5c5eaf6ff9f3c1bca025b7ef0234ba97232ba85b43e6354a92f49b7208f5c47581ebaf18bf58618498e5d264f2620c2b6676e81bb0f7df77112b96ba271ececf googletest-0e402173c97aea7a00749e825b194bfede4f2e45.tar.gz
7fa146ce86ddd4f160bb1ca9ff01cb7aca6b2b8c9aa50e4fa6b84504b9117b104be0d1e31ccb452d846549dfe1e9012ceccfcdc1f2357ed567621d71fb8b08c5 01-chttp2-maybe-uninitialized.patch
6702e39c6a3c065fe4ff5ae48898057135c09bf6851e35fc958cf95ee5d77e9dd34e8c34d978efe60682384e46c4c4b2e51156d546b06a0eb1feed89adcc024b find-dependency.patch
4ea72d2acd8bee9c9022a4412aa0af0477faca7b0810d14decb3ad5d4da044247f51189512323bfee855b9b260a7f82b812310391451e5d8ee718297800d7a73 ruby-fix-protoc-path.patch
7123bf1bbc48ceb303ce1e9820ea45a06dabd25e20e3c1c116ef68e629e80f229cf20314c415d74f0c5c1725f23a00b446656e0cffba3dcd3cc766ae29d8fb2f ruby-use-shared-libs.patch
631af4b9ac29c1ebabb2c88394ea2993e36cec1beda38195e1587dbd9d3c8c9eef75a17d2326d3cd2e682de551401216075ba08fdc501c098b8092d718ded381 ruby-use-system-certs.patch
89e260934da83eb45fa6b73884cba1b1c30f99c0eb883a726e2d36ee4788246f4c6fa1b201077038af956bcb58e625f83bedba4f186c711785ec240373ce4fc5 makefile-use-system-abseil.patch
896d2771fbb726db97efc7a76687a8fddfae18b0492977fc1f7cec4002803f7aed29e8276c94c6b60a06ecfe3ee7795d4ec3f8f90031dd3eda32d3e23dc9c98c cython3.patch
"

View file

@ -1,172 +0,0 @@
From b3277bac1585ddee88a170b0a95c260d909cce9c Mon Sep 17 00:00:00 2001
From: Atri Bhattacharya <A.Bhattacharya@uliege.be>
Date: Sat, 24 Feb 2024 04:06:08 +0530
Subject: [PATCH] [python] Cython 3 compatibility: declare functions noexcept.
In Cython 3, cdef functions that really will not raise exceptions must
be declared as `noexcept`. Fixed by this commit.
Update requirements to `cython >= 3.0` in requirements*.txt and
setup.py.
Fixes issue #33918.
---
requirements.bazel.txt | 2 +-
requirements.txt | 2 +-
setup.py | 2 +-
.../grpcio/grpc/_cython/_cygrpc/aio/callback_common.pxd.pxi | 2 +-
.../grpcio/grpc/_cython/_cygrpc/aio/callback_common.pyx.pxi | 2 +-
src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi | 2 +-
src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pxd.pxi | 6 +++---
src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi | 6 +++---
src/python/grpcio/grpc/_cython/_cygrpc/vtable.pyx.pxi | 6 +++---
9 files changed, 15 insertions(+), 15 deletions(-)
diff --git a/requirements.bazel.txt b/requirements.bazel.txt
index f46432cc88891..905c092ce4c33 100644
--- a/requirements.bazel.txt
+++ b/requirements.bazel.txt
@@ -1,6 +1,6 @@
# GRPC Python setup requirements
coverage==4.5.4
-cython==0.29.21
+cython==3.0.0
protobuf>=3.5.0.post1, < 4.0dev
wheel==0.38.1
oauth2client==4.1.0
diff --git a/requirements.txt b/requirements.txt
index 05390850559f1..56169434b1b78 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,5 +1,5 @@
# GRPC Python setup requirements
coverage>=4.0
-cython>=0.29.8,<3.0.0rc1
+cython>=3.0.0
protobuf>=4.21.3,<5.0dev
wheel>=0.29
diff --git a/setup.py b/setup.py
index 2ce5fef422316..8b4ce5c16736a 100644
--- a/setup.py
+++ b/setup.py
@@ -539,7 +539,7 @@ def cython_extensions_and_necessity():
sys.stderr.write(
"We could not find Cython. Setup may take 10-20 minutes.\n"
)
- SETUP_REQUIRES += ("cython>=0.23,<3.0.0rc1",)
+ SETUP_REQUIRES += ("cython>=3.0.0",)
COMMAND_CLASS = {
"doc": commands.SphinxDocumentation,
diff --git a/src/python/grpcio/grpc/_cython/_cygrpc/aio/callback_common.pxd.pxi b/src/python/grpcio/grpc/_cython/_cygrpc/aio/callback_common.pxd.pxi
index e54e5107547c1..26edbdb917b10 100644
--- a/src/python/grpcio/grpc/_cython/_cygrpc/aio/callback_common.pxd.pxi
+++ b/src/python/grpcio/grpc/_cython/_cygrpc/aio/callback_common.pxd.pxi
@@ -48,7 +48,7 @@ cdef class CallbackWrapper:
@staticmethod
cdef void functor_run(
grpc_completion_queue_functor* functor,
- int succeed)
+ int succeed) noexcept
cdef grpc_completion_queue_functor *c_functor(self)
diff --git a/src/python/grpcio/grpc/_cython/_cygrpc/aio/callback_common.pyx.pxi b/src/python/grpcio/grpc/_cython/_cygrpc/aio/callback_common.pyx.pxi
index 14a0098fc2041..2b0df0e5ce7f7 100644
--- a/src/python/grpcio/grpc/_cython/_cygrpc/aio/callback_common.pyx.pxi
+++ b/src/python/grpcio/grpc/_cython/_cygrpc/aio/callback_common.pyx.pxi
@@ -50,7 +50,7 @@ cdef class CallbackWrapper:
@staticmethod
cdef void functor_run(
grpc_completion_queue_functor* functor,
- int success):
+ int success) noexcept:
cdef CallbackContext *context = <CallbackContext *>functor
cdef object waiter = <object>context.waiter
if not waiter.cancelled():
diff --git a/src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi b/src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi
index 74a3f16d72dbb..600c0f304e067 100644
--- a/src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi
+++ b/src/python/grpcio/grpc/_cython/_cygrpc/credentials.pyx.pxi
@@ -316,7 +316,7 @@ def server_credentials_ssl_dynamic_cert_config(initial_cert_config,
return credentials
cdef grpc_ssl_certificate_config_reload_status _server_cert_config_fetcher_wrapper(
- void* user_data, grpc_ssl_server_certificate_config **config) with gil:
+ void* user_data, grpc_ssl_server_certificate_config **config) noexcept with gil:
# This is a credentials.ServerCertificateConfig
cdef ServerCertificateConfig cert_config = None
if not user_data:
diff --git a/src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pxd.pxi b/src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pxd.pxi
index 13a02434787ba..b300883abae81 100644
--- a/src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pxd.pxi
+++ b/src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pxd.pxi
@@ -12,10 +12,10 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-cdef void __prefork() nogil
+cdef void __prefork() noexcept nogil
-cdef void __postfork_parent() nogil
+cdef void __postfork_parent() noexcept nogil
-cdef void __postfork_child() nogil
\ No newline at end of file
+cdef void __postfork_child() noexcept nogil
diff --git a/src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi b/src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi
index 565f483b2ae00..d901cfddf4321 100644
--- a/src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi
+++ b/src/python/grpcio/grpc/_cython/_cygrpc/fork_posix.pyx.pxi
@@ -35,7 +35,7 @@ _GRPC_ENABLE_FORK_SUPPORT = (
_fork_handler_failed = False
-cdef void __prefork() nogil:
+cdef void __prefork() noexcept nogil:
with gil:
global _fork_handler_failed
_fork_handler_failed = False
@@ -49,14 +49,14 @@ cdef void __prefork() nogil:
_fork_handler_failed = True
-cdef void __postfork_parent() nogil:
+cdef void __postfork_parent() noexcept nogil:
with gil:
with _fork_state.fork_in_progress_condition:
_fork_state.fork_in_progress = False
_fork_state.fork_in_progress_condition.notify_all()
-cdef void __postfork_child() nogil:
+cdef void __postfork_child() noexcept nogil:
with gil:
try:
if _fork_handler_failed:
diff --git a/src/python/grpcio/grpc/_cython/_cygrpc/vtable.pyx.pxi b/src/python/grpcio/grpc/_cython/_cygrpc/vtable.pyx.pxi
index da4b81bd97e65..f59410073b736 100644
--- a/src/python/grpcio/grpc/_cython/_cygrpc/vtable.pyx.pxi
+++ b/src/python/grpcio/grpc/_cython/_cygrpc/vtable.pyx.pxi
@@ -13,16 +13,16 @@
# limitations under the License.
# TODO(https://github.com/grpc/grpc/issues/15662): Reform this.
-cdef void* _copy_pointer(void* pointer):
+cdef void* _copy_pointer(void* pointer) noexcept:
return pointer
# TODO(https://github.com/grpc/grpc/issues/15662): Reform this.
-cdef void _destroy_pointer(void* pointer):
+cdef void _destroy_pointer(void* pointer) noexcept:
pass
-cdef int _compare_pointer(void* first_pointer, void* second_pointer):
+cdef int _compare_pointer(void* first_pointer, void* second_pointer) noexcept:
if first_pointer < second_pointer:
return -1
elif first_pointer > second_pointer:

View file

@ -1,13 +0,0 @@
without this find_dependency() doesn't exist
--
diff --git a/cmake/gRPCConfig.cmake.in b/cmake/gRPCConfig.cmake.in
index 98d8c6d..5500ca2 100644
--- a/cmake/gRPCConfig.cmake.in
+++ b/cmake/gRPCConfig.cmake.in
@@ -1,5 +1,6 @@
# Module path
list(APPEND CMAKE_MODULE_PATH ${CMAKE_CURRENT_LIST_DIR}/modules)
+include(CMakeFindDependencyMacro)
# Depend packages
@_gRPC_FIND_ZLIB@

View file

@ -1,22 +0,0 @@
--- a/Makefile
+++ b/Makefile
@@ -575,8 +575,8 @@
# Setup abseil dependency
-GRPC_ABSEIL_DEP = $(LIBDIR)/$(CONFIG)/libgrpc_abseil.a
-GRPC_ABSEIL_MERGE_LIBS = $(LIBDIR)/$(CONFIG)/libgrpc_abseil.a
+GRPC_ABSEIL_DEP = -labsl_base -labsl_int128 -labsl_strings -labsl_time -labsl_bad_optional_access -labsl_throw_delegate -labsl_str_format_internal
+GRPC_ABSEIL_MERGE_LIBS = -labsl_base -labsl_int128 -labsl_strings -labsl_time -labsl_bad_optional_access -labsl_throw_delegate -labsl_str_format_internal
# Setup re2 dependency
@@ -2809,7 +2809,7 @@
third_party/abseil-cpp/absl/types/bad_variant_access.cc \
-LIBGRPC_ABSEIL_OBJS = $(addprefix $(OBJDIR)/$(CONFIG)/, $(addsuffix .o, $(basename $(LIBGRPC_ABSEIL_SRC))))
+LIBGRPC_ABSEIL_OBJS =
$(LIBGRPC_ABSEIL_OBJS): CPPFLAGS += -g -Ithird_party/abseil-cpp

View file

@ -1,25 +0,0 @@
Patch-Source: https://sources.debian.org/src/grpc/1.44.0-3/debian/patches/fix-protoc-path.patch (modified)
--- a/src/ruby/end2end/package_with_underscore_test.rb
+++ b/src/ruby/end2end/package_with_underscore_test.rb
@@ -20,8 +20,8 @@ def main
pb_dir = File.join(root_dir, 'src', 'ruby', 'end2end', 'protos')
- bins_dir = File.join(root_dir, 'cmake', 'build')
+ bins_dir = '/usr/bin'
plugin = File.join(bins_dir, 'grpc_ruby_plugin')
- protoc = File.join(bins_dir, 'third_party', 'protobuf', 'protoc')
+ protoc = File.join(bins_dir, 'protoc')
got = nil
--- a/src/ruby/tools/bin/grpc_tools_ruby_protoc
+++ b/src/ruby/tools/bin/grpc_tools_ruby_protoc
@@ -25,6 +25,5 @@ plugin_name = 'grpc_ruby_plugin' + ext
-protoc_dir = File.join(File.dirname(__FILE__),
- PLATFORM.architecture + '-' + PLATFORM.os_name)
+protoc_dir = '/usr/bin'
protoc_path = File.join(protoc_dir, protoc_name)

View file

@ -1,81 +0,0 @@
From: Jakub Jirutka <jakub@jirutka.cz>
Date: Wed, 24 Aug 2022 21:20:22 +0200
Subject: [PATCH] Link with shared libraries, don't embed anything
- Don't statically link openssl, zlib and cares.
- Don't build and statically link libgrpc, link shared libgrpc.
- Don't statically link libgcc and libstdc++.
diff --git a/src/ruby/ext/grpc/extconf.rb b/src/ruby/ext/grpc/extconf.rb
index 98a8876..808ecfe 100644
--- a/src/ruby/ext/grpc/extconf.rb
+++ b/src/ruby/ext/grpc/extconf.rb
@@ -69,11 +69,11 @@ if apple_toolchain && !cross_compiling
end
# Don't embed on TruffleRuby (constant-time crypto is unsafe with Sulong, slow build times)
-ENV['EMBED_OPENSSL'] = (RUBY_ENGINE != 'truffleruby').to_s
+ENV['EMBED_OPENSSL'] = 'false'
# Don't embed on TruffleRuby (the system zlib is already linked for the zlib C extension, slow build times)
-ENV['EMBED_ZLIB'] = (RUBY_ENGINE != 'truffleruby').to_s
+ENV['EMBED_ZLIB'] = 'false'
-ENV['EMBED_CARES'] = 'true'
+ENV['EMBED_CARES'] = 'false'
ENV['ARCH_FLAGS'] = RbConfig::CONFIG['ARCH_FLAG']
if apple_toolchain && !cross_compiling
@@ -97,32 +97,7 @@
strip_tool = RbConfig::CONFIG['STRIP']
strip_tool += ' -x' if apple_toolchain
-unless windows
- puts 'Building internal gRPC into ' + grpc_lib_dir
- nproc = 4
- nproc = Etc.nprocessors if Etc.respond_to? :nprocessors
- nproc_override = ENV['GRPC_RUBY_BUILD_PROCS']
- unless nproc_override.nil? or nproc_override.size == 0
- nproc = nproc_override
- puts "Overriding make parallelism to #{nproc}"
- end
- make = bsd ? 'gmake' : 'make'
- cmd = "#{make} -j#{nproc} -C #{grpc_root} #{grpc_lib_dir}/libgrpc.a CONFIG=#{grpc_config} Q="
- puts "Building grpc native library: #{cmd}"
- system(cmd)
- exit 1 unless $? == 0
-
- if grpc_config == 'opt'
- rm_obj_cmd = "rm -rf #{File.join(output_dir, 'objs')}"
- puts "Removing grpc object files: #{rm_obj_cmd}"
- system(rm_obj_cmd)
- exit 1 unless $? == 0
- strip_cmd = "#{strip_tool} #{grpc_lib_dir}/*.a"
- puts "Stripping grpc native library: #{strip_cmd}"
- system(strip_cmd)
- exit 1 unless $? == 0
- end
-end
+$LDFLAGS << ' -L' + ENV.fetch('TOPDIR', '.')
$CFLAGS << ' -DGRPC_RUBY_WINDOWS_UCRT' if windows_ucrt
$CFLAGS << ' -I' + File.join(grpc_root, 'include')
@@ -118,7 +103,7 @@ ext_export_file += '-truffleruby' if RUBY_ENGINE == 'truffleruby'
$LDFLAGS << ' -Wl,--version-script="' + ext_export_file + '.gcc"' if linux
$LDFLAGS << ' -Wl,-exported_symbols_list,"' + ext_export_file + '.clang"' if apple_toolchain
-$LDFLAGS << ' ' + File.join(grpc_lib_dir, 'libgrpc.a') unless windows
+$LDFLAGS << ' -Wl,-wrap,memcpy -lgrpc' unless windows
if grpc_config == 'gcov'
$CFLAGS << ' -O0 -fprofile-arcs -ftest-coverage'
$LDFLAGS << ' -fprofile-arcs -ftest-coverage -rdynamic'
@@ -129,10 +114,6 @@ if grpc_config == 'dbg'
end
$LDFLAGS << ' -Wl,-wrap,memcpy' if linux
-# Do not statically link standard libraries on TruffleRuby as this does not work when compiling to bitcode
-if linux && RUBY_ENGINE != 'truffleruby'
- $LDFLAGS << ' -static-libgcc -static-libstdc++'
-end
$LDFLAGS << ' -static' if windows
$CFLAGS << ' -std=c11 '

Some files were not shown because too many files have changed in this diff Show more