Project import generated by Copybara.

GitOrigin-RevId: ae1dc133ea5f1538d035af41e5ddbc2ebcb67b90
This commit is contained in:
Default email 2022-09-22 14:36:57 +02:00
parent 18d7f36feb
commit b41113241d
1002 changed files with 40522 additions and 30597 deletions

View file

@ -0,0 +1,32 @@
---
name: Missing or incorrect documentation
about: Help us improve the Nixpkgs and NixOS reference manuals
title: ''
labels: '9.needs: documentation'
assignees: ''
---
## Problem
<!-- describe your problem -->
## Checklist
<!-- make sure this issue is not redundant or obsolete -->
- [ ] checked [latest Nixpkgs manual] \([source][nixpkgs-source]) and [latest NixOS manual]] \[source][nixos-source])
- [ ] checked [open documentation issues] for possible duplicates
- [ ] checked [open documentation pull requests] for possible solutions
[latest Nixpkgs manual]: https://nixos.org/manual/nixpkgs/unstable/
[latest NixOS manual]: https://nixos.org/manual/nixos/unstable/
[nixpkgs-source]: https://github.com/NixOS/nixpkgs/tree/master/doc
[nixos-source]: https://github.com/NixOS/nixpkgs/tree/master/nixos/doc/manual
[open documentation issues]: https://github.com/NixOS/nixpkgs/issues?q=is%3Aissue+is%3Aopen+label%3A%229.needs%3A+documentation%22
[open documentation pull requests]: https://github.com/NixOS/nixpkgs/pulls?q=is%3Aopen+is%3Apr+label%3A%228.has%3A+documentation%22%2C%226.topic%3A+documentation%22
## Proposal
<!-- propose a solution -->

View file

@ -143,6 +143,9 @@
- nixos/modules/programs/neovim.nix - nixos/modules/programs/neovim.nix
- pkgs/applications/editors/neovim/**/* - pkgs/applications/editors/neovim/**/*
"6.topic: vscode":
- pkgs/applications/editors/vscode/**/*
"6.topic: xfce": "6.topic: xfce":
- nixos/doc/manual/configuration/xfce.xml - nixos/doc/manual/configuration/xfce.xml
- nixos/modules/services/x11/desktop-managers/xfce.nix - nixos/modules/services/x11/desktop-managers/xfce.nix

View file

@ -1,8 +1,8 @@
name: "Update terraform-providers" name: "Update terraform-providers"
on: on:
schedule: #schedule:
- cron: "14 3 * * 0" # - cron: "14 3 * * 0"
workflow_dispatch: workflow_dispatch:
permissions: permissions:
@ -18,6 +18,8 @@ jobs:
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v3
- uses: cachix/install-nix-action@v17 - uses: cachix/install-nix-action@v17
with:
nix_path: nixpkgs=channel:nixpkgs-unstable
- name: setup - name: setup
id: setup id: setup
run: | run: |

View file

@ -6,6 +6,8 @@
.vscode/ .vscode/
outputs/ outputs/
result-* result-*
result
!pkgs/development/python-modules/result
/doc/NEWS.html /doc/NEWS.html
/doc/NEWS.txt /doc/NEWS.txt
/doc/manual.html /doc/manual.html

View file

@ -14,19 +14,25 @@ for example when using an 'old' hash in a fixed-output derivation.
Examples: Examples:
```nix ```nix
passthru.tests.version = testVersion { package = hello; }; passthru.tests.version = testers.testVersion { package = hello; };
passthru.tests.version = testVersion { passthru.tests.version = testers.testVersion {
package = seaweedfs; package = seaweedfs;
command = "weed version"; command = "weed version";
}; };
passthru.tests.version = testVersion { passthru.tests.version = testers.testVersion {
package = key; package = key;
command = "KeY --help"; command = "KeY --help";
# Wrong '2.5' version in the code. Drop on next version. # Wrong '2.5' version in the code. Drop on next version.
version = "2.5"; version = "2.5";
}; };
passthru.tests.version = testers.testVersion {
package = ghr;
# The output needs to contain the 'version' string without any prefix or suffix.
version = "v${version}";
};
``` ```
## `testEqualDerivation` {#tester-testEqualDerivation} ## `testEqualDerivation` {#tester-testEqualDerivation}
@ -42,7 +48,7 @@ Otherwise, the build log explains the difference via `nix-diff`.
Example: Example:
```nix ```nix
testEqualDerivation testers.testEqualDerivation
"The hello package must stay the same when enabling checks." "The hello package must stay the same when enabling checks."
hello hello
(hello.overrideAttrs(o: { doCheck = true; })) (hello.overrideAttrs(o: { doCheck = true; }))
@ -73,7 +79,7 @@ fixed output derivation.
Example: Example:
```nix ```nix
tests.fetchgit = invalidateFetcherByDrvHash fetchgit { tests.fetchgit = testers.invalidateFetcherByDrvHash fetchgit {
name = "nix-source"; name = "nix-source";
url = "https://github.com/NixOS/nix"; url = "https://github.com/NixOS/nix";
rev = "9d9dbe6ed05854e03811c361a3380e09183f4f4a"; rev = "9d9dbe6ed05854e03811c361a3380e09183f4f4a";

View file

@ -40,7 +40,7 @@ Exported variables:
Bash-only variables: Bash-only variables:
- `postgresqlTestUserOptions`: SQL options to use when creating the `$PGUSER` role, default: `LOGIN`. - `postgresqlTestUserOptions`: SQL options to use when creating the `$PGUSER` role, default: `"LOGIN"`. Example: `"LOGIN SUPERUSER"`
- `postgresqlTestSetupSQL`: SQL commands to run as database administrator after startup, default: statements that create `$PGUSER` and `$PGDATABASE`. - `postgresqlTestSetupSQL`: SQL commands to run as database administrator after startup, default: statements that create `$PGUSER` and `$PGDATABASE`.
- `postgresqlTestSetupCommands`: bash commands to run after database start, defaults to running `$postgresqlTestSetupSQL` as database administrator. - `postgresqlTestSetupCommands`: bash commands to run after database start, defaults to running `$postgresqlTestSetupSQL` as database administrator.
- `postgresqlEnableTCP`: set to `1` to enable TCP listening. Flaky; not recommended. - `postgresqlEnableTCP`: set to `1` to enable TCP listening. Flaky; not recommended.

View file

@ -88,3 +88,58 @@ with lib; mkCoqDerivation {
}; };
} }
``` ```
## Three ways of overriding Coq packages {#coq-overriding-packages}
There are three distinct ways of changing a Coq package by overriding one of its values: `.override`, `overrideCoqDerivation`, and `.overrideAttrs`. This section explains what sort of values can be overridden with each of these methods.
### `.override` {#coq-override}
`.override` lets you change arguments to a Coq derivation. In the case of the `multinomials` package above, `.override` would let you override arguments like `mkCoqDerivation`, `version`, `coq`, `mathcomp`, `mathcom-finmap`, etc.
For example, assuming you have a special `mathcomp` dependency you want to use, here is how you could override the `mathcomp` dependency:
```nix
multinomials.override {
mathcomp = my-special-mathcomp;
}
```
In Nixpkgs, all Coq derivations take a `version` argument. This can be overridden in order to easily use a different version:
```nix
coqPackages.multinomials.override {
version = "1.5.1";
}
```
Refer to [](#coq-packages-attribute-sets-coqpackages) for all the different formats that you can potentially pass to `version`, as well as the restrictions.
### `overrideCoqDerivation` {#coq-overrideCoqDerivation}
The `overrideCoqDerivation` function lets you easily change arguments to `mkCoqDerivation`. These arguments are described in [](#coq-packages-attribute-sets-coqpackages).
For example, here is how you could locally add a new release of the `multinomials` library, and set the `defaultVersion` to use this release:
```nix
coqPackages.lib.overrideCoqDerivation
{
defaultVersion = "2.0";
release."2.0".sha256 = "1lq8x86vd3vqqh2yq6hvyagpnhfq5wmk5pg2z0xq7b7dbbbhyfkk";
}
coqPackages.multinomials
```
### `.overrideAttrs` {#coq-overrideAttrs}
`.overrideAttrs` lets you override arguments to the underlying `stdenv.mkDerivation` call. Internally, `mkCoqDerivation` uses `stdenv.mkDerivation` to create derivations for Coq libraries. You can override arguments to `stdenv.mkDerivation` with `.overrideAttrs`.
For instance, here is how you could add some code to be performed in the derivation after installation is complete:
```nix
coqPackages.multinomials.overrideAttrs (oldAttrs: {
postInstall = oldAttrs.postInstall or "" + ''
echo "you can do anything you want here"
'';
})
```

View file

@ -1780,6 +1780,10 @@ The following rules are desired to be respected:
that characters should be converted to lowercase and `.` and `_` should be that characters should be converted to lowercase and `.` and `_` should be
replaced by a single `-` (foo-bar-baz instead of Foo__Bar.baz). replaced by a single `-` (foo-bar-baz instead of Foo__Bar.baz).
If necessary, `pname` has to be given a different value within `fetchPypi`. If necessary, `pname` has to be given a different value within `fetchPypi`.
* Packages from sources such as GitHub and GitLab that do not exist on PyPI
should not use a name that is already used on PyPI. When possible, they should
use the package repository name prefixed with the owner (e.g. organization) name
and using a `-` as delimiter.
* Attribute names in `python-packages.nix` should be sorted alphanumerically to * Attribute names in `python-packages.nix` should be sorted alphanumerically to
avoid merge conflicts and ease locating attributes. avoid merge conflicts and ease locating attributes.

View file

@ -1206,4 +1206,57 @@ runTests {
expr = strings.levenshteinAtMost 3 "hello" "Holla"; expr = strings.levenshteinAtMost 3 "hello" "Holla";
expected = true; expected = true;
}; };
testTypeDescriptionInt = {
expr = (with types; int).description;
expected = "signed integer";
};
testTypeDescriptionListOfInt = {
expr = (with types; listOf int).description;
expected = "list of signed integer";
};
testTypeDescriptionListOfListOfInt = {
expr = (with types; listOf (listOf int)).description;
expected = "list of list of signed integer";
};
testTypeDescriptionListOfEitherStrOrBool = {
expr = (with types; listOf (either str bool)).description;
expected = "list of (string or boolean)";
};
testTypeDescriptionEitherListOfStrOrBool = {
expr = (with types; either (listOf bool) str).description;
expected = "(list of boolean) or string";
};
testTypeDescriptionEitherStrOrListOfBool = {
expr = (with types; either str (listOf bool)).description;
expected = "string or list of boolean";
};
testTypeDescriptionOneOfListOfStrOrBool = {
expr = (with types; oneOf [ (listOf bool) str ]).description;
expected = "(list of boolean) or string";
};
testTypeDescriptionOneOfListOfStrOrBoolOrNumber = {
expr = (with types; oneOf [ (listOf bool) str number ]).description;
expected = "(list of boolean) or string or signed integer or floating point number";
};
testTypeDescriptionEitherListOfBoolOrEitherStringOrNumber = {
expr = (with types; either (listOf bool) (either str number)).description;
expected = "(list of boolean) or string or signed integer or floating point number";
};
testTypeDescriptionEitherEitherListOfBoolOrStringOrNumber = {
expr = (with types; either (either (listOf bool) str) number).description;
expected = "(list of boolean) or string or signed integer or floating point number";
};
testTypeDescriptionEitherNullOrBoolOrString = {
expr = (with types; either (nullOr bool) str).description;
expected = "null or boolean or string";
};
testTypeDescriptionEitherListOfEitherBoolOrStrOrInt = {
expr = (with types; either (listOf (either bool str)) int).description;
expected = "(list of (boolean or string)) or signed integer";
};
testTypeDescriptionEitherIntOrListOrEitherBoolOrStr = {
expr = (with types; either int (listOf (either bool str))).description;
expected = "signed integer or list of (boolean or string)";
};
} }

View file

@ -113,6 +113,12 @@ rec {
name name
, # Description of the type, defined recursively by embedding the wrapped type if any. , # Description of the type, defined recursively by embedding the wrapped type if any.
description ? null description ? null
# A hint for whether or not this description needs parentheses. Possible values:
# - "noun": a simple noun phrase such as "positive integer"
# - "conjunction": a phrase with a potentially ambiguous "or" connective.
# - "composite": a phrase with an "of" connective
# See the `optionDescriptionPhrase` function.
, descriptionClass ? null
, # Function applied to each definition that should return true if , # Function applied to each definition that should return true if
# its type-correct, false otherwise. # its type-correct, false otherwise.
check ? (x: true) check ? (x: true)
@ -158,10 +164,36 @@ rec {
nestedTypes ? {} nestedTypes ? {}
}: }:
{ _type = "option-type"; { _type = "option-type";
inherit name check merge emptyValue getSubOptions getSubModules substSubModules typeMerge functor deprecationMessage nestedTypes; inherit
name check merge emptyValue getSubOptions getSubModules substSubModules
typeMerge functor deprecationMessage nestedTypes descriptionClass;
description = if description == null then name else description; description = if description == null then name else description;
}; };
# optionDescriptionPhrase :: (str -> bool) -> optionType -> str
#
# Helper function for producing unambiguous but readable natural language
# descriptions of types.
#
# Parameters
#
# optionDescriptionPhase unparenthesize optionType
#
# `unparenthesize`: A function from descriptionClass string to boolean.
# It must return true when the class of phrase will fit unambiguously into
# the description of the caller.
#
# `optionType`: The option type to parenthesize or not.
# The option whose description we're returning.
#
# Return value
#
# The description of the `optionType`, with parentheses if there may be an
# ambiguity.
optionDescriptionPhrase = unparenthesize: t:
if unparenthesize (t.descriptionClass or null)
then t.description
else "(${t.description})";
# When adding new types don't forget to document them in # When adding new types don't forget to document them in
# nixos/doc/manual/development/option-types.xml! # nixos/doc/manual/development/option-types.xml!
@ -170,6 +202,7 @@ rec {
raw = mkOptionType rec { raw = mkOptionType rec {
name = "raw"; name = "raw";
description = "raw value"; description = "raw value";
descriptionClass = "noun";
check = value: true; check = value: true;
merge = mergeOneOption; merge = mergeOneOption;
}; };
@ -177,6 +210,7 @@ rec {
anything = mkOptionType { anything = mkOptionType {
name = "anything"; name = "anything";
description = "anything"; description = "anything";
descriptionClass = "noun";
check = value: true; check = value: true;
merge = loc: defs: merge = loc: defs:
let let
@ -216,12 +250,14 @@ rec {
}; };
unspecified = mkOptionType { unspecified = mkOptionType {
name = "unspecified"; name = "unspecified value";
descriptionClass = "noun";
}; };
bool = mkOptionType { bool = mkOptionType {
name = "bool"; name = "bool";
description = "boolean"; description = "boolean";
descriptionClass = "noun";
check = isBool; check = isBool;
merge = mergeEqualOption; merge = mergeEqualOption;
}; };
@ -229,6 +265,7 @@ rec {
int = mkOptionType { int = mkOptionType {
name = "int"; name = "int";
description = "signed integer"; description = "signed integer";
descriptionClass = "noun";
check = isInt; check = isInt;
merge = mergeEqualOption; merge = mergeEqualOption;
}; };
@ -294,6 +331,7 @@ rec {
float = mkOptionType { float = mkOptionType {
name = "float"; name = "float";
description = "floating point number"; description = "floating point number";
descriptionClass = "noun";
check = isFloat; check = isFloat;
merge = mergeEqualOption; merge = mergeEqualOption;
}; };
@ -325,6 +363,7 @@ rec {
str = mkOptionType { str = mkOptionType {
name = "str"; name = "str";
description = "string"; description = "string";
descriptionClass = "noun";
check = isString; check = isString;
merge = mergeEqualOption; merge = mergeEqualOption;
}; };
@ -332,6 +371,7 @@ rec {
nonEmptyStr = mkOptionType { nonEmptyStr = mkOptionType {
name = "nonEmptyStr"; name = "nonEmptyStr";
description = "non-empty string"; description = "non-empty string";
descriptionClass = "noun";
check = x: str.check x && builtins.match "[ \t\n]*" x == null; check = x: str.check x && builtins.match "[ \t\n]*" x == null;
inherit (str) merge; inherit (str) merge;
}; };
@ -344,6 +384,7 @@ rec {
mkOptionType { mkOptionType {
name = "singleLineStr"; name = "singleLineStr";
description = "(optionally newline-terminated) single-line string"; description = "(optionally newline-terminated) single-line string";
descriptionClass = "noun";
inherit check; inherit check;
merge = loc: defs: merge = loc: defs:
lib.removeSuffix "\n" (merge loc defs); lib.removeSuffix "\n" (merge loc defs);
@ -352,6 +393,7 @@ rec {
strMatching = pattern: mkOptionType { strMatching = pattern: mkOptionType {
name = "strMatching ${escapeNixString pattern}"; name = "strMatching ${escapeNixString pattern}";
description = "string matching the pattern ${pattern}"; description = "string matching the pattern ${pattern}";
descriptionClass = "noun";
check = x: str.check x && builtins.match pattern x != null; check = x: str.check x && builtins.match pattern x != null;
inherit (str) merge; inherit (str) merge;
}; };
@ -364,6 +406,7 @@ rec {
then "Concatenated string" # for types.string. then "Concatenated string" # for types.string.
else "strings concatenated with ${builtins.toJSON sep}" else "strings concatenated with ${builtins.toJSON sep}"
; ;
descriptionClass = "noun";
check = isString; check = isString;
merge = loc: defs: concatStringsSep sep (getValues defs); merge = loc: defs: concatStringsSep sep (getValues defs);
functor = (defaultFunctor name) // { functor = (defaultFunctor name) // {
@ -387,7 +430,7 @@ rec {
passwdEntry = entryType: addCheck entryType (str: !(hasInfix ":" str || hasInfix "\n" str)) // { passwdEntry = entryType: addCheck entryType (str: !(hasInfix ":" str || hasInfix "\n" str)) // {
name = "passwdEntry ${entryType.name}"; name = "passwdEntry ${entryType.name}";
description = "${entryType.description}, not containing newlines or colons"; description = "${optionDescriptionPhrase (class: class == "noun") entryType}, not containing newlines or colons";
}; };
attrs = mkOptionType { attrs = mkOptionType {
@ -407,6 +450,7 @@ rec {
# ("/nix/store/hash-foo"). These get a context added to them using builtins.storePath. # ("/nix/store/hash-foo"). These get a context added to them using builtins.storePath.
package = mkOptionType { package = mkOptionType {
name = "package"; name = "package";
descriptionClass = "noun";
check = x: isDerivation x || isStorePath x; check = x: isDerivation x || isStorePath x;
merge = loc: defs: merge = loc: defs:
let res = mergeOneOption loc defs; let res = mergeOneOption loc defs;
@ -427,7 +471,8 @@ rec {
listOf = elemType: mkOptionType rec { listOf = elemType: mkOptionType rec {
name = "listOf"; name = "listOf";
description = "list of ${elemType.description}"; description = "list of ${optionDescriptionPhrase (class: class == "noun" || class == "composite") elemType}";
descriptionClass = "composite";
check = isList; check = isList;
merge = loc: defs: merge = loc: defs:
map (x: x.value) (filter (x: x ? value) (concatLists (imap1 (n: def: map (x: x.value) (filter (x: x ? value) (concatLists (imap1 (n: def:
@ -450,13 +495,14 @@ rec {
nonEmptyListOf = elemType: nonEmptyListOf = elemType:
let list = addCheck (types.listOf elemType) (l: l != []); let list = addCheck (types.listOf elemType) (l: l != []);
in list // { in list // {
description = "non-empty " + list.description; description = "non-empty ${optionDescriptionPhrase (class: class == "noun") list}";
emptyValue = { }; # no .value attr, meaning unset emptyValue = { }; # no .value attr, meaning unset
}; };
attrsOf = elemType: mkOptionType rec { attrsOf = elemType: mkOptionType rec {
name = "attrsOf"; name = "attrsOf";
description = "attribute set of ${elemType.description}"; description = "attribute set of ${optionDescriptionPhrase (class: class == "noun" || class == "composite") elemType}";
descriptionClass = "composite";
check = isAttrs; check = isAttrs;
merge = loc: defs: merge = loc: defs:
mapAttrs (n: v: v.value) (filterAttrs (n: v: v ? value) (zipAttrsWith (name: defs: mapAttrs (n: v: v.value) (filterAttrs (n: v: v ? value) (zipAttrsWith (name: defs:
@ -479,7 +525,8 @@ rec {
# error that it's not defined. Use only if conditional definitions don't make sense. # error that it's not defined. Use only if conditional definitions don't make sense.
lazyAttrsOf = elemType: mkOptionType rec { lazyAttrsOf = elemType: mkOptionType rec {
name = "lazyAttrsOf"; name = "lazyAttrsOf";
description = "lazy attribute set of ${elemType.description}"; description = "lazy attribute set of ${optionDescriptionPhrase (class: class == "noun" || class == "composite") elemType}";
descriptionClass = "composite";
check = isAttrs; check = isAttrs;
merge = loc: defs: merge = loc: defs:
zipAttrsWith (name: defs: zipAttrsWith (name: defs:
@ -509,7 +556,7 @@ rec {
# Value of given type but with no merging (i.e. `uniq list`s are not concatenated). # Value of given type but with no merging (i.e. `uniq list`s are not concatenated).
uniq = elemType: mkOptionType rec { uniq = elemType: mkOptionType rec {
name = "uniq"; name = "uniq";
inherit (elemType) description check; inherit (elemType) description descriptionClass check;
merge = mergeOneOption; merge = mergeOneOption;
emptyValue = elemType.emptyValue; emptyValue = elemType.emptyValue;
getSubOptions = elemType.getSubOptions; getSubOptions = elemType.getSubOptions;
@ -521,7 +568,7 @@ rec {
unique = { message }: type: mkOptionType rec { unique = { message }: type: mkOptionType rec {
name = "unique"; name = "unique";
inherit (type) description check; inherit (type) description descriptionClass check;
merge = mergeUniqueOption { inherit message; }; merge = mergeUniqueOption { inherit message; };
emptyValue = type.emptyValue; emptyValue = type.emptyValue;
getSubOptions = type.getSubOptions; getSubOptions = type.getSubOptions;
@ -534,7 +581,8 @@ rec {
# Null or value of ... # Null or value of ...
nullOr = elemType: mkOptionType rec { nullOr = elemType: mkOptionType rec {
name = "nullOr"; name = "nullOr";
description = "null or ${elemType.description}"; description = "null or ${optionDescriptionPhrase (class: class == "noun" || class == "conjunction") elemType}";
descriptionClass = "conjunction";
check = x: x == null || elemType.check x; check = x: x == null || elemType.check x;
merge = loc: defs: merge = loc: defs:
let nrNulls = count (def: def.value == null) defs; in let nrNulls = count (def: def.value == null) defs; in
@ -552,7 +600,8 @@ rec {
functionTo = elemType: mkOptionType { functionTo = elemType: mkOptionType {
name = "functionTo"; name = "functionTo";
description = "function that evaluates to a(n) ${elemType.description}"; description = "function that evaluates to a(n) ${optionDescriptionPhrase (class: class == "noun" || class == "composite") elemType}";
descriptionClass = "composite";
check = isFunction; check = isFunction;
merge = loc: defs: merge = loc: defs:
fnArgs: (mergeDefinitions (loc ++ [ "[function body]" ]) elemType (map (fn: { inherit (fn) file; value = fn.value fnArgs; }) defs)).mergedValue; fnArgs: (mergeDefinitions (loc ++ [ "[function body]" ]) elemType (map (fn: { inherit (fn) file; value = fn.value fnArgs; }) defs)).mergedValue;
@ -578,6 +627,7 @@ rec {
deferredModuleWith = attrs@{ staticModules ? [] }: mkOptionType { deferredModuleWith = attrs@{ staticModules ? [] }: mkOptionType {
name = "deferredModule"; name = "deferredModule";
description = "module"; description = "module";
descriptionClass = "noun";
check = x: isAttrs x || isFunction x || path.check x; check = x: isAttrs x || isFunction x || path.check x;
merge = loc: defs: { merge = loc: defs: {
imports = staticModules ++ map (def: lib.setDefaultModuleLocation "${def.file}, via option ${showOption loc}" def.value) defs; imports = staticModules ++ map (def: lib.setDefaultModuleLocation "${def.file}, via option ${showOption loc}" def.value) defs;
@ -603,6 +653,7 @@ rec {
optionType = mkOptionType { optionType = mkOptionType {
name = "optionType"; name = "optionType";
description = "optionType"; description = "optionType";
descriptionClass = "noun";
check = value: value._type or null == "option-type"; check = value: value._type or null == "option-type";
merge = loc: defs: merge = loc: defs:
if length defs == 1 if length defs == 1
@ -749,6 +800,10 @@ rec {
"value ${show (builtins.head values)} (singular enum)" "value ${show (builtins.head values)} (singular enum)"
else else
"one of ${concatMapStringsSep ", " show values}"; "one of ${concatMapStringsSep ", " show values}";
descriptionClass =
if builtins.length values < 2
then "noun"
else "conjunction";
check = flip elem values; check = flip elem values;
merge = mergeEqualOption; merge = mergeEqualOption;
functor = (defaultFunctor name) // { payload = values; binOp = a: b: unique (a ++ b); }; functor = (defaultFunctor name) // { payload = values; binOp = a: b: unique (a ++ b); };
@ -757,7 +812,8 @@ rec {
# Either value of type `t1` or `t2`. # Either value of type `t1` or `t2`.
either = t1: t2: mkOptionType rec { either = t1: t2: mkOptionType rec {
name = "either"; name = "either";
description = "${t1.description} or ${t2.description}"; description = "${optionDescriptionPhrase (class: class == "noun" || class == "conjunction") t1} or ${optionDescriptionPhrase (class: class == "noun" || class == "conjunction" || class == "composite") t2}";
descriptionClass = "conjunction";
check = x: t1.check x || t2.check x; check = x: t1.check x || t2.check x;
merge = loc: defs: merge = loc: defs:
let let
@ -795,7 +851,7 @@ rec {
coercedType.description})"; coercedType.description})";
mkOptionType rec { mkOptionType rec {
name = "coercedTo"; name = "coercedTo";
description = "${finalType.description} or ${coercedType.description} convertible to it"; description = "${optionDescriptionPhrase (class: class == "noun") finalType} or ${optionDescriptionPhrase (class: class == "noun") coercedType} convertible to it";
check = x: (coercedType.check x && finalType.check (coerceFunc x)) || finalType.check x; check = x: (coercedType.check x && finalType.check (coerceFunc x)) || finalType.check x;
merge = loc: defs: merge = loc: defs:
let let

View file

@ -373,6 +373,12 @@
githubId = 10677343; githubId = 10677343;
name = "Eugene"; name = "Eugene";
}; };
afh = {
email = "surryhill+nix@gmail.com";
github = "afh";
githubId = 16507;
name = "Alexis Hildebrandt";
};
aflatter = { aflatter = {
email = "flatter@fastmail.fm"; email = "flatter@fastmail.fm";
github = "aflatter"; github = "aflatter";
@ -597,6 +603,12 @@
githubId = 782180; githubId = 782180;
name = "Alex Vorobiev"; name = "Alex Vorobiev";
}; };
alexwinter = {
email = "git@alexwinter.net";
github = "lxwntr";
githubId = 50754358;
name = "Alex Winter";
};
alexeyre = { alexeyre = {
email = "A.Eyre@sms.ed.ac.uk"; email = "A.Eyre@sms.ed.ac.uk";
github = "alexeyre"; github = "alexeyre";
@ -2443,6 +2455,12 @@
githubId = 42220376; githubId = 42220376;
name = "Charlotte Van Petegem"; name = "Charlotte Van Petegem";
}; };
ciferkey = {
name = "Matthew Brunelle";
email = "ciferkey@gmail.com";
github = "ciferkey";
githubId = 101422;
};
cigrainger = { cigrainger = {
name = "Christopher Grainger"; name = "Christopher Grainger";
email = "chris@amplified.ai"; email = "chris@amplified.ai";
@ -3739,6 +3757,12 @@
github = "edlimerkaj"; github = "edlimerkaj";
githubId = 71988351; githubId = 71988351;
}; };
ehllie = {
email = "me@ehllie.xyz";
github = "ehllie";
githubId = 20847625;
name = "Elizabeth Paź";
};
elliottslaughter = { elliottslaughter = {
name = "Elliott Slaughter"; name = "Elliott Slaughter";
email = "elliottslaughter@gmail.com"; email = "elliottslaughter@gmail.com";
@ -4764,6 +4788,12 @@
githubId = 16470252; githubId = 16470252;
name = "Gemini Lasswell"; name = "Gemini Lasswell";
}; };
gbpdt = {
email = "nix@pdtpartners.com";
github = "gbpdt";
githubId = 25106405;
name = "Graham Bennett";
};
gbtb = { gbtb = {
email = "goodbetterthebeast3@gmail.com"; email = "goodbetterthebeast3@gmail.com";
github = "gbtb"; github = "gbtb";
@ -8086,6 +8116,13 @@
githubId = 109141; githubId = 109141;
name = "Georges Dubus"; name = "Georges Dubus";
}; };
madonius = {
email = "nixos@madoni.us";
github = "madonius";
githubId = 1246752;
name = "madonius";
matrix = "@madonius:entropia.de";
};
Madouura = { Madouura = {
email = "madouura@gmail.com"; email = "madouura@gmail.com";
github = "Madouura"; github = "Madouura";
@ -10690,6 +10727,12 @@
fingerprint = "48AD DE10 F27B AFB4 7BB0 CCAF 2D25 95A0 0D08 ACE0"; fingerprint = "48AD DE10 F27B AFB4 7BB0 CCAF 2D25 95A0 0D08 ACE0";
}]; }];
}; };
posch = {
email = "tp@fonz.de";
github = "posch";
githubId = 146413;
name = "Tobias Poschwatta";
};
ppenguin = { ppenguin = {
name = "Jeroen Versteeg"; name = "Jeroen Versteeg";
email = "hieronymusv@gmail.com"; email = "hieronymusv@gmail.com";
@ -13239,6 +13282,12 @@
}]; }];
name = "Karim Vergnes"; name = "Karim Vergnes";
}; };
thetallestjj = {
email = "me+nixpkgs@jeroen-jetten.com";
github = "thetallestjj";
githubId = 6579555;
name = "Jeroen Jetten";
};
theuni = { theuni = {
email = "ct@flyingcircus.io"; email = "ct@flyingcircus.io";
github = "ctheune"; github = "ctheune";
@ -13383,6 +13432,12 @@
githubId = 1292007; githubId = 1292007;
name = "Sébastien Maccagnoni"; name = "Sébastien Maccagnoni";
}; };
tirex = {
email = "szymon@kliniewski.pl";
name = "Szymon Kliniewski";
github = "NoneTirex";
githubId = 26038207;
};
titanous = { titanous = {
email = "jonathan@titanous.com"; email = "jonathan@titanous.com";
github = "titanous"; github = "titanous";
@ -15326,14 +15381,19 @@
githubId = 31394095; githubId = 31394095;
}; };
cafkafk = { cafkafk = {
email = "cafkafk@cafkafk.com"; email = "christina@cafkafk.com";
matrix = "@cafkafk:matrix.cafkafk.com"; matrix = "@cafkafk:matrix.cafkafk.com";
name = "Christina Sørensen"; name = "Christina Sørensen";
github = "cafkafk"; github = "cafkafk";
githubId = 89321978; githubId = 89321978;
keys = [{ keys = [
fingerprint = "7B9E E848 D074 AE03 7A0C 651A 8ED4 DEF7 375A 30C8"; {
}]; fingerprint = "7B9E E848 D074 AE03 7A0C 651A 8ED4 DEF7 375A 30C8";
}
{
fingerprint = "208A 2A66 8A2F CDE7 B5D3 8F64 CDDC 792F 6552 51ED";
}
];
}; };
rb = { rb = {
email = "maintainers@cloudposse.com"; email = "maintainers@cloudposse.com";
@ -15347,4 +15407,16 @@
githubId = 115711; githubId = 115711;
name = "bpaulin"; name = "bpaulin";
}; };
zuzuleinen = {
email = "andrey.boar@gmail.com";
name = "Andrei Boar";
github = "zuzuleinen";
githubId = 944919;
};
quasigod-io = {
email = "quasigod-io@protonmail.com";
name = "Michael Belsanti";
github = "quasigod-io";
githubId = 62124625;
};
} }

View file

@ -58,7 +58,14 @@ sed -r \
-e '/ jailbreak-cabal /d' \ -e '/ jailbreak-cabal /d' \
-e '/ language-nix /d' \ -e '/ language-nix /d' \
-e '/ cabal-install /d' \ -e '/ cabal-install /d' \
-e '/ lsp /d' \
-e '/ lsp-types /d' \
-e '/ lsp-test /d' \
-e '/ hie-bios /d' \
< "${tmpfile_new}" >> $stackage_config < "${tmpfile_new}" >> $stackage_config
# Explanations:
# cabal2nix, distribution-nixpkgs, jailbreak-cabal, language-nix: These are our packages and we know what we are doing.
# lsp, lsp-types, lsp-test, hie-bios: These are tightly coupled to hls which is not in stackage. They have no rdeps in stackage.
if [[ "${1:-}" == "--do-commit" ]]; then if [[ "${1:-}" == "--do-commit" ]]; then
git add $stackage_config git add $stackage_config

View file

@ -13,6 +13,9 @@ import tempfile
class CalledProcessError(Exception): class CalledProcessError(Exception):
process: asyncio.subprocess.Process process: asyncio.subprocess.Process
class UpdateFailedException(Exception):
pass
def eprint(*args, **kwargs): def eprint(*args, **kwargs):
print(*args, file=sys.stderr, **kwargs) print(*args, file=sys.stderr, **kwargs)
@ -69,7 +72,7 @@ async def run_update_script(nixpkgs_root: str, merge_lock: asyncio.Lock, temp_di
eprint(f"--- SHOWING ERROR LOG FOR {package['name']} ----------------------") eprint(f"--- SHOWING ERROR LOG FOR {package['name']} ----------------------")
if not keep_going: if not keep_going:
raise asyncio.exceptions.CancelledError() raise UpdateFailedException(f"The update script for {package['name']} failed with exit code {e.process.returncode}")
@contextlib.contextmanager @contextlib.contextmanager
def make_worktree() -> Generator[Tuple[str, str], None, None]: def make_worktree() -> Generator[Tuple[str, str], None, None]:
@ -185,9 +188,14 @@ async def start_updates(max_workers: int, keep_going: bool, commit: bool, packag
try: try:
# Start updater workers. # Start updater workers.
await updaters await updaters
except asyncio.exceptions.CancelledError as e: except asyncio.exceptions.CancelledError:
# When one worker is cancelled, cancel the others too. # When one worker is cancelled, cancel the others too.
updaters.cancel() updaters.cancel()
except UpdateFailedException as e:
# When one worker fails, cancel the others, as this exception is only thrown when keep_going is false.
updaters.cancel()
eprint(e)
sys.exit(1)
def main(max_workers: int, keep_going: bool, commit: bool, packages_path: str) -> None: def main(max_workers: int, keep_going: bool, commit: bool, packages_path: str) -> None:
with open(packages_path) as f: with open(packages_path) as f:

View file

@ -124,14 +124,14 @@ lib.mkOption {
```nix ```nix
lib.mkPackageOption pkgs "GHC" { lib.mkPackageOption pkgs "GHC" {
default = [ "ghc" ]; default = [ "ghc" ];
example = "pkgs.haskell.packages.ghc924.ghc.withPackages (hkgs: [ hkgs.primes ])"; example = "pkgs.haskell.packages.ghc92.ghc.withPackages (hkgs: [ hkgs.primes ])";
} }
# is like # is like
lib.mkOption { lib.mkOption {
type = lib.types.package; type = lib.types.package;
default = pkgs.ghc; default = pkgs.ghc;
defaultText = lib.literalExpression "pkgs.ghc"; defaultText = lib.literalExpression "pkgs.ghc";
example = lib.literalExpression "pkgs.haskell.packages.ghc924.ghc.withPackages (hkgs: [ hkgs.primes ])"; example = lib.literalExpression "pkgs.haskell.packages.ghc92.ghc.withPackages (hkgs: [ hkgs.primes ])";
description = "The GHC package to use."; description = "The GHC package to use.";
} }
``` ```

View file

@ -189,14 +189,14 @@ lib.mkOption {
<programlisting language="bash"> <programlisting language="bash">
lib.mkPackageOption pkgs &quot;GHC&quot; { lib.mkPackageOption pkgs &quot;GHC&quot; {
default = [ &quot;ghc&quot; ]; default = [ &quot;ghc&quot; ];
example = &quot;pkgs.haskell.packages.ghc924.ghc.withPackages (hkgs: [ hkgs.primes ])&quot;; example = &quot;pkgs.haskell.packages.ghc92.ghc.withPackages (hkgs: [ hkgs.primes ])&quot;;
} }
# is like # is like
lib.mkOption { lib.mkOption {
type = lib.types.package; type = lib.types.package;
default = pkgs.ghc; default = pkgs.ghc;
defaultText = lib.literalExpression &quot;pkgs.ghc&quot;; defaultText = lib.literalExpression &quot;pkgs.ghc&quot;;
example = lib.literalExpression &quot;pkgs.haskell.packages.ghc924.ghc.withPackages (hkgs: [ hkgs.primes ])&quot;; example = lib.literalExpression &quot;pkgs.haskell.packages.ghc92.ghc.withPackages (hkgs: [ hkgs.primes ])&quot;;
description = &quot;The GHC package to use.&quot;; description = &quot;The GHC package to use.&quot;;
} }
</programlisting> </programlisting>

View file

@ -1469,6 +1469,16 @@ Superuser created successfully.
extent. extent.
</para> </para>
</listitem> </listitem>
<listitem>
<para>
<literal>pkgs.haskell-language-server</literal> will now by
default be linked dynamically to improve TemplateHaskell
compatibility. To mitigate the increased closure size it will
now by default only support our current default ghc (at the
moment 9.0.2). Add other ghc versions via e.g.
<literal>pkgs.haskell-language-server.override { supportedGhcVersions = [ &quot;90&quot; &quot;92&quot; ]; }</literal>.
</para>
</listitem>
</itemizedlist> </itemizedlist>
</section> </section>
<section xml:id="sec-release-21.11-notable-changes"> <section xml:id="sec-release-21.11-notable-changes">
@ -2087,6 +2097,18 @@ Superuser created successfully.
<literal>java-packages.compiler</literal>. <literal>java-packages.compiler</literal>.
</para> </para>
</listitem> </listitem>
<listitem>
<para>
The sets <literal>haskell.packages</literal> and
<literal>haskell.compiler</literal> now contain for every ghc
version an attribute with the minor version dropped. E.g. for
<literal>ghc8107</literal> there also now exists
<literal>ghc810</literal>. Those attributes point to the same
compilers and packagesets but have the advantage that e.g.
<literal>ghc92</literal> stays stable when we update from
<literal>ghc924</literal> to <literal>ghc925</literal>.
</para>
</listitem>
</itemizedlist> </itemizedlist>
</section> </section>
</section> </section>

View file

@ -123,6 +123,13 @@
PHP now defaults to PHP 8.1, updated from 8.0. PHP now defaults to PHP 8.1, updated from 8.0.
</para> </para>
</listitem> </listitem>
<listitem>
<para>
Perl has been updated to 5.36, and its core module
<literal>HTTP::Tiny</literal> was patched to verify SSL/TLS
certificates by default.
</para>
</listitem>
<listitem> <listitem>
<para> <para>
Cinnamon has been updated to 5.4. While at it, the cinnamon Cinnamon has been updated to 5.4. While at it, the cinnamon
@ -225,6 +232,13 @@
<link linkend="opt-services.outline.enable">services.outline</link>. <link linkend="opt-services.outline.enable">services.outline</link>.
</para> </para>
</listitem> </listitem>
<listitem>
<para>
<link xlink:href="https://git.sr.ht/~migadu/alps">alps</link>,
a simple and extensible webmail. Available as
<link linkend="opt-services.alps.enable">services.alps</link>.
</para>
</listitem>
<listitem> <listitem>
<para> <para>
<link xlink:href="https://netbird.io">netbird</link>, a zero <link xlink:href="https://netbird.io">netbird</link>, a zero
@ -309,6 +323,13 @@
<link xlink:href="options.html#opt-services.writefreely.enable">services.writefreely</link>. <link xlink:href="options.html#opt-services.writefreely.enable">services.writefreely</link>.
</para> </para>
</listitem> </listitem>
<listitem>
<para>
<link xlink:href="https://listmonk.app">Listmonk</link>, a
self-hosted newsletter manager. Enable using
<link xlink:href="options.html#opt-services.listmonk.enable">services.listmonk</link>.
</para>
</listitem>
</itemizedlist> </itemizedlist>
</section> </section>
<section xml:id="sec-release-22.11-incompatibilities"> <section xml:id="sec-release-22.11-incompatibilities">
@ -470,6 +491,14 @@
<literal>services.datadog-agent</literal> module. <literal>services.datadog-agent</literal> module.
</para> </para>
</listitem> </listitem>
<listitem>
<para>
lemmy module option
<literal>services.lemmy.settings.database.createLocally</literal>
moved to
<literal>services.lemmy.database.createLocally</literal>.
</para>
</listitem>
<listitem> <listitem>
<para> <para>
virtlyst package and <literal>services.virtlyst</literal> virtlyst package and <literal>services.virtlyst</literal>
@ -486,6 +515,14 @@
been removed due to lack of upstream maintenance. been removed due to lack of upstream maintenance.
</para> </para>
</listitem> </listitem>
<listitem>
<para>
The <literal>aws</literal> package has been removed due to
being abandoned by the upstream. It is recommended to use
<literal>awscli</literal> or <literal>awscli2</literal>
instead.
</para>
</listitem>
<listitem> <listitem>
<para> <para>
The <literal>meta.mainProgram</literal> attribute of packages The <literal>meta.mainProgram</literal> attribute of packages
@ -526,6 +563,15 @@
option, and it is enabled by default, for servers. option, and it is enabled by default, for servers.
</para> </para>
</listitem> </listitem>
<listitem>
<para>
<literal>stylua</literal> no longer accepts
<literal>lua52Support</literal> and
<literal>luauSupport</literal> overrides, use
<literal>features</literal> instead, which defaults to
<literal>[ &quot;lua54&quot; &quot;luau&quot; ]</literal>.
</para>
</listitem>
</itemizedlist> </itemizedlist>
</section> </section>
<section xml:id="sec-release-22.11-notable-changes"> <section xml:id="sec-release-22.11-notable-changes">

View file

@ -437,6 +437,8 @@ In addition to numerous new and upgraded packages, this release has the followin
- `/usr` will always be included in the initial ramdisk. See the `fileSystems.<name>.neededForBoot` option. - `/usr` will always be included in the initial ramdisk. See the `fileSystems.<name>.neededForBoot` option.
If any files exist under `/usr` (which is not typical for NixOS), they will be included in the initial ramdisk, increasing its size to a possibly problematic extent. If any files exist under `/usr` (which is not typical for NixOS), they will be included in the initial ramdisk, increasing its size to a possibly problematic extent.
- `pkgs.haskell-language-server` will now by default be linked dynamically to improve TemplateHaskell compatibility. To mitigate the increased closure size it will now by default only support our current default ghc (at the moment 9.0.2). Add other ghc versions via e.g. `pkgs.haskell-language-server.override { supportedGhcVersions = [ "90" "92" ]; }`.
## Other Notable Changes {#sec-release-21.11-notable-changes} ## Other Notable Changes {#sec-release-21.11-notable-changes}
@ -573,3 +575,5 @@ In addition to numerous new and upgraded packages, this release has the followin
- hydrus has been upgraded from version `438` to `463`. Since upgrading between releases this old is advised against, be sure to have a backup of your data before upgrading. For details, see [the hydrus manual](https://hydrusnetwork.github.io/hydrus/help/getting_started_installing.html#big_updates). - hydrus has been upgraded from version `438` to `463`. Since upgrading between releases this old is advised against, be sure to have a backup of your data before upgrading. For details, see [the hydrus manual](https://hydrusnetwork.github.io/hydrus/help/getting_started_installing.html#big_updates).
- More jdk and jre versions are now exposed via `java-packages.compiler`. - More jdk and jre versions are now exposed via `java-packages.compiler`.
- The sets `haskell.packages` and `haskell.compiler` now contain for every ghc version an attribute with the minor version dropped. E.g. for `ghc8107` there also now exists `ghc810`. Those attributes point to the same compilers and packagesets but have the advantage that e.g. `ghc92` stays stable when we update from `ghc924` to `ghc925`.

View file

@ -50,6 +50,8 @@ In addition to numerous new and upgraded packages, this release has the followin
- PHP now defaults to PHP 8.1, updated from 8.0. - PHP now defaults to PHP 8.1, updated from 8.0.
- Perl has been updated to 5.36, and its core module `HTTP::Tiny` was patched to verify SSL/TLS certificates by default.
- Cinnamon has been updated to 5.4. While at it, the cinnamon module now defaults to - Cinnamon has been updated to 5.4. While at it, the cinnamon module now defaults to
blueman as bluetooth manager and slick-greeter as lightdm greeter to match upstream. blueman as bluetooth manager and slick-greeter as lightdm greeter to match upstream.
@ -83,6 +85,8 @@ In addition to numerous new and upgraded packages, this release has the followin
- [Outline](https://www.getoutline.com/), a wiki and knowledge base similar to Notion. Available as [services.outline](#opt-services.outline.enable). - [Outline](https://www.getoutline.com/), a wiki and knowledge base similar to Notion. Available as [services.outline](#opt-services.outline.enable).
- [alps](https://git.sr.ht/~migadu/alps), a simple and extensible webmail. Available as [services.alps](#opt-services.alps.enable).
- [netbird](https://netbird.io), a zero configuration VPN. - [netbird](https://netbird.io), a zero configuration VPN.
Available as [services.netbird](options.html#opt-services.netbird.enable). Available as [services.netbird](options.html#opt-services.netbird.enable).
@ -107,6 +111,8 @@ Available as [services.patroni](options.html#opt-services.patroni.enable).
- [WriteFreely](https://writefreely.org), a simple blogging platform with ActivityPub support. Available as [services.writefreely](options.html#opt-services.writefreely.enable). - [WriteFreely](https://writefreely.org), a simple blogging platform with ActivityPub support. Available as [services.writefreely](options.html#opt-services.writefreely.enable).
- [Listmonk](https://listmonk.app), a self-hosted newsletter manager. Enable using [services.listmonk](options.html#opt-services.listmonk.enable).
<!-- To avoid merge conflicts, consider adding your item at an arbitrary place in the list instead. --> <!-- To avoid merge conflicts, consider adding your item at an arbitrary place in the list instead. -->
## Backward Incompatibilities {#sec-release-22.11-incompatibilities} ## Backward Incompatibilities {#sec-release-22.11-incompatibilities}
@ -162,6 +168,9 @@ Available as [services.patroni](options.html#opt-services.patroni.enable).
- dd-agent package removed along with the `services.dd-agent` module, due to the project being deprecated in favor of `datadog-agent`, which is available via the `services.datadog-agent` module. - dd-agent package removed along with the `services.dd-agent` module, due to the project being deprecated in favor of `datadog-agent`, which is available via the `services.datadog-agent` module.
- lemmy module option `services.lemmy.settings.database.createLocally`
moved to `services.lemmy.database.createLocally`.
- virtlyst package and `services.virtlyst` module removed, due to lack of maintainers. - virtlyst package and `services.virtlyst` module removed, due to lack of maintainers.
- The `services.graphite.api` and `services.graphite.beacon` NixOS options, and - The `services.graphite.api` and `services.graphite.beacon` NixOS options, and
@ -169,6 +178,8 @@ Available as [services.patroni](options.html#opt-services.patroni.enable).
`python3.pkgs.influxgraph` packages, have been removed due to lack of upstream `python3.pkgs.influxgraph` packages, have been removed due to lack of upstream
maintenance. maintenance.
- The `aws` package has been removed due to being abandoned by the upstream. It is recommended to use `awscli` or `awscli2` instead.
- The `meta.mainProgram` attribute of packages in `wineWowPackages` now defaults to `"wine64"`. - The `meta.mainProgram` attribute of packages in `wineWowPackages` now defaults to `"wine64"`.
- The `paperless` module now defaults `PAPERLESS_TIME_ZONE` to your configured system timezone. - The `paperless` module now defaults `PAPERLESS_TIME_ZONE` to your configured system timezone.
@ -181,6 +192,8 @@ Available as [services.patroni](options.html#opt-services.patroni.enable).
- `k3s` supports `clusterInit` option, and it is enabled by default, for servers. - `k3s` supports `clusterInit` option, and it is enabled by default, for servers.
- `stylua` no longer accepts `lua52Support` and `luauSupport` overrides, use `features` instead, which defaults to `[ "lua54" "luau" ]`.
<!-- To avoid merge conflicts, consider adding your item at an arbitrary place in the list instead. --> <!-- To avoid merge conflicts, consider adding your item at an arbitrary place in the list instead. -->
## Other Notable Changes {#sec-release-22.11-notable-changes} ## Other Notable Changes {#sec-release-22.11-notable-changes}

View file

@ -505,6 +505,7 @@
./services/mail/dovecot.nix ./services/mail/dovecot.nix
./services/mail/dspam.nix ./services/mail/dspam.nix
./services/mail/exim.nix ./services/mail/exim.nix
./services/mail/listmonk.nix
./services/mail/maddy.nix ./services/mail/maddy.nix
./services/mail/mail.nix ./services/mail/mail.nix
./services/mail/mailcatcher.nix ./services/mail/mailcatcher.nix
@ -1054,6 +1055,7 @@
./services/video/epgstation/default.nix ./services/video/epgstation/default.nix
./services/video/mirakurun.nix ./services/video/mirakurun.nix
./services/video/replay-sorcery.nix ./services/video/replay-sorcery.nix
./services/web-apps/alps.nix
./services/web-apps/atlassian/confluence.nix ./services/web-apps/atlassian/confluence.nix
./services/web-apps/atlassian/crowd.nix ./services/web-apps/atlassian/crowd.nix
./services/web-apps/atlassian/jira.nix ./services/web-apps/atlassian/jira.nix
@ -1266,6 +1268,7 @@
./tasks/network-interfaces-scripted.nix ./tasks/network-interfaces-scripted.nix
./tasks/scsi-link-power-management.nix ./tasks/scsi-link-power-management.nix
./tasks/snapraid.nix ./tasks/snapraid.nix
./tasks/stratis.nix
./tasks/swraid.nix ./tasks/swraid.nix
./tasks/trackpoint.nix ./tasks/trackpoint.nix
./tasks/powertop.nix ./tasks/powertop.nix

View file

@ -79,7 +79,6 @@ in {
"--verbose" "--verbose"
"--debug" "--debug"
"--unsupported-gpu" "--unsupported-gpu"
"--my-next-gpu-wont-be-nvidia"
]; ];
description = lib.mdDoc '' description = lib.mdDoc ''
Command line arguments passed to launch Sway. Please DO NOT report Command line arguments passed to launch Sway. Please DO NOT report

View file

@ -34,10 +34,6 @@ in
services.udev.packages = [ pkgs.usbrelayd ]; services.udev.packages = [ pkgs.usbrelayd ];
systemd.packages = [ pkgs.usbrelayd ]; systemd.packages = [ pkgs.usbrelayd ];
users.users.usbrelay = {
isSystemUser = true;
group = "usbrelay";
};
users.groups.usbrelay = { }; users.groups.usbrelay = { };
}; };

View file

@ -0,0 +1,222 @@
{ config, lib, pkgs, ... }:
with lib;
let
cfg = config.services.listmonk;
tomlFormat = pkgs.formats.toml { };
cfgFile = tomlFormat.generate "listmonk.toml" cfg.settings;
# Escaping is done according to https://www.postgresql.org/docs/current/sql-syntax-lexical.html#SQL-SYNTAX-CONSTANTS
setDatabaseOption = key: value:
"UPDATE settings SET value = '${
lib.replaceChars [ "'" ] [ "''" ] (builtins.toJSON value)
}' WHERE key = '${key}';";
updateDatabaseConfigSQL = pkgs.writeText "update-database-config.sql"
(concatStringsSep "\n" (mapAttrsToList setDatabaseOption
(if (cfg.database.settings != null) then
cfg.database.settings
else
{ })));
updateDatabaseConfigScript =
pkgs.writeShellScriptBin "update-database-config.sh" ''
${if cfg.database.mutableSettings then ''
if [ ! -f /var/lib/listmonk/.db_settings_initialized ]; then
${pkgs.postgresql}/bin/psql -d listmonk -f ${updateDatabaseConfigSQL} ;
touch /var/lib/listmonk/.db_settings_initialized
fi
'' else
"${pkgs.postgresql}/bin/psql -d listmonk -f ${updateDatabaseConfigSQL}"}
'';
databaseSettingsOpts = with types; {
freeformType =
oneOf [ (listOf str) (listOf (attrsOf anything)) str int bool ];
options = {
"app.notify_emails" = mkOption {
type = listOf str;
default = [ ];
description = lib.mdDoc "Administrator emails for system notifications";
};
"privacy.exportable" = mkOption {
type = listOf str;
default = [ "profile" "subscriptions" "campaign_views" "link_clicks" ];
description = lib.mdDoc
"List of fields which can be exported through an automatic export request";
};
"privacy.domain_blocklist" = mkOption {
type = listOf str;
default = [ ];
description = lib.mdDoc
"E-mail addresses with these domains are disallowed from subscribing.";
};
smtp = mkOption {
type = listOf (submodule {
freeformType = with types; attrsOf (oneOf [ str int bool ]);
options = {
enabled = mkEnableOption (lib.mdDoc "this SMTP server for listmonk");
host = mkOption {
type = types.str;
description = lib.mdDoc "Hostname for the SMTP server";
};
port = mkOption {
type = types.port;
description = lib.mdDoc "Port for the SMTP server";
};
max_conns = mkOption {
type = types.int;
description = lib.mdDoc
"Maximum number of simultaneous connections, defaults to 1";
default = 1;
};
tls_type = mkOption {
type = types.enum [ "none" "STARTTLS" "TLS" ];
description =
lib.mdDoc "Type of TLS authentication with the SMTP server";
};
};
});
description = lib.mdDoc "List of outgoing SMTP servers";
};
# TODO: refine this type based on the smtp one.
"bounce.mailboxes" = mkOption {
type = listOf
(submodule { freeformType = with types; oneOf [ str int bool ]; });
default = [ ];
description = lib.mdDoc "List of bounce mailboxes";
};
messengers = mkOption {
type = listOf str;
default = [ ];
description = lib.mdDoc
"List of messengers, see: <https://github.com/knadh/listmonk/blob/master/models/settings.go#L64-L74> for options.";
};
};
};
in {
###### interface
options = {
services.listmonk = {
enable = mkEnableOption
(lib.mdDoc "Listmonk, this module assumes a reverse proxy to be set");
database = {
createLocally = mkOption {
type = types.bool;
default = false;
description = lib.mdDoc
"Create the PostgreSQL database and database user locally.";
};
settings = mkOption {
default = null;
type = with types; nullOr (submodule databaseSettingsOpts);
description = lib.mdDoc
"Dynamic settings in the PostgreSQL database, set by a SQL script, see <https://github.com/knadh/listmonk/blob/master/schema.sql#L177-L230> for details.";
};
mutableSettings = mkOption {
type = types.bool;
default = true;
description = lib.mdDoc ''
Database settings will be reset to the value set in this module if this is not enabled.
Enable this if you want to persist changes you have done in the application.
'';
};
};
package = mkPackageOption pkgs "listmonk" {};
settings = mkOption {
type = types.submodule { freeformType = tomlFormat.type; };
description = lib.mdDoc ''
Static settings set in the config.toml, see <https://github.com/knadh/listmonk/blob/master/config.toml.sample> for details.
You can set secrets using the secretFile option with environment variables following <https://listmonk.app/docs/configuration/#environment-variables>.
'';
};
secretFile = mkOption {
type = types.nullOr types.str;
default = null;
description = lib.mdDoc
"A file containing secrets as environment variables. See <https://listmonk.app/docs/configuration/#environment-variables> for details on supported values.";
};
};
};
###### implementation
config = mkIf cfg.enable {
# Default parameters from https://github.com/knadh/listmonk/blob/master/config.toml.sample
services.listmonk.settings."app".address = mkDefault "localhost:9000";
services.listmonk.settings."db" = mkMerge [
({
max_open = mkDefault 25;
max_idle = mkDefault 25;
max_lifetime = mkDefault "300s";
})
(mkIf cfg.database.createLocally {
host = mkDefault "/run/postgresql";
port = mkDefault 5432;
user = mkDefault "listmonk";
database = mkDefault "listmonk";
})
];
services.postgresql = mkIf cfg.database.createLocally {
enable = true;
ensureUsers = [{
name = "listmonk";
ensurePermissions = { "DATABASE listmonk" = "ALL PRIVILEGES"; };
}];
ensureDatabases = [ "listmonk" ];
};
systemd.services.listmonk = {
description = "Listmonk - newsletter and mailing list manager";
after = [ "network.target" ]
++ optional cfg.database.createLocally "postgresql.service";
wantedBy = [ "multi-user.target" ];
serviceConfig = {
Type = "exec";
EnvironmentFile = mkIf (cfg.secretFile != null) [ cfg.secretFile ];
ExecStartPre = [
# StateDirectory cannot be used when DynamicUser = true is set this way.
# Indeed, it will try to create all the folders and realize one of them already exist.
# Therefore, we have to create it ourselves.
''${pkgs.coreutils}/bin/mkdir -p "''${STATE_DIRECTORY}/listmonk/uploads"''
"${cfg.package}/bin/listmonk --config ${cfgFile} --idempotent --install --upgrade --yes"
"${updateDatabaseConfigScript}/bin/update-database-config.sh"
];
ExecStart = "${cfg.package}/bin/listmonk --config ${cfgFile}";
Restart = "on-failure";
StateDirectory = [ "listmonk" ];
User = "listmonk";
Group = "listmonk";
DynamicUser = true;
NoNewPrivileges = true;
CapabilityBoundingSet = "";
SystemCallArchitecture = "native";
SystemCallFilter = [ "@system-service" "~@privileged" "@resources" ];
ProtectDevices = true;
ProtectControlGroups = true;
ProtectKernelTunables = true;
ProtectHome = true;
DeviceAllow = false;
RestrictNamespaces = true;
RestrictRealtime = true;
UMask = "0027";
MemoryDenyWriteExecute = true;
LockPersonality = true;
RestrictAddressFamilies = [ "AF_INET" "AF_INET6" "AF_UNIX" ];
ProtectKernelModules = true;
PrivateUsers = true;
};
};
};
}

View file

@ -30,7 +30,7 @@
synapse server for the <literal>example.org</literal> domain, served from synapse server for the <literal>example.org</literal> domain, served from
the host <literal>myhostname.example.org</literal>. For more information, the host <literal>myhostname.example.org</literal>. For more information,
please refer to the please refer to the
<link xlink:href="https://github.com/matrix-org/synapse#synapse-installation"> <link xlink:href="https://matrix-org.github.io/synapse/latest/setup/installation.html">
installation instructions of Synapse </link>. installation instructions of Synapse </link>.
<programlisting> <programlisting>
{ pkgs, lib, config, ... }: { pkgs, lib, config, ... }:

View file

@ -227,6 +227,16 @@ in
The hostname of the build machine. The hostname of the build machine.
''; '';
}; };
protocol = mkOption {
type = types.enum [ "ssh" "ssh-ng" ];
default = "ssh";
example = "ssh-ng";
description = lib.mdDoc ''
The protocol used for communicating with the build machine.
Use `ssh-ng` if your remote builder and your
local Nix version support that improved protocol.
'';
};
system = mkOption { system = mkOption {
type = types.nullOr types.str; type = types.nullOr types.str;
default = null; default = null;
@ -670,7 +680,7 @@ in
concatMapStrings concatMapStrings
(machine: (machine:
(concatStringsSep " " ([ (concatStringsSep " " ([
"${optionalString (machine.sshUser != null) "${machine.sshUser}@"}${machine.hostName}" "${machine.protocol}://${optionalString (machine.sshUser != null) "${machine.sshUser}@"}${machine.hostName}"
(if machine.system != null then machine.system else if machine.systems != [ ] then concatStringsSep "," machine.systems else "-") (if machine.system != null then machine.system else if machine.systems != [ ] then concatStringsSep "," machine.systems else "-")
(if machine.sshKey != null then machine.sshKey else "-") (if machine.sshKey != null then machine.sshKey else "-")
(toString machine.maxJobs) (toString machine.maxJobs)

View file

@ -248,6 +248,8 @@ in
"-/etc/localtime" "-/etc/localtime"
"-/etc/kanidm" "-/etc/kanidm"
"-/etc/static/kanidm" "-/etc/static/kanidm"
"-/etc/ssl"
"-/etc/static/ssl"
]; ];
BindPaths = [ BindPaths = [
# To create the socket # To create the socket

View file

@ -0,0 +1,96 @@
{ lib, pkgs, config, ... }:
with lib;
let
cfg = config.services.alps;
in {
options.services.alps = {
enable = mkEnableOption (lib.mdDoc "alps");
port = mkOption {
type = types.port;
default = 1323;
description = lib.mdDoc ''
TCP port the service should listen on.
'';
};
bindIP = mkOption {
default = "[::]";
type = types.str;
description = lib.mdDoc ''
The IP the service should listen on.
'';
};
theme = mkOption {
type = types.enum [ "alps" "sourcehut" ];
default = "sourcehut";
description = lib.mdDoc ''
The frontend's theme to use.
'';
};
imaps = {
port = mkOption {
type = types.port;
default = 993;
description = lib.mdDoc ''
The IMAPS server port.
'';
};
host = mkOption {
type = types.str;
default = "[::1]";
example = "mail.example.org";
description = lib.mdDoc ''
The IMAPS server address.
'';
};
};
smtps = {
port = mkOption {
type = types.port;
default = 445;
description = lib.mdDoc ''
The SMTPS server port.
'';
};
host = mkOption {
type = types.str;
default = cfg.imaps.host;
defaultText = "services.alps.imaps.host";
example = "mail.example.org";
description = lib.mdDoc ''
The SMTPS server address.
'';
};
};
};
config = mkIf cfg.enable {
systemd.services.alps = {
description = "alps is a simple and extensible webmail.";
documentation = [ "https://git.sr.ht/~migadu/alps" ];
wantedBy = [ "multi-user.target" ];
after = [ "network.target" "network-online.target" ];
serviceConfig = {
ExecStart = ''
${pkgs.alps}/bin/alps \
-addr ${cfg.bindIP}:${toString cfg.port} \
-theme ${cfg.theme} \
imaps://${cfg.imaps.host}:${toString cfg.imaps.port} \
smpts://${cfg.smtps.host}:${toString cfg.smtps.port}
'';
StateDirectory = "alps";
WorkingDirectory = "/var/lib/alps";
DynamicUser = true;
};
};
};
}

View file

@ -28,6 +28,8 @@ in
caddy.enable = mkEnableOption (lib.mdDoc "exposing lemmy with the caddy reverse proxy"); caddy.enable = mkEnableOption (lib.mdDoc "exposing lemmy with the caddy reverse proxy");
database.createLocally = mkEnableOption (lib.mdDoc "creation of database on the instance");
settings = mkOption { settings = mkOption {
default = { }; default = { };
description = lib.mdDoc "Lemmy configuration"; description = lib.mdDoc "Lemmy configuration";
@ -63,18 +65,12 @@ in
description = lib.mdDoc "The difficultly of the captcha to solve."; description = lib.mdDoc "The difficultly of the captcha to solve.";
}; };
}; };
options.database.createLocally = mkEnableOption (lib.mdDoc "creation of database on the instance");
}; };
}; };
}; };
config = config =
let
localPostgres = (cfg.settings.database.host == "localhost" || cfg.settings.database.host == "/run/postgresql");
in
lib.mkIf cfg.enable { lib.mkIf cfg.enable {
services.lemmy.settings = (mapAttrs (name: mkDefault) services.lemmy.settings = (mapAttrs (name: mkDefault)
{ {
@ -101,8 +97,13 @@ in
}; };
}); });
services.postgresql = mkIf localPostgres { services.postgresql = mkIf cfg.database.createLocally {
enable = mkDefault true; enable = true;
ensureDatabases = [ cfg.settings.database.database ];
ensureUsers = [{
name = cfg.settings.database.user;
ensurePermissions."DATABASE ${cfg.settings.database.database}" = "ALL PRIVILEGES";
}];
}; };
services.pict-rs.enable = true; services.pict-rs.enable = true;
@ -142,7 +143,7 @@ in
}; };
assertions = [{ assertions = [{
assertion = cfg.settings.database.createLocally -> localPostgres; assertion = cfg.database.createLocally -> cfg.settings.database.host == "localhost" || cfg.settings.database.host == "/run/postgresql";
message = "if you want to create the database locally, you need to use a local database"; message = "if you want to create the database locally, you need to use a local database";
}]; }];
@ -163,9 +164,9 @@ in
wantedBy = [ "multi-user.target" ]; wantedBy = [ "multi-user.target" ];
after = [ "pict-rs.service" ] ++ lib.optionals cfg.settings.database.createLocally [ "lemmy-postgresql.service" ]; after = [ "pict-rs.service" ] ++ lib.optionals cfg.database.createLocally [ "postgresql.service" ];
requires = lib.optionals cfg.settings.database.createLocally [ "lemmy-postgresql.service" ]; requires = lib.optionals cfg.database.createLocally [ "postgresql.service" ];
serviceConfig = { serviceConfig = {
DynamicUser = true; DynamicUser = true;
@ -202,27 +203,6 @@ in
ExecStart = "${pkgs.nodejs}/bin/node ${pkgs.lemmy-ui}/dist/js/server.js"; ExecStart = "${pkgs.nodejs}/bin/node ${pkgs.lemmy-ui}/dist/js/server.js";
}; };
}; };
systemd.services.lemmy-postgresql = mkIf cfg.settings.database.createLocally {
description = "Lemmy postgresql db";
after = [ "postgresql.service" ];
partOf = [ "lemmy.service" ];
script = with cfg.settings.database; ''
PSQL() {
${config.services.postgresql.package}/bin/psql --port=${toString cfg.settings.database.port} "$@"
}
# check if the database already exists
if ! PSQL -lqt | ${pkgs.coreutils}/bin/cut -d \| -f 1 | ${pkgs.gnugrep}/bin/grep -qw ${database} ; then
PSQL -tAc "CREATE ROLE ${user} WITH LOGIN;"
PSQL -tAc "CREATE DATABASE ${database} WITH OWNER ${user};"
fi
'';
serviceConfig = {
User = config.services.postgresql.superUser;
Type = "oneshot";
RemainAfterExit = true;
};
};
}; };
} }

View file

@ -449,6 +449,7 @@ in
--dbuser ${cfg.database.user} \ --dbuser ${cfg.database.user} \
${optionalString (cfg.database.passwordFile != null) "--dbpassfile ${cfg.database.passwordFile}"} \ ${optionalString (cfg.database.passwordFile != null) "--dbpassfile ${cfg.database.passwordFile}"} \
--passfile ${cfg.passwordFile} \ --passfile ${cfg.passwordFile} \
--dbtype ${cfg.database.type} \
${cfg.name} \ ${cfg.name} \
admin admin

View file

@ -127,7 +127,7 @@ in {
WorkingDirectory = "/var/lib/${cfg.stateDirectoryName}"; WorkingDirectory = "/var/lib/${cfg.stateDirectoryName}";
DynamicUser = true; DynamicUser = true;
PrivateTmp = true; PrivateTmp = true;
ExecStart = "${pkgs.nodejs}/bin/node ${pkgs.wiki-js}/server"; ExecStart = "${pkgs.nodejs-16_x}/bin/node ${pkgs.wiki-js}/server";
}; };
}; };
}; };

View file

@ -339,6 +339,8 @@ in
plasma-workspace plasma-workspace
plasma-workspace-wallpapers plasma-workspace-wallpapers
oxygen-sounds
breeze-icons breeze-icons
pkgs.hicolor-icon-theme pkgs.hicolor-icon-theme

View file

@ -46,7 +46,7 @@ in {
haskellPackages = mkOption { haskellPackages = mkOption {
default = pkgs.haskellPackages; default = pkgs.haskellPackages;
defaultText = literalExpression "pkgs.haskellPackages"; defaultText = literalExpression "pkgs.haskellPackages";
example = literalExpression "pkgs.haskell.packages.ghc8107"; example = literalExpression "pkgs.haskell.packages.ghc810";
type = types.attrs; type = types.attrs;
description = lib.mdDoc '' description = lib.mdDoc ''
haskellPackages used to build Xmonad and other packages. haskellPackages used to build Xmonad and other packages.

View file

@ -121,7 +121,7 @@ let
"final.target" "final.target"
"kexec.target" "kexec.target"
"systemd-kexec.service" "systemd-kexec.service"
] ++ lib.optional (cfg.package.withUtmp or true) "systemd-update-utmp.service" ++ [ ] ++ lib.optional cfg.package.withUtmp "systemd-update-utmp.service" ++ [
# Password entry. # Password entry.
"systemd-ask-password-console.path" "systemd-ask-password-console.path"

View file

@ -0,0 +1,18 @@
{ config, lib, pkgs, ... }:
let
cfg = config.services.stratis;
in
{
options.services.stratis = {
enable = lib.mkEnableOption (lib.mdDoc "Stratis Storage - Easy to use local storage management for Linux");
};
config = lib.mkIf cfg.enable {
environment.systemPackages = [ pkgs.stratis-cli ];
systemd.packages = [ pkgs.stratisd ];
services.dbus.packages = [ pkgs.stratisd ];
services.udev.packages = [ pkgs.stratisd ];
systemd.services.stratisd.wantedBy = [ "sysinit.target" ];
};
}

View file

@ -293,6 +293,7 @@ in {
lightdm = handleTest ./lightdm.nix {}; lightdm = handleTest ./lightdm.nix {};
lighttpd = handleTest ./lighttpd.nix {}; lighttpd = handleTest ./lighttpd.nix {};
limesurvey = handleTest ./limesurvey.nix {}; limesurvey = handleTest ./limesurvey.nix {};
listmonk = handleTest ./listmonk.nix {};
litestream = handleTest ./litestream.nix {}; litestream = handleTest ./litestream.nix {};
locate = handleTest ./locate.nix {}; locate = handleTest ./locate.nix {};
login = handleTest ./login.nix {}; login = handleTest ./login.nix {};
@ -533,6 +534,7 @@ in {
sssd-ldap = handleTestOn ["x86_64-linux"] ./sssd-ldap.nix {}; sssd-ldap = handleTestOn ["x86_64-linux"] ./sssd-ldap.nix {};
starship = handleTest ./starship.nix {}; starship = handleTest ./starship.nix {};
step-ca = handleTestOn ["x86_64-linux"] ./step-ca.nix {}; step-ca = handleTestOn ["x86_64-linux"] ./step-ca.nix {};
stratis = handleTest ./stratis {};
strongswan-swanctl = handleTest ./strongswan-swanctl.nix {}; strongswan-swanctl = handleTest ./strongswan-swanctl.nix {};
stunnel = handleTest ./stunnel.nix {}; stunnel = handleTest ./stunnel.nix {};
sudo = handleTest ./sudo.nix {}; sudo = handleTest ./sudo.nix {};

View file

@ -424,5 +424,12 @@ import ./make-test-python.nix ({ pkgs, ... }: {
docker.succeed("docker run --rm etc | grep localhost") docker.succeed("docker run --rm etc | grep localhost")
docker.succeed("docker image rm etc:latest") docker.succeed("docker image rm etc:latest")
with subtest("image-with-certs"):
docker.succeed("<${examples.image-with-certs} docker load")
docker.succeed("docker run --rm image-with-certs:latest test -r /etc/ssl/certs/ca-bundle.crt")
docker.succeed("docker run --rm image-with-certs:latest test -r /etc/ssl/certs/ca-certificates.crt")
docker.succeed("docker run --rm image-with-certs:latest test -r /etc/pki/tls/certs/ca-bundle.crt")
docker.succeed("docker image rm image-with-certs:latest")
''; '';
}) })

View file

@ -44,6 +44,12 @@ import ./make-test-python.nix ({ pkgs, ... }:
enableClient = true; enableClient = true;
clientSettings = { clientSettings = {
uri = "https://${serverDomain}"; uri = "https://${serverDomain}";
verify_ca = true;
verify_hostnames = true;
};
enablePam = true;
unixSettings = {
pam_allowed_login_groups = [ "shell" ];
}; };
}; };
@ -67,9 +73,11 @@ import ./make-test-python.nix ({ pkgs, ... }:
start_all() start_all()
server.wait_for_unit("kanidm.service") server.wait_for_unit("kanidm.service")
server.wait_until_succeeds("curl -sf https://${serverDomain} | grep Kanidm") server.wait_until_succeeds("curl -sf https://${serverDomain} | grep Kanidm")
server.wait_until_succeeds("ldapsearch -H ldap://[::1]:636 -b '${ldapBaseDN}' -x '(name=test)'") server.succeed("ldapsearch -H ldap://[::1]:636 -b '${ldapBaseDN}' -x '(name=test)'")
client.wait_until_succeeds("kanidm login -D anonymous && kanidm self whoami | grep anonymous@${serverDomain}") client.succeed("kanidm login -D anonymous && kanidm self whoami | grep anonymous@${serverDomain}")
rv, result = server.execute("kanidmd recover_account -c ${serverConfigFile} idm_admin 2>&1 | rg -o '[A-Za-z0-9]{48}'") rv, result = server.execute("kanidmd recover_account -c ${serverConfigFile} idm_admin 2>&1 | rg -o '[A-Za-z0-9]{48}'")
assert rv == 0 assert rv == 0
client.wait_for_unit("kanidm-unixd.service")
client.succeed("kanidm_unixd_status | grep working!")
''; '';
}) })

View file

@ -30,7 +30,7 @@ let
linux_5_4_hardened linux_5_4_hardened
linux_5_10_hardened linux_5_10_hardened
linux_5_15_hardened linux_5_15_hardened
linux_5_18_hardened linux_5_19_hardened
linux_testing; linux_testing;
}; };

View file

@ -15,10 +15,10 @@ in
services.lemmy = { services.lemmy = {
enable = true; enable = true;
ui.port = uiPort; ui.port = uiPort;
database.createLocally = true;
settings = { settings = {
hostname = "http://${lemmyNodeName}"; hostname = "http://${lemmyNodeName}";
port = backendPort; port = backendPort;
database.createLocally = true;
# Without setup, the /feeds/* and /nodeinfo/* API endpoints won't return 200 # Without setup, the /feeds/* and /nodeinfo/* API endpoints won't return 200
setup = { setup = {
admin_username = "mightyiam"; admin_username = "mightyiam";

View file

@ -12,6 +12,8 @@ import ./make-test-python.nix ({ pkgs, ... }: {
libvirtd.enable = true; libvirtd.enable = true;
}; };
boot.supportedFilesystems = [ "zfs" ];
networking.hostId = "deadbeef"; # needed for zfs
networking.nameservers = [ "192.168.122.1" ]; networking.nameservers = [ "192.168.122.1" ];
security.polkit.enable = true; security.polkit.enable = true;
environment.systemPackages = with pkgs; [ virt-manager ]; environment.systemPackages = with pkgs; [ virt-manager ];
@ -37,6 +39,13 @@ import ./make-test-python.nix ({ pkgs, ... }: {
virthost.succeed("virsh vol-create-as foo loop0p1 25MB") virthost.succeed("virsh vol-create-as foo loop0p1 25MB")
virthost.succeed("virsh vol-create-as foo loop0p2 50MB") virthost.succeed("virsh vol-create-as foo loop0p2 50MB")
with subtest("check if virsh zfs pools work"):
virthost.succeed("fallocate -l100m /tmp/zfs; losetup /dev/loop1 /tmp/zfs;")
virthost.succeed("zpool create zfs_loop /dev/loop1")
virthost.succeed("virsh pool-define-as --name zfs_storagepool --source-name zfs_loop --type zfs")
virthost.succeed("virsh pool-start zfs_storagepool")
virthost.succeed("virsh vol-create-as zfs_storagepool disk1 25MB")
with subtest("check if nixos install iso boots and network works"): with subtest("check if nixos install iso boots and network works"):
virthost.succeed( virthost.succeed(
"virt-install -n nixos --osinfo=nixos-unstable --ram=1024 --graphics=none --disk=`find ${nixosInstallISO}/iso -type f | head -n1`,readonly=on --import --noautoconsole" "virt-install -n nixos --osinfo=nixos-unstable --ram=1024 --graphics=none --disk=`find ${nixosInstallISO}/iso -type f | head -n1`,readonly=on --import --noautoconsole"

View file

@ -0,0 +1,69 @@
import ./make-test-python.nix ({ lib, ... }: {
name = "listmonk";
meta.maintainers = with lib.maintainers; [ raitobezarius ];
nodes.machine = { pkgs, ... }: {
services.mailhog.enable = true;
services.listmonk = {
enable = true;
settings = {
admin_username = "listmonk";
admin_password = "hunter2";
};
database = {
createLocally = true;
# https://github.com/knadh/listmonk/blob/174a48f252a146d7e69dab42724e3329dbe25ebe/internal/messenger/email/email.go#L18-L27
settings.smtp = [ {
enabled = true;
host = "localhost";
port = 1025;
tls_type = "none";
}];
};
};
};
testScript = ''
import json
start_all()
basic_auth = "listmonk:hunter2"
def generate_listmonk_request(type, url, data=None):
if data is None: data = {}
json_data = json.dumps(data)
return f'curl -u "{basic_auth}" -X {type} "http://localhost:9000/api/{url}" -H "Content-Type: application/json; charset=utf-8" --data-raw \'{json_data}\'''
machine.wait_for_unit("mailhog.service")
machine.wait_for_unit("postgresql.service")
machine.wait_for_unit("listmonk.service")
machine.wait_for_open_port(1025)
machine.wait_for_open_port(8025)
machine.wait_for_open_port(9000)
machine.succeed("[[ -f /var/lib/listmonk/.db_settings_initialized ]]")
# Test transactional endpoint
# subscriber_id=1 is guaranteed to exist at install-time
# template_id=2 is guaranteed to exist at install-time and is a transactional template (1 is a campaign template).
machine.succeed(
generate_listmonk_request('POST', 'tx', data={'subscriber_id': 1, 'template_id': 2})
)
assert 'Welcome John Doe' in machine.succeed(
"curl --fail http://localhost:8025/api/v2/messages"
)
# Test campaign endpoint
# Based on https://github.com/knadh/listmonk/blob/174a48f252a146d7e69dab42724e3329dbe25ebe/cmd/campaigns.go#L549 as docs do not exist.
campaign_data = json.loads(machine.succeed(
generate_listmonk_request('POST', 'campaigns/1/test', data={'template_id': 1, 'subscribers': ['john@example.com'], 'name': 'Test', 'subject': 'NixOS is great', 'lists': [1], 'messenger': 'email'})
))
assert campaign_data['data'] # This is a boolean asserting if the test was successful or not: https://github.com/knadh/listmonk/blob/174a48f252a146d7e69dab42724e3329dbe25ebe/cmd/campaigns.go#L626
messages = json.loads(machine.succeed(
"curl --fail http://localhost:8025/api/v2/messages"
))
assert messages['total'] == 2
'';
})

View file

@ -209,7 +209,7 @@ in {
"curl --fail -L --cacert ${ca_pem} https://localhost:8448/" "curl --fail -L --cacert ${ca_pem} https://localhost:8448/"
) )
serverpostgres.require_unit_state("postgresql.service") serverpostgres.require_unit_state("postgresql.service")
serverpostgres.succeed("register_new_matrix_user -u ${testUser} -p ${testPassword} -a -k ${registrationSharedSecret} ") serverpostgres.succeed("register_new_matrix_user -u ${testUser} -p ${testPassword} -a -k ${registrationSharedSecret} https://localhost:8448/")
serverpostgres.succeed("obtain-token-and-register-email") serverpostgres.succeed("obtain-token-and-register-email")
serversqlite.wait_for_unit("matrix-synapse.service") serversqlite.wait_for_unit("matrix-synapse.service")
serversqlite.wait_until_succeeds( serversqlite.wait_until_succeeds(

View file

@ -0,0 +1,7 @@
{ system ? builtins.currentSystem
, pkgs ? import ../../.. { inherit system; }
}:
{
simple = import ./simple.nix { inherit system pkgs; };
}

View file

@ -0,0 +1,39 @@
import ../make-test-python.nix ({ pkgs, ... }:
{
name = "stratis";
meta = with pkgs.lib.maintainers; {
maintainers = [ nickcao ];
};
nodes.machine = { pkgs, ... }: {
services.stratis.enable = true;
virtualisation.emptyDiskImages = [ 1024 1024 1024 1024 ];
};
testScript = ''
machine.wait_for_unit("stratisd")
# test pool creation
machine.succeed("stratis pool create testpool /dev/vdb")
machine.succeed("stratis pool add-data testpool /dev/vdc")
machine.succeed("stratis pool init-cache testpool /dev/vdd")
machine.succeed("stratis pool add-cache testpool /dev/vde")
# test filesystem creation and rename
machine.succeed("stratis filesystem create testpool testfs0")
machine.succeed("stratis filesystem rename testpool testfs0 testfs1")
# test snapshot
machine.succeed("mkdir -p /mnt/testfs1 /mnt/testfs2")
machine.wait_for_file("/dev/stratis/testpool/testfs1")
machine.succeed("mount /dev/stratis/testpool/testfs1 /mnt/testfs1")
machine.succeed("echo test0 > /mnt/testfs1/test0")
machine.succeed("echo test1 > /mnt/testfs1/test1")
machine.succeed("stratis filesystem snapshot testpool testfs1 testfs2")
machine.succeed("echo test2 > /mnt/testfs1/test1")
machine.wait_for_file("/dev/stratis/testpool/testfs2")
machine.succeed("mount /dev/stratis/testpool/testfs2 /mnt/testfs2")
assert "test0" in machine.succeed("cat /mnt/testfs1/test0")
assert "test0" in machine.succeed("cat /mnt/testfs2/test0")
assert "test2" in machine.succeed("cat /mnt/testfs1/test1")
assert "test1" in machine.succeed("cat /mnt/testfs2/test1")
'';
})

View file

@ -1,36 +0,0 @@
From 5e05ec33520b6531e32db1b1e007ed0ab6362d74 Mon Sep 17 00:00:00 2001
From: Martin Weinelt <hexa@darmstadt.ccc.de>
Date: Thu, 15 Sep 2022 04:55:40 +0200
Subject: [PATCH] python3Packages.img2pdf: apply patch to fix tests
---
pkgs/development/python-modules/img2pdf/default.nix | 9 +++++++++
1 file changed, 9 insertions(+)
diff --git a/pkgs/development/python-modules/img2pdf/default.nix b/pkgs/development/python-modules/img2pdf/default.nix
index 791edcb2fb20e..c749071bab9b0 100644
--- a/pkgs/development/python-modules/img2pdf/default.nix
+++ b/pkgs/development/python-modules/img2pdf/default.nix
@@ -2,6 +2,7 @@
, buildPythonPackage
, isPy27
, fetchPypi
+, fetchpatch
, pikepdf
, pillow
, stdenv
@@ -26,6 +27,14 @@ buildPythonPackage rec {
sha256 = "8ec898a9646523fd3862b154f3f47cd52609c24cc3e2dc1fb5f0168f0cbe793c";
};
+ patches = [
+ (fetchpatch {
+ # https://gitlab.mister-muffin.de/josch/img2pdf/issues/148
+ url = "https://gitlab.mister-muffin.de/josch/img2pdf/commit/57d7e07e6badb252c12015388b58fcb5285d3158.patch";
+ hash = "sha256-H/g55spe/oVJRxO2Vh+F+ZgR6aLoRUrNeu5WnuU7k/k=";
+ })
+ ];
+
propagatedBuildInputs = [
pikepdf
pillow

View file

@ -1 +0,0 @@
pr191265.patch

View file

@ -23,20 +23,14 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "exaile"; pname = "exaile";
version = "4.1.1"; version = "4.1.2";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "exaile"; owner = "exaile";
repo = pname; repo = pname;
rev = version; rev = version;
sha256 = "0s29lm0i4slgaw5l5s9a2zx0b83xac43rnil5cvyi210dxm5s048"; sha256 = "sha256-GZyCuPy57NhGwgbLMrRKW5xmc1Udon7WtsrD4upviuQ=";
}; };
patches = [
(fetchpatch {
url = "https://github.com/exaile/exaile/pull/751.patch";
sha256 = "sha256-jCJh85Z3HQcyS4ntQP5HwYJgM7WNHcWzjf0BdNJitsM=";
})
];
nativeBuildInputs = [ nativeBuildInputs = [
gobject-introspection gobject-introspection
@ -76,7 +70,6 @@ stdenv.mkDerivation rec {
++ lib.optional wikipediaSupport webkitgtk; ++ lib.optional wikipediaSupport webkitgtk;
checkInputs = with python3.pkgs; [ checkInputs = with python3.pkgs; [
mox3
pytest pytest
]; ];

View file

@ -8,11 +8,11 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "headset"; pname = "headset";
version = "4.0.0"; version = "4.2.1";
src = fetchurl { src = fetchurl {
url = "https://github.com/headsetapp/headset-electron/releases/download/v${version}/headset_${version}_amd64.deb"; url = "https://github.com/headsetapp/headset-electron/releases/download/v${version}/headset_${version}_amd64.deb";
hash = "sha256-M1HMZgYczZWFq0EGlCMEGOGUNoUcmq37J8Ycen72PhM="; hash = "sha256-81gsIq74sggauE6g8pM6z05KTmsbe49CZa9aRQEDwMo=";
}; };
dontConfigure = true; dontConfigure = true;

View file

@ -15,14 +15,14 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "mmlgui"; pname = "mmlgui";
version = "unstable-2022-05-24"; version = "unstable-2022-09-15";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "superctr"; owner = "superctr";
repo = "mmlgui"; repo = "mmlgui";
rev = "fe2b298c1eddae4cc38096f6c1ba1ccaed562cf1"; rev = "163cc73a7c009d524b0d5aff94f9ed47fe19e3ce";
fetchSubmodules = true; fetchSubmodules = true;
sha256 = "Q34zzZthdThMbduXcc/qMome89mAMrn1Vinr073u4zo="; sha256 = "kKo3ge2wcDK2xU1YCfEwyqw84N+3jcbOEEsJHSpMpzY=";
}; };
postPatch = '' postPatch = ''

View file

@ -10,15 +10,11 @@
, icu , icu
, curl , curl
, outputsSupport ? true # outputs screen , outputsSupport ? true # outputs screen
, visualizerSupport ? false, fftw ? null # visualizer screen , visualizerSupport ? false, fftw # visualizer screen
, clockSupport ? true # clock screen , clockSupport ? true # clock screen
, taglibSupport ? true, taglib ? null # tag editor , taglibSupport ? true, taglib # tag editor
}: }:
assert visualizerSupport -> (fftw != null);
assert taglibSupport -> (taglib != null);
with lib;
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "ncmpcpp"; pname = "ncmpcpp";
version = "0.9.2"; version = "0.9.2";
@ -29,19 +25,23 @@ stdenv.mkDerivation rec {
}; };
enableParallelBuilding = true; enableParallelBuilding = true;
configureFlags = [ "BOOST_LIB_SUFFIX=" ]
++ optional outputsSupport "--enable-outputs"
++ optional visualizerSupport "--enable-visualizer --with-fftw"
++ optional clockSupport "--enable-clock"
++ optional taglibSupport "--with-taglib";
nativeBuildInputs = [ pkg-config ]; strictDeps = true;
configureFlags = [ "BOOST_LIB_SUFFIX=" ]
++ lib.optional outputsSupport "--enable-outputs"
++ lib.optional visualizerSupport "--enable-visualizer --with-fftw"
++ lib.optional clockSupport "--enable-clock"
++ lib.optional taglibSupport "--with-taglib";
nativeBuildInputs = [ pkg-config ]
++ lib.optional taglibSupport taglib;
buildInputs = [ boost libmpdclient ncurses readline libiconv icu curl ] buildInputs = [ boost libmpdclient ncurses readline libiconv icu curl ]
++ optional visualizerSupport fftw ++ lib.optional visualizerSupport fftw
++ optional taglibSupport taglib; ++ lib.optional taglibSupport taglib;
meta = { meta = with lib; {
description = "A featureful ncurses based MPD client inspired by ncmpc"; description = "A featureful ncurses based MPD client inspired by ncmpc";
homepage = "https://rybczak.net/ncmpcpp/"; homepage = "https://rybczak.net/ncmpcpp/";
changelog = "https://github.com/ncmpcpp/ncmpcpp/blob/${version}/CHANGELOG.md"; changelog = "https://github.com/ncmpcpp/ncmpcpp/blob/${version}/CHANGELOG.md";

View file

@ -7,16 +7,16 @@
rustPlatform.buildRustPackage rec { rustPlatform.buildRustPackage rec {
pname = "ncspot"; pname = "ncspot";
version = "0.11.0"; version = "0.11.1";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "hrkfdn"; owner = "hrkfdn";
repo = "ncspot"; repo = "ncspot";
rev = "v${version}"; rev = "v${version}";
sha256 = "sha256-mtveGRwadcct9R8CxLWCvT9FamK2PnicpeSvL4iT4oE="; sha256 = "sha256-q4jOfcU2sNKISgO9vX2Rao2JljiYzWwB3WWMIvy8rII=";
}; };
cargoSha256 = "sha256-JqHJY91q2vm0x819zUkBBAObpnXN5aPde8m5UJ2NeNY="; cargoSha256 = "sha256-f8yo60Gi2OdJMNxssMhladh82/ZeZ0ZWV7WmTcQ8jYo=";
nativeBuildInputs = [ pkg-config ]; nativeBuildInputs = [ pkg-config ];

View file

@ -2,12 +2,12 @@
let let
pname = "plexamp"; pname = "plexamp";
version = "4.3.0"; version = "4.4.0";
src = fetchurl { src = fetchurl {
url = "https://plexamp.plex.tv/plexamp.plex.tv/desktop/Plexamp-${version}.AppImage"; url = "https://plexamp.plex.tv/plexamp.plex.tv/desktop/Plexamp-${version}.AppImage";
name="${pname}-${version}.AppImage"; name="${pname}-${version}.AppImage";
sha512 = "c9d2rp7tibb73tZdoFONW7eoy+u+GaUZ0RPhYWCBk5MYwtY81xrsdka64x60xzxOopWZ6JkmGs9AWI1XifqBTQ=="; sha512 = "VYdeZUgVWDce9XGyf5AnwPV/Ja6p2i3IRAcnSj7J7KqTUdgoNsPl4gqs4HcdrSCEX8PfloimJihoBuEKtgXcNA==";
}; };
appimageContents = appimageTools.extractType2 { appimageContents = appimageTools.extractType2 {
@ -33,7 +33,7 @@ in appimageTools.wrapType2 {
meta = with lib; { meta = with lib; {
description = "A beautiful Plex music player for audiophiles, curators, and hipsters"; description = "A beautiful Plex music player for audiophiles, curators, and hipsters";
homepage = "https://plexamp.com/"; homepage = "https://plexamp.com/";
changelog = "https://forums.plex.tv/t/plexamp-release-notes/221280/45"; changelog = "https://forums.plex.tv/t/plexamp-release-notes/221280/46";
license = licenses.unfree; license = licenses.unfree;
maintainers = with maintainers; [ killercup synthetica ]; maintainers = with maintainers; [ killercup synthetica ];
platforms = [ "x86_64-linux" ]; platforms = [ "x86_64-linux" ];

View file

@ -2,13 +2,13 @@
python3Packages.buildPythonApplication rec { python3Packages.buildPythonApplication rec {
pname = "pyradio"; pname = "pyradio";
version = "0.8.9.26"; version = "0.8.9.27";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "coderholic"; owner = "coderholic";
repo = pname; repo = pname;
rev = "refs/tags/${version}"; rev = "refs/tags/${version}";
sha256 = "sha256-RuQAbmzB8s+YmJLSbzJTQtpiYLr1oFtrxKF8P+MlHeU="; sha256 = "sha256-KqSpyDiRhp7DdbFsPor+munMQg+0vv0qF2VI3gkR04Y=";
}; };
nativeBuildInputs = [ installShellFiles ]; nativeBuildInputs = [ installShellFiles ];

View file

@ -6,13 +6,13 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "scream"; pname = "scream";
version = "3.9"; version = "4.0";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "duncanthrax"; owner = "duncanthrax";
repo = pname; repo = pname;
rev = version; rev = version;
sha256 = "sha256-JxDR7UhS4/+oGQ9Fwm4f+yBM9OyX0Srvr9n/vaZVvxQ="; sha256 = "sha256-lP5mdNhZjkEVjgQUEsisPy+KXUqsE6xj6dFWcgD+VGM=";
}; };
buildInputs = lib.optional pulseSupport libpulseaudio buildInputs = lib.optional pulseSupport libpulseaudio

View file

@ -39,13 +39,13 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "sonic-pi"; pname = "sonic-pi";
version = "4.1.0"; version = "4.2.0";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "sonic-pi-net"; owner = "sonic-pi-net";
repo = pname; repo = pname;
rev = "v${version}"; rev = "v${version}";
hash = "sha256-kEZNVTAWkiqxyPJHSL4Gismpwxd+PnXiH8CgQCV3+PQ="; hash = "sha256-VRuNhS53okKsCHgKEsJgkpIe9yXFY6d2ghd0nsUQLLM=";
}; };
mixFodDeps = beamPackages.fetchMixDeps { mixFodDeps = beamPackages.fetchMixDeps {

View file

@ -14,15 +14,15 @@
buildGoModule rec { buildGoModule rec {
pname = "ymuse"; pname = "ymuse";
version = "0.20"; version = "0.21";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "yktoo"; owner = "yktoo";
repo = "ymuse"; repo = "ymuse";
rev = "v${version}"; rev = "v${version}";
sha256 = "sha256-wDQjNBxwxFVFdSswubp4AVD35aXKJ8i0ahk/tgRsDRc="; sha256 = "sha256-3QgBbK7AK9/uQ6Z7DNIJxa1oXrxvvHDQ/Z2QOf7yfS4=";
}; };
vendorSha256 = "sha256-Ap/nf0NT0VkP2k9U1HzEiptDfLjKkBopP5h0czP3vis="; vendorSha256 = "sha256-7oYYZWpvWzeHlp6l9bLeHcLITLZPVY5eZdfHSE+ZHW8=";
nativeBuildInputs = [ nativeBuildInputs = [
pkg-config pkg-config
@ -39,7 +39,8 @@ buildGoModule rec {
]; ];
postInstall = '' postInstall = ''
install -Dm644 ./resources/ymuse.desktop -t $out/share/applications install -Dm644 ./resources/com.yktoo.ymuse.desktop -t $out/share/applications
install -Dm644 ./resources/metainfo/com.yktoo.ymuse.metainfo.xml -t $out/share/metainfo
cp -r ./resources/icons $out/share cp -r ./resources/icons $out/share
app_id="ymuse" app_id="ymuse"

View file

@ -5,6 +5,7 @@
, pkg-config , pkg-config
, util-linux , util-linux
, hexdump , hexdump
, autoSignDarwinBinariesHook
, wrapQtAppsHook ? null , wrapQtAppsHook ? null
, boost , boost
, libevent , libevent
@ -24,19 +25,20 @@
with lib; with lib;
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = if withGui then "elements" else "elementsd"; pname = if withGui then "elements" else "elementsd";
version = "0.21.0.2"; version = "22.0";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "ElementsProject"; owner = "ElementsProject";
repo = "elements"; repo = "elements";
rev = "elements-${version}"; rev = "elements-${version}";
sha256 = "sha256-5b3wylp9Z2U0ueu2gI9jGeWiiJoddjcjQ/6zkFATyvA="; sha256 = "sha256-n98bz1W9hoJ5JDH34LG7R6igEIY1j4mRbO2PKnV8R2U=";
}; };
nativeBuildInputs = nativeBuildInputs =
[ autoreconfHook pkg-config ] [ autoreconfHook pkg-config ]
++ optionals stdenv.isLinux [ util-linux ] ++ optionals stdenv.isLinux [ util-linux ]
++ optionals stdenv.isDarwin [ hexdump ] ++ optionals stdenv.isDarwin [ hexdump ]
++ optionals (stdenv.isDarwin && stdenv.isAarch64) [ autoSignDarwinBinariesHook ]
++ optionals withGui [ wrapQtAppsHook ]; ++ optionals withGui [ wrapQtAppsHook ];
buildInputs = [ boost libevent miniupnpc zeromq zlib ] buildInputs = [ boost libevent miniupnpc zeromq zlib ]
@ -56,12 +58,16 @@ stdenv.mkDerivation rec {
"--with-qt-bindir=${qtbase.dev}/bin:${qttools.dev}/bin" "--with-qt-bindir=${qtbase.dev}/bin:${qttools.dev}/bin"
]; ];
# fix "Killed: 9 test/test_bitcoin"
# https://github.com/NixOS/nixpkgs/issues/179474
hardeningDisable = lib.optionals (stdenv.isAarch64 && stdenv.isDarwin) [ "fortify" "stackprotector" ];
checkInputs = [ python3 ]; checkInputs = [ python3 ];
doCheck = true; doCheck = true;
checkFlags = checkFlags =
[ "LC_ALL=C.UTF-8" ] [ "LC_ALL=en_US.UTF-8" ]
# QT_PLUGIN_PATH needs to be set when executing QT, which is needed when testing Bitcoin's GUI. # QT_PLUGIN_PATH needs to be set when executing QT, which is needed when testing Bitcoin's GUI.
# See also https://github.com/NixOS/nixpkgs/issues/24256 # See also https://github.com/NixOS/nixpkgs/issues/24256
++ optional withGui "QT_PLUGIN_PATH=${qtbase}/${qtbase.qtPluginPrefix}"; ++ optional withGui "QT_PLUGIN_PATH=${qtbase}/${qtbase.qtPluginPrefix}";

View file

@ -2,11 +2,11 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "ergo"; pname = "ergo";
version = "4.0.42"; version = "4.0.45";
src = fetchurl { src = fetchurl {
url = "https://github.com/ergoplatform/ergo/releases/download/v${version}/ergo-${version}.jar"; url = "https://github.com/ergoplatform/ergo/releases/download/v${version}/ergo-${version}.jar";
sha256 = "sha256-ZcNV6qgD736KlKz4h6xHAl3ByYzca77YXoETonRaWP8="; sha256 = "sha256-YUcBNGUs7oBiY7zkRSQxT4/t3DfvamcQPVI3h/sonHM=";
}; };
nativeBuildInputs = [ makeWrapper ]; nativeBuildInputs = [ makeWrapper ];

View file

@ -2,17 +2,17 @@
buildGoModule rec { buildGoModule rec {
pname = "erigon"; pname = "erigon";
version = "2022.09.01"; version = "2022.09.03";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "ledgerwatch"; owner = "ledgerwatch";
repo = pname; repo = pname;
rev = "v${version}"; rev = "v${version}";
sha256 = "sha256-vcppzHJ6yLIqp/5Gl9JIgkTVR1mKKAj1vhWY/bCvbPQ="; sha256 = "sha256-dilsoJw7VPA7SerpAOhYUviE2zt2qMBmSLWaPm0ux2Y=";
fetchSubmodules = true; fetchSubmodules = true;
}; };
vendorSha256 = "sha256-mY8m5bXm09pmq1imCo8uiBBnzPzrVuka8XtZyxL9LWo="; vendorSha256 = "sha256-W8hEMfn2qW/3+V6x/RH1azj49V26fyQ+1y2re3tXsTk=";
proxyVendor = true; proxyVendor = true;
# Build errors in mdbx when format hardening is enabled: # Build errors in mdbx when format hardening is enabled:
@ -30,6 +30,6 @@ buildGoModule rec {
homepage = "https://github.com/ledgerwatch/erigon/"; homepage = "https://github.com/ledgerwatch/erigon/";
description = "Ethereum node implementation focused on scalability and modularity"; description = "Ethereum node implementation focused on scalability and modularity";
license = with licenses; [ lgpl3Plus gpl3Plus ]; license = with licenses; [ lgpl3Plus gpl3Plus ];
maintainers = with maintainers; [ d-xo ]; maintainers = with maintainers; [ d-xo happysalada ];
}; };
} }

View file

@ -11,13 +11,13 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "fulcrum"; pname = "fulcrum";
version = "1.7.0"; version = "1.8.1";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "cculianu"; owner = "cculianu";
repo = "Fulcrum"; repo = "Fulcrum";
rev = "v${version}"; rev = "v${version}";
sha256 = "sha256-FIa6eAE6yyJR5UdlCXB2Gx3DqN528POxb0eYOCpVjJk="; sha256 = "sha256-GaXXqIHuMTGn8iLymAhL8i0HzXmaO6RxtvIzgWw6QI0=";
}; };
nativeBuildInputs = [ pkg-config qmake ]; nativeBuildInputs = [ pkg-config qmake ];

View file

@ -9,13 +9,13 @@ let
in buildGoModule rec { in buildGoModule rec {
pname = "go-ethereum"; pname = "go-ethereum";
version = "1.10.23"; version = "1.10.25";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "ethereum"; owner = "ethereum";
repo = pname; repo = pname;
rev = "v${version}"; rev = "v${version}";
sha256 = "sha256-1fEmtbHKrjuyIVrGr/vTudZ99onkNjEMvyBJt4I8KK4="; sha256 = "sha256-mnf0kMfQEEQMricZJfyF7ZB/2F1dyPBx9iT2v/rGh1U=";
}; };
vendorSha256 = "sha256-Dj+xN8lr98LJyYr2FwJ7yUIJkUeUrr1fkcbj4hShJI0="; vendorSha256 = "sha256-Dj+xN8lr98LJyYr2FwJ7yUIJkUeUrr1fkcbj4hShJI0=";
@ -46,7 +46,7 @@ in buildGoModule rec {
"cmd/utils" "cmd/utils"
]; ];
# Following upstream: https://github.com/ethereum/go-ethereum/blob/v1.10.23/build/ci.go#L218 # Following upstream: https://github.com/ethereum/go-ethereum/blob/v1.10.25/build/ci.go#L218
tags = [ "urfave_cli_no_docs" ]; tags = [ "urfave_cli_no_docs" ];
# Fix for usb-related segmentation faults on darwin # Fix for usb-related segmentation faults on darwin

View file

@ -9,13 +9,13 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "monero-cli"; pname = "monero-cli";
version = "0.18.1.0"; version = "0.18.1.1";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "monero-project"; owner = "monero-project";
repo = "monero"; repo = "monero";
rev = "v${version}"; rev = "v${version}";
sha256 = "sha256-xniGiGqZpL1b6alnCxa2MNzuDQxPgMdNjqifOC8h0qM="; sha256 = "sha256-R3ajdsHVgvkUEwaShwMvhIrcbM4YjsXgBk2QGBhxGRQ=";
fetchSubmodules = true; fetchSubmodules = true;
}; };

View file

@ -14,13 +14,13 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "monero-gui"; pname = "monero-gui";
version = "0.18.1.0"; version = "0.18.1.1";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "monero-project"; owner = "monero-project";
repo = "monero-gui"; repo = "monero-gui";
rev = "v${version}"; rev = "v${version}";
sha256 = "sha256-XL7DV4YD/U6RiqgdYJf6zFfvJWlOx//4YVmnc51riiE="; sha256 = "sha256-mxbr02Ba/BeUiAZujnBdXgJSaq6a/U4GM7rR7sZzTWc=";
}; };
nativeBuildInputs = [ nativeBuildInputs = [

View file

@ -4,7 +4,7 @@
}: }:
rustPlatform.buildRustPackage rec { rustPlatform.buildRustPackage rec {
pname = "nearcore"; pname = "nearcore";
version = "1.28.1"; version = "1.29.0";
# https://github.com/near/nearcore/tags # https://github.com/near/nearcore/tags
src = fetchFromGitHub { src = fetchFromGitHub {
@ -13,10 +13,10 @@ rustPlatform.buildRustPackage rec {
# there is also a branch for this version number, so we need to be explicit # there is also a branch for this version number, so we need to be explicit
rev = "refs/tags/${version}"; rev = "refs/tags/${version}";
sha256 = "sha256-lAbVcmr8StAZAII++21xiBd4tRcdprefvcGzPLIjl74="; sha256 = "sha256-TOZ6j4CaiOXmNn8kgVGP27SjvLDlGvabAA+PAEyFXIk=";
}; };
cargoSha256 = "sha256-1aoL5fbKZ4XZ1ELVDWNDFHDL2FyNuoX/DVb0h8RWBxI="; cargoSha256 = "sha256-LjBgsQynHIfrQP4Z7j1J8+PLqRZWGAOQ5dJaGOqabVk=";
cargoPatches = [ ./0001-make-near-test-contracts-optional.patch ]; cargoPatches = [ ./0001-make-near-test-contracts-optional.patch ];
postPatch = '' postPatch = ''

View file

@ -13,7 +13,6 @@
, protobuf , protobuf
, clang , clang
, llvm , llvm
, pkgconfig
, openssl , openssl
, libclang , libclang
, rustfmt , rustfmt
@ -74,7 +73,7 @@ rustPlatform.buildRustPackage rec {
"-isystem ${libclang.lib}/lib/clang/${lib.getVersion clang}/include"; "-isystem ${libclang.lib}/lib/clang/${lib.getVersion clang}/include";
LLVM_CONFIG_PATH = "${llvm}/bin/llvm-config"; LLVM_CONFIG_PATH = "${llvm}/bin/llvm-config";
nativeBuildInputs = [ clang llvm pkgconfig protobuf rustfmt perl ]; nativeBuildInputs = [ clang llvm pkg-config protobuf rustfmt perl ];
buildInputs = buildInputs =
[ openssl zlib libclang hidapi ] ++ (lib.optionals stdenv.isLinux [ udev ]); [ openssl zlib libclang hidapi ] ++ (lib.optionals stdenv.isLinux [ udev ]);
strictDeps = true; strictDeps = true;

View file

@ -38,13 +38,13 @@ let
in in
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "cudatext"; pname = "cudatext";
version = "1.169.2"; version = "1.171.0";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "Alexey-T"; owner = "Alexey-T";
repo = "CudaText"; repo = "CudaText";
rev = version; rev = version;
hash = "sha256-EQAoKft/L4sbdY8hOvyu+Cy+3I8Lt4g1KTxTlSYALac="; hash = "sha256-+NTxZ5UkmaFDcTYliNi/5c8xGztVu6P8C7Ga99MHSFM=";
}; };
postPatch = '' postPatch = ''

View file

@ -11,18 +11,18 @@
}, },
"ATFlatControls": { "ATFlatControls": {
"owner": "Alexey-T", "owner": "Alexey-T",
"rev": "2022.08.28", "rev": "2022.09.18",
"hash": "sha256-jkVHwPQGPtLeSRy502thPIrDWzkkwvlnyGcTzjgFgIc=" "hash": "sha256-4d27eW4gpJHwGlRZ4iPzuKIw/o/J4utxXbEhglk31Bw="
}, },
"ATSynEdit": { "ATSynEdit": {
"owner": "Alexey-T", "owner": "Alexey-T",
"rev": "2022.08.28", "rev": "2022.09.18",
"hash": "sha256-U/UD3vPnIdQUe/1g/mKgs5yGirsIB/uHTjD0MOouAyI=" "hash": "sha256-HjW4V7MctQoHbDYIlMv7VS0nS7FFG6Qir0sCju+isI0="
}, },
"ATSynEdit_Cmp": { "ATSynEdit_Cmp": {
"owner": "Alexey-T", "owner": "Alexey-T",
"rev": "2022.08.28", "rev": "2022.09.18",
"hash": "sha256-/MWC4BoU/4kflvbly0phh+cfIR8rNwgWFtrXnmxk0Ks=" "hash": "sha256-yIbIRo4hpwbCdH+3fIhjnQPtdvuFmfJSqloKjWqKEuY="
}, },
"EControl": { "EControl": {
"owner": "Alexey-T", "owner": "Alexey-T",
@ -31,8 +31,8 @@
}, },
"ATSynEdit_Ex": { "ATSynEdit_Ex": {
"owner": "Alexey-T", "owner": "Alexey-T",
"rev": "2022.07.20", "rev": "2022.09.03",
"hash": "sha256-f/BdOMcx7NTpKgaFTz4MbK3O0GcUepyMPyRdhnZImjU=" "hash": "sha256-6xzYn9x5tZLUhLAT9mQ4+UmpEemg386tAjlWdK8j/Ew="
}, },
"Python-for-Lazarus": { "Python-for-Lazarus": {
"owner": "Alexey-T", "owner": "Alexey-T",
@ -41,8 +41,8 @@
}, },
"Emmet-Pascal": { "Emmet-Pascal": {
"owner": "Alexey-T", "owner": "Alexey-T",
"rev": "2022.01.17", "rev": "2022.09.18",
"hash": "sha256-5yqxRW7xFJ4MwHjKnxYL8/HrCDLn30a1gyQRjGMx/qw=" "hash": "sha256-Kutl4Jh/+KptGbqakzPJnIYlFtytXVlzKWulKt4Z+/g="
}, },
"CudaText-lexers": { "CudaText-lexers": {
"owner": "Alexey-T", "owner": "Alexey-T",

View file

@ -5,13 +5,13 @@
trivialBuild { trivialBuild {
pname = "bqn-mode"; pname = "bqn-mode";
version = "0.pre+date=2022-01-07"; version = "0.pre+date=2022-09-14";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "museoa"; owner = "museoa";
repo = "bqn-mode"; repo = "bqn-mode";
rev = "86ef8b4d32d272b2765cd4a6e6e0b70a4f3e99a2"; rev = "3e3d4758c0054b35f047bf6d9e03b1bea425d013";
hash = "sha256-6ygV/iNzzpZ77w+Dh/snHAzUxrbfaU9TxuNOtJK6pNQ="; hash = "sha256:0pz3m4jp4dn8bsmc9n51sxwdk6g52mxb6y6f6a4g4hggb35shy2a";
}; };
meta = with lib; { meta = with lib; {

View file

@ -45,6 +45,18 @@ let
inherit sha256 url; inherit sha256 url;
} }
) {}; ) {};
sourcehut = self.callPackage ({ fetchzip }:
fetchzip {
url = "https://git.sr.ht/~${repo}/archive/${commit}.tar.gz";
inherit sha256;
}
) {};
codeberg = self.callPackage ({ fetchzip }:
fetchzip {
url = "https://codeberg.org/${repo}/archive/${commit}.tar.gz";
inherit sha256;
}
) {};
}; };
in { in {

View file

@ -103,6 +103,10 @@ return Promise to resolve in that process."
(url-hexify-string repo) (url-hexify-string repo)
"/repository/archive.tar.gz?ref=" "/repository/archive.tar.gz?ref="
commit))) commit)))
("sourcehut" (list "nix-prefetch-url"
"--unpack" (concat "https://git.sr.ht/~" repo "/archive/" commit ".tar.gz")))
("codeberg" (list "nix-prefetch-url"
"--unpack" (concat "https://codeberg.org/" repo "/archive/" commit ".tar.gz")))
("bitbucket" (list "nix-prefetch-hg" ("bitbucket" (list "nix-prefetch-hg"
(concat "https://bitbucket.com/" repo) commit)) (concat "https://bitbucket.com/" repo) commit))
("hg" (list "nix-prefetch-hg" ("hg" (list "nix-prefetch-hg"
@ -210,7 +214,9 @@ return Promise to resolve in that process."
`((fetcher . ,fetcher)) `((fetcher . ,fetcher))
(if (or (equal "github" fetcher) (if (or (equal "github" fetcher)
(equal "bitbucket" fetcher) (equal "bitbucket" fetcher)
(equal "gitlab" fetcher)) (equal "gitlab" fetcher)
(equal "sourcehut" fetcher)
(equal "codeberg" fetcher))
`((repo . ,repo)) `((repo . ,repo))
`((url . ,url))) `((url . ,url)))
(when unstable-aprops `((unstable . ,(source-info entry unstable-archive unstable-sha)))) (when unstable-aprops `((unstable . ,(source-info entry unstable-archive unstable-sha))))

View file

@ -136,7 +136,7 @@ let emacs = stdenv.mkDerivation (lib.optionalAttrs nativeComp {
++ lib.optionals (stdenv.isLinux && withX) [ m17n_lib libotf ] ++ lib.optionals (stdenv.isLinux && withX) [ m17n_lib libotf ]
++ lib.optional (withX && withGTK2) gtk2-x11 ++ lib.optional (withX && withGTK2) gtk2-x11
++ lib.optional (withX && withGTK3) gtk3-x11 ++ lib.optional (withX && withGTK3) gtk3-x11
++ lib.optional withGTK3 gsettings-desktop-schemas ++ lib.optional (!stdenv.isDarwin && withGTK3) gsettings-desktop-schemas
++ lib.optional withPgtk gtk3 ++ lib.optional withPgtk gtk3
++ lib.optional (withX && withMotif) motif ++ lib.optional (withX && withMotif) motif
++ lib.optional withSQLite3 sqlite ++ lib.optional withSQLite3 sqlite

View file

@ -22,6 +22,11 @@
, enableWayland ? stdenv.isLinux , enableWayland ? stdenv.isLinux
, wayland , wayland
, xorg , xorg
, xcbuild
, Security
, ApplicationServices
, AppKit
, Carbon
}: }:
rustPlatform.buildRustPackage rec { rustPlatform.buildRustPackage rec {
pname = "neovide"; pname = "neovide";
@ -75,7 +80,7 @@ rustPlatform.buildRustPackage rec {
python2 # skia-bindings python2 # skia-bindings
python3 # rust-xcb python3 # rust-xcb
llvmPackages.clang # skia llvmPackages.clang # skia
]; ] ++ lib.optionals stdenv.isDarwin [ xcbuild ];
# All tests passes but at the end cargo prints for unknown reason: # All tests passes but at the end cargo prints for unknown reason:
# error: test failed, to rerun pass '--bin neovide' # error: test failed, to rerun pass '--bin neovide'
@ -98,7 +103,7 @@ rustPlatform.buildRustPackage rec {
})) }))
]; ];
})) }))
]; ] ++ lib.optionals stdenv.isDarwin [ Security ApplicationServices Carbon AppKit ];
postFixup = let postFixup = let
libPath = lib.makeLibraryPath ([ libPath = lib.makeLibraryPath ([
@ -128,7 +133,7 @@ rustPlatform.buildRustPackage rec {
homepage = "https://github.com/Kethku/neovide"; homepage = "https://github.com/Kethku/neovide";
license = with licenses; [ mit ]; license = with licenses; [ mit ];
maintainers = with maintainers; [ ck3d ]; maintainers = with maintainers; [ ck3d ];
platforms = platforms.linux; platforms = platforms.all;
mainProgram = "neovide"; mainProgram = "neovide";
}; };
} }

View file

@ -222,4 +222,44 @@ rec {
export HOME=$TMPDIR export HOME=$TMPDIR
${nvimWithLuaPackages}/bin/nvim -i NONE --noplugin -es ${nvimWithLuaPackages}/bin/nvim -i NONE --noplugin -es
''; '';
# nixpkgs should install optional packages in the opt folder
nvim_with_opt_plugin = neovim.override {
extraName = "-with-opt-plugin";
configure.packages.opt-plugins = with pkgs.vimPlugins; {
opt = [
(dashboard-nvim.overrideAttrs(old: { pname = old.pname + "-unique-for-tests-please-dont-use-opt"; }))
];
};
configure.customRC = ''
" Load all autoloaded plugins
packloadall
" Try to run Dashboard, and throw if it succeeds
try
Dashboard
echo "Dashboard found, throwing error"
cquit 1
catch /^Vim\%((\a\+)\)\=:E492/
echo "Dashboard not found"
endtry
" Load Dashboard as an optional
packadd dashboard-nvim-unique-for-tests-please-dont-use-opt
" Try to run Dashboard again, and throw if it fails
try
Dashboard
echo "Dashboard found"
catch /^Vim\%((\a\+)\)\=:E492/
echo "Dashboard not found, throwing error"
cquit 1
endtry
'';
};
run_nvim_with_opt_plugin = runTest nvim_with_opt_plugin ''
export HOME=$TMPDIR
${nvim_with_opt_plugin}/bin/nvim -i NONE +quit! -e
'';
}) })

View file

@ -170,8 +170,8 @@ let
throw "The neovim legacy wrapper doesn't support configure.plug anymore, please setup your plugins via 'configure.packages' instead" throw "The neovim legacy wrapper doesn't support configure.plug anymore, please setup your plugins via 'configure.packages' instead"
else else
lib.flatten (lib.mapAttrsToList genPlugin (configure.packages or {})); lib.flatten (lib.mapAttrsToList genPlugin (configure.packages or {}));
genPlugin = packageName: {start ? [], opt?[]}: genPlugin = packageName: {start ? [], opt ? []}:
start ++ opt; start ++ (map (p: { plugin = p; optional = true; }) opt);
res = makeNeovimConfig { res = makeNeovimConfig {
inherit withPython3; inherit withPython3;

View file

@ -100,7 +100,7 @@ let
if ! $out/bin/nvim-wrapper \ if ! $out/bin/nvim-wrapper \
-u ${writeText "manifest.vim" manifestRc} \ -u ${writeText "manifest.vim" manifestRc} \
-i NONE -n \ -i NONE -n \
-E -V1rplugins.log -s \ -V1rplugins.log \
+UpdateRemotePlugins +quit! > outfile 2>&1; then +UpdateRemotePlugins +quit! > outfile 2>&1; then
cat outfile cat outfile
echo -e "\nGenerating rplugin.vim failed!" echo -e "\nGenerating rplugin.vim failed!"

View file

@ -1,8 +1,6 @@
diff --git a/src/cpp/core/libclang/LibClang.cpp b/src/cpp/core/libclang/LibClang.cpp
index 1186f3a..58e8cc7 100644
--- a/src/cpp/core/libclang/LibClang.cpp --- a/src/cpp/core/libclang/LibClang.cpp
+++ b/src/cpp/core/libclang/LibClang.cpp +++ b/src/cpp/core/libclang/LibClang.cpp
@@ -58,7 +58,7 @@ std::vector<std::string> defaultCompileArgs(LibraryVersion version) @@ -62,7 +62,7 @@
// we need to add in the associated libclang headers as // we need to add in the associated libclang headers as
// they are not discovered / used by default during compilation // they are not discovered / used by default during compilation
@ -11,7 +9,7 @@ index 1186f3a..58e8cc7 100644
boost::format fmt("%1%/lib/clang/%2%/include"); boost::format fmt("%1%/lib/clang/%2%/include");
fmt % llvmPath.getAbsolutePath() % version.asString(); fmt % llvmPath.getAbsolutePath() % version.asString();
std::string includePath = fmt.str(); std::string includePath = fmt.str();
@@ -70,46 +70,7 @@ std::vector<std::string> defaultCompileArgs(LibraryVersion version) @@ -74,47 +74,7 @@
std::vector<std::string> systemClangVersions() std::vector<std::string> systemClangVersions()
{ {
@ -55,7 +53,9 @@ index 1186f3a..58e8cc7 100644
- } - }
-#endif -#endif
- -
+ std::vector<std::string> clangVersions = { "@libclang.so@" }; - return clangVersions;
return clangVersions; + return std::vector<std::string> { "@libclang.so@" };
} }

View file

@ -39,17 +39,17 @@
let let
pname = "RStudio"; pname = "RStudio";
version = "2022.02.3+492"; version = "2022.07.1+554";
RSTUDIO_VERSION_MAJOR = "2022"; RSTUDIO_VERSION_MAJOR = "2022";
RSTUDIO_VERSION_MINOR = "02"; RSTUDIO_VERSION_MINOR = "07";
RSTUDIO_VERSION_PATCH = "3"; RSTUDIO_VERSION_PATCH = "1";
RSTUDIO_VERSION_SUFFIX = "+492"; RSTUDIO_VERSION_SUFFIX = "+554";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "rstudio"; owner = "rstudio";
repo = "rstudio"; repo = "rstudio";
rev = "v${version}"; rev = "v${version}";
sha256 = "1pgbk5rpy47h9ihdrplbfhfc49hrc6242j9099bclq7rqif049wi"; sha256 = "0rmdqxizxqg2vgr3lv066cjmlpjrxjlgi0m97wbh6iyhkfm2rrj1";
}; };
mathJaxSrc = fetchurl { mathJaxSrc = fetchurl {
@ -129,6 +129,8 @@ in
./use-system-node.patch ./use-system-node.patch
./fix-resources-path.patch ./fix-resources-path.patch
./pandoc-nix-path.patch ./pandoc-nix-path.patch
./remove-quarto-from-generator.patch
./do-not-install-pandoc.patch
]; ];
postPatch = '' postPatch = ''
@ -196,7 +198,6 @@ in
done done
rm -r $out/lib/rstudio/{INSTALL,COPYING,NOTICE,README.md,SOURCE,VERSION} rm -r $out/lib/rstudio/{INSTALL,COPYING,NOTICE,README.md,SOURCE,VERSION}
rm -r $out/lib/rstudio/bin/{pandoc/pandoc,pandoc}
''; '';
meta = with lib; { meta = with lib; {

View file

@ -0,0 +1,13 @@
--- a/src/cpp/session/CMakeLists.txt
+++ b/src/cpp/session/CMakeLists.txt
@@ -60,8 +60,7 @@
# validate our dependencies exist
foreach(VAR RSTUDIO_DEPENDENCIES_DICTIONARIES_DIR
- RSTUDIO_DEPENDENCIES_MATHJAX_DIR
- RSTUDIO_DEPENDENCIES_PANDOC_DIR)
+ RSTUDIO_DEPENDENCIES_MATHJAX_DIR)
# validate existence
if(NOT EXISTS "${${VAR}}")

View file

@ -0,0 +1,32 @@
--- a/src/cpp/session/CMakeLists.txt
+++ b/src/cpp/session/CMakeLists.txt
@@ -43,12 +43,6 @@
set(RSTUDIO_DEPENDENCIES_MATHJAX_DIR "${RSTUDIO_DEPENDENCIES_DIR}/mathjax-27")
endif()
- if(EXISTS "${RSTUDIO_TOOLS_ROOT}/quarto")
- set(RSTUDIO_DEPENDENCIES_QUARTO_DIR "${RSTUDIO_TOOLS_ROOT}/quarto")
- else()
- set(RSTUDIO_DEPENDENCIES_QUARTO_DIR "${RSTUDIO_DEPENDENCIES_DIR}/quarto")
- endif()
-
endif()
@@ -67,14 +61,7 @@
# validate our dependencies exist
foreach(VAR RSTUDIO_DEPENDENCIES_DICTIONARIES_DIR
RSTUDIO_DEPENDENCIES_MATHJAX_DIR
- RSTUDIO_DEPENDENCIES_PANDOC_DIR
- RSTUDIO_DEPENDENCIES_QUARTO_DIR)
-
-
- # skip quarto if not enabled
- if("${VAR}" STREQUAL "RSTUDIO_DEPENDENCIES_QUARTO_DIR" AND NOT QUARTO_ENABLED)
- continue()
- endif()
+ RSTUDIO_DEPENDENCIES_PANDOC_DIR)
# validate existence
if(NOT EXISTS "${${VAR}}")

View file

@ -1,11 +1,12 @@
--- a/src/gwt/build.xml --- a/src/gwt/build.xml
+++ b/src/gwt/build.xml +++ b/src/gwt/build.xml
@@ -84,23 +84,7 @@ @@ -83,24 +83,7 @@
<echo>Concatenated acesupport files to 'acesupport.js'</echo>
</target> </target>
<!-- panmirror typescript library --> - <!-- panmirror typescript library -->
- <!-- ensure version matches RSTUDIO_NODE_VERSION --> - <!-- ensure version matches RSTUDIO_NODE_VERSION -->
- <property name="node.version" value="14.17.5"/> - <property name="node.version" value="16.14.0"/>
- <property name="node.dir" value="../../dependencies/common/node/${node.version}"/> - <property name="node.dir" value="../../dependencies/common/node/${node.version}"/>
- <condition property="node.bin" value="../../../${node.dir}/bin/node"> - <condition property="node.bin" value="../../../${node.dir}/bin/node">
- <not> - <not>
@ -21,8 +22,8 @@
- property="node.bin" - property="node.bin"
- value="/opt/rstudio-tools/dependencies/common/node/${node.version}/bin/node" - value="/opt/rstudio-tools/dependencies/common/node/${node.version}/bin/node"
- file="/opt/rstudio-tools/dependencies/common/node/${node.version}/bin/node"/> - file="/opt/rstudio-tools/dependencies/common/node/${node.version}/bin/node"/>
+ <property name="node.bin" value="@node@/bin/node"/> + <property name="node.bin" value="@node@/bin/node"/>
<property name="panmirror.dir" value="./panmirror/src/editor"/> <property name="panmirror.dir" value="./panmirror/src/editor"/>
<property name="panmirror.build.dir" value="./www/js/panmirror"/> <property name="panmirror.build.dir" value="./www/js/panmirror"/>

View file

@ -2,7 +2,7 @@
{ fetchurl, stdenv, lib, xorg, glib, libglvnd, glibcLocales, gtk3, cairo, pango, makeWrapper, wrapGAppsHook { fetchurl, stdenv, lib, xorg, glib, libglvnd, glibcLocales, gtk3, cairo, pango, makeWrapper, wrapGAppsHook
, writeShellScript, common-updater-scripts, curl , writeShellScript, common-updater-scripts, curl
, openssl, bzip2, bash, unzip, zip , openssl_1_1, bzip2, bash, unzip, zip
}: }:
let let
@ -15,7 +15,7 @@ let
versionUrl = "https://download.sublimetext.com/latest/${if dev then "dev" else "stable"}"; versionUrl = "https://download.sublimetext.com/latest/${if dev then "dev" else "stable"}";
versionFile = builtins.toString ./packages.nix; versionFile = builtins.toString ./packages.nix;
libPath = lib.makeLibraryPath [ xorg.libX11 xorg.libXtst glib libglvnd openssl gtk3 cairo pango curl ]; libPath = lib.makeLibraryPath [ xorg.libX11 xorg.libXtst glib libglvnd openssl_1_1 gtk3 cairo pango curl ];
in let in let
binaryPackage = stdenv.mkDerivation rec { binaryPackage = stdenv.mkDerivation rec {
pname = "${pnameBase}-bin"; pname = "${pnameBase}-bin";
@ -65,6 +65,9 @@ in let
installPhase = '' installPhase = ''
runHook preInstall runHook preInstall
# No need to patch these libraries, it works well with our own
rm libcrypto.so.1.1 libssl.so.1.1
mkdir -p $out mkdir -p $out
cp -r * $out/ cp -r * $out/
@ -113,7 +116,7 @@ in stdenv.mkDerivation (rec {
makeWrapper "''$${primaryBinary}/${primaryBinary}" "$out/bin/${primaryBinary}" makeWrapper "''$${primaryBinary}/${primaryBinary}" "$out/bin/${primaryBinary}"
'' + builtins.concatStringsSep "" (map (binaryAlias: "ln -s $out/bin/${primaryBinary} $out/bin/${binaryAlias}\n") primaryBinaryAliases) + '' '' + builtins.concatStringsSep "" (map (binaryAlias: "ln -s $out/bin/${primaryBinary} $out/bin/${binaryAlias}\n") primaryBinaryAliases) + ''
mkdir -p "$out/share/applications" mkdir -p "$out/share/applications"
substitute "''$${primaryBinary}/${primaryBinary}.desktop" "$out/share/applications/${primaryBinary}.desktop" --replace "/opt/${primaryBinary}/${primaryBinary}" "$out/bin/${primaryBinary}" substitute "''$${primaryBinary}/${primaryBinary}.desktop" "$out/share/applications/${primaryBinary}.desktop" --replace "/opt/${primaryBinary}/${primaryBinary}" "${primaryBinary}"
for directory in ''$${primaryBinary}/Icon/*; do for directory in ''$${primaryBinary}/Icon/*; do
size=$(basename $directory) size=$(basename $directory)
mkdir -p "$out/share/icons/hicolor/$size/apps" mkdir -p "$out/share/icons/hicolor/$size/apps"

View file

@ -11,9 +11,9 @@ in
} {}; } {};
sublime4-dev = common { sublime4-dev = common {
buildVersion = "4134"; buildVersion = "4136";
dev = true; dev = true;
x64sha256 = "rd3EG8e13FsPKihSM9qjUMRsEA6joMwVqhj1NZlwIaE="; x64sha256 = "6cSaF8seS3XpXpoaROO5tpVuwJzE+z1kzwxNlOUadj0=";
aarch64sha256 = "gdfEDd2E1sew08sVmcmw21zyil8JuJJMpG2T/9Pi81E="; aarch64sha256 = "Mtib8i29FCFutRXmWPQSp9h/FcLD7+wv+tGAjwf9d3w=";
} {}; } {};
} }

File diff suppressed because it is too large Load diff

View file

@ -984,7 +984,7 @@ self: super: {
libiconv libiconv
]; ];
cargoSha256 = "sha256-QAfHhpXABuOPaHCfQQZYhBERGXMaJPFipWHt/MeSc3c="; cargoSha256 = "sha256-g5yNqDCN1O9x7/HcM8NsZlMwLudDTuPLE5gSpScNQnY=";
}; };
in in
'' ''

View file

@ -347,6 +347,8 @@ https://github.com/ldelossa/litee-symboltree.nvim/,,
https://github.com/ldelossa/litee.nvim/,, https://github.com/ldelossa/litee.nvim/,,
https://github.com/folke/lsp-colors.nvim/,, https://github.com/folke/lsp-colors.nvim/,,
https://github.com/lukas-reineke/lsp-format.nvim/,HEAD, https://github.com/lukas-reineke/lsp-format.nvim/,HEAD,
https://github.com/lvimuser/lsp-inlayhints.nvim/,HEAD,
https://github.com/Issafalcon/lsp-overloads.nvim/,main,
https://github.com/ahmedkhalf/lsp-rooter.nvim/,, https://github.com/ahmedkhalf/lsp-rooter.nvim/,,
https://github.com/nvim-lua/lsp-status.nvim/,, https://github.com/nvim-lua/lsp-status.nvim/,,
https://github.com/nvim-lua/lsp_extensions.nvim/,, https://github.com/nvim-lua/lsp_extensions.nvim/,,
@ -724,6 +726,7 @@ https://github.com/mtikekar/vim-bsv/,,
https://github.com/jeetsukumaran/vim-buffergator/,, https://github.com/jeetsukumaran/vim-buffergator/,,
https://github.com/bling/vim-bufferline/,, https://github.com/bling/vim-bufferline/,,
https://github.com/qpkorr/vim-bufkill/,, https://github.com/qpkorr/vim-bufkill/,,
https://github.com/isobit/vim-caddyfile/,HEAD,
https://github.com/tpope/vim-capslock/,, https://github.com/tpope/vim-capslock/,,
https://github.com/kristijanhusak/vim-carbon-now-sh/,, https://github.com/kristijanhusak/vim-carbon-now-sh/,,
https://github.com/m-pilia/vim-ccls/,, https://github.com/m-pilia/vim-ccls/,,
@ -892,6 +895,7 @@ https://github.com/tiagofumo/vim-nerdtree-syntax-highlight/,,
https://github.com/jistr/vim-nerdtree-tabs/,, https://github.com/jistr/vim-nerdtree-tabs/,,
https://github.com/nfnty/vim-nftables/,, https://github.com/nfnty/vim-nftables/,,
https://github.com/kana/vim-niceblock/,, https://github.com/kana/vim-niceblock/,,
https://github.com/nickel-lang/vim-nickel/,main,
https://github.com/tommcdo/vim-ninja-feet/,, https://github.com/tommcdo/vim-ninja-feet/,,
https://github.com/LnL7/vim-nix/,, https://github.com/LnL7/vim-nix/,,
https://github.com/symphorien/vim-nixhash/,, https://github.com/symphorien/vim-nixhash/,,

View file

@ -8,6 +8,7 @@
, python3Packages , python3Packages
, jdk , jdk
, llvmPackages_8 , llvmPackages_8
, llvmPackages_14
, nixpkgs-fmt , nixpkgs-fmt
, protobuf , protobuf
, jq , jq
@ -2551,7 +2552,7 @@ let
}; };
}; };
vadimcn.vscode-lldb = callPackage ./vscode-lldb { }; vadimcn.vscode-lldb = callPackage ./vscode-lldb { llvmPackages = llvmPackages_14; };
valentjn.vscode-ltex = vscode-utils.buildVscodeMarketplaceExtension rec { valentjn.vscode-ltex = vscode-utils.buildVscodeMarketplaceExtension rec {
mktplcRef = { mktplcRef = {

View file

@ -0,0 +1,17 @@
# This file has been generated by node2nix 1.11.1. Do not edit!
{pkgs ? import <nixpkgs> {
inherit system;
}, system ? builtins.currentSystem, nodejs ? pkgs."nodejs-14_x"}:
let
nodeEnv = import ./node-env.nix {
inherit (pkgs) stdenv lib python2 runCommand writeTextFile writeShellScript;
inherit pkgs nodejs;
libtool = if pkgs.stdenv.isDarwin then pkgs.darwin.cctools else null;
};
in
import ./node-packages.nix {
inherit (pkgs) fetchurl nix-gitignore stdenv lib fetchgit;
inherit nodeEnv;
}

View file

@ -0,0 +1,598 @@
# This file originates from node2nix
{lib, stdenv, nodejs, python2, pkgs, libtool, runCommand, writeTextFile, writeShellScript}:
let
# Workaround to cope with utillinux in Nixpkgs 20.09 and util-linux in Nixpkgs master
utillinux = if pkgs ? utillinux then pkgs.utillinux else pkgs.util-linux;
python = if nodejs ? python then nodejs.python else python2;
# Create a tar wrapper that filters all the 'Ignoring unknown extended header keyword' noise
tarWrapper = runCommand "tarWrapper" {} ''
mkdir -p $out/bin
cat > $out/bin/tar <<EOF
#! ${stdenv.shell} -e
$(type -p tar) "\$@" --warning=no-unknown-keyword --delay-directory-restore
EOF
chmod +x $out/bin/tar
'';
# Function that generates a TGZ file from a NPM project
buildNodeSourceDist =
{ name, version, src, ... }:
stdenv.mkDerivation {
name = "node-tarball-${name}-${version}";
inherit src;
buildInputs = [ nodejs ];
buildPhase = ''
export HOME=$TMPDIR
tgzFile=$(npm pack | tail -n 1) # Hooks to the pack command will add output (https://docs.npmjs.com/misc/scripts)
'';
installPhase = ''
mkdir -p $out/tarballs
mv $tgzFile $out/tarballs
mkdir -p $out/nix-support
echo "file source-dist $out/tarballs/$tgzFile" >> $out/nix-support/hydra-build-products
'';
};
# Common shell logic
installPackage = writeShellScript "install-package" ''
installPackage() {
local packageName=$1 src=$2
local strippedName
local DIR=$PWD
cd $TMPDIR
unpackFile $src
# Make the base dir in which the target dependency resides first
mkdir -p "$(dirname "$DIR/$packageName")"
if [ -f "$src" ]
then
# Figure out what directory has been unpacked
packageDir="$(find . -maxdepth 1 -type d | tail -1)"
# Restore write permissions to make building work
find "$packageDir" -type d -exec chmod u+x {} \;
chmod -R u+w "$packageDir"
# Move the extracted tarball into the output folder
mv "$packageDir" "$DIR/$packageName"
elif [ -d "$src" ]
then
# Get a stripped name (without hash) of the source directory.
# On old nixpkgs it's already set internally.
if [ -z "$strippedName" ]
then
strippedName="$(stripHash $src)"
fi
# Restore write permissions to make building work
chmod -R u+w "$strippedName"
# Move the extracted directory into the output folder
mv "$strippedName" "$DIR/$packageName"
fi
# Change to the package directory to install dependencies
cd "$DIR/$packageName"
}
'';
# Bundle the dependencies of the package
#
# Only include dependencies if they don't exist. They may also be bundled in the package.
includeDependencies = {dependencies}:
lib.optionalString (dependencies != []) (
''
mkdir -p node_modules
cd node_modules
''
+ (lib.concatMapStrings (dependency:
''
if [ ! -e "${dependency.packageName}" ]; then
${composePackage dependency}
fi
''
) dependencies)
+ ''
cd ..
''
);
# Recursively composes the dependencies of a package
composePackage = { name, packageName, src, dependencies ? [], ... }@args:
builtins.addErrorContext "while evaluating node package '${packageName}'" ''
installPackage "${packageName}" "${src}"
${includeDependencies { inherit dependencies; }}
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
'';
pinpointDependencies = {dependencies, production}:
let
pinpointDependenciesFromPackageJSON = writeTextFile {
name = "pinpointDependencies.js";
text = ''
var fs = require('fs');
var path = require('path');
function resolveDependencyVersion(location, name) {
if(location == process.env['NIX_STORE']) {
return null;
} else {
var dependencyPackageJSON = path.join(location, "node_modules", name, "package.json");
if(fs.existsSync(dependencyPackageJSON)) {
var dependencyPackageObj = JSON.parse(fs.readFileSync(dependencyPackageJSON));
if(dependencyPackageObj.name == name) {
return dependencyPackageObj.version;
}
} else {
return resolveDependencyVersion(path.resolve(location, ".."), name);
}
}
}
function replaceDependencies(dependencies) {
if(typeof dependencies == "object" && dependencies !== null) {
for(var dependency in dependencies) {
var resolvedVersion = resolveDependencyVersion(process.cwd(), dependency);
if(resolvedVersion === null) {
process.stderr.write("WARNING: cannot pinpoint dependency: "+dependency+", context: "+process.cwd()+"\n");
} else {
dependencies[dependency] = resolvedVersion;
}
}
}
}
/* Read the package.json configuration */
var packageObj = JSON.parse(fs.readFileSync('./package.json'));
/* Pinpoint all dependencies */
replaceDependencies(packageObj.dependencies);
if(process.argv[2] == "development") {
replaceDependencies(packageObj.devDependencies);
}
replaceDependencies(packageObj.optionalDependencies);
/* Write the fixed package.json file */
fs.writeFileSync("package.json", JSON.stringify(packageObj, null, 2));
'';
};
in
''
node ${pinpointDependenciesFromPackageJSON} ${if production then "production" else "development"}
${lib.optionalString (dependencies != [])
''
if [ -d node_modules ]
then
cd node_modules
${lib.concatMapStrings (dependency: pinpointDependenciesOfPackage dependency) dependencies}
cd ..
fi
''}
'';
# Recursively traverses all dependencies of a package and pinpoints all
# dependencies in the package.json file to the versions that are actually
# being used.
pinpointDependenciesOfPackage = { packageName, dependencies ? [], production ? true, ... }@args:
''
if [ -d "${packageName}" ]
then
cd "${packageName}"
${pinpointDependencies { inherit dependencies production; }}
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
fi
'';
# Extract the Node.js source code which is used to compile packages with
# native bindings
nodeSources = runCommand "node-sources" {} ''
tar --no-same-owner --no-same-permissions -xf ${nodejs.src}
mv node-* $out
'';
# Script that adds _integrity fields to all package.json files to prevent NPM from consulting the cache (that is empty)
addIntegrityFieldsScript = writeTextFile {
name = "addintegrityfields.js";
text = ''
var fs = require('fs');
var path = require('path');
function augmentDependencies(baseDir, dependencies) {
for(var dependencyName in dependencies) {
var dependency = dependencies[dependencyName];
// Open package.json and augment metadata fields
var packageJSONDir = path.join(baseDir, "node_modules", dependencyName);
var packageJSONPath = path.join(packageJSONDir, "package.json");
if(fs.existsSync(packageJSONPath)) { // Only augment packages that exist. Sometimes we may have production installs in which development dependencies can be ignored
console.log("Adding metadata fields to: "+packageJSONPath);
var packageObj = JSON.parse(fs.readFileSync(packageJSONPath));
if(dependency.integrity) {
packageObj["_integrity"] = dependency.integrity;
} else {
packageObj["_integrity"] = "sha1-000000000000000000000000000="; // When no _integrity string has been provided (e.g. by Git dependencies), add a dummy one. It does not seem to harm and it bypasses downloads.
}
if(dependency.resolved) {
packageObj["_resolved"] = dependency.resolved; // Adopt the resolved property if one has been provided
} else {
packageObj["_resolved"] = dependency.version; // Set the resolved version to the version identifier. This prevents NPM from cloning Git repositories.
}
if(dependency.from !== undefined) { // Adopt from property if one has been provided
packageObj["_from"] = dependency.from;
}
fs.writeFileSync(packageJSONPath, JSON.stringify(packageObj, null, 2));
}
// Augment transitive dependencies
if(dependency.dependencies !== undefined) {
augmentDependencies(packageJSONDir, dependency.dependencies);
}
}
}
if(fs.existsSync("./package-lock.json")) {
var packageLock = JSON.parse(fs.readFileSync("./package-lock.json"));
if(![1, 2].includes(packageLock.lockfileVersion)) {
process.stderr.write("Sorry, I only understand lock file versions 1 and 2!\n");
process.exit(1);
}
if(packageLock.dependencies !== undefined) {
augmentDependencies(".", packageLock.dependencies);
}
}
'';
};
# Reconstructs a package-lock file from the node_modules/ folder structure and package.json files with dummy sha1 hashes
reconstructPackageLock = writeTextFile {
name = "addintegrityfields.js";
text = ''
var fs = require('fs');
var path = require('path');
var packageObj = JSON.parse(fs.readFileSync("package.json"));
var lockObj = {
name: packageObj.name,
version: packageObj.version,
lockfileVersion: 1,
requires: true,
dependencies: {}
};
function augmentPackageJSON(filePath, dependencies) {
var packageJSON = path.join(filePath, "package.json");
if(fs.existsSync(packageJSON)) {
var packageObj = JSON.parse(fs.readFileSync(packageJSON));
dependencies[packageObj.name] = {
version: packageObj.version,
integrity: "sha1-000000000000000000000000000=",
dependencies: {}
};
processDependencies(path.join(filePath, "node_modules"), dependencies[packageObj.name].dependencies);
}
}
function processDependencies(dir, dependencies) {
if(fs.existsSync(dir)) {
var files = fs.readdirSync(dir);
files.forEach(function(entry) {
var filePath = path.join(dir, entry);
var stats = fs.statSync(filePath);
if(stats.isDirectory()) {
if(entry.substr(0, 1) == "@") {
// When we encounter a namespace folder, augment all packages belonging to the scope
var pkgFiles = fs.readdirSync(filePath);
pkgFiles.forEach(function(entry) {
if(stats.isDirectory()) {
var pkgFilePath = path.join(filePath, entry);
augmentPackageJSON(pkgFilePath, dependencies);
}
});
} else {
augmentPackageJSON(filePath, dependencies);
}
}
});
}
}
processDependencies("node_modules", lockObj.dependencies);
fs.writeFileSync("package-lock.json", JSON.stringify(lockObj, null, 2));
'';
};
prepareAndInvokeNPM = {packageName, bypassCache, reconstructLock, npmFlags, production}:
let
forceOfflineFlag = if bypassCache then "--offline" else "--registry http://www.example.com";
in
''
# Pinpoint the versions of all dependencies to the ones that are actually being used
echo "pinpointing versions of dependencies..."
source $pinpointDependenciesScriptPath
# Patch the shebangs of the bundled modules to prevent them from
# calling executables outside the Nix store as much as possible
patchShebangs .
# Deploy the Node.js package by running npm install. Since the
# dependencies have been provided already by ourselves, it should not
# attempt to install them again, which is good, because we want to make
# it Nix's responsibility. If it needs to install any dependencies
# anyway (e.g. because the dependency parameters are
# incomplete/incorrect), it fails.
#
# The other responsibilities of NPM are kept -- version checks, build
# steps, postprocessing etc.
export HOME=$TMPDIR
cd "${packageName}"
runHook preRebuild
${lib.optionalString bypassCache ''
${lib.optionalString reconstructLock ''
if [ -f package-lock.json ]
then
echo "WARNING: Reconstruct lock option enabled, but a lock file already exists!"
echo "This will most likely result in version mismatches! We will remove the lock file and regenerate it!"
rm package-lock.json
else
echo "No package-lock.json file found, reconstructing..."
fi
node ${reconstructPackageLock}
''}
node ${addIntegrityFieldsScript}
''}
npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${lib.optionalString production "--production"} rebuild
if [ "''${dontNpmInstall-}" != "1" ]
then
# NPM tries to download packages even when they already exist if npm-shrinkwrap is used.
rm -f npm-shrinkwrap.json
npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${lib.optionalString production "--production"} install
fi
'';
# Builds and composes an NPM package including all its dependencies
buildNodePackage =
{ name
, packageName
, version ? null
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, preRebuild ? ""
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, meta ? {}
, ... }@args:
let
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" "dontStrip" "dontNpmInstall" "preRebuild" "unpackPhase" "buildPhase" "meta" ];
in
stdenv.mkDerivation ({
name = "${name}${if version == null then "" else "-${version}"}";
buildInputs = [ tarWrapper python nodejs ]
++ lib.optional (stdenv.isLinux) utillinux
++ lib.optional (stdenv.isDarwin) libtool
++ buildInputs;
inherit nodejs;
inherit dontStrip; # Stripping may fail a build for some package deployments
inherit dontNpmInstall preRebuild unpackPhase buildPhase;
compositionScript = composePackage args;
pinpointDependenciesScript = pinpointDependenciesOfPackage args;
passAsFile = [ "compositionScript" "pinpointDependenciesScript" ];
installPhase = ''
source ${installPackage}
# Create and enter a root node_modules/ folder
mkdir -p $out/lib/node_modules
cd $out/lib/node_modules
# Compose the package and all its dependencies
source $compositionScriptPath
${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }}
# Create symlink to the deployed executable folder, if applicable
if [ -d "$out/lib/node_modules/.bin" ]
then
ln -s $out/lib/node_modules/.bin $out/bin
# Patch the shebang lines of all the executables
ls $out/bin/* | while read i
do
file="$(readlink -f "$i")"
chmod u+rwx "$file"
patchShebangs "$file"
done
fi
# Create symlinks to the deployed manual page folders, if applicable
if [ -d "$out/lib/node_modules/${packageName}/man" ]
then
mkdir -p $out/share
for dir in "$out/lib/node_modules/${packageName}/man/"*
do
mkdir -p $out/share/man/$(basename "$dir")
for page in "$dir"/*
do
ln -s $page $out/share/man/$(basename "$dir")
done
done
fi
# Run post install hook, if provided
runHook postInstall
'';
meta = {
# default to Node.js' platforms
platforms = nodejs.meta.platforms;
} // meta;
} // extraArgs);
# Builds a node environment (a node_modules folder and a set of binaries)
buildNodeDependencies =
{ name
, packageName
, version ? null
, src
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" ];
in
stdenv.mkDerivation ({
name = "node-dependencies-${name}${if version == null then "" else "-${version}"}";
buildInputs = [ tarWrapper python nodejs ]
++ lib.optional (stdenv.isLinux) utillinux
++ lib.optional (stdenv.isDarwin) libtool
++ buildInputs;
inherit dontStrip; # Stripping may fail a build for some package deployments
inherit dontNpmInstall unpackPhase buildPhase;
includeScript = includeDependencies { inherit dependencies; };
pinpointDependenciesScript = pinpointDependenciesOfPackage args;
passAsFile = [ "includeScript" "pinpointDependenciesScript" ];
installPhase = ''
source ${installPackage}
mkdir -p $out/${packageName}
cd $out/${packageName}
source $includeScriptPath
# Create fake package.json to make the npm commands work properly
cp ${src}/package.json .
chmod 644 package.json
${lib.optionalString bypassCache ''
if [ -f ${src}/package-lock.json ]
then
cp ${src}/package-lock.json .
chmod 644 package-lock.json
fi
''}
# Go to the parent folder to make sure that all packages are pinpointed
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }}
# Expose the executables that were installed
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
mv ${packageName} lib
ln -s $out/lib/node_modules/.bin $out/bin
'';
} // extraArgs);
# Builds a development shell
buildNodeShell =
{ name
, packageName
, version ? null
, src
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
nodeDependencies = buildNodeDependencies args;
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" "dontStrip" "dontNpmInstall" "unpackPhase" "buildPhase" ];
in
stdenv.mkDerivation ({
name = "node-shell-${name}${if version == null then "" else "-${version}"}";
buildInputs = [ python nodejs ] ++ lib.optional (stdenv.isLinux) utillinux ++ buildInputs;
buildCommand = ''
mkdir -p $out/bin
cat > $out/bin/shell <<EOF
#! ${stdenv.shell} -e
$shellHook
exec ${stdenv.shell}
EOF
chmod +x $out/bin/shell
'';
# Provide the dependencies in a development shell through the NODE_PATH environment variable
inherit nodeDependencies;
shellHook = lib.optionalString (dependencies != []) ''
export NODE_PATH=${nodeDependencies}/lib/node_modules
export PATH="${nodeDependencies}/bin:$PATH"
'';
} // extraArgs);
in
{
buildNodeSourceDist = lib.makeOverridable buildNodeSourceDist;
buildNodePackage = lib.makeOverridable buildNodePackage;
buildNodeDependencies = lib.makeOverridable buildNodeDependencies;
buildNodeShell = lib.makeOverridable buildNodeShell;
}

File diff suppressed because it is too large Load diff

View file

@ -1,23 +0,0 @@
{
"name": "vscode-lldb",
"version": "1.6.8",
"dependencies": {
"string-argv": "^0.3.1",
"yaml": "^1.10.0",
"yauzl": "^2.10.0",
"@types/vscode": "^1.31.0",
"@types/node": "^8.10.50",
"@types/mocha": "^7.0.1",
"@types/yauzl": "^2.9.0",
"typescript": "^4.2.4",
"mocha": "^8.4.0",
"source-map-support": "^0.5.12",
"memory-streams": "^0.1.3",
"vscode-debugprotocol": "^1.47.0",
"vscode-debugadapter-testsupport": "^1.47.0",
"vsce": "=1.88.0",
"webpack": "^5.37.1",
"webpack-cli": "^4.7.0",
"ts-loader": "^8.0.0"
}
}

View file

@ -1,5 +1,5 @@
diff --git a/CMakeLists.txt b/CMakeLists.txt diff --git a/CMakeLists.txt b/CMakeLists.txt
index 37745b5..cad11a0 100644 index 6ae4dfb..519f544 100644
--- a/CMakeLists.txt --- a/CMakeLists.txt
+++ b/CMakeLists.txt +++ b/CMakeLists.txt
@@ -16,13 +16,6 @@ endif() @@ -16,13 +16,6 @@ endif()
@ -13,10 +13,10 @@ index 37745b5..cad11a0 100644
- message(FATAL_ERROR "LLDB_PACKAGE not set." ) - message(FATAL_ERROR "LLDB_PACKAGE not set." )
-endif() -endif()
- -
set(TEST_TIMEOUT 5000 CACHE STRING "Test timeout [ms]") if (CMAKE_SYSROOT)
set(CMAKE_C_FLAGS "--sysroot=${CMAKE_SYSROOT} ${CMAKE_C_FLAGS}")
# General OS-specific definitions set(CMAKE_CXX_FLAGS "--sysroot=${CMAKE_SYSROOT} ${CMAKE_CXX_FLAGS}")
@@ -87,16 +80,6 @@ configure_file(package.json ${CMAKE_CURRENT_BINARY_DIR}/package.json @ONLY) @@ -93,16 +86,6 @@ configure_file(package.json ${CMAKE_CURRENT_BINARY_DIR}/package.json @ONLY)
configure_file(webpack.config.js ${CMAKE_CURRENT_BINARY_DIR}/webpack.config.js @ONLY) configure_file(webpack.config.js ${CMAKE_CURRENT_BINARY_DIR}/webpack.config.js @ONLY)
file(COPY ${CMAKE_CURRENT_SOURCE_DIR}/package-lock.json DESTINATION ${CMAKE_CURRENT_BINARY_DIR}) file(COPY ${CMAKE_CURRENT_SOURCE_DIR}/package-lock.json DESTINATION ${CMAKE_CURRENT_BINARY_DIR})
@ -33,4 +33,20 @@ index 37745b5..cad11a0 100644
# Copy it back, so we can commit the lock file. # Copy it back, so we can commit the lock file.
file(COPY ${CMAKE_CURRENT_BINARY_DIR}/package-lock.json DESTINATION ${CMAKE_CURRENT_SOURCE_DIR}) file(COPY ${CMAKE_CURRENT_BINARY_DIR}/package-lock.json DESTINATION ${CMAKE_CURRENT_SOURCE_DIR})
@@ -154,6 +137,7 @@ add_custom_target(tests
add_copy_file(PackageFiles ${CMAKE_CURRENT_SOURCE_DIR}/README.md ${CMAKE_CURRENT_BINARY_DIR}/README.md)
add_copy_file(PackageFiles ${CMAKE_CURRENT_SOURCE_DIR}/CHANGELOG.md ${CMAKE_CURRENT_BINARY_DIR}/CHANGELOG.md)
+add_copy_file(PackageFiles ${CMAKE_CURRENT_SOURCE_DIR}/LICENSE ${CMAKE_CURRENT_BINARY_DIR}/LICENSE)
add_copy_file(PackageFiles ${CMAKE_CURRENT_SOURCE_DIR}/images/lldb.png ${CMAKE_CURRENT_BINARY_DIR}/images/lldb.png)
add_copy_file(PackageFiles ${CMAKE_CURRENT_SOURCE_DIR}/images/user.svg ${CMAKE_CURRENT_BINARY_DIR}/images/user.svg)
add_copy_file(PackageFiles ${CMAKE_CURRENT_SOURCE_DIR}/images/users.svg ${CMAKE_CURRENT_BINARY_DIR}/images/users.svg)
@@ -170,6 +154,7 @@ add_custom_target(dev_debugging
set(PackagedFilesBootstrap
README.md
CHANGELOG.md
+ LICENSE
extension.js
images/*
syntaxes/*

View file

@ -1,11 +1,11 @@
{ lib, stdenv, fetchFromGitHub, rustPlatform, makeWrapper, callPackage { pkgs, lib, stdenv, fetchFromGitHub, runCommand, rustPlatform, makeWrapper, llvmPackages
, nodePackages, cmake, nodejs, unzip, python3 , nodePackages, cmake, nodejs, unzip, python3, pkg-config, libsecret
}: }:
assert lib.versionAtLeast python3.version "3.5"; assert lib.versionAtLeast python3.version "3.5";
let let
publisher = "vadimcn"; publisher = "vadimcn";
pname = "vscode-lldb"; pname = "vscode-lldb";
version = "1.6.10"; version = "1.7.4";
vscodeExtUniqueId = "${publisher}.${pname}"; vscodeExtUniqueId = "${publisher}.${pname}";
@ -13,19 +13,17 @@ let
owner = "vadimcn"; owner = "vadimcn";
repo = "vscode-lldb"; repo = "vscode-lldb";
rev = "v${version}"; rev = "v${version}";
sha256 = "sha256-4PM/818UFHRZekfbdhS/Rz0Pu6HOjJEldi4YuBWECnI="; sha256 = "sha256-yAB0qxeC2sWCQ1EcKG/7LsuUrxV/kbxkcOzRfAotxFc=";
}; };
lldb = callPackage ./lldb.nix {}; # need to build a custom version of lldb and llvm for enhanced rust support
lldb = (import ./lldb.nix { inherit fetchFromGitHub runCommand llvmPackages; });
adapter = rustPlatform.buildRustPackage { adapter = rustPlatform.buildRustPackage {
pname = "${pname}-adapter"; pname = "${pname}-adapter";
inherit version src; inherit version src;
# It will pollute the build environment of `buildRustPackage`. cargoSha256 = "sha256-Ly7yIGB6kLy0c9RzWt8BFuX90dxu2QASocNTEdQA3yo=";
cargoPatches = [ ./reset-cargo-config.patch ];
cargoSha256 = "sha256-Ch1X2vN+p7oCqSs/GIu5IzG+pcSKmQ+VwP2T8ycRhos=";
nativeBuildInputs = [ makeWrapper ]; nativeBuildInputs = [ makeWrapper ];
@ -42,7 +40,14 @@ let
doCheck = false; doCheck = false;
}; };
nodeDeps = nodePackages."vscode-lldb-build-deps-../../applications/editors/vscode/extensions/vscode-lldb/build-deps"; nodeDeps = ((import ./build-deps/default.nix {
inherit pkgs nodejs;
inherit (stdenv.hostPlatform) system;
}).nodeDependencies.override (old: {
inherit src version;
buildInputs = [pkg-config libsecret];
dontNpmInstall = true;
}));
in stdenv.mkDerivation { in stdenv.mkDerivation {
pname = "vscode-extension-${publisher}-${pname}"; pname = "vscode-extension-${publisher}-${pname}";
@ -55,7 +60,7 @@ in stdenv.mkDerivation {
patches = [ ./cmake-build-extension-only.patch ]; patches = [ ./cmake-build-extension-only.patch ];
postConfigure = '' postConfigure = ''
cp -r ${nodeDeps}/lib/node_modules/vscode-lldb/{node_modules,package-lock.json} . cp -r ${nodeDeps}/lib/{node_modules,package-lock.json} .
''; '';
cmakeFlags = [ cmakeFlags = [
@ -92,6 +97,7 @@ in stdenv.mkDerivation {
passthru = { passthru = {
inherit lldb adapter; inherit lldb adapter;
updateScript = ./update.sh;
}; };
meta = with lib; { meta = with lib; {

View file

@ -0,0 +1,13 @@
diff --git a/bindings/python/CMakeLists.txt b/bindings/python/CMakeLists.txt
index 82a52da89a7e..5127dc1d8f41 100644
--- a/bindings/python/CMakeLists.txt
+++ b/bindings/python/CMakeLists.txt
@@ -160,7 +160,7 @@ function(finish_swig_python swig_target lldb_python_bindings_dir lldb_python_tar
if(LLDB_BUILD_FRAMEWORK)
set(LLDB_PYTHON_INSTALL_PATH ${LLDB_FRAMEWORK_INSTALL_DIR}/LLDB.framework/Versions/${LLDB_FRAMEWORK_VERSION}/Resources/Python)
else()
- set(LLDB_PYTHON_INSTALL_PATH ${LLDB_PYTHON_RELATIVE_PATH})
+ set(LLDB_PYTHON_INSTALL_PATH ${CMAKE_INSTALL_LIBDIR}/../${LLDB_PYTHON_RELATIVE_PATH})
endif()
if (NOT CMAKE_CFG_INTDIR STREQUAL ".")
string(REPLACE ${CMAKE_CFG_INTDIR} "\$\{CMAKE_INSTALL_CONFIG_NAME\}" LLDB_PYTHON_INSTALL_PATH ${LLDB_PYTHON_INSTALL_PATH})

View file

@ -1,23 +1,35 @@
# Patched lldb for Rust language support. # Patched lldb for Rust language support.
{ lldb_12, fetchFromGitHub }: { fetchFromGitHub, runCommand, llvmPackages }:
let let
llvmSrc = fetchFromGitHub { llvmSrc = fetchFromGitHub {
owner = "vadimcn"; owner = "vadimcn";
repo = "llvm-project"; repo = "llvm-project";
rev = "f2e9ff34256cd8c6feaf14359f88ad3f538ed687"; # codelldb/14.x branch
sha256 = "sha256-5UsCBu3rtt+l2HZiCswoQJPPh8T6y471TBF4AypdF9I="; rev = "4c267c83cbb55fedf2e0b89644dc1db320fdfde7";
sha256 = "sha256-jM//ej6AxnRYj+8BAn4QrxHPT6HiDzK5RqHPSg3dCcw=";
}; };
in lldb_12.overrideAttrs (oldAttrs: { in (llvmPackages.lldb.overrideAttrs (oldAttrs: rec {
src = "${llvmSrc}/lldb";
passthru = (oldAttrs.passthru or {}) // { passthru = (oldAttrs.passthru or {}) // {
inherit llvmSrc; inherit llvmSrc;
}; };
patches = oldAttrs.patches ++ [
# backport of https://github.com/NixOS/nixpkgs/commit/0d3002334850a819d1a5c8283c39f114af907cd4
# remove when https://github.com/NixOS/nixpkgs/issues/166604 fixed
./fix-python-installation.patch
];
doInstallCheck = true; doInstallCheck = true;
postInstallCheck = (oldAttrs.postInstallCheck or "") + ''
# installCheck for lldb_14 currently broken
# https://github.com/NixOS/nixpkgs/issues/166604#issuecomment-1086103692
# ignore the oldAttrs installCheck
installCheckPhase = ''
versionOutput="$($out/bin/lldb --version)" versionOutput="$($out/bin/lldb --version)"
echo "'lldb --version' returns: $versionOutput" echo "'lldb --version' returns: $versionOutput"
echo "$versionOutput" | grep -q 'rust-enabled' echo "$versionOutput" | grep -q 'rust-enabled'
''; '';
})).override({
monorepoSrc = llvmSrc;
libllvm = llvmPackages.libllvm.override({ monorepoSrc = llvmSrc; });
}) })

View file

@ -1,19 +0,0 @@
diff --git a/.cargo/config b/.cargo/config
index c3c75e4..e69de29 100644
--- a/.cargo/config
+++ b/.cargo/config
@@ -1,14 +0,0 @@
-[build]
-target-dir = "build/target"
-
-[target.armv7-unknown-linux-gnueabihf]
-rustflags = [
- "-C", "link-arg=-fuse-ld=lld",
- "-C", "link-arg=--target=armv7-unknown-linux-gnueabihf",
-]
-
-[target.aarch64-unknown-linux-gnu]
-rustflags = [
- "-C", "link-arg=-fuse-ld=lld",
- "-C", "link-arg=--target=aarch64-unknown-linux-gnu",
-]

View file

@ -3,10 +3,6 @@
set -eo pipefail set -eo pipefail
cd "$(dirname "${BASH_SOURCE[0]}")" cd "$(dirname "${BASH_SOURCE[0]}")"
if [[ $# -ne 1 ]]; then
echo "Usage: ./update.sh <version>"
exit 1
fi
echo " echo "
FIXME: This script doesn't update patched lldb. Please manually check branches FIXME: This script doesn't update patched lldb. Please manually check branches
@ -19,28 +15,31 @@ nixFile=./default.nix
owner=vadimcn owner=vadimcn
repo=vscode-lldb repo=vscode-lldb
version="$1" version="$1"
if [[ $# -ne 1 ]]; then
# no version specified, find the newest one
version=$(
curl -s "https://api.github.com/repos/$owner/$repo/releases" |
jq 'map(select(.prerelease | not)) | .[0].tag_name' --raw-output |
sed 's/[\"v]//'
)
fi
old_version=$(sed -nE 's/.*\bversion = "(.*)".*/\1/p' ./default.nix)
if grep -q 'cargoSha256 = ""' ./default.nix; then
old_version='broken'
fi
if [[ "$version" == "$old_version" ]]; then
echo "Up to date: $version"
exit
fi
echo "$old_version -> $version"
# update hashes
sed -E 's/\bversion = ".*?"/version = "'$version'"/' --in-place "$nixFile" sed -E 's/\bversion = ".*?"/version = "'$version'"/' --in-place "$nixFile"
srcHash=$(nix-prefetch fetchFromGitHub --owner vadimcn --repo vscode-lldb --rev "v$version") srcHash=$(nix-prefetch fetchFromGitHub --owner vadimcn --repo vscode-lldb --rev "v$version")
sed -E 's#\bsha256 = ".*?"#sha256 = "'$srcHash'"#' --in-place "$nixFile" sed -E 's#\bsha256 = ".*?"#sha256 = "'$srcHash'"#' --in-place "$nixFile"
cargoHash=$(nix-prefetch "{ sha256 }: (import $nixpkgs {}).vscode-extensions.vadimcn.vscode-lldb.adapter.cargoDeps.overrideAttrs (_: { outputHash = sha256; })") cargoHash=$(nix-prefetch "{ sha256 }: (import $nixpkgs {}).vscode-extensions.vadimcn.vscode-lldb.adapter.cargoDeps.overrideAttrs (_: { outputHash = sha256; })")
sed -E 's#\bcargoSha256 = ".*?"#cargoSha256 = "'$cargoHash'"#' --in-place "$nixFile" sed -E 's#\bcargoSha256 = ".*?"#cargoSha256 = "'$cargoHash'"#' --in-place "$nixFile"
# update node dependencies
src="$(nix-build $nixpkgs -A vscode-extensions.vadimcn.vscode-lldb.src --no-out-link)" src="$(nix-build $nixpkgs -A vscode-extensions.vadimcn.vscode-lldb.src --no-out-link)"
oldDeps="$(jq '.dependencies' build-deps/package.json)" nix-shell -p node2nix -I nixpkgs=$nixpkgs --run "cd build-deps && ls -R && node2nix -14 -d -i \"$src/package.json\" -l \"$src/package-lock.json\""
newDeps="$(jq '.dependencies + .devDependencies' "$src/package.json")"
jq '{ name, version: $version, dependencies: (.dependencies + .devDependencies) }' \
--arg version "$version" \
"$src/package.json" \
> build-deps/package.json
if [[ "$oldDeps" == "$newDeps" ]]; then
echo "Dependencies not changed"
sed '/"vscode-lldb-build-deps-/,+3 s/version = ".*"/version = "'"$version"'"/' \
--in-place "$nixpkgs/pkgs/development/node-packages/node-packages.nix"
else
echo "Dependencies changed"
# Regenerate nodePackages.
cd "$nixpkgs/pkgs/development/node-packages"
exec ./generate.sh
fi

View file

@ -18,17 +18,17 @@ let
archive_fmt = if stdenv.isDarwin then "zip" else "tar.gz"; archive_fmt = if stdenv.isDarwin then "zip" else "tar.gz";
sha256 = { sha256 = {
x86_64-linux = "0cnrbjqcnkv7ybj9j7l0lcnfnxq18mddhdkj9797928q643bmj6z"; x86_64-linux = "0ar8gpklaa0aa3k1934jyg2vh65hzncx0awl1f0wz8n4fjasfrpc";
x86_64-darwin = "1d9gb3i2k0c9cn38igg1nm91bfqdi4xg29zlprqsqh98ijwqy25y"; x86_64-darwin = "0jkpzyg2pk2d88w2ffrp2lr0qadss7ccycx4vpmjmw62d3sap8n1";
aarch64-linux = "1jm8ll8f4m99ly53rv7000ng9a0l8jn4xpc6kfhmqdnf0jqfncsh"; aarch64-linux = "1g7lzqghagz63pljg4wy34z706j70vjmk49cl8v27jbnsgnva56a";
aarch64-darwin = "1awmaxkr5nl513c50g6k4r2j3w8p2by1j9i3kw7vkmwn91bk24i4"; aarch64-darwin = "132ml95xlyv5c343bfv0gpgr8rmk85xspsy9baninlmhnmy7mivv";
armv7l-linux = "1d2hl9jy1kfkzn4j7qkp3k8j1qc3r9rpqhvkfrr2axcqrahcrfsd"; armv7l-linux = "04anb6r7hkk3y3vahx32nxj5dz2i66rrnl0561xkcjr4cqvxykiw";
}.${system} or throwSystem; }.${system} or throwSystem;
in in
callPackage ./generic.nix rec { callPackage ./generic.nix rec {
# Please backport all compatible updates to the stable release. # Please backport all compatible updates to the stable release.
# This is important for the extension ecosystem. # This is important for the extension ecosystem.
version = "1.71.0"; version = "1.71.2";
pname = "vscode"; pname = "vscode";
executableName = "code" + lib.optionalString isInsiders "-insiders"; executableName = "code" + lib.optionalString isInsiders "-insiders";

View file

@ -15,11 +15,11 @@ let
archive_fmt = if stdenv.isDarwin then "zip" else "tar.gz"; archive_fmt = if stdenv.isDarwin then "zip" else "tar.gz";
sha256 = { sha256 = {
x86_64-linux = "03lbfl3azrjhxzkadrz632dpwnv6hyyls10mc8wzspwraz77v1m5"; x86_64-linux = "1ajls31iqvrcnydwdn2fhajz76j60vsqhn343237jgwfbvaklvav";
x86_64-darwin = "1fd66fbs414lja7ca38sdgx02nw9w1qfrlxhcb52ijls5xbmbgm4"; x86_64-darwin = "100p494k1gfzhd86nj9hvh0w73i4wjn2vy6jdpb66rrmswy2hr40";
aarch64-linux = "0hwzx0lvrxrzrpggpsymjzy53dq4msg0j3vrxq82308ydc5ssnzd"; aarch64-linux = "066g825s79hmwl5yl7yl0yf6vzr3nagb44bcqw1zp1iqv54f40c6";
aarch64-darwin = "0dqhi6br29bq8a97wgfxgz4d236cg0ydgaqv8j5nqjgvjwp13p9l"; aarch64-darwin = "02aln53zcjp689ivq3ypid2gk9pwbqs24n1ay0hibvrpkx3v4y8k";
armv7l-linux = "07qq0ic9nckl9fkk5rl9dy4gksw3l248jsy7v8ws8f3mq4l8gi49"; armv7l-linux = "1qvz1233k31baw09p45x67cfadsgm1jnnfc4r8yvrh75iplcspgl";
}.${system} or throwSystem; }.${system} or throwSystem;
sourceRoot = if stdenv.isDarwin then "" else "."; sourceRoot = if stdenv.isDarwin then "" else ".";
@ -29,7 +29,7 @@ in
# Please backport all compatible updates to the stable release. # Please backport all compatible updates to the stable release.
# This is important for the extension ecosystem. # This is important for the extension ecosystem.
version = "1.71.0.22245"; version = "1.71.2.22258";
pname = "vscodium"; pname = "vscodium";
executableName = "codium"; executableName = "codium";

View file

@ -10,13 +10,13 @@
stdenv.mkDerivation rec { stdenv.mkDerivation rec {
pname = "dolphin-emu"; pname = "dolphin-emu";
version = "5.0-16793"; version = "5.0-17269";
src = fetchFromGitHub { src = fetchFromGitHub {
owner = "dolphin-emu"; owner = "dolphin-emu";
repo = "dolphin"; repo = "dolphin";
rev = "3cd82b619388d0877436390093a6edc2319a6904"; rev = "48c9c224cf9f82f0f9f2690b7cc6283d7448480c";
sha256 = "sha256-0k+kmq/jkCy52wGcmvtwmnLxUfxk3k0mvsr5wfX8p30="; sha256 = "sha256-WC3jukRygZigLx987CzRmOmJ7DeS1atXrMzU98sRzEg=";
fetchSubmodules = true; fetchSubmodules = true;
}; };
@ -36,7 +36,6 @@ stdenv.mkDerivation rec {
cmakeFlags = [ cmakeFlags = [
"-DUSE_SHARED_ENET=ON" "-DUSE_SHARED_ENET=ON"
"-DENABLE_LTO=ON"
"-DDOLPHIN_WC_REVISION=${src.rev}" "-DDOLPHIN_WC_REVISION=${src.rev}"
"-DDOLPHIN_WC_DESCRIBE=${version}" "-DDOLPHIN_WC_DESCRIBE=${version}"
"-DDOLPHIN_WC_BRANCH=master" "-DDOLPHIN_WC_BRANCH=master"

Some files were not shown because too many files have changed in this diff Show more