Compare commits

...

260 Commits

Author SHA1 Message Date
renovate[bot]
f500cebab4 Update dependency tailwindcss to v3.4.7 2024-07-25 20:38:44 +00:00
renovate[bot]
4c786d4893 Update dependency postcss to v8.4.40 2024-07-25 01:37:28 +00:00
renovate[bot]
57a0a52b0b Update alpine Docker tag to v3.20.2 2024-07-23 05:26:11 +00:00
mergify[bot]
c584c397f0
Merge pull request #2708 from eikek/update/mariadb-java-client-3.4.1
Update mariadb-java-client to 3.4.1
2024-07-18 06:21:49 +00:00
eikek-scala-steward
723241f17f Update mariadb-java-client to 3.4.1 2024-07-18 06:13:28 +00:00
renovate[bot]
54afdeb934 Update dependency @fortawesome/fontawesome-free to v6.6.0 2024-07-17 02:11:37 +00:00
renovate[bot]
a7510c02f5 Update dependency tailwindcss to v3.4.6 2024-07-16 22:29:42 +00:00
renovate[bot]
9306467583 Update dependency tailwindcss to v3.4.5 2024-07-15 22:23:29 +00:00
mergify[bot]
68caf20e24
Merge pull request #2703 from eikek/update/circe-yaml-0.15.3
Update circe-yaml to 0.15.3
2024-07-12 06:20:39 +00:00
eikek-scala-steward
e178a7359f Update circe-yaml to 0.15.3 2024-07-12 06:12:25 +00:00
mergify[bot]
b871803415
Merge pull request #2702 from eikek/update/jsoup-1.18.1
Update jsoup to 1.18.1
2024-07-11 06:20:58 +00:00
eikek-scala-steward
191357f249 Update jsoup to 1.18.1 2024-07-11 06:12:47 +00:00
eikek
294b04e590
Merge pull request #2699 from ivanbrennan/nix-secure-config
Nix module: secure config file
2024-07-08 09:59:52 +02:00
dependabot[bot]
f0f8d907df
Bump braces from 3.0.2 to 3.0.3 in /modules/webapp (#2684)
Bumps [braces](https://github.com/micromatch/braces) from 3.0.2 to 3.0.3.
- [Changelog](https://github.com/micromatch/braces/blob/master/CHANGELOG.md)
- [Commits](https://github.com/micromatch/braces/compare/3.0.2...3.0.3)

---
updated-dependencies:
- dependency-name: braces
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-08 09:42:31 +02:00
mergify[bot]
9d6cf21819
Merge pull request #2700 from eikek/update/sbt-1.10.1
Update sbt to 1.10.1
2024-07-08 06:22:04 +00:00
eikek-scala-steward
f1d6b8efb0 Update sbt to 1.10.1 2024-07-08 06:13:35 +00:00
ivanbrennan
baf5c682b0
secure nix config
Stop writing docspell config files to the world-readable nix store,
since they contain sensitive info, e.g. database passwords.

Additionally, provide a `configFile` option so users may point to a file
they've secured using their prefered secret management strategy.
2024-07-05 19:16:53 -04:00
renovate[bot]
f626ee82e6 Update dependency cssnano to v7.0.4 2024-07-05 12:27:19 +00:00
renovate[bot]
ba06e851bd Update dependency postcss to v8.4.39 2024-06-29 21:51:59 +00:00
mergify[bot]
4549d62142
Merge pull request #2693 from eikek/update/sbt-sonatype-3.11.0
Update sbt-sonatype to 3.11.0
2024-06-27 06:21:23 +00:00
eikek-scala-steward
f4381e8972 Update sbt-sonatype to 3.11.0 2024-06-27 06:12:28 +00:00
mergify[bot]
f62df12de8
Merge pull request #2691 from eikek/update/circe-yaml-0.15.2
Update circe-yaml to 0.15.2
2024-06-25 06:20:17 +00:00
eikek-scala-steward
0228932379 Update circe-yaml to 0.15.2 2024-06-25 06:11:52 +00:00
renovate[bot]
9a9aaa5d8e Update alpine Docker tag to v3.20.1 2024-06-21 00:09:19 +00:00
renovate[bot]
80788708a4 Update dependency cssnano to v7.0.3 2024-06-19 14:57:34 +00:00
mergify[bot]
e0af3d72e9
Merge pull request #2686 from eikek/update/scalafmt-core-3.8.2
Update scalafmt-core to 3.8.2
2024-06-15 06:22:36 +00:00
eikek-scala-steward
5fab35ba0d Add 'Reformat with scalafmt 3.8.2' to .git-blame-ignore-revs 2024-06-15 06:12:35 +00:00
eikek-scala-steward
1c566cd518 Reformat with scalafmt 3.8.2
Executed command: scalafmt --non-interactive
2024-06-15 06:12:35 +00:00
eikek-scala-steward
11c5a3c612 Update scalafmt-core to 3.8.2 2024-06-15 06:12:19 +00:00
mergify[bot]
57e55a52d6
Merge pull request #2685 from eikek/update/testcontainers-scala-mariadb-0.41.4
Update testcontainers-scala-mariadb, ... to 0.41.4
2024-06-14 06:21:27 +00:00
eikek-scala-steward
bfe7ada178 Update testcontainers-scala-mariadb, ... to 0.41.4 2024-06-14 06:13:14 +00:00
renovate[bot]
94026346c4 Update actions/checkout action to v4.1.7 2024-06-12 22:50:21 +00:00
mergify[bot]
4a6412904a
Merge pull request #2680 from eikek/update/pureconfig-0.17.7
Update pureconfig, pureconfig-ip4s to 0.17.7
2024-06-10 06:22:14 +00:00
eikek-scala-steward
021c98c523 Update pureconfig, pureconfig-ip4s to 0.17.7 2024-06-10 06:13:06 +00:00
mergify[bot]
3c3aa103fa
Merge pull request #2679 from eikek/update/imageio-jpeg-3.11.0
Update imageio-jpeg, imageio-tiff to 3.11.0
2024-06-09 06:20:18 +00:00
eikek-scala-steward
cd42f33a6d Update imageio-jpeg, imageio-tiff to 3.11.0 2024-06-09 06:12:12 +00:00
mergify[bot]
5d1f49b279
Merge pull request #2677 from eikek/update/scala-java-time-2.6.0
Update scala-java-time to 2.6.0
2024-06-08 06:19:48 +00:00
eikek-scala-steward
86bbc8298d Update scala-java-time to 2.6.0 2024-06-08 06:11:39 +00:00
renovate[bot]
62f26bbf59 Update dependency tailwindcss to v3.4.4 2024-06-05 22:27:34 +00:00
renovate[bot]
1e2d46c643 Update dependency cssnano to v7.0.2 2024-06-05 12:59:57 +00:00
mergify[bot]
a88e5af64b
Merge pull request #2674 from eikek/update/scribe-3.15.0
Update scribe, scribe-slf4j2 to 3.15.0
2024-06-05 06:21:36 +00:00
eikek-scala-steward
297977f1aa Update scribe, scribe-slf4j2 to 3.15.0 2024-06-05 06:13:23 +00:00
mergify[bot]
81aba411a4
Merge pull request #2671 from eikek/update/scribe-3.14.0
Update scribe, scribe-slf4j2 to 3.14.0
2024-06-03 06:21:35 +00:00
eikek-scala-steward
768c9f71a8 Update scribe, scribe-slf4j2 to 3.14.0 2024-06-03 06:13:14 +00:00
tenpai
67cfcb275c
Adding CJK and Custom Mapping Documentation (#2669) 2024-06-02 20:57:36 +02:00
tenpai
3621d3d9b4
Add Japanese Mapping for OCR Optimization (#2668) 2024-06-02 10:37:20 +02:00
eikek
f991d6018e
Update release-drafter.yml 2024-05-31 09:01:58 +02:00
mergify[bot]
8131b444ff
Merge pull request #2670 from eikek/update/swagger-ui-5.17.14
Update swagger-ui to 5.17.14
2024-05-31 06:21:28 +00:00
eikek-scala-steward
f77142899d Update swagger-ui to 5.17.14 2024-05-31 06:13:09 +00:00
eikek
faff4308bd Add release drafter config 2024-05-30 21:06:54 +02:00
renovate[bot]
2aad27791a Update dependency flag-icons to v7.2.3 2024-05-29 15:05:37 +00:00
mergify[bot]
2fe49fa2b1
Merge pull request #2664 from eikek/update-munit
Update munit to 1.0.0, munit-cats-effect to 2.0.0
2024-05-28 19:22:48 +00:00
mergify[bot]
b7f53c78d8
Merge pull request #2663 from eikek/fix/2629-collect-output-default
Honor default value `true` for `collectOutput`
2024-05-28 19:15:27 +00:00
eikek
9c910d262e Update munit to 1.0.0, munit-cats-effect to 2.0.0 2024-05-28 21:13:44 +02:00
eikek
fd2b897f2f Honor default value true for collectOutput
Fixes #2629
2024-05-28 20:56:18 +02:00
mergify[bot]
9f3f21f627
Merge pull request #2662 from eikek/update/calev-core-0.7.3
Update calev-circe, calev-core, calev-fs2 to 0.7.3
2024-05-28 06:21:38 +00:00
eikek-scala-steward
b20e466e43 Update calev-circe, calev-core, calev-fs2 to 0.7.3 2024-05-28 06:12:52 +00:00
mergify[bot]
2bbeec677a
Merge pull request #2660 from eikek/fix/2650-addon-extract
FIx extracting addons with only a single file
2024-05-27 18:56:56 +00:00
eikek
2ca492d6cb Unwrap single directory after unzip is complete 2024-05-27 20:39:49 +02:00
eikek
62bd9844dd Fix test for non-empty sub directory
Fixes: #2650
2024-05-27 20:39:26 +02:00
mergify[bot]
870bfd9cf0
Merge pull request #2659 from eikek/fix-command-mappings-config
Move arg-mappings underneath `command` section
2024-05-27 16:02:26 +00:00
eikek
172513ce38 Move arg-mappings underneath command section
The argument mappings are part of the command configuration
2024-05-27 17:53:13 +02:00
mergify[bot]
523022988a
Merge pull request #2657 from eikek/update/swagger-ui-5.17.11
Update swagger-ui to 5.17.11
2024-05-24 21:31:44 +00:00
mergify[bot]
7cdef1915f
Merge pull request #2655 from eikek/update/scodec-bits-1.2.0
Update scodec-bits to 1.2.0
2024-05-24 21:26:11 +00:00
mergify[bot]
a3ad4479cf
Merge pull request #2653 from eikek/update/mariadb-java-client-3.4.0
Update mariadb-java-client to 3.4.0
2024-05-24 21:25:54 +00:00
mergify[bot]
c819735de5
Merge pull request #2651 from eikek/update/sourcecode-0.4.2
Update sourcecode to 0.4.2
2024-05-24 21:25:38 +00:00
mergify[bot]
22a55e75a9
Merge pull request #2652 from eikek/update/flyway-core-10.13.0
Update flyway-core, ... to 10.13.0
2024-05-24 21:25:24 +00:00
eikek-scala-steward
e5758847f6 Update swagger-ui to 5.17.11 2024-05-24 21:16:17 +00:00
eikek-scala-steward
ffeffedfbf Update scodec-bits to 1.2.0 2024-05-24 21:16:05 +00:00
eikek-scala-steward
7cd96bc59e Update mariadb-java-client to 3.4.0 2024-05-24 21:15:58 +00:00
eikek-scala-steward
f315ad32c0 Update flyway-core, ... to 10.13.0 2024-05-24 21:15:55 +00:00
eikek-scala-steward
9c8d290fa9 Update sourcecode to 0.4.2 2024-05-24 21:15:51 +00:00
eikek
d0681a12e3
Merge pull request #2646 from VTimofeenko/add-logout-url
Add logout-url option to Nix module
2024-05-24 23:05:46 +02:00
Vladimir Timofeenko
a2ae339870 Add auth.on-account-source-conflict 2024-05-24 13:28:44 -07:00
Vladimir Timofeenko
bec485de0b Add logout-url option to Nix module
Closes #2643
2024-05-22 17:18:18 -07:00
renovate[bot]
3a43ad408c chore(deps): update alpine docker tag to v3.20.0 2024-05-22 23:07:10 +00:00
renovate[bot]
f635181091 chore(deps): update dependency flag-icons to v7.2.2 2024-05-22 07:01:23 +00:00
renovate[bot]
bdf5c54ac9 chore(deps): update actions/checkout action to v4.1.6 2024-05-17 04:30:01 +00:00
renovate[bot]
a90cec4e7b chore(deps): update cachix/install-nix-action action to v27 2024-05-16 02:55:47 +00:00
mergify[bot]
92589ee2ed
Merge pull request #2635 from eikek/update/scribe-3.13.5
Update scribe, scribe-slf4j2 to 3.13.5
2024-05-10 06:34:43 +00:00
eikek-scala-steward
997fb60508 Update scribe, scribe-slf4j2 to 3.13.5 2024-05-10 06:26:37 +00:00
renovate[bot]
021f5a183e chore(deps): update postgres docker tag to v16.3 2024-05-10 04:51:20 +00:00
renovate[bot]
7969e0daa8 chore(deps): update actions/checkout action to v4.1.5 2024-05-09 00:39:19 +00:00
mergify[bot]
c935f6b4fc
Merge pull request #2631 from eikek/update/jwt-circe-10.0.1
Update jwt-circe to 10.0.1
2024-05-07 06:35:58 +00:00
eikek-scala-steward
2834513d92 Update jwt-circe to 10.0.1 2024-05-07 06:27:31 +00:00
eikek
66136b7d0f
Merge pull request #2630 from VTimofeenko/feat-add-package-option-to-nix-modules
Add package option to Nix modules
2024-05-06 19:57:35 +02:00
Vladimir Timofeenko
004add0dd1 Add package option to Nix modules
"Package" option allows specifying the derivation that will be used for
the systemd service. Works the same way as `services.<name>.package` in
NixOS. By default picks the docspell packages from pkgs instance -- same
behavior as prior to this commit

Closes #2627
2024-05-06 10:22:00 -07:00
mergify[bot]
adf3a9045e
Merge pull request #2628 from eikek/update/io-1.10.0
Update io, sbt to 1.10.0
2024-05-06 06:36:00 +00:00
eikek-scala-steward
6896ea0866 Update io, sbt to 1.10.0 2024-05-06 06:27:50 +00:00
mergify[bot]
3a73eb7948
Merge pull request #2626 from eikek/update/http4s-circe-0.23.27
Update http4s-circe, http4s-dsl, ... to 0.23.27
2024-05-04 06:34:06 +00:00
eikek-scala-steward
409ea99b17 Update http4s-circe, http4s-dsl, ... to 0.23.27 2024-05-04 06:25:28 +00:00
mergify[bot]
f670c5bace
Merge pull request #2625 from eikek/update/scala-library-2.13.14
Update scala-library to 2.13.14
2024-05-02 06:36:11 +00:00
mergify[bot]
a5c8f5d1a8
Merge pull request #2624 from eikek/update/sbt-scalafix-0.12.1
Update sbt-scalafix to 0.12.1
2024-05-02 06:35:43 +00:00
eikek-scala-steward
d44f4e7cfa Update scala-library to 2.13.14 2024-05-02 06:27:08 +00:00
eikek-scala-steward
775841b7de Update sbt-scalafix to 0.12.1 2024-05-02 06:27:03 +00:00
mergify[bot]
45c4bf84a1
Merge pull request #2622 from eikek/update/flyway-core-10.12.0
Update flyway-core, ... to 10.12.0
2024-04-30 06:35:38 +00:00
eikek-scala-steward
1485e688d3 Update flyway-core, ... to 10.12.0 2024-04-30 06:27:25 +00:00
renovate[bot]
77f186f320 chore(deps): update dependency cssnano to v7.0.1 2024-04-26 17:28:23 +00:00
mergify[bot]
95d4393421
Merge pull request #2619 from eikek/update/swagger-ui-5.17.2
Update swagger-ui to 5.17.2
2024-04-26 06:35:57 +00:00
eikek-scala-steward
2b4975fc08 Update swagger-ui to 5.17.2 2024-04-26 06:27:57 +00:00
renovate[bot]
ed86bfe182 chore(deps): update dependency tailwindcss to v3.4.3 2024-04-25 17:45:56 +00:00
renovate[bot]
f93a6701db chore(deps): update actions/checkout action to v4.1.4 2024-04-25 17:30:15 +00:00
eikek
9ada40c634
Merge pull request #2613 from ChanceHarrison/actual-master
docs(development.md): Fix minor typos
2024-04-25 15:30:10 +02:00
Eike
2f154146cb Merge remote-tracking branch 'origin/current-docs' 2024-04-25 14:12:28 +02:00
ChanceHarrison
08a71b1bad
Add page to website about contributing to docs (#2612) 2024-04-25 14:11:10 +02:00
Chance Harrison
893386b281
docs(development.md): Fix minor typos 2024-04-25 01:43:04 -07:00
mergify[bot]
c9f7c685db
Merge pull request #2611 from eikek/update/scribe-3.13.4
Update scribe, scribe-slf4j2 to 3.13.4
2024-04-25 06:34:35 +00:00
eikek-scala-steward
c9b1720aa5 Update scribe, scribe-slf4j2 to 3.13.4 2024-04-25 06:26:35 +00:00
renovate[bot]
dcc25805fd Update dependency cssnano to v7 2024-04-25 03:38:18 +00:00
mergify[bot]
dfe0d8e7bc
Merge pull request #2606 from eikek/update/swagger-ui-5.17.0
Update swagger-ui to 5.17.0
2024-04-24 06:37:01 +00:00
eikek-scala-steward
e8dada8720 Update swagger-ui to 5.17.0 2024-04-24 06:28:53 +00:00
renovate[bot]
2a5b7fab12 Update actions/checkout action to v4.1.3 2024-04-22 18:06:44 +00:00
renovate[bot]
6f46521a28 Update dependency @fontsource/montserrat to v5.0.18 2024-04-20 21:57:25 +00:00
mergify[bot]
81f386b7c0
Merge pull request #2601 from eikek/update/stanford-corenlp-4.5.7
Update stanford-corenlp to 4.5.7
2024-04-20 06:31:54 +00:00
eikek-scala-steward
788ffab63c Update stanford-corenlp to 4.5.7 2024-04-20 06:24:14 +00:00
mergify[bot]
df370ba221
Merge pull request #2599 from eikek/update/flyway-core-10.11.1
Update flyway-core, ... to 10.11.1
2024-04-19 06:35:25 +00:00
mergify[bot]
ffa55b9e51
Merge pull request #2598 from eikek/update/scribe-3.13.3
Update scribe, scribe-slf4j2 to 3.13.3
2024-04-19 06:34:19 +00:00
eikek-scala-steward
f96a3e6bb9 Update flyway-core, ... to 10.11.1 2024-04-19 06:26:27 +00:00
eikek-scala-steward
1425b7d21a Update scribe, scribe-slf4j2 to 3.13.3 2024-04-19 06:26:23 +00:00
mergify[bot]
230c80cae8
Merge pull request #2597 from eikek/update/swagger-ui-5.15.2-1
Update swagger-ui to 5.15.2-1
2024-04-18 06:34:13 +00:00
eikek-scala-steward
c3a7c1347c Update swagger-ui to 5.15.2-1 2024-04-18 06:26:58 +00:00
mergify[bot]
c16808a3f5
Merge pull request #2595 from eikek/update/icu4j-75.1
Update icu4j to 75.1
2024-04-17 06:37:01 +00:00
mergify[bot]
5ea4b5c6f2
Merge pull request #2596 from eikek/update/swagger-ui-5.15.2
Update swagger-ui to 5.15.2
2024-04-17 06:35:57 +00:00
eikek-scala-steward
21b1590a1d Update swagger-ui to 5.15.2 2024-04-17 06:28:14 +00:00
eikek-scala-steward
0b901ea430 Update icu4j to 75.1 2024-04-17 06:28:07 +00:00
tenpai
e731d822dc
Add Japanese Vertical Support Branch for Tesseract and Ocrmypdf OCR (#2505)
* Add Japanese Vertical Support 
* Adds Japanese Vertical mappings to default configuration.
2024-04-16 20:24:57 +02:00
mergify[bot]
36c00cc9ec
Merge pull request #2593 from eikek/update/sourcecode-0.4.1
Update sourcecode to 0.4.1
2024-04-16 06:34:49 +00:00
eikek-scala-steward
ca32f24804 Update sourcecode to 0.4.1 2024-04-16 06:25:50 +00:00
mergify[bot]
5b699fe99d
Merge pull request #2591 from eikek/update/sourcecode-0.4.0
Update sourcecode to 0.4.0
2024-04-15 06:34:10 +00:00
eikek-scala-steward
fa9c42f4b1 Update sourcecode to 0.4.0 2024-04-15 06:25:10 +00:00
mergify[bot]
76a55bed7b
Merge pull request #2588 from eikek/update/jcl-over-slf4j-2.0.13
Update jcl-over-slf4j to 2.0.13
2024-04-13 06:28:41 +00:00
mergify[bot]
e6d8a0ca83
Merge pull request #2587 from eikek/update/sbt-native-packager-1.10.0
Update sbt-native-packager to 1.10.0
2024-04-13 06:28:33 +00:00
eikek-scala-steward
826930827f Update jcl-over-slf4j to 2.0.13 2024-04-13 06:20:43 +00:00
eikek-scala-steward
15b73be1d7 Update sbt-native-packager to 1.10.0 2024-04-13 06:20:39 +00:00
mergify[bot]
342d4a88df
Merge pull request #2586 from eikek/update/swagger-ui-5.15.1
Update swagger-ui to 5.15.1
2024-04-12 06:33:53 +00:00
eikek-scala-steward
e97dda23a8 Update swagger-ui to 5.15.1 2024-04-12 06:25:59 +00:00
mergify[bot]
59182bc38a
Merge pull request #2585 from eikek/update/swagger-ui-5.15.0
Update swagger-ui to 5.15.0
2024-04-11 06:33:34 +00:00
eikek-scala-steward
9ac3055d20 Update swagger-ui to 5.15.0 2024-04-11 06:25:36 +00:00
mergify[bot]
a6d1d0e29d
Merge pull request #2583 from eikek/update/commons-io-2.16.1
Update commons-io to 2.16.1
2024-04-09 06:35:00 +00:00
eikek-scala-steward
9d072f31e0 Update commons-io to 2.16.1 2024-04-09 06:25:47 +00:00
mergify[bot]
d78b43168f
Merge pull request #2579 from eikek/update/tika-core-2.9.2
Update tika-core to 2.9.2
2024-04-03 06:32:53 +00:00
eikek-scala-steward
84174056ad Update tika-core to 2.9.2 2024-04-03 06:24:52 +00:00
renovate[bot]
76523d77b4 Update dependency @fortawesome/fontawesome-free to v6.5.2 2024-04-02 21:36:08 +00:00
mergify[bot]
7de555c9e6
Merge pull request #2577 from eikek/update/swagger-ui-5.13.0
Update swagger-ui to 5.13.0
2024-03-30 06:35:41 +00:00
mergify[bot]
a48f311227
Merge pull request #2576 from eikek/update/scalafmt-core-3.8.1
Update scalafmt-core to 3.8.1
2024-03-30 06:32:35 +00:00
eikek-scala-steward
231ce0022d Update swagger-ui to 5.13.0 2024-03-30 06:24:48 +00:00
eikek-scala-steward
38b4562dc5 Update scalafmt-core to 3.8.1 2024-03-30 06:24:29 +00:00
mergify[bot]
8001bbde6e
Merge pull request #2575 from eikek/update/commons-io-2.16.0
Update commons-io to 2.16.0
2024-03-29 06:33:05 +00:00
mergify[bot]
e27a4d6662
Merge pull request #2574 from eikek/update/fs2-core-3.10.2
Update fs2-core, fs2-io to 3.10.2
2024-03-29 06:32:54 +00:00
eikek-scala-steward
93a320bb72 Update commons-io to 2.16.0 2024-03-29 06:25:04 +00:00
eikek-scala-steward
cf4f0738da Update fs2-core, fs2-io to 3.10.2 2024-03-29 06:24:59 +00:00
renovate[bot]
53efb79cbc Update dependency tailwindcss to v3.4.3 2024-03-28 01:48:09 +00:00
renovate[bot]
21beefdc39 Update dependency tailwindcss to v3.4.2 2024-03-27 19:32:31 +00:00
mergify[bot]
7fab54b656
Merge pull request #2569 from eikek/update/swagger-ui-5.12.2
Update swagger-ui to 5.12.2
2024-03-27 06:32:51 +00:00
eikek-scala-steward
cdf6a75b4e Update swagger-ui to 5.12.2 2024-03-27 06:24:58 +00:00
renovate[bot]
e8f2bedecd Update dependency flag-icons to v7.2.1 2024-03-27 01:37:37 +00:00
renovate[bot]
0b1c924997 Update dependency cssnano to v6.1.2 2024-03-25 21:19:38 +00:00
mergify[bot]
3132b2af8f
Merge pull request #2565 from eikek/update/fs2-core-3.10.1
Update fs2-core, fs2-io to 3.10.1
2024-03-25 06:34:45 +00:00
eikek-scala-steward
78d8e7c054 Update fs2-core, fs2-io to 3.10.1 2024-03-25 06:27:04 +00:00
mergify[bot]
7f39395c1a
Merge pull request #2563 from eikek/update/scribe-3.13.2
Update scribe, scribe-slf4j2 to 3.13.2
2024-03-22 06:32:54 +00:00
eikek-scala-steward
faf5caffbe Update scribe, scribe-slf4j2 to 3.13.2 2024-03-22 06:25:14 +00:00
mergify[bot]
8b06a34fe6
Merge pull request #2562 from eikek/update/calev-core-0.7.2
Update calev-circe, calev-core, calev-fs2 to 0.7.2
2024-03-21 06:33:32 +00:00
renovate[bot]
aaedb45d96 Update dependency postcss to v8.4.38 2024-03-21 06:25:17 +00:00
eikek-scala-steward
753db5f9e4 Update calev-circe, calev-core, calev-fs2 to 0.7.2 2024-03-21 06:25:02 +00:00
renovate[bot]
2357547e70 Update dependency autoprefixer to v10.4.19 2024-03-21 03:04:59 +00:00
renovate[bot]
fcb986eca2 Update dependency postcss-import to v16.1.0 2024-03-21 02:31:02 +00:00
renovate[bot]
bfa5510442 Update dependency cssnano to v6.1.1 2024-03-20 21:29:53 +00:00
mergify[bot]
c2f8abae94
Merge pull request #2556 from eikek/update/sbt-scalajs-1.16.0
Update sbt-scalajs, scalajs-compiler, ... to 1.16.0
2024-03-20 06:39:19 +00:00
mergify[bot]
8f9e67d2a6
Merge pull request #2555 from eikek/update/scribe-3.13.1
Update scribe, scribe-slf4j2 to 3.13.1
2024-03-20 06:34:38 +00:00
mergify[bot]
3ef0a02ab1
Merge pull request #2554 from eikek/update/sbt-buildinfo-0.12.0
Update sbt-buildinfo to 0.12.0
2024-03-20 06:33:48 +00:00
eikek-scala-steward
2ece300b81 Update sbt-scalajs, scalajs-compiler, ... to 1.16.0 2024-03-20 06:25:58 +00:00
eikek-scala-steward
00ada494c4 Update scribe, scribe-slf4j2 to 3.13.1 2024-03-20 06:25:54 +00:00
eikek-scala-steward
e1e1e39606 Update sbt-buildinfo to 0.12.0 2024-03-20 06:25:48 +00:00
renovate[bot]
cd45407c2d Update dependency postcss to v8.4.37 2024-03-19 22:02:02 +00:00
mergify[bot]
e474257933
Merge pull request #2552 from eikek/update/swagger-ui-5.12.0
Update swagger-ui to 5.12.0
2024-03-19 06:32:40 +00:00
mergify[bot]
d1c5a077f1
Merge pull request #2551 from eikek/update/fs2-core-3.10.0
Update fs2-core, fs2-io to 3.10.0
2024-03-19 06:32:35 +00:00
eikek-scala-steward
2a4f37cc80 Update swagger-ui to 5.12.0 2024-03-19 06:24:49 +00:00
eikek-scala-steward
6db2d25e08 Update fs2-core, fs2-io to 3.10.0 2024-03-19 06:24:44 +00:00
renovate[bot]
cd9b49e4cc Update dependency postcss to v8.4.36 2024-03-18 01:46:46 +00:00
mergify[bot]
50ce96b8b2
Merge pull request #2544 from eikek/update/pdfbox-3.0.2
Update pdfbox to 3.0.2
2024-03-15 19:31:16 +00:00
mergify[bot]
f5514fb707
Merge pull request #2541 from eikek/update/swagger-ui-5.11.10
Update swagger-ui to 5.11.10
2024-03-15 19:31:14 +00:00
mergify[bot]
1af25ce148
Merge pull request #2546 from eikek/update/postgresql-42.7.3
Update postgresql to 42.7.3
2024-03-15 19:31:02 +00:00
mergify[bot]
19eef35b98
Merge pull request #2545 from eikek/update/flyway-core-10.10.0
Update flyway-core, ... to 10.10.0
2024-03-15 19:30:32 +00:00
eikek-scala-steward
1b7ffd4087 Update swagger-ui to 5.11.10 2024-03-15 20:23:26 +01:00
eikek-scala-steward
9253783ef0 Update pdfbox to 3.0.2 2024-03-15 20:23:09 +01:00
eikek-scala-steward
7a27fbf8fb Update flyway-core, ... to 10.10.0 2024-03-15 20:22:45 +01:00
eikek-scala-steward
f0b0906785 Update postgresql to 42.7.3 2024-03-15 20:22:20 +01:00
mergify[bot]
93b5a3ee72
Merge pull request #2547 from eikek/elm-deps
Fix renamed elm package
2024-03-15 08:32:23 +00:00
Eike Kettner
c223ba63aa Fix renamed elm package 2024-03-15 09:23:59 +01:00
eikek
f6d22523d1 Convert to stale action 2024-03-11 12:00:08 +01:00
eikek
67284d1f6a Fix tailwindcss warnings 2024-03-10 21:24:44 +01:00
mergify[bot]
247fc1d4e9
Merge pull request #2525 from eikek/update/sbt-github-pages-0.14.0
Update sbt-github-pages to 0.14.0
2024-03-10 20:15:06 +00:00
eikek-scala-steward
7c2a57966b Update sbt-github-pages to 0.14.0 2024-03-10 21:06:33 +01:00
mergify[bot]
fd927fa1e7
Merge pull request #2540 from eikek/redocly-tailwind-setup
Redocly tailwind setup
2024-03-10 20:05:40 +00:00
eikek
3d93439b28 Lower memory requirement for test-vm 2024-03-10 20:49:56 +01:00
eikek
7c123db1a3 Use tailwindcss standalone cli 2024-03-10 20:13:41 +01:00
eikek
7b53f3699f Update redocly setup 2024-03-10 19:53:36 +01:00
mergify[bot]
5715f60e96
Merge pull request #2539 from eikek/nix-refactor
Extend nix setup, including dev environments
2024-03-10 17:06:08 +00:00
eikek
ba8435c7dc Disable strict external link checking
This is so brittle, only works sometimes.
2024-03-10 16:58:22 +01:00
eikek
8a41ed3fd3 Github actions use nix 2024-03-10 16:58:22 +01:00
eikek
3aad3b7be4 Remove other now obsolete nix files 2024-03-10 15:38:17 +01:00
eikek
f3f246d798 Rename server -> restserver in nix setup
While I'd like to rename it the other way around, it would be a much
more breaking change. So for now, this way.
2024-03-10 15:37:16 +01:00
eikek
8bcc88ed65 Document flake dev setup 2024-03-10 15:37:16 +01:00
eikek
2e18274803 Extend nix flake setup 2024-03-10 15:37:16 +01:00
renovate[bot]
4167b64e31 Update dependency autoprefixer to v10.4.18 2024-03-09 06:35:51 +00:00
mergify[bot]
55a2d1359e
Merge pull request #2537 from eikek/update/kittens-3.3.0
Update kittens to 3.3.0
2024-03-09 06:20:16 +00:00
eikek-scala-steward
442e389537 Update kittens to 3.3.0 2024-03-09 06:12:48 +00:00
eikek
2ad9e1fa1e
Merge pull request #2511 from eikek/renovate/cssnano-6.x-lockfile
Update dependency cssnano to v6.1.0
2024-03-09 01:30:36 +01:00
eikek
f7eb913994
Merge pull request #2534 from eikek/renovate/cachix-install-nix-action-26.x
Update cachix/install-nix-action action to v26
2024-03-09 01:30:22 +01:00
renovate[bot]
443ba47cfb
Update cachix/install-nix-action action to v26 2024-03-08 22:02:57 +00:00
renovate[bot]
95a28afa69
Update dependency cssnano to v6.1.0 2024-03-08 22:02:53 +00:00
renovate[bot]
9ef934f8b1 Update dependency @fontsource/montserrat to v5.0.17 2024-03-08 22:01:43 +00:00
eikek
ca2a2a32d7 Merge branch 'current-docs' 2024-03-08 21:37:26 +01:00
eikek
8269a73a83
Extend config for external commands (#2536)
Allows to configure external commands and provide different arguments
based on runtime values, like language. It extends the current config
of a command to allow a `arg-mappings` section. An example for
ocrmypdf:

```conf
ocrmypdf = {
  enabled = true
  command = {
    program = "ocrmypdf"
### new arg-mappings
    arg-mappings = {
      "mylang" = {
        value = "{{lang}}"
        mappings = [
          {
            matches = "deu"
            args = [ "-l", "deu", "--pdf-renderer", "sandwich" ]
          },
          {
            matches = ".*"
            args = [ "-l", "{{lang}}" ]
          }
        ]
      }
    }
#### end new arg-mappings
    args = [
      ### will be replaced with corresponding args from "mylang" mapping
      "{{mylang}}", 
      "--skip-text",
      "--deskew",
      "-j", "1",
      "{{infile}}",
      "{{outfile}}"
    ]
    timeout = "5 minutes"
  }
  working-dir = ${java.io.tmpdir}"/docspell-convert"
}
```

The whole section will be first processed to replace all `{{…}}`
patterns with corresponding values. Then `arg-mappings` will be looked
at and the first match (value == matches) in its `mappings` array is
used to replace its name in the arguments to the command.
2024-03-08 21:34:42 +01:00
eikek
572afd2dc1 Fix array definition in config.toml 2024-03-08 21:24:40 +01:00
mergify[bot]
9c98f08520
Merge pull request #2533 from eikek/update/flyway-core-10.9.1
Update flyway-core, ... to 10.9.1
2024-03-08 06:18:50 +00:00
eikek-scala-steward
c9f2ed7185 Update flyway-core, ... to 10.9.1 2024-03-08 06:11:34 +00:00
eikek
012ef62b82 Try give more resources to sbt ci jobs 2024-03-07 21:46:14 +01:00
eikek
1691909d8f Try give more resources to ci job 2024-03-07 21:35:42 +01:00
mergify[bot]
500ae92a09
Merge pull request #2527 from eikek/update/http4s-circe-0.23.26
Update http4s-circe, http4s-dsl, ... to 0.23.26
2024-03-06 06:23:18 +00:00
mergify[bot]
57ecea818c
Merge pull request #2528 from eikek/update/cats-effect-3.5.4
Update cats-effect to 3.5.4
2024-03-06 06:22:19 +00:00
mergify[bot]
0e2bb198ae
Merge pull request #2526 from eikek/update/flyway-core-10.9.0
Update flyway-core, ... to 10.9.0
2024-03-06 06:22:16 +00:00
eikek-scala-steward
44bc8ac9ff Update cats-effect to 3.5.4 2024-03-06 06:13:31 +00:00
eikek-scala-steward
7411766ff0 Update http4s-circe, http4s-dsl, ... to 0.23.26 2024-03-06 06:13:26 +00:00
eikek-scala-steward
3d6643e98f Update flyway-core, ... to 10.9.0 2024-03-06 06:13:22 +00:00
mergify[bot]
551f96dd21
Merge pull request #2515 from eikek/update/scala-library-2.13.13
Update scala-library to 2.13.13
2024-03-03 17:52:13 +00:00
eikek
924aaf720e Fix compile warnings after scala update 2024-03-03 18:43:54 +01:00
eikek
1d149119ce Merge branch 'current-docs' 2024-03-03 09:47:40 +01:00
eikek
cea7948c2e Remove stackoverflow from external link check
It returns FORBIDDEN, so can not be checked it seems
2024-03-03 09:46:15 +01:00
eikek
979bdcfeb1
Merge pull request #2523 from tenpai-git/PostgreSQL-Manual-Backup-Documentation
Pushing a minor fix to the bash commands.
2024-03-03 09:45:22 +01:00
John Baumlin
7ea9d2e634 Pushing a minor fix to the bash commands. 2024-03-03 17:25:04 +09:00
mergify[bot]
0d0b150e0f
Merge pull request #2522 from eikek/update/sbt-scalafix-0.12.0
Update sbt-scalafix to 0.12.0
2024-03-03 06:20:34 +00:00
eikek-scala-steward
d30cc73e53 Update sbt-scalafix to 0.12.0 2024-03-03 06:11:56 +00:00
mergify[bot]
7b952f3da6
Merge pull request #2519 from eikek/docker-base-image
Change docker base images to 3.19.1
2024-03-02 07:18:37 +00:00
mergify[bot]
40f4974aca
Merge pull request #2520 from eikek/update/swagger-ui-5.11.8
Update swagger-ui to 5.11.8
2024-03-01 06:20:47 +00:00
eikek-scala-steward
4200edf675 Update swagger-ui to 5.11.8 2024-03-01 06:12:14 +00:00
eikek
0a987f5b66 Change docker base images to 3.19.1
See #2504, alpine edge introduced a version of tesseract that is
problematic to use from within docspell
2024-02-29 21:52:00 +01:00
eikek-scala-steward
3e76385d08 Update scala-library to 2.13.13 2024-02-27 06:11:48 +00:00
eikek
0bba5d8e02 Merge branch 'current-docs' 2024-02-26 17:39:33 +01:00
eikek
d4eeb01c7c Fix link to modheader plugin 2024-02-26 17:36:53 +01:00
eikek
10036cd57b Fix build when bloop plugin is present
When using sbt-bloop, the build doesn't compile anymore. The reason
seems to be incomptible `sbt-io` dependencies pulled in from
`sbt-bloop` and `sbt-native-packager` (as well as `sbt-github-pages`).
Interestingly, the build compiles fine if either one of these plugins
is removed. Only together with `sbt-bloop` the build fails to compile.
The workaround is to explicitely pull in the io depenency based on the
sbt version in use.
2024-02-26 10:27:56 +01:00
eikek
469fd70959
Merge pull request #2508 from tenpai-git/PostgreSQL-Manual-Backup-Documentation
Add documentation for backup and restore process for PostgreSQL.
2024-02-25 09:40:17 +01:00
eikek
620d97bd06
Merge pull request #2500 from nekrondev/master
fix(webapp): downstream keep-alive events to backend preventing timeout
2024-02-24 00:56:13 +01:00
eikek
1811d6f974
Merge pull request #2493 from eikek/renovate/postgres-16.x
Update postgres Docker tag to v16.2
2024-02-24 00:55:01 +01:00
eikek
b193ecc77a
Merge pull request #2501 from TheAnachronism/master
Fix some Kubernetes Kustomize deployment issues
2024-02-24 00:54:43 +01:00
renovate[bot]
bad82d01a5
Update postgres Docker tag to v16.2 2024-02-23 06:21:58 +00:00
mergify[bot]
0b51337514
Merge pull request #2512 from eikek/update/pureconfig-0.17.6
Update pureconfig, pureconfig-ip4s to 0.17.6
2024-02-23 06:21:03 +00:00
mergify[bot]
4f24625be9
Merge pull request #2513 from eikek/update/sbt-1.9.9
Update sbt to 1.9.9
2024-02-23 06:20:58 +00:00
eikek-scala-steward
21bbe67b09 Update sbt to 1.9.9 2024-02-23 06:12:50 +00:00
eikek-scala-steward
c824962925 Update pureconfig, pureconfig-ip4s to 0.17.6 2024-02-23 06:12:46 +00:00
John Baumlin
063a702a94 Also including a minor documentation update for JpnVert 2024-02-23 00:21:52 +09:00
John Baumlin
3a69bc5ee0 Adds manual backup and restore documentation for PostgreSQL as the recommended database. 2024-02-21 20:24:34 +09:00
mergify[bot]
7574dc2916
Merge pull request #2506 from eikek/update/mariadb-java-client-3.3.3
Update mariadb-java-client to 3.3.3
2024-02-21 06:19:08 +00:00
mergify[bot]
dc2937bc64
Merge pull request #2507 from eikek/update/postgresql-42.7.2
Update postgresql to 42.7.2
2024-02-21 06:19:03 +00:00
eikek-scala-steward
16db17d35c Update postgresql to 42.7.2 2024-02-21 06:10:53 +00:00
eikek-scala-steward
62f3cefc44 Update mariadb-java-client to 3.3.3 2024-02-21 06:10:49 +00:00
eikek
ba14d88f9f
Merge pull request #2496 from eikek/renovate/postcss-import-16.x-lockfile
Update dependency postcss-import to v16.0.1
2024-02-18 15:09:32 +01:00
nekrondev
d29d6adbac
fix(webapp): downstream keep-alive events to backend preventing timeout
Updated http4s component fixed idleTimeout between backend and proxy / client that requires now to send keep-alive messages from client to backend to prevent a timeout after 60s.

This resolves #2497.
2024-02-17 12:07:26 +01:00
renovate[bot]
9220c4205d
Update dependency postcss-import to v16.0.1 2024-02-16 06:20:07 +00:00
eikek
d4c7766f5a Mention minimum MariaDB version 2024-01-31 19:40:02 +01:00
178 changed files with 3319 additions and 4981 deletions

2
.git-blame-ignore-revs Normal file
View File

@ -0,0 +1,2 @@
# Scala Steward: Reformat with scalafmt 3.8.2
1c566cd5182d41f4cc06040fc347ddb4be617779

42
.github/release-drafter.yml vendored Normal file
View File

@ -0,0 +1,42 @@
name-template: "$RESOLVED_VERSION"
tag-template: "$RESOLVED_VERSION"
template: |
## Whats Changed
$CHANGES
categories:
- title: "🚀 Features"
labels:
- 'feature'
- 'enhancement'
- title: "🐛 Bug Fixes"
labels:
- 'fix'
- 'bug'
- title: "💚 Maintenance"
labels:
- 'chore'
- 'documentation'
- title: "🧱 Dependencies"
labels:
- 'dependencies'
- 'type: dependencies'
change-template: '- $TITLE @$AUTHOR (#$NUMBER)'
version-resolver:
major:
labels:
- 'breaking'
minor:
labels:
- 'feature'
- 'enhancement'
patch:
labels:
- 'chore'
- 'documentation'
- 'dependencies'
default: patch
exclude-labels:
- 'skip-changelog'

View File

@ -1,6 +1,6 @@
{ {
"automerge": true, "automerge": true,
"labels": ["type: dependencies"], "labels": ["dependencies"],
"packageRules": [ "packageRules": [
{ {
"matchManagers": [ "matchManagers": [

16
.github/stale.yml vendored
View File

@ -1,16 +0,0 @@
# Number of days of inactivity before an issue becomes stale
daysUntilStale: 30
# Number of days of inactivity before a stale issue is closed
daysUntilClose: 7
onlyLabels:
- question
# Label to use when marking an issue as stale
staleLabel: stale
# Comment to post when marking an issue as stale. Set to `false` to disable
markComment: >
This issue has been automatically marked as stale because it has not
had recent activity. It will be closed if no further activity
occurs. This only applies to 'question' issues. Always feel free to
reopen or create new issues. Thank you!
# Comment to post when closing a stale issue. Set to `false` to disable
closeComment: false

View File

@ -6,20 +6,13 @@ on:
- "master" - "master"
jobs: jobs:
check-website: check-website:
runs-on: ubuntu-22.04 runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4.1.1 - uses: actions/checkout@v4.1.7
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: cachix/install-nix-action@v27
- name: Set current version - name: Set current version
run: echo "DOCSPELL_VERSION=$(cat version.sbt | grep version | cut -d= -f2 | xargs)" >> $GITHUB_ENV run: echo "DOCSPELL_VERSION=$(cat version.sbt | grep version | cut -d= -f2 | xargs)" >> $GITHUB_ENV
- uses: jorelali/setup-elm@v5
with:
elm-version: 0.19.1
- uses: cachix/install-nix-action@v25
with:
nix_path: nixpkgs=channel:nixos-23.05
- name: Print nixpkgs version
run: nix-instantiate --eval -E '(import <nixpkgs> {}).lib.version'
- name: Build website (${{ env.DOCSPELL_VERSION }}) - name: Build website (${{ env.DOCSPELL_VERSION }})
run: nix-shell website/shell.nix --run "sbt make-website" run: nix develop .#ci --command sbt make-website

View File

@ -5,30 +5,18 @@ on:
- master - master
jobs: jobs:
ci-matrix: ci-matrix:
runs-on: ubuntu-22.04 runs-on: ubuntu-latest
strategy: strategy:
fail-fast: false fail-fast: false
matrix:
java: [ 'openjdk@1.17' ]
steps: steps:
- uses: actions/checkout@v4.1.1 - uses: actions/checkout@v4.1.7
with: with:
fetch-depth: 100 fetch-depth: 100
- uses: jorelali/setup-elm@v5
with:
elm-version: 0.19.1
- uses: bahmutov/npm-install@v1
with:
working-directory: modules/webapp
- name: Fetch tags - name: Fetch tags
run: git fetch --depth=100 origin +refs/tags/*:refs/tags/* run: git fetch --depth=100 origin +refs/tags/*:refs/tags/*
- uses: olafurpg/setup-scala@v14 - uses: cachix/install-nix-action@v27
with:
java-version: ${{ matrix.java }}
# - name: Coursier cache
# uses: coursier/cache-action@v6
- name: sbt ci ${{ github.ref }} - name: sbt ci ${{ github.ref }}
run: sbt ci run: nix develop .#ci --command sbt ci
ci: ci:
runs-on: ubuntu-22.04 runs-on: ubuntu-22.04
needs: [ci-matrix] needs: [ci-matrix]

View File

@ -4,9 +4,9 @@ on:
types: [ published ] types: [ published ]
jobs: jobs:
docker-images: docker-images:
runs-on: ubuntu-22.04 runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4.1.1 - uses: actions/checkout@v4.1.7
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Set current version - name: Set current version

14
.github/workflows/release-drafter.yml vendored Normal file
View File

@ -0,0 +1,14 @@
name: Release Drafter
on:
push:
branches:
- master
jobs:
update_release_draft:
runs-on: ubuntu-latest
steps:
- uses: release-drafter/release-drafter@v6
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@ -5,32 +5,20 @@ on:
- "master" - "master"
jobs: jobs:
release-nightly: release-nightly:
runs-on: ubuntu-22.04 runs-on: ubuntu-latest
strategy: strategy:
fail-fast: true fail-fast: true
matrix:
java: [ 'openjdk@1.17' ]
steps: steps:
- uses: actions/checkout@v4.1.1 - uses: actions/checkout@v4.1.7
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: olafurpg/setup-scala@v14 - uses: cachix/install-nix-action@v27
with:
java-version: ${{ matrix.java }}
- uses: jorelali/setup-elm@v5
with:
elm-version: 0.19.1
- uses: bahmutov/npm-install@v1
with:
working-directory: modules/webapp
# - name: Coursier cache
# uses: coursier/cache-action@v6
- name: Set current version - name: Set current version
run: echo "DOCSPELL_VERSION=$(cat version.sbt | grep version | cut -d= -f2 | xargs)" >> $GITHUB_ENV run: echo "DOCSPELL_VERSION=$(cat version.sbt | grep version | cut -d= -f2 | xargs)" >> $GITHUB_ENV
- name: sbt ci ${{ github.ref }} - name: sbt ci ${{ github.ref }}
run: sbt ci run: nix develop .#ci --command sbt ci
- name: sbt make-pkg (${{ env.DOCSPELL_VERSION }}) - name: sbt make-pkg (${{ env.DOCSPELL_VERSION }})
run: sbt make-pkg run: nix develop .#ci --command sbt make-pkg
- uses: "marvinpinto/action-automatic-releases@latest" - uses: "marvinpinto/action-automatic-releases@latest"
with: with:
repo_token: "${{ secrets.GITHUB_TOKEN }}" repo_token: "${{ secrets.GITHUB_TOKEN }}"

View File

@ -5,30 +5,18 @@ on:
- 'v*' - 'v*'
jobs: jobs:
release: release:
runs-on: ubuntu-22.04 runs-on: ubuntu-latest
strategy: strategy:
fail-fast: true fail-fast: true
matrix:
java: [ 'openjdk@1.17' ]
steps: steps:
- uses: actions/checkout@v4.1.1 - uses: actions/checkout@v4.1.7
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: olafurpg/setup-scala@v14 - uses: cachix/install-nix-action@v27
with:
java-version: ${{ matrix.java }}
- uses: jorelali/setup-elm@v5
with:
elm-version: 0.19.1
- uses: bahmutov/npm-install@v1
with:
working-directory: modules/webapp
# - name: Coursier cache
# uses: coursier/cache-action@v6
- name: Set current version - name: Set current version
run: echo "DOCSPELL_VERSION=$(cat version.sbt | grep version | cut -d= -f2 | xargs)" >> $GITHUB_ENV run: echo "DOCSPELL_VERSION=$(cat version.sbt | grep version | cut -d= -f2 | xargs)" >> $GITHUB_ENV
- name: sbt make-pkg (${{ env.DOCSPELL_VERSION }}) - name: sbt make-pkg (${{ env.DOCSPELL_VERSION }})
run: sbt make-pkg run: nix develop .#ci --command sbt make-pkg
- uses: meeDamian/github-release@2.0 - uses: meeDamian/github-release@2.0
with: with:
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.GITHUB_TOKEN }}

21
.github/workflows/stale.yml vendored Normal file
View File

@ -0,0 +1,21 @@
name: 'Handle stale issues'
on:
schedule:
- cron: '30 1 * * *'
jobs:
stale:
runs-on: ubuntu-latest
steps:
# https://github.com/actions/stale
- uses: actions/stale@v9
with:
days-before-stale: 30
days-before-close: 7
only-labels: question
stale-issue-label: stale
stale-issue-message: >
This issue has been automatically marked as stale because it has not
had recent activity. It will be closed if no further activity
occurs. This only applies to 'question' issues. Always feel free to
reopen or create new issues. Thank you!

View File

@ -5,24 +5,17 @@ on:
- "current-docs" - "current-docs"
jobs: jobs:
publish-website: publish-website:
runs-on: ubuntu-22.04 runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4.1.1 - uses: actions/checkout@v4.1.7
with: with:
fetch-depth: 0 fetch-depth: 0
- uses: cachix/install-nix-action@v27
- name: Set current version - name: Set current version
run: echo "DOCSPELL_VERSION=$(cat version.sbt | grep version | cut -d= -f2 | xargs)" >> $GITHUB_ENV run: echo "DOCSPELL_VERSION=$(cat version.sbt | grep version | cut -d= -f2 | xargs)" >> $GITHUB_ENV
- uses: jorelali/setup-elm@v5
with:
elm-version: 0.19.1
- uses: cachix/install-nix-action@v25
with:
nix_path: nixpkgs=channel:nixos-23.05
- name: Print nixpkgs version
run: nix-instantiate --eval -E '(import <nixpkgs> {}).lib.version'
- name: Build website (${{ env.DOCSPELL_VERSION }}) - name: Build website (${{ env.DOCSPELL_VERSION }})
run: nix-shell website/shell.nix --run "sbt make-website" run: nix develop .#ci --command sbt make-website
- name: Publish website (${{ env.DOCSPELL_VERSION }}) - name: Publish website (${{ env.DOCSPELL_VERSION }})
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: sbt publish-website run: nix develop .#ci --command sbt publish-website

1
.gitignore vendored
View File

@ -1,4 +1,5 @@
#artwork/*.png #artwork/*.png
.envrc
target/ target/
local/ local/
node_modules/ node_modules/

View File

@ -6,7 +6,7 @@ pull_request_rules:
assign: assign:
users: [eikek] users: [eikek]
label: label:
add: ["type: dependencies"] add: ["dependencies"]
- name: automatically merge Scala Steward PRs on CI success - name: automatically merge Scala Steward PRs on CI success
conditions: conditions:
- author=eikek-scala-steward[bot] - author=eikek-scala-steward[bot]

View File

@ -1,4 +1,4 @@
version = "3.7.17" version = "3.8.2"
preset = default preset = default
align.preset = some align.preset = some

View File

@ -1020,7 +1020,7 @@ Additionally there are some other minor features and bug fixes.
to be able to add a request header. Check [this for to be able to add a request header. Check [this for
firefox](https://addons.mozilla.org/en-US/firefox/addon/modheader-firefox/) firefox](https://addons.mozilla.org/en-US/firefox/addon/modheader-firefox/)
or [this for or [this for
chromium](https://chrome.google.com/webstore/detail/modheader/idgpnmonknjnojddfkpgkljpfnnfcklj) chromium](https://chromewebstore.google.com/detail/modheader-modify-http-hea/idgpnmonknjnojddfkpgkljpfnnfcklj)
- then add the request header `Docspell-Ui` with value `1`. - then add the request header `Docspell-Ui` with value `1`.
Reloading the page gets you back the old ui. Reloading the page gets you back the old ui.
- With new Web-UI, certain features and fixes were realized, but not - With new Web-UI, certain features and fixes were realized, but not

View File

@ -15,11 +15,14 @@ val scalafixSettings = Seq(
val sharedSettings = Seq( val sharedSettings = Seq(
organization := "com.github.eikek", organization := "com.github.eikek",
scalaVersion := "2.13.12", scalaVersion := "2.13.14",
organizationName := "Eike K. & Contributors", organizationName := "Eike K. & Contributors",
licenses += ("AGPL-3.0-or-later", url( licenses += (
"https://spdx.org/licenses/AGPL-3.0-or-later.html" "AGPL-3.0-or-later",
)), url(
"https://spdx.org/licenses/AGPL-3.0-or-later.html"
)
),
startYear := Some(2020), startYear := Some(2020),
headerLicenseStyle := HeaderLicenseStyle.SpdxSyntax, headerLicenseStyle := HeaderLicenseStyle.SpdxSyntax,
headerSources / excludeFilter := HiddenFileFilter || "*.java" || "StringUtil.scala", headerSources / excludeFilter := HiddenFileFilter || "*.java" || "StringUtil.scala",
@ -677,7 +680,11 @@ val restapi = project
openapiTargetLanguage := Language.Scala, openapiTargetLanguage := Language.Scala,
openapiPackage := Pkg("docspell.restapi.model"), openapiPackage := Pkg("docspell.restapi.model"),
openapiSpec := (Compile / resourceDirectory).value / "docspell-openapi.yml", openapiSpec := (Compile / resourceDirectory).value / "docspell-openapi.yml",
openapiStaticGen := OpenApiDocGenerator.Redoc openapiStaticGen := OpenApiDocGenerator.Redoc,
openapiRedoclyCmd := Seq("redocly-cli"),
openapiRedoclyConfig := Some(
(LocalRootProject / baseDirectory).value / "project" / "redocly.yml"
)
) )
.dependsOn(common, query.jvm, notificationApi, jsonminiq, addonlib) .dependsOn(common, query.jvm, notificationApi, jsonminiq, addonlib)
@ -697,7 +704,11 @@ val joexapi = project
openapiTargetLanguage := Language.Scala, openapiTargetLanguage := Language.Scala,
openapiPackage := Pkg("docspell.joexapi.model"), openapiPackage := Pkg("docspell.joexapi.model"),
openapiSpec := (Compile / resourceDirectory).value / "joex-openapi.yml", openapiSpec := (Compile / resourceDirectory).value / "joex-openapi.yml",
openapiStaticGen := OpenApiDocGenerator.Redoc openapiStaticGen := OpenApiDocGenerator.Redoc,
openapiRedoclyCmd := Seq("redocly-cli"),
openapiRedoclyConfig := Some(
(LocalRootProject / baseDirectory).value / "project" / "redocly.yml"
)
) )
.dependsOn(common, loggingScribe, addonlib) .dependsOn(common, loggingScribe, addonlib)

View File

@ -109,7 +109,7 @@ services:
- restserver - restserver
db: db:
image: postgres:16.1 image: postgres:16.3
container_name: postgres_db container_name: postgres_db
restart: unless-stopped restart: unless-stopped
volumes: volumes:

View File

@ -1,4 +1,4 @@
FROM alpine:20231219 FROM alpine:3.20.2
ARG version= ARG version=
ARG joex_url= ARG joex_url=
@ -77,7 +77,7 @@ RUN \
wget https://github.com/tesseract-ocr/tessdata/raw/main/khm.traineddata && \ wget https://github.com/tesseract-ocr/tessdata/raw/main/khm.traineddata && \
mv khm.traineddata /usr/share/tessdata mv khm.traineddata /usr/share/tessdata
# Using these data files for japanese, because they work better. See #973 # Using these data files for japanese, because they work better. Includes vertical data. See #973 and #2445.
RUN \ RUN \
wget https://raw.githubusercontent.com/tesseract-ocr/tessdata_fast/master/jpn_vert.traineddata && \ wget https://raw.githubusercontent.com/tesseract-ocr/tessdata_fast/master/jpn_vert.traineddata && \
wget https://raw.githubusercontent.com/tesseract-ocr/tessdata_fast/master/jpn.traineddata && \ wget https://raw.githubusercontent.com/tesseract-ocr/tessdata_fast/master/jpn.traineddata && \

View File

@ -1,4 +1,4 @@
FROM alpine:20231219 FROM alpine:3.20.2
ARG version= ARG version=
ARG restserver_url= ARG restserver_url=

130
flake.lock Normal file
View File

@ -0,0 +1,130 @@
{
"nodes": {
"devshell-tools": {
"inputs": {
"flake-utils": "flake-utils",
"nixpkgs": "nixpkgs"
},
"locked": {
"lastModified": 1710099997,
"narHash": "sha256-WmBKTLdth6I/D+0//9enbIXohGsBjepbjIAm9pCYj0U=",
"owner": "eikek",
"repo": "devshell-tools",
"rev": "e82faf976d318b3829f6f7f6785db6f3c7b65267",
"type": "github"
},
"original": {
"owner": "eikek",
"repo": "devshell-tools",
"type": "github"
}
},
"flake-utils": {
"inputs": {
"systems": "systems"
},
"locked": {
"lastModified": 1709126324,
"narHash": "sha256-q6EQdSeUZOG26WelxqkmR7kArjgWCdw5sfJVHPH/7j8=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "d465f4819400de7c8d874d50b982301f28a84605",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"flake-utils_2": {
"inputs": {
"systems": "systems_2"
},
"locked": {
"lastModified": 1709126324,
"narHash": "sha256-q6EQdSeUZOG26WelxqkmR7kArjgWCdw5sfJVHPH/7j8=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "d465f4819400de7c8d874d50b982301f28a84605",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1709309926,
"narHash": "sha256-VZFBtXGVD9LWTecGi6eXrE0hJ/mVB3zGUlHImUs2Qak=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "79baff8812a0d68e24a836df0a364c678089e2c7",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-23.11",
"repo": "nixpkgs",
"type": "github"
}
},
"nixpkgs_2": {
"locked": {
"lastModified": 1709677081,
"narHash": "sha256-tix36Y7u0rkn6mTm0lA45b45oab2cFLqAzDbJxeXS+c=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "880992dcc006a5e00dd0591446fdf723e6a51a64",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-23.11",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"devshell-tools": "devshell-tools",
"flake-utils": "flake-utils_2",
"nixpkgs": "nixpkgs_2"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
},
"systems_2": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

193
flake.nix Normal file
View File

@ -0,0 +1,193 @@
{
description = "Docspell";
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-23.11";
devshell-tools.url = "github:eikek/devshell-tools";
flake-utils.url = "github:numtide/flake-utils";
};
outputs = {
self,
nixpkgs,
flake-utils,
devshell-tools,
}:
flake-utils.lib.eachDefaultSystem (system: let
pkgs = nixpkgs.legacyPackages.${system};
sbt17 = pkgs.sbt.override {jre = pkgs.jdk17;};
ciPkgs = with pkgs; [
sbt17
jdk17
dpkg
elmPackages.elm
fakeroot
zola
yarn
nodejs
redocly-cli
tailwindcss
];
devshellPkgs =
ciPkgs
++ (with pkgs; [
jq
scala-cli
netcat
wget
which
inotifyTools
]);
docspellPkgs = pkgs.callPackage (import ./nix/pkg.nix) {};
dockerAmd64 = pkgs.pkgsCross.gnu64.callPackage (import ./nix/docker.nix) {
inherit (docspellPkgs) docspell-restserver docspell-joex;
};
dockerArm64 = pkgs.pkgsCross.aarch64-multiplatform.callPackage (import ./nix/docker.nix) {
inherit (docspellPkgs) docspell-restserver docspell-joex;
};
in {
formatter = pkgs.alejandra;
packages = {
inherit (docspellPkgs) docspell-restserver docspell-joex;
};
legacyPackages = {
docker = {
amd64 = {
inherit (dockerAmd64) docspell-restserver docspell-joex;
};
arm64 = {
inherit (dockerArm64) docspell-restserver docspell-joex;
};
};
};
checks = {
build-server = self.packages.${system}.docspell-restserver;
build-joex = self.packages.${system}.docspell-joex;
test = with import (nixpkgs + "/nixos/lib/testing-python.nix")
{
inherit system;
};
makeTest {
name = "docspell";
nodes = {
machine = {...}: {
nixpkgs.overlays = [self.overlays.default];
imports = [
self.nixosModules.default
./nix/checks
];
};
};
testScript = builtins.readFile ./nix/checks/testScript.py;
};
};
devShells = {
dev-cnt = pkgs.mkShellNoCC {
buildInputs =
(builtins.attrValues devshell-tools.legacyPackages.${system}.cnt-scripts)
++ devshellPkgs;
DOCSPELL_ENV = "dev";
DEV_CONTAINER = "docsp-dev";
SBT_OPTS = "-Xmx2G -Xss4m";
};
dev-vm = pkgs.mkShellNoCC {
buildInputs =
(builtins.attrValues devshell-tools.legacyPackages.${system}.vm-scripts)
++ devshellPkgs;
DOCSPELL_ENV = "dev";
SBT_OPTS = "-Xmx2G -Xss4m";
DEV_VM = "dev-vm";
VM_SSH_PORT = "10022";
};
ci = pkgs.mkShellNoCC {
buildInputs = ciPkgs;
SBT_OPTS = "-Xmx2G -Xss4m";
};
};
})
// {
nixosModules = {
default = {...}: {
imports = [
./nix/modules/server.nix
./nix/modules/joex.nix
];
};
server = import ./nix/modules/server.nix;
joex = import ./nix/modules/joex.nix;
};
overlays.default = final: prev: let
docspellPkgs = final.callPackage (import ./nix/pkg.nix) {};
in {
inherit (docspellPkgs) docspell-restserver docspell-joex;
};
nixosConfigurations = {
test-vm = devshell-tools.lib.mkVm {
system = "x86_64-linux";
modules = [
self.nixosModules.default
{
nixpkgs.overlays = [self.overlays.default];
}
./nix/test-vm.nix
];
};
docsp-dev = devshell-tools.lib.mkContainer {
system = "x86_64-linux";
modules = [
{
services.dev-postgres = {
enable = true;
databases = ["docspell"];
};
services.dev-email.enable = true;
services.dev-minio.enable = true;
services.dev-solr = {
enable = true;
cores = ["docspell"];
};
}
];
};
dev-vm = devshell-tools.lib.mkVm {
system = "x86_64-linux";
modules = [
{
networking.hostName = "dev-vm";
virtualisation.memorySize = 2048;
services.dev-postgres = {
enable = true;
databases = ["docspell"];
};
services.dev-email.enable = true;
services.dev-minio.enable = true;
services.dev-solr = {
enable = true;
cores = ["docspell"];
heap = 512;
};
port-forward.ssh = 10022;
port-forward.dev-postgres = 6534;
port-forward.dev-smtp = 10025;
port-forward.dev-imap = 10143;
port-forward.dev-webmail = 8080;
port-forward.dev-minio-api = 9000;
port-forward.dev-minio-console = 9001;
port-forward.dev-solr = 8983;
}
];
};
};
};
}

View File

@ -38,9 +38,9 @@ final case class AddonArchive(url: LenientUri, name: String, version: String) {
Files[F].createDirectories(target) *> Files[F].createDirectories(target) *>
reader(url) reader(url)
.through(Zip[F](logger.some).unzip(glob = glob, targetDir = target.some)) .through(Zip[F](logger.some).unzip(glob = glob, targetDir = target.some))
.evalTap(_ => Directory.unwrapSingle[F](logger, target))
.compile .compile
.drain .drain
.flatTap(_ => Directory.unwrapSingle[F](logger, target))
.as(target) .as(target)
} }
} }

View File

@ -110,7 +110,7 @@ private[addons] object RunnerUtil {
): F[AddonResult] = ): F[AddonResult] =
for { for {
stdout <- stdout <-
if (ctx.meta.options.exists(_.collectOutput)) CollectOut.buffer[F] if (ctx.meta.parseResult) CollectOut.buffer[F]
else CollectOut.none[F].pure[F] else CollectOut.none[F].pure[F]
cmdResult <- SysExec(cmd, logger, ctx.baseDir.some) cmdResult <- SysExec(cmd, logger, ctx.baseDir.some)
.flatMap( .flatMap(
@ -135,7 +135,7 @@ private[addons] object RunnerUtil {
out <- stdout.get out <- stdout.get
_ <- logger.debug(s"Addon stdout: $out") _ <- logger.debug(s"Addon stdout: $out")
result = Option result = Option
.when(ctx.meta.options.exists(_.collectOutput) && out.nonEmpty)( .when(ctx.meta.parseResult && out.nonEmpty)(
JsonParser JsonParser
.decode[AddonOutput](out) .decode[AddonOutput](out)
.fold(AddonResult.decodingError, AddonResult.success) .fold(AddonResult.decodingError, AddonResult.success)

View File

@ -9,7 +9,7 @@ package docspell.addons
import cats.effect._ import cats.effect._
import cats.syntax.option._ import cats.syntax.option._
import docspell.common.UrlReader import docspell.common._
import docspell.logging.TestLoggingConfig import docspell.logging.TestLoggingConfig
import munit._ import munit._
@ -42,10 +42,20 @@ class AddonArchiveTest extends CatsEffectSuite with TestLoggingConfig with Fixtu
} yield () } yield ()
} }
tempDir.test("read archive from zip with yaml only") { dir =>
for {
aa <- AddonArchive.read[IO](singleFileAddonUrl, UrlReader.defaultReader[IO], None)
_ = assertEquals(aa.version, "0.7.0")
path <- aa.extractTo(UrlReader.defaultReader[IO], dir)
read <- AddonArchive.read[IO](aa.url, UrlReader.defaultReader[IO], path.some)
_ = assertEquals(aa, read)
} yield ()
}
tempDir.test("Read generated addon from path") { dir => tempDir.test("Read generated addon from path") { dir =>
AddonGenerator.successAddon("mini-addon").use { addon => AddonGenerator.successAddon("mini-addon").use { addon =>
for { for {
archive <- IO(AddonArchive(addon.url, "", "")) archive <- IO(AddonArchive(addon.url, "test-addon", "0.1.0"))
path <- archive.extractTo[IO](UrlReader.defaultReader[IO], dir) path <- archive.extractTo[IO](UrlReader.defaultReader[IO], dir)
read <- AddonArchive.read[IO](addon.url, UrlReader.defaultReader[IO], path.some) read <- AddonArchive.read[IO](addon.url, UrlReader.defaultReader[IO], path.some)

View File

@ -142,7 +142,7 @@ class AddonExecutorTest extends CatsEffectSuite with Fixtures with TestLoggingCo
AddonExecutionResult.executionResultMonoid AddonExecutionResult.executionResultMonoid
.combine( .combine(
AddonExecutionResult.empty, AddonExecutionResult.empty,
AddonExecutionResult(Nil, true) AddonExecutionResult(Nil, pure = true)
) )
.pure .pure
) )

View File

@ -27,9 +27,9 @@ object AddonGenerator {
): Resource[IO, AddonArchive] = ): Resource[IO, AddonArchive] =
output match { output match {
case None => case None =>
generate(name, version, false)("exit 0") generate(name, version, collectOutput = false)("exit 0")
case Some(out) => case Some(out) =>
generate(name, version, true)( generate(name, version, collectOutput = true)(
s""" s"""
|cat <<-EOF |cat <<-EOF
|${out.asJson.noSpaces} |${out.asJson.noSpaces}
@ -77,8 +77,9 @@ object AddonGenerator {
meta = AddonMeta.Meta(name, version, None), meta = AddonMeta.Meta(name, version, None),
triggers = Set(AddonTriggerType.ExistingItem: AddonTriggerType).some, triggers = Set(AddonTriggerType.ExistingItem: AddonTriggerType).some,
args = None, args = None,
runner = runner = AddonMeta
AddonMeta.Runner(None, None, AddonMeta.TrivialRunner(true, "addon.sh").some).some, .Runner(None, None, AddonMeta.TrivialRunner(enable = true, "addon.sh").some)
.some,
options = options =
AddonMeta.Options(networking = !collectOutput, collectOutput = collectOutput).some AddonMeta.Options(networking = !collectOutput, collectOutput = collectOutput).some
) )

View File

@ -35,4 +35,13 @@ class AddonMetaTest extends CatsEffectSuite with TestLoggingConfig with Fixtures
_ = assertEquals(meta, dummyAddonMeta) _ = assertEquals(meta, dummyAddonMeta)
} yield () } yield ()
} }
test("parse yaml with defaults") {
val yamlStr = """meta:
| name: "test"
| version: "0.1.0"
|""".stripMargin
val meta = AddonMeta.fromYamlString(yamlStr).fold(throw _, identity)
assert(meta.parseResult)
}
} }

View File

@ -31,6 +31,9 @@ trait Fixtures extends TestLoggingConfig { self: CatsEffectSuite =>
val miniAddonUrl = val miniAddonUrl =
LenientUri.fromJava(getClass.getResource("/minimal-addon.zip")) LenientUri.fromJava(getClass.getResource("/minimal-addon.zip"))
val singleFileAddonUrl =
LenientUri.fromJava(getClass.getResource("/docspell-addon-single-file.zip"))
val dummyAddonMeta = val dummyAddonMeta =
AddonMeta( AddonMeta(
meta = meta =
@ -40,13 +43,13 @@ trait Fixtures extends TestLoggingConfig { self: CatsEffectSuite =>
), ),
None, None,
runner = Runner( runner = Runner(
nix = NixRunner(true).some, nix = NixRunner(enable = true).some,
docker = DockerRunner( docker = DockerRunner(
enable = true, enable = true,
image = None, image = None,
build = "Dockerfile".some build = "Dockerfile".some
).some, ).some,
trivial = TrivialRunner(true, "src/addon.sh").some trivial = TrivialRunner(enable = true, "src/addon.sh").some
).some, ).some,
options = Options(networking = true, collectOutput = true).some options = Options(networking = true, collectOutput = true).some
) )
@ -55,7 +58,7 @@ trait Fixtures extends TestLoggingConfig { self: CatsEffectSuite =>
Path(s"/tmp/target/test-temp") Path(s"/tmp/target/test-temp")
val tempDir = val tempDir =
ResourceFixture[Path]( ResourceFunFixture[Path](
Resource.eval(Files[IO].createDirectories(baseTempDir)) *> Resource.eval(Files[IO].createDirectories(baseTempDir)) *>
Files[IO] Files[IO]
.tempDirectory(baseTempDir.some, "run-", PosixPermissions.fromOctal("777")) .tempDirectory(baseTempDir.some, "run-", PosixPermissions.fromOctal("777"))
@ -65,7 +68,7 @@ trait Fixtures extends TestLoggingConfig { self: CatsEffectSuite =>
runner: RunnerType, runner: RunnerType,
runners: RunnerType* runners: RunnerType*
): AddonExecutorConfig = { ): AddonExecutorConfig = {
val nspawn = NSpawn(false, "sudo", "systemd-nspawn", Duration.millis(100)) val nspawn = NSpawn(enabled = false, "sudo", "systemd-nspawn", Duration.millis(100))
AddonExecutorConfig( AddonExecutorConfig(
runner = runner :: runners.toList, runner = runner :: runners.toList,
runTimeout = Duration.minutes(2), runTimeout = Duration.minutes(2),

View File

@ -125,6 +125,7 @@ object DateFind {
case Language.Dutch => dmy.or(ymd).or(mdy) case Language.Dutch => dmy.or(ymd).or(mdy)
case Language.Latvian => dmy.or(lavLong).or(ymd) case Language.Latvian => dmy.or(lavLong).or(ymd)
case Language.Japanese => ymd case Language.Japanese => ymd
case Language.JpnVert => ymd
case Language.Hebrew => dmy case Language.Hebrew => dmy
case Language.Lithuanian => ymd case Language.Lithuanian => ymd
case Language.Polish => dmy case Language.Polish => dmy

View File

@ -54,6 +54,8 @@ object MonthName {
latvian latvian
case Language.Japanese => case Language.Japanese =>
japanese japanese
case Language.JpnVert =>
japanese
case Language.Hebrew => case Language.Hebrew =>
hebrew hebrew
case Language.Lithuanian => case Language.Lithuanian =>

View File

@ -22,7 +22,7 @@ import munit._
class StanfordNerAnnotatorSuite extends FunSuite with TestLoggingConfig { class StanfordNerAnnotatorSuite extends FunSuite with TestLoggingConfig {
lazy val germanClassifier = lazy val germanClassifier =
new StanfordCoreNLP(Properties.nerGerman(None, false)) new StanfordCoreNLP(Properties.nerGerman(None, highRecall = false))
lazy val englishClassifier = lazy val englishClassifier =
new StanfordCoreNLP(Properties.nerEnglish(None)) new StanfordCoreNLP(Properties.nerEnglish(None))

View File

@ -90,6 +90,6 @@ object Config {
} }
object Addons { object Addons {
val disabled: Addons = val disabled: Addons =
Addons(false, false, UrlMatcher.False, UrlMatcher.True) Addons(enabled = false, allowImpure = false, UrlMatcher.False, UrlMatcher.True)
} }
} }

View File

@ -127,7 +127,7 @@ object Login {
_ <- logF.trace(s"Account lookup: $data") _ <- logF.trace(s"Account lookup: $data")
res <- data match { res <- data match {
case Some(d) if checkNoPassword(d, Set(AccountSource.OpenId)) => case Some(d) if checkNoPassword(d, Set(AccountSource.OpenId)) =>
doLogin(config, d.account, false) doLogin(config, d.account, rememberMe = false)
case Some(d) if checkNoPassword(d, Set(AccountSource.Local)) => case Some(d) if checkNoPassword(d, Set(AccountSource.Local)) =>
config.onAccountSourceConflict match { config.onAccountSourceConflict match {
case OnAccountSourceConflict.Fail => case OnAccountSourceConflict.Fail =>
@ -145,7 +145,7 @@ object Login {
AccountSource.OpenId AccountSource.OpenId
) )
) )
res <- doLogin(config, d.account, false) res <- doLogin(config, d.account, rememberMe = false)
} yield res } yield res
} }
case _ => case _ =>
@ -212,7 +212,12 @@ object Login {
val okResult: F[Result] = val okResult: F[Result] =
for { for {
_ <- store.transact(RUser.updateLogin(sf.token.account)) _ <- store.transact(RUser.updateLogin(sf.token.account))
newToken <- AuthToken.user(sf.token.account, false, config.serverSecret, None) newToken <- AuthToken.user(
sf.token.account,
requireSecondFactor = false,
config.serverSecret,
None
)
rem <- OptionT rem <- OptionT
.whenF(sf.rememberMe && config.rememberMe.enabled)( .whenF(sf.rememberMe && config.rememberMe.enabled)(
insertRememberToken(store, sf.token.account, config) insertRememberToken(store, sf.token.account, config)
@ -239,7 +244,9 @@ object Login {
(for { (for {
_ <- validateToken _ <- validateToken
key <- EitherT.fromOptionF( key <- EitherT.fromOptionF(
store.transact(RTotp.findEnabledByUserId(sf.token.account.userId, true)), store.transact(
RTotp.findEnabledByUserId(sf.token.account.userId, enabled = true)
),
Result.invalidAuth Result.invalidAuth
) )
now <- EitherT.right[Result](Timestamp.current[F]) now <- EitherT.right[Result](Timestamp.current[F])
@ -255,7 +262,12 @@ object Login {
def okResult(acc: AccountInfo) = def okResult(acc: AccountInfo) =
for { for {
_ <- store.transact(RUser.updateLogin(acc)) _ <- store.transact(RUser.updateLogin(acc))
token <- AuthToken.user(acc, false, config.serverSecret, None) token <- AuthToken.user(
acc,
requireSecondFactor = false,
config.serverSecret,
None
)
} yield Result.ok(token, None) } yield Result.ok(token, None)
def rememberedLogin(rid: Ident) = def rememberedLogin(rid: Ident) =

View File

@ -93,7 +93,7 @@ object AddonOps {
AddonResult.executionFailed( AddonResult.executionFailed(
new Exception(s"Addon run config ${id.id} not found.") new Exception(s"Addon run config ${id.id} not found.")
) :: Nil, ) :: Nil,
false pure = false
) :: Nil, ) :: Nil,
Nil Nil
) )

View File

@ -72,7 +72,7 @@ private[joex] class AddonPrepare[F[_]: Sync](store: Store[F]) extends LoggerExte
token <- AuthToken.user( token <- AuthToken.user(
account, account,
false, requireSecondFactor = false,
secret.getOrElse(ByteVector.empty), secret.getOrElse(ByteVector.empty),
tokenValidity.some tokenValidity.some
) )

View File

@ -194,7 +194,14 @@ object OCollective {
id <- Ident.randomId[F] id <- Ident.randomId[F]
settings = sett.emptyTrash.getOrElse(EmptyTrash.default) settings = sett.emptyTrash.getOrElse(EmptyTrash.default)
args = EmptyTrashArgs(cid, settings.minAge) args = EmptyTrashArgs(cid, settings.minAge)
ut = UserTask(id, EmptyTrashArgs.taskName, true, settings.schedule, None, args) ut = UserTask(
id,
EmptyTrashArgs.taskName,
enabled = true,
settings.schedule,
None,
args
)
_ <- uts.updateOneTask(UserTaskScope.collective(cid), args.makeSubject.some, ut) _ <- uts.updateOneTask(UserTaskScope.collective(cid), args.makeSubject.some, ut)
_ <- joex.notifyAllNodes _ <- joex.notifyAllNodes
} yield () } yield ()
@ -220,7 +227,7 @@ object OCollective {
ut = UserTask( ut = UserTask(
id, id,
LearnClassifierArgs.taskName, LearnClassifierArgs.taskName,
true, enabled = true,
CalEvent(WeekdayComponent.All, DateEvent.All, TimeEvent.All), CalEvent(WeekdayComponent.All, DateEvent.All, TimeEvent.All),
None, None,
args args
@ -239,7 +246,7 @@ object OCollective {
ut = UserTask( ut = UserTask(
id, id,
EmptyTrashArgs.taskName, EmptyTrashArgs.taskName,
true, enabled = true,
CalEvent(WeekdayComponent.All, DateEvent.All, TimeEvent.All), CalEvent(WeekdayComponent.All, DateEvent.All, TimeEvent.All),
None, None,
args args

View File

@ -114,14 +114,14 @@ object ONotification {
) )
_ <- notMod.send(logbuf._2.andThen(log), ev, ch) _ <- notMod.send(logbuf._2.andThen(log), ev, ch)
logs <- logbuf._1.get logs <- logbuf._1.get
res = SendTestResult(true, logs) res = SendTestResult(success = true, logs)
} yield res).attempt } yield res).attempt
.map { .map {
case Right(res) => res case Right(res) => res
case Left(ex) => case Left(ex) =>
val ev = val ev =
LogEvent.of(Level.Error, "Failed sending sample event").addError(ex) LogEvent.of(Level.Error, "Failed sending sample event").addError(ex)
SendTestResult(false, Vector(ev)) SendTestResult(success = false, Vector(ev))
} }
def listChannels(userId: Ident): F[Vector[Channel]] = def listChannels(userId: Ident): F[Vector[Channel]] =

View File

@ -120,7 +120,9 @@ object OTotp {
def confirmInit(accountId: AccountInfo, otp: OnetimePassword): F[ConfirmResult] = def confirmInit(accountId: AccountInfo, otp: OnetimePassword): F[ConfirmResult] =
for { for {
_ <- log.info(s"Confirm TOTP setup for account ${accountId.asString}") _ <- log.info(s"Confirm TOTP setup for account ${accountId.asString}")
key <- store.transact(RTotp.findEnabledByUserId(accountId.userId, false)) key <- store.transact(
RTotp.findEnabledByUserId(accountId.userId, enabled = false)
)
now <- Timestamp.current[F] now <- Timestamp.current[F]
res <- key match { res <- key match {
case None => case None =>
@ -129,7 +131,7 @@ object OTotp {
val check = totp.checkPassword(r.secret, otp, now.value) val check = totp.checkPassword(r.secret, otp, now.value)
if (check) if (check)
store store
.transact(RTotp.setEnabled(accountId.userId, true)) .transact(RTotp.setEnabled(accountId.userId, enabled = true))
.map(_ => ConfirmResult.Success) .map(_ => ConfirmResult.Success)
else ConfirmResult.Failed.pure[F] else ConfirmResult.Failed.pure[F]
} }
@ -140,7 +142,7 @@ object OTotp {
case Some(pw) => case Some(pw) =>
for { for {
_ <- log.info(s"Validating TOTP, because it is requested to disable it.") _ <- log.info(s"Validating TOTP, because it is requested to disable it.")
key <- store.transact(RTotp.findEnabledByLogin(accountId, true)) key <- store.transact(RTotp.findEnabledByLogin(accountId, enabled = true))
now <- Timestamp.current[F] now <- Timestamp.current[F]
res <- key match { res <- key match {
case None => case None =>
@ -149,7 +151,7 @@ object OTotp {
val check = totp.checkPassword(r.secret, pw, now.value) val check = totp.checkPassword(r.secret, pw, now.value)
if (check) if (check)
UpdateResult.fromUpdate( UpdateResult.fromUpdate(
store.transact(RTotp.setEnabled(r.userId, false)) store.transact(RTotp.setEnabled(r.userId, enabled = false))
) )
else else
log.info(s"TOTP code was invalid. Not disabling it.") *> UpdateResult log.info(s"TOTP code was invalid. Not disabling it.") *> UpdateResult
@ -160,15 +162,15 @@ object OTotp {
case None => case None =>
UpdateResult.fromUpdate { UpdateResult.fromUpdate {
(for { (for {
key <- OptionT(RTotp.findEnabledByLogin(accountId, true)) key <- OptionT(RTotp.findEnabledByLogin(accountId, enabled = true))
n <- OptionT.liftF(RTotp.setEnabled(key.userId, false)) n <- OptionT.liftF(RTotp.setEnabled(key.userId, enabled = false))
} yield n).mapK(store.transform).getOrElse(0) } yield n).mapK(store.transform).getOrElse(0)
} }
} }
def state(acc: AccountInfo): F[OtpState] = def state(acc: AccountInfo): F[OtpState] =
for { for {
record <- store.transact(RTotp.findEnabledByUserId(acc.userId, true)) record <- store.transact(RTotp.findEnabledByUserId(acc.userId, enabled = true))
result = record match { result = record match {
case Some(r) => case Some(r) =>
OtpState.Enabled(r.created) OtpState.Enabled(r.created)

View File

@ -159,7 +159,7 @@ object OUpload {
data.meta.skipDuplicates, data.meta.skipDuplicates,
data.meta.fileFilter.some, data.meta.fileFilter.some,
data.meta.tags.some, data.meta.tags.some,
false, reprocess = false,
data.meta.attachmentsOnly, data.meta.attachmentsOnly,
data.meta.customData data.meta.customData
) )

View File

@ -32,9 +32,12 @@ class AuthTokenTest extends CatsEffectSuite {
val otherSecret = ByteVector.fromValidHex("16bad") val otherSecret = ByteVector.fromValidHex("16bad")
test("validate") { test("validate") {
val token1 = AuthToken.user[IO](user, false, secret, None).unsafeRunSync() val token1 =
AuthToken.user[IO](user, requireSecondFactor = false, secret, None).unsafeRunSync()
val token2 = val token2 =
AuthToken.user[IO](user, false, secret, Duration.seconds(10).some).unsafeRunSync() AuthToken
.user[IO](user, requireSecondFactor = false, secret, Duration.seconds(10).some)
.unsafeRunSync()
assert(token1.validate(secret, Duration.seconds(5))) assert(token1.validate(secret, Duration.seconds(5)))
assert(!token1.validate(otherSecret, Duration.seconds(5))) assert(!token1.validate(otherSecret, Duration.seconds(5)))
assert(!token1.copy(account = john).validate(secret, Duration.seconds(5))) assert(!token1.copy(account = john).validate(secret, Duration.seconds(5)))
@ -46,9 +49,12 @@ class AuthTokenTest extends CatsEffectSuite {
} }
test("signature") { test("signature") {
val token1 = AuthToken.user[IO](user, false, secret, None).unsafeRunSync() val token1 =
AuthToken.user[IO](user, requireSecondFactor = false, secret, None).unsafeRunSync()
val token2 = val token2 =
AuthToken.user[IO](user, false, secret, Duration.seconds(10).some).unsafeRunSync() AuthToken
.user[IO](user, requireSecondFactor = false, secret, Duration.seconds(10).some)
.unsafeRunSync()
assert(token1.sigValid(secret)) assert(token1.sigValid(secret))
assert(token1.sigInvalid(otherSecret)) assert(token1.sigInvalid(otherSecret))

View File

@ -123,6 +123,11 @@ object Language {
val iso3 = "jpn" val iso3 = "jpn"
} }
/*It's not an ISO value, but this needs to be unique and tesseract will need jpn_vert for it's scan from the config of /etc/docspell-joex/docspell-joex.conf.*/
case object JpnVert extends Language {
val iso2 = "ja_vert"
val iso3 = "jpn_vert"
}
case object Hebrew extends Language { case object Hebrew extends Language {
val iso2 = "he" val iso2 = "he"
val iso3 = "heb" val iso3 = "heb"
@ -172,6 +177,7 @@ object Language {
Romanian, Romanian,
Latvian, Latvian,
Japanese, Japanese,
JpnVert,
Hebrew, Hebrew,
Lithuanian, Lithuanian,
Polish, Polish,

View File

@ -78,7 +78,11 @@ case class LenientUri(
.covary[F] .covary[F]
.rethrow .rethrow
.flatMap(url => .flatMap(url =>
fs2.io.readInputStream(Sync[F].delay(url.openStream()), chunkSize, true) fs2.io.readInputStream(
Sync[F].delay(url.openStream()),
chunkSize,
closeAfterUse = true
)
) )
def readText[F[_]: Sync](chunkSize: Int): F[String] = def readText[F[_]: Sync](chunkSize: Int): F[String] =
@ -121,7 +125,7 @@ object LenientUri {
val isRoot = true val isRoot = true
val isEmpty = false val isEmpty = false
def /(seg: String): Path = def /(seg: String): Path =
NonEmptyPath(NonEmptyList.of(seg), false) NonEmptyPath(NonEmptyList.of(seg), trailingSlash = false)
def asString = "/" def asString = "/"
} }
case object EmptyPath extends Path { case object EmptyPath extends Path {
@ -129,7 +133,7 @@ object LenientUri {
val isRoot = false val isRoot = false
val isEmpty = true val isEmpty = true
def /(seg: String): Path = def /(seg: String): Path =
NonEmptyPath(NonEmptyList.of(seg), false) NonEmptyPath(NonEmptyList.of(seg), trailingSlash = false)
def asString = "" def asString = ""
} }
case class NonEmptyPath(segs: NonEmptyList[String], trailingSlash: Boolean) case class NonEmptyPath(segs: NonEmptyList[String], trailingSlash: Boolean)

View File

@ -194,7 +194,7 @@ object MimeType {
val csValueStart = in.substring(n + "charset=".length).trim val csValueStart = in.substring(n + "charset=".length).trim
val csName = csValueStart.indexOf(';') match { val csName = csValueStart.indexOf(';') match {
case -1 => unquote(csValueStart).trim case -1 => unquote(csValueStart).trim
case n => unquote(csValueStart.substring(0, n)).trim case n2 => unquote(csValueStart.substring(0, n2)).trim
} }
if (Charset.isSupported(csName)) Right((Some(Charset.forName(csName)), "")) if (Charset.isSupported(csName)) Right((Some(Charset.forName(csName)), ""))
else Right((None, "")) else Right((None, ""))

View File

@ -1,212 +0,0 @@
/*
* Copyright 2020 Eike K. & Contributors
*
* SPDX-License-Identifier: AGPL-3.0-or-later
*/
package docspell.common
import java.io.InputStream
import java.lang.ProcessBuilder.Redirect
import java.util.concurrent.TimeUnit
import scala.jdk.CollectionConverters._
import cats.effect._
import cats.implicits._
import fs2.io.file.Path
import fs2.{Stream, io, text}
import docspell.common.{exec => newExec}
import docspell.logging.Logger
// better use `SysCmd` and `SysExec`
object SystemCommand {
final case class Config(
program: String,
args: Seq[String],
timeout: Duration,
env: Map[String, String] = Map.empty
) {
def toSysCmd = newExec
.SysCmd(program, newExec.Args(args))
.withTimeout(timeout)
.addEnv(newExec.Env(env))
def mapArgs(f: String => String): Config =
Config(program, args.map(f), timeout)
def replace(repl: Map[String, String]): Config =
mapArgs(s =>
repl.foldLeft(s) { case (res, (k, v)) =>
res.replace(k, v)
}
)
def withEnv(key: String, value: String): Config =
copy(env = env.updated(key, value))
def addEnv(moreEnv: Map[String, String]): Config =
copy(env = env ++ moreEnv)
def appendArgs(extraArgs: Args): Config =
copy(args = args ++ extraArgs.args)
def appendArgs(extraArgs: Seq[String]): Config =
copy(args = args ++ extraArgs)
def toCmd: List[String] =
program :: args.toList
lazy val cmdString: String =
toCmd.mkString(" ")
}
final case class Args(args: Vector[String]) extends Iterable[String] {
override def iterator = args.iterator
def prepend(a: String): Args = Args(a +: args)
def prependWhen(flag: Boolean)(a: String): Args =
prependOption(Option.when(flag)(a))
def prependOption(value: Option[String]): Args =
value.map(prepend).getOrElse(this)
def append(a: String, as: String*): Args =
Args(args ++ (a +: as.toVector))
def appendOption(value: Option[String]): Args =
value.map(append(_)).getOrElse(this)
def appendOptionVal(first: String, second: Option[String]): Args =
second.map(b => append(first, b)).getOrElse(this)
def appendWhen(flag: Boolean)(a: String, as: String*): Args =
if (flag) append(a, as: _*) else this
def appendWhenNot(flag: Boolean)(a: String, as: String*): Args =
if (!flag) append(a, as: _*) else this
def append(p: Path): Args =
append(p.toString)
def append(as: Iterable[String]): Args =
Args(args ++ as.toVector)
}
object Args {
val empty: Args = Args()
def apply(as: String*): Args =
Args(as.toVector)
}
final case class Result(rc: Int, stdout: String, stderr: String)
def exec[F[_]: Sync](
cmd: Config,
logger: Logger[F],
wd: Option[Path] = None,
stdin: Stream[F, Byte] = Stream.empty
): Stream[F, Result] =
startProcess(cmd, wd, logger, stdin) { proc =>
Stream.eval {
for {
_ <- writeToProcess(stdin, proc)
term <- Sync[F].blocking(proc.waitFor(cmd.timeout.seconds, TimeUnit.SECONDS))
_ <-
if (term)
logger.debug(s"Command `${cmd.cmdString}` finished: ${proc.exitValue}")
else
logger.warn(
s"Command `${cmd.cmdString}` did not finish in ${cmd.timeout.formatExact}!"
)
_ <- if (!term) timeoutError(proc, cmd) else Sync[F].pure(())
out <-
if (term) inputStreamToString(proc.getInputStream)
else Sync[F].pure("")
err <-
if (term) inputStreamToString(proc.getErrorStream)
else Sync[F].pure("")
} yield Result(proc.exitValue, out, err)
}
}
def execSuccess[F[_]: Sync](
cmd: Config,
logger: Logger[F],
wd: Option[Path] = None,
stdin: Stream[F, Byte] = Stream.empty
): Stream[F, Result] =
exec(cmd, logger, wd, stdin).flatMap { r =>
if (r.rc != 0)
Stream.raiseError[F](
new Exception(
s"Command `${cmd.cmdString}` returned non-zero exit code ${r.rc}. Stderr: ${r.stderr}"
)
)
else Stream.emit(r)
}
private def startProcess[F[_]: Sync, A](
cmd: Config,
wd: Option[Path],
logger: Logger[F],
stdin: Stream[F, Byte]
)(
f: Process => Stream[F, A]
): Stream[F, A] = {
val log = logger.debug(s"Running external command: ${cmd.cmdString}")
val hasStdin = stdin.take(1).compile.last.map(_.isDefined)
val proc = log *> hasStdin.flatMap(flag =>
Sync[F].blocking {
val pb = new ProcessBuilder(cmd.toCmd.asJava)
.redirectInput(if (flag) Redirect.PIPE else Redirect.INHERIT)
.redirectError(Redirect.PIPE)
.redirectOutput(Redirect.PIPE)
val pbEnv = pb.environment()
cmd.env.foreach { case (key, value) =>
pbEnv.put(key, value)
}
wd.map(_.toNioPath.toFile).foreach(pb.directory)
pb.start()
}
)
Stream
.bracket(proc)(p =>
logger.debug(s"Closing process: `${cmd.cmdString}`").map(_ => p.destroy())
)
.flatMap(f)
}
private def inputStreamToString[F[_]: Sync](in: InputStream): F[String] =
io.readInputStream(Sync[F].pure(in), 16 * 1024, closeAfterUse = false)
.through(text.utf8.decode)
.chunks
.map(_.toVector.mkString)
.fold1(_ + _)
.compile
.last
.map(_.getOrElse(""))
private def writeToProcess[F[_]: Sync](
data: Stream[F, Byte],
proc: Process
): F[Unit] =
data
.through(io.writeOutputStream(Sync[F].blocking(proc.getOutputStream)))
.compile
.drain
private def timeoutError[F[_]: Sync](proc: Process, cmd: Config): F[Unit] =
Sync[F].blocking(proc.destroyForcibly()).attempt *> {
Sync[F].raiseError(
new Exception(
s"Command `${cmd.cmdString}` timed out (${cmd.timeout.formatExact})"
)
)
}
}

View File

@ -62,7 +62,7 @@ object UrlMatcher {
// strip path to only match prefixes // strip path to only match prefixes
val mPath: LenientUri.Path = val mPath: LenientUri.Path =
NonEmptyList.fromList(url.path.segments.take(pathSegmentCount)) match { NonEmptyList.fromList(url.path.segments.take(pathSegmentCount)) match {
case Some(nel) => LenientUri.NonEmptyPath(nel, false) case Some(nel) => LenientUri.NonEmptyPath(nel, trailingSlash = false)
case None => LenientUri.RootPath case None => LenientUri.RootPath
} }

View File

@ -17,6 +17,9 @@ case class Env(values: Map[String, String]) {
def addAll(e: Env): Env = def addAll(e: Env): Env =
Env(values ++ e.values) Env(values ++ e.values)
def modifyValue(f: String => String): Env =
Env(values.view.mapValues(f).toMap)
def ++(e: Env) = addAll(e) def ++(e: Env) = addAll(e)
def foreach(f: (String, String) => Unit): Unit = def foreach(f: (String, String) => Unit): Unit =

View File

@ -0,0 +1,89 @@
/*
* Copyright 2020 Eike K. & Contributors
*
* SPDX-License-Identifier: AGPL-3.0-or-later
*/
package docspell.common.exec
import docspell.common.Duration
import docspell.common.Ident
import docspell.common.exec.Env
import docspell.common.exec.ExternalCommand.ArgMapping
import docspell.common.exec.SysCmd
final case class ExternalCommand(
program: String,
args: Seq[String],
timeout: Duration,
env: Map[String, String] = Map.empty,
argMappings: Map[Ident, ArgMapping] = Map.empty
) {
def withVars(vars: Map[String, String]): ExternalCommand.WithVars =
ExternalCommand.WithVars(this, vars)
import ExternalCommand.pattern
def resolve(vars: Map[String, String]): SysCmd = {
val replace = ExternalCommand.replaceString(vars) _
val resolvedArgMappings =
argMappings.view.mapValues(_.resolve(replace).firstMatch).toMap
val resolvedArgs = args.map(replace).flatMap { arg =>
resolvedArgMappings
.find(e => pattern(e._1.id) == arg)
.map(_._2)
.getOrElse(List(arg))
}
SysCmd(replace(program), resolvedArgs: _*)
.withTimeout(timeout)
.withEnv(_ => Env(env).modifyValue(replace))
}
}
object ExternalCommand {
private val openPattern = "{{"
private val closePattern = "}}"
private def pattern(s: String): String = s"${openPattern}${s}${closePattern}"
def apply(program: String, args: Seq[String], timeout: Duration): ExternalCommand =
ExternalCommand(program, args, timeout, Map.empty, Map.empty)
final case class ArgMapping(
value: String,
mappings: List[ArgMatch]
) {
private[exec] def resolve(replace: String => String): ArgMapping =
ArgMapping(replace(value), mappings.map(_.resolve(replace)))
def firstMatch: List[String] =
mappings.find(am => value.matches(am.matches)).map(_.args).getOrElse(Nil)
}
final case class ArgMatch(
matches: String,
args: List[String]
) {
private[exec] def resolve(replace: String => String): ArgMatch =
ArgMatch(replace(matches), args.map(replace))
}
private def replaceString(vars: Map[String, String])(in: String): String =
vars.foldLeft(in) { case (result, (name, value)) =>
val key = s"{{$name}}"
result.replace(key, value)
}
final case class WithVars(cmd: ExternalCommand, vars: Map[String, String]) {
def resolved: SysCmd = cmd.resolve(vars)
def append(more: (String, String)*): WithVars =
WithVars(cmd, vars ++ more.toMap)
def withVar(key: String, value: String): WithVars =
WithVars(cmd, vars.updated(key, value))
def withVarOption(key: String, value: Option[String]): WithVars =
value.map(withVar(key, _)).getOrElse(this)
}
}

View File

@ -38,6 +38,20 @@ trait SysExec[F[_]] {
def waitFor(timeout: Option[Duration] = None): F[Int] def waitFor(timeout: Option[Duration] = None): F[Int]
/** Uses `waitFor` and throws when return code is non-zero. Logs stderr and stdout while
* waiting.
*/
def runToSuccess(logger: Logger[F], timeout: Option[Duration] = None)(implicit
F: Async[F]
): F[Int]
/** Uses `waitFor` and throws when return code is non-zero. Logs stderr while waiting
* and collects stdout once finished successfully.
*/
def runToSuccessStdout(logger: Logger[F], timeout: Option[Duration] = None)(implicit
F: Async[F]
): F[String]
/** Sends a signal to the process to terminate it immediately */ /** Sends a signal to the process to terminate it immediately */
def cancel: F[Unit] def cancel: F[Unit]
@ -75,6 +89,12 @@ object SysExec {
proc <- startProcess(logger, cmd, workdir, stdin) proc <- startProcess(logger, cmd, workdir, stdin)
fibers <- Resource.eval(Ref.of[F, List[F[Unit]]](Nil)) fibers <- Resource.eval(Ref.of[F, List[F[Unit]]](Nil))
} yield new SysExec[F] { } yield new SysExec[F] {
private lazy val basicName: String =
cmd.program.lastIndexOf(java.io.File.separatorChar.toInt) match {
case n if n > 0 => cmd.program.drop(n + 1)
case _ => cmd.program.takeRight(16)
}
def stdout: Stream[F, Byte] = def stdout: Stream[F, Byte] =
fs2.io.readInputStream( fs2.io.readInputStream(
Sync[F].blocking(proc.getInputStream), Sync[F].blocking(proc.getInputStream),
@ -107,6 +127,39 @@ object SysExec {
) )
} }
def runToSuccess(logger: Logger[F], timeout: Option[Duration])(implicit
F: Async[F]
): F[Int] =
logOutputs(logger, basicName).use(_.waitFor(timeout).flatMap {
case rc if rc == 0 => Sync[F].pure(0)
case rc =>
Sync[F].raiseError(
new Exception(s"Command `${cmd.program}` returned non-zero exit code ${rc}")
)
})
def runToSuccessStdout(logger: Logger[F], timeout: Option[Duration])(implicit
F: Async[F]
): F[String] =
F.background(
stderrLines
.through(line => Stream.eval(logger.debug(s"[$basicName (err)]: $line")))
.compile
.drain
).use { f1 =>
waitFor(timeout)
.flatMap {
case rc if rc == 0 => stdout.through(fs2.text.utf8.decode).compile.string
case rc =>
Sync[F].raiseError[String](
new Exception(
s"Command `${cmd.program}` returned non-zero exit code ${rc}"
)
)
}
.flatTap(_ => f1)
}
def consumeOutputs(out: Pipe[F, String, Unit], err: Pipe[F, String, Unit])(implicit def consumeOutputs(out: Pipe[F, String, Unit], err: Pipe[F, String, Unit])(implicit
F: Async[F] F: Async[F]
): Resource[F, SysExec[F]] = ): Resource[F, SysExec[F]] =

View File

@ -6,6 +6,7 @@
package docspell.common.util package docspell.common.util
import cats.data.OptionT
import cats.effect._ import cats.effect._
import cats.syntax.all._ import cats.syntax.all._
import cats.{Applicative, Monad} import cats.{Applicative, Monad}
@ -26,10 +27,10 @@ object Directory {
(dir :: dirs.toList).traverse_(Files[F].createDirectories(_)) (dir :: dirs.toList).traverse_(Files[F].createDirectories(_))
def nonEmpty[F[_]: Files: Sync](dir: Path): F[Boolean] = def nonEmpty[F[_]: Files: Sync](dir: Path): F[Boolean] =
List( OptionT
Files[F].isDirectory(dir), .whenM(Files[F].isDirectory(dir))(Files[F].list(dir).take(1).compile.toList)
Files[F].list(dir).take(1).compile.last.map(_.isDefined) .map(_.nonEmpty)
).sequence.map(_.forall(identity)) .isDefined
def isEmpty[F[_]: Files: Sync](dir: Path): F[Boolean] = def isEmpty[F[_]: Files: Sync](dir: Path): F[Boolean] =
nonEmpty(dir).map(b => !b) nonEmpty(dir).map(b => !b)

View File

@ -0,0 +1,74 @@
/*
* Copyright 2020 Eike K. & Contributors
*
* SPDX-License-Identifier: AGPL-3.0-or-later
*/
package docspell.common.exec
import docspell.common.Duration
import docspell.common.Ident
import docspell.common.exec.Args
import docspell.common.exec.Env
import docspell.common.exec.ExternalCommand._
import docspell.common.exec.SysCmd
import munit.FunSuite
class ExternalCommandTest extends FunSuite {
test("resolve") {
val cmd = ExternalCommand(
program = "tesseract",
args = "{{infile}}" :: "{{lang-spec}}" :: "out" :: "pdf" :: "txt" :: Nil,
timeout = Duration.minutes(5),
env = Map.empty,
argMappings = Map(
Ident.unsafe("lang-spec") -> ArgMapping(
value = "{{lang}}",
mappings = List(
ArgMatch(
matches = "jpn_vert",
args = List("-l", "jpn_vert", "-c", "preserve_interword_spaces=1")
),
ArgMatch(
matches = ".*",
args = List("-l", "{{lang}}")
)
)
)
)
)
val varsDe = Map("lang" -> "de", "encoding" -> "UTF_8", "infile" -> "text.jpg")
assertEquals(
cmd.resolve(varsDe),
SysCmd(
"tesseract",
Args.of("text.jpg", "-l", "de", "out", "pdf", "txt"),
Env.empty,
Duration.minutes(5)
)
)
val varsJpnVert = varsDe.updated("lang", "jpn_vert")
assertEquals(
cmd.resolve(varsJpnVert),
SysCmd(
"tesseract",
Args.of(
"text.jpg",
"-l",
"jpn_vert",
"-c",
"preserve_interword_spaces=1",
"out",
"pdf",
"txt"
),
Env.empty,
Duration.minutes(5)
)
)
}
}

View File

@ -16,7 +16,7 @@ import munit.CatsEffectSuite
class DirectoryTest extends CatsEffectSuite with TestLoggingConfig { class DirectoryTest extends CatsEffectSuite with TestLoggingConfig {
val logger = docspell.logging.getLogger[IO] val logger = docspell.logging.getLogger[IO]
val tempDir = ResourceFixture( val tempDir = ResourceFunFixture(
Files[IO].tempDirectory(Path("target").some, "directory-test-", None) Files[IO].tempDirectory(Path("target").some, "directory-test-", None)
) )

View File

@ -11,7 +11,8 @@ import cats.implicits._
import fs2.io.file.{Files, Path} import fs2.io.file.{Files, Path}
import fs2.{Pipe, Stream} import fs2.{Pipe, Stream}
import docspell.common._ import docspell.common.exec.ExternalCommand
import docspell.common.exec.SysExec
import docspell.common.util.File import docspell.common.util.File
import docspell.convert.ConversionResult import docspell.convert.ConversionResult
import docspell.convert.ConversionResult.{Handler, successPdf, successPdfTxt} import docspell.convert.ConversionResult.{Handler, successPdf, successPdfTxt}
@ -21,11 +22,11 @@ private[extern] object ExternConv {
def toPDF[F[_]: Async: Files, A]( def toPDF[F[_]: Async: Files, A](
name: String, name: String,
cmdCfg: SystemCommand.Config, cmdCfg: ExternalCommand.WithVars,
wd: Path, wd: Path,
useStdin: Boolean, useStdin: Boolean,
logger: Logger[F], logger: Logger[F],
reader: (Path, SystemCommand.Result) => F[ConversionResult[F]] reader: (Path, Int) => F[ConversionResult[F]]
)(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = )(in: Stream[F, Byte], handler: Handler[F, A]): F[A] =
Stream Stream
.resource(File.withTempDir[F](wd, s"docspell-$name")) .resource(File.withTempDir[F](wd, s"docspell-$name"))
@ -33,32 +34,21 @@ private[extern] object ExternConv {
val inFile = dir.resolve("infile").absolute.normalize val inFile = dir.resolve("infile").absolute.normalize
val out = dir.resolve("out.pdf").absolute.normalize val out = dir.resolve("out.pdf").absolute.normalize
val sysCfg = val sysCfg =
cmdCfg.replace( cmdCfg
Map( .withVar("outfile", out.toString)
"{{outfile}}" -> out.toString .withVarOption("infile", Option.when(!useStdin)(inFile.toString))
) ++ .resolved
(if (!useStdin) Map("{{infile}}" -> inFile.toString)
else Map.empty)
)
val createInput: Pipe[F, Byte, Unit] = val createInput: Pipe[F, Byte, Unit] =
if (useStdin) _ => Stream.emit(()) if (useStdin) _ => Stream.emit(())
else storeDataToFile(name, logger, inFile) else storeDataToFile(name, logger, inFile)
in.through(createInput).flatMap { _ => in.through(createInput).evalMap { _ =>
SystemCommand SysExec(sysCfg, logger, Some(dir), Option.when(useStdin)(in))
.exec[F]( .flatMap(_.logOutputs(logger, name))
sysCfg, .use { proc =>
logger, proc.waitFor().flatMap(rc => reader(out, rc).flatMap(handler.run))
Some(dir), }
if (useStdin) in
else Stream.empty
)
.evalMap(result =>
logResult(name, result, logger)
.flatMap(_ => reader(out, result))
.flatMap(handler.run)
)
} }
} }
.compile .compile
@ -74,9 +64,9 @@ private[extern] object ExternConv {
def readResult[F[_]: Async: Files]( def readResult[F[_]: Async: Files](
chunkSize: Int, chunkSize: Int,
logger: Logger[F] logger: Logger[F]
)(out: Path, result: SystemCommand.Result): F[ConversionResult[F]] = )(out: Path, result: Int): F[ConversionResult[F]] =
File.existsNonEmpty[F](out).flatMap { File.existsNonEmpty[F](out).flatMap {
case true if result.rc == 0 => case true if result == 0 =>
val outTxt = out.resolveSibling(out.fileName.toString + ".txt") val outTxt = out.resolveSibling(out.fileName.toString + ".txt")
File.existsNonEmpty[F](outTxt).flatMap { File.existsNonEmpty[F](outTxt).flatMap {
case true => case true =>
@ -88,13 +78,13 @@ private[extern] object ExternConv {
successPdf(File.readAll(out, chunkSize)).pure[F] successPdf(File.readAll(out, chunkSize)).pure[F]
} }
case true => case true =>
logger.warn(s"Command not successful (rc=${result.rc}), but file exists.") *> logger.warn(s"Command not successful (rc=${result}), but file exists.") *>
successPdf(File.readAll(out, chunkSize)).pure[F] successPdf(File.readAll(out, chunkSize)).pure[F]
case false => case false =>
ConversionResult ConversionResult
.failure[F]( .failure[F](
new Exception(s"Command result=${result.rc}. No output file found.") new Exception(s"Command result=${result}. No output file found.")
) )
.pure[F] .pure[F]
} }
@ -103,25 +93,25 @@ private[extern] object ExternConv {
outPrefix: String, outPrefix: String,
chunkSize: Int, chunkSize: Int,
logger: Logger[F] logger: Logger[F]
)(out: Path, result: SystemCommand.Result): F[ConversionResult[F]] = { )(out: Path, result: Int): F[ConversionResult[F]] = {
val outPdf = out.resolveSibling(s"$outPrefix.pdf") val outPdf = out.resolveSibling(s"$outPrefix.pdf")
File.existsNonEmpty[F](outPdf).flatMap { File.existsNonEmpty[F](outPdf).flatMap {
case true => case true =>
val outTxt = out.resolveSibling(s"$outPrefix.txt") val outTxt = out.resolveSibling(s"$outPrefix.txt")
File.exists(outTxt).flatMap { txtExists => File.exists(outTxt).flatMap { txtExists =>
val pdfData = File.readAll(out, chunkSize) val pdfData = File.readAll(out, chunkSize)
if (result.rc == 0) if (result == 0)
if (txtExists) successPdfTxt(pdfData, File.readText(outTxt)).pure[F] if (txtExists) successPdfTxt(pdfData, File.readText(outTxt)).pure[F]
else successPdf(pdfData).pure[F] else successPdf(pdfData).pure[F]
else else
logger.warn(s"Command not successful (rc=${result.rc}), but file exists.") *> logger.warn(s"Command not successful (rc=${result}), but file exists.") *>
successPdf(pdfData).pure[F] successPdf(pdfData).pure[F]
} }
case false => case false =>
ConversionResult ConversionResult
.failure[F]( .failure[F](
new Exception(s"Command result=${result.rc}. No output file found.") new Exception(s"Command result=${result}. No output file found.")
) )
.pure[F] .pure[F]
} }
@ -138,14 +128,6 @@ private[extern] object ExternConv {
.drain ++ .drain ++
Stream.eval(storeFile(in, inFile)) Stream.eval(storeFile(in, inFile))
private def logResult[F[_]: Sync](
name: String,
result: SystemCommand.Result,
logger: Logger[F]
): F[Unit] =
logger.debug(s"$name stdout: ${result.stdout}") *>
logger.debug(s"$name stderr: ${result.stderr}")
private def storeFile[F[_]: Async: Files]( private def storeFile[F[_]: Async: Files](
in: Stream[F, Byte], in: Stream[F, Byte],
target: Path target: Path

View File

@ -24,14 +24,16 @@ object OcrMyPdf {
logger: Logger[F] logger: Logger[F]
)(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = )(in: Stream[F, Byte], handler: Handler[F, A]): F[A] =
if (cfg.enabled) { if (cfg.enabled) {
val reader: (Path, SystemCommand.Result) => F[ConversionResult[F]] = val reader: (Path, Int) => F[ConversionResult[F]] =
ExternConv.readResult[F](chunkSize, logger) ExternConv.readResult[F](chunkSize, logger)
val cmd = cfg.command.withVars(Map("lang" -> lang.iso3))
ExternConv.toPDF[F, A]( ExternConv.toPDF[F, A](
"ocrmypdf", "ocrmypdf",
cfg.command.replace(Map("{{lang}}" -> lang.iso3)), cmd,
cfg.workingDir, cfg.workingDir,
false, useStdin = false,
logger, logger,
reader reader
)(in, handler) )(in, handler)

View File

@ -8,10 +8,10 @@ package docspell.convert.extern
import fs2.io.file.Path import fs2.io.file.Path
import docspell.common.SystemCommand import docspell.common.exec.ExternalCommand
case class OcrMyPdfConfig( case class OcrMyPdfConfig(
enabled: Boolean, enabled: Boolean,
command: SystemCommand.Config, command: ExternalCommand,
workingDir: Path workingDir: Path
) )

View File

@ -24,17 +24,18 @@ object Tesseract {
logger: Logger[F] logger: Logger[F]
)(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = { )(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = {
val outBase = cfg.command.args.tail.headOption.getOrElse("out") val outBase = cfg.command.args.tail.headOption.getOrElse("out")
val reader: (Path, SystemCommand.Result) => F[ConversionResult[F]] = val reader: (Path, Int) => F[ConversionResult[F]] =
ExternConv.readResultTesseract[F](outBase, chunkSize, logger) ExternConv.readResultTesseract[F](outBase, chunkSize, logger)
val cmd = cfg.command.withVars(Map("lang" -> lang.iso3))
ExternConv.toPDF[F, A]( ExternConv.toPDF[F, A](
"tesseract", "tesseract",
cfg.command.replace(Map("{{lang}}" -> lang.iso3)), cmd,
cfg.workingDir, cfg.workingDir,
false, useStdin = false,
logger, logger,
reader reader
)(in, handler) )(in, handler)
} }
} }

View File

@ -8,6 +8,6 @@ package docspell.convert.extern
import fs2.io.file.Path import fs2.io.file.Path
import docspell.common.SystemCommand import docspell.common.exec.ExternalCommand
case class TesseractConfig(command: SystemCommand.Config, workingDir: Path) case class TesseractConfig(command: ExternalCommand, workingDir: Path)

View File

@ -10,7 +10,6 @@ import cats.effect._
import fs2.Stream import fs2.Stream
import fs2.io.file.{Files, Path} import fs2.io.file.{Files, Path}
import docspell.common._
import docspell.convert.ConversionResult import docspell.convert.ConversionResult
import docspell.convert.ConversionResult.Handler import docspell.convert.ConversionResult.Handler
import docspell.logging.Logger import docspell.logging.Logger
@ -22,14 +21,15 @@ object Unoconv {
chunkSize: Int, chunkSize: Int,
logger: Logger[F] logger: Logger[F]
)(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = { )(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = {
val reader: (Path, SystemCommand.Result) => F[ConversionResult[F]] = val reader: (Path, Int) => F[ConversionResult[F]] =
ExternConv.readResult[F](chunkSize, logger) ExternConv.readResult[F](chunkSize, logger)
val cmd = cfg.command.withVars(Map.empty)
ExternConv.toPDF[F, A]( ExternConv.toPDF[F, A](
"unoconv", "unoconv",
cfg.command, cmd,
cfg.workingDir, cfg.workingDir,
false, useStdin = false,
logger, logger,
reader reader
)( )(
@ -37,5 +37,4 @@ object Unoconv {
handler handler
) )
} }
} }

View File

@ -8,6 +8,6 @@ package docspell.convert.extern
import fs2.io.file.Path import fs2.io.file.Path
import docspell.common.SystemCommand import docspell.common.exec.ExternalCommand
case class UnoconvConfig(command: SystemCommand.Config, workingDir: Path) case class UnoconvConfig(command: ExternalCommand, workingDir: Path)

View File

@ -27,10 +27,10 @@ object Weasyprint {
sanitizeHtml: SanitizeHtml, sanitizeHtml: SanitizeHtml,
logger: Logger[F] logger: Logger[F]
)(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = { )(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = {
val reader: (Path, SystemCommand.Result) => F[ConversionResult[F]] = val reader: (Path, Int) => F[ConversionResult[F]] =
ExternConv.readResult[F](chunkSize, logger) ExternConv.readResult[F](chunkSize, logger)
val cmdCfg = cfg.command.replace(Map("{{encoding}}" -> charset.name())) val cmdCfg = cfg.command.withVars(Map("encoding" -> charset.name()))
// html sanitize should (among other) remove links to invalid // html sanitize should (among other) remove links to invalid
// protocols like cid: which is not supported by further // protocols like cid: which is not supported by further
@ -51,5 +51,4 @@ object Weasyprint {
handler handler
) )
} }
} }

View File

@ -8,6 +8,6 @@ package docspell.convert.extern
import fs2.io.file.Path import fs2.io.file.Path
import docspell.common.SystemCommand import docspell.common.exec.ExternalCommand
case class WeasyprintConfig(command: SystemCommand.Config, workingDir: Path) case class WeasyprintConfig(command: ExternalCommand, workingDir: Path)

View File

@ -27,10 +27,10 @@ object WkHtmlPdf {
sanitizeHtml: SanitizeHtml, sanitizeHtml: SanitizeHtml,
logger: Logger[F] logger: Logger[F]
)(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = { )(in: Stream[F, Byte], handler: Handler[F, A]): F[A] = {
val reader: (Path, SystemCommand.Result) => F[ConversionResult[F]] = val reader: (Path, Int) => F[ConversionResult[F]] =
ExternConv.readResult[F](chunkSize, logger) ExternConv.readResult[F](chunkSize, logger)
val cmdCfg = cfg.command.replace(Map("{{encoding}}" -> charset.name())) val cmdCfg = cfg.command.withVars(Map("encoding" -> charset.name()))
// html sanitize should (among other) remove links to invalid // html sanitize should (among other) remove links to invalid
// protocols like cid: which is not supported by further // protocols like cid: which is not supported by further
@ -58,5 +58,4 @@ object WkHtmlPdf {
handler handler
) )
} }
} }

View File

@ -8,6 +8,6 @@ package docspell.convert.extern
import fs2.io.file.Path import fs2.io.file.Path
import docspell.common.SystemCommand import docspell.common.exec.ExternalCommand
case class WkHtmlPdfConfig(command: SystemCommand.Config, workingDir: Path) case class WkHtmlPdfConfig(command: ExternalCommand, workingDir: Path)

View File

@ -15,6 +15,7 @@ import cats.implicits._
import fs2.Stream import fs2.Stream
import docspell.common._ import docspell.common._
import docspell.common.exec._
import docspell.common.util.File import docspell.common.util.File
import docspell.convert.ConversionResult.Handler import docspell.convert.ConversionResult.Handler
import docspell.convert.ConvertConfig.HtmlConverter import docspell.convert.ConvertConfig.HtmlConverter
@ -36,7 +37,7 @@ class ConversionTest extends FunSuite with FileChecks with TestLoggingConfig {
3000 * 3000, 3000 * 3000,
MarkdownConfig("body { padding: 2em 5em; }"), MarkdownConfig("body { padding: 2em 5em; }"),
WkHtmlPdfConfig( WkHtmlPdfConfig(
SystemCommand.Config( ExternalCommand(
"wkhtmltopdf", "wkhtmltopdf",
Seq("-s", "A4", "--encoding", "UTF-8", "-", "{{outfile}}"), Seq("-s", "A4", "--encoding", "UTF-8", "-", "{{outfile}}"),
Duration.seconds(20) Duration.seconds(20)
@ -44,7 +45,7 @@ class ConversionTest extends FunSuite with FileChecks with TestLoggingConfig {
target target
), ),
WeasyprintConfig( WeasyprintConfig(
SystemCommand.Config( ExternalCommand(
"weasyprint", "weasyprint",
Seq("--encoding", "UTF-8", "-", "{{outfile}}"), Seq("--encoding", "UTF-8", "-", "{{outfile}}"),
Duration.seconds(20) Duration.seconds(20)
@ -53,7 +54,7 @@ class ConversionTest extends FunSuite with FileChecks with TestLoggingConfig {
), ),
HtmlConverter.Wkhtmltopdf, HtmlConverter.Wkhtmltopdf,
TesseractConfig( TesseractConfig(
SystemCommand.Config( ExternalCommand(
"tesseract", "tesseract",
Seq("{{infile}}", "out", "-l", "deu", "pdf", "txt"), Seq("{{infile}}", "out", "-l", "deu", "pdf", "txt"),
Duration.seconds(20) Duration.seconds(20)
@ -61,7 +62,7 @@ class ConversionTest extends FunSuite with FileChecks with TestLoggingConfig {
target target
), ),
UnoconvConfig( UnoconvConfig(
SystemCommand.Config( ExternalCommand(
"unoconv", "unoconv",
Seq("-f", "pdf", "-o", "{{outfile}}", "{{infile}}"), Seq("-f", "pdf", "-o", "{{outfile}}", "{{infile}}"),
Duration.seconds(20) Duration.seconds(20)
@ -69,8 +70,8 @@ class ConversionTest extends FunSuite with FileChecks with TestLoggingConfig {
target target
), ),
OcrMyPdfConfig( OcrMyPdfConfig(
true, enabled = true,
SystemCommand.Config( ExternalCommand(
"ocrmypdf", "ocrmypdf",
Seq( Seq(
"-l", "-l",
@ -86,7 +87,7 @@ class ConversionTest extends FunSuite with FileChecks with TestLoggingConfig {
), ),
target target
), ),
ConvertConfig.DecryptPdf(true, Nil) ConvertConfig.DecryptPdf(enabled = true, Nil)
) )
val conversion = val conversion =

View File

@ -14,6 +14,7 @@ import cats.effect.unsafe.implicits.global
import fs2.io.file.Path import fs2.io.file.Path
import docspell.common._ import docspell.common._
import docspell.common.exec._
import docspell.common.util.File import docspell.common.util.File
import docspell.convert._ import docspell.convert._
import docspell.files.ExampleFiles import docspell.files.ExampleFiles
@ -27,7 +28,7 @@ class ExternConvTest extends FunSuite with FileChecks with TestLoggingConfig {
val target = File.path(Paths.get("target")) val target = File.path(Paths.get("target"))
test("convert html to pdf") { test("convert html to pdf") {
val cfg = SystemCommand.Config( val cfg = ExternalCommand(
"wkhtmltopdf", "wkhtmltopdf",
Seq("-s", "A4", "--encoding", "UTF-8", "-", "{{outfile}}"), Seq("-s", "A4", "--encoding", "UTF-8", "-", "{{outfile}}"),
Duration.seconds(20) Duration.seconds(20)
@ -53,7 +54,7 @@ class ExternConvTest extends FunSuite with FileChecks with TestLoggingConfig {
} }
test("convert office to pdf") { test("convert office to pdf") {
val cfg = SystemCommand.Config( val cfg = ExternalCommand(
"unoconv", "unoconv",
Seq("-f", "pdf", "-o", "{{outfile}}", "{{infile}}"), Seq("-f", "pdf", "-o", "{{outfile}}", "{{infile}}"),
Duration.seconds(20) Duration.seconds(20)
@ -80,7 +81,7 @@ class ExternConvTest extends FunSuite with FileChecks with TestLoggingConfig {
} }
test("convert image to pdf") { test("convert image to pdf") {
val cfg = SystemCommand.Config( val cfg = ExternalCommand(
"tesseract", "tesseract",
Seq("{{infile}}", "out", "-l", "deu", "pdf", "txt"), Seq("{{infile}}", "out", "-l", "deu", "pdf", "txt"),
Duration.seconds(20) Duration.seconds(20)
@ -105,5 +106,4 @@ class ExternConvTest extends FunSuite with FileChecks with TestLoggingConfig {
) )
.unsafeRunSync() .unsafeRunSync()
} }
} }

View File

@ -10,7 +10,8 @@ import cats.effect._
import fs2.Stream import fs2.Stream
import fs2.io.file.{Files, Path} import fs2.io.file.{Files, Path}
import docspell.common._ import docspell.common.exec.ExternalCommand
import docspell.common.exec.SysExec
import docspell.common.util.File import docspell.common.util.File
import docspell.logging.Logger import docspell.logging.Logger
@ -77,14 +78,17 @@ object Ocr {
else cfg.ghostscript.command.args else cfg.ghostscript.command.args
val cmd = cfg.ghostscript.command val cmd = cfg.ghostscript.command
.copy(args = xargs) .copy(args = xargs)
.replace( .withVars(
Map( Map(
"{{infile}}" -> "-", "infile" -> "-",
"{{outfile}}" -> "%d.tif" "outfile" -> "%d.tif"
) )
) )
SystemCommand .resolved
.execSuccess(cmd, logger, wd = Some(wd), stdin = pdf)
Stream
.resource(SysExec(cmd, logger, Some(wd), Some(pdf)))
.evalMap(_.runToSuccess(logger))
.flatMap(_ => File.listFiles(pathEndsWith(".tif"), wd)) .flatMap(_ => File.listFiles(pathEndsWith(".tif"), wd))
} }
@ -93,18 +97,22 @@ object Ocr {
*/ */
private[extract] def runGhostscriptFile[F[_]: Async: Files]( private[extract] def runGhostscriptFile[F[_]: Async: Files](
pdf: Path, pdf: Path,
ghostscript: SystemCommand.Config, ghostscript: ExternalCommand,
wd: Path, wd: Path,
logger: Logger[F] logger: Logger[F]
): Stream[F, Path] = { ): Stream[F, Path] = {
val cmd = ghostscript.replace( val cmd = ghostscript
Map( .withVars(
"{{infile}}" -> pdf.absolute.toString, Map(
"{{outfile}}" -> "%d.tif" "infile" -> pdf.absolute.toString,
"outfile" -> "%d.tif"
)
) )
) .resolved
SystemCommand
.execSuccess[F](cmd, logger, wd = Some(wd)) Stream
.resource(SysExec(cmd, logger, Some(wd)))
.evalMap(_.runToSuccess(logger))
.flatMap(_ => File.listFiles(pathEndsWith(".tif"), wd)) .flatMap(_ => File.listFiles(pathEndsWith(".tif"), wd))
} }
@ -116,19 +124,23 @@ object Ocr {
*/ */
private[extract] def runUnpaperFile[F[_]: Async]( private[extract] def runUnpaperFile[F[_]: Async](
img: Path, img: Path,
unpaper: SystemCommand.Config, unpaper: ExternalCommand,
wd: Option[Path], wd: Option[Path],
logger: Logger[F] logger: Logger[F]
): Stream[F, Path] = { ): Stream[F, Path] = {
val targetFile = img.resolveSibling("u-" + img.fileName.toString).absolute val targetFile = img.resolveSibling("u-" + img.fileName.toString).absolute
val cmd = unpaper.replace( val cmd = unpaper
Map( .withVars(
"{{infile}}" -> img.absolute.toString, Map(
"{{outfile}}" -> targetFile.toString "infile" -> img.absolute.toString,
"outfile" -> targetFile.toString
)
) )
) .resolved
SystemCommand
.execSuccess[F](cmd, logger, wd = wd) Stream
.resource(SysExec(cmd, logger, wd))
.evalMap(_.runToSuccess(logger))
.map(_ => targetFile) .map(_ => targetFile)
.handleErrorWith { th => .handleErrorWith { th =>
logger logger
@ -150,12 +162,14 @@ object Ocr {
// so use the parent as working dir // so use the parent as working dir
runUnpaperFile(img, config.unpaper.command, img.parent, logger).flatMap { uimg => runUnpaperFile(img, config.unpaper.command, img.parent, logger).flatMap { uimg =>
val cmd = config.tesseract.command val cmd = config.tesseract.command
.replace( .withVars(
Map("{{file}}" -> uimg.fileName.toString, "{{lang}}" -> fixLanguage(lang)) Map("file" -> uimg.fileName.toString, "lang" -> fixLanguage(lang))
) )
SystemCommand .resolved
.execSuccess[F](cmd, logger, wd = uimg.parent)
.map(_.stdout) Stream
.resource(SysExec(cmd, logger, uimg.parent))
.evalMap(_.runToSuccessStdout(logger))
} }
/** Run tesseract on the given image file and return the extracted text. */ /** Run tesseract on the given image file and return the extracted text. */
@ -166,8 +180,12 @@ object Ocr {
config: OcrConfig config: OcrConfig
): Stream[F, String] = { ): Stream[F, String] = {
val cmd = config.tesseract.command val cmd = config.tesseract.command
.replace(Map("{{file}}" -> "stdin", "{{lang}}" -> fixLanguage(lang))) .withVars(Map("file" -> "stdin", "lang" -> fixLanguage(lang)))
SystemCommand.execSuccess(cmd, logger, stdin = img).map(_.stdout) .resolved
Stream
.resource(SysExec(cmd, logger, None, Some(img)))
.evalMap(_.runToSuccessStdout(logger))
} }
private def fixLanguage(lang: String): String = private def fixLanguage(lang: String): String =

View File

@ -6,12 +6,9 @@
package docspell.extract.ocr package docspell.extract.ocr
import java.nio.file.Paths
import fs2.io.file.Path import fs2.io.file.Path
import docspell.common._ import docspell.common.exec.ExternalCommand
import docspell.common.util.File
case class OcrConfig( case class OcrConfig(
maxImageSize: Int, maxImageSize: Int,
@ -25,43 +22,10 @@ object OcrConfig {
case class PageRange(begin: Int) case class PageRange(begin: Int)
case class Ghostscript(command: SystemCommand.Config, workingDir: Path) case class Ghostscript(command: ExternalCommand, workingDir: Path)
case class Tesseract(command: SystemCommand.Config) case class Tesseract(command: ExternalCommand)
case class Unpaper(command: SystemCommand.Config) case class Unpaper(command: ExternalCommand)
val default = OcrConfig(
maxImageSize = 3000 * 3000,
pageRange = PageRange(10),
ghostscript = Ghostscript(
SystemCommand.Config(
"gs",
Seq(
"-dNOPAUSE",
"-dBATCH",
"-dSAFER",
"-sDEVICE=tiffscaled8",
"-sOutputFile={{outfile}}",
"{{infile}}"
),
Duration.seconds(30)
),
File.path(
Paths.get(System.getProperty("java.io.tmpdir")).resolve("docspell-extraction")
)
),
unpaper = Unpaper(
SystemCommand
.Config("unpaper", Seq("{{infile}}", "{{outfile}}"), Duration.seconds(30))
),
tesseract = Tesseract(
SystemCommand
.Config(
"tesseract",
Seq("{{file}}", "stdout", "-l", "{{lang}}"),
Duration.minutes(1)
)
)
)
} }

View File

@ -6,9 +6,14 @@
package docspell.extract.ocr package docspell.extract.ocr
import java.nio.file.Paths
import cats.effect.IO import cats.effect.IO
import cats.effect.unsafe.implicits.global import cats.effect.unsafe.implicits.global
import docspell.common.Duration
import docspell.common.exec.ExternalCommand
import docspell.common.util.File
import docspell.files.TestFiles import docspell.files.TestFiles
import docspell.logging.TestLoggingConfig import docspell.logging.TestLoggingConfig
@ -21,7 +26,7 @@ class TextExtractionSuite extends FunSuite with TestLoggingConfig {
test("extract english pdf".ignore) { test("extract english pdf".ignore) {
val text = TextExtract val text = TextExtract
.extract[IO](letterSourceEN, logger, "eng", OcrConfig.default) .extract[IO](letterSourceEN, logger, "eng", TextExtractionSuite.defaultConfig)
.compile .compile
.lastOrError .lastOrError
.unsafeRunSync() .unsafeRunSync()
@ -31,7 +36,7 @@ class TextExtractionSuite extends FunSuite with TestLoggingConfig {
test("extract german pdf".ignore) { test("extract german pdf".ignore) {
val expect = TestFiles.letterDEText val expect = TestFiles.letterDEText
val extract = TextExtract val extract = TextExtract
.extract[IO](letterSourceDE, logger, "deu", OcrConfig.default) .extract[IO](letterSourceDE, logger, "deu", TextExtractionSuite.defaultConfig)
.compile .compile
.lastOrError .lastOrError
.unsafeRunSync() .unsafeRunSync()
@ -39,3 +44,37 @@ class TextExtractionSuite extends FunSuite with TestLoggingConfig {
assertEquals(extract.value, expect) assertEquals(extract.value, expect)
} }
} }
object TextExtractionSuite {
val defaultConfig = OcrConfig(
maxImageSize = 3000 * 3000,
pageRange = OcrConfig.PageRange(10),
ghostscript = OcrConfig.Ghostscript(
ExternalCommand(
"gs",
Seq(
"-dNOPAUSE",
"-dBATCH",
"-dSAFER",
"-sDEVICE=tiffscaled8",
"-sOutputFile={{outfile}}",
"{{infile}}"
),
Duration.seconds(30)
),
File.path(
Paths.get(System.getProperty("java.io.tmpdir")).resolve("docspell-extraction")
)
),
unpaper = OcrConfig.Unpaper(
ExternalCommand("unpaper", Seq("{{infile}}", "{{outfile}}"), Duration.seconds(30))
),
tesseract = OcrConfig.Tesseract(
ExternalCommand(
"tesseract",
Seq("{{file}}", "stdout", "-l", "{{lang}}"),
Duration.minutes(1)
)
)
)
}

View File

@ -19,7 +19,7 @@ import munit._
class ZipTest extends CatsEffectSuite with TestLoggingConfig { class ZipTest extends CatsEffectSuite with TestLoggingConfig {
val logger = docspell.logging.getLogger[IO] val logger = docspell.logging.getLogger[IO]
val tempDir = ResourceFixture( val tempDir = ResourceFunFixture(
Files[IO].tempDirectory(Path("target").some, "zip-test-", None) Files[IO].tempDirectory(Path("target").some, "zip-test-", None)
) )

View File

@ -201,6 +201,7 @@ object FtsRepository extends DoobieMeta {
case Language.Czech => "simple" case Language.Czech => "simple"
case Language.Latvian => "simple" case Language.Latvian => "simple"
case Language.Japanese => "simple" case Language.Japanese => "simple"
case Language.JpnVert => "simple"
case Language.Hebrew => "simple" case Language.Hebrew => "simple"
case Language.Lithuanian => "simple" case Language.Lithuanian => "simple"
case Language.Polish => "simple" case Language.Polish => "simple"

View File

@ -45,7 +45,7 @@ object SolrMigration {
description, description,
FtsMigration.Result.reIndexAll.pure[F] FtsMigration.Result.reIndexAll.pure[F]
), ),
true dataChangeOnly = true
) )
def indexAll[F[_]: Applicative]( def indexAll[F[_]: Applicative](
@ -59,7 +59,7 @@ object SolrMigration {
description, description,
FtsMigration.Result.indexAll.pure[F] FtsMigration.Result.indexAll.pure[F]
), ),
true dataChangeOnly = true
) )
def apply[F[_]: Functor]( def apply[F[_]: Functor](
@ -74,6 +74,6 @@ object SolrMigration {
description, description,
task.map(_ => FtsMigration.Result.workDone) task.map(_ => FtsMigration.Result.workDone)
), ),
false dataChangeOnly = false
) )
} }

View File

@ -299,14 +299,22 @@ object SolrSetup {
Map("add-field" -> body.asJson).asJson Map("add-field" -> body.asJson).asJson
def string(field: Field): AddField = def string(field: Field): AddField =
AddField(field, "string", true, true, false) AddField(field, "string", stored = true, indexed = true, multiValued = false)
def textGeneral(field: Field): AddField = def textGeneral(field: Field): AddField =
AddField(field, "text_general", true, true, false) AddField(field, "text_general", stored = true, indexed = true, multiValued = false)
def textLang(field: Field, lang: Language): AddField = def textLang(field: Field, lang: Language): AddField =
if (lang == Language.Czech) AddField(field, s"text_cz", true, true, false) if (lang == Language.Czech)
else AddField(field, s"text_${lang.iso2}", true, true, false) AddField(field, s"text_cz", stored = true, indexed = true, multiValued = false)
else
AddField(
field,
s"text_${lang.iso2}",
stored = true,
indexed = true,
multiValued = false
)
} }
case class DeleteField(name: Field) case class DeleteField(name: Field)

View File

@ -595,11 +595,30 @@ Docpell Update Check
tesseract = { tesseract = {
command = { command = {
program = "tesseract" program = "tesseract"
# Custom Language Mappings Below
# Japanese Vertical Mapping
arg-mappings = {
"tesseract_lang" = {
value = "{{lang}}"
mappings = [
{
matches = "jpn_vert"
args = [ "-l", "jpn_vert", "-c", "preserve_interword_spaces=1" ]
},
# Start Other Custom Language Mappings Here
# Default Mapping Below
{
matches = ".*"
args = [ "-l", "{{lang}}" ]
}
]
}
}
# Default arguments for all processing go below.
args = [ args = [
"{{infile}}", "{{infile}}",
"out", "out",
"-l", "{{tesseract_lang}}",
"{{lang}}",
"pdf", "pdf",
"txt" "txt"
] ]
@ -651,8 +670,34 @@ Docpell Update Check
enabled = true enabled = true
command = { command = {
program = "ocrmypdf" program = "ocrmypdf"
# Custom argument mappings for this program.
arg-mappings = {
"ocr_lang" = {
value = "{{lang}}"
# Custom Language Mappings Below
# Japanese Vertical Mapping
mappings = [
{
matches = "jpn_vert"
args = [ "-l", "jpn_vert", "--pdf-renderer", "sandwich", "--tesseract-pagesegmode", "5", "--output-type", "pdf" ]
},
# Japanese Mapping for OCR Optimization
{
matches = "jpn"
args = [ "-l", "jpn", "--output-type", "pdf" ]
},
# Start Other Custom Language Mappings Here
# Default Mapping Below
{
matches = ".*"
args = [ "-l", "{{lang}}" ]
}
]
}
}
# Default arguments for all processing go below.
args = [ args = [
"-l", "{{lang}}", "{{ocr_lang}}",
"--skip-text", "--skip-text",
"--deskew", "--deskew",
"-j", "1", "-j", "1",

View File

@ -30,7 +30,7 @@ object EmptyTrashTask {
UserTask( UserTask(
args.periodicTaskId, args.periodicTaskId,
EmptyTrashArgs.taskName, EmptyTrashArgs.taskName,
true, enabled = true,
ce, ce,
None, None,
args args

View File

@ -29,23 +29,23 @@ object FileCopyTask {
case class CopyResult(success: Boolean, message: String, counter: List[Counter]) case class CopyResult(success: Boolean, message: String, counter: List[Counter])
object CopyResult { object CopyResult {
def noSourceImpl: CopyResult = def noSourceImpl: CopyResult =
CopyResult(false, "No source BinaryStore implementation found!", Nil) CopyResult(success = false, "No source BinaryStore implementation found!", Nil)
def noTargetImpl: CopyResult = def noTargetImpl: CopyResult =
CopyResult(false, "No target BinaryStore implementation found!", Nil) CopyResult(success = false, "No target BinaryStore implementation found!", Nil)
def noSourceStore(id: Ident): CopyResult = def noSourceStore(id: Ident): CopyResult =
CopyResult( CopyResult(
false, success = false,
s"No source file repo found with id: ${id.id}. Make sure it is present in the config.", s"No source file repo found with id: ${id.id}. Make sure it is present in the config.",
Nil Nil
) )
def noTargetStore: CopyResult = def noTargetStore: CopyResult =
CopyResult(false, "No target file repositories defined", Nil) CopyResult(success = false, "No target file repositories defined", Nil)
def success(counter: NonEmptyList[Counter]): CopyResult = def success(counter: NonEmptyList[Counter]): CopyResult =
CopyResult(true, "Done", counter.toList) CopyResult(success = true, "Done", counter.toList)
implicit val binaryIdCodec: Codec[BinaryId] = implicit val binaryIdCodec: Codec[BinaryId] =
Codec.from( Codec.from(
@ -96,8 +96,10 @@ object FileCopyTask {
.fromList(targets.filter(_ != srcConfig)) .fromList(targets.filter(_ != srcConfig))
.toRight(CopyResult.noTargetStore) .toRight(CopyResult.noTargetStore)
srcRepo = store.createFileRepository(srcConfig, true) srcRepo = store.createFileRepository(srcConfig, withAttributeStore = true)
targetRepos = trgConfig.map(store.createFileRepository(_, false)) targetRepos = trgConfig.map(
store.createFileRepository(_, withAttributeStore = false)
)
} yield (srcRepo, targetRepos) } yield (srcRepo, targetRepos)
data match { data match {

View File

@ -13,8 +13,8 @@ case class CleanupResult(removed: Int, disabled: Boolean) {
def asString = if (disabled) "disabled" else s"$removed" def asString = if (disabled) "disabled" else s"$removed"
} }
object CleanupResult { object CleanupResult {
def of(n: Int): CleanupResult = CleanupResult(n, false) def of(n: Int): CleanupResult = CleanupResult(n, disabled = false)
def disabled: CleanupResult = CleanupResult(0, true) def disabled: CleanupResult = CleanupResult(0, disabled = true)
implicit val jsonEncoder: Encoder[CleanupResult] = implicit val jsonEncoder: Encoder[CleanupResult] =
deriveEncoder deriveEncoder

View File

@ -55,7 +55,7 @@ object HouseKeepingTask {
UserTask( UserTask(
periodicId, periodicId,
taskName, taskName,
true, enabled = true,
ce, ce,
"Docspell house-keeping".some, "Docspell house-keeping".some,
() ()

View File

@ -222,13 +222,13 @@ object FindProposal {
def searchExact[F[_]: Sync](ctx: Context[F, Args], store: Store[F]): Finder[F] = def searchExact[F[_]: Sync](ctx: Context[F, Args], store: Store[F]): Finder[F] =
labels => labels =>
labels.toList labels.toList
.traverse(nl => search(nl, true, ctx, store)) .traverse(nl => search(nl, exact = true, ctx, store))
.map(MetaProposalList.flatten) .map(MetaProposalList.flatten)
def searchFuzzy[F[_]: Sync](ctx: Context[F, Args], store: Store[F]): Finder[F] = def searchFuzzy[F[_]: Sync](ctx: Context[F, Args], store: Store[F]): Finder[F] =
labels => labels =>
labels.toList labels.toList
.traverse(nl => search(nl, false, ctx, store)) .traverse(nl => search(nl, exact = false, ctx, store))
.map(MetaProposalList.flatten) .map(MetaProposalList.flatten)
} }

View File

@ -131,10 +131,10 @@ object ReProcessItem {
data.item.source, // source-id data.item.source, // source-id
None, // folder None, // folder
Seq.empty, Seq.empty,
false, skipDuplicate = false,
None, None,
None, None,
true, reprocess = true,
None, // attachOnly (not used when reprocessing attachments) None, // attachOnly (not used when reprocessing attachments)
None // cannot retain customData from an already existing item None // cannot retain customData from an already existing item
), ),

View File

@ -75,7 +75,7 @@ object TextAnalysis {
analyser: TextAnalyser[F], analyser: TextAnalyser[F],
nerFile: RegexNerFile[F] nerFile: RegexNerFile[F]
)(rm: RAttachmentMeta): F[(RAttachmentMeta, AttachmentDates)] = { )(rm: RAttachmentMeta): F[(RAttachmentMeta, AttachmentDates)] = {
val settings = NlpSettings(ctx.args.meta.language, false, None) val settings = NlpSettings(ctx.args.meta.language, highRecall = false, None)
for { for {
customNer <- nerFile.makeFile(ctx.args.meta.collective) customNer <- nerFile.makeFile(ctx.args.meta.collective)
sett = settings.copy(regexNer = customNer) sett = settings.copy(regexNer = customNer)

View File

@ -28,7 +28,7 @@ object JoexRoutes {
for { for {
_ <- app.scheduler.notifyChange _ <- app.scheduler.notifyChange
_ <- app.periodicScheduler.notifyChange _ <- app.periodicScheduler.notifyChange
resp <- Ok(BasicResult(true, "Schedulers notified.")) resp <- Ok(BasicResult(success = true, "Schedulers notified."))
} yield resp } yield resp
case GET -> Root / "running" => case GET -> Root / "running" =>
@ -43,7 +43,7 @@ object JoexRoutes {
_ <- Async[F].start( _ <- Async[F].start(
Temporal[F].sleep(Duration.seconds(1).toScala) *> app.initShutdown Temporal[F].sleep(Duration.seconds(1).toScala) *> app.initShutdown
) )
resp <- Ok(BasicResult(true, "Shutdown initiated.")) resp <- Ok(BasicResult(success = true, "Shutdown initiated."))
} yield resp } yield resp
case GET -> Root / "job" / Ident(id) => case GET -> Root / "job" / Ident(id) =>
@ -54,7 +54,9 @@ object JoexRoutes {
job <- optJob job <- optJob
log <- optLog log <- optLog
} yield mkJobLog(job, log) } yield mkJobLog(job, log)
resp <- jAndL.map(Ok(_)).getOrElse(NotFound(BasicResult(false, "Not found"))) resp <- jAndL
.map(Ok(_))
.getOrElse(NotFound(BasicResult(success = false, "Not found")))
} yield resp } yield resp
case POST -> Root / "job" / Ident(id) / "cancel" => case POST -> Root / "job" / Ident(id) / "cancel" =>

View File

@ -323,7 +323,7 @@ object ScanMailboxTask {
s"mailbox-${ctx.args.account.login.id}", s"mailbox-${ctx.args.account.login.id}",
args.itemFolder, args.itemFolder,
Seq.empty, Seq.empty,
true, skipDuplicates = true,
args.fileFilter.getOrElse(Glob.all), args.fileFilter.getOrElse(Glob.all),
args.tags.getOrElse(Nil), args.tags.getOrElse(Nil),
args.language, args.language,

View File

@ -18,6 +18,8 @@ servers:
- url: /api/v1 - url: /api/v1
description: Current host description: Current host
security: []
paths: paths:
/api/info/version: /api/info/version:
get: get:

View File

@ -164,7 +164,7 @@ object Event {
for { for {
id1 <- Ident.randomId[F] id1 <- Ident.randomId[F]
id2 <- Ident.randomId[F] id2 <- Ident.randomId[F]
} yield ItemSelection(account, Nel.of(id1, id2), true, baseUrl, None) } yield ItemSelection(account, Nel.of(id1, id2), more = true, baseUrl, None)
} }
/** Event when a new job is added to the queue */ /** Event when a new job is added to the queue */

View File

@ -20,9 +20,9 @@ import org.typelevel.ci._
trait Fixtures extends HttpClientOps { self: CatsEffectSuite => trait Fixtures extends HttpClientOps { self: CatsEffectSuite =>
val pubsubEnv = ResourceFixture(Fixtures.envResource("node-1")) val pubsubEnv = ResourceFunFixture(Fixtures.envResource("node-1"))
val pubsubT = ResourceFixture { val pubsubT = ResourceFunFixture {
Fixtures Fixtures
.envResource("node-1") .envResource("node-1")
.flatMap(_.pubSub) .flatMap(_.pubSub)

View File

@ -87,10 +87,10 @@ object ParseFailure {
SimpleMessage(offset, message) SimpleMessage(offset, message)
case InRange(offset, lower, upper) => case InRange(offset, lower, upper) =>
if (lower == upper) ExpectMessage(offset, List(lower.toString), true) if (lower == upper) ExpectMessage(offset, List(lower.toString), exhaustive = true)
else { else {
val expect = s"$lower-$upper" val expect = s"$lower-$upper"
ExpectMessage(offset, List(expect), true) ExpectMessage(offset, List(expect), exhaustive = true)
} }
case Length(offset, expected, actual) => case Length(offset, expected, actual) =>
@ -110,6 +110,10 @@ object ParseFailure {
ExpectMessage(offset, options.take(7), options.size < 8) ExpectMessage(offset, options.take(7), options.size < 8)
case WithContext(ctx, expect) => case WithContext(ctx, expect) =>
ExpectMessage(expect.offset, s"Failed to parse near: $ctx" :: Nil, true) ExpectMessage(
expect.offset,
s"Failed to parse near: $ctx" :: Nil,
exhaustive = true
)
} }
} }

View File

@ -27,6 +27,8 @@ servers:
- url: /api/v1 - url: /api/v1
description: Current host description: Current host
security: []
paths: paths:
/api/info/version: /api/info/version:
get: get:

View File

@ -329,7 +329,7 @@ trait Conversions {
sourceName, sourceName,
None, None,
validFileTypes, validFileTypes,
false, skipDuplicates = false,
Glob.all, Glob.all,
Nil, Nil,
None, None,
@ -641,82 +641,86 @@ trait Conversions {
def basicResult(r: SetValueResult): BasicResult = def basicResult(r: SetValueResult): BasicResult =
r match { r match {
case SetValueResult.FieldNotFound => case SetValueResult.FieldNotFound =>
BasicResult(false, "The given field is unknown") BasicResult(success = false, "The given field is unknown")
case SetValueResult.ItemNotFound => case SetValueResult.ItemNotFound =>
BasicResult(false, "The given item is unknown") BasicResult(success = false, "The given item is unknown")
case SetValueResult.ValueInvalid(msg) => case SetValueResult.ValueInvalid(msg) =>
BasicResult(false, s"The value is invalid: $msg") BasicResult(success = false, s"The value is invalid: $msg")
case SetValueResult.Success => case SetValueResult.Success =>
BasicResult(true, "Custom field value set successfully.") BasicResult(success = true, "Custom field value set successfully.")
} }
def basicResult(cr: JobCancelResult): BasicResult = def basicResult(cr: JobCancelResult): BasicResult =
cr match { cr match {
case JobCancelResult.JobNotFound => BasicResult(false, "Job not found") case JobCancelResult.JobNotFound => BasicResult(success = false, "Job not found")
case JobCancelResult.CancelRequested => case JobCancelResult.CancelRequested =>
BasicResult(true, "Cancel was requested at the job executor") BasicResult(success = true, "Cancel was requested at the job executor")
case JobCancelResult.Removed => case JobCancelResult.Removed =>
BasicResult(true, "The job has been removed from the queue.") BasicResult(success = true, "The job has been removed from the queue.")
} }
def idResult(ar: AddResult, id: Ident, successMsg: String): IdResult = def idResult(ar: AddResult, id: Ident, successMsg: String): IdResult =
ar match { ar match {
case AddResult.Success => IdResult(true, successMsg, id) case AddResult.Success => IdResult(success = true, successMsg, id)
case AddResult.EntityExists(msg) => IdResult(false, msg, Ident.unsafe("")) case AddResult.EntityExists(msg) => IdResult(success = false, msg, Ident.unsafe(""))
case AddResult.Failure(ex) => case AddResult.Failure(ex) =>
IdResult(false, s"Internal error: ${ex.getMessage}", Ident.unsafe("")) IdResult(success = false, s"Internal error: ${ex.getMessage}", Ident.unsafe(""))
} }
def basicResult(ar: AddResult, successMsg: String): BasicResult = def basicResult(ar: AddResult, successMsg: String): BasicResult =
ar match { ar match {
case AddResult.Success => BasicResult(true, successMsg) case AddResult.Success => BasicResult(success = true, successMsg)
case AddResult.EntityExists(msg) => BasicResult(false, msg) case AddResult.EntityExists(msg) => BasicResult(success = false, msg)
case AddResult.Failure(ex) => case AddResult.Failure(ex) =>
BasicResult(false, s"Internal error: ${ex.getMessage}") BasicResult(success = false, s"Internal error: ${ex.getMessage}")
} }
def basicResult(ar: UpdateResult, successMsg: String): BasicResult = def basicResult(ar: UpdateResult, successMsg: String): BasicResult =
ar match { ar match {
case UpdateResult.Success => BasicResult(true, successMsg) case UpdateResult.Success => BasicResult(success = true, successMsg)
case UpdateResult.NotFound => BasicResult(false, "Not found") case UpdateResult.NotFound => BasicResult(success = false, "Not found")
case UpdateResult.Failure(ex) => case UpdateResult.Failure(ex) =>
BasicResult(false, s"Error: ${ex.getMessage}") BasicResult(success = false, s"Error: ${ex.getMessage}")
} }
def basicResult(ur: OUpload.UploadResult): BasicResult = def basicResult(ur: OUpload.UploadResult): BasicResult =
ur match { ur match {
case UploadResult.Success => BasicResult(true, "Files submitted.") case UploadResult.Success => BasicResult(success = true, "Files submitted.")
case UploadResult.NoFiles => BasicResult(false, "There were no files to submit.") case UploadResult.NoFiles =>
case UploadResult.NoSource => BasicResult(false, "The source id is not valid.") BasicResult(success = false, "There were no files to submit.")
case UploadResult.NoItem => BasicResult(false, "The item could not be found.") case UploadResult.NoSource =>
BasicResult(success = false, "The source id is not valid.")
case UploadResult.NoItem =>
BasicResult(success = false, "The item could not be found.")
case UploadResult.NoCollective => case UploadResult.NoCollective =>
BasicResult(false, "The collective could not be found.") BasicResult(success = false, "The collective could not be found.")
case UploadResult.StoreFailure(_) => case UploadResult.StoreFailure(_) =>
BasicResult( BasicResult(
false, success = false,
"There were errors storing a file! See the server logs for details." "There were errors storing a file! See the server logs for details."
) )
} }
def basicResult(cr: PassChangeResult): BasicResult = def basicResult(cr: PassChangeResult): BasicResult =
cr match { cr match {
case PassChangeResult.Success => BasicResult(true, "Password changed.") case PassChangeResult.Success => BasicResult(success = true, "Password changed.")
case PassChangeResult.UpdateFailed => case PassChangeResult.UpdateFailed =>
BasicResult(false, "The database update failed.") BasicResult(success = false, "The database update failed.")
case PassChangeResult.PasswordMismatch => case PassChangeResult.PasswordMismatch =>
BasicResult(false, "The current password is incorrect.") BasicResult(success = false, "The current password is incorrect.")
case PassChangeResult.UserNotFound => BasicResult(false, "User not found.") case PassChangeResult.UserNotFound =>
BasicResult(success = false, "User not found.")
case PassChangeResult.InvalidSource(source) => case PassChangeResult.InvalidSource(source) =>
BasicResult( BasicResult(
false, success = false,
s"User has invalid soure: $source. Passwords are managed elsewhere." s"User has invalid soure: $source. Passwords are managed elsewhere."
) )
} }
def basicResult(e: Either[Throwable, _], successMsg: String): BasicResult = def basicResult(e: Either[Throwable, _], successMsg: String): BasicResult =
e match { e match {
case Right(_) => BasicResult(true, successMsg) case Right(_) => BasicResult(success = true, successMsg)
case Left(ex) => BasicResult(false, ex.getMessage) case Left(ex) => BasicResult(success = false, ex.getMessage)
} }
// MIME Type // MIME Type

View File

@ -38,7 +38,7 @@ object BinaryUtil {
if (matches) withResponseHeaders(dsl, NotModified())(data) if (matches) withResponseHeaders(dsl, NotModified())(data)
else makeByteResp(dsl)(data) else makeByteResp(dsl)(data)
} }
.getOrElse(NotFound(BasicResult(false, "Not found"))) .getOrElse(NotFound(BasicResult(success = false, "Not found")))
} }
def respondHead[F[_]: Async](dsl: Http4sDsl[F])( def respondHead[F[_]: Async](dsl: Http4sDsl[F])(
@ -48,7 +48,7 @@ object BinaryUtil {
fileData fileData
.map(data => withResponseHeaders(dsl, Ok())(data)) .map(data => withResponseHeaders(dsl, Ok())(data))
.getOrElse(NotFound(BasicResult(false, "Not found"))) .getOrElse(NotFound(BasicResult(success = false, "Not found")))
} }
def respondPreview[F[_]: Async](dsl: Http4sDsl[F], req: Request[F])( def respondPreview[F[_]: Async](dsl: Http4sDsl[F], req: Request[F])(
@ -56,7 +56,7 @@ object BinaryUtil {
): F[Response[F]] = { ): F[Response[F]] = {
import dsl._ import dsl._
def notFound = def notFound =
NotFound(BasicResult(false, "Not found")) NotFound(BasicResult(success = false, "Not found"))
QP.WithFallback.unapply(req.multiParams) match { QP.WithFallback.unapply(req.multiParams) match {
case Some(bool) => case Some(bool) =>
@ -75,7 +75,7 @@ object BinaryUtil {
) )
case None => case None =>
BadRequest(BasicResult(false, "Invalid query parameter 'withFallback'")) BadRequest(BasicResult(success = false, "Invalid query parameter 'withFallback'"))
} }
} }
@ -85,7 +85,7 @@ object BinaryUtil {
import dsl._ import dsl._
fileData fileData
.map(data => withResponseHeaders(dsl, Ok())(data)) .map(data => withResponseHeaders(dsl, Ok())(data))
.getOrElse(NotFound(BasicResult(false, "Not found"))) .getOrElse(NotFound(BasicResult(success = false, "Not found")))
} }
def withResponseHeaders[F[_]: Sync](dsl: Http4sDsl[F], resp: F[Response[F]])( def withResponseHeaders[F[_]: Sync](dsl: Http4sDsl[F], resp: F[Response[F]])(

View File

@ -33,10 +33,10 @@ object ThrowableResponseMapper {
def toResponse(ex: Throwable): F[Response[F]] = def toResponse(ex: Throwable): F[Response[F]] =
ex match { ex match {
case _: IllegalArgumentException => case _: IllegalArgumentException =>
BadRequest(BasicResult(false, ex.getMessage)) BadRequest(BasicResult(success = false, ex.getMessage))
case _ => case _ =>
InternalServerError(BasicResult(false, ex.getMessage)) InternalServerError(BasicResult(success = false, ex.getMessage))
} }
} }
} }

View File

@ -52,7 +52,7 @@ object AddonArchiveRoutes extends AddonValidationSupport {
case req @ POST -> Root :? Sync(sync) => case req @ POST -> Root :? Sync(sync) =>
def create(r: Option[RAddonArchive]) = def create(r: Option[RAddonArchive]) =
IdResult( IdResult(
true, success = true,
r.fold("Addon submitted for installation")(r => r.fold("Addon submitted for installation")(r =>
s"Addon installed: ${r.id.id}" s"Addon installed: ${r.id.id}"
), ),
@ -77,7 +77,7 @@ object AddonArchiveRoutes extends AddonValidationSupport {
case PUT -> Root / Ident(id) :? Sync(sync) => case PUT -> Root / Ident(id) :? Sync(sync) =>
def create(r: Option[AddonMeta]) = def create(r: Option[AddonMeta]) =
BasicResult( BasicResult(
true, success = true,
r.fold("Addon updated in background")(m => r.fold("Addon updated in background")(m =>
s"Addon updated: ${m.nameAndVersion}" s"Addon updated: ${m.nameAndVersion}"
) )
@ -99,8 +99,8 @@ object AddonArchiveRoutes extends AddonValidationSupport {
for { for {
flag <- backend.addons.deleteAddon(token.account.collectiveId, id) flag <- backend.addons.deleteAddon(token.account.collectiveId, id)
resp <- resp <-
if (flag) Ok(BasicResult(true, "Addon deleted")) if (flag) Ok(BasicResult(success = true, "Addon deleted"))
else NotFound(BasicResult(false, "Addon not found")) else NotFound(BasicResult(success = false, "Addon not found"))
} yield resp } yield resp
} }
} }
@ -112,11 +112,11 @@ object AddonArchiveRoutes extends AddonValidationSupport {
import dsl._ import dsl._
def failWith(msg: String): F[Response[F]] = def failWith(msg: String): F[Response[F]] =
Ok(IdResult(false, msg, Ident.unsafe(""))) Ok(IdResult(success = false, msg, Ident.unsafe("")))
e match { e match {
case AddonValidationError.AddonNotFound => case AddonValidationError.AddonNotFound =>
NotFound(BasicResult(false, "Addon not found.")) NotFound(BasicResult(success = false, "Addon not found."))
case _ => case _ =>
failWith(validationErrorToMessage(e)) failWith(validationErrorToMessage(e))

View File

@ -35,5 +35,5 @@ object AddonRoutes {
"run" -> AddonRunRoutes(backend, token) "run" -> AddonRunRoutes(backend, token)
) )
else else
Responses.notFoundRoute(BasicResult(false, "Addons disabled")) Responses.notFoundRoute(BasicResult(success = false, "Addons disabled"))
} }

View File

@ -43,8 +43,8 @@ object AddonRunConfigRoutes {
.map(_.leftMap(_.message)) .map(_.leftMap(_.message))
) )
resp <- res.fold( resp <- res.fold(
msg => Ok(BasicResult(false, msg)), msg => Ok(BasicResult(success = false, msg)),
id => Ok(IdResult(true, s"Addon run config added", id)) id => Ok(IdResult(success = true, s"Addon run config added", id))
) )
} yield resp } yield resp
@ -58,8 +58,8 @@ object AddonRunConfigRoutes {
.map(_.leftMap(_.message)) .map(_.leftMap(_.message))
) )
resp <- res.fold( resp <- res.fold(
msg => Ok(BasicResult(false, msg)), msg => Ok(BasicResult(success = false, msg)),
id => Ok(IdResult(true, s"Addon run config updated", id)) id => Ok(IdResult(success = true, s"Addon run config updated", id))
) )
} yield resp } yield resp
@ -67,8 +67,8 @@ object AddonRunConfigRoutes {
for { for {
flag <- backend.addons.deleteAddonRunConfig(token.account.collectiveId, id) flag <- backend.addons.deleteAddonRunConfig(token.account.collectiveId, id)
resp <- resp <-
if (flag) Ok(BasicResult(true, "Addon task deleted")) if (flag) Ok(BasicResult(success = true, "Addon task deleted"))
else NotFound(BasicResult(false, "Addon task not found")) else NotFound(BasicResult(success = false, "Addon task not found"))
} yield resp } yield resp
} }
} }

View File

@ -35,7 +35,7 @@ object AddonRunRoutes {
input.addonRunConfigIds.toSet, input.addonRunConfigIds.toSet,
UserTaskScope(token.account) UserTaskScope(token.account)
) )
resp <- Ok(BasicResult(true, "Job for running addons submitted.")) resp <- Ok(BasicResult(success = true, "Job for running addons submitted."))
} yield resp } yield resp
} }
} }

View File

@ -66,7 +66,7 @@ object AttachmentRoutes {
resp <- resp <-
fileData fileData
.map(data => withResponseHeaders(Ok())(data)) .map(data => withResponseHeaders(Ok())(data))
.getOrElse(NotFound(BasicResult(false, "Not found"))) .getOrElse(NotFound(BasicResult(success = false, "Not found")))
} yield resp } yield resp
case req @ GET -> Root / Ident(id) / "original" => case req @ GET -> Root / Ident(id) / "original" =>
@ -83,7 +83,7 @@ object AttachmentRoutes {
if (matches) withResponseHeaders(NotModified())(data) if (matches) withResponseHeaders(NotModified())(data)
else makeByteResp(data) else makeByteResp(data)
} }
.getOrElse(NotFound(BasicResult(false, "Not found"))) .getOrElse(NotFound(BasicResult(success = false, "Not found")))
} yield resp } yield resp
case HEAD -> Root / Ident(id) / "archive" => case HEAD -> Root / Ident(id) / "archive" =>
@ -93,7 +93,7 @@ object AttachmentRoutes {
resp <- resp <-
fileData fileData
.map(data => withResponseHeaders(Ok())(data)) .map(data => withResponseHeaders(Ok())(data))
.getOrElse(NotFound(BasicResult(false, "Not found"))) .getOrElse(NotFound(BasicResult(success = false, "Not found")))
} yield resp } yield resp
case req @ GET -> Root / Ident(id) / "archive" => case req @ GET -> Root / Ident(id) / "archive" =>
@ -108,7 +108,7 @@ object AttachmentRoutes {
if (matches) withResponseHeaders(NotModified())(data) if (matches) withResponseHeaders(NotModified())(data)
else makeByteResp(data) else makeByteResp(data)
} }
.getOrElse(NotFound(BasicResult(false, "Not found"))) .getOrElse(NotFound(BasicResult(success = false, "Not found")))
} yield resp } yield resp
case req @ GET -> Root / Ident(id) / "preview" => case req @ GET -> Root / Ident(id) / "preview" =>
@ -148,7 +148,9 @@ object AttachmentRoutes {
for { for {
rm <- backend.itemSearch.findAttachmentMeta(id, user.account.collectiveId) rm <- backend.itemSearch.findAttachmentMeta(id, user.account.collectiveId)
md = rm.map(Conversions.mkAttachmentMeta) md = rm.map(Conversions.mkAttachmentMeta)
resp <- md.map(Ok(_)).getOrElse(NotFound(BasicResult(false, "Not found."))) resp <- md
.map(Ok(_))
.getOrElse(NotFound(BasicResult(success = false, "Not found.")))
} yield resp } yield resp
case req @ POST -> Root / Ident(id) / "name" => case req @ POST -> Root / Ident(id) / "name" =>
@ -169,8 +171,11 @@ object AttachmentRoutes {
backend.attachment backend.attachment
.setExtractedText(user.account.collectiveId, itemId, id, newText) .setExtractedText(user.account.collectiveId, itemId, id, newText)
) )
resp <- OptionT.liftF(Ok(BasicResult(true, "Extracted text updated."))) resp <- OptionT.liftF(
} yield resp).getOrElseF(NotFound(BasicResult(false, "Attachment not found"))) Ok(BasicResult(success = true, "Extracted text updated."))
)
} yield resp)
.getOrElseF(NotFound(BasicResult(success = false, "Attachment not found")))
case DELETE -> Root / Ident(id) / "extracted-text" => case DELETE -> Root / Ident(id) / "extracted-text" =>
(for { (for {
@ -181,7 +186,9 @@ object AttachmentRoutes {
backend.attachment backend.attachment
.setExtractedText(user.account.collectiveId, itemId, id, "".pure[F]) .setExtractedText(user.account.collectiveId, itemId, id, "".pure[F])
) )
resp <- OptionT.liftF(Ok(BasicResult(true, "Extracted text cleared."))) resp <- OptionT.liftF(
Ok(BasicResult(success = true, "Extracted text cleared."))
)
} yield resp).getOrElseF(NotFound()) } yield resp).getOrElseF(NotFound())
case GET -> Root / Ident(id) / "extracted-text" => case GET -> Root / Ident(id) / "extracted-text" =>
@ -190,14 +197,15 @@ object AttachmentRoutes {
backend.itemSearch.findAttachmentMeta(id, user.account.collectiveId) backend.itemSearch.findAttachmentMeta(id, user.account.collectiveId)
) )
resp <- OptionT.liftF(Ok(OptionalText(meta.content))) resp <- OptionT.liftF(Ok(OptionalText(meta.content)))
} yield resp).getOrElseF(NotFound(BasicResult(false, "Attachment not found"))) } yield resp)
.getOrElseF(NotFound(BasicResult(success = false, "Attachment not found")))
case DELETE -> Root / Ident(id) => case DELETE -> Root / Ident(id) =>
for { for {
n <- backend.item.deleteAttachment(id, user.account.collectiveId) n <- backend.item.deleteAttachment(id, user.account.collectiveId)
res = res =
if (n == 0) BasicResult(false, "Attachment not found") if (n == 0) BasicResult(success = false, "Attachment not found")
else BasicResult(true, "Attachment deleted.") else BasicResult(success = true, "Attachment deleted.")
resp <- Ok(res) resp <- Ok(res)
} yield resp } yield resp
} }

View File

@ -40,9 +40,9 @@ object CalEventCheckRoutes {
val next = ev val next = ev
.nextElapses(now.toUtcDateTime, 2) .nextElapses(now.toUtcDateTime, 2)
.map(Timestamp.atUtc) .map(Timestamp.atUtc)
CalEventCheckResult(true, "Valid.", ev.some, next) CalEventCheckResult(success = true, "Valid.", ev.some, next)
case Left(err) => case Left(err) =>
CalEventCheckResult(false, err, None, Nil) CalEventCheckResult(success = false, err, None, Nil)
} }
} }
} }

View File

@ -66,7 +66,7 @@ object ClientSettingsRoutes {
for { for {
data <- req.as[Json] data <- req.as[Json]
_ <- backend.clientSettings.saveUser(clientId, user.account.userId, data) _ <- backend.clientSettings.saveUser(clientId, user.account.userId, data)
res <- Ok(BasicResult(true, "Settings stored")) res <- Ok(BasicResult(success = true, "Settings stored"))
} yield res } yield res
case GET -> Root / "user" / Ident(clientId) => case GET -> Root / "user" / Ident(clientId) =>
@ -97,7 +97,7 @@ object ClientSettingsRoutes {
user.account.collectiveId, user.account.collectiveId,
data data
) )
res <- Ok(BasicResult(true, "Settings stored")) res <- Ok(BasicResult(success = true, "Settings stored"))
} yield res } yield res
case GET -> Root / "collective" / Ident(clientId) => case GET -> Root / "collective" / Ident(clientId) =>

View File

@ -118,7 +118,7 @@ object CollectiveRoutes {
case POST -> Root / "classifier" / "startonce" => case POST -> Root / "classifier" / "startonce" =>
for { for {
_ <- backend.collective.startLearnClassifier(user.account.collectiveId) _ <- backend.collective.startLearnClassifier(user.account.collectiveId)
resp <- Ok(BasicResult(true, "Task submitted")) resp <- Ok(BasicResult(success = true, "Task submitted"))
} yield resp } yield resp
case req @ POST -> Root / "emptytrash" / "startonce" => case req @ POST -> Root / "emptytrash" / "startonce" =>
@ -127,7 +127,7 @@ object CollectiveRoutes {
_ <- backend.collective.startEmptyTrash( _ <- backend.collective.startEmptyTrash(
EmptyTrashArgs(user.account.collectiveId, data.minAge) EmptyTrashArgs(user.account.collectiveId, data.minAge)
) )
resp <- Ok(BasicResult(true, "Task submitted")) resp <- Ok(BasicResult(success = true, "Task submitted"))
} yield resp } yield resp
} }
} }

View File

@ -56,7 +56,7 @@ object CustomFieldRoutes {
(for { (for {
field <- OptionT(backend.customFields.findById(user.account.collectiveId, id)) field <- OptionT(backend.customFields.findById(user.account.collectiveId, id))
res <- OptionT.liftF(Ok(convertField(field))) res <- OptionT.liftF(Ok(convertField(field)))
} yield res).getOrElseF(NotFound(BasicResult(false, "Not found"))) } yield res).getOrElseF(NotFound(BasicResult(success = false, "Not found")))
case req @ PUT -> Root / Ident(id) => case req @ PUT -> Root / Ident(id) =>
for { for {

Some files were not shown because too many files have changed in this diff Show More