Remove obsolete shell scripts

The commandline interface now supersedes these scripts.
This commit is contained in:
eikek
2021-08-20 00:23:37 +02:00
parent 30dec30450
commit 461ae74c28
19 changed files with 49 additions and 2075 deletions

View File

@ -4,10 +4,10 @@ base_url = "https://docspell.org"
# Whether to automatically compile all Sass files in the sass directory
compile_sass = true
[markdown]
# Whether to do syntax highlighting
# Theme can be customised by setting the `highlight_theme` variable to a theme supported by Zola
highlight_code = true
highlight_theme = "gruvbox-dark"
# Whether to build a search index to be used later on by a JavaScript library

View File

@ -82,9 +82,9 @@ documentation, too.
In order to move to a different tool, it is necessary to get the data
out of Docspell in a machine readable/automatic way. Currently, there
is a [export-files.sh](@/docs/tools/export-files.md) script provided
(in the `tools/` folder) that can be used to download all your files
and item metadata.
is a [export command](@/docs/tools/cli.md#export-data) in the command
line client that can be used to download all your files and item
metadata.
My recommendation is to run periodic database backups and also store
the binaries/docker images. This lets you re-create the current state

View File

@ -7,8 +7,9 @@ To get started, here are some quick links:
- Using [docker and docker-compose](@/docs/install/docker.md). This
sets up everything: all prerequisites, both docspell components and
a container running the [consumedir.sh](@/docs/tools/consumedir.md)
script to import files that are dropped in a folder.
a container running the [dsc
watch](@/docs/tools/cli.md#watch-a-directory) script to import files
that are dropped in a folder.
- [Download, Unpack and Run](@/docs/install/download_run.md). This
option is also very quick, but you need to check the
[prerequisites](@/docs/install/prereq.md) yourself. Database is
@ -27,9 +28,9 @@ To get started, here are some quick links:
thread](https://forums.unraid.net/topic/103425-docspell-hilfe/) in
the German Unraid forum. Thanks for providing these!
Every [component](@/docs/intro/_index.md#components) (restserver, joex,
consumedir) can run on different machines and multiple times. Most of
the time running all on one machine is sufficient and also for
Every [component](@/docs/intro/_index.md#components) (restserver,
joex, dsc watch) can run on different machines and multiple times.
Most of the time running all on one machine is sufficient and also for
simplicity, the docker-compose setup reflects this variant.
While there are many different ways to run docspell, at some point all

View File

@ -167,7 +167,9 @@ directories.
The `watch` subcommand can be used to watch one or more directories
and upload files when they arrive. It uses the `upload` command under
the hood and therefore most options are also available here.
the hood and therefore most options are also available here. You can
upload via a source url, the integration endpoint or a valid session
(requires to login).
It detects file creations and skips a rename within a watched folder.
The flag `-r` or `--recursive` is required to recursively watch a
@ -191,6 +193,10 @@ If watching a directory is not possible due to system constraints
use the `upload` subcommand with `--poll` option which periodically
traverses a directory.
When using the integration endpoint, it requires to specify `-i` and
potentially a secret if the endpoint is protected with a secret.
## Download files
The `download` command allows to download files that match a given
@ -303,7 +309,8 @@ commands require the [admin
secret](@/docs/configure/_index.md#admin-endpoint) either in the
config file or as an argument.
Reset user password:
### Reset user password
``` shell
dsc admin reset-password --account demo
┌─────────┬──────────────┬──────────────────┐
@ -313,7 +320,8 @@ Reset user password:
└─────────┴──────────────┴──────────────────┘
```
Recreate fulltext index:
### Recreate fulltext index
``` shell
dsc admin --admin-secret admin123 recreate-index
┌─────────┬─────────────────────────────────────┐
@ -323,6 +331,34 @@ Recreate fulltext index:
└─────────┴─────────────────────────────────────┘
```
### Convert all files to PDF
``` shell
dsc admin --admin-secret admin123 convert-all-pdf
┌─────────┬─────────────────────────────────┐
│ success │ message │
├─────────┼─────────────────────────────────┤
│ true │ Convert all PDFs task submitted │
└─────────┴─────────────────────────────────┘
```
This may be necessary if you disabled pdf conversion before and are
enabling it now.
### Regenerate preview images
``` shell
dsc admin --admin-secret admin123 convert-all-pdf
┌─────────┬───────────────────────────────────────┐
│ success │ message │
├─────────┼───────────────────────────────────────┤
│ true │ Generate all previews task submitted. │
└─────────┴───────────────────────────────────────┘
```
This submits tasks to (re)generate preview images of all files. This
is necessary if you changed the `preview.dpi` setting in joex'
config.
## Search for items
The `search` command takes a [query](@/docs/query/_index.md) and

View File

@ -1,58 +0,0 @@
+++
title = "Directory Cleaner (⊗)"
description = "Clean directories from files in docspell"
weight = 150
+++
{% infobubble(mode="info", title="⚠ Please note") %}
This script is now obsolete, you can use the [**CLI tool**](../cli/) instead.
Use the `cleanup` or the `upload` command.
{% end %}
# Introduction
This script is made for cleaning up the consumption directory used for
the consumedir service (as it is provided as docker container)which
are copied or moved there.
<https://github.com/eikek/docspell/tree/master/tools/consumedir-cleaner>
## How it works
- Checks for every file (in the collective's folder of the given user
name) if it already exists in the collective (using Docspell's API).
- If so, by default those files are moved to an archive folder just
besides the collective's consumption folders named _archive. The
archive's files are organized into monthly subfolders by the date
they've been added to Docspell
- If set, those files can also be deleted instead of being moved to
the archive. There is no undo function provided for this, so be
careful.
- If a file is found which does not exist in the collective, by
default nothing happens, so that file would be found in every run
and just ignored
- If set, those files can also be uploaded to Docspell. Depending on
the setting for files already existing these files would either be
deleted or moved to the archive in the next run.
## Usage (parameters / settings)
Copy the script to your machine and run it with the following
parameters:
1. URL of Docspell, including http(s)
2. Username for Docspell, possibly including Collective (if other name
as user)
3. Password for Docspell
4. Path to the directory which files shall be checked against
existence in Docspell
Additionally, environment variables can be used to alter the behavior:
- `DS_CC_REMOVE`
- `true` delete files which already exist in the collective
- `false` (default) - move them to the archive (see above)
- `DS_CC_UPLOAD_MISSING`
- `true` - uploads files which do not exist in the collective
- `false` (default) - ignore them and do nothing

View File

@ -1,191 +0,0 @@
+++
title = "Consume Directory (⊗)"
description = "A script to watch a directory for new files and upload them to docspell."
weight = 110
+++
{% infobubble(mode="info", title="⚠ Please note") %}
This script is now obsolete, you can use the [**CLI tool**](../cli/) instead.
You can use the `watch` command, or the `upload` command with `--poll`.
{% end %}
# Introduction
The `consumerdir.sh` is a bash script that works in two modes:
- Go through all files in given directories (recursively, if `-r` is
specified) and sent each to docspell.
- Watch one or more directories for new files and upload them to
docspell.
It can watch or go through one or more directories. Files can be
uploaded to multiple urls.
Run the script with the `-h` or `--help` option, to see a short help
text. The help text will also show the values for any given option.
The script requires `curl` for uploading. It requires the
`inotifywait` command if directories should be watched for new
files.
Example for watching two directories:
``` bash
./tools/consumedir.sh --path ~/Downloads --path ~/pdfs -m -dv \
http://localhost:7880/api/v1/open/upload/item/5DxhjkvWf9S-CkWqF3Kr892-WgoCspFWDo7-XBykwCyAUxQ
```
The script by default watches the given directories. If the `-o` or
`--once` option is used, it will instead go through these directories
and upload all files in there. For directory watching the
`inotifywait` command is used and must be present. Another way is to
use the `--poll` option. It expects the number of seconds to wait
between running itself with `--once`.
Example using active polling (at 5 minutes interval):
``` bash
./tools/consumedir.sh --poll 300 --path ~/Downloads --path ~/pdfs -m -dv \
http://localhost:7880/api/v1/open/upload/item/5DxhjkvWf9S-CkWqF3Kr892-WgoCspFWDo7-XBykwCyAUxQ
```
Example for uploading all immediatly (the same as above only with `-o`
added):
``` bash
$ ./tools/consumedir.sh --once --path ~/Downloads --path ~/pdfs/ -m -dv \
http://localhost:7880/api/v1/open/upload/item/5DxhjkvWf9S-CkWqF3Kr892-WgoCspFWDo7-XBykwCyAUxQ
```
The URL can be any docspell url that accepts uploads without
authentication. This is usually a [source
url](@/docs/webapp/uploading.md#anonymous-upload). It is also possible
to use the script with the [integration
endpoint](@/docs/api/upload.md#integration-endpoint).
The script can be run multiple times and on on multiple machines, the
files are transferred via HTTP to the docspell server. For example, it
is convenient to set it up on your workstation, so that you can drop
files into some local folder to be immediatly transferred to docspell
(e.g. when downloading something from the browser).
## Integration Endpoint
When given the `-i` or `--integration` option, the script changes its
behaviour slightly to work with the [integration
endpoint](@/docs/api/upload.md#integration-endpoint).
First, if `-i` is given, it implies `-r` so the directories are
watched or traversed recursively. The script then assumes that there
is a subfolder with the collective name. Files must not be placed
directly into a folder given by `-p`, but below a sub-directory that
matches a collective name. In order to know for which collective the
file is, the script uses the first subfolder.
If the endpoint is protected, the credentials can be specified as
arguments `--iuser` and `--iheader`, respectively. The format is for
both `<name>:<value>`, so the username cannot contain a colon
character (but the password can).
Example:
``` bash
$ consumedir.sh -i -iheader 'Docspell-Integration:test123' -m -p ~/Downloads/ http://localhost:7880/api/v1/open/integration/item
```
The url is the integration endpoint url without the collective, as
this is amended by the script.
This watches the folder `~/Downloads`. If a file is placed in this
folder directly, say `~/Downloads/test.pdf` the upload will fail,
because the collective cannot be determined. Create a subfolder below
`~/Downloads` with the name of a collective, for example
`~/Downloads/family` and place files somewhere below this `family`
subfolder, like `~/Downloads/family/test.pdf`.
## Duplicates
With the `-m` option, the script will not upload files that already
exist at docspell. For this the `sha256sum` command is required.
So you can move and rename files in those folders without worring
about duplicates. This allows to keep your files organized using the
file-system and have them mirrored into docspell as well.
## Network Filesystems (samba cifs, nfs)
Watching a directory for changes relies on `inotify` subsystem on
linux. This doesn't work on network filesystems like nfs or cifs. Here
are some ideas to get around this limitation:
1. The `consumedir.sh` is just a shell script and doesn't need to run
on the same machine as docspell. (Note that the default docker
setup is mainly for demoing and quickstart, it's not required to
run all of them on one machine). So the best option is to put the
consumedir on the machine that contains the local filesystem. All
files are send via HTTP to the docspell server anyways, so there is
no need to first transfer them via a network filesystem or rsync.
2. If option 1 is not possible for some reason, and you need to check
a network filesystem, the only option left (that I know) is to
periodically poll this directory. This is also possible with
consumedir, using the `--poll` option (see above). You can also
setup a systemd timer to periodically run this script with the
`--once` option.
3. Copy the files to the machine that runs consumedir, via rsync for
example. Note that this has no advantage over otpion 1, as you now
need to setup rsync on the other machine to run either periodically
or when some file arrives. Then you can as well run the consumedir
script. But it might be more convenient, if rsync is already
running.
# Systemd
The script can be used with systemd to run as a service. This is an
example unit file:
``` systemd
[Unit]
After=networking.target
Description=Docspell Consumedir
[Service]
Environment="PATH=/set/a/path"
ExecStart=/bin/su -s /bin/bash someuser -c "consumedir.sh --path '/a/path/' -m 'http://localhost:7880/api/v1/open/upload/item/5DxhjkvWf9S-CkWqF3Kr892-WgoCspFWDo7-XBykwCyAUxQ'"
```
This unit file is just an example, it needs some fiddling. It assumes
an existing user `someuser` that is used to run this service. The url
`http://localhost:7880/api/v1/open/upload/...` is an anonymous upload
url as described [here](@/docs/webapp/uploading.md#anonymous-upload).
# Docker
The provided docker-compose setup runs this script to watch a single
directory, `./docs` in current directory, for new files. If a new file
is detected, it is pushed to docspell.
This utilizes the [integration
endpoint](@/docs/api/upload.md#integration-endpoint), which is
enabled in the config file, to allow uploading documents for all
collectives. A subfolder must be created for each registered
collective. The docker containers are configured to use http-header
protection for the integration endpoint. This requires you to provide
a secret, that is shared between the rest-server and the
`consumedir.sh` script. This can be done by defining an environment
variable which gets picked up by the containers defined in
`docker-compose.yml`:
``` bash
export DOCSPELL_HEADER_VALUE="my-secret"
docker-compose up
```
Now you can create a folder `./docs/<collective-name>` and place all
files in there that you want to import. Once dropped in this folder
the `consumedir` container will push it to docspell.

View File

@ -1,59 +0,0 @@
+++
title = "Convert All PDFs (⊗)"
description = "Convert all PDF files using OcrMyPdf."
weight = 160
+++
{% infobubble(mode="info", title="⚠ Please note") %}
This script is now obsolete, you can use the [**CLI tool**](../cli/) instead.
Use the `convert-all-pdfs` admin command, e.g. `dsc admin
convert-all-pdfs`.
{% end %}
# convert-all-pdf.sh
With version 0.9.0 there was support added for another external tool,
[OCRMyPdf](https://github.com/jbarlow83/OCRmyPDF), that can convert
PDF files such that they contain the OCR-ed text layer. This tool is
optional and can be disabled.
In order to convert all previously processed files with this tool,
there is an
[endpoint](/openapi/docspell-openapi.html#api-Item-secItemConvertallpdfsPost)
that submits a task to convert all PDF files not already converted for
your collective.
There is no UI part to trigger this route, so you need to use curl or
the script `convert-all-pdfs.sh` in the `tools/` directory.
# Requirements
It is a bash script that additionally needs
[curl](https://curl.haxx.se/) and
[jq](https://stedolan.github.io/jq/).
# Usage
```
./convert-all-pdfs.sh [docspell-base-url]
```
For example, if docspell is at `http://localhost:7880`:
```
./convert-all-pdfs.sh http://localhost:7880
```
The script asks for your account name and password. It then logs in
and triggers the said endpoint. After this you should see a few tasks
running.
There will be one task per file to convert. All these tasks are
submitted with a low priority. So files uploaded through the webapp or
a [source](@/docs/webapp/uploading.md#anonymous-upload) with a high
priority, will be preferred as [configured in the job
executor](@/docs/joex/intro.md#scheduler-config). This is to not
disturb normal processing when many conversion tasks are being
executed.

View File

@ -1,54 +0,0 @@
+++
title = "Upload CLI (⊗)"
description = "A script to quickly upload files from the command line."
weight = 100
+++
{% infobubble(mode="info", title="⚠ Please note") %}
This script is now obsolete, you can use the [**CLI tool**](../cli/) instead.
Use the `upload` command (or the `up` alias), like `dsc up *.pdf`.
{% end %}
# Introduction
The `tools/ds.sh` is a bash script to quickly upload files from the
command line. It reads a configuration file containing the URLs to
upload to. Then each file given to the script will be uploaded to al
URLs in the config.
The config file is expected in
`$XDG_CONFIG_HOME/docspell/ds.conf`. `$XDG_CONFIG_HOME` defaults to
`~/.config`.
The config file contains lines with key-value pairs, separated by a
`=` sign. Lines starting with `#` are ignored. Example:
```
# Config file
url.1 = http://localhost:7880/api/v1/open/upload/item/5DxhjkvWf9S-CkWqF3Kr892-WgoCspFWDo7-XBykwCyAUxQ
url.2 = http://localhost:7880/api/v1/open/upload/item/6DxhjkvWf9S-CkWqF3Kr892-WgoCspFWDo7-XBykwCyAUxQ
```
The key must start with `url`. The urls should be [anonymous upload
urls](@/docs/webapp/uploading.md#anonymous-upload).
# Usage
- The `-c` option allows to specifiy a different config file.
- The `-h` option shows a help overview.
- The `-d` option deletes files after upload was successful
- The `-e` option can be used to check for file existence in docspell.
Instead of uploading, the script only checks whether the file is in
docspell or not.
The script takes a list of files as arguments.
Example:
``` bash
./ds.sh ~/Downloads/*.pdf
```

View File

@ -1,215 +0,0 @@
+++
title = "Export Files (⊗)"
description = "Downloads all files from docspell."
weight = 165
+++
{% infobubble(mode="info", title="⚠ Please note") %}
This script is now obsolete, you can use the [**CLI tool**](../cli/) instead.
Use the `export` command, e.g. `dsc export --all --target .`.
{% end %}
# export-files.sh
This script can be used to download all files from docspell that have
been uploaded before and the item metadata.
It downloads the original files, those that have been uploaded and not
the converted pdf files.
The item's metadata are stored next to the files to provide more
information about the item: corresponent, tags, dates, custom fields
etc. This contains most of your user supplied data.
This script is intended for having your data outside and independent
of docspell. Another good idea for a backup strategy is to take
database dumps *and* storing the releases of docspell next to this
dump.
Files are stored into the following folder structure (below the given
target directory):
```
- yyyy-mm (item date)
- A3…XY (item id)
- somefile.pdf (attachments with name)
- metadata.json (json file with items metadata)
```
By default, files are not overwritten, it stops if existing files are
encountered. This and some other things can be changed using
environment variables:
- `DS_USER` the account name for login, it is asked if not available
- `DS_PASS` the password for login, it is asked if not available
- `OVERWRITE_FILE=` if `y` then overwriting existing files is ok.
Default is `n`.
- `SKIP_FILE=` if `y` then existing files are skipped (supersedes
`OVERWRITE_FILE`). Default is `n`.
- `DROP_ITEM=` if `y` the item folder is removed before attempting to
download it. If this is set to `y` then the above options don't make
sense, since they operate on the files inside the item folder.
Default is `n`.
Docspell sends the sha256 hash with each file via the ETag header.
This is used to do a integrity check after downloading.
# Requirements
It is a bash script that additionally needs
[curl](https://curl.haxx.se/) and [jq](https://stedolan.github.io/jq/)
to be available.
# Usage
```
./export-files.sh <docspell-base-url> <target-directory>
```
For example, if docspell is at `http://localhost:7880`:
```
./export-files.sh http://localhost:7880 /tmp/ds-downloads
```
The script asks for your account name and password. It then logs in
and goes through all items downloading the metadata as json and the
attachments.
# Example Run
``` bash
fish> env SKIP_FILE=y DS_USER=demo DS_PASS=test ./export-files.sh http://localhost:7880 /tmp/download
Login to Docspell.
Using url: http://localhost:7880
Login successful
Downloading 73 items…
Get next items with offset=0, limit=100
Get item 57Znskthf3g-X7RP1fxzE2U-dwr4vM6Yjnn-b7s1PoCznhz
- Download 'something.txt' (8HbeFornAUN-kBCyc8bHSVr-bnLBYDzgRQ7-peMZzyTzM2X)
- Checksum ok.
Get item 94u5Pt39q6N-7vKu3LugoRj-zohGS4ie4jb-68bW5gXU6Jd
- Download 'letter-en.pdf' (6KNNmoyqpew-RAkdwEmQgBT-QDqdY97whZA-4k2rmbssdfQ)
- Checksum ok.
Get item 7L9Fh53RVG4-vGSt2G2YUcY-cvpBKRXQgBn-omYpg6xQXyD
- Download 'mail.html' (A6yTYKrDc7y-xU3whmLB1kB-TGhEAVb12mo-RUw5u9PsYMo)
- Checksum ok.
Get item DCn9UtWUtvF-2qjxB5PXGEG-vqRUUU7JUJH-zBBrmSeGYPe
- Download 'Invoice_7340224.pdf' (6FWdjxJh7yB-CCjY39p6uH9-uVLbmGfm25r-cw6RksrSx4n)
- Checksum ok.
```
The resulting directory looks then like this:
``` bash
├── 2020-08
│   ├── 6t27gQQ4TfW-H4uAmkYyiSe-rBnerFE2v5F-9BdqbGEhMcv
│   │   ├── 52241.pdf
│   │   └── metadata.json
│   └── 9qwT2GuwEvV-s9UuBQ4w7o9-uE8AdMc7PwL-GFDd62gduAm
│   ├── DOC-20191223-155707.jpg
│   └── metadata.json
├── 2020-09
│   ├── 2CM8C9VaVAT-sVJiKyUPCvR-Muqr2Cqvi6v-GXhRtg6eomA
│   │   ├── letter with spaces.pdf
│   │   └── metadata.json
│   ├── 4sXpX2Sc9Ex-QX1M6GtjiXp-DApuDDzGQXR-7pg1QPW9pbs
│   │   ├── analyse.org
│   │   ├── 201703.docx
│   │   ├── 11812_120719.pdf
│   │   ├── letter-de.pdf
│   │   ├── letter-en.pdf
│   │   └── metadata.json
│   ├── 5VhP5Torsy1-15pwJBeRjPi-es8BGnxhWn7-3pBQTJv3zPb
│   │   └── metadata.json
│   ├── 7ePWmK4xCNk-gmvnTDdFwG8-JcN5MDSUNPL-NTZZrho2Jc6
│   │   ├── metadata.json
│   │   └── Rechnung.pdf
```
The `metadata.json` file contains all the item metadata. This may be
useful when importing into other tools.
``` json
{
"id": "AWCNx7tJgUw-SdrNtRouNJB-FGs6Y2VP5bV-218sFN8mjjk",
"direction": "incoming",
"name": "Ruecksendung.pdf",
"source": "integration",
"state": "confirmed",
"created": 1606171810005,
"updated": 1606422917826,
"itemDate": null,
"corrOrg": null,
"corrPerson": null,
"concPerson": null,
"concEquipment": null,
"inReplyTo": null,
"folder": null,
"dueDate": null,
"notes": null,
"attachments": [
{
"id": "4aPmhrjfR9Z-AgknoW6yVoE-YkffioD2KXV-E6Vm6snH17Q",
"name": "Ruecksendung.converted.pdf",
"size": 57777,
"contentType": "application/pdf",
"converted": true
}
],
"sources": [
{
"id": "4aPmhrjfR9Z-AgknoW6yVoE-YkffioD2KXV-E6Vm6snH17Q",
"name": "Ruecksendung.pdf",
"size": 65715,
"contentType": "application/pdf"
}
],
"archives": [],
"tags": [
{
"id": "EQvJ6AHw19Y-Cdg3gF78zZk-BY2zFtNTwes-J95jpXpzhfw",
"name": "Hupe",
"category": "state",
"created": 1606427083171
},
{
"id": "4xyZoeeELdJ-tJ91GiRLinJ-7bdauy3U1jR-Bzr4VS96bGS",
"name": "Invoice",
"category": "doctype",
"created": 1594249709473
}
],
"customfields": [
{
"id": "5tYmDHin3Kx-HomKkeEVtJN-v99oKxQ8ot6-yFVrEmMayoo",
"name": "amount",
"label": "EUR",
"ftype": "money",
"value": "151.55"
},
{
"id": "3jbwbep8rDs-hNJ9ePRE7gv-21nYMbUj3eb-mKRWAr4xSS2",
"name": "invoice-number",
"label": "Invoice-Nr",
"ftype": "text",
"value": "I454602"
},
{
"id": "AH4p4NUCa9Y-EUkH66wLzxE-Rf2wJPxTAYd-DeGDm4AT4Yg",
"name": "number",
"label": "Number",
"ftype": "numeric",
"value": "0.10"
}
]
}
```

View File

@ -1,48 +0,0 @@
+++
title = "Regenerate Preview Images (⊗)"
description = "Re-generates all preview images."
weight = 130
+++
{% infobubble(mode="info", title="⚠ Please note") %}
This script is now obsolete, you can use the [**CLI tool**](../cli/) instead.
Use the `generate-previews` admin command, e.g. `dsc admin generate-previews`.
{% end %}
# regenerate-previews.sh
This is a simple bash script to trigger the endpoint that submits task
for generating preview images of your files. This is usually not
needed, but should you change the `preview.dpi` setting in joex'
config file, you need to regenerate the images to have any effect.
# Requirements
It is a bash script that additionally needs
[curl](https://curl.haxx.se/) and
[jq](https://stedolan.github.io/jq/).
# Usage
```
./regenerate-previews.sh [docspell-base-url] [admin-secret]
```
For example, if docspell is at `http://localhost:7880`:
```
./convert-all-pdfs.sh http://localhost:7880 test123
```
The script asks for the admin secret if not given to the command. It
then logs in and triggers the said endpoint. After this you should see
a few tasks running.
There will be one task per file to convert. All these tasks are
submitted with a low priority. So files uploaded through the webapp or
a [source](@/docs/webapp/uploading.md#anonymous-upload) with a high
priority, will be preferred as [configured in the job
executor](@/docs/joex/intro.md#scheduler-config). This is to not
disturb normal processing when many conversion tasks are being
executed.

View File

@ -1,47 +0,0 @@
+++
title = "Reset Password (⊗)"
description = "Resets a user password."
weight = 120
+++
{% infobubble(mode="info", title="⚠ Please note") %}
This script is now obsolete, you can use the [**CLI tool**](../cli/) instead.
Use the `reset-password` admin command, e.g. `dsc admin reset-password
--account "smith/john"`, where `smith` is the collective id and `john`
the username.
{% end %}
This script can be used to reset a user password. This can be done by
admins, who know the `admin-endpoint.secret` value in the
[configuration](@/docs/configure/_index.md#admin-endpoint) file.
The script is in `/tools/reset-password/reset-password.sh` and it is
only a wrapper around the admin endpoint `/admin/user/resetPassword`.
## Usage
It's very simple:
``` bash
reset-password.sh <base-url> <admin-secret> <account>
```
Three arguments are required to specify the docspell base url, the
admin secret and the account you want to reset the password.
After the password has been reset, the user can login using it and
change it again in the webapp.
## Example
``` json
./tools/reset-password/reset-password.sh http://localhost:7880 123 eike
{
"success": true,
"newPassword": "HjtpG9BFo9y",
"message": "Password updated"
}
```