It was wrongly stored using RPeriodicTask directly, but the higher
level `UserTask` must be used instead, because this ensures a
correctly scoped periodic task using the `updateOneTask` method. Since
this is a system task, it can be given a fixed ID which makes it now
safe even if stored using RPeriodicTask directly.
The bug resulted in multiple empty-trash tasks to be inserted (on each
restart).
Refs: #347
In order to keep deleted items for a while, the periodic task can now
use a duration to only remove items with a certain age. This can be
used to ensure that a deleted item stays at least X days before it
will be removed from the database.
Refs: #347
Before, there were periodic tasks run per collective and not user by
making sure that submitter + group are the same value. This is now
encoded in `UserTaskScope` so it is now obvious and errors can be
reduced when using this.
It's not very likely to have more modes of search besides normal and
trashed, but got surprised in that way quite often and it's nicer this
way anyways.
When deleting items via the http api, they are not deleted anymore but
a new status "Deleted" is set. The collective insights contains now a
count separately for deleted items.
When using fulltext only search, then only the index must be searched.
This wasn't working anymore, because the routes added a query to
always select valid items (those not being processed). But this lead
to the downstream code to always consult the database, too. Since the
routes are using a "simple-search" interface, this is now adding the
valid-state condition if applicable. There are still more low-level
interfaces that can be used when searching should be done differently.
Closes: #823
This drops fomantic-ui as css toolkit and introduces tailwindcss. With
tailwind there are no predefined components, but it's very easy to
create those. So customizing the look&feel is much simpler, most of
the time no additional css is needed.
This requires a complete rewrite of the markup + styles. Luckily all
logic can be kept as is. The now old ui is not removed, it is still
available by using a request header `Docspell-Ui` with a value of `1`
for the old ui and `2` for the new ui.
Another addition is "dev mode", where docspell serves assets with a
no-cache header, to disable browser caching. This makes developing a
lot easier.
Mails are filtered once by using an imap search and then by some globs
to filter files and subjects. Imap can search by subject via a
string-contains, but not via globs or patterns (afaik). The subject
filter is applied to all downloaded mail headers. Now for post
processing (moving to some target folder or deleting), it can be
chosen to post-process all "seen" mails or only those that matched all
filters.
Adding a new language without nlp requires now only to fill out the
pieces:
- define a list of month names to support date recognition
- add it to joex' dockerfile to be available for tesseract
- update the solr migration/field definitions
- update the elm file so it shows up on the client
Improves and reorganizes how nlp pipelines are setup. Now users can
choose from many options, depending on their hardware and usage
scenario.
This is the base to use more languages without depending on what
stanford-nlp supports. Support then is involves to text extraction and
simple regex-ner processing.
The scaling factor can be given in the config file. When this changes,
images can be regenerated via POSTing to certain endpoints. It is
possible to regenerate just one attachment preview or all within a
collective.
There is a task to generate preview images per attachment. It can
either add them (if not present yet) or overwrite them (e.g. some
config has changed).
There is a task that selects all attachments without previews and
submits a task to create it. This is submitted on start automatically
to generate previews for all existing attachments.
When `base-url` is the default (i.e. localhost), the cookie is now
configured with the domain doing the request and the webapp is
configured to run requests against the host in the address bar of the
browser.
A request to cancel a job was not processed correctly. The cancelling
routine of a task must run, regardless of the (non-final) state. Now
it works like this: if a job is currently running, it is interrupted
and its cancel routine is invoked. It then enters "cancelled" state.
If it is stuck, it is loaded and only its cancel routine is run. If it
is in a final state or waiting, it is removed from the queue.
Invalid items are those that are not ready, and not shown to the user.
When changing metadata, it should only be changed, if the item was not
already shown to the user.