1
0
mirror of https://github.com/immich-app/immich.git synced 2024-12-24 10:37:28 +02:00

feat(server): separate face clustering job (#5598)

* separate facial clustering job

* update api

* fixed some tests

* invert clustering

* hdbscan

* update api

* remove commented code

* wip dbscan

* cleanup

removed cluster endpoint

remove commented code

* fixes

updated tests

minor fixes and formatting

fixed queuing

refinements

* scale search range based on library size

* defer non-core faces

* optimizations

removed unused query option

* assign faces individually for correctness

fixed unit tests

remove unused method

* don't select face embedding

update sql

linting

fixed ml typing

* updated job mock

* paginate people query

* select face embeddings because typeorm

* fix setting face detection concurrency

* update sql

formatting

linting

* simplify logic

remove unused imports

* more specific delete signature

* more accurate typing for face stubs

* add migration

formatting

* chore: better typing

* don't select embedding by default

remove unused import

* updated sql

* use normal try/catch

* stricter concurrency typing and enforcement

* update api

* update job concurrency panel to show disabled queues

formatting

* check jobId in queueAll

fix tests

* remove outdated comment

* better facial recognition icon

* wording

wording

formatting

* fixed tests

* fix

* formatting & sql

* try to fix sql check

* more detailed description

* update sql

* formatting

* wording

* update `minFaces` description

---------

Co-authored-by: Jason Rasmussen <jrasm91@gmail.com>
Co-authored-by: Alex Tran <alex.tran1502@gmail.com>
This commit is contained in:
Mert 2024-01-18 00:08:48 -05:00 committed by GitHub
parent 44873b4224
commit 68f52818ae
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
57 changed files with 1023 additions and 590 deletions

View File

@ -231,12 +231,12 @@ Immich optionally uses machine learning for several features. However, it can be
### Can I lower CPU and RAM usage?
The initial backup is the most intensive due to the number of jobs running. The most CPU-intensive ones are transcoding and machine learning jobs (Tag Images, Smart Search, Recognize Faces), and to a lesser extent thumbnail generation. Here are some ways to lower their CPU usage:
The initial backup is the most intensive due to the number of jobs running. The most CPU-intensive ones are transcoding and machine learning jobs (Smart Search, Face Detection), and to a lesser extent thumbnail generation. Here are some ways to lower their CPU usage:
- Lower the job concurrency for these jobs to 1.
- Under Settings > Transcoding Settings > Threads, set the number of threads to a low number like 1 or 2.
- Under Settings > Machine Learning Settings > Facial Recognition > Model Name, you can change the facial recognition model to `buffalo_s` instead of `buffalo_l`. The former is a smaller and faster model, albeit not as good.
- You _must_ re-run the Recognize Faces job for all images after this for facial recognition on new images to work properly.
- You _must_ re-run the Face Detection job for all images after this for facial recognition on new images to work properly.
- If these changes are not enough, see [below](/docs/FAQ#how-can-i-disable-machine-learning) for how you can disable machine learning.
### Can I limit the amount of CPU and RAM usage?
@ -247,10 +247,10 @@ You can look at the [original docker docs](https://docs.docker.com/config/contai
### How an I boost machine learning speed?
:::note
This advice increases throughput, not latency. This is to say that it will make Smart Search jobs process more quickly, but it won't make searching faster.
This advice improves throughput, not latency. This is to say that it will make Smart Search jobs process more quickly, but it won't make searching faster.
:::
You can increase throughput by increasing the job concurrency for machine learning jobs (Smart Search, Recognize Faces). With higher concurrency, the host will work on more assets in parallel. You can do this by navigating to Administration > Settings > Job Settings and increasing concurrency as needed.
You can increase throughput by increasing the job concurrency for machine learning jobs (Smart Search, Face Detection). With higher concurrency, the host will work on more assets in parallel. You can do this by navigating to Administration > Settings > Job Settings and increasing concurrency as needed.
:::danger
On a normal machine, 2 or 3 concurrent jobs can probably max the CPU, so if you're not hitting those maximums with, say, 30 jobs.

View File

@ -79,7 +79,7 @@ The default configuration looks like this:
"modelName": "buffalo_l",
"minScore": 0.7,
"maxDistance": 0.6,
"minFaces": 1
"minFaces": 3
}
},
"map": {

View File

@ -6,7 +6,7 @@ import threading
import time
from concurrent.futures import ThreadPoolExecutor
from contextlib import asynccontextmanager
from typing import Any, AsyncGenerator, Iterator
from typing import Any, AsyncGenerator, Callable, Iterator
from zipfile import BadZipFile
import orjson
@ -105,14 +105,14 @@ async def predict(
model = await load(await model_cache.get(model_name, model_type, **kwargs))
model.configure(**kwargs)
outputs = await run(model, inputs)
outputs = await run(model.predict, inputs)
return ORJSONResponse(outputs)
async def run(model: InferenceModel, inputs: Any) -> Any:
async def run(func: Callable[..., Any], inputs: Any) -> Any:
if thread_pool is None:
return model.predict(inputs)
return await asyncio.get_running_loop().run_in_executor(thread_pool, model.predict, inputs)
return func(inputs)
return await asyncio.get_running_loop().run_in_executor(thread_pool, func, inputs)
async def load(model: InferenceModel) -> InferenceModel:

View File

@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry 1.6.1 and should not be changed by hand.
# This file is automatically @generated by Poetry 1.7.0 and should not be changed by hand.
[[package]]
name = "aiocache"
@ -588,62 +588,53 @@ tests = ["pytest", "pytest-cov", "pytest-xdist"]
[[package]]
name = "cython"
version = "3.0.7"
description = "The Cython compiler for writing C extensions in the Python language."
version = "0.29.37"
description = "The Cython compiler for writing C extensions for the Python language."
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
files = [
{file = "Cython-3.0.7-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e3c0e19bb41de6be9d8afc85795159ca16296be81a586cd9588be0400d44a855"},
{file = "Cython-3.0.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e8bf00ec1dd1d92e9ae74d2e6891f087a939e1dfb40c9c7fa5d8d6a26c94f5a"},
{file = "Cython-3.0.7-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cd6ae43ef2e596c9a88dbf2a8895be2e32cc2f5bc3c8ba2e7753b69068fc0b2d"},
{file = "Cython-3.0.7-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:f674be92673e87dd8ee7cfe553d5960ec4effc5ab15063b9a5e265a51585a31a"},
{file = "Cython-3.0.7-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:861cf254bf5836d47c2aee86aa75dd93d3de00ccd1b077c3c7a2bb22cba358e7"},
{file = "Cython-3.0.7-cp310-cp310-win32.whl", hash = "sha256:f6d8ff62ad55dc0393686438eac4b457a916e4d1118a0b550746bb52b4c756cc"},
{file = "Cython-3.0.7-cp310-cp310-win_amd64.whl", hash = "sha256:e13abb14843397b76d0472c7d33cd260d5f262ab05cc27ed423317e645e29643"},
{file = "Cython-3.0.7-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0c636c9ab92c7838231a1ba769e519d953af8294612f3f772a54d3a5250ff23f"},
{file = "Cython-3.0.7-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22d2a684122dfb531853d57c8c85c1d5d44be709e12466dca99fa6aee7d8054f"},
{file = "Cython-3.0.7-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e1bdf8a107fdf9e174991aa87a0be7504f60de1ec6bfb1ccfb30e33acac818a0"},
{file = "Cython-3.0.7-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:3a83e04fde663b84905f3a20213a4333d13a07b79434300704b70dc552761f8b"},
{file = "Cython-3.0.7-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e34b4b08d795ccca920fa26b099558f4f1e4e3f794e4ba8d3433c5bc2454d50a"},
{file = "Cython-3.0.7-cp311-cp311-win32.whl", hash = "sha256:133057ac45b6fa7fe5d7baada9d3545d09339432f75c0545f556e8c6fecc2932"},
{file = "Cython-3.0.7-cp311-cp311-win_amd64.whl", hash = "sha256:b65abca78aa5ebc8675c8480b9a53006f6efea9910ad099cf32c9fb5617ef251"},
{file = "Cython-3.0.7-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:23ceac5315fe899c229e874328742154e331fa41337bb03f6f5264636c351c9e"},
{file = "Cython-3.0.7-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8ea936cf5931297ba07bce121388c4c6266c1b63a9f4d648ae16c92ff090204b"},
{file = "Cython-3.0.7-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9fcd9a18ee3ac7f460e0841954feb495102ffbdbec0e6c78562f3495cda000dd"},
{file = "Cython-3.0.7-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7c8d579d13cb81abe704c8b0908d122b81d6e2623265a19c4a6a7377f440debb"},
{file = "Cython-3.0.7-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:ef5bb0268bfe5992da3ef9292463a5a895ed8700b134ed2c00008d5471b3ba6e"},
{file = "Cython-3.0.7-cp312-cp312-win32.whl", hash = "sha256:55f93d3822bc196b37a8bdfa4ec6a35232a399e97f2baa714bd5ed8ea9b0ce68"},
{file = "Cython-3.0.7-cp312-cp312-win_amd64.whl", hash = "sha256:f3845c4506e0d207c5e268fb02813928f3a1e135de954a379f165ef0d581da47"},
{file = "Cython-3.0.7-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8ad7c2303a338b2c0b6c6c68f101a6768725934538756096cf3388a5c07a7525"},
{file = "Cython-3.0.7-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fed25959e4025870fdde5f895fcb126196d22affd4f4fad85a2823e0dddc85b0"},
{file = "Cython-3.0.7-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:79868ec74e4907a8a6e63effe13547c6157f196a162920b1de066da5849ffb8e"},
{file = "Cython-3.0.7-cp36-cp36m-musllinux_1_1_aarch64.whl", hash = "sha256:5e3a038332973b12e72236e8884dc99601a840334c2c46cfbbb5851cb94166eb"},
{file = "Cython-3.0.7-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:f2602a5c97a3d618b3b847514204ef3349fb414c59e1126c0c2c708d2c5680f8"},
{file = "Cython-3.0.7-cp36-cp36m-win32.whl", hash = "sha256:539ad5a21141e6420035cf616bcba48d999bf878839e52692f97fc7e2f16265c"},
{file = "Cython-3.0.7-cp36-cp36m-win_amd64.whl", hash = "sha256:848a28ea49166454c3bff927e5a47629eecf1aa755d6fb3290569cba0fc93766"},
{file = "Cython-3.0.7-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:82f27a0134fc6bb46032ca5f728d8af984f3be94a3cb01cb70ff1224e551b9cf"},
{file = "Cython-3.0.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:79f20c61114c7948cf1214585066406cef4b54a9b935160980e0b6e70ada3a69"},
{file = "Cython-3.0.7-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:34d51709e10ad6213b4bf094af7be7ff82bab43216b3c92a07d05b451deeca79"},
{file = "Cython-3.0.7-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:3f02c7240abab48d59f0d5fef7064f18f01a2a204616165fa6367a8abf5a8832"},
{file = "Cython-3.0.7-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:225f8bba6428b8d711ca2d6c738d2e3a4667f6a2ae40f8a7a5256f69f6a3600e"},
{file = "Cython-3.0.7-cp37-cp37m-win32.whl", hash = "sha256:30eb2d2938b9195e2c82951713429aff3ad1be9f104437d1536a04eb0cb3dc0e"},
{file = "Cython-3.0.7-cp37-cp37m-win_amd64.whl", hash = "sha256:167b3f3894dcc697cefefac1d198304fae8eb4d5860a7b8bc2459d572e838470"},
{file = "Cython-3.0.7-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2c67105f2c6ccf5b3adbcfaecf3c5c9fa8940f9f97955c9ad7d2542151d97d93"},
{file = "Cython-3.0.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6a1859af761977530df2cd5c36e31d54e8d6708ad2c4656e7125c482364dc216"},
{file = "Cython-3.0.7-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:01b94304aab87496e81d1f546e71abf57b430b39be4269df1cd7da9928d70b5b"},
{file = "Cython-3.0.7-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:931aade65f77cf59f2a702ac1f549a4836ce221107c740502cbad18d6d8e9511"},
{file = "Cython-3.0.7-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:812b193c26553f1f375d4f1c50f805c227b24ed2d595bc9cdaf78c992ecc64a4"},
{file = "Cython-3.0.7-cp38-cp38-win32.whl", hash = "sha256:b227643d8a40b68554dc7d37fcd03fc97b4fb0bd2614aeb5f2e07ab244642d36"},
{file = "Cython-3.0.7-cp38-cp38-win_amd64.whl", hash = "sha256:0d8a98c7d86ac4d05b251c39faf49423780381aab55fbf2e147f6e006a34a58a"},
{file = "Cython-3.0.7-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:816f5285d596062c7ef22790de7d75354b58d4417a9fc64cba914aeeb900db0b"},
{file = "Cython-3.0.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b9d0dae6dccd349b8ccf197c10ef2d05c711ca36a649c7eddbab1de2c90b63a1"},
{file = "Cython-3.0.7-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:13211b67b29f6ed8e87c137496c73d93aff0330d97940b4fbed72eae37a4a2a0"},
{file = "Cython-3.0.7-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:b1853bc34ced5ff6473e881fcf6de29da83262552c8f268a0df53b49c2b89e2c"},
{file = "Cython-3.0.7-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:51e8164b1270625ff101e95c3c1c234421520c07a0a3a20ded9e9431d98afce7"},
{file = "Cython-3.0.7-cp39-cp39-win32.whl", hash = "sha256:45319d2471f4dbf19893ca53785a421107266e18b8cccd2054fce1e3f72a85f1"},
{file = "Cython-3.0.7-cp39-cp39-win_amd64.whl", hash = "sha256:612d83fd1eb5aaa5401a755c1f1aafacd9dab404cd350b90d5f404c98b33e4b3"},
{file = "Cython-3.0.7-py2.py3-none-any.whl", hash = "sha256:936ec37b261b226d7404eff23a9aad284098338150d42a53d6a9af12b18d3892"},
{file = "Cython-3.0.7.tar.gz", hash = "sha256:fb299acf3a578573c190c858d49e0cf9d75f4bc49c3f24c5a63804997ef09213"},
{file = "Cython-0.29.37-cp27-cp27m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f2d621fe4cb50007446742134a890500b34e3f50abaf7993baaca02634af7e15"},
{file = "Cython-0.29.37-cp27-cp27m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:d94caf90ae9cb56116ca6d54cdcbccd3c4df6b0cb7233922b2233ee7fe81d05b"},
{file = "Cython-0.29.37-cp27-cp27mu-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:852cd4378cbc9ade02f53709107ff9fdad55019a3a636e8a27663ba6cfce10b6"},
{file = "Cython-0.29.37-cp27-cp27mu-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:bbce388431a2608a81c8ab13cb14c50611473843ca766031b8b24bb1723faf79"},
{file = "Cython-0.29.37-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:4658499a41255431f6bbdca7e634e9c8d3a4c190bf24b4aa1646dac751d3da4d"},
{file = "Cython-0.29.37-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:12192ab269e7185720f2d2f8894587bf1da4276db1b9b869e4622a093f18cae6"},
{file = "Cython-0.29.37-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_24_i686.whl", hash = "sha256:9450e0766ab65947f8a2a36f9e59079fc879c3807ec936c61725a48c97741a52"},
{file = "Cython-0.29.37-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:177481b0a7e003e5c49e2bf0dda1d6fe610c239f17642a5da9f18c2ad0c5f6b6"},
{file = "Cython-0.29.37-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:b048354fd380278f2fa096e7526973beb6e0491a9d44d7e4e29df52612d25776"},
{file = "Cython-0.29.37-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:ea6d208be1906c5df25b674777d5905c6d8e9ef0b201b830849e0729ba08caba"},
{file = "Cython-0.29.37-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_24_i686.whl", hash = "sha256:af03854571738307a5f30cc6b724081d72db12f907699e7fdfc04c12c839158e"},
{file = "Cython-0.29.37-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c33508ede9172a6f6f99d5a6dadc7fee23c840423b411ef8b5a403c04e530297"},
{file = "Cython-0.29.37-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e8af5975ecfae254d8c0051204fca995dda8f93cf9f0bbf7571e3cda2b0cef4d"},
{file = "Cython-0.29.37-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:29415d8eb2fdc1ea518ca4810c50a2d062b387d4c9fbcfb3352346e93db22c6d"},
{file = "Cython-0.29.37-cp35-cp35m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fe0eaf6b1e9ee97c5ee7bfc943f00e36cf59d929db16886cb018352bff8208da"},
{file = "Cython-0.29.37-cp35-cp35m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:cc1b9ce2b73b9ee8c305e06173b35c7c202d4b82d084a0cd73dcedfd6d310aec"},
{file = "Cython-0.29.37-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:2618af0b8df26d32ee4e8858d4ad8167546596762620aeade84954ae37194a0e"},
{file = "Cython-0.29.37-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:ac910a28a2fd3d280faf3077b6fe63b97a4b93994ff05647581846f0e4b2f8d1"},
{file = "Cython-0.29.37-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_24_i686.whl", hash = "sha256:8bf38373773f967cfd793997a6fb96cf972d41a9fce987ace5767349d6f15572"},
{file = "Cython-0.29.37-cp36-cp36m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:6cddb567dadb3aa3e280a8a35e5126030915ea744c2812206e9c194b8881475d"},
{file = "Cython-0.29.37-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:79ecfc48694e156402c05561e0adb0e25a6e9d35ac0b41693733a08219d38c58"},
{file = "Cython-0.29.37-cp36-cp36m-musllinux_1_1_x86_64.whl", hash = "sha256:9a455347e20ddfad0c5dfee32a3e855ee96811269e5fd86be622ddc4cb326404"},
{file = "Cython-0.29.37-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:fa5b6a0f69bf1823c9fd038fa77a2568b78fda2de045a95b48a71dee4d0d578f"},
{file = "Cython-0.29.37-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:a6164a05440dcd9daa760c6488bc91bdac1380c7b4b3aca38cf307ba66042d54"},
{file = "Cython-0.29.37-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_24_i686.whl", hash = "sha256:562f8f911dbd6f1a1b9be8f6cba097125700355688f613994ccd4406f220557a"},
{file = "Cython-0.29.37-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8c39c2f5a0fe29bb01de9b1fb449bf65bed6f192317c677f181732791c63fe28"},
{file = "Cython-0.29.37-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:0a0a6d5972bb3b8c7363cf19a42a988bb0c0bb5ebd9c736c84eca85113ccfdbe"},
{file = "Cython-0.29.37-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:b82584836e9e7c0d6effee976595e5cd7fa88dbef3e96e900187983c1d4637d1"},
{file = "Cython-0.29.37-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:b6c48f1032b379135a5b4a31976d6c468e02490688acf9254c6c8ed27bd4cbd4"},
{file = "Cython-0.29.37-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:3f87bef1808d255cf13be378c7ad27ae7c6db6df7732217d32428d1daf4109be"},
{file = "Cython-0.29.37-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_24_i686.whl", hash = "sha256:9e68bafeeb97d5a403fb1f7700bd4a55a1f8989824c323ae02ae8a4fcd88f6a1"},
{file = "Cython-0.29.37-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e14cd44c830e53cf9d7269c87a6bcc638bb065ec07e24990e338162c7001d3c3"},
{file = "Cython-0.29.37-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:0544f7a3e4437b89b356baa15387494c18214e03f2ffaddada5a2c71c3dfd24b"},
{file = "Cython-0.29.37-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2de3e729d25f041036e81e2f15683dd129f977dfb5b06267e30e8d7acec43225"},
{file = "Cython-0.29.37-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:2ad634dc77a6a74022881826099eccac19c9b79153942cc82e754ffac2bec116"},
{file = "Cython-0.29.37-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:e841a8b4f9ceefb2916e32dac4f28a895cd519e8ece71505144da1ee355c548a"},
{file = "Cython-0.29.37-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_24_i686.whl", hash = "sha256:6c672089fba6a8f6690b8d7924a58c04477771401ad101d53171a13405ee12cb"},
{file = "Cython-0.29.37-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:0301d4739c6894e012f1d410052082fdda9e63888c815d9e23e0f7f82fff7d79"},
{file = "Cython-0.29.37-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:af8e7b4397620e2d18259a11f3bfa026eff9846657e397d02616962dd5dd035a"},
{file = "Cython-0.29.37-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:b225d5e2091c224d4ab328165fef224ba3919b3ed44bd9b3241416f523b4d51a"},
{file = "Cython-0.29.37-py2.py3-none-any.whl", hash = "sha256:95f1d6a83ef2729e67b3fa7318c829ce5b07ac64c084cd6af11c228e0364662c"},
{file = "Cython-0.29.37.tar.gz", hash = "sha256:f813d4a6dd94adee5d4ff266191d1d95bf6d4164a4facc535422c021b2504cfb"},
]
[[package]]

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@ -6503,6 +6503,12 @@
"backgroundTask": {
"$ref": "#/components/schemas/JobStatusDto"
},
"faceDetection": {
"$ref": "#/components/schemas/JobStatusDto"
},
"facialRecognition": {
"$ref": "#/components/schemas/JobStatusDto"
},
"library": {
"$ref": "#/components/schemas/JobStatusDto"
},
@ -6512,9 +6518,6 @@
"migration": {
"$ref": "#/components/schemas/JobStatusDto"
},
"recognizeFaces": {
"$ref": "#/components/schemas/JobStatusDto"
},
"search": {
"$ref": "#/components/schemas/JobStatusDto"
},
@ -6543,7 +6546,8 @@
"migration",
"backgroundTask",
"search",
"recognizeFaces",
"faceDetection",
"facialRecognition",
"sidecar",
"library"
],
@ -7831,7 +7835,8 @@
"thumbnailGeneration",
"metadataExtraction",
"videoConversion",
"recognizeFaces",
"faceDetection",
"facialRecognition",
"smartSearch",
"backgroundTask",
"storageTemplateMigration",
@ -8466,13 +8471,15 @@
"type": "boolean"
},
"maxDistance": {
"type": "integer"
"format": "float",
"type": "number"
},
"minFaces": {
"type": "integer"
},
"minScore": {
"type": "integer"
"format": "float",
"type": "number"
},
"modelName": {
"type": "string"
@ -9212,6 +9219,9 @@
"backgroundTask": {
"$ref": "#/components/schemas/JobSettingsDto"
},
"faceDetection": {
"$ref": "#/components/schemas/JobSettingsDto"
},
"library": {
"$ref": "#/components/schemas/JobSettingsDto"
},
@ -9221,9 +9231,6 @@
"migration": {
"$ref": "#/components/schemas/JobSettingsDto"
},
"recognizeFaces": {
"$ref": "#/components/schemas/JobSettingsDto"
},
"search": {
"$ref": "#/components/schemas/JobSettingsDto"
},
@ -9248,7 +9255,7 @@
"migration",
"backgroundTask",
"search",
"recognizeFaces",
"faceDetection",
"sidecar",
"library"
],

View File

@ -355,6 +355,18 @@ export interface AllJobStatusResponseDto {
* @memberof AllJobStatusResponseDto
*/
'backgroundTask': JobStatusDto;
/**
*
* @type {JobStatusDto}
* @memberof AllJobStatusResponseDto
*/
'faceDetection': JobStatusDto;
/**
*
* @type {JobStatusDto}
* @memberof AllJobStatusResponseDto
*/
'facialRecognition': JobStatusDto;
/**
*
* @type {JobStatusDto}
@ -373,12 +385,6 @@ export interface AllJobStatusResponseDto {
* @memberof AllJobStatusResponseDto
*/
'migration': JobStatusDto;
/**
*
* @type {JobStatusDto}
* @memberof AllJobStatusResponseDto
*/
'recognizeFaces': JobStatusDto;
/**
*
* @type {JobStatusDto}
@ -1982,7 +1988,8 @@ export const JobName = {
ThumbnailGeneration: 'thumbnailGeneration',
MetadataExtraction: 'metadataExtraction',
VideoConversion: 'videoConversion',
RecognizeFaces: 'recognizeFaces',
FaceDetection: 'faceDetection',
FacialRecognition: 'facialRecognition',
SmartSearch: 'smartSearch',
BackgroundTask: 'backgroundTask',
StorageTemplateMigration: 'storageTemplateMigration',
@ -3774,6 +3781,12 @@ export interface SystemConfigJobDto {
* @memberof SystemConfigJobDto
*/
'backgroundTask': JobSettingsDto;
/**
*
* @type {JobSettingsDto}
* @memberof SystemConfigJobDto
*/
'faceDetection': JobSettingsDto;
/**
*
* @type {JobSettingsDto}
@ -3792,12 +3805,6 @@ export interface SystemConfigJobDto {
* @memberof SystemConfigJobDto
*/
'migration': JobSettingsDto;
/**
*
* @type {JobSettingsDto}
* @memberof SystemConfigJobDto
*/
'recognizeFaces': JobSettingsDto;
/**
*
* @type {JobSettingsDto}

View File

@ -86,6 +86,7 @@ export const testApp = {
getJobCounts: jest.fn(),
pause: jest.fn(),
clear: jest.fn(),
waitForQueueCompletion: jest.fn(),
} as IJobRepository)
.compile();

View File

@ -204,16 +204,20 @@ export class AuditService {
}
}
const people = await this.personRepository.getAll();
for (const { id, thumbnailPath } of people) {
track(thumbnailPath);
const entity = { entityId: id, entityType: PathEntityType.PERSON };
if (thumbnailPath && !hasFile(thumbFiles, thumbnailPath)) {
orphans.push({ ...entity, pathType: PersonPathType.FACE, pathValue: thumbnailPath });
const personPagination = usePagination(JOBS_ASSET_PAGINATION_SIZE, (pagination) =>
this.personRepository.getAll(pagination),
);
for await (const people of personPagination) {
for (const { id, thumbnailPath } of people) {
track(thumbnailPath);
const entity = { entityId: id, entityType: PathEntityType.PERSON };
if (thumbnailPath && !hasFile(thumbFiles, thumbnailPath)) {
orphans.push({ ...entity, pathType: PersonPathType.FACE, pathValue: thumbnailPath });
}
}
}
this.logger.log(`Found ${assetCount} assets, ${users.length} users, ${people.length} people`);
this.logger.log(`Found ${assetCount} assets, ${users.length} users, ${people.length} people`);
}
const extras: string[] = [];
for (const file of allFiles) {

View File

@ -2,7 +2,8 @@ export enum QueueName {
THUMBNAIL_GENERATION = 'thumbnailGeneration',
METADATA_EXTRACTION = 'metadataExtraction',
VIDEO_CONVERSION = 'videoConversion',
RECOGNIZE_FACES = 'recognizeFaces',
FACE_DETECTION = 'faceDetection',
FACIAL_RECOGNITION = 'facialRecognition',
SMART_SEARCH = 'smartSearch',
BACKGROUND_TASK = 'backgroundTask',
STORAGE_TEMPLATE_MIGRATION = 'storageTemplateMigration',
@ -12,6 +13,11 @@ export enum QueueName {
LIBRARY = 'library',
}
export type ConcurrentQueueName = Exclude<
QueueName,
QueueName.STORAGE_TEMPLATE_MIGRATION | QueueName.FACIAL_RECOGNITION
>;
export enum JobCommand {
START = 'start',
PAUSE = 'pause',
@ -57,9 +63,10 @@ export enum JobName {
// facial recognition
PERSON_CLEANUP = 'person-cleanup',
PERSON_DELETE = 'person-delete',
QUEUE_RECOGNIZE_FACES = 'queue-recognize-faces',
RECOGNIZE_FACES = 'recognize-faces',
QUEUE_FACE_DETECTION = 'queue-face-detection',
FACE_DETECTION = 'face-detection',
QUEUE_FACIAL_RECOGNITION = 'queue-facial-recognition',
FACIAL_RECOGNITION = 'facial-recognition',
// library managment
LIBRARY_SCAN = 'library-refresh',
@ -95,7 +102,6 @@ export const JOBS_TO_QUEUE: Record<JobName, QueueName> = {
[JobName.DELETE_FILES]: QueueName.BACKGROUND_TASK,
[JobName.CLEAN_OLD_AUDIT_LOGS]: QueueName.BACKGROUND_TASK,
[JobName.PERSON_CLEANUP]: QueueName.BACKGROUND_TASK,
[JobName.PERSON_DELETE]: QueueName.BACKGROUND_TASK,
[JobName.USER_SYNC_USAGE]: QueueName.BACKGROUND_TASK,
// conversion
@ -124,8 +130,10 @@ export const JOBS_TO_QUEUE: Record<JobName, QueueName> = {
[JobName.MIGRATE_PERSON]: QueueName.MIGRATION,
// facial recognition
[JobName.QUEUE_RECOGNIZE_FACES]: QueueName.RECOGNIZE_FACES,
[JobName.RECOGNIZE_FACES]: QueueName.RECOGNIZE_FACES,
[JobName.QUEUE_FACE_DETECTION]: QueueName.FACE_DETECTION,
[JobName.FACE_DETECTION]: QueueName.FACE_DETECTION,
[JobName.QUEUE_FACIAL_RECOGNITION]: QueueName.FACIAL_RECOGNITION,
[JobName.FACIAL_RECOGNITION]: QueueName.FACIAL_RECOGNITION,
// clip
[JobName.QUEUE_ENCODE_CLIP]: QueueName.SMART_SEARCH,

View File

@ -75,7 +75,10 @@ export class AllJobStatusResponseDto implements Record<QueueName, JobStatusDto>
[QueueName.SEARCH]!: JobStatusDto;
@ApiProperty({ type: JobStatusDto })
[QueueName.RECOGNIZE_FACES]!: JobStatusDto;
[QueueName.FACE_DETECTION]!: JobStatusDto;
@ApiProperty({ type: JobStatusDto })
[QueueName.FACIAL_RECOGNITION]!: JobStatusDto;
@ApiProperty({ type: JobStatusDto })
[QueueName.SIDECAR]!: JobStatusDto;

View File

@ -35,3 +35,7 @@ export interface ISidecarWriteJob extends IEntityJob {
latitude?: number;
longitude?: number;
}
export interface IDeferrableJob extends IEntityJob {
deferred?: boolean;
}

View File

@ -104,7 +104,8 @@ describe(JobService.name, () => {
[QueueName.MIGRATION]: expectedJobStatus,
[QueueName.THUMBNAIL_GENERATION]: expectedJobStatus,
[QueueName.VIDEO_CONVERSION]: expectedJobStatus,
[QueueName.RECOGNIZE_FACES]: expectedJobStatus,
[QueueName.FACE_DETECTION]: expectedJobStatus,
[QueueName.FACIAL_RECOGNITION]: expectedJobStatus,
[QueueName.SIDECAR]: expectedJobStatus,
[QueueName.LIBRARY]: expectedJobStatus,
});
@ -189,12 +190,20 @@ describe(JobService.name, () => {
expect(jobMock.queue).toHaveBeenCalledWith({ name: JobName.QUEUE_GENERATE_THUMBNAILS, data: { force: false } });
});
it('should handle a start recognize faces command', async () => {
it('should handle a start face detection command', async () => {
jobMock.getQueueStatus.mockResolvedValue({ isActive: false, isPaused: false });
await sut.handleCommand(QueueName.RECOGNIZE_FACES, { command: JobCommand.START, force: false });
await sut.handleCommand(QueueName.FACE_DETECTION, { command: JobCommand.START, force: false });
expect(jobMock.queue).toHaveBeenCalledWith({ name: JobName.QUEUE_RECOGNIZE_FACES, data: { force: false } });
expect(jobMock.queue).toHaveBeenCalledWith({ name: JobName.QUEUE_FACE_DETECTION, data: { force: false } });
});
it('should handle a start facial recognition command', async () => {
jobMock.getQueueStatus.mockResolvedValue({ isActive: false, isPaused: false });
await sut.handleCommand(QueueName.FACIAL_RECOGNITION, { command: JobCommand.START, force: false });
expect(jobMock.queue).toHaveBeenCalledWith({ name: JobName.QUEUE_FACIAL_RECOGNITION, data: { force: false } });
});
it('should throw a bad request when an invalid queue is used', async () => {
@ -224,7 +233,7 @@ describe(JobService.name, () => {
[QueueName.BACKGROUND_TASK]: { concurrency: 10 },
[QueueName.SMART_SEARCH]: { concurrency: 10 },
[QueueName.METADATA_EXTRACTION]: { concurrency: 10 },
[QueueName.RECOGNIZE_FACES]: { concurrency: 10 },
[QueueName.FACE_DETECTION]: { concurrency: 10 },
[QueueName.SEARCH]: { concurrency: 10 },
[QueueName.SIDECAR]: { concurrency: 10 },
[QueueName.LIBRARY]: { concurrency: 10 },
@ -237,7 +246,7 @@ describe(JobService.name, () => {
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.BACKGROUND_TASK, 10);
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.SMART_SEARCH, 10);
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.METADATA_EXTRACTION, 10);
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.RECOGNIZE_FACES, 10);
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.FACE_DETECTION, 10);
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.SIDECAR, 10);
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.LIBRARY, 10);
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.MIGRATION, 10);
@ -280,7 +289,7 @@ describe(JobService.name, () => {
JobName.GENERATE_WEBP_THUMBNAIL,
JobName.GENERATE_THUMBHASH_THUMBNAIL,
JobName.ENCODE_CLIP,
JobName.RECOGNIZE_FACES,
JobName.FACE_DETECTION,
],
},
{
@ -289,7 +298,7 @@ describe(JobService.name, () => {
JobName.GENERATE_WEBP_THUMBNAIL,
JobName.GENERATE_THUMBHASH_THUMBNAIL,
JobName.ENCODE_CLIP,
JobName.RECOGNIZE_FACES,
JobName.FACE_DETECTION,
JobName.VIDEO_CONVERSION,
],
},
@ -299,7 +308,7 @@ describe(JobService.name, () => {
JobName.GENERATE_WEBP_THUMBNAIL,
JobName.GENERATE_THUMBHASH_THUMBNAIL,
JobName.ENCODE_CLIP,
JobName.RECOGNIZE_FACES,
JobName.FACE_DETECTION,
JobName.VIDEO_CONVERSION,
],
},
@ -308,7 +317,11 @@ describe(JobService.name, () => {
jobs: [],
},
{
item: { name: JobName.RECOGNIZE_FACES, data: { id: 'asset-1' } },
item: { name: JobName.FACE_DETECTION, data: { id: 'asset-1' } },
jobs: [JobName.QUEUE_FACIAL_RECOGNITION],
},
{
item: { name: JobName.FACIAL_RECOGNITION, data: { id: 'asset-1' } },
jobs: [],
},
];
@ -355,7 +368,12 @@ describe(JobService.name, () => {
configKey: SystemConfigKey.MACHINE_LEARNING_CLIP_ENABLED,
},
{
queue: QueueName.RECOGNIZE_FACES,
queue: QueueName.FACE_DETECTION,
feature: FeatureFlag.FACIAL_RECOGNITION,
configKey: SystemConfigKey.MACHINE_LEARNING_FACIAL_RECOGNITION_ENABLED,
},
{
queue: QueueName.FACIAL_RECOGNITION,
feature: FeatureFlag.FACIAL_RECOGNITION,
configKey: SystemConfigKey.MACHINE_LEARNING_FACIAL_RECOGNITION_ENABLED,
},

View File

@ -14,7 +14,7 @@ import {
QueueCleanType,
} from '../repositories';
import { FeatureFlag, SystemConfigCore } from '../system-config/system-config.core';
import { JobCommand, JobName, QueueName } from './job.constants';
import { ConcurrentQueueName, JobCommand, JobName, QueueName } from './job.constants';
import { AllJobStatusResponseDto, JobCommandDto, JobStatusDto } from './job.dto';
@Injectable()
@ -108,9 +108,13 @@ export class JobService {
case QueueName.THUMBNAIL_GENERATION:
return this.jobRepository.queue({ name: JobName.QUEUE_GENERATE_THUMBNAILS, data: { force } });
case QueueName.RECOGNIZE_FACES:
case QueueName.FACE_DETECTION:
await this.configCore.requireFeature(FeatureFlag.FACIAL_RECOGNITION);
return this.jobRepository.queue({ name: JobName.QUEUE_RECOGNIZE_FACES, data: { force } });
return this.jobRepository.queue({ name: JobName.QUEUE_FACE_DETECTION, data: { force } });
case QueueName.FACIAL_RECOGNITION:
await this.configCore.requireFeature(FeatureFlag.FACIAL_RECOGNITION);
return this.jobRepository.queue({ name: JobName.QUEUE_FACIAL_RECOGNITION, data: { force } });
case QueueName.LIBRARY:
return this.jobRepository.queue({ name: JobName.LIBRARY_QUEUE_SCAN_ALL, data: { force } });
@ -124,7 +128,8 @@ export class JobService {
const config = await this.configCore.getConfig();
for (const queueName of Object.values(QueueName)) {
let concurrency = 1;
if (queueName !== QueueName.STORAGE_TEMPLATE_MIGRATION) {
if (this.isConcurrentQueue(queueName)) {
concurrency = config.job[queueName].concurrency;
}
@ -145,10 +150,10 @@ export class JobService {
}
this.configCore.config$.subscribe((config) => {
this.logger.log(`Updating queue concurrency settings`);
this.logger.debug(`Updating queue concurrency settings`);
for (const queueName of Object.values(QueueName)) {
let concurrency = 1;
if (queueName !== QueueName.STORAGE_TEMPLATE_MIGRATION) {
if (this.isConcurrentQueue(queueName)) {
concurrency = config.job[queueName].concurrency;
}
this.logger.debug(`Setting ${queueName} concurrency to ${concurrency}`);
@ -157,6 +162,10 @@ export class JobService {
});
}
private isConcurrentQueue(name: QueueName): name is ConcurrentQueueName {
return ![QueueName.FACIAL_RECOGNITION, QueueName.STORAGE_TEMPLATE_MIGRATION].includes(name);
}
async handleNightlyJobs() {
await this.jobRepository.queueAll([
{ name: JobName.ASSET_DELETION_CHECK },
@ -217,7 +226,7 @@ export class JobService {
{ name: JobName.GENERATE_WEBP_THUMBNAIL, data: item.data },
{ name: JobName.GENERATE_THUMBHASH_THUMBNAIL, data: item.data },
{ name: JobName.ENCODE_CLIP, data: item.data },
{ name: JobName.RECOGNIZE_FACES, data: item.data },
{ name: JobName.FACE_DETECTION, data: item.data },
];
const [asset] = await this.assetRepository.getByIds([item.data.id]);
@ -244,6 +253,12 @@ export class JobService {
if (asset && asset.isVisible) {
this.communicationRepository.send(ClientEvent.UPLOAD_SUCCESS, asset.ownerId, mapAsset(asset));
}
break;
}
case JobName.FACE_DETECTION: {
await this.jobRepository.queue({ name: JobName.QUEUE_FACIAL_RECOGNITION, data: item.data });
break;
}
}
}

View File

@ -70,7 +70,10 @@ describe(MediaService.name, () => {
items: [assetStub.image],
hasNextPage: false,
});
personMock.getAll.mockResolvedValue([personStub.newThumbnail]);
personMock.getAll.mockResolvedValue({
items: [personStub.newThumbnail],
hasNextPage: false,
});
personMock.getFacesByIds.mockResolvedValue([faceStub.face1]);
await sut.handleQueueGenerateThumbnails({ force: true });
@ -84,8 +87,7 @@ describe(MediaService.name, () => {
},
]);
expect(personMock.getAll).toHaveBeenCalled();
expect(personMock.getAllWithoutThumbnail).not.toHaveBeenCalled();
expect(personMock.getAll).toHaveBeenCalledWith({ skip: 0, take: 1000 }, {});
expect(jobMock.queueAll).toHaveBeenCalledWith([
{
name: JobName.GENERATE_PERSON_THUMBNAIL,
@ -99,7 +101,10 @@ describe(MediaService.name, () => {
items: [assetStub.image],
hasNextPage: false,
});
personMock.getAllWithoutThumbnail.mockResolvedValue([personStub.noThumbnail]);
personMock.getAll.mockResolvedValue({
items: [personStub.noThumbnail],
hasNextPage: false,
});
personMock.getRandomFace.mockResolvedValue(faceStub.face1);
await sut.handleQueueGenerateThumbnails({ force: false });
@ -107,8 +112,7 @@ describe(MediaService.name, () => {
expect(assetMock.getAll).not.toHaveBeenCalled();
expect(assetMock.getWithout).toHaveBeenCalledWith({ skip: 0, take: 1000 }, WithoutProperty.THUMBNAIL);
expect(personMock.getAll).not.toHaveBeenCalled();
expect(personMock.getAllWithoutThumbnail).toHaveBeenCalled();
expect(personMock.getAll).toHaveBeenCalledWith({ skip: 0, take: 1000 }, { where: { thumbnailPath: '' } });
expect(personMock.getRandomFace).toHaveBeenCalled();
expect(jobMock.queueAll).toHaveBeenCalledWith([
{
@ -125,7 +129,10 @@ describe(MediaService.name, () => {
items: [assetStub.noResizePath],
hasNextPage: false,
});
personMock.getAllWithoutThumbnail.mockResolvedValue([]);
personMock.getAll.mockResolvedValue({
items: [],
hasNextPage: false,
});
await sut.handleQueueGenerateThumbnails({ force: false });
@ -138,8 +145,7 @@ describe(MediaService.name, () => {
},
]);
expect(personMock.getAll).not.toHaveBeenCalled();
expect(personMock.getAllWithoutThumbnail).toHaveBeenCalled();
expect(personMock.getAll).toHaveBeenCalledWith({ skip: 0, take: 1000 }, { where: { thumbnailPath: '' } });
});
it('should queue all assets with missing webp path', async () => {
@ -147,7 +153,10 @@ describe(MediaService.name, () => {
items: [assetStub.noWebpPath],
hasNextPage: false,
});
personMock.getAllWithoutThumbnail.mockResolvedValue([]);
personMock.getAll.mockResolvedValue({
items: [],
hasNextPage: false,
});
await sut.handleQueueGenerateThumbnails({ force: false });
@ -160,8 +169,7 @@ describe(MediaService.name, () => {
},
]);
expect(personMock.getAll).not.toHaveBeenCalled();
expect(personMock.getAllWithoutThumbnail).toHaveBeenCalled();
expect(personMock.getAll).toHaveBeenCalledWith({ skip: 0, take: 1000 }, { where: { thumbnailPath: '' } });
});
it('should queue all assets with missing thumbhash', async () => {
@ -169,7 +177,10 @@ describe(MediaService.name, () => {
items: [assetStub.noThumbhash],
hasNextPage: false,
});
personMock.getAllWithoutThumbnail.mockResolvedValue([]);
personMock.getAll.mockResolvedValue({
items: [],
hasNextPage: false,
});
await sut.handleQueueGenerateThumbnails({ force: false });
@ -182,8 +193,7 @@ describe(MediaService.name, () => {
},
]);
expect(personMock.getAll).not.toHaveBeenCalled();
expect(personMock.getAllWithoutThumbnail).toHaveBeenCalled();
expect(personMock.getAll).toHaveBeenCalledWith({ skip: 0, take: 1000 }, { where: { thumbnailPath: '' } });
});
});
@ -394,7 +404,10 @@ describe(MediaService.name, () => {
items: [assetStub.video],
hasNextPage: false,
});
personMock.getAll.mockResolvedValue([]);
personMock.getAll.mockResolvedValue({
items: [],
hasNextPage: false,
});
await sut.handleQueueVideoConversion({ force: true });

View File

@ -93,20 +93,24 @@ export class MediaService {
await this.jobRepository.queueAll(jobs);
}
const people = force ? await this.personRepository.getAll() : await this.personRepository.getAllWithoutThumbnail();
const jobs: JobItem[] = [];
for (const person of people) {
if (!person.faceAssetId) {
const face = await this.personRepository.getRandomFace(person.id);
if (!face) {
continue;
const personPagination = usePagination(JOBS_ASSET_PAGINATION_SIZE, (pagination) =>
this.personRepository.getAll(pagination, { where: force ? undefined : { thumbnailPath: '' } }),
);
for await (const people of personPagination) {
for (const person of people) {
if (!person.faceAssetId) {
const face = await this.personRepository.getRandomFace(person.id);
if (!face) {
continue;
}
await this.personRepository.update({ id: person.id, faceAssetId: face.assetId });
}
await this.personRepository.update({ id: person.id, faceAssetId: face.assetId });
jobs.push({ name: JobName.GENERATE_PERSON_THUMBNAIL, data: { id: person.id } });
}
jobs.push({ name: JobName.GENERATE_PERSON_THUMBNAIL, data: { id: person.id } });
}
await this.jobRepository.queueAll(jobs);
@ -131,11 +135,16 @@ export class MediaService {
);
}
const people = await this.personRepository.getAll();
await this.jobRepository.queueAll(
people.map((person) => ({ name: JobName.MIGRATE_PERSON, data: { id: person.id } })),
const personPagination = usePagination(JOBS_ASSET_PAGINATION_SIZE, (pagination) =>
this.personRepository.getAll(pagination),
);
for await (const people of personPagination) {
await this.jobRepository.queueAll(
people.map((person) => ({ name: JobName.MIGRATE_PERSON, data: { id: person.id } })),
);
}
return true;
}

View File

@ -18,10 +18,12 @@ import {
newSystemConfigRepositoryMock,
personStub,
} from '@test';
import { IsNull } from 'typeorm';
import { BulkIdErrorReason } from '../asset';
import { CacheControl, ImmichFileResponse } from '../domain.util';
import { JobName } from '../job';
import {
FaceSearchResult,
IAssetRepository,
ICryptoRepository,
IJobRepository,
@ -120,7 +122,7 @@ describe(PersonService.name, () => {
people: [responseDto],
});
expect(personMock.getAllForUser).toHaveBeenCalledWith(authStub.admin.user.id, {
minimumFaceCount: 1,
minimumFaceCount: 3,
withHidden: false,
});
});
@ -132,7 +134,7 @@ describe(PersonService.name, () => {
people: [responseDto],
});
expect(personMock.getAllForUser).toHaveBeenCalledWith(authStub.admin.user.id, {
minimumFaceCount: 1,
minimumFaceCount: 3,
withHidden: false,
});
});
@ -153,7 +155,7 @@ describe(PersonService.name, () => {
],
});
expect(personMock.getAllForUser).toHaveBeenCalledWith(authStub.admin.user.id, {
minimumFaceCount: 1,
minimumFaceCount: 3,
withHidden: true,
});
});
@ -516,51 +518,22 @@ describe(PersonService.name, () => {
});
});
describe('handlePersonDelete', () => {
it('should stop if a person has not be found', async () => {
personMock.getById.mockResolvedValue(null);
await expect(sut.handlePersonDelete({ id: 'person-1' })).resolves.toBe(false);
expect(personMock.update).not.toHaveBeenCalled();
expect(storageMock.unlink).not.toHaveBeenCalled();
});
it('should delete a person', async () => {
personMock.getById.mockResolvedValue(personStub.primaryPerson);
await expect(sut.handlePersonDelete({ id: 'person-1' })).resolves.toBe(true);
expect(personMock.delete).toHaveBeenCalledWith(personStub.primaryPerson);
expect(storageMock.unlink).toHaveBeenCalledWith(personStub.primaryPerson.thumbnailPath);
});
});
describe('handlePersonDelete', () => {
it('should delete person', async () => {
personMock.getById.mockResolvedValue(personStub.withName);
await sut.handlePersonDelete({ id: personStub.withName.id });
expect(personMock.delete).toHaveBeenCalledWith(personStub.withName);
expect(storageMock.unlink).toHaveBeenCalledWith(personStub.withName.thumbnailPath);
});
});
describe('handlePersonCleanup', () => {
it('should delete people without faces', async () => {
personMock.getAllWithoutFaces.mockResolvedValue([personStub.noName]);
await sut.handlePersonCleanup();
expect(jobMock.queueAll).toHaveBeenCalledWith([
{ name: JobName.PERSON_DELETE, data: { id: personStub.noName.id } },
]);
expect(personMock.delete).toHaveBeenCalledWith([personStub.noName]);
expect(storageMock.unlink).toHaveBeenCalledWith(personStub.noName.thumbnailPath);
});
});
describe('handleQueueRecognizeFaces', () => {
describe('handleQueueDetectFaces', () => {
it('should return if machine learning is disabled', async () => {
configMock.load.mockResolvedValue([{ key: SystemConfigKey.MACHINE_LEARNING_ENABLED, value: false }]);
await expect(sut.handleQueueRecognizeFaces({})).resolves.toBe(true);
await expect(sut.handleQueueDetectFaces({})).resolves.toBe(true);
expect(jobMock.queue).not.toHaveBeenCalled();
expect(jobMock.queueAll).not.toHaveBeenCalled();
expect(configMock.load).toHaveBeenCalled();
@ -571,12 +544,13 @@ describe(PersonService.name, () => {
items: [assetStub.image],
hasNextPage: false,
});
await sut.handleQueueRecognizeFaces({});
await sut.handleQueueDetectFaces({});
expect(assetMock.getWithout).toHaveBeenCalledWith({ skip: 0, take: 1000 }, WithoutProperty.FACES);
expect(jobMock.queueAll).toHaveBeenCalledWith([
{
name: JobName.RECOGNIZE_FACES,
name: JobName.FACE_DETECTION,
data: { id: assetStub.image.id },
},
]);
@ -587,39 +561,133 @@ describe(PersonService.name, () => {
items: [assetStub.image],
hasNextPage: false,
});
personMock.getAll.mockResolvedValue([personStub.withName]);
personMock.deleteAll.mockResolvedValue(5);
personMock.getAll.mockResolvedValue({
items: [personStub.withName],
hasNextPage: false,
});
await sut.handleQueueRecognizeFaces({ force: true });
await sut.handleQueueDetectFaces({ force: true });
expect(assetMock.getAll).toHaveBeenCalled();
expect(jobMock.queueAll).toHaveBeenCalledWith([
{
name: JobName.RECOGNIZE_FACES,
name: JobName.FACE_DETECTION,
data: { id: assetStub.image.id },
},
]);
});
it('should delete existing people and faces if forced', async () => {
personMock.getAll.mockResolvedValue({
items: [faceStub.face1.person],
hasNextPage: false,
});
personMock.getAllFaces.mockResolvedValue({
items: [faceStub.face1],
hasNextPage: false,
});
assetMock.getAll.mockResolvedValue({
items: [assetStub.image],
hasNextPage: false,
});
await sut.handleQueueDetectFaces({ force: true });
expect(assetMock.getAll).toHaveBeenCalled();
expect(jobMock.queueAll).toHaveBeenCalledWith([
{
name: JobName.PERSON_DELETE,
data: { id: personStub.withName.id },
name: JobName.FACE_DETECTION,
data: { id: assetStub.image.id },
},
]);
expect(personMock.delete).toHaveBeenCalledWith([faceStub.face1.person]);
expect(storageMock.unlink).toHaveBeenCalledWith(faceStub.face1.person.thumbnailPath);
});
});
describe('handleRecognizeFaces', () => {
describe('handleQueueRecognizeFaces', () => {
it('should return if machine learning is disabled', async () => {
configMock.load.mockResolvedValue([{ key: SystemConfigKey.MACHINE_LEARNING_ENABLED, value: false }]);
await expect(sut.handleRecognizeFaces({ id: 'foo' })).resolves.toBe(true);
await expect(sut.handleQueueRecognizeFaces({})).resolves.toBe(true);
expect(jobMock.queue).not.toHaveBeenCalled();
expect(configMock.load).toHaveBeenCalled();
});
it('should queue missing assets', async () => {
personMock.getAllFaces.mockResolvedValue({
items: [faceStub.face1],
hasNextPage: false,
});
await sut.handleQueueRecognizeFaces({});
expect(personMock.getAllFaces).toHaveBeenCalledWith({ skip: 0, take: 1000 }, { where: { personId: IsNull() } });
expect(jobMock.queueAll).toHaveBeenCalledWith([
{
name: JobName.FACIAL_RECOGNITION,
data: { id: faceStub.face1.id, deferred: false },
},
]);
});
it('should queue all assets', async () => {
personMock.getAll.mockResolvedValue({
items: [],
hasNextPage: false,
});
personMock.getAllFaces.mockResolvedValue({
items: [faceStub.face1],
hasNextPage: false,
});
await sut.handleQueueRecognizeFaces({ force: true });
expect(personMock.getAllFaces).toHaveBeenCalledWith({ skip: 0, take: 1000 }, {});
expect(jobMock.queueAll).toHaveBeenCalledWith([
{
name: JobName.FACIAL_RECOGNITION,
data: { id: faceStub.face1.id, deferred: false },
},
]);
});
it('should delete existing people and faces if forced', async () => {
personMock.getAll.mockResolvedValue({
items: [faceStub.face1.person],
hasNextPage: false,
});
personMock.getAllFaces.mockResolvedValue({
items: [faceStub.face1],
hasNextPage: false,
});
await sut.handleQueueRecognizeFaces({ force: true });
expect(personMock.getAllFaces).toHaveBeenCalledWith({ skip: 0, take: 1000 }, {});
expect(jobMock.queueAll).toHaveBeenCalledWith([
{
name: JobName.FACIAL_RECOGNITION,
data: { id: faceStub.face1.id, deferred: false },
},
]);
expect(personMock.delete).toHaveBeenCalledWith([faceStub.face1.person]);
expect(storageMock.unlink).toHaveBeenCalledWith(faceStub.face1.person.thumbnailPath);
});
});
describe('handleDetectFaces', () => {
it('should return if machine learning is disabled', async () => {
configMock.load.mockResolvedValue([{ key: SystemConfigKey.MACHINE_LEARNING_ENABLED, value: false }]);
await expect(sut.handleDetectFaces({ id: 'foo' })).resolves.toBe(true);
expect(assetMock.getByIds).not.toHaveBeenCalled();
expect(configMock.load).toHaveBeenCalled();
});
it('should skip when no resize path', async () => {
assetMock.getByIds.mockResolvedValue([assetStub.noResizePath]);
await sut.handleRecognizeFaces({ id: assetStub.noResizePath.id });
await sut.handleDetectFaces({ id: assetStub.noResizePath.id });
expect(machineLearningMock.detectFaces).not.toHaveBeenCalled();
});
@ -636,7 +704,7 @@ describe(PersonService.name, () => {
],
},
]);
await sut.handleRecognizeFaces({ id: assetStub.noResizePath.id });
await sut.handleDetectFaces({ id: assetStub.noResizePath.id });
expect(machineLearningMock.detectFaces).not.toHaveBeenCalled();
});
@ -645,7 +713,7 @@ describe(PersonService.name, () => {
machineLearningMock.detectFaces.mockResolvedValue([]);
assetMock.getByIds.mockResolvedValue([assetStub.image]);
await sut.handleRecognizeFaces({ id: assetStub.image.id });
await sut.handleDetectFaces({ id: assetStub.image.id });
expect(machineLearningMock.detectFaces).toHaveBeenCalledWith(
'http://immich-machine-learning:3003',
{
@ -655,7 +723,7 @@ describe(PersonService.name, () => {
enabled: true,
maxDistance: 0.6,
minScore: 0.7,
minFaces: 1,
minFaces: 3,
modelName: 'buffalo_l',
},
);
@ -670,37 +738,13 @@ describe(PersonService.name, () => {
expect(assetMock.upsertJobStatus.mock.calls[0][0].facesRecognizedAt?.getTime()).toBeGreaterThan(start);
});
it('should match existing people', async () => {
it('should create a face with no person', async () => {
machineLearningMock.detectFaces.mockResolvedValue([detectFaceMock]);
smartInfoMock.searchFaces.mockResolvedValue([faceStub.face1]);
smartInfoMock.searchFaces.mockResolvedValue([{ face: faceStub.face1, distance: 0.7 }]);
assetMock.getByIds.mockResolvedValue([assetStub.image]);
await sut.handleRecognizeFaces({ id: assetStub.image.id });
await sut.handleDetectFaces({ id: assetStub.image.id });
expect(personMock.createFace).toHaveBeenCalledWith({
personId: 'person-1',
assetId: 'asset-id',
embedding: [1, 2, 3, 4],
boundingBoxX1: 100,
boundingBoxY1: 100,
boundingBoxX2: 200,
boundingBoxY2: 200,
imageHeight: 500,
imageWidth: 400,
});
});
it('should create a new person', async () => {
machineLearningMock.detectFaces.mockResolvedValue([detectFaceMock]);
smartInfoMock.searchFaces.mockResolvedValue([]);
personMock.create.mockResolvedValue(personStub.noName);
assetMock.getByIds.mockResolvedValue([assetStub.image]);
personMock.createFace.mockResolvedValue(faceStub.primaryFace1);
await sut.handleRecognizeFaces({ id: assetStub.image.id });
expect(personMock.create).toHaveBeenCalledWith({ ownerId: assetStub.image.ownerId });
expect(personMock.createFace).toHaveBeenCalledWith({
personId: 'person-1',
assetId: 'asset-id',
embedding: [1, 2, 3, 4],
boundingBoxX1: 100,
@ -710,8 +754,130 @@ describe(PersonService.name, () => {
imageHeight: 500,
imageWidth: 400,
});
expect(personMock.reassignFace).not.toHaveBeenCalled();
expect(personMock.reassignFaces).not.toHaveBeenCalled();
});
});
describe('handleRecognizeFaces', () => {
it('should return false if face does not exist', async () => {
personMock.getFaceByIdWithAssets.mockResolvedValue(null);
expect(await sut.handleRecognizeFaces({ id: faceStub.face1.id })).toBe(false);
expect(personMock.reassignFaces).not.toHaveBeenCalled();
expect(personMock.create).not.toHaveBeenCalled();
expect(personMock.createFace).not.toHaveBeenCalled();
});
it('should return true if face already has an assigned person', async () => {
personMock.getFaceByIdWithAssets.mockResolvedValue(faceStub.face1);
expect(await sut.handleRecognizeFaces({ id: faceStub.face1.id })).toBe(true);
expect(personMock.reassignFaces).not.toHaveBeenCalled();
expect(personMock.create).not.toHaveBeenCalled();
expect(personMock.createFace).not.toHaveBeenCalled();
});
it('should match existing person', async () => {
if (!faceStub.primaryFace1.person) {
throw new Error('faceStub.primaryFace1.person is null');
}
const faces = [
{ face: faceStub.noPerson1, distance: 0.0 },
{ face: faceStub.primaryFace1, distance: 0.2 },
{ face: faceStub.noPerson2, distance: 0.3 },
{ face: faceStub.face1, distance: 0.4 },
] as FaceSearchResult[];
configMock.load.mockResolvedValue([
{ key: SystemConfigKey.MACHINE_LEARNING_FACIAL_RECOGNITION_MIN_FACES, value: 1 },
]);
smartInfoMock.searchFaces.mockResolvedValue(faces);
personMock.getFaceByIdWithAssets.mockResolvedValue(faceStub.noPerson1);
personMock.create.mockResolvedValue(faceStub.primaryFace1.person);
await sut.handleRecognizeFaces({ id: faceStub.noPerson1.id });
expect(personMock.create).not.toHaveBeenCalled();
expect(personMock.reassignFaces).toHaveBeenCalledTimes(1);
expect(personMock.reassignFaces).toHaveBeenCalledWith({
faceIds: expect.arrayContaining([faceStub.noPerson1.id]),
newPersonId: faceStub.primaryFace1.person.id,
});
expect(personMock.reassignFaces).toHaveBeenCalledWith({
faceIds: expect.not.arrayContaining([faceStub.face1.id]),
newPersonId: faceStub.primaryFace1.person.id,
});
});
it('should create a new person if the face is a core point with no person', async () => {
const faces = [
{ face: faceStub.noPerson1, distance: 0.0 },
{ face: faceStub.noPerson2, distance: 0.3 },
] as FaceSearchResult[];
configMock.load.mockResolvedValue([
{ key: SystemConfigKey.MACHINE_LEARNING_FACIAL_RECOGNITION_MIN_FACES, value: 1 },
]);
smartInfoMock.searchFaces.mockResolvedValue(faces);
personMock.getFaceByIdWithAssets.mockResolvedValue(faceStub.noPerson1);
personMock.create.mockResolvedValue(personStub.withName);
await sut.handleRecognizeFaces({ id: faceStub.noPerson1.id });
expect(personMock.create).toHaveBeenCalledWith({
ownerId: faceStub.noPerson1.asset.ownerId,
faceAssetId: faceStub.noPerson1.id,
});
expect(personMock.reassignFaces).toHaveBeenCalledWith({
faceIds: [faceStub.noPerson1.id],
newPersonId: personStub.withName.id,
});
});
it('should defer non-core faces to end of queue', async () => {
const faces = [{ face: faceStub.noPerson1, distance: 0.0 }] as FaceSearchResult[];
configMock.load.mockResolvedValue([
{ key: SystemConfigKey.MACHINE_LEARNING_FACIAL_RECOGNITION_MIN_FACES, value: 2 },
]);
smartInfoMock.searchFaces.mockResolvedValue(faces);
personMock.getFaceByIdWithAssets.mockResolvedValue(faceStub.noPerson1);
personMock.create.mockResolvedValue(personStub.withName);
await sut.handleRecognizeFaces({ id: faceStub.noPerson1.id });
expect(jobMock.queue).toHaveBeenCalledWith({
name: JobName.FACIAL_RECOGNITION,
data: { id: faceStub.noPerson1.id, deferred: true },
});
expect(smartInfoMock.searchFaces).toHaveBeenCalledTimes(1);
expect(personMock.create).not.toHaveBeenCalled();
expect(personMock.reassignFaces).not.toHaveBeenCalled();
});
it('should not assign person to non-core face with no matching person', async () => {
const faces = [{ face: faceStub.noPerson1, distance: 0.0 }] as FaceSearchResult[];
configMock.load.mockResolvedValue([
{ key: SystemConfigKey.MACHINE_LEARNING_FACIAL_RECOGNITION_MIN_FACES, value: 2 },
]);
smartInfoMock.searchFaces.mockResolvedValueOnce(faces).mockResolvedValueOnce([]);
personMock.getFaceByIdWithAssets.mockResolvedValue(faceStub.noPerson1);
personMock.create.mockResolvedValue(personStub.withName);
await sut.handleRecognizeFaces({ id: faceStub.noPerson1.id, deferred: true });
expect(jobMock.queue).not.toHaveBeenCalled();
expect(smartInfoMock.searchFaces).toHaveBeenCalledTimes(2);
expect(personMock.create).not.toHaveBeenCalled();
expect(personMock.reassignFaces).not.toHaveBeenCalled();
});
});
describe('handleGeneratePersonThumbnail', () => {
it('should return if machine learning is disabled', async () => {
configMock.load.mockResolvedValue([{ key: SystemConfigKey.MACHINE_LEARNING_ENABLED, value: false }]);
@ -822,7 +988,6 @@ describe(PersonService.name, () => {
it('should require person.write and person.merge permission', async () => {
personMock.getById.mockResolvedValueOnce(personStub.primaryPerson);
personMock.getById.mockResolvedValueOnce(personStub.mergePerson);
personMock.delete.mockResolvedValue(personStub.mergePerson);
await expect(sut.mergePerson(authStub.admin, 'person-1', { ids: ['person-2'] })).rejects.toBeInstanceOf(
BadRequestException,
@ -837,7 +1002,6 @@ describe(PersonService.name, () => {
it('should merge two people without smart merge', async () => {
personMock.getById.mockResolvedValueOnce(personStub.primaryPerson);
personMock.getById.mockResolvedValueOnce(personStub.mergePerson);
personMock.delete.mockResolvedValue(personStub.mergePerson);
accessMock.person.checkOwnerAccess.mockResolvedValueOnce(new Set(['person-1']));
accessMock.person.checkOwnerAccess.mockResolvedValueOnce(new Set(['person-2']));
@ -852,17 +1016,12 @@ describe(PersonService.name, () => {
expect(personMock.update).not.toHaveBeenCalled();
expect(jobMock.queue).toHaveBeenCalledWith({
name: JobName.PERSON_DELETE,
data: { id: personStub.mergePerson.id },
});
expect(accessMock.person.checkOwnerAccess).toHaveBeenCalledWith(authStub.admin.user.id, new Set(['person-1']));
});
it('should merge two people with smart merge', async () => {
personMock.getById.mockResolvedValueOnce(personStub.randomPerson);
personMock.getById.mockResolvedValueOnce(personStub.primaryPerson);
personMock.delete.mockResolvedValue(personStub.primaryPerson);
personMock.update.mockResolvedValue({ ...personStub.randomPerson, name: personStub.primaryPerson.name });
accessMock.person.checkOwnerAccess.mockResolvedValueOnce(new Set(['person-3']));
accessMock.person.checkOwnerAccess.mockResolvedValueOnce(new Set(['person-1']));
@ -881,10 +1040,7 @@ describe(PersonService.name, () => {
name: personStub.primaryPerson.name,
});
expect(jobMock.queue).toHaveBeenCalledWith({
name: JobName.PERSON_DELETE,
data: { id: personStub.primaryPerson.id },
});
expect(personMock.delete).toHaveBeenCalledWith([personStub.primaryPerson]);
expect(accessMock.person.checkOwnerAccess).toHaveBeenCalledWith(authStub.admin.user.id, new Set(['person-1']));
});
@ -954,7 +1110,7 @@ describe(PersonService.name, () => {
boundingBoxX2: 1,
boundingBoxY1: 0,
boundingBoxY2: 1,
id: 'assetFaceId',
id: faceStub.face1.id,
imageHeight: 1024,
imageWidth: 1024,
person: mapPerson(personStub.withName),

View File

@ -2,12 +2,13 @@ import { PersonEntity } from '@app/infra/entities';
import { PersonPathType } from '@app/infra/entities/move.entity';
import { ImmichLogger } from '@app/infra/logger';
import { BadRequestException, Inject, Injectable, NotFoundException } from '@nestjs/common';
import { IsNull } from 'typeorm';
import { AccessCore, Permission } from '../access';
import { AssetResponseDto, BulkIdErrorReason, BulkIdResponseDto, mapAsset } from '../asset';
import { AuthDto } from '../auth';
import { mimeTypes } from '../domain.constant';
import { CacheControl, ImmichFileResponse, usePagination } from '../domain.util';
import { IBaseJob, IEntityJob, JOBS_ASSET_PAGINATION_SIZE, JobName } from '../job';
import { IBaseJob, IDeferrableJob, IEntityJob, JOBS_ASSET_PAGINATION_SIZE, JobName, QueueName } from '../job';
import { FACE_THUMBNAIL_SIZE } from '../media';
import {
CropOptions,
@ -249,64 +250,63 @@ export class PersonService {
return results;
}
async handlePersonDelete({ id }: IEntityJob) {
const person = await this.repository.getById(id);
if (!person) {
return false;
}
private async delete(people: PersonEntity[]) {
await Promise.all(people.map((person) => this.storageRepository.unlink(person.thumbnailPath)));
await this.repository.delete(people);
this.logger.debug(`Deleted ${people.length} people`);
}
try {
await this.repository.delete(person);
await this.storageRepository.unlink(person.thumbnailPath);
} catch (error: Error | any) {
this.logger.error(`Unable to delete person: ${error}`, error?.stack);
}
private async deleteAllPeople() {
const personPagination = usePagination(JOBS_ASSET_PAGINATION_SIZE, (pagination) =>
this.repository.getAll(pagination),
);
return true;
for await (const people of personPagination) {
await this.delete(people); // deletes thumbnails too
}
}
async handlePersonCleanup() {
const people = await this.repository.getAllWithoutFaces();
for (const person of people) {
this.logger.debug(`Person ${person.name || person.id} no longer has any faces, deleting.`);
}
await this.jobRepository.queueAll(
people.map((person) => ({ name: JobName.PERSON_DELETE, data: { id: person.id } })),
);
await this.delete(people);
return true;
}
async handleQueueRecognizeFaces({ force }: IBaseJob) {
async handleQueueDetectFaces({ force }: IBaseJob) {
const { machineLearning } = await this.configCore.getConfig();
if (!machineLearning.enabled || !machineLearning.facialRecognition.enabled) {
return true;
}
if (force) {
await this.deleteAllPeople();
await this.repository.deleteAllFaces();
}
const assetPagination = usePagination(JOBS_ASSET_PAGINATION_SIZE, (pagination) => {
return force
? this.assetRepository.getAll(pagination, { order: 'DESC' })
? this.assetRepository.getAll(pagination, {
order: 'DESC',
withFaces: true,
withPeople: false,
withSmartInfo: false,
withSmartSearch: false,
withExif: false,
withStacked: false,
})
: this.assetRepository.getWithout(pagination, WithoutProperty.FACES);
});
if (force) {
const people = await this.repository.getAll();
await this.jobRepository.queueAll(
people.map((person) => ({ name: JobName.PERSON_DELETE, data: { id: person.id } })),
);
this.logger.debug(`Deleted ${people.length} people`);
}
for await (const assets of assetPagination) {
await this.jobRepository.queueAll(
assets.map((asset) => ({ name: JobName.RECOGNIZE_FACES, data: { id: asset.id } })),
assets.map((asset) => ({ name: JobName.FACE_DETECTION, data: { id: asset.id } })),
);
}
return true;
}
async handleRecognizeFaces({ id }: IEntityJob) {
async handleDetectFaces({ id }: IEntityJob) {
const { machineLearning } = await this.configCore.getConfig();
if (!machineLearning.enabled || !machineLearning.facialRecognition.enabled) {
return true;
@ -315,7 +315,7 @@ export class PersonService {
const relations = {
exifInfo: true,
faces: {
person: true,
person: false,
},
};
const [asset] = await this.assetRepository.getByIds([id], relations);
@ -332,38 +332,19 @@ export class PersonService {
this.logger.debug(`${faces.length} faces detected in ${asset.resizePath}`);
this.logger.verbose(faces.map((face) => ({ ...face, embedding: `vector(${face.embedding.length})` })));
for (const { embedding, ...rest } of faces) {
const matches = await this.smartInfoRepository.searchFaces({
userIds: [asset.ownerId],
embedding,
numResults: 1,
maxDistance: machineLearning.facialRecognition.maxDistance,
});
let personId = matches[0]?.personId || null;
let newPerson: PersonEntity | null = null;
if (!personId) {
this.logger.debug('No matches, creating a new person.');
newPerson = await this.repository.create({ ownerId: asset.ownerId });
personId = newPerson.id;
}
const face = await this.repository.createFace({
for (const face of faces) {
const mappedFace = {
assetId: asset.id,
personId,
embedding,
imageHeight: rest.imageHeight,
imageWidth: rest.imageWidth,
boundingBoxX1: rest.boundingBox.x1,
boundingBoxX2: rest.boundingBox.x2,
boundingBoxY1: rest.boundingBox.y1,
boundingBoxY2: rest.boundingBox.y2,
});
embedding: face.embedding,
imageHeight: face.imageHeight,
imageWidth: face.imageWidth,
boundingBoxX1: face.boundingBox.x1,
boundingBoxX2: face.boundingBox.x2,
boundingBoxY1: face.boundingBox.y1,
boundingBoxY2: face.boundingBox.y2,
};
if (newPerson) {
await this.repository.update({ id: personId, faceAssetId: face.id });
await this.jobRepository.queue({ name: JobName.GENERATE_PERSON_THUMBNAIL, data: { id: newPerson.id } });
}
await this.repository.createFace(mappedFace);
}
await this.assetRepository.upsertJobStatus({
@ -374,6 +355,98 @@ export class PersonService {
return true;
}
async handleQueueRecognizeFaces({ force }: IBaseJob) {
const { machineLearning } = await this.configCore.getConfig();
if (!machineLearning.enabled || !machineLearning.facialRecognition.enabled) {
return true;
}
await this.jobRepository.waitForQueueCompletion(QueueName.THUMBNAIL_GENERATION, QueueName.FACE_DETECTION);
if (force) {
await this.deleteAllPeople();
}
const facePagination = usePagination(JOBS_ASSET_PAGINATION_SIZE, (pagination) =>
this.repository.getAllFaces(pagination, { where: force ? undefined : { personId: IsNull() } }),
);
for await (const page of facePagination) {
await this.jobRepository.queueAll(
page.map((face) => ({ name: JobName.FACIAL_RECOGNITION, data: { id: face.id, deferred: false } })),
);
}
return true;
}
async handleRecognizeFaces({ id, deferred }: IDeferrableJob) {
const { machineLearning } = await this.configCore.getConfig();
if (!machineLearning.enabled || !machineLearning.facialRecognition.enabled) {
return true;
}
const face = await this.repository.getFaceByIdWithAssets(
id,
{ person: true, asset: true },
{ id: true, personId: true, embedding: true },
);
if (!face) {
this.logger.warn(`Face ${id} not found`);
return false;
}
if (face.personId) {
this.logger.debug(`Face ${id} already has a person assigned`);
return true;
}
const matches = await this.smartInfoRepository.searchFaces({
userIds: [face.asset.ownerId],
embedding: face.embedding,
maxDistance: machineLearning.facialRecognition.maxDistance,
numResults: machineLearning.facialRecognition.minFaces,
});
this.logger.debug(`Face ${id} has ${matches.length} match${matches.length != 1 ? 'es' : ''}`);
const isCore = matches.length >= machineLearning.facialRecognition.minFaces;
if (!isCore && !deferred) {
this.logger.debug(`Deferring non-core face ${id} for later processing`);
await this.jobRepository.queue({ name: JobName.FACIAL_RECOGNITION, data: { id, deferred: true } });
return true;
}
let personId = matches.find((match) => match.face.personId)?.face.personId; // `matches` also includes the face itself
if (!personId) {
const matchWithPerson = await this.smartInfoRepository.searchFaces({
userIds: [face.asset.ownerId],
embedding: face.embedding,
maxDistance: machineLearning.facialRecognition.maxDistance,
numResults: 1,
hasPerson: true,
});
if (matchWithPerson.length > 0) {
personId = matchWithPerson[0].face.personId;
}
}
if (isCore && !personId) {
this.logger.log(`Creating new person for face ${id}`);
const newPerson = await this.repository.create({ ownerId: face.asset.ownerId, faceAssetId: face.id });
await this.jobRepository.queue({ name: JobName.GENERATE_PERSON_THUMBNAIL, data: { id: newPerson.id } });
personId = newPerson.id;
}
if (personId) {
this.logger.debug(`Assigning face ${id} to person ${personId}`);
await this.repository.reassignFaces({ faceIds: [id], newPersonId: personId });
}
return true;
}
async handlePersonMigration({ id }: IEntityJob) {
const person = await this.repository.getById(id);
if (!person) {
@ -499,7 +572,7 @@ export class PersonService {
this.logger.log(`Merging ${mergeName} into ${primaryName}`);
await this.repository.reassignFaces(mergeData);
await this.jobRepository.queue({ name: JobName.PERSON_DELETE, data: { id: mergePerson.id } });
await this.delete([mergePerson]);
this.logger.log(`Merged ${mergeName} into ${primaryName}`);
results.push({ id: mergeId, success: true });

View File

@ -1,6 +1,6 @@
import { SearchExploreItem } from '@app/domain';
import { AssetEntity, AssetJobStatusEntity, AssetType, ExifEntity } from '@app/infra/entities';
import { FindOptionsRelations } from 'typeorm';
import { FindOptionsRelations, FindOptionsSelect } from 'typeorm';
import { Paginated, PaginationOptions } from '../domain.util';
export type AssetStats = Record<AssetType, number>;
@ -33,6 +33,9 @@ export interface AssetSearchOptions {
withStacked?: boolean;
withExif?: boolean;
withPeople?: boolean;
withSmartInfo?: boolean;
withSmartSearch?: boolean;
withFaces?: boolean;
createdBefore?: Date;
createdAfter?: Date;
@ -93,6 +96,7 @@ export enum WithoutProperty {
CLIP_ENCODING = 'clip-embedding',
OBJECT_TAGS = 'object-tags',
FACES = 'faces',
PERSON = 'person',
SIDECAR = 'sidecar',
}
@ -168,7 +172,11 @@ export const IAssetRepository = 'IAssetRepository';
export interface IAssetRepository {
create(asset: AssetCreate): Promise<AssetEntity>;
getByDate(ownerId: string, date: Date): Promise<AssetEntity[]>;
getByIds(ids: string[], relations?: FindOptionsRelations<AssetEntity>): Promise<AssetEntity[]>;
getByIds(
ids: string[],
relations?: FindOptionsRelations<AssetEntity>,
select?: FindOptionsSelect<AssetEntity>,
): Promise<AssetEntity[]>;
getByDayOfYear(ownerId: string, monthDay: MonthDay): Promise<AssetEntity[]>;
getByChecksum(userId: string, checksum: Buffer): Promise<AssetEntity | null>;
getByAlbumId(pagination: PaginationOptions, albumId: string): Paginated<AssetEntity>;

View File

@ -3,6 +3,7 @@ import { JobName, QueueName } from '../job/job.constants';
import {
IAssetDeletionJob,
IBaseJob,
IDeferrableJob,
IDeleteFilesJob,
IEntityJob,
ILibraryFileJob,
@ -63,11 +64,12 @@ export type JobItem =
| { name: JobName.SIDECAR_SYNC; data: IEntityJob }
| { name: JobName.SIDECAR_WRITE; data: ISidecarWriteJob }
// Recognize Faces
| { name: JobName.QUEUE_RECOGNIZE_FACES; data: IBaseJob }
| { name: JobName.RECOGNIZE_FACES; data: IEntityJob }
// Facial Recognition
| { name: JobName.QUEUE_FACE_DETECTION; data: IBaseJob }
| { name: JobName.FACE_DETECTION; data: IEntityJob }
| { name: JobName.QUEUE_FACIAL_RECOGNITION; data: IBaseJob }
| { name: JobName.FACIAL_RECOGNITION; data: IDeferrableJob }
| { name: JobName.GENERATE_PERSON_THUMBNAIL; data: IEntityJob }
| { name: JobName.PERSON_DELETE; data: IEntityJob }
// Clip Embedding
| { name: JobName.QUEUE_ENCODE_CLIP; data: IBaseJob }
@ -111,4 +113,5 @@ export interface IJobRepository {
clear(name: QueueName, type: QueueCleanType): Promise<string[]>;
getQueueStatus(name: QueueName): Promise<QueueStatus>;
getJobCounts(name: QueueName): Promise<JobCounts>;
waitForQueueCompletion(...queues: QueueName[]): Promise<void>;
}

View File

@ -1,4 +1,6 @@
import { AssetEntity, AssetFaceEntity, PersonEntity } from '@app/infra/entities';
import { FindManyOptions, FindOptionsRelations, FindOptionsSelect } from 'typeorm';
import { Paginated, PaginationOptions } from '../domain.util';
export const IPersonRepository = 'IPersonRepository';
@ -17,7 +19,8 @@ export interface AssetFaceId {
}
export interface UpdateFacesData {
oldPersonId: string;
oldPersonId?: string;
faceIds?: string[];
newPersonId: string;
}
@ -26,8 +29,7 @@ export interface PersonStatistics {
}
export interface IPersonRepository {
getAll(): Promise<PersonEntity[]>;
getAllWithoutThumbnail(): Promise<PersonEntity[]>;
getAll(pagination: PaginationOptions, options?: FindManyOptions<PersonEntity>): Paginated<PersonEntity>;
getAllForUser(userId: string, options: PersonSearchOptions): Promise<PersonEntity[]>;
getAllWithoutFaces(): Promise<PersonEntity[]>;
getById(personId: string): Promise<PersonEntity | null>;
@ -35,19 +37,23 @@ export interface IPersonRepository {
getAssets(personId: string): Promise<AssetEntity[]>;
reassignFaces(data: UpdateFacesData): Promise<number>;
create(entity: Partial<PersonEntity>): Promise<PersonEntity>;
update(entity: Partial<PersonEntity>): Promise<PersonEntity>;
delete(entity: PersonEntity): Promise<PersonEntity | null>;
deleteAll(): Promise<number>;
getStatistics(personId: string): Promise<PersonStatistics>;
getAllFaces(): Promise<AssetFaceEntity[]>;
createFace(entity: Partial<AssetFaceEntity>): Promise<void>;
delete(entities: PersonEntity[]): Promise<void>;
deleteAll(): Promise<void>;
deleteAllFaces(): Promise<void>;
getAllFaces(pagination: PaginationOptions, options?: FindManyOptions<AssetFaceEntity>): Paginated<AssetFaceEntity>;
getFaceById(id: string): Promise<AssetFaceEntity>;
getFaceByIdWithAssets(
id: string,
relations?: FindOptionsRelations<AssetFaceEntity>,
select?: FindOptionsSelect<AssetFaceEntity>,
): Promise<AssetFaceEntity | null>;
getFaces(assetId: string): Promise<AssetFaceEntity[]>;
getFacesByIds(ids: AssetFaceId[]): Promise<AssetFaceEntity[]>;
getRandomFace(personId: string): Promise<AssetFaceEntity | null>;
createFace(entity: Partial<AssetFaceEntity>): Promise<AssetFaceEntity>;
getFaces(assetId: string): Promise<AssetFaceEntity[]>;
getStatistics(personId: string): Promise<PersonStatistics>;
reassignFace(assetFaceId: string, newPersonId: string): Promise<number>;
getFaceById(id: string): Promise<AssetFaceEntity>;
getFaceByIdWithAssets(id: string): Promise<AssetFaceEntity | null>;
reassignFaces(data: UpdateFacesData): Promise<number>;
update(entity: Partial<PersonEntity>): Promise<PersonEntity>;
}

View File

@ -7,14 +7,23 @@ export type Embedding = number[];
export interface EmbeddingSearch {
userIds: string[];
embedding: Embedding;
numResults: number;
maxDistance?: number;
numResults?: number;
withArchived?: boolean;
}
export interface FaceEmbeddingSearch extends EmbeddingSearch {
maxDistance?: number;
hasPerson?: boolean;
}
export interface FaceSearchResult {
face: AssetFaceEntity;
distance: number;
}
export interface ISmartInfoRepository {
init(modelName: string): Promise<void>;
searchCLIP(search: EmbeddingSearch): Promise<AssetEntity[]>;
searchFaces(search: EmbeddingSearch): Promise<AssetFaceEntity[]>;
searchFaces(search: FaceEmbeddingSearch): Promise<FaceSearchResult[]>;
upsert(smartInfo: Partial<SmartInfoEntity>, embedding?: Embedding): Promise<void>;
}

View File

@ -30,14 +30,14 @@ export class RecognitionConfig extends ModelConfig {
@Min(0)
@Max(1)
@Type(() => Number)
@ApiProperty({ type: 'integer' })
@ApiProperty({ type: 'number', format: 'float' })
minScore!: number;
@IsNumber()
@Min(0)
@Max(2)
@Type(() => Number)
@ApiProperty({ type: 'integer' })
@ApiProperty({ type: 'number', format: 'float' })
maxDistance!: number;
@IsNumber()

View File

@ -1,6 +1,5 @@
import { ImmichLogger } from '@app/infra/logger';
import { Inject, Injectable } from '@nestjs/common';
import { setTimeout } from 'timers/promises';
import { usePagination } from '../domain.util';
import { IBaseJob, IEntityJob, JOBS_ASSET_PAGINATION_SIZE, JobName, QueueName } from '../job';
import {
@ -34,13 +33,7 @@ export class SmartInfoService {
async init() {
await this.jobRepository.pause(QueueName.SMART_SEARCH);
let { isActive } = await this.jobRepository.getQueueStatus(QueueName.SMART_SEARCH);
while (isActive) {
this.logger.verbose('Waiting for CLIP encoding queue to stop...');
await setTimeout(1000).then(async () => {
({ isActive } = await this.jobRepository.getQueueStatus(QueueName.SMART_SEARCH));
});
}
await this.jobRepository.waitForQueueCompletion(QueueName.SMART_SEARCH);
const { machineLearning } = await this.configCore.getConfig();

View File

@ -1,7 +1,7 @@
import { ApiProperty } from '@nestjs/swagger';
import { Type } from 'class-transformer';
import { IsInt, IsObject, IsPositive, ValidateNested } from 'class-validator';
import { QueueName } from '../../job';
import { ConcurrentQueueName, QueueName } from '../../job';
export class JobSettingsDto {
@IsInt()
@ -10,9 +10,7 @@ export class JobSettingsDto {
concurrency!: number;
}
export class SystemConfigJobDto
implements Record<Exclude<QueueName, QueueName.STORAGE_TEMPLATE_MIGRATION>, JobSettingsDto>
{
export class SystemConfigJobDto implements Record<ConcurrentQueueName, JobSettingsDto> {
@ApiProperty({ type: JobSettingsDto })
@ValidateNested()
@IsObject()
@ -59,7 +57,7 @@ export class SystemConfigJobDto
@ValidateNested()
@IsObject()
@Type(() => JobSettingsDto)
[QueueName.RECOGNIZE_FACES]!: JobSettingsDto;
[QueueName.FACE_DETECTION]!: JobSettingsDto;
@ApiProperty({ type: JobSettingsDto })
@ValidateNested()

View File

@ -49,7 +49,7 @@ export const defaults = Object.freeze<SystemConfig>({
[QueueName.BACKGROUND_TASK]: { concurrency: 5 },
[QueueName.SMART_SEARCH]: { concurrency: 2 },
[QueueName.METADATA_EXTRACTION]: { concurrency: 5 },
[QueueName.RECOGNIZE_FACES]: { concurrency: 2 },
[QueueName.FACE_DETECTION]: { concurrency: 2 },
[QueueName.SEARCH]: { concurrency: 5 },
[QueueName.SIDECAR]: { concurrency: 5 },
[QueueName.LIBRARY]: { concurrency: 5 },
@ -73,7 +73,7 @@ export const defaults = Object.freeze<SystemConfig>({
modelName: 'buffalo_l',
minScore: 0.7,
maxDistance: 0.6,
minFaces: 1,
minFaces: 3,
},
},
map: {

View File

@ -30,7 +30,7 @@ const updatedConfig = Object.freeze<SystemConfig>({
[QueueName.BACKGROUND_TASK]: { concurrency: 5 },
[QueueName.SMART_SEARCH]: { concurrency: 2 },
[QueueName.METADATA_EXTRACTION]: { concurrency: 5 },
[QueueName.RECOGNIZE_FACES]: { concurrency: 2 },
[QueueName.FACE_DETECTION]: { concurrency: 2 },
[QueueName.SEARCH]: { concurrency: 5 },
[QueueName.SIDECAR]: { concurrency: 5 },
[QueueName.LIBRARY]: { concurrency: 5 },
@ -73,7 +73,7 @@ const updatedConfig = Object.freeze<SystemConfig>({
modelName: 'buffalo_l',
minScore: 0.7,
maxDistance: 0.6,
minFaces: 1,
minFaces: 3,
},
},
map: {

View File

@ -15,7 +15,7 @@ export class AssetFaceEntity {
personId!: string | null;
@Index('face_index', { synchronize: false })
@Column({ type: 'float4', array: true, select: false })
@Column({ type: 'float4', array: true, select: false, transformer: { from: (v) => JSON.parse(v), to: (v) => v } })
embedding!: number[];
@Column({ default: 0, type: 'int' })
@ -39,6 +39,10 @@ export class AssetFaceEntity {
@ManyToOne(() => AssetEntity, (asset) => asset.faces, { onDelete: 'CASCADE', onUpdate: 'CASCADE' })
asset!: AssetEntity;
@ManyToOne(() => PersonEntity, (person) => person.faces, { onDelete: 'CASCADE', onUpdate: 'CASCADE', nullable: true })
@ManyToOne(() => PersonEntity, (person) => person.faces, {
onDelete: 'SET NULL',
onUpdate: 'CASCADE',
nullable: true,
})
person!: PersonEntity | null;
}

View File

@ -1,4 +1,4 @@
import { QueueName } from '@app/domain';
import { ConcurrentQueueName } from '@app/domain';
import { Column, Entity, PrimaryColumn } from 'typeorm';
@Entity('system_config')
@ -35,7 +35,7 @@ export enum SystemConfigKey {
JOB_THUMBNAIL_GENERATION_CONCURRENCY = 'job.thumbnailGeneration.concurrency',
JOB_METADATA_EXTRACTION_CONCURRENCY = 'job.metadataExtraction.concurrency',
JOB_VIDEO_CONVERSION_CONCURRENCY = 'job.videoConversion.concurrency',
JOB_RECOGNIZE_FACES_CONCURRENCY = 'job.recognizeFaces.concurrency',
JOB_FACE_DETECTION_CONCURRENCY = 'job.faceDetection.concurrency',
JOB_CLIP_ENCODING_CONCURRENCY = 'job.smartSearch.concurrency',
JOB_BACKGROUND_TASK_CONCURRENCY = 'job.backgroundTask.concurrency',
JOB_STORAGE_TEMPLATE_MIGRATION_CONCURRENCY = 'job.storageTemplateMigration.concurrency',
@ -176,7 +176,7 @@ export interface SystemConfig {
accel: TranscodeHWAccel;
tonemap: ToneMapping;
};
job: Record<Exclude<QueueName, QueueName.STORAGE_TEMPLATE_MIGRATION>, { concurrency: number }>;
job: Record<ConcurrentQueueName, { concurrency: number }>;
logging: {
enabled: boolean;
level: LogLevel;

View File

@ -1,6 +1,6 @@
import { Paginated, PaginationOptions } from '@app/domain';
import _ from 'lodash';
import { Between, FindOneOptions, LessThanOrEqual, MoreThanOrEqual, ObjectLiteral, Repository } from 'typeorm';
import { Between, FindManyOptions, LessThanOrEqual, MoreThanOrEqual, ObjectLiteral, Repository } from 'typeorm';
import { chunks, setUnion } from '../domain/domain.util';
import { DATABASE_PARAMETER_CHUNK_SIZE } from './infra.util';
@ -21,14 +21,19 @@ export function OptionalBetween<T>(from?: T, to?: T) {
export async function paginate<Entity extends ObjectLiteral>(
repository: Repository<Entity>,
paginationOptions: PaginationOptions,
searchOptions?: FindOneOptions<Entity>,
searchOptions?: FindManyOptions<Entity>,
): Paginated<Entity> {
const items = await repository.find({
...searchOptions,
// Take one more item to check if there's a next page
take: paginationOptions.take + 1,
skip: paginationOptions.skip,
});
const items = await repository.find(
_.omitBy(
{
...searchOptions,
// Take one more item to check if there's a next page
take: paginationOptions.take + 1,
skip: paginationOptions.skip,
},
_.isUndefined,
),
);
const hasNextPage = items.length > paginationOptions.take;
items.splice(paginationOptions.take);

View File

@ -0,0 +1,24 @@
import { MigrationInterface, QueryRunner } from "typeorm"
export class SetAssetFaceNullOnPersonDelete1704943345360 implements MigrationInterface {
public async up(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`
ALTER TABLE "asset_faces"
DROP CONSTRAINT "FK_95ad7106dd7b484275443f580f9",
ADD CONSTRAINT "FK_95ad7106dd7b484275443f580f9"
FOREIGN KEY ("personId") REFERENCES "person"("id")
ON DELETE SET NULL ON UPDATE CASCADE
`);
}
public async down(queryRunner: QueryRunner): Promise<void> {
await queryRunner.query(`
ALTER TABLE "asset_faces"
DROP CONSTRAINT "FK_95ad7106dd7b484275443f580f9",
ADD CONSTRAINT "FK_95ad7106dd7b484275443f580f9"
FOREIGN KEY ("personId") REFERENCES "person"("id")
ON DELETE CASCADE ON UPDATE CASCADE
`);
}
}

View File

@ -25,7 +25,18 @@ import { InjectRepository } from '@nestjs/typeorm';
import _ from 'lodash';
import { DateTime } from 'luxon';
import path from 'path';
import { And, Brackets, FindOptionsRelations, FindOptionsWhere, In, IsNull, LessThan, Not, Repository } from 'typeorm';
import {
And,
Brackets,
FindOptionsRelations,
FindOptionsSelect,
FindOptionsWhere,
In,
IsNull,
LessThan,
Not,
Repository,
} from 'typeorm';
import { AssetEntity, AssetJobStatusEntity, AssetType, ExifEntity, SmartInfoEntity } from '../entities';
import { DummyValue, GenerateSql } from '../infra.util';
import { Chunked, ChunkedArray, OptionalBetween, paginate } from '../infra.utils';
@ -103,6 +114,7 @@ export class AssetRepository implements IAssetRepository {
withExif: _withExif,
withStacked,
withPeople,
withSmartInfo,
order,
} = options;
@ -174,6 +186,10 @@ export class AssetRepository implements IAssetRepository {
builder.leftJoinAndSelect('asset.stack', 'stack');
}
if (withSmartInfo) {
builder.leftJoinAndSelect('asset.smartInfo', 'smartInfo');
}
if (withDeleted) {
builder.withDeleted();
}
@ -250,7 +266,11 @@ export class AssetRepository implements IAssetRepository {
@GenerateSql({ params: [[DummyValue.UUID]] })
@ChunkedArray()
getByIds(ids: string[], relations?: FindOptionsRelations<AssetEntity>): Promise<AssetEntity[]> {
getByIds(
ids: string[],
relations?: FindOptionsRelations<AssetEntity>,
select?: FindOptionsSelect<AssetEntity>,
): Promise<AssetEntity[]> {
if (!relations) {
relations = {
exifInfo: true,
@ -262,9 +282,11 @@ export class AssetRepository implements IAssetRepository {
stack: true,
};
}
return this.repository.find({
where: { id: In(ids) },
relations,
select,
withDeleted: true,
});
}
@ -325,12 +347,11 @@ export class AssetRepository implements IAssetRepository {
deletedAt: options.trashedBefore ? And(Not(IsNull()), LessThan(options.trashedBefore)) : undefined,
},
relations: {
exifInfo: true,
smartInfo: true,
tags: true,
faces: {
person: true,
},
exifInfo: options.withExif !== false,
smartInfo: options.withSmartInfo !== false,
tags: options.withSmartInfo !== false,
faces: options.withFaces !== false,
smartSearch: options.withSmartInfo === true,
},
withDeleted: options.withDeleted ?? !!options.trashedBefore,
order: {
@ -519,6 +540,20 @@ export class AssetRepository implements IAssetRepository {
};
break;
case WithoutProperty.PERSON:
relations = {
faces: true,
};
where = {
resizePath: Not(IsNull()),
isVisible: true,
faces: {
assetId: Not(IsNull()),
personId: IsNull(),
},
};
break;
case WithoutProperty.SIDECAR:
where = [
{ sidecarPath: IsNull(), isVisible: true },

View File

@ -64,7 +64,15 @@ export class FilesystemProvider implements IStorageRepository {
}
async unlink(file: string) {
await fs.unlink(file);
try {
await fs.unlink(file);
} catch (err) {
if ((err as NodeJS.ErrnoException)?.code === 'ENOENT') {
this.logger.warn(`File ${file} does not exist.`);
} else {
throw err;
}
}
}
stat = fs.stat;

View File

@ -15,6 +15,7 @@ import { ModuleRef } from '@nestjs/core';
import { SchedulerRegistry } from '@nestjs/schedule';
import { Job, JobsOptions, Processor, Queue, Worker, WorkerOptions } from 'bullmq';
import { CronJob, CronTime } from 'cron';
import { setTimeout } from 'timers/promises';
import { bullConfig } from '../infra.config';
@Injectable()
@ -121,26 +122,47 @@ export class JobRepository implements IJobRepository {
return;
}
const itemsByQueue = items.reduce<Record<string, JobItem[]>>((acc, item) => {
const promises = [];
const itemsByQueue = {} as Record<string, (JobItem & { data: any; options: JobsOptions | undefined })[]>;
for (const item of items) {
const queueName = JOBS_TO_QUEUE[item.name];
acc[queueName] = acc[queueName] || [];
acc[queueName].push(item);
return acc;
}, {});
for (const [queueName, items] of Object.entries(itemsByQueue)) {
const queue = this.getQueue(queueName as QueueName);
const jobs = items.map((item) => ({
const job = {
name: item.name,
data: (item as { data?: any })?.data || {},
data: item.data || {},
options: this.getJobOptions(item) || undefined,
}));
await queue.addBulk(jobs);
} as JobItem & { data: any; options: JobsOptions | undefined };
if (job.options?.jobId) {
// need to use add() instead of addBulk() for jobId deduplication
promises.push(this.getQueue(queueName).add(item.name, item.data, job.options));
} else {
itemsByQueue[queueName] = itemsByQueue[queueName] || [];
itemsByQueue[queueName].push(job);
}
}
for (const [queueName, jobs] of Object.entries(itemsByQueue)) {
const queue = this.getQueue(queueName as QueueName);
promises.push(queue.addBulk(jobs));
}
await Promise.all(promises);
}
async queue(item: JobItem): Promise<void> {
await this.queueAll([item]);
return this.queueAll([item]);
}
async waitForQueueCompletion(...queues: QueueName[]): Promise<void> {
let activeQueue: QueueStatus | undefined;
do {
const statuses = await Promise.all(queues.map((name) => this.getQueueStatus(name)));
activeQueue = statuses.find((status) => status.isActive);
} while (activeQueue);
{
this.logger.verbose(`Waiting for ${activeQueue} queue to stop...`);
await setTimeout(1000);
}
}
private getJobOptions(item: JobItem): JobsOptions | null {
@ -149,6 +171,8 @@ export class JobRepository implements IJobRepository {
return { jobId: item.data.id };
case JobName.GENERATE_PERSON_THUMBNAIL:
return { priority: 1 };
case JobName.QUEUE_FACIAL_RECOGNITION:
return { jobId: JobName.QUEUE_FACIAL_RECOGNITION };
default:
return null;

View File

@ -16,7 +16,7 @@ const errorPrefix = 'Machine learning request';
@Injectable()
export class MachineLearningRepository implements IMachineLearningRepository {
private async post<T>(url: string, input: TextModelInput | VisionModelInput, config: ModelConfig): Promise<T> {
private async predict<T>(url: string, input: TextModelInput | VisionModelInput, config: ModelConfig): Promise<T> {
const formData = await this.getFormData(input, config);
const res = await fetch(`${url}/predict`, { method: 'POST', body: formData }).catch((error: Error | any) => {
@ -31,11 +31,11 @@ export class MachineLearningRepository implements IMachineLearningRepository {
}
detectFaces(url: string, input: VisionModelInput, config: RecognitionConfig): Promise<DetectFaceResult[]> {
return this.post<DetectFaceResult[]>(url, input, { ...config, modelType: ModelType.FACIAL_RECOGNITION });
return this.predict<DetectFaceResult[]>(url, input, { ...config, modelType: ModelType.FACIAL_RECOGNITION });
}
encodeImage(url: string, input: VisionModelInput, config: CLIPConfig): Promise<number[]> {
return this.post<number[]>(url, input, {
return this.predict<number[]>(url, input, {
...config,
modelType: ModelType.CLIP,
mode: CLIPMode.VISION,
@ -43,7 +43,11 @@ export class MachineLearningRepository implements IMachineLearningRepository {
}
encodeText(url: string, input: TextModelInput, config: CLIPConfig): Promise<number[]> {
return this.post<number[]>(url, input, { ...config, modelType: ModelType.CLIP, mode: CLIPMode.TEXT } as CLIPConfig);
return this.predict<number[]>(url, input, {
...config,
modelType: ModelType.CLIP,
mode: CLIPMode.TEXT,
} as CLIPConfig);
}
async getFormData(input: TextModelInput | VisionModelInput, config: ModelConfig): Promise<FormData> {

View File

@ -1,16 +1,19 @@
import {
AssetFaceId,
IPersonRepository,
Paginated,
PaginationOptions,
PersonNameSearchOptions,
PersonSearchOptions,
PersonStatistics,
UpdateFacesData,
} from '@app/domain';
import { InjectRepository } from '@nestjs/typeorm';
import { In, Repository } from 'typeorm';
import _ from 'lodash';
import { FindManyOptions, FindOptionsRelations, FindOptionsSelect, In, Repository } from 'typeorm';
import { AssetEntity, AssetFaceEntity, PersonEntity } from '../entities';
import { DummyValue, GenerateSql } from '../infra.util';
import { Chunked, ChunkedArray, asVector } from '../infra.utils';
import { ChunkedArray, asVector, paginate } from '../infra.utils';
export class PersonRepository implements IPersonRepository {
constructor(
@ -19,64 +22,44 @@ export class PersonRepository implements IPersonRepository {
@InjectRepository(AssetFaceEntity) private assetFaceRepository: Repository<AssetFaceEntity>,
) {}
/**
* Before reassigning faces, delete potential key violations
*/
async prepareReassignFaces({ oldPersonId, newPersonId }: UpdateFacesData): Promise<string[]> {
const results = await this.assetFaceRepository
.createQueryBuilder('face')
.select('face."assetId"')
.where(`face."personId" IN (:...ids)`, { ids: [oldPersonId, newPersonId] })
.groupBy('face."assetId"')
.having('COUNT(face."personId") > 1')
.getRawMany();
const assetIds = results.map(({ assetId }) => assetId);
await this.deletePersonFromAssets(oldPersonId, assetIds);
return assetIds;
}
@Chunked({ paramIndex: 1 })
async deletePersonFromAssets(personId: string, assetIds: string[]): Promise<void> {
await this.assetFaceRepository.delete({ personId: personId, assetId: In(assetIds) });
}
@GenerateSql({ params: [{ oldPersonId: DummyValue.UUID, newPersonId: DummyValue.UUID }] })
async reassignFaces({ oldPersonId, newPersonId }: UpdateFacesData): Promise<number> {
async reassignFaces({ oldPersonId, faceIds, newPersonId }: UpdateFacesData): Promise<number> {
const result = await this.assetFaceRepository
.createQueryBuilder()
.update()
.set({ personId: newPersonId })
.where({ personId: oldPersonId })
.where(
_.omitBy(
{ personId: oldPersonId ? oldPersonId : undefined, id: faceIds ? In(faceIds) : undefined },
_.isUndefined,
),
)
.execute();
return result.affected ?? 0;
}
delete(entity: PersonEntity): Promise<PersonEntity | null> {
return this.personRepository.remove(entity);
async delete(entities: PersonEntity[]): Promise<void> {
await this.personRepository.remove(entities);
}
async deleteAll(): Promise<number> {
const people = await this.personRepository.find();
await this.personRepository.remove(people);
return people.length;
async deleteAll(): Promise<void> {
await this.personRepository.delete({});
}
@GenerateSql()
getAllFaces(): Promise<AssetFaceEntity[]> {
return this.assetFaceRepository.find({ relations: { asset: true }, withDeleted: true });
async deleteAllFaces(): Promise<void> {
await this.assetFaceRepository.delete({});
}
@GenerateSql()
getAll(): Promise<PersonEntity[]> {
return this.personRepository.find();
getAllFaces(
pagination: PaginationOptions,
options: FindManyOptions<AssetFaceEntity> = {},
): Paginated<AssetFaceEntity> {
return paginate(this.assetFaceRepository, pagination, options);
}
@GenerateSql()
getAllWithoutThumbnail(): Promise<PersonEntity[]> {
return this.personRepository.findBy({ thumbnailPath: '' });
getAll(pagination: PaginationOptions, options: FindManyOptions<PersonEntity> = {}): Paginated<PersonEntity> {
return paginate(this.personRepository, pagination, options);
}
@GenerateSql({ params: [DummyValue.UUID] })
@ -133,14 +116,25 @@ export class PersonRepository implements IPersonRepository {
}
@GenerateSql({ params: [DummyValue.UUID] })
getFaceByIdWithAssets(id: string): Promise<AssetFaceEntity | null> {
return this.assetFaceRepository.findOne({
where: { id },
relations: {
person: true,
asset: true,
},
});
getFaceByIdWithAssets(
id: string,
relations: FindOptionsRelations<AssetFaceEntity>,
select: FindOptionsSelect<AssetFaceEntity>,
): Promise<AssetFaceEntity | null> {
return this.assetFaceRepository.findOne(
_.omitBy(
{
where: { id },
relations: {
...relations,
person: true,
asset: true,
},
select,
},
_.isUndefined,
),
);
}
@GenerateSql({ params: [DummyValue.UUID, DummyValue.UUID] })
@ -221,15 +215,11 @@ export class PersonRepository implements IPersonRepository {
return this.personRepository.save(entity);
}
async createFace(entity: AssetFaceEntity): Promise<AssetFaceEntity> {
if (!entity.personId) {
throw new Error('Person ID is required to create a face');
}
async createFace(entity: AssetFaceEntity): Promise<void> {
if (!entity.embedding) {
throw new Error('Embedding is required to create a face');
}
await this.assetFaceRepository.insert({ ...entity, embedding: () => asVector(entity.embedding, true) });
return this.assetFaceRepository.findOneByOrFail({ assetId: entity.assetId, personId: entity.personId });
}
async update(entity: Partial<PersonEntity>): Promise<PersonEntity> {

View File

@ -1,4 +1,4 @@
import { Embedding, EmbeddingSearch, ISmartInfoRepository } from '@app/domain';
import { Embedding, EmbeddingSearch, FaceEmbeddingSearch, FaceSearchResult, ISmartInfoRepository } from '@app/domain';
import { getCLIPModelInfo } from '@app/domain/smart-info/smart-info.constant';
import { AssetEntity, AssetFaceEntity, SmartInfoEntity, SmartSearchEntity } from '@app/infra/entities';
import { ImmichLogger } from '@app/infra/logger';
@ -44,32 +44,33 @@ export class SmartInfoRepository implements ISmartInfoRepository {
params: [{ userIds: [DummyValue.UUID], embedding: Array.from({ length: 512 }, Math.random), numResults: 100 }],
})
async searchCLIP({ userIds, embedding, numResults, withArchived }: EmbeddingSearch): Promise<AssetEntity[]> {
if (!isValidInteger(numResults, { min: 1 })) {
throw new Error(`Invalid value for 'numResults': ${numResults}`);
}
let results: AssetEntity[] = [];
await this.assetRepository.manager.transaction(async (manager) => {
await manager.query(`SET LOCAL vectors.k = '${numResults}'`);
await manager.query(`SET LOCAL vectors.enable_prefilter = on`);
const query = manager
let query = manager
.createQueryBuilder(AssetEntity, 'a')
.innerJoin('a.smartSearch', 's')
.leftJoinAndSelect('a.exifInfo', 'e')
.where('a.ownerId IN (:...userIds )')
.andWhere('a.isVisible = true');
.orderBy('s.embedding <=> :embedding')
.setParameters({ userIds, embedding: asVector(embedding) });
if (!withArchived) {
query.andWhere('a.isArchived = false');
}
query.andWhere('a.isVisible = true').andWhere('a.fileCreatedAt < NOW()');
results = await query
.andWhere('a.fileCreatedAt < NOW()')
.leftJoinAndSelect('a.exifInfo', 'e')
.orderBy('s.embedding <=> :embedding')
.setParameters({ userIds, embedding: asVector(embedding) })
.limit(numResults)
.getMany();
if (numResults) {
if (!isValidInteger(numResults, { min: 1 })) {
throw new Error(`Invalid value for 'numResults': ${numResults}`);
}
query = query.limit(numResults);
await manager.query(`SET LOCAL vectors.k = '${numResults}'`);
}
results = await query.getMany();
});
return results;
@ -85,22 +86,38 @@ export class SmartInfoRepository implements ISmartInfoRepository {
},
],
})
async searchFaces({ userIds, embedding, numResults, maxDistance }: EmbeddingSearch): Promise<AssetFaceEntity[]> {
if (!isValidInteger(numResults, { min: 1 })) {
throw new Error(`Invalid value for 'numResults': ${numResults}`);
}
let results: AssetFaceEntity[] = [];
async searchFaces({
userIds,
embedding,
numResults,
maxDistance,
hasPerson,
}: FaceEmbeddingSearch): Promise<FaceSearchResult[]> {
let results: Array<AssetFaceEntity & { distance: number }> = [];
await this.assetRepository.manager.transaction(async (manager) => {
await manager.query(`SET LOCAL vectors.k = '${numResults}'`);
const cte = manager
await manager.query(`SET LOCAL vectors.enable_prefilter = on`);
let cte = manager
.createQueryBuilder(AssetFaceEntity, 'faces')
.select('1 + (faces.embedding <=> :embedding)', 'distance')
.innerJoin('faces.asset', 'asset')
.where('asset.ownerId IN (:...userIds )')
.orderBy('1 + (faces.embedding <=> :embedding)')
.setParameters({ userIds, embedding: asVector(embedding) })
.limit(numResults);
.setParameters({ userIds, embedding: asVector(embedding) });
if (numResults) {
if (!isValidInteger(numResults, { min: 1 })) {
throw new Error(`Invalid value for 'numResults': ${numResults}`);
}
cte = cte.limit(numResults);
if (numResults > 64) {
// setting k too low messes with prefilter recall
await manager.query(`SET LOCAL vectors.k = '${numResults}'`);
}
}
if (hasPerson) {
cte = cte.andWhere('faces."personId" IS NOT NULL');
}
this.faceColumns.forEach((col) => cte.addSelect(`faces.${col}`, col));
@ -113,7 +130,10 @@ export class SmartInfoRepository implements ISmartInfoRepository {
.getRawMany();
});
return this.assetFaceRepository.create(results);
return results.map((row) => ({
face: this.assetFaceRepository.create(row),
distance: row.distance,
}));
}
async upsert(smartInfo: Partial<SmartInfoEntity>, embedding?: Embedding): Promise<void> {

View File

@ -7,80 +7,6 @@ SET
WHERE
"personId" = $2
-- PersonRepository.getAllFaces
SELECT
"AssetFaceEntity"."id" AS "AssetFaceEntity_id",
"AssetFaceEntity"."assetId" AS "AssetFaceEntity_assetId",
"AssetFaceEntity"."personId" AS "AssetFaceEntity_personId",
"AssetFaceEntity"."imageWidth" AS "AssetFaceEntity_imageWidth",
"AssetFaceEntity"."imageHeight" AS "AssetFaceEntity_imageHeight",
"AssetFaceEntity"."boundingBoxX1" AS "AssetFaceEntity_boundingBoxX1",
"AssetFaceEntity"."boundingBoxY1" AS "AssetFaceEntity_boundingBoxY1",
"AssetFaceEntity"."boundingBoxX2" AS "AssetFaceEntity_boundingBoxX2",
"AssetFaceEntity"."boundingBoxY2" AS "AssetFaceEntity_boundingBoxY2",
"AssetFaceEntity__AssetFaceEntity_asset"."id" AS "AssetFaceEntity__AssetFaceEntity_asset_id",
"AssetFaceEntity__AssetFaceEntity_asset"."deviceAssetId" AS "AssetFaceEntity__AssetFaceEntity_asset_deviceAssetId",
"AssetFaceEntity__AssetFaceEntity_asset"."ownerId" AS "AssetFaceEntity__AssetFaceEntity_asset_ownerId",
"AssetFaceEntity__AssetFaceEntity_asset"."libraryId" AS "AssetFaceEntity__AssetFaceEntity_asset_libraryId",
"AssetFaceEntity__AssetFaceEntity_asset"."deviceId" AS "AssetFaceEntity__AssetFaceEntity_asset_deviceId",
"AssetFaceEntity__AssetFaceEntity_asset"."type" AS "AssetFaceEntity__AssetFaceEntity_asset_type",
"AssetFaceEntity__AssetFaceEntity_asset"."originalPath" AS "AssetFaceEntity__AssetFaceEntity_asset_originalPath",
"AssetFaceEntity__AssetFaceEntity_asset"."resizePath" AS "AssetFaceEntity__AssetFaceEntity_asset_resizePath",
"AssetFaceEntity__AssetFaceEntity_asset"."webpPath" AS "AssetFaceEntity__AssetFaceEntity_asset_webpPath",
"AssetFaceEntity__AssetFaceEntity_asset"."thumbhash" AS "AssetFaceEntity__AssetFaceEntity_asset_thumbhash",
"AssetFaceEntity__AssetFaceEntity_asset"."encodedVideoPath" AS "AssetFaceEntity__AssetFaceEntity_asset_encodedVideoPath",
"AssetFaceEntity__AssetFaceEntity_asset"."createdAt" AS "AssetFaceEntity__AssetFaceEntity_asset_createdAt",
"AssetFaceEntity__AssetFaceEntity_asset"."updatedAt" AS "AssetFaceEntity__AssetFaceEntity_asset_updatedAt",
"AssetFaceEntity__AssetFaceEntity_asset"."deletedAt" AS "AssetFaceEntity__AssetFaceEntity_asset_deletedAt",
"AssetFaceEntity__AssetFaceEntity_asset"."fileCreatedAt" AS "AssetFaceEntity__AssetFaceEntity_asset_fileCreatedAt",
"AssetFaceEntity__AssetFaceEntity_asset"."localDateTime" AS "AssetFaceEntity__AssetFaceEntity_asset_localDateTime",
"AssetFaceEntity__AssetFaceEntity_asset"."fileModifiedAt" AS "AssetFaceEntity__AssetFaceEntity_asset_fileModifiedAt",
"AssetFaceEntity__AssetFaceEntity_asset"."isFavorite" AS "AssetFaceEntity__AssetFaceEntity_asset_isFavorite",
"AssetFaceEntity__AssetFaceEntity_asset"."isArchived" AS "AssetFaceEntity__AssetFaceEntity_asset_isArchived",
"AssetFaceEntity__AssetFaceEntity_asset"."isExternal" AS "AssetFaceEntity__AssetFaceEntity_asset_isExternal",
"AssetFaceEntity__AssetFaceEntity_asset"."isReadOnly" AS "AssetFaceEntity__AssetFaceEntity_asset_isReadOnly",
"AssetFaceEntity__AssetFaceEntity_asset"."isOffline" AS "AssetFaceEntity__AssetFaceEntity_asset_isOffline",
"AssetFaceEntity__AssetFaceEntity_asset"."checksum" AS "AssetFaceEntity__AssetFaceEntity_asset_checksum",
"AssetFaceEntity__AssetFaceEntity_asset"."duration" AS "AssetFaceEntity__AssetFaceEntity_asset_duration",
"AssetFaceEntity__AssetFaceEntity_asset"."isVisible" AS "AssetFaceEntity__AssetFaceEntity_asset_isVisible",
"AssetFaceEntity__AssetFaceEntity_asset"."livePhotoVideoId" AS "AssetFaceEntity__AssetFaceEntity_asset_livePhotoVideoId",
"AssetFaceEntity__AssetFaceEntity_asset"."originalFileName" AS "AssetFaceEntity__AssetFaceEntity_asset_originalFileName",
"AssetFaceEntity__AssetFaceEntity_asset"."sidecarPath" AS "AssetFaceEntity__AssetFaceEntity_asset_sidecarPath",
"AssetFaceEntity__AssetFaceEntity_asset"."stackParentId" AS "AssetFaceEntity__AssetFaceEntity_asset_stackParentId"
FROM
"asset_faces" "AssetFaceEntity"
LEFT JOIN "assets" "AssetFaceEntity__AssetFaceEntity_asset" ON "AssetFaceEntity__AssetFaceEntity_asset"."id" = "AssetFaceEntity"."assetId"
-- PersonRepository.getAll
SELECT
"PersonEntity"."id" AS "PersonEntity_id",
"PersonEntity"."createdAt" AS "PersonEntity_createdAt",
"PersonEntity"."updatedAt" AS "PersonEntity_updatedAt",
"PersonEntity"."ownerId" AS "PersonEntity_ownerId",
"PersonEntity"."name" AS "PersonEntity_name",
"PersonEntity"."birthDate" AS "PersonEntity_birthDate",
"PersonEntity"."thumbnailPath" AS "PersonEntity_thumbnailPath",
"PersonEntity"."faceAssetId" AS "PersonEntity_faceAssetId",
"PersonEntity"."isHidden" AS "PersonEntity_isHidden"
FROM
"person" "PersonEntity"
-- PersonRepository.getAllWithoutThumbnail
SELECT
"PersonEntity"."id" AS "PersonEntity_id",
"PersonEntity"."createdAt" AS "PersonEntity_createdAt",
"PersonEntity"."updatedAt" AS "PersonEntity_updatedAt",
"PersonEntity"."ownerId" AS "PersonEntity_ownerId",
"PersonEntity"."name" AS "PersonEntity_name",
"PersonEntity"."birthDate" AS "PersonEntity_birthDate",
"PersonEntity"."thumbnailPath" AS "PersonEntity_thumbnailPath",
"PersonEntity"."faceAssetId" AS "PersonEntity_faceAssetId",
"PersonEntity"."isHidden" AS "PersonEntity_isHidden"
FROM
"person" "PersonEntity"
WHERE
("PersonEntity"."thumbnailPath" = $1)
-- PersonRepository.getAllForUser
SELECT
"person"."id" AS "person_id",

View File

@ -2,10 +2,10 @@
-- SmartInfoRepository.searchCLIP
START TRANSACTION
SET
LOCAL vectors.k = '100'
SET
LOCAL vectors.enable_prefilter = on
SET
LOCAL vectors.k = '100'
SELECT
"a"."id" AS "a_id",
"a"."deviceAssetId" AS "a_deviceAssetId",
@ -70,8 +70,8 @@ FROM
WHERE
(
"a"."ownerId" IN ($1)
AND "a"."isVisible" = true
AND "a"."isArchived" = false
AND "a"."isVisible" = true
AND "a"."fileCreatedAt" < NOW()
)
AND ("a"."deletedAt" IS NULL)
@ -83,6 +83,8 @@ COMMIT
-- SmartInfoRepository.searchFaces
START TRANSACTION
SET
LOCAL vectors.enable_prefilter = on
SET
LOCAL vectors.k = '100'
WITH

View File

@ -62,11 +62,12 @@ export class AppService {
[JobName.QUEUE_METADATA_EXTRACTION]: (data) => this.metadataService.handleQueueMetadataExtraction(data),
[JobName.METADATA_EXTRACTION]: (data) => this.metadataService.handleMetadataExtraction(data),
[JobName.LINK_LIVE_PHOTOS]: (data) => this.metadataService.handleLivePhotoLinking(data),
[JobName.QUEUE_RECOGNIZE_FACES]: (data) => this.personService.handleQueueRecognizeFaces(data),
[JobName.RECOGNIZE_FACES]: (data) => this.personService.handleRecognizeFaces(data),
[JobName.QUEUE_FACE_DETECTION]: (data) => this.personService.handleQueueDetectFaces(data),
[JobName.FACE_DETECTION]: (data) => this.personService.handleDetectFaces(data),
[JobName.QUEUE_FACIAL_RECOGNITION]: (data) => this.personService.handleQueueRecognizeFaces(data),
[JobName.FACIAL_RECOGNITION]: (data) => this.personService.handleRecognizeFaces(data),
[JobName.GENERATE_PERSON_THUMBNAIL]: (data) => this.personService.handleGeneratePersonThumbnail(data),
[JobName.PERSON_CLEANUP]: () => this.personService.handlePersonCleanup(),
[JobName.PERSON_DELETE]: (data) => this.personService.handlePersonDelete(data),
[JobName.QUEUE_SIDECAR]: (data) => this.metadataService.handleQueueSidecar(data),
[JobName.SIDECAR_DISCOVERY]: (data) => this.metadataService.handleSidecarDiscovery(data),
[JobName.SIDECAR_SYNC]: () => this.metadataService.handleSidecarSync(),

View File

@ -2,9 +2,11 @@ import { AssetFaceEntity } from '@app/infra/entities';
import { assetStub } from './asset.stub';
import { personStub } from './person.stub';
type NonNullableProperty<T> = { [P in keyof T]: NonNullable<T[P]> };
export const faceStub = {
face1: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId',
face1: Object.freeze<NonNullableProperty<AssetFaceEntity>>({
id: 'assetFaceId1',
assetId: assetStub.image.id,
asset: assetStub.image,
personId: personStub.withName.id,
@ -17,8 +19,8 @@ export const faceStub = {
imageHeight: 1024,
imageWidth: 1024,
}),
primaryFace1: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId',
primaryFace1: Object.freeze<NonNullableProperty<AssetFaceEntity>>({
id: 'assetFaceId2',
assetId: assetStub.image.id,
asset: assetStub.image,
personId: personStub.primaryPerson.id,
@ -31,8 +33,8 @@ export const faceStub = {
imageHeight: 1024,
imageWidth: 1024,
}),
mergeFace1: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId',
mergeFace1: Object.freeze<NonNullableProperty<AssetFaceEntity>>({
id: 'assetFaceId3',
assetId: assetStub.image.id,
asset: assetStub.image,
personId: personStub.mergePerson.id,
@ -45,8 +47,8 @@ export const faceStub = {
imageHeight: 1024,
imageWidth: 1024,
}),
mergeFace2: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId',
mergeFace2: Object.freeze<NonNullableProperty<AssetFaceEntity>>({
id: 'assetFaceId4',
assetId: assetStub.image1.id,
asset: assetStub.image1,
personId: personStub.mergePerson.id,
@ -59,8 +61,8 @@ export const faceStub = {
imageHeight: 1024,
imageWidth: 1024,
}),
start: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId',
start: Object.freeze<NonNullableProperty<AssetFaceEntity>>({
id: 'assetFaceId5',
assetId: assetStub.image.id,
asset: assetStub.image,
personId: personStub.newThumbnail.id,
@ -73,8 +75,8 @@ export const faceStub = {
imageHeight: 1000,
imageWidth: 1000,
}),
middle: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId',
middle: Object.freeze<NonNullableProperty<AssetFaceEntity>>({
id: 'assetFaceId6',
assetId: assetStub.image.id,
asset: assetStub.image,
personId: personStub.newThumbnail.id,
@ -87,8 +89,8 @@ export const faceStub = {
imageHeight: 500,
imageWidth: 400,
}),
end: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId',
end: Object.freeze<NonNullableProperty<AssetFaceEntity>>({
id: 'assetFaceId7',
assetId: assetStub.image.id,
asset: assetStub.image,
personId: personStub.newThumbnail.id,
@ -101,4 +103,32 @@ export const faceStub = {
imageHeight: 500,
imageWidth: 500,
}),
noPerson1: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId8',
assetId: assetStub.image.id,
asset: assetStub.image,
personId: null,
person: null,
embedding: [1, 2, 3, 4],
boundingBoxX1: 0,
boundingBoxY1: 0,
boundingBoxX2: 1,
boundingBoxY2: 1,
imageHeight: 1024,
imageWidth: 1024,
}),
noPerson2: Object.freeze<AssetFaceEntity>({
id: 'assetFaceId9',
assetId: assetStub.image.id,
asset: assetStub.image,
personId: null,
person: null,
embedding: [1, 2, 3, 4],
boundingBoxX1: 0,
boundingBoxY1: 0,
boundingBoxX2: 1,
boundingBoxY2: 1,
imageHeight: 1024,
imageWidth: 1024,
}),
};

View File

@ -15,5 +15,6 @@ export const newJobRepositoryMock = (): jest.Mocked<IJobRepository> => {
getQueueStatus: jest.fn(),
getJobCounts: jest.fn(),
clear: jest.fn(),
waitForQueueCompletion: jest.fn(),
};
};

View File

@ -4,7 +4,6 @@ export const newPersonRepositoryMock = (): jest.Mocked<IPersonRepository> => {
return {
getById: jest.fn(),
getAll: jest.fn(),
getAllWithoutThumbnail: jest.fn(),
getAllForUser: jest.fn(),
getAssets: jest.fn(),
getAllWithoutFaces: jest.fn(),
@ -15,6 +14,7 @@ export const newPersonRepositoryMock = (): jest.Mocked<IPersonRepository> => {
update: jest.fn(),
deleteAll: jest.fn(),
delete: jest.fn(),
deleteAllFaces: jest.fn(),
getStatistics: jest.fn(),
getAllFaces: jest.fn(),

View File

@ -136,7 +136,8 @@ class ImmichApi {
[JobName.MetadataExtraction]: 'Extract Metadata',
[JobName.Sidecar]: 'Sidecar Metadata',
[JobName.SmartSearch]: 'Smart Search',
[JobName.RecognizeFaces]: 'Recognize Faces',
[JobName.FaceDetection]: 'Face Detection',
[JobName.FacialRecognition]: 'Facial Recognition',
[JobName.VideoConversion]: 'Transcode Videos',
[JobName.StorageTemplateMigration]: 'Storage Template Migration',
[JobName.Migration]: 'Migration',

View File

@ -15,6 +15,7 @@
mdiImageSearch,
mdiLibraryShelves,
mdiTable,
mdiTagFaces,
mdiVideo,
} from '@mdi/js';
import ConfirmDialogue from '../../shared-components/confirm-dialogue.svelte';
@ -35,20 +36,23 @@
handleCommand?: (jobId: JobName, jobCommand: JobCommandDto) => Promise<void>;
}
let faceConfirm = false;
let confirmJob: JobName | null = null;
const handleFaceCommand = async (jobId: JobName, dto: JobCommandDto) => {
const handleConfirmCommand = async (jobId: JobName, dto: JobCommandDto) => {
if (dto.force) {
faceConfirm = true;
confirmJob = jobId;
return;
}
await handleCommand(jobId, dto);
};
const onFaceConfirm = () => {
faceConfirm = false;
handleCommand(JobName.RecognizeFaces, { command: JobCommand.Start, force: true });
const onConfirm = () => {
if (!confirmJob) {
return;
}
handleCommand(confirmJob, { command: JobCommand.Start, force: true });
confirmJob = null;
};
$: jobDetails = <Partial<Record<JobName, JobDetails>>>{
@ -83,11 +87,20 @@
subtitle: 'Run machine learning on assets to support smart search',
disabled: !$featureFlags.clipEncode,
},
[JobName.RecognizeFaces]: {
[JobName.FaceDetection]: {
icon: mdiFaceRecognition,
title: api.getJobName(JobName.RecognizeFaces),
subtitle: 'Run machine learning on assets to recognize faces',
handleCommand: handleFaceCommand,
title: api.getJobName(JobName.FaceDetection),
subtitle:
'Detect the faces in assets using machine learning. For videos, only the thumbnail is considered. "All" (re-)processes all assets. "Missing" queues assets that haven\'t been processed yet. Detected faces will be queued for Facial Recognition after Face Detection is complete, grouping them into existing or new people.',
handleCommand: handleConfirmCommand,
disabled: !$featureFlags.facialRecognition,
},
[JobName.FacialRecognition]: {
icon: mdiTagFaces,
title: api.getJobName(JobName.FacialRecognition),
subtitle:
'Group detected faces into people. This step runs after Face Detection is complete. "All" (re-)clusters all faces. "Missing" queues faces that don\'t have a person assigned.',
handleCommand: handleConfirmCommand,
disabled: !$featureFlags.facialRecognition,
},
[JobName.VideoConversion]: {
@ -131,11 +144,11 @@
}
</script>
{#if faceConfirm}
{#if confirmJob}
<ConfirmDialogue
prompt="Are you sure you want to reprocess all faces? This will also clear named people."
on:confirm={onFaceConfirm}
on:cancel={() => (faceConfirm = false)}
on:confirm={onConfirm}
on:cancel={() => (confirmJob = null)}
/>
{/if}

View File

@ -1,5 +1,5 @@
<script lang="ts">
import { api, JobName, SystemConfigDto } from '@api';
import { api, JobName, SystemConfigDto, SystemConfigJobDto } from '@api';
import { isEqual } from 'lodash-es';
import { fade } from 'svelte/transition';
import SettingButtonsRow from '../setting-buttons-row.svelte';
@ -20,10 +20,16 @@
JobName.Library,
JobName.Sidecar,
JobName.SmartSearch,
JobName.RecognizeFaces,
JobName.FaceDetection,
JobName.FacialRecognition,
JobName.VideoConversion,
JobName.StorageTemplateMigration,
JobName.Migration,
];
function isSystemConfigJobDto(jobName: JobName): jobName is keyof SystemConfigJobDto {
return jobName in config.job;
}
</script>
<div>
@ -31,15 +37,26 @@
<form autocomplete="off" on:submit|preventDefault>
{#each jobNames as jobName}
<div class="ml-4 mt-4 flex flex-col gap-4">
<SettingInputField
inputType={SettingInputFieldType.NUMBER}
{disabled}
label="{api.getJobName(jobName)} Concurrency"
desc=""
bind:value={config.job[jobName].concurrency}
required={true}
isEdited={!(config.job[jobName].concurrency == savedConfig.job[jobName].concurrency)}
/>
{#if isSystemConfigJobDto(jobName)}
<SettingInputField
inputType={SettingInputFieldType.NUMBER}
{disabled}
label="{api.getJobName(jobName)} Concurrency"
desc=""
bind:value={config.job[jobName].concurrency}
required={true}
isEdited={!(config.job[jobName].concurrency == savedConfig.job[jobName].concurrency)}
/>
{:else}
<SettingInputField
inputType={SettingInputFieldType.NUMBER}
label="{api.getJobName(jobName)} Concurrency"
desc=""
value="1"
disabled={true}
title="This job is not concurrency-safe."
/>
{/if}
</div>
{/each}

View File

@ -82,7 +82,7 @@
<SettingSelect
label="FACIAL RECOGNITION MODEL"
desc="Models are listed in descending order of size. Larger models are slower and use more memory, but produce better results. Note that you must re-run the Recognize Faces job for all images upon changing a model."
desc="Models are listed in descending order of size. Larger models are slower and use more memory, but produce better results. Note that you must re-run the Face Detection job for all images upon changing a model."
name="facial-recognition-model"
bind:value={config.machineLearning.facialRecognition.modelName}
options={[
@ -124,8 +124,8 @@
<SettingInputField
inputType={SettingInputFieldType.NUMBER}
label="MIN FACES DETECTED"
desc="The minimum number of faces of a person that must be detected for them to appear in the People tab. Setting this to a value greater than 1 can prevent strangers or blurry faces that are not the main subject of the image from being displayed."
label="MIN RECOGNIZED FACES"
desc="The minimum number of recognized faces for a person to be created. Increasing this makes Facial Recognition more precise at the cost of increasing the chance that a face is not assigned to a person."
bind:value={config.machineLearning.facialRecognition.minFaces}
step="1"
min="1"

View File

@ -18,6 +18,7 @@
export let step = '1';
export let label = '';
export let desc = '';
export let title = '';
export let required = false;
export let disabled = false;
export let isEdited = false;
@ -69,5 +70,6 @@
{value}
on:input={handleInput}
{disabled}
{title}
/>
</div>