mirror of
https://github.com/immich-app/immich.git
synced 2024-12-22 01:47:08 +02:00
feat(server,web): libraries (#3124)
* feat: libraries Co-authored-by: Jason Rasmussen <jrasm91@gmail.com> Co-authored-by: Alex <alex.tran1502@gmail.com>
This commit is contained in:
parent
816db700e1
commit
acdc66413c
1044
cli/src/api/open-api/api.ts
generated
1044
cli/src/api/open-api/api.ts
generated
File diff suppressed because it is too large
Load Diff
148
docs/docs/features/libraries.md
Normal file
148
docs/docs/features/libraries.md
Normal file
@ -0,0 +1,148 @@
|
||||
# Libraries
|
||||
|
||||
## Overview
|
||||
|
||||
Immich supports the creation of libraries which is a top-level asset container. Currently, there are two types of libraries: traditional upload libraries that can sync with a mobile device, and external libraries, that keeps up to date with files on disk. Libraries are different from albums in that an asset can belong to multiple albums but only one library, and deleting a library deletes all assets contained within. As of August 2023, this is a new feature and libraries have a lot of potential for future development beyond what is documented here. This document attempts to describe the current state of libraries.
|
||||
|
||||
## The Upload Library
|
||||
|
||||
Immich comes preconfigured with an upload library for each user. All assets uploaded to Immich are added to this library. This library can be renamed, but not deleted. The upload library is the only library that can be synced with a mobile device. No items in an upload library is allowed to have the same sha1 hash as another item in the same library in order to prevent duplicates.
|
||||
|
||||
## External Libraries
|
||||
|
||||
External libraries tracks assets stored outside of immich, i.e. in the file system. Immich will only read data from the files, and will not modify them in any way. Therefore, the delete button is disabled for external assets. When the external library is scanned, immich will read the metadata from the file and create an asset in the library for each image or video file. These items will then be shown in the main timeline, and they will look and behave like any other asset, including viewing on the map, adding to albums, etc.
|
||||
|
||||
If a file is modified outside of Immich, the changes will not be reflected in immich until the library is scanned again. There are different ways to scan a library depending on the use case:
|
||||
|
||||
- Scan Library Files: This is the default scan method and also the quickest. It will scan all files in the library and add new files to the library. It will notice if any files are missing (see below) but not check existing assets
|
||||
- Scan All Library Files: Same as above, but will check each existing asset to see if the modification time has changed. If it has, the asset will be updated. Since it has to check each asset, this is slower than Scan Library Files.
|
||||
- Force Scan All Library Files: Same as above, but will read each asset from disk no matter the modification time. This is useful in some cases where an asset has been modified externally but the modification time has not changed. This is the slowest way to scan because it reads each asset from disk.
|
||||
|
||||
:::caution
|
||||
|
||||
Due to aggressive caching it can take some time for a refreshed asset to appear correctly in the web view. You need to clear the cache in your browser to see the changes. This is a known issue and will be fixed in a future release. In Chrome, you need to open the developer console with F12, then reload the page with F5, and finally right click on the reload button and select "Empty Cache and Hard Reload".
|
||||
|
||||
:::
|
||||
|
||||
In external libraries, the file path is used for duplicate detection. This means that if a file is moved to a different location, it will be added as a new asset. If the file is moved back to its original location, it will be added as a new asset. In contrast to upload libraries, two identical files can be uploaded if they are in different locations. This is a deliberate design choice to make Immich reflect the file system as closely as possible. Remember that duplication detection is only done within the same library, so if you have multiple external libraries, the same file can be added to multiple libraries.
|
||||
|
||||
:::caution
|
||||
|
||||
If you add assets from an external library to an album and then move the asset to another location within the library, the asset will be removed from the album upon rescan. This is because the asset is considered a new asset after the move. This is a known issue and will be fixed in a future release.
|
||||
|
||||
:::
|
||||
|
||||
### Deleted External Assets
|
||||
|
||||
In all above scan methods, Immich will check if any files are missing. This can happen if files are deleted, or if they are on a storage location that is currently unavailable, like a network drive that is not mounted, or a USB drive that has been unplugged. In order to prevent accidental deletion of assets, Immich will not immediately delete an asset from the library if the file is missing. Instead, the asset will be internally marked as offline and will still be visible in the main timeline. If the file is moved back to its original location and the library is scanned again, the asset will be restored.
|
||||
|
||||
Finally, files can be deleted from Immich via the `Remove Offline Files` job. Any assets marked as offline will then be removed from Immich. Run this job whenever files have been deleted from the file system and you want to remove them from Immich. Note that a library scan must be performed first to mark the assets as offline.
|
||||
|
||||
### Import Paths
|
||||
|
||||
External libraries use import paths to determine which files to scan. Each library can have multiple import paths so that files from different locations can be added to the same library. Import paths are scanned recursively, and if a file is in multiple import paths, it will only be added once. If the import paths are edited in a way that an external file is no longer in any import path, it will be removed from the library in the same way a deleted file would. If the file is moved back to an import path, it will be added again as if it was a new file.
|
||||
|
||||
### Security Considerations
|
||||
|
||||
For security purposes, each Immich user is disallowed to add external files by default. This is to prevent devastating [path traversal attacks](https://owasp.org/www-community/attacks/Path_Traversal). An admin can allow individual users to use external path feature via the `external path` setting found in the admin panel. Without the external path restriction, a user can add any image or video file on the Immich host filesystem to be imported into Immich, potentially allowing sensitive data to be accessed. If you are running Immich as root in your Docker setup (which is the default), all external file reads are done with root privileges. This is particularly dangerous if the Immich host is a shared server.
|
||||
|
||||
With the `external path` set, a user is restricted to accessing external files to files or directories within that path. The Immich admin should still be careful not set the external path too generously. For example, `user1` wants to read their photos in to `/home/user1`. A lazy admin sets that user's external path to `/home/` since it "gets the job done". However, that user will then be able to read all photos in `/home/user2/private-photos`, too! Please set the external path as specific as possible. If multiple folders must be added, do this using the docker volume mount feature described below.
|
||||
|
||||
### Exclusion Patterns and Scan Settings
|
||||
|
||||
By default, all files in the import paths will be added to the library. If there are files that should not be added, exclusion patterns can be used to exclude them. Exclusion patterns are glob patterns are matched against the full file path. If a file matches an exclusion pattern, it will not be added to the library. Exclusion patterns can be added in the Scan Settings page for each library. Under the hood, Immich uses the [glob](https://www.npmjs.com/package/glob) package to match patterns, so please refer to [their documentation](https://github.com/isaacs/node-glob#glob-primer) to see what patterns are supported.
|
||||
|
||||
Some basic examples:
|
||||
|
||||
- `*.tif` will exclude all files with the extension `.tif`
|
||||
- `hidden.jpg` will exclude all files named `hidden.jpg`
|
||||
- `**/Raw/**` will exclude all files in any directory named `Raw`
|
||||
- `*.(tif,jpg)` will exclude all files with the extension `.tif` or `.jpg`
|
||||
|
||||
## Usage
|
||||
|
||||
Let's show a concrete example where we add an existing gallery to Immich. Here, we have the following folders we want to add:
|
||||
|
||||
- `/home/user/old-pics`: a folder contining childhood photos.
|
||||
- `/mnt/nas/christmas-trip`: photos from a christmas trip. The subfolder `/mnt/nas/christmas-trip/Raw` contains the raw files directly from the DSLR. We don't want to import the raw files to Immich
|
||||
- `/mnt/media/videos`: Videos from the same christmas trip.
|
||||
|
||||
First, we need to plan how we want to organize the libraries. The christmas trip photos should belong to its own library since we want to exclude the raw files. The videos and old photos can be in the same library since we want to import all files. We could also add all three folders to the same library if there are no files matching the Raw exclusion pattern in the other folders.
|
||||
|
||||
### Mount Docker Volumes
|
||||
|
||||
`immich-server` and `immich-microservices` containers will need access to the gallery. Modify your docker compose file as follows
|
||||
|
||||
```diff title="docker-compose.yml"
|
||||
immich-server:
|
||||
volumes:
|
||||
- ${UPLOAD_LOCATION}:/usr/src/app/upload
|
||||
+ - /mnt/nas/christmas-trip:/mnt/media/christmas-trip:ro
|
||||
+ - /home/user/old-pics:/mnt/media/old-pics:ro
|
||||
+ - /mnt/media/videos:/mnt/media/videos:ro
|
||||
|
||||
|
||||
immich-microservices:
|
||||
volumes:
|
||||
- ${UPLOAD_LOCATION}:/usr/src/app/upload
|
||||
+ - /mnt/nas/christmas-trip:/mnt/media/christmas-trip:ro
|
||||
+ - /home/user/old-pics:/mnt/media/old-pics:ro
|
||||
+ - /mnt/media/videos:/mnt/media/videos:ro
|
||||
```
|
||||
|
||||
:::tip
|
||||
The `ro` flag at the end only gives read-only access to the volumes. While Immich does not modify files, it's a good practice to mount read-only.
|
||||
:::
|
||||
|
||||
_Remember to bring the container down/up to register the changes. Make sure you can see the mounted path in the container._
|
||||
|
||||
### Set External Path
|
||||
|
||||
Only an admin can do this.
|
||||
|
||||
- Navigate to `Administration > Users` page on the web.
|
||||
- Click on the user edit button.
|
||||
- Set `/mnt/media` to be the external path. This folder will only contain the three folders that we want to import, so nothing else can be accessed.
|
||||
|
||||
### Create External Libraries
|
||||
|
||||
- Click on your user name in the top right corner -> Account Settings
|
||||
- Click on Libraries
|
||||
- Click on Create External Library
|
||||
- Click the drop-down menu on the newly created library
|
||||
- Click on Rename Library and rename it to "Christmas Trip"
|
||||
- Click Edit Import Paths
|
||||
- Click on Add Path
|
||||
- Enter `/mnt/media/christmas-trip` then click Add
|
||||
|
||||
NOTE: We have to use the `/mnt/media/christmas-trip` path and not the `/mnt/nas/christmas-trip` path since all paths have to be what the Docker containers see.
|
||||
|
||||
Next, we'll add an exclusion pattern to filter out raw files.
|
||||
|
||||
- Click the drop-down menu on the newly christmas library
|
||||
- Click on Manage
|
||||
- Click on Scan Settings
|
||||
- Click on Add Exclusion Pattern
|
||||
- Enter `**/Raw/**` and click save.
|
||||
- Click save
|
||||
- Click the drop-down menu on the newly created library
|
||||
- Click on Scan Library Files
|
||||
|
||||
The christmas trip library will now be scanned in the background. In the meantime, let's add the videos and old photos to another library.
|
||||
|
||||
- Click on Create External Library.
|
||||
|
||||
:::info Note
|
||||
If you get an error here, please rename the other external library to something else. This is a bug that will be fixed in a future release.
|
||||
:::
|
||||
|
||||
- Click the drop-down menu on the newly created library
|
||||
- Click Edit Import Paths
|
||||
- Click on Add Path
|
||||
- Enter `/mnt/media/old-pics` then click Add
|
||||
- Click on Add Path
|
||||
- Enter `/mnt/media/videos` then click Add
|
||||
- Click Save
|
||||
- Click on Scan Library Files
|
||||
|
||||
Within seconds, the assets from the old-pics and videos folders should show up in the main timeline.
|
21
mobile/openapi/.openapi-generator/FILES
generated
21
mobile/openapi/.openapi-generator/FILES
generated
@ -46,6 +46,7 @@ doc/CheckExistingAssetsResponseDto.md
|
||||
doc/ClassificationConfig.md
|
||||
doc/Colorspace.md
|
||||
doc/CreateAlbumDto.md
|
||||
doc/CreateLibraryDto.md
|
||||
doc/CreateProfileImageResponseDto.md
|
||||
doc/CreateTagDto.md
|
||||
doc/CreateUserDto.md
|
||||
@ -67,6 +68,10 @@ doc/JobCountsDto.md
|
||||
doc/JobName.md
|
||||
doc/JobSettingsDto.md
|
||||
doc/JobStatusDto.md
|
||||
doc/LibraryApi.md
|
||||
doc/LibraryResponseDto.md
|
||||
doc/LibraryStatsResponseDto.md
|
||||
doc/LibraryType.md
|
||||
doc/LoginCredentialDto.md
|
||||
doc/LoginResponseDto.md
|
||||
doc/LogoutResponseDto.md
|
||||
@ -88,6 +93,7 @@ doc/PersonResponseDto.md
|
||||
doc/PersonUpdateDto.md
|
||||
doc/QueueStatusDto.md
|
||||
doc/RecognitionConfig.md
|
||||
doc/ScanLibraryDto.md
|
||||
doc/SearchAlbumResponseDto.md
|
||||
doc/SearchApi.md
|
||||
doc/SearchAssetDto.md
|
||||
@ -134,6 +140,7 @@ doc/TranscodeHWAccel.md
|
||||
doc/TranscodePolicy.md
|
||||
doc/UpdateAlbumDto.md
|
||||
doc/UpdateAssetDto.md
|
||||
doc/UpdateLibraryDto.md
|
||||
doc/UpdateTagDto.md
|
||||
doc/UpdateUserDto.md
|
||||
doc/UsageByUserDto.md
|
||||
@ -150,6 +157,7 @@ lib/api/asset_api.dart
|
||||
lib/api/audit_api.dart
|
||||
lib/api/authentication_api.dart
|
||||
lib/api/job_api.dart
|
||||
lib/api/library_api.dart
|
||||
lib/api/o_auth_api.dart
|
||||
lib/api/partner_api.dart
|
||||
lib/api/person_api.dart
|
||||
@ -205,6 +213,7 @@ lib/model/clip_mode.dart
|
||||
lib/model/colorspace.dart
|
||||
lib/model/cq_mode.dart
|
||||
lib/model/create_album_dto.dart
|
||||
lib/model/create_library_dto.dart
|
||||
lib/model/create_profile_image_response_dto.dart
|
||||
lib/model/create_tag_dto.dart
|
||||
lib/model/create_user_dto.dart
|
||||
@ -225,6 +234,9 @@ lib/model/job_counts_dto.dart
|
||||
lib/model/job_name.dart
|
||||
lib/model/job_settings_dto.dart
|
||||
lib/model/job_status_dto.dart
|
||||
lib/model/library_response_dto.dart
|
||||
lib/model/library_stats_response_dto.dart
|
||||
lib/model/library_type.dart
|
||||
lib/model/login_credential_dto.dart
|
||||
lib/model/login_response_dto.dart
|
||||
lib/model/logout_response_dto.dart
|
||||
@ -243,6 +255,7 @@ lib/model/person_response_dto.dart
|
||||
lib/model/person_update_dto.dart
|
||||
lib/model/queue_status_dto.dart
|
||||
lib/model/recognition_config.dart
|
||||
lib/model/scan_library_dto.dart
|
||||
lib/model/search_album_response_dto.dart
|
||||
lib/model/search_asset_dto.dart
|
||||
lib/model/search_asset_response_dto.dart
|
||||
@ -284,6 +297,7 @@ lib/model/transcode_hw_accel.dart
|
||||
lib/model/transcode_policy.dart
|
||||
lib/model/update_album_dto.dart
|
||||
lib/model/update_asset_dto.dart
|
||||
lib/model/update_library_dto.dart
|
||||
lib/model/update_tag_dto.dart
|
||||
lib/model/update_user_dto.dart
|
||||
lib/model/usage_by_user_dto.dart
|
||||
@ -335,6 +349,7 @@ test/clip_mode_test.dart
|
||||
test/colorspace_test.dart
|
||||
test/cq_mode_test.dart
|
||||
test/create_album_dto_test.dart
|
||||
test/create_library_dto_test.dart
|
||||
test/create_profile_image_response_dto_test.dart
|
||||
test/create_tag_dto_test.dart
|
||||
test/create_user_dto_test.dart
|
||||
@ -356,6 +371,10 @@ test/job_counts_dto_test.dart
|
||||
test/job_name_test.dart
|
||||
test/job_settings_dto_test.dart
|
||||
test/job_status_dto_test.dart
|
||||
test/library_api_test.dart
|
||||
test/library_response_dto_test.dart
|
||||
test/library_stats_response_dto_test.dart
|
||||
test/library_type_test.dart
|
||||
test/login_credential_dto_test.dart
|
||||
test/login_response_dto_test.dart
|
||||
test/logout_response_dto_test.dart
|
||||
@ -377,6 +396,7 @@ test/person_response_dto_test.dart
|
||||
test/person_update_dto_test.dart
|
||||
test/queue_status_dto_test.dart
|
||||
test/recognition_config_test.dart
|
||||
test/scan_library_dto_test.dart
|
||||
test/search_album_response_dto_test.dart
|
||||
test/search_api_test.dart
|
||||
test/search_asset_dto_test.dart
|
||||
@ -423,6 +443,7 @@ test/transcode_hw_accel_test.dart
|
||||
test/transcode_policy_test.dart
|
||||
test/update_album_dto_test.dart
|
||||
test/update_asset_dto_test.dart
|
||||
test/update_library_dto_test.dart
|
||||
test/update_tag_dto_test.dart
|
||||
test/update_user_dto_test.dart
|
||||
test/usage_by_user_dto_test.dart
|
||||
|
BIN
mobile/openapi/README.md
generated
BIN
mobile/openapi/README.md
generated
Binary file not shown.
BIN
mobile/openapi/doc/AllJobStatusResponseDto.md
generated
BIN
mobile/openapi/doc/AllJobStatusResponseDto.md
generated
Binary file not shown.
BIN
mobile/openapi/doc/AssetApi.md
generated
BIN
mobile/openapi/doc/AssetApi.md
generated
Binary file not shown.
BIN
mobile/openapi/doc/AssetResponseDto.md
generated
BIN
mobile/openapi/doc/AssetResponseDto.md
generated
Binary file not shown.
BIN
mobile/openapi/doc/CreateLibraryDto.md
generated
Normal file
BIN
mobile/openapi/doc/CreateLibraryDto.md
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/doc/ImportAssetDto.md
generated
BIN
mobile/openapi/doc/ImportAssetDto.md
generated
Binary file not shown.
BIN
mobile/openapi/doc/LibraryApi.md
generated
Normal file
BIN
mobile/openapi/doc/LibraryApi.md
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/doc/LibraryResponseDto.md
generated
Normal file
BIN
mobile/openapi/doc/LibraryResponseDto.md
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/doc/LibraryStatsResponseDto.md
generated
Normal file
BIN
mobile/openapi/doc/LibraryStatsResponseDto.md
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/doc/LibraryType.md
generated
Normal file
BIN
mobile/openapi/doc/LibraryType.md
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/doc/ScanLibraryDto.md
generated
Normal file
BIN
mobile/openapi/doc/ScanLibraryDto.md
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/doc/SystemConfigJobDto.md
generated
BIN
mobile/openapi/doc/SystemConfigJobDto.md
generated
Binary file not shown.
BIN
mobile/openapi/doc/UpdateLibraryDto.md
generated
Normal file
BIN
mobile/openapi/doc/UpdateLibraryDto.md
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/lib/api.dart
generated
BIN
mobile/openapi/lib/api.dart
generated
Binary file not shown.
BIN
mobile/openapi/lib/api/asset_api.dart
generated
BIN
mobile/openapi/lib/api/asset_api.dart
generated
Binary file not shown.
BIN
mobile/openapi/lib/api/library_api.dart
generated
Normal file
BIN
mobile/openapi/lib/api/library_api.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/lib/api_client.dart
generated
BIN
mobile/openapi/lib/api_client.dart
generated
Binary file not shown.
BIN
mobile/openapi/lib/api_helper.dart
generated
BIN
mobile/openapi/lib/api_helper.dart
generated
Binary file not shown.
Binary file not shown.
BIN
mobile/openapi/lib/model/asset_response_dto.dart
generated
BIN
mobile/openapi/lib/model/asset_response_dto.dart
generated
Binary file not shown.
BIN
mobile/openapi/lib/model/create_library_dto.dart
generated
Normal file
BIN
mobile/openapi/lib/model/create_library_dto.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/lib/model/import_asset_dto.dart
generated
BIN
mobile/openapi/lib/model/import_asset_dto.dart
generated
Binary file not shown.
BIN
mobile/openapi/lib/model/job_name.dart
generated
BIN
mobile/openapi/lib/model/job_name.dart
generated
Binary file not shown.
BIN
mobile/openapi/lib/model/library_response_dto.dart
generated
Normal file
BIN
mobile/openapi/lib/model/library_response_dto.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/lib/model/library_stats_response_dto.dart
generated
Normal file
BIN
mobile/openapi/lib/model/library_stats_response_dto.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/lib/model/library_type.dart
generated
Normal file
BIN
mobile/openapi/lib/model/library_type.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/lib/model/scan_library_dto.dart
generated
Normal file
BIN
mobile/openapi/lib/model/scan_library_dto.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/lib/model/system_config_job_dto.dart
generated
BIN
mobile/openapi/lib/model/system_config_job_dto.dart
generated
Binary file not shown.
BIN
mobile/openapi/lib/model/update_library_dto.dart
generated
Normal file
BIN
mobile/openapi/lib/model/update_library_dto.dart
generated
Normal file
Binary file not shown.
Binary file not shown.
BIN
mobile/openapi/test/asset_api_test.dart
generated
BIN
mobile/openapi/test/asset_api_test.dart
generated
Binary file not shown.
BIN
mobile/openapi/test/asset_response_dto_test.dart
generated
BIN
mobile/openapi/test/asset_response_dto_test.dart
generated
Binary file not shown.
BIN
mobile/openapi/test/create_library_dto_test.dart
generated
Normal file
BIN
mobile/openapi/test/create_library_dto_test.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/test/import_asset_dto_test.dart
generated
BIN
mobile/openapi/test/import_asset_dto_test.dart
generated
Binary file not shown.
BIN
mobile/openapi/test/library_api_test.dart
generated
Normal file
BIN
mobile/openapi/test/library_api_test.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/test/library_response_dto_test.dart
generated
Normal file
BIN
mobile/openapi/test/library_response_dto_test.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/test/library_stats_response_dto_test.dart
generated
Normal file
BIN
mobile/openapi/test/library_stats_response_dto_test.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/test/library_type_test.dart
generated
Normal file
BIN
mobile/openapi/test/library_type_test.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/test/scan_library_dto_test.dart
generated
Normal file
BIN
mobile/openapi/test/scan_library_dto_test.dart
generated
Normal file
Binary file not shown.
BIN
mobile/openapi/test/system_config_job_dto_test.dart
generated
BIN
mobile/openapi/test/system_config_job_dto_test.dart
generated
Binary file not shown.
BIN
mobile/openapi/test/update_library_dto_test.dart
generated
Normal file
BIN
mobile/openapi/test/update_library_dto_test.dart
generated
Normal file
Binary file not shown.
@ -2476,6 +2476,328 @@
|
||||
]
|
||||
}
|
||||
},
|
||||
"/library": {
|
||||
"get": {
|
||||
"operationId": "getAllForUser",
|
||||
"parameters": [],
|
||||
"responses": {
|
||||
"200": {
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"items": {
|
||||
"$ref": "#/components/schemas/LibraryResponseDto"
|
||||
},
|
||||
"type": "array"
|
||||
}
|
||||
}
|
||||
},
|
||||
"description": ""
|
||||
}
|
||||
},
|
||||
"security": [
|
||||
{
|
||||
"bearer": []
|
||||
},
|
||||
{
|
||||
"cookie": []
|
||||
},
|
||||
{
|
||||
"api_key": []
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
"Library"
|
||||
]
|
||||
},
|
||||
"post": {
|
||||
"operationId": "createLibrary",
|
||||
"parameters": [],
|
||||
"requestBody": {
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"$ref": "#/components/schemas/CreateLibraryDto"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": true
|
||||
},
|
||||
"responses": {
|
||||
"201": {
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"$ref": "#/components/schemas/LibraryResponseDto"
|
||||
}
|
||||
}
|
||||
},
|
||||
"description": ""
|
||||
}
|
||||
},
|
||||
"security": [
|
||||
{
|
||||
"bearer": []
|
||||
},
|
||||
{
|
||||
"cookie": []
|
||||
},
|
||||
{
|
||||
"api_key": []
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
"Library"
|
||||
]
|
||||
}
|
||||
},
|
||||
"/library/{id}": {
|
||||
"delete": {
|
||||
"operationId": "deleteLibrary",
|
||||
"parameters": [
|
||||
{
|
||||
"name": "id",
|
||||
"required": true,
|
||||
"in": "path",
|
||||
"schema": {
|
||||
"format": "uuid",
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
],
|
||||
"responses": {
|
||||
"200": {
|
||||
"description": ""
|
||||
}
|
||||
},
|
||||
"security": [
|
||||
{
|
||||
"bearer": []
|
||||
},
|
||||
{
|
||||
"cookie": []
|
||||
},
|
||||
{
|
||||
"api_key": []
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
"Library"
|
||||
]
|
||||
},
|
||||
"get": {
|
||||
"operationId": "getLibraryInfo",
|
||||
"parameters": [
|
||||
{
|
||||
"name": "id",
|
||||
"required": true,
|
||||
"in": "path",
|
||||
"schema": {
|
||||
"format": "uuid",
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
],
|
||||
"responses": {
|
||||
"200": {
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"$ref": "#/components/schemas/LibraryResponseDto"
|
||||
}
|
||||
}
|
||||
},
|
||||
"description": ""
|
||||
}
|
||||
},
|
||||
"security": [
|
||||
{
|
||||
"bearer": []
|
||||
},
|
||||
{
|
||||
"cookie": []
|
||||
},
|
||||
{
|
||||
"api_key": []
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
"Library"
|
||||
]
|
||||
},
|
||||
"put": {
|
||||
"operationId": "updateLibrary",
|
||||
"parameters": [
|
||||
{
|
||||
"name": "id",
|
||||
"required": true,
|
||||
"in": "path",
|
||||
"schema": {
|
||||
"format": "uuid",
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
],
|
||||
"requestBody": {
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"$ref": "#/components/schemas/UpdateLibraryDto"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": true
|
||||
},
|
||||
"responses": {
|
||||
"200": {
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"$ref": "#/components/schemas/LibraryResponseDto"
|
||||
}
|
||||
}
|
||||
},
|
||||
"description": ""
|
||||
}
|
||||
},
|
||||
"security": [
|
||||
{
|
||||
"bearer": []
|
||||
},
|
||||
{
|
||||
"cookie": []
|
||||
},
|
||||
{
|
||||
"api_key": []
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
"Library"
|
||||
]
|
||||
}
|
||||
},
|
||||
"/library/{id}/removeOffline": {
|
||||
"post": {
|
||||
"operationId": "removeOfflineFiles",
|
||||
"parameters": [
|
||||
{
|
||||
"name": "id",
|
||||
"required": true,
|
||||
"in": "path",
|
||||
"schema": {
|
||||
"format": "uuid",
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
],
|
||||
"responses": {
|
||||
"201": {
|
||||
"description": ""
|
||||
}
|
||||
},
|
||||
"security": [
|
||||
{
|
||||
"bearer": []
|
||||
},
|
||||
{
|
||||
"cookie": []
|
||||
},
|
||||
{
|
||||
"api_key": []
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
"Library"
|
||||
]
|
||||
}
|
||||
},
|
||||
"/library/{id}/scan": {
|
||||
"post": {
|
||||
"operationId": "scanLibrary",
|
||||
"parameters": [
|
||||
{
|
||||
"name": "id",
|
||||
"required": true,
|
||||
"in": "path",
|
||||
"schema": {
|
||||
"format": "uuid",
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
],
|
||||
"requestBody": {
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"$ref": "#/components/schemas/ScanLibraryDto"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": true
|
||||
},
|
||||
"responses": {
|
||||
"201": {
|
||||
"description": ""
|
||||
}
|
||||
},
|
||||
"security": [
|
||||
{
|
||||
"bearer": []
|
||||
},
|
||||
{
|
||||
"cookie": []
|
||||
},
|
||||
{
|
||||
"api_key": []
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
"Library"
|
||||
]
|
||||
}
|
||||
},
|
||||
"/library/{id}/statistics": {
|
||||
"get": {
|
||||
"operationId": "getLibraryStatistics",
|
||||
"parameters": [
|
||||
{
|
||||
"name": "id",
|
||||
"required": true,
|
||||
"in": "path",
|
||||
"schema": {
|
||||
"format": "uuid",
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
],
|
||||
"responses": {
|
||||
"200": {
|
||||
"content": {
|
||||
"application/json": {
|
||||
"schema": {
|
||||
"$ref": "#/components/schemas/LibraryStatsResponseDto"
|
||||
}
|
||||
}
|
||||
},
|
||||
"description": ""
|
||||
}
|
||||
},
|
||||
"security": [
|
||||
{
|
||||
"bearer": []
|
||||
},
|
||||
{
|
||||
"cookie": []
|
||||
},
|
||||
{
|
||||
"api_key": []
|
||||
}
|
||||
],
|
||||
"tags": [
|
||||
"Library"
|
||||
]
|
||||
}
|
||||
},
|
||||
"/oauth/authorize": {
|
||||
"post": {
|
||||
"operationId": "authorizeOAuth",
|
||||
@ -4971,6 +5293,9 @@
|
||||
"clipEncoding": {
|
||||
"$ref": "#/components/schemas/JobStatusDto"
|
||||
},
|
||||
"library": {
|
||||
"$ref": "#/components/schemas/JobStatusDto"
|
||||
},
|
||||
"metadataExtraction": {
|
||||
"$ref": "#/components/schemas/JobStatusDto"
|
||||
},
|
||||
@ -5006,7 +5331,8 @@
|
||||
"backgroundTask",
|
||||
"search",
|
||||
"recognizeFaces",
|
||||
"sidecar"
|
||||
"sidecar",
|
||||
"library"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
@ -5216,9 +5542,21 @@
|
||||
"isArchived": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isExternal": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isFavorite": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isOffline": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isReadOnly": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"libraryId": {
|
||||
"type": "string"
|
||||
},
|
||||
"livePhotoVideoId": {
|
||||
"nullable": true,
|
||||
"type": "string"
|
||||
@ -5272,6 +5610,7 @@
|
||||
"deviceAssetId",
|
||||
"deviceId",
|
||||
"ownerId",
|
||||
"libraryId",
|
||||
"originalPath",
|
||||
"originalFileName",
|
||||
"resized",
|
||||
@ -5281,6 +5620,9 @@
|
||||
"updatedAt",
|
||||
"isFavorite",
|
||||
"isArchived",
|
||||
"isOffline",
|
||||
"isExternal",
|
||||
"isReadOnly",
|
||||
"duration",
|
||||
"checksum"
|
||||
],
|
||||
@ -5607,16 +5949,25 @@
|
||||
"isArchived": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isExternal": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isFavorite": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isOffline": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isReadOnly": {
|
||||
"default": false,
|
||||
"type": "boolean"
|
||||
},
|
||||
"isVisible": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"libraryId": {
|
||||
"format": "uuid",
|
||||
"type": "string"
|
||||
},
|
||||
"livePhotoData": {
|
||||
"format": "binary",
|
||||
"type": "string"
|
||||
@ -5636,6 +5987,35 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"CreateLibraryDto": {
|
||||
"properties": {
|
||||
"exclusionPatterns": {
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"importPaths": {
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"isVisible": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"name": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"$ref": "#/components/schemas/LibraryType"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"type"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"CreateProfileImageDto": {
|
||||
"properties": {
|
||||
"file": {
|
||||
@ -6012,9 +6392,15 @@
|
||||
"isArchived": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isExternal": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isFavorite": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isOffline": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"isReadOnly": {
|
||||
"default": true,
|
||||
"type": "boolean"
|
||||
@ -6022,6 +6408,10 @@
|
||||
"isVisible": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"libraryId": {
|
||||
"format": "uuid",
|
||||
"type": "string"
|
||||
},
|
||||
"sidecarPath": {
|
||||
"type": "string"
|
||||
}
|
||||
@ -6102,7 +6492,8 @@
|
||||
"backgroundTask",
|
||||
"storageTemplateMigration",
|
||||
"search",
|
||||
"sidecar"
|
||||
"sidecar",
|
||||
"library"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
@ -6132,6 +6523,98 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"LibraryResponseDto": {
|
||||
"properties": {
|
||||
"assetCount": {
|
||||
"type": "integer"
|
||||
},
|
||||
"createdAt": {
|
||||
"format": "date-time",
|
||||
"type": "string"
|
||||
},
|
||||
"exclusionPatterns": {
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"id": {
|
||||
"type": "string"
|
||||
},
|
||||
"importPaths": {
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"name": {
|
||||
"type": "string"
|
||||
},
|
||||
"ownerId": {
|
||||
"type": "string"
|
||||
},
|
||||
"refreshedAt": {
|
||||
"format": "date-time",
|
||||
"nullable": true,
|
||||
"type": "string"
|
||||
},
|
||||
"type": {
|
||||
"$ref": "#/components/schemas/LibraryType"
|
||||
},
|
||||
"updatedAt": {
|
||||
"format": "date-time",
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"type",
|
||||
"assetCount",
|
||||
"id",
|
||||
"ownerId",
|
||||
"name",
|
||||
"importPaths",
|
||||
"exclusionPatterns",
|
||||
"createdAt",
|
||||
"updatedAt",
|
||||
"refreshedAt"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"LibraryStatsResponseDto": {
|
||||
"properties": {
|
||||
"photos": {
|
||||
"default": 0,
|
||||
"type": "integer"
|
||||
},
|
||||
"total": {
|
||||
"default": 0,
|
||||
"type": "integer"
|
||||
},
|
||||
"usage": {
|
||||
"default": 0,
|
||||
"format": "int64",
|
||||
"type": "integer"
|
||||
},
|
||||
"videos": {
|
||||
"default": 0,
|
||||
"type": "integer"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"photos",
|
||||
"videos",
|
||||
"total",
|
||||
"usage"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"LibraryType": {
|
||||
"enum": [
|
||||
"UPLOAD",
|
||||
"EXTERNAL"
|
||||
],
|
||||
"type": "string"
|
||||
},
|
||||
"LoginCredentialDto": {
|
||||
"properties": {
|
||||
"email": {
|
||||
@ -6493,6 +6976,18 @@
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
"ScanLibraryDto": {
|
||||
"properties": {
|
||||
"refreshAllFiles": {
|
||||
"default": false,
|
||||
"type": "boolean"
|
||||
},
|
||||
"refreshModifiedFiles": {
|
||||
"type": "boolean"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"SearchAlbumResponseDto": {
|
||||
"properties": {
|
||||
"count": {
|
||||
@ -7148,6 +7643,9 @@
|
||||
"clipEncoding": {
|
||||
"$ref": "#/components/schemas/JobSettingsDto"
|
||||
},
|
||||
"library": {
|
||||
"$ref": "#/components/schemas/JobSettingsDto"
|
||||
},
|
||||
"metadataExtraction": {
|
||||
"$ref": "#/components/schemas/JobSettingsDto"
|
||||
},
|
||||
@ -7183,7 +7681,8 @@
|
||||
"backgroundTask",
|
||||
"search",
|
||||
"recognizeFaces",
|
||||
"sidecar"
|
||||
"sidecar",
|
||||
"library"
|
||||
],
|
||||
"type": "object"
|
||||
},
|
||||
@ -7497,6 +7996,29 @@
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"UpdateLibraryDto": {
|
||||
"properties": {
|
||||
"exclusionPatterns": {
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"importPaths": {
|
||||
"items": {
|
||||
"type": "string"
|
||||
},
|
||||
"type": "array"
|
||||
},
|
||||
"isVisible": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"name": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
},
|
||||
"UpdateTagDto": {
|
||||
"properties": {
|
||||
"name": {
|
||||
|
1260
server/package-lock.json
generated
1260
server/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@ -61,6 +61,7 @@
|
||||
"exiftool-vendored": "^23.0.0",
|
||||
"exiftool-vendored.pl": "^12.62.0",
|
||||
"fluent-ffmpeg": "^2.1.2",
|
||||
"glob": "^10.3.3",
|
||||
"geo-tz": "^7.0.7",
|
||||
"handlebars": "^4.7.8",
|
||||
"i18n-iso-countries": "^7.6.0",
|
||||
@ -99,6 +100,7 @@
|
||||
"@types/jest": "29.5.4",
|
||||
"@types/jest-when": "^3.5.2",
|
||||
"@types/lodash": "^4.14.197",
|
||||
"@types/mock-fs": "^4.13.1",
|
||||
"@types/multer": "^1.4.7",
|
||||
"@types/mv": "^2.1.2",
|
||||
"@types/node": "^20.5.7",
|
||||
@ -113,6 +115,7 @@
|
||||
"eslint-plugin-prettier": "^5.0.0",
|
||||
"jest": "^29.6.4",
|
||||
"jest-when": "^3.6.0",
|
||||
"mock-fs": "^5.2.0",
|
||||
"prettier": "^3.0.2",
|
||||
"prettier-plugin-organize-imports": "^3.2.3",
|
||||
"rimraf": "^5.0.1",
|
||||
|
@ -21,7 +21,13 @@ export enum Permission {
|
||||
|
||||
ARCHIVE_READ = 'archive.read',
|
||||
|
||||
TIMELINE_READ = 'timeline.read',
|
||||
TIMELINE_DOWNLOAD = 'timeline.download',
|
||||
|
||||
LIBRARY_CREATE = 'library.create',
|
||||
LIBRARY_READ = 'library.read',
|
||||
LIBRARY_UPDATE = 'library.update',
|
||||
LIBRARY_DELETE = 'library.delete',
|
||||
LIBRARY_DOWNLOAD = 'library.download',
|
||||
|
||||
PERSON_READ = 'person.read',
|
||||
@ -165,12 +171,24 @@ export class AccessCore {
|
||||
case Permission.ARCHIVE_READ:
|
||||
return authUser.id === id;
|
||||
|
||||
case Permission.LIBRARY_READ:
|
||||
return authUser.id === id || (await this.repository.library.hasPartnerAccess(authUser.id, id));
|
||||
case Permission.TIMELINE_READ:
|
||||
return authUser.id === id || (await this.repository.timeline.hasPartnerAccess(authUser.id, id));
|
||||
|
||||
case Permission.LIBRARY_DOWNLOAD:
|
||||
case Permission.TIMELINE_DOWNLOAD:
|
||||
return authUser.id === id;
|
||||
|
||||
case Permission.LIBRARY_READ:
|
||||
return (
|
||||
(await this.repository.library.hasOwnerAccess(authUser.id, id)) ||
|
||||
(await this.repository.library.hasPartnerAccess(authUser.id, id))
|
||||
);
|
||||
|
||||
case Permission.LIBRARY_UPDATE:
|
||||
return this.repository.library.hasOwnerAccess(authUser.id, id);
|
||||
|
||||
case Permission.LIBRARY_DELETE:
|
||||
return this.repository.library.hasOwnerAccess(authUser.id, id);
|
||||
|
||||
case Permission.PERSON_READ:
|
||||
return this.repository.person.hasOwnerAccess(authUser.id, id);
|
||||
|
||||
|
@ -15,6 +15,11 @@ export interface IAccessRepository {
|
||||
};
|
||||
|
||||
library: {
|
||||
hasOwnerAccess(userId: string, libraryId: string): Promise<boolean>;
|
||||
hasPartnerAccess(userId: string, partnerId: string): Promise<boolean>;
|
||||
};
|
||||
|
||||
timeline: {
|
||||
hasPartnerAccess(userId: string, partnerId: string): Promise<boolean>;
|
||||
};
|
||||
|
||||
|
@ -45,6 +45,7 @@ export enum WithoutProperty {
|
||||
|
||||
export enum WithProperty {
|
||||
SIDECAR = 'sidecar',
|
||||
IS_OFFLINE = 'isOffline',
|
||||
}
|
||||
|
||||
export enum TimeBucketSize {
|
||||
@ -69,15 +70,18 @@ export interface TimeBucketItem {
|
||||
export const IAssetRepository = 'IAssetRepository';
|
||||
|
||||
export interface IAssetRepository {
|
||||
create(asset: Partial<AssetEntity>): Promise<AssetEntity>;
|
||||
getByDate(ownerId: string, date: Date): Promise<AssetEntity[]>;
|
||||
getByIds(ids: string[]): Promise<AssetEntity[]>;
|
||||
getByChecksum(userId: string, checksum: Buffer): Promise<AssetEntity | null>;
|
||||
getByAlbumId(pagination: PaginationOptions, albumId: string): Paginated<AssetEntity>;
|
||||
getByUserId(pagination: PaginationOptions, userId: string): Paginated<AssetEntity>;
|
||||
getWithout(pagination: PaginationOptions, property: WithoutProperty): Paginated<AssetEntity>;
|
||||
getWith(pagination: PaginationOptions, property: WithProperty): Paginated<AssetEntity>;
|
||||
getWith(pagination: PaginationOptions, property: WithProperty, libraryId?: string): Paginated<AssetEntity>;
|
||||
getFirstAssetForAlbumId(albumId: string): Promise<AssetEntity | null>;
|
||||
getLastUpdatedAssetForAlbumId(albumId: string): Promise<AssetEntity | null>;
|
||||
getByLibraryId(libraryIds: string[]): Promise<AssetEntity[]>;
|
||||
getByLibraryIdAndOriginalPath(libraryId: string, originalPath: string): Promise<AssetEntity | null>;
|
||||
deleteAll(ownerId: string): Promise<void>;
|
||||
getAll(pagination: PaginationOptions, options?: AssetSearchOptions): Paginated<AssetEntity>;
|
||||
updateAll(ids: string[], options: Partial<AssetEntity>): Promise<void>;
|
||||
@ -87,5 +91,7 @@ export interface IAssetRepository {
|
||||
getStatistics(ownerId: string, options: AssetStatsOptions): Promise<AssetStats>;
|
||||
getTimeBuckets(options: TimeBucketOptions): Promise<TimeBucketItem[]>;
|
||||
getByTimeBucket(timeBucket: string, options: TimeBucketOptions): Promise<AssetEntity[]>;
|
||||
remove(asset: AssetEntity): Promise<AssetEntity>;
|
||||
getById(assetId: string): Promise<AssetEntity>;
|
||||
upsertExif(exif: Partial<ExifEntity>): Promise<void>;
|
||||
}
|
||||
|
@ -501,6 +501,7 @@ describe(AssetService.name, () => {
|
||||
});
|
||||
|
||||
it('should return a list of archives (userId)', async () => {
|
||||
accessMock.library.hasOwnerAccess.mockResolvedValue(true);
|
||||
assetMock.getByUserId.mockResolvedValue({
|
||||
items: [assetStub.image, assetStub.video],
|
||||
hasNextPage: false,
|
||||
@ -514,6 +515,8 @@ describe(AssetService.name, () => {
|
||||
});
|
||||
|
||||
it('should split archives by size', async () => {
|
||||
accessMock.library.hasOwnerAccess.mockResolvedValue(true);
|
||||
|
||||
assetMock.getByUserId.mockResolvedValue({
|
||||
items: [
|
||||
{ ...assetStub.image, id: 'asset-1' },
|
||||
|
@ -162,7 +162,7 @@ export class AssetService {
|
||||
if (dto.isArchived !== false) {
|
||||
await this.access.requirePermission(authUser, Permission.ARCHIVE_READ, [dto.userId]);
|
||||
}
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_READ, [dto.userId]);
|
||||
await this.access.requirePermission(authUser, Permission.TIMELINE_READ, [dto.userId]);
|
||||
} else {
|
||||
dto.userId = authUser.id;
|
||||
}
|
||||
@ -187,6 +187,10 @@ export class AssetService {
|
||||
throw new BadRequestException('Asset not found');
|
||||
}
|
||||
|
||||
if (asset.isOffline) {
|
||||
throw new BadRequestException('Asset is offline');
|
||||
}
|
||||
|
||||
return this.storageRepository.createReadStream(asset.originalPath, mimeTypes.lookup(asset.originalPath));
|
||||
}
|
||||
|
||||
@ -268,7 +272,7 @@ export class AssetService {
|
||||
|
||||
if (dto.userId) {
|
||||
const userId = dto.userId;
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_DOWNLOAD, userId);
|
||||
await this.access.requirePermission(authUser, Permission.TIMELINE_DOWNLOAD, userId);
|
||||
return usePagination(PAGINATION_SIZE, (pagination) => this.assetRepository.getByUserId(pagination, userId));
|
||||
}
|
||||
|
||||
|
@ -12,6 +12,7 @@ export class AssetResponseDto {
|
||||
deviceId!: string;
|
||||
ownerId!: string;
|
||||
owner?: UserResponseDto;
|
||||
libraryId!: string;
|
||||
|
||||
@ApiProperty({ enumName: 'AssetTypeEnum', enum: AssetType })
|
||||
type!: AssetType;
|
||||
@ -25,6 +26,9 @@ export class AssetResponseDto {
|
||||
updatedAt!: Date;
|
||||
isFavorite!: boolean;
|
||||
isArchived!: boolean;
|
||||
isOffline!: boolean;
|
||||
isExternal!: boolean;
|
||||
isReadOnly!: boolean;
|
||||
duration!: string;
|
||||
exifInfo?: ExifResponseDto;
|
||||
smartInfo?: SmartInfoResponseDto;
|
||||
@ -42,6 +46,7 @@ function _map(entity: AssetEntity, withExif: boolean): AssetResponseDto {
|
||||
ownerId: entity.ownerId,
|
||||
owner: entity.owner ? mapUser(entity.owner) : undefined,
|
||||
deviceId: entity.deviceId,
|
||||
libraryId: entity.libraryId,
|
||||
type: entity.type,
|
||||
originalPath: entity.originalPath,
|
||||
originalFileName: entity.originalFileName,
|
||||
@ -59,6 +64,9 @@ function _map(entity: AssetEntity, withExif: boolean): AssetResponseDto {
|
||||
tags: entity.tags?.map(mapTag),
|
||||
people: entity.faces?.map(mapFace).filter((person) => !person.isHidden),
|
||||
checksum: entity.checksum.toString('base64'),
|
||||
isExternal: entity.isExternal,
|
||||
isOffline: entity.isOffline,
|
||||
isReadOnly: entity.isReadOnly,
|
||||
};
|
||||
}
|
||||
|
||||
|
@ -25,7 +25,7 @@ export class AuditService {
|
||||
|
||||
async getDeletes(authUser: AuthUserDto, dto: AuditDeletesDto): Promise<AuditDeletesResponseDto> {
|
||||
const userId = dto.userId || authUser.id;
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_READ, userId);
|
||||
await this.access.requirePermission(authUser, Permission.TIMELINE_READ, userId);
|
||||
|
||||
const audits = await this.repository.getAfter(dto.after, {
|
||||
ownerId: userId,
|
||||
|
@ -6,6 +6,7 @@ import {
|
||||
loginResponseStub,
|
||||
newCryptoRepositoryMock,
|
||||
newKeyRepositoryMock,
|
||||
newLibraryRepositoryMock,
|
||||
newSharedLinkRepositoryMock,
|
||||
newSystemConfigRepositoryMock,
|
||||
newUserRepositoryMock,
|
||||
@ -20,6 +21,7 @@ import { Issuer, generators } from 'openid-client';
|
||||
import { Socket } from 'socket.io';
|
||||
import { IKeyRepository } from '../api-key';
|
||||
import { ICryptoRepository } from '../crypto/crypto.repository';
|
||||
import { ILibraryRepository } from '../library';
|
||||
import { ISharedLinkRepository } from '../shared-link';
|
||||
import { ISystemConfigRepository } from '../system-config';
|
||||
import { IUserRepository } from '../user';
|
||||
@ -50,6 +52,7 @@ describe('AuthService', () => {
|
||||
let sut: AuthService;
|
||||
let cryptoMock: jest.Mocked<ICryptoRepository>;
|
||||
let userMock: jest.Mocked<IUserRepository>;
|
||||
let libraryMock: jest.Mocked<ILibraryRepository>;
|
||||
let configMock: jest.Mocked<ISystemConfigRepository>;
|
||||
let userTokenMock: jest.Mocked<IUserTokenRepository>;
|
||||
let shareMock: jest.Mocked<ISharedLinkRepository>;
|
||||
@ -81,12 +84,13 @@ describe('AuthService', () => {
|
||||
|
||||
cryptoMock = newCryptoRepositoryMock();
|
||||
userMock = newUserRepositoryMock();
|
||||
libraryMock = newLibraryRepositoryMock();
|
||||
configMock = newSystemConfigRepositoryMock();
|
||||
userTokenMock = newUserTokenRepositoryMock();
|
||||
shareMock = newSharedLinkRepositoryMock();
|
||||
keyMock = newKeyRepositoryMock();
|
||||
|
||||
sut = new AuthService(cryptoMock, configMock, userMock, userTokenMock, shareMock, keyMock);
|
||||
sut = new AuthService(cryptoMock, configMock, userMock, userTokenMock, libraryMock, shareMock, keyMock);
|
||||
});
|
||||
|
||||
it('should be defined', () => {
|
||||
|
@ -13,6 +13,7 @@ import { DateTime } from 'luxon';
|
||||
import { ClientMetadata, Issuer, UserinfoResponse, custom, generators } from 'openid-client';
|
||||
import { IKeyRepository } from '../api-key';
|
||||
import { ICryptoRepository } from '../crypto/crypto.repository';
|
||||
import { ILibraryRepository } from '../library';
|
||||
import { ISharedLinkRepository } from '../shared-link';
|
||||
import { ISystemConfigRepository } from '../system-config';
|
||||
import { SystemConfigCore } from '../system-config/system-config.core';
|
||||
@ -66,11 +67,12 @@ export class AuthService {
|
||||
@Inject(ISystemConfigRepository) configRepository: ISystemConfigRepository,
|
||||
@Inject(IUserRepository) userRepository: IUserRepository,
|
||||
@Inject(IUserTokenRepository) private userTokenRepository: IUserTokenRepository,
|
||||
@Inject(ILibraryRepository) libraryRepository: ILibraryRepository,
|
||||
@Inject(ISharedLinkRepository) private sharedLinkRepository: ISharedLinkRepository,
|
||||
@Inject(IKeyRepository) private keyRepository: IKeyRepository,
|
||||
) {
|
||||
this.configCore = new SystemConfigCore(configRepository);
|
||||
this.userCore = new UserCore(userRepository, cryptoRepository);
|
||||
this.userCore = new UserCore(userRepository, libraryRepository, cryptoRepository);
|
||||
|
||||
custom.setHttpOptionsDefaults({ timeout: 30000 });
|
||||
}
|
||||
|
@ -102,6 +102,7 @@ export const mimeTypes = {
|
||||
video,
|
||||
|
||||
isAsset: (filename: string) => isType(filename, image) || isType(filename, video),
|
||||
isImage: (filename: string) => isType(filename, image),
|
||||
isProfile: (filename: string) => isType(filename, profile),
|
||||
isSidecar: (filename: string) => isType(filename, sidecar),
|
||||
isVideo: (filename: string) => isType(filename, video),
|
||||
@ -115,4 +116,5 @@ export const mimeTypes = {
|
||||
}
|
||||
return AssetType.OTHER;
|
||||
},
|
||||
getSupportedFileExtensions: () => Object.keys(image).concat(Object.keys(video)),
|
||||
};
|
||||
|
@ -6,6 +6,7 @@ import { AuditService } from './audit';
|
||||
import { AuthService } from './auth';
|
||||
import { FacialRecognitionService } from './facial-recognition';
|
||||
import { JobService } from './job';
|
||||
import { LibraryService } from './library';
|
||||
import { MediaService } from './media';
|
||||
import { MetadataService } from './metadata';
|
||||
import { PartnerService } from './partner';
|
||||
@ -30,6 +31,7 @@ const providers: Provider[] = [
|
||||
JobService,
|
||||
MediaService,
|
||||
MetadataService,
|
||||
LibraryService,
|
||||
PersonService,
|
||||
PartnerService,
|
||||
SearchService,
|
||||
|
@ -12,6 +12,7 @@ export * from './domain.module';
|
||||
export * from './domain.util';
|
||||
export * from './facial-recognition';
|
||||
export * from './job';
|
||||
export * from './library';
|
||||
export * from './media';
|
||||
export * from './metadata';
|
||||
export * from './partner';
|
||||
|
@ -9,6 +9,7 @@ export enum QueueName {
|
||||
STORAGE_TEMPLATE_MIGRATION = 'storageTemplateMigration',
|
||||
SEARCH = 'search',
|
||||
SIDECAR = 'sidecar',
|
||||
LIBRARY = 'library',
|
||||
}
|
||||
|
||||
export enum JobCommand {
|
||||
@ -53,6 +54,15 @@ export enum JobName {
|
||||
RECOGNIZE_FACES = 'recognize-faces',
|
||||
PERSON_CLEANUP = 'person-cleanup',
|
||||
|
||||
// library managment
|
||||
LIBRARY_SCAN = 'library-refresh',
|
||||
LIBRARY_SCAN_ASSET = 'library-refresh-asset',
|
||||
LIBRARY_REMOVE_OFFLINE = 'library-remove-offline',
|
||||
LIBRARY_MARK_ASSET_OFFLINE = 'library-mark-asset-offline',
|
||||
LIBRARY_DELETE = 'library-delete',
|
||||
LIBRARY_QUEUE_SCAN_ALL = 'library-queue-all-refresh',
|
||||
LIBRARY_QUEUE_CLEANUP = 'library-queue-cleanup',
|
||||
|
||||
// cleanup
|
||||
DELETE_FILES = 'delete-files',
|
||||
CLEAN_OLD_AUDIT_LOGS = 'clean-old-audit-logs',
|
||||
@ -140,4 +150,13 @@ export const JOBS_TO_QUEUE: Record<JobName, QueueName> = {
|
||||
[JobName.QUEUE_SIDECAR]: QueueName.SIDECAR,
|
||||
[JobName.SIDECAR_DISCOVERY]: QueueName.SIDECAR,
|
||||
[JobName.SIDECAR_SYNC]: QueueName.SIDECAR,
|
||||
|
||||
// Library managment
|
||||
[JobName.LIBRARY_SCAN_ASSET]: QueueName.LIBRARY,
|
||||
[JobName.LIBRARY_MARK_ASSET_OFFLINE]: QueueName.LIBRARY,
|
||||
[JobName.LIBRARY_SCAN]: QueueName.LIBRARY,
|
||||
[JobName.LIBRARY_DELETE]: QueueName.LIBRARY,
|
||||
[JobName.LIBRARY_REMOVE_OFFLINE]: QueueName.LIBRARY,
|
||||
[JobName.LIBRARY_QUEUE_SCAN_ALL]: QueueName.LIBRARY,
|
||||
[JobName.LIBRARY_QUEUE_CLEANUP]: QueueName.LIBRARY,
|
||||
};
|
||||
|
@ -79,4 +79,7 @@ export class AllJobStatusResponseDto implements Record<QueueName, JobStatusDto>
|
||||
|
||||
@ApiProperty({ type: JobStatusDto })
|
||||
[QueueName.SIDECAR]!: JobStatusDto;
|
||||
|
||||
@ApiProperty({ type: JobStatusDto })
|
||||
[QueueName.LIBRARY]!: JobStatusDto;
|
||||
}
|
||||
|
@ -22,6 +22,21 @@ export interface IEntityJob extends IBaseJob {
|
||||
source?: 'upload';
|
||||
}
|
||||
|
||||
export interface IOfflineLibraryFileJob extends IEntityJob {
|
||||
assetPath: string;
|
||||
}
|
||||
|
||||
export interface ILibraryFileJob extends IEntityJob {
|
||||
ownerId: string;
|
||||
assetPath: string;
|
||||
forceRefresh: boolean;
|
||||
}
|
||||
|
||||
export interface ILibraryRefreshJob extends IEntityJob {
|
||||
refreshModifiedFiles: boolean;
|
||||
refreshAllFiles: boolean;
|
||||
}
|
||||
|
||||
export interface IBulkEntityJob extends IBaseJob {
|
||||
ids: string[];
|
||||
}
|
||||
|
@ -1,4 +1,5 @@
|
||||
import { JobName, QueueName } from './job.constants';
|
||||
|
||||
import {
|
||||
IAssetFaceJob,
|
||||
IBaseJob,
|
||||
@ -6,6 +7,9 @@ import {
|
||||
IDeleteFilesJob,
|
||||
IEntityJob,
|
||||
IFaceThumbnailJob,
|
||||
ILibraryFileJob,
|
||||
ILibraryRefreshJob,
|
||||
IOfflineLibraryFileJob,
|
||||
} from './job.interface';
|
||||
|
||||
export interface JobCounts {
|
||||
@ -74,6 +78,15 @@ export type JobItem =
|
||||
// Asset Deletion
|
||||
| { name: JobName.PERSON_CLEANUP; data?: IBaseJob }
|
||||
|
||||
// Library Managment
|
||||
| { name: JobName.LIBRARY_SCAN_ASSET; data: ILibraryFileJob }
|
||||
| { name: JobName.LIBRARY_MARK_ASSET_OFFLINE; data: IOfflineLibraryFileJob }
|
||||
| { name: JobName.LIBRARY_SCAN; data: ILibraryRefreshJob }
|
||||
| { name: JobName.LIBRARY_REMOVE_OFFLINE; data: IEntityJob }
|
||||
| { name: JobName.LIBRARY_DELETE; data: IEntityJob }
|
||||
| { name: JobName.LIBRARY_QUEUE_SCAN_ALL; data: IBaseJob }
|
||||
| { name: JobName.LIBRARY_QUEUE_CLEANUP; data: IBaseJob }
|
||||
|
||||
// Search
|
||||
| { name: JobName.SEARCH_INDEX_ASSETS; data?: IBaseJob }
|
||||
| { name: JobName.SEARCH_INDEX_ASSET; data: IBulkEntityJob }
|
||||
|
@ -52,6 +52,7 @@ describe(JobService.name, () => {
|
||||
[{ name: JobName.PERSON_CLEANUP }],
|
||||
[{ name: JobName.QUEUE_GENERATE_THUMBNAILS, data: { force: false } }],
|
||||
[{ name: JobName.CLEAN_OLD_AUDIT_LOGS }],
|
||||
[{ name: JobName.LIBRARY_QUEUE_SCAN_ALL, data: { force: false } }],
|
||||
]);
|
||||
});
|
||||
});
|
||||
@ -97,6 +98,7 @@ describe(JobService.name, () => {
|
||||
[QueueName.VIDEO_CONVERSION]: expectedJobStatus,
|
||||
[QueueName.RECOGNIZE_FACES]: expectedJobStatus,
|
||||
[QueueName.SIDECAR]: expectedJobStatus,
|
||||
[QueueName.LIBRARY]: expectedJobStatus,
|
||||
});
|
||||
});
|
||||
});
|
||||
@ -225,6 +227,7 @@ describe(JobService.name, () => {
|
||||
[QueueName.RECOGNIZE_FACES]: { concurrency: 10 },
|
||||
[QueueName.SEARCH]: { concurrency: 10 },
|
||||
[QueueName.SIDECAR]: { concurrency: 10 },
|
||||
[QueueName.LIBRARY]: { concurrency: 10 },
|
||||
[QueueName.STORAGE_TEMPLATE_MIGRATION]: { concurrency: 10 },
|
||||
[QueueName.THUMBNAIL_GENERATION]: { concurrency: 10 },
|
||||
[QueueName.VIDEO_CONVERSION]: { concurrency: 10 },
|
||||
@ -237,6 +240,7 @@ describe(JobService.name, () => {
|
||||
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.OBJECT_TAGGING, 10);
|
||||
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.RECOGNIZE_FACES, 10);
|
||||
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.SIDECAR, 10);
|
||||
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.LIBRARY, 10);
|
||||
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.STORAGE_TEMPLATE_MIGRATION, 10);
|
||||
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.THUMBNAIL_GENERATION, 10);
|
||||
expect(jobMock.setConcurrency).toHaveBeenCalledWith(QueueName.VIDEO_CONVERSION, 10);
|
||||
|
@ -98,6 +98,9 @@ export class JobService {
|
||||
await this.configCore.requireFeature(FeatureFlag.FACIAL_RECOGNITION);
|
||||
return this.jobRepository.queue({ name: JobName.QUEUE_RECOGNIZE_FACES, data: { force } });
|
||||
|
||||
case QueueName.LIBRARY:
|
||||
return this.jobRepository.queue({ name: JobName.LIBRARY_QUEUE_SCAN_ALL, data: { force } });
|
||||
|
||||
default:
|
||||
throw new BadRequestException(`Invalid job name: ${name}`);
|
||||
}
|
||||
@ -118,7 +121,7 @@ export class JobService {
|
||||
await this.onDone(item);
|
||||
}
|
||||
} catch (error: Error | any) {
|
||||
this.logger.error(`Unable to run job handler: ${error}`, error?.stack, data);
|
||||
this.logger.error(`Unable to run job handler (${queueName}/${name}): ${error}`, error?.stack, data);
|
||||
}
|
||||
});
|
||||
}
|
||||
@ -138,6 +141,7 @@ export class JobService {
|
||||
await this.jobRepository.queue({ name: JobName.PERSON_CLEANUP });
|
||||
await this.jobRepository.queue({ name: JobName.QUEUE_GENERATE_THUMBNAILS, data: { force: false } });
|
||||
await this.jobRepository.queue({ name: JobName.CLEAN_OLD_AUDIT_LOGS });
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_QUEUE_SCAN_ALL, data: { force: false } });
|
||||
}
|
||||
|
||||
/**
|
||||
|
3
server/src/domain/library/index.ts
Normal file
3
server/src/domain/library/index.ts
Normal file
@ -0,0 +1,3 @@
|
||||
export * from './library.dto';
|
||||
export * from './library.repository';
|
||||
export * from './library.service';
|
124
server/src/domain/library/library.dto.ts
Normal file
124
server/src/domain/library/library.dto.ts
Normal file
@ -0,0 +1,124 @@
|
||||
import { LibraryEntity, LibraryType } from '@app/infra/entities';
|
||||
import { ApiProperty } from '@nestjs/swagger';
|
||||
import { IsBoolean, IsEnum, IsNotEmpty, IsOptional, IsString } from 'class-validator';
|
||||
import { ValidateUUID } from '../domain.util';
|
||||
|
||||
export class CreateLibraryDto {
|
||||
@IsEnum(LibraryType)
|
||||
@ApiProperty({ enumName: 'LibraryType', enum: LibraryType })
|
||||
type!: LibraryType;
|
||||
|
||||
@IsString()
|
||||
@IsOptional()
|
||||
@IsNotEmpty()
|
||||
name?: string;
|
||||
|
||||
@IsOptional()
|
||||
@IsBoolean()
|
||||
isVisible?: boolean;
|
||||
|
||||
@IsOptional()
|
||||
@IsString({ each: true })
|
||||
@IsNotEmpty({ each: true })
|
||||
importPaths?: string[];
|
||||
|
||||
@IsOptional()
|
||||
@IsString({ each: true })
|
||||
@IsNotEmpty({ each: true })
|
||||
exclusionPatterns?: string[];
|
||||
}
|
||||
|
||||
export class UpdateLibraryDto {
|
||||
@IsOptional()
|
||||
@IsString()
|
||||
@IsNotEmpty()
|
||||
name?: string;
|
||||
|
||||
@IsOptional()
|
||||
@IsBoolean()
|
||||
isVisible?: boolean;
|
||||
|
||||
@IsOptional()
|
||||
@IsString({ each: true })
|
||||
@IsNotEmpty({ each: true })
|
||||
importPaths?: string[];
|
||||
|
||||
@IsOptional()
|
||||
@IsNotEmpty({ each: true })
|
||||
@IsString({ each: true })
|
||||
exclusionPatterns?: string[];
|
||||
}
|
||||
|
||||
export class CrawlOptionsDto {
|
||||
pathsToCrawl!: string[];
|
||||
includeHidden? = false;
|
||||
exclusionPatterns?: string[];
|
||||
}
|
||||
|
||||
export class LibrarySearchDto {
|
||||
@ValidateUUID({ optional: true })
|
||||
userId?: string;
|
||||
}
|
||||
|
||||
export class ScanLibraryDto {
|
||||
@IsBoolean()
|
||||
@IsOptional()
|
||||
refreshModifiedFiles?: boolean;
|
||||
|
||||
@IsBoolean()
|
||||
@IsOptional()
|
||||
refreshAllFiles?: boolean = false;
|
||||
}
|
||||
|
||||
export class LibraryResponseDto {
|
||||
id!: string;
|
||||
ownerId!: string;
|
||||
name!: string;
|
||||
|
||||
@ApiProperty({ enumName: 'LibraryType', enum: LibraryType })
|
||||
type!: LibraryType;
|
||||
|
||||
@ApiProperty({ type: 'integer' })
|
||||
assetCount!: number;
|
||||
|
||||
importPaths!: string[];
|
||||
|
||||
exclusionPatterns!: string[];
|
||||
|
||||
createdAt!: Date;
|
||||
updatedAt!: Date;
|
||||
refreshedAt!: Date | null;
|
||||
}
|
||||
|
||||
export class LibraryStatsResponseDto {
|
||||
@ApiProperty({ type: 'integer' })
|
||||
photos = 0;
|
||||
|
||||
@ApiProperty({ type: 'integer' })
|
||||
videos = 0;
|
||||
|
||||
@ApiProperty({ type: 'integer' })
|
||||
total = 0;
|
||||
|
||||
@ApiProperty({ type: 'integer', format: 'int64' })
|
||||
usage = 0;
|
||||
}
|
||||
|
||||
export function mapLibrary(entity: LibraryEntity): LibraryResponseDto {
|
||||
let assetCount = 0;
|
||||
if (entity.assets) {
|
||||
assetCount = entity.assets.length;
|
||||
}
|
||||
return {
|
||||
id: entity.id,
|
||||
ownerId: entity.ownerId,
|
||||
type: entity.type,
|
||||
name: entity.name,
|
||||
createdAt: entity.createdAt,
|
||||
updatedAt: entity.updatedAt,
|
||||
refreshedAt: entity.refreshedAt,
|
||||
assetCount,
|
||||
importPaths: entity.importPaths,
|
||||
exclusionPatterns: entity.exclusionPatterns,
|
||||
};
|
||||
}
|
22
server/src/domain/library/library.repository.ts
Normal file
22
server/src/domain/library/library.repository.ts
Normal file
@ -0,0 +1,22 @@
|
||||
import { LibraryEntity, LibraryType } from '@app/infra/entities';
|
||||
import { LibraryStatsResponseDto } from './library.dto';
|
||||
|
||||
export const ILibraryRepository = 'ILibraryRepository';
|
||||
|
||||
export interface ILibraryRepository {
|
||||
getCountForUser(ownerId: string): Promise<number>;
|
||||
getAllByUserId(userId: string, type?: LibraryType): Promise<LibraryEntity[]>;
|
||||
getAll(withDeleted?: boolean, type?: LibraryType): Promise<LibraryEntity[]>;
|
||||
getAllDeleted(): Promise<LibraryEntity[]>;
|
||||
get(id: string, withDeleted?: boolean): Promise<LibraryEntity | null>;
|
||||
create(library: Partial<LibraryEntity>): Promise<LibraryEntity>;
|
||||
delete(id: string): Promise<void>;
|
||||
softDelete(id: string): Promise<void>;
|
||||
getDefaultUploadLibrary(ownerId: string): Promise<LibraryEntity | null>;
|
||||
getUploadLibraryCount(ownerId: string): Promise<number>;
|
||||
update(library: Partial<LibraryEntity>): Promise<LibraryEntity>;
|
||||
getStatistics(id: string): Promise<LibraryStatsResponseDto>;
|
||||
getOnlineAssetPaths(id: string): Promise<string[]>;
|
||||
getAssetIds(id: string, withDeleted?: boolean): Promise<string[]>;
|
||||
existsByName(name: string, withDeleted?: boolean): Promise<boolean>;
|
||||
}
|
1204
server/src/domain/library/library.service.spec.ts
Normal file
1204
server/src/domain/library/library.service.spec.ts
Normal file
File diff suppressed because it is too large
Load Diff
468
server/src/domain/library/library.service.ts
Normal file
468
server/src/domain/library/library.service.ts
Normal file
@ -0,0 +1,468 @@
|
||||
import { AssetType, LibraryType } from '@app/infra/entities';
|
||||
import { BadRequestException, Inject, Injectable, Logger } from '@nestjs/common';
|
||||
import { R_OK } from 'node:constants';
|
||||
import { Stats } from 'node:fs';
|
||||
import path from 'node:path';
|
||||
import { basename, parse } from 'path';
|
||||
import { AccessCore, IAccessRepository, Permission } from '../access';
|
||||
import { IAssetRepository, WithProperty } from '../asset';
|
||||
import { AuthUserDto } from '../auth';
|
||||
import { usePagination } from '../domain.util';
|
||||
|
||||
import { ICryptoRepository } from '../crypto';
|
||||
import { mimeTypes } from '../domain.constant';
|
||||
import {
|
||||
IBaseJob,
|
||||
IEntityJob,
|
||||
IJobRepository,
|
||||
ILibraryFileJob,
|
||||
ILibraryRefreshJob,
|
||||
IOfflineLibraryFileJob,
|
||||
JobName,
|
||||
JOBS_ASSET_PAGINATION_SIZE,
|
||||
} from '../job';
|
||||
import { IStorageRepository } from '../storage';
|
||||
import { IUserRepository } from '../user';
|
||||
import {
|
||||
CreateLibraryDto,
|
||||
LibraryResponseDto,
|
||||
LibraryStatsResponseDto,
|
||||
mapLibrary,
|
||||
ScanLibraryDto,
|
||||
UpdateLibraryDto,
|
||||
} from './library.dto';
|
||||
import { ILibraryRepository } from './library.repository';
|
||||
|
||||
@Injectable()
|
||||
export class LibraryService {
|
||||
readonly logger = new Logger(LibraryService.name);
|
||||
private access: AccessCore;
|
||||
|
||||
constructor(
|
||||
@Inject(IAccessRepository) accessRepository: IAccessRepository,
|
||||
@Inject(IAssetRepository) private assetRepository: IAssetRepository,
|
||||
@Inject(ICryptoRepository) private cryptoRepository: ICryptoRepository,
|
||||
@Inject(IJobRepository) private jobRepository: IJobRepository,
|
||||
@Inject(ILibraryRepository) private repository: ILibraryRepository,
|
||||
@Inject(IStorageRepository) private storageRepository: IStorageRepository,
|
||||
@Inject(IUserRepository) private userRepository: IUserRepository,
|
||||
) {
|
||||
this.access = new AccessCore(accessRepository);
|
||||
}
|
||||
|
||||
async getStatistics(authUser: AuthUserDto, id: string): Promise<LibraryStatsResponseDto> {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_READ, id);
|
||||
return this.repository.getStatistics(id);
|
||||
}
|
||||
|
||||
async getCount(authUser: AuthUserDto): Promise<number> {
|
||||
return this.repository.getCountForUser(authUser.id);
|
||||
}
|
||||
|
||||
async getAllForUser(authUser: AuthUserDto): Promise<LibraryResponseDto[]> {
|
||||
const libraries = await this.repository.getAllByUserId(authUser.id);
|
||||
return libraries.map((library) => mapLibrary(library));
|
||||
}
|
||||
|
||||
async get(authUser: AuthUserDto, id: string): Promise<LibraryResponseDto> {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_READ, id);
|
||||
const library = await this.findOrFail(id);
|
||||
return mapLibrary(library);
|
||||
}
|
||||
|
||||
async handleQueueCleanup(): Promise<boolean> {
|
||||
this.logger.debug('Cleaning up any pending library deletions');
|
||||
const pendingDeletion = await this.repository.getAllDeleted();
|
||||
for (const libraryToDelete of pendingDeletion) {
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_DELETE, data: { id: libraryToDelete.id } });
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
async create(authUser: AuthUserDto, dto: CreateLibraryDto): Promise<LibraryResponseDto> {
|
||||
switch (dto.type) {
|
||||
case LibraryType.EXTERNAL:
|
||||
if (!dto.name) {
|
||||
dto.name = 'New External Library';
|
||||
}
|
||||
break;
|
||||
case LibraryType.UPLOAD:
|
||||
if (!dto.name) {
|
||||
dto.name = 'New Upload Library';
|
||||
}
|
||||
if (dto.importPaths && dto.importPaths.length > 0) {
|
||||
throw new BadRequestException('Upload libraries cannot have import paths');
|
||||
}
|
||||
if (dto.exclusionPatterns && dto.exclusionPatterns.length > 0) {
|
||||
throw new BadRequestException('Upload libraries cannot have exclusion patterns');
|
||||
}
|
||||
break;
|
||||
}
|
||||
|
||||
const library = await this.repository.create({
|
||||
ownerId: authUser.id,
|
||||
name: dto.name,
|
||||
type: dto.type,
|
||||
importPaths: dto.importPaths ?? [],
|
||||
exclusionPatterns: dto.exclusionPatterns ?? [],
|
||||
isVisible: dto.isVisible ?? true,
|
||||
});
|
||||
|
||||
return mapLibrary(library);
|
||||
}
|
||||
|
||||
async update(authUser: AuthUserDto, id: string, dto: UpdateLibraryDto): Promise<LibraryResponseDto> {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_UPDATE, id);
|
||||
const library = await this.repository.update({ id, ...dto });
|
||||
return mapLibrary(library);
|
||||
}
|
||||
|
||||
async delete(authUser: AuthUserDto, id: string) {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_DELETE, id);
|
||||
|
||||
const library = await this.findOrFail(id);
|
||||
const uploadCount = await this.repository.getUploadLibraryCount(authUser.id);
|
||||
if (library.type === LibraryType.UPLOAD && uploadCount <= 1) {
|
||||
throw new BadRequestException('Cannot delete the last upload library');
|
||||
}
|
||||
|
||||
await this.repository.softDelete(id);
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_DELETE, data: { id } });
|
||||
}
|
||||
|
||||
async handleDeleteLibrary(job: IEntityJob): Promise<boolean> {
|
||||
const library = await this.repository.get(job.id, true);
|
||||
if (!library) {
|
||||
return false;
|
||||
}
|
||||
|
||||
// TODO use pagination
|
||||
const assetIds = await this.repository.getAssetIds(job.id);
|
||||
this.logger.debug(`Will delete ${assetIds.length} asset(s) in library ${job.id}`);
|
||||
// TODO queue a job for asset deletion
|
||||
await this.deleteAssets(assetIds);
|
||||
this.logger.log(`Deleting library ${job.id}`);
|
||||
await this.repository.delete(job.id);
|
||||
return true;
|
||||
}
|
||||
|
||||
async handleAssetRefresh(job: ILibraryFileJob) {
|
||||
const assetPath = path.normalize(job.assetPath);
|
||||
|
||||
const user = await this.userRepository.get(job.ownerId);
|
||||
if (!user?.externalPath) {
|
||||
this.logger.warn('User has no external path set, cannot import asset');
|
||||
return false;
|
||||
}
|
||||
|
||||
if (!path.normalize(assetPath).match(new RegExp(`^${user.externalPath}`))) {
|
||||
this.logger.error("Asset must be within the user's external path");
|
||||
return false;
|
||||
}
|
||||
|
||||
const existingAssetEntity = await this.assetRepository.getByLibraryIdAndOriginalPath(job.id, assetPath);
|
||||
|
||||
let stats: Stats;
|
||||
try {
|
||||
stats = await this.storageRepository.stat(assetPath);
|
||||
} catch (error: Error | any) {
|
||||
// Can't access file, probably offline
|
||||
if (existingAssetEntity) {
|
||||
// Mark asset as offline
|
||||
this.logger.debug(`Marking asset as offline: ${assetPath}`);
|
||||
|
||||
await this.assetRepository.save({ id: existingAssetEntity.id, isOffline: true });
|
||||
return true;
|
||||
} else {
|
||||
// File can't be accessed and does not already exist in db
|
||||
throw new BadRequestException("Can't access file", { cause: error });
|
||||
}
|
||||
}
|
||||
|
||||
let doImport = false;
|
||||
let doRefresh = false;
|
||||
|
||||
if (job.forceRefresh) {
|
||||
doRefresh = true;
|
||||
}
|
||||
|
||||
if (!existingAssetEntity) {
|
||||
// This asset is new to us, read it from disk
|
||||
this.logger.debug(`Importing new asset: ${assetPath}`);
|
||||
doImport = true;
|
||||
} else if (stats.mtime.toISOString() !== existingAssetEntity.fileModifiedAt.toISOString()) {
|
||||
// File modification time has changed since last time we checked, re-read from disk
|
||||
this.logger.debug(
|
||||
`File modification time has changed, re-importing asset: ${assetPath}. Old mtime: ${existingAssetEntity.fileModifiedAt}. New mtime: ${stats.mtime}`,
|
||||
);
|
||||
doRefresh = true;
|
||||
} else if (!job.forceRefresh && stats && !existingAssetEntity.isOffline) {
|
||||
// Asset exists on disk and in db and mtime has not changed. Also, we are not forcing refresn. Therefore, do nothing
|
||||
this.logger.debug(`Asset already exists in database and on disk, will not import: ${assetPath}`);
|
||||
}
|
||||
|
||||
if (stats && existingAssetEntity?.isOffline) {
|
||||
// File was previously offline but is now online
|
||||
this.logger.debug(`Marking previously-offline asset as online: ${assetPath}`);
|
||||
await this.assetRepository.save({ id: existingAssetEntity.id, isOffline: false });
|
||||
doRefresh = true;
|
||||
}
|
||||
|
||||
if (!doImport && !doRefresh) {
|
||||
// If we don't import, exit here
|
||||
return true;
|
||||
}
|
||||
|
||||
let assetType: AssetType;
|
||||
|
||||
if (mimeTypes.isImage(assetPath)) {
|
||||
assetType = AssetType.IMAGE;
|
||||
} else if (mimeTypes.isVideo(assetPath)) {
|
||||
assetType = AssetType.VIDEO;
|
||||
} else {
|
||||
throw new BadRequestException(`Unsupported file type ${assetPath}`);
|
||||
}
|
||||
|
||||
// TODO: doesn't xmp replace the file extension? Will need investigation
|
||||
let sidecarPath: string | null = null;
|
||||
if (await this.storageRepository.checkFileExists(`${assetPath}.xmp`, R_OK)) {
|
||||
sidecarPath = `${assetPath}.xmp`;
|
||||
}
|
||||
|
||||
const deviceAssetId = `${basename(assetPath)}`.replace(/\s+/g, '');
|
||||
|
||||
const pathHash = this.cryptoRepository.hashSha1(`path:${assetPath}`);
|
||||
|
||||
let assetId;
|
||||
if (doImport) {
|
||||
const library = await this.repository.get(job.id, true);
|
||||
if (library?.deletedAt) {
|
||||
this.logger.error('Cannot import asset into deleted library');
|
||||
return false;
|
||||
}
|
||||
|
||||
// TODO: In wait of refactoring the domain asset service, this function is just manually written like this
|
||||
const addedAsset = await this.assetRepository.create({
|
||||
ownerId: job.ownerId,
|
||||
libraryId: job.id,
|
||||
checksum: pathHash,
|
||||
originalPath: assetPath,
|
||||
deviceAssetId: deviceAssetId,
|
||||
deviceId: 'Library Import',
|
||||
fileCreatedAt: stats.ctime,
|
||||
fileModifiedAt: stats.mtime,
|
||||
type: assetType,
|
||||
originalFileName: parse(assetPath).name,
|
||||
sidecarPath,
|
||||
isReadOnly: true,
|
||||
isExternal: true,
|
||||
});
|
||||
assetId = addedAsset.id;
|
||||
} else if (doRefresh && existingAssetEntity) {
|
||||
assetId = existingAssetEntity.id;
|
||||
await this.assetRepository.updateAll([existingAssetEntity.id], {
|
||||
fileCreatedAt: stats.ctime,
|
||||
fileModifiedAt: stats.mtime,
|
||||
});
|
||||
} else {
|
||||
// Not importing and not refreshing, do nothing
|
||||
return true;
|
||||
}
|
||||
|
||||
this.logger.debug(`Queuing metadata extraction for: ${assetPath}`);
|
||||
|
||||
await this.jobRepository.queue({ name: JobName.METADATA_EXTRACTION, data: { id: assetId, source: 'upload' } });
|
||||
|
||||
if (assetType === AssetType.VIDEO) {
|
||||
await this.jobRepository.queue({ name: JobName.VIDEO_CONVERSION, data: { id: assetId } });
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
async queueScan(authUser: AuthUserDto, id: string, dto: ScanLibraryDto) {
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_UPDATE, id);
|
||||
|
||||
const library = await this.repository.get(id);
|
||||
if (!library || library.type !== LibraryType.EXTERNAL) {
|
||||
throw new BadRequestException('Can only refresh external libraries');
|
||||
}
|
||||
|
||||
await this.jobRepository.queue({
|
||||
name: JobName.LIBRARY_SCAN,
|
||||
data: {
|
||||
id,
|
||||
refreshModifiedFiles: dto.refreshModifiedFiles ?? false,
|
||||
refreshAllFiles: dto.refreshAllFiles ?? false,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async queueRemoveOffline(authUser: AuthUserDto, id: string) {
|
||||
this.logger.verbose(`Removing offline files from library: ${id}`);
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_UPDATE, id);
|
||||
|
||||
await this.jobRepository.queue({
|
||||
name: JobName.LIBRARY_REMOVE_OFFLINE,
|
||||
data: {
|
||||
id,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async handleQueueAllScan(job: IBaseJob): Promise<boolean> {
|
||||
this.logger.debug(`Refreshing all external libraries: force=${job.force}`);
|
||||
|
||||
// Queue cleanup
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_QUEUE_CLEANUP, data: {} });
|
||||
|
||||
// Queue all library refresh
|
||||
const libraries = await this.repository.getAll(true, LibraryType.EXTERNAL);
|
||||
for (const library of libraries) {
|
||||
await this.jobRepository.queue({
|
||||
name: JobName.LIBRARY_SCAN,
|
||||
data: {
|
||||
id: library.id,
|
||||
refreshModifiedFiles: !job.force,
|
||||
refreshAllFiles: job.force ?? false,
|
||||
},
|
||||
});
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
async handleOfflineRemoval(job: IEntityJob): Promise<boolean> {
|
||||
const assetPagination = usePagination(JOBS_ASSET_PAGINATION_SIZE, (pagination) => {
|
||||
return this.assetRepository.getWith(pagination, WithProperty.IS_OFFLINE, job.id);
|
||||
});
|
||||
|
||||
const assetIds: string[] = [];
|
||||
|
||||
for await (const assets of assetPagination) {
|
||||
for (const asset of assets) {
|
||||
assetIds.push(asset.id);
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.verbose(`Found ${assetIds.length} offline assets to remove`);
|
||||
await this.deleteAssets(assetIds);
|
||||
return true;
|
||||
}
|
||||
|
||||
async handleQueueAssetRefresh(job: ILibraryRefreshJob): Promise<boolean> {
|
||||
const library = await this.repository.get(job.id);
|
||||
if (!library || library.type !== LibraryType.EXTERNAL) {
|
||||
this.logger.warn('Can only refresh external libraries');
|
||||
return false;
|
||||
}
|
||||
|
||||
const user = await this.userRepository.get(library.ownerId);
|
||||
if (!user?.externalPath) {
|
||||
this.logger.warn('User has no external path set, cannot refresh library');
|
||||
return false;
|
||||
}
|
||||
|
||||
this.logger.verbose(`Refreshing library: ${job.id}`);
|
||||
const crawledAssetPaths = (
|
||||
await this.storageRepository.crawl({
|
||||
pathsToCrawl: library.importPaths,
|
||||
exclusionPatterns: library.exclusionPatterns,
|
||||
})
|
||||
)
|
||||
.map(path.normalize)
|
||||
.filter((assetPath) =>
|
||||
// Filter out paths that are not within the user's external path
|
||||
assetPath.match(new RegExp(`^${user.externalPath}`)),
|
||||
);
|
||||
|
||||
this.logger.debug(`Found ${crawledAssetPaths.length} assets when crawling import paths ${library.importPaths}`);
|
||||
const assetsInLibrary = await this.assetRepository.getByLibraryId([job.id]);
|
||||
const offlineAssets = assetsInLibrary.filter((asset) => !crawledAssetPaths.includes(asset.originalPath));
|
||||
this.logger.debug(`${offlineAssets.length} assets in library are not present on disk and will be marked offline`);
|
||||
|
||||
for (const offlineAsset of offlineAssets) {
|
||||
const offlineJobData: IOfflineLibraryFileJob = {
|
||||
id: job.id,
|
||||
assetPath: offlineAsset.originalPath,
|
||||
};
|
||||
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_MARK_ASSET_OFFLINE, data: offlineJobData });
|
||||
}
|
||||
|
||||
if (crawledAssetPaths.length > 0) {
|
||||
let filteredPaths: string[] = [];
|
||||
if (job.refreshAllFiles || job.refreshModifiedFiles) {
|
||||
filteredPaths = crawledAssetPaths;
|
||||
} else {
|
||||
const existingPaths = await this.repository.getOnlineAssetPaths(job.id);
|
||||
this.logger.debug(`Found ${existingPaths.length} existing asset(s) in library ${job.id}`);
|
||||
|
||||
filteredPaths = crawledAssetPaths.filter((assetPath) => !existingPaths.includes(assetPath));
|
||||
this.logger.debug(`After db comparison, ${filteredPaths.length} asset(s) remain to be imported`);
|
||||
}
|
||||
|
||||
for (const assetPath of filteredPaths) {
|
||||
const libraryJobData: ILibraryFileJob = {
|
||||
id: job.id,
|
||||
assetPath: path.normalize(assetPath),
|
||||
ownerId: library.ownerId,
|
||||
forceRefresh: job.refreshAllFiles ?? false,
|
||||
};
|
||||
|
||||
await this.jobRepository.queue({ name: JobName.LIBRARY_SCAN_ASSET, data: libraryJobData });
|
||||
}
|
||||
}
|
||||
|
||||
await this.repository.update({ id: job.id, refreshedAt: new Date() });
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
async handleOfflineAsset(job: IOfflineLibraryFileJob): Promise<boolean> {
|
||||
const existingAssetEntity = await this.assetRepository.getByLibraryIdAndOriginalPath(job.id, job.assetPath);
|
||||
|
||||
if (existingAssetEntity) {
|
||||
this.logger.verbose(`Marking asset as offline: ${job.assetPath}`);
|
||||
await this.assetRepository.save({ id: existingAssetEntity.id, isOffline: true });
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
private async findOrFail(id: string) {
|
||||
const library = await this.repository.get(id);
|
||||
if (!library) {
|
||||
throw new BadRequestException('Library not found');
|
||||
}
|
||||
return library;
|
||||
}
|
||||
|
||||
private async deleteAssets(assetIds: string[]) {
|
||||
// TODO: this should be refactored to a centralized asset deletion service
|
||||
for (const assetId of assetIds) {
|
||||
const asset = await this.assetRepository.getById(assetId);
|
||||
this.logger.debug(`Removing asset from library: ${asset.originalPath}`);
|
||||
|
||||
if (asset.faces) {
|
||||
await Promise.all(
|
||||
asset.faces.map(({ assetId, personId }) =>
|
||||
this.jobRepository.queue({ name: JobName.SEARCH_REMOVE_FACE, data: { assetId, personId } }),
|
||||
),
|
||||
);
|
||||
}
|
||||
|
||||
await this.assetRepository.remove(asset);
|
||||
await this.jobRepository.queue({ name: JobName.SEARCH_REMOVE_ASSET, data: { ids: [asset.id] } });
|
||||
|
||||
await this.jobRepository.queue({
|
||||
name: JobName.DELETE_FILES,
|
||||
data: { files: [asset.webpPath, asset.resizePath, asset.encodedVideoPath, asset.sidecarPath] },
|
||||
});
|
||||
|
||||
// TODO refactor this to use cascades
|
||||
if (asset.livePhotoVideoId && !assetIds.includes(asset.livePhotoVideoId)) {
|
||||
assetIds.push(asset.livePhotoVideoId);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
@ -90,8 +90,9 @@ export class StorageTemplateService {
|
||||
}
|
||||
|
||||
async moveAsset(asset: AssetEntity, metadata: MoveAssetMetadata) {
|
||||
if (asset.isReadOnly) {
|
||||
this.logger.verbose(`Not moving read-only asset: ${asset.originalPath}`);
|
||||
if (asset.isReadOnly || asset.isExternal) {
|
||||
// External assets are not affected by storage template
|
||||
// TODO: shouldn't this only apply to external assets?
|
||||
return;
|
||||
}
|
||||
|
||||
|
@ -1,4 +1,6 @@
|
||||
import { Stats } from 'fs';
|
||||
import { Readable } from 'stream';
|
||||
import { CrawlOptionsDto } from '../library';
|
||||
|
||||
export interface ImmichReadStream {
|
||||
stream: Readable;
|
||||
@ -30,4 +32,6 @@ export interface IStorageRepository {
|
||||
mkdirSync(filepath: string): void;
|
||||
checkDiskUsage(folder: string): Promise<DiskUsage>;
|
||||
readdir(folder: string): Promise<string[]>;
|
||||
stat(filepath: string): Promise<Stats>;
|
||||
crawl(crawlOptions: CrawlOptionsDto): Promise<string[]>;
|
||||
}
|
||||
|
@ -70,4 +70,10 @@ export class SystemConfigJobDto implements Record<QueueName, JobSettingsDto> {
|
||||
@IsObject()
|
||||
@Type(() => JobSettingsDto)
|
||||
[QueueName.SIDECAR]!: JobSettingsDto;
|
||||
|
||||
@ApiProperty({ type: JobSettingsDto })
|
||||
@ValidateNested()
|
||||
@IsObject()
|
||||
@Type(() => JobSettingsDto)
|
||||
[QueueName.LIBRARY]!: JobSettingsDto;
|
||||
}
|
||||
|
@ -51,6 +51,7 @@ export const defaults = Object.freeze<SystemConfig>({
|
||||
[QueueName.RECOGNIZE_FACES]: { concurrency: 2 },
|
||||
[QueueName.SEARCH]: { concurrency: 5 },
|
||||
[QueueName.SIDECAR]: { concurrency: 5 },
|
||||
[QueueName.LIBRARY]: { concurrency: 1 },
|
||||
[QueueName.STORAGE_TEMPLATE_MIGRATION]: { concurrency: 5 },
|
||||
[QueueName.THUMBNAIL_GENERATION]: { concurrency: 5 },
|
||||
[QueueName.VIDEO_CONVERSION]: { concurrency: 1 },
|
||||
|
@ -31,6 +31,7 @@ const updatedConfig = Object.freeze<SystemConfig>({
|
||||
[QueueName.RECOGNIZE_FACES]: { concurrency: 2 },
|
||||
[QueueName.SEARCH]: { concurrency: 5 },
|
||||
[QueueName.SIDECAR]: { concurrency: 5 },
|
||||
[QueueName.LIBRARY]: { concurrency: 1 },
|
||||
[QueueName.STORAGE_TEMPLATE_MIGRATION]: { concurrency: 5 },
|
||||
[QueueName.THUMBNAIL_GENERATION]: { concurrency: 5 },
|
||||
[QueueName.VIDEO_CONVERSION]: { concurrency: 1 },
|
||||
|
@ -1,4 +1,4 @@
|
||||
import { UserEntity } from '@app/infra/entities';
|
||||
import { LibraryType, UserEntity } from '@app/infra/entities';
|
||||
import {
|
||||
BadRequestException,
|
||||
ForbiddenException,
|
||||
@ -11,6 +11,7 @@ import fs from 'fs/promises';
|
||||
import sanitize from 'sanitize-filename';
|
||||
import { AuthUserDto } from '../auth';
|
||||
import { ICryptoRepository } from '../crypto';
|
||||
import { ILibraryRepository } from '../library/library.repository';
|
||||
import { IUserRepository, UserListFilter } from './user.repository';
|
||||
|
||||
const SALT_ROUNDS = 10;
|
||||
@ -18,6 +19,7 @@ const SALT_ROUNDS = 10;
|
||||
export class UserCore {
|
||||
constructor(
|
||||
private userRepository: IUserRepository,
|
||||
private libraryRepository: ILibraryRepository,
|
||||
private cryptoRepository: ICryptoRepository,
|
||||
) {}
|
||||
|
||||
@ -91,7 +93,19 @@ export class UserCore {
|
||||
if (payload.storageLabel) {
|
||||
payload.storageLabel = sanitize(payload.storageLabel);
|
||||
}
|
||||
return this.userRepository.create(payload);
|
||||
|
||||
const userEntity = await this.userRepository.create(payload);
|
||||
await this.libraryRepository.create({
|
||||
owner: { id: userEntity.id } as UserEntity,
|
||||
name: 'Default Library',
|
||||
assets: [],
|
||||
type: LibraryType.UPLOAD,
|
||||
importPaths: [],
|
||||
exclusionPatterns: [],
|
||||
isVisible: true,
|
||||
});
|
||||
|
||||
return userEntity;
|
||||
} catch (e) {
|
||||
Logger.error(e, 'Create new user');
|
||||
throw new InternalServerErrorException('Failed to register new user');
|
||||
|
@ -10,6 +10,7 @@ import {
|
||||
newAssetRepositoryMock,
|
||||
newCryptoRepositoryMock,
|
||||
newJobRepositoryMock,
|
||||
newLibraryRepositoryMock,
|
||||
newStorageRepositoryMock,
|
||||
newUserRepositoryMock,
|
||||
userStub,
|
||||
@ -20,6 +21,7 @@ import { IAssetRepository } from '../asset';
|
||||
import { AuthUserDto } from '../auth';
|
||||
import { ICryptoRepository } from '../crypto';
|
||||
import { IJobRepository, JobName } from '../job';
|
||||
import { ILibraryRepository } from '../library';
|
||||
import { IStorageRepository } from '../storage';
|
||||
import { UpdateUserDto } from './dto/update-user.dto';
|
||||
import { UserResponseDto, mapUser } from './response-dto';
|
||||
@ -129,6 +131,7 @@ describe(UserService.name, () => {
|
||||
let albumMock: jest.Mocked<IAlbumRepository>;
|
||||
let assetMock: jest.Mocked<IAssetRepository>;
|
||||
let jobMock: jest.Mocked<IJobRepository>;
|
||||
let libraryMock: jest.Mocked<ILibraryRepository>;
|
||||
let storageMock: jest.Mocked<IStorageRepository>;
|
||||
|
||||
beforeEach(async () => {
|
||||
@ -136,10 +139,11 @@ describe(UserService.name, () => {
|
||||
albumMock = newAlbumRepositoryMock();
|
||||
assetMock = newAssetRepositoryMock();
|
||||
jobMock = newJobRepositoryMock();
|
||||
libraryMock = newLibraryRepositoryMock();
|
||||
storageMock = newStorageRepositoryMock();
|
||||
userMock = newUserRepositoryMock();
|
||||
|
||||
sut = new UserService(userMock, cryptoRepositoryMock, albumMock, assetMock, jobMock, storageMock);
|
||||
sut = new UserService(userMock, cryptoRepositoryMock, libraryMock, albumMock, assetMock, jobMock, storageMock);
|
||||
|
||||
when(userMock.get).calledWith(adminUser.id).mockResolvedValue(adminUser);
|
||||
when(userMock.get).calledWith(adminUser.id, undefined).mockResolvedValue(adminUser);
|
||||
|
@ -7,6 +7,7 @@ import { IAssetRepository } from '../asset/asset.repository';
|
||||
import { AuthUserDto } from '../auth';
|
||||
import { ICryptoRepository } from '../crypto/crypto.repository';
|
||||
import { IEntityJob, IJobRepository, JobName } from '../job';
|
||||
import { ILibraryRepository } from '../library/library.repository';
|
||||
import { StorageCore, StorageFolder } from '../storage';
|
||||
import { IStorageRepository } from '../storage/storage.repository';
|
||||
import { CreateUserDto, UpdateUserDto, UserCountDto } from './dto';
|
||||
@ -30,13 +31,13 @@ export class UserService {
|
||||
constructor(
|
||||
@Inject(IUserRepository) private userRepository: IUserRepository,
|
||||
@Inject(ICryptoRepository) cryptoRepository: ICryptoRepository,
|
||||
|
||||
@Inject(ILibraryRepository) libraryRepository: ILibraryRepository,
|
||||
@Inject(IAlbumRepository) private albumRepository: IAlbumRepository,
|
||||
@Inject(IAssetRepository) private assetRepository: IAssetRepository,
|
||||
@Inject(IJobRepository) private jobRepository: IJobRepository,
|
||||
@Inject(IStorageRepository) private storageRepository: IStorageRepository,
|
||||
) {
|
||||
this.userCore = new UserCore(userRepository, cryptoRepository);
|
||||
this.userCore = new UserCore(userRepository, libraryRepository, cryptoRepository);
|
||||
}
|
||||
|
||||
async getAll(authUser: AuthUserDto, isAll: boolean): Promise<UserResponseDto[]> {
|
||||
|
@ -22,7 +22,7 @@ export interface AssetOwnerCheck extends AssetCheck {
|
||||
export interface IAssetRepository {
|
||||
get(id: string): Promise<AssetEntity | null>;
|
||||
create(
|
||||
asset: Omit<AssetEntity, 'id' | 'createdAt' | 'updatedAt' | 'ownerId' | 'livePhotoVideoId'>,
|
||||
asset: Omit<AssetEntity, 'id' | 'createdAt' | 'updatedAt' | 'ownerId' | 'libraryId' | 'livePhotoVideoId'>,
|
||||
): Promise<AssetEntity>;
|
||||
remove(asset: AssetEntity): Promise<void>;
|
||||
getAllByUserId(userId: string, dto: AssetSearchDto): Promise<AssetEntity[]>;
|
||||
@ -146,6 +146,7 @@ export class AssetRepository implements IAssetRepository {
|
||||
faces: {
|
||||
person: true,
|
||||
},
|
||||
library: true,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
@ -1,5 +1,5 @@
|
||||
import { AuthUserDto, IJobRepository, JobName, mimeTypes, UploadFile } from '@app/domain';
|
||||
import { AssetEntity, UserEntity } from '@app/infra/entities';
|
||||
import { AssetEntity, LibraryEntity, UserEntity } from '@app/infra/entities';
|
||||
import { parse } from 'node:path';
|
||||
import { IAssetRepository } from './asset-repository';
|
||||
import { CreateAssetDto, ImportAssetDto } from './dto/create-asset.dto';
|
||||
@ -19,6 +19,7 @@ export class AssetCore {
|
||||
): Promise<AssetEntity> {
|
||||
const asset = await this.repository.create({
|
||||
owner: { id: authUser.id } as UserEntity,
|
||||
library: { id: dto.libraryId } as LibraryEntity,
|
||||
|
||||
checksum: file.checksum,
|
||||
originalPath: file.originalPath,
|
||||
@ -45,6 +46,8 @@ export class AssetCore {
|
||||
faces: [],
|
||||
sidecarPath: sidecarPath || null,
|
||||
isReadOnly: dto.isReadOnly ?? false,
|
||||
isExternal: dto.isExternal ?? false,
|
||||
isOffline: dto.isOffline ?? false,
|
||||
});
|
||||
|
||||
await this.jobRepository.queue({ name: JobName.METADATA_EXTRACTION, data: { id: asset.id, source: 'upload' } });
|
||||
|
@ -1,14 +1,16 @@
|
||||
import { ICryptoRepository, IJobRepository, IStorageRepository, JobName } from '@app/domain';
|
||||
import { AssetEntity, AssetType, ExifEntity } from '@app/infra/entities';
|
||||
import { ICryptoRepository, IJobRepository, ILibraryRepository, IStorageRepository, JobName } from '@app/domain';
|
||||
import { ASSET_CHECKSUM_CONSTRAINT, AssetEntity, AssetType, ExifEntity } from '@app/infra/entities';
|
||||
import { BadRequestException } from '@nestjs/common';
|
||||
import {
|
||||
IAccessRepositoryMock,
|
||||
assetStub,
|
||||
authStub,
|
||||
fileStub,
|
||||
libraryStub,
|
||||
newAccessRepositoryMock,
|
||||
newCryptoRepositoryMock,
|
||||
newJobRepositoryMock,
|
||||
newLibraryRepositoryMock,
|
||||
newStorageRepositoryMock,
|
||||
} from '@test';
|
||||
import { when } from 'jest-when';
|
||||
@ -27,6 +29,7 @@ const _getCreateAssetDto = (): CreateAssetDto => {
|
||||
createAssetDto.isFavorite = false;
|
||||
createAssetDto.isArchived = false;
|
||||
createAssetDto.duration = '0:00:00.000000';
|
||||
createAssetDto.libraryId = 'libraryId';
|
||||
|
||||
return createAssetDto;
|
||||
};
|
||||
@ -89,6 +92,7 @@ describe('AssetService', () => {
|
||||
let cryptoMock: jest.Mocked<ICryptoRepository>;
|
||||
let jobMock: jest.Mocked<IJobRepository>;
|
||||
let storageMock: jest.Mocked<IStorageRepository>;
|
||||
let libraryMock: jest.Mocked<ILibraryRepository>;
|
||||
|
||||
beforeEach(() => {
|
||||
assetRepositoryMock = {
|
||||
@ -111,8 +115,9 @@ describe('AssetService', () => {
|
||||
cryptoMock = newCryptoRepositoryMock();
|
||||
jobMock = newJobRepositoryMock();
|
||||
storageMock = newStorageRepositoryMock();
|
||||
libraryMock = newLibraryRepositoryMock();
|
||||
|
||||
sut = new AssetService(accessMock, assetRepositoryMock, a, cryptoMock, jobMock, storageMock);
|
||||
sut = new AssetService(accessMock, assetRepositoryMock, a, cryptoMock, jobMock, libraryMock, storageMock);
|
||||
|
||||
when(assetRepositoryMock.get)
|
||||
.calledWith(assetStub.livePhotoStillAsset.id)
|
||||
@ -149,7 +154,7 @@ describe('AssetService', () => {
|
||||
};
|
||||
const dto = _getCreateAssetDto();
|
||||
const error = new QueryFailedError('', [], '');
|
||||
(error as any).constraint = 'UQ_userid_checksum';
|
||||
(error as any).constraint = ASSET_CHECKSUM_CONSTRAINT;
|
||||
|
||||
assetRepositoryMock.create.mockRejectedValue(error);
|
||||
assetRepositoryMock.getAssetsByChecksums.mockResolvedValue([_getAsset_1()]);
|
||||
@ -166,7 +171,7 @@ describe('AssetService', () => {
|
||||
it('should handle a live photo', async () => {
|
||||
const dto = _getCreateAssetDto();
|
||||
const error = new QueryFailedError('', [], '');
|
||||
(error as any).constraint = 'UQ_userid_checksum';
|
||||
(error as any).constraint = ASSET_CHECKSUM_CONSTRAINT;
|
||||
|
||||
assetRepositoryMock.create.mockResolvedValueOnce(assetStub.livePhotoMotionAsset);
|
||||
assetRepositoryMock.create.mockResolvedValueOnce(assetStub.livePhotoStillAsset);
|
||||
@ -217,7 +222,10 @@ describe('AssetService', () => {
|
||||
});
|
||||
|
||||
it('should return failed status a delete fails', async () => {
|
||||
assetRepositoryMock.get.mockResolvedValue({ id: 'asset1' } as AssetEntity);
|
||||
assetRepositoryMock.get.mockResolvedValue({
|
||||
id: 'asset1',
|
||||
library: libraryStub.uploadLibrary1,
|
||||
} as AssetEntity);
|
||||
assetRepositoryMock.remove.mockRejectedValue('delete failed');
|
||||
accessMock.asset.hasOwnerAccess.mockResolvedValue(true);
|
||||
|
||||
@ -261,6 +269,7 @@ describe('AssetService', () => {
|
||||
originalPath: 'original-path-1',
|
||||
resizePath: 'resize-path-1',
|
||||
webpPath: 'web-path-1',
|
||||
library: libraryStub.uploadLibrary1,
|
||||
};
|
||||
|
||||
const asset2 = {
|
||||
@ -269,6 +278,17 @@ describe('AssetService', () => {
|
||||
resizePath: 'resize-path-2',
|
||||
webpPath: 'web-path-2',
|
||||
encodedVideoPath: 'encoded-video-path-2',
|
||||
library: libraryStub.uploadLibrary1,
|
||||
};
|
||||
|
||||
// Can't be deleted since it's external
|
||||
const asset3 = {
|
||||
id: 'asset3',
|
||||
originalPath: 'original-path-3',
|
||||
resizePath: 'resize-path-3',
|
||||
webpPath: 'web-path-3',
|
||||
encodedVideoPath: 'encoded-video-path-2',
|
||||
library: libraryStub.externalLibrary1,
|
||||
};
|
||||
|
||||
when(assetRepositoryMock.get)
|
||||
@ -277,12 +297,16 @@ describe('AssetService', () => {
|
||||
when(assetRepositoryMock.get)
|
||||
.calledWith(asset2.id)
|
||||
.mockResolvedValue(asset2 as AssetEntity);
|
||||
when(assetRepositoryMock.get)
|
||||
.calledWith(asset3.id)
|
||||
.mockResolvedValue(asset3 as AssetEntity);
|
||||
|
||||
accessMock.asset.hasOwnerAccess.mockResolvedValue(true);
|
||||
|
||||
await expect(sut.deleteAll(authStub.user1, { ids: ['asset1', 'asset2'] })).resolves.toEqual([
|
||||
await expect(sut.deleteAll(authStub.user1, { ids: ['asset1', 'asset2', 'asset3'] })).resolves.toEqual([
|
||||
{ id: 'asset1', status: 'SUCCESS' },
|
||||
{ id: 'asset2', status: 'SUCCESS' },
|
||||
{ id: 'asset3', status: 'FAILED' },
|
||||
]);
|
||||
|
||||
expect(jobMock.queue.mock.calls).toEqual([
|
||||
@ -349,6 +373,7 @@ describe('AssetService', () => {
|
||||
..._getCreateAssetDto(),
|
||||
assetPath: '/data/user1/fake_path/asset_1.jpeg',
|
||||
isReadOnly: true,
|
||||
libraryId: 'library-id',
|
||||
}),
|
||||
).resolves.toEqual({ duplicate: false, id: 'asset-id' });
|
||||
|
||||
@ -357,7 +382,7 @@ describe('AssetService', () => {
|
||||
|
||||
it('should handle a duplicate if originalPath already exists', async () => {
|
||||
const error = new QueryFailedError('', [], '');
|
||||
(error as any).constraint = 'UQ_userid_checksum';
|
||||
(error as any).constraint = ASSET_CHECKSUM_CONSTRAINT;
|
||||
|
||||
assetRepositoryMock.create.mockRejectedValue(error);
|
||||
assetRepositoryMock.getAssetsByChecksums.mockResolvedValue([assetStub.image]);
|
||||
@ -369,6 +394,7 @@ describe('AssetService', () => {
|
||||
..._getCreateAssetDto(),
|
||||
assetPath: '/data/user1/fake_path/asset_1.jpeg',
|
||||
isReadOnly: true,
|
||||
libraryId: 'library-id',
|
||||
}),
|
||||
).resolves.toEqual({ duplicate: true, id: 'asset-id' });
|
||||
|
||||
|
@ -6,6 +6,7 @@ import {
|
||||
IAccessRepository,
|
||||
ICryptoRepository,
|
||||
IJobRepository,
|
||||
ILibraryRepository,
|
||||
IStorageRepository,
|
||||
JobName,
|
||||
mapAsset,
|
||||
@ -14,7 +15,7 @@ import {
|
||||
Permission,
|
||||
UploadFile,
|
||||
} from '@app/domain';
|
||||
import { AssetEntity, AssetType } from '@app/infra/entities';
|
||||
import { ASSET_CHECKSUM_CONSTRAINT, AssetEntity, AssetType, LibraryType } from '@app/infra/entities';
|
||||
import {
|
||||
BadRequestException,
|
||||
Inject,
|
||||
@ -65,6 +66,7 @@ export class AssetService {
|
||||
@InjectRepository(AssetEntity) private assetRepository: Repository<AssetEntity>,
|
||||
@Inject(ICryptoRepository) private cryptoRepository: ICryptoRepository,
|
||||
@Inject(IJobRepository) private jobRepository: IJobRepository,
|
||||
@Inject(ILibraryRepository) private libraryRepository: ILibraryRepository,
|
||||
@Inject(IStorageRepository) private storageRepository: IStorageRepository,
|
||||
) {
|
||||
this.assetCore = new AssetCore(_assetRepository, jobRepository);
|
||||
@ -93,6 +95,16 @@ export class AssetService {
|
||||
livePhotoAsset = await this.assetCore.create(authUser, livePhotoDto, livePhotoFile);
|
||||
}
|
||||
|
||||
if (!dto.libraryId) {
|
||||
// No library given, fall back to default upload library
|
||||
const defaultUploadLibrary = await this.libraryRepository.getDefaultUploadLibrary(authUser.id);
|
||||
|
||||
if (!defaultUploadLibrary) {
|
||||
throw new InternalServerErrorException('Cannot find default upload library for user ' + authUser.id);
|
||||
}
|
||||
dto.libraryId = defaultUploadLibrary.id;
|
||||
}
|
||||
|
||||
const asset = await this.assetCore.create(authUser, dto, file, livePhotoAsset?.id, sidecarFile?.originalPath);
|
||||
|
||||
return { id: asset.id, duplicate: false };
|
||||
@ -104,7 +116,7 @@ export class AssetService {
|
||||
});
|
||||
|
||||
// handle duplicates with a success response
|
||||
if (error instanceof QueryFailedError && (error as any).constraint === 'UQ_userid_checksum') {
|
||||
if (error instanceof QueryFailedError && (error as any).constraint === ASSET_CHECKSUM_CONSTRAINT) {
|
||||
const checksums = [file.checksum, livePhotoFile?.checksum].filter((checksum): checksum is Buffer => !!checksum);
|
||||
const [duplicate] = await this._assetRepository.getAssetsByChecksums(authUser.id, checksums);
|
||||
return { id: duplicate.id, duplicate: true };
|
||||
@ -156,22 +168,11 @@ export class AssetService {
|
||||
return { id: asset.id, duplicate: false };
|
||||
} catch (error: QueryFailedError | Error | any) {
|
||||
// handle duplicates with a success response
|
||||
if (error instanceof QueryFailedError && (error as any).constraint === 'UQ_userid_checksum') {
|
||||
if (error instanceof QueryFailedError && (error as any).constraint === ASSET_CHECKSUM_CONSTRAINT) {
|
||||
const [duplicate] = await this._assetRepository.getAssetsByChecksums(authUser.id, [assetFile.checksum]);
|
||||
return { id: duplicate.id, duplicate: true };
|
||||
}
|
||||
|
||||
if (error instanceof QueryFailedError && (error as any).constraint === 'UQ_4ed4f8052685ff5b1e7ca1058ba') {
|
||||
const duplicate = await this._assetRepository.getByOriginalPath(dto.assetPath);
|
||||
if (duplicate) {
|
||||
if (duplicate.ownerId === authUser.id) {
|
||||
return { id: duplicate.id, duplicate: true };
|
||||
}
|
||||
|
||||
throw new BadRequestException('Path in use by another user');
|
||||
}
|
||||
}
|
||||
|
||||
this.logger.error(`Error importing file ${error}`, error?.stack);
|
||||
throw new BadRequestException(`Error importing file`, `${error}`);
|
||||
}
|
||||
@ -183,7 +184,7 @@ export class AssetService {
|
||||
|
||||
public async getAllAssets(authUser: AuthUserDto, dto: AssetSearchDto): Promise<AssetResponseDto[]> {
|
||||
const userId = dto.userId || authUser.id;
|
||||
await this.access.requirePermission(authUser, Permission.LIBRARY_READ, userId);
|
||||
await this.access.requirePermission(authUser, Permission.TIMELINE_READ, userId);
|
||||
const assets = await this._assetRepository.getAllByUserId(userId, dto);
|
||||
return assets.map((asset) => mapAsset(asset));
|
||||
}
|
||||
@ -258,7 +259,8 @@ export class AssetService {
|
||||
}
|
||||
|
||||
const asset = await this._assetRepository.get(id);
|
||||
if (!asset) {
|
||||
if (!asset || !asset.library || asset.library.type === LibraryType.EXTERNAL) {
|
||||
// We don't allow deletions assets belong to an external library
|
||||
result.push({ id, status: DeleteAssetStatusEnum.FAILED });
|
||||
continue;
|
||||
}
|
||||
@ -291,7 +293,8 @@ export class AssetService {
|
||||
if (asset.livePhotoVideoId && !ids.includes(asset.livePhotoVideoId)) {
|
||||
ids.push(asset.livePhotoVideoId);
|
||||
}
|
||||
} catch {
|
||||
} catch (error) {
|
||||
this.logger.error(`Error deleting asset ${id}`, error);
|
||||
result.push({ id, status: DeleteAssetStatusEnum.FAILED });
|
||||
}
|
||||
}
|
||||
|
@ -1,4 +1,4 @@
|
||||
import { Optional, toBoolean, UploadFieldName } from '@app/domain';
|
||||
import { Optional, toBoolean, UploadFieldName, ValidateUUID } from '@app/domain';
|
||||
import { ApiProperty } from '@nestjs/swagger';
|
||||
import { Transform, Type } from 'class-transformer';
|
||||
import { IsBoolean, IsDate, IsNotEmpty, IsString } from 'class-validator';
|
||||
@ -39,13 +39,24 @@ export class CreateAssetBase {
|
||||
@Optional()
|
||||
@IsString()
|
||||
duration?: string;
|
||||
|
||||
@Optional()
|
||||
@IsBoolean()
|
||||
isExternal?: boolean;
|
||||
|
||||
@Optional()
|
||||
@IsBoolean()
|
||||
isOffline?: boolean;
|
||||
}
|
||||
|
||||
export class CreateAssetDto extends CreateAssetBase {
|
||||
@Optional()
|
||||
@IsBoolean()
|
||||
@Transform(toBoolean)
|
||||
isReadOnly?: boolean = false;
|
||||
isReadOnly?: boolean;
|
||||
|
||||
@ValidateUUID({ optional: true })
|
||||
libraryId?: string;
|
||||
|
||||
// The properties below are added to correctly generate the API docs
|
||||
// and client SDKs. Validation should be handled in the controller.
|
||||
@ -65,6 +76,9 @@ export class ImportAssetDto extends CreateAssetBase {
|
||||
@Transform(toBoolean)
|
||||
isReadOnly?: boolean = true;
|
||||
|
||||
@ValidateUUID()
|
||||
libraryId?: string;
|
||||
|
||||
@IsString()
|
||||
@IsNotEmpty()
|
||||
assetPath!: string;
|
||||
|
@ -19,6 +19,7 @@ import {
|
||||
AuditController,
|
||||
AuthController,
|
||||
JobController,
|
||||
LibraryController,
|
||||
OAuthController,
|
||||
PartnerController,
|
||||
PersonController,
|
||||
@ -46,6 +47,7 @@ import {
|
||||
AuditController,
|
||||
AuthController,
|
||||
JobController,
|
||||
LibraryController,
|
||||
OAuthController,
|
||||
PartnerController,
|
||||
SearchController,
|
||||
|
@ -5,6 +5,7 @@ export * from './asset.controller';
|
||||
export * from './audit.controller';
|
||||
export * from './auth.controller';
|
||||
export * from './job.controller';
|
||||
export * from './library.controller';
|
||||
export * from './oauth.controller';
|
||||
export * from './partner.controller';
|
||||
export * from './person.controller';
|
||||
|
69
server/src/immich/controllers/library.controller.ts
Normal file
69
server/src/immich/controllers/library.controller.ts
Normal file
@ -0,0 +1,69 @@
|
||||
import {
|
||||
AuthUserDto,
|
||||
CreateLibraryDto as CreateDto,
|
||||
LibraryService,
|
||||
LibraryStatsResponseDto,
|
||||
LibraryResponseDto as ResponseDto,
|
||||
ScanLibraryDto,
|
||||
UpdateLibraryDto as UpdateDto,
|
||||
} from '@app/domain';
|
||||
import { Body, Controller, Delete, Get, Param, Post, Put } from '@nestjs/common';
|
||||
import { ApiTags } from '@nestjs/swagger';
|
||||
import { AuthUser, Authenticated } from '../app.guard';
|
||||
import { UseValidation } from '../app.utils';
|
||||
import { UUIDParamDto } from './dto/uuid-param.dto';
|
||||
|
||||
@ApiTags('Library')
|
||||
@Controller('library')
|
||||
@Authenticated()
|
||||
@UseValidation()
|
||||
export class LibraryController {
|
||||
constructor(private service: LibraryService) {}
|
||||
|
||||
@Get()
|
||||
getAllForUser(@AuthUser() authUser: AuthUserDto): Promise<ResponseDto[]> {
|
||||
return this.service.getAllForUser(authUser);
|
||||
}
|
||||
|
||||
@Post()
|
||||
createLibrary(@AuthUser() authUser: AuthUserDto, @Body() dto: CreateDto): Promise<ResponseDto> {
|
||||
return this.service.create(authUser, dto);
|
||||
}
|
||||
|
||||
@Put(':id')
|
||||
updateLibrary(
|
||||
@AuthUser() authUser: AuthUserDto,
|
||||
@Param() { id }: UUIDParamDto,
|
||||
@Body() dto: UpdateDto,
|
||||
): Promise<ResponseDto> {
|
||||
return this.service.update(authUser, id, dto);
|
||||
}
|
||||
|
||||
@Get(':id')
|
||||
getLibraryInfo(@AuthUser() authUser: AuthUserDto, @Param() { id }: UUIDParamDto): Promise<ResponseDto> {
|
||||
return this.service.get(authUser, id);
|
||||
}
|
||||
|
||||
@Delete(':id')
|
||||
deleteLibrary(@AuthUser() authUser: AuthUserDto, @Param() { id }: UUIDParamDto): Promise<void> {
|
||||
return this.service.delete(authUser, id);
|
||||
}
|
||||
|
||||
@Get(':id/statistics')
|
||||
getLibraryStatistics(
|
||||
@AuthUser() authUser: AuthUserDto,
|
||||
@Param() { id }: UUIDParamDto,
|
||||
): Promise<LibraryStatsResponseDto> {
|
||||
return this.service.getStatistics(authUser, id);
|
||||
}
|
||||
|
||||
@Post(':id/scan')
|
||||
scanLibrary(@AuthUser() authUser: AuthUserDto, @Param() { id }: UUIDParamDto, @Body() dto: ScanLibraryDto) {
|
||||
return this.service.queueScan(authUser, id, dto);
|
||||
}
|
||||
|
||||
@Post(':id/removeOffline')
|
||||
removeOfflineFiles(@AuthUser() authUser: AuthUserDto, @Param() { id }: UUIDParamDto) {
|
||||
return this.service.queueRemoveOffline(authUser, id);
|
||||
}
|
||||
}
|
@ -10,19 +10,25 @@ import {
|
||||
OneToMany,
|
||||
OneToOne,
|
||||
PrimaryGeneratedColumn,
|
||||
Unique,
|
||||
UpdateDateColumn,
|
||||
} from 'typeorm';
|
||||
import { AlbumEntity } from './album.entity';
|
||||
import { AssetFaceEntity } from './asset-face.entity';
|
||||
import { ExifEntity } from './exif.entity';
|
||||
import { LibraryEntity } from './library.entity';
|
||||
import { SharedLinkEntity } from './shared-link.entity';
|
||||
import { SmartInfoEntity } from './smart-info.entity';
|
||||
import { TagEntity } from './tag.entity';
|
||||
import { UserEntity } from './user.entity';
|
||||
|
||||
export const ASSET_CHECKSUM_CONSTRAINT = 'UQ_assets_owner_library_checksum';
|
||||
|
||||
@Entity('assets')
|
||||
@Unique('UQ_userid_checksum', ['owner', 'checksum'])
|
||||
// Checksums must be unique per user and library
|
||||
@Index(ASSET_CHECKSUM_CONSTRAINT, ['owner', 'library', 'checksum'], {
|
||||
unique: true,
|
||||
})
|
||||
// For all assets, each originalpath must be unique per user and library
|
||||
export class AssetEntity {
|
||||
@PrimaryGeneratedColumn('uuid')
|
||||
id!: string;
|
||||
@ -36,13 +42,19 @@ export class AssetEntity {
|
||||
@Column()
|
||||
ownerId!: string;
|
||||
|
||||
@ManyToOne(() => LibraryEntity, { onDelete: 'CASCADE', onUpdate: 'CASCADE', nullable: false })
|
||||
library!: LibraryEntity;
|
||||
|
||||
@Column()
|
||||
libraryId!: string;
|
||||
|
||||
@Column()
|
||||
deviceId!: string;
|
||||
|
||||
@Column()
|
||||
type!: AssetType;
|
||||
|
||||
@Column({ unique: true })
|
||||
@Column()
|
||||
originalPath!: string;
|
||||
|
||||
@Column({ type: 'varchar', nullable: true })
|
||||
@ -75,9 +87,15 @@ export class AssetEntity {
|
||||
@Column({ type: 'boolean', default: false })
|
||||
isArchived!: boolean;
|
||||
|
||||
@Column({ type: 'boolean', default: false })
|
||||
isExternal!: boolean;
|
||||
|
||||
@Column({ type: 'boolean', default: false })
|
||||
isReadOnly!: boolean;
|
||||
|
||||
@Column({ type: 'boolean', default: false })
|
||||
isOffline!: boolean;
|
||||
|
||||
@Column({ type: 'bytea' })
|
||||
@Index()
|
||||
checksum!: Buffer; // sha1 checksum
|
||||
|
@ -4,6 +4,7 @@ import { AssetFaceEntity } from './asset-face.entity';
|
||||
import { AssetEntity } from './asset.entity';
|
||||
import { AuditEntity } from './audit.entity';
|
||||
import { ExifEntity } from './exif.entity';
|
||||
import { LibraryEntity } from './library.entity';
|
||||
import { PartnerEntity } from './partner.entity';
|
||||
import { PersonEntity } from './person.entity';
|
||||
import { SharedLinkEntity } from './shared-link.entity';
|
||||
@ -19,6 +20,7 @@ export * from './asset-face.entity';
|
||||
export * from './asset.entity';
|
||||
export * from './audit.entity';
|
||||
export * from './exif.entity';
|
||||
export * from './library.entity';
|
||||
export * from './partner.entity';
|
||||
export * from './person.entity';
|
||||
export * from './shared-link.entity';
|
||||
@ -43,4 +45,5 @@ export const databaseEntities = [
|
||||
TagEntity,
|
||||
UserEntity,
|
||||
UserTokenEntity,
|
||||
LibraryEntity,
|
||||
];
|
||||
|
61
server/src/infra/entities/library.entity.ts
Normal file
61
server/src/infra/entities/library.entity.ts
Normal file
@ -0,0 +1,61 @@
|
||||
import {
|
||||
Column,
|
||||
CreateDateColumn,
|
||||
DeleteDateColumn,
|
||||
Entity,
|
||||
JoinTable,
|
||||
ManyToOne,
|
||||
OneToMany,
|
||||
PrimaryGeneratedColumn,
|
||||
UpdateDateColumn,
|
||||
} from 'typeorm';
|
||||
import { AssetEntity } from './asset.entity';
|
||||
import { UserEntity } from './user.entity';
|
||||
|
||||
@Entity('libraries')
|
||||
export class LibraryEntity {
|
||||
@PrimaryGeneratedColumn('uuid')
|
||||
id!: string;
|
||||
|
||||
@Column()
|
||||
name!: string;
|
||||
|
||||
@OneToMany(() => AssetEntity, (asset) => asset.library)
|
||||
@JoinTable()
|
||||
assets!: AssetEntity[];
|
||||
|
||||
@ManyToOne(() => UserEntity, { onDelete: 'CASCADE', onUpdate: 'CASCADE', nullable: false })
|
||||
owner!: UserEntity;
|
||||
|
||||
@Column()
|
||||
ownerId!: string;
|
||||
|
||||
@Column()
|
||||
type!: LibraryType;
|
||||
|
||||
@Column('text', { array: true })
|
||||
importPaths!: string[];
|
||||
|
||||
@Column('text', { array: true })
|
||||
exclusionPatterns!: string[];
|
||||
|
||||
@CreateDateColumn({ type: 'timestamptz' })
|
||||
createdAt!: Date;
|
||||
|
||||
@UpdateDateColumn({ type: 'timestamptz' })
|
||||
updatedAt!: Date;
|
||||
|
||||
@DeleteDateColumn({ type: 'timestamptz' })
|
||||
deletedAt?: Date;
|
||||
|
||||
@Column({ type: 'timestamptz', nullable: true })
|
||||
refreshedAt!: Date | null;
|
||||
|
||||
@Column({ type: 'boolean', default: true })
|
||||
isVisible!: boolean;
|
||||
}
|
||||
|
||||
export enum LibraryType {
|
||||
UPLOAD = 'UPLOAD',
|
||||
EXTERNAL = 'EXTERNAL',
|
||||
}
|
@ -42,6 +42,7 @@ export enum SystemConfigKey {
|
||||
JOB_STORAGE_TEMPLATE_MIGRATION_CONCURRENCY = 'job.storageTemplateMigration.concurrency',
|
||||
JOB_SEARCH_CONCURRENCY = 'job.search.concurrency',
|
||||
JOB_SIDECAR_CONCURRENCY = 'job.sidecar.concurrency',
|
||||
JOB_LIBRARY_CONCURRENCY = 'job.library.concurrency',
|
||||
|
||||
MACHINE_LEARNING_ENABLED = 'machineLearning.enabled',
|
||||
MACHINE_LEARNING_URL = 'machineLearning.url',
|
||||
|
@ -9,6 +9,7 @@ import {
|
||||
IGeocodingRepository,
|
||||
IJobRepository,
|
||||
IKeyRepository,
|
||||
ILibraryRepository,
|
||||
IMachineLearningRepository,
|
||||
IMediaRepository,
|
||||
immichAppConfig,
|
||||
@ -43,6 +44,7 @@ import {
|
||||
FilesystemProvider,
|
||||
GeocodingRepository,
|
||||
JobRepository,
|
||||
LibraryRepository,
|
||||
MachineLearningRepository,
|
||||
MediaRepository,
|
||||
PartnerRepository,
|
||||
@ -66,6 +68,7 @@ const providers: Provider[] = [
|
||||
{ provide: IFaceRepository, useClass: FaceRepository },
|
||||
{ provide: IGeocodingRepository, useClass: GeocodingRepository },
|
||||
{ provide: IJobRepository, useClass: JobRepository },
|
||||
{ provide: ILibraryRepository, useClass: LibraryRepository },
|
||||
{ provide: IKeyRepository, useClass: APIKeyRepository },
|
||||
{ provide: IMachineLearningRepository, useClass: MachineLearningRepository },
|
||||
{ provide: IPartnerRepository, useClass: PartnerRepository },
|
||||
|
57
server/src/infra/migrations/1688392120838-AddLibraryTable.ts
Normal file
57
server/src/infra/migrations/1688392120838-AddLibraryTable.ts
Normal file
@ -0,0 +1,57 @@
|
||||
import { MigrationInterface, QueryRunner } from 'typeorm';
|
||||
|
||||
export class AddLibraries1688392120838 implements MigrationInterface {
|
||||
name = 'AddLibraryTable1688392120838';
|
||||
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP CONSTRAINT "UQ_userid_checksum"`);
|
||||
await queryRunner.query(
|
||||
`CREATE TABLE "libraries" ("id" uuid NOT NULL DEFAULT uuid_generate_v4(), "name" character varying NOT NULL, "ownerId" uuid NOT NULL, "type" character varying NOT NULL, "importPaths" text array NOT NULL, "exclusionPatterns" text array NOT NULL, "createdAt" TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(), "updatedAt" TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT now(), "deletedAt" TIMESTAMP WITH TIME ZONE, "refreshedAt" TIMESTAMP WITH TIME ZONE, "isVisible" boolean NOT NULL DEFAULT true, CONSTRAINT "PK_505fedfcad00a09b3734b4223de" PRIMARY KEY ("id"))`,
|
||||
);
|
||||
await queryRunner.query(`ALTER TABLE "assets" ADD "isOffline" boolean NOT NULL DEFAULT false`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" ADD "libraryId" uuid`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP CONSTRAINT "UQ_4ed4f8052685ff5b1e7ca1058ba"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" ADD "isExternal" boolean NOT NULL DEFAULT false`);
|
||||
|
||||
await queryRunner.query(
|
||||
`CREATE UNIQUE INDEX "UQ_assets_owner_library_checksum" on "assets" ("ownerId", "libraryId", checksum)`,
|
||||
);
|
||||
await queryRunner.query(
|
||||
`ALTER TABLE "libraries" ADD CONSTRAINT "FK_0f6fc2fb195f24d19b0fb0d57c1" FOREIGN KEY ("ownerId") REFERENCES "users"("id") ON DELETE CASCADE ON UPDATE CASCADE`,
|
||||
);
|
||||
await queryRunner.query(
|
||||
`ALTER TABLE "assets" ADD CONSTRAINT "FK_9977c3c1de01c3d848039a6b90c" FOREIGN KEY ("libraryId") REFERENCES "libraries"("id") ON DELETE CASCADE ON UPDATE CASCADE`,
|
||||
);
|
||||
|
||||
// Create default library for each user and assign all assets to it
|
||||
const userIds: string[] = (await queryRunner.query(`SELECT id FROM "users"`)).map((user: any) => user.id);
|
||||
|
||||
for (const userId of userIds) {
|
||||
await queryRunner.query(
|
||||
`INSERT INTO "libraries" ("name", "ownerId", "type", "importPaths", "exclusionPatterns") VALUES ('Default Library', '${userId}', 'UPLOAD', '{}', '{}')`,
|
||||
);
|
||||
|
||||
await queryRunner.query(
|
||||
`UPDATE "assets" SET "libraryId" = (SELECT id FROM "libraries" WHERE "ownerId" = '${userId}' LIMIT 1) WHERE "ownerId" = '${userId}'`,
|
||||
);
|
||||
}
|
||||
|
||||
await queryRunner.query(`ALTER TABLE "assets" ALTER COLUMN "libraryId" SET NOT NULL`);
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`ALTER TABLE "assets" ALTER COLUMN "libraryId" DROP NOT NULL`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP CONSTRAINT "FK_9977c3c1de01c3d848039a6b90c"`);
|
||||
await queryRunner.query(`ALTER TABLE "libraries" DROP CONSTRAINT "FK_0f6fc2fb195f24d19b0fb0d57c1"`);
|
||||
await queryRunner.query(`DROP INDEX "UQ_assets_owner_library_checksum"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP CONSTRAINT "UQ_owner_library_originalpath"`);
|
||||
await queryRunner.query(
|
||||
`ALTER TABLE "assets" ADD CONSTRAINT "UQ_4ed4f8052685ff5b1e7ca1058ba" UNIQUE ("originalPath")`,
|
||||
);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP COLUMN "libraryId"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP COLUMN "isOffline"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" DROP COLUMN "isExternal"`);
|
||||
await queryRunner.query(`DROP TABLE "libraries"`);
|
||||
await queryRunner.query(`ALTER TABLE "assets" ADD CONSTRAINT "UQ_userid_checksum" UNIQUE ("ownerId", "checksum")`);
|
||||
}
|
||||
}
|
@ -1,7 +1,7 @@
|
||||
import { IAccessRepository } from '@app/domain';
|
||||
import { InjectRepository } from '@nestjs/typeorm';
|
||||
import { Repository } from 'typeorm';
|
||||
import { AlbumEntity, AssetEntity, PartnerEntity, PersonEntity, SharedLinkEntity } from '../entities';
|
||||
import { AlbumEntity, AssetEntity, LibraryEntity, PartnerEntity, PersonEntity, SharedLinkEntity } from '../entities';
|
||||
|
||||
export class AccessRepository implements IAccessRepository {
|
||||
constructor(
|
||||
@ -10,9 +10,29 @@ export class AccessRepository implements IAccessRepository {
|
||||
@InjectRepository(PartnerEntity) private partnerRepository: Repository<PartnerEntity>,
|
||||
@InjectRepository(PersonEntity) private personRepository: Repository<PersonEntity>,
|
||||
@InjectRepository(SharedLinkEntity) private sharedLinkRepository: Repository<SharedLinkEntity>,
|
||||
@InjectRepository(LibraryEntity) private libraryRepository: Repository<LibraryEntity>,
|
||||
) {}
|
||||
|
||||
library = {
|
||||
hasOwnerAccess: (userId: string, libraryId: string): Promise<boolean> => {
|
||||
return this.libraryRepository.exist({
|
||||
where: {
|
||||
id: libraryId,
|
||||
ownerId: userId,
|
||||
},
|
||||
});
|
||||
},
|
||||
hasPartnerAccess: (userId: string, partnerId: string): Promise<boolean> => {
|
||||
return this.partnerRepository.exist({
|
||||
where: {
|
||||
sharedWithId: userId,
|
||||
sharedById: partnerId,
|
||||
},
|
||||
});
|
||||
},
|
||||
};
|
||||
|
||||
timeline = {
|
||||
hasPartnerAccess: (userId: string, partnerId: string): Promise<boolean> => {
|
||||
return this.partnerRepository.exist({
|
||||
where: {
|
||||
|
@ -38,6 +38,12 @@ export class AssetRepository implements IAssetRepository {
|
||||
await this.exifRepository.upsert(exif, { conflictPaths: ['assetId'] });
|
||||
}
|
||||
|
||||
create(
|
||||
asset: Omit<AssetEntity, 'id' | 'createdAt' | 'updatedAt' | 'ownerId' | 'livePhotoVideoId'>,
|
||||
): Promise<AssetEntity> {
|
||||
return this.repository.save(asset);
|
||||
}
|
||||
|
||||
getByDate(ownerId: string, date: Date): Promise<AssetEntity[]> {
|
||||
// For reference of a correct approach although slower
|
||||
|
||||
@ -85,6 +91,7 @@ export class AssetRepository implements IAssetRepository {
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async deleteAll(ownerId: string): Promise<void> {
|
||||
await this.repository.delete({ ownerId });
|
||||
}
|
||||
@ -115,6 +122,39 @@ export class AssetRepository implements IAssetRepository {
|
||||
});
|
||||
}
|
||||
|
||||
getByLibraryId(libraryIds: string[]): Promise<AssetEntity[]> {
|
||||
return this.repository.find({
|
||||
where: { library: { id: In(libraryIds) } },
|
||||
});
|
||||
}
|
||||
|
||||
getByLibraryIdAndOriginalPath(libraryId: string, originalPath: string): Promise<AssetEntity | null> {
|
||||
return this.repository.findOne({
|
||||
where: { library: { id: libraryId }, originalPath: originalPath },
|
||||
});
|
||||
}
|
||||
|
||||
getById(assetId: string): Promise<AssetEntity> {
|
||||
return this.repository.findOneOrFail({
|
||||
where: {
|
||||
id: assetId,
|
||||
},
|
||||
relations: {
|
||||
exifInfo: true,
|
||||
tags: true,
|
||||
sharedLinks: true,
|
||||
smartInfo: true,
|
||||
faces: {
|
||||
person: true,
|
||||
},
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
remove(asset: AssetEntity): Promise<AssetEntity> {
|
||||
return this.repository.remove(asset);
|
||||
}
|
||||
|
||||
getAll(pagination: PaginationOptions, options: AssetSearchOptions = {}): Paginated<AssetEntity> {
|
||||
return paginate(this.repository, pagination, {
|
||||
where: {
|
||||
@ -273,13 +313,19 @@ export class AssetRepository implements IAssetRepository {
|
||||
});
|
||||
}
|
||||
|
||||
getWith(pagination: PaginationOptions, property: WithProperty): Paginated<AssetEntity> {
|
||||
getWith(pagination: PaginationOptions, property: WithProperty, libraryId?: string): Paginated<AssetEntity> {
|
||||
let where: FindOptionsWhere<AssetEntity> | FindOptionsWhere<AssetEntity>[] = {};
|
||||
|
||||
switch (property) {
|
||||
case WithProperty.SIDECAR:
|
||||
where = [{ sidecarPath: Not(IsNull()), isVisible: true }];
|
||||
break;
|
||||
case WithProperty.IS_OFFLINE:
|
||||
if (!libraryId) {
|
||||
throw new Error('Library id is required when finding offline assets');
|
||||
}
|
||||
where = [{ isOffline: true, libraryId: libraryId }];
|
||||
break;
|
||||
|
||||
default:
|
||||
throw new Error(`Invalid getWith property: ${property}`);
|
||||
|
209
server/src/infra/repositories/filesystem.provider.spec.ts
Normal file
209
server/src/infra/repositories/filesystem.provider.spec.ts
Normal file
@ -0,0 +1,209 @@
|
||||
import { CrawlOptionsDto } from '@app/domain';
|
||||
import mockfs from 'mock-fs';
|
||||
import { FilesystemProvider } from './filesystem.provider';
|
||||
|
||||
describe(FilesystemProvider.name, () => {
|
||||
const sut: FilesystemProvider = new FilesystemProvider();
|
||||
|
||||
describe('crawl', () => {
|
||||
it('should return empty wnen crawling an empty path list', async () => {
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = [];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths).toHaveLength(0);
|
||||
});
|
||||
|
||||
it('should crawl a single path', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should exclude by file extension', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.tif': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
options.exclusionPatterns = ['**/*.tif'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should exclude by file extension without case sensitivity', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.tif': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
options.exclusionPatterns = ['**/*.TIF'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should exclude by folder', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/raw/image.jpg': '',
|
||||
'/photos/raw2/image.jpg': '',
|
||||
'/photos/folder/raw/image.jpg': '',
|
||||
'/photos/crawl/image.jpg': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
options.exclusionPatterns = ['**/raw/**'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg', '/photos/raw2/image.jpg', '/photos/crawl/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should crawl multiple paths', async () => {
|
||||
mockfs({
|
||||
'/photos/image1.jpg': '',
|
||||
'/images/image2.jpg': '',
|
||||
'/albums/image3.jpg': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/', '/images/', '/albums/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image1.jpg', '/images/image2.jpg', '/albums/image3.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should support globbing paths', async () => {
|
||||
mockfs({
|
||||
'/photos1/image1.jpg': '',
|
||||
'/photos2/image2.jpg': '',
|
||||
'/images/image3.jpg': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos*'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos1/image1.jpg', '/photos2/image2.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should crawl a single path without trailing slash', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
// TODO: test for hidden paths (not yet implemented)
|
||||
|
||||
it('should crawl a single path', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/subfolder/image1.jpg': '',
|
||||
'/photos/subfolder/image2.jpg': '',
|
||||
'/image1.jpg': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(
|
||||
['/photos/image.jpg', '/photos/subfolder/image1.jpg', '/photos/subfolder/image2.jpg'].sort(),
|
||||
);
|
||||
});
|
||||
|
||||
it('should filter file extensions', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.txt': '',
|
||||
'/photos/1': '',
|
||||
});
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(['/photos/image.jpg'].sort());
|
||||
});
|
||||
|
||||
it('should include photo and video extensions', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.jpeg': '',
|
||||
'/photos/image.heic': '',
|
||||
'/photos/image.heif': '',
|
||||
'/photos/image.png': '',
|
||||
'/photos/image.gif': '',
|
||||
'/photos/image.tif': '',
|
||||
'/photos/image.tiff': '',
|
||||
'/photos/image.webp': '',
|
||||
'/photos/image.dng': '',
|
||||
'/photos/image.nef': '',
|
||||
'/videos/video.mp4': '',
|
||||
'/videos/video.mov': '',
|
||||
'/videos/video.webm': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/', '/videos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
|
||||
expect(paths.sort()).toEqual(
|
||||
[
|
||||
'/photos/image.jpg',
|
||||
'/photos/image.jpeg',
|
||||
'/photos/image.heic',
|
||||
'/photos/image.heif',
|
||||
'/photos/image.png',
|
||||
'/photos/image.gif',
|
||||
'/photos/image.tif',
|
||||
'/photos/image.tiff',
|
||||
'/photos/image.webp',
|
||||
'/photos/image.dng',
|
||||
'/photos/image.nef',
|
||||
'/videos/video.mp4',
|
||||
'/videos/video.mov',
|
||||
'/videos/video.webm',
|
||||
].sort(),
|
||||
);
|
||||
});
|
||||
|
||||
it('should check file extensions without case sensitivity', async () => {
|
||||
mockfs({
|
||||
'/photos/image.jpg': '',
|
||||
'/photos/image.Jpg': '',
|
||||
'/photos/image.jpG': '',
|
||||
'/photos/image.JPG': '',
|
||||
'/photos/image.jpEg': '',
|
||||
'/photos/image.TIFF': '',
|
||||
'/photos/image.tif': '',
|
||||
'/photos/image.dng': '',
|
||||
'/photos/image.NEF': '',
|
||||
});
|
||||
|
||||
const options = new CrawlOptionsDto();
|
||||
options.pathsToCrawl = ['/photos/'];
|
||||
const paths: string[] = await sut.crawl(options);
|
||||
expect(paths.sort()).toEqual(
|
||||
[
|
||||
'/photos/image.jpg',
|
||||
'/photos/image.Jpg',
|
||||
'/photos/image.jpG',
|
||||
'/photos/image.JPG',
|
||||
'/photos/image.jpEg',
|
||||
'/photos/image.TIFF',
|
||||
'/photos/image.tif',
|
||||
'/photos/image.dng',
|
||||
'/photos/image.NEF',
|
||||
].sort(),
|
||||
);
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
mockfs.restore();
|
||||
});
|
||||
});
|
||||
});
|
@ -1,7 +1,15 @@
|
||||
import { DiskUsage, ImmichReadStream, ImmichZipStream, IStorageRepository } from '@app/domain';
|
||||
import {
|
||||
CrawlOptionsDto,
|
||||
DiskUsage,
|
||||
ImmichReadStream,
|
||||
ImmichZipStream,
|
||||
IStorageRepository,
|
||||
mimeTypes,
|
||||
} from '@app/domain';
|
||||
import archiver from 'archiver';
|
||||
import { constants, createReadStream, existsSync, mkdirSync } from 'fs';
|
||||
import fs, { readdir } from 'fs/promises';
|
||||
import { glob } from 'glob';
|
||||
import mv from 'mv';
|
||||
import { promisify } from 'node:util';
|
||||
import path from 'path';
|
||||
@ -52,6 +60,8 @@ export class FilesystemProvider implements IStorageRepository {
|
||||
await fs.unlink(file);
|
||||
}
|
||||
|
||||
stat = fs.stat;
|
||||
|
||||
async unlinkDir(folder: string, options: { recursive?: boolean; force?: boolean }) {
|
||||
await fs.rm(folder, options);
|
||||
}
|
||||
@ -93,5 +103,25 @@ export class FilesystemProvider implements IStorageRepository {
|
||||
};
|
||||
}
|
||||
|
||||
async crawl(crawlOptions: CrawlOptionsDto): Promise<string[]> {
|
||||
const pathsToCrawl = crawlOptions.pathsToCrawl;
|
||||
|
||||
let paths: string;
|
||||
if (!pathsToCrawl) {
|
||||
// No paths to crawl, return empty list
|
||||
return [];
|
||||
} else if (pathsToCrawl.length === 1) {
|
||||
paths = pathsToCrawl[0];
|
||||
} else {
|
||||
paths = '{' + pathsToCrawl.join(',') + '}';
|
||||
}
|
||||
|
||||
paths = paths + '/**/*{' + mimeTypes.getSupportedFileExtensions().join(',') + '}';
|
||||
|
||||
return (await glob(paths, { nocase: true, nodir: true, ignore: crawlOptions.exclusionPatterns })).map((assetPath) =>
|
||||
path.normalize(assetPath),
|
||||
);
|
||||
}
|
||||
|
||||
readdir = readdir;
|
||||
}
|
||||
|
@ -9,6 +9,7 @@ export * from './face.repository';
|
||||
export * from './filesystem.provider';
|
||||
export * from './geocoding.repository';
|
||||
export * from './job.repository';
|
||||
export * from './library.repository';
|
||||
export * from './machine-learning.repository';
|
||||
export * from './media.repository';
|
||||
export * from './partner.repository';
|
||||
|
@ -78,7 +78,7 @@ export class JobRepository implements IJobRepository {
|
||||
}
|
||||
}
|
||||
|
||||
private getQueue(queue: QueueName) {
|
||||
private getQueue(queue: QueueName): Queue {
|
||||
return this.moduleRef.get<Queue>(getQueueToken(queue), { strict: false });
|
||||
}
|
||||
}
|
||||
|
183
server/src/infra/repositories/library.repository.ts
Normal file
183
server/src/infra/repositories/library.repository.ts
Normal file
@ -0,0 +1,183 @@
|
||||
import { ILibraryRepository, LibraryStatsResponseDto } from '@app/domain';
|
||||
import { Injectable } from '@nestjs/common';
|
||||
import { InjectRepository } from '@nestjs/typeorm';
|
||||
import { IsNull, Not } from 'typeorm';
|
||||
import { Repository } from 'typeorm/repository/Repository';
|
||||
import { LibraryEntity, LibraryType } from '../entities';
|
||||
|
||||
@Injectable()
|
||||
export class LibraryRepository implements ILibraryRepository {
|
||||
constructor(@InjectRepository(LibraryEntity) private repository: Repository<LibraryEntity>) {}
|
||||
|
||||
get(id: string, withDeleted = false): Promise<LibraryEntity | null> {
|
||||
return this.repository.findOneOrFail({
|
||||
where: {
|
||||
id,
|
||||
},
|
||||
relations: { owner: true },
|
||||
withDeleted,
|
||||
});
|
||||
}
|
||||
|
||||
existsByName(name: string, withDeleted = false): Promise<boolean> {
|
||||
return this.repository.exist({
|
||||
where: {
|
||||
name,
|
||||
},
|
||||
withDeleted,
|
||||
});
|
||||
}
|
||||
|
||||
getCountForUser(ownerId: string): Promise<number> {
|
||||
return this.repository.countBy({ ownerId });
|
||||
}
|
||||
|
||||
getDefaultUploadLibrary(ownerId: string): Promise<LibraryEntity | null> {
|
||||
return this.repository.findOne({
|
||||
where: {
|
||||
ownerId: ownerId,
|
||||
type: LibraryType.UPLOAD,
|
||||
},
|
||||
order: {
|
||||
createdAt: 'ASC',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
getUploadLibraryCount(ownerId: string): Promise<number> {
|
||||
return this.repository.count({
|
||||
where: {
|
||||
ownerId: ownerId,
|
||||
type: LibraryType.UPLOAD,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
getAllByUserId(ownerId: string, type?: LibraryType): Promise<LibraryEntity[]> {
|
||||
return this.repository.find({
|
||||
where: {
|
||||
ownerId,
|
||||
isVisible: true,
|
||||
type,
|
||||
},
|
||||
relations: {
|
||||
owner: true,
|
||||
},
|
||||
order: {
|
||||
createdAt: 'ASC',
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
getAll(withDeleted = false, type?: LibraryType): Promise<LibraryEntity[]> {
|
||||
return this.repository.find({
|
||||
where: { type },
|
||||
relations: {
|
||||
owner: true,
|
||||
},
|
||||
order: {
|
||||
createdAt: 'ASC',
|
||||
},
|
||||
withDeleted,
|
||||
});
|
||||
}
|
||||
|
||||
getAllDeleted(): Promise<LibraryEntity[]> {
|
||||
return this.repository.find({
|
||||
where: {
|
||||
isVisible: true,
|
||||
deletedAt: Not(IsNull()),
|
||||
},
|
||||
relations: {
|
||||
owner: true,
|
||||
},
|
||||
order: {
|
||||
createdAt: 'ASC',
|
||||
},
|
||||
withDeleted: true,
|
||||
});
|
||||
}
|
||||
|
||||
create(library: Omit<LibraryEntity, 'id' | 'createdAt' | 'updatedAt' | 'ownerId'>): Promise<LibraryEntity> {
|
||||
return this.repository.save(library);
|
||||
}
|
||||
|
||||
async delete(id: string): Promise<void> {
|
||||
await this.repository.delete({ id });
|
||||
}
|
||||
|
||||
async softDelete(id: string): Promise<void> {
|
||||
await this.repository.softDelete({ id });
|
||||
}
|
||||
|
||||
async update(library: Partial<LibraryEntity>): Promise<LibraryEntity> {
|
||||
return this.save(library);
|
||||
}
|
||||
|
||||
async getStatistics(id: string): Promise<LibraryStatsResponseDto> {
|
||||
const stats = await this.repository
|
||||
.createQueryBuilder('libraries')
|
||||
.addSelect(`COUNT(assets.id) FILTER (WHERE assets.type = 'IMAGE' AND assets.isVisible)`, 'photos')
|
||||
.addSelect(`COUNT(assets.id) FILTER (WHERE assets.type = 'VIDEO' AND assets.isVisible)`, 'videos')
|
||||
.addSelect('COALESCE(SUM(exif.fileSizeInByte), 0)', 'usage')
|
||||
.leftJoin('libraries.assets', 'assets')
|
||||
.leftJoin('assets.exifInfo', 'exif')
|
||||
.groupBy('libraries.id')
|
||||
.where('libraries.id = :id', { id })
|
||||
.getRawOne();
|
||||
|
||||
return {
|
||||
photos: Number(stats.photos),
|
||||
videos: Number(stats.videos),
|
||||
usage: Number(stats.usage),
|
||||
total: Number(stats.photos) + Number(stats.videos),
|
||||
};
|
||||
}
|
||||
|
||||
async getOnlineAssetPaths(libraryId: string): Promise<string[]> {
|
||||
// Return all non-offline asset paths for a given library
|
||||
const rawResults = await this.repository
|
||||
.createQueryBuilder('library')
|
||||
.innerJoinAndSelect('library.assets', 'assets')
|
||||
.where('library.id = :id', { id: libraryId })
|
||||
.andWhere('assets.isOffline = false')
|
||||
.select('assets.originalPath')
|
||||
.getRawMany();
|
||||
|
||||
const results: string[] = [];
|
||||
|
||||
for (const rawPath of rawResults) {
|
||||
results.push(rawPath.assets_originalPath);
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
async getAssetIds(libraryId: string, withDeleted = false): Promise<string[]> {
|
||||
let query = await this.repository
|
||||
.createQueryBuilder('library')
|
||||
.innerJoinAndSelect('library.assets', 'assets')
|
||||
.where('library.id = :id', { id: libraryId })
|
||||
.select('assets.id');
|
||||
|
||||
if (withDeleted) {
|
||||
query = query.withDeleted();
|
||||
}
|
||||
|
||||
// Return all asset paths for a given library
|
||||
const rawResults = await query.getRawMany();
|
||||
|
||||
const results: string[] = [];
|
||||
|
||||
for (const rawPath of rawResults) {
|
||||
results.push(rawPath.assets_id);
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
private async save(library: Partial<LibraryEntity>) {
|
||||
const { id } = await this.repository.save(library);
|
||||
return this.repository.findOneByOrFail({ id });
|
||||
}
|
||||
}
|
@ -4,6 +4,7 @@ import {
|
||||
IDeleteFilesJob,
|
||||
JobName,
|
||||
JobService,
|
||||
LibraryService,
|
||||
MediaService,
|
||||
MetadataService,
|
||||
PersonService,
|
||||
@ -14,6 +15,7 @@ import {
|
||||
SystemConfigService,
|
||||
UserService,
|
||||
} from '@app/domain';
|
||||
|
||||
import { Injectable, Logger } from '@nestjs/common';
|
||||
import { MetadataExtractionProcessor } from './processors/metadata-extraction.processor';
|
||||
|
||||
@ -37,6 +39,7 @@ export class AppService {
|
||||
private systemConfigService: SystemConfigService,
|
||||
private userService: UserService,
|
||||
private auditService: AuditService,
|
||||
private libraryService: LibraryService,
|
||||
) {}
|
||||
|
||||
async init() {
|
||||
@ -77,6 +80,13 @@ export class AppService {
|
||||
[JobName.QUEUE_SIDECAR]: (data) => this.metadataService.handleQueueSidecar(data),
|
||||
[JobName.SIDECAR_DISCOVERY]: (data) => this.metadataService.handleSidecarDiscovery(data),
|
||||
[JobName.SIDECAR_SYNC]: () => this.metadataService.handleSidecarSync(),
|
||||
[JobName.LIBRARY_SCAN_ASSET]: (data) => this.libraryService.handleAssetRefresh(data),
|
||||
[JobName.LIBRARY_MARK_ASSET_OFFLINE]: (data) => this.libraryService.handleOfflineAsset(data),
|
||||
[JobName.LIBRARY_SCAN]: (data) => this.libraryService.handleQueueAssetRefresh(data),
|
||||
[JobName.LIBRARY_DELETE]: (data) => this.libraryService.handleDeleteLibrary(data),
|
||||
[JobName.LIBRARY_REMOVE_OFFLINE]: (data) => this.libraryService.handleOfflineRemoval(data),
|
||||
[JobName.LIBRARY_QUEUE_SCAN_ALL]: (data) => this.libraryService.handleQueueAllScan(data),
|
||||
[JobName.LIBRARY_QUEUE_CLEANUP]: () => this.libraryService.handleQueueCleanup(),
|
||||
});
|
||||
|
||||
process.on('uncaughtException', (error: Error | any) => {
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user