1
0
mirror of https://github.com/laurent22/joplin.git synced 2025-08-27 20:29:45 +02:00

Compare commits

...

26 Commits

Author SHA1 Message Date
Laurent Cozic
7db7dc4957 multiput 2021-06-18 16:47:57 +01:00
Laurent Cozic
1aa96af4db fix tests 2021-06-18 16:13:22 +01:00
Laurent Cozic
92dadd7509 tests 2021-06-18 15:39:24 +01:00
Laurent Cozic
000185bfb4 multiput 2021-06-18 15:19:51 +01:00
Laurent Cozic
e81427a1f2 Merge branch 'dev' into sync_batch_upload 2021-06-18 11:50:41 +01:00
Laurent Cozic
3b9c02e92d Server: Add support for uploading multiple items in one request 2021-06-18 11:50:06 +01:00
Laurent Cozic
d73eab6f82 Fixed tests 2021-06-17 18:32:52 +01:00
Laurent Cozic
d698ea0c12 Server v2.1.1 2021-06-17 18:27:54 +01:00
Laurent Cozic
e04133cfc6 Setup new release 2.1 2021-06-17 18:26:58 +01:00
Laurent Cozic
525ab01b9b Mobile: Add version number to log 2021-06-17 18:19:32 +01:00
Laurent Cozic
0d33955fcd All: Mask auth token and password in log 2021-06-17 18:17:23 +01:00
Laurent Cozic
7f0b3fd718 Server: Added account info to dashboard and title to pages 2021-06-17 18:04:35 +01:00
Laurent Cozic
65c3d01cc6 Server: Sort users by name, then email 2021-06-17 17:34:17 +01:00
Laurent Cozic
ac03c08f33 Server: Hide Reset Password button when creating new users 2021-06-17 17:30:45 +01:00
Laurent Cozic
ea1d614f82 Tools: Utility to measure perforemances 2021-06-17 17:27:03 +01:00
Laurent Cozic
c682c8879c Server: Added way to batch requests (currently disabled) 2021-06-17 16:55:45 +01:00
Laurent Cozic
e8532441bc Server: Added way to debug slow queries 2021-06-17 16:51:25 +01:00
Laurent Cozic
958e9163b6 All: Batch upload during initial sync 2021-06-17 12:45:34 +01:00
Laurent Cozic
1c597883ef Chore: Clean up synchronizer code and add types 2021-06-17 12:39:06 +01:00
Laurent Cozic
15ce5cdd6e All: Optimise first synchronisation, when items have never been synced before 2021-06-17 11:24:02 +01:00
Laurent Cozic
a38958ab7b Tools: Added scripts to test server performances 2021-06-17 11:21:37 +01:00
Laurent Cozic
232e0c937a Server v2.0.14 2021-06-17 09:52:40 +01:00
Laurent Cozic
479237d16f Server: Allow sending reset password email from admin UI 2021-06-17 09:49:56 +01:00
Laurent Cozic
6ae0e84a1a Server: Tokens would expire too soon 2021-06-17 09:28:45 +01:00
Laurent Cozic
71d567669b CLI v2.0.1 2021-06-16 20:07:17 +01:00
Laurent Cozic
db39db45c5 Releasing sub-packages 2021-06-16 20:04:45 +01:00
75 changed files with 1172 additions and 253 deletions

View File

@@ -842,6 +842,9 @@ packages/lib/SyncTargetOneDrive.js.map
packages/lib/Synchronizer.d.ts
packages/lib/Synchronizer.js
packages/lib/Synchronizer.js.map
packages/lib/TaskQueue.d.ts
packages/lib/TaskQueue.js
packages/lib/TaskQueue.js.map
packages/lib/commands/historyBackward.d.ts
packages/lib/commands/historyBackward.js
packages/lib/commands/historyBackward.js.map
@@ -869,6 +872,9 @@ packages/lib/eventManager.js.map
packages/lib/file-api-driver-joplinServer.d.ts
packages/lib/file-api-driver-joplinServer.js
packages/lib/file-api-driver-joplinServer.js.map
packages/lib/file-api-driver-memory.d.ts
packages/lib/file-api-driver-memory.js
packages/lib/file-api-driver-memory.js.map
packages/lib/file-api-driver.test.d.ts
packages/lib/file-api-driver.test.js
packages/lib/file-api-driver.test.js.map
@@ -1388,6 +1394,12 @@ packages/lib/services/spellChecker/SpellCheckerService.js.map
packages/lib/services/spellChecker/SpellCheckerServiceDriverBase.d.ts
packages/lib/services/spellChecker/SpellCheckerServiceDriverBase.js
packages/lib/services/spellChecker/SpellCheckerServiceDriverBase.js.map
packages/lib/services/synchronizer/ItemUploader.d.ts
packages/lib/services/synchronizer/ItemUploader.js
packages/lib/services/synchronizer/ItemUploader.js.map
packages/lib/services/synchronizer/ItemUploader.test.d.ts
packages/lib/services/synchronizer/ItemUploader.test.js
packages/lib/services/synchronizer/ItemUploader.test.js.map
packages/lib/services/synchronizer/LockHandler.d.ts
packages/lib/services/synchronizer/LockHandler.js
packages/lib/services/synchronizer/LockHandler.js.map

12
.gitignore vendored
View File

@@ -828,6 +828,9 @@ packages/lib/SyncTargetOneDrive.js.map
packages/lib/Synchronizer.d.ts
packages/lib/Synchronizer.js
packages/lib/Synchronizer.js.map
packages/lib/TaskQueue.d.ts
packages/lib/TaskQueue.js
packages/lib/TaskQueue.js.map
packages/lib/commands/historyBackward.d.ts
packages/lib/commands/historyBackward.js
packages/lib/commands/historyBackward.js.map
@@ -855,6 +858,9 @@ packages/lib/eventManager.js.map
packages/lib/file-api-driver-joplinServer.d.ts
packages/lib/file-api-driver-joplinServer.js
packages/lib/file-api-driver-joplinServer.js.map
packages/lib/file-api-driver-memory.d.ts
packages/lib/file-api-driver-memory.js
packages/lib/file-api-driver-memory.js.map
packages/lib/file-api-driver.test.d.ts
packages/lib/file-api-driver.test.js
packages/lib/file-api-driver.test.js.map
@@ -1374,6 +1380,12 @@ packages/lib/services/spellChecker/SpellCheckerService.js.map
packages/lib/services/spellChecker/SpellCheckerServiceDriverBase.d.ts
packages/lib/services/spellChecker/SpellCheckerServiceDriverBase.js
packages/lib/services/spellChecker/SpellCheckerServiceDriverBase.js.map
packages/lib/services/synchronizer/ItemUploader.d.ts
packages/lib/services/synchronizer/ItemUploader.js
packages/lib/services/synchronizer/ItemUploader.js.map
packages/lib/services/synchronizer/ItemUploader.test.d.ts
packages/lib/services/synchronizer/ItemUploader.test.js
packages/lib/services/synchronizer/ItemUploader.test.js.map
packages/lib/services/synchronizer/LockHandler.d.ts
packages/lib/services/synchronizer/LockHandler.js
packages/lib/services/synchronizer/LockHandler.js.map

View File

@@ -13,3 +13,11 @@ services:
- POSTGRES_PASSWORD=joplin
- POSTGRES_USER=joplin
- POSTGRES_DB=joplin
# Use this to specify additional Postgres
# config parameters:
#
# command:
# - "postgres"
# - "-c"
# - "log_min_duration_statement=0"

View File

@@ -1,6 +1,6 @@
{
"name": "joplin",
"version": "2.0.0",
"version": "2.0.1",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -27,11 +27,12 @@
2017,
2018,
2019,
2020
2020,
2021
],
"owner": "Laurent Cozic"
},
"version": "2.0.0",
"version": "2.1.0",
"bin": {
"joplin": "./main.js"
},
@@ -39,8 +40,8 @@
"node": ">=10.0.0"
},
"dependencies": {
"@joplin/lib": "1.8",
"@joplin/renderer": "1.8",
"@joplin/lib": "2.0",
"@joplin/renderer": "2.0",
"aws-sdk": "^2.588.0",
"chalk": "^4.1.0",
"compare-version": "^0.1.2",
@@ -64,7 +65,7 @@
"yargs-parser": "^7.0.0"
},
"devDependencies": {
"@joplin/tools": "1.8",
"@joplin/tools": "2.0",
"@types/fs-extra": "^9.0.6",
"@types/jest": "^26.0.15",
"@types/node": "^14.14.6",

View File

@@ -1,2 +1,3 @@
test data/
export/
export/
support/serverPerformances/testPerfCommands.txt

View File

@@ -0,0 +1,56 @@
#!/bin/bash
set -e
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
ROOT_DIR="$SCRIPT_DIR/../../../../.."
COMMANDS=($(echo $1 | tr "," "\n"))
PROFILE_DIR=~/.config/joplindev-testperf
CMD_FILE="$SCRIPT_DIR/testPerfCommands.txt"
rm -f "$CMD_FILE"
touch "$CMD_FILE"
for CMD in "${COMMANDS[@]}"
do
if [[ $CMD == "createUsers" ]]; then
curl --data '{"action": "createTestUsers"}' -H 'Content-Type: application/json' http://api.joplincloud.local:22300/api/debug
# elif [[ $CMD == "createData" ]]; then
# echo 'mkbook "shared"' >> "$CMD_FILE"
# echo 'mkbook "other"' >> "$CMD_FILE"
# echo 'use "shared"' >> "$CMD_FILE"
# echo 'mknote "note 1"' >> "$CMD_FILE"
# echo 'mknote "note 2"' >> "$CMD_FILE"
elif [[ $CMD == "reset" ]]; then
USER_EMAIL="user1@example.com"
rm -rf "$PROFILE_DIR"
echo "config keychain.supported 0" >> "$CMD_FILE"
echo "config sync.target 9" >> "$CMD_FILE"
echo "config sync.9.path http://api.joplincloud.local:22300" >> "$CMD_FILE"
echo "config sync.9.username $USER_EMAIL" >> "$CMD_FILE"
echo "config sync.9.password 123456" >> "$CMD_FILE"
# elif [[ $CMD == "e2ee" ]]; then
# echo "e2ee enable --password 111111" >> "$CMD_FILE"
else
echo "Unknown command: $CMD"
exit 1
fi
done
cd "$ROOT_DIR/packages/app-cli"
npm start -- --profile "$PROFILE_DIR" batch "$CMD_FILE"
npm start -- --profile "$PROFILE_DIR" import ~/Desktop/Joplin_17_06_2021.jex
# npm start -- --profile "$PROFILE_DIR" import ~/Desktop/Tout_18_06_2021.jex
npm start -- --profile "$PROFILE_DIR" sync

View File

@@ -1,7 +1,7 @@
{
"manifest_version": 2,
"name": "Joplin Web Clipper [DEV]",
"version": "2.0.0",
"version": "2.1.0",
"description": "Capture and save web pages and screenshots from your browser to Joplin.",
"homepage_url": "https://joplinapp.org",
"content_security_policy": "script-src 'self'; object-src 'self'",

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/app-desktop",
"version": "2.0.11",
"version": "2.1.0",
"description": "Joplin for Desktop",
"main": "main.js",
"private": true,

View File

@@ -142,7 +142,7 @@ android {
minSdkVersion rootProject.ext.minSdkVersion
targetSdkVersion rootProject.ext.targetSdkVersion
versionCode 2097635
versionName "2.0.4"
versionName "2.1.0"
ndk {
abiFilters "armeabi-v7a", "x86", "arm64-v8a", "x86_64"
}

View File

@@ -492,7 +492,7 @@
INFOPLIST_FILE = Joplin/Info.plist;
IPHONEOS_DEPLOYMENT_TARGET = 9.0;
LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks";
MARKETING_VERSION = 12.0.2;
MARKETING_VERSION = 12.1.0;
OTHER_LDFLAGS = (
"$(inherited)",
"-ObjC",
@@ -519,7 +519,7 @@
INFOPLIST_FILE = Joplin/Info.plist;
IPHONEOS_DEPLOYMENT_TARGET = 9.0;
LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks";
MARKETING_VERSION = 12.0.2;
MARKETING_VERSION = 12.1.0;
OTHER_LDFLAGS = (
"$(inherited)",
"-ObjC",
@@ -666,7 +666,7 @@
INFOPLIST_FILE = ShareExtension/Info.plist;
IPHONEOS_DEPLOYMENT_TARGET = 9.0;
LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks @executable_path/../../Frameworks";
MARKETING_VERSION = 12.0.2;
MARKETING_VERSION = 12.1.0;
MTL_ENABLE_DEBUG_INFO = INCLUDE_SOURCE;
MTL_FAST_MATH = YES;
PRODUCT_BUNDLE_IDENTIFIER = net.cozic.joplin.ShareExtension;
@@ -697,7 +697,7 @@
INFOPLIST_FILE = ShareExtension/Info.plist;
IPHONEOS_DEPLOYMENT_TARGET = 9.0;
LD_RUNPATH_SEARCH_PATHS = "$(inherited) @executable_path/Frameworks @executable_path/../../Frameworks";
MARKETING_VERSION = 12.0.2;
MARKETING_VERSION = 12.1.0;
MTL_FAST_MATH = YES;
PRODUCT_BUNDLE_IDENTIFIER = net.cozic.joplin.ShareExtension;
PRODUCT_NAME = "$(TARGET_NAME)";

View File

@@ -28,6 +28,7 @@ import SyncTargetJoplinServer from '@joplin/lib/SyncTargetJoplinServer';
import SyncTargetJoplinCloud from '@joplin/lib/SyncTargetJoplinCloud';
import SyncTargetOneDrive from '@joplin/lib/SyncTargetOneDrive';
const VersionInfo = require('react-native-version-info').default;
const { AppState, Keyboard, NativeModules, BackHandler, Animated, View, StatusBar, Linking, Platform } = require('react-native');
import NetInfo from '@react-native-community/netinfo';
@@ -426,7 +427,7 @@ async function initialize(dispatch: Function) {
// require('@joplin/lib/ntpDate').setLogger(reg.logger());
reg.logger().info('====================================');
reg.logger().info(`Starting application ${Setting.value('appId')} (${Setting.value('env')})`);
reg.logger().info(`Starting application ${Setting.value('appId')} v${VersionInfo.appVersion} (${Setting.value('env')})`);
const dbLogger = new Logger();
dbLogger.addTarget(TargetType.Database, { database: logDatabase, source: 'm' });

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/fork-htmlparser2",
"version": "4.1.26",
"version": "4.1.27",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -1,7 +1,7 @@
{
"name": "@joplin/fork-htmlparser2",
"description": "Fast & forgiving HTML/XML/RSS parser",
"version": "4.1.26",
"version": "4.1.27",
"author": "Felix Boehm <me@feedic.com>",
"publishConfig": {
"access": "public"

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/fork-sax",
"version": "1.2.30",
"version": "1.2.31",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -2,7 +2,7 @@
"name": "@joplin/fork-sax",
"description": "An evented streaming XML parser in JavaScript",
"author": "Isaac Z. Schlueter <i@izs.me> (http://blog.izs.me/)",
"version": "1.2.30",
"version": "1.2.31",
"main": "lib/sax.js",
"publishConfig": {
"access": "public"

View File

@@ -1,7 +1,7 @@
{
"manifest_version": 1,
"id": "<%= pluginId %>",
"app_min_version": "2.0",
"app_min_version": "2.1",
"version": "1.0.0",
"name": "<%= pluginName %>",
"description": "<%= pluginDescription %>",

View File

@@ -1,6 +1,6 @@
{
"name": "generator-joplin",
"version": "2.0.1",
"version": "2.1.0",
"description": "Scaffolds out a new Joplin plugin",
"homepage": "https://github.com/laurent22/joplin/tree/dev/packages/generator-joplin",
"author": {
@@ -34,4 +34,4 @@
"repository": "https://github.com/laurent22/generator-joplin",
"license": "MIT",
"private": true
}
}

View File

@@ -91,6 +91,23 @@ export default class JoplinServerApi {
return _('Could not connect to Joplin Server. Please check the Synchronisation options in the config screen. Full error was:\n\n%s', msg);
}
private hidePassword(o: any): any {
if (typeof o === 'string') {
try {
const output = JSON.parse(o);
if (!output) return o;
if (output.password) output.password = '******';
return JSON.stringify(output);
} catch (error) {
return o;
}
} else {
const output = { ...o };
if (output.password) output.password = '******';
return output;
}
}
private requestToCurl_(url: string, options: any) {
const output = [];
output.push('curl');
@@ -99,11 +116,12 @@ export default class JoplinServerApi {
if (options.headers) {
for (const n in options.headers) {
if (!options.headers.hasOwnProperty(n)) continue;
output.push(`${'-H ' + '"'}${n}: ${options.headers[n]}"`);
const headerValue = n === 'X-API-AUTH' ? '******' : options.headers[n];
output.push(`${'-H ' + '"'}${n}: ${headerValue}"`);
}
}
if (options.body) {
const serialized = typeof options.body !== 'string' ? JSON.stringify(options.body) : options.body;
const serialized = typeof options.body !== 'string' ? JSON.stringify(this.hidePassword(options.body)) : this.hidePassword(options.body);
output.push(`${'--data ' + '\''}${serialized}'`);
}
output.push(`'${url}'`);

View File

@@ -1,7 +1,7 @@
const BaseSyncTarget = require('./BaseSyncTarget').default;
const Setting = require('./models/Setting').default;
const { FileApi } = require('./file-api.js');
const { FileApiDriverMemory } = require('./file-api-driver-memory.js');
const FileApiDriverMemory = require('./file-api-driver-memory').default;
const Synchronizer = require('./Synchronizer').default;
class SyncTargetMemory extends BaseSyncTarget {

View File

@@ -18,8 +18,9 @@ import ResourceService from './services/ResourceService';
import EncryptionService from './services/EncryptionService';
import JoplinError from './JoplinError';
import ShareService from './services/share/ShareService';
import TaskQueue from './TaskQueue';
import ItemUploader from './services/synchronizer/ItemUploader';
const { sprintf } = require('sprintf-js');
const TaskQueue = require('./TaskQueue');
const { Dirnames } = require('./services/synchronizer/utils/types');
interface RemoteItem {
@@ -73,7 +74,7 @@ export default class Synchronizer {
public dispatch: Function;
constructor(db: any, api: any, appType: string) {
public constructor(db: any, api: any, appType: string) {
this.db_ = db;
this.api_ = api;
this.appType_ = appType;
@@ -83,6 +84,8 @@ export default class Synchronizer {
this.progressReport_ = {};
this.dispatch = function() {};
this.apiCall = this.apiCall.bind(this);
}
state() {
@@ -169,7 +172,7 @@ export default class Synchronizer {
if (report.deleteRemote) lines.push(_('Deleted remote items: %d.', report.deleteRemote));
if (report.fetchingTotal && report.fetchingProcessed) lines.push(_('Fetched items: %d/%d.', report.fetchingProcessed, report.fetchingTotal));
if (report.cancelling && !report.completedTime) lines.push(_('Cancelling...'));
if (report.completedTime) lines.push(_('Completed: %s', time.formatMsToLocal(report.completedTime)));
if (report.completedTime) lines.push(_('Completed: %s (%s)', time.formatMsToLocal(report.completedTime), `${Math.round((report.completedTime - report.startTime) / 1000)}s`));
if (this.reportHasErrors(report)) lines.push(_('Last error: %s', report.errors[report.errors.length - 1].toString().substr(0, 500)));
return lines;
@@ -225,6 +228,7 @@ export default class Synchronizer {
if (n == 'starting') continue;
if (n == 'finished') continue;
if (n == 'state') continue;
if (n == 'startTime') continue;
if (n == 'completedTime') continue;
this.logger().info(`${n}: ${report[n] ? report[n] : '-'}`);
}
@@ -299,7 +303,7 @@ export default class Synchronizer {
return '';
}
async apiCall(fnName: string, ...args: any[]) {
private async apiCall(fnName: string, ...args: any[]) {
if (this.syncTargetIsLocked_) throw new JoplinError('Sync target is locked - aborting API call', 'lockError');
try {
@@ -356,6 +360,8 @@ export default class Synchronizer {
const outputContext = Object.assign({}, lastContext);
this.progressReport_.startTime = time.unixMs();
this.dispatch({ type: 'SYNC_STARTED' });
eventManager.emit('syncStart');
@@ -386,6 +392,8 @@ export default class Synchronizer {
// correctly so as to share/unshare the right items.
await Folder.updateAllShareIds();
const itemUploader = new ItemUploader(this.api(), this.apiCall);
let errorToThrow = null;
let syncLock = null;
@@ -437,6 +445,8 @@ export default class Synchronizer {
const result = await BaseItem.itemsThatNeedSync(syncTargetId);
const locals = result.items;
await itemUploader.preUploadItems(result.items.filter((it: any) => result.neverSyncedItemIds.includes(it.id)));
for (let i = 0; i < locals.length; i++) {
if (this.cancelling()) break;
@@ -453,7 +463,7 @@ export default class Synchronizer {
// (by setting an updated_time less than current time).
if (donePaths.indexOf(path) >= 0) throw new JoplinError(sprintf('Processing a path that has already been done: %s. sync_time was not updated? Remote item has an updated_time in the future?', path), 'processingPathTwice');
const remote: RemoteItem = await this.apiCall('stat', path);
const remote: RemoteItem = result.neverSyncedItemIds.includes(local.id) ? null : await this.apiCall('stat', path);
let action = null;
let reason = '';
@@ -561,14 +571,15 @@ export default class Synchronizer {
try {
const remoteContentPath = resourceRemotePath(local.id);
const result = await Resource.fullPathForSyncUpload(local);
local = result.resource;
const resource = result.resource;
local = resource as any;
const localResourceContentPath = result.path;
if (local.size >= 10 * 1000 * 1000) {
this.logger().warn(`Uploading a large resource (resourceId: ${local.id}, size:${local.size} bytes) which may tie up the sync process.`);
if (resource.size >= 10 * 1000 * 1000) {
this.logger().warn(`Uploading a large resource (resourceId: ${local.id}, size:${resource.size} bytes) which may tie up the sync process.`);
}
await this.apiCall('put', remoteContentPath, null, { path: localResourceContentPath, source: 'file', shareId: local.share_id });
await this.apiCall('put', remoteContentPath, null, { path: localResourceContentPath, source: 'file', shareId: resource.share_id });
} catch (error) {
if (isCannotSyncError(error)) {
await handleCannotSyncItem(ItemClass, syncTargetId, local, error.message);
@@ -584,8 +595,7 @@ export default class Synchronizer {
let canSync = true;
try {
if (this.testingHooks_.indexOf('notesRejectedByTarget') >= 0 && local.type_ === BaseModel.TYPE_NOTE) throw new JoplinError('Testing rejectedByTarget', 'rejectedByTarget');
const content = await ItemClass.serializeForSync(local);
await this.apiCall('put', path, content);
await itemUploader.serializeAndUploadItem(ItemClass, path, local);
} catch (error) {
if (error && error.code === 'rejectedByTarget') {
await handleCannotSyncItem(ItemClass, syncTargetId, local, error.message);
@@ -615,7 +625,6 @@ export default class Synchronizer {
// above also doesn't use it because it fetches the whole remote object and read the
// more reliable 'updated_time' property. Basically remote.updated_time is deprecated.
// await this.api().setTimestamp(path, local.updated_time);
await ItemClass.saveSyncTime(syncTargetId, local, local.updated_time);
}
} else if (action == 'itemConflict') {
@@ -784,7 +793,7 @@ export default class Synchronizer {
if (!BaseItem.isSystemPath(remote.path)) continue; // The delta API might return things like the .sync, .resource or the root folder
const loadContent = async () => {
const task = await this.downloadQueue_.waitForResult(path); // await this.apiCall('get', path);
const task = await this.downloadQueue_.waitForResult(path);
if (task.error) throw task.error;
if (!task.result) return null;
return await BaseItem.unserialize(task.result);

View File

@@ -1,5 +1,5 @@
const { setupDatabaseAndSynchronizer, sleep, switchClient } = require('./testing/test-utils.js');
const TaskQueue = require('./TaskQueue.js');
const TaskQueue = require('./TaskQueue').default;
describe('TaskQueue', function() {

View File

@@ -1,23 +1,38 @@
const time = require('./time').default;
const Setting = require('./models/Setting').default;
const Logger = require('./Logger').default;
import time from './time';
import Setting from './models/Setting';
import Logger from './Logger';
class TaskQueue {
constructor(name) {
this.waitingTasks_ = [];
this.processingTasks_ = {};
this.processingQueue_ = false;
this.stopping_ = false;
this.results_ = {};
interface Task {
id: string;
callback: Function;
}
interface TaskResult {
id: string;
result: any;
error?: Error;
}
export default class TaskQueue {
private waitingTasks_: Task[] = [];
private processingTasks_: Record<string, Task> = {};
private processingQueue_ = false;
private stopping_ = false;
private results_: Record<string, TaskResult> = {};
private name_: string;
private logger_: Logger;
constructor(name: string, logger: Logger = null) {
this.name_ = name;
this.logger_ = new Logger();
this.logger_ = logger ? logger : new Logger();
}
concurrency() {
return Setting.value('sync.maxConcurrentConnections');
}
push(id, callback) {
push(id: string, callback: Function) {
if (this.stopping_) throw new Error('Cannot push task when queue is stopping');
this.waitingTasks_.push({
@@ -32,10 +47,10 @@ class TaskQueue {
this.processingQueue_ = true;
const completeTask = (task, result, error) => {
const completeTask = (task: Task, result: any, error: Error) => {
delete this.processingTasks_[task.id];
const r = {
const r: TaskResult = {
id: task.id,
result: result,
};
@@ -55,10 +70,10 @@ class TaskQueue {
task
.callback()
.then(result => {
.then((result: any) => {
completeTask(task, result, null);
})
.catch(error => {
.catch((error: Error) => {
if (!error) error = new Error('Unknown error');
completeTask(task, null, error);
});
@@ -67,29 +82,42 @@ class TaskQueue {
this.processingQueue_ = false;
}
isWaiting(taskId) {
isWaiting(taskId: string) {
return this.waitingTasks_.find(task => task.id === taskId);
}
isProcessing(taskId) {
isProcessing(taskId: string) {
return taskId in this.processingTasks_;
}
isDone(taskId) {
isDone(taskId: string) {
return taskId in this.results_;
}
async waitForResult(taskId) {
if (!this.isWaiting(taskId) && !this.isProcessing(taskId) && !this.isDone(taskId)) throw new Error(`No such task: ${taskId}`);
async waitForAll() {
return new Promise((resolve) => {
const checkIID = setInterval(() => {
if (this.waitingTasks_.length) return;
if (this.processingTasks_.length) return;
clearInterval(checkIID);
resolve(null);
}, 100);
});
}
taskExists(taskId: string) {
return this.isWaiting(taskId) || this.isProcessing(taskId) || this.isDone(taskId);
}
taskResult(taskId: string) {
if (!this.taskExists(taskId)) throw new Error(`No such task: ${taskId}`);
return this.results_[taskId];
}
async waitForResult(taskId: string) {
if (!this.taskExists(taskId)) throw new Error(`No such task: ${taskId}`);
while (true) {
// if (this.stopping_) {
// return {
// id: taskId,
// error: new JoplinError('Queue has been destroyed', 'destroyedQueue'),
// };
// }
const task = this.results_[taskId];
if (task) return task;
await time.sleep(0.1);
@@ -120,7 +148,3 @@ class TaskQueue {
return this.stopping_;
}
}
TaskQueue.CONCURRENCY = 5;
module.exports = TaskQueue;

View File

@@ -1,3 +1,4 @@
import { MultiPutItem } from './file-api';
import JoplinError from './JoplinError';
import JoplinServerApi from './JoplinServerApi';
import { trimSlashes } from './path-utils';
@@ -31,6 +32,10 @@ export default class FileApiDriverJoplinServer {
return this.api_;
}
public get supportsMultiPut() {
return true;
}
public requestRepeatCount() {
return 3;
}
@@ -174,6 +179,10 @@ export default class FileApiDriverJoplinServer {
}
}
public async multiPut(items: MultiPutItem[], options: any = null) {
return this.api().exec('PUT', 'api/batch_items', null, { items: items }, null, options);
}
public async delete(path: string) {
return this.api().exec('DELETE', this.apiFilePath_(path));
}

View File

@@ -1,14 +1,18 @@
const time = require('./time').default;
import time from './time';
const fs = require('fs-extra');
const { basicDelta } = require('./file-api');
import { basicDelta, MultiPutItem } from './file-api';
export default class FileApiDriverMemory {
private items_: any[];
private deletedItems_: any[];
class FileApiDriverMemory {
constructor() {
this.items_ = [];
this.deletedItems_ = [];
}
encodeContent_(content) {
encodeContent_(content: any) {
if (content instanceof Buffer) {
return content.toString('base64');
} else {
@@ -16,23 +20,27 @@ class FileApiDriverMemory {
}
}
decodeContent_(content) {
public get supportsMultiPut() {
return true;
}
decodeContent_(content: any) {
return Buffer.from(content, 'base64').toString('utf-8');
}
itemIndexByPath(path) {
itemIndexByPath(path: string) {
for (let i = 0; i < this.items_.length; i++) {
if (this.items_[i].path == path) return i;
}
return -1;
}
itemByPath(path) {
itemByPath(path: string) {
const index = this.itemIndexByPath(path);
return index < 0 ? null : this.items_[index];
}
newItem(path, isDir = false) {
newItem(path: string, isDir = false) {
const now = time.unixMs();
return {
path: path,
@@ -43,18 +51,18 @@ class FileApiDriverMemory {
};
}
stat(path) {
stat(path: string) {
const item = this.itemByPath(path);
return Promise.resolve(item ? Object.assign({}, item) : null);
}
async setTimestamp(path, timestampMs) {
async setTimestamp(path: string, timestampMs: number): Promise<any> {
const item = this.itemByPath(path);
if (!item) return Promise.reject(new Error(`File not found: ${path}`));
item.updated_time = timestampMs;
}
async list(path) {
async list(path: string) {
const output = [];
for (let i = 0; i < this.items_.length; i++) {
@@ -77,7 +85,7 @@ class FileApiDriverMemory {
});
}
async get(path, options) {
async get(path: string, options: any) {
const item = this.itemByPath(path);
if (!item) return Promise.resolve(null);
if (item.isDir) return Promise.reject(new Error(`${path} is a directory, not a file`));
@@ -93,13 +101,13 @@ class FileApiDriverMemory {
return output;
}
async mkdir(path) {
async mkdir(path: string) {
const index = this.itemIndexByPath(path);
if (index >= 0) return;
this.items_.push(this.newItem(path, true));
}
async put(path, content, options = null) {
async put(path: string, content: any, options: any = null) {
if (!options) options = {};
if (options.source === 'file') content = await fs.readFile(options.path);
@@ -109,13 +117,38 @@ class FileApiDriverMemory {
const item = this.newItem(path, false);
item.content = this.encodeContent_(content);
this.items_.push(item);
return item;
} else {
this.items_[index].content = this.encodeContent_(content);
this.items_[index].updated_time = time.unixMs();
return this.items_[index];
}
}
async delete(path) {
public async multiPut(items: MultiPutItem[], options: any = null) {
const output: any = {
items: {},
};
for (const item of items) {
try {
const processedItem = await this.put(`/root/${item.name}`, item.body, options);
output.items[item.name] = {
item: processedItem,
error: null,
};
} catch (error) {
output.items[item.name] = {
item: null,
error: error,
};
}
}
return output;
}
async delete(path: string) {
const index = this.itemIndexByPath(path);
if (index >= 0) {
const item = Object.assign({}, this.items_[index]);
@@ -126,10 +159,10 @@ class FileApiDriverMemory {
}
}
async move(oldPath, newPath) {
async move(oldPath: string, newPath: string): Promise<any> {
const sourceItem = this.itemByPath(oldPath);
if (!sourceItem) return Promise.reject(new Error(`Path not found: ${oldPath}`));
this.delete(newPath); // Overwrite if newPath already exists
await this.delete(newPath); // Overwrite if newPath already exists
sourceItem.path = newPath;
}
@@ -137,8 +170,8 @@ class FileApiDriverMemory {
this.items_ = [];
}
async delta(path, options = null) {
const getStatFn = async path => {
async delta(path: string, options: any = null) {
const getStatFn = async (path: string) => {
const output = this.items_.slice();
for (let i = 0; i < output.length; i++) {
const item = Object.assign({}, output[i]);
@@ -156,5 +189,3 @@ class FileApiDriverMemory {
this.items_ = [];
}
}
module.exports = { FileApiDriverMemory };

View File

@@ -11,6 +11,11 @@ const Mutex = require('async-mutex').Mutex;
const logger = Logger.create('FileApi');
export interface MultiPutItem {
name: string;
body: string;
}
function requestCanBeRepeated(error: any) {
const errorCode = typeof error === 'object' && error.code ? error.code : null;
@@ -81,6 +86,10 @@ class FileApi {
if (this.driver_.initialize) return this.driver_.initialize(this.fullPath(''));
}
public get supportsMultiPut(): boolean {
return !!this.driver().supportsMultiPut;
}
async fetchRemoteDateOffset_() {
const tempFile = `${this.tempDirName()}/timeCheck${Math.round(Math.random() * 1000000)}.txt`;
const startTime = Date.now();
@@ -251,12 +260,6 @@ class FileApi {
if (!output) return output;
output.path = path;
return output;
// return this.driver_.stat(this.fullPath(path)).then((output) => {
// if (!output) return output;
// output.path = path;
// return output;
// });
}
// Returns UTF-8 encoded string by default, or a Response if `options.target = 'file'`
@@ -277,6 +280,11 @@ class FileApi {
return tryAndRepeat(() => this.driver_.put(this.fullPath(path), content, options), this.requestRepeatCount());
}
public async multiPut(items: MultiPutItem[], options: any = null) {
if (!this.driver().supportsMultiPut) throw new Error('Multi PUT not supported');
return tryAndRepeat(() => this.driver_.multiPut(items, options), this.requestRepeatCount());
}
delete(path: string) {
logger.debug(`delete ${this.fullPath(path)}`);
return tryAndRepeat(() => this.driver_.delete(this.fullPath(path)), this.requestRepeatCount());

View File

@@ -18,6 +18,20 @@ export interface ItemsThatNeedDecryptionResult {
items: any[];
}
export interface ItemThatNeedSync {
id: string;
sync_time: number;
type_: ModelType;
updated_time: number;
encryption_applied: number;
}
export interface ItemsThatNeedSyncResult {
hasMore: boolean;
items: ItemThatNeedSync[];
neverSyncedItemIds: string[];
}
export default class BaseItem extends BaseModel {
public static encryptionService_: any = null;
@@ -389,7 +403,7 @@ export default class BaseItem extends BaseModel {
return this.shareService_;
}
public static async serializeForSync(item: BaseItemEntity) {
public static async serializeForSync(item: BaseItemEntity): Promise<string> {
const ItemClass = this.itemClass(item);
const shownKeys = ItemClass.fieldNames();
shownKeys.push('type_');
@@ -583,7 +597,7 @@ export default class BaseItem extends BaseModel {
throw new Error('Unreachable');
}
static async itemsThatNeedSync(syncTarget: number, limit = 100) {
public static async itemsThatNeedSync(syncTarget: number, limit = 100): Promise<ItemsThatNeedSyncResult> {
const classNames = this.syncItemClassNames();
for (let i = 0; i < classNames.length; i++) {
@@ -660,12 +674,13 @@ export default class BaseItem extends BaseModel {
changedItems = await ItemClass.modelSelectAll(sql);
}
const neverSyncedItemIds = neverSyncedItem.map((it: any) => it.id);
const items = neverSyncedItem.concat(changedItems);
if (i >= classNames.length - 1) {
return { hasMore: items.length >= limit, items: items };
return { hasMore: items.length >= limit, items: items, neverSyncedItemIds };
} else {
if (items.length) return { hasMore: true, items: items };
if (items.length) return { hasMore: true, items: items, neverSyncedItemIds };
}
}

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/lib",
"version": "2.0.2",
"version": "2.0.3",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/lib",
"version": "2.0.2",
"version": "2.1.0",
"description": "Joplin Core library",
"author": "Laurent Cozic",
"homepage": "",
@@ -25,11 +25,11 @@
"typescript": "^4.0.5"
},
"dependencies": {
"@joplin/fork-htmlparser2": "^4.1.26",
"@joplin/fork-sax": "^1.2.30",
"@joplin/fork-htmlparser2": "^4.1.27",
"@joplin/fork-sax": "^1.2.31",
"@joplin/renderer": "^1.8.2",
"@joplin/turndown": "^4.0.48",
"@joplin/turndown-plugin-gfm": "^1.0.30",
"@joplin/turndown": "^4.0.49",
"@joplin/turndown-plugin-gfm": "^1.0.31",
"async-mutex": "^0.1.3",
"aws-sdk": "^2.588.0",
"base-64": "^0.1.0",

View File

@@ -0,0 +1,167 @@
import { FileApi } from '../../file-api';
import BaseItem from '../../models/BaseItem';
import Note from '../../models/Note';
import { expectNotThrow, expectThrow, setupDatabaseAndSynchronizer, switchClient } from '../../testing/test-utils';
import time from '../../time';
import ItemUploader, { ApiCallFunction } from './ItemUploader';
interface ApiCall {
name: string;
args: any[];
}
function clearArray(a: any[]) {
a.splice(0, a.length);
}
function newFakeApi(): FileApi {
return { supportsMultiPut: true } as any;
}
function newFakeApiCall(callRecorder: ApiCall[], itemBodyCallback: Function = null): ApiCallFunction {
const apiCall = async (callName: string, ...args: any[]): Promise<any> => {
callRecorder.push({ name: callName, args });
if (callName === 'multiPut') {
const [batch] = args;
const output: any = { items: {} };
for (const item of batch) {
if (itemBodyCallback) {
output.items[item.name] = itemBodyCallback(item);
} else {
output.items[item.name] = {
item: item.body,
error: null,
};
}
}
return output;
}
};
return apiCall;
}
describe('synchronizer_ItemUplader', function() {
beforeEach(async (done) => {
await setupDatabaseAndSynchronizer(1);
await setupDatabaseAndSynchronizer(2);
await switchClient(1);
done();
});
it('should batch uploads and use the cache afterwards', (async () => {
const callRecorder: ApiCall[] = [];
const itemUploader = new ItemUploader(newFakeApi(), newFakeApiCall(callRecorder));
const notes = [
await Note.save({ title: '1' }),
await Note.save({ title: '2' }),
];
await itemUploader.preUploadItems(notes);
// There should be only one call to "multiPut" because the items have
// been batched.
expect(callRecorder.length).toBe(1);
expect(callRecorder[0].name).toBe('multiPut');
clearArray(callRecorder);
// Now if we try to upload the item it shouldn't call the API because it
// will use the cached item.
await itemUploader.serializeAndUploadItem(Note, BaseItem.systemPath(notes[0]), notes[0]);
expect(callRecorder.length).toBe(0);
// Now try to process a note that hasn't been cached. In that case, it
// should call "PUT" directly.
const note3 = await Note.save({ title: '3' });
await itemUploader.serializeAndUploadItem(Note, BaseItem.systemPath(note3), note3);
expect(callRecorder.length).toBe(1);
expect(callRecorder[0].name).toBe('put');
}));
it('should not batch upload if the items are over the batch size limit', (async () => {
const callRecorder: ApiCall[] = [];
const itemUploader = new ItemUploader(newFakeApi(), newFakeApiCall(callRecorder));
itemUploader.maxBatchSize = 1;
const notes = [
await Note.save({ title: '1' }),
await Note.save({ title: '2' }),
];
await itemUploader.preUploadItems(notes);
expect(callRecorder.length).toBe(0);
}));
it('should not use the cache if the note has changed since the pre-upload', (async () => {
const callRecorder: ApiCall[] = [];
const itemUploader = new ItemUploader(newFakeApi(), newFakeApiCall(callRecorder));
const notes = [
await Note.save({ title: '1' }),
await Note.save({ title: '2' }),
];
await itemUploader.preUploadItems(notes);
clearArray(callRecorder);
await itemUploader.serializeAndUploadItem(Note, BaseItem.systemPath(notes[0]), notes[0]);
expect(callRecorder.length).toBe(0);
await time.msleep(1);
notes[1] = await Note.save({ title: '22' }),
await itemUploader.serializeAndUploadItem(Note, BaseItem.systemPath(notes[1]), notes[1]);
expect(callRecorder.length).toBe(1);
}));
it('should respect the max batch size', (async () => {
const callRecorder: ApiCall[] = [];
const itemUploader = new ItemUploader(newFakeApi(), newFakeApiCall(callRecorder));
const notes = [
await Note.save({ title: '1' }),
await Note.save({ title: '2' }),
await Note.save({ title: '3' }),
];
const noteSize = BaseItem.systemPath(notes[0]).length + (await Note.serializeForSync(notes[0])).length;
itemUploader.maxBatchSize = noteSize * 2;
// It should send two batches - one with two notes, and the second with
// only one note.
await itemUploader.preUploadItems(notes);
expect(callRecorder.length).toBe(2);
expect(callRecorder[0].args[0].length).toBe(2);
expect(callRecorder[1].args[0].length).toBe(1);
}));
it('should rethrow error for items within the batch', (async () => {
const callRecorder: ApiCall[] = [];
const notes = [
await Note.save({ title: '1' }),
await Note.save({ title: '2' }),
await Note.save({ title: '3' }),
];
// Simulates throwing an error on note 2
const itemBodyCallback = (item: any): any => {
if (item.name === BaseItem.systemPath(notes[1])) {
return { error: new Error('Could not save item'), item: null };
} else {
return { error: null, item: item.body };
}
};
const itemUploader = new ItemUploader(newFakeApi(), newFakeApiCall(callRecorder, itemBodyCallback));
await itemUploader.preUploadItems(notes);
await expectNotThrow(async () => itemUploader.serializeAndUploadItem(Note, BaseItem.systemPath(notes[0]), notes[0]));
await expectThrow(async () => itemUploader.serializeAndUploadItem(Note, BaseItem.systemPath(notes[1]), notes[1]));
await expectNotThrow(async () => itemUploader.serializeAndUploadItem(Note, BaseItem.systemPath(notes[2]), notes[2]));
}));
});

View File

@@ -0,0 +1,110 @@
import { ModelType } from '../../BaseModel';
import { FileApi, MultiPutItem } from '../../file-api';
import Logger from '../../Logger';
import BaseItem, { ItemThatNeedSync } from '../../models/BaseItem';
const logger = Logger.create('ItemUploader');
export type ApiCallFunction = (fnName: string, ...args: any[])=> Promise<any>;
interface BatchItem extends MultiPutItem {
localItemUpdatedTime: number;
}
export default class ItemUploader {
private api_: FileApi;
private apiCall_: ApiCallFunction;
private preUploadedItems_: Record<string, any> = {};
private preUploadedItemUpdatedTimes_: Record<string, number> = {};
private maxBatchSize_ = 1 * 1024 * 1024; // 1MB;
public constructor(api: FileApi, apiCall: ApiCallFunction) {
this.api_ = api;
this.apiCall_ = apiCall;
}
public get maxBatchSize() {
return this.maxBatchSize_;
}
public set maxBatchSize(v: number) {
this.maxBatchSize_ = v;
}
public async serializeAndUploadItem(ItemClass: any, path: string, local: ItemThatNeedSync) {
const preUploadItem = this.preUploadedItems_[path];
if (preUploadItem) {
if (this.preUploadedItemUpdatedTimes_[path] !== local.updated_time) {
// Normally this should be rare as it can only happen if the
// item has been changed between the moment it was pre-uploaded
// and the moment where it's being processed by the
// synchronizer. It could happen for example for a note being
// edited just at the same time. In that case, we proceed with
// the regular upload.
logger.warn(`Pre-uploaded item updated_time has changed. It is going to be re-uploaded again: ${path} (From ${this.preUploadedItemUpdatedTimes_[path]} to ${local.updated_time})`);
} else {
if (preUploadItem.error) throw new Error(preUploadItem.error.message ? preUploadItem.error.message : 'Unknown pre-upload error');
return;
}
}
const content = await ItemClass.serializeForSync(local);
await this.apiCall_('put', path, content);
}
public async preUploadItems(items: ItemThatNeedSync[]) {
if (!this.api_.supportsMultiPut) return;
const itemsToUpload: BatchItem[] = [];
for (const local of items) {
// For resources, additional logic is necessary - in particular the blob
// should be uploaded before the metadata, so we can't batch process.
if (local.type_ === ModelType.Resource) continue;
const ItemClass = BaseItem.itemClass(local);
itemsToUpload.push({
name: BaseItem.systemPath(local),
body: await ItemClass.serializeForSync(local),
localItemUpdatedTime: local.updated_time,
});
}
let batchSize = 0;
let currentBatch: BatchItem[] = [];
const uploadBatch = async (batch: BatchItem[]) => {
for (const batchItem of batch) {
this.preUploadedItemUpdatedTimes_[batchItem.name] = batchItem.localItemUpdatedTime;
}
const response = await this.apiCall_('multiPut', batch);
this.preUploadedItems_ = {
...this.preUploadedItems_,
...response.items,
};
};
while (itemsToUpload.length) {
const itemToUpload = itemsToUpload.pop();
const itemSize = itemToUpload.name.length + itemToUpload.body.length;
// Although it should be rare, if the item itself is above the
// batch max size, we skip it. In that case it will be uploaded the
// regular way when the synchronizer calls `serializeAndUploadItem()`
if (itemSize > this.maxBatchSize) continue;
if (batchSize + itemSize > this.maxBatchSize) {
await uploadBatch(currentBatch);
batchSize = itemSize;
currentBatch = [itemToUpload];
} else {
batchSize += itemSize;
currentBatch.push(itemToUpload);
}
}
if (currentBatch.length) await uploadBatch(currentBatch);
}
}

View File

@@ -29,7 +29,7 @@ import Revision from '../models/Revision';
import MasterKey from '../models/MasterKey';
import BaseItem from '../models/BaseItem';
const { FileApi } = require('../file-api.js');
const { FileApiDriverMemory } = require('../file-api-driver-memory.js');
const FileApiDriverMemory = require('../file-api-driver-memory').default;
const { FileApiDriverLocal } = require('../file-api-driver-local.js');
const { FileApiDriverWebDav } = require('../file-api-driver-webdav.js');
const { FileApiDriverDropbox } = require('../file-api-driver-dropbox.js');

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/plugin-repo-cli",
"version": "2.0.2",
"version": "2.0.3",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/plugin-repo-cli",
"version": "2.0.2",
"version": "2.1.0",
"description": "",
"main": "index.js",
"bin": {

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/renderer",
"version": "2.0.2",
"version": "2.0.3",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/renderer",
"version": "2.0.2",
"version": "2.1.0",
"description": "The Joplin note renderer, used the mobile and desktop application",
"repository": "https://github.com/laurent22/joplin/tree/dev/packages/renderer",
"main": "index.js",
@@ -24,7 +24,7 @@
"typescript": "^4.0.5"
},
"dependencies": {
"@joplin/fork-htmlparser2": "^4.1.26",
"@joplin/fork-htmlparser2": "^4.1.27",
"font-awesome-filetypes": "^2.1.0",
"fs-extra": "^8.1.0",
"highlight.js": "^10.2.1",

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/server",
"version": "2.0.13",
"version": "2.1.1",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -1,9 +1,12 @@
{
"name": "@joplin/server",
"version": "2.0.13",
"version": "2.1.1",
"private": true,
"scripts": {
"start-dev": "nodemon --config nodemon.json --ext ts,js,mustache,css,tsx dist/app.js --env dev",
"devCreateDb": "node dist/app.js --env dev --create-db",
"devDropTables": "node dist/app.js --env dev --drop-tables",
"devDropDb": "node dist/app.js --env dev --drop-db",
"start": "node dist/app.js",
"generateTypes": "rm -f db-buildTypes.sqlite && npm run start -- --migrate-db --env buildTypes && node dist/tools/generateTypes.js && mv db-buildTypes.sqlite schema.sqlite",
"tsc": "tsc --project tsconfig.json",

View File

@@ -27,6 +27,11 @@ const env: Env = argv.env as Env || Env.Prod;
const defaultEnvVariables: Record<Env, EnvVariables> = {
dev: {
// To test with the Postgres database, uncomment DB_CLIENT below and
// comment out SQLITE_DATABASE. Then start the Postgres server using
// `docker-compose --file docker-compose.db-dev.yml up`
// DB_CLIENT: 'pg',
SQLITE_DATABASE: `${sqliteDefaultDir}/db-dev.sqlite`,
},
buildTypes: {

View File

@@ -94,7 +94,27 @@ export async function waitForConnection(dbConfig: DatabaseConfig): Promise<Conne
}
export async function connectDb(dbConfig: DatabaseConfig): Promise<DbConnection> {
return knex(makeKnexConfig(dbConfig));
const connection = knex(makeKnexConfig(dbConfig));
const debugSlowQueries = false;
if (debugSlowQueries) {
const startTimes: Record<string, number> = {};
const slowQueryDuration = 10;
connection.on('query', (data) => {
startTimes[data.__knexQueryUid] = Date.now();
});
connection.on('query-response', (_response, data) => {
const duration = Date.now() - startTimes[data.__knexQueryUid];
if (duration < slowQueryDuration) return;
console.info(`SQL: ${data.sql} (${duration}ms)`);
});
}
return connection;
}
export async function disconnectDb(db: DbConnection) {

View File

@@ -49,6 +49,7 @@ export default async function(ctx: AppContext) {
stack: config().showErrorStackTraces ? error.stack : '',
owner: ctx.owner,
},
title: 'Error',
};
ctx.response.body = await ctx.services.mustache.renderView(view);
} else { // JSON

View File

@@ -12,6 +12,18 @@ const mimeUtils = require('@joplin/lib/mime-utils.js').mime;
// Converts "root:/myfile.txt:" to "myfile.txt"
const extractNameRegex = /^root:\/(.*):$/;
export interface SaveFromRawContentItem {
name: string;
body: Buffer;
}
export interface SaveFromRawContentResultItem {
item: Item;
error: any;
}
export type SaveFromRawContentResult = Record<string, SaveFromRawContentResultItem>;
export interface PaginatedItems extends PaginatedResults {
items: Item[];
}
@@ -282,62 +294,122 @@ export default class ItemModel extends BaseModel<Item> {
return this.itemToJoplinItem(raw);
}
public async saveFromRawContent(user: User, name: string, buffer: Buffer, options: ItemSaveOption = null): Promise<Item> {
public async saveFromRawContent(user: User, rawContentItems: SaveFromRawContentItem[], options: ItemSaveOption = null): Promise<SaveFromRawContentResult> {
options = options || {};
const existingItem = await this.loadByName(user.id, name);
// In this function, first we process the input items, which may be
// serialized Joplin items or actual buffers (for resources) and convert
// them to database items. Once it's done those db items are saved in
// batch at the end.
const isJoplinItem = isJoplinItemName(name);
let isNote = false;
const item: Item = {
name,
};
let joplinItem: any = null;
let resourceIds: string[] = [];
if (isJoplinItem) {
joplinItem = await unserializeJoplinItem(buffer.toString());
isNote = joplinItem.type_ === ModelType.Note;
resourceIds = isNote ? linkedResourceIds(joplinItem.body) : [];
item.jop_id = joplinItem.id;
item.jop_parent_id = joplinItem.parent_id || '';
item.jop_type = joplinItem.type_;
item.jop_encryption_applied = joplinItem.encryption_applied || 0;
item.jop_share_id = joplinItem.share_id || '';
const joplinItemToSave = { ...joplinItem };
delete joplinItemToSave.id;
delete joplinItemToSave.parent_id;
delete joplinItemToSave.share_id;
delete joplinItemToSave.type_;
delete joplinItemToSave.encryption_applied;
item.content = Buffer.from(JSON.stringify(joplinItemToSave));
} else {
item.content = buffer;
interface ItemToProcess {
item: Item;
error: Error;
resourceIds?: string[];
isNote?: boolean;
joplinItem?: any;
}
if (existingItem) item.id = existingItem.id;
const existingItems = await this.loadByNames(user.id, rawContentItems.map(i => i.name));
const itemsToProcess: Record<string, ItemToProcess> = {};
if (options.shareId) item.jop_share_id = options.shareId;
for (const rawItem of rawContentItems) {
try {
const isJoplinItem = isJoplinItemName(rawItem.name);
let isNote = false;
await this.models().user().checkMaxItemSizeLimit(user, buffer, item, joplinItem);
const item: Item = {
name: rawItem.name,
};
return this.withTransaction<Item>(async () => {
const savedItem = await this.saveForUser(user.id, item);
let joplinItem: any = null;
if (isNote) {
await this.models().itemResource().deleteByItemId(savedItem.id);
await this.models().itemResource().addResourceIds(savedItem.id, resourceIds);
let resourceIds: string[] = [];
if (isJoplinItem) {
joplinItem = await unserializeJoplinItem(rawItem.body.toString());
isNote = joplinItem.type_ === ModelType.Note;
resourceIds = isNote ? linkedResourceIds(joplinItem.body) : [];
item.jop_id = joplinItem.id;
item.jop_parent_id = joplinItem.parent_id || '';
item.jop_type = joplinItem.type_;
item.jop_encryption_applied = joplinItem.encryption_applied || 0;
item.jop_share_id = joplinItem.share_id || '';
const joplinItemToSave = { ...joplinItem };
delete joplinItemToSave.id;
delete joplinItemToSave.parent_id;
delete joplinItemToSave.share_id;
delete joplinItemToSave.type_;
delete joplinItemToSave.encryption_applied;
item.content = Buffer.from(JSON.stringify(joplinItemToSave));
} else {
item.content = rawItem.body;
}
const existingItem = existingItems.find(i => i.name === rawItem.name);
if (existingItem) item.id = existingItem.id;
if (options.shareId) item.jop_share_id = options.shareId;
await this.models().user().checkMaxItemSizeLimit(user, rawItem.body, item, joplinItem);
itemsToProcess[rawItem.name] = {
item: item,
error: null,
resourceIds,
isNote,
joplinItem,
};
} catch (error) {
itemsToProcess[rawItem.name] = {
item: null,
error: error,
};
}
}
return savedItem;
const output: SaveFromRawContentResult = {};
await this.withTransaction(async () => {
for (const name of Object.keys(itemsToProcess)) {
const o = itemsToProcess[name];
if (o.error) {
output[name] = {
item: null,
error: o.error,
};
continue;
}
const itemToSave = o.item;
try {
const savedItem = await this.saveForUser(user.id, itemToSave);
if (o.isNote) {
await this.models().itemResource().deleteByItemId(savedItem.id);
await this.models().itemResource().addResourceIds(savedItem.id, o.resourceIds);
}
output[name] = {
item: savedItem,
error: null,
};
} catch (error) {
output[name] = {
item: null,
error: error,
};
}
}
});
return output;
}
protected async validate(item: Item, options: ValidateOptions = {}): Promise<Item> {

View File

@@ -5,7 +5,7 @@ import BaseModel from './BaseModel';
export default class TokenModel extends BaseModel<Token> {
private tokenTtl_: number = 7 * 24 * 60 * 1000;
private tokenTtl_: number = 7 * 24 * 60 * 60 * 1000;
public get tableName(): string {
return 'tokens';

View File

@@ -51,19 +51,26 @@ export function accountTypeOptions(): AccountTypeSelectOptions[] {
return [
{
value: AccountType.Default,
label: 'Default',
label: accountTypeToString(AccountType.Default),
},
{
value: AccountType.Basic,
label: 'Basic',
label: accountTypeToString(AccountType.Basic),
},
{
value: AccountType.Pro,
label: 'Pro',
label: accountTypeToString(AccountType.Pro),
},
];
}
export function accountTypeToString(accountType: AccountType): string {
if (accountType === AccountType.Default) return 'Default';
if (accountType === AccountType.Basic) return 'Basic';
if (accountType === AccountType.Pro) return 'Pro';
throw new Error(`Invalid type: ${accountType}`);
}
export default class UserModel extends BaseModel<User> {
public get tableName(): string {
@@ -225,16 +232,19 @@ export default class UserModel extends BaseModel<User> {
await this.save({ id: user.id, email_confirmed: 1 });
}
// public async saveWithAccountType(accountType:AccountType, user: User, options: SaveOptions = {}): Promise<User> {
// if (accountType !== AccountType.Default) {
// user = {
// ...user,
// ...accountTypeProperties(accountType),
// };
// }
public async sendAccountConfirmationEmail(user: User) {
const validationToken = await this.models().token().generate(user.id);
const confirmUrl = encodeURI(this.confirmUrl(user.id, validationToken));
// return this.save(user, options);
// }
await this.models().email().push({
sender_id: EmailSender.NoReply,
recipient_id: user.id,
recipient_email: user.email,
recipient_name: user.full_name || '',
subject: `Please setup your ${this.appName} account`,
body: `Your new ${this.appName} account is almost ready to use!\n\nPlease click on the following link to finish setting up your account:\n\n[Complete your account](${confirmUrl})`,
});
}
// Note that when the "password" property is provided, it is going to be
// hashed automatically. It means that it is not safe to do:
@@ -254,17 +264,7 @@ export default class UserModel extends BaseModel<User> {
const savedUser = await super.save(user, options);
if (isNew) {
const validationToken = await this.models().token().generate(savedUser.id);
const confirmUrl = encodeURI(this.confirmUrl(savedUser.id, validationToken));
await this.models().email().push({
sender_id: EmailSender.NoReply,
recipient_id: savedUser.id,
recipient_email: savedUser.email,
recipient_name: savedUser.full_name || '',
subject: `Please setup your ${this.appName} account`,
body: `Your new ${this.appName} account has been created!\n\nPlease click on the following link to complete the creation of your account:\n\n[Complete your account](${confirmUrl})`,
});
await this.sendAccountConfirmationEmail(savedUser);
}
UserModel.eventEmitter.emit('created');

View File

@@ -0,0 +1,92 @@
import { bodyFields } from '../../utils/requestUtils';
import { SubPath } from '../../utils/routeUtils';
import Router from '../../utils/Router';
import { HttpMethod, RouteType } from '../../utils/types';
import { AppContext } from '../../utils/types';
import routeHandler from '../../middleware/routeHandler';
import config from '../../config';
import { ErrorBadRequest } from '../../utils/errors';
const router = new Router(RouteType.Api);
const maxSubRequests = 50;
interface SubRequest {
method: HttpMethod;
url: string;
headers: Record<string, string>;
body: any;
}
type SubRequests = Record<string, SubRequest>;
interface SubRequestResponse {
status: number;
body: any;
header: Record<string, any>;
}
type BatchResponse = Record<string, SubRequestResponse>;
function createSubRequestContext(ctx: AppContext, subRequest: SubRequest): AppContext {
const fullUrl = `${config().apiBaseUrl}/${subRequest.url.trim()}`;
const newContext: AppContext = {
...ctx,
URL: new URL(fullUrl),
request: {
...ctx.request,
method: subRequest.method,
},
method: subRequest.method,
headers: {
...ctx.headers,
...subRequest.headers,
},
body: subRequest.body,
appLogger: ctx.appLogger,
path: `/${subRequest.url}`,
url: fullUrl,
services: ctx.services,
db: ctx.db,
models: ctx.models,
routes: ctx.routes,
};
return newContext;
}
function validateRequest(request: SubRequest): SubRequest {
const output = { ...request };
if (!output.method) output.method = HttpMethod.GET;
if (!output.url) throw new Error('"url" is required');
return output;
}
router.post('api/batch', async (_path: SubPath, ctx: AppContext) => {
throw new Error('Not enabled');
// eslint-disable-next-line no-unreachable
const subRequests = await bodyFields<SubRequests>(ctx.req);
if (Object.keys(subRequests).length > maxSubRequests) throw new ErrorBadRequest(`Can only process up to ${maxSubRequests} requests`);
const response: BatchResponse = {};
for (const subRequestId of Object.keys(subRequests)) {
const subRequest = validateRequest(subRequests[subRequestId]);
const subRequestContext = createSubRequestContext(ctx, subRequest);
await routeHandler(subRequestContext);
const r = subRequestContext.response;
response[subRequestId] = {
status: r.status,
body: typeof r.body === 'object' ? { ...r.body } : r.body,
header: r.header ? { ...r.header } : {},
};
}
return response;
});
export default router;

View File

@@ -0,0 +1,19 @@
import { SubPath } from '../../utils/routeUtils';
import Router from '../../utils/Router';
import { RouteType } from '../../utils/types';
import { AppContext } from '../../utils/types';
import { putItemContents } from './items';
import { PaginatedResults } from '../../models/utils/pagination';
const router = new Router(RouteType.Api);
router.put('api/batch_items', async (path: SubPath, ctx: AppContext) => {
const output: PaginatedResults = {
items: await putItemContents(path, ctx, true) as any,
has_more: false,
};
return output;
});
export default router;

View File

@@ -3,10 +3,11 @@ import { NoteEntity } from '@joplin/lib/services/database/types';
import { ModelType } from '@joplin/lib/BaseModel';
import { deleteApi, getApi, putApi } from '../../utils/testing/apiUtils';
import { Item } from '../../db';
import { PaginatedItems } from '../../models/ItemModel';
import { PaginatedItems, SaveFromRawContentResult } from '../../models/ItemModel';
import { shareFolderWithUser } from '../../utils/testing/shareApiUtils';
import { resourceBlobPath } from '../../utils/joplinUtils';
import { ErrorForbidden, ErrorPayloadTooLarge } from '../../utils/errors';
import { PaginatedResults } from '../../models/utils/pagination';
describe('api_items', function() {
@@ -149,6 +150,56 @@ describe('api_items', function() {
expect(result.name).toBe(`${noteId}.md`);
});
test('should batch upload items', async function() {
const { session: session1 } = await createUserAndSession(1, false);
const result: PaginatedResults = await putApi(session1.id, 'batch_items', {
items: [
{
name: '00000000000000000000000000000001.md',
body: makeNoteSerializedBody({ id: '00000000000000000000000000000001' }),
},
{
name: '00000000000000000000000000000002.md',
body: makeNoteSerializedBody({ id: '00000000000000000000000000000002' }),
},
],
});
expect(Object.keys(result.items).length).toBe(2);
expect(Object.keys(result.items).sort()).toEqual(['00000000000000000000000000000001.md', '00000000000000000000000000000002.md']);
});
test('should report errors when batch uploading', async function() {
const { user: user1,session: session1 } = await createUserAndSession(1, false);
const note1 = makeNoteSerializedBody({ id: '00000000000000000000000000000001' });
await models().user().save({ id: user1.id, max_item_size: note1.length });
const result: PaginatedResults = await putApi(session1.id, 'batch_items', {
items: [
{
name: '00000000000000000000000000000001.md',
body: note1,
},
{
name: '00000000000000000000000000000002.md',
body: makeNoteSerializedBody({ id: '00000000000000000000000000000002', body: 'too large' }),
},
],
});
const items: SaveFromRawContentResult = result.items as any;
expect(Object.keys(items).length).toBe(2);
expect(Object.keys(items).sort()).toEqual(['00000000000000000000000000000001.md', '00000000000000000000000000000002.md']);
expect(items['00000000000000000000000000000001.md'].item).toBeTruthy();
expect(items['00000000000000000000000000000001.md'].error).toBeFalsy();
expect(items['00000000000000000000000000000002.md'].item).toBeFalsy();
expect(items['00000000000000000000000000000002.md'].error.httpCode).toBe(ErrorPayloadTooLarge.httpCode);
});
test('should list children', async function() {
const { session } = await createUserAndSession(1, true);

View File

@@ -5,14 +5,71 @@ import Router from '../../utils/Router';
import { RouteType } from '../../utils/types';
import { AppContext } from '../../utils/types';
import * as fs from 'fs-extra';
import { ErrorForbidden, ErrorMethodNotAllowed, ErrorNotFound } from '../../utils/errors';
import ItemModel, { ItemSaveOption } from '../../models/ItemModel';
import { ErrorForbidden, ErrorMethodNotAllowed, ErrorNotFound, ErrorPayloadTooLarge } from '../../utils/errors';
import ItemModel, { ItemSaveOption, SaveFromRawContentItem } from '../../models/ItemModel';
import { requestDeltaPagination, requestPagination } from '../../models/utils/pagination';
import { AclAction } from '../../models/BaseModel';
import { safeRemove } from '../../utils/fileUtils';
import { formatBytes, MB } from '../../utils/bytes';
const router = new Router(RouteType.Api);
const batchMaxSize = 1 * MB;
export async function putItemContents(path: SubPath, ctx: AppContext, isBatch: boolean) {
if (!ctx.owner.can_upload) throw new ErrorForbidden('Uploading content is disabled');
const parsedBody = await formParse(ctx.req);
const bodyFields = parsedBody.fields;
const saveOptions: ItemSaveOption = {};
let items: SaveFromRawContentItem[] = [];
if (isBatch) {
let totalSize = 0;
items = bodyFields.items.map((item: any) => {
totalSize += item.name.length + (item.body ? item.body.length : 0);
return {
name: item.name,
body: item.body ? Buffer.from(item.body, 'utf8') : Buffer.alloc(0),
};
});
if (totalSize > batchMaxSize) throw new ErrorPayloadTooLarge(`Size of items (${formatBytes(totalSize)}) is over the limit (${formatBytes(batchMaxSize)})`);
} else {
const filePath = parsedBody?.files?.file ? parsedBody.files.file.path : null;
try {
const buffer = filePath ? await fs.readFile(filePath) : Buffer.alloc(0);
// This end point can optionally set the associated jop_share_id field. It
// is only useful when uploading resource blob (under .resource folder)
// since they can't have metadata. Note, Folder and Resource items all
// include the "share_id" field property so it doesn't need to be set via
// query parameter.
if (ctx.query['share_id']) {
saveOptions.shareId = ctx.query['share_id'];
await ctx.models.item().checkIfAllowed(ctx.owner, AclAction.Create, { jop_share_id: saveOptions.shareId });
}
items = [
{
name: ctx.models.item().pathToName(path.id),
body: buffer,
},
];
} finally {
if (filePath) await safeRemove(filePath);
}
}
const output = await ctx.models.item().saveFromRawContent(ctx.owner, items, saveOptions);
for (const [name] of Object.entries(output)) {
if (output[name].item) output[name].item = ctx.models.item().toApiOutput(output[name].item) as Item;
}
return output;
}
// Note about access control:
//
// - All these calls are scoped to a user, which is derived from the session
@@ -66,36 +123,10 @@ router.get('api/items/:id/content', async (path: SubPath, ctx: AppContext) => {
});
router.put('api/items/:id/content', async (path: SubPath, ctx: AppContext) => {
if (!ctx.owner.can_upload) throw new ErrorForbidden('Uploading content is disabled');
const itemModel = ctx.models.item();
const name = itemModel.pathToName(path.id);
const parsedBody = await formParse(ctx.req);
const filePath = parsedBody?.files?.file ? parsedBody.files.file.path : null;
let outputItem: Item = null;
try {
const buffer = filePath ? await fs.readFile(filePath) : Buffer.alloc(0);
const saveOptions: ItemSaveOption = {};
// This end point can optionally set the associated jop_share_id field. It
// is only useful when uploading resource blob (under .resource folder)
// since they can't have metadata. Note, Folder and Resource items all
// include the "share_id" field property so it doesn't need to be set via
// query parameter.
if (ctx.query['share_id']) {
saveOptions.shareId = ctx.query['share_id'];
await itemModel.checkIfAllowed(ctx.owner, AclAction.Create, { jop_share_id: saveOptions.shareId });
}
const item = await itemModel.saveFromRawContent(ctx.owner, name, buffer, saveOptions);
outputItem = itemModel.toApiOutput(item) as Item;
} finally {
if (filePath) await safeRemove(filePath);
}
return outputItem;
const results = await putItemContents(path, ctx, false);
const result = results[Object.keys(results)[0]];
if (result.error) throw result.error;
return result.item;
});
router.get('api/items/:id/delta', async (_path: SubPath, ctx: AppContext) => {

View File

@@ -56,7 +56,7 @@ router.get('changes', async (_path: SubPath, ctx: AppContext) => {
}),
};
const view: View = defaultView('changes');
const view: View = defaultView('changes', 'Log');
view.content.changeTable = makeTableView(table),
view.cssFiles = ['index/changes'];
return view;

View File

@@ -5,6 +5,9 @@ import { AppContext } from '../../utils/types';
import { contextSessionId } from '../../utils/requestUtils';
import { ErrorMethodNotAllowed } from '../../utils/errors';
import defaultView from '../../utils/defaultView';
import { accountTypeProperties, accountTypeToString } from '../../models/UserModel';
import { formatBytes } from '../../utils/bytes';
import { yesOrNo } from '../../utils/strings';
const router: Router = new Router(RouteType.Web);
@@ -12,7 +15,35 @@ router.get('home', async (_path: SubPath, ctx: AppContext) => {
contextSessionId(ctx);
if (ctx.method === 'GET') {
return defaultView('home');
const accountProps = accountTypeProperties(ctx.owner.account_type);
const view = defaultView('home', 'Home');
view.content = {
userProps: [
{
label: 'Account Type',
value: accountTypeToString(accountProps.account_type),
},
{
label: 'Is Admin',
value: yesOrNo(ctx.owner.is_admin),
},
{
label: 'Max Item Size',
value: accountProps.max_item_size ? formatBytes(accountProps.max_item_size) : '∞',
},
{
label: 'Can Share Note',
value: yesOrNo(true),
},
{
label: 'Can Share Notebook',
value: yesOrNo(accountProps.can_share),
},
],
};
return view;
}
throw new ErrorMethodNotAllowed();

View File

@@ -64,7 +64,7 @@ router.get('items', async (_path: SubPath, ctx: AppContext) => {
}),
};
const view: View = defaultView('items');
const view: View = defaultView('items', 'Items');
view.content.itemTable = makeTableView(table),
view.content.postUrl = `${config().baseUrl}/items`;
view.cssFiles = ['index/items'];

View File

@@ -8,7 +8,7 @@ import defaultView from '../../utils/defaultView';
import { View } from '../../services/MustacheService';
function makeView(error: any = null): View {
const view = defaultView('login');
const view = defaultView('login', 'Login');
view.content = {
error,
signupUrl: config().signupEnabled ? makeUrl(UrlType.Signup) : '',

View File

@@ -12,7 +12,7 @@ import { AccountType, accountTypeProperties } from '../../models/UserModel';
import { ErrorForbidden } from '../../utils/errors';
function makeView(error: Error = null): View {
const view = defaultView('signup');
const view = defaultView('signup', 'Sign Up');
view.content = {
error,
postUrl: makeUrl(UrlType.Signup),

View File

@@ -4,7 +4,7 @@ import { RouteType } from '../../utils/types';
import { AppContext, HttpMethod } from '../../utils/types';
import { bodyFields, formParse } from '../../utils/requestUtils';
import { ErrorForbidden, ErrorUnprocessableEntity } from '../../utils/errors';
import { User } from '../../db';
import { User, Uuid } from '../../db';
import config from '../../config';
import { View } from '../../services/MustacheService';
import defaultView from '../../utils/defaultView';
@@ -83,7 +83,14 @@ router.get('users', async (_path: SubPath, ctx: AppContext) => {
const users = await userModel.all();
const view: View = defaultView('users');
users.sort((u1: User, u2: User) => {
if (u1.full_name && u2.full_name) return u1.full_name.toLowerCase() < u2.full_name.toLowerCase() ? -1 : +1;
if (u1.full_name && !u2.full_name) return +1;
if (!u1.full_name && u2.full_name) return -1;
return u1.email.toLowerCase() < u2.email.toLowerCase() ? -1 : +1;
});
const view: View = defaultView('users', 'Users');
view.content.users = users.map(user => {
return {
...user,
@@ -115,13 +122,14 @@ router.get('users/:id', async (path: SubPath, ctx: AppContext, user: User = null
postUrl = `${config().baseUrl}/users/${user.id}`;
}
const view: View = defaultView('user');
const view: View = defaultView('user', 'Profile');
view.content.user = user;
view.content.isNew = isNew;
view.content.buttonTitle = isNew ? 'Create user' : 'Update profile';
view.content.error = error;
view.content.postUrl = postUrl;
view.content.showDeleteButton = !isNew && !!owner.is_admin && owner.id !== user.id;
view.content.showResetPasswordButton = !isNew && owner.is_admin;
if (config().accountTypesEnabled) {
view.content.showAccountTypes = true;
@@ -145,7 +153,7 @@ router.get('users/:id/confirm', async (path: SubPath, ctx: AppContext, error: Er
if (user.must_set_password) {
const view: View = {
...defaultView('users/confirm'),
...defaultView('users/confirm', 'Confirmation'),
content: {
user,
error,
@@ -200,13 +208,20 @@ router.post('users/:id/confirm', async (path: SubPath, ctx: AppContext) => {
router.alias(HttpMethod.POST, 'users/:id', 'users');
interface FormFields {
id: Uuid;
post_button: string;
delete_button: string;
send_reset_password_email: string;
}
router.post('users', async (path: SubPath, ctx: AppContext) => {
let user: User = {};
const userId = userIsMe(path) ? ctx.owner.id : path.id;
try {
const body = await formParse(ctx.req);
const fields = body.fields;
const fields = body.fields as FormFields;
const isNew = userIsNew(path);
if (userIsMe(path)) fields.id = userId;
user = makeUser(isNew, fields);
@@ -226,6 +241,10 @@ router.post('users', async (path: SubPath, ctx: AppContext) => {
const user = await userModel.load(path.id);
await userModel.checkIfAllowed(ctx.owner, AclAction.Delete, user);
await userModel.delete(path.id);
} else if (fields.send_reset_password_email) {
const user = await userModel.load(path.id);
await userModel.save({ id: user.id, must_set_password: 1 });
await userModel.sendAccountConfirmationEmail(user);
} else {
throw new Error('Invalid form button');
}

View File

@@ -1,7 +1,9 @@
import { Routers } from '../utils/routeUtils';
import apiBatch from './api/batch';
import apiDebug from './api/debug';
import apiEvents from './api/events';
import apiBatchItems from './api/batch_items';
import apiItems from './api/items';
import apiPing from './api/ping';
import apiSessions from './api/sessions';
@@ -25,6 +27,8 @@ import indexPrivacy from './index/privacy';
import defaultRoute from './default';
const routes: Routers = {
'api/batch': apiBatch,
'api/batch_items': apiBatchItems,
'api/debug': apiDebug,
'api/events': apiEvents,
'api/items': apiItems,

View File

@@ -14,6 +14,7 @@ export interface RenderOptions {
export interface View {
name: string;
title: string;
path: string;
navbar?: boolean;
content?: any;
@@ -33,6 +34,7 @@ interface GlobalParams {
termsUrl?: string;
privacyUrl?: string;
showErrorStackTraces?: boolean;
userDisplayName?: string;
}
export function isView(o: any): boolean {
@@ -102,6 +104,13 @@ export default class MustacheService {
return output;
}
private userDisplayName(owner: User): string {
if (!owner) return '';
if (owner.full_name) return owner.full_name;
if (owner.email) return owner.email;
return '';
}
public async renderView(view: View, globalParams: GlobalParams = null): Promise<string> {
const cssFiles = this.resolvesFilePaths('css', view.cssFiles || []);
const jsFiles = this.resolvesFilePaths('js', view.jsFiles || []);
@@ -110,6 +119,7 @@ export default class MustacheService {
globalParams = {
...this.defaultLayoutOptions,
...globalParams,
userDisplayName: this.userDisplayName(globalParams ? globalParams.owner : null),
};
const contentHtml = Mustache.render(
@@ -124,6 +134,7 @@ export default class MustacheService {
const layoutView: any = {
global: globalParams,
pageName: view.name,
pageTitle: `${config().appName} - ${view.title}`,
contentHtml: contentHtml,
cssFiles: cssFiles,
jsFiles: jsFiles,

View File

@@ -1,11 +1,12 @@
import { View } from '../services/MustacheService';
// Populate a View object with some good defaults.
export default function(name: string): View {
export default function(name: string, title: string): View {
return {
name: name,
path: `index/${name}`,
content: {},
navbar: true,
title: title,
};
}

View File

@@ -196,6 +196,7 @@ async function renderNote(share: Share, note: NoteEntity, resourceInfos: Resourc
cssFiles: ['items/note'],
jsFiles: ['items/note'],
name: 'note',
title: 'Note',
path: 'index/items/note',
content: {
note: {

View File

@@ -186,7 +186,7 @@ export async function execRequest(routes: Routers, ctx: AppContext) {
if (!match) throw new ErrorNotFound();
const endPoint = match.route.findEndPoint(ctx.request.method as HttpMethod, match.subPath.schema);
if (ctx.URL && !isValidOrigin(ctx.URL.origin, baseUrl(endPoint.type), endPoint.type)) throw new ErrorNotFound('Invalid origin', 'invalidOrigin');
if (ctx.URL && !isValidOrigin(ctx.URL.origin, baseUrl(endPoint.type), endPoint.type)) throw new ErrorNotFound(`Invalid origin: ${ctx.URL.origin}`, 'invalidOrigin');
// This is a generic catch-all for all private end points - if we
// couldn't get a valid session, we exit now. Individual end points

View File

@@ -0,0 +1,7 @@
export function yesOrNo(value: any): string {
return value ? 'yes' : 'no';
}
export function nothing() {
return '';
}

View File

@@ -60,7 +60,8 @@ async function createItemTree3(sessionId: Uuid, userId: Uuid, parentFolderId: st
}
}
const newItem = await models().item().saveFromRawContent(user, `${jopItem.id}.md`, Buffer.from(serializedBody));
const result = await models().item().saveFromRawContent(user, [{ name: `${jopItem.id}.md`, body: Buffer.from(serializedBody) }]);
const newItem = result[`${jopItem.id}.md`].item;
if (isFolder && jopItem.children.length) await createItemTree3(sessionId, userId, newItem.jop_id, shareId, jopItem.children);
}
}

View File

@@ -275,19 +275,20 @@ export async function createItemTree(userId: Uuid, parentFolderId: string, tree:
}
}
export async function createItemTree2(userId: Uuid, parentFolderId: string, tree: any[]): Promise<void> {
const itemModel = models().item();
const user = await models().user().load(userId);
// export async function createItemTree2(userId: Uuid, parentFolderId: string, tree: any[]): Promise<void> {
// const itemModel = models().item();
// const user = await models().user().load(userId);
for (const jopItem of tree) {
const isFolder = !!jopItem.children;
const serializedBody = isFolder ?
makeFolderSerializedBody({ ...jopItem, parent_id: parentFolderId }) :
makeNoteSerializedBody({ ...jopItem, parent_id: parentFolderId });
const newItem = await itemModel.saveFromRawContent(user, `${jopItem.id}.md`, Buffer.from(serializedBody));
if (isFolder && jopItem.children.length) await createItemTree2(userId, newItem.jop_id, jopItem.children);
}
}
// for (const jopItem of tree) {
// const isFolder = !!jopItem.children;
// const serializedBody = isFolder ?
// makeFolderSerializedBody({ ...jopItem, parent_id: parentFolderId }) :
// makeNoteSerializedBody({ ...jopItem, parent_id: parentFolderId });
// const result = await itemModel.saveFromRawContent(user, [{ name: `${jopItem.id}.md`, body: Buffer.from(serializedBody) }]);
// const newItem = result[`${jopItem.id}.md`].item;
// if (isFolder && jopItem.children.length) await createItemTree2(userId, newItem.jop_id, jopItem.children);
// }
// }
export async function createItemTree3(userId: Uuid, parentFolderId: string, shareId: Uuid, tree: any[]): Promise<void> {
const itemModel = models().item();
@@ -298,7 +299,8 @@ export async function createItemTree3(userId: Uuid, parentFolderId: string, shar
const serializedBody = isFolder ?
makeFolderSerializedBody({ ...jopItem, parent_id: parentFolderId, share_id: shareId }) :
makeNoteSerializedBody({ ...jopItem, parent_id: parentFolderId, share_id: shareId });
const newItem = await itemModel.saveFromRawContent(user, `${jopItem.id}.md`, Buffer.from(serializedBody));
const result = await itemModel.saveFromRawContent(user, [{ name: `${jopItem.id}.md`, body: Buffer.from(serializedBody) }]);
const newItem = result[`${jopItem.id}.md`].item;
if (isFolder && jopItem.children.length) await createItemTree3(userId, newItem.jop_id, shareId, jopItem.children);
}
}

View File

@@ -11,3 +11,23 @@ export function msleep(ms: number) {
export function formatDateTime(ms: number): string {
return dayjs(ms).format('D MMM YY HH:mm:ss');
}
// Use the utility functions below to easily measure performance of a block or
// line of code.
interface PerfTimer {
name: string;
startTime: number;
}
const perfTimers_: PerfTimer[] = [];
export function timerPush(name: string) {
perfTimers_.push({ name, startTime: Date.now() });
}
export function timerPop() {
const t = perfTimers_.pop();
console.info(`Time: ${t.name}: ${Date.now() - t.startTime}`);
}

View File

@@ -1 +1,17 @@
Welcome {{global.owner.email}}
<h1 class="title">Welcome to {{global.appName}}</h1>
<p class="subtitle">Logged in as <strong>{{global.userDisplayName}}</strong></p>
<table class="table is-hoverable">
<tbody>
{{#userProps}}
<tr>
<td>
<strong>{{label}}</strong>
</td>
<td>
{{value}}
</td>
</tr>
{{/userProps}}
</tbody>
</table>

View File

@@ -59,6 +59,9 @@
</div>
<div class="control">
<input type="submit" name="post_button" class="button is-primary" value="{{buttonTitle}}" />
{{#showResetPasswordButton}}
<input type="submit" name="send_reset_password_email" class="button is-warning" value="Send reset password email" />
{{/showResetPasswordButton}}
{{#showDeleteButton}}
<input type="submit" name="delete_button" class="button is-danger" value="Delete" />
{{/showDeleteButton}}

View File

@@ -1,6 +1,7 @@
<!doctype html>
<html>
<head>
<title>{{pageTitle}}</title>
<meta charset="utf-8">
<link rel="stylesheet" href="{{{global.baseUrl}}}/css/bulma.min.css" crossorigin="anonymous">
{{#global.prefersDarkEnabled}}

View File

@@ -18,8 +18,7 @@
<a class="navbar-item" href="{{{global.baseUrl}}}/changes">Log</a>
</div>
<div class="navbar-end">
<div class="navbar-item">{{global.owner.email}}</div>
<a class="navbar-item" href="{{{global.baseUrl}}}/users/me">Profile</a>
<a class="navbar-item" href="{{{global.baseUrl}}}/users/me">{{global.userDisplayName}}</a>
<div class="navbar-item">
<form method="post" action="{{{global.baseUrl}}}/logout">
<button class="button is-primary">Logout</button>

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/tools",
"version": "2.0.2",
"version": "2.0.3",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/tools",
"version": "2.0.2",
"version": "2.1.0",
"description": "Various tools for Joplin",
"main": "index.js",
"author": "Laurent Cozic",

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/turndown-plugin-gfm",
"version": "1.0.30",
"version": "1.0.31",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -4,7 +4,7 @@
"publishConfig": {
"access": "public"
},
"version": "1.0.30",
"version": "1.0.31",
"author": "Dom Christie",
"main": "lib/turndown-plugin-gfm.cjs.js",
"module": "lib/turndown-plugin-gfm.es.js",

View File

@@ -1,6 +1,6 @@
{
"name": "@joplin/turndown",
"version": "4.0.48",
"version": "4.0.49",
"lockfileVersion": 1,
"requires": true,
"dependencies": {

View File

@@ -1,7 +1,7 @@
{
"name": "@joplin/turndown",
"description": "A library that converts HTML to Markdown",
"version": "4.0.48",
"version": "4.0.49",
"author": "Dom Christie",
"main": "lib/turndown.cjs.js",
"module": "lib/turndown.es.js",

View File

@@ -1,5 +1,19 @@
# Joplin terminal app changelog
## [cli-v2.0.1](https://github.com/laurent22/joplin/releases/tag/cli-v2.0.1) - 2021-06-16T19:06:28Z
- New: Add new date format YYMMDD (#4954 by Helmut K. C. Tessarek)
- New: Add support for sharing notebooks with Joplin Server (#4772)
- Improved: Allow setting up E2EE without having to confirm the password (c5b0529)
- Improved: Conflict notes will now populate a new field with the ID of the conflict note. (#5049 by [@Ahmad45123](https://github.com/Ahmad45123))
- Improved: Import SVG as images when importing ENEX files (#4968)
- Improved: Improve search with Asian scripts (#5018) (#4613 by [@mablin7](https://github.com/mablin7))
- Improved: Prevent sync process from being stuck when the download state of a resource is invalid (5c6fd93)
- Fixed: Fixed possible crash when trying to delete corrupted revision in revision service (#4845)
- Fixed: Fixed user content URLs when sharing note via Joplin Server (2cf7067)
- Fixed: Improved importing Evernote notes that contain codeblocks (#4965)
- Fixed: Items are filtered in the API search (#5017) (#5007 by [@JackGruber](https://github.com/JackGruber))
## [cli-v1.8.1](https://github.com/laurent22/joplin/releases/tag/cli-v1.8.1) - 2021-05-10T09:38:05Z
- New: Add "id" and "due" search filters (#4898 by [@JackGruber](https://github.com/JackGruber))

View File

@@ -1,5 +1,18 @@
# Joplin Server Changelog
## [server-v2.1.1](https://github.com/laurent22/joplin/releases/tag/server-v2.1.1) - 2021-06-17T17:27:29Z
- New: Added account info to dashboard and title to pages (7f0b3fd)
- New: Added way to batch requests (currently disabled) (c682c88)
- New: Added way to debug slow queries (e853244)
- Improved: Hide Reset Password button when creating new users (ac03c08)
- Improved: Sort users by name, then email (65c3d01)
## [server-v2.0.14](https://github.com/laurent22/joplin/releases/tag/server-v2.0.14) - 2021-06-17T08:52:26Z
- Improved: Allow sending reset password email from admin UI (479237d)
- Improved: Tokens would expire too soon (6ae0e84)
## [server-v2.0.13](https://github.com/laurent22/joplin/releases/tag/server-v2.0.13) - 2021-06-16T14:28:20Z
- Improved: Allow creating a new user with no password, which must be set via email confirmation (1896549)