Merge branch 'master' into bwindels/typescript-observable-2

This commit is contained in:
Bruno Windels 2021-11-30 16:37:43 +01:00
commit fc3eb7f57f
127 changed files with 3931 additions and 1632 deletions

44
.github/workflows/docker-publish.yml vendored Normal file
View file

@ -0,0 +1,44 @@
name: Container Image
on:
push:
branches: [ master ]
tags: [ 'v*' ]
pull_request:
branches: [ master ]
env:
IMAGE_NAME: ${{ github.repository }}
REGISTRY: ghcr.io
jobs:
push:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- name: Checkout repository
uses: actions/checkout@v2
- name: Log into registry ${{ env.REGISTRY }}
uses: docker/login-action@v1
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract Docker metadata
id: meta
uses: docker/metadata-action@v3
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
- name: Build and push Docker image
uses: docker/build-push-action@v2
with:
push: ${{ github.event_name != 'pull_request' }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}

View file

@ -28,7 +28,7 @@ You can only verify by comparing keys manually currently. In Element, go to your
## I want to host my own Hydrogen, how do I do that?
There are no published builds at this point. You need to checkout the version you want to build, or master if you want to run bleeding edge, and run `yarn install` and then `yarn build` in a console (and install nodejs > 14 and yarn if you haven't yet). Now you should find all the files needed to host Hydrogen in the `target/` folder, just copy them all over to your server. As always, don't host your client on the same [origin](https://web.dev/same-origin-policy/#what's-considered-same-origin) as your homeserver.
Published builds can be found at https://github.com/vector-im/hydrogen-web/releases. For building your own, you need to checkout the version you want to build, or master if you want to run bleeding edge, and run `yarn install` and then `yarn build` in a console (and install nodejs > 14 and yarn if you haven't yet). Now you should find all the files needed to host Hydrogen in the `target/` folder, just copy them all over to your server. As always, don't host your client on the same [origin](https://web.dev/same-origin-policy/#what's-considered-same-origin) as your homeserver.
## I want to embed Hydrogen in my website, how should I do that?

View file

@ -2,6 +2,8 @@
If you want to use end-to-end encryption, it is recommended to use a [supported build system](../src/sdk/paths/) (currently only vite) to be able to locate the olm library files.
**NOTE**: For now, these instructions will only work at development time, support when building (e.g. `vite build`) is being worked on and tracked in [#529](https://github.com/vector-im/hydrogen-web/issues/529).
You can create a project using the following commands
```sh

View file

@ -1,7 +1,9 @@
# Typescript migration
# Typescript style guide
## Introduce `abstract` & `override`
## Use `type` rather than `interface` for named parameters and POJO return values.
- find all methods and getters that throw or are empty in base classes and turn into abstract method or if all methods are abstract, into an interface.
- change child impls to not call super.method and to add override
- don't allow implicit override in ts config
`type` and `interface` can be used somewhat interchangebly used, but let's use `type` to describe data and `interface` to describe (polymorphic) behaviour.
Good examples of data are option objects to have named parameters, and POJO (plain old javascript objects) without any methods, just fields.
Also see [this playground](https://www.typescriptlang.org/play?#code/C4TwDgpgBACghgJwgO2AeTMAlge2QZygF4oBvAKCiqmTgFsIAuKfYBLZAcwG5LqATCABs4IAPzNkAVzoAjCAl4BfcuVCQoAYQAWWIfwzY8hEvCSpDuAlABkZPlQDGOITgTNW7LstWOR+QjMUYHtqKGcCNilHYDcAChxMK3xmIIsk4wBKewcoFRVyPzgArV19KAgAD2AUfkDEYNDqCM9o2IQEjIJmHT0DLvxsijCw-ClIDsSjAkzeEebjEIYAuE5oEgADABJSKeSAOloGJSgsQh29433nVwQlDbnqfKA)

97
doc/impl-thoughts/SDK.md Normal file
View file

@ -0,0 +1,97 @@
SDK:
- we need to compile src/lib.ts to javascript, with a d.ts file generated as well. We need to compile to javascript once for cjs and once of es modules. The package.json looks like this:
we don't need to bundle for the sdk case! we might need to do some transpilation to just plain ES6 (e.g. don't assume ?. and ??) we could use a browserslist query for this e.g. `node 14`. esbuild seems to support this as well, tldraw uses esbuild for their build.
one advantage of not bundling the files for the sdk is that you can still use import overrides in the consuming project build settings. is that an idiomatic way of doing things though?
```
"main": "./dist/index.cjs",
"exports": {
"import": "./dist/index.mjs",
"require": "./dist/index.cjs"
},
"types": "dist/index.d.ts",
```
this way we will support typescript, non-esm javascript and esm javascript using libhydrogen as an SDK
got this from https://medium.com/dazn-tech/publishing-npm-packages-as-native-es-modules-41ffbc0a9dea
how about the assets?
we also need to build the app
we need to be able to version libhydrogen independently from hydrogen the app? as any api breaking changes will need a major version increase. we probably want to end up with a monorepo where the app uses the sdk as well and we just use the local code with yarn link?
## Assets
we want to provide scss/sass files, but also css that can be included
https://github.com/webpack/webpack/issues/7353 seems to imply that we just need to include the assets in the published files and from there on it is the consumer of libhydrogen's problem.
how does all of this tie in with vite?
we want to have hydrogenapp be a consumer of libhydrogen, potentially as two packages in a monorepo ... but we want the SDK to expose views and stylesheets... without having an index.html (which would be in hydrogenapp). this seems a bit odd...?
what would be in hydrogenapp actually? just an index.html file?
I'm not sure it makes sense to have them be 2 different packages in a monorepo, they should really be two artifacts from the same directory.
the stylesheets included in libhydrogen are from the same main.css file as is used in the app
https://www.freecodecamp.org/news/build-a-css-library-with-vitejs/
basically, we import the sass file from src/lib.ts so it is included in the assets there too, and we also create a plugin that emits a file for every sass file as suggested in the link above?
we probably want two different build commands for the app and the sdk though, we could have a parent vite config that both build configs extend from?
### Dependency assets
our dependencies should not be bundled for the SDK case. So if we import aesjs, it would be up to the build system of the consuming project to make that import work.
the paths.ts thingy ... we want to make it easy for people to setup the assets for our dependencies (olm), some assets are also part of the sdk itself. it might make sense to make all of the assets there part of the sdk (e.g. bundle olm.wasm and friends?) although shipping crypto, etc ...
perhaps we should have an include file per build system that treats own assets and dep assets the same by including the package name as wel for our own deps:
```js
import _downloadSandboxPath from "@matrix-org/hydrogen-sdk/download-sandbox.html?url";
import _serviceWorkerPath from "@matrix-org/hydrogen-sdk/sw.js?url"; // not yet sure this is the way to do it
import olmWasmPath from "@matrix-org/olm/olm.wasm?url";
import olmJsPath from "@matrix-org/olm/olm.js?url";
import olmLegacyJsPath from "@matrix-org/olm/olm_legacy.js?url";
export const olmPaths = {
wasm: olmWasmPath,
legacyBundle: olmLegacyJsPath,
wasmBundle: olmJsPath,
};
export const downloadSandboxPath = _downloadSandboxPath;
```
we could put this file per build system, as ESM, in dist as well so you can include it to get the paths
## Tooling
- `vite` a more high-level build tool that takes your index.html and turns it into optimized assets that you can host for production, as well as a very fast dev server. is used to have good default settings for our tools, typescript support, and also deals with asset compiling. good dev server. Would be nice to have the same tool for dev and prod. vite has good support for using `import` for anything that is not javascript, where we had an issue with `snowpack` (to get the prod path of an asset).
- `rollup`: inlines
- `lerna` is used to handle multi-package monorepos
- `esbuild`: a js/ts build tool that we could use for building the lower level sdk where no other assets are involved, `vite` uses it for fast dev builds (`rollup` for prod). For now we won't extract a lower level sdk though.
## TODO
- finish vite app build (without IE11 for now?)
- create vite config to build src/lib.ts in cjs and esm, inheriting from a common base config with the app config
- this will create a dist folder with
- the whole source tree in es and cjs format
- an es file to import get the asset paths as they are expected by Platform, per build system
- assets from hydrogen itself:
- css files and any resource used therein
- download-sandbox.html
- a type declaration file (index.d.ts)

View file

@ -1,6 +1,6 @@
{
"name": "hydrogen-web",
"version": "0.2.16",
"version": "0.2.22",
"description": "A javascript matrix client prototype, trying to minize RAM usage by offloading as much as possible to IndexedDB",
"main": "src/lib.ts",
"directories": {
@ -10,7 +10,7 @@
"lint": "eslint --cache src/",
"lint-ts": "eslint src/ -c .ts-eslintrc.js --ext .ts",
"lint-ci": "eslint src/",
"test": "impunity --entry-point src/main.js --force-esm-dirs lib/ src/",
"test": "impunity --entry-point src/main.js src/platform/web/Platform.js --force-esm-dirs lib/ src/",
"start": "snowpack dev --port 3000",
"build": "node --experimental-modules scripts/build.mjs",
"postinstall": "node ./scripts/post-install.js"
@ -39,9 +39,8 @@
"eslint": "^7.32.0",
"fake-indexeddb": "^3.1.2",
"finalhandler": "^1.1.1",
"impunity": "^1.0.1",
"impunity": "^1.0.9",
"mdn-polyfills": "^5.20.0",
"node-html-parser": "^4.0.0",
"postcss": "^8.1.1",
"postcss-css-variables": "^0.17.0",
"postcss-flexbugs-fixes": "^4.2.1",
@ -56,16 +55,17 @@
},
"dependencies": {
"@matrix-org/olm": "https://gitlab.matrix.org/api/v4/projects/27/packages/npm/@matrix-org/olm/-/@matrix-org/olm-3.2.3.tgz",
"@rollup/plugin-commonjs": "^15.0.0",
"@rollup/plugin-json": "^4.1.0",
"@rollup/plugin-node-resolve": "^9.0.0",
"aes-js": "^3.1.2",
"another-json": "^0.2.0",
"base64-arraybuffer": "^0.2.0",
"bs58": "^4.0.1",
"dompurify": "^2.3.0",
"es6-promise": "https://github.com/bwindels/es6-promise.git#bwindels/expose-flush",
"text-encoding": "^0.7.0",
"@rollup/plugin-commonjs": "^15.0.0",
"@rollup/plugin-json": "^4.1.0",
"@rollup/plugin-node-resolve": "^9.0.0",
"rollup": "^2.26.4"
"node-html-parser": "^4.0.0",
"rollup": "^2.26.4",
"text-encoding": "^0.7.0"
}
}

View file

@ -0,0 +1,136 @@
/*
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {ViewModel} from "./ViewModel.js";
import {KeyType} from "../matrix/ssss/index.js";
import {Status} from "./session/settings/SessionBackupViewModel.js";
export class AccountSetupViewModel extends ViewModel {
constructor(accountSetup) {
super();
this._accountSetup = accountSetup;
this._dehydratedDevice = undefined;
this._decryptDehydratedDeviceViewModel = undefined;
if (this._accountSetup.encryptedDehydratedDevice) {
this._decryptDehydratedDeviceViewModel = new DecryptDehydratedDeviceViewModel(this, dehydratedDevice => {
this._dehydratedDevice = dehydratedDevice;
this._decryptDehydratedDeviceViewModel = undefined;
this.emitChange("deviceDecrypted");
});
}
}
get decryptDehydratedDeviceViewModel() {
return this._decryptDehydratedDeviceViewModel;
}
get deviceDecrypted() {
return !!this._dehydratedDevice;
}
get dehydratedDeviceId() {
return this._accountSetup.encryptedDehydratedDevice.deviceId;
}
finish() {
this._accountSetup.finish(this._dehydratedDevice);
}
}
// this vm adopts the same shape as SessionBackupViewModel so the same view can be reused.
class DecryptDehydratedDeviceViewModel extends ViewModel {
constructor(accountSetupViewModel, decryptedCallback) {
super();
this._accountSetupViewModel = accountSetupViewModel;
this._isBusy = false;
this._status = Status.SetupKey;
this._error = undefined;
this._decryptedCallback = decryptedCallback;
}
get decryptAction() {
return this.i18n`Restore`;
}
get purpose() {
return this.i18n`claim your dehydrated device`;
}
get offerDehydratedDeviceSetup() {
return false;
}
get dehydratedDeviceId() {
return this._accountSetupViewModel._dehydratedDevice?.deviceId;
}
get isBusy() {
return this._isBusy;
}
get backupVersion() { return 0; }
get status() {
return this._status;
}
get error() {
return this._error?.message;
}
showPhraseSetup() {
if (this._status === Status.SetupKey) {
this._status = Status.SetupPhrase;
this.emitChange("status");
}
}
showKeySetup() {
if (this._status === Status.SetupPhrase) {
this._status = Status.SetupKey;
this.emitChange("status");
}
}
async _enterCredentials(keyType, credential) {
if (credential) {
try {
this._isBusy = true;
this.emitChange("isBusy");
const {encryptedDehydratedDevice} = this._accountSetupViewModel._accountSetup;
const dehydratedDevice = await encryptedDehydratedDevice.decrypt(keyType, credential);
this._decryptedCallback(dehydratedDevice);
} catch (err) {
console.error(err);
this._error = err;
this.emitChange("error");
} finally {
this._isBusy = false;
this.emitChange("");
}
}
}
enterSecurityPhrase(passphrase) {
this._enterCredentials(KeyType.Passphrase, passphrase);
}
enterSecurityKey(securityKey) {
this._enterCredentials(KeyType.RecoveryKey, securityKey);
}
disable() {}
}

View file

@ -14,6 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {AccountSetupViewModel} from "./AccountSetupViewModel.js";
import {LoadStatus} from "../matrix/SessionContainer.js";
import {SyncStatus} from "../matrix/Sync.js";
import {ViewModel} from "./ViewModel.js";
@ -29,6 +30,8 @@ export class SessionLoadViewModel extends ViewModel {
this._loading = false;
this._error = null;
this.backUrl = this.urlCreator.urlForSegment("session", true);
this._accountSetupViewModel = undefined;
}
async start() {
@ -39,6 +42,11 @@ export class SessionLoadViewModel extends ViewModel {
this._loading = true;
this.emitChange("loading");
this._waitHandle = this._sessionContainer.loadStatus.waitFor(s => {
if (s === LoadStatus.AccountSetup) {
this._accountSetupViewModel = new AccountSetupViewModel(this._sessionContainer.accountSetup);
} else {
this._accountSetupViewModel = undefined;
}
this.emitChange("loadLabel");
// wait for initial sync, but not catchup sync
const isCatchupSync = s === LoadStatus.FirstSync &&
@ -97,13 +105,16 @@ export class SessionLoadViewModel extends ViewModel {
// to show a spinner or not
get loading() {
const sc = this._sessionContainer;
if (sc && sc.loadStatus.get() === LoadStatus.AccountSetup) {
return false;
}
return this._loading;
}
get loadLabel() {
const sc = this._sessionContainer;
const error = this._error || (sc && sc.loadError);
const error = this._getError();
if (error || (sc && sc.loadStatus.get() === LoadStatus.Error)) {
return `Something went wrong: ${error && error.message}.`;
}
@ -111,6 +122,10 @@ export class SessionLoadViewModel extends ViewModel {
// Statuses related to login are handled by respective login view models
if (sc) {
switch (sc.loadStatus.get()) {
case LoadStatus.QueryAccount:
return `Querying account encryption setup…`;
case LoadStatus.AccountSetup:
return ""; // we'll show a header ing AccountSetupView
case LoadStatus.SessionSetup:
return `Setting up your encryption keys…`;
case LoadStatus.Loading:
@ -124,4 +139,26 @@ export class SessionLoadViewModel extends ViewModel {
return `Preparing…`;
}
_getError() {
return this._error || this._sessionContainer?.loadError;
}
get hasError() {
return !!this._getError();
}
async exportLogs() {
const logExport = await this.logger.export();
this.platform.saveFileAs(logExport.asBlob(), `hydrogen-logs-${this.platform.clock.now()}.json`);
}
async logout() {
await this._sessionContainer.logout();
this.navigation.push("session", true);
}
get accountSetupViewModel() {
return this._accountSetupViewModel;
}
}

View file

@ -33,44 +33,6 @@ class SessionItemViewModel extends ViewModel {
return this._error && this._error.message;
}
async delete() {
this._isDeleting = true;
this.emitChange("isDeleting");
try {
await this._pickerVM.delete(this.id);
} catch(err) {
this._error = err;
console.error(err);
this.emitChange("error");
} finally {
this._isDeleting = false;
this.emitChange("isDeleting");
}
}
async clear() {
this._isClearing = true;
this.emitChange();
try {
await this._pickerVM.clear(this.id);
} catch(err) {
this._error = err;
console.error(err);
this.emitChange("error");
} finally {
this._isClearing = false;
this.emitChange("isClearing");
}
}
get isDeleting() {
return this._isDeleting;
}
get isClearing() {
return this._isClearing;
}
get id() {
return this._sessionInfo.id;
}
@ -96,27 +58,6 @@ class SessionItemViewModel extends ViewModel {
return this._exportDataUrl;
}
async export() {
try {
const data = await this._pickerVM._exportData(this._sessionInfo.id);
const json = JSON.stringify(data, undefined, 2);
const blob = new Blob([json], {type: "application/json"});
this._exportDataUrl = URL.createObjectURL(blob);
this.emitChange("exportDataUrl");
} catch (err) {
alert(err.message);
console.error(err);
}
}
clearExport() {
if (this._exportDataUrl) {
URL.revokeObjectURL(this._exportDataUrl);
this._exportDataUrl = null;
this.emitChange("exportDataUrl");
}
}
get avatarColorNumber() {
return getIdentifierColorNumber(this._sessionInfo.userId);
}
@ -148,43 +89,6 @@ export class SessionPickerViewModel extends ViewModel {
return this._loadViewModel;
}
async _exportData(id) {
const sessionInfo = await this.platform.sessionInfoStorage.get(id);
const stores = await this.logger.run("export", log => {
return this.platform.storageFactory.export(id, log);
});
const data = {sessionInfo, stores};
return data;
}
async import(json) {
try {
const data = JSON.parse(json);
const {sessionInfo} = data;
sessionInfo.comment = `Imported on ${new Date().toLocaleString()} from id ${sessionInfo.id}.`;
sessionInfo.id = this._createSessionContainer().createNewSessionId();
await this.logger.run("import", log => {
return this.platform.storageFactory.import(sessionInfo.id, data.stores, log);
});
await this.platform.sessionInfoStorage.add(sessionInfo);
this._sessions.set(new SessionItemViewModel(sessionInfo, this));
} catch (err) {
alert(err.message);
console.error(err);
}
}
async delete(id) {
const idx = this._sessions.array.findIndex(s => s.id === id);
await this.platform.sessionInfoStorage.delete(id);
await this.platform.storageFactory.delete(id);
this._sessions.remove(idx);
}
async clear(id) {
await this.platform.storageFactory.delete(id);
}
get sessions() {
return this._sessions;
}

View file

@ -19,7 +19,7 @@ limitations under the License.
// we do need to return a disposable from EventEmitter.on, or at least have a method here to easily track a subscription to an EventEmitter
import {EventEmitter} from "../utils/EventEmitter";
import {Disposables} from "../utils/Disposables.js";
import {Disposables} from "../utils/Disposables";
export class ViewModel extends EventEmitter {
constructor(options = {}) {

View file

@ -107,7 +107,7 @@ export class LoginViewModel extends ViewModel {
async attemptLogin(loginMethod) {
this._setBusy(true);
this._sessionContainer.startWithLogin(loginMethod);
this._sessionContainer.startWithLogin(loginMethod, {inspectAccountSetup: true});
const loadStatus = this._sessionContainer.loadStatus;
const handle = loadStatus.waitFor(status => status !== LoadStatus.Login);
await handle.promise;

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import {ViewModel} from "../ViewModel.js";
import {createEnum} from "../../utils/enum.js";
import {createEnum} from "../../utils/enum";
import {ConnectionStatus} from "../../matrix/net/Reconnector.js";
import {SyncStatus} from "../../matrix/Sync.js";

View file

@ -230,7 +230,7 @@ export class SessionViewModel extends ViewModel {
}
if (settingsOpen) {
this._settingsViewModel = this.track(new SettingsViewModel(this.childOptions({
session: this._sessionContainer.session,
sessionContainer: this._sessionContainer,
})));
this._settingsViewModel.load();
}

View file

@ -184,7 +184,7 @@ import {Clock as MockClock} from "../../../../mocks/Clock.js";
import {createMockStorage} from "../../../../mocks/Storage";
import {ListObserver} from "../../../../mocks/ListObserver.js";
import {createEvent, withTextBody, withContent} from "../../../../mocks/event.js";
import {NullLogItem, NullLogger} from "../../../../logging/NullLogger.js";
import {NullLogItem, NullLogger} from "../../../../logging/NullLogger";
import {HomeServer as MockHomeServer} from "../../../../mocks/HomeServer.js";
// other imports
import {BaseMessageTile} from "./tiles/BaseMessageTile.js";

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import {BaseObservableList} from "../../../../observable/list/BaseObservableList";
import {sortedIndex} from "../../../../utils/sortedIndex.js";
import {sortedIndex} from "../../../../utils/sortedIndex";
// maps 1..n entries to 0..1 tile. Entries are what is stored in the timeline, either an event or fragmentboundary
// for now, tileCreator should be stable in whether it returns a tile or not.

View file

@ -16,7 +16,7 @@ limitations under the License.
import {BaseMessageTile} from "./BaseMessageTile.js";
import {stringAsBody} from "../MessageBody.js";
import {createEnum} from "../../../../../utils/enum.js";
import {createEnum} from "../../../../../utils/enum";
export const BodyFormat = createEnum("Plain", "Html");

View file

@ -16,7 +16,7 @@ limitations under the License.
*/
import {BaseMessageTile} from "./BaseMessageTile.js";
import {formatSize} from "../../../../../utils/formatSize.js";
import {formatSize} from "../../../../../utils/formatSize";
import {SendStatus} from "../../../../../matrix/room/sending/PendingEvent.js";
export class FileTile extends BaseMessageTile {

View file

@ -23,6 +23,7 @@ export class GapTile extends SimpleTile {
this._loading = false;
this._error = null;
this._isAtTop = true;
this._siblingChanged = false;
}
async fill() {
@ -42,11 +43,21 @@ export class GapTile extends SimpleTile {
this._loading = false;
this.emitChange("isLoading");
}
return true;
}
return false;
}
notifyVisible() {
this.fill();
async notifyVisible() {
// we do (up to 10) backfills while no new tiles have been added to the timeline
// because notifyVisible won't be called again until something gets added to the timeline
let depth = 0;
let canFillMore;
this._siblingChanged = false;
do {
canFillMore = await this.fill();
depth = depth + 1;
} while (depth < 10 && !this._siblingChanged && canFillMore && !this.isDisposed);
}
get isAtTop() {
@ -54,14 +65,20 @@ export class GapTile extends SimpleTile {
}
updatePreviousSibling(prev) {
console.log("GapTile.updatePreviousSibling", prev);
super.updatePreviousSibling(prev);
const isAtTop = !prev;
if (this._isAtTop !== isAtTop) {
this._isAtTop = isAtTop;
console.log("isAtTop", this._isAtTop);
this.emitChange("isAtTop");
}
this._siblingChanged = true;
}
updateNextSibling() {
// if the sibling of the gap changed while calling room.fill(),
// we intepret this as at least one new tile has been added to
// the timeline. See notifyVisible why this is important.
this._siblingChanged = true;
}
updateEntry(entry, params) {

View file

@ -65,6 +65,9 @@ export function tilesCreator(baseOptions) {
case "m.room.member":
return new RoomMemberTile(options);
case "m.room.encrypted":
if (entry.isRedacted) {
return new RedactedTile(options);
}
return new EncryptedEventTile(options);
case "m.room.encryption":
return new EncryptionEnabledTile(options);

View file

@ -15,19 +15,61 @@ limitations under the License.
*/
import {ViewModel} from "../../ViewModel.js";
import {KeyType} from "../../../matrix/ssss/index.js";
import {createEnum} from "../../../utils/enum";
export const Status = createEnum("Enabled", "SetupKey", "SetupPhrase", "Pending");
export class SessionBackupViewModel extends ViewModel {
constructor(options) {
super(options);
this._session = options.session;
this._showKeySetup = true;
this._error = null;
this._isBusy = false;
this._dehydratedDeviceId = undefined;
this._status = undefined;
this._reevaluateStatus();
this.track(this._session.hasSecretStorageKey.subscribe(() => {
this.emitChange("status");
if (this._reevaluateStatus()) {
this.emitChange("status");
}
}));
}
_reevaluateStatus() {
if (this._isBusy) {
return false;
}
let status;
const hasSecretStorageKey = this._session.hasSecretStorageKey.get();
if (hasSecretStorageKey === true) {
status = this._session.sessionBackup ? Status.Enabled : Status.SetupKey;
} else if (hasSecretStorageKey === false) {
status = Status.SetupKey;
} else {
status = Status.Pending;
}
const changed = status !== this._status;
this._status = status;
return changed;
}
get decryptAction() {
return this.i18n`Set up`;
}
get purpose() {
return this.i18n`set up session backup`;
}
offerDehydratedDeviceSetup() {
return true;
}
get dehydratedDeviceId() {
return this._dehydratedDeviceId;
}
get isBusy() {
return this._isBusy;
}
@ -37,15 +79,7 @@ export class SessionBackupViewModel extends ViewModel {
}
get status() {
if (this._session.sessionBackup) {
return "enabled";
} else {
if (this._session.hasSecretStorageKey.get() === false) {
return this._showKeySetup ? "setupKey" : "setupPhrase";
} else {
return "pending";
}
}
return this._status;
}
get error() {
@ -53,46 +87,61 @@ export class SessionBackupViewModel extends ViewModel {
}
showPhraseSetup() {
this._showKeySetup = false;
this.emitChange("status");
if (this._status === Status.SetupKey) {
this._status = Status.SetupPhrase;
this.emitChange("status");
}
}
showKeySetup() {
this._showKeySetup = true;
this.emitChange("status");
if (this._status === Status.SetupPhrase) {
this._status = Status.SetupKey;
this.emitChange("status");
}
}
async enterSecurityPhrase(passphrase) {
if (passphrase) {
async _enterCredentials(keyType, credential, setupDehydratedDevice) {
if (credential) {
try {
this._isBusy = true;
this.emitChange("isBusy");
await this._session.enableSecretStorage("phrase", passphrase);
const key = await this._session.enableSecretStorage(keyType, credential);
if (setupDehydratedDevice) {
this._dehydratedDeviceId = await this._session.setupDehydratedDevice(key);
}
} catch (err) {
console.error(err);
this._error = err;
this.emitChange("error");
} finally {
this._isBusy = false;
this._reevaluateStatus();
this.emitChange("");
}
}
}
async enterSecurityKey(securityKey) {
if (securityKey) {
try {
this._isBusy = true;
this.emitChange("isBusy");
await this._session.enableSecretStorage("key", securityKey);
} catch (err) {
console.error(err);
this._error = err;
this.emitChange("error");
} finally {
this._isBusy = false;
this.emitChange("");
}
enterSecurityPhrase(passphrase, setupDehydratedDevice) {
this._enterCredentials(KeyType.Passphrase, passphrase, setupDehydratedDevice);
}
enterSecurityKey(securityKey, setupDehydratedDevice) {
this._enterCredentials(KeyType.RecoveryKey, securityKey, setupDehydratedDevice);
}
async disable() {
try {
this._isBusy = true;
this.emitChange("isBusy");
await this._session.disableSecretStorage();
} catch (err) {
console.error(err);
this._error = err;
this.emitChange("error");
} finally {
this._isBusy = false;
this._reevaluateStatus();
this.emitChange("");
}
}
}

View file

@ -41,18 +41,31 @@ export class SettingsViewModel extends ViewModel {
constructor(options) {
super(options);
this._updateService = options.updateService;
const session = options.session;
this._session = session;
this._sessionBackupViewModel = this.track(new SessionBackupViewModel(this.childOptions({session})));
const {sessionContainer} = options;
this._sessionContainer = sessionContainer;
this._sessionBackupViewModel = this.track(new SessionBackupViewModel(this.childOptions({session: this._session})));
this._closeUrl = this.urlCreator.urlUntilSegment("session");
this._estimate = null;
this.sentImageSizeLimit = null;
this.minSentImageSizeLimit = 400;
this.maxSentImageSizeLimit = 4000;
this.pushNotifications = new PushNotificationStatus();
this._isLoggingOut = false;
}
get _session() {
return this._sessionContainer.session;
}
async logout() {
this._isLoggingOut = true;
await this._sessionContainer.logout();
this.emitChange("isLoggingOut");
this.navigation.push("session", true);
}
get isLoggingOut() { return this._isLoggingOut; }
setSentImageSizeLimit(size) {
if (size > this.maxSentImageSizeLimit || size < this.minSentImageSizeLimit) {
this.sentImageSizeLimit = null;

View file

@ -15,23 +15,29 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {LogItem} from "./LogItem.js";
import {LogLevel, LogFilter} from "./LogFilter.js";
import {LogItem} from "./LogItem";
import {LogLevel, LogFilter} from "./LogFilter";
import type {ILogger, ILogExport, FilterCreator, LabelOrValues, LogCallback, ILogItem, ISerializedItem} from "./types";
import type {Platform} from "../platform/web/Platform.js";
export class BaseLogger {
constructor({platform}) {
this._openItems = new Set();
export abstract class BaseLogger implements ILogger {
protected _openItems: Set<LogItem> = new Set();
protected _platform: Platform;
protected _serializedTransformer: (item: ISerializedItem) => ISerializedItem;
constructor({platform, serializedTransformer = (item: ISerializedItem) => item}) {
this._platform = platform;
this._serializedTransformer = serializedTransformer;
}
log(labelOrValues, logLevel = LogLevel.Info) {
const item = new LogItem(labelOrValues, logLevel, null, this);
item._end = item._start;
this._persistItem(item, null, false);
log(labelOrValues: LabelOrValues, logLevel: LogLevel = LogLevel.Info): void {
const item = new LogItem(labelOrValues, logLevel, this);
item.end = item.start;
this._persistItem(item, undefined, false);
}
/** if item is a log item, wrap the callback in a child of it, otherwise start a new root log item. */
wrapOrRun(item, labelOrValues, callback, logLevel = null, filterCreator = null) {
wrapOrRun<T>(item: ILogItem | undefined, labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): T {
if (item) {
return item.wrap(labelOrValues, callback, logLevel, filterCreator);
} else {
@ -43,28 +49,31 @@ export class BaseLogger {
where the (async) result or errors are not propagated but still logged.
Useful to pair with LogItem.refDetached.
@return {LogItem} the log item added, useful to pass to LogItem.refDetached */
runDetached(labelOrValues, callback, logLevel = null, filterCreator = null) {
if (logLevel === null) {
@return {ILogItem} the log item added, useful to pass to LogItem.refDetached */
runDetached<T>(labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): ILogItem {
if (!logLevel) {
logLevel = LogLevel.Info;
}
const item = new LogItem(labelOrValues, logLevel, null, this);
this._run(item, callback, logLevel, filterCreator, false /* don't throw, nobody is awaiting */);
const item = new LogItem(labelOrValues, logLevel, this);
this._run(item, callback, logLevel, false /* don't throw, nobody is awaiting */, filterCreator);
return item;
}
/** run a callback wrapped in a log operation.
Errors and duration are transparently logged, also for async operations.
Whatever the callback returns is returned here. */
run(labelOrValues, callback, logLevel = null, filterCreator = null) {
if (logLevel === null) {
run<T>(labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): T {
if (logLevel === undefined) {
logLevel = LogLevel.Info;
}
const item = new LogItem(labelOrValues, logLevel, null, this);
return this._run(item, callback, logLevel, filterCreator, true);
const item = new LogItem(labelOrValues, logLevel, this);
return this._run(item, callback, logLevel, true, filterCreator);
}
_run(item, callback, logLevel, filterCreator, shouldThrow) {
_run<T>(item: LogItem, callback: LogCallback<T>, logLevel: LogLevel, wantResult: true, filterCreator?: FilterCreator): T;
// we don't return if we don't throw, as we don't have anything to return when an error is caught but swallowed for the fire-and-forget case.
_run<T>(item: LogItem, callback: LogCallback<T>, logLevel: LogLevel, wantResult: false, filterCreator?: FilterCreator): void;
_run<T>(item: LogItem, callback: LogCallback<T>, logLevel: LogLevel, wantResult: boolean, filterCreator?: FilterCreator): T | void {
this._openItems.add(item);
const finishItem = () => {
@ -88,24 +97,29 @@ export class BaseLogger {
};
try {
const result = item.run(callback);
let result = item.run(callback);
if (result instanceof Promise) {
return result.then(promiseResult => {
result = result.then(promiseResult => {
finishItem();
return promiseResult;
}, err => {
finishItem();
if (shouldThrow) {
if (wantResult) {
throw err;
}
});
}) as unknown as T;
if (wantResult) {
return result;
}
} else {
finishItem();
return result;
if(wantResult) {
return result;
}
}
} catch (err) {
finishItem();
if (shouldThrow) {
if (wantResult) {
throw err;
}
}
@ -127,24 +141,20 @@ export class BaseLogger {
this._openItems.clear();
}
_persistItem() {
throw new Error("not implemented");
}
abstract _persistItem(item: LogItem, filter?: LogFilter, forced?: boolean): void;
async export() {
throw new Error("not implemented");
}
abstract export(): Promise<ILogExport | undefined>;
// expose log level without needing
get level() {
get level(): typeof LogLevel {
return LogLevel;
}
_now() {
_now(): number {
return this._platform.clock.now();
}
_createRefId() {
_createRefId(): number {
return Math.round(this._platform.random() * Number.MAX_SAFE_INTEGER);
}
}

View file

@ -13,33 +13,35 @@ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {BaseLogger} from "./BaseLogger.js";
import {BaseLogger} from "./BaseLogger";
import {LogItem} from "./LogItem";
import type {ILogItem, LogItemValues, ILogExport} from "./types";
export class ConsoleLogger extends BaseLogger {
_persistItem(item) {
_persistItem(item: LogItem): void {
printToConsole(item);
}
async export(): Promise<ILogExport | undefined> {
return undefined;
}
}
const excludedKeysFromTable = ["l", "id"];
function filterValues(values) {
if (!values) {
return null;
}
function filterValues(values: LogItemValues): LogItemValues | null {
return Object.entries(values)
.filter(([key]) => !excludedKeysFromTable.includes(key))
.reduce((obj, [key, value]) => {
.reduce((obj: LogItemValues, [key, value]) => {
obj = obj || {};
obj[key] = value;
return obj;
}, null);
}
function printToConsole(item) {
function printToConsole(item: LogItem): void {
const label = `${itemCaption(item)} (${item.duration}ms)`;
const filteredValues = filterValues(item._values);
const shouldGroup = item._children || filteredValues;
const filteredValues = filterValues(item.values);
const shouldGroup = item.children || filteredValues;
if (shouldGroup) {
if (item.error) {
console.group(label);
@ -59,8 +61,8 @@ function printToConsole(item) {
if (filteredValues) {
console.table(filteredValues);
}
if (item._children) {
for(const c of item._children) {
if (item.children) {
for(const c of item.children) {
printToConsole(c);
}
}
@ -69,18 +71,18 @@ function printToConsole(item) {
}
}
function itemCaption(item) {
if (item._values.t === "network") {
return `${item._values.method} ${item._values.url}`;
} else if (item._values.l && typeof item._values.id !== "undefined") {
return `${item._values.l} ${item._values.id}`;
} else if (item._values.l && typeof item._values.status !== "undefined") {
return `${item._values.l} (${item._values.status})`;
} else if (item._values.l && item.error) {
return `${item._values.l} failed`;
} else if (typeof item._values.ref !== "undefined") {
return `ref ${item._values.ref}`
function itemCaption(item: ILogItem): string {
if (item.values.t === "network") {
return `${item.values.method} ${item.values.url}`;
} else if (item.values.l && typeof item.values.id !== "undefined") {
return `${item.values.l} ${item.values.id}`;
} else if (item.values.l && typeof item.values.status !== "undefined") {
return `${item.values.l} (${item.values.status})`;
} else if (item.values.l && item.error) {
return `${item.values.l} failed`;
} else if (typeof item.values.ref !== "undefined") {
return `ref ${item.values.ref}`;
} else {
return item._values.l || item._values.type;
return item.values.l || item.values.type;
}
}

View file

@ -22,10 +22,25 @@ import {
iterateCursor,
fetchResults,
} from "../matrix/storage/idb/utils";
import {BaseLogger} from "./BaseLogger.js";
import {BaseLogger} from "./BaseLogger";
import type {Interval} from "../platform/web/dom/Clock";
import type {Platform} from "../platform/web/Platform.js";
import type {BlobHandle} from "../platform/web/dom/BlobHandle.js";
import type {ILogItem, ILogExport, ISerializedItem} from "./types";
import type {LogFilter} from "./LogFilter";
type QueuedItem = {
json: string;
id?: number;
}
export class IDBLogger extends BaseLogger {
constructor(options) {
private readonly _name: string;
private readonly _limit: number;
private readonly _flushInterval: Interval;
private _queuedItems: QueuedItem[];
constructor(options: {name: string, flushInterval?: number, limit?: number, platform: Platform, serializedTransformer?: (item: ISerializedItem) => ISerializedItem}) {
super(options);
const {name, flushInterval = 60 * 1000, limit = 3000} = options;
this._name = name;
@ -36,18 +51,19 @@ export class IDBLogger extends BaseLogger {
this._flushInterval = this._platform.clock.createInterval(() => this._tryFlush(), flushInterval);
}
dispose() {
// TODO: move dispose to ILogger, listen to pagehide elsewhere and call dispose from there, which calls _finishAllAndFlush
dispose(): void {
window.removeEventListener("pagehide", this, false);
this._flushInterval.dispose();
}
handleEvent(evt) {
handleEvent(evt: Event): void {
if (evt.type === "pagehide") {
this._finishAllAndFlush();
}
}
async _tryFlush() {
async _tryFlush(): Promise<void> {
const db = await this._openDB();
try {
const txn = db.transaction(["logs"], "readwrite");
@ -77,13 +93,13 @@ export class IDBLogger extends BaseLogger {
}
}
_finishAllAndFlush() {
_finishAllAndFlush(): void {
this._finishOpenItems();
this.log({l: "pagehide, closing logs", t: "navigation"});
this._persistQueuedItems(this._queuedItems);
}
_loadQueuedItems() {
_loadQueuedItems(): QueuedItem[] {
const key = `${this._name}_queuedItems`;
try {
const json = window.localStorage.getItem(key);
@ -97,18 +113,21 @@ export class IDBLogger extends BaseLogger {
return [];
}
_openDB() {
_openDB(): Promise<IDBDatabase> {
return openDatabase(this._name, db => db.createObjectStore("logs", {keyPath: "id", autoIncrement: true}), 1);
}
_persistItem(logItem, filter, forced) {
const serializedItem = logItem.serialize(filter, forced);
this._queuedItems.push({
json: JSON.stringify(serializedItem)
});
_persistItem(logItem: ILogItem, filter: LogFilter, forced: boolean): void {
const serializedItem = logItem.serialize(filter, undefined, forced);
if (serializedItem) {
const transformedSerializedItem = this._serializedTransformer(serializedItem);
this._queuedItems.push({
json: JSON.stringify(transformedSerializedItem)
});
}
}
_persistQueuedItems(items) {
_persistQueuedItems(items: QueuedItem[]): void {
try {
window.localStorage.setItem(`${this._name}_queuedItems`, JSON.stringify(items));
} catch (e) {
@ -116,12 +135,12 @@ export class IDBLogger extends BaseLogger {
}
}
async export() {
async export(): Promise<ILogExport> {
const db = await this._openDB();
try {
const txn = db.transaction(["logs"], "readonly");
const logs = txn.objectStore("logs");
const storedItems = await fetchResults(logs.openCursor(), () => false);
const storedItems: QueuedItem[] = await fetchResults(logs.openCursor(), () => false);
const allItems = storedItems.concat(this._queuedItems);
return new IDBLogExport(allItems, this, this._platform);
} finally {
@ -131,17 +150,20 @@ export class IDBLogger extends BaseLogger {
}
}
async _removeItems(items) {
async _removeItems(items: QueuedItem[]): Promise<void> {
const db = await this._openDB();
try {
const txn = db.transaction(["logs"], "readwrite");
const logs = txn.objectStore("logs");
for (const item of items) {
const queuedIdx = this._queuedItems.findIndex(i => i.id === item.id);
if (queuedIdx === -1) {
if (typeof item.id === "number") {
logs.delete(item.id);
} else {
this._queuedItems.splice(queuedIdx, 1);
// assume the (non-persisted) object in each array will be the same
const queuedIdx = this._queuedItems.indexOf(item);
if (queuedIdx === -1) {
this._queuedItems.splice(queuedIdx, 1);
}
}
}
await txnAsPromise(txn);
@ -153,33 +175,37 @@ export class IDBLogger extends BaseLogger {
}
}
class IDBLogExport {
constructor(items, logger, platform) {
class IDBLogExport implements ILogExport {
private readonly _items: QueuedItem[];
private readonly _logger: IDBLogger;
private readonly _platform: Platform;
constructor(items: QueuedItem[], logger: IDBLogger, platform: Platform) {
this._items = items;
this._logger = logger;
this._platform = platform;
}
get count() {
get count(): number {
return this._items.length;
}
/**
* @return {Promise}
*/
removeFromStore() {
removeFromStore(): Promise<void> {
return this._logger._removeItems(this._items);
}
asBlob() {
asBlob(): BlobHandle {
const log = {
formatVersion: 1,
appVersion: this._platform.updateService?.version,
items: this._items.map(i => JSON.parse(i.json))
};
const json = JSON.stringify(log);
const buffer = this._platform.encoding.utf8.encode(json);
const blob = this._platform.createBlob(buffer, "application/json");
const buffer: Uint8Array = this._platform.encoding.utf8.encode(json);
const blob: BlobHandle = this._platform.createBlob(buffer, "application/json");
return blob;
}
}

View file

@ -14,31 +14,35 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
export const LogLevel = {
All: 1,
Debug: 2,
Detail: 3,
Info: 4,
Warn: 5,
Error: 6,
Fatal: 7,
Off: 8,
import type {ILogItem, ISerializedItem} from "./types";
export enum LogLevel {
All = 1,
Debug,
Detail,
Info,
Warn,
Error,
Fatal,
Off
}
export class LogFilter {
constructor(parentFilter) {
private _min?: LogLevel;
private _parentFilter?: LogFilter;
constructor(parentFilter?: LogFilter) {
this._parentFilter = parentFilter;
this._min = null;
}
filter(item, children) {
filter(item: ILogItem, children: ISerializedItem[] | null): boolean {
if (this._parentFilter) {
if (!this._parentFilter.filter(item, children)) {
return false;
}
}
// neither our children or us have a loglevel high enough, filter out.
if (this._min !== null && !Array.isArray(children) && item.logLevel < this._min) {
if (this._min !== undefined && !Array.isArray(children) && item.logLevel < this._min) {
return false;
} else {
return true;
@ -46,7 +50,7 @@ export class LogFilter {
}
/* methods to build the filter */
minLevel(logLevel) {
minLevel(logLevel: LogLevel): LogFilter {
this._min = logLevel;
return this;
}

View file

@ -15,39 +15,47 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {LogLevel, LogFilter} from "./LogFilter.js";
import {LogLevel, LogFilter} from "./LogFilter";
import type {BaseLogger} from "./BaseLogger";
import type {ISerializedItem, ILogItem, LogItemValues, LabelOrValues, FilterCreator, LogCallback} from "./types";
export class LogItem {
constructor(labelOrValues, logLevel, filterCreator, logger) {
export class LogItem implements ILogItem {
public readonly start: number;
public logLevel: LogLevel;
public error?: Error;
public end?: number;
private _values: LogItemValues;
private _logger: BaseLogger;
private _filterCreator?: FilterCreator;
private _children?: Array<LogItem>;
constructor(labelOrValues: LabelOrValues, logLevel: LogLevel, logger: BaseLogger, filterCreator?: FilterCreator) {
this._logger = logger;
this._start = logger._now();
this._end = null;
this.start = logger._now();
// (l)abel
this._values = typeof labelOrValues === "string" ? {l: labelOrValues} : labelOrValues;
this.error = null;
this.logLevel = logLevel;
this._children = null;
this._filterCreator = filterCreator;
}
/** start a new root log item and run it detached mode, see BaseLogger.runDetached */
runDetached(labelOrValues, callback, logLevel, filterCreator) {
runDetached(labelOrValues: LabelOrValues, callback: LogCallback<unknown>, logLevel?: LogLevel, filterCreator?: FilterCreator): ILogItem {
return this._logger.runDetached(labelOrValues, callback, logLevel, filterCreator);
}
/** start a new detached root log item and log a reference to it from this item */
wrapDetached(labelOrValues, callback, logLevel, filterCreator) {
wrapDetached(labelOrValues: LabelOrValues, callback: LogCallback<unknown>, logLevel?: LogLevel, filterCreator?: FilterCreator): void {
this.refDetached(this.runDetached(labelOrValues, callback, logLevel, filterCreator));
}
/** logs a reference to a different log item, usually obtained from runDetached.
This is useful if the referenced operation can't be awaited. */
refDetached(logItem, logLevel = null) {
refDetached(logItem: ILogItem, logLevel?: LogLevel): void {
logItem.ensureRefId();
return this.log({ref: logItem._values.refId}, logLevel);
this.log({ref: logItem.values.refId}, logLevel);
}
ensureRefId() {
ensureRefId(): void {
if (!this._values.refId) {
this.set("refId", this._logger._createRefId());
}
@ -56,29 +64,33 @@ export class LogItem {
/**
* Creates a new child item and runs it in `callback`.
*/
wrap(labelOrValues, callback, logLevel = null, filterCreator = null) {
wrap<T>(labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): T {
const item = this.child(labelOrValues, logLevel, filterCreator);
return item.run(callback);
}
get duration() {
if (this._end) {
return this._end - this._start;
get duration(): number | undefined {
if (this.end) {
return this.end - this.start;
} else {
return null;
return undefined;
}
}
durationWithoutType(type) {
return this.duration - this.durationOfType(type);
durationWithoutType(type: string): number | undefined {
const durationOfType = this.durationOfType(type);
if (this.duration && durationOfType) {
return this.duration - durationOfType;
}
}
durationOfType(type) {
durationOfType(type: string): number | undefined {
if (this._values.t === type) {
return this.duration;
} else if (this._children) {
return this._children.reduce((sum, c) => {
return sum + c.durationOfType(type);
const duration = c.durationOfType(type);
return sum + (duration ?? 0);
}, 0);
} else {
return 0;
@ -91,12 +103,12 @@ export class LogItem {
*
* Hence, the child item is not returned.
*/
log(labelOrValues, logLevel = null) {
const item = this.child(labelOrValues, logLevel, null);
item._end = item._start;
log(labelOrValues: LabelOrValues, logLevel?: LogLevel): void {
const item = this.child(labelOrValues, logLevel);
item.end = item.start;
}
set(key, value) {
set(key: string | object, value?: unknown): void {
if(typeof key === "object") {
const values = key;
Object.assign(this._values, values);
@ -105,7 +117,7 @@ export class LogItem {
}
}
serialize(filter, parentStartTime = null, forced) {
serialize(filter: LogFilter, parentStartTime: number | undefined, forced: boolean): ISerializedItem | undefined {
if (this._filterCreator) {
try {
filter = this._filterCreator(new LogFilter(filter), this);
@ -113,10 +125,10 @@ export class LogItem {
console.error("Error creating log filter", err);
}
}
let children;
if (this._children !== null) {
children = this._children.reduce((array, c) => {
const s = c.serialize(filter, this._start, false);
let children: Array<ISerializedItem> | null = null;
if (this._children) {
children = this._children.reduce((array: Array<ISerializedItem>, c) => {
const s = c.serialize(filter, this.start, false);
if (s) {
if (array === null) {
array = [];
@ -127,12 +139,12 @@ export class LogItem {
}, null);
}
if (filter && !filter.filter(this, children)) {
return null;
return;
}
// in (v)alues, (l)abel and (t)ype are also reserved.
const item = {
const item: ISerializedItem = {
// (s)tart
s: parentStartTime === null ? this._start : this._start - parentStartTime,
s: typeof parentStartTime === "number" ? this.start - parentStartTime : this.start,
// (d)uration
d: this.duration,
// (v)alues
@ -171,20 +183,19 @@ export class LogItem {
* @param {Function} callback [description]
* @return {[type]} [description]
*/
run(callback) {
if (this._end !== null) {
run<T>(callback: LogCallback<T>): T {
if (this.end !== undefined) {
console.trace("log item is finished, additional logs will likely not be recorded");
}
let result;
try {
result = callback(this);
const result = callback(this);
if (result instanceof Promise) {
return result.then(promiseResult => {
this.finish();
return promiseResult;
}, err => {
throw this.catch(err);
});
}) as unknown as T;
} else {
this.finish();
return result;
@ -198,45 +209,53 @@ export class LogItem {
* finished the item, recording the end time. After finishing, an item can't be modified anymore as it will be persisted.
* @internal shouldn't typically be called by hand. allows to force finish if a promise is still running when closing the app
*/
finish() {
if (this._end === null) {
if (this._children !== null) {
finish(): void {
if (this.end === undefined) {
if (this._children) {
for(const c of this._children) {
c.finish();
}
}
this._end = this._logger._now();
this.end = this._logger._now();
}
}
// expose log level without needing import everywhere
get level() {
get level(): typeof LogLevel {
return LogLevel;
}
catch(err) {
catch(err: Error): Error {
this.error = err;
this.logLevel = LogLevel.Error;
this.finish();
return err;
}
child(labelOrValues, logLevel, filterCreator) {
if (this._end !== null) {
child(labelOrValues: LabelOrValues, logLevel?: LogLevel, filterCreator?: FilterCreator): LogItem {
if (this.end) {
console.trace("log item is finished, additional logs will likely not be recorded");
}
if (!logLevel) {
logLevel = this.logLevel || LogLevel.Info;
}
const item = new LogItem(labelOrValues, logLevel, filterCreator, this._logger);
if (this._children === null) {
const item = new LogItem(labelOrValues, logLevel, this._logger, filterCreator);
if (!this._children) {
this._children = [];
}
this._children.push(item);
return item;
}
get logger() {
get logger(): BaseLogger {
return this._logger;
}
get values(): LogItemValues {
return this._values;
}
get children(): Array<LogItem> | undefined {
return this._children;
}
}

View file

@ -1,99 +0,0 @@
/*
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {LogLevel} from "./LogFilter.js";
function noop () {}
export class NullLogger {
constructor() {
this.item = new NullLogItem(this);
}
log() {}
run(_, callback) {
return callback(this.item);
}
wrapOrRun(item, _, callback) {
if (item) {
return item.wrap(null, callback);
} else {
return this.run(null, callback);
}
}
runDetached(_, callback) {
new Promise(r => r(callback(this.item))).then(noop, noop);
}
async export() {
return null;
}
get level() {
return LogLevel;
}
}
export class NullLogItem {
constructor(logger) {
this.logger = logger;
}
wrap(_, callback) {
return callback(this);
}
log() {}
set() {}
runDetached(_, callback) {
new Promise(r => r(callback(this))).then(noop, noop);
}
wrapDetached(_, callback) {
return this.refDetached(null, callback);
}
run(callback) {
return callback(this);
}
refDetached() {}
ensureRefId() {}
get level() {
return LogLevel;
}
get duration() {
return 0;
}
catch(err) {
return err;
}
child() {
return this;
}
finish() {}
}
export const Instance = new NullLogger();

106
src/logging/NullLogger.ts Normal file
View file

@ -0,0 +1,106 @@
/*
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {LogLevel} from "./LogFilter";
import type {ILogger, ILogExport, ILogItem, LabelOrValues, LogCallback, LogItemValues} from "./types";
function noop (): void {}
export class NullLogger implements ILogger {
public readonly item: ILogItem = new NullLogItem(this);
log(): void {}
run<T>(_, callback: LogCallback<T>): T {
return callback(this.item);
}
wrapOrRun<T>(item: ILogItem | undefined, _, callback: LogCallback<T>): T {
if (item) {
return item.wrap(_, callback);
} else {
return this.run(_, callback);
}
}
runDetached(_, callback): ILogItem {
new Promise(r => r(callback(this.item))).then(noop, noop);
return this.item;
}
async export(): Promise<ILogExport | undefined> {
return undefined;
}
get level(): typeof LogLevel {
return LogLevel;
}
}
export class NullLogItem implements ILogItem {
public readonly logger: NullLogger;
public readonly logLevel: LogLevel;
public children?: Array<ILogItem>;
public values: LogItemValues;
public error?: Error;
constructor(logger: NullLogger) {
this.logger = logger;
}
wrap<T>(_: LabelOrValues, callback: LogCallback<T>): T {
return callback(this);
}
log(): void {}
set(): void {}
runDetached(_: LabelOrValues, callback: LogCallback<unknown>): ILogItem {
new Promise(r => r(callback(this))).then(noop, noop);
return this;
}
wrapDetached(_: LabelOrValues, _callback: LogCallback<unknown>): void {
return this.refDetached();
}
refDetached(): void {}
ensureRefId(): void {}
get level(): typeof LogLevel {
return LogLevel;
}
get duration(): 0 {
return 0;
}
catch(err: Error): Error {
return err;
}
child(): ILogItem {
return this;
}
finish(): void {}
serialize(): undefined {
return undefined;
}
}
export const Instance = new NullLogger();

82
src/logging/types.ts Normal file
View file

@ -0,0 +1,82 @@
/*
Copyright 2020 Bruno Windels <bruno@windels.cloud>
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {LogLevel, LogFilter} from "./LogFilter";
import type {BaseLogger} from "./BaseLogger";
import type {BlobHandle} from "../platform/web/dom/BlobHandle.js";
export interface ISerializedItem {
s: number;
d?: number;
v: LogItemValues;
l: LogLevel;
e?: {
stack?: string;
name: string;
message: string;
};
f?: boolean;
c?: Array<ISerializedItem>;
};
export interface ILogItem {
logLevel: LogLevel;
error?: Error;
readonly logger: ILogger;
readonly level: typeof LogLevel;
readonly end?: number;
readonly start?: number;
readonly values: LogItemValues;
wrap<T>(labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): T;
log(labelOrValues: LabelOrValues, logLevel?: LogLevel): void;
set(key: string | object, value: unknown): void;
runDetached(labelOrValues: LabelOrValues, callback: LogCallback<unknown>, logLevel?: LogLevel, filterCreator?: FilterCreator): ILogItem;
wrapDetached(labelOrValues: LabelOrValues, callback: LogCallback<unknown>, logLevel?: LogLevel, filterCreator?: FilterCreator): void;
refDetached(logItem: ILogItem, logLevel?: LogLevel): void;
ensureRefId(): void;
catch(err: Error): Error;
serialize(filter: LogFilter, parentStartTime: number | undefined, forced: boolean): ISerializedItem | undefined;
}
export interface ILogger {
log(labelOrValues: LabelOrValues, logLevel?: LogLevel): void;
wrapOrRun<T>(item: ILogItem | undefined, labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): T;
runDetached<T>(labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): ILogItem;
run<T>(labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): T;
export(): Promise<ILogExport | undefined>;
get level(): typeof LogLevel;
}
export interface ILogExport {
get count(): number;
removeFromStore(): Promise<void>;
asBlob(): BlobHandle;
}
export type LogItemValues = {
l?: string;
t?: string;
id?: unknown;
status?: string | number;
refId?: number;
ref?: number;
[key: string]: any
}
export type LabelOrValues = string | LogItemValues;
export type FilterCreator = ((filter: LogFilter, item: ILogItem) => LogFilter);
export type LogCallback<T> = (item: ILogItem) => T;

View file

@ -1,16 +0,0 @@
// these are helper functions if you can't assume you always have a log item (e.g. some code paths call with one set, others don't)
// if you know you always have a log item, better to use the methods on the log item than these utility functions.
import {Instance as NullLoggerInstance} from "./NullLogger.js";
export function wrapOrRunNullLogger(logItem, labelOrValues, callback, logLevel = null, filterCreator = null) {
if (logItem) {
return logItem.wrap(logItem, labelOrValues, callback, logLevel, filterCreator);
} else {
return NullLoggerInstance.run(null, callback);
}
}
export function ensureLogItem(logItem) {
return logItem || NullLoggerInstance.item;
}

18
src/logging/utils.ts Normal file
View file

@ -0,0 +1,18 @@
// these are helper functions if you can't assume you always have a log item (e.g. some code paths call with one set, others don't)
// if you know you always have a log item, better to use the methods on the log item than these utility functions.
import {Instance as NullLoggerInstance} from "./NullLogger";
import type {FilterCreator, ILogItem, LabelOrValues, LogCallback} from "./types";
import {LogLevel} from "./LogFilter";
export function wrapOrRunNullLogger<T>(logItem: ILogItem | undefined, labelOrValues: LabelOrValues, callback: LogCallback<T>, logLevel?: LogLevel, filterCreator?: FilterCreator): T | Promise<T> {
if (logItem) {
return logItem.wrap(labelOrValues, callback, logLevel, filterCreator);
} else {
return NullLoggerInstance.run(null, callback);
}
}
export function ensureLogItem(logItem: ILogItem): ILogItem {
return logItem || NullLoggerInstance.item;
}

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import {OLM_ALGORITHM} from "./e2ee/common.js";
import {countBy, groupBy} from "../utils/groupBy.js";
import {countBy, groupBy} from "../utils/groupBy";
export class DeviceMessageHandler {
constructor({storage}) {
@ -67,12 +67,4 @@ class SyncPreparation {
this.newRoomKeys = newRoomKeys;
this.newKeysByRoom = groupBy(newRoomKeys, r => r.roomId);
}
dispose() {
if (this.newRoomKeys) {
for (const k of this.newRoomKeys) {
k.dispose();
}
}
}
}

View file

@ -24,20 +24,24 @@ import { ObservableMap } from "../observable/index.js";
import {User} from "./User.js";
import {DeviceMessageHandler} from "./DeviceMessageHandler.js";
import {Account as E2EEAccount} from "./e2ee/Account.js";
import {uploadAccountAsDehydratedDevice} from "./e2ee/Dehydration.js";
import {Decryption as OlmDecryption} from "./e2ee/olm/Decryption.js";
import {Encryption as OlmEncryption} from "./e2ee/olm/Encryption.js";
import {Decryption as MegOlmDecryption} from "./e2ee/megolm/Decryption.js";
import {Decryption as MegOlmDecryption} from "./e2ee/megolm/Decryption";
import {KeyLoader as MegOlmKeyLoader} from "./e2ee/megolm/decryption/KeyLoader";
import {SessionBackup} from "./e2ee/megolm/SessionBackup.js";
import {Encryption as MegOlmEncryption} from "./e2ee/megolm/Encryption.js";
import {MEGOLM_ALGORITHM} from "./e2ee/common.js";
import {RoomEncryption} from "./e2ee/RoomEncryption.js";
import {DeviceTracker} from "./e2ee/DeviceTracker.js";
import {LockMap} from "../utils/LockMap.js";
import {groupBy} from "../utils/groupBy.js";
import {LockMap} from "../utils/LockMap";
import {groupBy} from "../utils/groupBy";
import {
keyFromCredential as ssssKeyFromCredential,
readKey as ssssReadKey,
writeKey as ssssWriteKey,
removeKey as ssssRemoveKey,
keyFromDehydratedDeviceKey as createSSSSKeyFromDehydratedDeviceKey
} from "./ssss/index.js";
import {SecretStorage} from "./ssss/SecretStorage.js";
import {ObservableValue, RetainedObservableValue} from "../observable/ObservableValue";
@ -105,6 +109,11 @@ export class Session {
return this._sessionInfo.userId;
}
/** @internal call SessionContainer.logout instead */
async logout(log = undefined) {
await this._hsApi.logout({log}).response();
}
// called once this._e2eeAccount is assigned
_setupEncryption() {
// TODO: this should all go in a wrapper in e2ee/ that is bootstrapped by passing in the account
@ -137,11 +146,8 @@ export class Session {
now: this._platform.clock.now,
ownDeviceId: this._sessionInfo.deviceId,
});
this._megolmDecryption = new MegOlmDecryption({
pickleKey: PICKLE_KEY,
olm: this._olm,
olmWorker: this._olmWorker,
});
const keyLoader = new MegOlmKeyLoader(this._olm, PICKLE_KEY, 20);
this._megolmDecryption = new MegOlmDecryption(keyLoader, this._olmWorker);
this._deviceMessageHandler.enableEncryption({olmDecryption, megolmDecryption: this._megolmDecryption});
}
@ -200,6 +206,12 @@ export class Session {
this._storage.storeNames.accountData,
]);
await this._createSessionBackup(key, readTxn);
await this._writeSSSSKey(key);
this._hasSecretStorageKey.set(true);
return key;
}
async _writeSSSSKey(key) {
// only after having read a secret, write the key
// as we only find out if it was good if the MAC verification succeeds
const writeTxn = await this._storage.readWriteTxn([
@ -212,7 +224,29 @@ export class Session {
throw err;
}
await writeTxn.complete();
this._hasSecretStorageKey.set(true);
}
async disableSecretStorage() {
const writeTxn = await this._storage.readWriteTxn([
this._storage.storeNames.session,
]);
try {
ssssRemoveKey(writeTxn);
} catch (err) {
writeTxn.abort();
throw err;
}
await writeTxn.complete();
if (this._sessionBackup) {
for (const room of this._rooms.values()) {
if (room.isEncrypted) {
room.enableSessionBackup(undefined);
}
}
this._sessionBackup?.dispose();
this._sessionBackup = undefined;
}
this._hasSecretStorageKey.set(false);
}
async _createSessionBackup(ssssKey, txn) {
@ -245,23 +279,76 @@ export class Session {
async createIdentity(log) {
if (this._olm) {
if (!this._e2eeAccount) {
this._e2eeAccount = await E2EEAccount.create({
hsApi: this._hsApi,
olm: this._olm,
pickleKey: PICKLE_KEY,
userId: this._sessionInfo.userId,
deviceId: this._sessionInfo.deviceId,
olmWorker: this._olmWorker,
storage: this._storage,
});
this._e2eeAccount = await this._createNewAccount(this._sessionInfo.deviceId, this._storage);
log.set("keys", this._e2eeAccount.identityKeys);
this._setupEncryption();
}
await this._e2eeAccount.generateOTKsIfNeeded(this._storage, log);
await log.wrap("uploadKeys", log => this._e2eeAccount.uploadKeys(this._storage, log));
await log.wrap("uploadKeys", log => this._e2eeAccount.uploadKeys(this._storage, false, log));
}
}
/** @internal */
async dehydrateIdentity(dehydratedDevice, log) {
log.set("deviceId", dehydratedDevice.deviceId);
if (!this._olm) {
log.set("no_olm", true);
return false;
}
if (dehydratedDevice.deviceId !== this.deviceId) {
log.set("wrong_device", true);
return false;
}
if (this._e2eeAccount) {
log.set("account_already_setup", true);
return false;
}
if (!await dehydratedDevice.claim(this._hsApi, log)) {
log.set("already_claimed", true);
return false;
}
this._e2eeAccount = await E2EEAccount.adoptDehydratedDevice({
dehydratedDevice,
hsApi: this._hsApi,
olm: this._olm,
pickleKey: PICKLE_KEY,
userId: this._sessionInfo.userId,
olmWorker: this._olmWorker,
deviceId: this.deviceId,
storage: this._storage,
});
log.set("keys", this._e2eeAccount.identityKeys);
this._setupEncryption();
return true;
}
_createNewAccount(deviceId, storage = undefined) {
// storage is optional and if omitted the account won't be persisted (useful for dehydrating devices)
return E2EEAccount.create({
hsApi: this._hsApi,
olm: this._olm,
pickleKey: PICKLE_KEY,
userId: this._sessionInfo.userId,
olmWorker: this._olmWorker,
deviceId,
storage,
});
}
setupDehydratedDevice(key, log = null) {
return this._platform.logger.wrapOrRun(log, "setupDehydratedDevice", async log => {
const dehydrationAccount = await this._createNewAccount("temp-device-id");
try {
const deviceId = await uploadAccountAsDehydratedDevice(
dehydrationAccount, this._hsApi, key, "Dehydrated device", log);
log.set("deviceId", deviceId);
return deviceId;
} finally {
dehydrationAccount.dispose();
}
});
}
/** @internal */
async load(log) {
const txn = await this._storage.readTxn([
@ -318,10 +405,17 @@ export class Session {
dispose() {
this._olmWorker?.dispose();
this._olmWorker = undefined;
this._sessionBackup?.dispose();
this._sessionBackup = undefined;
this._megolmDecryption?.dispose();
this._megolmDecryption = undefined;
this._e2eeAccount?.dispose();
this._e2eeAccount = undefined;
for (const room of this._rooms.values()) {
room.dispose();
}
this._rooms = undefined;
}
/**
@ -330,7 +424,7 @@ export class Session {
* and useful to store so we can later tell what capabilities
* our homeserver has.
*/
async start(lastVersionResponse, log) {
async start(lastVersionResponse, dehydratedDevice, log) {
if (lastVersionResponse) {
// store /versions response
const txn = await this._storage.readWriteTxn([
@ -342,6 +436,15 @@ export class Session {
}
// enable session backup, this requests the latest backup version
if (!this._sessionBackup) {
if (dehydratedDevice) {
await log.wrap("SSSSKeyFromDehydratedDeviceKey", async log => {
const ssssKey = await createSSSSKeyFromDehydratedDeviceKey(dehydratedDevice.key, this._storage, this._platform);
if (ssssKey) {
log.set("success", true);
await this._writeSSSSKey(ssssKey);
}
})
}
const txn = await this._storage.readTxn([
this._storage.storeNames.session,
this._storage.storeNames.accountData,
@ -513,7 +616,7 @@ export class Session {
if (!isCatchupSync) {
const needsToUploadOTKs = await this._e2eeAccount.generateOTKsIfNeeded(this._storage, log);
if (needsToUploadOTKs) {
await log.wrap("uploadKeys", log => this._e2eeAccount.uploadKeys(this._storage, log));
await log.wrap("uploadKeys", log => this._e2eeAccount.uploadKeys(this._storage, false, log));
}
}
}

View file

@ -15,7 +15,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {createEnum} from "../utils/enum.js";
import {createEnum} from "../utils/enum";
import {lookupHomeserver} from "./well-known.js";
import {AbortableOperation} from "../utils/AbortableOperation";
import {ObservableValue} from "../observable/ObservableValue";
@ -29,14 +29,17 @@ import {Session} from "./Session.js";
import {PasswordLoginMethod} from "./login/PasswordLoginMethod.js";
import {TokenLoginMethod} from "./login/TokenLoginMethod.js";
import {SSOLoginHelper} from "./login/SSOLoginHelper.js";
import {getDehydratedDevice} from "./e2ee/Dehydration.js";
export const LoadStatus = createEnum(
"NotLoading",
"Login",
"LoginFailed",
"QueryAccount", // check for dehydrated device after login
"AccountSetup", // asked to restore from dehydrated device if present, call sc.accountSetup.finish() to progress to the next stage
"Loading",
"SessionSetup", // upload e2ee keys, ...
"Migrating", //not used atm, but would fit here
"Migrating", // not used atm, but would fit here
"FirstSync",
"Error",
"Ready",
@ -63,6 +66,7 @@ export class SessionContainer {
this._requestScheduler = null;
this._olmPromise = olmPromise;
this._workerPromise = workerPromise;
this._accountSetup = undefined;
}
createNewSessionId() {
@ -85,7 +89,7 @@ export class SessionContainer {
if (!sessionInfo) {
throw new Error("Invalid session id: " + sessionId);
}
await this._loadSessionInfo(sessionInfo, false, log);
await this._loadSessionInfo(sessionInfo, null, log);
log.set("status", this._status.get());
} catch (err) {
log.catch(err);
@ -127,7 +131,7 @@ export class SessionContainer {
});
}
async startWithLogin(loginMethod) {
async startWithLogin(loginMethod, {inspectAccountSetup} = {}) {
const currentStatus = this._status.get();
if (currentStatus !== LoadStatus.LoginFailed &&
currentStatus !== LoadStatus.NotLoading &&
@ -154,7 +158,6 @@ export class SessionContainer {
lastUsed: clock.now()
};
log.set("id", sessionId);
await this._platform.sessionInfoStorage.add(sessionInfo);
} catch (err) {
this._error = err;
if (err.name === "HomeServerError") {
@ -173,21 +176,31 @@ export class SessionContainer {
}
return;
}
let dehydratedDevice;
if (inspectAccountSetup) {
dehydratedDevice = await this._inspectAccountAfterLogin(sessionInfo, log);
if (dehydratedDevice) {
sessionInfo.deviceId = dehydratedDevice.deviceId;
}
}
await this._platform.sessionInfoStorage.add(sessionInfo);
// loading the session can only lead to
// LoadStatus.Error in case of an error,
// so separate try/catch
try {
await this._loadSessionInfo(sessionInfo, true, log);
await this._loadSessionInfo(sessionInfo, dehydratedDevice, log);
log.set("status", this._status.get());
} catch (err) {
log.catch(err);
// free olm Account that might be contained
dehydratedDevice?.dispose();
this._error = err;
this._status.set(LoadStatus.Error);
}
});
}
async _loadSessionInfo(sessionInfo, isNewLogin, log) {
async _loadSessionInfo(sessionInfo, dehydratedDevice, log) {
log.set("appVersion", this._platform.version);
const clock = this._platform.clock;
this._sessionStartedByReconnector = false;
@ -233,7 +246,10 @@ export class SessionContainer {
platform: this._platform,
});
await this._session.load(log);
if (!this._session.hasIdentity) {
if (dehydratedDevice) {
await log.wrap("dehydrateIdentity", log => this._session.dehydrateIdentity(dehydratedDevice, log));
await this._session.setupDehydratedDevice(dehydratedDevice.key, log);
} else if (!this._session.hasIdentity) {
this._status.set(LoadStatus.SessionSetup);
await log.wrap("createIdentity", log => this._session.createIdentity(log));
}
@ -247,12 +263,16 @@ export class SessionContainer {
this._requestScheduler.start();
this._sync.start();
this._sessionStartedByReconnector = true;
await log.wrap("session start", log => this._session.start(this._reconnector.lastVersionsResponse, log));
const d = dehydratedDevice;
dehydratedDevice = undefined;
await log.wrap("session start", log => this._session.start(this._reconnector.lastVersionsResponse, d, log));
});
}
});
await log.wrap("wait first sync", () => this._waitForFirstSync());
if (this._isDisposed) {
return;
}
this._status.set(LoadStatus.Ready);
// if the sync failed, and then the reconnector
@ -261,8 +281,13 @@ export class SessionContainer {
// to prevent an extra /versions request
if (!this._sessionStartedByReconnector) {
const lastVersionsResponse = await hsApi.versions({timeout: 10000, log}).response();
if (this._isDisposed) {
return;
}
const d = dehydratedDevice;
dehydratedDevice = undefined;
// log as ref as we don't want to await it
await log.wrap("session start", log => this._session.start(lastVersionsResponse, log));
await log.wrap("session start", log => this._session.start(lastVersionsResponse, d, log));
}
}
@ -295,6 +320,41 @@ export class SessionContainer {
}
}
_inspectAccountAfterLogin(sessionInfo, log) {
return log.wrap("inspectAccount", async log => {
this._status.set(LoadStatus.QueryAccount);
const hsApi = new HomeServerApi({
homeserver: sessionInfo.homeServer,
accessToken: sessionInfo.accessToken,
request: this._platform.request,
});
const olm = await this._olmPromise;
let encryptedDehydratedDevice;
try {
encryptedDehydratedDevice = await getDehydratedDevice(hsApi, olm, this._platform, log);
} catch (err) {
if (err.name === "HomeServerError") {
log.set("not_supported", true);
} else {
throw err;
}
}
if (encryptedDehydratedDevice) {
let resolveStageFinish;
const promiseStageFinish = new Promise(r => resolveStageFinish = r);
this._accountSetup = new AccountSetup(encryptedDehydratedDevice, resolveStageFinish);
this._status.set(LoadStatus.AccountSetup);
await promiseStageFinish;
const dehydratedDevice = this._accountSetup?._dehydratedDevice;
this._accountSetup = null;
return dehydratedDevice;
}
});
}
get accountSetup() {
return this._accountSetup;
}
get loadStatus() {
return this._status;
@ -322,19 +382,36 @@ export class SessionContainer {
return this._reconnector;
}
get _isDisposed() {
return !this._reconnector;
}
logout() {
return this._platform.logger.run("logout", async log => {
try {
await this._session?.logout(log);
} catch (err) {}
await this.deleteSession(log);
});
}
dispose() {
if (this._reconnectSubscription) {
this._reconnectSubscription();
this._reconnectSubscription = null;
}
this._reconnector = null;
if (this._requestScheduler) {
this._requestScheduler.stop();
this._requestScheduler = null;
}
if (this._sync) {
this._sync.stop();
this._sync = null;
}
if (this._session) {
this._session.dispose();
this._session = null;
}
if (this._waitForFirstSyncHandle) {
this._waitForFirstSyncHandle.dispose();
@ -346,13 +423,17 @@ export class SessionContainer {
}
}
async deleteSession() {
async deleteSession(log) {
if (this._sessionId) {
// need to dispose first, so the storage is closed,
// and also first sync finishing won't call Session.start anymore,
// which assumes that the storage works.
this.dispose();
// if one fails, don't block the other from trying
// also, run in parallel
await Promise.all([
this._platform.storageFactory.delete(this._sessionId),
this._platform.sessionInfoStorage.delete(this._sessionId),
log.wrap("storageFactory", () => this._platform.storageFactory.delete(this._sessionId)),
log.wrap("sessionInfoStorage", () => this._platform.sessionInfoStorage.delete(this._sessionId)),
]);
this._sessionId = null;
}
@ -364,3 +445,20 @@ export class SessionContainer {
this._loginFailure = null;
}
}
class AccountSetup {
constructor(encryptedDehydratedDevice, finishStage) {
this._encryptedDehydratedDevice = encryptedDehydratedDevice;
this._dehydratedDevice = undefined;
this._finishStage = finishStage;
}
get encryptedDehydratedDevice() {
return this._encryptedDehydratedDevice;
}
finish(dehydratedDevice) {
this._dehydratedDevice = dehydratedDevice;
this._finishStage();
}
}

View file

@ -16,7 +16,7 @@ limitations under the License.
*/
import {ObservableValue} from "../observable/ObservableValue";
import {createEnum} from "../utils/enum.js";
import {createEnum} from "../utils/enum";
const INCREMENTAL_TIMEOUT = 30000;
@ -464,7 +464,6 @@ class SessionSyncProcessState {
dispose() {
this.lock?.release();
this.preparation?.dispose();
}
}

View file

@ -22,6 +22,24 @@ const ACCOUNT_SESSION_KEY = SESSION_E2EE_KEY_PREFIX + "olmAccount";
const DEVICE_KEY_FLAG_SESSION_KEY = SESSION_E2EE_KEY_PREFIX + "areDeviceKeysUploaded";
const SERVER_OTK_COUNT_SESSION_KEY = SESSION_E2EE_KEY_PREFIX + "serverOTKCount";
async function initiallyStoreAccount(account, pickleKey, areDeviceKeysUploaded, serverOTKCount, storage) {
const pickledAccount = account.pickle(pickleKey);
const txn = await storage.readWriteTxn([
storage.storeNames.session
]);
try {
// add will throw if the key already exists
// we would not want to overwrite olmAccount here
txn.session.add(ACCOUNT_SESSION_KEY, pickledAccount);
txn.session.add(DEVICE_KEY_FLAG_SESSION_KEY, areDeviceKeysUploaded);
txn.session.add(SERVER_OTK_COUNT_SESSION_KEY, serverOTKCount);
} catch (err) {
txn.abort();
throw err;
}
await txn.complete();
}
export class Account {
static async load({olm, pickleKey, hsApi, userId, deviceId, olmWorker, txn}) {
const pickledAccount = await txn.session.get(ACCOUNT_SESSION_KEY);
@ -35,6 +53,21 @@ export class Account {
}
}
static async adoptDehydratedDevice({olm, dehydratedDevice, pickleKey, hsApi, userId, olmWorker, storage}) {
const account = dehydratedDevice.adoptUnpickledOlmAccount();
const oneTimeKeys = JSON.parse(account.one_time_keys());
// only one algorithm supported by olm atm, so hardcode its name
const oneTimeKeysEntries = Object.entries(oneTimeKeys.curve25519);
const serverOTKCount = oneTimeKeysEntries.length;
const areDeviceKeysUploaded = true;
await initiallyStoreAccount(account, pickleKey, areDeviceKeysUploaded, serverOTKCount, storage);
return new Account({
pickleKey, hsApi, account, userId,
deviceId: dehydratedDevice.deviceId,
areDeviceKeysUploaded, serverOTKCount, olm, olmWorker
});
}
static async create({olm, pickleKey, hsApi, userId, deviceId, olmWorker, storage}) {
const account = new olm.Account();
if (olmWorker) {
@ -43,24 +76,13 @@ export class Account {
account.create();
account.generate_one_time_keys(account.max_number_of_one_time_keys());
}
const pickledAccount = account.pickle(pickleKey);
const areDeviceKeysUploaded = false;
const txn = await storage.readWriteTxn([
storage.storeNames.session
]);
try {
// add will throw if the key already exists
// we would not want to overwrite olmAccount here
txn.session.add(ACCOUNT_SESSION_KEY, pickledAccount);
txn.session.add(DEVICE_KEY_FLAG_SESSION_KEY, areDeviceKeysUploaded);
txn.session.add(SERVER_OTK_COUNT_SESSION_KEY, 0);
} catch (err) {
txn.abort();
throw err;
const serverOTKCount = 0;
if (storage) {
await initiallyStoreAccount(account, pickleKey, areDeviceKeysUploaded, serverOTKCount, storage);
}
await txn.complete();
return new Account({pickleKey, hsApi, account, userId,
deviceId, areDeviceKeysUploaded, serverOTKCount: 0, olm, olmWorker});
deviceId, areDeviceKeysUploaded, serverOTKCount, olm, olmWorker});
}
constructor({pickleKey, hsApi, account, userId, deviceId, areDeviceKeysUploaded, serverOTKCount, olm, olmWorker}) {
@ -80,7 +102,11 @@ export class Account {
return this._identityKeys;
}
async uploadKeys(storage, log) {
setDeviceId(deviceId) {
this._deviceId = deviceId;
}
async uploadKeys(storage, isDehydratedDevice, log) {
const oneTimeKeys = JSON.parse(this._account.one_time_keys());
// only one algorithm supported by olm atm, so hardcode its name
const oneTimeKeysEntries = Object.entries(oneTimeKeys.curve25519);
@ -95,7 +121,8 @@ export class Account {
log.set("otks", true);
payload.one_time_keys = this._oneTimeKeysPayload(oneTimeKeysEntries);
}
const response = await this._hsApi.uploadKeys(payload, {log}).response();
const dehydratedDeviceId = isDehydratedDevice ? this._deviceId : undefined;
const response = await this._hsApi.uploadKeys(dehydratedDeviceId, payload, {log}).response();
this._serverOTKCount = response?.one_time_key_counts?.signed_curve25519;
log.set("serverOTKCount", this._serverOTKCount);
// TODO: should we not modify this in the txn like we do elsewhere?
@ -105,12 +132,12 @@ export class Account {
await this._updateSessionStorage(storage, sessionStore => {
if (oneTimeKeysEntries.length) {
this._account.mark_keys_as_published();
sessionStore.set(ACCOUNT_SESSION_KEY, this._account.pickle(this._pickleKey));
sessionStore.set(SERVER_OTK_COUNT_SESSION_KEY, this._serverOTKCount);
sessionStore?.set(ACCOUNT_SESSION_KEY, this._account.pickle(this._pickleKey));
sessionStore?.set(SERVER_OTK_COUNT_SESSION_KEY, this._serverOTKCount);
}
if (!this._areDeviceKeysUploaded) {
this._areDeviceKeysUploaded = true;
sessionStore.set(DEVICE_KEY_FLAG_SESSION_KEY, this._areDeviceKeysUploaded);
sessionStore?.set(DEVICE_KEY_FLAG_SESSION_KEY, this._areDeviceKeysUploaded);
}
});
}
@ -246,16 +273,20 @@ export class Account {
}
async _updateSessionStorage(storage, callback) {
const txn = await storage.readWriteTxn([
storage.storeNames.session
]);
try {
await callback(txn.session);
} catch (err) {
txn.abort();
throw err;
if (storage) {
const txn = await storage.readWriteTxn([
storage.storeNames.session
]);
try {
await callback(txn.session);
} catch (err) {
txn.abort();
throw err;
}
await txn.complete();
} else {
await callback(undefined);
}
await txn.complete();
}
signObject(obj) {
@ -273,4 +304,13 @@ export class Account {
obj.unsigned = unsigned;
}
}
pickleWithKey(key) {
return this._account.pickle(key);
}
dispose() {
this._account.free();
this._account = undefined;
}
}

View file

@ -29,10 +29,10 @@ limitations under the License.
export class DecryptionResult {
constructor(event, senderCurve25519Key, claimedKeys) {
constructor(event, senderCurve25519Key, claimedEd25519Key) {
this.event = event;
this.senderCurve25519Key = senderCurve25519Key;
this.claimedEd25519Key = claimedKeys.ed25519;
this.claimedEd25519Key = claimedEd25519Key;
this._device = null;
this._roomTracked = true;
}

View file

@ -0,0 +1,115 @@
/*
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
const DEHYDRATION_LIBOLM_PICKLE_ALGORITHM = "org.matrix.msc2697.v1.olm.libolm_pickle";
import {KeyDescription} from "../ssss/common.js";
import {keyFromCredentialAndDescription} from "../ssss/index.js";
export async function getDehydratedDevice(hsApi, olm, platform, log) {
try {
const response = await hsApi.getDehydratedDevice({log}).response();
if (response.device_data.algorithm === DEHYDRATION_LIBOLM_PICKLE_ALGORITHM) {
return new EncryptedDehydratedDevice(response, olm, platform);
}
} catch (err) {
if (err.name !== "HomeServerError") {
log.error = err;
}
return undefined;
}
}
export async function uploadAccountAsDehydratedDevice(account, hsApi, key, deviceDisplayName, log) {
const response = await hsApi.createDehydratedDevice({
device_data: {
algorithm: DEHYDRATION_LIBOLM_PICKLE_ALGORITHM,
account: account.pickleWithKey(key.binaryKey.slice()),
passphrase: key.description?.passphraseParams || {},
},
initial_device_display_name: deviceDisplayName
}).response();
const deviceId = response.device_id;
account.setDeviceId(deviceId);
await account.uploadKeys(undefined, true, log);
return deviceId;
}
class EncryptedDehydratedDevice {
constructor(dehydratedDevice, olm, platform) {
this._dehydratedDevice = dehydratedDevice;
this._olm = olm;
this._platform = platform;
}
async decrypt(keyType, credential) {
const keyDescription = new KeyDescription("dehydrated_device", this._dehydratedDevice.device_data.passphrase);
const key = await keyFromCredentialAndDescription(keyType, credential, keyDescription, this._platform, this._olm);
const account = new this._olm.Account();
try {
const pickledAccount = this._dehydratedDevice.device_data.account;
account.unpickle(key.binaryKey.slice(), pickledAccount);
return new DehydratedDevice(this._dehydratedDevice, account, key);
} catch (err) {
account.free();
if (err.message === "OLM.BAD_ACCOUNT_KEY") {
return undefined;
} else {
throw err;
}
}
}
get deviceId() {
return this._dehydratedDevice.device_id;
}
}
class DehydratedDevice {
constructor(dehydratedDevice, account, key) {
this._dehydratedDevice = dehydratedDevice;
this._account = account;
this._key = key;
}
async claim(hsApi, log) {
try {
const response = await hsApi.claimDehydratedDevice(this.deviceId, {log}).response();
return response.success;
} catch (err) {
return false;
}
}
// make it clear that ownership is transfered upon calling this
adoptUnpickledOlmAccount() {
const account = this._account;
this._account = undefined;
return account;
}
get deviceId() {
return this._dehydratedDevice.device_id;
}
get key() {
return this._key;
}
dispose() {
this._account?.free();
this._account = undefined;
}
}

View file

@ -15,9 +15,9 @@ limitations under the License.
*/
import {MEGOLM_ALGORITHM, DecryptionSource} from "./common.js";
import {groupEventsBySession} from "./megolm/decryption/utils.js";
import {mergeMap} from "../../utils/mergeMap.js";
import {groupBy} from "../../utils/groupBy.js";
import {groupEventsBySession} from "./megolm/decryption/utils";
import {mergeMap} from "../../utils/mergeMap";
import {groupBy} from "../../utils/groupBy";
import {makeTxnId} from "../common.js";
const ENCRYPTED_TYPE = "m.room.encrypted";
@ -36,8 +36,6 @@ export class RoomEncryption {
this._megolmDecryption = megolmDecryption;
// content of the m.room.encryption event
this._encryptionParams = encryptionParams;
this._megolmBackfillCache = this._megolmDecryption.createSessionCache();
this._megolmSyncCache = this._megolmDecryption.createSessionCache(1);
// caches devices to verify events
this._senderDeviceCache = new Map();
this._storage = storage;
@ -51,7 +49,7 @@ export class RoomEncryption {
}
enableSessionBackup(sessionBackup) {
if (this._sessionBackup) {
if (this._sessionBackup && !!sessionBackup) {
return;
}
this._sessionBackup = sessionBackup;
@ -76,9 +74,6 @@ export class RoomEncryption {
}
notifyTimelineClosed() {
// empty the backfill cache when closing the timeline
this._megolmBackfillCache.dispose();
this._megolmBackfillCache = this._megolmDecryption.createSessionCache();
this._senderDeviceCache = new Map(); // purge the sender device cache
}
@ -112,27 +107,8 @@ export class RoomEncryption {
}
validEvents.push(event);
}
let customCache;
let sessionCache;
// we have different caches so we can keep them small but still
// have backfill and sync not invalidate each other
if (source === DecryptionSource.Sync) {
sessionCache = this._megolmSyncCache;
} else if (source === DecryptionSource.Timeline) {
sessionCache = this._megolmBackfillCache;
} else if (source === DecryptionSource.Retry) {
// when retrying, we could have mixed events from at the bottom of the timeline (sync)
// and somewhere else, so create a custom cache we use just for this operation.
customCache = this._megolmDecryption.createSessionCache();
sessionCache = customCache;
} else {
throw new Error("Unknown source: " + source);
}
const preparation = await this._megolmDecryption.prepareDecryptAll(
this._room.id, validEvents, newKeys, sessionCache, txn);
if (customCache) {
customCache.dispose();
}
this._room.id, validEvents, newKeys, txn);
return new DecryptionPreparation(preparation, errors, source, this, events);
}
@ -208,33 +184,27 @@ export class RoomEncryption {
try {
const session = await this._sessionBackup.getSession(this._room.id, sessionId, log);
if (session?.algorithm === MEGOLM_ALGORITHM) {
if (session["sender_key"] !== senderKey) {
log.set("wrong_sender_key", session["sender_key"]);
log.logLevel = log.level.Warn;
return;
}
let roomKey = this._megolmDecryption.roomKeyFromBackup(this._room.id, sessionId, session);
if (roomKey) {
if (roomKey.senderKey !== senderKey) {
log.set("wrong_sender_key", roomKey.senderKey);
log.logLevel = log.level.Warn;
return;
}
let keyIsBestOne = false;
let retryEventIds;
const txn = await this._storage.readWriteTxn([this._storage.storeNames.inboundGroupSessions]);
try {
const txn = await this._storage.readWriteTxn([this._storage.storeNames.inboundGroupSessions]);
try {
keyIsBestOne = await this._megolmDecryption.writeRoomKey(roomKey, txn);
log.set("isBetter", keyIsBestOne);
if (keyIsBestOne) {
retryEventIds = roomKey.eventIds;
}
} catch (err) {
txn.abort();
throw err;
keyIsBestOne = await this._megolmDecryption.writeRoomKey(roomKey, txn);
log.set("isBetter", keyIsBestOne);
if (keyIsBestOne) {
retryEventIds = roomKey.eventIds;
}
await txn.complete();
} finally {
// can still access properties on it afterwards
// this is just clearing the internal sessionInfo
roomKey.dispose();
} catch (err) {
txn.abort();
throw err;
}
await txn.complete();
if (keyIsBestOne) {
await log.wrap("retryDecryption", log => this._room.notifyRoomKey(roomKey, retryEventIds || [], log));
}
@ -466,8 +436,6 @@ export class RoomEncryption {
dispose() {
this._disposed = true;
this._megolmBackfillCache.dispose();
this._megolmSyncCache.dispose();
}
}

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import anotherjson from "../../../lib/another-json/index.js";
import {createEnum} from "../../utils/enum.js";
import {createEnum} from "../../utils/enum";
export const DecryptionSource = createEnum("Sync", "Timeline", "Retry");

View file

@ -15,23 +15,26 @@ limitations under the License.
*/
import {DecryptionError} from "../common.js";
import * as RoomKey from "./decryption/RoomKey.js";
import {SessionInfo} from "./decryption/SessionInfo.js";
import {DecryptionPreparation} from "./decryption/DecryptionPreparation.js";
import {SessionDecryption} from "./decryption/SessionDecryption.js";
import {SessionCache} from "./decryption/SessionCache.js";
import {SessionDecryption} from "./decryption/SessionDecryption";
import {MEGOLM_ALGORITHM} from "../common.js";
import {validateEvent, groupEventsBySession} from "./decryption/utils.js";
import {validateEvent, groupEventsBySession} from "./decryption/utils";
import {keyFromStorage, keyFromDeviceMessage, keyFromBackup} from "./decryption/RoomKey";
import type {RoomKey, IncomingRoomKey} from "./decryption/RoomKey";
import type {KeyLoader} from "./decryption/KeyLoader";
import type {OlmWorker} from "../OlmWorker";
import type {Transaction} from "../../storage/idb/Transaction";
import type {TimelineEvent} from "../../storage/types";
import type {DecryptionResult} from "../DecryptionResult";
import type {ILogItem} from "../../../logging/types";
export class Decryption {
constructor({pickleKey, olm, olmWorker}) {
this._pickleKey = pickleKey;
this._olm = olm;
this._olmWorker = olmWorker;
}
private keyLoader: KeyLoader;
private olmWorker?: OlmWorker;
createSessionCache(size) {
return new SessionCache(size);
constructor(keyLoader: KeyLoader, olmWorker: OlmWorker | undefined) {
this.keyLoader = keyLoader;
this.olmWorker = olmWorker;
}
async addMissingKeyEventIds(roomId, senderKey, sessionId, eventIds, txn) {
@ -75,9 +78,9 @@ export class Decryption {
* @param {[type]} txn [description]
* @return {DecryptionPreparation}
*/
async prepareDecryptAll(roomId, events, newKeys, sessionCache, txn) {
async prepareDecryptAll(roomId: string, events: TimelineEvent[], newKeys: IncomingRoomKey[] | undefined, txn: Transaction) {
const errors = new Map();
const validEvents = [];
const validEvents: TimelineEvent[] = [];
for (const event of events) {
if (validateEvent(event)) {
@ -89,11 +92,11 @@ export class Decryption {
const eventsBySession = groupEventsBySession(validEvents);
const sessionDecryptions = [];
const sessionDecryptions: SessionDecryption[] = [];
await Promise.all(Array.from(eventsBySession.values()).map(async group => {
const sessionInfo = await this._getSessionInfo(roomId, group.senderKey, group.sessionId, newKeys, sessionCache, txn);
if (sessionInfo) {
sessionDecryptions.push(new SessionDecryption(sessionInfo, group.events, this._olmWorker));
const key = await this.getRoomKey(roomId, group.senderKey!, group.sessionId!, newKeys, txn);
if (key) {
sessionDecryptions.push(new SessionDecryption(key, group.events, this.olmWorker, this.keyLoader));
} else {
for (const event of group.events) {
errors.set(event.event_id, new DecryptionError("MEGOLM_NO_SESSION", event));
@ -104,63 +107,43 @@ export class Decryption {
return new DecryptionPreparation(roomId, sessionDecryptions, errors);
}
async _getSessionInfo(roomId, senderKey, sessionId, newKeys, sessionCache, txn) {
let sessionInfo;
private async getRoomKey(roomId: string, senderKey: string, sessionId: string, newKeys: IncomingRoomKey[] | undefined, txn: Transaction): Promise<RoomKey | undefined> {
if (newKeys) {
const key = newKeys.find(k => k.roomId === roomId && k.senderKey === senderKey && k.sessionId === sessionId);
if (key) {
sessionInfo = await key.createSessionInfo(this._olm, this._pickleKey, txn);
if (sessionInfo) {
sessionCache.add(sessionInfo);
}
const key = newKeys.find(k => k.isForSession(roomId, senderKey, sessionId));
if (key && await key.checkBetterThanKeyInStorage(this.keyLoader, txn)) {
return key;
}
}
// look only in the cache after looking into newKeys as it may contains that are better
if (!sessionInfo) {
sessionInfo = sessionCache.get(roomId, senderKey, sessionId);
const cachedKey = this.keyLoader.getCachedKey(roomId, senderKey, sessionId);
if (cachedKey) {
return cachedKey;
}
if (!sessionInfo) {
const sessionEntry = await txn.inboundGroupSessions.get(roomId, senderKey, sessionId);
if (sessionEntry && sessionEntry.session) {
let session = new this._olm.InboundGroupSession();
try {
session.unpickle(this._pickleKey, sessionEntry.session);
sessionInfo = new SessionInfo(roomId, senderKey, session, sessionEntry.claimedKeys);
} catch (err) {
session.free();
throw err;
}
sessionCache.add(sessionInfo);
}
const storageKey = await keyFromStorage(roomId, senderKey, sessionId, txn);
if (storageKey && storageKey.serializationKey) {
return storageKey;
}
return sessionInfo;
}
/**
* Writes the key as an inbound group session if there is not already a better key in the store
* @param {RoomKey} key
* @param {Transaction} txn a storage transaction with read/write on inboundGroupSessions
* @return {Promise<boolean>} whether the key was the best for the sessio id and was written
*/
writeRoomKey(key, txn) {
return key.write(this._olm, this._pickleKey, txn);
writeRoomKey(key: IncomingRoomKey, txn: Transaction): Promise<boolean> {
return key.write(this.keyLoader, txn);
}
/**
* Extracts room keys from decrypted device messages.
* The key won't be persisted yet, you need to call RoomKey.write for that.
*
* @param {Array<OlmDecryptionResult>} decryptionResults, any non megolm m.room_key messages will be ignored.
* @return {Array<RoomKey>} an array with validated RoomKey's. Note that it is possible we already have a better version of this key in storage though; writing the key will tell you so.
*/
roomKeysFromDeviceMessages(decryptionResults, log) {
let keys = [];
roomKeysFromDeviceMessages(decryptionResults: DecryptionResult[], log: ILogItem): IncomingRoomKey[] {
const keys: IncomingRoomKey[] = [];
for (const dr of decryptionResults) {
if (dr.event?.type !== "m.room_key" || dr.event.content?.algorithm !== MEGOLM_ALGORITHM) {
continue;
}
log.wrap("room_key", log => {
const key = RoomKey.fromDeviceMessage(dr);
const key = keyFromDeviceMessage(dr);
if (key) {
log.set("roomId", key.roomId);
log.set("id", key.sessionId);
@ -174,8 +157,11 @@ export class Decryption {
return keys;
}
roomKeyFromBackup(roomId, sessionId, sessionInfo) {
return RoomKey.fromBackup(roomId, sessionId, sessionInfo);
roomKeyFromBackup(roomId: string, sessionId: string, sessionInfo: string): IncomingRoomKey | undefined {
return keyFromBackup(roomId, sessionId, sessionInfo);
}
dispose() {
this.keyLoader.dispose();
}
}

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import {DecryptionChanges} from "./DecryptionChanges.js";
import {mergeMap} from "../../../../utils/mergeMap.js";
import {mergeMap} from "../../../../utils/mergeMap";
/**
* Class that contains all the state loaded from storage to decrypt the given events

View file

@ -0,0 +1,434 @@
/*
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {isBetterThan, IncomingRoomKey} from "./RoomKey";
import {BaseLRUCache} from "../../../../utils/LRUCache";
import type {RoomKey} from "./RoomKey";
export declare class OlmDecryptionResult {
readonly plaintext: string;
readonly message_index: number;
}
export declare class OlmInboundGroupSession {
constructor();
free(): void;
pickle(key: string | Uint8Array): string;
unpickle(key: string | Uint8Array, pickle: string);
create(session_key: string): string;
import_session(session_key: string): string;
decrypt(message: string): OlmDecryptionResult;
session_id(): string;
first_known_index(): number;
export_session(message_index: number): string;
}
/*
Because Olm only has very limited memory available when compiled to wasm,
we limit the amount of sessions held in memory.
*/
export class KeyLoader extends BaseLRUCache<KeyOperation> {
private pickleKey: string;
private olm: any;
private resolveUnusedOperation?: () => void;
private operationBecomesUnusedPromise?: Promise<void>;
constructor(olm: any, pickleKey: string, limit: number) {
super(limit);
this.pickleKey = pickleKey;
this.olm = olm;
}
getCachedKey(roomId: string, senderKey: string, sessionId: string): RoomKey | undefined {
const idx = this.findCachedKeyIndex(roomId, senderKey, sessionId);
if (idx !== -1) {
return this._getByIndexAndMoveUp(idx)!.key;
}
}
async useKey<T>(key: RoomKey, callback: (session: OlmInboundGroupSession, pickleKey: string) => Promise<T> | T): Promise<T> {
const keyOp = await this.allocateOperation(key);
try {
return await callback(keyOp.session, this.pickleKey);
} finally {
this.releaseOperation(keyOp);
}
}
get running() {
return this._entries.some(op => op.refCount !== 0);
}
dispose() {
for (let i = 0; i < this._entries.length; i += 1) {
this._entries[i].dispose();
}
// remove all entries
this._entries.splice(0, this._entries.length);
}
private async allocateOperation(key: RoomKey): Promise<KeyOperation> {
let idx;
while((idx = this.findIndexForAllocation(key)) === -1) {
await this.operationBecomesUnused();
}
if (idx < this.size) {
const op = this._getByIndexAndMoveUp(idx)!;
// cache hit
if (op.isForKey(key)) {
op.refCount += 1;
return op;
} else {
// refCount should be 0 here
op.refCount = 1;
op.key = key;
key.loadInto(op.session, this.pickleKey);
}
return op;
} else {
// create new operation
const session = new this.olm.InboundGroupSession();
key.loadInto(session, this.pickleKey);
const op = new KeyOperation(key, session);
this._set(op);
return op;
}
}
private releaseOperation(op: KeyOperation) {
op.refCount -= 1;
if (op.refCount <= 0 && this.resolveUnusedOperation) {
this.resolveUnusedOperation();
// promise is resolved now, we'll need a new one for next await so clear
this.operationBecomesUnusedPromise = this.resolveUnusedOperation = undefined;
}
}
private operationBecomesUnused(): Promise<void> {
if (!this.operationBecomesUnusedPromise) {
this.operationBecomesUnusedPromise = new Promise(resolve => {
this.resolveUnusedOperation = resolve;
});
}
return this.operationBecomesUnusedPromise;
}
private findIndexForAllocation(key: RoomKey) {
let idx = this.findIndexSameKey(key); // cache hit
if (idx === -1) {
if (this.size < this.limit) {
idx = this.size;
} else {
idx = this.findIndexSameSessionUnused(key);
if (idx === -1) {
idx = this.findIndexOldestUnused();
}
}
}
return idx;
}
private findCachedKeyIndex(roomId: string, senderKey: string, sessionId: string): number {
return this._entries.reduce((bestIdx, op, i, arr) => {
const bestOp = bestIdx === -1 ? undefined : arr[bestIdx];
// only operations that are the "best" for their session can be used, see comment on isBest
if (op.isBest === true && op.isForSameSession(roomId, senderKey, sessionId)) {
if (!bestOp || op.isBetter(bestOp)) {
return i;
}
}
return bestIdx;
}, -1);
}
private findIndexSameKey(key: RoomKey): number {
return this._entries.findIndex(op => {
return op.isForSameSession(key.roomId, key.senderKey, key.sessionId) && op.isForKey(key);
});
}
private findIndexSameSessionUnused(key: RoomKey): number {
return this._entries.reduce((worstIdx, op, i, arr) => {
const worst = worstIdx === -1 ? undefined : arr[worstIdx];
// we try to pick the worst operation to overwrite, so the best one stays in the cache
if (op.refCount === 0 && op.isForSameSession(key.roomId, key.senderKey, key.sessionId)) {
if (!worst || !op.isBetter(worst)) {
return i;
}
}
return worstIdx;
}, -1);
}
private findIndexOldestUnused(): number {
for (let i = this._entries.length - 1; i >= 0; i -= 1) {
const op = this._entries[i];
if (op.refCount === 0) {
return i;
}
}
return -1;
}
}
class KeyOperation {
session: OlmInboundGroupSession;
key: RoomKey;
refCount: number;
constructor(key: RoomKey, session: OlmInboundGroupSession) {
this.key = key;
this.session = session;
this.refCount = 1;
}
isForSameSession(roomId: string, senderKey: string, sessionId: string): boolean {
return this.key.roomId === roomId && this.key.senderKey === senderKey && this.key.sessionId === sessionId;
}
// assumes isForSameSession is true
isBetter(other: KeyOperation) {
return isBetterThan(this.session, other.session);
}
isForKey(key: RoomKey) {
return this.key.serializationKey === key.serializationKey &&
this.key.serializationType === key.serializationType;
}
dispose() {
this.session.free();
this.session = undefined as any;
}
/** returns whether the key for this operation has been checked at some point against storage
* and was determined to be the better key, undefined if it hasn't been checked yet.
* Only keys that are the best keys can be returned by getCachedKey as returning a cache hit
* will usually not check for a better session in storage. Also see RoomKey.isBetter. */
get isBest(): boolean | undefined {
return this.key.isBetter;
}
}
export function tests() {
let instances = 0;
class MockRoomKey extends IncomingRoomKey {
private _roomId: string;
private _senderKey: string;
private _sessionId: string;
private _firstKnownIndex: number;
constructor(roomId: string, senderKey: string, sessionId: string, firstKnownIndex: number) {
super();
this._roomId = roomId;
this._senderKey = senderKey;
this._sessionId = sessionId;
this._firstKnownIndex = firstKnownIndex;
}
get roomId(): string { return this._roomId; }
get senderKey(): string { return this._senderKey; }
get sessionId(): string { return this._sessionId; }
get claimedEd25519Key(): string { return "claimedEd25519Key"; }
get serializationKey(): string { return `key-${this.sessionId}-${this._firstKnownIndex}`; }
get serializationType(): string { return "type"; }
get eventIds(): string[] | undefined { return undefined; }
loadInto(session: OlmInboundGroupSession) {
const mockSession = session as MockInboundSession;
mockSession.sessionId = this.sessionId;
mockSession.firstKnownIndex = this._firstKnownIndex;
}
}
class MockInboundSession {
public sessionId: string = "";
public firstKnownIndex: number = 0;
constructor() {
instances += 1;
}
free(): void { instances -= 1; }
pickle(key: string | Uint8Array): string { return `${this.sessionId}-pickled-session`; }
unpickle(key: string | Uint8Array, pickle: string) {}
create(session_key: string): string { return `${this.sessionId}-created-session`; }
import_session(session_key: string): string { return ""; }
decrypt(message: string): OlmDecryptionResult { return {} as OlmDecryptionResult; }
session_id(): string { return this.sessionId; }
first_known_index(): number { return this.firstKnownIndex; }
export_session(message_index: number): string { return `${this.sessionId}-exported-session`; }
}
const PICKLE_KEY = "🥒🔑";
const olm = {InboundGroupSession: MockInboundSession};
const roomId = "!abc:hs.tld";
const aliceSenderKey = "abc";
const bobSenderKey = "def";
const sessionId1 = "s123";
const sessionId2 = "s456";
return {
"load key gives correct session": async assert => {
const loader = new KeyLoader(olm, PICKLE_KEY, 2);
let callback1Called = false;
let callback2Called = false;
const p1 = loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId1, 1), async session => {
callback1Called = true;
assert.equal(session.session_id(), sessionId1);
assert.equal(session.first_known_index(), 1);
await Promise.resolve(); // make sure they are busy in parallel
});
const p2 = loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId2, 2), async session => {
callback2Called = true;
assert.equal(session.session_id(), sessionId2);
assert.equal(session.first_known_index(), 2);
await Promise.resolve(); // make sure they are busy in parallel
});
assert.equal(loader.size, 2);
await Promise.all([p1, p2]);
assert(callback1Called);
assert(callback2Called);
},
"keys with different first index are kept separate": async assert => {
const loader = new KeyLoader(olm, PICKLE_KEY, 2);
let callback1Called = false;
let callback2Called = false;
const p1 = loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId1, 1), async session => {
callback1Called = true;
assert.equal(session.session_id(), sessionId1);
assert.equal(session.first_known_index(), 1);
await Promise.resolve(); // make sure they are busy in parallel
});
const p2 = loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId1, 2), async session => {
callback2Called = true;
assert.equal(session.session_id(), sessionId1);
assert.equal(session.first_known_index(), 2);
await Promise.resolve(); // make sure they are busy in parallel
});
assert.equal(loader.size, 2);
await Promise.all([p1, p2]);
assert(callback1Called);
assert(callback2Called);
},
"useKey blocks as long as no free sessions are available": async assert => {
const loader = new KeyLoader(olm, PICKLE_KEY, 1);
let resolve;
let callbackCalled = false;
loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId1, 1), async session => {
await new Promise(r => resolve = r);
});
await Promise.resolve();
assert.equal(loader.size, 1);
const promise = loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId2, 1), session => {
callbackCalled = true;
});
assert.equal(callbackCalled, false);
resolve();
await promise;
assert.equal(callbackCalled, true);
},
"cache hit while key in use, then replace (check refCount works properly)": async assert => {
const loader = new KeyLoader(olm, PICKLE_KEY, 1);
let resolve1, resolve2;
const key1 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 1);
const p1 = loader.useKey(key1, async session => {
await new Promise(r => resolve1 = r);
});
const p2 = loader.useKey(key1, async session => {
await new Promise(r => resolve2 = r);
});
await Promise.resolve();
assert.equal(loader.size, 1);
assert.equal(loader.running, true);
resolve1();
await p1;
assert.equal(loader.running, true);
resolve2();
await p2;
assert.equal(loader.running, false);
let callbackCalled = false;
await loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId2, 1), async session => {
callbackCalled = true;
assert.equal(session.session_id(), sessionId2);
assert.equal(session.first_known_index(), 1);
});
assert.equal(loader.size, 1);
assert.equal(callbackCalled, true);
},
"cache hit while key not in use": async assert => {
const loader = new KeyLoader(olm, PICKLE_KEY, 2);
let resolve1, resolve2, invocations = 0;
const key1 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 1);
await loader.useKey(key1, async session => { invocations += 1; });
key1.isBetter = true;
assert.equal(loader.size, 1);
const cachedKey = loader.getCachedKey(roomId, aliceSenderKey, sessionId1)!;
assert.equal(cachedKey, key1);
await loader.useKey(cachedKey, async session => { invocations += 1; });
assert.equal(loader.size, 1);
assert.equal(invocations, 2);
},
"dispose calls free on all sessions": async assert => {
instances = 0;
const loader = new KeyLoader(olm, PICKLE_KEY, 2);
await loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId1, 1), async session => {});
await loader.useKey(new MockRoomKey(roomId, aliceSenderKey, sessionId2, 1), async session => {});
assert.equal(instances, 2);
assert.equal(loader.size, 2);
loader.dispose();
assert.strictEqual(instances, 0, "instances");
assert.strictEqual(loader.size, 0, "loader.size");
},
"checkBetterThanKeyInStorage false with cache": async assert => {
const loader = new KeyLoader(olm, PICKLE_KEY, 2);
const key1 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 2);
await loader.useKey(key1, async session => {});
// fake we've checked with storage that this is the best key,
// and as long is it remains the best key with newly added keys,
// it will be returned from getCachedKey (as called from checkBetterThanKeyInStorage)
key1.isBetter = true;
const key2 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 3);
// this will hit cache of key 1 so we pass in null as txn
const isBetter = await key2.checkBetterThanKeyInStorage(loader, null as any);
assert.strictEqual(isBetter, false);
assert.strictEqual(key2.isBetter, false);
},
"checkBetterThanKeyInStorage true with cache": async assert => {
const loader = new KeyLoader(olm, PICKLE_KEY, 2);
const key1 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 2);
key1.isBetter = true; // fake we've check with storage so far (not including key2) this is the best key
await loader.useKey(key1, async session => {});
const key2 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 1);
// this will hit cache of key 1 so we pass in null as txn
const isBetter = await key2.checkBetterThanKeyInStorage(loader, null as any);
assert.strictEqual(isBetter, true);
assert.strictEqual(key2.isBetter, true);
},
"prefer to remove worst key for a session from cache": async assert => {
const loader = new KeyLoader(olm, PICKLE_KEY, 2);
const key1 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 2);
await loader.useKey(key1, async session => {});
key1.isBetter = true; // set to true just so it gets returned from getCachedKey
const key2 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 4);
await loader.useKey(key2, async session => {});
const key3 = new MockRoomKey(roomId, aliceSenderKey, sessionId1, 3);
await loader.useKey(key3, async session => {});
assert.strictEqual(loader.getCachedKey(roomId, aliceSenderKey, sessionId1), key1);
},
}
}

View file

@ -14,11 +14,24 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import type {TimelineEvent} from "../../../storage/types";
export class ReplayDetectionEntry {
constructor(sessionId, messageIndex, event) {
public readonly sessionId: string;
public readonly messageIndex: number;
public readonly event: TimelineEvent;
constructor(sessionId: string, messageIndex: number, event: TimelineEvent) {
this.sessionId = sessionId;
this.messageIndex = messageIndex;
this.eventId = event.event_id;
this.timestamp = event.origin_server_ts;
this.event = event;
}
get eventId(): string {
return this.event.event_id;
}
get timestamp(): number {
return this.event.origin_server_ts;
}
}

View file

@ -1,166 +0,0 @@
import {SessionInfo} from "./SessionInfo.js";
export class BaseRoomKey {
constructor() {
this._sessionInfo = null;
this._isBetter = null;
this._eventIds = null;
}
async createSessionInfo(olm, pickleKey, txn) {
if (this._isBetter === false) {
return;
}
const session = new olm.InboundGroupSession();
try {
this._loadSessionKey(session);
this._isBetter = await this._isBetterThanKnown(session, olm, pickleKey, txn);
if (this._isBetter) {
const claimedKeys = {ed25519: this.claimedEd25519Key};
this._sessionInfo = new SessionInfo(this.roomId, this.senderKey, session, claimedKeys);
// retain the session so we don't have to create a new session during write.
this._sessionInfo.retain();
return this._sessionInfo;
} else {
session.free();
return;
}
} catch (err) {
this._sessionInfo = null;
session.free();
throw err;
}
}
async _isBetterThanKnown(session, olm, pickleKey, txn) {
let isBetter = true;
// TODO: we could potentially have a small speedup here if we looked first in the SessionCache here...
const existingSessionEntry = await txn.inboundGroupSessions.get(this.roomId, this.senderKey, this.sessionId);
if (existingSessionEntry?.session) {
const existingSession = new olm.InboundGroupSession();
try {
existingSession.unpickle(pickleKey, existingSessionEntry.session);
isBetter = session.first_known_index() < existingSession.first_known_index();
} finally {
existingSession.free();
}
}
// store the event ids that can be decrypted with this key
// before we overwrite them if called from `write`.
if (existingSessionEntry?.eventIds) {
this._eventIds = existingSessionEntry.eventIds;
}
return isBetter;
}
async write(olm, pickleKey, txn) {
// we checked already and we had a better session in storage, so don't write
if (this._isBetter === false) {
return false;
}
if (!this._sessionInfo) {
await this.createSessionInfo(olm, pickleKey, txn);
}
if (this._sessionInfo) {
const session = this._sessionInfo.session;
const sessionEntry = {
roomId: this.roomId,
senderKey: this.senderKey,
sessionId: this.sessionId,
session: session.pickle(pickleKey),
claimedKeys: this._sessionInfo.claimedKeys,
};
txn.inboundGroupSessions.set(sessionEntry);
this.dispose();
return true;
}
return false;
}
get eventIds() {
return this._eventIds;
}
dispose() {
if (this._sessionInfo) {
this._sessionInfo.release();
this._sessionInfo = null;
}
}
}
class DeviceMessageRoomKey extends BaseRoomKey {
constructor(decryptionResult) {
super();
this._decryptionResult = decryptionResult;
}
get roomId() { return this._decryptionResult.event.content?.["room_id"]; }
get senderKey() { return this._decryptionResult.senderCurve25519Key; }
get sessionId() { return this._decryptionResult.event.content?.["session_id"]; }
get claimedEd25519Key() { return this._decryptionResult.claimedEd25519Key; }
_loadSessionKey(session) {
const sessionKey = this._decryptionResult.event.content?.["session_key"];
session.create(sessionKey);
}
}
class BackupRoomKey extends BaseRoomKey {
constructor(roomId, sessionId, backupInfo) {
super();
this._roomId = roomId;
this._sessionId = sessionId;
this._backupInfo = backupInfo;
}
get roomId() { return this._roomId; }
get senderKey() { return this._backupInfo["sender_key"]; }
get sessionId() { return this._sessionId; }
get claimedEd25519Key() { return this._backupInfo["sender_claimed_keys"]?.["ed25519"]; }
_loadSessionKey(session) {
const sessionKey = this._backupInfo["session_key"];
session.import_session(sessionKey);
}
}
export function fromDeviceMessage(dr) {
const roomId = dr.event.content?.["room_id"];
const sessionId = dr.event.content?.["session_id"];
const sessionKey = dr.event.content?.["session_key"];
if (
typeof roomId === "string" ||
typeof sessionId === "string" ||
typeof senderKey === "string" ||
typeof sessionKey === "string"
) {
return new DeviceMessageRoomKey(dr);
}
}
/*
sessionInfo is a response from key backup and has the following keys:
algorithm
forwarding_curve25519_key_chain
sender_claimed_keys
sender_key
session_key
*/
export function fromBackup(roomId, sessionId, sessionInfo) {
const sessionKey = sessionInfo["session_key"];
const senderKey = sessionInfo["sender_key"];
// TODO: can we just trust this?
const claimedEd25519Key = sessionInfo["sender_claimed_keys"]?.["ed25519"];
if (
typeof roomId === "string" &&
typeof sessionId === "string" &&
typeof senderKey === "string" &&
typeof sessionKey === "string" &&
typeof claimedEd25519Key === "string"
) {
return new BackupRoomKey(roomId, sessionId, sessionInfo);
}
}

View file

@ -0,0 +1,245 @@
/*
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import type {InboundGroupSessionEntry} from "../../../storage/idb/stores/InboundGroupSessionStore";
import type {Transaction} from "../../../storage/idb/Transaction";
import type {DecryptionResult} from "../../DecryptionResult";
import type {KeyLoader, OlmInboundGroupSession} from "./KeyLoader";
export abstract class RoomKey {
private _isBetter: boolean | undefined;
isForSession(roomId: string, senderKey: string, sessionId: string) {
return this.roomId === roomId && this.senderKey === senderKey && this.sessionId === sessionId;
}
abstract get roomId(): string;
abstract get senderKey(): string;
abstract get sessionId(): string;
abstract get claimedEd25519Key(): string;
abstract get serializationKey(): string;
abstract get serializationType(): string;
abstract get eventIds(): string[] | undefined;
abstract loadInto(session: OlmInboundGroupSession, pickleKey: string): void;
/* Whether the key has been checked against storage (or is from storage)
* to be the better key for a given session. Given that all keys are checked to be better
* as part of writing, we can trust that when this returns true, it really is the best key
* available between storage and cached keys in memory. This is why keys with this field set to
* true are used by the key loader to return cached keys. Also see KeyOperation.isBest there. */
get isBetter(): boolean | undefined { return this._isBetter; }
// should only be set in key.checkBetterThanKeyInStorage
set isBetter(value: boolean | undefined) { this._isBetter = value; }
}
export function isBetterThan(newSession: OlmInboundGroupSession, existingSession: OlmInboundGroupSession) {
return newSession.first_known_index() < existingSession.first_known_index();
}
export abstract class IncomingRoomKey extends RoomKey {
private _eventIds?: string[];
checkBetterThanKeyInStorage(loader: KeyLoader, txn: Transaction): Promise<boolean> {
return this._checkBetterThanKeyInStorage(loader, undefined, txn);
}
async write(loader: KeyLoader, txn: Transaction): Promise<boolean> {
// we checked already and we had a better session in storage, so don't write
let pickledSession;
if (this.isBetter === undefined) {
// if this key wasn't used to decrypt any messages in the same sync,
// we haven't checked if this is the best key yet,
// so do that now to not overwrite a better key.
// while we have the key deserialized, also pickle it to store it later on here.
await this._checkBetterThanKeyInStorage(loader, (session, pickleKey) => {
pickledSession = session.pickle(pickleKey);
}, txn);
}
if (this.isBetter === false) {
return false;
}
// before calling write in parallel, we need to check loader.running is false so we are sure our transaction will not be closed
if (!pickledSession) {
pickledSession = await loader.useKey(this, (session, pickleKey) => session.pickle(pickleKey));
}
const sessionEntry = {
roomId: this.roomId,
senderKey: this.senderKey,
sessionId: this.sessionId,
session: pickledSession,
claimedKeys: {"ed25519": this.claimedEd25519Key},
};
txn.inboundGroupSessions.set(sessionEntry);
return true;
}
get eventIds() { return this._eventIds; }
private async _checkBetterThanKeyInStorage(loader: KeyLoader, callback: (((session: OlmInboundGroupSession, pickleKey: string) => void) | undefined), txn: Transaction): Promise<boolean> {
if (this.isBetter !== undefined) {
return this.isBetter;
}
let existingKey = loader.getCachedKey(this.roomId, this.senderKey, this.sessionId);
if (!existingKey) {
const storageKey = await keyFromStorage(this.roomId, this.senderKey, this.sessionId, txn);
// store the event ids that can be decrypted with this key
// before we overwrite them if called from `write`.
if (storageKey) {
if (storageKey.hasSession) {
existingKey = storageKey;
} else if (storageKey.eventIds) {
this._eventIds = storageKey.eventIds;
}
}
}
if (existingKey) {
const key = existingKey;
await loader.useKey(this, async newSession => {
await loader.useKey(key, (existingSession, pickleKey) => {
// set isBetter as soon as possible, on both keys compared,
// as it is is used to determine whether a key can be used for the cache
this.isBetter = isBetterThan(newSession, existingSession);
key.isBetter = !this.isBetter;
if (this.isBetter && callback) {
callback(newSession, pickleKey);
}
});
});
} else {
// no previous key, so we're the best \o/
this.isBetter = true;
}
return this.isBetter!;
}
}
class DeviceMessageRoomKey extends IncomingRoomKey {
private _decryptionResult: DecryptionResult;
constructor(decryptionResult: DecryptionResult) {
super();
this._decryptionResult = decryptionResult;
}
get roomId() { return this._decryptionResult.event.content?.["room_id"]; }
get senderKey() { return this._decryptionResult.senderCurve25519Key; }
get sessionId() { return this._decryptionResult.event.content?.["session_id"]; }
get claimedEd25519Key() { return this._decryptionResult.claimedEd25519Key; }
get serializationKey(): string { return this._decryptionResult.event.content?.["session_key"]; }
get serializationType(): string { return "create"; }
loadInto(session) {
session.create(this.serializationKey);
}
}
class BackupRoomKey extends IncomingRoomKey {
private _roomId: string;
private _sessionId: string;
private _backupInfo: string;
constructor(roomId, sessionId, backupInfo) {
super();
this._roomId = roomId;
this._sessionId = sessionId;
this._backupInfo = backupInfo;
}
get roomId() { return this._roomId; }
get senderKey() { return this._backupInfo["sender_key"]; }
get sessionId() { return this._sessionId; }
get claimedEd25519Key() { return this._backupInfo["sender_claimed_keys"]?.["ed25519"]; }
get serializationKey(): string { return this._backupInfo["session_key"]; }
get serializationType(): string { return "import_session"; }
loadInto(session) {
session.import_session(this.serializationKey);
}
}
class StoredRoomKey extends RoomKey {
private storageEntry: InboundGroupSessionEntry;
constructor(storageEntry: InboundGroupSessionEntry) {
super();
this.isBetter = true; // usually the key in storage is the best until checks prove otherwise
this.storageEntry = storageEntry;
}
get roomId() { return this.storageEntry.roomId; }
get senderKey() { return this.storageEntry.senderKey; }
get sessionId() { return this.storageEntry.sessionId; }
get claimedEd25519Key() { return this.storageEntry.claimedKeys!["ed25519"]; }
get eventIds() { return this.storageEntry.eventIds; }
get serializationKey(): string { return this.storageEntry.session || ""; }
get serializationType(): string { return "unpickle"; }
loadInto(session, pickleKey) {
session.unpickle(pickleKey, this.serializationKey);
}
get hasSession() {
// sessions are stored before they are received
// to keep track of events that need it to be decrypted.
// This is used to retry decryption of those events once the session is received.
return !!this.serializationKey;
}
}
export function keyFromDeviceMessage(dr: DecryptionResult): DeviceMessageRoomKey | undefined {
const sessionKey = dr.event.content?.["session_key"];
const key = new DeviceMessageRoomKey(dr);
if (
typeof key.roomId === "string" &&
typeof key.sessionId === "string" &&
typeof key.senderKey === "string" &&
typeof sessionKey === "string"
) {
return key;
}
}
/*
sessionInfo is a response from key backup and has the following keys:
algorithm
forwarding_curve25519_key_chain
sender_claimed_keys
sender_key
session_key
*/
export function keyFromBackup(roomId, sessionId, backupInfo): BackupRoomKey | undefined {
const sessionKey = backupInfo["session_key"];
const senderKey = backupInfo["sender_key"];
// TODO: can we just trust this?
const claimedEd25519Key = backupInfo["sender_claimed_keys"]?.["ed25519"];
if (
typeof roomId === "string" &&
typeof sessionId === "string" &&
typeof senderKey === "string" &&
typeof sessionKey === "string" &&
typeof claimedEd25519Key === "string"
) {
return new BackupRoomKey(roomId, sessionId, backupInfo);
}
}
export async function keyFromStorage(roomId: string, senderKey: string, sessionId: string, txn: Transaction): Promise<StoredRoomKey | undefined> {
const existingSessionEntry = await txn.inboundGroupSessions.get(roomId, senderKey, sessionId);
if (existingSessionEntry) {
return new StoredRoomKey(existingSessionEntry);
}
return;
}

View file

@ -1,61 +0,0 @@
/*
Copyright 2020 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {BaseLRUCache} from "../../../../utils/LRUCache.js";
const DEFAULT_CACHE_SIZE = 10;
/**
* Cache of unpickled inbound megolm session.
*/
export class SessionCache extends BaseLRUCache {
constructor(limit) {
limit = typeof limit === "number" ? limit : DEFAULT_CACHE_SIZE;
super(limit);
}
/**
* @param {string} roomId
* @param {string} senderKey
* @param {string} sessionId
* @return {SessionInfo?}
*/
get(roomId, senderKey, sessionId) {
return this._get(s => {
return s.roomId === roomId &&
s.senderKey === senderKey &&
sessionId === s.sessionId;
});
}
add(sessionInfo) {
sessionInfo.retain();
this._set(sessionInfo, s => {
return s.roomId === sessionInfo.roomId &&
s.senderKey === sessionInfo.senderKey &&
s.sessionId === sessionInfo.sessionId;
});
}
_onEvictEntry(sessionInfo) {
sessionInfo.release();
}
dispose() {
for (const sessionInfo of this._entries) {
sessionInfo.release();
}
}
}

View file

@ -1,90 +0,0 @@
/*
Copyright 2020 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {DecryptionResult} from "../../DecryptionResult.js";
import {DecryptionError} from "../../common.js";
import {ReplayDetectionEntry} from "./ReplayDetectionEntry.js";
/**
* Does the actual decryption of all events for a given megolm session in a batch
*/
export class SessionDecryption {
constructor(sessionInfo, events, olmWorker) {
sessionInfo.retain();
this._sessionInfo = sessionInfo;
this._events = events;
this._olmWorker = olmWorker;
this._decryptionRequests = olmWorker ? [] : null;
}
async decryptAll() {
const replayEntries = [];
const results = new Map();
let errors;
const roomId = this._sessionInfo.roomId;
await Promise.all(this._events.map(async event => {
try {
const {session} = this._sessionInfo;
const ciphertext = event.content.ciphertext;
let decryptionResult;
if (this._olmWorker) {
const request = this._olmWorker.megolmDecrypt(session, ciphertext);
this._decryptionRequests.push(request);
decryptionResult = await request.response();
} else {
decryptionResult = session.decrypt(ciphertext);
}
const plaintext = decryptionResult.plaintext;
const messageIndex = decryptionResult.message_index;
let payload;
try {
payload = JSON.parse(plaintext);
} catch (err) {
throw new DecryptionError("PLAINTEXT_NOT_JSON", event, {plaintext, err});
}
if (payload.room_id !== roomId) {
throw new DecryptionError("MEGOLM_WRONG_ROOM", event,
{encryptedRoomId: payload.room_id, eventRoomId: roomId});
}
replayEntries.push(new ReplayDetectionEntry(session.session_id(), messageIndex, event));
const result = new DecryptionResult(payload, this._sessionInfo.senderKey, this._sessionInfo.claimedKeys);
results.set(event.event_id, result);
} catch (err) {
// ignore AbortError from cancelling decryption requests in dispose method
if (err.name === "AbortError") {
return;
}
if (!errors) {
errors = new Map();
}
errors.set(event.event_id, err);
}
}));
return {results, errors, replayEntries};
}
dispose() {
if (this._decryptionRequests) {
for (const r of this._decryptionRequests) {
r.abort();
}
}
// TODO: cancel decryptions here
this._sessionInfo.release();
}
}

View file

@ -0,0 +1,103 @@
/*
Copyright 2020 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {DecryptionResult} from "../../DecryptionResult.js";
import {DecryptionError} from "../../common.js";
import {ReplayDetectionEntry} from "./ReplayDetectionEntry";
import type {RoomKey} from "./RoomKey.js";
import type {KeyLoader, OlmDecryptionResult} from "./KeyLoader";
import type {OlmWorker} from "../../OlmWorker";
import type {TimelineEvent} from "../../../storage/types";
interface DecryptAllResult {
readonly results: Map<string, DecryptionResult>;
readonly errors?: Map<string, Error>;
readonly replayEntries: ReplayDetectionEntry[];
}
/**
* Does the actual decryption of all events for a given megolm session in a batch
*/
export class SessionDecryption {
private key: RoomKey;
private events: TimelineEvent[];
private keyLoader: KeyLoader;
private olmWorker?: OlmWorker;
private decryptionRequests?: any[];
constructor(key: RoomKey, events: TimelineEvent[], olmWorker: OlmWorker | undefined, keyLoader: KeyLoader) {
this.key = key;
this.events = events;
this.olmWorker = olmWorker;
this.keyLoader = keyLoader;
this.decryptionRequests = olmWorker ? [] : undefined;
}
async decryptAll(): Promise<DecryptAllResult> {
const replayEntries: ReplayDetectionEntry[] = [];
const results: Map<string, DecryptionResult> = new Map();
let errors: Map<string, Error> | undefined;
await this.keyLoader.useKey(this.key, async session => {
for (const event of this.events) {
try {
const ciphertext = event.content.ciphertext as string;
let decryptionResult: OlmDecryptionResult | undefined;
// TODO: pass all cipthertexts in one go to the megolm worker and don't deserialize the key until in the worker?
if (this.olmWorker) {
const request = this.olmWorker.megolmDecrypt(session, ciphertext);
this.decryptionRequests!.push(request);
decryptionResult = await request.response();
} else {
decryptionResult = session.decrypt(ciphertext);
}
const {plaintext} = decryptionResult!;
let payload;
try {
payload = JSON.parse(plaintext);
} catch (err) {
throw new DecryptionError("PLAINTEXT_NOT_JSON", event, {plaintext, err});
}
if (payload.room_id !== this.key.roomId) {
throw new DecryptionError("MEGOLM_WRONG_ROOM", event,
{encryptedRoomId: payload.room_id, eventRoomId: this.key.roomId});
}
replayEntries.push(new ReplayDetectionEntry(this.key.sessionId, decryptionResult!.message_index, event));
const result = new DecryptionResult(payload, this.key.senderKey, this.key.claimedEd25519Key);
results.set(event.event_id, result);
} catch (err) {
// ignore AbortError from cancelling decryption requests in dispose method
if (err.name === "AbortError") {
return;
}
if (!errors) {
errors = new Map();
}
errors.set(event.event_id, err);
}
}
});
return {results, errors, replayEntries};
}
dispose() {
if (this.decryptionRequests) {
for (const r of this.decryptionRequests) {
r.abort();
}
}
}
}

View file

@ -1,49 +0,0 @@
/*
Copyright 2020 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
/**
* session loaded in memory with everything needed to create DecryptionResults
* and to store/retrieve it in the SessionCache
*/
export class SessionInfo {
constructor(roomId, senderKey, session, claimedKeys) {
this.roomId = roomId;
this.senderKey = senderKey;
this.session = session;
this.claimedKeys = claimedKeys;
this._refCounter = 0;
}
get sessionId() {
return this.session?.session_id();
}
retain() {
this._refCounter += 1;
}
release() {
this._refCounter -= 1;
if (this._refCounter <= 0) {
this.dispose();
}
}
dispose() {
this.session.free();
this.session = null;
}
}

View file

@ -14,44 +14,46 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {groupByWithCreator} from "../../../../utils/groupBy.js";
import {groupByWithCreator} from "../../../../utils/groupBy";
import type {TimelineEvent} from "../../../storage/types";
function getSenderKey(event) {
function getSenderKey(event: TimelineEvent): string | undefined {
return event.content?.["sender_key"];
}
function getSessionId(event) {
function getSessionId(event: TimelineEvent): string | undefined {
return event.content?.["session_id"];
}
function getCiphertext(event) {
function getCiphertext(event: TimelineEvent): string | undefined {
return event.content?.ciphertext;
}
export function validateEvent(event) {
export function validateEvent(event: TimelineEvent) {
return typeof getSenderKey(event) === "string" &&
typeof getSessionId(event) === "string" &&
typeof getCiphertext(event) === "string";
}
class SessionKeyGroup {
export class SessionKeyGroup {
public readonly events: TimelineEvent[];
constructor() {
this.events = [];
}
get senderKey() {
return getSenderKey(this.events[0]);
get senderKey(): string | undefined {
return getSenderKey(this.events[0]!);
}
get sessionId() {
return getSessionId(this.events[0]);
get sessionId(): string | undefined {
return getSessionId(this.events[0]!);
}
}
export function groupEventsBySession(events) {
return groupByWithCreator(events,
event => `${getSenderKey(event)}|${getSessionId(event)}`,
export function groupEventsBySession(events: TimelineEvent[]): Map<string, SessionKeyGroup> {
return groupByWithCreator<string, TimelineEvent, SessionKeyGroup>(events,
(event: TimelineEvent) => `${getSenderKey(event)}|${getSessionId(event)}`,
() => new SessionKeyGroup(),
(group, event) => group.events.push(event)
(group: SessionKeyGroup, event: TimelineEvent) => group.events.push(event)
);
}

View file

@ -15,8 +15,8 @@ limitations under the License.
*/
import {DecryptionError} from "../common.js";
import {groupBy} from "../../../utils/groupBy.js";
import {MultiLock} from "../../../utils/Lock.js";
import {groupBy} from "../../../utils/groupBy";
import {MultiLock} from "../../../utils/Lock";
import {Session} from "./Session.js";
import {DecryptionResult} from "../DecryptionResult.js";
@ -150,7 +150,7 @@ export class Decryption {
throw new DecryptionError("PLAINTEXT_NOT_JSON", event, {plaintext, error});
}
this._validatePayload(payload, event);
return new DecryptionResult(payload, senderKey, payload.keys);
return new DecryptionResult(payload, senderKey, payload.keys.ed25519);
} else {
throw new DecryptionError("OLM_NO_MATCHING_SESSION", event,
{knownSessionIds: senderKeyDecryption.sessions.map(s => s.id)});

View file

@ -14,7 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {groupByWithCreator} from "../../../utils/groupBy.js";
import {groupByWithCreator} from "../../../utils/groupBy";
import {verifyEd25519Signature, OLM_ALGORITHM} from "../common.js";
import {createSessionEntry} from "./Session.js";

View file

@ -38,7 +38,7 @@ export class HomeServerError extends Error {
}
}
export {AbortError} from "../utils/error.js";
export {AbortError} from "../utils/error";
export class ConnectionError extends Error {
constructor(message, isTimeout) {

View file

@ -14,7 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {AbortError} from "../../utils/error.js";
import {AbortError} from "../../utils/error";
export class ExponentialRetryDelay {
constructor(createTimeout) {

View file

@ -18,6 +18,9 @@ limitations under the License.
import {encodeQueryParams, encodeBody} from "./common.js";
import {HomeServerRequest} from "./HomeServerRequest.js";
const CS_R0_PREFIX = "/_matrix/client/r0";
const DEHYDRATION_PREFIX = "/_matrix/client/unstable/org.matrix.msc2697.v2";
export class HomeServerApi {
constructor({homeserver, accessToken, request, reconnector}) {
// store these both in a closure somehow so it's harder to get at in case of XSS?
@ -28,8 +31,8 @@ export class HomeServerApi {
this._reconnector = reconnector;
}
_url(csPath) {
return `${this._homeserver}/_matrix/client/r0${csPath}`;
_url(csPath, prefix = CS_R0_PREFIX) {
return this._homeserver + prefix + csPath;
}
_baseRequest(method, url, queryParams, body, options, accessToken) {
@ -92,15 +95,15 @@ export class HomeServerApi {
}
_post(csPath, queryParams, body, options) {
return this._authedRequest("POST", this._url(csPath), queryParams, body, options);
return this._authedRequest("POST", this._url(csPath, options?.prefix || CS_R0_PREFIX), queryParams, body, options);
}
_put(csPath, queryParams, body, options) {
return this._authedRequest("PUT", this._url(csPath), queryParams, body, options);
return this._authedRequest("PUT", this._url(csPath, options?.prefix || CS_R0_PREFIX), queryParams, body, options);
}
_get(csPath, queryParams, body, options) {
return this._authedRequest("GET", this._url(csPath), queryParams, body, options);
return this._authedRequest("GET", this._url(csPath, options?.prefix || CS_R0_PREFIX), queryParams, body, options);
}
sync(since, filter, timeout, options = null) {
@ -170,8 +173,12 @@ export class HomeServerApi {
return this._unauthedRequest("GET", `${this._homeserver}/_matrix/client/versions`, null, null, options);
}
uploadKeys(payload, options = null) {
return this._post("/keys/upload", null, payload, options);
uploadKeys(dehydratedDeviceId, payload, options = null) {
let path = "/keys/upload";
if (dehydratedDeviceId) {
path = path + `/${encodeURIComponent(dehydratedDeviceId)}`;
}
return this._post(path, null, payload, options);
}
queryKeys(queryRequest, options = null) {
@ -225,6 +232,25 @@ export class HomeServerApi {
forget(roomId, options = null) {
return this._post(`/rooms/${encodeURIComponent(roomId)}/forget`, null, null, options);
}
logout(options = null) {
return this._post(`/logout`, null, null, options);
}
getDehydratedDevice(options = {}) {
options.prefix = DEHYDRATION_PREFIX;
return this._get(`/dehydrated_device`, null, null, options);
}
createDehydratedDevice(payload, options = {}) {
options.prefix = DEHYDRATION_PREFIX;
return this._put(`/dehydrated_device`, null, payload, options);
}
claimDehydratedDevice(deviceId, options = {}) {
options.prefix = DEHYDRATION_PREFIX;
return this._post(`/dehydrated_device/claim`, null, {device_id: deviceId}, options);
}
}
import {Request as MockRequest} from "../../mocks/Request.js";

View file

@ -14,7 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {createEnum} from "../../utils/enum.js";
import {createEnum} from "../../utils/enum";
import {ObservableValue} from "../../observable/ObservableValue";
export const ConnectionStatus = createEnum(

View file

@ -15,7 +15,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {AbortError} from "../../utils/error.js";
import {AbortError} from "../../utils/error";
import {HomeServerError} from "../error.js";
import {HomeServerApi} from "./HomeServerApi.js";
import {ExponentialRetryDelay} from "./ExponentialRetryDelay.js";

View file

@ -27,7 +27,7 @@ import {Heroes} from "./members/Heroes.js";
import {EventEntry} from "./timeline/entries/EventEntry.js";
import {ObservedEventMap} from "./ObservedEventMap.js";
import {DecryptionSource} from "../e2ee/common.js";
import {ensureLogItem} from "../../logging/utils.js";
import {ensureLogItem} from "../../logging/utils";
import {PowerLevels} from "./PowerLevels.js";
import {RetainedObservableValue} from "../../observable/ObservableValue";
@ -208,7 +208,7 @@ export class BaseRoom extends EventEmitter {
if (this._summary.data.needsHeroes) {
this._heroes = new Heroes(this._roomId);
const changes = await this._heroes.calculateChanges(this._summary.data.heroes, [], txn);
this._heroes.applyChanges(changes, this._summary.data);
this._heroes.applyChanges(changes, this._summary.data, log);
}
} catch (err) {
throw new WrappedError(`Could not load room ${this._roomId}`, err);
@ -463,7 +463,7 @@ export class BaseRoom extends EventEmitter {
enableSessionBackup(sessionBackup) {
this._roomEncryption?.enableSessionBackup(sessionBackup);
// TODO: do we really want to do this every time you open the app?
if (this._timeline) {
if (this._timeline && sessionBackup) {
this._platform.logger.run("enableSessionBackup", log => {
return this._roomEncryption.restoreMissingSessionsFromBackup(this._timeline.remoteEntries, log);
});

View file

@ -139,7 +139,7 @@ export class Invite extends EventEmitter {
const summaryData = this._createSummaryData(inviteState);
let heroes;
if (!summaryData.name && !summaryData.canonicalAlias) {
heroes = await this._createHeroes(inviteState);
heroes = await this._createHeroes(inviteState, log);
}
const myInvite = this._getMyInvite(inviteState);
if (!myInvite) {
@ -204,7 +204,7 @@ export class Invite extends EventEmitter {
return inviteState.reduce(processStateEvent, new SummaryData(null, this.id));
}
async _createHeroes(inviteState) {
async _createHeroes(inviteState, log) {
const members = inviteState.filter(e => e.type === MEMBER_EVENT_TYPE);
const otherMembers = members.filter(e => e.state_key !== this._user.id);
const memberChanges = otherMembers.reduce((map, e) => {
@ -220,7 +220,7 @@ export class Invite extends EventEmitter {
const countSummary = new SummaryData(null, this.id);
countSummary.joinCount = members.reduce((sum, e) => sum + (e.content?.membership === "join" ? 1 : 0), 0);
countSummary.inviteCount = members.reduce((sum, e) => sum + (e.content?.membership === "invite" ? 1 : 0), 0);
heroes.applyChanges(changes, countSummary);
heroes.applyChanges(changes, countSummary, log);
return heroes;
}
@ -244,7 +244,7 @@ export class Invite extends EventEmitter {
}
}
import {NullLogItem} from "../../logging/NullLogger.js";
import {NullLogItem} from "../../logging/NullLogger";
import {Clock as MockClock} from "../../mocks/Clock.js";
import {default as roomInviteFixture} from "../../fixtures/matrix/invites/room.js";
import {default as dmInviteFixture} from "../../fixtures/matrix/invites/dm.js";

View file

@ -76,6 +76,10 @@ export class Room extends BaseRoom {
let eventsToDecrypt = roomResponse?.timeline?.events || [];
// when new keys arrive, also see if any older events can now be retried to decrypt
if (newKeys) {
// TODO: if a key is considered by roomEncryption.prepareDecryptAll to use for decryption,
// key.eventIds will be set. We could somehow try to reuse that work, but retrying also needs
// to happen if a key is not needed to decrypt this sync or there are indeed no encrypted messages
// in this sync at all.
retryEntries = await this._getSyncRetryDecryptEntries(newKeys, roomEncryption, txn);
if (retryEntries.length) {
log.set("retry", retryEntries.length);
@ -236,7 +240,7 @@ export class Room extends BaseRoom {
}
if (this._heroes && heroChanges) {
const oldName = this.name;
this._heroes.applyChanges(heroChanges, this._summary.data);
this._heroes.applyChanges(heroChanges, this._summary.data, log);
if (oldName !== this.name) {
emitChange = true;
}

View file

@ -16,7 +16,7 @@ limitations under the License.
import {RoomMember} from "./RoomMember.js";
function calculateRoomName(sortedMembers, summaryData) {
function calculateRoomName(sortedMembers, summaryData, log) {
const countWithoutMe = summaryData.joinCount + summaryData.inviteCount - 1;
if (sortedMembers.length >= countWithoutMe) {
if (sortedMembers.length > 1) {
@ -24,7 +24,13 @@ function calculateRoomName(sortedMembers, summaryData) {
const firstMembers = sortedMembers.slice(0, sortedMembers.length - 1);
return firstMembers.map(m => m.name).join(", ") + " and " + lastMember.name;
} else {
return sortedMembers[0].name;
const otherMember = sortedMembers[0];
if (otherMember) {
return otherMember.name;
} else {
log.log({l: "could get get other member name", length: sortedMembers.length, otherMember: !!otherMember, otherMemberMembership: otherMember?.membership});
return "Unknown DM Name";
}
}
} else if (sortedMembers.length < countWithoutMe) {
return sortedMembers.map(m => m.name).join(", ") + ` and ${countWithoutMe} others`;
@ -74,7 +80,7 @@ export class Heroes {
return {updatedHeroMembers: updatedHeroMembers.values(), removedUserIds};
}
applyChanges({updatedHeroMembers, removedUserIds}, summaryData) {
applyChanges({updatedHeroMembers, removedUserIds}, summaryData, log) {
for (const userId of removedUserIds) {
this._members.delete(userId);
}
@ -82,7 +88,7 @@ export class Heroes {
this._members.set(member.userId, member);
}
const sortedMembers = Array.from(this._members.values()).sort((a, b) => a.name.localeCompare(b.name));
this._roomName = calculateRoomName(sortedMembers, summaryData);
this._roomName = calculateRoomName(sortedMembers, summaryData, log);
}
get roomName() {

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import {ObservableMap} from "../../../observable/map/ObservableMap.js";
import {RetainedValue} from "../../../utils/RetainedValue.js";
import {RetainedValue} from "../../../utils/RetainedValue";
export class MemberList extends RetainedValue {
constructor({members, closeCallback}) {

View file

@ -13,8 +13,8 @@ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {createEnum} from "../../../utils/enum.js";
import {AbortError} from "../../../utils/error.js";
import {createEnum} from "../../../utils/enum";
import {AbortError} from "../../../utils/error";
import {REDACTION_TYPE} from "../common.js";
import {getRelationFromContent, getRelationTarget, setRelationTarget} from "../timeline/relations.js";

View file

@ -353,7 +353,7 @@ export class SendQueue {
import {HomeServer as MockHomeServer} from "../../../mocks/HomeServer.js";
import {createMockStorage} from "../../../mocks/Storage";
import {ListObserver} from "../../../mocks/ListObserver.js";
import {NullLogger, NullLogItem} from "../../../logging/NullLogger.js";
import {NullLogger, NullLogItem} from "../../../logging/NullLogger";
import {createEvent, withTextBody, withTxnId} from "../../../mocks/event.js";
import {poll} from "../../../mocks/poll.js";
import {createAnnotation} from "../timeline/relations.js";

View file

@ -16,7 +16,7 @@ limitations under the License.
*/
import {SortedArray, AsyncMappedList, ConcatList, ObservableArray} from "../../../observable/index.js";
import {Disposables} from "../../../utils/Disposables.js";
import {Disposables} from "../../../utils/Disposables";
import {Direction} from "./Direction";
import {TimelineReader} from "./persistence/TimelineReader.js";
import {PendingEventEntry} from "./entries/PendingEventEntry.js";
@ -346,7 +346,7 @@ import {Clock as MockClock} from "../../../mocks/Clock.js";
import {createMockStorage} from "../../../mocks/Storage";
import {ListObserver} from "../../../mocks/ListObserver.js";
import {createEvent, withTextBody, withContent, withSender} from "../../../mocks/event.js";
import {NullLogItem} from "../../../logging/NullLogger.js";
import {NullLogItem} from "../../../logging/NullLogger";
import {EventEntry} from "./entries/EventEntry.js";
import {User} from "../../User.js";
import {PendingEvent} from "../sending/PendingEvent.js";

View file

@ -205,7 +205,7 @@ import {FragmentIdComparer} from "../FragmentIdComparer.js";
import {RelationWriter} from "./RelationWriter.js";
import {createMockStorage} from "../../../../mocks/Storage";
import {FragmentBoundaryEntry} from "../entries/FragmentBoundaryEntry.js";
import {NullLogItem} from "../../../../logging/NullLogger.js";
import {NullLogItem} from "../../../../logging/NullLogger";
import {TimelineMock, eventIds, eventId} from "../../../../mocks/TimelineMock.ts";
import {SyncWriter} from "./SyncWriter.js";
import {MemberWriter} from "./MemberWriter.js";

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import {MemberChange, RoomMember, EVENT_TYPE as MEMBER_EVENT_TYPE} from "../../members/RoomMember.js";
import {LRUCache} from "../../../../utils/LRUCache.js";
import {LRUCache} from "../../../../utils/LRUCache";
export class MemberWriter {
constructor(roomId) {

View file

@ -257,7 +257,7 @@ import {createMockStorage} from "../../../../mocks/Storage";
import {createEvent, withTextBody, withRedacts, withContent} from "../../../../mocks/event.js";
import {createAnnotation} from "../relations.js";
import {FragmentIdComparer} from "../FragmentIdComparer.js";
import {NullLogItem} from "../../../../logging/NullLogger.js";
import {NullLogItem} from "../../../../logging/NullLogger";
export function tests() {
const fragmentIdComparer = new FragmentIdComparer([]);

View file

@ -258,7 +258,7 @@ export class SyncWriter {
import {createMockStorage} from "../../../../mocks/Storage";
import {createEvent, withTextBody} from "../../../../mocks/event.js";
import {Instance as nullLogger} from "../../../../logging/NullLogger.js";
import {Instance as nullLogger} from "../../../../logging/NullLogger";
export function tests() {
const roomId = "!abc:hs.tld";
return {

View file

@ -15,9 +15,9 @@ limitations under the License.
*/
export class KeyDescription {
constructor(id, keyAccountData) {
constructor(id, keyDescription) {
this._id = id;
this._keyAccountData = keyAccountData;
this._keyDescription = keyDescription;
}
get id() {
@ -25,11 +25,30 @@ export class KeyDescription {
}
get passphraseParams() {
return this._keyAccountData?.content?.passphrase;
return this._keyDescription?.passphrase;
}
get algorithm() {
return this._keyAccountData?.content?.algorithm;
return this._keyDescription?.algorithm;
}
async isCompatible(key, platform) {
if (this.algorithm === "m.secret_storage.v1.aes-hmac-sha2") {
const kd = this._keyDescription;
if (kd.mac) {
const otherMac = await calculateKeyMac(key.binaryKey, kd.iv, platform);
return kd.mac === otherMac;
} else if (kd.passphrase) {
const kdOther = key.description._keyDescription;
if (!kdOther.passphrase) {
return false;
}
return kd.passphrase.algorithm === kdOther.passphrase.algorithm &&
kd.passphrase.iterations === kdOther.passphrase.iterations &&
kd.passphrase.salt === kdOther.passphrase.salt;
}
}
return false;
}
}
@ -39,6 +58,14 @@ export class Key {
this._binaryKey = binaryKey;
}
withDescription(description) {
return new Key(description, this._binaryKey);
}
get description() {
return this._keyDescription;
}
get id() {
return this._keyDescription.id;
}
@ -51,3 +78,24 @@ export class Key {
return this._keyDescription.algorithm;
}
}
async function calculateKeyMac(key, ivStr, platform) {
const {crypto, encoding} = platform;
const {utf8, base64} = encoding;
const {derive, aes, hmac} = crypto;
const iv = base64.decode(ivStr);
// salt for HKDF, with 8 bytes of zeros
const zerosalt = new Uint8Array(8);
const ZERO_STR = "\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0";
const info = utf8.encode("");
const keybits = await derive.hkdf(key, zerosalt, info, "SHA-256", 512);
const aesKey = keybits.slice(0, 32);
const hmacKey = keybits.slice(32);
const ciphertext = await aes.encryptCTR({key: aesKey, iv, data: utf8.encode(ZERO_STR)});
const mac = await hmac.compute(hmacKey, ciphertext, "SHA-256");
return base64.encode(mac);
}

View file

@ -18,9 +18,12 @@ import {KeyDescription, Key} from "./common.js";
import {keyFromPassphrase} from "./passphrase.js";
import {keyFromRecoveryKey} from "./recoveryKey.js";
import {SESSION_E2EE_KEY_PREFIX} from "../e2ee/common.js";
import {createEnum} from "../../utils/enum";
const SSSS_KEY = `${SESSION_E2EE_KEY_PREFIX}ssssKey`;
export const KeyType = createEnum("RecoveryKey", "Passphrase");
async function readDefaultKeyDescription(storage) {
const txn = await storage.readTxn([
storage.storeNames.accountData
@ -34,7 +37,7 @@ async function readDefaultKeyDescription(storage) {
if (!keyAccountData) {
return;
}
return new KeyDescription(id, keyAccountData);
return new KeyDescription(id, keyAccountData.content);
}
export async function writeKey(key, txn) {
@ -47,7 +50,14 @@ export async function readKey(txn) {
return;
}
const keyAccountData = await txn.accountData.get(`m.secret_storage.key.${keyData.id}`);
return new Key(new KeyDescription(keyData.id, keyAccountData), keyData.binaryKey);
if (keyAccountData) {
return new Key(new KeyDescription(keyData.id, keyAccountData.content), keyData.binaryKey);
}
}
export async function removeKey(txn) {
await txn.session.remove(SSSS_KEY);
}
export async function keyFromCredential(type, credential, storage, platform, olm) {
@ -55,13 +65,24 @@ export async function keyFromCredential(type, credential, storage, platform, olm
if (!keyDescription) {
throw new Error("Could not find a default secret storage key in account data");
}
return await keyFromCredentialAndDescription(type, credential, keyDescription, platform, olm);
}
export async function keyFromCredentialAndDescription(type, credential, keyDescription, platform, olm) {
let key;
if (type === "phrase") {
if (type === KeyType.Passphrase) {
key = await keyFromPassphrase(keyDescription, credential, platform);
} else if (type === "key") {
} else if (type === KeyType.RecoveryKey) {
key = keyFromRecoveryKey(keyDescription, credential, olm, platform);
} else {
throw new Error(`Invalid type: ${type}`);
}
return key;
}
export async function keyFromDehydratedDeviceKey(key, storage, platform) {
const keyDescription = await readDefaultKeyDescription(storage);
if (await keyDescription.isCompatible(key, platform)) {
return key.withDescription(keyDescription);
}
}

View file

@ -16,7 +16,7 @@ limitations under the License.
import {iterateCursor, DONE, NOT_DONE, reqAsPromise} from "./utils";
import {StorageError} from "../common";
import {LogItem} from "../../../logging/LogItem.js";
import {ILogItem} from "../../../logging/types";
import {IDBKey} from "./Transaction";
// this is the part of the Transaction class API that is used here and in the Store subclass,
@ -25,7 +25,7 @@ export interface ITransaction {
idbFactory: IDBFactory;
IDBKeyRange: typeof IDBKeyRange;
databaseName: string;
addWriteError(error: StorageError, refItem: LogItem | undefined, operationName: string, keys: IDBKey[] | undefined);
addWriteError(error: StorageError, refItem: ILogItem | undefined, operationName: string, keys: IDBKey[] | undefined);
}
type Reducer<A,B> = (acc: B, val: A) => B
@ -277,7 +277,7 @@ export function tests() {
class MockTransaction extends MockIDBImpl {
get databaseName(): string { return "mockdb"; }
addWriteError(error: StorageError, refItem: LogItem | undefined, operationName: string, keys: IDBKey[] | undefined) {}
addWriteError(error: StorageError, refItem: ILogItem | undefined, operationName: string, keys: IDBKey[] | undefined) {}
}
interface TestEntry {

View file

@ -18,7 +18,7 @@ import {IDOMStorage} from "./types";
import {Transaction} from "./Transaction";
import { STORE_NAMES, StoreNames, StorageError } from "../common";
import { reqAsPromise } from "./utils";
import { BaseLogger } from "../../../logging/BaseLogger.js";
import { ILogger } from "../../../logging/types";
const WEBKITEARLYCLOSETXNBUG_BOGUS_KEY = "782rh281re38-boguskey";
@ -26,13 +26,13 @@ export class Storage {
private _db: IDBDatabase;
private _hasWebkitEarlyCloseTxnBug: boolean;
readonly logger: BaseLogger;
readonly logger: ILogger;
readonly idbFactory: IDBFactory
readonly IDBKeyRange: typeof IDBKeyRange;
readonly storeNames: typeof StoreNames;
readonly localStorage: IDOMStorage;
constructor(idbDatabase: IDBDatabase, idbFactory: IDBFactory, _IDBKeyRange: typeof IDBKeyRange, hasWebkitEarlyCloseTxnBug: boolean, localStorage: IDOMStorage, logger: BaseLogger) {
constructor(idbDatabase: IDBDatabase, idbFactory: IDBFactory, _IDBKeyRange: typeof IDBKeyRange, hasWebkitEarlyCloseTxnBug: boolean, localStorage: IDOMStorage, logger: ILogger) {
this._db = idbDatabase;
this.idbFactory = idbFactory;
this.IDBKeyRange = _IDBKeyRange;

View file

@ -20,11 +20,10 @@ import { openDatabase, reqAsPromise } from "./utils";
import { exportSession, importSession, Export } from "./export";
import { schema } from "./schema";
import { detectWebkitEarlyCloseTxnBug } from "./quirks";
import { BaseLogger } from "../../../logging/BaseLogger.js";
import { LogItem } from "../../../logging/LogItem.js";
import { ILogItem } from "../../../logging/types";
const sessionName = (sessionId: string) => `hydrogen_session_${sessionId}`;
const openDatabaseWithSessionId = function(sessionId: string, idbFactory: IDBFactory, localStorage: IDOMStorage, log: LogItem) {
const openDatabaseWithSessionId = function(sessionId: string, idbFactory: IDBFactory, localStorage: IDOMStorage, log: ILogItem) {
const create = (db, txn, oldVersion, version) => createStores(db, txn, oldVersion, version, localStorage, log);
return openDatabase(sessionName(sessionId), create, schema.length, idbFactory);
}
@ -63,7 +62,7 @@ export class StorageFactory {
this._localStorage = localStorage;
}
async create(sessionId: string, log: LogItem): Promise<Storage> {
async create(sessionId: string, log: ILogItem): Promise<Storage> {
await this._serviceWorkerHandler?.preventConcurrentSessionAccess(sessionId);
requestPersistedStorage().then(persisted => {
// Firefox lies here though, and returns true even if the user denied the request
@ -83,23 +82,25 @@ export class StorageFactory {
return reqAsPromise(req);
}
async export(sessionId: string, log: LogItem): Promise<Export> {
async export(sessionId: string, log: ILogItem): Promise<Export> {
const db = await openDatabaseWithSessionId(sessionId, this._idbFactory, this._localStorage, log);
return await exportSession(db);
}
async import(sessionId: string, data: Export, log: LogItem): Promise<void> {
async import(sessionId: string, data: Export, log: ILogItem): Promise<void> {
const db = await openDatabaseWithSessionId(sessionId, this._idbFactory, this._localStorage, log);
return await importSession(db, data);
}
}
async function createStores(db: IDBDatabase, txn: IDBTransaction, oldVersion: number | null, version: number, localStorage: IDOMStorage, log: LogItem): Promise<void> {
async function createStores(db: IDBDatabase, txn: IDBTransaction, oldVersion: number | null, version: number, localStorage: IDOMStorage, log: ILogItem): Promise<void> {
const startIdx = oldVersion || 0;
return log.wrap({l: "storage migration", oldVersion, version}, async log => {
for(let i = startIdx; i < version; ++i) {
const migrationFunc = schema[i];
await log.wrap(`v${i + 1}`, log => migrationFunc(db, txn, localStorage, log));
}
});
return log.wrap(
{ l: "storage migration", oldVersion, version },
async (log) => {
for (let i = startIdx; i < version; ++i) {
const migrationFunc = schema[i];
await log.wrap(`v${i + 1}`, (log) => migrationFunc(db, txn, localStorage, log));
}
});
}

View file

@ -18,7 +18,7 @@ import {QueryTarget, IDBQuery, ITransaction} from "./QueryTarget";
import {IDBRequestError, IDBRequestAttemptError} from "./error";
import {reqAsPromise} from "./utils";
import {Transaction, IDBKey} from "./Transaction";
import {LogItem} from "../../../logging/LogItem.js";
import {ILogItem} from "../../../logging/types";
const LOG_REQUESTS = false;
@ -145,7 +145,7 @@ export class Store<T> extends QueryTarget<T> {
return new QueryTarget<T>(new QueryTargetWrapper<T>(this._idbStore.index(indexName)), this._transaction);
}
put(value: T, log?: LogItem): void {
put(value: T, log?: ILogItem): void {
// If this request fails, the error will bubble up to the transaction and abort it,
// which is the behaviour we want. Therefore, it is ok to not create a promise for this
// request and await it.
@ -160,13 +160,13 @@ export class Store<T> extends QueryTarget<T> {
this._prepareErrorLog(request, log, "put", undefined, value);
}
add(value: T, log?: LogItem): void {
add(value: T, log?: ILogItem): void {
// ok to not monitor result of request, see comment in `put`.
const request = this._idbStore.add(value);
this._prepareErrorLog(request, log, "add", undefined, value);
}
async tryAdd(value: T, log: LogItem): Promise<boolean> {
async tryAdd(value: T, log: ILogItem): Promise<boolean> {
try {
await reqAsPromise(this._idbStore.add(value));
return true;
@ -181,13 +181,13 @@ export class Store<T> extends QueryTarget<T> {
}
}
delete(keyOrKeyRange: IDBValidKey | IDBKeyRange, log?: LogItem): void {
delete(keyOrKeyRange: IDBValidKey | IDBKeyRange, log?: ILogItem): void {
// ok to not monitor result of request, see comment in `put`.
const request = this._idbStore.delete(keyOrKeyRange);
this._prepareErrorLog(request, log, "delete", keyOrKeyRange, undefined);
}
private _prepareErrorLog(request: IDBRequest, log: LogItem | undefined, operationName: string, key: IDBKey | undefined, value: T | undefined) {
private _prepareErrorLog(request: IDBRequest, log: ILogItem | undefined, operationName: string, key: IDBKey | undefined, value: T | undefined) {
if (log) {
log.ensureRefId();
}

View file

@ -36,15 +36,14 @@ import {OutboundGroupSessionStore} from "./stores/OutboundGroupSessionStore";
import {GroupSessionDecryptionStore} from "./stores/GroupSessionDecryptionStore";
import {OperationStore} from "./stores/OperationStore";
import {AccountDataStore} from "./stores/AccountDataStore";
import {LogItem} from "../../../logging/LogItem.js";
import {BaseLogger} from "../../../logging/BaseLogger.js";
import type {ILogger, ILogItem} from "../../../logging/types";
export type IDBKey = IDBValidKey | IDBKeyRange;
class WriteErrorInfo {
constructor(
public readonly error: StorageError,
public readonly refItem: LogItem | undefined,
public readonly refItem: ILogItem | undefined,
public readonly operationName: string,
public readonly keys: IDBKey[] | undefined,
) {}
@ -77,7 +76,7 @@ export class Transaction {
return this._storage.databaseName;
}
get logger(): BaseLogger {
get logger(): ILogger {
return this._storage.logger;
}
@ -169,7 +168,7 @@ export class Transaction {
return this._store(StoreNames.accountData, idbStore => new AccountDataStore(idbStore));
}
async complete(log?: LogItem): Promise<void> {
async complete(log?: ILogItem): Promise<void> {
try {
await txnAsPromise(this._txn);
} catch (err) {
@ -190,7 +189,7 @@ export class Transaction {
return error;
}
abort(log?: LogItem): void {
abort(log?: ILogItem): void {
// TODO: should we wrap the exception in a StorageError?
try {
this._txn.abort();
@ -202,14 +201,14 @@ export class Transaction {
}
}
addWriteError(error: StorageError, refItem: LogItem | undefined, operationName: string, keys: IDBKey[] | undefined) {
addWriteError(error: StorageError, refItem: ILogItem | undefined, operationName: string, keys: IDBKey[] | undefined) {
// don't log subsequent `AbortError`s
if (error.errcode !== "AbortError" || this._writeErrors.length === 0) {
this._writeErrors.push(new WriteErrorInfo(error, refItem, operationName, keys));
}
}
private _logWriteErrors(parentItem: LogItem | undefined) {
private _logWriteErrors(parentItem: ILogItem | undefined) {
const callback = errorGroupItem => {
// we don't have context when there is no parentItem, so at least log stores
if (!parentItem) {

View file

@ -11,10 +11,10 @@ import {SessionStore} from "./stores/SessionStore";
import {Store} from "./Store";
import {encodeScopeTypeKey} from "./stores/OperationStore";
import {MAX_UNICODE} from "./stores/common";
import {LogItem} from "../../../logging/LogItem.js";
import {ILogItem} from "../../../logging/types";
export type MigrationFunc = (db: IDBDatabase, txn: IDBTransaction, localStorage: IDOMStorage, log: LogItem) => Promise<void> | void;
export type MigrationFunc = (db: IDBDatabase, txn: IDBTransaction, localStorage: IDOMStorage, log: ILogItem) => Promise<void> | void;
// FUNCTIONS SHOULD ONLY BE APPENDED!!
// the index in the array is the database version
export const schema: MigrationFunc[] = [
@ -166,7 +166,7 @@ function createTimelineRelationsStore(db: IDBDatabase) : void {
}
//v11 doesn't change the schema, but ensures all userIdentities have all the roomIds they should (see #470)
async function fixMissingRoomsInUserIdentities(db: IDBDatabase, txn: IDBTransaction, localStorage: IDOMStorage, log: LogItem) {
async function fixMissingRoomsInUserIdentities(db: IDBDatabase, txn: IDBTransaction, localStorage: IDOMStorage, log: ILogItem) {
const roomSummaryStore = txn.objectStore("roomSummary");
const trackedRoomIds: string[] = [];
await iterateCursor<SummaryData>(roomSummaryStore.openCursor(), roomSummary => {
@ -220,7 +220,7 @@ async function changeSSSSKeyPrefix(db: IDBDatabase, txn: IDBTransaction) {
}
}
// v13
async function backupAndRestoreE2EEAccountToLocalStorage(db: IDBDatabase, txn: IDBTransaction, localStorage: IDOMStorage, log: LogItem) {
async function backupAndRestoreE2EEAccountToLocalStorage(db: IDBDatabase, txn: IDBTransaction, localStorage: IDOMStorage, log: ILogItem) {
const session = txn.objectStore("session");
// the Store object gets passed in several things through the Transaction class (a wrapper around IDBTransaction),
// the only thing we should need here is the databaseName though, so we mock it out.

View file

@ -17,24 +17,26 @@ limitations under the License.
import {MIN_UNICODE, MAX_UNICODE} from "./common";
import {Store} from "../Store";
interface InboundGroupSession {
export interface InboundGroupSessionEntry {
roomId: string;
senderKey: string;
sessionId: string;
session?: string;
claimedKeys?: { [algorithm : string] : string };
eventIds?: string[];
key: string;
}
type InboundGroupSessionStorageEntry = InboundGroupSessionEntry & { key: string };
function encodeKey(roomId: string, senderKey: string, sessionId: string): string {
return `${roomId}|${senderKey}|${sessionId}`;
}
export class InboundGroupSessionStore {
private _store: Store<InboundGroupSession>;
private _store: Store<InboundGroupSessionStorageEntry>;
constructor(store: Store<InboundGroupSession>) {
constructor(store: Store<InboundGroupSessionStorageEntry>) {
this._store = store;
}
@ -44,13 +46,14 @@ export class InboundGroupSessionStore {
return key === fetchedKey;
}
get(roomId: string, senderKey: string, sessionId: string): Promise<InboundGroupSession | null> {
get(roomId: string, senderKey: string, sessionId: string): Promise<InboundGroupSessionEntry | null> {
return this._store.get(encodeKey(roomId, senderKey, sessionId));
}
set(session: InboundGroupSession): void {
session.key = encodeKey(session.roomId, session.senderKey, session.sessionId);
this._store.put(session);
set(session: InboundGroupSessionEntry): void {
const storageEntry = session as InboundGroupSessionStorageEntry;
storageEntry.key = encodeKey(session.roomId, session.senderKey, session.sessionId);
this._store.put(storageEntry);
}
removeAllForRoom(roomId: string) {

View file

@ -16,8 +16,8 @@ limitations under the License.
import {Store} from "../Store";
import {IDOMStorage} from "../types";
import {SESSION_E2EE_KEY_PREFIX} from "../../../e2ee/common.js";
import {LogItem} from "../../../../logging/LogItem.js";
import {parse, stringify} from "../../../../utils/typedJSON";
import type {ILogItem} from "../../../../logging/types";
export interface SessionEntry {
key: string;
@ -64,7 +64,7 @@ export class SessionStore {
});
}
async tryRestoreE2EEIdentityFromLocalStorage(log: LogItem): Promise<boolean> {
async tryRestoreE2EEIdentityFromLocalStorage(log: ILogItem): Promise<boolean> {
let success = false;
const lsPrefix = this._localStorageKeyPrefix;
const prefix = lsPrefix + SESSION_E2EE_KEY_PREFIX;

View file

@ -20,7 +20,7 @@ import { encodeUint32, decodeUint32 } from "../utils";
import {KeyLimits} from "../../common";
import {Store} from "../Store";
import {TimelineEvent, StateEvent} from "../../types";
import {LogItem} from "../../../../logging/LogItem.js";
import {ILogItem} from "../../../../logging/types";
interface Annotation {
count: number;
@ -286,7 +286,7 @@ export class TimelineEventStore {
*
* Returns if the event was not yet known and the entry was written.
*/
tryInsert(entry: TimelineEventEntry, log: LogItem): Promise<boolean> {
tryInsert(entry: TimelineEventEntry, log: ILogItem): Promise<boolean> {
(entry as TimelineEventStorageEntry).key = encodeKey(entry.roomId, entry.fragmentId, entry.eventIndex);
(entry as TimelineEventStorageEntry).eventIdKey = encodeEventIdKey(entry.roomId, entry.event.event_id);
return this._timelineStore.tryAdd(entry as TimelineEventStorageEntry, log);
@ -320,7 +320,7 @@ export class TimelineEventStore {
import {createMockStorage} from "../../../../mocks/Storage";
import {createEvent, withTextBody} from "../../../../mocks/event.js";
import {createEventEntry} from "../../../room/timeline/persistence/common.js";
import {Instance as logItem} from "../../../../logging/NullLogger.js";
import {Instance as nullLogger} from "../../../../logging/NullLogger";
export function tests() {
@ -368,7 +368,7 @@ export function tests() {
let eventKey = EventKey.defaultFragmentKey(109);
for (const insertedId of insertedIds) {
const entry = createEventEntry(eventKey.nextKey(), roomId, createEventWithId(insertedId));
assert(await txn.timelineEvents.tryInsert(entry, logItem));
assert(await txn.timelineEvents.tryInsert(entry, nullLogger.item));
eventKey = eventKey.nextKey();
}
const eventKeyMap = await txn.timelineEvents.getEventKeysForIds(roomId, checkedIds);

View file

@ -17,7 +17,7 @@ limitations under the License.
import { IDBRequestError } from "./error";
import { StorageError } from "../common";
import { AbortError } from "../../../utils/error.js";
import { AbortError } from "../../../utils/error";
let needsSyncPromise = false;

View file

@ -14,7 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {AbortError} from "../utils/error.js";
import {AbortError} from "../utils/error";
export class BaseRequest {
constructor() {

View file

@ -18,7 +18,7 @@ import {FDBFactory, FDBKeyRange} from "../../lib/fake-indexeddb/index.js";
import {StorageFactory} from "../matrix/storage/idb/StorageFactory";
import {IDOMStorage} from "../matrix/storage/idb/types";
import {Storage} from "../matrix/storage/idb/Storage";
import {Instance as nullLogger} from "../logging/NullLogger.js";
import {Instance as nullLogger} from "../logging/NullLogger";
import {openDatabase, CreateObjectStore} from "../matrix/storage/idb/utils";
export function createMockStorage(): Promise<Storage> {

View file

@ -14,7 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {AbortError} from "../utils/error.js";
import {AbortError} from "../utils/error";
import {BaseObservable} from "./BaseObservable";
// like an EventEmitter, but doesn't have an event type

View file

@ -46,6 +46,14 @@ export class ObservableArray<T> extends BaseObservableList<T> {
this.emitAdd(idx, item);
}
move(fromIdx: number, toIdx: number): void {
if (fromIdx < this._items.length && toIdx < this._items.length) {
const [item] = this._items.splice(fromIdx, 1);
this._items.splice(toIdx, 0, item);
this.emitMove(fromIdx, toIdx, item);
}
}
update(idx: number, item: T, params: any = null): void {
if (idx < this._items.length) {
this._items[idx] = item;

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import {BaseObservableList} from "./BaseObservableList";
import {sortedIndex} from "../../utils/sortedIndex.js";
import {sortedIndex} from "../../utils/sortedIndex";
import {findAndUpdateInArray} from "./common";
declare function sortedIndex<T>(array: T[], value: T, comparator: (left: T, right: T) => number): number;

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import {BaseObservableList} from "./BaseObservableList";
import {sortedIndex} from "../../utils/sortedIndex.js";
import {sortedIndex} from "../../utils/sortedIndex";
/*

View file

@ -15,7 +15,7 @@ limitations under the License.
*/
import aesjs from "../../../lib/aes-js/index.js";
import {hkdf} from "../../utils/crypto/hkdf.js";
import {hkdf} from "../../utils/crypto/hkdf";
import {Platform as ModernPlatform} from "./Platform.js";
export function Platform(container, paths) {

View file

@ -21,8 +21,8 @@ import {SessionInfoStorage} from "../../matrix/sessioninfo/localstorage/SessionI
import {SettingsStorage} from "./dom/SettingsStorage.js";
import {Encoding} from "./utils/Encoding.js";
import {OlmWorker} from "../../matrix/e2ee/OlmWorker.js";
import {IDBLogger} from "../../logging/IDBLogger.js";
import {ConsoleLogger} from "../../logging/ConsoleLogger.js";
import {IDBLogger} from "../../logging/IDBLogger";
import {ConsoleLogger} from "../../logging/ConsoleLogger";
import {RootView} from "./ui/RootView.js";
import {Clock} from "./dom/Clock.js";
import {ServiceWorkerHandler} from "./dom/ServiceWorkerHandler.js";
@ -35,7 +35,7 @@ import {WorkerPool} from "./dom/WorkerPool.js";
import {BlobHandle} from "./dom/BlobHandle.js";
import {hasReadPixelPermission, ImageHandle, VideoHandle} from "./dom/ImageHandle.js";
import {downloadInIframe} from "./dom/download.js";
import {Disposables} from "../../utils/Disposables.js";
import {Disposables} from "../../utils/Disposables";
import {parseHTML} from "./parsehtml.js";
import {handleAvatarError} from "./ui/avatar.js";
@ -132,11 +132,7 @@ export class Platform {
this.clock = new Clock();
this.encoding = new Encoding();
this.random = Math.random;
if (options?.development) {
this.logger = new ConsoleLogger({platform: this});
} else {
this.logger = new IDBLogger({name: "hydrogen_logs", platform: this});
}
this._createLogger(options?.development);
this.history = new History();
this.onlineStatus = new OnlineStatus();
this._serviceWorkerHandler = null;
@ -162,6 +158,21 @@ export class Platform {
this._disposables = new Disposables();
}
_createLogger(isDevelopment) {
// Make sure that loginToken does not end up in the logs
const transformer = (item) => {
if (item.e?.stack) {
item.e.stack = item.e.stack.replace(/(?<=\/\?loginToken=).+/, "<snip>");
}
return item;
};
if (isDevelopment) {
this.logger = new ConsoleLogger({platform: this});
} else {
this.logger = new IDBLogger({name: "hydrogen_logs", platform: this, serializedTransformer: transformer});
}
}
get updateService() {
return this._serviceWorkerHandler;
}
@ -272,3 +283,30 @@ export class Platform {
this._disposables.dispose();
}
}
import {LogItem} from "../../logging/LogItem";
export function tests() {
return {
"loginToken should not be in logs": (assert) => {
const transformer = (item) => {
if (item.e?.stack) {
item.e.stack = item.e.stack.replace(/(?<=\/\?loginToken=).+/, "<snip>");
}
return item;
};
const logger = {
_queuedItems: [],
_serializedTransformer: transformer,
_now: () => {}
};
logger.persist = IDBLogger.prototype._persistItem.bind(logger);
const logItem = new LogItem("test", 1, logger);
logItem.error = new Error();
logItem.error.stack = "main http://localhost:3000/src/main.js:55\n<anonymous> http://localhost:3000/?loginToken=secret:26"
logger.persist(logItem, null, false);
const item = logger._queuedItems.pop();
console.log(item);
assert.strictEqual(item.json.search("secret"), -1);
}
};
}

View file

@ -2,6 +2,7 @@
<html>
<head>
<meta charset="utf-8">
<title>Hydrogen Chat</title>
<meta name="viewport" content="width=device-width, user-scalable=no">
<meta name="application-name" content="Hydrogen Chat"/>
<meta name="apple-mobile-web-app-capable" content="yes">
@ -9,6 +10,7 @@
<meta name="apple-mobile-web-app-title" content="Hydrogen Chat">
<meta name="description" content="A matrix chat application">
<link rel="apple-touch-icon" href="assets/icon-maskable.png">
<link rel="icon" type="image/png" href="assets/icon-maskable.png">
<link rel="stylesheet" type="text/css" href="src/platform/web/ui/css/main.css">
<link rel="stylesheet" type="text/css" href="src/platform/web/ui/css/themes/element/theme.css" title="Element Theme">
<link rel="alternate stylesheet" type="text/css" href="src/platform/web/ui/css/themes/bubbles/theme.css" title="Bubbles Theme">

View file

@ -69,7 +69,14 @@ async function purgeOldCaches() {
}
self.addEventListener('fetch', (event) => {
event.respondWith(handleRequest(event.request));
/*
service worker shouldn't handle xhr uploads because otherwise
the progress events won't fire.
This has to do with xhr not being supported in service workers.
*/
if (event.request.method === "GET") {
event.respondWith(handleRequest(event.request));
}
});
function isCacheableThumbnail(url) {

View file

@ -14,7 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {AbortError} from "../../../utils/error.js";
import {AbortError} from "../../../utils/error";
class Timeout {
constructor(ms) {

View file

@ -14,7 +14,7 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import {AbortError} from "../../../utils/error.js";
import {AbortError} from "../../../utils/error";
class WorkerState {
constructor(worker) {

View file

@ -19,7 +19,7 @@ import {
AbortError,
ConnectionError
} from "../../../../matrix/error.js";
import {abortOnTimeout} from "../../../../utils/timeout.js";
import {abortOnTimeout} from "../../../../utils/timeout";
import {addCacheBuster} from "./common.js";
import {xhrRequest} from "./xhr.js";

View file

@ -37,6 +37,11 @@ class RequestResult {
function createXhr(url, {method, headers, timeout, format, uploadProgress}) {
const xhr = new XMLHttpRequest();
if (uploadProgress) {
xhr.upload.addEventListener("progress", evt => uploadProgress(evt.loaded));
}
xhr.open(method, url);
if (format === "buffer") {
@ -56,10 +61,6 @@ function createXhr(url, {method, headers, timeout, format, uploadProgress}) {
xhr.timeout = timeout;
}
if (uploadProgress) {
xhr.upload.addEventListener("progress", evt => uploadProgress(evt.loaded));
}
return xhr;
}

View file

@ -54,15 +54,12 @@ limitations under the License.
padding: 0 0.4em 0.4em;
}
.SessionLoadStatusView, .LoginView_query-spinner {
.SessionLoadStatusView > .status, .LoginView_query-spinner {
display: flex;
gap: 12px;
}
.SessionLoadStatusView> :not(:first-child), .LoginView_query-spinner> :not(:first-child) {
margin-left: 12px;
}
.SessionLoadStatusView p, .LoginView_query-spinner p {
.SessionLoadStatusView > .status p, .LoginView_query-spinner p {
flex: 1;
margin: 0;
}

View file

@ -502,7 +502,7 @@ a {
}
.MessageComposer_input, .MessageComposer_replyPreview {
padding: 8px 16px;
padding: 8px;
}
.MessageComposer_replyPreview > .replying {
@ -536,14 +536,20 @@ a {
margin-left: 12px;
}
.MessageComposer_input > input {
padding: 0 16px;
.MessageComposer_input > textarea {
border: none;
border-radius: 24px;
background: #F6F6F6;
height: 48px;
font-size: 14px;
font-family: "Inter", sans-serif;
resize: none;
flex: 1;
padding: 14px;
box-sizing: border-box;
overflow: hidden;
max-height: 113px; /* 5 lines */
overflow-y: auto;
overflow-y: overlay;
}
.MessageComposer_input > button.send {
@ -559,6 +565,8 @@ a {
background-image: url('icons/send.svg');
background-repeat: no-repeat;
background-position: center;
align-self: end;
margin-bottom: 8px;
}
.MessageComposer_input > button.sendFile {
@ -575,8 +583,12 @@ a {
background-position: center;
}
.MessageComposer_input > button.send:disabled {
background-color: #E3E8F0;
.MessageComposer.MessageComposer_canSend button.sendFile {
display: none;
}
.MessageComposer:not(.MessageComposer_canSend) button.send {
display: none;
}
.SettingsBody {
@ -629,6 +641,19 @@ a {
.Settings .row .label {
flex: 0 0 200px;
align-self: flex-start;
}
.Settings .row .content p {
margin: 8px 0;
}
.Settings .row .content p:first-child {
margin-top: 0;
}
.Settings .row .content p:last-child {
margin-bottom: 0;
}
.error {

View file

@ -1,280 +0,0 @@
/*
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {el} from "./html";
import {mountView} from "./utils";
import {ListView} from "./ListView";
import {insertAt} from "./utils";
class ItemRange {
constructor(topCount, renderCount, bottomCount) {
this.topCount = topCount;
this.renderCount = renderCount;
this.bottomCount = bottomCount;
}
contains(range) {
// don't contain empty ranges
// as it will prevent clearing the list
// once it is scrolled far enough out of view
if (!range.renderCount && this.renderCount) {
return false;
}
return range.topCount >= this.topCount &&
(range.topCount + range.renderCount) <= (this.topCount + this.renderCount);
}
containsIndex(idx) {
return idx >= this.topCount && idx <= (this.topCount + this.renderCount);
}
expand(amount) {
// don't expand ranges that won't render anything
if (this.renderCount === 0) {
return this;
}
const topGrow = Math.min(amount, this.topCount);
const bottomGrow = Math.min(amount, this.bottomCount);
return new ItemRange(
this.topCount - topGrow,
this.renderCount + topGrow + bottomGrow,
this.bottomCount - bottomGrow,
);
}
totalSize() {
return this.topCount + this.renderCount + this.bottomCount;
}
normalize(idx) {
/*
map index from list to index in rendered range
eg: if the index range of this._list is [0, 200] and we have rendered
elements in range [50, 60] then index 50 in list must map to index 0
in DOM tree/childInstance array.
*/
return idx - this.topCount;
}
}
export class LazyListView extends ListView {
constructor({itemHeight, overflowMargin = 5, overflowItems = 20,...options}, childCreator) {
super(options, childCreator);
this._itemHeight = itemHeight;
this._overflowMargin = overflowMargin;
this._overflowItems = overflowItems;
}
_getVisibleRange() {
const length = this._list ? this._list.length : 0;
const scrollTop = this._parent.scrollTop;
const topCount = Math.min(Math.max(0, Math.floor(scrollTop / this._itemHeight)), length);
const itemsAfterTop = length - topCount;
const visibleItems = this._height !== 0 ? Math.ceil(this._height / this._itemHeight) : 0;
const renderCount = Math.min(visibleItems, itemsAfterTop);
const bottomCount = itemsAfterTop - renderCount;
return new ItemRange(topCount, renderCount, bottomCount);
}
_renderIfNeeded(forceRender = false) {
/*
forceRender only because we don't optimize onAdd/onRemove yet.
Ideally, onAdd/onRemove should only render whatever has changed + update padding + update renderRange
*/
const range = this._getVisibleRange();
const intersectRange = range.expand(this._overflowMargin);
const renderRange = range.expand(this._overflowItems);
// only update render Range if the new range + overflowMargin isn't contained by the old anymore
// or if we are force rendering
if (forceRender || !this._renderRange.contains(intersectRange)) {
this._renderRange = renderRange;
this._renderElementsInRange();
}
}
async _initialRender() {
/*
Wait two frames for the return from mount() to be inserted into DOM.
This should be enough, but if this gives us trouble we can always use
MutationObserver.
*/
await new Promise(r => requestAnimationFrame(r));
await new Promise(r => requestAnimationFrame(r));
this._height = this._parent.clientHeight;
if (this._height === 0) { console.error("LazyListView could not calculate parent height."); }
const range = this._getVisibleRange();
const renderRange = range.expand(this._overflowItems);
this._renderRange = renderRange;
this._renderElementsInRange();
}
_itemsFromList(start, end) {
const array = [];
let i = 0;
for (const item of this._list) {
if (i >= start && i < end) {
array.push(item);
}
i = i + 1;
}
return array;
}
_itemAtIndex(idx) {
let i = 0;
for (const item of this._list) {
if (i === idx) {
return item;
}
i = i + 1;
}
return null;
}
_renderElementsInRange() {
const { topCount, renderCount, bottomCount } = this._renderRange;
const paddingTop = topCount * this._itemHeight;
const paddingBottom = bottomCount * this._itemHeight;
const renderedItems = this._itemsFromList(topCount, topCount + renderCount);
this._root.style.paddingTop = `${paddingTop}px`;
this._root.style.paddingBottom = `${paddingBottom}px`;
for (const child of this._childInstances) {
this._removeChild(child);
}
this._childInstances = [];
const fragment = document.createDocumentFragment();
for (const item of renderedItems) {
const view = this._childCreator(item);
this._childInstances.push(view);
fragment.appendChild(mountView(view, this._mountArgs));
}
this._root.appendChild(fragment);
}
mount() {
const root = super.mount();
this._parent = el("div", {className: "LazyListParent"}, root);
/*
Hooking to scroll events can be expensive.
Do we need to do more (like event throttling)?
*/
this._parent.addEventListener("scroll", () => this._renderIfNeeded());
this._initialRender();
return this._parent;
}
update(attributes) {
this._renderRange = null;
super.update(attributes);
this._initialRender();
}
loadList() {
if (!this._list) { return; }
this._subscription = this._list.subscribe(this);
this._childInstances = [];
/*
super.loadList() would render the entire list at this point.
We instead lazy render a part of the list in _renderIfNeeded
*/
}
_removeChild(child) {
child.root().remove();
child.unmount();
}
// If size of the list changes, re-render
onAdd() {
this._renderIfNeeded(true);
}
onRemove() {
this._renderIfNeeded(true);
}
onUpdate(idx, value, params) {
if (this._renderRange.containsIndex(idx)) {
const normalizedIdx = this._renderRange.normalize(idx);
super.onUpdate(normalizedIdx, value, params);
}
}
recreateItem(idx, value) {
if (this._renderRange.containsIndex(idx)) {
const normalizedIdx = this._renderRange.normalize(idx);
super.recreateItem(normalizedIdx, value)
}
}
/**
* Render additional element from top or bottom to offset the outgoing element
*/
_renderExtraOnMove(fromIdx, toIdx) {
const {topCount, renderCount} = this._renderRange;
if (toIdx < fromIdx) {
// Element is moved up the list, so render element from top boundary
const index = topCount;
const child = this._childCreator(this._itemAtIndex(index));
this._childInstances.unshift(child);
this._root.insertBefore(mountView(child, this._mountArgs), this._root.firstChild);
}
else {
// Element is moved down the list, so render element from bottom boundary
const index = topCount + renderCount - 1;
const child = this._childCreator(this._itemAtIndex(index));
this._childInstances.push(child);
this._root.appendChild(mountView(child, this._mountArgs));
}
}
/**
* Remove an element from top or bottom to make space for the incoming element
*/
_removeElementOnMove(fromIdx, toIdx) {
// If element comes from the bottom, remove element at bottom and vice versa
const child = toIdx < fromIdx ? this._childInstances.pop() : this._childInstances.shift();
this._removeChild(child);
}
onMove(fromIdx, toIdx, value) {
const fromInRange = this._renderRange.containsIndex(fromIdx);
const toInRange = this._renderRange.containsIndex(toIdx);
const normalizedFromIdx = this._renderRange.normalize(fromIdx);
const normalizedToIdx = this._renderRange.normalize(toIdx);
if (fromInRange && toInRange) {
super.onMove(normalizedFromIdx, normalizedToIdx, value);
}
else if (fromInRange && !toInRange) {
this.onBeforeListChanged();
const [child] = this._childInstances.splice(normalizedFromIdx, 1);
this._removeChild(child);
this._renderExtraOnMove(fromIdx, toIdx);
this.onListChanged();
}
else if (!fromInRange && toInRange) {
this.onBeforeListChanged();
const child = this._childCreator(value);
this._removeElementOnMove(fromIdx, toIdx);
this._childInstances.splice(normalizedToIdx, 0, child);
insertAt(this._root, normalizedToIdx, mountView(child, this._mountArgs));
this.onListChanged();
}
}
}

View file

@ -0,0 +1,201 @@
/*
Copyright 2021 The Matrix.org Foundation C.I.C.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import {tag} from "./html";
import {removeChildren, mountView} from "./utils";
import {ListRange, ResultType, AddRemoveResult} from "./ListRange";
import {ListView, IOptions as IParentOptions} from "./ListView";
import {IView} from "./types";
export interface IOptions<T, V> extends IParentOptions<T, V> {
itemHeight: number;
overflowItems?: number;
}
export class LazyListView<T, V extends IView> extends ListView<T, V> {
private renderRange?: ListRange;
private height?: number;
private itemHeight: number;
private overflowItems: number;
private scrollContainer?: HTMLElement;
constructor(
{itemHeight, overflowItems = 20, ...options}: IOptions<T, V>,
childCreator: (value: T) => V
) {
super(options, childCreator);
this.itemHeight = itemHeight;
this.overflowItems = overflowItems;
}
handleEvent(e: Event) {
if (e.type === "scroll") {
this.handleScroll();
} else {
super.handleEvent(e);
}
}
handleScroll() {
const visibleRange = this._getVisibleRange();
// don't contain empty ranges
// as it will prevent clearing the list
// once it is scrolled far enough out of view
if (visibleRange.length !== 0 && !this.renderRange!.contains(visibleRange)) {
const prevRenderRange = this.renderRange!;
this.renderRange = visibleRange.expand(this.overflowItems);
this.renderUpdate(prevRenderRange, this.renderRange);
}
}
// override
async loadList() {
/*
Wait two frames for the return from mount() to be inserted into DOM.
This should be enough, but if this gives us trouble we can always use
MutationObserver.
*/
await new Promise(r => requestAnimationFrame(r));
await new Promise(r => requestAnimationFrame(r));
if (!this._list) {
return;
}
this._subscription = this._list.subscribe(this);
const visibleRange = this._getVisibleRange();
this.renderRange = visibleRange.expand(this.overflowItems);
this._childInstances = [];
this.reRenderFullRange(this.renderRange);
}
private _getVisibleRange() {
const {clientHeight, scrollTop} = this.root()!;
if (clientHeight === 0) {
throw new Error("LazyListView height is 0");
}
return ListRange.fromViewport(this._list.length, this.itemHeight, clientHeight, scrollTop);
}
private reRenderFullRange(range: ListRange) {
removeChildren(this._listElement!);
const fragment = document.createDocumentFragment();
const it = this._list[Symbol.iterator]();
this._childInstances!.length = 0;
range.forEachInIterator(it, item => {
const child = this._childCreator(item);
this._childInstances!.push(child);
fragment.appendChild(mountView(child, this._mountArgs));
});
this._listElement!.appendChild(fragment);
this.adjustPadding(range);
}
private renderUpdate(prevRange: ListRange, newRange: ListRange) {
if (newRange.intersects(prevRange)) {
// remove children in reverse order so child index isn't affected by previous removals
for (const idxInList of prevRange.reverseIterable()) {
if (!newRange.containsIndex(idxInList)) {
const localIdx = idxInList - prevRange.start;
this.removeChild(localIdx);
}
}
// use forEachInIterator instead of for loop as we need to advance
// the list iterator to the start of the range first
newRange.forEachInIterator(this._list[Symbol.iterator](), (item, idxInList) => {
if (!prevRange.containsIndex(idxInList)) {
const localIdx = idxInList - newRange.start;
this.addChild(localIdx, item);
}
});
this.adjustPadding(newRange);
} else {
this.reRenderFullRange(newRange);
}
}
private adjustPadding(range: ListRange) {
const paddingTop = range.start * this.itemHeight;
const paddingBottom = (range.totalLength - range.end) * this.itemHeight;
const style = this._listElement!.style;
style.paddingTop = `${paddingTop}px`;
style.paddingBottom = `${paddingBottom}px`;
}
mount() {
const listElement = super.mount();
this.scrollContainer = tag.div({className: "LazyListParent"}, listElement) as HTMLElement;
this.scrollContainer.addEventListener("scroll", this);
return this.scrollContainer;
}
unmount() {
this.root()!.removeEventListener("scroll", this);
this.scrollContainer = undefined;
super.unmount();
}
root(): Element | undefined {
return this.scrollContainer;
}
private get _listElement(): HTMLElement | undefined {
return super.root() as HTMLElement | undefined;
}
onAdd(idx: number, value: T) {
const result = this.renderRange!.queryAdd(idx, value, this._list);
this.applyRemoveAddResult(result);
}
onRemove(idx: number, value: T) {
const result = this.renderRange!.queryRemove(idx, this._list);
this.applyRemoveAddResult(result);
}
onMove(fromIdx: number, toIdx: number, value: T) {
const result = this.renderRange!.queryMove(fromIdx, toIdx, value, this._list);
if (result) {
if (result.type === ResultType.Move) {
this.moveChild(
this.renderRange!.toLocalIndex(result.fromIdx),
this.renderRange!.toLocalIndex(result.toIdx)
);
} else {
this.applyRemoveAddResult(result);
}
}
}
onUpdate(i: number, value: T, params: any) {
if (this.renderRange!.containsIndex(i)) {
this.updateChild(this.renderRange!.toLocalIndex(i), value, params);
}
}
private applyRemoveAddResult(result: AddRemoveResult<T>) {
// order is important here, the new range can have a different start
if (result.type === ResultType.Remove || result.type === ResultType.RemoveAndAdd) {
this.removeChild(this.renderRange!.toLocalIndex(result.removeIdx));
}
if (result.newRange) {
this.renderRange = result.newRange;
this.adjustPadding(this.renderRange)
}
if (result.type === ResultType.Add || result.type === ResultType.RemoveAndAdd) {
this.addChild(this.renderRange!.toLocalIndex(result.addIdx), result.value);
}
}
}

Some files were not shown because too many files have changed in this diff Show more