62 Commits

Author SHA1 Message Date
static
614d0e74b4 패키지 버전 업데이트 2026-01-11 16:01:02 +09:00
static
efc2b08b1f Merge pull request #18 from kmc7468/add-chunked-upload
Chunked Upload 도입
2026-01-11 15:56:35 +09:00
static
80368c3a29 사소한 리팩토링 2 2026-01-11 15:54:05 +09:00
static
83369f83e3 DB에 청크 업로드 경로를 저장하도록 변경 2026-01-11 15:16:03 +09:00
static
2801eed556 사소한 리팩토링 2026-01-11 14:35:30 +09:00
static
57c27b76be 썸네일 업로드도 새로운 업로드 방식으로 변경 2026-01-11 14:07:32 +09:00
static
3628e6d21a 업로드할 때에도 스트리밍 방식으로 처리하도록 변경 2026-01-11 13:19:54 +09:00
static
1efcdd68f1 스트리밍 방식으로 동영상을 불러올 때 다운로드 메뉴가 표시되지 않는 버그 수정 2026-01-11 09:25:40 +09:00
static
0c295a2ffa Service Worker를 활용한 스트리밍 방식 파일 복호화 구현 2026-01-11 09:06:49 +09:00
static
4b783a36e9 파일 업로드 방식을 Chunking 방식으로 변경 2026-01-11 04:45:21 +09:00
static
b9e6f17b0c IV를 암호화된 파일 및 썸네일 앞에 합쳐서 전송하도록 변경 2026-01-11 00:29:59 +09:00
static
5d130204a6 사소한 버그 수정 2026-01-06 07:46:07 +09:00
static
4997b1f38c 불필요하게 분리된 컴포넌트 삭제 2026-01-06 07:17:58 +09:00
static
1d3704bfad 디렉터리 및 카테고리 페이지에서 탐색시의 깜빡임 현상 완화 2026-01-06 06:48:35 +09:00
static
ae1d34fc6b 파일, 카테고리, 디렉터리 정보를 불러올 때 특정 조건에서 네트워크 요청이 여러 번 발생할 수 있는 버그 수정 2026-01-05 06:49:12 +09:00
static
f10a0a2da3 썸네일 로딩 로직 최적화 2026-01-04 20:01:30 +09:00
static
0eb1d29259 Scheduler 클래스의 스케쥴링 로직 개선 2026-01-04 17:54:42 +09:00
static
cf0f8fe0b9 누락된 @eslint/js 패키지 추가 2026-01-04 01:50:02 +09:00
static
30c56e0926 삭제된 파일, 카테고리, 디렉터리에 대한 정보가 IndexedDB에서 삭제되지 않는 버그 수정 2026-01-03 00:54:32 +09:00
static
83d595636b 동시에 업로드할 수 있는 파일의 메모리 용량을 제한하여 메모리 부족으로 인해 발생하던 크래시 해결 2026-01-02 23:00:25 +09:00
static
008c8ad6ba 디렉터리 페이지 하단에 여백이 생기지 않는 버그 수정 2026-01-02 18:24:09 +09:00
static
5729af380d 모바일 환경에서 갤러리 페이지에서의 스크롤이 부자연스럽게 이뤄지는 버그 수정 2026-01-02 17:04:08 +09:00
static
c0e71993e9 Merge pull request #16 from kmc7468/optimize-networking
네트워크 호출 최적화
2026-01-02 15:04:45 +09:00
static
280d46b48d 사소한 리팩토링 2 2026-01-02 14:55:26 +09:00
static
d1f9018213 사소한 리팩토링 2026-01-02 00:31:58 +09:00
static
2e3cd4f8a2 네트워크 호출 결과가 IndexedDB에 캐시되지 않던 버그 수정 2026-01-01 23:52:47 +09:00
static
d98be331ad 홈, 갤러리, 캐시 설정, 썸네일 설정 페이지에서의 네트워크 호출 최적화 2026-01-01 23:31:01 +09:00
static
841c57e8fc 삭제된 파일의 캐시가 존재하는 경우 캐시 페이지의 로딩이 끝나지 않는 버그 수정 2026-01-01 21:41:53 +09:00
static
182ec18a2b 사소한 리팩토링 2025-12-31 02:43:07 +09:00
static
7b666cf692 파일이 다운로드/업로드된 직후에 다운로드/업로드 페이지의 목록에서 바로 사라지던 버그 수정 2025-12-31 01:32:54 +09:00
static
26323c2d4d 프론트엔드 파일시스템 모듈 리팩토링 2025-12-31 00:43:12 +09:00
static
e4413ddbf6 파일 페이지에서의 네트워크 호출 최적화 2025-12-30 23:30:50 +09:00
static
b5522a4c6d 카테고리 페이지에서의 네트워크 호출 최적화 2025-12-30 20:53:20 +09:00
static
1e57941f4c 디렉터리 페이지에서 하위 디렉터리도 가상 리스트로 표시하도록 개선 2025-12-30 18:44:46 +09:00
static
409ae09f4f 디렉터리 페이지에서의 네트워크 호출 최적화 2025-12-30 17:21:54 +09:00
static
cdb652cacf 사진 또는 동영상이 없을 때 홈 페이지의 레이아웃이 깨지는 버그 수정 2025-12-29 19:43:25 +09:00
static
15b6a53710 사소한 리팩토링 2 2025-12-29 18:14:42 +09:00
static
174305ca1b 파일 페이지와 카테고리 페이지에서 파일 목록을 표시할 때도 가상 리스트를 사용하여 효율적으로 랜더링하도록 개선 2025-12-27 23:27:57 +09:00
static
0d13d3baef 사소한 리팩토링 2025-12-27 14:10:33 +09:00
static
576d41da7f 디렉터리 페이지에 상위 디렉터리로 이동 버튼 추가 2025-12-27 03:04:09 +09:00
static
9eb67d5877 파일 페이지에 다운로드 및 폴더로 이동 메뉴 추가 2025-12-27 02:37:56 +09:00
static
a9da8435cb tRPC 클라이언트에 최대 URL 길이 설정 2025-12-26 23:54:49 +09:00
static
3e98e3d591 갤러리 페이지에서 파일이 표시되지 않던 버그 수정 2025-12-26 23:29:29 +09:00
static
27a46bcc2e eslint.config.js 파일 업데이트 2025-12-26 23:12:37 +09:00
static
a1f30ee154 홈 페이지와 갤러리 페이지에서 사진 및 동영상만 표시되도록 개선 2025-12-26 22:58:09 +09:00
static
6d02178c69 홈 페이지 구현 2025-12-26 22:47:31 +09:00
static
ed21a9cd31 갤러리 페이지 구현 2025-12-26 22:29:44 +09:00
static
b7a7536461 Merge pull request #14 from kmc7468/migrate-to-trpc
tRPC 도입
2025-12-26 15:58:24 +09:00
static
3eb7411438 사소한 리팩토링 3 2025-12-26 15:57:05 +09:00
static
c9d4b10356 사소한 리팩토링 2 2025-12-26 15:45:03 +09:00
static
d94d14cf83 사소한 리팩토링 2025-12-26 15:07:59 +09:00
static
3fc29cf8db /api/auth 아래의 Endpoint들을 tRPC로 마이그레이션 2025-12-25 23:44:23 +09:00
static
b92b4a0b1b Zod 4 마이그레이션 2025-12-25 22:53:51 +09:00
static
6d95059450 /api/category, /api/directory, /api/file 아래의 대부분의 Endpoint들을 tRPC로 마이그레이션 2025-12-25 22:45:55 +09:00
static
a08ddf2c09 tRPC Endpoint를 /api/trpc로 변경 2025-12-25 20:22:58 +09:00
static
208252f6b2 /api/hsk, /api/mek, /api/user 아래의 Endpoint들을 tRPC로 마이그레이션 2025-12-25 20:00:15 +09:00
static
aa4a1a74ea /api/client 아래의 Endpoint들을 tRPC로 마이그레이션 2025-12-25 18:59:41 +09:00
static
640e12d2c3 tRPC Authorization 미들웨어 구현 2025-12-25 16:50:41 +09:00
static
7779910949 tRPC 초기 설정 2025-11-02 23:09:01 +09:00
static
328baba395 패키지 버전 업데이트 2025-11-02 02:57:18 +09:00
static
4e91cdad95 서버로부터 파일의 DEK를 다운로드한 후에야 썸네일이 표시되던 현상 수정 2025-07-20 05:17:38 +09:00
static
9f53874d1d 비디오 재생이 지원되지 않는 포맷일 때 썸네일 생성 작업이 무한히 끝나지 않던 버그 수정 2025-07-17 01:54:58 +09:00
221 changed files with 6632 additions and 5578 deletions

View File

@@ -12,6 +12,7 @@ node_modules
/data
/library
/thumbnails
/uploads
# OS
.DS_Store

View File

@@ -12,3 +12,4 @@ USER_CLIENT_CHALLENGE_EXPIRES=
SESSION_UPGRADE_CHALLENGE_EXPIRES=
LIBRARY_PATH=
THUMBNAILS_PATH=
UPLOADS_PATH=

1
.gitignore vendored
View File

@@ -10,6 +10,7 @@ node_modules
/data
/library
/thumbnails
/uploads
# OS
.DS_Store

View File

@@ -2,11 +2,7 @@
FROM node:22-alpine AS base
WORKDIR /app
RUN apk add --no-cache bash curl && \
curl -o /usr/local/bin/wait-for-it https://raw.githubusercontent.com/vishnubob/wait-for-it/master/wait-for-it.sh && \
chmod +x /usr/local/bin/wait-for-it
RUN npm install -g pnpm@9
RUN npm install -g pnpm@10
COPY pnpm-lock.yaml .
# Build Stage
@@ -29,4 +25,4 @@ COPY --from=build /app/build ./build
EXPOSE 3000
ENV BODY_SIZE_LIMIT=Infinity
CMD ["bash", "-c", "wait-for-it ${DATABASE_HOST:-localhost}:${DATABASE_PORT:-5432} -- node ./build/index.js"]
CMD ["node", "./build/index.js"]

View File

@@ -3,7 +3,8 @@ services:
build: .
restart: unless-stopped
depends_on:
- database
database:
condition: service_healthy
user: ${CONTAINER_UID:-0}:${CONTAINER_GID:-0}
volumes:
- ./data/library:/app/data/library
@@ -19,6 +20,7 @@ services:
- SESSION_UPGRADE_CHALLENGE_EXPIRES
- LIBRARY_PATH=/app/data/library
- THUMBNAILS_PATH=/app/data/thumbnails
- UPLOADS_PATH=/app/data/uploads
# SvelteKit
- ADDRESS_HEADER=${TRUST_PROXY:+X-Forwarded-For}
- XFF_DEPTH=${TRUST_PROXY:-}
@@ -35,3 +37,8 @@ services:
environment:
- POSTGRES_USER=arkvault
- POSTGRES_PASSWORD=${DATABASE_PASSWORD:?}
healthcheck:
test: ["CMD-SHELL", "pg_isready -U $${POSTGRES_USER}"]
interval: 5s
timeout: 5s
retries: 5

View File

@@ -1,21 +1,24 @@
import prettier from "eslint-config-prettier";
import js from "@eslint/js";
import { includeIgnoreFile } from "@eslint/compat";
import js from "@eslint/js";
import { defineConfig } from "eslint/config";
import prettier from "eslint-config-prettier";
import svelte from "eslint-plugin-svelte";
import tailwind from "eslint-plugin-tailwindcss";
import globals from "globals";
import { fileURLToPath } from "node:url";
import ts from "typescript-eslint";
import { fileURLToPath } from "url";
import svelteConfig from "./svelte.config.js";
const gitignorePath = fileURLToPath(new URL("./.gitignore", import.meta.url));
export default ts.config(
export default defineConfig(
includeIgnoreFile(gitignorePath),
js.configs.recommended,
...ts.configs.recommended,
...svelte.configs["flat/recommended"],
...svelte.configs.recommended,
...tailwind.configs["flat/recommended"],
prettier,
...svelte.configs["flat/prettier"],
...svelte.configs.prettier,
{
languageOptions: {
globals: {
@@ -23,13 +26,18 @@ export default ts.config(
...globals.node,
},
},
rules: {
"no-undef": "off",
},
},
{
files: ["**/*.svelte"],
files: ["**/*.svelte", "**/*.svelte.ts", "**/*.svelte.js"],
languageOptions: {
parserOptions: {
projectService: true,
extraFileExtensions: [".svelte"],
parser: ts.parser,
svelteConfig,
},
},
},

View File

@@ -1,7 +1,7 @@
{
"name": "arkvault",
"private": true,
"version": "0.5.1",
"version": "0.8.0",
"type": "module",
"scripts": {
"dev": "vite dev",
@@ -16,53 +16,58 @@
"db:migrate": "kysely migrate"
},
"devDependencies": {
"@eslint/compat": "^1.3.1",
"@iconify-json/material-symbols": "^1.2.29",
"@sveltejs/adapter-node": "^5.2.13",
"@sveltejs/kit": "^2.22.5",
"@sveltejs/vite-plugin-svelte": "^4.0.4",
"@eslint/compat": "^2.0.1",
"@eslint/js": "^9.39.2",
"@iconify-json/material-symbols": "^1.2.51",
"@noble/hashes": "^2.0.1",
"@sveltejs/adapter-node": "^5.4.0",
"@sveltejs/kit": "^2.49.4",
"@sveltejs/vite-plugin-svelte": "^6.2.4",
"@tanstack/svelte-virtual": "^3.13.18",
"@trpc/client": "^11.8.1",
"@types/file-saver": "^2.0.7",
"@types/ms": "^0.7.34",
"@types/node-schedule": "^2.1.8",
"@types/pg": "^8.15.4",
"autoprefixer": "^10.4.21",
"axios": "^1.10.0",
"dexie": "^4.0.11",
"eslint": "^9.30.1",
"eslint-config-prettier": "^10.1.5",
"eslint-plugin-svelte": "^3.10.1",
"eslint-plugin-tailwindcss": "^3.18.0",
"exifreader": "^4.31.1",
"@types/pg": "^8.16.0",
"autoprefixer": "^10.4.23",
"axios": "^1.13.2",
"dexie": "^4.2.1",
"eslint": "^9.39.2",
"eslint-config-prettier": "^10.1.8",
"eslint-plugin-svelte": "^3.14.0",
"eslint-plugin-tailwindcss": "^3.18.2",
"exifreader": "^4.35.0",
"file-saver": "^2.0.5",
"globals": "^16.3.0",
"globals": "^17.0.0",
"heic2any": "^0.0.4",
"kysely-ctl": "^0.13.1",
"lru-cache": "^11.1.0",
"mime": "^4.0.7",
"p-limit": "^6.2.0",
"prettier": "^3.6.2",
"prettier-plugin-svelte": "^3.4.0",
"prettier-plugin-tailwindcss": "^0.6.14",
"svelte": "^5.35.6",
"svelte-check": "^4.2.2",
"tailwindcss": "^3.4.17",
"typescript": "^5.8.3",
"typescript-eslint": "^8.36.0",
"unplugin-icons": "^22.1.0",
"vite": "^5.4.19"
"kysely-ctl": "^0.19.0",
"lru-cache": "^11.2.4",
"mime": "^4.1.0",
"p-limit": "^7.2.0",
"prettier": "^3.7.4",
"prettier-plugin-svelte": "^3.4.1",
"prettier-plugin-tailwindcss": "^0.7.2",
"svelte": "^5.46.1",
"svelte-check": "^4.3.5",
"tailwindcss": "^3.4.19",
"typescript": "^5.9.3",
"typescript-eslint": "^8.52.0",
"unplugin-icons": "^22.5.0",
"vite": "^7.3.1"
},
"dependencies": {
"@fastify/busboy": "^3.1.1",
"argon2": "^0.43.0",
"kysely": "^0.28.2",
"@trpc/server": "^11.8.1",
"argon2": "^0.44.0",
"kysely": "^0.28.9",
"ms": "^2.1.3",
"node-schedule": "^2.1.1",
"pg": "^8.16.3",
"uuid": "^11.1.0",
"zod": "^3.25.76"
"superjson": "^2.2.6",
"uuid": "^13.0.0",
"zod": "^4.3.5"
},
"engines": {
"node": "^22.0.0",
"pnpm": "^9.0.0"
"pnpm": "^10.0.0"
}
}

2234
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,6 @@
import type { ClientInit } from "@sveltejs/kit";
import { cleanupDanglingInfos, getClientKey, getMasterKeys, getHmacSecrets } from "$lib/indexedDB";
import { prepareFileCache } from "$lib/modules/file";
import { prepareOpfs } from "$lib/modules/opfs";
import { clientKeyStore, masterKeyStore, hmacSecretStore } from "$lib/stores";
const requestPersistentStorage = async () => {
@@ -46,7 +45,6 @@ export const init: ClientInit = async () => {
prepareClientKeyStore(),
prepareMasterKeyStore(),
prepareHmacSecretStore(),
prepareOpfs(),
]);
cleanupDanglingInfos(); // Intended

View File

@@ -7,6 +7,7 @@ import {
cleanupExpiredSessions,
cleanupExpiredSessionUpgradeChallenges,
} from "$lib/server/db/session";
import { cleanupExpiredUploadSessions } from "$lib/server/services/upload";
import { authenticate, setAgentInfo } from "$lib/server/middlewares";
export const init: ServerInit = async () => {
@@ -16,6 +17,7 @@ export const init: ServerInit = async () => {
cleanupExpiredUserClientChallenges();
cleanupExpiredSessions();
cleanupExpiredSessionUpgradeChallenges();
cleanupExpiredUploadSessions();
});
};

View File

@@ -0,0 +1,60 @@
<script lang="ts">
import { createWindowVirtualizer } from "@tanstack/svelte-virtual";
import type { Snippet } from "svelte";
import type { ClassValue } from "svelte/elements";
interface Props {
class?: ClassValue;
count: number;
item: Snippet<[index: number]>;
itemHeight: (index: number) => number;
itemGap?: number;
placeholder?: Snippet;
}
let { class: className, count, item, itemHeight, itemGap, placeholder }: Props = $props();
let element: HTMLElement | undefined = $state();
let scrollMargin = $state(0);
let virtualizer = $derived(
createWindowVirtualizer({
count,
estimateSize: itemHeight,
gap: itemGap,
scrollMargin,
}),
);
const measureItem = (node: HTMLElement) => {
$effect(() => $virtualizer.measureElement(node));
};
$effect(() => {
if (!element) return;
const observer = new ResizeObserver(() => {
scrollMargin = Math.round(element!.getBoundingClientRect().top + window.scrollY);
});
observer.observe(element.parentElement!);
return () => observer.disconnect();
});
</script>
<div bind:this={element} class={["relative", className]}>
<div style:height="{$virtualizer.getTotalSize()}px">
{#each $virtualizer.getVirtualItems() as virtualItem (virtualItem.key)}
<div
class="absolute left-0 top-0 w-full"
style:transform="translateY({virtualItem.start - scrollMargin}px)"
data-index={virtualItem.index}
use:measureItem
>
{@render item(virtualItem.index)}
</div>
{/each}
</div>
{#if placeholder && count === 0}
{@render placeholder()}
{/if}
</div>

View File

@@ -0,0 +1,24 @@
<script lang="ts">
import { getFileThumbnail } from "$lib/modules/file";
import type { SummarizedFileInfo } from "$lib/modules/filesystem";
interface Props {
info: SummarizedFileInfo;
onclick?: (file: SummarizedFileInfo) => void;
}
let { info, onclick }: Props = $props();
let thumbnail = $derived(getFileThumbnail(info));
</script>
<button
onclick={onclick && (() => setTimeout(() => onclick(info), 100))}
class="aspect-square overflow-hidden rounded transition active:scale-95 active:brightness-90"
>
{#if $thumbnail}
<img src={$thumbnail} alt={info.name} class="h-full w-full object-cover" />
{:else}
<div class="h-full w-full bg-gray-100"></div>
{/if}
</button>

View File

@@ -1,5 +1,6 @@
export { default as ActionEntryButton } from "./ActionEntryButton.svelte";
export { default as Button } from "./Button.svelte";
export { default as EntryButton } from "./EntryButton.svelte";
export { default as FileThumbnailButton } from "./FileThumbnailButton.svelte";
export { default as FloatingButton } from "./FloatingButton.svelte";
export { default as TextButton } from "./TextButton.svelte";

View File

@@ -3,3 +3,4 @@ export * from "./buttons";
export * from "./divs";
export * from "./inputs";
export { default as Modal } from "./Modal.svelte";
export { default as RowVirtualizer } from "./RowVirtualizer.svelte";

View File

@@ -0,0 +1,44 @@
<script module lang="ts">
import type { DataKey } from "$lib/modules/filesystem";
export interface SelectedCategory {
id: number;
dataKey?: DataKey;
name: string;
}
</script>
<script lang="ts">
import type { Component } from "svelte";
import type { SvelteHTMLElements } from "svelte/elements";
import { ActionEntryButton } from "$lib/components/atoms";
import { CategoryLabel } from "$lib/components/molecules";
import type { SubCategoryInfo } from "$lib/modules/filesystem";
import { sortEntries } from "$lib/utils";
interface Props {
categories: SubCategoryInfo[];
categoryMenuIcon?: Component<SvelteHTMLElements["svg"]>;
onCategoryClick: (category: SelectedCategory) => void;
onCategoryMenuClick?: (category: SelectedCategory) => void;
}
let { categories, categoryMenuIcon, onCategoryClick, onCategoryMenuClick }: Props = $props();
let categoriesWithName = $derived(sortEntries([...categories]));
</script>
{#if categoriesWithName.length > 0}
<div class="space-y-1">
{#each categoriesWithName as category (category.id)}
<ActionEntryButton
class="h-12"
onclick={() => onCategoryClick(category)}
actionButtonIcon={categoryMenuIcon}
onActionButtonClick={() => onCategoryMenuClick?.(category)}
>
<CategoryLabel name={category.name} />
</ActionEntryButton>
{/each}
</div>
{/if}

View File

@@ -1,63 +0,0 @@
<script lang="ts">
import { untrack, type Component } from "svelte";
import type { SvelteHTMLElements } from "svelte/elements";
import { get, type Writable } from "svelte/store";
import type { CategoryInfo } from "$lib/modules/filesystem";
import { SortBy, sortEntries } from "$lib/modules/util";
import Category from "./Category.svelte";
import type { SelectedCategory } from "./service";
interface Props {
categories: Writable<CategoryInfo | null>[];
categoryMenuIcon?: Component<SvelteHTMLElements["svg"]>;
onCategoryClick: (category: SelectedCategory) => void;
onCategoryMenuClick?: (category: SelectedCategory) => void;
sortBy?: SortBy;
}
let {
categories,
categoryMenuIcon,
onCategoryClick,
onCategoryMenuClick,
sortBy = SortBy.NAME_ASC,
}: Props = $props();
let categoriesWithName: { name?: string; info: Writable<CategoryInfo | null> }[] = $state([]);
$effect(() => {
categoriesWithName = categories.map((category) => ({
name: get(category)?.name,
info: category,
}));
const sort = () => {
sortEntries(categoriesWithName, sortBy);
};
return untrack(() => {
sort();
const unsubscribes = categoriesWithName.map((category) =>
category.info.subscribe((value) => {
if (category.name === value?.name) return;
category.name = value?.name;
sort();
}),
);
return () => unsubscribes.forEach((unsubscribe) => unsubscribe());
});
});
</script>
{#if categoriesWithName.length > 0}
<div class="space-y-1">
{#each categoriesWithName as { info }}
<Category
{info}
menuIcon={categoryMenuIcon}
onclick={onCategoryClick}
onMenuClick={onCategoryMenuClick}
/>
{/each}
</div>
{/if}

View File

@@ -1,43 +0,0 @@
<script lang="ts">
import type { Component } from "svelte";
import type { SvelteHTMLElements } from "svelte/elements";
import type { Writable } from "svelte/store";
import { ActionEntryButton } from "$lib/components/atoms";
import { CategoryLabel } from "$lib/components/molecules";
import type { CategoryInfo } from "$lib/modules/filesystem";
import type { SelectedCategory } from "./service";
interface Props {
info: Writable<CategoryInfo | null>;
menuIcon?: Component<SvelteHTMLElements["svg"]>;
onclick: (category: SelectedCategory) => void;
onMenuClick?: (category: SelectedCategory) => void;
}
let { info, menuIcon, onclick, onMenuClick }: Props = $props();
const openCategory = () => {
const { id, dataKey, dataKeyVersion, name } = $info as CategoryInfo;
if (!dataKey || !dataKeyVersion) return; // TODO: Error handling
onclick({ id, dataKey, dataKeyVersion, name });
};
const openMenu = () => {
const { id, dataKey, dataKeyVersion, name } = $info as CategoryInfo;
if (!dataKey || !dataKeyVersion) return; // TODO: Error handling
onMenuClick!({ id, dataKey, dataKeyVersion, name });
};
</script>
{#if $info}
<ActionEntryButton
class="h-12"
onclick={openCategory}
actionButtonIcon={menuIcon}
onActionButtonClick={openMenu}
>
<CategoryLabel name={$info.name!} />
</ActionEntryButton>
{/if}

View File

@@ -1,2 +0,0 @@
export { default } from "./Categories.svelte";
export * from "./service";

View File

@@ -1,6 +0,0 @@
export interface SelectedCategory {
id: number;
dataKey: CryptoKey;
dataKeyVersion: Date;
name: string;
}

View File

@@ -1,10 +1,8 @@
<script lang="ts">
import type { Component } from "svelte";
import type { ClassValue, SvelteHTMLElements } from "svelte/elements";
import type { Writable } from "svelte/store";
import { Categories, IconEntryButton, type SelectedCategory } from "$lib/components/molecules";
import { getCategoryInfo, type CategoryInfo } from "$lib/modules/filesystem";
import { masterKeyStore } from "$lib/stores";
import type { CategoryInfo } from "$lib/modules/filesystem";
import IconAddCircle from "~icons/material-symbols/add-circle";
@@ -27,14 +25,6 @@
subCategoryCreatePosition = "bottom",
subCategoryMenuIcon,
}: Props = $props();
let subCategories: Writable<CategoryInfo | null>[] = $state([]);
$effect(() => {
subCategories = info.subCategoryIds.map((id) =>
getCategoryInfo(id, $masterKeyStore?.get(1)?.key!),
);
});
</script>
<div class={["space-y-1", className]}>
@@ -53,14 +43,12 @@
{#if subCategoryCreatePosition === "top"}
{@render subCategoryCreate()}
{/if}
{#key info}
<Categories
categories={subCategories}
categoryMenuIcon={subCategoryMenuIcon}
onCategoryClick={onSubCategoryClick}
onCategoryMenuClick={onSubCategoryMenuClick}
/>
{/key}
<Categories
categories={info.subCategories}
categoryMenuIcon={subCategoryMenuIcon}
onCategoryClick={onSubCategoryClick}
onCategoryMenuClick={onSubCategoryMenuClick}
/>
{#if subCategoryCreatePosition === "bottom"}
{@render subCategoryCreate()}
{/if}

View File

@@ -1,7 +1,7 @@
export * from "./ActionModal.svelte";
export { default as ActionModal } from "./ActionModal.svelte";
export * from "./Categories";
export { default as Categories } from "./Categories";
export * from "./Categories.svelte";
export { default as Categories } from "./Categories.svelte";
export { default as IconEntryButton } from "./IconEntryButton.svelte";
export * from "./labels";
export { default as SubCategories } from "./SubCategories.svelte";

View File

@@ -3,6 +3,7 @@
import { IconLabel } from "$lib/components/molecules";
import IconFolder from "~icons/material-symbols/folder";
import IconDriveFolderUpload from "~icons/material-symbols/drive-folder-upload";
import IconDraft from "~icons/material-symbols/draft";
interface Props {
@@ -11,7 +12,7 @@
subtext?: string;
textClass?: ClassValue;
thumbnail?: string;
type: "directory" | "file";
type: "directory" | "parent-directory" | "file";
}
let {
@@ -30,6 +31,8 @@
<img src={thumbnail} alt={name} loading="lazy" class="aspect-square rounded object-cover" />
{:else if type === "directory"}
<IconFolder />
{:else if type === "parent-directory"}
<IconDriveFolderUpload class="text-yellow-500" />
{:else}
<IconDraft class="text-blue-400" />
{/if}

View File

@@ -1,107 +0,0 @@
<script lang="ts">
import { untrack } from "svelte";
import { get, type Writable } from "svelte/store";
import { CheckBox } from "$lib/components/atoms";
import { SubCategories, type SelectedCategory } from "$lib/components/molecules";
import { getFileInfo, type FileInfo, type CategoryInfo } from "$lib/modules/filesystem";
import { SortBy, sortEntries } from "$lib/modules/util";
import { masterKeyStore } from "$lib/stores";
import File from "./File.svelte";
import type { SelectedFile } from "./service";
import IconMoreVert from "~icons/material-symbols/more-vert";
interface Props {
info: CategoryInfo;
onFileClick: (file: SelectedFile) => void;
onFileRemoveClick: (file: SelectedFile) => void;
onSubCategoryClick: (subCategory: SelectedCategory) => void;
onSubCategoryCreateClick: () => void;
onSubCategoryMenuClick: (subCategory: SelectedCategory) => void;
sortBy?: SortBy;
isFileRecursive: boolean;
}
let {
info,
onFileClick,
onFileRemoveClick,
onSubCategoryClick,
onSubCategoryCreateClick,
onSubCategoryMenuClick,
sortBy = SortBy.NAME_ASC,
isFileRecursive = $bindable(),
}: Props = $props();
let files: { name?: string; info: Writable<FileInfo | null>; isRecursive: boolean }[] = $state(
[],
);
$effect(() => {
files =
info.files
?.filter(({ isRecursive }) => isFileRecursive || !isRecursive)
.map(({ id, isRecursive }) => {
const info = getFileInfo(id, $masterKeyStore?.get(1)?.key!);
return {
name: get(info)?.name,
info,
isRecursive,
};
}) ?? [];
const sort = () => {
sortEntries(files, sortBy);
};
return untrack(() => {
sort();
const unsubscribes = files.map((file) =>
file.info.subscribe((value) => {
if (file.name === value?.name) return;
file.name = value?.name;
sort();
}),
);
return () => unsubscribes.forEach((unsubscribe) => unsubscribe());
});
});
</script>
<div class="space-y-4">
<div class="space-y-4 bg-white p-4">
{#if info.id !== "root"}
<p class="text-lg font-bold text-gray-800">하위 카테고리</p>
{/if}
<SubCategories
{info}
{onSubCategoryClick}
{onSubCategoryCreateClick}
{onSubCategoryMenuClick}
subCategoryMenuIcon={IconMoreVert}
/>
</div>
{#if info.id !== "root"}
<div class="space-y-4 bg-white p-4">
<div class="flex items-center justify-between">
<p class="text-lg font-bold text-gray-800">파일</p>
<CheckBox bind:checked={isFileRecursive}>
<p class="font-medium">하위 카테고리의 파일</p>
</CheckBox>
</div>
<div class="space-y-1">
{#key info}
{#each files as { info, isRecursive }}
<File
{info}
onclick={onFileClick}
onRemoveClick={!isRecursive ? onFileRemoveClick : undefined}
/>
{:else}
<p class="text-gray-500 text-center">이 카테고리에 추가된 파일이 없어요.</p>
{/each}
{/key}
</div>
</div>
{/if}
</div>

View File

@@ -1,59 +0,0 @@
<script lang="ts">
import type { Writable } from "svelte/store";
import { ActionEntryButton } from "$lib/components/atoms";
import { DirectoryEntryLabel } from "$lib/components/molecules";
import type { FileInfo } from "$lib/modules/filesystem";
import { requestFileThumbnailDownload, type SelectedFile } from "./service";
import IconClose from "~icons/material-symbols/close";
interface Props {
info: Writable<FileInfo | null>;
onclick: (selectedFile: SelectedFile) => void;
onRemoveClick?: (selectedFile: SelectedFile) => void;
}
let { info, onclick, onRemoveClick }: Props = $props();
let thumbnail: string | undefined = $state();
const openFile = () => {
const { id, dataKey, dataKeyVersion, name } = $info as FileInfo;
if (!dataKey || !dataKeyVersion) return; // TODO: Error handling
onclick({ id, dataKey, dataKeyVersion, name });
};
const removeFile = () => {
const { id, dataKey, dataKeyVersion, name } = $info as FileInfo;
if (!dataKey || !dataKeyVersion) return; // TODO: Error handling
onRemoveClick!({ id, dataKey, dataKeyVersion, name });
};
$effect(() => {
if ($info?.dataKey) {
requestFileThumbnailDownload($info.id, $info.dataKey)
.then((thumbnailUrl) => {
thumbnail = thumbnailUrl ?? undefined;
})
.catch(() => {
// TODO: Error Handling
thumbnail = undefined;
});
} else {
thumbnail = undefined;
}
});
</script>
{#if $info}
<ActionEntryButton
class="h-12"
onclick={openFile}
actionButtonIcon={onRemoveClick && IconClose}
onActionButtonClick={removeFile}
>
<DirectoryEntryLabel type="file" {thumbnail} name={$info.name} />
</ActionEntryButton>
{/if}

View File

@@ -1,2 +0,0 @@
export { default } from "./Category.svelte";
export * from "./service";

View File

@@ -1,8 +0,0 @@
export { requestFileThumbnailDownload } from "$lib/services/file";
export interface SelectedFile {
id: number;
dataKey: CryptoKey;
dataKeyVersion: Date;
name: string;
}

View File

@@ -1,3 +1 @@
export * from "./Category";
export { default as Category } from "./Category";
export * from "./modals";

View File

@@ -0,0 +1,2 @@
export * from "./serviceWorker";
export * from "./upload";

View File

@@ -0,0 +1 @@
export const DECRYPTED_FILE_URL_PREFIX = "/_internal/decryptedFile/";

View File

@@ -0,0 +1,6 @@
export const AES_GCM_IV_SIZE = 12;
export const AES_GCM_TAG_SIZE = 16;
export const ENCRYPTION_OVERHEAD = AES_GCM_IV_SIZE + AES_GCM_TAG_SIZE;
export const CHUNK_SIZE = 4 * 1024 * 1024; // 4 MiB
export const ENCRYPTED_CHUNK_SIZE = CHUNK_SIZE + ENCRYPTION_OVERHEAD;

View File

@@ -1,11 +0,0 @@
export const callGetApi = async (input: RequestInfo, fetchInternal = fetch) => {
return await fetchInternal(input);
};
export const callPostApi = async <T>(input: RequestInfo, payload?: T, fetchInternal = fetch) => {
return await fetchInternal(input, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: payload ? JSON.stringify(payload) : undefined,
});
};

View File

@@ -1,2 +0,0 @@
export * from "./callApi";
export * from "./gotoStateful";

View File

@@ -1,7 +1,5 @@
import { Dexie, type EntityTable } from "dexie";
export type DirectoryId = "root" | number;
interface DirectoryInfo {
id: number;
parentId: DirectoryId;
@@ -15,17 +13,15 @@ interface FileInfo {
contentType: string;
createdAt?: Date;
lastModifiedAt: Date;
categoryIds: number[];
categoryIds?: number[];
}
export type CategoryId = "root" | number;
interface CategoryInfo {
id: number;
parentId: CategoryId;
name: string;
files: { id: number; isRecursive: boolean }[];
isFileRecursive: boolean;
files?: { id: number; isRecursive: boolean }[];
isFileRecursive?: boolean;
}
const filesystem = new Dexie("filesystem") as Dexie & {
@@ -59,13 +55,23 @@ export const getDirectoryInfo = async (id: number) => {
};
export const storeDirectoryInfo = async (directoryInfo: DirectoryInfo) => {
await filesystem.directory.put(directoryInfo);
await filesystem.directory.upsert(directoryInfo.id, { ...directoryInfo });
};
export const deleteDirectoryInfo = async (id: number) => {
await filesystem.directory.delete(id);
};
export const deleteDanglingDirectoryInfos = async (
parentId: DirectoryId,
validIds: Set<number>,
) => {
await filesystem.directory
.where({ parentId })
.and((directory) => !validIds.has(directory.id))
.delete();
};
export const getAllFileInfos = async () => {
return await filesystem.file.toArray();
};
@@ -78,14 +84,29 @@ export const getFileInfo = async (id: number) => {
return await filesystem.file.get(id);
};
export const bulkGetFileInfos = async (ids: number[]) => {
return await filesystem.file.bulkGet(ids);
};
export const storeFileInfo = async (fileInfo: FileInfo) => {
await filesystem.file.put(fileInfo);
await filesystem.file.upsert(fileInfo.id, { ...fileInfo });
};
export const deleteFileInfo = async (id: number) => {
await filesystem.file.delete(id);
};
export const bulkDeleteFileInfos = async (ids: number[]) => {
await filesystem.file.bulkDelete(ids);
};
export const deleteDanglingFileInfos = async (parentId: DirectoryId, validIds: Set<number>) => {
await filesystem.file
.where({ parentId })
.and((file) => !validIds.has(file.id))
.delete();
};
export const getCategoryInfos = async (parentId: CategoryId) => {
return await filesystem.category.where({ parentId }).toArray();
};
@@ -95,7 +116,7 @@ export const getCategoryInfo = async (id: number) => {
};
export const storeCategoryInfo = async (categoryInfo: CategoryInfo) => {
await filesystem.category.put(categoryInfo);
await filesystem.category.upsert(categoryInfo.id, { ...categoryInfo });
};
export const updateCategoryInfo = async (id: number, changes: { isFileRecursive?: boolean }) => {
@@ -106,6 +127,13 @@ export const deleteCategoryInfo = async (id: number) => {
await filesystem.category.delete(id);
};
export const deleteDanglingCategoryInfos = async (parentId: CategoryId, validIds: Set<number>) => {
await filesystem.category
.where({ parentId })
.and((category) => !validIds.has(category.id))
.delete();
};
export const cleanupDanglingInfos = async () => {
const validDirectoryIds: number[] = [];
const validFileIds: number[] = [];

View File

@@ -70,12 +70,12 @@ export const storeMasterKeys = async (keys: MasterKey[]) => {
};
export const getHmacSecrets = async () => {
return await keyStore.hmacSecret.toArray();
return (await keyStore.hmacSecret.toArray()).filter(({ secret }) => secret.extractable);
};
export const storeHmacSecrets = async (secrets: HmacSecret[]) => {
if (secrets.some(({ secret }) => secret.extractable)) {
throw new Error("Hmac secrets must be nonextractable");
if (secrets.some(({ secret }) => !secret.extractable)) {
throw new Error("Hmac secrets must be extractable");
}
await keyStore.hmacSecret.bulkPut(secrets);
};

View File

@@ -1,8 +1,15 @@
import { encodeString, decodeString, encodeToBase64, decodeFromBase64 } from "./util";
import { AES_GCM_IV_SIZE } from "$lib/constants";
import {
encodeString,
decodeString,
encodeToBase64,
decodeFromBase64,
concatenateBuffers,
} from "./utils";
export const generateMasterKey = async () => {
return {
masterKey: await window.crypto.subtle.generateKey(
masterKey: await crypto.subtle.generateKey(
{
name: "AES-KW",
length: 256,
@@ -15,7 +22,7 @@ export const generateMasterKey = async () => {
export const generateDataKey = async () => {
return {
dataKey: await window.crypto.subtle.generateKey(
dataKey: await crypto.subtle.generateKey(
{
name: "AES-GCM",
length: 256,
@@ -28,9 +35,9 @@ export const generateDataKey = async () => {
};
export const makeAESKeyNonextractable = async (key: CryptoKey) => {
return await window.crypto.subtle.importKey(
return await crypto.subtle.importKey(
"raw",
await window.crypto.subtle.exportKey("raw", key),
await crypto.subtle.exportKey("raw", key),
key.algorithm,
false,
key.usages,
@@ -38,12 +45,12 @@ export const makeAESKeyNonextractable = async (key: CryptoKey) => {
};
export const wrapDataKey = async (dataKey: CryptoKey, masterKey: CryptoKey) => {
return encodeToBase64(await window.crypto.subtle.wrapKey("raw", dataKey, masterKey, "AES-KW"));
return encodeToBase64(await crypto.subtle.wrapKey("raw", dataKey, masterKey, "AES-KW"));
};
export const unwrapDataKey = async (dataKeyWrapped: string, masterKey: CryptoKey) => {
return {
dataKey: await window.crypto.subtle.unwrapKey(
dataKey: await crypto.subtle.unwrapKey(
"raw",
decodeFromBase64(dataKeyWrapped),
masterKey,
@@ -56,12 +63,12 @@ export const unwrapDataKey = async (dataKeyWrapped: string, masterKey: CryptoKey
};
export const wrapHmacSecret = async (hmacSecret: CryptoKey, masterKey: CryptoKey) => {
return encodeToBase64(await window.crypto.subtle.wrapKey("raw", hmacSecret, masterKey, "AES-KW"));
return encodeToBase64(await crypto.subtle.wrapKey("raw", hmacSecret, masterKey, "AES-KW"));
};
export const unwrapHmacSecret = async (hmacSecretWrapped: string, masterKey: CryptoKey) => {
return {
hmacSecret: await window.crypto.subtle.unwrapKey(
hmacSecret: await crypto.subtle.unwrapKey(
"raw",
decodeFromBase64(hmacSecretWrapped),
masterKey,
@@ -70,15 +77,15 @@ export const unwrapHmacSecret = async (hmacSecretWrapped: string, masterKey: Cry
name: "HMAC",
hash: "SHA-256",
} satisfies HmacImportParams,
false, // Nonextractable
true, // Extractable
["sign", "verify"],
),
};
};
export const encryptData = async (data: BufferSource, dataKey: CryptoKey) => {
const iv = window.crypto.getRandomValues(new Uint8Array(12));
const ciphertext = await window.crypto.subtle.encrypt(
const iv = crypto.getRandomValues(new Uint8Array(12));
const ciphertext = await crypto.subtle.encrypt(
{
name: "AES-GCM",
iv,
@@ -86,14 +93,18 @@ export const encryptData = async (data: BufferSource, dataKey: CryptoKey) => {
dataKey,
data,
);
return { ciphertext, iv: encodeToBase64(iv.buffer) };
return { ciphertext, iv: iv.buffer };
};
export const decryptData = async (ciphertext: BufferSource, iv: string, dataKey: CryptoKey) => {
return await window.crypto.subtle.decrypt(
export const decryptData = async (
ciphertext: BufferSource,
iv: string | BufferSource,
dataKey: CryptoKey,
) => {
return await crypto.subtle.decrypt(
{
name: "AES-GCM",
iv: decodeFromBase64(iv),
iv: typeof iv === "string" ? decodeFromBase64(iv) : iv,
} satisfies AesGcmParams,
dataKey,
ciphertext,
@@ -102,9 +113,22 @@ export const decryptData = async (ciphertext: BufferSource, iv: string, dataKey:
export const encryptString = async (plaintext: string, dataKey: CryptoKey) => {
const { ciphertext, iv } = await encryptData(encodeString(plaintext), dataKey);
return { ciphertext: encodeToBase64(ciphertext), iv };
return { ciphertext: encodeToBase64(ciphertext), iv: encodeToBase64(iv) };
};
export const decryptString = async (ciphertext: string, iv: string, dataKey: CryptoKey) => {
return decodeString(await decryptData(decodeFromBase64(ciphertext), iv, dataKey));
};
export const encryptChunk = async (chunk: ArrayBuffer, dataKey: CryptoKey) => {
const { ciphertext, iv } = await encryptData(chunk, dataKey);
return concatenateBuffers(iv, ciphertext).buffer;
};
export const decryptChunk = async (encryptedChunk: ArrayBuffer, dataKey: CryptoKey) => {
return await decryptData(
encryptedChunk.slice(AES_GCM_IV_SIZE),
encryptedChunk.slice(0, AES_GCM_IV_SIZE),
dataKey,
);
};

View File

@@ -1,4 +1,4 @@
export * from "./aes";
export * from "./rsa";
export * from "./sha";
export * from "./util";
export * from "./utils";

View File

@@ -1,7 +1,7 @@
import { encodeString, encodeToBase64, decodeFromBase64 } from "./util";
import { encodeString, encodeToBase64, decodeFromBase64 } from "./utils";
export const generateEncryptionKeyPair = async () => {
const keyPair = await window.crypto.subtle.generateKey(
const keyPair = await crypto.subtle.generateKey(
{
name: "RSA-OAEP",
modulusLength: 4096,
@@ -18,7 +18,7 @@ export const generateEncryptionKeyPair = async () => {
};
export const generateSigningKeyPair = async () => {
const keyPair = await window.crypto.subtle.generateKey(
const keyPair = await crypto.subtle.generateKey(
{
name: "RSA-PSS",
modulusLength: 4096,
@@ -37,7 +37,7 @@ export const generateSigningKeyPair = async () => {
export const exportRSAKey = async (key: CryptoKey) => {
const format = key.type === "public" ? ("spki" as const) : ("pkcs8" as const);
return {
key: await window.crypto.subtle.exportKey(format, key),
key: await crypto.subtle.exportKey(format, key),
format,
};
};
@@ -54,14 +54,14 @@ export const importEncryptionKeyPairFromBase64 = async (
name: "RSA-OAEP",
hash: "SHA-256",
};
const encryptKey = await window.crypto.subtle.importKey(
const encryptKey = await crypto.subtle.importKey(
"spki",
decodeFromBase64(encryptKeyBase64),
algorithm,
true,
["encrypt", "wrapKey"],
);
const decryptKey = await window.crypto.subtle.importKey(
const decryptKey = await crypto.subtle.importKey(
"pkcs8",
decodeFromBase64(decryptKeyBase64),
algorithm,
@@ -79,14 +79,14 @@ export const importSigningKeyPairFromBase64 = async (
name: "RSA-PSS",
hash: "SHA-256",
};
const signKey = await window.crypto.subtle.importKey(
const signKey = await crypto.subtle.importKey(
"pkcs8",
decodeFromBase64(signKeyBase64),
algorithm,
true,
["sign"],
);
const verifyKey = await window.crypto.subtle.importKey(
const verifyKey = await crypto.subtle.importKey(
"spki",
decodeFromBase64(verifyKeyBase64),
algorithm,
@@ -98,17 +98,11 @@ export const importSigningKeyPairFromBase64 = async (
export const makeRSAKeyNonextractable = async (key: CryptoKey) => {
const { key: exportedKey, format } = await exportRSAKey(key);
return await window.crypto.subtle.importKey(
format,
exportedKey,
key.algorithm,
false,
key.usages,
);
return await crypto.subtle.importKey(format, exportedKey, key.algorithm, false, key.usages);
};
export const decryptChallenge = async (challenge: string, decryptKey: CryptoKey) => {
return await window.crypto.subtle.decrypt(
return await crypto.subtle.decrypt(
{
name: "RSA-OAEP",
} satisfies RsaOaepParams,
@@ -119,7 +113,7 @@ export const decryptChallenge = async (challenge: string, decryptKey: CryptoKey)
export const wrapMasterKey = async (masterKey: CryptoKey, encryptKey: CryptoKey) => {
return encodeToBase64(
await window.crypto.subtle.wrapKey("raw", masterKey, encryptKey, {
await crypto.subtle.wrapKey("raw", masterKey, encryptKey, {
name: "RSA-OAEP",
} satisfies RsaOaepParams),
);
@@ -131,7 +125,7 @@ export const unwrapMasterKey = async (
extractable = false,
) => {
return {
masterKey: await window.crypto.subtle.unwrapKey(
masterKey: await crypto.subtle.unwrapKey(
"raw",
decodeFromBase64(masterKeyWrapped),
decryptKey,
@@ -146,7 +140,7 @@ export const unwrapMasterKey = async (
};
export const signMessageRSA = async (message: BufferSource, signKey: CryptoKey) => {
return await window.crypto.subtle.sign(
return await crypto.subtle.sign(
{
name: "RSA-PSS",
saltLength: 32, // SHA-256
@@ -161,7 +155,7 @@ export const verifySignatureRSA = async (
signature: BufferSource,
verifyKey: CryptoKey,
) => {
return await window.crypto.subtle.verify(
return await crypto.subtle.verify(
{
name: "RSA-PSS",
saltLength: 32, // SHA-256

View File

@@ -1,10 +1,13 @@
import { hmac } from "@noble/hashes/hmac.js";
import { sha256 } from "@noble/hashes/sha2.js";
export const digestMessage = async (message: BufferSource) => {
return await window.crypto.subtle.digest("SHA-256", message);
return await crypto.subtle.digest("SHA-256", message);
};
export const generateHmacSecret = async () => {
return {
hmacSecret: await window.crypto.subtle.generateKey(
hmacSecret: await crypto.subtle.generateKey(
{
name: "HMAC",
hash: "SHA-256",
@@ -15,6 +18,10 @@ export const generateHmacSecret = async () => {
};
};
export const signMessageHmac = async (message: BufferSource, hmacSecret: CryptoKey) => {
return await window.crypto.subtle.sign("HMAC", hmacSecret, message);
export const createHmacStream = async (hmacSecret: CryptoKey) => {
const h = hmac.create(sha256, new Uint8Array(await crypto.subtle.exportKey("raw", hmacSecret)));
return {
update: (data: Uint8Array) => h.update(data),
digest: () => h.digest(),
};
};

View File

@@ -9,8 +9,8 @@ export const decodeString = (data: ArrayBuffer) => {
return textDecoder.decode(data);
};
export const encodeToBase64 = (data: ArrayBuffer) => {
return btoa(String.fromCharCode(...new Uint8Array(data)));
export const encodeToBase64 = (data: ArrayBuffer | Uint8Array) => {
return btoa(String.fromCharCode(...(data instanceof ArrayBuffer ? new Uint8Array(data) : data)));
};
export const decodeFromBase64 = (data: string) => {

View File

@@ -1,15 +1,12 @@
import { LRUCache } from "lru-cache";
import {
getFileCacheIndex as getFileCacheIndexFromIndexedDB,
storeFileCacheIndex,
deleteFileCacheIndex,
type FileCacheIndex,
} from "$lib/indexedDB";
import { readFile, writeFile, deleteFile, deleteDirectory } from "$lib/modules/opfs";
import { getThumbnailUrl } from "$lib/modules/thumbnail";
import { readFile, writeFile, deleteFile } from "$lib/modules/opfs";
const fileCacheIndex = new Map<number, FileCacheIndex>();
const loadedThumbnails = new LRUCache<number, string>({ max: 100 });
export const prepareFileCache = async () => {
for (const cache of await getFileCacheIndexFromIndexedDB()) {
@@ -51,30 +48,3 @@ export const deleteFileCache = async (fileId: number) => {
await deleteFile(`/cache/${fileId}`);
await deleteFileCacheIndex(fileId);
};
export const getFileThumbnailCache = async (fileId: number) => {
const thumbnail = loadedThumbnails.get(fileId);
if (thumbnail) return thumbnail;
const thumbnailBuffer = await readFile(`/thumbnail/file/${fileId}`);
if (!thumbnailBuffer) return null;
const thumbnailUrl = getThumbnailUrl(thumbnailBuffer);
loadedThumbnails.set(fileId, thumbnailUrl);
return thumbnailUrl;
};
export const storeFileThumbnailCache = async (fileId: number, thumbnailBuffer: ArrayBuffer) => {
await writeFile(`/thumbnail/file/${fileId}`, thumbnailBuffer);
loadedThumbnails.set(fileId, getThumbnailUrl(thumbnailBuffer));
};
export const deleteFileThumbnailCache = async (fileId: number) => {
loadedThumbnails.delete(fileId);
await deleteFile(`/thumbnail/file/${fileId}`);
};
export const deleteAllFileThumbnailCaches = async () => {
loadedThumbnails.clear();
await deleteDirectory("/thumbnail/file");
};

View File

@@ -0,0 +1,110 @@
import axios from "axios";
import { limitFunction } from "p-limit";
import { CHUNK_SIZE, ENCRYPTION_OVERHEAD } from "$lib/constants";
import { decryptChunk, concatenateBuffers } from "$lib/modules/crypto";
export interface FileDownloadState {
id: number;
status:
| "download-pending"
| "downloading"
| "decryption-pending"
| "decrypting"
| "decrypted"
| "canceled"
| "error";
progress?: number;
rate?: number;
estimated?: number;
result?: ArrayBuffer;
}
type LiveFileDownloadState = FileDownloadState & {
status: "download-pending" | "downloading" | "decryption-pending" | "decrypting";
};
let downloadingFiles: FileDownloadState[] = $state([]);
export const isFileDownloading = (
status: FileDownloadState["status"],
): status is LiveFileDownloadState["status"] =>
["download-pending", "downloading", "decryption-pending", "decrypting"].includes(status);
export const getFileDownloadState = (fileId: number) => {
return downloadingFiles.find((file) => file.id === fileId && isFileDownloading(file.status));
};
export const getDownloadingFiles = () => {
return downloadingFiles.filter((file) => isFileDownloading(file.status));
};
export const clearDownloadedFiles = () => {
downloadingFiles = downloadingFiles.filter((file) => isFileDownloading(file.status));
};
const requestFileDownload = limitFunction(
async (state: FileDownloadState, id: number) => {
state.status = "downloading";
const res = await axios.get(`/api/file/${id}/download`, {
responseType: "arraybuffer",
onDownloadProgress: ({ progress, rate, estimated }) => {
state.progress = progress;
state.rate = rate;
state.estimated = estimated;
},
});
const fileEncrypted: ArrayBuffer = res.data;
state.status = "decryption-pending";
return fileEncrypted;
},
{ concurrency: 1 },
);
const decryptFile = limitFunction(
async (
state: FileDownloadState,
fileEncrypted: ArrayBuffer,
encryptedChunkSize: number,
dataKey: CryptoKey,
) => {
state.status = "decrypting";
const chunks: ArrayBuffer[] = [];
let offset = 0;
while (offset < fileEncrypted.byteLength) {
const nextOffset = Math.min(offset + encryptedChunkSize, fileEncrypted.byteLength);
chunks.push(await decryptChunk(fileEncrypted.slice(offset, nextOffset), dataKey));
offset = nextOffset;
}
const fileBuffer = concatenateBuffers(...chunks).buffer;
state.status = "decrypted";
state.result = fileBuffer;
return fileBuffer;
},
{ concurrency: 4 },
);
export const downloadFile = async (id: number, dataKey: CryptoKey, isLegacy: boolean) => {
downloadingFiles.push({
id,
status: "download-pending",
});
const state = downloadingFiles.at(-1)!;
try {
const fileEncrypted = await requestFileDownload(state, id);
return await decryptFile(
state,
fileEncrypted,
isLegacy ? fileEncrypted.byteLength : CHUNK_SIZE + ENCRYPTION_OVERHEAD,
dataKey,
);
} catch (e) {
state.status = "error";
throw e;
}
};

View File

@@ -1,84 +0,0 @@
import axios from "axios";
import { limitFunction } from "p-limit";
import { writable, type Writable } from "svelte/store";
import { decryptData } from "$lib/modules/crypto";
import { fileDownloadStatusStore, type FileDownloadStatus } from "$lib/stores";
const requestFileDownload = limitFunction(
async (status: Writable<FileDownloadStatus>, id: number) => {
status.update((value) => {
value.status = "downloading";
return value;
});
const res = await axios.get(`/api/file/${id}/download`, {
responseType: "arraybuffer",
onDownloadProgress: ({ progress, rate, estimated }) => {
status.update((value) => {
value.progress = progress;
value.rate = rate;
value.estimated = estimated;
return value;
});
},
});
const fileEncrypted: ArrayBuffer = res.data;
status.update((value) => {
value.status = "decryption-pending";
return value;
});
return fileEncrypted;
},
{ concurrency: 1 },
);
const decryptFile = limitFunction(
async (
status: Writable<FileDownloadStatus>,
fileEncrypted: ArrayBuffer,
fileEncryptedIv: string,
dataKey: CryptoKey,
) => {
status.update((value) => {
value.status = "decrypting";
return value;
});
const fileBuffer = await decryptData(fileEncrypted, fileEncryptedIv, dataKey);
status.update((value) => {
value.status = "decrypted";
value.result = fileBuffer;
return value;
});
return fileBuffer;
},
{ concurrency: 4 },
);
export const downloadFile = async (id: number, fileEncryptedIv: string, dataKey: CryptoKey) => {
const status = writable<FileDownloadStatus>({
id,
status: "download-pending",
});
fileDownloadStatusStore.update((value) => {
value.push(status);
return value;
});
try {
return await decryptFile(
status,
await requestFileDownload(status, id),
fileEncryptedIv,
dataKey,
);
} catch (e) {
status.update((value) => {
value.status = "error";
return value;
});
throw e;
}
};

View File

@@ -1,3 +1,4 @@
export * from "./cache";
export * from "./download";
export * from "./upload";
export * from "./download.svelte";
export * from "./thumbnail";
export * from "./upload.svelte";

View File

@@ -0,0 +1,82 @@
import { LRUCache } from "lru-cache";
import { writable, type Writable } from "svelte/store";
import { browser } from "$app/environment";
import { decryptData } from "$lib/modules/crypto";
import type { SummarizedFileInfo } from "$lib/modules/filesystem";
import { readFile, writeFile, deleteFile, deleteDirectory } from "$lib/modules/opfs";
import { getThumbnailUrl } from "$lib/modules/thumbnail";
const loadedThumbnails = new LRUCache<number, Writable<string>>({ max: 100 });
const loadingThumbnails = new Map<number, Writable<string | undefined>>();
const fetchFromOpfs = async (fileId: number) => {
const thumbnailBuffer = await readFile(`/thumbnail/file/${fileId}`);
if (thumbnailBuffer) {
return getThumbnailUrl(thumbnailBuffer);
}
};
const fetchFromServer = async (fileId: number, dataKey: CryptoKey) => {
const res = await fetch(`/api/file/${fileId}/thumbnail/download`);
if (!res.ok) return null;
const thumbnailEncrypted = await res.arrayBuffer();
const thumbnailBuffer = await decryptData(
thumbnailEncrypted.slice(12),
thumbnailEncrypted.slice(0, 12),
dataKey,
);
void writeFile(`/thumbnail/file/${fileId}`, thumbnailBuffer);
return getThumbnailUrl(thumbnailBuffer);
};
export const getFileThumbnail = (file: SummarizedFileInfo) => {
if (
!browser ||
!(file.contentType.startsWith("image/") || file.contentType.startsWith("video/"))
) {
return undefined;
}
const thumbnail = loadedThumbnails.get(file.id);
if (thumbnail) return thumbnail;
let loadingThumbnail = loadingThumbnails.get(file.id);
if (loadingThumbnail) return loadingThumbnail;
loadingThumbnail = writable(undefined);
loadingThumbnails.set(file.id, loadingThumbnail);
fetchFromOpfs(file.id)
.then((thumbnail) => thumbnail ?? (file.dataKey && fetchFromServer(file.id, file.dataKey.key)))
.then((thumbnail) => {
if (thumbnail) {
loadingThumbnail.set(thumbnail);
loadedThumbnails.set(file.id, loadingThumbnail as Writable<string>);
}
loadingThumbnails.delete(file.id);
});
return loadingThumbnail;
};
export const storeFileThumbnailCache = async (fileId: number, thumbnailBuffer: ArrayBuffer) => {
await writeFile(`/thumbnail/file/${fileId}`, thumbnailBuffer);
const oldThumbnail = loadedThumbnails.get(fileId);
if (oldThumbnail) {
oldThumbnail.set(getThumbnailUrl(thumbnailBuffer));
} else {
loadedThumbnails.set(fileId, writable(getThumbnailUrl(thumbnailBuffer)));
}
};
export const deleteFileThumbnailCache = async (fileId: number) => {
loadedThumbnails.delete(fileId);
await deleteFile(`/thumbnail/file/${fileId}`);
};
export const deleteAllFileThumbnailCaches = async () => {
loadedThumbnails.clear();
await deleteDirectory(`/thumbnail/file`);
};

View File

@@ -0,0 +1,480 @@
import ExifReader from "exifreader";
import pLimit, { limitFunction } from "p-limit";
import { CHUNK_SIZE } from "$lib/constants";
import {
encodeToBase64,
generateDataKey,
wrapDataKey,
encryptData,
encryptString,
encryptChunk,
digestMessage,
createHmacStream,
} from "$lib/modules/crypto";
import { Scheduler } from "$lib/modules/scheduler";
import { generateThumbnail, generateThumbnailFromFile } from "$lib/modules/thumbnail";
import type { MasterKey, HmacSecret } from "$lib/stores";
import { trpc } from "$trpc/client";
import type { RouterInputs } from "$trpc/router.server";
export interface FileUploadState {
name: string;
parentId: DirectoryId;
status:
| "queued"
| "encryption-pending"
| "encrypting"
| "upload-pending"
| "uploading"
| "uploaded"
| "canceled"
| "error";
progress?: number;
rate?: number;
estimated?: number;
}
export type LiveFileUploadState = FileUploadState & {
status: "queued" | "encryption-pending" | "encrypting" | "upload-pending" | "uploading";
};
const scheduler = new Scheduler<
{ fileId: number; fileBuffer?: ArrayBuffer; thumbnailBuffer?: ArrayBuffer } | undefined
>();
let uploadingFiles: FileUploadState[] = $state([]);
const isFileUploading = (status: FileUploadState["status"]) =>
["queued", "encryption-pending", "encrypting", "upload-pending", "uploading"].includes(status);
export const getUploadingFiles = (parentId?: DirectoryId) => {
return uploadingFiles.filter(
(file) =>
(parentId === undefined || file.parentId === parentId) && isFileUploading(file.status),
);
};
export const clearUploadedFiles = () => {
uploadingFiles = uploadingFiles.filter((file) => isFileUploading(file.status));
};
const requestDuplicateFileScan = limitFunction(
async (file: File, hmacSecret: HmacSecret, onDuplicate: () => Promise<boolean>) => {
const hmacStream = await createHmacStream(hmacSecret.secret);
const reader = file.stream().getReader();
while (true) {
const { done, value } = await reader.read();
if (done) break;
hmacStream.update(value);
}
const fileSigned = encodeToBase64(hmacStream.digest());
const files = await trpc().file.listByHash.query({
hskVersion: hmacSecret.version,
contentHmac: fileSigned,
});
if (files.length === 0 || (await onDuplicate())) {
return { fileSigned };
} else {
return {};
}
},
{ concurrency: 1 },
);
const getFileType = (file: File) => {
if (file.type) return file.type;
if (file.name.endsWith(".heic")) return "image/heic";
throw new Error("Unknown file type");
};
const extractExifDateTime = (fileBuffer: ArrayBuffer) => {
const exif = ExifReader.load(fileBuffer);
const dateTimeOriginal = exif["DateTimeOriginal"]?.description;
const offsetTimeOriginal = exif["OffsetTimeOriginal"]?.description;
if (!dateTimeOriginal) return undefined;
const [date, time] = dateTimeOriginal.split(" ");
if (!date || !time) return undefined;
const [year, month, day] = date.split(":").map(Number);
const [hour, minute, second] = time.split(":").map(Number);
if (!year || !month || !day || !hour || !minute || !second) return undefined;
if (!offsetTimeOriginal) {
// No timezone information.. Assume local timezone
return new Date(year, month - 1, day, hour, minute, second);
}
const offsetSign = offsetTimeOriginal[0] === "+" ? 1 : -1;
const [offsetHour, offsetMinute] = offsetTimeOriginal.slice(1).split(":").map(Number);
const utcDate = Date.UTC(year, month - 1, day, hour, minute, second);
const offsetMs = offsetSign * ((offsetHour ?? 0) * 60 + (offsetMinute ?? 0)) * 60 * 1000;
return new Date(utcDate - offsetMs);
};
const encryptChunks = async (fileBuffer: ArrayBuffer, dataKey: CryptoKey) => {
const chunksEncrypted: { chunkEncrypted: ArrayBuffer; chunkEncryptedHash: string }[] = [];
let offset = 0;
while (offset < fileBuffer.byteLength) {
const nextOffset = Math.min(offset + CHUNK_SIZE, fileBuffer.byteLength);
const chunkEncrypted = await encryptChunk(fileBuffer.slice(offset, nextOffset), dataKey);
chunksEncrypted.push({
chunkEncrypted: chunkEncrypted,
chunkEncryptedHash: encodeToBase64(await digestMessage(chunkEncrypted)),
});
offset = nextOffset;
}
return chunksEncrypted;
};
const encryptImageFile = limitFunction(
async (state: FileUploadState, file: File, masterKey: MasterKey) => {
state.status = "encrypting";
const fileBuffer = await file.arrayBuffer();
const createdAt = extractExifDateTime(fileBuffer);
const { dataKey, dataKeyVersion } = await generateDataKey();
const dataKeyWrapped = await wrapDataKey(dataKey, masterKey.key);
const chunksEncrypted = await encryptChunks(fileBuffer, dataKey);
const nameEncrypted = await encryptString(file.name, dataKey);
const createdAtEncrypted =
createdAt && (await encryptString(createdAt.getTime().toString(), dataKey));
const lastModifiedAtEncrypted = await encryptString(file.lastModified.toString(), dataKey);
const thumbnail = await generateThumbnail(fileBuffer, getFileType(file));
const thumbnailBuffer = await thumbnail?.arrayBuffer();
const thumbnailEncrypted = thumbnailBuffer && (await encryptData(thumbnailBuffer, dataKey));
state.status = "upload-pending";
return {
dataKeyWrapped,
dataKeyVersion,
chunksEncrypted,
nameEncrypted,
createdAtEncrypted,
lastModifiedAtEncrypted,
thumbnail: thumbnailEncrypted && { plaintext: thumbnailBuffer, ...thumbnailEncrypted },
};
},
{ concurrency: 4 },
);
const uploadThumbnail = async (
fileId: number,
thumbnailEncrypted: { ciphertext: ArrayBuffer; iv: ArrayBuffer },
dataKeyVersion: Date,
) => {
const { uploadId } = await trpc().upload.startFileThumbnailUpload.mutate({
file: fileId,
dekVersion: dataKeyVersion,
});
const ivAndCiphertext = new Uint8Array(
thumbnailEncrypted.iv.byteLength + thumbnailEncrypted.ciphertext.byteLength,
);
ivAndCiphertext.set(new Uint8Array(thumbnailEncrypted.iv), 0);
ivAndCiphertext.set(
new Uint8Array(thumbnailEncrypted.ciphertext),
thumbnailEncrypted.iv.byteLength,
);
const chunkHash = encodeToBase64(await digestMessage(ivAndCiphertext));
const response = await fetch(`/api/upload/${uploadId}/chunks/0`, {
method: "POST",
headers: {
"Content-Type": "application/octet-stream",
"Content-Digest": `sha-256=:${chunkHash}:`,
},
body: ivAndCiphertext,
});
if (!response.ok) {
throw new Error(`Thumbnail upload failed: ${response.status} ${response.statusText}`);
}
await trpc().upload.completeFileThumbnailUpload.mutate({ uploadId });
};
const requestImageFileUpload = limitFunction(
async (
state: FileUploadState,
metadata: RouterInputs["upload"]["startFileUpload"],
chunksEncrypted: { chunkEncrypted: ArrayBuffer; chunkEncryptedHash: string }[],
fileSigned: string | undefined,
thumbnailData: { ciphertext: ArrayBuffer; iv: ArrayBuffer; plaintext: ArrayBuffer } | null,
dataKeyVersion: Date,
) => {
state.status = "uploading";
const { uploadId } = await trpc().upload.startFileUpload.mutate(metadata);
const totalBytes = chunksEncrypted.reduce((sum, c) => sum + c.chunkEncrypted.byteLength, 0);
let uploadedBytes = 0;
const startTime = Date.now();
for (let i = 0; i < chunksEncrypted.length; i++) {
const { chunkEncrypted, chunkEncryptedHash } = chunksEncrypted[i]!;
const response = await fetch(`/api/upload/${uploadId}/chunks/${i}`, {
method: "POST",
headers: {
"Content-Type": "application/octet-stream",
"Content-Digest": `sha-256=:${chunkEncryptedHash}:`,
},
body: chunkEncrypted,
});
if (!response.ok) {
throw new Error(`Chunk upload failed: ${response.status} ${response.statusText}`);
}
uploadedBytes += chunkEncrypted.byteLength;
const elapsed = (Date.now() - startTime) / 1000;
const rate = uploadedBytes / elapsed;
const remaining = totalBytes - uploadedBytes;
const estimated = rate > 0 ? remaining / rate : undefined;
state.progress = uploadedBytes / totalBytes;
state.rate = rate;
state.estimated = estimated;
}
const { file: fileId } = await trpc().upload.completeFileUpload.mutate({
uploadId,
contentHmac: fileSigned,
});
if (thumbnailData) {
try {
await uploadThumbnail(fileId, thumbnailData, dataKeyVersion);
} catch (e) {
// TODO: Error handling for thumbnail upload
console.error(e);
}
}
state.status = "uploaded";
return { fileId, thumbnailBuffer: thumbnailData?.plaintext };
},
{ concurrency: 1 },
);
const requestFileUpload = async (
state: FileUploadState,
file: File,
masterKey: MasterKey,
hmacSecret: HmacSecret,
fileSigned: string,
parentId: DirectoryId,
) => {
state.status = "uploading";
const fileType = getFileType(file);
const { dataKey, dataKeyVersion } = await generateDataKey();
const dataKeyWrapped = await wrapDataKey(dataKey, masterKey.key);
const nameEncrypted = await encryptString(file.name, dataKey);
const lastModifiedAtEncrypted = await encryptString(file.lastModified.toString(), dataKey);
const totalChunks = Math.ceil(file.size / CHUNK_SIZE);
const metadata = {
chunks: totalChunks,
parent: parentId,
mekVersion: masterKey.version,
dek: dataKeyWrapped,
dekVersion: dataKeyVersion,
hskVersion: hmacSecret.version,
contentType: fileType,
name: nameEncrypted.ciphertext,
nameIv: nameEncrypted.iv,
lastModifiedAt: lastModifiedAtEncrypted.ciphertext,
lastModifiedAtIv: lastModifiedAtEncrypted.iv,
};
const { uploadId } = await trpc().upload.startFileUpload.mutate(metadata);
const reader = file.stream().getReader();
const limit = pLimit(4);
let buffer = new Uint8Array(0);
let chunkIndex = 0;
const uploadPromises: Promise<void>[] = [];
const totalBytes = file.size;
let uploadedBytes = 0;
const startTime = Date.now();
const uploadChunk = async (
index: number,
encryptedChunk: ArrayBuffer,
chunkHash: string,
originalChunkSize: number,
) => {
const response = await fetch(`/api/upload/${uploadId}/chunks/${index}`, {
method: "POST",
headers: {
"Content-Type": "application/octet-stream",
"Content-Digest": `sha-256=:${chunkHash}:`,
},
body: encryptedChunk,
});
if (!response.ok) {
throw new Error(`Chunk upload failed: ${response.status} ${response.statusText}`);
}
uploadedBytes += originalChunkSize;
const elapsed = (Date.now() - startTime) / 1000;
const rate = uploadedBytes / elapsed;
const remaining = totalBytes - uploadedBytes;
const estimated = rate > 0 ? remaining / rate : undefined;
state.progress = uploadedBytes / totalBytes;
state.rate = rate;
state.estimated = estimated;
};
while (true) {
const { done, value } = await reader.read();
if (done && buffer.length === 0) break;
if (value) {
const newBuffer = new Uint8Array(buffer.length + value.length);
newBuffer.set(buffer);
newBuffer.set(value, buffer.length);
buffer = newBuffer;
}
while (buffer.length >= CHUNK_SIZE || (done && buffer.length > 0)) {
const chunkSize = Math.min(CHUNK_SIZE, buffer.length);
const chunk = buffer.slice(0, chunkSize);
buffer = buffer.slice(chunkSize);
const encryptedChunk = await encryptChunk(chunk.buffer.slice(0, chunk.byteLength), dataKey);
const chunkHash = encodeToBase64(await digestMessage(encryptedChunk));
const currentIndex = chunkIndex++;
uploadPromises.push(
limit(() => uploadChunk(currentIndex, encryptedChunk, chunkHash, chunkSize)),
);
}
if (done) break;
}
await Promise.all(uploadPromises);
const { file: fileId } = await trpc().upload.completeFileUpload.mutate({
uploadId,
contentHmac: fileSigned,
});
if (fileType.startsWith("video/")) {
try {
const thumbnail = await generateThumbnailFromFile(file);
if (thumbnail) {
const thumbnailBuffer = await thumbnail.arrayBuffer();
const thumbnailEncrypted = await encryptData(thumbnailBuffer, dataKey);
await uploadThumbnail(fileId, thumbnailEncrypted, dataKeyVersion);
}
} catch (e) {
// Thumbnail upload failure is not critical
console.error(e);
}
}
state.status = "uploaded";
return { fileId };
};
export const uploadFile = async (
file: File,
parentId: "root" | number,
hmacSecret: HmacSecret,
masterKey: MasterKey,
onDuplicate: () => Promise<boolean>,
) => {
uploadingFiles.push({
name: file.name,
parentId,
status: "queued",
});
const state = uploadingFiles.at(-1)!;
return await scheduler.schedule(file.size, async () => {
state.status = "encryption-pending";
try {
const { fileSigned } = await requestDuplicateFileScan(file, hmacSecret, onDuplicate);
if (!fileSigned) {
state.status = "canceled";
uploadingFiles = uploadingFiles.filter((file) => file !== state);
return;
}
const fileType = getFileType(file);
if (fileType.startsWith("image/")) {
const fileBuffer = await file.arrayBuffer();
const {
dataKeyWrapped,
dataKeyVersion,
chunksEncrypted,
nameEncrypted,
createdAtEncrypted,
lastModifiedAtEncrypted,
thumbnail,
} = await encryptImageFile(state, file, masterKey);
const metadata = {
chunks: chunksEncrypted.length,
parent: parentId,
mekVersion: masterKey.version,
dek: dataKeyWrapped,
dekVersion: dataKeyVersion,
hskVersion: hmacSecret.version,
contentType: fileType,
name: nameEncrypted.ciphertext,
nameIv: nameEncrypted.iv,
createdAt: createdAtEncrypted?.ciphertext,
createdAtIv: createdAtEncrypted?.iv,
lastModifiedAt: lastModifiedAtEncrypted.ciphertext,
lastModifiedAtIv: lastModifiedAtEncrypted.iv,
};
const { fileId, thumbnailBuffer } = await requestImageFileUpload(
state,
metadata,
chunksEncrypted,
fileSigned,
thumbnail ?? null,
dataKeyVersion,
);
return { fileId, fileBuffer, thumbnailBuffer };
} else {
const { fileId } = await requestFileUpload(
state,
file,
masterKey,
hmacSecret,
fileSigned,
parentId,
);
return { fileId };
}
} catch (e) {
state.status = "error";
throw e;
}
});
};

View File

@@ -1,267 +0,0 @@
import axios from "axios";
import ExifReader from "exifreader";
import { limitFunction } from "p-limit";
import { writable, type Writable } from "svelte/store";
import {
encodeToBase64,
generateDataKey,
wrapDataKey,
encryptData,
encryptString,
digestMessage,
signMessageHmac,
} from "$lib/modules/crypto";
import { generateThumbnail } from "$lib/modules/thumbnail";
import type {
DuplicateFileScanRequest,
DuplicateFileScanResponse,
FileThumbnailUploadRequest,
FileUploadRequest,
FileUploadResponse,
} from "$lib/server/schemas";
import {
fileUploadStatusStore,
type MasterKey,
type HmacSecret,
type FileUploadStatus,
} from "$lib/stores";
const requestDuplicateFileScan = limitFunction(
async (file: File, hmacSecret: HmacSecret, onDuplicate: () => Promise<boolean>) => {
const fileBuffer = await file.arrayBuffer();
const fileSigned = encodeToBase64(await signMessageHmac(fileBuffer, hmacSecret.secret));
const res = await axios.post("/api/file/scanDuplicates", {
hskVersion: hmacSecret.version,
contentHmac: fileSigned,
} satisfies DuplicateFileScanRequest);
const { files }: DuplicateFileScanResponse = res.data;
if (files.length === 0 || (await onDuplicate())) {
return { fileBuffer, fileSigned };
} else {
return {};
}
},
{ concurrency: 1 },
);
const getFileType = (file: File) => {
if (file.type) return file.type;
if (file.name.endsWith(".heic")) return "image/heic";
throw new Error("Unknown file type");
};
const extractExifDateTime = (fileBuffer: ArrayBuffer) => {
const exif = ExifReader.load(fileBuffer);
const dateTimeOriginal = exif["DateTimeOriginal"]?.description;
const offsetTimeOriginal = exif["OffsetTimeOriginal"]?.description;
if (!dateTimeOriginal) return undefined;
const [date, time] = dateTimeOriginal.split(" ");
if (!date || !time) return undefined;
const [year, month, day] = date.split(":").map(Number);
const [hour, minute, second] = time.split(":").map(Number);
if (!year || !month || !day || !hour || !minute || !second) return undefined;
if (!offsetTimeOriginal) {
// No timezone information.. Assume local timezone
return new Date(year, month - 1, day, hour, minute, second);
}
const offsetSign = offsetTimeOriginal[0] === "+" ? 1 : -1;
const [offsetHour, offsetMinute] = offsetTimeOriginal.slice(1).split(":").map(Number);
const utcDate = Date.UTC(year, month - 1, day, hour, minute, second);
const offsetMs = offsetSign * ((offsetHour ?? 0) * 60 + (offsetMinute ?? 0)) * 60 * 1000;
return new Date(utcDate - offsetMs);
};
const encryptFile = limitFunction(
async (
status: Writable<FileUploadStatus>,
file: File,
fileBuffer: ArrayBuffer,
masterKey: MasterKey,
) => {
status.update((value) => {
value.status = "encrypting";
return value;
});
const fileType = getFileType(file);
let createdAt;
if (fileType.startsWith("image/")) {
createdAt = extractExifDateTime(fileBuffer);
}
const { dataKey, dataKeyVersion } = await generateDataKey();
const dataKeyWrapped = await wrapDataKey(dataKey, masterKey.key);
const fileEncrypted = await encryptData(fileBuffer, dataKey);
const fileEncryptedHash = encodeToBase64(await digestMessage(fileEncrypted.ciphertext));
const nameEncrypted = await encryptString(file.name, dataKey);
const createdAtEncrypted =
createdAt && (await encryptString(createdAt.getTime().toString(), dataKey));
const lastModifiedAtEncrypted = await encryptString(file.lastModified.toString(), dataKey);
const thumbnail = await generateThumbnail(fileBuffer, fileType);
const thumbnailBuffer = await thumbnail?.arrayBuffer();
const thumbnailEncrypted = thumbnailBuffer && (await encryptData(thumbnailBuffer, dataKey));
status.update((value) => {
value.status = "upload-pending";
return value;
});
return {
dataKeyWrapped,
dataKeyVersion,
fileType,
fileEncrypted,
fileEncryptedHash,
nameEncrypted,
createdAtEncrypted,
lastModifiedAtEncrypted,
thumbnail: thumbnailEncrypted && { plaintext: thumbnailBuffer, ...thumbnailEncrypted },
};
},
{ concurrency: 4 },
);
const requestFileUpload = limitFunction(
async (status: Writable<FileUploadStatus>, form: FormData, thumbnailForm: FormData | null) => {
status.update((value) => {
value.status = "uploading";
return value;
});
const res = await axios.post("/api/file/upload", form, {
onUploadProgress: ({ progress, rate, estimated }) => {
status.update((value) => {
value.progress = progress;
value.rate = rate;
value.estimated = estimated;
return value;
});
},
});
const { file }: FileUploadResponse = res.data;
if (thumbnailForm) {
try {
await axios.post(`/api/file/${file}/thumbnail/upload`, thumbnailForm);
} catch (e) {
// TODO
console.error(e);
}
}
status.update((value) => {
value.status = "uploaded";
return value;
});
return { fileId: file };
},
{ concurrency: 1 },
);
export const uploadFile = async (
file: File,
parentId: "root" | number,
hmacSecret: HmacSecret,
masterKey: MasterKey,
onDuplicate: () => Promise<boolean>,
): Promise<
{ fileId: number; fileBuffer: ArrayBuffer; thumbnailBuffer?: ArrayBuffer } | undefined
> => {
const status = writable<FileUploadStatus>({
name: file.name,
parentId,
status: "encryption-pending",
});
fileUploadStatusStore.update((value) => {
value.push(status);
return value;
});
try {
const { fileBuffer, fileSigned } = await requestDuplicateFileScan(
file,
hmacSecret,
onDuplicate,
);
if (!fileBuffer || !fileSigned) {
status.update((value) => {
value.status = "canceled";
return value;
});
fileUploadStatusStore.update((value) => {
value = value.filter((v) => v !== status);
return value;
});
return undefined;
}
const {
dataKeyWrapped,
dataKeyVersion,
fileType,
fileEncrypted,
fileEncryptedHash,
nameEncrypted,
createdAtEncrypted,
lastModifiedAtEncrypted,
thumbnail,
} = await encryptFile(status, file, fileBuffer, masterKey);
const form = new FormData();
form.set(
"metadata",
JSON.stringify({
parent: parentId,
mekVersion: masterKey.version,
dek: dataKeyWrapped,
dekVersion: dataKeyVersion.toISOString(),
hskVersion: hmacSecret.version,
contentHmac: fileSigned,
contentType: fileType,
contentIv: fileEncrypted.iv,
name: nameEncrypted.ciphertext,
nameIv: nameEncrypted.iv,
createdAt: createdAtEncrypted?.ciphertext,
createdAtIv: createdAtEncrypted?.iv,
lastModifiedAt: lastModifiedAtEncrypted.ciphertext,
lastModifiedAtIv: lastModifiedAtEncrypted.iv,
} satisfies FileUploadRequest),
);
form.set("content", new Blob([fileEncrypted.ciphertext]));
form.set("checksum", fileEncryptedHash);
let thumbnailForm = null;
if (thumbnail) {
thumbnailForm = new FormData();
thumbnailForm.set(
"metadata",
JSON.stringify({
dekVersion: dataKeyVersion.toISOString(),
contentIv: thumbnail.iv,
} satisfies FileThumbnailUploadRequest),
);
thumbnailForm.set("content", new Blob([thumbnail.ciphertext]));
}
const { fileId } = await requestFileUpload(status, form, thumbnailForm);
return { fileId, fileBuffer, thumbnailBuffer: thumbnail?.plaintext };
} catch (e) {
status.update((value) => {
value.status = "error";
return value;
});
throw e;
}
};

View File

@@ -1,360 +0,0 @@
import { get, writable, type Writable } from "svelte/store";
import { callGetApi } from "$lib/hooks";
import {
getDirectoryInfos as getDirectoryInfosFromIndexedDB,
getDirectoryInfo as getDirectoryInfoFromIndexedDB,
storeDirectoryInfo,
deleteDirectoryInfo,
getFileInfos as getFileInfosFromIndexedDB,
getFileInfo as getFileInfoFromIndexedDB,
storeFileInfo,
deleteFileInfo,
getCategoryInfos as getCategoryInfosFromIndexedDB,
getCategoryInfo as getCategoryInfoFromIndexedDB,
storeCategoryInfo,
updateCategoryInfo as updateCategoryInfoInIndexedDB,
deleteCategoryInfo,
type DirectoryId,
type CategoryId,
} from "$lib/indexedDB";
import { unwrapDataKey, decryptString } from "$lib/modules/crypto";
import type {
CategoryInfoResponse,
CategoryFileListResponse,
DirectoryInfoResponse,
FileInfoResponse,
} from "$lib/server/schemas";
export type DirectoryInfo =
| {
id: "root";
dataKey?: undefined;
dataKeyVersion?: undefined;
name?: undefined;
subDirectoryIds: number[];
fileIds: number[];
}
| {
id: number;
dataKey?: CryptoKey;
dataKeyVersion?: Date;
name: string;
subDirectoryIds: number[];
fileIds: number[];
};
export interface FileInfo {
id: number;
dataKey?: CryptoKey;
dataKeyVersion?: Date;
contentType: string;
contentIv?: string;
name: string;
createdAt?: Date;
lastModifiedAt: Date;
categoryIds: number[];
}
export type CategoryInfo =
| {
id: "root";
dataKey?: undefined;
dataKeyVersion?: undefined;
name?: undefined;
subCategoryIds: number[];
files?: undefined;
isFileRecursive?: undefined;
}
| {
id: number;
dataKey?: CryptoKey;
dataKeyVersion?: Date;
name: string;
subCategoryIds: number[];
files: { id: number; isRecursive: boolean }[];
isFileRecursive: boolean;
};
const directoryInfoStore = new Map<DirectoryId, Writable<DirectoryInfo | null>>();
const fileInfoStore = new Map<number, Writable<FileInfo | null>>();
const categoryInfoStore = new Map<CategoryId, Writable<CategoryInfo | null>>();
const fetchDirectoryInfoFromIndexedDB = async (
id: DirectoryId,
info: Writable<DirectoryInfo | null>,
) => {
if (get(info)) return;
const [directory, subDirectories, files] = await Promise.all([
id !== "root" ? getDirectoryInfoFromIndexedDB(id) : undefined,
getDirectoryInfosFromIndexedDB(id),
getFileInfosFromIndexedDB(id),
]);
const subDirectoryIds = subDirectories.map(({ id }) => id);
const fileIds = files.map(({ id }) => id);
if (id === "root") {
info.set({ id, subDirectoryIds, fileIds });
} else {
if (!directory) return;
info.set({ id, name: directory.name, subDirectoryIds, fileIds });
}
};
const fetchDirectoryInfoFromServer = async (
id: DirectoryId,
info: Writable<DirectoryInfo | null>,
masterKey: CryptoKey,
) => {
const res = await callGetApi(`/api/directory/${id}`);
if (res.status === 404) {
info.set(null);
await deleteDirectoryInfo(id as number);
return;
} else if (!res.ok) {
throw new Error("Failed to fetch directory information");
}
const {
metadata,
subDirectories: subDirectoryIds,
files: fileIds,
}: DirectoryInfoResponse = await res.json();
if (id === "root") {
info.set({ id, subDirectoryIds, fileIds });
} else {
const { dataKey } = await unwrapDataKey(metadata!.dek, masterKey);
const name = await decryptString(metadata!.name, metadata!.nameIv, dataKey);
info.set({
id,
dataKey,
dataKeyVersion: new Date(metadata!.dekVersion),
name,
subDirectoryIds,
fileIds,
});
await storeDirectoryInfo({ id, parentId: metadata!.parent, name });
}
};
const fetchDirectoryInfo = async (
id: DirectoryId,
info: Writable<DirectoryInfo | null>,
masterKey: CryptoKey,
) => {
await fetchDirectoryInfoFromIndexedDB(id, info);
await fetchDirectoryInfoFromServer(id, info, masterKey);
};
export const getDirectoryInfo = (id: DirectoryId, masterKey: CryptoKey) => {
// TODO: MEK rotation
let info = directoryInfoStore.get(id);
if (!info) {
info = writable(null);
directoryInfoStore.set(id, info);
}
fetchDirectoryInfo(id, info, masterKey); // Intended
return info;
};
const fetchFileInfoFromIndexedDB = async (id: number, info: Writable<FileInfo | null>) => {
if (get(info)) return;
const file = await getFileInfoFromIndexedDB(id);
if (!file) return;
info.set(file);
};
const decryptDate = async (ciphertext: string, iv: string, dataKey: CryptoKey) => {
return new Date(parseInt(await decryptString(ciphertext, iv, dataKey), 10));
};
const fetchFileInfoFromServer = async (
id: number,
info: Writable<FileInfo | null>,
masterKey: CryptoKey,
) => {
const res = await callGetApi(`/api/file/${id}`);
if (res.status === 404) {
info.set(null);
await deleteFileInfo(id);
return;
} else if (!res.ok) {
throw new Error("Failed to fetch file information");
}
const metadata: FileInfoResponse = await res.json();
const { dataKey } = await unwrapDataKey(metadata.dek, masterKey);
const name = await decryptString(metadata.name, metadata.nameIv, dataKey);
const createdAt =
metadata.createdAt && metadata.createdAtIv
? await decryptDate(metadata.createdAt, metadata.createdAtIv, dataKey)
: undefined;
const lastModifiedAt = await decryptDate(
metadata.lastModifiedAt,
metadata.lastModifiedAtIv,
dataKey,
);
info.set({
id,
dataKey,
dataKeyVersion: new Date(metadata.dekVersion),
contentType: metadata.contentType,
contentIv: metadata.contentIv,
name,
createdAt,
lastModifiedAt,
categoryIds: metadata.categories,
});
await storeFileInfo({
id,
parentId: metadata.parent,
name,
contentType: metadata.contentType,
createdAt,
lastModifiedAt,
categoryIds: metadata.categories,
});
};
const fetchFileInfo = async (id: number, info: Writable<FileInfo | null>, masterKey: CryptoKey) => {
await fetchFileInfoFromIndexedDB(id, info);
await fetchFileInfoFromServer(id, info, masterKey);
};
export const getFileInfo = (fileId: number, masterKey: CryptoKey) => {
// TODO: MEK rotation
let info = fileInfoStore.get(fileId);
if (!info) {
info = writable(null);
fileInfoStore.set(fileId, info);
}
fetchFileInfo(fileId, info, masterKey); // Intended
return info;
};
const fetchCategoryInfoFromIndexedDB = async (
id: CategoryId,
info: Writable<CategoryInfo | null>,
) => {
if (get(info)) return;
const [category, subCategories] = await Promise.all([
id !== "root" ? getCategoryInfoFromIndexedDB(id) : undefined,
getCategoryInfosFromIndexedDB(id),
]);
const subCategoryIds = subCategories.map(({ id }) => id);
if (id === "root") {
info.set({ id, subCategoryIds });
} else {
if (!category) return;
info.set({
id,
name: category.name,
subCategoryIds,
files: category.files,
isFileRecursive: category.isFileRecursive,
});
}
};
const fetchCategoryInfoFromServer = async (
id: CategoryId,
info: Writable<CategoryInfo | null>,
masterKey: CryptoKey,
) => {
let res = await callGetApi(`/api/category/${id}`);
if (res.status === 404) {
info.set(null);
await deleteCategoryInfo(id as number);
return;
} else if (!res.ok) {
throw new Error("Failed to fetch category information");
}
const { metadata, subCategories }: CategoryInfoResponse = await res.json();
if (id === "root") {
info.set({ id, subCategoryIds: subCategories });
} else {
const { dataKey } = await unwrapDataKey(metadata!.dek, masterKey);
const name = await decryptString(metadata!.name, metadata!.nameIv, dataKey);
res = await callGetApi(`/api/category/${id}/file/list?recurse=true`);
if (!res.ok) {
throw new Error("Failed to fetch category files");
}
const { files }: CategoryFileListResponse = await res.json();
const filesMapped = files.map(({ file, isRecursive }) => ({ id: file, isRecursive }));
let isFileRecursive: boolean | undefined = undefined;
info.update((value) => {
const newValue = {
isFileRecursive: false,
...value,
id,
dataKey,
dataKeyVersion: new Date(metadata!.dekVersion),
name,
subCategoryIds: subCategories,
files: filesMapped,
};
isFileRecursive = newValue.isFileRecursive;
return newValue;
});
await storeCategoryInfo({
id,
parentId: metadata!.parent,
name,
files: filesMapped,
isFileRecursive: isFileRecursive!,
});
}
};
const fetchCategoryInfo = async (
id: CategoryId,
info: Writable<CategoryInfo | null>,
masterKey: CryptoKey,
) => {
await fetchCategoryInfoFromIndexedDB(id, info);
await fetchCategoryInfoFromServer(id, info, masterKey);
};
export const getCategoryInfo = (categoryId: CategoryId, masterKey: CryptoKey) => {
// TODO: MEK rotation
let info = categoryInfoStore.get(categoryId);
if (!info) {
info = writable(null);
categoryInfoStore.set(categoryId, info);
}
fetchCategoryInfo(categoryId, info, masterKey); // Intended
return info;
};
export const updateCategoryInfo = async (
categoryId: number,
changes: { isFileRecursive?: boolean },
) => {
await updateCategoryInfoInIndexedDB(categoryId, changes);
categoryInfoStore.get(categoryId)?.update((value) => {
if (!value) return value;
if (changes.isFileRecursive !== undefined) {
value.isFileRecursive = changes.isFileRecursive;
}
return value;
});
};

View File

@@ -0,0 +1,121 @@
import * as IndexedDB from "$lib/indexedDB";
import { trpc, isTRPCClientError } from "$trpc/client";
import { FilesystemCache, decryptFileMetadata, decryptCategoryMetadata } from "./internal.svelte";
import type { CategoryInfo, MaybeCategoryInfo } from "./types";
const cache = new FilesystemCache<CategoryId, MaybeCategoryInfo>({
async fetchFromIndexedDB(id) {
const [category, subCategories] = await Promise.all([
id !== "root" ? IndexedDB.getCategoryInfo(id) : undefined,
IndexedDB.getCategoryInfos(id),
]);
const files = category?.files
? await Promise.all(
category.files.map(async (file) => {
const fileInfo = await IndexedDB.getFileInfo(file.id);
return fileInfo
? {
id: file.id,
parentId: fileInfo.parentId,
contentType: fileInfo.contentType,
name: fileInfo.name,
createdAt: fileInfo.createdAt,
lastModifiedAt: fileInfo.lastModifiedAt,
isRecursive: file.isRecursive,
}
: undefined;
}),
)
: undefined;
if (id === "root") {
return {
id,
exists: true,
subCategories,
};
} else if (category) {
return {
id,
exists: true,
parentId: category.parentId,
name: category.name,
subCategories,
files: files?.filter((file) => !!file) ?? [],
isFileRecursive: category.isFileRecursive ?? false,
};
}
},
async fetchFromServer(id, cachedInfo, masterKey) {
try {
const category = await trpc().category.get.query({ id, recurse: true });
const [subCategories, files, metadata] = await Promise.all([
Promise.all(
category.subCategories.map(async (category) => ({
id: category.id,
parentId: id,
...(await decryptCategoryMetadata(category, masterKey)),
})),
),
category.files &&
Promise.all(
category.files.map(async (file) => ({
id: file.id,
parentId: file.parent,
contentType: file.contentType,
isRecursive: file.isRecursive,
...(await decryptFileMetadata(file, masterKey)),
})),
),
category.metadata && decryptCategoryMetadata(category.metadata, masterKey),
]);
return storeToIndexedDB(
id !== "root"
? {
id,
parentId: category.metadata!.parent,
subCategories,
files: files!,
isFileRecursive: cachedInfo?.isFileRecursive ?? false,
...metadata!,
}
: { id, subCategories },
);
} catch (e) {
if (isTRPCClientError(e) && e.data?.code === "NOT_FOUND") {
await IndexedDB.deleteCategoryInfo(id as number);
return { id, exists: false };
}
throw e;
}
},
});
const storeToIndexedDB = (info: CategoryInfo) => {
if (info.id !== "root") {
void IndexedDB.storeCategoryInfo(info);
// TODO: Bulk Upsert
new Map(info.files.map((file) => [file.id, file])).forEach((file) => {
void IndexedDB.storeFileInfo(file);
});
}
// TODO: Bulk Upsert
info.subCategories.forEach((category) => {
void IndexedDB.storeCategoryInfo(category);
});
void IndexedDB.deleteDanglingCategoryInfos(
info.id,
new Set(info.subCategories.map(({ id }) => id)),
);
return { ...info, exists: true as const };
};
export const getCategoryInfo = (id: CategoryId, masterKey: CryptoKey) => {
return cache.get(id, masterKey);
};

View File

@@ -0,0 +1,102 @@
import * as IndexedDB from "$lib/indexedDB";
import { trpc, isTRPCClientError } from "$trpc/client";
import { FilesystemCache, decryptDirectoryMetadata, decryptFileMetadata } from "./internal.svelte";
import type { DirectoryInfo, MaybeDirectoryInfo } from "./types";
const cache = new FilesystemCache<DirectoryId, MaybeDirectoryInfo>({
async fetchFromIndexedDB(id) {
const [directory, subDirectories, files] = await Promise.all([
id !== "root" ? IndexedDB.getDirectoryInfo(id) : undefined,
IndexedDB.getDirectoryInfos(id),
IndexedDB.getFileInfos(id),
]);
if (id === "root") {
return {
id,
exists: true,
subDirectories,
files,
};
} else if (directory) {
return {
id,
exists: true,
parentId: directory.parentId,
name: directory.name,
subDirectories,
files,
};
}
},
async fetchFromServer(id, _cachedInfo, masterKey) {
try {
const directory = await trpc().directory.get.query({ id });
const [subDirectories, files, metadata] = await Promise.all([
Promise.all(
directory.subDirectories.map(async (directory) => ({
id: directory.id,
parentId: id,
...(await decryptDirectoryMetadata(directory, masterKey)),
})),
),
Promise.all(
directory.files.map(async (file) => ({
id: file.id,
parentId: id,
contentType: file.contentType,
...(await decryptFileMetadata(file, masterKey)),
})),
),
directory.metadata && decryptDirectoryMetadata(directory.metadata, masterKey),
]);
return storeToIndexedDB(
id !== "root"
? {
id,
parentId: directory.metadata!.parent,
subDirectories,
files,
...metadata!,
}
: { id, subDirectories, files },
);
} catch (e) {
if (isTRPCClientError(e) && e.data?.code === "NOT_FOUND") {
await IndexedDB.deleteDirectoryInfo(id as number);
return { id, exists: false as const };
}
throw e;
}
},
});
const storeToIndexedDB = (info: DirectoryInfo) => {
if (info.id !== "root") {
void IndexedDB.storeDirectoryInfo(info);
}
// TODO: Bulk Upsert
info.subDirectories.forEach((subDirectory) => {
void IndexedDB.storeDirectoryInfo(subDirectory);
});
// TODO: Bulk Upsert
info.files.forEach((file) => {
void IndexedDB.storeFileInfo(file);
});
void IndexedDB.deleteDanglingDirectoryInfos(
info.id,
new Set(info.subDirectories.map(({ id }) => id)),
);
void IndexedDB.deleteDanglingFileInfos(info.id, new Set(info.files.map(({ id }) => id)));
return { ...info, exists: true as const };
};
export const getDirectoryInfo = (id: DirectoryId, masterKey: CryptoKey) => {
return cache.get(id, masterKey);
};

View File

@@ -0,0 +1,177 @@
import * as IndexedDB from "$lib/indexedDB";
import { trpc, isTRPCClientError } from "$trpc/client";
import { FilesystemCache, decryptFileMetadata, decryptCategoryMetadata } from "./internal.svelte";
import type { FileInfo, MaybeFileInfo } from "./types";
const cache = new FilesystemCache<number, MaybeFileInfo>({
async fetchFromIndexedDB(id) {
const file = await IndexedDB.getFileInfo(id);
const categories = file?.categoryIds
? await Promise.all(
file.categoryIds.map(async (categoryId) => {
const category = await IndexedDB.getCategoryInfo(categoryId);
return category
? { id: category.id, parentId: category.parentId, name: category.name }
: undefined;
}),
)
: undefined;
if (file) {
return {
id,
exists: true,
parentId: file.parentId,
contentType: file.contentType,
name: file.name,
createdAt: file.createdAt,
lastModifiedAt: file.lastModifiedAt,
categories: categories?.filter((category) => !!category) ?? [],
};
}
},
async fetchFromServer(id, _cachedInfo, masterKey) {
try {
const file = await trpc().file.get.query({ id });
const [categories, metadata] = await Promise.all([
Promise.all(
file.categories.map(async (category) => ({
id: category.id,
parentId: category.parent,
...(await decryptCategoryMetadata(category, masterKey)),
})),
),
decryptFileMetadata(file, masterKey),
]);
return storeToIndexedDB({
id,
isLegacy: file.isLegacy,
parentId: file.parent,
dataKey: metadata.dataKey,
contentType: file.contentType,
name: metadata.name,
createdAt: metadata.createdAt,
lastModifiedAt: metadata.lastModifiedAt,
categories,
});
} catch (e) {
if (isTRPCClientError(e) && e.data?.code === "NOT_FOUND") {
await IndexedDB.deleteFileInfo(id);
return { id, exists: false as const };
}
throw e;
}
},
async bulkFetchFromIndexedDB(ids) {
const files = await IndexedDB.bulkGetFileInfos([...ids]);
const categories = await Promise.all(
files.map(async (file) =>
file?.categoryIds
? await Promise.all(
file.categoryIds.map(async (categoryId) => {
const category = await IndexedDB.getCategoryInfo(categoryId);
return category
? { id: category.id, parentId: category.parentId, name: category.name }
: undefined;
}),
)
: undefined,
),
);
return new Map(
files
.filter((file) => !!file)
.map((file, index) => [
file.id,
{
...file,
exists: true,
categories: categories[index]?.filter((category) => !!category) ?? [],
},
]),
);
},
async bulkFetchFromServer(ids, masterKey) {
const idsArray = [...ids.keys()];
const filesRaw = await trpc().file.bulkGet.query({ ids: idsArray });
const files = await Promise.all(
filesRaw.map(async ({ id, categories: categoriesRaw, ...metadataRaw }) => {
const [categories, metadata] = await Promise.all([
Promise.all(
categoriesRaw.map(async (category) => ({
id: category.id,
parentId: category.parent,
...(await decryptCategoryMetadata(category, masterKey)),
})),
),
decryptFileMetadata(metadataRaw, masterKey),
]);
return {
id,
exists: true as const,
isLegacy: metadataRaw.isLegacy,
parentId: metadataRaw.parent,
contentType: metadataRaw.contentType,
categories,
...metadata,
};
}),
);
const existingIds = new Set(filesRaw.map(({ id }) => id));
const deletedIds = idsArray.filter((id) => !existingIds.has(id));
void IndexedDB.bulkDeleteFileInfos(deletedIds);
return new Map<number, MaybeFileInfo>([
...bulkStoreToIndexedDB(files),
...deletedIds.map((id) => [id, { id, exists: false }] as const),
]);
},
});
const storeToIndexedDB = (info: FileInfo) => {
void IndexedDB.storeFileInfo({
...info,
categoryIds: info.categories.map(({ id }) => id),
});
info.categories.forEach((category) => {
void IndexedDB.storeCategoryInfo(category);
});
return { ...info, exists: true as const };
};
const bulkStoreToIndexedDB = (infos: FileInfo[]) => {
// TODO: Bulk Upsert
infos.forEach((info) => {
void IndexedDB.storeFileInfo({
...info,
categoryIds: info.categories.map(({ id }) => id),
});
});
// TODO: Bulk Upsert
new Map(
infos.flatMap(({ categories }) => categories).map((category) => [category.id, category]),
).forEach((category) => {
void IndexedDB.storeCategoryInfo(category);
});
return infos.map((info) => [info.id, { ...info, exists: true }] as const);
};
export const getFileInfo = (id: number, masterKey: CryptoKey) => {
return cache.get(id, masterKey);
};
export const bulkGetFileInfo = (ids: number[], masterKey: CryptoKey) => {
return cache.bulkGet(new Set(ids), masterKey);
};

View File

@@ -0,0 +1,4 @@
export * from "./category";
export * from "./directory";
export * from "./file";
export * from "./types";

View File

@@ -0,0 +1,172 @@
import { untrack } from "svelte";
import { unwrapDataKey, decryptString } from "$lib/modules/crypto";
interface FilesystemCacheOptions<K, V> {
fetchFromIndexedDB: (key: K) => Promise<V | undefined>;
fetchFromServer: (key: K, cachedValue: V | undefined, masterKey: CryptoKey) => Promise<V>;
bulkFetchFromIndexedDB?: (keys: Set<K>) => Promise<Map<K, V>>;
bulkFetchFromServer?: (
keys: Map<K, { cachedValue: V | undefined }>,
masterKey: CryptoKey,
) => Promise<Map<K, V>>;
}
export class FilesystemCache<K, V extends object> {
private map = new Map<K, { value?: V; promise?: Promise<V> }>();
constructor(private readonly options: FilesystemCacheOptions<K, V>) {}
get(key: K, masterKey: CryptoKey) {
return untrack(() => {
let state = this.map.get(key);
if (state?.promise) return state.value ?? state.promise;
const { promise: newPromise, resolve } = Promise.withResolvers<V>();
if (!state) {
const newState = $state({});
state = newState;
this.map.set(key, newState);
}
(state.value
? Promise.resolve(state.value)
: this.options.fetchFromIndexedDB(key).then((loadedInfo) => {
if (loadedInfo) {
state.value = loadedInfo;
resolve(state.value);
}
return loadedInfo;
})
)
.then((cachedInfo) => this.options.fetchFromServer(key, cachedInfo, masterKey))
.then((loadedInfo) => {
if (state.value) {
Object.assign(state.value, loadedInfo);
} else {
state.value = loadedInfo;
}
resolve(state.value);
})
.finally(() => {
state.promise = undefined;
});
state.promise = newPromise;
return state.value ?? newPromise;
});
}
bulkGet(keys: Set<K>, masterKey: CryptoKey) {
return untrack(() => {
const newPromises = new Map(
keys
.keys()
.filter((key) => this.map.get(key)?.promise === undefined)
.map((key) => [key, Promise.withResolvers<V>()]),
);
newPromises.forEach(({ promise }, key) => {
const state = this.map.get(key);
if (state) {
state.promise = promise;
} else {
const newState = $state({ promise });
this.map.set(key, newState);
}
});
const resolve = (loadedInfos: Map<K, V>) => {
loadedInfos.forEach((loadedInfo, key) => {
const state = this.map.get(key)!;
if (state.value) {
Object.assign(state.value, loadedInfo);
} else {
state.value = loadedInfo;
}
newPromises.get(key)!.resolve(state.value);
});
return loadedInfos;
};
this.options.bulkFetchFromIndexedDB!(
new Set(newPromises.keys().filter((key) => this.map.get(key)!.value === undefined)),
)
.then(resolve)
.then(() =>
this.options.bulkFetchFromServer!(
new Map(
newPromises.keys().map((key) => [key, { cachedValue: this.map.get(key)!.value }]),
),
masterKey,
),
)
.then(resolve)
.finally(() => {
newPromises.forEach((_, key) => {
this.map.get(key)!.promise = undefined;
});
});
const bottleneckPromises = Array.from(
keys
.keys()
.filter((key) => this.map.get(key)!.value === undefined)
.map((key) => this.map.get(key)!.promise!),
);
const makeResult = () =>
new Map(keys.keys().map((key) => [key, this.map.get(key)!.value!] as const));
return bottleneckPromises.length > 0
? Promise.all(bottleneckPromises).then(makeResult)
: makeResult();
});
}
}
export const decryptDirectoryMetadata = async (
metadata: { dek: string; dekVersion: Date; name: string; nameIv: string },
masterKey: CryptoKey,
) => {
const { dataKey } = await unwrapDataKey(metadata.dek, masterKey);
const name = await decryptString(metadata.name, metadata.nameIv, dataKey);
return {
dataKey: { key: dataKey, version: metadata.dekVersion },
name,
};
};
const decryptDate = async (ciphertext: string, iv: string, dataKey: CryptoKey) => {
return new Date(parseInt(await decryptString(ciphertext, iv, dataKey), 10));
};
export const decryptFileMetadata = async (
metadata: {
dek: string;
dekVersion: Date;
name: string;
nameIv: string;
createdAt?: string;
createdAtIv?: string;
lastModifiedAt: string;
lastModifiedAtIv: string;
},
masterKey: CryptoKey,
) => {
const { dataKey } = await unwrapDataKey(metadata.dek, masterKey);
const [name, createdAt, lastModifiedAt] = await Promise.all([
decryptString(metadata.name, metadata.nameIv, dataKey),
metadata.createdAt
? decryptDate(metadata.createdAt, metadata.createdAtIv!, dataKey)
: undefined,
decryptDate(metadata.lastModifiedAt, metadata.lastModifiedAtIv, dataKey),
]);
return {
dataKey: { key: dataKey, version: metadata.dekVersion },
name,
createdAt,
lastModifiedAt,
};
};
export const decryptCategoryMetadata = decryptDirectoryMetadata;

View File

@@ -0,0 +1,77 @@
export type DataKey = { key: CryptoKey; version: Date };
type AllUndefined<T> = { [K in keyof T]?: undefined };
interface LocalDirectoryInfo {
id: number;
parentId: DirectoryId;
dataKey?: DataKey;
name: string;
subDirectories: SubDirectoryInfo[];
files: SummarizedFileInfo[];
}
interface RootDirectoryInfo {
id: "root";
parentId?: undefined;
dataKey?: undefined;
name?: undefined;
subDirectories: SubDirectoryInfo[];
files: SummarizedFileInfo[];
}
export type DirectoryInfo = LocalDirectoryInfo | RootDirectoryInfo;
export type MaybeDirectoryInfo =
| (DirectoryInfo & { exists: true })
| ({ id: DirectoryId; exists: false } & AllUndefined<Omit<DirectoryInfo, "id">>);
export type SubDirectoryInfo = Omit<LocalDirectoryInfo, "subDirectories" | "files">;
export interface FileInfo {
id: number;
isLegacy?: boolean;
parentId: DirectoryId;
dataKey?: DataKey;
contentType: string;
name: string;
createdAt?: Date;
lastModifiedAt: Date;
categories: FileCategoryInfo[];
}
export type MaybeFileInfo =
| (FileInfo & { exists: true })
| ({ id: number; exists: false } & AllUndefined<Omit<FileInfo, "id">>);
export type SummarizedFileInfo = Omit<FileInfo, "categories">;
export type CategoryFileInfo = SummarizedFileInfo & { isRecursive: boolean };
interface LocalCategoryInfo {
id: number;
parentId: DirectoryId;
dataKey?: DataKey;
name: string;
subCategories: SubCategoryInfo[];
files: CategoryFileInfo[];
isFileRecursive: boolean;
}
interface RootCategoryInfo {
id: "root";
parentId?: undefined;
dataKey?: undefined;
name?: undefined;
subCategories: SubCategoryInfo[];
files?: undefined;
isFileRecursive?: undefined;
}
export type CategoryInfo = LocalCategoryInfo | RootCategoryInfo;
export type MaybeCategoryInfo =
| (CategoryInfo & { exists: true })
| ({ id: CategoryId; exists: false } & AllUndefined<Omit<CategoryInfo, "id">>);
export type SubCategoryInfo = Omit<
LocalCategoryInfo,
"subCategories" | "files" | "isFileRecursive"
>;
export type FileCategoryInfo = Omit<SubCategoryInfo, "dataKey">;

14
src/lib/modules/http.ts Normal file
View File

@@ -0,0 +1,14 @@
export const parseRangeHeader = (rangeHeader: string | null) => {
if (!rangeHeader) return undefined;
const firstRange = rangeHeader.split(",")[0]!.trim();
const parts = firstRange.replace(/bytes=/, "").split("-");
return {
start: parts[0] ? parseInt(parts[0], 10) : undefined,
end: parts[1] ? parseInt(parts[1], 10) : undefined,
};
};
export const getContentRangeHeader = (range?: { start: number; end: number; total: number }) => {
return range && { "Content-Range": `bytes ${range.start}-${range.end}/${range.total}` };
};

View File

@@ -2,21 +2,21 @@ import { z } from "zod";
import { storeClientKey } from "$lib/indexedDB";
import type { ClientKeys } from "$lib/stores";
const serializedClientKeysSchema = z.intersection(
const SerializedClientKeysSchema = z.intersection(
z.object({
generator: z.literal("ArkVault"),
exportedAt: z.string().datetime(),
exportedAt: z.iso.datetime(),
}),
z.object({
version: z.literal(1),
encryptKey: z.string().base64().nonempty(),
decryptKey: z.string().base64().nonempty(),
signKey: z.string().base64().nonempty(),
verifyKey: z.string().base64().nonempty(),
encryptKey: z.base64().nonempty(),
decryptKey: z.base64().nonempty(),
signKey: z.base64().nonempty(),
verifyKey: z.base64().nonempty(),
}),
);
type SerializedClientKeys = z.infer<typeof serializedClientKeysSchema>;
type SerializedClientKeys = z.infer<typeof SerializedClientKeysSchema>;
type DeserializedClientKeys = {
encryptKeyBase64: string;
@@ -43,7 +43,7 @@ export const serializeClientKeys = ({
};
export const deserializeClientKeys = (serialized: string) => {
const zodRes = serializedClientKeysSchema.safeParse(JSON.parse(serialized));
const zodRes = SerializedClientKeysSchema.safeParse(JSON.parse(serialized));
if (zodRes.success) {
return {
encryptKeyBase64: zodRes.data.encryptKey,

View File

@@ -1,13 +1,5 @@
let rootHandle: FileSystemDirectoryHandle | null = null;
export const prepareOpfs = async () => {
rootHandle = await navigator.storage.getDirectory();
};
const getFileHandle = async (path: string, create = true) => {
if (!rootHandle) {
throw new Error("OPFS not prepared");
} else if (path[0] !== "/") {
if (path[0] !== "/") {
throw new Error("Path must be absolute");
}
@@ -17,7 +9,7 @@ const getFileHandle = async (path: string, create = true) => {
}
try {
let directoryHandle = rootHandle;
let directoryHandle = await navigator.storage.getDirectory();
for (const part of parts.slice(0, -1)) {
if (!part) continue;
directoryHandle = await directoryHandle.getDirectoryHandle(part, { create });
@@ -34,12 +26,15 @@ const getFileHandle = async (path: string, create = true) => {
}
};
export const readFile = async (path: string) => {
export const getFile = async (path: string) => {
const { fileHandle } = await getFileHandle(path, false);
if (!fileHandle) return null;
const file = await fileHandle.getFile();
return await file.arrayBuffer();
return await fileHandle.getFile();
};
export const readFile = async (path: string) => {
return (await getFile(path))?.arrayBuffer() ?? null;
};
export const writeFile = async (path: string, data: ArrayBuffer) => {
@@ -61,9 +56,7 @@ export const deleteFile = async (path: string) => {
};
const getDirectoryHandle = async (path: string) => {
if (!rootHandle) {
throw new Error("OPFS not prepared");
} else if (path[0] !== "/") {
if (path[0] !== "/") {
throw new Error("Path must be absolute");
}
@@ -73,7 +66,7 @@ const getDirectoryHandle = async (path: string) => {
}
try {
let directoryHandle = rootHandle;
let directoryHandle = await navigator.storage.getDirectory();
let parentHandle;
for (const part of parts.slice(1)) {
if (!part) continue;

View File

@@ -0,0 +1,48 @@
export class Scheduler<T = void> {
private isEstimating = false;
private memoryUsage = 0;
private queue: (() => void)[] = [];
constructor(public readonly memoryLimit = 100 * 1024 * 1024 /* 100 MiB */) {}
private next() {
if (!this.isEstimating && this.memoryUsage < this.memoryLimit) {
const resolve = this.queue.shift();
if (resolve) {
this.isEstimating = true;
resolve();
}
}
}
async schedule(
estimateMemoryUsage: number | (() => number | Promise<number>),
task: () => Promise<T>,
) {
if (this.isEstimating || this.memoryUsage >= this.memoryLimit) {
await new Promise<void>((resolve) => {
this.queue.push(resolve);
});
} else {
this.isEstimating = true;
}
let taskMemoryUsage = 0;
try {
taskMemoryUsage =
typeof estimateMemoryUsage === "number" ? estimateMemoryUsage : await estimateMemoryUsage();
this.memoryUsage += taskMemoryUsage;
} finally {
this.isEstimating = false;
this.next();
}
try {
return await task();
} finally {
this.memoryUsage -= taskMemoryUsage;
this.next();
}
}
}

View File

@@ -67,10 +67,15 @@ const generateVideoThumbnail = (videoUrl: string, time = 0) => {
return new Promise<Blob>((resolve, reject) => {
const video = document.createElement("video");
video.onloadedmetadata = () => {
video.currentTime = Math.min(time, video.duration);
video.requestVideoFrameCallback(() => {
if (video.videoWidth === 0 || video.videoHeight === 0) {
return reject();
}
const callbackId = video.requestVideoFrameCallback(() => {
captureVideoThumbnail(video).then(resolve).catch(reject);
video.cancelVideoFrameCallback(callbackId);
});
video.currentTime = Math.min(time, video.duration);
};
video.onerror = reject;
@@ -117,6 +122,22 @@ export const generateThumbnail = async (fileBuffer: ArrayBuffer, fileType: strin
}
};
export const generateThumbnailFromFile = async (file: File) => {
if (!file.type.startsWith("video/")) return null;
let url;
try {
url = URL.createObjectURL(file);
return await generateVideoThumbnail(url);
} catch {
return null;
} finally {
if (url) {
URL.revokeObjectURL(url);
}
}
};
export const getThumbnailUrl = (thumbnailBuffer: ArrayBuffer) => {
return `data:image/webp;base64,${encodeToBase64(thumbnailBuffer)}`;
};

View File

@@ -0,0 +1,4 @@
import { z } from "zod";
export const DirectoryIdSchema = z.union([z.literal("root"), z.int().positive()]);
export const CategoryIdSchema = z.union([z.literal("root"), z.int().positive()]);

1
src/lib/schemas/index.ts Normal file
View File

@@ -0,0 +1 @@
export * from "./filesystem";

View File

@@ -2,8 +2,6 @@ import { IntegrityError } from "./error";
import db from "./kysely";
import type { Ciphertext } from "./schema";
export type CategoryId = "root" | number;
interface Category {
id: number;
parentId: CategoryId;

View File

@@ -98,22 +98,6 @@ export const createUserClient = async (userId: number, clientId: number) => {
}
};
export const getAllUserClients = async (userId: number) => {
const userClients = await db
.selectFrom("user_client")
.selectAll()
.where("user_id", "=", userId)
.execute();
return userClients.map(
({ user_id, client_id, state }) =>
({
userId: user_id,
clientId: client_id,
state,
}) satisfies UserClient,
);
};
export const getUserClient = async (userId: number, clientId: number) => {
const userClient = await db
.selectFrom("user_client")

View File

@@ -1,11 +1,10 @@
import { sql, type NotNull } from "kysely";
import { sql } from "kysely";
import { jsonArrayFrom } from "kysely/helpers/postgres";
import pg from "pg";
import { IntegrityError } from "./error";
import db from "./kysely";
import type { Ciphertext } from "./schema";
export type DirectoryId = "root" | number;
interface Directory {
id: number;
parentId: DirectoryId;
@@ -16,8 +15,6 @@ interface Directory {
encName: Ciphertext;
}
export type NewDirectory = Omit<Directory, "id">;
interface File {
id: number;
parentId: DirectoryId;
@@ -29,16 +26,23 @@ interface File {
hskVersion: number | null;
contentHmac: string | null;
contentType: string;
encContentIv: string;
encContentIv: string | null;
encContentHash: string;
encName: Ciphertext;
encCreatedAt: Ciphertext | null;
encLastModifiedAt: Ciphertext;
}
export type NewFile = Omit<File, "id">;
interface FileCategory {
id: number;
parentId: CategoryId;
mekVersion: number;
encDek: string;
dekVersion: Date;
encName: Ciphertext;
}
export const registerDirectory = async (params: NewDirectory) => {
export const registerDirectory = async (params: Omit<Directory, "id">) => {
await db.transaction().execute(async (trx) => {
const mek = await trx
.selectFrom("master_encryption_key")
@@ -206,69 +210,41 @@ export const unregisterDirectory = async (userId: number, directoryId: number) =
});
};
export const registerFile = async (params: NewFile) => {
export const registerFile = async (trx: typeof db, params: Omit<File, "id">) => {
if ((params.hskVersion && !params.contentHmac) || (!params.hskVersion && params.contentHmac)) {
throw new Error("Invalid arguments");
}
return await db.transaction().execute(async (trx) => {
const mek = await trx
.selectFrom("master_encryption_key")
.select("version")
.where("user_id", "=", params.userId)
.where("state", "=", "active")
.limit(1)
.forUpdate()
.executeTakeFirst();
if (mek?.version !== params.mekVersion) {
throw new IntegrityError("Inactive MEK version");
}
if (params.hskVersion) {
const hsk = await trx
.selectFrom("hmac_secret_key")
.select("version")
.where("user_id", "=", params.userId)
.where("state", "=", "active")
.limit(1)
.forUpdate()
.executeTakeFirst();
if (hsk?.version !== params.hskVersion) {
throw new IntegrityError("Inactive HSK version");
}
}
const { fileId } = await trx
.insertInto("file")
.values({
parent_id: params.parentId !== "root" ? params.parentId : null,
user_id: params.userId,
path: params.path,
master_encryption_key_version: params.mekVersion,
encrypted_data_encryption_key: params.encDek,
data_encryption_key_version: params.dekVersion,
hmac_secret_key_version: params.hskVersion,
content_hmac: params.contentHmac,
content_type: params.contentType,
encrypted_content_iv: params.encContentIv,
encrypted_content_hash: params.encContentHash,
encrypted_name: params.encName,
encrypted_created_at: params.encCreatedAt,
encrypted_last_modified_at: params.encLastModifiedAt,
})
.returning("id as fileId")
.executeTakeFirstOrThrow();
await trx
.insertInto("file_log")
.values({
file_id: fileId,
timestamp: new Date(),
action: "create",
new_name: params.encName,
})
.execute();
return { id: fileId };
});
const { fileId } = await trx
.insertInto("file")
.values({
parent_id: params.parentId !== "root" ? params.parentId : null,
user_id: params.userId,
path: params.path,
master_encryption_key_version: params.mekVersion,
encrypted_data_encryption_key: params.encDek,
data_encryption_key_version: params.dekVersion,
hmac_secret_key_version: params.hskVersion,
content_hmac: params.contentHmac,
content_type: params.contentType,
encrypted_content_iv: params.encContentIv,
encrypted_content_hash: params.encContentHash,
encrypted_name: params.encName,
encrypted_created_at: params.encCreatedAt,
encrypted_last_modified_at: params.encLastModifiedAt,
})
.returning("id as fileId")
.executeTakeFirstOrThrow();
await trx
.insertInto("file_log")
.values({
file_id: fileId,
timestamp: new Date(),
action: "create",
new_name: params.encName,
})
.execute();
return { id: fileId };
};
export const getAllFilesByParent = async (userId: number, parentId: DirectoryId) => {
@@ -306,39 +282,51 @@ export const getAllFilesByCategory = async (
recurse: boolean,
) => {
const files = await db
.withRecursive("cte", (db) =>
.withRecursive("category_tree", (db) =>
db
.selectFrom("category")
.leftJoin("file_category", "category.id", "file_category.category_id")
.select(["id", "parent_id", "user_id", "file_category.file_id"])
.select(sql<number>`0`.as("depth"))
.select(["id", sql<number>`0`.as("depth")])
.where("id", "=", categoryId)
.where("user_id", "=", userId)
.$if(recurse, (qb) =>
qb.unionAll((db) =>
db
.selectFrom("category")
.leftJoin("file_category", "category.id", "file_category.category_id")
.innerJoin("cte", "category.parent_id", "cte.id")
.select([
"category.id",
"category.parent_id",
"category.user_id",
"file_category.file_id",
])
.select(sql<number>`cte.depth + 1`.as("depth")),
.innerJoin("category_tree", "category.parent_id", "category_tree.id")
.select(["category.id", sql<number>`depth + 1`.as("depth")]),
),
),
)
.selectFrom("cte")
.selectFrom("category_tree")
.innerJoin("file_category", "category_tree.id", "file_category.category_id")
.innerJoin("file", "file_category.file_id", "file.id")
.select(["file_id", "depth"])
.selectAll("file")
.distinctOn("file_id")
.where("user_id", "=", userId)
.where("file_id", "is not", null)
.$narrowType<{ file_id: NotNull }>()
.orderBy("file_id")
.orderBy("depth")
.execute();
return files.map(({ file_id, depth }) => ({ id: file_id, isRecursive: depth > 0 }));
return files.map(
(file) =>
({
id: file.file_id,
parentId: file.parent_id ?? "root",
userId: file.user_id,
path: file.path,
mekVersion: file.master_encryption_key_version,
encDek: file.encrypted_data_encryption_key,
dekVersion: file.data_encryption_key_version,
hskVersion: file.hmac_secret_key_version,
contentHmac: file.content_hmac,
contentType: file.content_type,
encContentIv: file.encrypted_content_iv,
encContentHash: file.encrypted_content_hash,
encName: file.encrypted_name,
encCreatedAt: file.encrypted_created_at,
encLastModifiedAt: file.encrypted_last_modified_at,
isRecursive: file.depth > 0,
}) satisfies File & { isRecursive: boolean },
);
};
export const getAllFileIds = async (userId: number) => {
@@ -390,6 +378,52 @@ export const getFile = async (userId: number, fileId: number) => {
: null;
};
export const getFilesWithCategories = async (userId: number, fileIds: number[]) => {
const files = await db
.selectFrom("file")
.selectAll()
.select((eb) =>
jsonArrayFrom(
eb
.selectFrom("file_category")
.innerJoin("category", "file_category.category_id", "category.id")
.where("file_category.file_id", "=", eb.ref("file.id"))
.selectAll("category"),
).as("categories"),
)
.where("id", "=", (eb) => eb.fn.any(eb.val(fileIds)))
.where("user_id", "=", userId)
.execute();
return files.map(
(file) =>
({
id: file.id,
parentId: file.parent_id ?? "root",
userId: file.user_id,
path: file.path,
mekVersion: file.master_encryption_key_version,
encDek: file.encrypted_data_encryption_key,
dekVersion: file.data_encryption_key_version,
hskVersion: file.hmac_secret_key_version,
contentHmac: file.content_hmac,
contentType: file.content_type,
encContentIv: file.encrypted_content_iv,
encContentHash: file.encrypted_content_hash,
encName: file.encrypted_name,
encCreatedAt: file.encrypted_created_at,
encLastModifiedAt: file.encrypted_last_modified_at,
categories: file.categories.map((category) => ({
id: category.id,
parentId: category.parent_id ?? "root",
mekVersion: category.master_encryption_key_version,
encDek: category.encrypted_data_encryption_key,
dekVersion: new Date(category.data_encryption_key_version),
encName: category.encrypted_name,
})),
}) satisfies File & { categories: FileCategory[] },
);
};
export const setFileEncName = async (
userId: number,
fileId: number,
@@ -476,10 +510,21 @@ export const addFileToCategory = async (fileId: number, categoryId: number) => {
export const getAllFileCategories = async (fileId: number) => {
const categories = await db
.selectFrom("file_category")
.select("category_id")
.innerJoin("category", "file_category.category_id", "category.id")
.selectAll("category")
.where("file_id", "=", fileId)
.execute();
return categories.map(({ category_id }) => ({ id: category_id }));
return categories.map(
(category) =>
({
id: category.id,
parentId: category.parent_id ?? "root",
mekVersion: category.master_encryption_key_version,
encDek: category.encrypted_data_encryption_key,
dekVersion: category.data_encryption_key_version,
encName: category.encrypted_name,
}) satisfies FileCategory,
);
};
export const removeFileFromCategory = async (fileId: number, categoryId: number) => {

View File

@@ -0,0 +1,11 @@
export * as CategoryRepo from "./category";
export * as ClientRepo from "./client";
export * as FileRepo from "./file";
export * as HskRepo from "./hsk";
export * as MediaRepo from "./media";
export * as MekRepo from "./mek";
export * as SessionRepo from "./session";
export * as UploadRepo from "./upload";
export * as UserRepo from "./user";
export * from "./error";

View File

@@ -6,7 +6,7 @@ interface Thumbnail {
id: number;
path: string;
updatedAt: Date;
encContentIv: string;
encContentIv: string | null;
}
interface FileThumbnail extends Thumbnail {
@@ -14,54 +14,53 @@ interface FileThumbnail extends Thumbnail {
}
export const updateFileThumbnail = async (
trx: typeof db,
userId: number,
fileId: number,
dekVersion: Date,
path: string,
encContentIv: string,
encContentIv: string | null,
) => {
return await db.transaction().execute(async (trx) => {
const file = await trx
.selectFrom("file")
.select("data_encryption_key_version")
.where("id", "=", fileId)
.where("user_id", "=", userId)
.limit(1)
.forUpdate()
.executeTakeFirst();
if (!file) {
throw new IntegrityError("File not found");
} else if (file.data_encryption_key_version.getTime() !== dekVersion.getTime()) {
throw new IntegrityError("Invalid DEK version");
}
const file = await trx
.selectFrom("file")
.select("data_encryption_key_version")
.where("id", "=", fileId)
.where("user_id", "=", userId)
.limit(1)
.forUpdate()
.executeTakeFirst();
if (!file) {
throw new IntegrityError("File not found");
} else if (file.data_encryption_key_version.getTime() !== dekVersion.getTime()) {
throw new IntegrityError("Invalid DEK version");
}
const thumbnail = await trx
.selectFrom("thumbnail")
.select("path as oldPath")
.where("file_id", "=", fileId)
.limit(1)
.forUpdate()
.executeTakeFirst();
const now = new Date();
const thumbnail = await trx
.selectFrom("thumbnail")
.select("path as oldPath")
.where("file_id", "=", fileId)
.limit(1)
.forUpdate()
.executeTakeFirst();
const now = new Date();
await trx
.insertInto("thumbnail")
.values({
file_id: fileId,
await trx
.insertInto("thumbnail")
.values({
file_id: fileId,
path,
updated_at: now,
encrypted_content_iv: encContentIv,
})
.onConflict((oc) =>
oc.column("file_id").doUpdateSet({
path,
updated_at: now,
encrypted_content_iv: encContentIv,
})
.onConflict((oc) =>
oc.column("file_id").doUpdateSet({
path,
updated_at: now,
encrypted_content_iv: encContentIv,
}),
)
.execute();
return thumbnail?.oldPath ?? null;
});
}),
)
.execute();
return thumbnail?.oldPath ?? null;
};
export const getFileThumbnail = async (userId: number, fileId: number) => {

View File

@@ -60,19 +60,6 @@ export const registerInitialMek = async (
});
};
export const getInitialMek = async (userId: number) => {
const mek = await db
.selectFrom("master_encryption_key")
.selectAll()
.where("user_id", "=", userId)
.where("version", "=", 1)
.limit(1)
.executeTakeFirst();
return mek
? ({ userId: mek.user_id, version: mek.version, state: mek.state } satisfies Mek)
: null;
};
export const getAllValidClientMeks = async (userId: number, clientId: number) => {
const clientMeks = await db
.selectFrom("client_master_encryption_key")

View File

@@ -0,0 +1,63 @@
import { Kysely, sql } from "kysely";
// eslint-disable-next-line @typescript-eslint/no-explicit-any
export const up = async (db: Kysely<any>) => {
// file.ts
await db.schema
.alterTable("file")
.alterColumn("encrypted_content_iv", (col) => col.dropNotNull())
.execute();
// media.ts
await db.schema
.alterTable("thumbnail")
.alterColumn("encrypted_content_iv", (col) => col.dropNotNull())
.execute();
// upload.ts
await db.schema
.createTable("upload_session")
.addColumn("id", "uuid", (col) => col.primaryKey())
.addColumn("type", "text", (col) => col.notNull())
.addColumn("user_id", "integer", (col) => col.references("user.id").notNull())
.addColumn("path", "text", (col) => col.notNull())
.addColumn("total_chunks", "integer", (col) => col.notNull())
.addColumn("uploaded_chunks", sql`integer[]`, (col) => col.notNull().defaultTo(sql`'{}'`))
.addColumn("expires_at", "timestamp(3)", (col) => col.notNull())
.addColumn("parent_id", "integer", (col) => col.references("directory.id"))
.addColumn("master_encryption_key_version", "integer")
.addColumn("encrypted_data_encryption_key", "text")
.addColumn("data_encryption_key_version", "timestamp(3)")
.addColumn("hmac_secret_key_version", "integer")
.addColumn("content_type", "text")
.addColumn("encrypted_name", "json")
.addColumn("encrypted_created_at", "json")
.addColumn("encrypted_last_modified_at", "json")
.addColumn("file_id", "integer", (col) => col.references("file.id"))
.addForeignKeyConstraint(
"upload_session_fk01",
["user_id", "master_encryption_key_version"],
"master_encryption_key",
["user_id", "version"],
)
.addForeignKeyConstraint(
"upload_session_fk02",
["user_id", "hmac_secret_key_version"],
"hmac_secret_key",
["user_id", "version"],
)
.execute();
};
// eslint-disable-next-line @typescript-eslint/no-explicit-any
export const down = async (db: Kysely<any>) => {
await db.schema.dropTable("upload_session").execute();
await db.schema
.alterTable("thumbnail")
.alterColumn("encrypted_content_iv", (col) => col.setNotNull())
.execute();
await db.schema
.alterTable("file")
.alterColumn("encrypted_content_iv", (col) => col.setNotNull())
.execute();
};

View File

@@ -1,9 +1,11 @@
import * as Initial1737357000 from "./1737357000-Initial";
import * as AddFileCategory1737422340 from "./1737422340-AddFileCategory";
import * as AddThumbnail1738409340 from "./1738409340-AddThumbnail";
import * as AddChunkedUpload1768062380 from "./1768062380-AddChunkedUpload";
export default {
"1737357000-Initial": Initial1737357000,
"1737422340-AddFileCategory": AddFileCategory1737422340,
"1738409340-AddThumbnail": AddThumbnail1738409340,
"1768062380-AddChunkedUpload": AddChunkedUpload1768062380,
};

View File

@@ -1,5 +1,5 @@
import type { Generated } from "kysely";
import type { Ciphertext } from "./util";
import type { Ciphertext } from "./utils";
interface CategoryTable {
id: Generated<number>;

View File

@@ -1,5 +1,5 @@
import type { ColumnType, Generated } from "kysely";
import type { Ciphertext } from "./util";
import type { Ciphertext } from "./utils";
interface DirectoryTable {
id: Generated<number>;
@@ -30,7 +30,7 @@ interface FileTable {
hmac_secret_key_version: number | null;
content_hmac: string | null; // Base64
content_type: string;
encrypted_content_iv: string; // Base64
encrypted_content_iv: string | null; // Base64
encrypted_content_hash: string; // Base64
encrypted_name: Ciphertext;
encrypted_created_at: Ciphertext | null;

View File

@@ -5,8 +5,9 @@ export * from "./hsk";
export * from "./media";
export * from "./mek";
export * from "./session";
export * from "./upload";
export * from "./user";
export * from "./util";
export * from "./utils";
// eslint-disable-next-line @typescript-eslint/no-empty-object-type
export interface Database {}

View File

@@ -7,7 +7,7 @@ interface ThumbnailTable {
category_id: number | null;
path: string;
updated_at: Date;
encrypted_content_iv: string; // Base64
encrypted_content_iv: string | null; // Base64
}
declare module "./index" {

View File

@@ -0,0 +1,32 @@
import type { Generated } from "kysely";
import type { Ciphertext } from "./utils";
interface UploadSessionTable {
id: string;
type: "file" | "thumbnail";
user_id: number;
path: string;
total_chunks: number;
uploaded_chunks: Generated<number[]>;
expires_at: Date;
// For file uploads
parent_id: number | null;
master_encryption_key_version: number | null;
encrypted_data_encryption_key: string | null; // Base64
data_encryption_key_version: Date | null;
hmac_secret_key_version: number | null;
content_type: string | null;
encrypted_name: Ciphertext | null;
encrypted_created_at: Ciphertext | null;
encrypted_last_modified_at: Ciphertext | null;
// For thumbnail uploads
file_id: number | null;
}
declare module "./index" {
interface Database {
upload_session: UploadSessionTable;
}
}

185
src/lib/server/db/upload.ts Normal file
View File

@@ -0,0 +1,185 @@
import { sql } from "kysely";
import { IntegrityError } from "./error";
import db from "./kysely";
import type { Ciphertext } from "./schema";
interface BaseUploadSession {
id: string;
userId: number;
path: string;
totalChunks: number;
uploadedChunks: number[];
expiresAt: Date;
}
interface FileUploadSession extends BaseUploadSession {
type: "file";
parentId: DirectoryId;
mekVersion: number;
encDek: string;
dekVersion: Date;
hskVersion: number | null;
contentType: string;
encName: Ciphertext;
encCreatedAt: Ciphertext | null;
encLastModifiedAt: Ciphertext;
}
interface ThumbnailUploadSession extends BaseUploadSession {
type: "thumbnail";
fileId: number;
dekVersion: Date;
}
export const createFileUploadSession = async (
params: Omit<FileUploadSession, "type" | "uploadedChunks">,
) => {
await db.transaction().execute(async (trx) => {
const mek = await trx
.selectFrom("master_encryption_key")
.select("version")
.where("user_id", "=", params.userId)
.where("state", "=", "active")
.limit(1)
.forUpdate()
.executeTakeFirst();
if (mek?.version !== params.mekVersion) {
throw new IntegrityError("Inactive MEK version");
}
if (params.hskVersion) {
const hsk = await trx
.selectFrom("hmac_secret_key")
.select("version")
.where("user_id", "=", params.userId)
.where("state", "=", "active")
.limit(1)
.forUpdate()
.executeTakeFirst();
if (hsk?.version !== params.hskVersion) {
throw new IntegrityError("Inactive HSK version");
}
}
await trx
.insertInto("upload_session")
.values({
id: params.id,
type: "file",
user_id: params.userId,
path: params.path,
total_chunks: params.totalChunks,
expires_at: params.expiresAt,
parent_id: params.parentId !== "root" ? params.parentId : null,
master_encryption_key_version: params.mekVersion,
encrypted_data_encryption_key: params.encDek,
data_encryption_key_version: params.dekVersion,
hmac_secret_key_version: params.hskVersion,
content_type: params.contentType,
encrypted_name: params.encName,
encrypted_created_at: params.encCreatedAt,
encrypted_last_modified_at: params.encLastModifiedAt,
})
.execute();
});
};
export const createThumbnailUploadSession = async (
params: Omit<ThumbnailUploadSession, "type" | "uploadedChunks">,
) => {
await db.transaction().execute(async (trx) => {
const file = await trx
.selectFrom("file")
.select("data_encryption_key_version")
.where("id", "=", params.fileId)
.where("user_id", "=", params.userId)
.limit(1)
.forUpdate()
.executeTakeFirst();
if (!file) {
throw new IntegrityError("File not found");
} else if (file.data_encryption_key_version.getTime() !== params.dekVersion.getTime()) {
throw new IntegrityError("Invalid DEK version");
}
await trx
.insertInto("upload_session")
.values({
id: params.id,
type: "thumbnail",
user_id: params.userId,
path: params.path,
total_chunks: params.totalChunks,
expires_at: params.expiresAt,
file_id: params.fileId,
data_encryption_key_version: params.dekVersion,
})
.execute();
});
};
export const getUploadSession = async (sessionId: string, userId: number) => {
const session = await db
.selectFrom("upload_session")
.selectAll()
.where("id", "=", sessionId)
.where("user_id", "=", userId)
.where("expires_at", ">", new Date())
.limit(1)
.executeTakeFirst();
if (!session) {
return null;
} else if (session.type === "file") {
return {
type: "file",
id: session.id,
userId: session.user_id,
path: session.path,
totalChunks: session.total_chunks,
uploadedChunks: session.uploaded_chunks,
expiresAt: session.expires_at,
parentId: session.parent_id ?? "root",
mekVersion: session.master_encryption_key_version!,
encDek: session.encrypted_data_encryption_key!,
dekVersion: session.data_encryption_key_version!,
hskVersion: session.hmac_secret_key_version,
contentType: session.content_type!,
encName: session.encrypted_name!,
encCreatedAt: session.encrypted_created_at,
encLastModifiedAt: session.encrypted_last_modified_at!,
} satisfies FileUploadSession;
} else {
return {
type: "thumbnail",
id: session.id,
userId: session.user_id,
path: session.path,
totalChunks: session.total_chunks,
uploadedChunks: session.uploaded_chunks,
expiresAt: session.expires_at,
fileId: session.file_id!,
dekVersion: session.data_encryption_key_version!,
} satisfies ThumbnailUploadSession;
}
};
export const markChunkAsUploaded = async (sessionId: string, chunkIndex: number) => {
await db
.updateTable("upload_session")
.set({ uploaded_chunks: sql`array_append(uploaded_chunks, ${chunkIndex})` })
.where("id", "=", sessionId)
.execute();
};
export const deleteUploadSession = async (trx: typeof db, sessionId: string) => {
await trx.deleteFrom("upload_session").where("id", "=", sessionId).execute();
};
export const cleanupExpiredUploadSessions = async () => {
const sessions = await db
.deleteFrom("upload_session")
.where("expires_at", "<=", new Date())
.returning("path")
.execute();
return sessions.map(({ path }) => path);
};

View File

@@ -27,10 +27,6 @@ export const getUserByEmail = async (email: string) => {
return user ? (user satisfies User) : null;
};
export const setUserNickname = async (userId: number, nickname: string) => {
await db.updateTable("user").set({ nickname }).where("id", "=", userId).execute();
};
export const setUserPassword = async (userId: number, password: string) => {
await db.updateTable("user").set({ password }).where("id", "=", userId).execute();
};

View File

@@ -26,4 +26,5 @@ export default {
},
libraryPath: env.LIBRARY_PATH || "library",
thumbnailsPath: env.THUMBNAILS_PATH || "thumbnails",
uploadsPath: env.UPLOADS_PATH || "uploads",
};

View File

@@ -1,13 +1,7 @@
import { error, redirect, type Handle } from "@sveltejs/kit";
import env from "$lib/server/loadenv";
import { authenticate, AuthenticationError } from "$lib/server/modules/auth";
import { cookieOptions, authenticate, AuthenticationError } from "$lib/server/modules/auth";
export const authenticateMiddleware: Handle = async ({ event, resolve }) => {
const { pathname, search } = event.url;
if (pathname === "/api/auth/login") {
return await resolve(event);
}
try {
const sessionIdSigned = event.cookies.get("sessionId");
if (!sessionIdSigned) {
@@ -16,15 +10,11 @@ export const authenticateMiddleware: Handle = async ({ event, resolve }) => {
const { ip, userAgent } = event.locals;
event.locals.session = await authenticate(sessionIdSigned, ip, userAgent);
event.cookies.set("sessionId", sessionIdSigned, {
path: "/",
maxAge: env.session.exp / 1000,
secure: true,
sameSite: "strict",
});
event.cookies.set("sessionId", sessionIdSigned, cookieOptions);
} catch (e) {
if (e instanceof AuthenticationError) {
if (pathname === "/auth/login") {
const { pathname, search } = event.url;
if (pathname === "/auth/login" || pathname.startsWith("/api/trpc")) {
return await resolve(event);
} else if (pathname.startsWith("/api")) {
error(e.status, e.message);

View File

@@ -1,20 +1,25 @@
import { error } from "@sveltejs/kit";
import { getUserClient } from "$lib/server/db/client";
import { IntegrityError } from "$lib/server/db/error";
import { createSession, refreshSession } from "$lib/server/db/session";
import { ClientRepo, SessionRepo, IntegrityError } from "$lib/server/db";
import env from "$lib/server/loadenv";
import { issueSessionId, verifySessionId } from "$lib/server/modules/crypto";
import { verifySessionId } from "$lib/server/modules/crypto";
interface Session {
export interface Session {
sessionId: string;
userId: number;
clientId?: number;
}
interface ClientSession extends Session {
export interface ClientSession extends Session {
clientId: number;
}
export type SessionPermission =
| "any"
| "notClient"
| "anyClient"
| "pendingClient"
| "activeClient";
export class AuthenticationError extends Error {
constructor(
public status: 400 | 401,
@@ -25,11 +30,22 @@ export class AuthenticationError extends Error {
}
}
export const startSession = async (userId: number, ip: string, userAgent: string) => {
const { sessionId, sessionIdSigned } = await issueSessionId(32, env.session.secret);
await createSession(userId, sessionId, ip, userAgent);
return sessionIdSigned;
};
export class AuthorizationError extends Error {
constructor(
public status: 403 | 500,
message: string,
) {
super(message);
this.name = "AuthorizationError";
}
}
export const cookieOptions = {
path: "/",
maxAge: env.session.exp / 1000,
secure: true,
sameSite: "strict",
} as const;
export const authenticate = async (sessionIdSigned: string, ip: string, userAgent: string) => {
const sessionId = verifySessionId(sessionIdSigned, env.session.secret);
@@ -38,7 +54,7 @@ export const authenticate = async (sessionIdSigned: string, ip: string, userAgen
}
try {
const { userId, clientId } = await refreshSession(sessionId, ip, userAgent);
const { userId, clientId } = await SessionRepo.refreshSession(sessionId, ip, userAgent);
return {
id: sessionId,
userId,
@@ -52,34 +68,12 @@ export const authenticate = async (sessionIdSigned: string, ip: string, userAgen
}
};
export async function authorize(locals: App.Locals, requiredPermission: "any"): Promise<Session>;
export async function authorize(
export const authorizeInternal = async (
locals: App.Locals,
requiredPermission: "notClient",
): Promise<Session>;
export async function authorize(
locals: App.Locals,
requiredPermission: "anyClient",
): Promise<ClientSession>;
export async function authorize(
locals: App.Locals,
requiredPermission: "pendingClient",
): Promise<ClientSession>;
export async function authorize(
locals: App.Locals,
requiredPermission: "activeClient",
): Promise<ClientSession>;
export async function authorize(
locals: App.Locals,
requiredPermission: "any" | "notClient" | "anyClient" | "pendingClient" | "activeClient",
): Promise<Session> {
requiredPermission: SessionPermission,
): Promise<Session> => {
if (!locals.session) {
error(500, "Unauthenticated");
throw new AuthorizationError(500, "Unauthenticated");
}
const { id: sessionId, userId, clientId } = locals.session;
@@ -89,39 +83,63 @@ export async function authorize(
break;
case "notClient":
if (clientId) {
error(403, "Forbidden");
throw new AuthorizationError(403, "Forbidden");
}
break;
case "anyClient":
if (!clientId) {
error(403, "Forbidden");
throw new AuthorizationError(403, "Forbidden");
}
break;
case "pendingClient": {
if (!clientId) {
error(403, "Forbidden");
throw new AuthorizationError(403, "Forbidden");
}
const userClient = await getUserClient(userId, clientId);
const userClient = await ClientRepo.getUserClient(userId, clientId);
if (!userClient) {
error(500, "Invalid session id");
throw new AuthorizationError(500, "Invalid session id");
} else if (userClient.state !== "pending") {
error(403, "Forbidden");
throw new AuthorizationError(403, "Forbidden");
}
break;
}
case "activeClient": {
if (!clientId) {
error(403, "Forbidden");
throw new AuthorizationError(403, "Forbidden");
}
const userClient = await getUserClient(userId, clientId);
const userClient = await ClientRepo.getUserClient(userId, clientId);
if (!userClient) {
error(500, "Invalid session id");
throw new AuthorizationError(500, "Invalid session id");
} else if (userClient.state !== "active") {
error(403, "Forbidden");
throw new AuthorizationError(403, "Forbidden");
}
break;
}
}
return { sessionId, userId, clientId };
};
export async function authorize(
locals: App.Locals,
requiredPermission: "any" | "notClient",
): Promise<Session>;
export async function authorize(
locals: App.Locals,
requiredPermission: "anyClient" | "pendingClient" | "activeClient",
): Promise<ClientSession>;
export async function authorize(
locals: App.Locals,
requiredPermission: SessionPermission,
): Promise<Session> {
try {
return await authorizeInternal(locals, requiredPermission);
} catch (e) {
if (e instanceof AuthorizationError) {
error(e.status, e.message);
}
throw e;
}
}

View File

@@ -0,0 +1,13 @@
import { rm, unlink } from "fs/promises";
export const safeRecursiveRm = async (path: string | null | undefined) => {
if (path) {
await rm(path, { recursive: true }).catch(console.error);
}
};
export const safeUnlink = async (path: string | null | undefined) => {
if (path) {
await unlink(path).catch(console.error);
}
};

View File

@@ -1,25 +0,0 @@
import { error } from "@sveltejs/kit";
import { getUserClientWithDetails } from "$lib/server/db/client";
import { getInitialMek } from "$lib/server/db/mek";
import { verifySignature } from "$lib/server/modules/crypto";
export const isInitialMekNeeded = async (userId: number) => {
const initialMek = await getInitialMek(userId);
return !initialMek;
};
export const verifyClientEncMekSig = async (
userId: number,
clientId: number,
version: number,
encMek: string,
encMekSig: string,
) => {
const userClient = await getUserClientWithDetails(userId, clientId);
if (!userClient) {
error(500, "Invalid session id");
}
const data = JSON.stringify({ version, key: encMek });
return verifySignature(Buffer.from(data), encMekSig, userClient.sigPubKey);
};

View File

@@ -1,32 +0,0 @@
import { z } from "zod";
export const passwordChangeRequest = z.object({
oldPassword: z.string().trim().nonempty(),
newPassword: z.string().trim().nonempty(),
});
export type PasswordChangeRequest = z.input<typeof passwordChangeRequest>;
export const loginRequest = z.object({
email: z.string().email(),
password: z.string().trim().nonempty(),
});
export type LoginRequest = z.input<typeof loginRequest>;
export const sessionUpgradeRequest = z.object({
encPubKey: z.string().base64().nonempty(),
sigPubKey: z.string().base64().nonempty(),
});
export type SessionUpgradeRequest = z.input<typeof sessionUpgradeRequest>;
export const sessionUpgradeResponse = z.object({
id: z.number().int().positive(),
challenge: z.string().base64().nonempty(),
});
export type SessionUpgradeResponse = z.output<typeof sessionUpgradeResponse>;
export const sessionUpgradeVerifyRequest = z.object({
id: z.number().int().positive(),
answerSig: z.string().base64().nonempty(),
force: z.boolean().default(false),
});
export type SessionUpgradeVerifyRequest = z.input<typeof sessionUpgradeVerifyRequest>;

View File

@@ -1,55 +0,0 @@
import { z } from "zod";
export const categoryIdSchema = z.union([z.literal("root"), z.number().int().positive()]);
export const categoryInfoResponse = z.object({
metadata: z
.object({
parent: categoryIdSchema,
mekVersion: z.number().int().positive(),
dek: z.string().base64().nonempty(),
dekVersion: z.string().datetime(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
})
.optional(),
subCategories: z.number().int().positive().array(),
});
export type CategoryInfoResponse = z.output<typeof categoryInfoResponse>;
export const categoryFileAddRequest = z.object({
file: z.number().int().positive(),
});
export type CategoryFileAddRequest = z.input<typeof categoryFileAddRequest>;
export const categoryFileListResponse = z.object({
files: z.array(
z.object({
file: z.number().int().positive(),
isRecursive: z.boolean(),
}),
),
});
export type CategoryFileListResponse = z.output<typeof categoryFileListResponse>;
export const categoryFileRemoveRequest = z.object({
file: z.number().int().positive(),
});
export type CategoryFileRemoveRequest = z.input<typeof categoryFileRemoveRequest>;
export const categoryRenameRequest = z.object({
dekVersion: z.string().datetime(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
});
export type CategoryRenameRequest = z.input<typeof categoryRenameRequest>;
export const categoryCreateRequest = z.object({
parent: categoryIdSchema,
mekVersion: z.number().int().positive(),
dek: z.string().base64().nonempty(),
dekVersion: z.string().datetime(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
});
export type CategoryCreateRequest = z.input<typeof categoryCreateRequest>;

View File

@@ -1,36 +0,0 @@
import { z } from "zod";
export const clientListResponse = z.object({
clients: z.array(
z.object({
id: z.number().int().positive(),
state: z.enum(["pending", "active"]),
}),
),
});
export type ClientListResponse = z.output<typeof clientListResponse>;
export const clientRegisterRequest = z.object({
encPubKey: z.string().base64().nonempty(),
sigPubKey: z.string().base64().nonempty(),
});
export type ClientRegisterRequest = z.input<typeof clientRegisterRequest>;
export const clientRegisterResponse = z.object({
id: z.number().int().positive(),
challenge: z.string().base64().nonempty(),
});
export type ClientRegisterResponse = z.output<typeof clientRegisterResponse>;
export const clientRegisterVerifyRequest = z.object({
id: z.number().int().positive(),
answerSig: z.string().base64().nonempty(),
});
export type ClientRegisterVerifyRequest = z.input<typeof clientRegisterVerifyRequest>;
export const clientStatusResponse = z.object({
id: z.number().int().positive(),
state: z.enum(["pending", "active"]),
isInitialMekNeeded: z.boolean(),
});
export type ClientStatusResponse = z.output<typeof clientStatusResponse>;

View File

@@ -1,41 +0,0 @@
import { z } from "zod";
export const directoryIdSchema = z.union([z.literal("root"), z.number().int().positive()]);
export const directoryInfoResponse = z.object({
metadata: z
.object({
parent: directoryIdSchema,
mekVersion: z.number().int().positive(),
dek: z.string().base64().nonempty(),
dekVersion: z.string().datetime(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
})
.optional(),
subDirectories: z.number().int().positive().array(),
files: z.number().int().positive().array(),
});
export type DirectoryInfoResponse = z.output<typeof directoryInfoResponse>;
export const directoryDeleteResponse = z.object({
deletedFiles: z.number().int().positive().array(),
});
export type DirectoryDeleteResponse = z.output<typeof directoryDeleteResponse>;
export const directoryRenameRequest = z.object({
dekVersion: z.string().datetime(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
});
export type DirectoryRenameRequest = z.input<typeof directoryRenameRequest>;
export const directoryCreateRequest = z.object({
parent: directoryIdSchema,
mekVersion: z.number().int().positive(),
dek: z.string().base64().nonempty(),
dekVersion: z.string().datetime(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
});
export type DirectoryCreateRequest = z.input<typeof directoryCreateRequest>;

View File

@@ -1,91 +0,0 @@
import mime from "mime";
import { z } from "zod";
import { directoryIdSchema } from "./directory";
export const fileInfoResponse = z.object({
parent: directoryIdSchema,
mekVersion: z.number().int().positive(),
dek: z.string().base64().nonempty(),
dekVersion: z.string().datetime(),
contentType: z
.string()
.trim()
.nonempty()
.refine((value) => mime.getExtension(value) !== null), // MIME type
contentIv: z.string().base64().nonempty(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
createdAt: z.string().base64().nonempty().optional(),
createdAtIv: z.string().base64().nonempty().optional(),
lastModifiedAt: z.string().base64().nonempty(),
lastModifiedAtIv: z.string().base64().nonempty(),
categories: z.number().int().positive().array(),
});
export type FileInfoResponse = z.output<typeof fileInfoResponse>;
export const fileRenameRequest = z.object({
dekVersion: z.string().datetime(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
});
export type FileRenameRequest = z.input<typeof fileRenameRequest>;
export const fileThumbnailInfoResponse = z.object({
updatedAt: z.string().datetime(),
contentIv: z.string().base64().nonempty(),
});
export type FileThumbnailInfoResponse = z.output<typeof fileThumbnailInfoResponse>;
export const fileThumbnailUploadRequest = z.object({
dekVersion: z.string().datetime(),
contentIv: z.string().base64().nonempty(),
});
export type FileThumbnailUploadRequest = z.input<typeof fileThumbnailUploadRequest>;
export const fileListResponse = z.object({
files: z.number().int().positive().array(),
});
export type FileListResponse = z.output<typeof fileListResponse>;
export const duplicateFileScanRequest = z.object({
hskVersion: z.number().int().positive(),
contentHmac: z.string().base64().nonempty(),
});
export type DuplicateFileScanRequest = z.input<typeof duplicateFileScanRequest>;
export const duplicateFileScanResponse = z.object({
files: z.number().int().positive().array(),
});
export type DuplicateFileScanResponse = z.output<typeof duplicateFileScanResponse>;
export const missingThumbnailFileScanResponse = z.object({
files: z.number().int().positive().array(),
});
export type MissingThumbnailFileScanResponse = z.output<typeof missingThumbnailFileScanResponse>;
export const fileUploadRequest = z.object({
parent: directoryIdSchema,
mekVersion: z.number().int().positive(),
dek: z.string().base64().nonempty(),
dekVersion: z.string().datetime(),
hskVersion: z.number().int().positive(),
contentHmac: z.string().base64().nonempty(),
contentType: z
.string()
.trim()
.nonempty()
.refine((value) => mime.getExtension(value) !== null), // MIME type
contentIv: z.string().base64().nonempty(),
name: z.string().base64().nonempty(),
nameIv: z.string().base64().nonempty(),
createdAt: z.string().base64().nonempty().optional(),
createdAtIv: z.string().base64().nonempty().optional(),
lastModifiedAt: z.string().base64().nonempty(),
lastModifiedAtIv: z.string().base64().nonempty(),
});
export type FileUploadRequest = z.input<typeof fileUploadRequest>;
export const fileUploadResponse = z.object({
file: z.number().int().positive(),
});
export type FileUploadResponse = z.output<typeof fileUploadResponse>;

View File

@@ -1,19 +0,0 @@
import { z } from "zod";
export const hmacSecretListResponse = z.object({
hsks: z.array(
z.object({
version: z.number().int().positive(),
state: z.enum(["active"]),
mekVersion: z.number().int().positive(),
hsk: z.string().base64().nonempty(),
}),
),
});
export type HmacSecretListResponse = z.output<typeof hmacSecretListResponse>;
export const initialHmacSecretRegisterRequest = z.object({
mekVersion: z.number().int().positive(),
hsk: z.string().base64().nonempty(),
});
export type InitialHmacSecretRegisterRequest = z.input<typeof initialHmacSecretRegisterRequest>;

View File

@@ -1,8 +0,0 @@
export * from "./auth";
export * from "./category";
export * from "./client";
export * from "./directory";
export * from "./file";
export * from "./hsk";
export * from "./mek";
export * from "./user";

View File

@@ -1,19 +0,0 @@
import { z } from "zod";
export const masterKeyListResponse = z.object({
meks: z.array(
z.object({
version: z.number().int().positive(),
state: z.enum(["active", "retired"]),
mek: z.string().base64().nonempty(),
mekSig: z.string().base64().nonempty(),
}),
),
});
export type MasterKeyListResponse = z.output<typeof masterKeyListResponse>;
export const initialMasterKeyRegisterRequest = z.object({
mek: z.string().base64().nonempty(),
mekSig: z.string().base64().nonempty(),
});
export type InitialMasterKeyRegisterRequest = z.input<typeof initialMasterKeyRegisterRequest>;

View File

@@ -1,12 +0,0 @@
import { z } from "zod";
export const userInfoResponse = z.object({
email: z.string().email(),
nickname: z.string().nonempty(),
});
export type UserInfoResponse = z.output<typeof userInfoResponse>;
export const nicknameChangeRequest = z.object({
newNickname: z.string().trim().min(2).max(8),
});
export type NicknameChangeRequest = z.input<typeof nicknameChangeRequest>;

View File

@@ -1,122 +0,0 @@
import { error } from "@sveltejs/kit";
import argon2 from "argon2";
import { getClient, getClientByPubKeys, getUserClient } from "$lib/server/db/client";
import { IntegrityError } from "$lib/server/db/error";
import {
upgradeSession,
deleteSession,
deleteAllOtherSessions,
registerSessionUpgradeChallenge,
consumeSessionUpgradeChallenge,
} from "$lib/server/db/session";
import { getUser, getUserByEmail, setUserPassword } from "$lib/server/db/user";
import env from "$lib/server/loadenv";
import { startSession } from "$lib/server/modules/auth";
import { verifySignature, generateChallenge } from "$lib/server/modules/crypto";
const hashPassword = async (password: string) => {
return await argon2.hash(password);
};
const verifyPassword = async (hash: string, password: string) => {
return await argon2.verify(hash, password);
};
export const changePassword = async (
userId: number,
sessionId: string,
oldPassword: string,
newPassword: string,
) => {
if (oldPassword === newPassword) {
error(400, "Same passwords");
} else if (newPassword.length < 8) {
error(400, "Too short password");
}
const user = await getUser(userId);
if (!user) {
error(500, "Invalid session id");
} else if (!(await verifyPassword(user.password, oldPassword))) {
error(403, "Invalid password");
}
await setUserPassword(userId, await hashPassword(newPassword));
await deleteAllOtherSessions(userId, sessionId);
};
export const login = async (email: string, password: string, ip: string, userAgent: string) => {
const user = await getUserByEmail(email);
if (!user || !(await verifyPassword(user.password, password))) {
error(401, "Invalid email or password");
}
return { sessionIdSigned: await startSession(user.id, ip, userAgent) };
};
export const logout = async (sessionId: string) => {
await deleteSession(sessionId);
};
export const createSessionUpgradeChallenge = async (
sessionId: string,
userId: number,
ip: string,
encPubKey: string,
sigPubKey: string,
) => {
const client = await getClientByPubKeys(encPubKey, sigPubKey);
const userClient = client ? await getUserClient(userId, client.id) : undefined;
if (!client) {
error(401, "Invalid public key(s)");
} else if (!userClient || userClient.state === "challenging") {
error(403, "Unregistered client");
}
const { answer, challenge } = await generateChallenge(32, encPubKey);
const { id } = await registerSessionUpgradeChallenge(
sessionId,
client.id,
answer.toString("base64"),
ip,
new Date(Date.now() + env.challenge.sessionUpgradeExp),
);
return { id, challenge: challenge.toString("base64") };
};
export const verifySessionUpgradeChallenge = async (
sessionId: string,
userId: number,
ip: string,
challengeId: number,
answerSig: string,
force: boolean,
) => {
const challenge = await consumeSessionUpgradeChallenge(challengeId, sessionId, ip);
if (!challenge) {
error(403, "Invalid challenge answer");
}
const client = await getClient(challenge.clientId);
if (!client) {
error(500, "Invalid challenge answer");
} else if (
!verifySignature(Buffer.from(challenge.answer, "base64"), answerSig, client.sigPubKey)
) {
error(403, "Invalid challenge answer signature");
}
try {
await upgradeSession(userId, sessionId, client.id, force);
} catch (e) {
if (e instanceof IntegrityError) {
if (e.message === "Session not found") {
error(500, "Invalid challenge answer");
} else if (!force && e.message === "Session already exists") {
error(409, "Already logged in");
}
}
throw e;
}
};

View File

@@ -1,133 +0,0 @@
import { error } from "@sveltejs/kit";
import {
registerCategory,
getAllCategoriesByParent,
getCategory,
setCategoryEncName,
unregisterCategory,
type CategoryId,
type NewCategory,
} from "$lib/server/db/category";
import { IntegrityError } from "$lib/server/db/error";
import {
getAllFilesByCategory,
getFile,
addFileToCategory,
removeFileFromCategory,
} from "$lib/server/db/file";
import type { Ciphertext } from "$lib/server/db/schema";
export const getCategoryInformation = async (userId: number, categoryId: CategoryId) => {
const category = categoryId !== "root" ? await getCategory(userId, categoryId) : undefined;
if (category === null) {
error(404, "Invalid category id");
}
const categories = await getAllCategoriesByParent(userId, categoryId);
return {
metadata: category && {
parentId: category.parentId ?? ("root" as const),
mekVersion: category.mekVersion,
encDek: category.encDek,
dekVersion: category.dekVersion,
encName: category.encName,
},
categories: categories.map(({ id }) => id),
};
};
export const deleteCategory = async (userId: number, categoryId: number) => {
try {
await unregisterCategory(userId, categoryId);
} catch (e) {
if (e instanceof IntegrityError && e.message === "Category not found") {
error(404, "Invalid category id");
}
throw e;
}
};
export const addCategoryFile = async (userId: number, categoryId: number, fileId: number) => {
const category = await getCategory(userId, categoryId);
const file = await getFile(userId, fileId);
if (!category) {
error(404, "Invalid category id");
} else if (!file) {
error(404, "Invalid file id");
}
try {
await addFileToCategory(fileId, categoryId);
} catch (e) {
if (e instanceof IntegrityError && e.message === "File already added to category") {
error(400, "File already added");
}
throw e;
}
};
export const getCategoryFiles = async (userId: number, categoryId: number, recurse: boolean) => {
const category = await getCategory(userId, categoryId);
if (!category) {
error(404, "Invalid category id");
}
const files = await getAllFilesByCategory(userId, categoryId, recurse);
return { files };
};
export const removeCategoryFile = async (userId: number, categoryId: number, fileId: number) => {
const category = await getCategory(userId, categoryId);
const file = await getFile(userId, fileId);
if (!category) {
error(404, "Invalid category id");
} else if (!file) {
error(404, "Invalid file id");
}
try {
await removeFileFromCategory(fileId, categoryId);
} catch (e) {
if (e instanceof IntegrityError && e.message === "File not found in category") {
error(400, "File not added");
}
throw e;
}
};
export const renameCategory = async (
userId: number,
categoryId: number,
dekVersion: Date,
newEncName: Ciphertext,
) => {
try {
await setCategoryEncName(userId, categoryId, dekVersion, newEncName);
} catch (e) {
if (e instanceof IntegrityError) {
if (e.message === "Category not found") {
error(404, "Invalid category id");
} else if (e.message === "Invalid DEK version") {
error(400, "Invalid DEK version");
}
}
throw e;
}
};
export const createCategory = async (params: NewCategory) => {
const oneMinuteAgo = new Date(Date.now() - 60 * 1000);
const oneMinuteLater = new Date(Date.now() + 60 * 1000);
if (params.dekVersion <= oneMinuteAgo || params.dekVersion >= oneMinuteLater) {
error(400, "Invalid DEK version");
}
try {
await registerCategory(params);
} catch (e) {
if (e instanceof IntegrityError && e.message === "Inactive MEK version") {
error(400, "Inactive MEK version");
}
throw e;
}
};

View File

@@ -1,116 +0,0 @@
import { error } from "@sveltejs/kit";
import {
createClient,
getClient,
getClientByPubKeys,
createUserClient,
getAllUserClients,
getUserClient,
setUserClientStateToPending,
registerUserClientChallenge,
consumeUserClientChallenge,
} from "$lib/server/db/client";
import { IntegrityError } from "$lib/server/db/error";
import { verifyPubKey, verifySignature, generateChallenge } from "$lib/server/modules/crypto";
import { isInitialMekNeeded } from "$lib/server/modules/mek";
import env from "$lib/server/loadenv";
export const getUserClientList = async (userId: number) => {
const userClients = await getAllUserClients(userId);
return {
userClients: userClients.map(({ clientId, state }) => ({
id: clientId,
state: state as "pending" | "active",
})),
};
};
const expiresAt = () => new Date(Date.now() + env.challenge.userClientExp);
const createUserClientChallenge = async (
ip: string,
userId: number,
clientId: number,
encPubKey: string,
) => {
const { answer, challenge } = await generateChallenge(32, encPubKey);
const { id } = await registerUserClientChallenge(
userId,
clientId,
answer.toString("base64"),
ip,
expiresAt(),
);
return { id, challenge: challenge.toString("base64") };
};
export const registerUserClient = async (
userId: number,
ip: string,
encPubKey: string,
sigPubKey: string,
) => {
const client = await getClientByPubKeys(encPubKey, sigPubKey);
if (client) {
try {
await createUserClient(userId, client.id);
return await createUserClientChallenge(ip, userId, client.id, encPubKey);
} catch (e) {
if (e instanceof IntegrityError && e.message === "User client already exists") {
error(409, "Client already registered");
}
throw e;
}
} else {
if (encPubKey === sigPubKey) {
error(400, "Same public keys");
} else if (!verifyPubKey(encPubKey) || !verifyPubKey(sigPubKey)) {
error(400, "Invalid public key(s)");
}
try {
const { id: clientId } = await createClient(encPubKey, sigPubKey, userId);
return await createUserClientChallenge(ip, userId, clientId, encPubKey);
} catch (e) {
if (e instanceof IntegrityError && e.message === "Public key(s) already registered") {
error(409, "Public key(s) already used");
}
throw e;
}
}
};
export const verifyUserClient = async (
userId: number,
ip: string,
challengeId: number,
answerSig: string,
) => {
const challenge = await consumeUserClientChallenge(challengeId, userId, ip);
if (!challenge) {
error(403, "Invalid challenge answer");
}
const client = await getClient(challenge.clientId);
if (!client) {
error(500, "Invalid challenge answer");
} else if (
!verifySignature(Buffer.from(challenge.answer, "base64"), answerSig, client.sigPubKey)
) {
error(403, "Invalid challenge answer signature");
}
await setUserClientStateToPending(userId, client.id);
};
export const getUserClientStatus = async (userId: number, clientId: number) => {
const userClient = await getUserClient(userId, clientId);
if (!userClient) {
error(500, "Invalid session id");
}
return {
state: userClient.state as "pending" | "active",
isInitialMekNeeded: await isInitialMekNeeded(userId),
};
};

View File

@@ -1,96 +0,0 @@
import { error } from "@sveltejs/kit";
import { unlink } from "fs/promises";
import { IntegrityError } from "$lib/server/db/error";
import {
registerDirectory,
getAllDirectoriesByParent,
getDirectory,
setDirectoryEncName,
unregisterDirectory,
getAllFilesByParent,
type DirectoryId,
type NewDirectory,
} from "$lib/server/db/file";
import type { Ciphertext } from "$lib/server/db/schema";
export const getDirectoryInformation = async (userId: number, directoryId: DirectoryId) => {
const directory = directoryId !== "root" ? await getDirectory(userId, directoryId) : undefined;
if (directory === null) {
error(404, "Invalid directory id");
}
const directories = await getAllDirectoriesByParent(userId, directoryId);
const files = await getAllFilesByParent(userId, directoryId);
return {
metadata: directory && {
parentId: directory.parentId ?? ("root" as const),
mekVersion: directory.mekVersion,
encDek: directory.encDek,
dekVersion: directory.dekVersion,
encName: directory.encName,
},
directories: directories.map(({ id }) => id),
files: files.map(({ id }) => id),
};
};
const safeUnlink = async (path: string | null) => {
if (path) {
await unlink(path).catch(console.error);
}
};
export const deleteDirectory = async (userId: number, directoryId: number) => {
try {
const files = await unregisterDirectory(userId, directoryId);
return {
files: files.map(({ id, path, thumbnailPath }) => {
safeUnlink(path); // Intended
safeUnlink(thumbnailPath); // Intended
return id;
}),
};
} catch (e) {
if (e instanceof IntegrityError && e.message === "Directory not found") {
error(404, "Invalid directory id");
}
throw e;
}
};
export const renameDirectory = async (
userId: number,
directoryId: number,
dekVersion: Date,
newEncName: Ciphertext,
) => {
try {
await setDirectoryEncName(userId, directoryId, dekVersion, newEncName);
} catch (e) {
if (e instanceof IntegrityError) {
if (e.message === "Directory not found") {
error(404, "Invalid directory id");
} else if (e.message === "Invalid DEK version") {
error(400, "Invalid DEK version");
}
}
throw e;
}
};
export const createDirectory = async (params: NewDirectory) => {
const oneMinuteAgo = new Date(Date.now() - 60 * 1000);
const oneMinuteLater = new Date(Date.now() + 60 * 1000);
if (params.dekVersion <= oneMinuteAgo || params.dekVersion >= oneMinuteLater) {
error(400, "Invalid DEK version");
}
try {
await registerDirectory(params);
} catch (e) {
if (e instanceof IntegrityError && e.message === "Inactive MEK version") {
error(400, "Invalid MEK version");
}
throw e;
}
};

View File

@@ -1,223 +1,74 @@
import { error } from "@sveltejs/kit";
import { createHash } from "crypto";
import { createReadStream, createWriteStream } from "fs";
import { mkdir, stat, unlink } from "fs/promises";
import { dirname } from "path";
import { createReadStream } from "fs";
import { stat } from "fs/promises";
import { Readable } from "stream";
import { pipeline } from "stream/promises";
import { v4 as uuidv4 } from "uuid";
import { IntegrityError } from "$lib/server/db/error";
import {
registerFile,
getAllFileIds,
getAllFileIdsByContentHmac,
getFile,
setFileEncName,
unregisterFile,
getAllFileCategories,
type NewFile,
} from "$lib/server/db/file";
import {
updateFileThumbnail,
getFileThumbnail,
getMissingFileThumbnails,
} from "$lib/server/db/media";
import type { Ciphertext } from "$lib/server/db/schema";
import env from "$lib/server/loadenv";
import { FileRepo, MediaRepo } from "$lib/server/db";
export const getFileInformation = async (userId: number, fileId: number) => {
const file = await getFile(userId, fileId);
if (!file) {
error(404, "Invalid file id");
const createEncContentStream = async (
path: string,
iv?: Buffer,
range?: { start?: number; end?: number },
) => {
const { size: fileSize } = await stat(path);
const ivSize = iv?.byteLength ?? 0;
const totalSize = fileSize + ivSize;
const start = range?.start ?? 0;
const end = range?.end ?? totalSize - 1;
if (start > end || start < 0 || end >= totalSize) {
error(416, "Invalid range");
}
const categories = await getAllFileCategories(fileId);
return {
parentId: file.parentId ?? ("root" as const),
mekVersion: file.mekVersion,
encDek: file.encDek,
dekVersion: file.dekVersion,
contentType: file.contentType,
encContentIv: file.encContentIv,
encName: file.encName,
encCreatedAt: file.encCreatedAt,
encLastModifiedAt: file.encLastModifiedAt,
categories: categories.map(({ id }) => id),
};
};
const safeUnlink = async (path: string | null) => {
if (path) {
await unlink(path).catch(console.error);
}
};
export const deleteFile = async (userId: number, fileId: number) => {
try {
const { path, thumbnailPath } = await unregisterFile(userId, fileId);
safeUnlink(path); // Intended
safeUnlink(thumbnailPath); // Intended
} catch (e) {
if (e instanceof IntegrityError && e.message === "File not found") {
error(404, "Invalid file id");
}
throw e;
}
};
export const getFileStream = async (userId: number, fileId: number) => {
const file = await getFile(userId, fileId);
if (!file) {
error(404, "Invalid file id");
}
const { size } = await stat(file.path);
return {
encContentStream: Readable.toWeb(createReadStream(file.path)),
encContentSize: size,
};
};
export const renameFile = async (
userId: number,
fileId: number,
dekVersion: Date,
newEncName: Ciphertext,
) => {
try {
await setFileEncName(userId, fileId, dekVersion, newEncName);
} catch (e) {
if (e instanceof IntegrityError) {
if (e.message === "File not found") {
error(404, "Invalid file id");
} else if (e.message === "Invalid DEK version") {
error(400, "Invalid DEK version");
}
}
throw e;
}
};
export const getFileThumbnailInformation = async (userId: number, fileId: number) => {
const thumbnail = await getFileThumbnail(userId, fileId);
if (!thumbnail) {
error(404, "File or its thumbnail not found");
}
return { updatedAt: thumbnail.updatedAt, encContentIv: thumbnail.encContentIv };
};
export const getFileThumbnailStream = async (userId: number, fileId: number) => {
const thumbnail = await getFileThumbnail(userId, fileId);
if (!thumbnail) {
error(404, "File or its thumbnail not found");
}
const { size } = await stat(thumbnail.path);
return {
encContentStream: Readable.toWeb(createReadStream(thumbnail.path)),
encContentSize: size,
};
};
export const uploadFileThumbnail = async (
userId: number,
fileId: number,
dekVersion: Date,
encContentIv: string,
encContentStream: Readable,
) => {
const path = `${env.thumbnailsPath}/${userId}/${uuidv4()}`;
await mkdir(dirname(path), { recursive: true });
try {
await pipeline(encContentStream, createWriteStream(path, { flags: "wx", mode: 0o600 }));
const oldPath = await updateFileThumbnail(userId, fileId, dekVersion, path, encContentIv);
safeUnlink(oldPath); // Intended
} catch (e) {
await safeUnlink(path);
if (e instanceof IntegrityError) {
if (e.message === "File not found") {
error(404, "File not found");
} else if (e.message === "Invalid DEK version") {
error(400, "Mismatched DEK version");
}
}
throw e;
}
};
export const getFileList = async (userId: number) => {
const fileIds = await getAllFileIds(userId);
return { files: fileIds };
};
export const scanDuplicateFiles = async (
userId: number,
hskVersion: number,
contentHmac: string,
) => {
const fileIds = await getAllFileIdsByContentHmac(userId, hskVersion, contentHmac);
return { files: fileIds };
};
export const scanMissingFileThumbnails = async (userId: number) => {
const fileIds = await getMissingFileThumbnails(userId);
return { files: fileIds };
};
export const uploadFile = async (
params: Omit<NewFile, "path" | "encContentHash">,
encContentStream: Readable,
encContentHash: Promise<string>,
) => {
const oneDayAgo = new Date(Date.now() - 24 * 60 * 60 * 1000);
const oneMinuteLater = new Date(Date.now() + 60 * 1000);
if (params.dekVersion <= oneDayAgo || params.dekVersion >= oneMinuteLater) {
error(400, "Invalid DEK version");
}
const path = `${env.libraryPath}/${params.userId}/${uuidv4()}`;
await mkdir(dirname(path), { recursive: true });
try {
const hashStream = createHash("sha256");
const [, hash] = await Promise.all([
pipeline(
encContentStream,
async function* (source) {
for await (const chunk of source) {
hashStream.update(chunk);
yield chunk;
encContentStream: Readable.toWeb(
Readable.from(
(async function* () {
if (start < ivSize) {
yield iv!.subarray(start, Math.min(end + 1, ivSize));
}
},
createWriteStream(path, { flags: "wx", mode: 0o600 }),
if (end >= ivSize) {
yield* createReadStream(path, {
start: Math.max(0, start - ivSize),
end: end - ivSize,
});
}
})(),
),
encContentHash,
]);
if (hashStream.digest("base64") !== hash) {
throw new Error("Invalid checksum");
}
const { id: fileId } = await registerFile({
...params,
path,
encContentHash: hash,
});
return { fileId };
} catch (e) {
await safeUnlink(path);
if (e instanceof IntegrityError && e.message === "Inactive MEK version") {
error(400, "Invalid MEK version");
} else if (
e instanceof Error &&
(e.message === "Invalid request body" || e.message === "Invalid checksum")
) {
error(400, "Invalid request body");
}
throw e;
}
),
range: { start, end, total: totalSize },
};
};
export const getFileStream = async (
userId: number,
fileId: number,
range?: { start?: number; end?: number },
) => {
const file = await FileRepo.getFile(userId, fileId);
if (!file) {
error(404, "Invalid file id");
}
return createEncContentStream(
file.path,
file.encContentIv ? Buffer.from(file.encContentIv, "base64") : undefined,
range,
);
};
export const getFileThumbnailStream = async (
userId: number,
fileId: number,
range?: { start?: number; end?: number },
) => {
const thumbnail = await MediaRepo.getFileThumbnail(userId, fileId);
if (!thumbnail) {
error(404, "File or its thumbnail not found");
}
return createEncContentStream(
thumbnail.path,
thumbnail.encContentIv ? Buffer.from(thumbnail.encContentIv, "base64") : undefined,
range,
);
};

View File

@@ -1,31 +0,0 @@
import { error } from "@sveltejs/kit";
import { IntegrityError } from "$lib/server/db/error";
import { registerInitialHsk, getAllValidHsks } from "$lib/server/db/hsk";
export const getHskList = async (userId: number) => {
const hsks = await getAllValidHsks(userId);
return {
encHsks: hsks.map(({ version, state, mekVersion, encHsk }) => ({
version,
state,
mekVersion,
encHsk,
})),
};
};
export const registerInitialActiveHsk = async (
userId: number,
createdBy: number,
mekVersion: number,
encHsk: string,
) => {
try {
await registerInitialHsk(userId, createdBy, mekVersion, encHsk);
} catch (e) {
if (e instanceof IntegrityError && e.message === "HSK already registered") {
error(409, "Initial HSK already registered");
}
throw e;
}
};

View File

@@ -1,38 +0,0 @@
import { error } from "@sveltejs/kit";
import { setUserClientStateToActive } from "$lib/server/db/client";
import { IntegrityError } from "$lib/server/db/error";
import { registerInitialMek, getAllValidClientMeks } from "$lib/server/db/mek";
import { verifyClientEncMekSig } from "$lib/server/modules/mek";
export const getClientMekList = async (userId: number, clientId: number) => {
const clientMeks = await getAllValidClientMeks(userId, clientId);
return {
encMeks: clientMeks.map(({ version, state, encMek, encMekSig }) => ({
version,
state,
encMek,
encMekSig,
})),
};
};
export const registerInitialActiveMek = async (
userId: number,
createdBy: number,
encMek: string,
encMekSig: string,
) => {
if (!(await verifyClientEncMekSig(userId, createdBy, 1, encMek, encMekSig))) {
error(400, "Invalid signature");
}
try {
await registerInitialMek(userId, createdBy, encMek, encMekSig);
await setUserClientStateToActive(userId, createdBy);
} catch (e) {
if (e instanceof IntegrityError && e.message === "MEK already registered") {
error(409, "Initial MEK already registered");
}
throw e;
}
};

View File

@@ -0,0 +1,82 @@
import { error } from "@sveltejs/kit";
import { createHash } from "crypto";
import { createWriteStream } from "fs";
import { Readable } from "stream";
import { ENCRYPTION_OVERHEAD, ENCRYPTED_CHUNK_SIZE } from "$lib/constants";
import { UploadRepo } from "$lib/server/db";
import { safeRecursiveRm, safeUnlink } from "$lib/server/modules/filesystem";
const chunkLocks = new Set<string>();
export const uploadChunk = async (
userId: number,
sessionId: string,
chunkIndex: number,
encChunkStream: Readable,
encChunkHash: string,
) => {
const lockKey = `${sessionId}/${chunkIndex}`;
if (chunkLocks.has(lockKey)) {
error(409, "Chunk upload already in progress");
} else {
chunkLocks.add(lockKey);
}
let filePath;
try {
const session = await UploadRepo.getUploadSession(sessionId, userId);
if (!session) {
error(404, "Invalid upload id");
} else if (chunkIndex >= session.totalChunks) {
error(400, "Invalid chunk index");
} else if (session.uploadedChunks.includes(chunkIndex)) {
error(409, "Chunk already uploaded");
}
const isLastChunk = chunkIndex === session.totalChunks - 1;
filePath = `${session.path}/${chunkIndex}`;
const hashStream = createHash("sha256");
const writeStream = createWriteStream(filePath, { flags: "wx", mode: 0o600 });
let writtenBytes = 0;
for await (const chunk of encChunkStream) {
hashStream.update(chunk);
writeStream.write(chunk);
writtenBytes += chunk.length;
}
await new Promise<void>((resolve, reject) => {
writeStream.end((e: any) => (e ? reject(e) : resolve()));
});
if (hashStream.digest("base64") !== encChunkHash) {
throw new Error("Invalid checksum");
} else if (
(!isLastChunk && writtenBytes !== ENCRYPTED_CHUNK_SIZE) ||
(isLastChunk && (writtenBytes <= ENCRYPTION_OVERHEAD || writtenBytes > ENCRYPTED_CHUNK_SIZE))
) {
throw new Error("Invalid chunk size");
}
await UploadRepo.markChunkAsUploaded(sessionId, chunkIndex);
} catch (e) {
await safeUnlink(filePath);
if (
e instanceof Error &&
(e.message === "Invalid checksum" || e.message === "Invalid chunk size")
) {
error(400, "Invalid request body");
}
throw e;
} finally {
chunkLocks.delete(lockKey);
}
};
export const cleanupExpiredUploadSessions = async () => {
const paths = await UploadRepo.cleanupExpiredUploadSessions();
await Promise.all(paths.map(safeRecursiveRm));
};

View File

@@ -1,15 +0,0 @@
import { error } from "@sveltejs/kit";
import { getUser, setUserNickname } from "$lib/server/db/user";
export const getUserInformation = async (userId: number) => {
const user = await getUser(userId);
if (!user) {
error(500, "Invalid session id");
}
return { email: user.email, nickname: user.nickname };
};
export const changeNickname = async (userId: number, nickname: string) => {
await setUserNickname(userId, nickname);
};

View File

@@ -0,0 +1,39 @@
import { DECRYPTED_FILE_URL_PREFIX } from "$lib/constants";
import type { FileMetadata, ServiceWorkerMessage, ServiceWorkerResponse } from "./types";
const PREPARE_TIMEOUT_MS = 5000;
const getServiceWorker = async () => {
const registration = await navigator.serviceWorker.ready;
const sw = registration.active;
if (!sw) {
throw new Error("Service worker not activated");
}
return sw;
};
export const prepareFileDecryption = async (id: number, metadata: FileMetadata) => {
const sw = await getServiceWorker();
return new Promise<void>((resolve, reject) => {
const timeout = setTimeout(
() => reject(new Error("Service worker timeout")),
PREPARE_TIMEOUT_MS,
);
const handler = (event: MessageEvent<ServiceWorkerResponse>) => {
if (event.data.type === "decryption-ready" && event.data.fileId === id) {
clearTimeout(timeout);
navigator.serviceWorker.removeEventListener("message", handler);
resolve();
}
};
navigator.serviceWorker.addEventListener("message", handler);
sw.postMessage({
type: "decryption-prepare",
fileId: id,
...metadata,
} satisfies ServiceWorkerMessage);
});
};
export const getDecryptedFileUrl = (id: number) => `${DECRYPTED_FILE_URL_PREFIX}${id}`;

Some files were not shown because too many files have changed in this diff Show More