Compare commits

...

6 Commits

Author SHA1 Message Date
Sylver b0352724b2 use zod for config validation 2024-01-07 12:55:22 +08:00
Sylver 805ebcd4b4 fix lint errors 2024-01-07 12:31:37 +08:00
Sylver 62b91a8316 fix useQueryState replaceState 2024-01-07 12:26:27 +08:00
Sylver 40cbaaca8b update deps, move to atlas configs 2024-01-07 12:19:16 +08:00
Sylver aec85d03a6 remove v0 -> v1 migration code 2024-01-07 11:33:58 +08:00
Sylver 757e53fba4 update readme 2024-01-07 11:33:19 +08:00
51 changed files with 4113 additions and 5032 deletions

31
.syncpackrc Normal file
View File

@ -0,0 +1,31 @@
{
"indent": " ",
"sortFirst": [
"name",
"version",
"license",
"repository",
"author",
"type",
"main",
"private",
"types",
"module",
"exports",
"source",
"publishConfig",
"contributors",
"keywords",
"files",
"workspaces",
"engines",
"scripts",
"contributors",
"dependencies",
"devDependencies",
"peerDependencies",
"keywords",
"mikro-orm",
"jest"
]
}

View File

@ -1,15 +0,0 @@
# migrating
## from micro 0.0.x to micro 1.0.0
I've made a best effort attempt to make migration as painless as possible, mostly for my own sanity. These steps are quite in-depth but in reality the migration should be fairly simple for most users. If you get stuck at any point, please join the [discord server](https://discord.gg/VDMX6VQRZm) and ask for help.
1. Create a backup of the database and the data directory.
2. Update your `.microrc` with the changes seen in [example config](example/.microrc.yaml) (your config may be in json with the example now being yaml, but the keys are 1:1), notable changes are `database` is now `databaseUrl`.
3. Change the docker image from `sylver/micro` or `sylver/micro:master` to `sylver/micro:main`
4. Change the port from `8080` to `3000`. If you are using the example config, do this in `Caddyfile` by changing `micro:8080` to `micro:3000`.
5. Start the container. It should exit on startup with an error message saying that there is data that must be migrated. If it does not, you did not update the image tag correctly or it cannot detect data to be migrated.
6. Read the error message, then stop the container and set the `MIGRATE_OLD_DATABASE` environment variable to `true`
7. Start the container and it will migrate the database automatically.
After that, you should be able to use it as normal. Thumbnails are the only data that is not migrated, as the format changed and it doesn't really matter because they can just be regenerated on demand. If you run into any issues during migration, join the [discord server](https://discord.gg/VDMX6VQRZm) or open an issue on [github](https://github.com/sylv/micro/issues/new).

View File

@ -1,14 +1,28 @@
<p align="center">
<svg xmlns="http://www.w3.org/2000/svg" width="128" height="128" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="mr-2 text-primary">
<path d="M6.13 1L6 16a2 2 0 0 0 2 2h15"></path>
<path d="M1 6.13L16 6a2 2 0 0 1 2 2v15"></path>
</svg>
</p>
<p align="center">
<img src="https://skillicons.dev/icons?i=next,tailwind,nest,typescript,docker,graphql" />
<br/>
<a href="https://discord.gg/VDMX6VQRZm"><kbd>🔵 discord</kbd></a> <a href="https://micro.sylo.digital"><kbd>🟢 hosted instance</kbd></a>
</p>
# micro
An invite-only file sharing service with support for ShareX. You can see a preview at https://micro.sylo.digital
A vanity file sharing service with support for ShareX. You can see a preview at https://micro.sylo.digital
- [micro](#micro)
- [features](#features)
- [screenshots](#screenshots)
- [installation](#installation)
- [configuration](#configuration)
- [updating](#updating)
- [todo](#todo)
- [discord](#discord)
- [support](#support)
## features
@ -51,25 +65,37 @@ An invite-only file sharing service with support for ShareX. You can see a previ
## installation
> [!NOTE]
> If you need help, join the [discord server](https://discord.gg/VDMX6VQRZm). This guide assumes you are on linux with a basic understanding of linux and docker.
> To migrate from micro 0.0.x to 1.0.0, see [MIGRATING.md](MIGRATING.md).
> [!TIP]
> If you are already familiar with docker, you can look at the [compose file](./example/compose.yml) and [config file](./example/.microrc.yaml) to get setup quickly. The below is a more detailed guide for inexperienced users.
1. Install `git`, `docker` and `docker-compose`
1. Install `git` and `docker`
2. Download the files in this repository, `git clone https://github.com/sylv/micro.git`
3. Copy the example configs to the current directory, `cp ./micro/example/* ./`
4. Fill out `.microrc.yaml`, `Caddyfile` and `docker-compose.yml`. **It is extremely important you read through each of the 3 files and make sure you understand what they do.** Specifically, `.microrc.yaml` contains a secret that handles authentication, if it is not a secure random string everyone can sign in as anyone they want without a password.
5. Run `docker-compose up -d` to start the database and micro.
6. Get the startup invite by doing `docker-compose logs micro` and copying the invite URL that should be somewhere towards the end of the log. Go to that URL to create the first account.
4. Fill out `.microrc.yaml`, `Caddyfile` and `docker compose.yml`. **It is extremely important you read through each of the 3 files and make sure you understand what they do.** Specifically, `.microrc.yaml` contains a secret that handles authentication, if it is not a secure random string everyone can sign in as anyone they want without a password.
5. Run `docker compose up -d` to start the database and micro.
6. Get the startup invite by doing `docker compose logs micro` and copying the invite URL that should be somewhere towards the end of the log. Go to that URL to create the first account.
Setup is now complete and your instance should be working.
To add another user, sign in then go to `/api/invite` and copy the URL it gives you. This will be improved in the future.
### configuration
micro uses [venera](https://github.com/sylv/venera) to load configuration files. Configuration files are validated on startup, and may log errors if invalid setups are detected. The venera page has more information, but tl;dr:
- `.microrc.yaml` is the main configuration file.
- You can override any config value with an environment variable. The key `hosts.0.url` would be set as `MICRO_HOSTS__0__URL`
- You can use other file formats, like JSON or TOML.
### updating
You should take a full database backup before updating. Pending database migrations will be applied automatically on startup.
You should take a full database backup before updating, but you won't, will you?
The database will be automatically migrated on startup.
1. `docker-compose pull micro`
2. `docker-compose up -d micro`
1. `docker compose pull micro`
2. `docker compose up -d micro`
## todo
@ -80,7 +106,7 @@ You should take a full database backup before updating. Pending database migrati
- [ ] SQLite support
- [ ] Private email aliases like firefox relay (might be difficult/expensive)
## discord
## support
<a href="https://discord.gg/VDMX6VQRZm" target="__blank">
<img src="https://discordapp.com/api/guilds/778444719553511425/widget.png?style=banner2" alt="sylo.digital"/>

View File

@ -1,6 +0,0 @@
module.exports = {
extends: ["@commitlint/config-conventional"],
rules: {
"subject-case": [0],
},
};

View File

@ -13,8 +13,8 @@ services:
image: postgres:12-alpine
restart: unless-stopped
environment:
# lleaving this as default should be fine as postgres will only ever be exposed to services
# in this docker-compose file, but you might still want to consider changing it to something more secure.
# leaving this as default should be fine as postgres will only ever be exposed to services
# in this compose file, but you might still want to consider changing it to something more secure.
- POSTGRES_PASSWORD=youshallnotpass
- POSTGRES_USER=micro
- POSTGRES_DB=micro

View File

@ -1,22 +1,24 @@
{
"name": "micro",
"version": "1.0.0",
"license": "GPL-3.0",
"repository": "https://github.com/sylv/micro.git",
"author": "Ryan <ryan@sylver.me>",
"license": "GPL-3.0",
"private": true,
"packageManager": "pnpm@7.0.0",
"engines": {
"node": ">=16",
"pnpm": ">=7"
},
"scripts": {
"lint": "turbo run lint",
"build": "turbo run build",
"test": "turbo run test",
"clean": "rm -rf ./packages/*/{tsconfig.tsbuildinfo,lib,dist,yarn-error.log,.next}"
"clean": "rm -rf ./packages/*/{tsconfig.tsbuildinfo,lib,dist,yarn-error.log,.next}",
"lint": "turbo run lint",
"sync": "syncpack format && syncpack fix-mismatches",
"test": "turbo run test"
},
"devDependencies": {
"turbo": "1.10.13"
}
"syncpack": "^12.3.0",
"turbo": "1.11.3"
},
"packageManager": "pnpm@7.0.0"
}

View File

@ -1,11 +1,9 @@
module.exports = {
extends: require.resolve('@sylo-digital/scripts/eslint/base'),
overrides: [
{
files: ['**/*.{entity,embeddable,resolver}.ts'],
rules: {
'@typescript-eslint/no-inferrable-types': 'off',
},
},
],
extends: require.resolve('@atlasbot/configs/eslint/node'),
parserOptions: {
project: './tsconfig.json',
},
rules: {
'unicorn/no-abusive-eslint-disable': 'off',
},
};

View File

@ -1,86 +1,89 @@
{
"name": "@ryanke/micro-api",
"version": "1.0.0",
"license": "GPL-3.0",
"repository": "https://github.com/sylv/micro.git",
"author": "Ryan <ryan@sylver.me>",
"license": "GPL-3.0",
"type": "module",
"private": true,
"engine": {
"node": ">=18"
"engines": {
"node": ">=20"
},
"scripts": {
"watch": "tsup --watch --onSuccess \"node dist/main.js --inspect --inspect-brk\"",
"build": "tsup",
"build": "tsc --noEmit && tsup",
"lint": "eslint src --fix --cache",
"test": "vitest run"
"test": "vitest run",
"watch": "tsup --watch --onSuccess \"node dist/main.js\""
},
"dependencies": {
"@fastify/cookie": "^9.0.4",
"@fastify/helmet": "^11.0.0",
"@fastify/multipart": "^7.7.3",
"@fastify/cookie": "^9.2.0",
"@fastify/helmet": "^11.1.1",
"@fastify/multipart": "^8.1.0",
"@jenyus-org/graphql-utils": "^1.5.0",
"@mercuriusjs/gateway": "^1.2.0",
"@mikro-orm/core": "^5.7.14",
"@mikro-orm/migrations": "^5.7.14",
"@mikro-orm/nestjs": "^5.2.1",
"@mikro-orm/postgresql": "^5.7.14",
"@nestjs/common": "^10.2.2",
"@nestjs/core": "^10.2.2",
"@nestjs/graphql": "^12.0.8",
"@nestjs/jwt": "^10.1.0",
"@nestjs/mercurius": "^12.0.4",
"@nestjs/passport": "^10.0.1",
"@nestjs/platform-fastify": "^10.2.2",
"@nestjs/schedule": "^3.0.3",
"@mercuriusjs/gateway": "^2.2.0",
"@mikro-orm/core": "^5.9.7",
"@mikro-orm/migrations": "^5.9.7",
"@mikro-orm/nestjs": "^5.2.3",
"@mikro-orm/postgresql": "^5.9.7",
"@nestjs/common": "^10.3.0",
"@nestjs/core": "^10.3.0",
"@nestjs/graphql": "^12.0.11",
"@nestjs/jwt": "^10.2.0",
"@nestjs/mercurius": "^12.0.11",
"@nestjs/passport": "^10.0.3",
"@nestjs/platform-fastify": "^10.3.0",
"@nestjs/schedule": "^4.0.0",
"@ryanke/venera": "^1.0.5",
"rxjs": "^7.8.1",
"bcryptjs": "^2.4.3",
"class-transformer": "^0.5.1",
"class-validator": "^0.14.0",
"fastify": "^4.22.0",
"fastify": "^4.25.2",
"fluent-ffmpeg": "^2.1.2",
"graphql": "^16.8.0",
"mercurius": "^13.1.0",
"graphql": "^16.8.1",
"mercurius": "^13.3.3",
"mime-types": "^2.1.35",
"nodemailer": "^6.9.4",
"nodemailer": "^6.9.8",
"otplib": "^12.0.1",
"passport": "^0.6.0",
"passport": "^0.7.0",
"passport-jwt": "^4.0.1",
"sharp": "^0.32.5",
"rxjs": "^7.8.1",
"sharp": "^0.33.1",
"stream-size": "^0.0.6"
},
"devDependencies": {
"@mikro-orm/cli": "^5.7.14",
"@swc/core": "^1.3.80",
"@sylo-digital/scripts": "^1.0.12",
"@types/bcryptjs": "^2.4.3",
"@types/bytes": "^3.1.1",
"@types/dedent": "^0.7.0",
"@types/fluent-ffmpeg": "^2.1.21",
"@types/luxon": "^3.3.1",
"@types/mime-types": "^2.1.1",
"@types/ms": "^0.7.31",
"@types/node": "^18.15.11",
"@types/nodemailer": "^6.4.9",
"@types/passport-jwt": "^3.0.9",
"@atlasbot/configs": "^10.5.14",
"@mikro-orm/cli": "^5.9.7",
"@swc/core": "^1.3.102",
"@types/bcryptjs": "^2.4.6",
"@types/bytes": "^3.1.4",
"@types/dedent": "^0.7.2",
"@types/fluent-ffmpeg": "^2.1.24",
"@types/luxon": "^3.4.0",
"@types/mime-types": "^2.1.4",
"@types/ms": "^0.7.34",
"@types/node": "^20.10.6",
"@types/nodemailer": "^6.4.14",
"@types/passport-jwt": "^4.0.0",
"bytes": "^3.1.2",
"chalk": "^5.3.0",
"content-range": "^2.0.2",
"dedent": "^1.5.1",
"escape-string-regexp": "^5.0.0",
"file-type": "^18.5.0",
"file-type": "^18.7.0",
"handlebars": "^4.7.8",
"istextorbinary": "^6.0.0",
"luxon": "^3.4.2",
"istextorbinary": "^9.5.0",
"luxon": "^3.4.4",
"ms": "^3.0.0-canary.1",
"nanoid": "^4.0.2",
"nanoid": "^5.0.4",
"normalize-url": "^8.0.0",
"pretty-bytes": "^6.1.1",
"reflect-metadata": "^0.1.13",
"ts-node": "^10.9.1",
"tsup": "^7.2.0",
"typescript": "^5.2.2",
"vitest": "^0.34.3"
"reflect-metadata": "^0.2.1",
"ts-node": "^10.9.2",
"tsup": "^8.0.1",
"typescript": "^5.3.3",
"vitest": "^1.1.3",
"zod": "^3.22.4",
"zod-validation-error": "^2.1.0"
},
"mikro-orm": {
"useTsNode": true,

View File

@ -16,7 +16,7 @@ export class ExifTransformer extends Transform {
private static readonly maxMarkerLength = Math.max(
ExifTransformer.exifMarker.length,
ExifTransformer.xmpMarker.length,
ExifTransformer.flirMarker.length
ExifTransformer.flirMarker.length,
);
private remainingBytes?: number;

View File

@ -1,82 +0,0 @@
import bytes from 'bytes';
import { Transform, Type } from 'class-transformer';
import {
IsBoolean,
IsDefined,
IsEmail,
IsMimeType,
IsNumber,
IsOptional,
IsString,
IsUrl,
Max,
NotEquals,
ValidateNested,
} from 'class-validator';
import path from 'path';
import { expandMime } from '../helpers/expand-mime.js';
import { MicroConversion } from './MicroConversion.js';
import { MicroEmail } from './MicroEmail.js';
import { MicroHost } from './MicroHost.js';
import { MicroPurge } from './MicroPurge.js';
export class MicroConfig {
@IsUrl({ require_tld: false, require_protocol: true, protocols: ['postgresql', 'postgres'] })
databaseUrl: string;
@IsString()
@NotEquals('YOU_SHALL_NOT_PASS')
secret: string;
@IsEmail()
inquiries: string;
@IsNumber()
@Transform(({ value }) => bytes.parse(value))
uploadLimit = bytes.parse('50MB');
@IsNumber()
@IsOptional()
@Max(500000)
maxPasteLength = 500000;
@IsMimeType({ each: true })
@IsOptional()
@Transform(({ value }) => {
if (!value) return value;
const clean = expandMime(value);
return new Set(clean);
})
allowTypes?: Set<string>;
@IsString()
@Transform(({ value }) => path.resolve(value))
storagePath: string;
@IsBoolean()
restrictFilesToHost: boolean;
@ValidateNested()
@IsOptional()
@Type(() => MicroPurge)
purge?: MicroPurge;
@ValidateNested()
@IsOptional()
@Type(() => MicroEmail)
email: MicroEmail;
@ValidateNested({ each: true })
@IsOptional()
@Type(() => MicroConversion)
conversions?: MicroConversion[];
@ValidateNested({ each: true })
@IsDefined()
@Type(() => MicroHost)
hosts: MicroHost[];
get rootHost() {
return this.hosts[0]!;
}
}

View File

@ -1,21 +0,0 @@
import bytes from 'bytes';
import { Transform } from 'class-transformer';
import { IsMimeType, IsNumber, IsOptional, IsString } from 'class-validator';
import { expandMime } from '../helpers/expand-mime.js';
export class MicroConversion {
@IsString({ each: true })
@Transform(({ value }) => {
const clean = expandMime(value);
return new Set(clean);
})
from: Set<string>;
@IsMimeType()
to: string;
@IsNumber()
@IsOptional()
@Transform(({ value }) => bytes.parse(value))
minSize?: number;
}

View File

@ -1,9 +0,0 @@
import { IsEmail, IsObject } from 'class-validator';
export class MicroEmail {
@IsEmail()
from: string;
@IsObject()
smtp: Record<string, any>;
}

View File

@ -1,45 +0,0 @@
import { IsOptional, IsString, IsUrl, Matches } from 'class-validator';
import escapeString from 'escape-string-regexp';
import { HostService } from '../modules/host/host.service.js';
export class MicroHost {
constructor(url: string, tags?: string[], redirect?: string) {
this.url = url;
this.tags = tags;
this.redirect = redirect;
}
// https://regex101.com/r/ZR9rpp/1
@Matches(/^https?:\/\/[\d.:A-z{}-]+$/u)
url: string;
@IsString({ each: true })
@IsOptional()
tags?: string[];
@IsUrl({ require_protocol: true })
@IsOptional()
redirect?: string;
get normalised() {
return HostService.normaliseHostUrl(this.url);
}
get isWildcard() {
return this.url.includes('{{username}}');
}
private _pattern?: RegExp;
get pattern() {
if (this._pattern) return this._pattern;
this._pattern = MicroHost.getWildcardPattern(this.url);
return this._pattern;
}
static getWildcardPattern(url: string) {
const normalised = HostService.normaliseHostUrl(url);
const escaped = escapeString(normalised);
const pattern = escaped.replace('\\{\\{username\\}\\}', '(?<username>[a-z0-9-{}]+?)');
return new RegExp(`^(https?:\\/\\/)?${pattern}\\/?`, 'u');
}
}

View File

@ -1,14 +0,0 @@
import bytes from 'bytes';
import { Transform } from 'class-transformer';
import { IsNumber } from 'class-validator';
import ms from 'ms';
export class MicroPurge {
@IsNumber()
@Transform(({ value }) => bytes.parse(value))
overLimit: number;
@IsNumber()
@Transform(({ value }) => ms(value))
afterTime: number;
}

View File

@ -1,20 +1,100 @@
import { loadConfig } from '@ryanke/venera';
import { plainToClass } from 'class-transformer';
import { validateSync } from 'class-validator';
import { MicroConfig } from './classes/MicroConfig.js';
import bytes from 'bytes';
import c from 'chalk';
import { randomBytes } from 'crypto';
import dedent from 'dedent';
import escapeStringRegexp from 'escape-string-regexp';
import ms from 'ms';
import z, { any, array, boolean, number, record, strictObject, string, union } from 'zod';
import { fromZodError } from 'zod-validation-error';
import { expandMime } from './helpers/expand-mime.js';
import { HostService } from './modules/host/host.service.js';
export type MicroHost = ReturnType<typeof enhanceHost>;
const schema = strictObject({
databaseUrl: string().startsWith('postgresql://'),
secret: string().min(6),
inquiries: string().email(),
uploadLimit: string().transform(bytes.parse),
maxPasteLength: number().default(500000),
allowTypes: z
.union([array(string()), string()])
.optional()
.transform((value) => new Set(value ? expandMime(value) : [])),
storagePath: string(),
restrictFilesToHost: boolean().default(true),
purge: strictObject({
overLimit: string().transform(bytes.parse),
afterTime: string().transform(ms),
}).optional(),
email: strictObject({
from: string().email(),
smtp: record(string(), any()),
}).optional(),
conversions: array(
strictObject({
from: union([array(string()), string()]).transform((value) => new Set(expandMime(value))),
to: string(),
minSize: string().transform(bytes.parse).optional(),
}),
).optional(),
hosts: array(
strictObject({
url: z
.string()
.url()
.transform((value) => value.replace(/\/$/, '')),
tags: array(string()).optional(),
redirect: string().url().optional(),
}),
),
});
const data = loadConfig('micro');
const config = plainToClass(MicroConfig, data, { exposeDefaultValues: true });
const errors = validateSync(config, { forbidUnknownValues: true });
if (errors.length > 0) {
const clean = errors.map((error) => error.toString()).join('\n');
console.dir(config, { depth: null });
console.error(clean);
process.exit(1);
const result = schema.safeParse(data);
if (!result.success) {
console.dir({ data, error: result.error }, { depth: null });
const pretty = fromZodError(result.error);
throw new Error(pretty.toString());
}
if (config.rootHost.isWildcard) {
const getWildcardPattern = (url: string) => {
const normalised = HostService.normaliseHostUrl(url);
const escaped = escapeStringRegexp(normalised);
const pattern = escaped.replace('\\{\\{username\\}\\}', '(?<username>[a-z0-9-{}]+?)');
return new RegExp(`^(https?:\\/\\/)?${pattern}\\/?`, 'u');
};
const enhanceHost = (host: z.infer<typeof schema>['hosts'][0]) => {
const isWildcard = host.url.includes('{{username}}');
const normalised = HostService.normaliseHostUrl(host.url);
const pattern = getWildcardPattern(host.url);
return {
...host,
isWildcard,
normalised,
pattern,
};
};
export const config = result.data as Omit<z.infer<typeof schema>, 'hosts'>;
export const hosts = result.data.hosts.map(enhanceHost);
export const rootHost = hosts[0];
if (rootHost.isWildcard) {
throw new Error(`Root host cannot be a wildcard domain.`);
}
export { config };
const disallowed = new Set(['youshallnotpass', 'you_shall_not_pass', 'secret', 'test']);
if (disallowed.has(config.secret.toLowerCase())) {
const token = randomBytes(24).toString('hex');
throw new Error(
dedent`
${c.redBright.bold('Do not use the default secret.')}
Please generate a random, secure secret or you risk anyone being able to impersonate you.
If you're lazy, here is a random secret: ${c.underline(token)}
`,
);
}

View File

@ -1,5 +1,5 @@
import { customAlphabet } from 'nanoid';
import blocklist from '../blocklist.json' assert { type: 'json' };
import blocklist from '../blocklist.json';
const contentIdLength = 6;
const paranoidIdLength = 12;

View File

@ -2,8 +2,6 @@ import { fileTypeFromBuffer } from 'file-type';
import * as mimeType from 'mime-types';
import path from 'path';
import type { PassThrough } from 'stream';
// @ts-expect-error see tsconfig.json/paths
import { isBinary } from 'istextorbinary';
const DEFAULT_TYPE = 'application/octet-stream';

View File

@ -1,76 +0,0 @@
import type { Connection, IMigrator } from '@mikro-orm/core';
import type { EntityManager } from '@mikro-orm/postgresql';
import { Logger } from '@nestjs/common';
import dedent from 'dedent';
const logger = new Logger('micro');
const migrationErrorWarning = dedent`
An old database for a previous version of micro was found.
To use this database with the new version, it must be migrated to the new format.
This can be done automatically, but first you need to create a database backup and ensure it works.
I am not kidding, do a backup now and make sure it works. If you skip this step and things go wrong, its on you.
Once you have a backup and you are sure that backup works, you can continue on the migration path.
To get started, start micro with the "MIGRATE_OLD_DATABASE" environment variable set to "true".
On startup the database will be migrated to the new format automatically, then startup will continue as normal.
If anything goes wrong during the migration, create a new issue on GitHub https://github.com/sylv/micro/issues/new immediately.
`;
const legacyMigrationWarning = dedent`
You have set "MIGRATE_OLD_DATABASE" to "true", so the old database will be migrated to the new format.
This may take some time, please be patient.
`;
export async function checkForOldDatabase(connection: Connection) {
logger.debug(`Checking for old database`);
const result = await connection.execute(
`SELECT EXISTS(SELECT FROM information_schema.tables WHERE table_schema = 'public' AND table_name = '_prisma_migrations')`
);
return result[0].exists;
}
// https://tenor.com/vGfQ.gif
export async function migrateOldDatabase(em: EntityManager, migrator: IMigrator) {
logger.debug(`Migrating old database`);
if (process.env.MIGRATE_OLD_DATABASE !== 'true') {
logger.error(migrationErrorWarning);
process.exit(1);
}
logger.warn(legacyMigrationWarning);
await em.transactional(async (em) => {
const trx = em.getTransactionContext();
const execute = (sql: string) =>
em
.createQueryBuilder('files')
.raw(sql)
.transacting(trx)
.then((result) => result);
await execute(`CREATE SCHEMA public_old`);
const tables = ['files', 'users', 'links', 'thumbnails', '_prisma_migrations'];
for (const table of tables) {
await execute(`ALTER TABLE "public"."${table}" SET SCHEMA "public_old"`);
}
await migrator.up({ transaction: trx });
await execute(`UPDATE public_old.users SET tags = array[]::text[] WHERE tags IS NULL`);
await execute(dedent`
INSERT INTO public.users (id, username, permissions, password, secret, tags)
SELECT id, username, permissions, password, secret, tags FROM public_old.users
`);
await execute(dedent`
INSERT INTO public.files (id, host, type, size, hash, name, owner_id, created_at)
SELECT id, host, type, size, hash, name, "ownerId", "createdAt" FROM public_old.files
`);
await execute(dedent`
INSERT INTO public.links (id, destination, host, clicks, created_at, owner_id)
SELECT id, destination, host, clicks, "createdAt", "ownerId" FROM public_old.links
`);
});
}

View File

@ -1,11 +1,11 @@
/* eslint-disable sonarjs/no-duplicate-string */
import type { IdentifiedReference } from '@mikro-orm/core';
import { BeforeCreate, Entity, type EventArgs, Property } from '@mikro-orm/core';
import { BeforeCreate, Entity, Property, type EventArgs } from '@mikro-orm/core';
import { ObjectType } from '@nestjs/graphql';
import type { FastifyRequest } from 'fastify';
import { config } from '../config.js';
import type { ResourceLocations } from '../types/resource-locations.type.js';
import { config, hosts, rootHost } from '../config.js';
import type { User } from '../modules/user/user.entity.js';
import type { ResourceLocations } from '../types/resource-locations.type.js';
import { getHostFromRequest } from './get-host-from-request.js';
@Entity({ abstract: true })
@ -31,10 +31,10 @@ export abstract class Resource {
}
getHost() {
if (!this.hostname) return config.rootHost;
const match = config.hosts.find((host) => host.normalised === this.hostname || host.pattern.test(this.hostname!));
if (!this.hostname) return rootHost;
const match = hosts.find((host) => host.normalised === this.hostname || host.pattern.test(this.hostname!));
if (match) return match;
return config.rootHost;
return rootHost;
}
getBaseUrl() {
@ -42,7 +42,7 @@ export abstract class Resource {
const host = this.getHost();
const hasPlaceholder = host.url.includes('{{username}}');
if (hasPlaceholder) {
if (!owner) return config.rootHost.url;
if (!owner) return rootHost.url;
return host.url.replace('{{username}}', owner.username);
}
@ -58,7 +58,7 @@ export abstract class Resource {
if (!config.restrictFilesToHost) return true;
// root host can send all files
if (hostname === config.rootHost.normalised) return true;
if (hostname === rootHost.normalised) return true;
if (this.hostname === hostname) return true;
if (this.hostname?.includes('{{username}}')) {
// old files have {{username}} in the persisted hostname, migrating them

View File

@ -4,7 +4,7 @@ import { config } from '../config.js';
const transport = config.email && nodemailer.createTransport(config.email.smtp);
export const sendMail = (options: Omit<nodemailer.SendMailOptions, 'from'>) => {
if (!transport) {
if (!transport || !config.email) {
throw new Error('No SMTP configuration found');
}

View File

@ -2,26 +2,20 @@ import type { Options } from '@mikro-orm/core';
import { MikroORM } from '@mikro-orm/core';
import type { EntityManager } from '@mikro-orm/postgresql';
import { Logger } from '@nestjs/common';
import { checkForOldDatabase, migrateOldDatabase } from './helpers/migrate-old-database.js';
import mikroOrmConfig, { MIGRATIONS_TABLE_NAME, ORM_LOGGER } from './orm.config.js';
const logger = new Logger('migrate');
export const migrate = async (
config: Options = mikroOrmConfig,
skipLock = process.env.SKIP_MIGRATION_LOCK === 'true'
skipLock = process.env.SKIP_MIGRATION_LOCK === 'true',
) => {
logger.debug(`Checking for and running migrations`);
const orm = await MikroORM.init(config);
const em = orm.em.fork({ clear: true }) as EntityManager;
const connection = em.getConnection();
const migrator = orm.getMigrator();
const oldDatabaseExists = await checkForOldDatabase(connection);
if (oldDatabaseExists) {
await migrateOldDatabase(em, migrator);
return;
}
const migrator = orm.getMigrator();
const executedMigrations = await migrator.getExecutedMigrations();
const pendingMigrations = await migrator.getPendingMigrations();
if (!pendingMigrations[0]) {

View File

@ -1,6 +1,6 @@
import { Controller, Get, Req, UseGuards } from '@nestjs/common';
import type { FastifyRequest } from 'fastify';
import { config } from '../config.js';
import { config, hosts, rootHost } from '../config.js';
import { UserId } from './auth/auth.decorators.js';
import { OptionalJWTAuthGuard } from './auth/guards/optional-jwt.guard.js';
import { UserService } from './user/user.service.js';
@ -26,10 +26,10 @@ export class AppController {
allowTypes: config.allowTypes ? [...config.allowTypes?.values()] : undefined,
email: !!config.email,
rootHost: {
url: config.rootHost.url,
normalised: config.rootHost.normalised,
url: rootHost.url,
normalised: rootHost.normalised,
},
hosts: config.hosts
hosts: hosts
.filter((host) => {
if (!host.tags || !host.tags[0]) return true;
return host.tags.every((tag) => tags.includes(tag));

View File

@ -1,7 +1,6 @@
import { UseGuards } from '@nestjs/common';
import { Query, Resolver } from '@nestjs/graphql';
import { MicroHost } from '../classes/MicroHost.js';
import { config } from '../config.js';
import { config, hosts, rootHost, type MicroHost } from '../config.js';
import type { ConfigHost } from '../types/config.type.js';
import { Config } from '../types/config.type.js';
import { CurrentHost, UserId } from './auth/auth.decorators.js';
@ -28,9 +27,9 @@ export class AppResolver {
uploadLimit: config.uploadLimit,
allowTypes: config.allowTypes ? [...config.allowTypes?.values()] : [],
requireEmails: !!config.email,
rootHost: this.filterHost(config.rootHost),
rootHost: this.filterHost(rootHost),
currentHost: this.filterHost(currentHost),
hosts: config.hosts
hosts: hosts
.filter((host) => {
if (!host.tags || !host.tags[0]) return true;
return host.tags.every((tag) => tags.includes(tag));

View File

@ -4,7 +4,7 @@ import { UseGuards } from '@nestjs/common';
import { Args, Context, Mutation, Resolver } from '@nestjs/graphql';
import type { FastifyReply } from 'fastify';
import ms from 'ms';
import { config } from '../../config.js';
import { rootHost } from '../../config.js';
import { User } from '../user/user.entity.js';
import { UserId } from './auth.decorators.js';
import { AuthService, TokenType } from './auth.service.js';
@ -18,13 +18,13 @@ export class AuthResolver {
private static readonly COOKIE_OPTIONS = {
path: '/',
httpOnly: true,
domain: config.rootHost.normalised.split(':').shift(),
secure: config.rootHost.url.startsWith('https'),
domain: rootHost.normalised.split(':').shift(),
secure: rootHost.url.startsWith('https'),
};
constructor(
@InjectRepository(User) private readonly userRepo: EntityRepository<User>,
private readonly authService: AuthService
private readonly authService: AuthService,
) {}
@Mutation(() => User)
@ -32,7 +32,7 @@ export class AuthResolver {
@Context() ctx: any,
@Args('username') username: string,
@Args('password') password: string,
@Args('otpCode', { nullable: true }) otpCode?: string
@Args('otpCode', { nullable: true }) otpCode?: string,
) {
const reply = ctx.reply as FastifyReply;
const user = await this.authService.authenticateUser(username, password, otpCode);

View File

@ -15,7 +15,7 @@ import {
UseGuards,
} from '@nestjs/common';
import type { FastifyReply, FastifyRequest } from 'fastify';
import { config } from '../../config.js';
import { rootHost } from '../../config.js';
import { UserId } from '../auth/auth.decorators.js';
import { JWTAuthGuard } from '../auth/guards/jwt.guard.js';
import { HostService } from '../host/host.service.js';
@ -31,14 +31,14 @@ export class FileController {
private readonly fileService: FileService,
private readonly userService: UserService,
private readonly hostService: HostService,
private readonly linkService: LinkService
private readonly linkService: LinkService,
) {}
@Get('file/:fileId')
async getFileContent(
@Res() reply: FastifyReply,
@Param('fileId') fileId: string,
@Request() request: FastifyRequest
@Request() request: FastifyRequest,
) {
return this.fileService.sendFile(fileId, request, reply);
}
@ -49,8 +49,8 @@ export class FileController {
@UserId() userId: string,
@Req() request: FastifyRequest,
@Headers('X-Micro-Paste-Shortcut') shortcut: string,
@Headers('x-micro-host') hosts = config.rootHost.url,
@Query('input') input?: string
@Headers('x-micro-host') hosts = rootHost.url,
@Query('input') input?: string,
) {
const user = await this.userService.getUser(userId, true);
const host = this.hostService.resolveUploadHost(hosts, user);

View File

@ -13,8 +13,7 @@ import { DateTime } from 'luxon';
import mime from 'mime-types';
import sharp from 'sharp';
import { PassThrough } from 'stream';
import type { MicroHost } from '../../classes/MicroHost.js';
import { config } from '../../config.js';
import { config, type MicroHost } from '../../config.js';
import { generateContentId } from '../../helpers/generate-content-id.helper.js';
import { getStreamType } from '../../helpers/get-stream-type.helper.js';
import { HostService } from '../host/host.service.js';
@ -29,7 +28,7 @@ export class FileService implements OnApplicationBootstrap {
@InjectRepository('File') private readonly fileRepo: EntityRepository<File>,
private readonly storageService: StorageService,
private readonly hostService: HostService,
protected readonly orm: MikroORM
protected readonly orm: MikroORM,
) {}
async getFile(id: string, request: FastifyRequest) {
@ -45,7 +44,7 @@ export class FileService implements OnApplicationBootstrap {
multipart: MultipartFile,
request: FastifyRequest,
owner: User,
host: MicroHost | undefined
host: MicroHost | undefined,
): Promise<File> {
if (host) this.hostService.checkUserCanUploadTo(host, owner);
if (!request.headers['content-length']) throw new BadRequestException('Missing "Content-Length" header.');
@ -53,7 +52,7 @@ export class FileService implements OnApplicationBootstrap {
if (Number.isNaN(contentLength) || contentLength >= config.uploadLimit) {
const size = bytes.parse(Number(request.headers['content-length']));
this.logger.warn(
`User ${owner.id} tried uploading a ${size} file, which is over the configured upload size limit.`
`User ${owner.id} tried uploading a ${size} file, which is over the configured upload size limit.`,
);
throw new PayloadTooLargeException();

View File

@ -1,6 +1,6 @@
import type { CanActivate, ExecutionContext } from '@nestjs/common';
import { BadRequestException, Injectable } from '@nestjs/common';
import { config } from '../../config.js';
import { hosts } from '../../config.js';
import { getRequest } from '../../helpers/get-request.js';
@Injectable()
@ -9,11 +9,11 @@ export class HostGuard implements CanActivate {
const request = getRequest(context);
const referer = request.headers.referer;
if (!referer) {
request.host = config.hosts[0];
request.host = hosts[0];
return true;
}
const host = config.hosts.find((host) => host.pattern.test(referer));
const host = hosts.find((host) => host.pattern.test(referer));
if (!host) throw new BadRequestException('Invalid host.');
request.host = host;
return true;

View File

@ -1,7 +1,6 @@
import { BadRequestException, ForbiddenException } from '@nestjs/common';
import normalizeUrl from 'normalize-url';
import type { MicroHost } from '../../classes/MicroHost.js';
import { config } from '../../config.js';
import { hosts, rootHost, type MicroHost } from '../../config.js';
import { randomItem } from '../../helpers/random-item.helper.js';
import type { User } from '../user/user.entity.js';
@ -19,9 +18,9 @@ export class HostService {
* @throws if the host could not be resolved.
*/
getHostFrom(url: string | undefined, tags: string[] | null): MicroHost {
if (!url) return config.rootHost;
if (!url) return rootHost;
const normalised = HostService.normaliseHostUrl(url);
for (const host of config.hosts) {
for (const host of hosts) {
if (!host.pattern.test(normalised)) continue;
if (tags && host.tags) {
const hasTags = host.tags.every((tag) => tags.includes(tag));

View File

@ -1,6 +1,6 @@
import { Entity, ManyToOne, OneToOne, OptionalProps, PrimaryKey, Property, type Ref } from '@mikro-orm/core';
import { Field, ID, ObjectType } from '@nestjs/graphql';
import { config } from '../../config.js';
import { rootHost } from '../../config.js';
import { generateDeleteKey } from '../../helpers/generate-delete-key.helper.js';
import { User } from '../user/user.entity.js';
@ -54,7 +54,7 @@ export class Invite {
@Property({ persist: false })
@Field(() => String)
get url() {
const url = new URL(config.rootHost.url);
const url = new URL(rootHost.url);
url.pathname = this.path;
return url;
}

View File

@ -2,8 +2,8 @@ import { InjectRepository } from '@mikro-orm/nestjs';
import { EntityRepository } from '@mikro-orm/postgresql';
import { Injectable, NotFoundException, UnauthorizedException } from '@nestjs/common';
import type { FastifyRequest } from 'fastify';
import type { MicroHost } from '../../classes/MicroHost.js';
import { Link } from './link.entity.js';
import type { MicroHost } from '../../config.js';
@Injectable()
export class LinkService {

View File

@ -1,6 +1,6 @@
import { Field, InputType } from '@nestjs/graphql';
import { IsEmail, IsLowercase, IsNotIn, IsOptional, IsString, MaxLength, MinLength } from 'class-validator';
import blocklist from '../../../blocklist.json' assert { type: 'json' };
import blocklist from '../../../blocklist.json';
@InputType()
export class CreateUserDto {

View File

@ -7,7 +7,7 @@ import dedent from 'dedent';
import handlebars from 'handlebars';
import ms from 'ms';
import { nanoid } from 'nanoid';
import { config } from '../../config.js';
import { config, rootHost } from '../../config.js';
import type { Permission } from '../../constants.js';
import { generateContentId } from '../../helpers/generate-content-id.helper.js';
import { sendMail } from '../../helpers/send-mail.helper.js';
@ -38,7 +38,7 @@ export class UserService {
@InjectRepository(UserVerification) private readonly verificationRepo: EntityRepository<UserVerification>,
@InjectRepository(File) private readonly fileRepo: EntityRepository<File>,
@InjectRepository(Paste) private readonly pasteRepo: EntityRepository<Paste>,
private readonly inviteService: InviteService
private readonly inviteService: InviteService,
) {}
async getUser(id: string, verified: boolean) {
@ -61,7 +61,7 @@ export class UserService {
orderBy: {
createdAt: QueryOrder.DESC,
},
}
},
);
}
@ -76,7 +76,7 @@ export class UserService {
orderBy: {
createdAt: QueryOrder.DESC,
},
}
},
);
}
@ -111,7 +111,7 @@ export class UserService {
});
user.verifications.add(verification);
const verifyUrl = `${config.rootHost.url}/api/user/${verification.user.id}/verify/${verification.id}`;
const verifyUrl = `${rootHost.url}/api/user/${verification.user.id}/verify/${verification.id}`;
const html = UserService.EMAIL_TEMPLATE({ verifyUrl });
await sendMail({
to: user.email,
@ -164,7 +164,7 @@ export class UserService {
$gt: new Date(),
},
},
{ populate: ['user'] }
{ populate: ['user'] },
);
if (!verification) {

View File

@ -1,38 +1,13 @@
{
"include": ["src", "types"],
"extends": "@atlasbot/configs/tsconfig/esm.json",
"include": ["src"],
"compilerOptions": {
// https://www.npmjs.com/package/@tsconfig/node18
// https://github.com/sindresorhus/tsconfig/blob/main/tsconfig.json
"moduleDetection": "force",
"allowSyntheticDefaultImports": true,
"resolveJsonModule": true,
"lib": ["es2022"],
// "module": "CommonJS",
// "moduleResolution": "node",
"module": "NodeNext",
"moduleResolution": "nodenext",
"target": "es2022",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"declaration": true,
"pretty": true,
"newLine": "lf",
"stripInternal": true,
"noImplicitOverride": false,
"noUnusedLocals": true,
"noFallthroughCasesInSwitch": true,
"outDir": "dist",
"noUncheckedIndexedAccess": false,
"strictPropertyInitialization": false,
"noImplicitOverride": false,
"emitDecoratorMetadata": true,
"experimentalDecorators": true,
"baseUrl": ".",
"paths": {
// https://github.com/bevry/istextorbinary/issues/270
// voids the bad types or else ncc fails
"istextorbinary": ["*.sink.d.ts"]
}
"lib": ["es2021", "dom"]
}
}

View File

@ -1,12 +1,11 @@
module.exports = {
extends: require.resolve('@sylo-digital/scripts/eslint/react'),
ignorePatterns: ['**/generated/**'],
extends: require.resolve('@atlasbot/configs/eslint/next'),
parserOptions: {
project: './tsconfig.json',
},
rules: {
'@typescript-eslint/consistent-type-assertions': 'off',
'storybook/no-title-property-in-meta': 'off',
'@typescript-eslint/no-floating-promises': 'off',
'jsx-a11y/no-autofocus': 'off',
'jsx-a11y/media-has-caption': 'off',
'@typescript-eslint/no-confusing-void-expression': 'off',
'@typescript-eslint/no-unnecessary-type-assertion': 'off',
'@typescript-eslint/no-misused-promises': 'off',
},
};

View File

@ -1,59 +1,59 @@
{
"name": "@ryanke/micro-web",
"version": "1.0.0",
"license": "GPL-3.0",
"repository": "https://github.com/sylv/micro.git",
"author": "Ryan <ryan@sylver.me>",
"license": "GPL-3.0",
"private": true,
"engine": {
"node": ">=16"
"engines": {
"node": ">=20"
},
"scripts": {
"watch": "NODE_ENV=development concurrently \"next dev\" \"pnpm generate --watch\"",
"build": "NODE_ENV=production next build",
"generate": "graphql-codegen --config codegen.yml",
"lint": "NODE_ENV=production next lint",
"generate": "graphql-codegen --config codegen.yml"
"watch": "NODE_ENV=development concurrently \"next dev\" \"pnpm generate --watch\""
},
"dependencies": {
"@apollo/client": "^3.8.1",
"@apollo/client": "^3.8.8",
"@headlessui/react": "^1.7.17",
"@ryanke/pandora": "^0.0.9",
"@tailwindcss/typography": "^0.5.9",
"autoprefixer": "^10.4.15",
"clsx": "^2.0.0",
"concurrently": "^8.2.1",
"@tailwindcss/typography": "^0.5.10",
"autoprefixer": "^10.4.16",
"clsx": "^2.1.0",
"concurrently": "^8.2.2",
"copy-to-clipboard": "^3.3.3",
"dayjs": "^1.11.9",
"dayjs": "^1.11.10",
"deepmerge": "^4.3.1",
"formik": "^2.4.3",
"formik": "^2.4.5",
"generate-avatar": "1.4.10",
"graphql": "^16.8.0",
"http-status-codes": "^2.2.0",
"graphql": "^16.8.1",
"http-status-codes": "^2.3.0",
"lodash": "^4.17.21",
"nanoid": "^4.0.2",
"next": "13.4.19",
"postcss": "^8.4.29",
"prism-react-renderer": "^2.0.6",
"next": "14.0.4",
"postcss": "^8.4.33",
"prism-react-renderer": "^2.3.1",
"qrcode.react": "^3.1.0",
"react": "18.2.0",
"react-dom": "^18.1.0",
"react-feather": "^2.0.9",
"react-markdown": "^8.0.7",
"remark-gfm": "^3.0.1",
"swr": "^2.2.2",
"tailwindcss": "^3.3.3",
"yup": "^1.2.0"
"react-markdown": "^9.0.1",
"remark-gfm": "^4.0.0",
"swr": "^2.2.4",
"tailwindcss": "^3.4.1",
"yup": "^1.3.3"
},
"devDependencies": {
"@atlasbot/configs": "^10.5.14",
"@graphql-codegen/cli": "^5.0.0",
"@graphql-codegen/typescript": "4.0.1",
"@graphql-codegen/typescript-operations": "4.0.1",
"@graphql-codegen/typescript-react-apollo": "3.3.7",
"@sylo-digital/scripts": "^1.0.12",
"@types/lodash": "^4.14.197",
"@types/node": "^18.15.11",
"@types/react": "^18.2.21",
"prettier": "^3.0.3",
"typescript": "^5.2.2"
"@graphql-codegen/typescript-react-apollo": "4.1.0",
"@parcel/watcher": "^2.3.0",
"@types/lodash": "^4.14.202",
"@types/node": "^20.10.6",
"@types/react": "^18.2.47",
"prettier": "^3.1.1",
"typescript": "^5.3.3"
}
}

View File

@ -14,9 +14,9 @@ let globalClient: ApolloClient<NormalizedCacheObject> | undefined;
const errorLink = onError(({ graphQLErrors, networkError }) => {
if (graphQLErrors) {
graphQLErrors.forEach(({ message, locations, path }) =>
console.log(`[GraphQL error]: Message: ${message}, Location: ${locations}, Path: ${path}`)
);
for (const { message, locations, path } of graphQLErrors) {
console.log(`[GraphQL error]: Message: ${message}, Location: ${locations}, Path: ${path}`);
}
}
if (networkError) {
console.log(`[Network error]: ${networkError.message}`);

View File

@ -1,3 +1,5 @@
/* eslint-disable jsx-a11y/no-static-element-interactions */
/* eslint-disable jsx-a11y/click-events-have-key-events */
import { Menu } from '@headlessui/react';
import clsx from 'clsx';
import type { FC, ReactNode } from 'react';

View File

@ -39,9 +39,11 @@ EmbedImage.embeddable = (data: Embeddable) => {
case 'image/svg+xml':
case 'image/webp':
case 'image/bmp':
case 'image/tiff':
case 'image/tiff': {
return true;
default:
}
default: {
return false;
}
}
};

View File

@ -23,9 +23,11 @@ EmbedVideo.embeddable = (data: Embeddable) => {
switch (data.type) {
case 'video/mp4':
case 'video/webm':
case 'video/ogg':
case 'video/ogg': {
return true;
default:
}
default: {
return false;
}
}
};

View File

@ -1,6 +1,6 @@
import clsx from 'clsx';
import type { Language } from 'prism-react-renderer';
import { Fragment, memo } from 'react';
import { Children, Fragment, memo } from 'react';
import ReactMarkdown from 'react-markdown';
import remarkGfm from 'remark-gfm';
import { SyntaxHighlighter } from './syntax-highlighter/syntax-highlighter';
@ -18,7 +18,7 @@ export const Markdown = memo<{ children: string; className?: string }>(({ childr
'prose-blockquote:font-normal prose-blockquote:not-italic',
// make inline `code` blocks purple
'prose-code:text-primary',
className
className,
);
return (
@ -32,10 +32,15 @@ export const Markdown = memo<{ children: string; className?: string }>(({ childr
// prism syntax highlighter. so this just doesnt render the pre tag.
return <Fragment>{children}</Fragment>;
},
code({ inline, className, children, ...rest }) {
const languageMatch = !inline && className && LANGUAGE_REGEX.exec(className);
const text = languageMatch ? children.filter((child) => typeof child === 'string').join(' ') : null;
if (inline || !languageMatch || !text) {
code({ className, children, ...rest }) {
const languageMatch = className && LANGUAGE_REGEX.exec(className);
const text = languageMatch
? Children.toArray(children)
.filter((child) => typeof child === 'string')
.join(' ')
: null;
if (!languageMatch || !text) {
return (
<code className={className} {...rest}>
{children}
@ -45,7 +50,7 @@ export const Markdown = memo<{ children: string; className?: string }>(({ childr
const language = languageMatch.groups!.language as Language;
return (
<SyntaxHighlighter language={language} className={className} {...rest}>
<SyntaxHighlighter language={language} className={className} {...(rest as any)}>
{text}
</SyntaxHighlighter>
);

View File

@ -24,7 +24,7 @@ export const SyntaxHighlighter = memo<SyntaxHighlighterProps>(
const containerClasses = clsx(
'text-left overflow-x-auto h-full relative',
highlighterClasses,
additionalClasses
additionalClasses,
);
return (
@ -43,5 +43,5 @@ export const SyntaxHighlighter = memo<SyntaxHighlighterProps>(
}}
</Highlight>
);
}
},
);

View File

@ -52,21 +52,17 @@ export const FileList: FC = () => {
{!source.data && <PageLoader />}
{filter === 'files' && (
<div className="grid grid-cols-2 gap-4 md:grid-cols-4 lg:grid-cols-6">
{files.data?.user.files.edges.map(({ node }) => (
<FileCard key={node.id} file={node} />
))}
{files.data?.user.files.edges.map(({ node }) => <FileCard key={node.id} file={node} />)}
</div>
)}
{filter === 'pastes' && (
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
{pastes.data?.user.pastes.edges.map(({ node }) => (
<PasteCard key={node.id} paste={node} />
))}
{pastes.data?.user.pastes.edges.map(({ node }) => <PasteCard key={node.id} paste={node} />)}
</div>
)}
{!source.loading && !hasContent && (
<Card className="text-gray-500">
You haven't uploaded anything yet. Once you upload something, it will appear here.
You haven&apos;t uploaded anything yet. Once you upload something, it will appear here.
</Card>
)}
</div>

View File

@ -1,5 +1,6 @@
import { gql } from '@apollo/client';
import * as Apollo from '@apollo/client';
export type Maybe<T> = T | null;
export type InputMaybe<T> = Maybe<T>;
export type Exact<T extends { [key: string]: unknown }> = { [K in keyof T]: T[K] };
@ -678,8 +679,15 @@ export function useGetFilesLazyQuery(baseOptions?: Apollo.LazyQueryHookOptions<G
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useLazyQuery<GetFilesQuery, GetFilesQueryVariables>(GetFilesDocument, options);
}
export function useGetFilesSuspenseQuery(
baseOptions?: Apollo.SuspenseQueryHookOptions<GetFilesQuery, GetFilesQueryVariables>,
) {
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useSuspenseQuery<GetFilesQuery, GetFilesQueryVariables>(GetFilesDocument, options);
}
export type GetFilesQueryHookResult = ReturnType<typeof useGetFilesQuery>;
export type GetFilesLazyQueryHookResult = ReturnType<typeof useGetFilesLazyQuery>;
export type GetFilesSuspenseQueryHookResult = ReturnType<typeof useGetFilesSuspenseQuery>;
export type GetFilesQueryResult = Apollo.QueryResult<GetFilesQuery, GetFilesQueryVariables>;
export const GetPastesDocument = gql`
query GetPastes($first: Float, $after: String) {
@ -727,8 +735,15 @@ export function useGetPastesLazyQuery(
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useLazyQuery<GetPastesQuery, GetPastesQueryVariables>(GetPastesDocument, options);
}
export function useGetPastesSuspenseQuery(
baseOptions?: Apollo.SuspenseQueryHookOptions<GetPastesQuery, GetPastesQueryVariables>,
) {
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useSuspenseQuery<GetPastesQuery, GetPastesQueryVariables>(GetPastesDocument, options);
}
export type GetPastesQueryHookResult = ReturnType<typeof useGetPastesQuery>;
export type GetPastesLazyQueryHookResult = ReturnType<typeof useGetPastesLazyQuery>;
export type GetPastesSuspenseQueryHookResult = ReturnType<typeof useGetPastesSuspenseQuery>;
export type GetPastesQueryResult = Apollo.QueryResult<GetPastesQuery, GetPastesQueryVariables>;
export const ConfigDocument = gql`
query Config {
@ -775,8 +790,15 @@ export function useConfigLazyQuery(baseOptions?: Apollo.LazyQueryHookOptions<Con
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useLazyQuery<ConfigQuery, ConfigQueryVariables>(ConfigDocument, options);
}
export function useConfigSuspenseQuery(
baseOptions?: Apollo.SuspenseQueryHookOptions<ConfigQuery, ConfigQueryVariables>,
) {
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useSuspenseQuery<ConfigQuery, ConfigQueryVariables>(ConfigDocument, options);
}
export type ConfigQueryHookResult = ReturnType<typeof useConfigQuery>;
export type ConfigLazyQueryHookResult = ReturnType<typeof useConfigLazyQuery>;
export type ConfigSuspenseQueryHookResult = ReturnType<typeof useConfigSuspenseQuery>;
export type ConfigQueryResult = Apollo.QueryResult<ConfigQuery, ConfigQueryVariables>;
export const GetUserDocument = gql`
query GetUser {
@ -811,8 +833,15 @@ export function useGetUserLazyQuery(baseOptions?: Apollo.LazyQueryHookOptions<Ge
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useLazyQuery<GetUserQuery, GetUserQueryVariables>(GetUserDocument, options);
}
export function useGetUserSuspenseQuery(
baseOptions?: Apollo.SuspenseQueryHookOptions<GetUserQuery, GetUserQueryVariables>,
) {
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useSuspenseQuery<GetUserQuery, GetUserQueryVariables>(GetUserDocument, options);
}
export type GetUserQueryHookResult = ReturnType<typeof useGetUserQuery>;
export type GetUserLazyQueryHookResult = ReturnType<typeof useGetUserLazyQuery>;
export type GetUserSuspenseQueryHookResult = ReturnType<typeof useGetUserSuspenseQuery>;
export type GetUserQueryResult = Apollo.QueryResult<GetUserQuery, GetUserQueryVariables>;
export const LoginDocument = gql`
mutation Login($username: String!, $password: String!, $otp: String) {
@ -1069,8 +1098,15 @@ export function useGetFileLazyQuery(baseOptions?: Apollo.LazyQueryHookOptions<Ge
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useLazyQuery<GetFileQuery, GetFileQueryVariables>(GetFileDocument, options);
}
export function useGetFileSuspenseQuery(
baseOptions?: Apollo.SuspenseQueryHookOptions<GetFileQuery, GetFileQueryVariables>,
) {
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useSuspenseQuery<GetFileQuery, GetFileQueryVariables>(GetFileDocument, options);
}
export type GetFileQueryHookResult = ReturnType<typeof useGetFileQuery>;
export type GetFileLazyQueryHookResult = ReturnType<typeof useGetFileLazyQuery>;
export type GetFileSuspenseQueryHookResult = ReturnType<typeof useGetFileSuspenseQuery>;
export type GetFileQueryResult = Apollo.QueryResult<GetFileQuery, GetFileQueryVariables>;
export const DeleteFileDocument = gql`
mutation DeleteFile($fileId: ID!, $deleteKey: String) {
@ -1141,8 +1177,15 @@ export function useGetInviteLazyQuery(
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useLazyQuery<GetInviteQuery, GetInviteQueryVariables>(GetInviteDocument, options);
}
export function useGetInviteSuspenseQuery(
baseOptions?: Apollo.SuspenseQueryHookOptions<GetInviteQuery, GetInviteQueryVariables>,
) {
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useSuspenseQuery<GetInviteQuery, GetInviteQueryVariables>(GetInviteDocument, options);
}
export type GetInviteQueryHookResult = ReturnType<typeof useGetInviteQuery>;
export type GetInviteLazyQueryHookResult = ReturnType<typeof useGetInviteLazyQuery>;
export type GetInviteSuspenseQueryHookResult = ReturnType<typeof useGetInviteSuspenseQuery>;
export type GetInviteQueryResult = Apollo.QueryResult<GetInviteQuery, GetInviteQueryVariables>;
export const CreateUserDocument = gql`
mutation CreateUser($user: CreateUserDto!) {
@ -1261,8 +1304,15 @@ export function useGetPasteLazyQuery(baseOptions?: Apollo.LazyQueryHookOptions<G
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useLazyQuery<GetPasteQuery, GetPasteQueryVariables>(GetPasteDocument, options);
}
export function useGetPasteSuspenseQuery(
baseOptions?: Apollo.SuspenseQueryHookOptions<GetPasteQuery, GetPasteQueryVariables>,
) {
const options = { ...defaultOptions, ...baseOptions };
return Apollo.useSuspenseQuery<GetPasteQuery, GetPasteQueryVariables>(GetPasteDocument, options);
}
export type GetPasteQueryHookResult = ReturnType<typeof useGetPasteQuery>;
export type GetPasteLazyQueryHookResult = ReturnType<typeof useGetPasteLazyQuery>;
export type GetPasteSuspenseQueryHookResult = ReturnType<typeof useGetPasteSuspenseQuery>;
export type GetPasteQueryResult = Apollo.QueryResult<GetPasteQuery, GetPasteQueryVariables>;
export const ShortenDocument = gql`
mutation Shorten($link: String!, $host: String) {

View File

@ -1,3 +1,4 @@
/* eslint-disable unicorn/prefer-code-point */
const ENCRYPTION_ALGORITHM = 'AES-GCM';
const ENCRYPTION_LENGTH = 256;
@ -36,7 +37,7 @@ export async function encryptContent(content: string): Promise<EncryptionResult>
length: ENCRYPTION_LENGTH,
},
true,
['encrypt', 'decrypt']
['encrypt', 'decrypt'],
);
const encryptedContent = await crypto.subtle.encrypt(
@ -45,7 +46,7 @@ export async function encryptContent(content: string): Promise<EncryptionResult>
iv,
},
key,
new TextEncoder().encode(content)
new TextEncoder().encode(content),
);
const ivString = arrayBufferToBase64(iv);
@ -68,7 +69,7 @@ export async function decryptContent(data: EncryptionResult): Promise<string> {
name: ENCRYPTION_ALGORITHM,
},
true,
['encrypt', 'decrypt']
['encrypt', 'decrypt'],
);
const decryptedContent = await crypto.subtle.decrypt(
@ -77,7 +78,7 @@ export async function decryptContent(data: EncryptionResult): Promise<string> {
iv: base64ToArrayBuffer(iv),
},
key,
base64ToArrayBuffer(encryptedContent)
base64ToArrayBuffer(encryptedContent),
);
return new TextDecoder().decode(decryptedContent);

View File

@ -16,7 +16,7 @@ export const useQueryState = <S>(key: string, initialState?: S, parser?: (input:
const route = new URL(window.location.href);
if (value === initialState) route.searchParams.delete(key);
else route.searchParams.set(key, `${value}`);
history.replaceState(null, '', route.toString());
history.replaceState(window.history.state, '', route.toString());
}, [value, initialState, key]);
return [value, setValue] as const;

View File

@ -120,7 +120,7 @@ export default function Generate() {
such as Google Authenticator and Authy.
</p>
<p className="text-xs text-gray-600">
If you can't scan the QR code, you can enter the code{' '}
If you can&apos;t scan the QR code, you can enter the code{' '}
<code className="text-purple-400">{result.generateOTP.secret}</code> manually.
</p>
</div>
@ -148,7 +148,7 @@ export default function Generate() {
disabled={currentStep === 0}
className={clsx(
`text-gray-400 flex items-center gap-1 hover:underline`,
currentStep === 0 && 'opacity-0 pointer-events-none'
currentStep === 0 && 'opacity-0 pointer-events-none',
)}
>
<ChevronLeft className="h-4 w-4" /> Back

View File

@ -22,13 +22,13 @@ const FileOption: FC<{ children: ReactNode; className?: string; onClick: () => v
}) => {
const classes = clsx(
'flex items-center gap-2 shrink-0 transition-colors duration-100 hover:text-gray-300',
className
className,
);
return (
<span className={classes} onClick={onClick}>
<button className={classes} onClick={onClick}>
{children}
</span>
</button>
);
};

View File

@ -41,7 +41,7 @@ export default function ViewPaste() {
setBurnUnless(burnUnless);
url.searchParams.delete('burn_unless');
window.history.replaceState(null, '', url.href);
window.history.replaceState(window.history.state, '', url.href);
}, [router]);
useEffect(() => {

View File

@ -157,15 +157,15 @@ export default function Paste() {
placeholder="Markdown, code or plain text"
/>
<div className="flex gap-2 justify-end flex-wrap">
<label className="flex gap-2 items-center">
<label className="flex gap-2 items-center" htmlFor="burn">
<Checkbox id="burn" />
<span className="truncate">Destroy after viewing</span>
</label>
<label className="flex gap-2 items-center">
<label className="flex gap-2 items-center" htmlFor="paranoid">
<Checkbox id="paranoid" />
Long ID
</label>
<label className="flex gap-2 items-center">
<label className="flex gap-2 items-center" htmlFor="encrypt">
<Checkbox id="encrypt" />
Encrypt
</label>

View File

@ -133,14 +133,14 @@ export default function Upload() {
</Select>
<Button onClick={handleUpload}>Upload</Button>
</div>
<span
<button
className="mt-4 cursor-pointer text-primary"
onClick={() => {
setFile(null);
}}
>
Cancel
</span>
</button>
</Card>
</Container>
);

File diff suppressed because it is too large Load Diff