Compare commits

...

383 Commits

Author SHA1 Message Date
Gauvino 6248783a3f
Merge branch 'dev/0.15.0' into quality-check 2024-03-30 21:25:57 +01:00
Sylvie e026dfdd64
dummy for [docker build] 2023-12-22 15:00:37 -07:00
Sylvie 0b63fd6821
build: update pnpm-lock.yaml (remove package-lock.json) 2023-12-22 15:00:19 -07:00
Sylvie f0dd773c78
Merge pull request #250 from WilliamDavidHarrison/dev/0.15.0 2023-12-21 22:38:17 -07:00
William Harrison f53bb10581 feat: use william.js for discord validation 2023-12-22 10:41:22 +08:00
William Harrison ee32fb6b34 Update utils.ts 2023-12-22 09:18:22 +08:00
William Harrison 2e3fe23e72 chore: single apostrophe 2023-12-22 09:17:54 +08:00
William Harrison ad37dca7a1 chore: remove unused import 2023-12-22 09:16:33 +08:00
William Harrison 26c8e2f5f5 chore: switch to axios 2023-12-22 09:14:55 +08:00
William Harrison 58fd55cfa6 feat: discord webhook support + custom id func 2023-12-22 08:06:17 +08:00
Sylvie 3f685ef067
docs: fix: follow redirect 2023-12-05 13:22:30 -07:00
Sylvie b472ed235e
docs: use new docker deploy method 2023-12-05 13:19:21 -07:00
Sylvie 9dce54d850
build: add docs script to add compose redir 2023-12-05 13:19:08 -07:00
Sylvie 123a397538
docs: fix: incorrect url 2023-12-05 13:17:59 -07:00
Sylvie 3028123467
docs: remove dead links 2023-12-04 16:52:44 -07:00
Sylvie 1bb79c9107
docs: migrate some data from PR 2023-12-04 16:49:11 -07:00
Sylvie 4727913b31
docs: added local install notes 2023-12-04 16:31:38 -07:00
Sylvie e23133b170
docs: added docker install notes 2023-12-04 16:31:33 -07:00
Sylvie 4b0436893b
docs: update wording for alternatives 2023-12-04 16:17:46 -07:00
Sylvie 0befb0fae6
docs: add warning to install page 2023-12-04 16:17:33 -07:00
Sylvie 3e4bef7e17
docs: fix: incorrect markdown syntax 2023-12-04 16:17:16 -07:00
Sylvie 770f78d3ed
docs: mention future alternative install types 2023-12-04 16:08:28 -07:00
Sylvie f4f5997a0b
feat: migrate `check_tool`
Co-authored-by: NotAShelf <raf@notashelf.dev>
2023-12-04 16:02:17 -07:00
Sylvie 9205f18b28
refactor: rename old flameshot script 2023-12-04 15:51:36 -07:00
Sylvie 504f29db87
chore: add .wrangler/ to .gitignore 2023-12-04 15:45:04 -07:00
Sylvie 47627dfa12
docs: update hero actions 2023-12-04 15:44:50 -07:00
Sylvie c07cbad45b
docs: set branch 2023-12-04 15:44:10 -07:00
Sylvie 5536478de3
docs: update template and description 2023-12-04 15:44:02 -07:00
Sylvie 343f0f1c7a
docs: added edit link 2023-12-04 15:42:56 -07:00
Sylvie c3111d296f
docs: added footer 2023-12-04 15:42:15 -07:00
Sylvie e8169e7b71
build: small update 2023-12-04 15:31:11 -07:00
Sylvie 1e84372dff
dummy to fix pages build with new script 2023-12-04 11:37:16 -07:00
Sylvie c19055c190
build: update packages 2023-12-04 11:35:12 -07:00
Sylvie 2545baca0d
build: synchronize pnpm lockfile 2023-12-04 11:34:20 -07:00
Sylvie e350ae3939
build: remove `package-lock.json` (deploy via `pnpm`) 2023-12-04 11:34:01 -07:00
Sylvie 8f326d59b5
build: renamed and rearranged npm scripts; added `dev:docs` script 2023-12-04 11:32:15 -07:00
Sylvie 5c5d20f6a3
build: add wrangler dev dependency (for documentation) 2023-12-04 11:32:10 -07:00
Sylvie 7e15d29e97
docs: add logo and meta 2023-12-04 11:12:34 -07:00
Sylvie 0821954f1f
docs: add Aiven-specific MySQL instructions 2023-12-04 11:04:42 -07:00
Sylvie 2d55579cb7
docs: add Configure links 2023-12-04 11:04:29 -07:00
Sylvie 21095ed5f8
MVP.231204.1 [docker build] 2023-12-04 09:03:31 -07:00
Sylvie dd050b6d37
fix: error handling 2023-12-04 08:54:53 -07:00
Sylvie 7cd56bc4ed
fix: accept missing on initial creation 2023-12-04 08:54:46 -07:00
Sylvie c228a9f6c3
fix: use simpler types & normalize within db implementations 2023-12-04 08:54:21 -07:00
Sylvie fa38f14a0c
refactor: move these types to `types.d.ts` 2023-12-03 17:31:14 -07:00
Sylvie 495bbcf1db
more todo 2023-12-03 17:21:55 -07:00
Sylvie 0abf3a2188
fix: try to reconcile types 2023-12-03 17:21:47 -07:00
Sylvie 778f25761d
todo 2023-12-03 17:10:00 -07:00
Sylvie 2a6e78356f
fix: missing error logs in api 2023-12-03 17:03:13 -07:00
Sylvie 11d6f302f2
fix: mysql not setting `_ready` 2023-12-03 17:00:28 -07:00
Sylvie fb1c2695c8
fix: use `hashtag` instead of `number` 2023-12-03 16:57:59 -07:00
Sylvie c8ee9a2985
fix: log user create error locally 2023-12-03 16:57:49 -07:00
Sylvie 7791af03c1
feat: add port to MySQL config 2023-12-03 16:57:30 -07:00
Sylvie 7126ad957a
docs: update homepage title 2023-12-03 14:48:00 -07:00
Sylvie e83623dc7f
docs: switch to dropdown menu 2023-12-03 14:39:23 -07:00
Sylvie 56219d046a
Merge pull request #248 from tycrek/vitepress-docs 2023-12-03 14:29:13 -07:00
Sylvie bf8a2a2f87
docs: add basic dir structure for install and customize 2023-12-03 14:24:47 -07:00
Sylvie 010318e94e
docs: added index page (and vendor examples) 2023-12-03 14:24:30 -07:00
Sylvie fec5f48822
build: add VitePress scripts 2023-12-03 14:24:04 -07:00
Sylvie f367b3292e
docs: configure VitePress 2023-12-03 14:23:53 -07:00
Sylvie a8cf170242
build: forgot pnpm lockfile 2023-12-03 14:22:36 -07:00
Sylvie a61f3b45f5
chore: update .gitignore 2023-12-03 14:22:21 -07:00
Sylvie 1f3f8982a0
build: add VitePress/vue 2023-12-03 14:22:12 -07:00
Sylvie 2d4a003b00
build: fix `rm` command 2023-11-27 10:41:07 -07:00
Sylvie 5eff78c1da
refactor: update all imports for ESM 2023-11-27 10:40:39 -07:00
Sylvie 0c3939154e
build: switch to ESM mode 2023-11-27 10:32:07 -07:00
Sylvie 1f98b2762c
build: remove unnecessary `npm` engine 2023-11-27 10:31:52 -07:00
Sylvie 36870295c4
build: use `dist/` subdirs for output now 2023-11-27 10:23:27 -07:00
Sylvie 4c68159b70
Merge branch 'master' into dev/0.15.0 2023-11-26 12:14:41 -07:00
Sylvie afd0e2bd62
build: update packages 2023-11-26 11:42:58 -07:00
Josh Moore 35c9374fd7
build: remove legacy scripts 2023-11-05 12:27:48 -07:00
Josh Moore 076f3b82bb
build: no longer use implicit script calls 2023-11-05 12:26:31 -07:00
Josh Moore c1d14702ed
Merge branch 'master' into dev/0.15.0 2023-11-05 12:03:48 -07:00
Josh Moore efdb10b482
fix: look for server.json in .ass-data dir [docker build] 2023-11-03 18:32:23 -06:00
Josh Moore 1fa55dba4e
fix/ci: remove armv7 because it takes years [docker build] 2023-11-02 12:13:20 -06:00
Josh Moore 347b80e5c9
Merge pull request #245 from XWasHere/dev/0.15.0 2023-11-02 11:59:27 -06:00
Josh Moore 41d683be0d
Merge branch 'dev/0.15.0' into dev/0.15.0 2023-11-02 11:56:33 -06:00
X 8dcb966760
Merge pull request #2 from tycrek/x-dev-15
a couple smol things
2023-11-02 07:52:34 -04:00
Josh Moore 1ffcc02ca9
docs: remove old readme sections 2023-11-02 01:23:04 -06:00
Josh Moore 173eaae560
docs: update README 2023-11-02 01:16:07 -06:00
Josh Moore 21f9bef6b2
feat: added helper dev container build script 2023-11-02 01:00:02 -06:00
Josh Moore 9083df9315
chore: make consistent with "Checking data files" 2023-11-02 00:26:53 -06:00
Josh Moore 3e26d4da65
fix: remove unused PATHS variable (moved to other file) 2023-11-01 23:59:54 -06:00
Josh Moore fd578d8359
fix: remove duplicate sections 2023-11-01 23:58:50 -06:00
Josh Moore 4457d32205
fix: move semicolons 2023-11-01 23:49:39 -06:00
Josh Moore ec97d0a3b6
feat: inform host that DB has not been set 2023-11-01 23:46:46 -06:00
Josh Moore c81541b667
fix: only ensure dir on start; files can be done by JSON DB 2023-11-01 23:45:45 -06:00
Josh Moore 6e7ac81ed4
chore: organize imports 2023-11-01 23:39:07 -06:00
xwashere fa322e4166
Merge branch 'dev/0.15.0' of https://github.com/XWasHere/ass into dev/0.15.0 2023-10-28 13:43:18 -04:00
X 387fd5ebfa
Merge pull request #1 from tycrek/x-dev-15
resolves package.json conflicts
2023-10-28 12:29:13 -04:00
Josh Moore df5510d01e fix: update pnpm lock 2023-10-27 17:26:46 -06:00
Josh Moore f2f44bee1f Merge branch 'dev/0.15.0' into x-dev-15 2023-10-27 17:08:20 -06:00
Josh Moore da8494c511 dummy for [docker build] 2023-10-27 16:31:24 -06:00
Josh Moore 942c2c258f chore: use pnpm in workflow 2023-10-27 16:31:02 -06:00
Josh Moore 95a3fea42b build/docker: use pnpm instead of npm 2023-10-27 16:29:34 -06:00
Josh Moore 5bfe91613c build: remove @types/tailwindcss (main package provides own types) 2023-10-27 15:59:15 -06:00
Josh Moore 5aa636f93e dummy for [docker build] 2023-10-27 15:53:47 -06:00
Josh Moore 2fc1ddf618
Merge pull request #244 from Gauvino/patch/docker 2023-10-27 15:43:23 -06:00
Uruk 52339f38bd Rename and change content 2023-10-27 23:41:16 +02:00
Uruk 0e2439a0d0 Removed apk cache command 2023-10-27 22:34:26 +02:00
Uruk b5e8993b8d Merge branch 'dev/0.15.0' into patch/docker 2023-10-27 22:30:07 +02:00
Josh Moore 3a1481e6ef
Merge branch 'dev/0.15.0' into patch/docker 2023-10-27 13:52:20 -06:00
Josh Moore 30346a0e8c build/fix: use safer engines declaration 2023-10-27 13:49:39 -06:00
Josh Moore 1f7108f0ee build: update packages 2023-10-27 13:45:12 -06:00
Josh Moore bb2d11d23a build: remove unused 0.14.x packages 2023-10-27 13:39:30 -06:00
Uruk a8fda8800d Merge branch 'patch/docker' of https://github.com/Gauvino/ass into patch/docker 2023-10-27 19:41:31 +02:00
Uruk c3d7e8645a Superseded: #246 2023-10-27 19:40:57 +02:00
Gauvino 966e1ce1dd
Merge branch 'dev/0.15.0' into patch/docker 2023-10-26 12:20:58 -07:00
Uruk d006073a68 Fix package.json 2023-10-26 21:16:39 +02:00
Uruk e4602f9e79 Optimize docker build and update package 2023-10-26 21:10:48 +02:00
xwashere 3c5094695a
semicolons 2023-10-26 15:01:51 -04:00
xwashere b7ddc96b06
mayb this should be a number 2023-10-26 15:01:01 -04:00
xwashere 0d1172d332
add: postgresql! 2023-10-26 14:52:56 -04:00
xwashere 9c9f2ac768
impr: move sql config to "database", add config for different db variants 2023-10-26 12:26:39 -04:00
xwashere e113cb57ee
Merge remote-tracking branch 'upstream/dev/0.15.0' into dev/0.15.0 2023-10-26 12:00:32 -04:00
Josh Moore e8b70d1257
Merge pull request #242 from XWasHere/crap 2023-10-26 08:53:17 -06:00
xwashere a833f8c585
fix this 2023-10-26 10:47:58 -04:00
xwashere 7bd08debe6
fix: oh god oh fuck oh god oh no oh
cherry pick from xutils/dev/0.15.0
2023-10-26 10:38:55 -04:00
xwashere 26431b2982
fix: oh god oh fuck oh god oh no oh 2023-10-26 10:24:05 -04:00
xwashere d36e5c53fe
nvm 2023-10-25 15:48:09 -04:00
X 66c81503e8
Merge branch 'tycrek:dev/0.15.0' into dev/0.15.0 2023-10-25 15:26:27 -04:00
xwashere ce3ad10281
split json database out of data.ts 2023-10-25 15:20:56 -04:00
xwashere 5620d0bed3
add: database class interface 2023-10-25 14:27:52 -04:00
Josh Moore f276956a13 Merge branch 'master' into dev/0.15.0 2023-10-25 11:14:15 -06:00
Josh Moore 304f499cac fix: use const & specify which config was invaldi 2023-10-25 10:22:32 -06:00
Josh Moore 3709a6958e chore: vsc autoformat 2023-10-25 10:22:12 -06:00
Josh Moore 6927bbdc26
Merge pull request #241 from XWasHere/dev/0.15.0 2023-10-25 10:07:56 -06:00
xwashere c648533469
feat: configurable rate limiting 2023-10-25 10:58:57 -04:00
Josh Moore 3d0a6eb794 build: update packages [docker build] 2023-10-21 23:17:36 -06:00
Josh Moore 2746a3e32a fix: reduce alerts to improve user flow 2023-10-21 23:16:00 -06:00
Josh Moore 1c12615e01 feat: add `requireAdmin` optional param 2023-10-21 23:13:42 -06:00
Josh Moore c84f507dea chore: minor cleanup 2023-10-21 23:03:57 -06:00
Josh Moore 229198294b
Merge pull request #239 from tycrek/feature/15-login [docker build] 2023-10-21 22:41:30 -06:00
Josh Moore c2509388d3 fix: add proper type here (excuse to do a [docker build] ) 2023-10-21 22:38:41 -06:00
Josh Moore 4b68cb7c31 feat: redirect user to the page they originally requested 2023-10-21 22:23:27 -06:00
Josh Moore b46198eb47 feat: added login checker/redirection flow 2023-10-21 22:02:35 -06:00
Josh Moore 04ef991fbc fix: remove unnecessary function & move session setup step to exist...
ing middleware
2023-10-21 21:43:48 -06:00
Josh Moore 5d06745ceb refactor: use consistent import ordering
- global modules
- NPM modules (defaults first, then expansions)
- local modules
2023-10-15 18:33:12 -06:00
Josh Moore 5832a696a8 feat: authenticate sessions with bcrypt 2023-10-15 12:06:04 -06:00
Josh Moore 285e5ccc6a feat: added getAll to `data.ts` (MySQL is UNTESTED) 2023-10-15 12:00:51 -06:00
Josh Moore b4ebe8b5d5 feat: added UNTESTED MySQL getAll 2023-10-15 12:00:12 -06:00
Josh Moore c257718410 feat: added login flow to frontend 2023-10-14 22:34:41 -06:00
Josh Moore 69f1776c7a feat: added base implementation of login validation middleware 2023-10-14 22:34:27 -06:00
Josh Moore 49bd572d03 feat: made these functions universal (will be non-duplicate soon) 2023-10-14 22:28:46 -06:00
Josh Moore 1a07edf3cf build: added new NPM script `fresh` for clean runs 2023-10-14 16:37:39 -06:00
Josh Moore 38c235c836 feat: improve setup logs 2023-10-14 16:37:20 -06:00
Josh Moore 5e7016a0c6 feat: move setup POST to /api/setup 2023-10-14 16:31:25 -06:00
Josh Moore 7f04d365b6 feat: made basic page routers universal 2023-10-14 16:25:40 -06:00
Josh Moore b7f79a49fd feat: added session types & set up session storage 2023-10-14 16:02:19 -06:00
Josh Moore b62a1eacf4 build: add session packages 2023-10-14 16:00:53 -06:00
Josh Moore 9d3dc96aec feat: added base implementation of user routing 2023-10-14 15:40:42 -06:00
Josh Moore 72a58295ae feat: added base implementation of admin routing 2023-10-14 15:34:18 -06:00
Josh Moore 701c808f7a feat: set up initial frontend JS 2023-10-14 15:27:07 -06:00
Josh Moore de41cd04ff feat: improved `fix-frontend-js.js` to fix multiple files 2023-10-14 15:11:23 -06:00
Josh Moore 789f91eceb feat: added base login frontend JS 2023-10-14 15:10:56 -06:00
Josh Moore 98826a08be feat: added login router 2023-10-14 15:10:40 -06:00
Josh Moore 8065100a4b feat: added basic login UI (non-functional) 2023-10-14 14:51:18 -06:00
Josh Moore fce65a2992 feat: make util style 2023-10-14 14:40:17 -06:00
Josh Moore 6dc4070810 build: use `@tsconfig/node20` [docker build] 2023-10-14 13:26:39 -06:00
Josh Moore 35fcee6ec1 feat: added ass version to Request metadata (see #223 for info) 2023-10-14 13:15:54 -06:00
Josh Moore d0a2bf7fa2 Merge branch 'master' into dev/0.15.0 2023-10-14 13:04:58 -06:00
Josh Moore 36be2d0bca refactor: delete old Docker deploy scripts (no longer needed) 2023-10-14 12:35:43 -06:00
Josh Moore e94c0bceb8 refactor: delete old Tailwind CSS 2023-10-14 12:35:22 -06:00
Josh Moore fae0d0b69b refactor: added missing explicit semicolons 2023-10-14 12:33:13 -06:00
Josh Moore a559866453 fix: indicate that I'm not a robot 2023-10-14 12:16:22 -06:00
Josh Moore 4bdfa09c60 Merge branch 'master' into dev/0.15.0 2023-10-14 12:12:50 -06:00
Josh Moore e7213f203f
Merge pull request #135 from NotAShelf/master 2023-10-14 11:39:22 -06:00
Josh Moore 0a652227a3 feat: dummy non-functioning login page [skip ci:ts] 2023-10-14 11:26:14 -06:00
Josh Moore e92f91633e fix: forgor the title block 2023-10-14 11:25:31 -06:00
Josh Moore 9e1c8a2ac3 feat: full-height content block 2023-10-14 11:25:06 -06:00
Josh Moore 1da9ce44f0 Merge branch 'master' into dev/0.15.0 2023-10-14 11:05:38 -06:00
NotAShelf 25a32b3879
feat: sanity checks in sample screenshotter 2023-10-14 19:38:26 +03:00
NotAShelf e201da1e55
refactor: modularize screenshot script 2023-10-14 19:23:22 +03:00
NotAShelf afc943c2bb
Merge branch 'tycrek:master' into master 2023-10-14 18:40:33 +03:00
Josh Moore e5d627e8fc chore: update `Dockerfile` and `compose.yaml` [skip ci:ts] 2023-10-14 09:19:57 -06:00
Josh Moore 121a7cd9dc fix: idfk anymore 2023-10-14 00:32:08 -06:00
Josh Moore f915e345b9 docker: add compose volume [skip ci:ts] 2023-10-14 00:26:55 -06:00
Josh Moore fc42b0f9ee docker: use Hub image 2023-10-14 00:03:35 -06:00
Josh Moore 96ae0acf34 fix: oops invalid reference format (fml) [docker build] 2023-10-13 23:25:28 -06:00
Josh Moore 8ca70ce568 fix: using my username in a Secret censored the tag 😭 [docker build] 2023-10-13 23:22:50 -06:00
Josh Moore 6b67f198b6 fix: forget it i dont want a short hash [docker build] ofosdojfha 2023-10-13 23:14:12 -06:00
Josh Moore 13c76f71ad yeah [docker build] 2023-10-13 23:09:35 -06:00
Josh Moore 455a43986d *sighhhhh* 2023-10-13 23:09:05 -06:00
Josh Moore 25f29b497d fix: i cri [docker build] 2023-10-13 23:04:40 -06:00
Josh Moore c8892bfb4c IM NOT COMMIT FARMING I FORGOR [docker build] 2023-10-13 23:01:50 -06:00
Josh Moore 952eb1176f fix: maybe this was correct the first time 🤷 2023-10-13 23:00:24 -06:00
Josh Moore 0bf02e9aed fix: forgot [docker build] so hope for the best 2023-10-13 22:54:03 -06:00
Josh Moore 2801887dee fix: apparently `pull_request` was skipped even when it worked 2023-10-13 22:52:51 -06:00
Josh Moore 1406fd4b13 fix: no clue why this is being skipped [docker build] 2023-10-13 22:49:49 -06:00
Josh Moore b6aa350f2c fix: remove unnecessary line maybe [docker build] 2023-10-13 22:46:48 -06:00
Josh Moore 4ce222dc25 fix: incorrect List format [docker build]
https://github.com/docker/build-push-action#inputs
2023-10-13 22:40:19 -06:00
Josh Moore 92e3efb2a3 fix: ugh [docker build] 2023-10-13 22:38:26 -06:00
Josh Moore 2115a4f009 fix: grr [docker build] 2023-10-13 22:36:45 -06:00
Josh Moore 18cd093dd3 fix: remove invalid line [docker build] 2023-10-13 22:33:35 -06:00
Josh Moore cb14493abb feat: wait for existing test; push to commit hash tag as well 2023-10-13 22:31:19 -06:00
Josh Moore 977c97f3fb chore: remove commented lines (and test a skip flag) [skip ci:ts] 2023-10-13 22:18:05 -06:00
Josh Moore 3efc71b697 fix: probably don't need this either 2023-10-13 22:13:16 -06:00
Josh Moore 829e52a5c8 fix: maybe `push:` isn't required as all things run twice? 2023-10-13 22:09:40 -06:00
Josh Moore e8c3e7a99d fix: rename file 2023-10-13 22:04:10 -06:00
Josh Moore 9f94eee1b1 Merge branch 'master' into dev/0.15.0 2023-10-13 22:01:43 -06:00
Josh Moore 4f49312c12 fix: don't compile every time because it takes forever 2023-10-13 22:01:03 -06:00
Josh Moore a39a13e47b fix: I think I got it 2023-10-13 16:26:54 -06:00
Josh Moore 3384dacb30 fix: on push? 2023-10-13 16:17:28 -06:00
Josh Moore 7bd9a5687e fix: don't attempt to publish if the test failed 2023-10-13 16:17:20 -06:00
Josh Moore 61c84939c3 fix: missed this undistinguished skip flag 2023-10-13 16:16:50 -06:00
Josh Moore 5efb5a5dca chore: distinguish CI skip flags 2023-10-13 16:12:40 -06:00
Josh Moore 977fd222ae
Merge pull request #234 from Gauvino/docker-build 2023-10-13 16:01:17 -06:00
Gauvino 9373a3cd3a
Update docker-build.yml 2023-10-13 23:46:38 +02:00
Josh Moore fa17f097af build: update packages 2023-10-11 23:04:51 -06:00
Josh Moore 7f3a86a790 build: set minimum engines (Node 20 will be LTS in about a week) 2023-10-11 23:03:51 -06:00
Josh Moore 10f4114fc0 Merge branch 'master' into dev/0.15.0 2023-10-11 22:37:38 -06:00
Gauvino 13d395f3f4
Update ts-build.yml 2023-10-12 03:37:44 +02:00
Gauvino 1eff87664f
Update and rename docker-build to docker-build.yml 2023-10-12 03:35:02 +02:00
Uruk 65a08babec Added queries in codeql, change version on ts-build, add docker-build
```
2023-10-09 15:46:50 +02:00
Josh Moore 059b20e714 feat: nice
basic admin dash route
2023-10-01 22:06:13 -06:00
Josh Moore 6ac59b60cb Merge branch 'master' into dev/0.15.0 2023-09-30 01:08:39 -06:00
Josh Moore 39082f4fb0 refactor: renamed views/ and views2/ 2023-09-30 00:10:03 -06:00
Josh Moore 1562e7af67 refactor: rename Tailwind files 2023-09-30 00:07:06 -06:00
Josh Moore 5f6eccd098 refactor: remove old files 2023-09-30 00:06:09 -06:00
Josh Moore d4d6e869b2 fix: update imports for new tsconfig.json 2023-09-30 00:02:34 -06:00
Josh Moore d91520cb78 feat: use pasted nanoid function (only 130 bytes) to resolve ESM issues 2023-09-30 00:02:23 -06:00
Josh Moore ed6213a76b build: updated tsconfig.json 2023-09-30 00:01:49 -06:00
Josh Moore 9702c39eb1 build: updated regular deps 2023-09-30 00:01:35 -06:00
Josh Moore cc4218856c build: updated devdeps (type defs) 2023-09-29 23:48:47 -06:00
Josh Moore 324cb5248e build: remove old packages 2023-09-29 23:46:59 -06:00
Josh Moore e8e29d2a73 feat: add ability to pass pkg version easily 2023-09-29 23:40:54 -06:00
Josh Moore 37c5836ec6 feat: slightly modularized frontend 2023-09-29 23:35:31 -06:00
Josh Moore 263ab28a13 feat: created `_base_.pug` 2023-09-29 23:28:36 -06:00
Josh Moore 851a8a5255 feat: updated tycrek Shoelace packages 2023-09-29 23:19:11 -06:00
Josh Moore b2869df3ae build: remove old package 2023-09-29 23:11:56 -06:00
Josh Moore d3191c9970 remove: this was re-implemented already in a much simpler way 2023-09-29 23:11:47 -06:00
Josh Moore 49a2e85b93 chore: update license year 2023-09-29 23:08:11 -06:00
Josh Moore 493321bc39 refactor: delete old files 2023-09-29 22:40:15 -06:00
Josh Moore 40469eabcf feat: migrated some utils 2023-09-29 22:40:03 -06:00
Josh Moore 71e2dce315 refactor: migrated and deleted some types 2023-09-29 22:30:19 -06:00
Josh Moore f4d05bdb33 refactor: migrated file operations 2023-09-29 22:29:48 -06:00
Josh Moore 1c0965c510 chore: re-add gfycat data 2023-09-29 22:04:10 -06:00
Josh Moore d04b4f9bb5 refactor: migrated all the generators 2023-09-29 22:03:46 -06:00
Josh Moore 0721510b31 Merge remote-tracking branch 'origin/master' into dev/0.15.0 2023-09-28 09:28:43 -06:00
Josh Moore c4b43ddf68 refactor: delete a lot of old stuff 2023-09-28 09:28:17 -06:00
Josh Moore d5bd01b814 feat: added initial API route for posting a new user 2023-09-25 11:03:30 -06:00
Josh Moore e515849580 feat: added setup panel for admin user 2023-09-25 11:03:17 -06:00
Josh Moore 346aa9b97d fix: set S3 region to `auto` if unset 2023-09-25 10:29:03 -06:00
Josh Moore 8487089209 feat: probably improved single object S3 upload 2023-09-25 10:28:49 -06:00
Josh Moore b852a95dcd feat: added empty API router 2023-07-17 19:40:45 -06:00
Josh Moore 07332009fd feat: added interface for interacting with new user API 2023-07-17 19:39:37 -06:00
Josh Moore 7c4829f349 feat: bit of a cleanup 2023-07-17 18:06:04 -06:00
Josh Moore c9905eb127 feat: added file hashing 2023-07-17 18:05:53 -06:00
Josh Moore f245f06647 feat: attach file size to AssFile 2023-07-17 17:56:20 -06:00
Josh Moore 04acf001e7 fix: setup should be allowed even if SQL not defined 2023-07-16 23:08:36 -06:00
Josh Moore 36de04b483 fix: redirect to index instead of displaying a generic alert 2023-07-16 23:08:11 -06:00
Josh Moore cdc3e19435 feat: enable/disable Submit button 2023-07-16 23:07:53 -06:00
Josh Moore ac1a51ec16 fix: this should be `resolve(false)` 2023-07-16 23:03:10 -06:00
Josh Moore c274e0722b fix: added proper error handler here 2023-07-16 23:02:59 -06:00
Josh Moore 9429fca2f7 feat: implement favicon redirect 2023-07-16 22:58:18 -06:00
Josh Moore 597983f370 fix: unknown crash in prod? 2023-07-16 22:39:48 -06:00
Josh Moore 6532bc2545 feat: implement GET for both SQL and local storage 2023-07-16 22:24:22 -06:00
Josh Moore 858635f64d fix: initial SQL configuration crashed if empty 2023-07-16 22:07:28 -06:00
Josh Moore c0cf6598e3 feat: added existing check for SQL data put 2023-07-16 22:07:07 -06:00
Josh Moore 310a2bb117 feat: improve this log 2023-07-16 21:26:03 -06:00
Josh Moore 9c403a1554 feat: half-implement put to SQL 2023-07-16 21:25:48 -06:00
Josh Moore d0e5711f65 fix: set _ready to true 2023-07-16 21:24:54 -06:00
Josh Moore 2d4db02d22 feat: added `put` to MySql client 2023-07-16 21:24:44 -06:00
Josh Moore adac5a1538 chore: I hate formatting shit 2023-07-16 21:02:59 -06:00
Josh Moore 351195d7b5 feat: configure SQL on app startup if user config is ready 2023-07-16 21:01:04 -06:00
Josh Moore 488f5c15e3 feat: half-implement useSql in data.ts 2023-07-16 21:00:50 -06:00
Josh Moore 821acf0cd8 feat: added _ready/ready vars to MySql 2023-07-16 20:58:34 -06:00
Josh Moore 5e6c918ec2 fix: not sure why this order was weird 2023-07-16 20:58:21 -06:00
Josh Moore 2f883e39be feat: add data Put 2023-07-16 20:50:32 -06:00
Josh Moore 51a7b304f4 feat: run SQL mode switch during setup 2023-07-16 20:45:37 -06:00
Josh Moore 0a45ddcd37 feat: implement SQL mode switch 2023-07-16 20:45:25 -06:00
Josh Moore 56e4f63ffa feat: improve MySQL database creation 2023-07-16 20:44:22 -06:00
Josh Moore 6efe86fe9b feat: adjusted some types 2023-07-16 20:42:38 -06:00
Josh Moore fa40db0662 feat: added initial data file management (no put/get/etc yet) 2023-07-16 19:23:03 -06:00
Josh Moore 88553d223d feat: added new interfaces for auth & JSON data 2023-07-16 19:17:02 -06:00
Josh Moore 8ff1028b1a docs: commented some interfaces 2023-07-16 19:13:38 -06:00
Josh Moore 9de7e20f5f feat: save `userconfig.json` to `.ass-data/` dir 2023-07-16 19:05:01 -06:00
Josh Moore 9aa7df4a48 chore: updated .gitignore 2023-07-16 19:03:26 -06:00
Josh Moore 6db469078d feat: added MySQL configuration options to web setup 2023-07-16 15:28:35 -06:00
Josh Moore 9691893a8a feat: add MySQL checks to parseConfig() 2023-07-16 15:13:53 -06:00
Josh Moore 8d545f1d37 feat: send S3 setup inputs to /setup 2023-07-16 14:26:30 -06:00
Josh Moore 15ee35d996 feat: added more config checks 2023-07-16 14:26:09 -06:00
Josh Moore 97fa3acb1a feat: added basic string checker 2023-07-16 14:25:35 -06:00
Josh Moore 8a17624db2 feat: added `uploads-` prefix 2023-07-16 13:57:35 -06:00
Josh Moore 57872470c3 feat: moved elements into Object 2023-07-16 13:56:44 -06:00
Josh Moore 47753f05b7 feat: improve setup panels 2023-07-16 13:44:35 -06:00
Josh Moore 293f7027c7 fix: make this a password field 2023-07-16 13:35:40 -06:00
Josh Moore 9f317ddb17 feat: setup UI improvements (and add S3 to UI) 2023-07-16 13:33:48 -06:00
Josh Moore 329f9113b8 chore: remove completed todo 2023-07-16 12:59:47 -06:00
Josh Moore eb3e930739 feat: added initial MySQL client (creates tables right now) 2023-07-15 00:00:59 -06:00
Josh Moore 41ed8b1d6f build: update node types 2023-07-14 22:37:49 -06:00
Josh Moore c84b26804d misc: move soon to be deprecated constants 2023-07-14 22:30:45 -06:00
Josh Moore 7c86834d28 feat: allow getting from either S3 or local 2023-07-14 22:07:20 -06:00
Josh Moore 1657200e3d fix: I'm very dumb this needs to be recursive 2023-07-14 21:49:47 -06:00
Josh Moore 30dfd386c6 fix: log errors 2023-07-14 21:46:43 -06:00
Josh Moore fc31a48bbb fix: create uploads dir automatically for docker containers 2023-07-14 21:42:02 -06:00
Josh Moore 234c9b9c0c fix: maybe fix prod issues?? 2023-07-14 21:34:41 -06:00
Josh Moore a975e2964e feat: updated Dockerfile & compose.yaml for 0.15.0 (maybe) 2023-07-14 21:19:44 -06:00
Josh Moore a518f45b87 feat: added cache control header 2023-07-14 20:51:42 -06:00
Josh Moore a989da79d8 feat: added retrieval routes 2023-07-14 20:50:57 -06:00
Josh Moore ab76785e5f feat: temporary file map to test upload retrieval 2023-07-14 20:50:20 -06:00
Josh Moore 0f184c6630 feat: implement S3 get and improve upload 2023-07-14 20:49:50 -06:00
Josh Moore 96c28fa7a9 feat: simplify S3 indication (urls not always required) 2023-07-14 20:49:21 -06:00
Josh Moore a2c2b8e906 fix: imports 2023-07-14 20:48:53 -06:00
Josh Moore d227537575 feat: implement upload files via S3 (still no accessing) 2023-07-14 20:04:20 -06:00
Josh Moore bf6d741cde feat: implemented S3 Object Put & Multipart (kind of) uploads 2023-07-14 20:03:27 -06:00
Josh Moore 204c3cdf3c feat: added fileKey parameter for S3 (and other things?) 2023-07-14 20:02:33 -06:00
Josh Moore be1038b7c4 feat: upgraded S3 client package 2023-07-14 20:02:15 -06:00
Josh Moore f6b9aba0b8 feat: added basic S3 structure kind of 2023-07-14 18:15:20 -06:00
Josh Moore 0d01fb670b feat: sample for oggy 2023-07-14 17:38:11 -06:00
Josh Moore f8930fa299 feat: wrote new upload flow 2023-07-14 14:30:12 -06:00
Josh Moore 26e4f679ce feat: migrated random generator 2023-07-14 14:29:52 -06:00
Josh Moore e0e4acbcf4 feat: print an error here, just in case 2023-07-14 14:29:38 -06:00
Josh Moore 46fabc223f feat: added BusBoy/ass file metadata types 2023-07-14 14:28:52 -06:00
Josh Moore f49fe2e0b3 feat: flameshot script automatically assigns HTTP/HTTPS protocol 2023-07-14 14:28:27 -06:00
Josh Moore ce350d4fd0 build: updated NanoID 2023-07-14 14:22:17 -06:00
Josh Moore 821bb4719b feat: added type alias (?) for NanoID ID's 2023-07-14 14:21:55 -06:00
Josh Moore 963eed7860 fix: set default flameshot script mode to ass 2023-07-14 12:48:51 -06:00
Josh Moore 0eabfa36e6 fix: was using wrong symbol 2023-07-14 12:48:34 -06:00
Josh Moore ca44f60670 build: update tlog 2023-07-13 14:13:25 -06:00
Josh Moore b9f2b40a86 feat: added desktop notification to flameshot script 2023-07-13 13:34:52 -06:00
Josh Moore b95ef137cf build: forgot this 2023-07-13 13:13:34 -06:00
Josh Moore 09c732b56e feat: rewrote flameshot script 2023-07-13 13:13:22 -06:00
Josh Moore 4d72448cf3 fix: remove one-time variable 2023-07-13 02:13:50 -06:00
Josh Moore f166d31b2c refactor: move where this log is 2023-07-13 02:13:12 -06:00
Josh Moore 7c655e466a feat: improve numChecker 2023-07-13 02:12:52 -06:00
Josh Moore 9e0088e155 feat: improved some error logs 2023-07-13 01:51:02 -06:00
Josh Moore b37436c4f2 feat: improved (?) setup flow 2023-07-13 00:54:01 -06:00
Josh Moore b9dff5865c feat: don't print the entire error here 2023-07-13 00:20:52 -06:00
Josh Moore 9f810a8f5c feat: save setup file from web setup 2023-07-13 00:07:05 -06:00
Josh Moore 371d2f8b01 chore: update .gitignore 2023-07-13 00:06:44 -06:00
Josh Moore b3093b23e3 fix: avoid displaying `null` in alerts 2023-07-13 00:06:01 -06:00
Josh Moore c9a274490a feat: add config file saving/reading 2023-07-13 00:05:45 -06:00
Josh Moore bc3e6b86d6 fix: improved number checker 2023-07-13 00:00:50 -06:00
Josh Moore 5cd3f57175 feat: improve setup reliability 2023-07-12 23:23:35 -06:00
Josh Moore cb41aeaa3c feat: added type checkers 2023-07-12 23:22:57 -06:00
Josh Moore fce1b30e3f feat: created UserConfig class 2023-07-12 23:22:19 -06:00
Josh Moore 8b8998ade6 fix: alert the server-provided error message 2023-07-12 23:16:11 -06:00
Josh Moore 352cb2b636 TABS 2023-07-12 22:45:49 -06:00
Josh Moore 83c89a9e25 fix: make userconfig.json checks dynamic 2023-07-12 22:41:57 -06:00
Josh Moore 26c6841dcd feat: added util script to remove `export {};` from frontend JS 2023-07-12 22:31:53 -06:00
Josh Moore 466fc5eafe feat: basic type-safe setup communication configured 2023-07-12 22:27:13 -06:00
Josh Moore 5d1d4a402e feat: migrated files to new backend directory 2023-07-12 22:10:36 -06:00
Josh Moore 2c57a1d80d build: remove funding note (this isn't an npm package) 2023-07-12 22:09:03 -06:00
Josh Moore f7bebf2a14 build: removed old scripts 2023-07-12 22:08:48 -06:00
Josh Moore d505187697 build: changed output directory for backend 2023-07-12 22:08:35 -06:00
Josh Moore 5270b3c7b6 build: wrote new .tsconfigs for back/front-end setup 2023-07-12 22:05:29 -06:00
Josh Moore d0c1d08431 build: improved build scripts 2023-07-12 22:04:32 -06:00
Josh Moore 4c2337cf3c feat: use proper element ID's for setup page 2023-07-12 22:04:03 -06:00
Josh Moore 863e8c6c8a feat: moved types into common dir 2023-07-12 22:03:00 -06:00
Josh Moore 7412b77b4f chore: update .gitignore 2023-07-12 22:02:35 -06:00
Josh Moore 81cafd6b67 build: update tsconfig reference 2023-07-12 21:48:56 -06:00
Josh Moore 91824803a1 feat: startup/routing behavior changes if user conf doesn't exist 2023-07-10 23:09:42 -06:00
Josh Moore 823d2a66a6 build: use fixed tlog 2023-07-07 12:34:25 -06:00
Josh Moore 706e46ad35 feat: match user config to current setup page 2023-07-07 11:18:04 -06:00
Josh Moore f6800f3fb8 feat: improved dummy setup page 2023-07-07 11:17:45 -06:00
Josh Moore 25fe92ffe0 feat: added exit handler 2023-07-07 01:11:01 -06:00
Josh Moore 4169c3525f feat: added basic dummy setup page 2023-07-07 00:36:07 -06:00
Josh Moore e8db8b2275 feat: set up CSS compilation 2023-07-07 00:34:05 -06:00
Josh Moore 2364ba4720 feat: added custom middlware for ass object & hostname 2023-07-07 00:33:30 -06:00
Josh Moore 475362a392 feat: created types file defining `'ass'` module 2023-07-07 00:31:15 -06:00
Josh Moore 82aadc886b build: upgrade TS; add shoelace packages 2023-07-06 23:02:13 -06:00
Josh Moore 3e73653894 feat: added clickable link for dev 2023-07-06 22:30:37 -06:00
Josh Moore a84a3fa016 feat: added basic routers 2023-07-06 22:30:15 -06:00
Josh Moore 751901e76e feat: set new views directory 2023-07-06 22:29:59 -06:00
Josh Moore 12afe72f80 feat: add basic temporary sample views 2023-07-06 22:29:46 -06:00
Josh Moore c2a4a44ea2 feat: create initial app.ts for loading server config & Express 2023-07-06 22:20:18 -06:00
Josh Moore e55bd501d2 feat: add logger 2023-07-06 22:17:07 -06:00
Josh Moore e68de620bf build: bump version to `0.15.0-indev` 2023-07-06 22:16:48 -06:00
Josh Moore 1d5850e261 build: fix start script file 2023-07-06 22:16:23 -06:00
Josh Moore aba450382a build: remove engine-check script 2023-07-06 22:16:00 -06:00
Josh Moore ce8fe80567 build: point tsconfig at `src2/` for now 2023-07-06 22:15:39 -06:00
Josh Moore 570a945e27 feat: remove useless engines definition 2023-07-06 22:15:10 -06:00
Josh Moore fe1a1f5fdd feat: remove old ass-x stuff 2023-07-06 21:16:32 -06:00
NotAShelf 80f385813e
Merge branch 'master' into master 2023-06-28 23:37:05 +03:00
NotAShelf 0571d0600e
Merge branch 'master' into master 2023-01-05 23:50:49 +03:00
NotAShelf 12dbb4a827
Merge branch 'tycrek:master' into master 2022-11-25 17:00:52 +03:00
NotAShelf 961eff3e2f
Merge branch 'tycrek:master' into master 2022-11-17 18:15:38 +03:00
NotAShelf 39906e7f7c
ready for production 2022-05-13 07:48:03 +03:00
NotAShelf 88c391b02b
performance fix 2 2022-05-13 07:31:12 +03:00
NotAShelf bda4800fc8
performance fix for the shell checker 2022-05-13 07:27:13 +03:00
NotAShelf 6b7610a414
implement local screenshots on x11 2022-05-13 07:22:19 +03:00
NotAShelf 6d05beed0b
update flameshot script to be global
- wayland support
- experimental local screenshots if domain & key are left empty
2022-05-13 06:46:53 +03:00
98 changed files with 10509 additions and 7593 deletions

5
.dockerignore Normal file
View File

@ -0,0 +1,5 @@
# production
.ass-data/
# development
node_modules/

340
.github/README.md vendored
View File

@ -9,19 +9,7 @@
</div>
**ass** is a self-hosted ShareX upload server written in Node.js. I initially started this project purely out of spite. ass aims to be as **unopinionated** as possible, giving users & hosts alike the ability to modify nearly everything.
By default, ass comes with a resource viewing page, which includes metadata about the resource as well as a download button & inline viewers for images, videos, & audio. It does **not** have a user dashboard or registration system: **this is intentional!** Developers are free to [create their own frontends] using the languages & tools they are most comfortable with. Writing & using these frontends is fully documented below, in the wiki, & in the source code.
### Notice (Sep 2023)
The current release version 0.14.x is now in **maintenence mode**. What this means is I'll only be providing updates to catastrophic issues.
However! I'm currently working on [a new version](https://github.com/tycrek/ass/pull/220), 0.15.0, which is a lot more stable and organized. I have no ETA but please know that I'm continuing to work on it when I can. Version 0.14.x is still functional, just a bit rough around the edges.
#### Developers 🧡
ass was designed with developers in mind. If you are a developer & want something changed to better suit you, let me know & we'll see what we can do!
**ass** is a self-hosted ShareX upload server written in TypeScript.
[GitHub package.json version]: https://img.shields.io/github/package-json/v/tycrek/ass?color=fd842d&style=for-the-badge
[GitHub license]: https://img.shields.io/github/license/tycrek/ass?color=FD7C21&style=for-the-badge
@ -29,24 +17,11 @@ ass was designed with developers in mind. If you are a developer & want somethin
[GitHub Repo stars]: https://img.shields.io/github/stars/tycrek/ass?color=F26602&style=for-the-badge
[Discord badge]: https://img.shields.io/discord/848274994375294986?label=Discord&logo=Discord&logoColor=FFF&style=for-the-badge
[Discord invite]: https://discord.gg/wGZYt5fasY
[create their own frontends]: #custom-frontends
## Code quality
| [CodeQL] | [DeepSource] |
| :---------------------------------------: | :----------------------------------: |
| [![CodeQL badge]][CodeQL link] | [![DeepSource Active Issues]][DeepSource Repo] [![DeepSource Resolved Issues]][DeepSource Repo] |
[CodeQL]: https://codeql.github.com/docs/
[DeepSource]: https://deepsource.io/
[CodeQL badge]: https://github.com/tycrek/ass/actions/workflows/codeql-analysis.yml/badge.svg?branch=master
[CodeQL link]: https://github.com/tycrek/ass/actions/workflows/codeql-analysis.yml
[DeepSource Active Issues]: https://deepsource.io/gh/tycrek/ass.svg/?label=active+issues
[DeepSource Resolved Issues]: https://deepsource.io/gh/tycrek/ass.svg/?label=resolved+issues
[DeepSource Repo]: https://deepsource.io/gh/tycrek/ass/?ref=repository-badge
## Features
###### Out of date
#### For users
- Upload images, gifs, videos, audio, & files
@ -55,44 +30,30 @@ ass was designed with developers in mind. If you are a developer & want somethin
- GPS data automatically removed
- Fully customizable Discord embeds
- Built-in web viewer with video & audio player
- Dashboard to manage your files
- Embed images, gifs, & videos directly in Discord
- Personal upload log using customizable Discord Webhooks
- macOS/Linux support with alternative clients such as [Flameshot] ([script for ass]) & [MagicCap]
- **Multiple URL styles**
- [ZWS]
- Mixed-case alphanumeric
- Gfycat
- Original
- Timestamp
- Original
- ZWS
#### For hosts & developers
- Usage metrics
- Thumbnail support
- Mimetype blocking
- Themeable viewer page
- Basic multi-user support
- Configurable global upload size limit (per-user coming soon)
- Custom pluggable frontends using [Git Submodules]
- Run locally or in a Docker container
- Multi-user support
- Run locally or via Docker
- API for developers to write custom interfaces
- **Multiple file storage methods**
- Local file system
- Amazon S3, including [DigitalOcean Spaces] (more coming soon)
- **Multiple data storage methods** using [data engines]
- **File**
- JSON (default, [papito])
- YAML (soon)
- **Database**
- PostgreSQL ([ass-psql])
- MongoDB ([ass-mongoose][GH AMongoose])
- MySQL (soon)
- S3
- **Multiple data storage methods**
- JSON
- MySQL
- PostgreSQL
[Git Submodules]: https://git-scm.com/book/en/v2/Git-Tools-Submodules
[ZWS]: https://zws.im
[DigitalOcean Spaces]: https://www.digitalocean.com/products/spaces/
[data engines]: #data-engines
[papito]: https://github.com/tycrek/papito
[ass-psql]: https://github.com/tycrek/ass-psql
[Flameshot]: https://flameshot.org/
[script for ass]: #flameshot-users-linux
[MagicCap]: https://magiccap.me/
@ -101,18 +62,17 @@ ass was designed with developers in mind. If you are a developer & want somethin
| Type | What is it? |
| ---- | ----------- |
| **[Zero-width spaces][ZWS]** | When pasted elsewhere, the URL appears to be *just* your domain name. Some browsers or sites may not recognize these URLs (Discord sadly no longer supports these as of April 2023)<br>![ZWS sample] |
| **Mixed-case alphanumeric** | The "safe" mode. URL's are browser safe as the character set is just letters & numbers. |
| **Gfycat** | Gfycat-style ID's (for example: `https://example.com/unsung-discrete-grub`). Thanks to [Gfycat] for the wordlists |
| **Original** | The "basic" mode. URL matches the same filename as when the file was uploaded. This may be prone to conflicts with files of the same name. |
| **Timestamp** | The quick but dirty mode. URL is a timestamp of when the file was uploaded, in milliseconds. This is the most unique mode, but also potentially the longest (Gfycat could be longer, easily). **Keep in mind this is vulnerable to iteration attacks** |
| **Original** | The "basic" mode. URL matches the same filename as when the file was uploaded. This may be prone to conflicts with files of the same name. |
| **ZWS** | "Zero-width spaces": when pasted elsewhere, the URL appears to be *just* your domain name. Some browsers or sites may not recognize these URLs (Discord sadly no longer supports these as of April 2023) |
[ZWS sample]: https://user-images.githubusercontent.com/29926144/113785625-bf43a480-96f4-11eb-8dd7-7f164f33ada2.png
[Gfycat]: https://gfycat.com
## Installation
ass supports two installation methods: Docker (recommended) & local (manual).
ass supports two installation methods: Docker & local.
### Docker
@ -120,61 +80,17 @@ ass supports two installation methods: Docker (recommended) & local (manual).
<summary><em>Expand for Docker/Docker Compose installation steps</em></summary>
<br>
[Docker Compose] is the recommended way to install ass. These steps assume you are already family with Docker. If not, you should probably use the local installation method. They also assume that you have a working Docker installation with Docker Compose v2 installed.
[Docker Compose] is the recommended way to install ass. These steps assume you already Docker & Docker Compose v2 installed.
[Docker Compose]: https://docs.docker.com/compose/
#### Install using docker-compose
1. Clone the ass repo using `git clone https://github.com/tycrek/ass.git && cd ass/`
2. Run the command that corresponds to your OS:
- **Linux**: `./install/docker-linux.sh` (uses `#!/bin/bash`)
- **Windows**: `install\docker-windows.bat` (from Command Prompt)
- These scripts are identical using the equivalent commands in each OS.
3. Work through the setup process when prompted.
The upload token will be printed at the end of the setup script prompts. This is the token that you'll need to use to upload resources to ass. It may go by too quickly to copy it, so just scroll back up in your terminal after setup or run `cat auth.json`.
You should now be able to access the ass server at `http://localhost:40115/` (ass-docker will bind to host `0.0.0.0` to allow external access). You can configure a reverse proxy (for example, [Caddy]; also check out [my tutorial]) to make it accessible from the internet with automatic SSL.
#### What is this script doing?
It creates directories & files required for Docker Compose to properly set up volumes. After that, it simply builds the image & container, then launches the setup process.
#### How do I run the npm scripts?
Since all 3 primary data files are bound to the container with Volumes, you can run the scripts in two ways: `docker compose exec` or `npm` on the host.
```bash
# Check the usage metrics
docker compose exec ass npm run metrics
# Run the setup script
docker compose exec ass npm run setup && docker compose restart
# Run npm on the host to run the setup script (also works for metrics)
# (You will have to meet the Node.js & npm requirements on your host for this to work properly)
npm run setup && docker compose restart
```
#### How do I update?
Easy! Just pull the changes & run this one-liner:
```bash
# Pull the latest version of ass & rebuild the image
git pull && docker compose build --no-cache && docker compose up -d
```
#### What else should I be aware of?
Deploying ass with Docker exposes **five** volumes. These volumes let you edit the config, view the auth or data files, or view the `uploads/` folder from your host.
- `uploads/`
- `share/`
- `config.json`
- `auth.json`
- `data.json`
0. This repo comes with a pre-made Compose file.
1. Clone the repo using `git clone https://github.com/tycrek/ass.git && cd ass/`
2. Run `docker compose up`
- You can append `-d` to run in the background.
3. When the logs indicate, visit your installation in your browser to begin the setup.
</details>
@ -184,16 +100,17 @@ Deploying ass with Docker exposes **five** volumes. These volumes let you edit t
<summary><em>Expand for local installation steps</em></summary>
<br>
1. You should have **Node.js 16** & **npm 8 or later** installed.
1. You should have **Node.js 20** & **npm 10 or later** installed.
2. Clone this repo using `git clone https://github.com/tycrek/ass.git && cd ass/`
3. Run `npm i --save-dev` to install the required dependencies (`--save-dev` is **required** for compilation)
4. Run `npm run build` to compile the TypeScript files
5. Run `npm start` to start ass.
The first time you run ass, the setup process will automatically be called & you will be shown your first authorization token; save this as you will need it to configure ShareX.
3. Run `pnpm i` or `npm i`
4. Run `npm run build`
5. Run `npm start`
6. When the logs indicate, visit your installation in your browser to begin the setup.
</details>
# the readme from this point is out of date
## Using HTTPS
For HTTPS support, you must configure a reverse proxy. I recommend [Caddy] but any reverse proxy works fine (such as Apache or Nginx). A sample config for Caddy is provided below:
@ -246,44 +163,6 @@ If you need to override a specific part of the config to be different from the g
[Luxon]: https://moment.github.io/luxon/#/zones?id=specifying-a-zone
### Fancy embeds
If you primarily share media on Discord, you can add these additional (optional) headers to build embeds:
| Header | Purpose |
| ------ | ------- |
| **`X-Ass-OG-Title`** | Large text shown above your media. Required for embeds to appear on desktop. |
| **`X-Ass-OG-Description`** | Small text shown below the title but above the media (does not show up on videos) |
| **`X-Ass-OG-Author`** | Small text shown above the title |
| **`X-Ass-OG-Author-Url`** | URL to open when the Author is clicked |
| **`X-Ass-OG-Provider`** | Smaller text shown above the author |
| **`X-Ass-OG-Provider-Url`** | URL to open when the Provider is clicked |
| **`X-Ass-OG-Color`** | Colour shown on the left side of the embed. Must be one of `&random`, `&vibrant`, or a hex colour value (for example: `#fe3c29`). Random is a randomly generated hex value & Vibrant is sourced from the image itself |
#### Embed placeholders
You can insert certain metadata into your embeds with these placeholders:
| Placeholder | Result |
| ----------- | ------ |
| **`&size`** | The files size with proper notation rounded to two decimals (example: `7.06 KB`) |
| **`&filename`** | The original filename of the uploaded file |
| **`&timestamp`** | The timestamp of when the file was uploaded (example: `Oct 14, 1983, 1:30 PM`) |
#### Server-side embed configuration
You may also specify a default embed config on the server. Keep in mind that if users specify the `X-Ass-OG-Title` header, the server-side config will be ignored. To configure the server-side embed, create a new file in the `share/` directory named `embed.json`. Available options are:
- **`title`**
- `description`
- `author`
- `authorUrl`
- `provider`
- `providerUrl`
- `color`
Their values are equivalent to the headers listed above.
### Webhooks
You may use Discord webhooks as an easy way to keep track of your uploads. The first step is to [create a new Webhook]. You only need to follow the first section, **Making a Webhook**. Once you are done that, click **Copy Webhook URL**. Finally, add these headers to your custom uploader:
@ -298,22 +177,6 @@ Webhooks will show the filename, mimetype, size, upload timestamp, thumbail, & a
[create a new Webhook]: https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks
## Customizing the viewer
If you want to customize the font or colours of the viewer page, create a file in the `share/` directory called `theme.json`. Available options are:
| Option | Purpose |
| ------ | ------- |
| **`font`** | The font family to use; defaults to `"Josefin Sans"`. Fonts with a space should be surrounded by double quotes. |
| **`bgPage`** | Background colour for the whole page |
| **`bgViewer`** | Background colour for the viewer element |
| **`txtPrimary`** | Primary text colour; this should be your main brand colour. |
| **`txtSecondary`** | Secondary text colour; this is used for the file details. |
| **`linkPrimary`** | Primary link colour |
| **`linkHover`** | Colour of the `hover` effect for links |
| **`linkActive`** | Colour of the `active` effect for links |
| **`borderHover`** | Colour of the `hover` effect for borders; this is used for the underlining links. |
## Custom index
By default, ass directs the index route `/` to this README. Follow these steps to use a custom index:
@ -339,148 +202,12 @@ To use a custom 404 page, create a file in the `share/` directory called `404.ht
If there's interest, I may allow making this a function, similar to the custom index.
## File storage
ass supports three methods of file storage: local, S3, or [Skynet].
### Local
Local storage is the simplest option, but relies on you having a lot of disk space to store files, which can be costly.
### S3
Any existing object storage server that's compatible with [Amazon S3] can be used with ass. I personally host my files using Digital Ocean Spaces, which implements S3.
S3 servers are generally very fast & have very good uptime, though this will depend on the hosting provider & plan you choose.
## New user system (v0.14.0)
The user system was overhauled in v0.14.0 to allow more features and flexibility. New fields on users include `admin`, `passhash`, `unid`, and `meta` (these will be documented more once the system is finalized).
New installs will automatically generate a default user. Check the `auth.json` file for the token. You will use this for API requests and to authenticate within ShareX.
ass will automatically convert your old `auth.json` to the new format. **Always backup your `auth.json` and `data.json` before updating**. By default, the original user (named `ass`) will be marked as an admin.
### Adding users
You may add users via the CLI or the API. I'll document the API further in the future.
#### CLI
```bash
npm run cli-adduser <username> <password> [admin] [meta]
```
| Argument | Purpose |
| -------- | ------- |
| **`username`** `string` | The username of the user. |
| **`password`** `string` | The password of the user. |
| **`admin?`** `boolean` | Whether the user is an admin. Defaults to `false`. |
| **`meta?`** `string` | Any additional metadata to store on the user, as a JSON string. |
**Things still not added:**
- Modifying/deleting users via the API
## Developer API
ass includes an API (v0.14.0) for frontend developers to easily integrate with. Right now the API is pretty limited but I will expand on it in the future, with frontend developer feedback.
Any endpoints requiring authorization will require an `Authorization` header with the value being the user's upload token. Admin users are a new feature introduced in v0.14.0. Admin users can access all endpoints, while non-admin users can only access those relevant to them.
Other things to note:
- **All endpoints are prefixed with `/api/`**.
- All endpoints will return a JSON object unless otherwise specified.
- Successful endpoints *should* return a `200` status code. Any errors will use the corresponding `4xx` or `5xx` status code (such as `401 Unauthorized`).
- ass's API will try to be as compliant with the HTTP spec as possible. For example, using `POST/PUT` for create/modify, and response codes such as `409 Conflict` for duplicate entries. This compliance may not be 100% perfect, but I will try my best.
### API endpoints
| Endpoint | Purpose | Admin? |
| -------- | ------- | ------ |
| **`GET /user/`** | Returns a list of all users | Yes |
| **`GET /user/:id`** | Returns the user with the given ID | Yes |
| **`GET /user/self`** | Returns the current user | No |
| **`GET /user/token/:token`** | Returns the user with the given token | No |
| **`POST /user/`** | Creates a new user. Request body must be a JSON object including `username` and `password`. You may optionally include `admin` (boolean) or `meta` (object). Returns 400 if fails. | Yes |
| **`POST /user/password/reset/:id`** | Force resets the user's **password**. Request body must be a JSON object including a `password`. | Yes |
| **`DELETE /user/:id`** | Deletes the user with the given ID, as well as all their uploads. | Yes |
| **`PUT /user/meta/:id`** | Updates the user's metadata. Request body must be a JSON object with keys `key` and `value`, with the key/value you want to set in the users metadata. Optionally you may include `force: boolean` to override existing keys. | Yes |
| **`DELETE /user/meta/:id`** | Deletes a key/value from a users metadata. Request body must be a JSON object with a `key` property specifying the key to delete. | Yes |
| **`PUT /user/username/:id`** | Updates the user's username. Request body must be a JSON object with a `username` property. | Yes |
| **`PUT /user/token/:id`** | Regenerates a users upload token | Yes |
## Custom frontends - OUTDATED
**Please be aware that this section is outdated (marked as of 2022-04-15). It will be updated when I overhaul the frontend system.**
**Update 2022-12-24: I plan to overhaul this early in 2023.**
ass is intended to provide a strong backend for developers to build their own frontends around. [Git Submodules] make it easy to create custom frontends. Submodules are their own projects, which means you are free to build the router however you wish, as long as it exports the required items. A custom frontend is really just an [Express.js router].
**For a detailed walkthrough on developing your first frontend, [consult the wiki][ctw1].**
[Git Submodules]: https://git-scm.com/book/en/v2/Git-Tools-Submodules
[Express.js router]: https://expressjs.com/en/guide/routing.html#express-router
[ctw1]: https://github.com/tycrek/ass/wiki/Writing-a-custom-frontend
## Data Engines
[Papito data engines] are responsible for managing your data. "Data" has two parts: an identifier & the actual data itself. With ass, the data is a JSON object representing the uploaded resource. The identifier is the unique ID in the URL returned to the user on upload. **Update August 2022:** I plan to overhaul Papito and how all this works *eventually*. If this comment is still here in a year, ~~kick~~ message me.
[Papito data engines]: https://github.com/tycrek/papito
**Supported data engines:**
| Name | Description | Links |
| :--: | ----------- | :---: |
| **JSON** | JSON-based data storage. On disk, data is stored in a JSON file. In memory, data is stored in a [Map]. This is the default engine. | [GitHub][GH ASE]<br>[npm][npm ASE] |
| **PostgreSQL** | Data storage using a [PostgreSQL] database. [node-postgres] is used for communicating with the database. | [GitHub][GH APSQL]<br>[npm][npm APSQL] |
| **Mongoose** | Data storage using a [MongoDB] database. [mongoose] is used for communicating with the database. Created by [@dylancl] | [GitHub][GH AMongoose]<br>[npm][npm AMongoose] |
[Map]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map
[GH ASE]: https://github.com/tycrek/papito/
[npm ASE]: https://www.npmjs.com/package/@tycrek/papito
[PostgreSQL]: https://www.postgresql.org/
[node-postgres]: https://node-postgres.com/
[GH APSQL]: https://github.com/tycrek/ass-psql/
[npm APSQL]: https://www.npmjs.com/package/@tycrek/ass-psql
[MongoDB]: https://www.mongodb.com/
[mongoose]: https://mongoosejs.com/
[GH AMongoose]: https://github.com/dylancl/ass-mongoose
[npm AMongoose]: https://www.npmjs.com/package/ass-mongoose
[@dylancl]: https://github.com/dylancl
A Papito data engine implements support for one type of database (or file, such as JSON or YAML). This lets ass server hosts pick their database of choice, because all they'll have to do is enter the connection/authentication details & ass will handle the rest, using the resource ID as the key.
**~~For a detailed walkthrough on developing engines, [consult the wiki][ctw2].~~ Outdated!**
[`data.js`]: https://github.com/tycrek/ass/blob/master/data.js
[ctw2]: https://github.com/tycrek/ass/wiki/Writing-a-StorageEngine
## npm scripts
ass has a number of pre-made npm scripts for you to use. **All** of these scripts should be run using `npm run <script-name>` (except `start`).
| Script | Description |
| ------ | ----------- |
| **`start`** | Starts the ass server. This is the default script & is run with **`npm start`**. |
| `build` | Compiles the TypeScript files into JavaScript. |
| `dev` | Chains the `build` & `compile` scripts together. |
| `setup` | Starts the easy setup process. Should be run after any updates that introduce new config options. |
| `metrics` | Runs the metrics script. This is a simple script that outputs basic resource statistics. |
| `purge` | Purges all uploads & data associated with them. This does **not** delete any users, however. |
| `engine-check` | Ensures your environment meets the minimum Node & npm version requirements. |
[`FORCE_COLOR`]: https://nodejs.org/dist/latest-v16.x/docs/api/cli.html#cli_force_color_1_2_3
## Flameshot users (Linux)
Use [this script]. For the `KEY`, put your token. Thanks to [@ToxicAven] for creating this!
Use [`flameshot-v2.sh`] or [`sample_screenshotter.sh`].
[this script]: https://github.com/tycrek/ass/blob/master/flameshot_example.sh
[@ToxicAven]: https://github.com/ToxicAven
[`flameshot-v2.sh`]: https://github.com/tycrek/ass/blob/dev/0.15.0/flameshot-v2.sh
[`sample_screenshotter.sh`]: https://github.com/tycrek/ass/blob/dev/0.15.0/sample_screenshotter.sh
## Contributing
@ -493,7 +220,6 @@ Please follow the [Contributing Guidelines] when submiting Issues or Pull Reques
- Thanks to [hlsl#1359] for the logo
- [Gfycat] for their wordlists
- [Aven], for helping kickstart the project
- My spiteful ass for motivating me to actually take this project to new heights
[hlsl#1359]: https://behance.net/zevwolf
[Aven]: https://github.com/ToxicAven

66
.github/workflows/docker-build.yml vendored Normal file
View File

@ -0,0 +1,66 @@
name: "Docker Build"
on:
push:
branches: [ master, dev/0.15.0 ]
jobs:
build_and_push:
name: Build & Publish Docker Images
if: (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/dev/0.15.0') && contains(github.event.head_commit.message, '[docker build]')
runs-on: ubuntu-latest
steps:
- name: Wait for build to succeed
uses: lewagon/wait-on-check-action@master
with:
ref: ${{ github.ref }}
check-name: build
repo-token: ${{ secrets.GH_TOKEN }}
allowed-conclusions: success
- name: Checkout
uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/arm64
push: true
build-args: |
COMMIT_TAG=${{ github.sha }}
tags: |
tycrek/ass:latest
tycrek/ass:${{ github.sha }}
discord:
name: Send Discord Notification
needs: build_and_push
if: always() && github.event_name != 'pull_request' && contains(github.event.head_commit.message, '[docker build]')
runs-on: ubuntu-latest
steps:
- name: Get Build Job Status
uses: technote-space/workflow-conclusion-action@v3
- name: Combine Job Status
id: status
run: |
failures=(neutral, skipped, timed_out, action_required)
if [[ ${array[@]} =~ $WORKFLOW_CONCLUSION ]]; then
echo "status=failure" >> $GITHUB_OUTPUT
else
echo "status=$WORKFLOW_CONCLUSION" >> $GITHUB_OUTPUT
fi
- name: Post Status to Discord
uses: sarisia/actions-status-discord@v1
with:
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ steps.status.outputs.status }}
title: ${{ github.workflow }}
nofail: true

View File

@ -1,11 +1,16 @@
name: TypeScript Build
concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
on:
push:
pull_request:
workflow_dispatch:
jobs:
build:
if: "!contains(github.event.head_commit.message, '[skip ci:ts]')"
runs-on: ubuntu-latest
env:
ARCHIVE_NAME: ass-build-${{ github.run_id }}-${{ github.run_number }}
@ -13,19 +18,19 @@ jobs:
# Checkout repo
- uses: actions/checkout@v4
# Set up Node 16
# Set up Node 20
- name: Setup Node.js environment
uses: actions/setup-node@v3
with:
node-version: 16.14.0
node-version: 20
# Install npm 8 & TypeScript
# Install npm 10 & TypeScript
- name: Install global packages
run: npm i -g npm@8 typescript
run: npm i -g npm@10 typescript pnpm
# Install ass dependencies (including types)
- name: Install dependencies
run: npm i --save-dev
run: pnpm i
# Compile the TypeScript files
- name: Run build script

126
.gitignore vendored
View File

@ -1,123 +1,15 @@
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# TypeScript v1 declaration files
typings/
# build dirs
dist*/
# TypeScript cache
*.tsbuildinfo
# ass data
.ass-data/
# Optional npm cache directory
.npm
# VitePress documentation
docs/.vitepress/dist/
docs/.vitepress/cache/
# Optional eslint cache
.eslintcache
# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
# parcel-bundler cache (https://parceljs.org/)
.cache
# Next.js build output
.next
# Nuxt.js build / generate output
.nuxt
dist
# Gatsby files
.cache/
# Comment in the public line in if your project uses Gatsby and *not* Next.js
# https://nextjs.org/blog/next-9-1#public-directory-support
# public
# vuepress build output
.vuepress/dist
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# TernJS port file
.tern-port
# tokens
auth.json*
auth.*.json
# data
data.json*
# uploads
uploads/
# config
config.json
# certificates
*.crt
# share/ directory
share/
# Wrangler local cache (docs dev server)
.wrangler/

View File

@ -1,27 +1,12 @@
# ass Dockerfile v0.3.1
# ass Dockerfile v0.3.3
# authors:
# - tycrek <t@tycrek.com> (https://tycrek.com/)
# - Zusier <zusier@pm.me> (https://github.com/Zusier)
# Node 16 image
FROM node:16.20.2
# Set working directory
WORKDIR /opt/ass/
# Copy directory files (config.json, source files etc.)
FROM node:20.9.0-alpine
WORKDIR /opt/ass-src/
COPY . ./
# Ensure these directories & files exist for compose volumes
RUN mkdir -p /opt/ass/uploads/thumbnails/ && \
mkdir -p /opt/ass/share/ && \
touch /opt/ass/config.json && \
touch /opt/ass/auth.json && \
touch /opt/ass/data.json
# Install dependencies as rootless user
RUN npm i --save-dev && \
npm run build
# Start ass
CMD npm start
RUN npm i -g pnpm
RUN pnpm i
RUN npm run build
CMD npm start

View File

@ -1,6 +1,6 @@
ISC License
Copyright (c) 2021, tycrek
Copyright (c) 2021-2023, tycrek
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above

View File

@ -1,14 +0,0 @@
{
"HTTP": 80,
"HTTPS": 443,
"CODE_OK": 200,
"CODE_NO_CONTENT": 204,
"CODE_BAD_REQUEST": 400,
"CODE_UNAUTHORIZED": 401,
"CODE_NOT_FOUND": 404,
"CODE_CONFLICT": 409,
"CODE_PAYLOAD_TOO_LARGE": 413,
"CODE_UNSUPPORTED_MEDIA_TYPE": 415,
"CODE_INTERNAL_SERVER_ERROR": 500,
"KILOBYTES": 1024
}

184
backend/UserConfig.ts Normal file
View File

@ -0,0 +1,184 @@
import { UserConfiguration, UserConfigTypeChecker, PostgresConfiguration } from 'ass';
import fs from 'fs-extra';
import { path } from '@tycrek/joint';
import { log } from './log.js';
import { validate } from 'william.js';
const FILEPATH = path.join('.ass-data/userconfig.json');
/**
* Returns a boolean if the provided value is a number
*/
const numChecker = (val: any) => {
try { return !isNaN(parseInt(val)) && typeof val !== 'string'; }
catch (err) { return false; }
}
/**
* Returns a boolean if the provided value is a non-empty string
*/
const basicStringChecker = (val: any) => typeof val === 'string' && val.length > 0;
/**
* User-config property type checker functions
*/
const Checkers: UserConfigTypeChecker = {
uploadsDir: (val) => {
try {
fs.pathExistsSync(val)
? fs.accessSync(val)
: fs.mkdirSync(val, { recursive: true });
return true;
}
catch (err) {
log.warn('Cannot access directory', `${val}`);
console.error(err);
return false;
}
},
idType: (val) => ['random', 'original', 'gfycat', 'timestamp', 'zws'].includes(val),
idSize: numChecker,
gfySize: numChecker,
maximumFileSize: numChecker,
discordWebhook: (val) => validate.discord.webhook(val),
s3: {
endpoint: basicStringChecker,
bucket: basicStringChecker,
region: (val) => val == null || basicStringChecker(val),
credentials: {
accessKey: basicStringChecker,
secretKey: basicStringChecker
}
},
sql: {
mySql: {
host: basicStringChecker,
user: basicStringChecker,
password: basicStringChecker,
database: basicStringChecker,
port: (val) => numChecker(val) && val >= 1 && val <= 65535
},
postgres: {
port: (val) => numChecker(val) && val >= 1 && val <= 65535
}
},
rateLimit: {
endpoint: (val) => val == null || (val != null && (numChecker(val.requests) && numChecker(val.duration)))
}
};
export class UserConfig {
private static _config: UserConfiguration;
private static _ready = false;
public static get config() { return UserConfig._config; }
public static get ready() { return UserConfig._ready; }
constructor(config?: any) {
// Typically this would only happen during first-time setup (for now)
if (config != null) {
UserConfig._config = UserConfig.parseConfig(config);
UserConfig._ready = true;
}
}
/**
* Ensures that all config options are valid
*/
private static parseConfig(c: any) {
const config = (typeof c === 'string' ? JSON.parse(c) : c) as UserConfiguration;
// * Base config
if (!Checkers.uploadsDir(config.uploadsDir)) throw new Error(`Unable to access uploads directory: ${config.uploadsDir}`);
if (!Checkers.idType(config.idType)) throw new Error(`Invalid ID type: ${config.idType}`);
if (!Checkers.idSize(config.idSize)) throw new Error('Invalid ID size');
if (!Checkers.gfySize(config.gfySize)) throw new Error('Invalid Gfy size');
if (!Checkers.maximumFileSize(config.maximumFileSize)) throw new Error('Invalid maximum file size');
if (!Checkers.discordWebhook(config.discordWebhook)) throw new Error('Invalid Discord webhook');
// * Optional S3 config
if (config.s3 != null) {
if (!Checkers.s3.endpoint(config.s3.endpoint)) throw new Error('Invalid S3 Endpoint');
if (!Checkers.s3.bucket(config.s3.bucket)) throw new Error('Invalid S3 Bucket');
if (!Checkers.s3.region(config.s3.region)) throw new Error('Invalid S3 Region');
if (!Checkers.s3.credentials.accessKey(config.s3.credentials.accessKey)) throw new Error('Invalid S3 Access key');
if (!Checkers.s3.credentials.secretKey(config.s3.credentials.secretKey)) throw new Error('Invalid S3 Secret key');
}
// * Optional database config(s)
if (config.database != null) {
// these both have the same schema so we can just check both
if (config.database.kind == 'mysql' || config.database.kind == 'postgres') {
if (config.database.options != undefined) {
if (!Checkers.sql.mySql.host(config.database.options.host)) throw new Error('Invalid database host');
if (!Checkers.sql.mySql.user(config.database.options.user)) throw new Error('Invalid databse user');
if (!Checkers.sql.mySql.password(config.database.options.password)) throw new Error('Invalid database password');
if (!Checkers.sql.mySql.database(config.database.options.database)) throw new Error('Invalid database');
if (!Checkers.sql.mySql.port(config.database.options.port)) throw new Error('Invalid database port');
if (config.database.kind == 'postgres') {
if (!Checkers.sql.postgres.port((config.database.options as PostgresConfiguration).port)) {
throw new Error('Invalid database port');
}
}
} else throw new Error('Database options missing');
}
}
// * optional rate limit config
if (config.rateLimit != null) {
if (!Checkers.rateLimit.endpoint(config.rateLimit.login)) throw new Error('Invalid Login rate limit configuration');
if (!Checkers.rateLimit.endpoint(config.rateLimit.upload)) throw new Error('Invalid Upload rate limit configuration');
if (!Checkers.rateLimit.endpoint(config.rateLimit.api)) throw new Error('Invalid API rate limit configuration');
}
// All is fine, carry on!
return config;
}
/**
* Save the config file to disk
*/
public static saveConfigFile(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// Only save is the config has been parsed
if (!UserConfig._ready) throw new Error('Config not ready to be saved!');
// Write to file
await fs.writeFile(FILEPATH, JSON.stringify(UserConfig._config, null, '\t'));
resolve(void 0);
} catch (err) {
log.error('Failed to save config file!');
reject(err);
}
});
}
/**
* Reads the config file from disk
*/
public static readConfigFile(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// Read the file data
const data = (await fs.readFile(FILEPATH)).toString();
// Ensure the config is valid
UserConfig._config = UserConfig.parseConfig(data);
UserConfig._ready = true;
resolve(void 0);
} catch (err) {
log.error('Failed to read config file!');
reject(err);
}
});
}
}

211
backend/app.ts Normal file
View File

@ -0,0 +1,211 @@
import { AssUser, ServerConfiguration } from 'ass';
import fs from 'fs-extra';
import tailwindcss from 'tailwindcss';
import session from 'express-session';
import MemoryStore from 'memorystore';
import express, { Request, Response, NextFunction, RequestHandler, json as BodyParserJson } from 'express';
import { path, isProd } from '@tycrek/joint';
import { epcss } from '@tycrek/express-postcss';
import { log } from './log.js';
import { get } from './data.js';
import { UserConfig } from './UserConfig.js';
import { DBManager } from './sql/database.js';
import { JSONDatabase } from './sql/json.js';
import { MySQLDatabase } from './sql/mysql.js';
import { PostgreSQLDatabase } from './sql/postgres.js';
import { buildFrontendRouter } from './routers/_frontend.js';
/**
* Top-level metadata exports
*/
export const App = {
pkgVersion: ''
};
/**
* Custom middleware to attach the ass object (and construct the `host` property)
*/
const assMetaMiddleware = (port: number, proxied: boolean): RequestHandler =>
(req: Request, _res: Response, next: NextFunction) => {
req.ass = {
host: `${req.protocol}://${req.hostname}${proxied ? '' : `:${port}`}`,
version: App.pkgVersion
};
// Set up Session if required
if (!req.session.ass)
(log.debug('Session missing'), req.session.ass = {});
next();
};
/**
* Custom middleware to verify user access
*/
const loginRedirectMiddleware = (requireAdmin = false): RequestHandler =>
async (req: Request, res: Response, next: NextFunction) => {
// If auth doesn't exist yet, make the user login
if (!req.session.ass?.auth) {
log.warn('User not logged in', req.baseUrl);
// Set pre-login path so user is directed to their requested page
req.session.ass!.preLoginPath = req.baseUrl;
// Redirect
res.redirect('/login');
} else {
const user = (await get('users', req.session.ass.auth.uid)) as AssUser;
// Check if user is admin
if ((requireAdmin || req.baseUrl === '/admin') && !user.admin) {
log.warn('Admin verification failed', user.username, user.id);
res.sendStatus(403);
} else next();
}
};
/**
* Main function.
* Yes I'm using main() in TS, cry about it
*/
async function main() {
// Launch log
const pkg = await fs.readJson(path.join('package.json')) as { name: string, version: string };
log.blank().info(pkg.name, pkg.version).blank();
App.pkgVersion = pkg.version;
// Ensure data directory exists
log.debug('Checking data dir')
await fs.ensureDir(path.join('.ass-data'));
// Set default server configuration
const serverConfig: ServerConfiguration = {
host: '0.0.0.0',
port: 40115,
proxied: isProd()
};
// Replace with user details, if necessary
try {
const exists = await fs.pathExists(path.join('.ass-data/server.json'));
if (exists) {
// Read file
const { host, port, proxied } = await fs.readJson(path.join('.ass-data/server.json')) as { host?: string, port?: number, proxied?: boolean };
// Set details, if available
if (host) serverConfig.host = host;
if (port) serverConfig.port = port;
if (proxied != undefined) serverConfig.proxied = proxied;
log.debug('server.json', `${host ? `host=${host},` : ''}${port ? `port=${port},` : ''}${proxied != undefined ? `proxied=${proxied},` : ''}`);
}
} catch (err) {
log.error('Failed to read server.json');
console.error(err);
throw err;
}
// Attempt to load user configuration
await new Promise((resolve) => UserConfig.readConfigFile().then(() => resolve(void 0))
.catch((err) => (err.code && err.code === 'ENOENT' ? {} : console.error(err), resolve(void 0))));
// If user config is ready, try to configure SQL
if (UserConfig.ready && UserConfig.config.database != null) {
try {
switch (UserConfig.config.database?.kind) {
case 'json':
await DBManager.use(new JSONDatabase());
break;
case 'mysql':
await DBManager.use(new MySQLDatabase());
break;
case 'postgres':
await DBManager.use(new PostgreSQLDatabase());
break;
}
} catch (err) { throw new Error(`Failed to configure SQL`); }
} else { // default to json database
log.debug('DB not set! Defaulting to JSON');
await DBManager.use(new JSONDatabase());
}
// Set up Express
const app = express();
// Configure sessions
const DAY = 86_400_000;
app.use(session({
name: 'ass',
resave: true,
saveUninitialized: false,
cookie: { maxAge: DAY, secure: isProd() },
secret: (Math.random() * 100).toString(),
store: new (MemoryStore(session))({ checkPeriod: DAY }) as any,
}));
// Configure Express features
app.enable('case sensitive routing');
app.disable('x-powered-by');
// Set Express variables
app.set('trust proxy', serverConfig.proxied);
app.set('view engine', 'pug');
app.set('views', 'views/');
// Middleware
app.use(log.express());
app.use(BodyParserJson());
app.use(assMetaMiddleware(serverConfig.port, serverConfig.proxied));
// Favicon
app.use('/favicon.ico', (req, res) => res.redirect('https://i.tycrek.dev/ass'));
// CSS
app.use('/.css', epcss({
cssPath: path.join('tailwind.css'),
plugins: [
tailwindcss,
(await import('autoprefixer')).default(),
(await import('cssnano')).default(),
(await import('@tinycreek/postcss-font-magician')).default(),
],
warn: (warning: Error) => log.warn('PostCSS', warning.toString())
}));
// Metadata routes
app.get('/.ass.host', (req, res) => res.type('text').send(req.ass.host));
app.get('/.ass.version', (req, res) => res.type('text').send(req.ass.version));
// Basic page routers
app.use('/setup', buildFrontendRouter('setup', false));
app.use('/login', buildFrontendRouter('login'));
app.use('/admin', loginRedirectMiddleware(), buildFrontendRouter('admin'));
app.use('/user', loginRedirectMiddleware(), buildFrontendRouter('user'));
// Advanced routers
app.use('/api', (await import('./routers/api.js')).router);
app.use('/', (await import('./routers/index.js')).router);
// Host app
app.listen(serverConfig.port, serverConfig.host, () => log[UserConfig.ready ? 'success' : 'warn']('Server listening', UserConfig.ready ? 'Ready for uploads' : 'Setup required', `click http://127.0.0.1:${serverConfig.port}`));
}
// Start program
main().catch((err) => (console.error(err), process.exit(1)));
// Exit tasks
['SIGINT', 'SIGTERM'].forEach((signal) => process.addListener(signal as any, () => {
// Hide ^C in console output
process.stdout.write('\r');
// Log then exit
log.info('Exiting', `received ${signal}`);
process.exit();
}));

54
backend/data.ts Normal file
View File

@ -0,0 +1,54 @@
import { AssFile, AssUser, DatabaseValue, NID } from 'ass';
import { log } from './log.js';
import { UserConfig } from './UserConfig.js';
import { DBManager } from './sql/database.js';
/**
* Switcher type for exported functions
*/
type DataSector = 'files' | 'users';
/**
* database kind -> name mapping
*/
const DBNAMES = {
'mysql': 'MySQL',
'postgres': 'PostgreSQL',
'json': 'JSON'
};
export const put = (sector: DataSector, key: NID, data: AssFile | AssUser): Promise<void> => new Promise(async (resolve, reject) => {
try {
if (sector === 'files') {
// * 1: Save as files (image, video, etc)
await DBManager.put('assfiles', key, data as AssFile);
} else {
// * 2: Save as users
await DBManager.put('assusers', key, data as AssUser);
}
log.info(`PUT ${sector} data`, `using ${DBNAMES[UserConfig.config.database?.kind ?? 'json']}`, key);
resolve(void 0);
} catch (err) {
reject(err);
}
});
export const get = (sector: DataSector, key: NID): Promise<DatabaseValue> => new Promise(async (resolve, reject) => {
try {
const data = await DBManager.get(sector === 'files' ? 'assfiles' : 'assusers', key);
resolve(data);
} catch (err) {
reject(err);
}
});
export const getAll = (sector: DataSector): Promise<DatabaseValue[]> => new Promise(async (resolve, reject) => {
try {
const data = await DBManager.getAll(sector === 'files' ? 'assfiles' : 'assusers');
resolve(data);
} catch (err) {
reject(err);
}
});

50
backend/generators.ts Normal file
View File

@ -0,0 +1,50 @@
import fs from 'fs-extra';
import cryptoRandomString from 'crypto-random-string';
import { randomBytes, getRandomValues } from 'crypto';
import { path } from '@tycrek/joint';
type Length = { length: number, gfyLength?: number };
// todo: load gfy length from config file
const MIN_LENGTH_GFY = 2;
/**
* Random generator
*/
export const random = ({ length }: Length) => cryptoRandomString({ length, type: 'alphanumeric' });
/**
* Timestamp generator
*/
export const timestamp = () => `${Date.now()}`;
/**
* Charset generator
*/
export const charset = ({ length, charset }: { length: number, charset: string[] }): string =>
[...randomBytes(length)].map((byte) => charset[Number(byte) % charset.length]).join('').slice(1).concat(charset[0]);
/**
* ZWS generator
*/
export const zws = ({ length }: Length) => charset({ length, charset: ['\u200B', '\u200C', '\u200D', '\u2060'] });
/**
* Gfycat generator
*/
export const gfycat = ({ gfyLength }: Length) => {
const count = gfyLength ?? MIN_LENGTH_GFY;
const getWord = (list: string[], delim = '') =>
list[Math.floor(Math.random() * list.length)].concat(delim);
const adjectives = fs.readFileSync(path.join('./common/gfycat/adjectives.txt')).toString().split('\n');
const animals = fs.readFileSync(path.join('./common/gfycat/animals.txt')).toString().split('\n');
let gfycat = '';
for (let i = 0; i < (count < MIN_LENGTH_GFY ? MIN_LENGTH_GFY : count); i++)
gfycat += getWord(adjectives, '-');
return gfycat.concat(getWord(animals));
};
export const nanoid = (size = 21) => getRandomValues(new Uint8Array(size)).reduce(((t, e) => t += (e &= 63) < 36 ? e.toString(36) : e < 62 ? (e - 26).toString(36).toUpperCase() : e > 62 ? "-" : "_"), "");

2
backend/log.ts Normal file
View File

@ -0,0 +1,2 @@
import { TLog } from '@tycrek/log';
export const log = new TLog('debug');

89
backend/operations.ts Normal file
View File

@ -0,0 +1,89 @@
import fs from 'fs-extra';
import sharp from 'sharp';
import Vibrant from 'node-vibrant';
import ffmpeg from 'ffmpeg-static';
import { exec } from 'child_process';
import { isProd } from '@tycrek/joint';
import { removeLocation } from '@xoi/gps-metadata-remover';
//@ts-ignore
import shell from 'any-shell-escape';
type SrcDest = { src: string, dest: string };
/**
* Strips GPS EXIF data from a file
*/
export const removeGPS = (file: string): Promise<boolean> => new Promise((resolve, reject) =>
fs.open(file, 'r+')
.then((fd) => removeLocation(file,
// Read function
(size: number, offset: number): Promise<Buffer> =>
fs.read(fd, Buffer.alloc(size), 0, size, offset)
.then(({ buffer }) => Promise.resolve(buffer)),
// Write function
(val: string, offset: number, enc: BufferEncoding): Promise<void> =>
fs.write(fd, Buffer.alloc(val.length, val, enc), 0, val.length, offset)
.then(() => Promise.resolve())))
.then(resolve)
.catch(reject));
const VIBRANT = { COLOURS: 256, QUALITY: 3 };
export const vibrant = (file: string, mimetype: string): Promise<string> => new Promise((resolve, reject) =>
// todo: random hex colour
mimetype.includes('video') || mimetype.includes('webp') ? `#335599`
: sharp(file).png().toBuffer()
.then((data) => Vibrant.from(data)
.maxColorCount(VIBRANT.COLOURS)
.quality(VIBRANT.QUALITY)
.getPalette())
.then((palettes) => resolve(palettes[Object.keys(palettes).sort((a, b) => palettes[b]!.population - palettes[a]!.population)[0]]!.hex))
.catch((err) => reject(err)));
/**
* Thumbnail operations
*/
export class Thumbnail {
private static readonly THUMBNAIL = {
QUALITY: 75,
WIDTH: 200 * 2,
HEIGHT: 140 * 2,
}
private static getImageThumbnail({ src, dest }: SrcDest) {
return new Promise((resolve, reject) =>
sharp(src)
.resize(this.THUMBNAIL.WIDTH, this.THUMBNAIL.HEIGHT, { kernel: 'cubic' })
.jpeg({ quality: this.THUMBNAIL.QUALITY })
.toFile(dest)
.then(resolve)
.catch(reject));
}
private static getVideoThumbnail({ src, dest }: SrcDest) {
exec(this.getCommand({ src, dest }));
}
private static getCommand({ src, dest }: SrcDest) {
return shell([
ffmpeg, '-y',
'-v', (isProd() ? 'error' : 'debug'), // Log level
'-i', src, // Input file
'-ss', '00:00:01.000', // Timestamp of frame to grab
'-vf', `scale=${this.THUMBNAIL.WIDTH}:${this.THUMBNAIL.HEIGHT}:force_original_aspect_ratio=increase,crop=${this.THUMBNAIL.WIDTH}:${this.THUMBNAIL.HEIGHT}`, // Dimensions of output file
'-frames:v', '1', // Number of frames to grab
dest // Output file
]);
}
// old default
/*
export default (file: FileData): Promise<string> =>
new Promise((resolve, reject) =>
(file.is.video ? getVideoThumbnail : (file.is.image && !file.mimetype.includes('webp')) ? getImageThumbnail : () => Promise.resolve())(file)
.then(() => resolve((file.is.video || file.is.image) ? getNewName(file.randomId) : file.is.audio ? 'views/ass-audio-icon.png' : 'views/ass-file-icon.png'))
.catch(reject));
*/
}

46
backend/ratelimit.ts Normal file
View File

@ -0,0 +1,46 @@
import { EndpointRateLimitConfiguration } from 'ass';
import { NextFunction, Request, Response } from 'express';
import { rateLimit } from 'express-rate-limit';
/**
* map that contains rate limiter middleware for each group
*/
const rateLimiterGroups = new Map<string, (req: Request, res: Response, next: NextFunction) => void>();
export const setRateLimiter = (group: string, config: EndpointRateLimitConfiguration | undefined): (req: Request, res: Response, next: NextFunction) => void => {
if (config == null) { // config might be null if the user doesnt want a rate limit
rateLimiterGroups.set(group, (req, res, next) => {
next();
});
return rateLimiterGroups.get(group)!;
} else {
rateLimiterGroups.set(group, rateLimit({
limit: config.requests,
windowMs: config.duration * 1000,
skipFailedRequests: true,
legacyHeaders: false,
standardHeaders: 'draft-7',
keyGenerator: (req, res) => {
return req.ip || 'disconnected';
},
handler: (req, res) => {
res.status(429);
res.contentType('json');
res.send('{"success":false,"message":"Rate limit exceeded, try again later"}');
}
}));
return rateLimiterGroups.get(group)!;
}
}
/**
* creates middleware for rate limiting
*/
export const rateLimiterMiddleware = (group: string, config: EndpointRateLimitConfiguration | undefined): (req: Request, res: Response, next: NextFunction) => void => {
if (!rateLimiterGroups.has(group)) setRateLimiter(group, config);
return (req, res, next) => {
return rateLimiterGroups.get(group)!(req, res, next);
};
};

View File

@ -0,0 +1,31 @@
import { Router } from 'express';
import { path } from '@tycrek/joint';
import { App } from '../app.js';
import { UserConfig } from '../UserConfig.js';
/**
* Builds a basic router for loading a page with frontend JS
*/
export const buildFrontendRouter = (page: string, onConfigReady = true) => {
// Config readiness checker
const ready = () => (onConfigReady)
? UserConfig.ready
: !UserConfig.ready;
// Set up a router
const router = Router({ caseSensitive: true });
// Render the page
router.get('/', (_req, res) => ready()
? res.render(page, { version: App.pkgVersion })
: res.redirect('/'));
// Load frontend JS
router.get('/ui.js', (_req, res) => ready()
? res.type('text/javascript').sendFile(path.join(`dist/frontend/${page}.mjs`))
: res.sendStatus(403));
return router;
};

142
backend/routers/api.ts Normal file
View File

@ -0,0 +1,142 @@
import { AssUser, AssUserNewReq } from 'ass';
import * as bcrypt from 'bcrypt'
import { Router, json as BodyParserJson, RequestHandler } from 'express';
import * as data from '../data.js';
import { log } from '../log.js';
import { nanoid } from '../generators.js';
import { UserConfig } from '../UserConfig.js';
import { rateLimiterMiddleware, setRateLimiter } from '../ratelimit.js';
import { DBManager } from '../sql/database.js';
import { JSONDatabase } from '../sql/json.js';
import { MySQLDatabase } from '../sql/mysql.js';
import { PostgreSQLDatabase } from '../sql/postgres.js';
const router = Router({ caseSensitive: true });
// Setup route
router.post('/setup', BodyParserJson(), async (req, res) => {
if (UserConfig.ready)
return res.status(409).json({ success: false, message: 'User config already exists' });
log.info('Setup', 'initiated');
try {
// Parse body
new UserConfig(req.body);
// Save config
await UserConfig.saveConfigFile();
// set up new databases
if (UserConfig.config.database) {
switch (UserConfig.config.database.kind) {
case 'json':
await DBManager.use(new JSONDatabase());
break;
case 'mysql':
await DBManager.use(new MySQLDatabase());
break;
case 'postgres':
await DBManager.use(new PostgreSQLDatabase());
break;
}
}
// set rate limits
if (UserConfig.config.rateLimit?.api) setRateLimiter('api', UserConfig.config.rateLimit.api);
if (UserConfig.config.rateLimit?.login) setRateLimiter('login', UserConfig.config.rateLimit.login);
if (UserConfig.config.rateLimit?.upload) setRateLimiter('upload', UserConfig.config.rateLimit.upload);;
log.success('Setup', 'completed');
return res.json({ success: true });
} catch (err: any) {
log.error('Setup failed', err);
return res.status(400).json({ success: false, message: err.message });
}
});
// User login
router.post('/login', rateLimiterMiddleware('login', UserConfig.config?.rateLimit?.login), BodyParserJson(), (req, res) => {
const { username, password } = req.body;
data.getAll('users')
.then((users) => {
if (!users) throw new Error('Missing users data');
else return Object.entries(users as AssUser[])
.filter(([_uid, user]: [string, AssUser]) => user.username === username)[0][1]; // [0] is the first item in the filter results, [1] is AssUser
})
.then((user) => Promise.all([bcrypt.compare(password, user.password), user]))
.then(([success, user]) => {
success ? log.success('User logged in', user.username)
: log.warn('User failed to log in', user.username);
// Set up the session information
if (success) req.session.ass!.auth = {
uid: user.id,
token: ''
};
// Respond
res.json({ success, message: `User [${user.username}] ${success ? 'logged' : 'failed to log'} in`, meta: { redirectTo: req.session.ass?.preLoginPath ?? '/user' } });
// Delete the pre-login path after successful login
if (success) delete req.session.ass?.preLoginPath;
})
.catch((err) => log.error(err).callback(() => res.status(400).json({ success: false, message: err.message })));
});
// todo: authenticate API endpoints
router.post('/user', rateLimiterMiddleware('api', UserConfig.config?.rateLimit?.api), BodyParserJson(), async (req, res) => {
if (!UserConfig.ready)
return res.status(409).json({ success: false, message: 'User config not ready' });
const newUser = req.body as AssUserNewReq;
// Run input validation
let issue: false | string = false;
let user: AssUser;
try {
// Username check
if (!newUser.username) issue = 'Missing username';
newUser.username.replaceAll(/[^A-z0-9_-]/g, '');
if (newUser.username === '') issue = 'Invalid username';
// Password check
if (!newUser.password) issue = 'Missing password';
if (newUser.password === '') issue = 'Invalid password';
newUser.password = newUser.password.substring(0, 128);
// todo: figure out how to check admin:boolean and meta:{}
// Create new AssUser objet
user = {
id: nanoid(32),
username: newUser.username,
password: await bcrypt.hash(newUser.password, 10),
admin: newUser.admin ?? false,
meta: newUser.meta ?? {},
tokens: [],
files: []
};
log.debug(`Creating ${user.admin ? 'admin' : 'regular'} user`, user.username, user.id);
// todo: also check duplicate usernames
await data.put('users', user.id, user);
} catch (err: any) { issue = `Error: ${err.message}`; }
if (issue) {
log.error('Failed to create user', issue);
return res.status(400).json({ success: false, messsage: issue });
}
log.debug(`User created`, user!.username);
res.json(({ success: true, message: `User ${user!.username} created` }));
});
export { router };

154
backend/routers/index.ts Normal file
View File

@ -0,0 +1,154 @@
import { BusBoyFile, AssFile } from 'ass';
import axios from 'axios';
import fs from 'fs-extra';
import bb from 'express-busboy';
import crypto from 'crypto';
import { Router } from 'express';
import { Readable } from 'stream';
import * as data from '../data.js';
import { log } from '../log.js';
import { App } from '../app.js';
import { random } from '../generators.js';
import { UserConfig } from '../UserConfig.js';
import { getFileS3, uploadFileS3 } from '../s3.js';
import { rateLimiterMiddleware } from '../ratelimit.js';
const router = Router({ caseSensitive: true });
//@ts-ignore // Required since bb.extends expects express.Application, not a Router (but it still works)
bb.extend(router, {
upload: true,
restrictMultiple: true,
allowedPath: (url: string) => url === '/',
limits: {
fileSize: () => (UserConfig.ready ? UserConfig.config.maximumFileSize : 50) * 1000000 // MB
}
});
// Render or redirect
router.get('/', (req, res) => UserConfig.ready ? res.render('index', { version: App.pkgVersion }) : res.redirect('/setup'));
// Upload flow
router.post('/', rateLimiterMiddleware("upload", UserConfig.config?.rateLimit?.upload), async (req, res) => {
// Check user config
if (!UserConfig.ready) return res.status(500).type('text').send('Configuration missing!');
// Does the file actually exist
if (!req.files || !req.files['file']) return res.status(400).type('text').send('No file was provided!');
else log.debug('Upload request received', `Using ${UserConfig.config.s3 != null ? 'S3' : 'local'} storage`);
// Type-check the file data
const bbFile: BusBoyFile = req.files['file'];
// Prepare file move
const uploads = UserConfig.config.uploadsDir;
const timestamp = Date.now().toString();
const fileKey = `${timestamp}_${bbFile.filename}`;
const destination = `${uploads}${uploads.endsWith('/') ? '' : '/'}${fileKey}`;
// S3 configuration
const s3 = UserConfig.config.s3 != null ? UserConfig.config.s3 : false;
try {
// Get the file size
const size = (await fs.stat(bbFile.file)).size;
// Get the hash
const sha256 = crypto.createHash('sha256').update(await fs.readFile(bbFile.file)).digest('base64');
// * Move the file
if (!s3) await fs.move(bbFile.file, destination);
else await uploadFileS3(await fs.readFile(bbFile.file), fileKey, bbFile.mimetype, size, sha256);
// Build ass metadata
const assFile: AssFile = {
fakeid: random({ length: UserConfig.config.idSize }), // todo: more generators
size,
sha256,
fileKey,
timestamp,
mimetype: bbFile.mimetype,
filename: bbFile.filename,
uploader: '0', // todo: users
save: {},
};
// Set the save location
if (!s3) assFile.save.local = destination;
else {
// Using S3 doesn't move temp file, delete it now
await fs.rm(bbFile.file);
assFile.save.s3 = true;
}
// * Save metadata
data.put('files', assFile.fakeid, assFile);
log.debug('File saved to', !s3 ? assFile.save.local! : 'S3');
await res.type('json').send({ resource: `${req.ass.host}/${assFile.fakeid}` });
// Send to Discord webhook
try {
await axios.post(UserConfig.config.discordWebhook, {
body: JSON.stringify({
content: `New upload: ${req.ass.host}/${assFile.fakeid}`
})
})
} catch (err) {
log.warn('Failed to send request to Discord webhook');
console.error(err);
}
} catch (err) {
log.error('Failed to upload file', bbFile.filename);
console.error(err);
return res.status(500).send(err);
}
});
router.get('/:fakeId', (req, res) => res.redirect(`/direct/${req.params.fakeId}`));
router.get('/direct/:fakeId', async (req, res) => {
if (!UserConfig.ready) res.redirect('/setup');
// Get the ID
const fakeId = req.params.fakeId;
// Get the file metadata
let _data;
try { _data = await data.get('files', fakeId); }
catch (err) {
log.error('Failed to get', fakeId);
console.error(err);
return res.status(500).send();
}
if (!_data) return res.status(404).send();
else {
const meta = _data as AssFile;
// File data can come from either S3 or local filesystem
let output: Readable | NodeJS.ReadableStream;
// Try to retrieve the file
if (!!meta.save.s3) {
const file = await getFileS3(meta.fileKey);
if (!file.Body) return res.status(500).send('Unknown error');
output = file.Body as Readable;
} else output = fs.createReadStream(meta.save.local!);
// Configure response headers
res.type(meta.mimetype)
.header('Content-Disposition', `inline; filename="${meta.filename}"`)
.header('Cache-Control', 'public, max-age=31536000, immutable')
.header('Accept-Ranges', 'bytes');
// Send the file (thanks to https://stackoverflow.com/a/67373050)
output.pipe(res);
}
});
export { router };

177
backend/s3.ts Normal file
View File

@ -0,0 +1,177 @@
import {
S3Client,
S3ClientConfig,
PutObjectCommand,
PutObjectCommandOutput,
GetObjectCommand,
GetObjectCommandOutput,
CreateMultipartUploadCommand,
UploadPartCommand,
CompleteMultipartUploadCommand,
CompleteMultipartUploadCommandOutput,
AbortMultipartUploadCommand,
} from "@aws-sdk/client-s3";
import { log } from './log.js';
import { UserConfig } from './UserConfig.js';
const NYR = 'S3 not ready';
/**
* Helper function to verify if the S3 config has been set
*/
const s3readyCheck = (): boolean => UserConfig.ready && UserConfig.config.s3 != null;
let _s3client: S3Client;
const s3 = (): S3Client | null => {
if (!s3readyCheck) return null;
// Build the S3 client
if (_s3client == undefined) {
const { endpoint, bucket, credentials, region } = UserConfig.config.s3!;
// Set up base config (without optional region)
const s3config: S3ClientConfig = {
endpoint,
credentials: {
accessKeyId: credentials.accessKey,
secretAccessKey: credentials.secretKey
}
};
// Attach region to config if required
s3config.region = region != null ? region : 'auto';
// Build the new client
_s3client = new S3Client(s3config);
log.debug('S3 client configured', endpoint, bucket);
}
return _s3client;
};
/**
* Basic single file upload
*/
const doObjectUpload = (file: Buffer, fileKey: string, mimetype: string, size: number, sha256: string): Promise<PutObjectCommandOutput> =>
new Promise((resolve, reject) => s3()!.send(new PutObjectCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
ContentType: mimetype,
ContentLength: size,
Body: new Uint8Array(file),
ChecksumSHA256: sha256
})).then(resolve).catch(reject));
/**
* More complicated multipart upload for large files
*/
const doMultipartUpload = (file: Buffer, mimetype: string, fileKey: string): Promise<CompleteMultipartUploadCommandOutput> => new Promise(async (resolve, reject) => {
let uploadId: string | undefined;
try {
// Create multipart upload for S3
const multipartUpload = await s3()!.send(new CreateMultipartUploadCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
ContentType: mimetype
}));
// Get the ID in case we have to abort it later
uploadId = multipartUpload.UploadId;
// Minimum size of 5 MB per part
const partSize = Math.ceil(file.length / 5);
// Build the upload commands
const uploadParts = [];
for (let i = 0; i < 5; i++) {
const start = i * partSize;
const end = start + partSize;
uploadParts.push(s3()!
.send(new UploadPartCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
UploadId: uploadId,
Body: file.subarray(start, end),
PartNumber: i + 1
}))
.then((d) => (log.debug('S3 Upload', `Part ${i + 1} uploaded`), d)));
}
// Upload all the parts
const uploadResults = await Promise.all(uploadParts);
// Get the URL? who knows
const output = await s3()!.send(
new CompleteMultipartUploadCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
UploadId: uploadId,
MultipartUpload: {
Parts: uploadResults.map(({ ETag }, i) => ({ ETag, PartNumber: i + 1 }))
}
}));
// todo: S3 multipart: clean up/finalize this properly
console.log(output);
resolve(output);
} catch (err) {
if (uploadId) {
reject(err);
await s3()!.send(new AbortMultipartUploadCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
UploadId: uploadId,
}));
}
}
});
/**
* Uploads a file to your configured S3 provider
*/
export const uploadFileS3 = (file: Buffer, fileKey: string, mimetype: string, size: number, sha256: string): Promise<void> => new Promise(async (resolve, reject) => {
if (!s3readyCheck) return reject(NYR);
try {
// todo: determine when to do multipart uplloads
await doObjectUpload(file, fileKey, mimetype, size, sha256);
resolve(void 0);
} catch (err) {
log.error('Failed to upload object to S3', fileKey);
console.error(err);
reject(err);
}
});
/**
* Gets a file from your configured S3 provider
*/
export const getFileS3 = (fileKey: string): Promise<GetObjectCommandOutput> => new Promise(async (resolve, reject) => {
if (!s3readyCheck) return reject(NYR);
try {
resolve(await s3()!.send(new GetObjectCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey
})));
} catch (err) {
log.error('Failed to get object from S3', fileKey);
console.error(err);
reject(err);
}
});
/**
* Deletes a file from your configured S3 provider
*/
export const deleteFileS3 = (): Promise<void> => new Promise((resolve, reject) => {
const NYI = 'Not yet implemented';
if (!s3readyCheck) return reject(NYR);
log.warn('S3 Delete', NYI);
reject(NYI);
});

67
backend/sql/database.ts Normal file
View File

@ -0,0 +1,67 @@
import { NID, Database, DatabaseTable, DatabaseValue } from "ass";
export class DBManager {
private static _db: Database;
private static _dbReady: boolean = false;
public static get ready() {
return this._dbReady;
}
static {
process.on('exit', () => {
if (DBManager._db) DBManager._db.close();
});
}
/**
* activate a database
*/
public static use(db: Database): Promise<void> {
return new Promise(async (resolve, reject) => {
if (this._db != undefined) {
await this._db.close();
this._dbReady = false;
}
this._db = db;
await this._db.open();
await this._db.configure();
this._dbReady = true;
resolve();
});
}
public static configure(): Promise<void> {
if (this._db && this._dbReady) {
return this._db.configure();
} else throw new Error("No database active");
}
/**
* put a value in the database
*/
public static put(table: DatabaseTable, key: NID, data: DatabaseValue): Promise<void> {
if (this._db && this._dbReady) {
return this._db.put(table, key, data);
} else throw new Error("No database active");
}
/**
* get a value from the database
*/
public static get(table: DatabaseTable, key: NID): Promise<DatabaseValue> {
if (this._db && this._dbReady) {
return this._db.get(table, key);
} else throw new Error("No database active");
}
/**
* get all values from the database
*/
public static getAll(table: DatabaseTable): Promise<DatabaseValue[]> {
if (this._db && this._dbReady) {
return this._db.getAll(table);
} else throw new Error("No database active");
}
}

152
backend/sql/json.ts Normal file
View File

@ -0,0 +1,152 @@
import { AssFile, AssUser, FilesSchema, UsersSchema, Database, DatabaseTable, DatabaseValue } from 'ass';
import path, { resolve } from 'path';
import fs from 'fs-extra';
import { log } from '../log.js';
import { nanoid } from '../generators.js';
/**
* Absolute filepaths for JSON data files
*/
const PATHS = {
files: path.join('.ass-data/files.json'),
users: path.join('.ass-data/users.json')
};
/**
* map from tables to paths
*/
const PATHMAP = {
assfiles: PATHS.files,
assusers: PATHS.users
} as { [index: string]: string };
/**
* map from tables to sectors
*/
const SECTORMAP = {
assfiles: 'files',
assusers: 'users'
} as { [index: string]: string };
const bothWriter = async (files: FilesSchema, users: UsersSchema) => {
await fs.writeJson(PATHS.files, files, { spaces: '\t' });
await fs.writeJson(PATHS.users, users, { spaces: '\t' });
};
/**
* Creates a JSON file with a given empty data template
*/
const createEmptyJson = (filepath: string, emptyData: any): Promise<void> => new Promise(async (resolve, reject) => {
try {
if (!(await fs.pathExists(filepath))) {
await fs.ensureFile(filepath);
await fs.writeJson(filepath, emptyData, { spaces: '\t' });
}
resolve(void 0);
} catch (err) {
reject(err);
}
});
/**
* Ensures the data files exist and creates them if required
*/
export const ensureFiles = (): Promise<void> => new Promise(async (resolve, reject) => {
log.debug('Checking data files');
try {
// * Default files.json
await createEmptyJson(PATHS.files, {
files: {},
useSql: false,
meta: {}
} as FilesSchema);
// * Default users.json
await createEmptyJson(PATHS.users, {
tokens: [],
users: {},
cliKey: nanoid(32),
useSql: false,
meta: {}
} as UsersSchema);
log.debug('Data files exist');
resolve();
} catch (err) {
log.error('Failed to verify existence of data files');
reject(err);
}
});
/**
* JSON database. i know json isnt sql, shut up.
*/
export class JSONDatabase implements Database {
public open(): Promise<void> { return Promise.resolve() }
public close(): Promise<void> { return Promise.resolve() }
public configure(): Promise<void> {
return new Promise((resolve, reject) => {
ensureFiles();
resolve();
});
}
public put(table: DatabaseTable, key: string, data: DatabaseValue): Promise<void> {
return new Promise(async (resolve, reject) => {
if (table == 'assfiles') {
// ? Local JSON
const filesJson = await fs.readJson(PATHS.files) as FilesSchema;
// Check if key already exists
if (filesJson.files[key] != null) return reject(new Error(`File key ${key} already exists`));
// Otherwise add the data
filesJson.files[key] = data as AssFile;
// Also save the key to the users file
const usersJson = await fs.readJson(PATHS.users) as UsersSchema;
// todo: uncomment this once users are implemented
// usersJson.users[data.uploader].files.push(key);
// Save the files
await bothWriter(filesJson, usersJson);
resolve()
} else if (table == 'assusers') {
// ? Local JSON
const usersJson = await fs.readJson(PATHS.users) as UsersSchema;
// Check if key already exists
if (usersJson.users[key] != null) return reject(new Error(`User key ${key} already exists`));
// Otherwise add the data
usersJson.users[key] = data as AssUser;
await fs.writeJson(PATHS.users, usersJson, { spaces: '\t' });
resolve();
}
})
}
public get(table: DatabaseTable, key: string): Promise<DatabaseValue> {
return new Promise(async (resolve, reject) => {
const data = (await fs.readJson(PATHMAP[table]))[SECTORMAP[table]][key];
(!data) ? reject(new Error(`Key '${key}' not found in '${table}'`)) : resolve(data);
});
}
public getAll(table: DatabaseTable): Promise<DatabaseValue[]> {
return new Promise(async (resolve, reject) => {
const data = (await fs.readJson(PATHMAP[table]))[SECTORMAP[table]];
// todo: fix this
(!data) ? resolve(data) : resolve(data);
});
}
}

185
backend/sql/mysql.ts Normal file
View File

@ -0,0 +1,185 @@
import { AssFile, AssUser, NID, UploadToken, Database, DatabaseTable, DatabaseValue } from 'ass';
import mysql, { Pool } from 'mysql2/promise';
import { log } from '../log.js';
import { UserConfig } from '../UserConfig.js';
export class MySQLDatabase implements Database {
private _pool: Pool;
private _ready: boolean = false;
public get ready() { return this._ready; }
/**
* Quick function for creating a simple JSON table
*/
private _tableManager(mode: 'create' | 'drop', name: string, schema = '( NanoID varchar(255), Data JSON )'): Promise<void> {
return new Promise((resolve, reject) =>
this._pool.query(
mode === 'create'
? `CREATE TABLE ${name} ${schema};`
: `DROP TABLE ${name};`)
.then(() => resolve())
.catch((err) => reject(err)));
}
/**
* validate the mysql config
*/
private _validateConfig(): string | undefined {
// make sure the configuration exists
if (!UserConfig.ready) return 'User configuration not ready';
if (typeof UserConfig.config.database != 'object') return 'MySQL configuration missing';
if (UserConfig.config.database.kind != "mysql") return 'Database not set to MySQL, but MySQL is in use, something has gone terribly wrong';
if (typeof UserConfig.config.database.options != 'object') return 'MySQL configuration missing';
let mySqlConf = UserConfig.config.database.options;
// Check the MySQL configuration
const checker = (val: string) => val != null && val !== '';
const issue =
!checker(mySqlConf.host) ? 'Missing MySQL Host'
: !checker(mySqlConf.user) ? 'Missing MySQL User'
: !checker(mySqlConf.password) ? 'Missing MySQL Password'
: !checker(mySqlConf.database) ? 'Missing MySQL Database'
// ! Blame VS Code for this weird indentation
: undefined;
return issue;
}
public open() { return Promise.resolve(); }
public close() { return Promise.resolve(); }
/**
* Build the MySQL client and create the tables
*/
public configure(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// Config check
let configError = this._validateConfig();
if (configError) throw new Error(configError);
// Create the pool
this._pool = mysql.createPool(UserConfig.config.database!.options!);
// Check if the pool is usable
const [rowz, _fields] = await this._pool.query(`SHOW FULL TABLES WHERE Table_Type LIKE 'BASE TABLE';`);
const rows_tableData = rowz as unknown as { [key: string]: string }[];
// Create tables if needed
if (rows_tableData.length === 0) {
log.warn('MySQL', 'Tables do not exist, creating');
await Promise.all([
this._tableManager('create', 'assfiles'),
this._tableManager('create', 'assusers'),
this._tableManager('create', 'asstokens')
]);
log.success('MySQL', 'Tables created');
} else {
// There's at least one row, do further checks
const tablesExist = { files: false, users: false, tokens: false };
// Check which tables ACTUALLY do exist
for (let row of rows_tableData) {
const table = row[`Tables_in_${UserConfig.config.database!.options!.database}`
] as DatabaseTable;
if (table === 'assfiles') tablesExist.files = true;
if (table === 'assusers') tablesExist.users = true;
if (table === 'asstokens') tablesExist.tokens = true;
// ! Don't use `= table === ''` because this is a loop
}
// Mini-function for creating a one-off table
const createOneTable = async (name: DatabaseTable) => {
log.warn('MySQL', `Table '${name}' missing, creating`);
await this._tableManager('create', name);
log.success('MySQL', `Table '${name}' created`);
}
// Check & create tables
if (!tablesExist.files) await createOneTable('assfiles');
if (!tablesExist.users) await createOneTable('assusers');
if (!tablesExist.users) await createOneTable('asstokens');
// ! temp: drop tables for testing
/* await MySql._tableManager('drop', 'assfiles');
await MySql._tableManager('drop', 'assusers');
log.debug('Table dropped'); */
// Hopefully we are ready
if (tablesExist.files && tablesExist.users)
log.info('MySQL', 'Tables exist, ready');
else throw new Error('Table(s) missing!');
}
// We are ready!
this._ready = true;
resolve();
} catch (err) {
log.error('MySQL', 'failed to initialize');
console.error(err);
reject(err);
}
});
}
public put(table: DatabaseTable, key: NID, data: DatabaseValue): Promise<void> {
return new Promise(async (resolve, reject) => {
if (!this._ready) return reject(new Error('MySQL not ready'));
try {
if (await this.get(table, key))
reject(new Error(`${table == 'assfiles' ? 'File' : table == 'assusers' ? 'User' : 'Token'} key ${key} already exists`));
} catch (err: any) {
if (!err.message.includes('not found in'))
reject(err);
}
const query = `
INSERT INTO ${table} ( NanoID, Data )
VALUES ('${key}', '${JSON.stringify(data)}');
`;
return this._pool.query(query)
.then(() => resolve(void 0))
.catch((err) => reject(err));
});
}
public get(table: DatabaseTable, key: NID): Promise<DatabaseValue> {
return new Promise(async (resolve, reject) => {
try {
// Run query
const [rowz, _fields] = await this._pool.query(`SELECT Data FROM ${table} WHERE NanoID = '${key}';`);
// Disgustingly interpret the query results
const rows_tableData = (rowz as unknown as { [key: string]: string }[])[0] as unknown as ({ Data: UploadToken | AssFile | AssUser | undefined });
if (rows_tableData?.Data) resolve(rows_tableData.Data);
else throw new Error(`Key '${key}' not found in '${table}'`);
} catch (err) {
reject(err);
}
});
}
public getAll(table: DatabaseTable): Promise<DatabaseValue[]> {
return new Promise(async (resolve, reject) => {
try {
// Run query
const [rowz, _fields] = await this._pool.query(`SELECT Data FROM ${table}`);
// Interpret results this is pain
const rows = (rowz as unknown as { Data: UploadToken | AssFile | AssUser }[]);
resolve(rows.map((row) => row.Data));
} catch (err) {
reject(err);
}
});
}
}

200
backend/sql/postgres.ts Normal file
View File

@ -0,0 +1,200 @@
import { PostgresConfiguration, Database, DatabaseTable, DatabaseValue } from 'ass';
import pg from 'pg';
import { log } from '../log.js';
import { UserConfig } from '../UserConfig.js';
/**
* database adapter for postgresql
*/
export class PostgreSQLDatabase implements Database {
private _client: pg.Client;
/**
* validate config
*/
private _validateConfig(): string | undefined {
// make sure the configuration exists
if (!UserConfig.ready) return 'User configuration not ready';
if (typeof UserConfig.config.database != 'object') return 'PostgreSQL configuration missing';
if (UserConfig.config.database.kind != "postgres") return 'Database not set to PostgreSQL, but PostgreSQL is in use, something has gone terribly wrong';
if (typeof UserConfig.config.database.options != 'object') return 'PostgreSQL configuration missing';
let config = UserConfig.config.database.options;
// check the postgres config
const checker = (val: string) => val != null && val !== '';
const issue =
!checker(config.host) ? 'Missing PostgreSQL Host'
: !checker(config.user) ? 'Missing PostgreSQL User'
: !checker(config.password) ? 'Missing PostgreSQL Password'
: !checker(config.database) ? 'Missing PostgreSQL Database'
// ! Blame VS Code for this weird indentation
: undefined;
return issue;
}
public open(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// config check
let configError = this._validateConfig();
if (configError) throw new Error(configError);
// grab the config
let config = UserConfig.config.database!.options! as PostgresConfiguration;
// set up the client
this._client = new pg.Client({
host: config.host,
port: config.port,
user: config.user,
password: config.password,
database: config.database,
});
// connect to the database
log.info('PostgreSQL', `connecting to ${config.host}:${config.port}`);
await this._client.connect();
log.success('PostgreSQL', 'ok');
resolve();
} catch (err) {
log.error('PostgreSQL', 'failed to connect');
console.error(err);
reject(err);
}
});
}
public close(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// gracefully disconnect
await this._client.end();
resolve();
} catch (err) {
log.error('PostgreSQL', 'failed to disconnect');
console.error(err);
reject(err);
}
});
}
public configure(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
await this._client.query(
`CREATE TABLE IF NOT EXISTS asstables (
name TEXT PRIMARY KEY,
version INT NOT NULL
);`);
log.info('PostgreSQL', 'checking database');
// update tables
let seenRows = new Set<string>();
let versions = await this._client.query('SELECT * FROM asstables;');
for (let row of versions.rows) {
seenRows.add(row.name);
}
const assTableSchema = '(id TEXT PRIMARY KEY, data JSON NOT NULL)'
// add missing tables
if (!seenRows.has('assfiles')) {
log.warn('PostgreSQL', 'assfiles missing, repairing...')
await this._client.query(
`CREATE TABLE assfiles ${assTableSchema};` +
`INSERT INTO asstables (name, version) VALUES ('assfiles', 1);`
);
log.success('PostgreSQL', 'ok');
}
if (!seenRows.has('assusers')) {
log.warn('PostgreSQL', 'asstokens missing, repairing...')
await this._client.query(
`CREATE TABLE assusers ${assTableSchema};` +
`INSERT INTO asstables (name, version) VALUES ('assusers', 1);`
);
log.success('PostgreSQL', 'ok');
}
if (!seenRows.has('asstokens')) {
log.warn('PostgreSQL', 'asstokens missing, repairing...')
await this._client.query(
`CREATE TABLE asstokens ${assTableSchema};` +
`INSERT INTO asstables (name, version) VALUES ('asstokens', 1);`
);
log.success('PostgreSQL', 'ok');
}
log.success('PostgreSQL', 'database is ok').callback(() => {
resolve();
});
} catch (err) {
log.error('PostgreSQL', 'failed to set up');
console.error(err);
reject(err);
}
});
}
public put(table: DatabaseTable, key: string, data: DatabaseValue): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
const queries = {
assfiles: 'INSERT INTO assfiles (id, data) VALUES ($1, $2);',
assusers: 'INSERT INTO assusers (id, data) VALUES ($1, $2);',
asstokens: 'INSERT INTO asstokens (id, data) VALUES ($1, $2);'
};
let result = await this._client.query(queries[table], [key, data]);
resolve();
} catch (err) {
reject(err);
}
});
}
public get(table: DatabaseTable, key: string): Promise<DatabaseValue> {
return new Promise(async (resolve, reject) => {
try {
const queries = {
assfiles: 'SELECT data FROM assfiles WHERE id = $1::text;',
assusers: 'SELECT data FROM assusers WHERE id = $1::text;',
asstokens: 'SELECT data FROM asstokens WHERE id = $1::text;'
};
let result = await this._client.query(queries[table], [key]);
resolve(result.rowCount ? result.rows[0].data : void 0);
} catch (err) {
reject(err);
}
});
}
// todo: verify this works
public getAll(table: DatabaseTable): Promise<DatabaseValue[]> {
return new Promise(async (resolve, reject) => {
try {
const queries = {
assfiles: 'SELECT json_object_agg(id, data) AS stuff FROM assfiles;',
assusers: 'SELECT json_object_agg(id, data) AS stuff FROM assusers;',
asstokens: 'SELECT json_object_agg(id, data) AS stuff FROM asstokens;'
};
let result = await this._client.query(queries[table]);
resolve(result.rowCount ? result.rows[0].stuff : void 0);
} catch (err) {
reject(err);
}
});
}
}

11
backend/tsconfig.json Normal file
View File

@ -0,0 +1,11 @@
{
"extends": "@tsconfig/node20/tsconfig.json",
"compilerOptions": {
"outDir": "../dist/backend",
"strictPropertyInitialization": false
},
"include": [
"./**/*.ts",
"../**/common/*.ts"
]
}

22
backend/utils.ts Normal file
View File

@ -0,0 +1,22 @@
import { DateTime } from 'luxon';
import { id } from 'william.js';
export const customId = (length: number, alphabet: string = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789') => id(length, 1, alphabet);
export const randomHexColour = () => { // From: https://www.geeksforgeeks.org/javascript-generate-random-hex-codes-color/
const letters = '0123456789ABCDEF';
let colour = '#';
for (let i = 0; i < 6; i++)
colour += letters[(Math.floor(Math.random() * letters.length))];
return colour;
};
export const formatTimestamp = (timestamp: number, timeoffset: string) =>
DateTime.fromMillis(timestamp).setZone(timeoffset).toLocaleString(DateTime.DATETIME_MED);
export const formatBytes = (bytes: number, decimals = 2) => {
if (bytes === 0) return '0 Bytes';
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return parseFloat((bytes / Math.pow(1024, i)).toFixed(decimals < 0 ? 0 : decimals)).toString().concat(` ${sizes[i]}`);
};

28
common/fix-frontend-js.js Normal file
View File

@ -0,0 +1,28 @@
import fs from 'fs-extra';
import { path } from '@tycrek/joint';
import { TLog } from '@tycrek/log';
const log = new TLog();
const FILES = {
prefix: 'dist/frontend',
suffix: '.mjs',
pages: [
'setup',
'login',
'admin',
'user',
]
};
const fixFile = (page) => {
const filePath = path.join(FILES.prefix, `${page}${FILES.suffix}`);
const fixed = fs.readFileSync(filePath).toString().replace('export {};', '');
return fs.writeFile(filePath, fixed);
};
log.info('Fixing frontend JS', `${FILES.pages.length} files`);
Promise.all(FILES.pages.map(fixFile))
.then(() => log.success('Fixed.'))
.catch(console.error);

39
common/global.d.ts vendored Normal file
View File

@ -0,0 +1,39 @@
import { BusBoyFile } from 'ass';
import { Request, Response } from 'express';
declare module 'express-session' {
interface SessionData {
ass: {
auth?: {
uid: string;
token: string;
}
preLoginPath?: string;
}
}
}
declare global {
namespace Express {
interface Request {
/**
* ass-specific request items
*/
ass: {
/**
* Combination of {protocol}://{hostname}
*/
host: string
/**
* ass version
*/
version: string
}
files: { [key: string]: BusBoyFile }
}
}
}

306
common/types.d.ts vendored Normal file
View File

@ -0,0 +1,306 @@
declare module 'ass' {
type NID = string;
type IdType = 'random' | 'original' | 'gfycat' | 'timestamp' | 'zws'
export type DatabaseValue = AssFile | AssUser | UploadToken;
export type DatabaseTable = 'assfiles' | 'assusers' | 'asstokens';
/**
* Core Express server config.
* This is separate from the user configuration starting in 0.15.0
*/
interface ServerConfiguration {
host: string;
port: number;
proxied: boolean;
}
/**
* User-defined configuration
*/
interface UserConfiguration {
uploadsDir: string;
idType: IdType;
idSize: number;
gfySize: number;
maximumFileSize: number;
discordWebhook: string;
s3?: S3Configuration;
database?: DatabaseConfiguration;
rateLimit?: RateLimitConfiguration;
}
interface S3Configuration {
/**
* S3 endpoint to use
*/
endpoint: string;
/**
* Bucket to upload to
*/
bucket: string;
/**
* Optional region. Required for some providers
*/
region?: string;
/**
* Access credentials
*/
credentials: {
accessKey: string;
secretKey: string;
}
}
/**
* interface for database classes
*/
export interface Database {
/**
* preform database initialization tasks
*/
open(): Promise<void>;
/**
* preform database suspension tasks
*/
close(): Promise<void>;
/**
* set up database
*/
configure(): Promise<void>;
/**
* put a value in the database
*/
put(table: DatabaseTable, key: NID, data: DatabaseValue): Promise<void>;
/**
* get a value from the database
*/
get(table: DatabaseTable, key: NID): Promise<DatabaseValue>;
/**
* get all values from the database
*/
getAll(table: DatabaseTable): Promise<DatabaseValue[]>;
}
interface DatabaseConfiguration {
kind: 'mysql' | 'postgres' | 'json';
options?: MySQLConfiguration | PostgresConfiguration;
}
interface MySQLConfiguration {
host: string;
port: number;
user: string;
password: string;
database: string;
}
interface PostgresConfiguration {
host: string;
port: number;
user: string;
password: string;
database: string;
}
/**
* rate limiter configuration
* @since 0.15.0
*/
interface RateLimitConfiguration {
/**
* rate limit for the login endpoints
*/
login?: EndpointRateLimitConfiguration;
/**
* rate limit for parts of the api not covered by other rate limits
*/
api?: EndpointRateLimitConfiguration;
/**
* rate limit for file uploads
*/
upload?: EndpointRateLimitConfiguration;
}
/**
* rate limiter per-endpoint configuration
* @since 0.15.0
*/
interface EndpointRateLimitConfiguration {
/**
* maximum number of requests per duration
*/
requests: number;
/**
* rate limiting window in seconds
*/
duration: number;
}
interface UserConfigTypeChecker {
uploadsDir: (val: any) => boolean;
idType: (val: any) => boolean;
idSize: (val: any) => boolean;
gfySize: (val: any) => boolean;
maximumFileSize: (val: any) => boolean;
discordWebhook: (val: any) => boolean;
s3: {
endpoint: (val: any) => boolean;
bucket: (val: any) => boolean;
region: (val: any) => boolean;
credentials: {
accessKey: (val: any) => boolean;
secretKey: (val: any) => boolean;
}
}
sql: {
mySql: {
host: (val: any) => boolean;
port: (val: any) => boolean;
user: (val: any) => boolean;
password: (val: any) => boolean;
database: (val: any) => boolean;
}
postgres: {
port: (val: any) => boolean;
}
}
rateLimit: {
endpoint: (val: any) => boolean;
}
}
/**
* The in-progress structure of a file being uploaded (pre-ass processing)
*/
interface BusBoyFile {
uuid: string;
field: string;
/**
* Absolute path to the temporary file on-disk
*/
file: string;
filename: string;
encoding: string;
mimetype: string;
truncated: boolean;
done: boolean;
}
/**
* Object describing the file as ass handles it (after BusBoy)
*/
interface AssFile {
/**
* Public identifier used in the URL
*/
fakeid: NID;
/**
* Unique-but-human-readable ID. Combination of Epoch and filename.
* This allows users to search for their file while also avoiding conflicts.
*/
fileKey: string;
/**
* The original filename when it was uploaded by the user
*/
filename: string;
mimetype: string;
save: {
local?: string;
s3?: {
privateUrl?: string;
publicUrl?: string;
thumbnailUrl?: string;
} | true;
}
sha256: string;
size: number;
timestamp: string;
uploader: NID;
}
/**
* Structure of a token in 0.15.0, allowing more fancy features, maybe
*/
interface UploadToken {
/**
* Token ID to link it to a user
*/
id: NID;
/**
* The token itself. The user will need this for upload auth.
*/
token: string;
/**
* Helps the user know what this token is used for
*/
hint: string;
}
/**
* Object describing the users of an ass instance
*/
interface AssUser {
id: NID;
username: string;
password: string;
admin: boolean
tokens: NID[];
files: NID[];
meta: { [key: string]: any };
}
interface AssUserNewReq {
username: string;
password: string;
admin?: boolean;
meta?: { [key: string]: any };
}
/**
* JSON schema for files.json
*/
interface FilesSchema {
files: {
[key: NID]: AssFile;
}
meta: { [key: string]: any };
}
/**
* JSON scheme for users.json
*/
interface UsersSchema {
tokens: UploadToken[];
users: {
[key: NID]: AssUser;
};
cliKey: string;
meta: { [key: string]: any };
}
}
//#region Dummy modules
declare module '@tinycreek/postcss-font-magician';
//#endregion
// don't commit
/* future UserConfig options:
mediaStrict: boolean;
viewDirect: boolean;
viewDirectDiscord: boolean;
adminWebhook: {}
s3: {}
*/

View File

@ -1,4 +1,4 @@
# ass Docker compose.yaml v0.2.0
# ass Docker compose.yaml v0.3.0
# authors:
# - tycrek <t@tycrek.com> (https://tycrek.com/)
# - Zusier <zusier@pm.me> (https://github.com/Zusier)
@ -9,27 +9,15 @@
services:
ass:
build: .
image: tycrek/ass
container_name: ass-docker
restart: unless-stopped
volumes:
- ./.ass-data:/opt/ass-src/.ass-data
ports:
- "40115:40115"
volumes:
- ./uploads:/opt/ass/uploads
- ./share:/opt/ass/share
- type: bind
source: ./config.json
target: /opt/ass/config.json
- type: bind
source: ./auth.json
target: /opt/ass/auth.json
- type: bind
source: ./data.json
target: /opt/ass/data.json
tmpfs: /tmp # temp files such as uploads are stored here
tmpfs: /tmp
tty: true
environment:
- NODE_ENV=production # for production
- ASS_ENV=docker # docker, local, production (not widely used yet)
- LOG_LEVEL=debug # debug, info, warn, error
- FORCE_COLOR=3 # force color output
- NODE_ENV=production
- FORCE_COLOR=3 # tlog color output

29
docker-dev-container.sh Executable file
View File

@ -0,0 +1,29 @@
#!/bin/bash
denv=FORCE_COLOR=3
volume=$(pwd)/.ass-data:/opt/ass-src/.ass-data
workdir=/opt/ass-src/
port=40115:40115
# container name:tag (tag is unix timestamp)
cname=ass:$(date +%s)
# build image
docker buildx build -t $cname .
# run the new image
docker run -it -e $denv -v $volume -w $workdir -p $port $cname
# wait for exit
echo
echo
echo -e "\033[32m\033[1mTo use this image again, run:\033[0m"
echo
echo " docker run -it \\"
echo " -e $denv \\"
echo " -v \$(pwd)/.ass-data:/opt/ass-src/.ass-data \\"
echo " -w $workdir \\"
echo " -p $port \\"
echo " $cname"
echo

103
docs/.vitepress/config.ts Normal file
View File

@ -0,0 +1,103 @@
import { defineConfig } from 'vitepress';
const LOGO = 'https://i.tycrek.dev/ass';
const GIT_BRANCH = 'dev/0.15.0'
// https://vitepress.dev/reference/site-config
export default defineConfig({
lang: 'en-US',
title: 'ass docs',
titleTemplate: ':title ~ ass docs',
description: 'Documentation for ass, an open-source ShareX server',
cleanUrls: true,
lastUpdated: true,
head: [
['meta', { property: 'og:image', content: LOGO }],
['meta', { property: 'og:type', content: 'website' }],
['meta', { property: 'twitter:domain', content: 'ass.tycrek.dev' }],
['meta', { property: 'twitter:image', content: LOGO }],
['link', { rel: 'icon', href: LOGO }],
],
themeConfig: {
// https://vitepress.dev/reference/default-theme-config
logo: LOGO,
nav: [
{ text: 'Home', link: '/' },
{
text: 'Install', items: [
{ text: 'Docker', link: '/install/docker' },
{ text: 'Local', link: '/install/local' }
]
},
{ text: 'Configure', link: '/configure/' }
],
sidebar: [
{
text: 'Install',
link: '/install/',
items: [
{ text: 'Docker', link: '/install/docker' },
{ text: 'Local', link: '/install/local' }
]
},
{
text: 'Configure',
link: '/configure/',
items: [
{
text: 'SQL',
items: [
{
text: 'MySQL',
link: '/configure/sql/mysql'
},
{
text: 'PostgreSQL',
link: '/configure/sql/postgresql'
}
]
},
{
text: 'Clients',
items: [
{
text: 'ShareX',
link: '/configure/clients/sharex'
},
{
text: 'Flameshot',
link: '/configure/clients/flameshot'
}
]
}
]
},
{
text: 'Customize',
link: '/customize/',
items: [
{ text: 'Colors', link: '/customize/colors' }
]
}
],
editLink: {
pattern: `https://github.com/tycrek/ass/edit/${GIT_BRANCH}/docs/:path`,
text: 'Edit this page on GitHub',
},
footer: {
message: 'Released under the ISC License.',
copyright: 'Copyright © 2023 tycrek & ass contributors',
},
socialLinks: [
{ icon: 'github', link: 'https://github.com/tycrek/ass/' },
{ icon: 'discord', link: 'https://discord.gg/wGZYt5fasY' }
]
}
});

View File

@ -0,0 +1,17 @@
// https://vitepress.dev/guide/custom-theme
import { h } from 'vue'
import type { Theme } from 'vitepress'
import DefaultTheme from 'vitepress/theme'
import './style.css'
export default {
extends: DefaultTheme,
Layout: () => {
return h(DefaultTheme.Layout, null, {
// https://vitepress.dev/guide/extending-default-theme#layout-slots
})
},
enhanceApp({ app, router, siteData }) {
// ...
}
} satisfies Theme

View File

@ -0,0 +1,139 @@
/**
* Customize default theme styling by overriding CSS variables:
* https://github.com/vuejs/vitepress/blob/main/src/client/theme-default/styles/vars.css
*/
/**
* Colors
*
* Each colors have exact same color scale system with 3 levels of solid
* colors with different brightness, and 1 soft color.
*
* - `XXX-1`: The most solid color used mainly for colored text. It must
* satisfy the contrast ratio against when used on top of `XXX-soft`.
*
* - `XXX-2`: The color used mainly for hover state of the button.
*
* - `XXX-3`: The color for solid background, such as bg color of the button.
* It must satisfy the contrast ratio with pure white (#ffffff) text on
* top of it.
*
* - `XXX-soft`: The color used for subtle background such as custom container
* or badges. It must satisfy the contrast ratio when putting `XXX-1` colors
* on top of it.
*
* The soft color must be semi transparent alpha channel. This is crucial
* because it allows adding multiple "soft" colors on top of each other
* to create a accent, such as when having inline code block inside
* custom containers.
*
* - `default`: The color used purely for subtle indication without any
* special meanings attched to it such as bg color for menu hover state.
*
* - `brand`: Used for primary brand colors, such as link text, button with
* brand theme, etc.
*
* - `tip`: Used to indicate useful information. The default theme uses the
* brand color for this by default.
*
* - `warning`: Used to indicate warning to the users. Used in custom
* container, badges, etc.
*
* - `danger`: Used to show error, or dangerous message to the users. Used
* in custom container, badges, etc.
* -------------------------------------------------------------------------- */
:root {
--vp-c-default-1: var(--vp-c-gray-1);
--vp-c-default-2: var(--vp-c-gray-2);
--vp-c-default-3: var(--vp-c-gray-3);
--vp-c-default-soft: var(--vp-c-gray-soft);
--vp-c-brand-1: var(--vp-c-indigo-1);
--vp-c-brand-2: var(--vp-c-indigo-2);
--vp-c-brand-3: var(--vp-c-indigo-3);
--vp-c-brand-soft: var(--vp-c-indigo-soft);
--vp-c-tip-1: var(--vp-c-brand-1);
--vp-c-tip-2: var(--vp-c-brand-2);
--vp-c-tip-3: var(--vp-c-brand-3);
--vp-c-tip-soft: var(--vp-c-brand-soft);
--vp-c-warning-1: var(--vp-c-yellow-1);
--vp-c-warning-2: var(--vp-c-yellow-2);
--vp-c-warning-3: var(--vp-c-yellow-3);
--vp-c-warning-soft: var(--vp-c-yellow-soft);
--vp-c-danger-1: var(--vp-c-red-1);
--vp-c-danger-2: var(--vp-c-red-2);
--vp-c-danger-3: var(--vp-c-red-3);
--vp-c-danger-soft: var(--vp-c-red-soft);
}
/**
* Component: Button
* -------------------------------------------------------------------------- */
:root {
--vp-button-brand-border: transparent;
--vp-button-brand-text: var(--vp-c-white);
--vp-button-brand-bg: var(--vp-c-brand-3);
--vp-button-brand-hover-border: transparent;
--vp-button-brand-hover-text: var(--vp-c-white);
--vp-button-brand-hover-bg: var(--vp-c-brand-2);
--vp-button-brand-active-border: transparent;
--vp-button-brand-active-text: var(--vp-c-white);
--vp-button-brand-active-bg: var(--vp-c-brand-1);
}
/**
* Component: Home
* -------------------------------------------------------------------------- */
:root {
--vp-home-hero-name-color: transparent;
--vp-home-hero-name-background: -webkit-linear-gradient(
120deg,
#bd34fe 30%,
#41d1ff
);
--vp-home-hero-image-background-image: linear-gradient(
-45deg,
#bd34fe 50%,
#47caff 50%
);
--vp-home-hero-image-filter: blur(44px);
}
@media (min-width: 640px) {
:root {
--vp-home-hero-image-filter: blur(56px);
}
}
@media (min-width: 960px) {
:root {
--vp-home-hero-image-filter: blur(68px);
}
}
/**
* Component: Custom Block
* -------------------------------------------------------------------------- */
:root {
--vp-custom-block-tip-border: transparent;
--vp-custom-block-tip-text: var(--vp-c-text-1);
--vp-custom-block-tip-bg: var(--vp-c-brand-soft);
--vp-custom-block-tip-code-bg: var(--vp-c-brand-soft);
}
/**
* Component: Algolia
* -------------------------------------------------------------------------- */
.DocSearch {
--docsearch-primary-color: var(--vp-c-brand-1) !important;
}

49
docs/api-examples.md Normal file
View File

@ -0,0 +1,49 @@
---
outline: deep
---
# Runtime API Examples
This page demonstrates usage of some of the runtime APIs provided by VitePress.
The main `useData()` API can be used to access site, theme, and page data for the current page. It works in both `.md` and `.vue` files:
```md
<script setup>
import { useData } from 'vitepress'
const { theme, page, frontmatter } = useData()
</script>
## Results
### Theme Data
<pre>{{ theme }}</pre>
### Page Data
<pre>{{ page }}</pre>
### Page Frontmatter
<pre>{{ frontmatter }}</pre>
```
<script setup>
import { useData } from 'vitepress'
const { site, theme, page, frontmatter } = useData()
</script>
## Results
### Theme Data
<pre>{{ theme }}</pre>
### Page Data
<pre>{{ page }}</pre>
### Page Frontmatter
<pre>{{ frontmatter }}</pre>
## More
Check out the documentation for the [full list of runtime APIs](https://vitepress.dev/reference/runtime-api#usedata).

View File

@ -0,0 +1,11 @@
# Flameshot
The Flameshot script has been updated to be a lot more dynamic, including adding support for [cheek](https://github.com/tycrek/cheek#readme), my serverless ShareX upload server. To set cheek mode, edit the file [`flameshot-v2.sh`](https://github.com/tycrek/ass/blob/dev/0.15.0/flameshot-v2.sh) and set `MODE=0` to `MODE=1`.
To set your token (not in use yet, can be random) and domain for the script, create these directories with the following files:
- `~/.ass/` (or `~/.cheek/`)
- `~/.ass/.token`
- `~/.ass/.domain`
For `.domain`, you do **not** need to include `http(s)://`.

View File

@ -0,0 +1,10 @@
# ShareX
| Setting | Value |
| ------- | ----- |
| Request URL | Your server domain (including `http(s)://`) |
| Request Method | `POST` |
| Destination Type | `Image`, `Text`, `File` |
| Body | `multipart/form-data` |
| File Form Name | `file` |
| URL | `{json.resource}` |

25
docs/configure/index.md Normal file
View File

@ -0,0 +1,25 @@
# Configure
Most of the configuration is managed through the administrator dashboard.
## `server.json` overrides
The webserver in ass 15 is hosted independently of any user configuration. If you wish to set a specific server setting, you may do so with a `server.json` file.
Place this file in `<root>/.ass-data/`.
| Property | Use | Default |
| -------- | --- | ------- |
| `host` | Local IP to bind to | `0.0.0.0` |
| `port` | Port to listen on | `40115` |
| `proxied` | If ass is behind a reverse proxy | `false`, unless `NODE_ENV=production` is specified, otherwise `true` |
**Example**
```json
{
"host": "127.0.1.2",
"port": 40200,
"proxied": false
}
```

View File

@ -0,0 +1,7 @@
# MySQL
## Provider-specific instructions
### Aiven
In the **Overview** panel, scroll down to **Advanced**, and set `mysql.sql_require_primary_key` to **`Disabled`**.

1
docs/customize/colors.md Normal file
View File

@ -0,0 +1 @@
# Colors

3
docs/customize/index.md Normal file
View File

@ -0,0 +1,3 @@
# Customize
This is coming soon.

34
docs/index.md Normal file
View File

@ -0,0 +1,34 @@
---
# https://vitepress.dev/reference/default-theme-home-page
layout: home
title: Home
hero:
name: ass
text: open-source file hosting server
tagline: Unopinionated, customizable, uniquely yours.
actions:
- theme: brand
text: Get Started
link: /install/
- theme: alt
text: View on GitHub
link: https://github.com/tycrek/ass
image:
src: 'https://i.tycrek.dev/ass-round-square-logo-white-with-text'
alt: ass logo
features:
- icon: 😋
title: sassy
details: Like me.
- icon: 🍔
title: greasy
details: More than a Big Mac.
- icon: ☁️
title: soft
details: Just the way you like it.
---

32
docs/install/docker.md Normal file
View File

@ -0,0 +1,32 @@
# Docker
The Docker method uses [Docker Compose][1] for a quick and easy installation. For a faster deployment, a pre-built image is pulled from [Docker Hub](https://hub.docker.com/r/tycrek/ass).
## Requirements
- Latest [Docker](https://docs.docker.com/engine/install/)
- [Docker Compose][1] v2 plugin
[1]: https://docs.docker.com/compose/
## Install
I provide a pre-made `compose.yaml` file that makes it easier to get started.
```bash
mkdir ass && cd ass/
curl -LO https://ass.tycrek.dev/compose.yaml
docker compose up -d
```
### View logs
Use the following command to view the container logs:
```bash
docker compose logs -n <lines> --follow
```
## Build local image
If you wish to build a Docker image locally for development, you can use the provided [docker-dev-container.sh](https://github.com/tycrek/ass/blob/dev/0.15.0/docker-dev-container.sh) script.

18
docs/install/index.md Normal file
View File

@ -0,0 +1,18 @@
# Installation
You can use either [Docker](docker) (recommended) or your [local Node.js](local) installation.
::: warning ass 0.15.0 is experimental
Branch [`dev/0.15.0`](https://github.com/tycrek/ass/tree/dev/0.15.0/) is a full rewrite of the ass codebase.
At this time, it is working and ready for testing, **but is very incomplete** and is lacking many features currently found in ass 0.14.
**The existing configs, data.json, and auth.json will be abandoned. There is currently no migration nor one planned.**
::::
## Alternatives
These are maintained by the ass community.
- Nix flake (soon)
- Pterodactyl Egg (soon)

20
docs/install/local.md Normal file
View File

@ -0,0 +1,20 @@
# Local install
The local method uses the [Node.js](https://nodejs.org/en) installation found on your system.
## Requirements
- **Node 20** or later
- [pnpm](https://pnpm.io/installation)
## Install
```bash
git clone -b dev/0.15.0 https://github.com/tycrek/ass.git && cd ass/
pnpm i
# or: npm i -D
pnpm run dev
# After ass has been compiled, you can instead use:
pnpm run start
```

85
docs/markdown-examples.md Normal file
View File

@ -0,0 +1,85 @@
# Markdown Extension Examples
This page demonstrates some of the built-in markdown extensions provided by VitePress.
## Syntax Highlighting
VitePress provides Syntax Highlighting powered by [Shikiji](https://github.com/antfu/shikiji), with additional features like line-highlighting:
**Input**
````md
```js{4}
export default {
data () {
return {
msg: 'Highlighted!'
}
}
}
```
````
**Output**
```js{4}
export default {
data () {
return {
msg: 'Highlighted!'
}
}
}
```
## Custom Containers
**Input**
```md
::: info
This is an info box.
:::
::: tip
This is a tip.
:::
::: warning
This is a warning.
:::
::: danger
This is a dangerous warning.
:::
::: details
This is a details block.
:::
```
**Output**
::: info
This is an info box.
:::
::: tip
This is a tip.
:::
::: warning
This is a warning.
:::
::: danger
This is a dangerous warning.
:::
::: details
This is a details block.
:::
## More
Check out the documentation for the [full list of markdown extensions](https://vitepress.dev/guide/markdown).

136
flameshot-v1.1.sh Normal file
View File

@ -0,0 +1,136 @@
#!/usr/bin/env bash
# Script Configuration
# Load configuration file if available
# this is useful if you want to source keys from a secret file
CONFIG_FILE="config.sh"
if [ -f "$CONFIG_FILE" ]; then
# shellcheck disable=1090
source "${CONFIG_FILE}"
fi
LOG_DIR=$(pwd)
if [ ! -d "$LOG_DIR" ]; then
echo "The directory you have specified to save the logs does not exist."
echo "Please create the directory with the following command:"
echo "mkdir -p $LOG_DIR"
echo -en "Or specify a different LOG_DIR\n"
exit 1
fi
IMAGE_PATH="$HOME/Pictures"
if [ ! -d "$IMAGE_PATH" ]; then
echo "The directory you have specified to save the screenshot does not exist."
echo "Please create the directory with the following command:"
echo "mkdir -p $IMAGE_PATH"
echo -en "Or specify a different IMAGE_PATH\n"
exit 1
fi
IMAGE_NAME="ass"
FILE="${IMAGE_PATH}/${IMAGE_NAME}.png"
# Function to check if a tool is installed
check_tool() {
command -v "$1" >/dev/null 2>&1
}
# Function to take Flameshot screenshots
takeFlameshot() {
# check if flameshot tool is installed
REQUIRED_TOOLS=("flameshot")
for tool in "${REQUIRED_TOOLS[@]}"; do
if ! check_tool "$tool"; then
echo "Error: $tool is not installed. Please install it before using this script."
exit 1
fi
done
flameshot config -f "${IMAGE_NAME}"
flameshot gui -r -p "${IMAGE_PATH}" >/dev/null
}
# Function to take Wayland screenshots using grim + slurp
takeGrimshot() {
# check if grim and slurp are installed
REQUIRED_TOOLS=("grim" "slurp")
for tool in "${REQUIRED_TOOLS[@]}"; do
if ! check_tool "$tool"; then
echo "Error: $tool is not installed. Please install it before using this script."
exit 1
fi
done
grim -g "$(slurp)" "${FILE}" >/dev/null
}
# Function to remove the taken screenshot
removeTargetFile() {
echo -en "Process complete.\nRemoving image.\n"
rm -v "${FILE}"
}
# Function to upload target image to your ass instance
uploadScreenshot() {
echo -en "KEY & DOMAIN are set. Attempting to upload to your ass instance.\n"
URL=$(curl -X POST \
-H "Content-Type: multipart/form-data" \
-H "Accept: application/json" \
-H "User-Agent: ShareX/13.4.0" \
-H "Authorization: $KEY" \
-F "file=@${FILE}" "https://$DOMAIN/" | grep -Po '(?<="resource":")[^"]+')
if [[ "${XDG_SESSION_TYPE}" == x11 ]]; then
printf "%s" "$URL" | xclip -sel clip
elif [[ "${XDG_SESSION_TYPE}" == wayland ]]; then
printf "%s" "$URL" | wl-copy
else
echo -en "Invalid desktop session!\nExiting.\n"
exit 1
fi
}
localScreenshot() {
echo -en "KEY & DOMAIN variables are not set. Attempting local screenshot.\n"
if [[ "${XDG_SESSION_TYPE}" == x11 ]]; then
xclip -sel clip -target image/png <"${FILE}"
elif [[ "${XDG_SESSION_TYPE}" == wayland ]]; then
wl-copy <"${FILE}"
else
echo -en "Unknown display backend. Assuming Xorg and using xclip.\n"
xclip -sel clip -target image/png <"${FILE}"
fi
}
# Check if the screenshot tool based on display backend
if [[ "${XDG_SESSION_TYPE}" == x11 ]]; then
echo -en "Display backend detected as Xorg (x11), using Flameshot\n"
takeFlameshot
elif [[ "${XDG_SESSION_TYPE}" == wayland ]]; then
echo -en "Display backend detected as Wayland, using grim & slurp\n"
takeGrimshot
else
echo -en "Unknown display backend. Assuming Xorg and using Flameshot\n"
takeFlameshot >"${LOG_DIR}/flameshot.log"
echo -en "Done. Make sure you check for any errors and report them.\nLogfile located in '${LOG_DIR}'\n"
fi
# Check if the screenshot file exists before proceeding
if [[ -f "${FILE}" ]]; then
if [[ -n "$KEY" && -n "$DOMAIN" ]]; then
# Upload the file to the ass instance
uploadImage
# Remove image
removeTargetFile
else
# Take a screenshot locally
localScreenshot
# Remove image
removeTargetFile
fi
else
echo -en "Target file ${FILE} was not found. Aborting screenshot.\n"
exit 1
fi

89
flameshot-v2.sh Executable file
View File

@ -0,0 +1,89 @@
#!/bin/bash
## * ass & cheek flameshot script * ##
#
# Required packages: flameshot, curl, xclip, libnotify
#
# Authors:
# - ToxicAven (https://github.com/ToxicAven)
# - tycrek (https://github.com/tycrek)
# - Metacinnabar (https://github.com/Metacinnabar)
# - NotAShelf (https://github.com/NotAShelf)
# ! Upload mode (ass=0,cheek=1)
MODE=0
# Function to check if a tool is installed
check_tool() {
command -v "$1" >/dev/null 2>&1
}
# Mode string switcher
get_mode() {
if [[ $MODE -eq 0 ]];
then echo "ass"
else echo "cheek"
fi
}
# File details
IMGPATH="$HOME/.$(get_mode)"
FILE="$IMGPATH/$(get_mode)-$(date +%s).png"
# ass/cheek configuration (domain should be saved without http(s)://)
TOKEN=$(cat $IMGPATH/.token)
DOMAIN=$(cat $IMGPATH/.domain)
takeScreenshot() {
REQUIRED_TOOLS=("flameshot" "curl" "xclip" "notify-send")
# Check if the proper tools are installed
for tool in "${REQUIRED_TOOLS[@]}"; do
if ! check_tool "$tool"; then
echo "Error: $tool is missing!"
exit 1
fi
done
# Build dynamic Flameshot user-agent
USERAGENT=$(flameshot -v | sed -n -E 's/(Flameshot) (v[0-9]+\.[0-9]+\.[0-9]+) .+/\1-\2/p')
# Take screenshot with Flameshot
flameshot gui -r -p "$FILE" > /dev/null # Append the random gibberish to /dev/null
# Upload file
if [ -f "$FILE" ]; then
echo "Uploading $FILE to $(get_mode)..."
# Configure upload fields
FIELD="$([[ $MODE -eq 0 ]] && echo "file" || echo "image")=@$FILE"
[[ "${DOMAIN%%:*}" = "127.0.0.1" ]] && PROTOCOL="http" || PROTOCOL="https"
POSTTO="$PROTOCOL://$DOMAIN/$([[ $MODE -eq 0 ]] && echo "" || echo "upload")"
# Upload the file
URL=$(curl -sS -X POST \
-H "Content-Type: multipart/form-data" \
-H "Accept: application/json" \
-H "User-Agent: $USERAGENT" \
-H "Authorization: $TOKEN" \
-F $FIELD $POSTTO
)
# Response parser unique to ass
if [[ $MODE -eq 0 ]]; then
URL=$(echo $URL | grep -Po '(?<="resource":")[^"]+')
fi
# Copy the URL to clipboard (using printf instead of echo to avoid a newline)
printf "%s" "$URL" | xclip -sel clip
echo "URL copied: $URL"
notify-send -a $(get_mode) -t 4000 "URL copied to clipboard" "<a href=\"$URL\">View in browser</a>"
# Delete local file
rm "$FILE"
else
echo "Aborted."
fi
}
takeScreenshot

View File

@ -1,27 +0,0 @@
#!/bin/bash
IMAGEPATH="$HOME/Pictures/" # Where to store screenshots before they're deleted
IMAGENAME="ass" # Not really important, tells Flameshot what file to send and delete
KEY="" # Your ass upload token
DOMAIN="" # Your upload domain (without http:// or https://)
flameshot config -f "$IMAGENAME" # Make sure that Flameshot names the file correctly
flameshot gui -r -p "$IMAGEPATH" > /dev/null # Prompt the screenshot GUI, also append the random gibberish to /dev/null
FILE="$IMAGEPATH$IMAGENAME.png" # File path and file name combined
# Check if file exists to handle Curl and rm errors
# then upload the image and copy the response URL
if [ -f "$FILE" ]; then
echo "$FILE exists."
URL=$(curl -X POST \
-H "Content-Type: multipart/form-data" \
-H "Accept: application/json" \
-H "User-Agent: ShareX/13.4.0" \
-H "Authorization: $KEY" \
-F "file=@$IMAGEPATH$IMAGENAME.png" "https://$DOMAIN/" | grep -Po '(?<="resource":")[^"]+')
# printf instead of echo as echo appends a newline
printf "%s" "$URL" | xclip -sel clip
rm "$IMAGEPATH$IMAGENAME.png" # Delete the image locally
else
echo "Aborted."
fi

4
frontend/admin.mts Normal file
View File

@ -0,0 +1,4 @@
import { SlInput, SlButton } from '@shoelace-style/shoelace';
// * Wait for the document to be ready
document.addEventListener('DOMContentLoaded', () => console.log('Admin page loaded'));

46
frontend/login.mts Normal file
View File

@ -0,0 +1,46 @@
import { SlInput, SlButton } from '@shoelace-style/shoelace';
const genericErrorAlert = () => alert('An error occured, please check the console for details');
const errAlert = (logTitle: string, err: any, stream: 'error' | 'warn' = 'error') => (console[stream](logTitle, err), genericErrorAlert());
const errReset = (message: string, element: SlButton) => (element.disabled = false, alert(message));
// * Wait for the document to be ready
document.addEventListener('DOMContentLoaded', () => {
const Elements = {
usernameInput: document.querySelector('#login-username') as SlInput,
passwordInput: document.querySelector('#login-password') as SlInput,
submitButton: document.querySelector('#login-submit') as SlButton
};
// * Login button click handler
Elements.submitButton.addEventListener('click', async () => {
Elements.submitButton.disabled = true;
// Make sure fields are filled
if (Elements.usernameInput.value == null || Elements.usernameInput.value === '')
return errReset('Username is required!', Elements.submitButton);
if (Elements.passwordInput.value == null || Elements.passwordInput.value === '')
return errReset('Password is required!', Elements.submitButton);
fetch('/api/login', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
username: Elements.usernameInput.value,
password: Elements.passwordInput.value
})
})
.then((res) => res.json())
.then((data: {
success: boolean,
message: string,
meta: { redirectTo: string }
}) => {
if (!data.success) alert(data.message);
else window.location.href = data.meta.redirectTo;
})
.catch((err) => errAlert('POST to /api/login failed!', err))
.finally(() => Elements.submitButton.disabled = false);
});
});

195
frontend/setup.mts Normal file
View File

@ -0,0 +1,195 @@
import { SlInput, SlButton, SlTab } from '@shoelace-style/shoelace';
import { IdType, UserConfiguration } from 'ass';
const genericErrorAlert = () => alert('An error occured, please check the console for details');
const errAlert = (logTitle: string, err: any, stream: 'error' | 'warn' = 'error') => (console[stream](logTitle, err), genericErrorAlert());
const errReset = (message: string, element: SlButton) => (element.disabled = false, alert(message));
const genericRateLimit = (config: object, category: string, submitButton: SlButton, requests: SlInput, time: SlInput) => {
if ((requests.value || time.value) != '') {
if (requests.value == '') {
errReset(`No count for ${category} rate limit`, submitButton);
return true; // this should probably be false but this lets us chain this until we see an error
}
if (time.value == '') {
errReset(`No time for ${category} rate limit`, submitButton);
return true;
}
(config as any)[category] = {
requests: parseInt(requests.value),
duration: parseInt(time.value),
};
}
return false;
};
// * Wait for the document to be ready
document.addEventListener('DOMContentLoaded', () => {
const Elements = {
dirInput: document.querySelector('#uploads-dir') as SlInput,
idTypeInput: document.querySelector('#uploads-idtype') as SlInput,
idSizeInput: document.querySelector('#uploads-idsize') as SlInput,
gfySizeInput: document.querySelector('#uploads-gfysize') as SlInput,
fileSizeInput: document.querySelector('#uploads-filesize') as SlInput,
s3endpoint: document.querySelector('#s3-endpoint') as SlInput,
s3bucket: document.querySelector('#s3-bucket') as SlInput,
s3accessKey: document.querySelector('#s3-accessKey') as SlInput,
s3secretKey: document.querySelector('#s3-secretKey') as SlInput,
s3region: document.querySelector('#s3-region') as SlInput,
jsonTab: document.querySelector('#json-tab') as SlTab,
mySqlTab: document.querySelector('#mysql-tab') as SlTab,
mySqlHost: document.querySelector('#mysql-host') as SlInput,
mySqlPort: document.querySelector('#mysql-port') as SlInput,
mySqlUser: document.querySelector('#mysql-user') as SlInput,
mySqlPassword: document.querySelector('#mysql-password') as SlInput,
mySqlDatabase: document.querySelector('#mysql-database') as SlInput,
pgsqlTab: document.querySelector('#pgsql-tab') as SlTab,
pgsqlHost: document.querySelector('#pgsql-host') as SlInput,
pgsqlPort: document.querySelector('#pgsql-port') as SlInput,
pgsqlUser: document.querySelector('#pgsql-user') as SlInput,
pgsqlPassword: document.querySelector('#pgsql-password') as SlInput,
pgsqlDatabase: document.querySelector('#pgsql-database') as SlInput,
userUsername: document.querySelector('#user-username') as SlInput,
userPassword: document.querySelector('#user-password') as SlInput,
ratelimitLoginRequests: document.querySelector('#ratelimit-login-requests') as SlInput,
ratelimitLoginTime: document.querySelector('#ratelimit-login-time') as SlInput,
ratelimitApiRequests: document.querySelector('#ratelimit-api-requests') as SlInput,
ratelimitApiTime: document.querySelector('#ratelimit-api-time') as SlInput,
ratelimitUploadRequests: document.querySelector('#ratelimit-upload-requests') as SlInput,
ratelimitUploadTime: document.querySelector('#ratelimit-upload-time') as SlInput,
submitButton: document.querySelector('#submit') as SlButton,
};
// * Setup button click handler
Elements.submitButton.addEventListener('click', async () => {
Elements.submitButton.disabled = true;
// Base configuration values
const config: UserConfiguration = {
uploadsDir: Elements.dirInput.value,
idType: Elements.idTypeInput.value as IdType,
idSize: parseInt(Elements.idSizeInput.value),
gfySize: parseInt(Elements.gfySizeInput.value),
maximumFileSize: parseInt(Elements.fileSizeInput.value),
};
// Append S3 to config, if specified
if (Elements.s3endpoint.value != null && Elements.s3endpoint.value !== '') {
config.s3 = {
endpoint: Elements.s3endpoint.value,
bucket: Elements.s3bucket.value,
credentials: {
accessKey: Elements.s3accessKey.value,
secretKey: Elements.s3secretKey.value
}
};
// Also append region, if it was provided
if (Elements.s3region.value != null && Elements.s3region.value !== '')
config.s3.region = Elements.s3region.value;
}
// Append database to config, if specified
if (Elements.jsonTab.active) {
config.database = {
kind: 'json'
};
} else if (Elements.mySqlTab.active) {
if (Elements.mySqlHost.value != null && Elements.mySqlHost.value != '') {
config.database = {
kind: 'mysql',
options: {
host: Elements.mySqlHost.value,
port: parseInt(Elements.mySqlPort.value),
user: Elements.mySqlUser.value,
password: Elements.mySqlPassword.value,
database: Elements.mySqlDatabase.value
}
};
}
} else if (Elements.pgsqlTab.active) {
if (Elements.pgsqlHost.value != null && Elements.pgsqlHost.value != '') {
config.database = {
kind: 'postgres',
options: {
host: Elements.pgsqlHost.value,
port: parseInt(Elements.pgsqlPort.value),
user: Elements.pgsqlUser.value,
password: Elements.pgsqlPassword.value,
database: Elements.pgsqlDatabase.value
}
};
}
}
// append rate limit config, if specified
if ((
Elements.ratelimitLoginRequests.value
|| Elements.ratelimitLoginTime.value
|| Elements.ratelimitUploadRequests.value
|| Elements.ratelimitUploadTime.value
|| Elements.ratelimitApiRequests.value
|| Elements.ratelimitApiTime.value) != ''
) {
if (!config.rateLimit) config.rateLimit = {};
if (
genericRateLimit(config.rateLimit, 'login', Elements.submitButton, Elements.ratelimitLoginRequests, Elements.ratelimitLoginTime)
|| genericRateLimit(config.rateLimit, 'api', Elements.submitButton, Elements.ratelimitApiRequests, Elements.ratelimitApiTime)
|| genericRateLimit(config.rateLimit, 'upload', Elements.submitButton, Elements.ratelimitUploadRequests, Elements.ratelimitUploadTime)
) {
return;
}
}
// ! Make sure the admin user fields are set
if (Elements.userUsername.value == null || Elements.userUsername.value === '')
return errReset('Admin username is required!', Elements.submitButton);
if (Elements.userPassword.value == null || Elements.userPassword.value === '')
return errReset('Admin password is required!', Elements.submitButton);
// Do setup
fetch('/api/setup', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(config)
})
.then((res) => res.json())
.then((data: {
success: boolean,
message: string
}) => {
if (!data.success) alert(data.message);
// Create first user (YES I KNOW THIS NESTING IS GROSS)
else return fetch('/api/user', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
username: Elements.userUsername.value,
password: Elements.userPassword.value,
admin: true
})
}).then((res) => res.json())
.then((data: {
success: boolean,
message: string
}) => {
if (data.success) window.location.href = '/admin';
else alert(data.message);
});
})
.catch((err) => errAlert('POST to /api/setup failed!', err))
.finally(() => Elements.submitButton.disabled = false);
});
});

15
frontend/tsconfig.json Normal file
View File

@ -0,0 +1,15 @@
{
"extends": "@tsconfig/node20/tsconfig.json",
"compilerOptions": {
"outDir": "../dist/frontend",
"lib": [
"ES2022",
"DOM"
],
"target": "ES2015",
},
"include": [
"./**/*.mts",
"../**/common/*.ts"
]
}

4
frontend/user.mts Normal file
View File

@ -0,0 +1,4 @@
import { SlInput, SlButton } from '@shoelace-style/shoelace';
// * Wait for the document to be ready
document.addEventListener('DOMContentLoaded', () => console.log('User page loaded'));

View File

@ -1,31 +0,0 @@
#!/bin/bash
echo "Installing ass-docker for Linux..."
# Ensure that ./uploads/thumbnails/ exists
mkdir -p ./uploads/thumbnails/
# Ensure that ./share/ exists
mkdir -p ./share/
# Ensure that files config.json, auth.json, & data.json exist
for value in config.json auth.json data.json
do
if [ ! -f $value ]; then
touch $value
fi
done
# Wait for user to confirm
echo "Continuing will run docker compose. Continue? (Press Ctrl+C to abort)"
read -n 1 -s -r -p "Press any key to continue..."
echo Running setup...
# Bring up the container and run the setup
docker compose up -d && docker compose exec ass npm run setup && docker compose restart
# Done!
echo "ass-docker for Linux installed!"
echo "Run the following to view commands:"
echo "$ docker compose logs -f --tail=50 --no-log-prefix ass"

View File

@ -1,28 +0,0 @@
@echo off
ECHO Installing ass-docker for Windows...
REM Ensure that ./uploads/thumbnails/ exists
if not exist "./uploads/thumbnails/" md "./uploads/thumbnails/"
REM Ensure that ./share/ exists
if not exist "./share/" md "./share/"
REM Ensure that files config.json, auth.json, & data.json exist
if not exist "./config.json" echo. >> "./config.json"
if not exist "./auth.json" echo. >> "./auth.json"
if not exist "./data.json" echo. >> "./data.json"
REM Wait for user to confirm
ECHO Continuing will run docker compose. Continue? (Press Ctrl+C to abort)
PAUSE
ECHO Running setup...
REM Bring up the container and run the setup
docker compose up -d && docker compose exec ass npm run setup && docker compose restart
REM Done!
ECHO ass-docker for Windows installed!
ECHO Run the following to view commands:
ECHO > docker compose logs -f --tail=50 --no-log-prefix ass

6253
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -1,97 +1,82 @@
{
"name": "ass",
"version": "0.14.8",
"version": "0.15.0-indev",
"description": "The simple self-hosted ShareX server",
"main": "ass.js",
"main": "dist/backend/app.js",
"type": "module",
"engines": {
"node": ">=16.14.x",
"npm": ">=8.3.x"
"node": "^20"
},
"scripts": {
"start": "node dist/backend/app.js",
"dev": "npm run build && npm start",
"dev-win": "npm run build-skip-options && npm run start",
"build": "NODE_OPTIONS=\"--max-old-space-size=1024\" tsc",
"build-skip-options": "tsc",
"start": "node dist/ass.js",
"setup": "node dist/setup.js",
"metrics": "node dist/metrics.js",
"engine-check": "node dist/checkEngine.js",
"prestart": "npm run engine-check",
"presetup": "npm run engine-check",
"purge": "node dist/purge.js",
"docker-logs": "docker-compose logs -f --tail=50 --no-log-prefix ass",
"docker-update": "git pull && npm run docker-uplite",
"docker-uplite": "docker-compose up --force-recreate --build -d && docker image prune -f",
"docker-upfull": "npm run docker-update && npm run docker-resetup",
"docker-resetup": "docker-compose exec ass npm run setup && docker-compose restart",
"cli-setpassword": "node dist/tools/script.setpassword.js",
"cli-testpassword": "node dist/tools/script.testpassword.js",
"cli-adduser": "node dist/tools/script.adduser.js"
"dev:docs": "wrangler pages dev --proxy 5173 -- npm run vp:dev",
"build": "rm -dr dist/ ; npm run build:backend && npm run build:frontend && npm run build:fix-frontend",
"build:backend": "tsc -p backend/",
"build:frontend": "tsc -p frontend/",
"build:fix-frontend": "node common/fix-frontend-js.js",
"build:docs": "npm run vp:build && npm run build:compose-redir",
"build:compose-redir": "echo \"/compose.yaml https://raw.githubusercontent.com/tycrek/ass/dev/0.15.0/compose.yaml 302\" > ./docs/.vitepress/dist/_redirects",
"vp:dev": "vitepress dev docs",
"vp:build": "vitepress build docs",
"vp:preview": "vitepress preview docs"
},
"repository": "github:tycrek/ass",
"keywords": [
"sharex",
"sharex-server"
],
"author": "tycrek <t@tycrek.com> (https://tycrek.com/)",
"author": "tycrek <sylvie@tycrek.com> (https://tycrek.com/)",
"license": "ISC",
"bugs": "https://github.com/tycrek/ass/issues",
"homepage": "https://github.com/tycrek/ass#readme",
"funding": {
"type": "patreon",
"url": "https://patreon.com/tycrek"
},
"dependencies": {
"@aws-sdk/client-s3": "^3.465.0",
"@shoelace-style/shoelace": "^2.12.0",
"@tinycreek/postcss-font-magician": "^4.2.0",
"@tsconfig/node16": "^1.0.1",
"@tsconfig/node20": "^20.1.2",
"@tycrek/discord-hookr": "^0.1.0",
"@tycrek/express-postcss": "^0.4.1",
"@tycrek/joint": "^1.0.0-1",
"@tycrek/log": "^0.7.1",
"@tycrek/papito": "^0.3.4",
"@tycrek/joint": "1.0.0-1",
"@tycrek/log": "^0.7.5",
"@xoi/gps-metadata-remover": "^1.1.2",
"any-shell-escape": "^0.1.1",
"autoprefixer": "^10.4.16",
"aws-sdk": "^2.1467.0",
"axios": "^1.6.0",
"axios": "^1.6.2",
"bcrypt": "^5.1.1",
"chalk": "^4.1.2",
"check-node-version": "^4.2.1",
"crypto-random-string": "3.3.1",
"cssnano": "^6.0.1",
"escape-html": "^1.0.3",
"express": "^4.18.2",
"express-busboy": "^10.1.0",
"express-rate-limit": "^7.1.5",
"express-session": "^1.17.3",
"ffmpeg-static": "^5.2.0",
"fs-extra": "^11.1.1",
"helmet": "^7.0.0",
"luxon": "^3.4.3",
"nanoid": "^3.3.4",
"node-fetch": "^2.6.7",
"node-vibrant": "^3.2.1-alpha.1",
"prompt": "^1.3.0",
"fs-extra": "^11.2.0",
"luxon": "^3.4.4",
"memorystore": "^1.6.7",
"mysql2": "^3.6.5",
"node-vibrant": "^3.1.6",
"pg": "^8.11.3",
"pug": "^3.0.2",
"sanitize-filename": "^1.6.3",
"sharp": "^0.32.6",
"stream-to-array": "^2.3.0",
"tailwindcss": "^3.3.3",
"typescript": "^4.9.5",
"uuid": "^8.3.2"
"shoelace-fontawesome-pug": "^6.4.3",
"shoelace-pug-loader": "^2.11.0",
"tailwindcss": "^3.3.6",
"typescript": "^5.3.2",
"william.js": "^1.3.1"
},
"devDependencies": {
"@types/bcrypt": "^5.0.0",
"@types/escape-html": "^1.0.1",
"@types/express": "^4.17.13",
"@types/express-busboy": "^8.0.0",
"@types/ffmpeg-static": "^3.0.0",
"@types/fs-extra": "^9.0.12",
"@types/luxon": "^3.3.0",
"@types/marked": "^3.0.0",
"@types/node": "^16.9.0",
"@types/node-fetch": "^2.5.12",
"@types/sharp": "^0.30.2",
"@types/stream-to-array": "^2.3.0",
"@types/uuid": "^8.3.1",
"@types/ws": "^7.4.7"
"@types/bcrypt": "^5.0.2",
"@types/express": "^4.17.21",
"@types/express-busboy": "^8.0.3",
"@types/express-session": "^1.17.10",
"@types/ffmpeg-static": "^3.0.3",
"@types/fs-extra": "^11.0.4",
"@types/luxon": "^3.3.6",
"@types/node": "^20.10.3",
"@types/pg": "^8.10.9",
"vitepress": "1.0.0-rc.31",
"vue": "^3.3.10",
"wrangler": "^3.18.0"
}
}

6923
pnpm-lock.yaml Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1,133 +0,0 @@
import { ErrWrap } from './types/definitions';
import { Config, MagicNumbers, Package } from 'ass-json';
//#region Imports
import fs from 'fs-extra';
import express, { Request, Response, json as BodyParserJson } from 'express';
import { nofavicon } from '@tycrek/joint';
import { epcss } from '@tycrek/express-postcss';
import tailwindcss from 'tailwindcss';
import helmet from 'helmet';
import { path, log, getTrueHttp, getTrueDomain } from './utils';
import { onStart as ApiOnStart } from './routers/api';
//#endregion
//#region Setup - Run first time setup if using Docker (pseudo-process, setup will be run with docker exec)
import { doSetup } from './setup';
const configPath = path('config.json');
if (!fs.existsSync(configPath) || fs.readFileSync(configPath).toString().length === 0) {
doSetup();
// @ts-ignore
return;
}
//#endregion
// Load the JSON
const { host, port, useSsl, isProxied, s3enabled, frontendName, diskFilePath }: Config = fs.readJsonSync(path('config.json'));
const { CODE_INTERNAL_SERVER_ERROR }: MagicNumbers = fs.readJsonSync(path('MagicNumbers.json'));
const { name, version, homepage }: Package = fs.readJsonSync(path('package.json'));
//#region Local imports
import uploadRouter from './routers/upload';
import resourceRouter from './routers/resource';
//#endregion
// Welcome :D
log.blank().info(`* ${name} v${version} *`).blank();
//#region Variables, module setup
const app = express();
const ROUTERS = {
upload: uploadRouter,
resource: resourceRouter
};
// Read users and data
import { onStart as AuthOnStart, users } from './auth';
import { onStart as DataOnStart, data } from './data';
//#endregion
// Create thumbnails directory
fs.ensureDirSync(path(diskFilePath, 'thumbnails'));
// Enable/disable Express features
app.enable('case sensitive routing');
app.disable('x-powered-by');
// Set Express variables
app.set('trust proxy', isProxied);
app.set('view engine', 'pug');
// Express logger middleware
// app.use(log.middleware());
// Body parser for API POST requests
// (I really don't like this being top level but it does not work inside the API Router as of 2022-12-24)
app.use(BodyParserJson());
// Helmet security middleware
app.use(helmet.noSniff());
app.use(helmet.ieNoOpen());
app.use(helmet.xssFilter());
app.use(helmet.referrerPolicy());
app.use(helmet.dnsPrefetchControl());
useSsl && app.use(helmet.hsts({ preload: true })); // skipcq: JS-0093
// Don't process favicon requests
// todo: this doesn't actually return a 204 properly, it returns a 404
app.use(nofavicon.none());
// Use custom index, otherwise render README.md
type ASS_INDEX_TYPE = 'html' | 'js' | undefined;
const ASS_INDEX: ASS_INDEX_TYPE = fs.existsSync(path('share', 'index.html')) ? 'html' : fs.existsSync(path('share', 'index.js')) ? 'js' : undefined;
app.get('/', (req, res, next) =>
ASS_INDEX === 'html' ? res.sendFile(path('share', 'index.html')) :
ASS_INDEX === 'js' ? require(path('share', 'index.js'))(req, res, next) : // skipcq: JS-0359
res.redirect(homepage))
// Set up custom frontend
const ASS_FRONTEND = { enabled: false }; // ! Disabled in 0.14.7
// Upload router (has to come after custom frontends as express-busboy interferes with all POST calls)
app.use('/', ROUTERS.upload);
// API
app.use('/api', ApiOnStart());
// CSS
app.use('/css', epcss({
cssPath: path('tailwind.css'),
plugins: [
tailwindcss,
require('autoprefixer')(),
require('cssnano')(),
require('@tinycreek/postcss-font-magician')(),
],
warn: (warning: Error) => log.warn('PostCSS', warning.toString())
}));
// '/:resouceId' always needs to be LAST since it's a catch-all route
app.use('/:resourceId', (req, _res, next) => (req.resourceId = req.params.resourceId, next()), ROUTERS.resource); // skipcq: JS-0086, JS-0090
// Error handler
app.use((err: ErrWrap, _req: Request, res: Response) => {
log.error(err.message);
console.error(err);
res.sendStatus(CODE_INTERNAL_SERVER_ERROR);
});
(async function start() {
await AuthOnStart();
await DataOnStart();
if (data() == null) setTimeout(start, 100);
else log
.info('Users', `${users.length}`)
.info('Files', `${data().size}`)
.info('Data engine', data().name, data().type)
.info('Frontend', 'disabled')
.info('Custom index', ASS_INDEX ?? 'disabled')
.blank()
.callback(() => app.listen(port, host, () => log.success('Ready for uploads', `Storing resources ${s3enabled ? 'in S3' : 'on disk'}`)));
})();

View File

@ -1,22 +0,0 @@
const check = require("check-node-version");
const ENGINES = require('../package.json').engines;
const { TLog } = require('@tycrek/log');
const logger = new TLog();
function doCheck() {
return new Promise((resolve, reject) =>
check(ENGINES, (err, { isSatisfied: allSatisfied, versions }) =>
err ? reject(err) : allSatisfied ? resolve('Node & npm version requirements satisfied!')
: reject(Object.entries(versions)
.filter(([, { isSatisfied }]) => (!isSatisfied))
.map(([packageName, { version: current, wanted: minimum }]) =>
`\nInvalid ${packageName} version!\n- Current: ${current}\n- Required: ${minimum}`)
.join('')
.concat('\nPlease update to continue!'))));
}
if (require.main !== module) module.exports = doCheck;
else doCheck()
.then((result) => logger.comment(`Wanted: ${ENGINES.node} (npm ${ENGINES.npm})`)/* .node() */.success(result))
.catch((err) => logger.error(err) && process.exit(1));

View File

@ -1,25 +0,0 @@
/**
* Used for global data management
*/
import fs from 'fs-extra';
import { Config } from 'ass-json';
import { JsonDataEngine } from '@tycrek/papito'
let theData: any;
/**
* Called by ass.ts on startup
* @since v0.14.2
*/
export const onStart = () => new Promise((resolve, reject) => {
// Actual data engine
const { dataEngine }: Config = fs.readJsonSync('config.json');
import(dataEngine)
.then(({ _ENGINE_ }) => theData = _ENGINE_(new JsonDataEngine()))
.then(resolve)
.catch(reject);
});
// Export a self-calling const function returning the data
export const data = ((): any => theData);

View File

@ -1,23 +0,0 @@
import fs from 'fs-extra';
// Don't trigger circular dependency during setup
if (require !== undefined && !require?.main?.filename.includes('setup.js'))
var MIN_LENGTH = require('../setup').gfyIdSize; // skipcq: JS-0239, JS-0102
function getWord(list: string[], delim = '') {
return list[Math.floor(Math.random() * list.length)].concat(delim);
}
function genString(count = MIN_LENGTH) {
// For some reason these 3 lines MUST be inside the function
const { path } = require('../utils');
const adjectives = fs.readFileSync(path('./gfycat/adjectives.txt')).toString().split('\n');
const animals = fs.readFileSync(path('./gfycat/animals.txt')).toString().split('\n');
let gfycat = '';
for (let i = 0; i < (count < MIN_LENGTH ? MIN_LENGTH : count); i++)
gfycat += getWord(adjectives, '-');
return gfycat.concat(getWord(animals));
};
export default ({ gfyLength }: { gfyLength: number }) => genString(gfyLength);

View File

@ -1,2 +0,0 @@
import { randomBytes } from 'crypto';
export default (length: number, charset: string[]): string => [...randomBytes(length)].map((byte) => charset[Number(byte) % charset.length]).join('').slice(1).concat(charset[0]);

View File

@ -1,2 +0,0 @@
import { nanoid } from 'nanoid';
export default ({ length }: { length?: number }) => nanoid(length);

View File

@ -1,2 +0,0 @@
import cryptoRandomString from 'crypto-random-string';
export default ({ length }: { length: number }) => cryptoRandomString({ length, type: 'alphanumeric' });

View File

@ -1 +0,0 @@
export default () => `${Date.now()}`;

View File

@ -1,38 +0,0 @@
import { v4 as uuid } from 'uuid';
import fs from 'fs-extra';
import path from 'path';
import randomGen from './random';
import { TLog } from '@tycrek/log';
const log = new TLog();
const MAX_USERNAME = 20;
export default () => uuid().replace(/-/g, '');
module.exports = () => uuid().replace(/-/g, '');
// If directly called on the command line, generate a new token
if (require.main === module) {
const token = module.exports();
const authPath = path.join(process.cwd(), 'auth.json');
let name = '';
fs.readJson(authPath)
.then((auth) => {
// Generate the user
const username = process.argv[2] ? process.argv[2].replace(/[^\da-z_]/gi, '').substring(0, MAX_USERNAME) : randomGen({ length: 20 }); // skipcq: JS-0074
if (!auth.users) auth.users = {};
if (Object.values(auth.users).findIndex((user: any) => user.username === username) !== -1) {
log.error('Username already exists', username);
process.exit(1);
}
auth.users[token] = { username, count: 0 };
name = auth.users[token].username;
fs.writeJsonSync(authPath, auth, { spaces: 4 });
})
.then(() => log
.comment('A new token has been generated and automatically applied.')
.comment('You do not need to restart \'ass\'.')
.success('Your token', token, `username: ${name}`))
.catch(console.error);
}

View File

@ -1,4 +0,0 @@
import lengthGen from './lengthGen';
const zeroWidthChars = ['\u200B', '\u200C', '\u200D', '\u2060'];
export default ({ length }: { length: number }) => lengthGen(length, zeroWidthChars);
export const checkIfZws = (str: string) => str.split('').every(char => zeroWidthChars.includes(char));

View File

@ -1,16 +0,0 @@
import { FileData } from './types/definitions';
import fs from 'fs-extra';
import crypto from 'crypto';
import toArray from 'stream-to-array';
import { log } from './utils';
/**
* Generates a SHA1 hash for the provided file
*/
export default (file: FileData): Promise<string> =>
new Promise((resolve, reject) =>
toArray((fs.createReadStream(file.path)))
.then((parts: any[]) => Buffer.concat(parts.map((part: any) => (Buffer.isBuffer(part) ? part : Buffer.from(part)))))
.then((buf: Buffer) => crypto.createHash('sha1').update(buf).digest('hex')) // skipcq: JS-D003
.then((hash: string) => log.debug(`Hash for ${file.originalname}`, hash, 'SHA1, hex').callback(() => resolve(hash)))
.catch(reject));

View File

@ -1,10 +0,0 @@
import { TLog } from '@tycrek/log';
import { DateTime } from 'luxon';
// Set up logging
const logger = new TLog(process.env.NODE_ENV === 'production' ? 'info' : 'debug')
.setTimestamp({ preset: DateTime.DATETIME_MED });
// todo: re-enable the Express logger
export default logger;

View File

@ -1,65 +0,0 @@
const fs = require('fs-extra');
const path = require('path');
const { s3enabled } = require('../config.json');
const { formatBytes } = require('./utils');
const { bucketSize } = require('./storage');
const { TLog } = require('@tycrek/log');
const log = new TLog({ level: 'debug', timestamp: { enabled: false } });
/**
* Thank you CoPilot for helping write whatever the fuck this is -tycrek, 2022-04-18
*/
function whileWait(expression, timeout = 1000) {
return new Promise(async (resolve, reject) => {
while (expression())
await new Promise((resolve) => setTimeout(resolve, timeout));
resolve();
});
}
module.exports = () => {
const data = require('./data').data;
const { users } = fs.readJsonSync(path.join(process.cwd(), 'auth.json'));
Object.keys(users).forEach((token) => users[token].count = 0);
let totalSize = 0;
let oldSize = 0;
let d = [];
whileWait(() => data() === undefined)
.then(() => data().get())
.then((D) => (d = D.map(([, resource]) => resource)))
.then(() =>
d.forEach(({ token, size }) => {
try {
totalSize += size;
if (token === undefined) oldSize += size; // skipcq: JS-0127
else {
if (!users[token].size) users[token].size = 0;
users[token].size += size;
users[token].count++;
}
} catch (ex) {
// Silently handle missing tokens from dev environment -tycrek
}
}))
.then(() => bucketSize())
.then((s3size) => {
log.info('---- Usage metrics ----')
.blank()
.info('Users', Object.keys(users).length)
.info('Files', Object.keys(d).length)
.info('S3 size', s3enabled ? s3size : '--')
.blank()
.info('Total size', formatBytes(totalSize))
.info('Old files', formatBytes(oldSize))
.blank();
Object.values(users).forEach(({ username, count, size }) => log.info(`- ${username}`, formatBytes(size), `${count} files`));
process.exit(0);
})
.catch(console.error);
}
if (require.main === module) module.exports();

View File

@ -1,26 +0,0 @@
/**
* This strips GPS EXIF data from files
*/
import { removeLocation } from '@xoi/gps-metadata-remover';
import fs from 'fs-extra';
/**
* This strips GPS EXIF data from files using the @xoi/gps-metadata-remover package
* @returns A Promise that resolves to `true` if GPS data was removed, `false` if not
*/
export const removeGPS = (file: string): Promise<boolean> => {
return new Promise((resolve, reject) =>
fs.open(file, 'r+')
.then((fd) => removeLocation(file,
// Read function
(size: number, offset: number): Promise<Buffer> =>
fs.read(fd, Buffer.alloc(size), 0, size, offset)
.then(({ buffer }) => Promise.resolve(buffer)),
// Write function
(val: string, offset: number, enc: BufferEncoding): Promise<void> =>
fs.write(fd, Buffer.alloc(val.length, val, enc), 0, val.length, offset)
.then(() => Promise.resolve())))
.then(resolve)
.catch(reject));
}

View File

@ -1,16 +0,0 @@
import { TLog } from '@tycrek/log';
import fs from 'fs-extra';
import path from 'path';
const log = new TLog();
const uploadsPath = path.join(process.cwd(), 'uploads/');
const dataPath = path.join(process.cwd(), 'data.json');
if (fs.existsSync(uploadsPath)) {
fs.removeSync(uploadsPath);
log.success('Deleted', uploadsPath);
}
if (fs.existsSync(dataPath)) {
fs.removeSync(dataPath);
log.success('Deleted', dataPath);
}

View File

@ -1,80 +0,0 @@
import { FileData } from './types/definitions';
import { Config } from 'ass-json';
import fs from 'fs-extra';
import ffmpeg from 'ffmpeg-static';
import sharp from 'sharp';
// @ts-ignore
import shell from 'any-shell-escape';
import { exec } from 'child_process';
import { isProd, path } from './utils';
const { diskFilePath }: Config = fs.readJsonSync(path('config.json'));
// Thumbnail parameters
const THUMBNAIL = {
QUALITY: 75,
WIDTH: 200 * 2,
HEIGHT: 140 * 2,
}
/**
* Builds a safe escaped ffmpeg command
*/
function getCommand(src: String, dest: String) {
return shell([
ffmpeg, '-y',
'-v', (isProd ? 'error' : 'debug'), // Log level
'-i', src, // Input file
'-ss', '00:00:01.000', // Timestamp of frame to grab
'-vf', `scale=${THUMBNAIL.WIDTH}:${THUMBNAIL.HEIGHT}:force_original_aspect_ratio=increase,crop=${THUMBNAIL.WIDTH}:${THUMBNAIL.HEIGHT}`, // Dimensions of output file
'-frames:v', '1', // Number of frames to grab
dest // Output file
]);
}
/**
* Builds a thumbnail filename
*/
function getNewName(oldName: String) {
return oldName.concat('.thumbnail.jpg');
}
/**
* Builds a path to the thumbnails
*/
function getNewNamePath(oldName: String) {
return path(diskFilePath, 'thumbnails/', getNewName(oldName));
}
/**
* Extracts an image from a video file to use as a thumbnail, using ffmpeg
*/
function getVideoThumbnail(file: FileData) {
return new Promise((resolve: Function, reject: Function) => exec(
getCommand(file.path, getNewNamePath(file.randomId)),
// @ts-ignore
(err: Error) => (err ? reject(err) : resolve())
));
}
/**
* Generates a thumbnail for the provided image
*/
function getImageThumbnail(file: FileData) {
return new Promise((resolve, reject) =>
sharp(file.path)
.resize(THUMBNAIL.WIDTH, THUMBNAIL.HEIGHT, { kernel: 'cubic' })
.jpeg({ quality: THUMBNAIL.QUALITY })
.toFile(getNewNamePath(file.randomId))
.then(resolve)
.catch(reject));
}
/**
* Generates a thumbnail
*/
export default (file: FileData): Promise<string> =>
new Promise((resolve, reject) =>
(file.is.video ? getVideoThumbnail : (file.is.image && !file.mimetype.includes('webp')) ? getImageThumbnail : () => Promise.resolve())(file)
.then(() => resolve((file.is.video || file.is.image) ? getNewName(file.randomId) : file.is.audio ? 'views/ass-audio-icon.png' : 'views/ass-file-icon.png'))
.catch(reject));

View File

@ -1,29 +0,0 @@
import path from 'path';
import fs from 'fs-extra';
import axios from 'axios';
import logger from '../logger';
import { User } from '../types/auth';
// Port from config.json
const { port } = fs.readJsonSync(path.join(process.cwd(), 'config.json'));
// CLI key from auth.json
const { cliKey } = fs.readJsonSync(path.join(process.cwd(), 'auth.json'));
if (process.argv.length < 4) {
logger.error('Missing username or password');
logger.error('Usage: node script.adduser.js <username> <password> [admin] [meta]');
process.exit(1);
} else {
const username = process.argv[2];
const password = process.argv[3];
const admin = process.argv[4] ? process.argv[4].toLowerCase() === 'true' : false;
const meta = process.argv[5] ? JSON.parse(process.argv[5]) : {};
axios.post(`http://localhost:${port}/api/user`, { username, password, admin, meta }, { headers: { 'Authorization': cliKey } })
.then((response) => {
const user = response.data as User;
logger.info('User created', `${username} (${user.unid})`, `token: ${user.token}`).callback(() => process.exit(0))
})
.catch((err) => logger.error(err).callback(() => process.exit(1)));
}

View File

@ -1,19 +0,0 @@
import logger from '../logger';
import { onStart, users, setUserPassword } from '../auth';
if (process.argv.length < 4) {
logger.error('Missing username/unid or password');
process.exit(1);
} else {
const id = process.argv[2];
const password = process.argv[3];
onStart(process.argv[4] || 'auth.json')
.then(() => {
const user = users.find((user) => user.unid === id || user.username === id);
if (!user) throw new Error('User not found');
else return setUserPassword(user.unid, password);
})
.then(() => logger.info('Password changed successfully').callback(() => process.exit(0)))
.catch((err) => logger.error(err).callback(() => process.exit(1)));
}

View File

@ -1,20 +0,0 @@
import logger from '../logger';
import { onStart, users } from '../auth';
import { compare } from 'bcrypt';
if (process.argv.length < 4) {
logger.error('Missing username/unid or password');
process.exit(1);
} else {
const id = process.argv[2];
const password = process.argv[3];
onStart(process.argv[4] || 'auth.json')
.then(() => {
const user = users.find((user) => user.unid === id || user.username === id);
if (!user) throw new Error('User not found');
else return compare(password, user.passhash);
})
.then((result) => logger.info('Matches', `${result}`).callback(() => process.exit(0)))
.catch((err) => logger.error(err).callback(() => process.exit(1)));
}

View File

@ -1,5 +1,3 @@
import { Request, Response } from 'express';
declare global {
namespace Express {
interface Request {

View File

@ -1,5 +0,0 @@
declare module './setup' {
export function doSetup(): void;
}
declare module '@tycrek/papito';
declare module '@skynetlabs/skynet-nodejs';

View File

@ -1,30 +1,4 @@
import { Config } from 'ass-json';
import { FileData } from './types/definitions';
import fs from 'fs-extra';
import Path from 'path';
import fetch from 'node-fetch';
import sanitize from 'sanitize-filename';
import { DateTime } from 'luxon';
import token from './generators/token';
import zwsGen from './generators/zws';
import randomGen from './generators/random';
import gfyGen from './generators/gfycat';
import tsGen from './generators/timestamp';
import logger from './logger';
import { Request } from 'express';
import { isProd as ip } from '@tycrek/joint';
const { HTTP, HTTPS, KILOBYTES } = require('../MagicNumbers.json');
// Catch config.json not existing when running setup script
try {
// todo: fix this
const configPath = Path.join(process.cwd(), 'config.json');
if (!fs.existsSync(configPath)) throw new Error('Config file not found');
var { useSsl, port, domain, isProxied, diskFilePath, s3bucket, s3endpoint, s3usePathStyle }: Config = fs.readJsonSync(configPath);
} catch (ex) {
// @ts-ignore
if (ex.code !== 'MODULE_NOT_FOUND' || !ex.toString().includes('Unexpected end')) console.error(ex);
}
const { HTTP, HTTPS } = require('../MagicNumbers.json');
export function getTrueHttp() {
return ('http').concat(useSsl ? 's' : '').concat('://');
@ -42,88 +16,13 @@ export function getDirectUrl(resourceId: string) {
return `${getTrueHttp()}${getTrueDomain()}/${resourceId}/direct`;
}
export function randomHexColour() { // From: https://www.geeksforgeeks.org/javascript-generate-random-hex-codes-color/
const letters = '0123456789ABCDEF';
let colour = '#';
for (let i = 0; i < 6; i++) // skipcq: JS-0074
colour += letters[(Math.floor(Math.random() * letters.length))];
return colour;
}
export function getResourceColor(colorValue: string, vibrantValue: string) {
return (!colorValue || colorValue === '&vibrant') ? vibrantValue : colorValue === '&random' ? randomHexColour() : colorValue;
}
export function formatTimestamp(timestamp: number, timeoffset: string) {
return DateTime.fromMillis(timestamp).setZone(timeoffset).toLocaleString(DateTime.DATETIME_MED);
}
export function formatBytes(bytes: number, decimals = 2) { // skipcq: JS-0074
if (bytes === 0) return '0 Bytes';
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];
const i = Math.floor(Math.log(bytes) / Math.log(KILOBYTES));
return parseFloat((bytes / Math.pow(KILOBYTES, i)).toFixed(decimals < 0 ? 0 : decimals)).toString().concat(` ${sizes[i]}`);
}
export function replaceholder(data: string, size: number, timestamp: number, timeoffset: string, originalname: string) {
return data
.replace(/&size/g, formatBytes(size))
.replace(/&filename/g, originalname)
.replace(/&timestamp/g, formatTimestamp(timestamp, timeoffset));
}
const idModes = {
zws: 'zws', // Zero-width spaces (see: https://zws.im/)
og: 'original', // Use original uploaded filename
r: 'random', // Use a randomly generated ID with a mixed-case alphanumeric character set
gfy: 'gfycat', // Gfycat-style ID's (https://gfycat.com/unsungdiscretegrub)
ts: 'timestamp', // Timestamp-based ID's
};
const GENERATORS = new Map();
GENERATORS.set(idModes.zws, zwsGen);
GENERATORS.set(idModes.r, randomGen);
GENERATORS.set(idModes.gfy, gfyGen);
GENERATORS.set(idModes.ts, tsGen);
export function generateId(mode: string, length: number, gfyLength: number, originalName: string) {
return (GENERATORS.has(mode) ? GENERATORS.get(mode)({ length, gfyLength }) : originalName);
}
// Set up pathing
export const path = (...paths: string[]) => Path.join(process.cwd(), ...paths);
export const isProd = ip();
module.exports = {
path,
getTrueHttp,
getTrueDomain,
getS3url,
getDirectUrl,
getResourceColor,
formatTimestamp,
formatBytes,
replaceholder,
randomHexColour,
sanitize,
renameFile: (req: Request, newName: string) => new Promise((resolve: Function, reject) => {
try {
const paths = [req.file.destination, newName];
fs.rename(path(req.file.path), path(...paths));
req.file.path = Path.join(...paths);
resolve();
} catch (err) {
reject(err);
}
}),
generateToken: () => token(),
generateId,
downloadTempS3: (file: FileData) => new Promise((resolve: Function, reject) =>
fetch(getS3url(file.randomId, file.ext))
.then((f2) => f2.body!.pipe(fs.createWriteStream(Path.join(__dirname, diskFilePath, sanitize(file.originalname))).on('close', () => resolve())))
.catch(reject)),
}
export const log = logger;
/**
* @type {TLog}
*/
module.exports.log = logger;

View File

@ -1,26 +0,0 @@
import { FileData } from './types/definitions';
import Vibrant from 'node-vibrant';
import sharp from 'sharp';
import { randomHexColour } from './utils';
// Vibrant parameters
const COLOR_COUNT = 256;
const QUALITY = 3;
/**
* Extracts a prominent colour from the provided image file
*/
function getVibrant(file: FileData, resolve: Function, reject: Function) {
sharp(file.path).png().toBuffer()
.then((data) => Vibrant.from(data)
.maxColorCount(COLOR_COUNT)
.quality(QUALITY)
.getPalette())
.then((palettes) => resolve(palettes[Object.keys(palettes).sort((a, b) => palettes[b]!.population - palettes[a]!.population)[0]]!.hex))
.catch((err) => reject(err));
}
/**
* Extracts a colour from an image file. Returns a random Hex value if provided file is a video
*/
export default (file: FileData): Promise<string> => new Promise((resolve, reject) => (!file.is.image || file.mimetype.includes('webp')) ? resolve(randomHexColour()) : getVibrant(file, resolve, reject)); // skipcq: JS-0229

View File

@ -5,29 +5,29 @@
@layer base {}
@layer components {
.res-media {
@apply border-l-4 rounded max-h-half-port;
.setup-text-section-header {
@apply text-2xl font-bold font-mono;
}
.link {
@apply no-underline hover_no-underline active_no-underline visited_no-underline
.setup-text-item-title {
@apply text-stone-300;
}
/* regular, visited */
text-link-primary visited_text-link-primary
border-b-2 visited_border-b-2
border-transparent visited_border-transparent
rounded-sm visited_rounded-sm
.setup-text-optional {
@apply text-stone-400 italic;
}
/* hover */
hover_text-link-hover
hover_border-hover
.setup-panel {
@apply flex flex-col pt-4 w-full max-w-xs;
}
/* active */
active_text-link-active
/* transitions */
ease-linear duration-150 transition-all;
.setup-panel>sl-input {
@apply mb-4;
}
}
@layer utilities {}
@layer utilities {
.flex-center {
@apply items-center justify-center;
}
}

View File

@ -1,20 +0,0 @@
{
"extends": "@tsconfig/node16/tsconfig.json",
"compilerOptions": {
"outDir": "./dist",
"target": "ES2022",
"lib": [
"ES2022",
"DOM"
],
"allowJs": true,
"downlevelIteration": true
},
"include": [
"src/**/*.js",
"src/**/*.ts"
],
"exclude": [
"ass-x"
]
}

View File

Before

Width:  |  Height:  |  Size: 6.2 KiB

After

Width:  |  Height:  |  Size: 6.2 KiB

View File

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 11 KiB

24
views/_base_.pug Normal file
View File

@ -0,0 +1,24 @@
doctype html
html.dark.sl-theme-dark(lang='en')
head
meta(charset='UTF-8')
meta(name='viewport', content='width=device-width, initial-scale=1.0')
block title
title ass 🍑
meta(name='theme-color' content='black')
link(rel='stylesheet', href='/.css')
//- Shoelace/Font Awesome mixins
include ../node_modules/shoelace-fontawesome-pug/sl-fa-mixin.pug
include ../node_modules/shoelace-pug-loader/loader.pug
+slTheme('dark')
+slAuto
body.w-screen.h-screen.flex.flex-col
//- Header
.w-full.border-b.border-stone-500.flex.justify-center.items-center.py-3
h1.text-4xl.font-bold.font-mono: block section
span [section]
//- Centering width-fixer
.w-full.flex.justify-center.h-full
.w-full.md_max-w-xl.px-4.pt-16.h-full: block content

9
views/admin.pug Normal file
View File

@ -0,0 +1,9 @@
extends _base_
block title
title ass admin 🍑
block section
span admin
block content
h1.text-3xl Coming soon.
script(src='/admin/ui.js')

5
views/index.pug Normal file
View File

@ -0,0 +1,5 @@
extends _base_
block section
span ass
block content
h1.text-3xl Welcome to ass #{version}, a ShareX server.

11
views/login.pug Normal file
View File

@ -0,0 +1,11 @@
extends _base_
block section
span login
block content
.flex.flex-col.flex-center.h-full: .setup-panel
h3 Username
sl-input#login-username(type='text' placeholder='username' clearable): sl-icon(slot='prefix' name='fas-user' library='fa')
h3 Password
sl-input#login-password(type='password' placeholder='password' clearable): sl-icon(slot='prefix' name='fas-lock' library='fa')
sl-button.mt-4#login-submit(type='primary' submit) Login
script(src='/login/ui.js')

101
views/setup.pug Normal file
View File

@ -0,0 +1,101 @@
extends _base_
block title
title ass setup 🍑
block section
span ass setup
block content
//- Setup panel
.flex.flex-col.items-center
p.text-lg.mb-4 Welcome to ass, your new personal file upload server!
//- * Base config
h2.setup-text-section-header.mt-12 Upload configuration
.setup-panel
h3.setup-text-item-title Uploads directory
sl-input#uploads-dir(type='text' placeholder='/opt/ass/uploads' clearable): sl-icon(slot='prefix' name='fas-folders' library='fa')
h3.setup-text-item-title ID type
sl-input#uploads-idtype(type='text' placeholder='random'): sl-icon(slot='prefix' name='fas-input-text' library='fa')
h3.setup-text-item-title ID size
sl-input#uploads-idsize(type='number' placeholder='8'): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title Gfycat size
sl-input#uploads-gfysize(type='number' placeholder='3'): sl-icon(slot='prefix' name='fas-cat' library='fa')
h3.setup-text-item-title Maximum file size (MB)
sl-input#uploads-filesize(type='number' placeholder='50'): sl-icon(slot='prefix' name='fas-file' library='fa')
//- * Admin User
h2.setup-text-section-header.mt-4 Admin User
.setup-panel
h3.setup-text-item-title Username
sl-input#user-username(type='text' placeholder='admin' clearable): sl-icon(slot='prefix' name='fas-user' library='fa')
h3.setup-text-item-title Password
sl-input#user-password(type='password' placeholder='the-most-secure' clearable): sl-icon(slot='prefix' name='fas-lock' library='fa')
//- * Database
h2.setup-text-section-header.mt-4 Database
.setup-panel
sl-tab-group
//- * JSON
sl-tab#json-tab(slot='nav' panel='json') JSON
sl-tab-panel(name='json')
| you all good!
//- * MySQL
sl-tab#mysql-tab(slot='nav' panel='mysql') MySQL
sl-tab-panel(name='mysql')
h3.setup-text-item-title Host
sl-input#mysql-host(type='text' placeholder='mysql.example.com' clearable): sl-icon(slot='prefix' name='fas-server' library='fa')
h3.setup-text-item-title Port
sl-input#mysql-port(type='number' placeholder='3306' min='1' max='65535' no-spin-buttons clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title User
sl-input#mysql-user(type='text' placeholder='myassql' clearable): sl-icon(slot='prefix' name='fas-user' library='fa')
h3.setup-text-item-title Password
sl-input#mysql-password(type='password' placeholder='super-secure' clearable): sl-icon(slot='prefix' name='fas-lock' library='fa')
h3.setup-text-item-title Database
sl-input#mysql-database(type='text' placeholder='assdb' clearable): sl-icon(slot='prefix' name='fas-database' library='fa')
//- * PostgreSQL
sl-tab#pgsql-tab(slot='nav' panel='pgsql') PostgreSQL
sl-tab-panel(name='pgsql')
h3.setup-text-item-title Host
sl-input#pgsql-host(type='text' placeholder='postgres.example.com' clearable): sl-icon(slot='prefix' name='fas-server' library='fa')
h3.setup-text-item-title Port
sl-input#pgsql-port(type='number' placeholder='5432' min='1' max='65535' no-spin-buttons clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title User
sl-input#pgsql-user(type='text' placeholder='posgrassql' clearable): sl-icon(slot='prefix' name='fas-user' library='fa')
h3.setup-text-item-title Password
sl-input#pgsql-password(type='password' placeholder='super-secure' clearable): sl-icon(slot='prefix' name='fas-lock' library='fa')
h3.setup-text-item-title Database
sl-input#pgsql-database(type='text' placeholder='assdb' clearable): sl-icon(slot='prefix' name='fas-database' library='fa')
//- * S3
h2.setup-text-section-header.mt-4 S3 #[span.setup-text-optional optional]
.setup-panel
h3.setup-text-item-title Endpoint
sl-input#s3-endpoint(type='text' placeholder='https://s3.example.com' clearable): sl-icon(slot='prefix' name='fas-server' library='fa')
h3.setup-text-item-title Bucket
sl-input#s3-bucket(type='text' placeholder='ass-bucket' clearable): sl-icon(slot='prefix' name='fas-bucket' library='fa')
h3.setup-text-item-title Access key
sl-input#s3-accessKey(type='text' placeholder='ABCD1234' clearable): sl-icon(slot='prefix' name='fas-key-skeleton' library='fa')
h3.setup-text-item-title Secret key
sl-input#s3-secretKey(type='password' placeholder='EF56GH78IJ90KL12' clearable): sl-icon(slot='prefix' name='fas-user-secret' library='fa')
h3.setup-text-item-title Region #[span.setup-text-optional optional]
sl-input#s3-region(type='text' placeholder='us-east' clearable): sl-icon(slot='prefix' name='fas-map-location-dot' library='fa')
//- * Rate Limits
h2.setup-text-section-header.mt-4 Rate Limits #[span.setup-text-optional optional]
.setup-panel
h3.setup-text-item-title Generic API - Requests
sl-input#ratelimit-api-requests(type='text' placeholder='120' clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title Generic API - Seconds per reset
sl-input#ratelimit-api-time(type='text' placeholder='60' clearable): sl-icon(slot='prefix' name='fas-clock' library='fa')
h3.setup-text-item-title Login - Requests
sl-input#ratelimit-login-requests(type='text' placeholder='5' clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title Login - Seconds per reset
sl-input#ratelimit-login-time(type='text' placeholder='30' clearable): sl-icon(slot='prefix' name='fas-clock' library='fa')
h3.setup-text-item-title File upload - Requests
sl-input#ratelimit-upload-requests(type='text' placeholder='120' clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title File upload - Seconds per reset
sl-input#ratelimit-upload-time(type='text' placeholder='60' clearable): sl-icon(slot='prefix' name='fas-clock' library='fa')
sl-button.w-32.mt-2.self-center#submit(type='primary' submit) Submit
script(src='/setup/ui.js')

9
views/user.pug Normal file
View File

@ -0,0 +1,9 @@
extends _base_
block title
title ass user 🍑
block section
span user
block content
h1.text-3xl Coming soon.
script(src='/user/ui.js')