pull/220/merge
Sylvie 4 months ago committed by GitHub
commit 85c9d65ac5
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -0,0 +1,5 @@
# production
.ass-data/
# development
node_modules/

342
.github/README.md vendored

@ -9,19 +9,7 @@
</div>
**ass** is a self-hosted ShareX upload server written in Node.js. I initially started this project purely out of spite. ass aims to be as **unopinionated** as possible, giving users & hosts alike the ability to modify nearly everything.
By default, ass comes with a resource viewing page, which includes metadata about the resource as well as a download button & inline viewers for images, videos, & audio. It does **not** have a user dashboard or registration system: **this is intentional!** Developers are free to [create their own frontends] using the languages & tools they are most comfortable with. Writing & using these frontends is fully documented below, in the wiki, & in the source code.
### Notice (Sep 2023)
The current release version 0.14.x is now in **maintenence mode**. What this means is I'll only be providing updates to catastrophic issues.
However! I'm currently working on [a new version](https://github.com/tycrek/ass/pull/220), 0.15.0, which is a lot more stable and organized. I have no ETA but please know that I'm continuing to work on it when I can. Version 0.14.x is still functional, just a bit rough around the edges.
#### Developers 🧡
ass was designed with developers in mind. If you are a developer & want something changed to better suit you, let me know & we'll see what we can do!
**ass** is a self-hosted ShareX upload server written in TypeScript.
[GitHub package.json version]: https://img.shields.io/github/package-json/v/tycrek/ass?color=fd842d&style=for-the-badge
[GitHub license]: https://img.shields.io/github/license/tycrek/ass?color=FD7C21&style=for-the-badge
@ -29,24 +17,11 @@ ass was designed with developers in mind. If you are a developer & want somethin
[GitHub Repo stars]: https://img.shields.io/github/stars/tycrek/ass?color=F26602&style=for-the-badge
[Discord badge]: https://img.shields.io/discord/848274994375294986?label=Discord&logo=Discord&logoColor=FFF&style=for-the-badge
[Discord invite]: https://discord.gg/wGZYt5fasY
[create their own frontends]: #custom-frontends
## Code quality
| [CodeQL] | [DeepSource] |
| :---------------------------------------: | :----------------------------------: |
| [![CodeQL badge]][CodeQL link] | [![DeepSource Active Issues]][DeepSource Repo] [![DeepSource Resolved Issues]][DeepSource Repo] |
[CodeQL]: https://codeql.github.com/docs/
[DeepSource]: https://deepsource.io/
[CodeQL badge]: https://github.com/tycrek/ass/actions/workflows/codeql-analysis.yml/badge.svg?branch=master
[CodeQL link]: https://github.com/tycrek/ass/actions/workflows/codeql-analysis.yml
[DeepSource Active Issues]: https://deepsource.io/gh/tycrek/ass.svg/?label=active+issues
[DeepSource Resolved Issues]: https://deepsource.io/gh/tycrek/ass.svg/?label=resolved+issues
[DeepSource Repo]: https://deepsource.io/gh/tycrek/ass/?ref=repository-badge
## Features
###### Out of date
#### For users
- Upload images, gifs, videos, audio, & files
@ -55,44 +30,30 @@ ass was designed with developers in mind. If you are a developer & want somethin
- GPS data automatically removed
- Fully customizable Discord embeds
- Built-in web viewer with video & audio player
- Dashboard to manage your files
- Embed images, gifs, & videos directly in Discord
- Personal upload log using customizable Discord Webhooks
- macOS/Linux support with alternative clients such as [Flameshot] ([script for ass]) & [MagicCap]
- **Multiple URL styles**
- [ZWS]
- Mixed-case alphanumeric
- Gfycat
- Original
- Timestamp
- Original
- ZWS
#### For hosts & developers
- Usage metrics
- Thumbnail support
- Mimetype blocking
- Themeable viewer page
- Basic multi-user support
- Configurable global upload size limit (per-user coming soon)
- Custom pluggable frontends using [Git Submodules]
- Run locally or in a Docker container
- Multi-user support
- Run locally or via Docker
- API for developers to write custom interfaces
- **Multiple file storage methods**
- Local file system
- Amazon S3, including [DigitalOcean Spaces] (more coming soon)
- **Multiple data storage methods** using [data engines]
- **File**
- JSON (default, [papito])
- YAML (soon)
- **Database**
- PostgreSQL ([ass-psql])
- MongoDB ([ass-mongoose][GH AMongoose])
- MySQL (soon)
[Git Submodules]: https://git-scm.com/book/en/v2/Git-Tools-Submodules
[ZWS]: https://zws.im
[DigitalOcean Spaces]: https://www.digitalocean.com/products/spaces/
[data engines]: #data-engines
[papito]: https://github.com/tycrek/papito
[ass-psql]: https://github.com/tycrek/ass-psql
- S3
- **Multiple data storage methods**
- JSON
- MySQL
- PostgreSQL
[Flameshot]: https://flameshot.org/
[script for ass]: #flameshot-users-linux
[MagicCap]: https://magiccap.me/
@ -101,18 +62,17 @@ ass was designed with developers in mind. If you are a developer & want somethin
| Type | What is it? |
| ---- | ----------- |
| **[Zero-width spaces][ZWS]** | When pasted elsewhere, the URL appears to be *just* your domain name. Some browsers or sites may not recognize these URLs (Discord sadly no longer supports these as of April 2023)<br>![ZWS sample] |
| **Mixed-case alphanumeric** | The "safe" mode. URL's are browser safe as the character set is just letters & numbers. |
| **Gfycat** | Gfycat-style ID's (for example: `https://example.com/unsung-discrete-grub`). Thanks to [Gfycat] for the wordlists |
| **Original** | The "basic" mode. URL matches the same filename as when the file was uploaded. This may be prone to conflicts with files of the same name. |
| **Timestamp** | The quick but dirty mode. URL is a timestamp of when the file was uploaded, in milliseconds. This is the most unique mode, but also potentially the longest (Gfycat could be longer, easily). **Keep in mind this is vulnerable to iteration attacks** |
| **Original** | The "basic" mode. URL matches the same filename as when the file was uploaded. This may be prone to conflicts with files of the same name. |
| **ZWS** | "Zero-width spaces": when pasted elsewhere, the URL appears to be *just* your domain name. Some browsers or sites may not recognize these URLs (Discord sadly no longer supports these as of April 2023) |
[ZWS sample]: https://user-images.githubusercontent.com/29926144/113785625-bf43a480-96f4-11eb-8dd7-7f164f33ada2.png
[Gfycat]: https://gfycat.com
## Installation
ass supports two installation methods: Docker (recommended) & local (manual).
ass supports two installation methods: Docker & local.
### Docker
@ -120,61 +80,17 @@ ass supports two installation methods: Docker (recommended) & local (manual).
<summary><em>Expand for Docker/Docker Compose installation steps</em></summary>
<br>
[Docker Compose] is the recommended way to install ass. These steps assume you are already family with Docker. If not, you should probably use the local installation method. They also assume that you have a working Docker installation with Docker Compose v2 installed.
[Docker Compose] is the recommended way to install ass. These steps assume you already Docker & Docker Compose v2 installed.
[Docker Compose]: https://docs.docker.com/compose/
#### Install using docker-compose
1. Clone the ass repo using `git clone https://github.com/tycrek/ass.git && cd ass/`
2. Run the command that corresponds to your OS:
- **Linux**: `./install/docker-linux.sh` (uses `#!/bin/bash`)
- **Windows**: `install\docker-windows.bat` (from Command Prompt)
- These scripts are identical using the equivalent commands in each OS.
3. Work through the setup process when prompted.
The upload token will be printed at the end of the setup script prompts. This is the token that you'll need to use to upload resources to ass. It may go by too quickly to copy it, so just scroll back up in your terminal after setup or run `cat auth.json`.
You should now be able to access the ass server at `http://localhost:40115/` (ass-docker will bind to host `0.0.0.0` to allow external access). You can configure a reverse proxy (for example, [Caddy]; also check out [my tutorial]) to make it accessible from the internet with automatic SSL.
#### What is this script doing?
It creates directories & files required for Docker Compose to properly set up volumes. After that, it simply builds the image & container, then launches the setup process.
#### How do I run the npm scripts?
Since all 3 primary data files are bound to the container with Volumes, you can run the scripts in two ways: `docker compose exec` or `npm` on the host.
```bash
# Check the usage metrics
docker compose exec ass npm run metrics
# Run the setup script
docker compose exec ass npm run setup && docker compose restart
# Run npm on the host to run the setup script (also works for metrics)
# (You will have to meet the Node.js & npm requirements on your host for this to work properly)
npm run setup && docker compose restart
```
#### How do I update?
Easy! Just pull the changes & run this one-liner:
```bash
# Pull the latest version of ass & rebuild the image
git pull && docker compose build --no-cache && docker compose up -d
```
#### What else should I be aware of?
Deploying ass with Docker exposes **five** volumes. These volumes let you edit the config, view the auth or data files, or view the `uploads/` folder from your host.
- `uploads/`
- `share/`
- `config.json`
- `auth.json`
- `data.json`
0. This repo comes with a pre-made Compose file.
1. Clone the repo using `git clone https://github.com/tycrek/ass.git && cd ass/`
2. Run `docker compose up`
- You can append `-d` to run in the background.
3. When the logs indicate, visit your installation in your browser to begin the setup.
</details>
@ -184,16 +100,17 @@ Deploying ass with Docker exposes **five** volumes. These volumes let you edit t
<summary><em>Expand for local installation steps</em></summary>
<br>
1. You should have **Node.js 16** & **npm 8 or later** installed.
1. You should have **Node.js 20** & **npm 10 or later** installed.
2. Clone this repo using `git clone https://github.com/tycrek/ass.git && cd ass/`
3. Run `npm i --save-dev` to install the required dependencies (`--save-dev` is **required** for compilation)
4. Run `npm run build` to compile the TypeScript files
5. Run `npm start` to start ass.
The first time you run ass, the setup process will automatically be called & you will be shown your first authorization token; save this as you will need it to configure ShareX.
3. Run `pnpm i` or `npm i`
4. Run `npm run build`
5. Run `npm start`
6. When the logs indicate, visit your installation in your browser to begin the setup.
</details>
# the readme from this point is out of date
## Using HTTPS
For HTTPS support, you must configure a reverse proxy. I recommend [Caddy] but any reverse proxy works fine (such as Apache or Nginx). A sample config for Caddy is provided below:
@ -246,44 +163,6 @@ If you need to override a specific part of the config to be different from the g
[Luxon]: https://moment.github.io/luxon/#/zones?id=specifying-a-zone
### Fancy embeds
If you primarily share media on Discord, you can add these additional (optional) headers to build embeds:
| Header | Purpose |
| ------ | ------- |
| **`X-Ass-OG-Title`** | Large text shown above your media. Required for embeds to appear on desktop. |
| **`X-Ass-OG-Description`** | Small text shown below the title but above the media (does not show up on videos) |
| **`X-Ass-OG-Author`** | Small text shown above the title |
| **`X-Ass-OG-Author-Url`** | URL to open when the Author is clicked |
| **`X-Ass-OG-Provider`** | Smaller text shown above the author |
| **`X-Ass-OG-Provider-Url`** | URL to open when the Provider is clicked |
| **`X-Ass-OG-Color`** | Colour shown on the left side of the embed. Must be one of `&random`, `&vibrant`, or a hex colour value (for example: `#fe3c29`). Random is a randomly generated hex value & Vibrant is sourced from the image itself |
#### Embed placeholders
You can insert certain metadata into your embeds with these placeholders:
| Placeholder | Result |
| ----------- | ------ |
| **`&size`** | The files size with proper notation rounded to two decimals (example: `7.06 KB`) |
| **`&filename`** | The original filename of the uploaded file |
| **`&timestamp`** | The timestamp of when the file was uploaded (example: `Oct 14, 1983, 1:30 PM`) |
#### Server-side embed configuration
You may also specify a default embed config on the server. Keep in mind that if users specify the `X-Ass-OG-Title` header, the server-side config will be ignored. To configure the server-side embed, create a new file in the `share/` directory named `embed.json`. Available options are:
- **`title`**
- `description`
- `author`
- `authorUrl`
- `provider`
- `providerUrl`
- `color`
Their values are equivalent to the headers listed above.
### Webhooks
You may use Discord webhooks as an easy way to keep track of your uploads. The first step is to [create a new Webhook]. You only need to follow the first section, **Making a Webhook**. Once you are done that, click **Copy Webhook URL**. Finally, add these headers to your custom uploader:
@ -298,22 +177,6 @@ Webhooks will show the filename, mimetype, size, upload timestamp, thumbail, & a
[create a new Webhook]: https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks
## Customizing the viewer
If you want to customize the font or colours of the viewer page, create a file in the `share/` directory called `theme.json`. Available options are:
| Option | Purpose |
| ------ | ------- |
| **`font`** | The font family to use; defaults to `"Josefin Sans"`. Fonts with a space should be surrounded by double quotes. |
| **`bgPage`** | Background colour for the whole page |
| **`bgViewer`** | Background colour for the viewer element |
| **`txtPrimary`** | Primary text colour; this should be your main brand colour. |
| **`txtSecondary`** | Secondary text colour; this is used for the file details. |
| **`linkPrimary`** | Primary link colour |
| **`linkHover`** | Colour of the `hover` effect for links |
| **`linkActive`** | Colour of the `active` effect for links |
| **`borderHover`** | Colour of the `hover` effect for borders; this is used for the underlining links. |
## Custom index
By default, ass directs the index route `/` to this README. Follow these steps to use a custom index:
@ -339,148 +202,12 @@ To use a custom 404 page, create a file in the `share/` directory called `404.ht
If there's interest, I may allow making this a function, similar to the custom index.
## File storage
ass supports three methods of file storage: local, S3, or [Skynet].
### Local
Local storage is the simplest option, but relies on you having a lot of disk space to store files, which can be costly.
### S3
Any existing object storage server that's compatible with [Amazon S3] can be used with ass. I personally host my files using Digital Ocean Spaces, which implements S3.
S3 servers are generally very fast & have very good uptime, though this will depend on the hosting provider & plan you choose.
## New user system (v0.14.0)
The user system was overhauled in v0.14.0 to allow more features and flexibility. New fields on users include `admin`, `passhash`, `unid`, and `meta` (these will be documented more once the system is finalized).
New installs will automatically generate a default user. Check the `auth.json` file for the token. You will use this for API requests and to authenticate within ShareX.
ass will automatically convert your old `auth.json` to the new format. **Always backup your `auth.json` and `data.json` before updating**. By default, the original user (named `ass`) will be marked as an admin.
### Adding users
You may add users via the CLI or the API. I'll document the API further in the future.
#### CLI
```bash
npm run cli-adduser <username> <password> [admin] [meta]
```
| Argument | Purpose |
| -------- | ------- |
| **`username`** `string` | The username of the user. |
| **`password`** `string` | The password of the user. |
| **`admin?`** `boolean` | Whether the user is an admin. Defaults to `false`. |
| **`meta?`** `string` | Any additional metadata to store on the user, as a JSON string. |
**Things still not added:**
- Modifying/deleting users via the API
## Developer API
ass includes an API (v0.14.0) for frontend developers to easily integrate with. Right now the API is pretty limited but I will expand on it in the future, with frontend developer feedback.
Any endpoints requiring authorization will require an `Authorization` header with the value being the user's upload token. Admin users are a new feature introduced in v0.14.0. Admin users can access all endpoints, while non-admin users can only access those relevant to them.
Other things to note:
- **All endpoints are prefixed with `/api/`**.
- All endpoints will return a JSON object unless otherwise specified.
- Successful endpoints *should* return a `200` status code. Any errors will use the corresponding `4xx` or `5xx` status code (such as `401 Unauthorized`).
- ass's API will try to be as compliant with the HTTP spec as possible. For example, using `POST/PUT` for create/modify, and response codes such as `409 Conflict` for duplicate entries. This compliance may not be 100% perfect, but I will try my best.
### API endpoints
| Endpoint | Purpose | Admin? |
| -------- | ------- | ------ |
| **`GET /user/`** | Returns a list of all users | Yes |
| **`GET /user/:id`** | Returns the user with the given ID | Yes |
| **`GET /user/self`** | Returns the current user | No |
| **`GET /user/token/:token`** | Returns the user with the given token | No |
| **`POST /user/`** | Creates a new user. Request body must be a JSON object including `username` and `password`. You may optionally include `admin` (boolean) or `meta` (object). Returns 400 if fails. | Yes |
| **`POST /user/password/reset/:id`** | Force resets the user's **password**. Request body must be a JSON object including a `password`. | Yes |
| **`DELETE /user/:id`** | Deletes the user with the given ID, as well as all their uploads. | Yes |
| **`PUT /user/meta/:id`** | Updates the user's metadata. Request body must be a JSON object with keys `key` and `value`, with the key/value you want to set in the users metadata. Optionally you may include `force: boolean` to override existing keys. | Yes |
| **`DELETE /user/meta/:id`** | Deletes a key/value from a users metadata. Request body must be a JSON object with a `key` property specifying the key to delete. | Yes |
| **`PUT /user/username/:id`** | Updates the user's username. Request body must be a JSON object with a `username` property. | Yes |
| **`PUT /user/token/:id`** | Regenerates a users upload token | Yes |
## Custom frontends - OUTDATED
**Please be aware that this section is outdated (marked as of 2022-04-15). It will be updated when I overhaul the frontend system.**
**Update 2022-12-24: I plan to overhaul this early in 2023.**
ass is intended to provide a strong backend for developers to build their own frontends around. [Git Submodules] make it easy to create custom frontends. Submodules are their own projects, which means you are free to build the router however you wish, as long as it exports the required items. A custom frontend is really just an [Express.js router].
**For a detailed walkthrough on developing your first frontend, [consult the wiki][ctw1].**
[Git Submodules]: https://git-scm.com/book/en/v2/Git-Tools-Submodules
[Express.js router]: https://expressjs.com/en/guide/routing.html#express-router
[ctw1]: https://github.com/tycrek/ass/wiki/Writing-a-custom-frontend
## Data Engines
[Papito data engines] are responsible for managing your data. "Data" has two parts: an identifier & the actual data itself. With ass, the data is a JSON object representing the uploaded resource. The identifier is the unique ID in the URL returned to the user on upload. **Update August 2022:** I plan to overhaul Papito and how all this works *eventually*. If this comment is still here in a year, ~~kick~~ message me.
[Papito data engines]: https://github.com/tycrek/papito
**Supported data engines:**
| Name | Description | Links |
| :--: | ----------- | :---: |
| **JSON** | JSON-based data storage. On disk, data is stored in a JSON file. In memory, data is stored in a [Map]. This is the default engine. | [GitHub][GH ASE]<br>[npm][npm ASE] |
| **PostgreSQL** | Data storage using a [PostgreSQL] database. [node-postgres] is used for communicating with the database. | [GitHub][GH APSQL]<br>[npm][npm APSQL] |
| **Mongoose** | Data storage using a [MongoDB] database. [mongoose] is used for communicating with the database. Created by [@dylancl] | [GitHub][GH AMongoose]<br>[npm][npm AMongoose] |
[Map]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map
[GH ASE]: https://github.com/tycrek/papito/
[npm ASE]: https://www.npmjs.com/package/@tycrek/papito
[PostgreSQL]: https://www.postgresql.org/
[node-postgres]: https://node-postgres.com/
[GH APSQL]: https://github.com/tycrek/ass-psql/
[npm APSQL]: https://www.npmjs.com/package/@tycrek/ass-psql
[MongoDB]: https://www.mongodb.com/
[mongoose]: https://mongoosejs.com/
[GH AMongoose]: https://github.com/dylancl/ass-mongoose
[npm AMongoose]: https://www.npmjs.com/package/ass-mongoose
[@dylancl]: https://github.com/dylancl
A Papito data engine implements support for one type of database (or file, such as JSON or YAML). This lets ass server hosts pick their database of choice, because all they'll have to do is enter the connection/authentication details & ass will handle the rest, using the resource ID as the key.
**~~For a detailed walkthrough on developing engines, [consult the wiki][ctw2].~~ Outdated!**
[`data.js`]: https://github.com/tycrek/ass/blob/master/data.js
[ctw2]: https://github.com/tycrek/ass/wiki/Writing-a-StorageEngine
## npm scripts
ass has a number of pre-made npm scripts for you to use. **All** of these scripts should be run using `npm run <script-name>` (except `start`).
| Script | Description |
| ------ | ----------- |
| **`start`** | Starts the ass server. This is the default script & is run with **`npm start`**. |
| `build` | Compiles the TypeScript files into JavaScript. |
| `dev` | Chains the `build` & `compile` scripts together. |
| `setup` | Starts the easy setup process. Should be run after any updates that introduce new config options. |
| `metrics` | Runs the metrics script. This is a simple script that outputs basic resource statistics. |
| `purge` | Purges all uploads & data associated with them. This does **not** delete any users, however. |
| `engine-check` | Ensures your environment meets the minimum Node & npm version requirements. |
[`FORCE_COLOR`]: https://nodejs.org/dist/latest-v16.x/docs/api/cli.html#cli_force_color_1_2_3
## Flameshot users (Linux)
Use [this script]. For the `KEY`, put your token. Thanks to [@ToxicAven] for creating this!
Use [`flameshot-v2.sh`] or [`sample_screenshotter.sh`].
[this script]: https://github.com/tycrek/ass/blob/master/flameshot_example.sh
[@ToxicAven]: https://github.com/ToxicAven
[`flameshot-v2.sh`]: https://github.com/tycrek/ass/blob/dev/0.15.0/flameshot-v2.sh
[`sample_screenshotter.sh`]: https://github.com/tycrek/ass/blob/dev/0.15.0/sample_screenshotter.sh
## Contributing
@ -493,7 +220,6 @@ Please follow the [Contributing Guidelines] when submiting Issues or Pull Reques
- Thanks to [hlsl#1359] for the logo
- [Gfycat] for their wordlists
- [Aven], for helping kickstart the project
- My spiteful ass for motivating me to actually take this project to new heights
[hlsl#1359]: https://behance.net/zevwolf
[Aven]: https://github.com/ToxicAven

@ -0,0 +1,66 @@
name: "Docker Build"
on:
push:
branches: [ master, dev/0.15.0 ]
jobs:
build_and_push:
name: Build & Publish Docker Images
if: (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/dev/0.15.0') && contains(github.event.head_commit.message, '[docker build]')
runs-on: ubuntu-latest
steps:
- name: Wait for build to succeed
uses: lewagon/wait-on-check-action@master
with:
ref: ${{ github.ref }}
check-name: build
repo-token: ${{ secrets.GH_TOKEN }}
allowed-conclusions: success
- name: Checkout
uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Build and push
uses: docker/build-push-action@v5
with:
context: .
file: ./Dockerfile
platforms: linux/amd64,linux/arm64
push: true
build-args: |
COMMIT_TAG=${{ github.sha }}
tags: |
tycrek/ass:latest
tycrek/ass:${{ github.sha }}
discord:
name: Send Discord Notification
needs: build_and_push
if: always() && github.event_name != 'pull_request' && contains(github.event.head_commit.message, '[docker build]')
runs-on: ubuntu-latest
steps:
- name: Get Build Job Status
uses: technote-space/workflow-conclusion-action@v3
- name: Combine Job Status
id: status
run: |
failures=(neutral, skipped, timed_out, action_required)
if [[ ${array[@]} =~ $WORKFLOW_CONCLUSION ]]; then
echo "status=failure" >> $GITHUB_OUTPUT
else
echo "status=$WORKFLOW_CONCLUSION" >> $GITHUB_OUTPUT
fi
- name: Post Status to Discord
uses: sarisia/actions-status-discord@v1
with:
webhook: ${{ secrets.DISCORD_WEBHOOK }}
status: ${{ steps.status.outputs.status }}
title: ${{ github.workflow }}
nofail: true

@ -1,11 +1,16 @@
name: TypeScript Build
concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true
on:
push:
pull_request:
workflow_dispatch:
jobs:
build:
if: "!contains(github.event.head_commit.message, '[skip ci:ts]')"
runs-on: ubuntu-latest
env:
ARCHIVE_NAME: ass-build-${{ github.run_id }}-${{ github.run_number }}
@ -13,19 +18,19 @@ jobs:
# Checkout repo
- uses: actions/checkout@v4
# Set up Node 16
# Set up Node 20
- name: Setup Node.js environment
uses: actions/setup-node@v3
with:
node-version: 16.14.0
node-version: 20
# Install npm 8 & TypeScript
# Install npm 10 & TypeScript
- name: Install global packages
run: npm i -g npm@8 typescript
run: npm i -g npm@10 typescript pnpm
# Install ass dependencies (including types)
- name: Install dependencies
run: npm i --save-dev
run: pnpm i
# Compile the TypeScript files
- name: Run build script

126
.gitignore vendored

@ -1,123 +1,15 @@
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
lerna-debug.log*
# Diagnostic reports (https://nodejs.org/api/report.html)
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
# Runtime data
pids
*.pid
*.seed
*.pid.lock
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
*.lcov
# nyc test coverage
.nyc_output
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
.grunt
# Bower dependency directory (https://bower.io/)
bower_components
# node-waf configuration
.lock-wscript
# Compiled binary addons (https://nodejs.org/api/addons.html)
build/Release
# Dependency directories
node_modules/
jspm_packages/
# TypeScript v1 declaration files
typings/
# TypeScript cache
*.tsbuildinfo
# Optional npm cache directory
.npm
# Optional eslint cache
.eslintcache
# Microbundle cache
.rpt2_cache/
.rts2_cache_cjs/
.rts2_cache_es/
.rts2_cache_umd/
# Optional REPL history
.node_repl_history
# Output of 'npm pack'
*.tgz
# Yarn Integrity file
.yarn-integrity
# dotenv environment variables file
.env
.env.test
# parcel-bundler cache (https://parceljs.org/)
.cache
# Next.js build output
.next
# Nuxt.js build / generate output
.nuxt
dist
# Gatsby files
.cache/
# Comment in the public line in if your project uses Gatsby and *not* Next.js
# https://nextjs.org/blog/next-9-1#public-directory-support
# public
# vuepress build output
.vuepress/dist
# Serverless directories
.serverless/
# FuseBox cache
.fusebox/
# DynamoDB Local files
.dynamodb/
# TernJS port file
.tern-port
# tokens
auth.json*
auth.*.json
# data
data.json*
# uploads
uploads/
# build dirs
dist*/
# config
config.json
# ass data
.ass-data/
# certificates
*.crt
# VitePress documentation
docs/.vitepress/dist/
docs/.vitepress/cache/
# share/ directory
share/
# Wrangler local cache (docs dev server)
.wrangler/

@ -1,27 +1,12 @@
# ass Dockerfile v0.3.1
# ass Dockerfile v0.3.3
# authors:
# - tycrek <t@tycrek.com> (https://tycrek.com/)
# - Zusier <zusier@pm.me> (https://github.com/Zusier)
# Node 16 image
FROM node:16.20.2
# Set working directory
WORKDIR /opt/ass/
# Copy directory files (config.json, source files etc.)
FROM node:20.9.0-alpine
WORKDIR /opt/ass-src/
COPY . ./
# Ensure these directories & files exist for compose volumes
RUN mkdir -p /opt/ass/uploads/thumbnails/ && \
mkdir -p /opt/ass/share/ && \
touch /opt/ass/config.json && \
touch /opt/ass/auth.json && \
touch /opt/ass/data.json
# Install dependencies as rootless user
RUN npm i --save-dev && \
npm run build
# Start ass
CMD npm start
RUN npm i -g pnpm
RUN pnpm i
RUN npm run build
CMD npm start

@ -1,6 +1,6 @@
ISC License
Copyright (c) 2021, tycrek
Copyright (c) 2021-2023, tycrek
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above

@ -1,14 +0,0 @@
{
"HTTP": 80,
"HTTPS": 443,
"CODE_OK": 200,
"CODE_NO_CONTENT": 204,
"CODE_BAD_REQUEST": 400,
"CODE_UNAUTHORIZED": 401,
"CODE_NOT_FOUND": 404,
"CODE_CONFLICT": 409,
"CODE_PAYLOAD_TOO_LARGE": 413,
"CODE_UNSUPPORTED_MEDIA_TYPE": 415,
"CODE_INTERNAL_SERVER_ERROR": 500,
"KILOBYTES": 1024
}

@ -0,0 +1,184 @@
import { UserConfiguration, UserConfigTypeChecker, PostgresConfiguration } from 'ass';
import fs from 'fs-extra';
import { path } from '@tycrek/joint';
import { log } from './log.js';
import { validate } from 'william.js';
const FILEPATH = path.join('.ass-data/userconfig.json');
/**
* Returns a boolean if the provided value is a number
*/
const numChecker = (val: any) => {
try { return !isNaN(parseInt(val)) && typeof val !== 'string'; }
catch (err) { return false; }
}
/**
* Returns a boolean if the provided value is a non-empty string
*/
const basicStringChecker = (val: any) => typeof val === 'string' && val.length > 0;
/**
* User-config property type checker functions
*/
const Checkers: UserConfigTypeChecker = {
uploadsDir: (val) => {
try {
fs.pathExistsSync(val)
? fs.accessSync(val)
: fs.mkdirSync(val, { recursive: true });
return true;
}
catch (err) {
log.warn('Cannot access directory', `${val}`);
console.error(err);
return false;
}
},
idType: (val) => ['random', 'original', 'gfycat', 'timestamp', 'zws'].includes(val),
idSize: numChecker,
gfySize: numChecker,
maximumFileSize: numChecker,
discordWebhook: (val) => validate.discord.webhook(val),
s3: {
endpoint: basicStringChecker,
bucket: basicStringChecker,
region: (val) => val == null || basicStringChecker(val),
credentials: {
accessKey: basicStringChecker,
secretKey: basicStringChecker
}
},
sql: {
mySql: {
host: basicStringChecker,
user: basicStringChecker,
password: basicStringChecker,
database: basicStringChecker,
port: (val) => numChecker(val) && val >= 1 && val <= 65535
},
postgres: {
port: (val) => numChecker(val) && val >= 1 && val <= 65535
}
},
rateLimit: {
endpoint: (val) => val == null || (val != null && (numChecker(val.requests) && numChecker(val.duration)))
}
};
export class UserConfig {
private static _config: UserConfiguration;
private static _ready = false;
public static get config() { return UserConfig._config; }
public static get ready() { return UserConfig._ready; }
constructor(config?: any) {
// Typically this would only happen during first-time setup (for now)
if (config != null) {
UserConfig._config = UserConfig.parseConfig(config);
UserConfig._ready = true;
}
}
/**
* Ensures that all config options are valid
*/
private static parseConfig(c: any) {
const config = (typeof c === 'string' ? JSON.parse(c) : c) as UserConfiguration;
// * Base config
if (!Checkers.uploadsDir(config.uploadsDir)) throw new Error(`Unable to access uploads directory: ${config.uploadsDir}`);
if (!Checkers.idType(config.idType)) throw new Error(`Invalid ID type: ${config.idType}`);
if (!Checkers.idSize(config.idSize)) throw new Error('Invalid ID size');
if (!Checkers.gfySize(config.gfySize)) throw new Error('Invalid Gfy size');
if (!Checkers.maximumFileSize(config.maximumFileSize)) throw new Error('Invalid maximum file size');
if (!Checkers.discordWebhook(config.discordWebhook)) throw new Error('Invalid Discord webhook');
// * Optional S3 config
if (config.s3 != null) {
if (!Checkers.s3.endpoint(config.s3.endpoint)) throw new Error('Invalid S3 Endpoint');
if (!Checkers.s3.bucket(config.s3.bucket)) throw new Error('Invalid S3 Bucket');
if (!Checkers.s3.region(config.s3.region)) throw new Error('Invalid S3 Region');
if (!Checkers.s3.credentials.accessKey(config.s3.credentials.accessKey)) throw new Error('Invalid S3 Access key');
if (!Checkers.s3.credentials.secretKey(config.s3.credentials.secretKey)) throw new Error('Invalid S3 Secret key');
}
// * Optional database config(s)
if (config.database != null) {
// these both have the same schema so we can just check both
if (config.database.kind == 'mysql' || config.database.kind == 'postgres') {
if (config.database.options != undefined) {
if (!Checkers.sql.mySql.host(config.database.options.host)) throw new Error('Invalid database host');
if (!Checkers.sql.mySql.user(config.database.options.user)) throw new Error('Invalid databse user');
if (!Checkers.sql.mySql.password(config.database.options.password)) throw new Error('Invalid database password');
if (!Checkers.sql.mySql.database(config.database.options.database)) throw new Error('Invalid database');
if (!Checkers.sql.mySql.port(config.database.options.port)) throw new Error('Invalid database port');
if (config.database.kind == 'postgres') {
if (!Checkers.sql.postgres.port((config.database.options as PostgresConfiguration).port)) {
throw new Error('Invalid database port');
}
}
} else throw new Error('Database options missing');
}
}
// * optional rate limit config
if (config.rateLimit != null) {
if (!Checkers.rateLimit.endpoint(config.rateLimit.login)) throw new Error('Invalid Login rate limit configuration');
if (!Checkers.rateLimit.endpoint(config.rateLimit.upload)) throw new Error('Invalid Upload rate limit configuration');
if (!Checkers.rateLimit.endpoint(config.rateLimit.api)) throw new Error('Invalid API rate limit configuration');
}
// All is fine, carry on!
return config;
}
/**
* Save the config file to disk
*/
public static saveConfigFile(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// Only save is the config has been parsed
if (!UserConfig._ready) throw new Error('Config not ready to be saved!');
// Write to file
await fs.writeFile(FILEPATH, JSON.stringify(UserConfig._config, null, '\t'));
resolve(void 0);
} catch (err) {
log.error('Failed to save config file!');
reject(err);
}
});
}
/**
* Reads the config file from disk
*/
public static readConfigFile(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// Read the file data
const data = (await fs.readFile(FILEPATH)).toString();
// Ensure the config is valid
UserConfig._config = UserConfig.parseConfig(data);
UserConfig._ready = true;
resolve(void 0);
} catch (err) {
log.error('Failed to read config file!');
reject(err);
}
});
}
}

@ -0,0 +1,211 @@
import { AssUser, ServerConfiguration } from 'ass';
import fs from 'fs-extra';
import tailwindcss from 'tailwindcss';
import session from 'express-session';
import MemoryStore from 'memorystore';
import express, { Request, Response, NextFunction, RequestHandler, json as BodyParserJson } from 'express';
import { path, isProd } from '@tycrek/joint';
import { epcss } from '@tycrek/express-postcss';
import { log } from './log.js';
import { get } from './data.js';
import { UserConfig } from './UserConfig.js';
import { DBManager } from './sql/database.js';
import { JSONDatabase } from './sql/json.js';
import { MySQLDatabase } from './sql/mysql.js';
import { PostgreSQLDatabase } from './sql/postgres.js';
import { buildFrontendRouter } from './routers/_frontend.js';
/**
* Top-level metadata exports
*/
export const App = {
pkgVersion: ''
};
/**
* Custom middleware to attach the ass object (and construct the `host` property)
*/
const assMetaMiddleware = (port: number, proxied: boolean): RequestHandler =>
(req: Request, _res: Response, next: NextFunction) => {
req.ass = {
host: `${req.protocol}://${req.hostname}${proxied ? '' : `:${port}`}`,
version: App.pkgVersion
};
// Set up Session if required
if (!req.session.ass)
(log.debug('Session missing'), req.session.ass = {});
next();
};
/**
* Custom middleware to verify user access
*/
const loginRedirectMiddleware = (requireAdmin = false): RequestHandler =>
async (req: Request, res: Response, next: NextFunction) => {
// If auth doesn't exist yet, make the user login
if (!req.session.ass?.auth) {
log.warn('User not logged in', req.baseUrl);
// Set pre-login path so user is directed to their requested page
req.session.ass!.preLoginPath = req.baseUrl;
// Redirect
res.redirect('/login');
} else {
const user = (await get('users', req.session.ass.auth.uid)) as AssUser;
// Check if user is admin
if ((requireAdmin || req.baseUrl === '/admin') && !user.admin) {
log.warn('Admin verification failed', user.username, user.id);
res.sendStatus(403);
} else next();
}
};
/**
* Main function.
* Yes I'm using main() in TS, cry about it
*/
async function main() {
// Launch log
const pkg = await fs.readJson(path.join('package.json')) as { name: string, version: string };
log.blank().info(pkg.name, pkg.version).blank();
App.pkgVersion = pkg.version;
// Ensure data directory exists
log.debug('Checking data dir')
await fs.ensureDir(path.join('.ass-data'));
// Set default server configuration
const serverConfig: ServerConfiguration = {
host: '0.0.0.0',
port: 40115,
proxied: isProd()
};
// Replace with user details, if necessary
try {
const exists = await fs.pathExists(path.join('.ass-data/server.json'));
if (exists) {
// Read file
const { host, port, proxied } = await fs.readJson(path.join('.ass-data/server.json')) as { host?: string, port?: number, proxied?: boolean };
// Set details, if available
if (host) serverConfig.host = host;
if (port) serverConfig.port = port;
if (proxied != undefined) serverConfig.proxied = proxied;
log.debug('server.json', `${host ? `host=${host},` : ''}${port ? `port=${port},` : ''}${proxied != undefined ? `proxied=${proxied},` : ''}`);
}
} catch (err) {
log.error('Failed to read server.json');
console.error(err);
throw err;
}
// Attempt to load user configuration
await new Promise((resolve) => UserConfig.readConfigFile().then(() => resolve(void 0))
.catch((err) => (err.code && err.code === 'ENOENT' ? {} : console.error(err), resolve(void 0))));
// If user config is ready, try to configure SQL
if (UserConfig.ready && UserConfig.config.database != null) {
try {
switch (UserConfig.config.database?.kind) {
case 'json':
await DBManager.use(new JSONDatabase());
break;
case 'mysql':
await DBManager.use(new MySQLDatabase());
break;
case 'postgres':
await DBManager.use(new PostgreSQLDatabase());
break;
}
} catch (err) { throw new Error(`Failed to configure SQL`); }
} else { // default to json database
log.debug('DB not set! Defaulting to JSON');
await DBManager.use(new JSONDatabase());
}
// Set up Express
const app = express();
// Configure sessions
const DAY = 86_400_000;
app.use(session({
name: 'ass',
resave: true,
saveUninitialized: false,
cookie: { maxAge: DAY, secure: isProd() },
secret: (Math.random() * 100).toString(),
store: new (MemoryStore(session))({ checkPeriod: DAY }) as any,
}));
// Configure Express features
app.enable('case sensitive routing');
app.disable('x-powered-by');
// Set Express variables
app.set('trust proxy', serverConfig.proxied);
app.set('view engine', 'pug');
app.set('views', 'views/');
// Middleware
app.use(log.express());
app.use(BodyParserJson());
app.use(assMetaMiddleware(serverConfig.port, serverConfig.proxied));
// Favicon
app.use('/favicon.ico', (req, res) => res.redirect('https://i.tycrek.dev/ass'));
// CSS
app.use('/.css', epcss({
cssPath: path.join('tailwind.css'),
plugins: [
tailwindcss,
(await import('autoprefixer')).default(),
(await import('cssnano')).default(),
(await import('@tinycreek/postcss-font-magician')).default(),
],
warn: (warning: Error) => log.warn('PostCSS', warning.toString())
}));
// Metadata routes
app.get('/.ass.host', (req, res) => res.type('text').send(req.ass.host));
app.get('/.ass.version', (req, res) => res.type('text').send(req.ass.version));
// Basic page routers
app.use('/setup', buildFrontendRouter('setup', false));
app.use('/login', buildFrontendRouter('login'));
app.use('/admin', loginRedirectMiddleware(), buildFrontendRouter('admin'));
app.use('/user', loginRedirectMiddleware(), buildFrontendRouter('user'));
// Advanced routers
app.use('/api', (await import('./routers/api.js')).router);
app.use('/', (await import('./routers/index.js')).router);
// Host app
app.listen(serverConfig.port, serverConfig.host, () => log[UserConfig.ready ? 'success' : 'warn']('Server listening', UserConfig.ready ? 'Ready for uploads' : 'Setup required', `click http://127.0.0.1:${serverConfig.port}`));
}
// Start program
main().catch((err) => (console.error(err), process.exit(1)));
// Exit tasks
['SIGINT', 'SIGTERM'].forEach((signal) => process.addListener(signal as any, () => {
// Hide ^C in console output
process.stdout.write('\r');
// Log then exit
log.info('Exiting', `received ${signal}`);
process.exit();
}));

@ -0,0 +1,54 @@
import { AssFile, AssUser, DatabaseValue, NID } from 'ass';
import { log } from './log.js';
import { UserConfig } from './UserConfig.js';
import { DBManager } from './sql/database.js';
/**
* Switcher type for exported functions
*/
type DataSector = 'files' | 'users';
/**
* database kind -> name mapping
*/
const DBNAMES = {
'mysql': 'MySQL',
'postgres': 'PostgreSQL',
'json': 'JSON'
};
export const put = (sector: DataSector, key: NID, data: AssFile | AssUser): Promise<void> => new Promise(async (resolve, reject) => {
try {
if (sector === 'files') {
// * 1: Save as files (image, video, etc)
await DBManager.put('assfiles', key, data as AssFile);
} else {
// * 2: Save as users
await DBManager.put('assusers', key, data as AssUser);
}
log.info(`PUT ${sector} data`, `using ${DBNAMES[UserConfig.config.database?.kind ?? 'json']}`, key);
resolve(void 0);
} catch (err) {
reject(err);
}
});
export const get = (sector: DataSector, key: NID): Promise<DatabaseValue> => new Promise(async (resolve, reject) => {
try {
const data = await DBManager.get(sector === 'files' ? 'assfiles' : 'assusers', key);
resolve(data);
} catch (err) {
reject(err);
}
});
export const getAll = (sector: DataSector): Promise<DatabaseValue[]> => new Promise(async (resolve, reject) => {
try {
const data = await DBManager.getAll(sector === 'files' ? 'assfiles' : 'assusers');
resolve(data);
} catch (err) {
reject(err);
}
});

@ -0,0 +1,50 @@
import fs from 'fs-extra';
import cryptoRandomString from 'crypto-random-string';
import { randomBytes, getRandomValues } from 'crypto';
import { path } from '@tycrek/joint';
type Length = { length: number, gfyLength?: number };
// todo: load gfy length from config file
const MIN_LENGTH_GFY = 2;
/**
* Random generator
*/
export const random = ({ length }: Length) => cryptoRandomString({ length, type: 'alphanumeric' });
/**
* Timestamp generator
*/
export const timestamp = () => `${Date.now()}`;
/**
* Charset generator
*/
export const charset = ({ length, charset }: { length: number, charset: string[] }): string =>
[...randomBytes(length)].map((byte) => charset[Number(byte) % charset.length]).join('').slice(1).concat(charset[0]);
/**
* ZWS generator
*/
export const zws = ({ length }: Length) => charset({ length, charset: ['\u200B', '\u200C', '\u200D', '\u2060'] });
/**
* Gfycat generator
*/
export const gfycat = ({ gfyLength }: Length) => {
const count = gfyLength ?? MIN_LENGTH_GFY;
const getWord = (list: string[], delim = '') =>
list[Math.floor(Math.random() * list.length)].concat(delim);
const adjectives = fs.readFileSync(path.join('./common/gfycat/adjectives.txt')).toString().split('\n');
const animals = fs.readFileSync(path.join('./common/gfycat/animals.txt')).toString().split('\n');
let gfycat = '';
for (let i = 0; i < (count < MIN_LENGTH_GFY ? MIN_LENGTH_GFY : count); i++)
gfycat += getWord(adjectives, '-');
return gfycat.concat(getWord(animals));
};
export const nanoid = (size = 21) => getRandomValues(new Uint8Array(size)).reduce(((t, e) => t += (e &= 63) < 36 ? e.toString(36) : e < 62 ? (e - 26).toString(36).toUpperCase() : e > 62 ? "-" : "_"), "");

@ -0,0 +1,2 @@
import { TLog } from '@tycrek/log';
export const log = new TLog('debug');

@ -0,0 +1,89 @@
import fs from 'fs-extra';
import sharp from 'sharp';
import Vibrant from 'node-vibrant';
import ffmpeg from 'ffmpeg-static';
import { exec } from 'child_process';
import { isProd } from '@tycrek/joint';
import { removeLocation } from '@xoi/gps-metadata-remover';
//@ts-ignore
import shell from 'any-shell-escape';
type SrcDest = { src: string, dest: string };
/**
* Strips GPS EXIF data from a file
*/
export const removeGPS = (file: string): Promise<boolean> => new Promise((resolve, reject) =>
fs.open(file, 'r+')
.then((fd) => removeLocation(file,
// Read function
(size: number, offset: number): Promise<Buffer> =>
fs.read(fd, Buffer.alloc(size), 0, size, offset)
.then(({ buffer }) => Promise.resolve(buffer)),
// Write function
(val: string, offset: number, enc: BufferEncoding): Promise<void> =>
fs.write(fd, Buffer.alloc(val.length, val, enc), 0, val.length, offset)
.then(() => Promise.resolve())))
.then(resolve)
.catch(reject));
const VIBRANT = { COLOURS: 256, QUALITY: 3 };
export const vibrant = (file: string, mimetype: string): Promise<string> => new Promise((resolve, reject) =>
// todo: random hex colour
mimetype.includes('video') || mimetype.includes('webp') ? `#335599`
: sharp(file).png().toBuffer()
.then((data) => Vibrant.from(data)
.maxColorCount(VIBRANT.COLOURS)
.quality(VIBRANT.QUALITY)
.getPalette())
.then((palettes) => resolve(palettes[Object.keys(palettes).sort((a, b) => palettes[b]!.population - palettes[a]!.population)[0]]!.hex))
.catch((err) => reject(err)));
/**
* Thumbnail operations
*/
export class Thumbnail {
private static readonly THUMBNAIL = {
QUALITY: 75,
WIDTH: 200 * 2,
HEIGHT: 140 * 2,
}
private static getImageThumbnail({ src, dest }: SrcDest) {
return new Promise((resolve, reject) =>
sharp(src)
.resize(this.THUMBNAIL.WIDTH, this.THUMBNAIL.HEIGHT, { kernel: 'cubic' })
.jpeg({ quality: this.THUMBNAIL.QUALITY })
.toFile(dest)
.then(resolve)
.catch(reject));
}
private static getVideoThumbnail({ src, dest }: SrcDest) {
exec(this.getCommand({ src, dest }));
}
private static getCommand({ src, dest }: SrcDest) {
return shell([
ffmpeg, '-y',
'-v', (isProd() ? 'error' : 'debug'), // Log level
'-i', src, // Input file
'-ss', '00:00:01.000', // Timestamp of frame to grab
'-vf', `scale=${this.THUMBNAIL.WIDTH}:${this.THUMBNAIL.HEIGHT}:force_original_aspect_ratio=increase,crop=${this.THUMBNAIL.WIDTH}:${this.THUMBNAIL.HEIGHT}`, // Dimensions of output file
'-frames:v', '1', // Number of frames to grab
dest // Output file
]);
}
// old default
/*
export default (file: FileData): Promise<string> =>
new Promise((resolve, reject) =>
(file.is.video ? getVideoThumbnail : (file.is.image && !file.mimetype.includes('webp')) ? getImageThumbnail : () => Promise.resolve())(file)
.then(() => resolve((file.is.video || file.is.image) ? getNewName(file.randomId) : file.is.audio ? 'views/ass-audio-icon.png' : 'views/ass-file-icon.png'))
.catch(reject));
*/
}

@ -0,0 +1,46 @@
import { EndpointRateLimitConfiguration } from 'ass';
import { NextFunction, Request, Response } from 'express';
import { rateLimit } from 'express-rate-limit';
/**
* map that contains rate limiter middleware for each group
*/
const rateLimiterGroups = new Map<string, (req: Request, res: Response, next: NextFunction) => void>();
export const setRateLimiter = (group: string, config: EndpointRateLimitConfiguration | undefined): (req: Request, res: Response, next: NextFunction) => void => {
if (config == null) { // config might be null if the user doesnt want a rate limit
rateLimiterGroups.set(group, (req, res, next) => {
next();
});
return rateLimiterGroups.get(group)!;
} else {
rateLimiterGroups.set(group, rateLimit({
limit: config.requests,
windowMs: config.duration * 1000,
skipFailedRequests: true,
legacyHeaders: false,
standardHeaders: 'draft-7',
keyGenerator: (req, res) => {
return req.ip || 'disconnected';
},
handler: (req, res) => {
res.status(429);
res.contentType('json');
res.send('{"success":false,"message":"Rate limit exceeded, try again later"}');
}
}));
return rateLimiterGroups.get(group)!;
}
}
/**
* creates middleware for rate limiting
*/
export const rateLimiterMiddleware = (group: string, config: EndpointRateLimitConfiguration | undefined): (req: Request, res: Response, next: NextFunction) => void => {
if (!rateLimiterGroups.has(group)) setRateLimiter(group, config);
return (req, res, next) => {
return rateLimiterGroups.get(group)!(req, res, next);
};
};

@ -0,0 +1,31 @@
import { Router } from 'express';
import { path } from '@tycrek/joint';
import { App } from '../app.js';
import { UserConfig } from '../UserConfig.js';
/**
* Builds a basic router for loading a page with frontend JS
*/
export const buildFrontendRouter = (page: string, onConfigReady = true) => {
// Config readiness checker
const ready = () => (onConfigReady)
? UserConfig.ready
: !UserConfig.ready;
// Set up a router
const router = Router({ caseSensitive: true });
// Render the page
router.get('/', (_req, res) => ready()
? res.render(page, { version: App.pkgVersion })
: res.redirect('/'));
// Load frontend JS
router.get('/ui.js', (_req, res) => ready()
? res.type('text/javascript').sendFile(path.join(`dist/frontend/${page}.mjs`))
: res.sendStatus(403));
return router;
};

@ -0,0 +1,142 @@
import { AssUser, AssUserNewReq } from 'ass';
import * as bcrypt from 'bcrypt'
import { Router, json as BodyParserJson, RequestHandler } from 'express';
import * as data from '../data.js';
import { log } from '../log.js';
import { nanoid } from '../generators.js';
import { UserConfig } from '../UserConfig.js';
import { rateLimiterMiddleware, setRateLimiter } from '../ratelimit.js';
import { DBManager } from '../sql/database.js';
import { JSONDatabase } from '../sql/json.js';
import { MySQLDatabase } from '../sql/mysql.js';
import { PostgreSQLDatabase } from '../sql/postgres.js';
const router = Router({ caseSensitive: true });
// Setup route
router.post('/setup', BodyParserJson(), async (req, res) => {
if (UserConfig.ready)
return res.status(409).json({ success: false, message: 'User config already exists' });
log.info('Setup', 'initiated');
try {
// Parse body
new UserConfig(req.body);
// Save config
await UserConfig.saveConfigFile();
// set up new databases
if (UserConfig.config.database) {
switch (UserConfig.config.database.kind) {
case 'json':
await DBManager.use(new JSONDatabase());
break;
case 'mysql':
await DBManager.use(new MySQLDatabase());
break;
case 'postgres':
await DBManager.use(new PostgreSQLDatabase());
break;
}
}
// set rate limits
if (UserConfig.config.rateLimit?.api) setRateLimiter('api', UserConfig.config.rateLimit.api);
if (UserConfig.config.rateLimit?.login) setRateLimiter('login', UserConfig.config.rateLimit.login);
if (UserConfig.config.rateLimit?.upload) setRateLimiter('upload', UserConfig.config.rateLimit.upload);;
log.success('Setup', 'completed');
return res.json({ success: true });
} catch (err: any) {
log.error('Setup failed', err);
return res.status(400).json({ success: false, message: err.message });
}
});
// User login
router.post('/login', rateLimiterMiddleware('login', UserConfig.config?.rateLimit?.login), BodyParserJson(), (req, res) => {
const { username, password } = req.body;
data.getAll('users')
.then((users) => {
if (!users) throw new Error('Missing users data');
else return Object.entries(users as AssUser[])
.filter(([_uid, user]: [string, AssUser]) => user.username === username)[0][1]; // [0] is the first item in the filter results, [1] is AssUser
})
.then((user) => Promise.all([bcrypt.compare(password, user.password), user]))
.then(([success, user]) => {
success ? log.success('User logged in', user.username)
: log.warn('User failed to log in', user.username);
// Set up the session information
if (success) req.session.ass!.auth = {
uid: user.id,
token: ''
};
// Respond
res.json({ success, message: `User [${user.username}] ${success ? 'logged' : 'failed to log'} in`, meta: { redirectTo: req.session.ass?.preLoginPath ?? '/user' } });
// Delete the pre-login path after successful login
if (success) delete req.session.ass?.preLoginPath;
})
.catch((err) => log.error(err).callback(() => res.status(400).json({ success: false, message: err.message })));
});
// todo: authenticate API endpoints
router.post('/user', rateLimiterMiddleware('api', UserConfig.config?.rateLimit?.api), BodyParserJson(), async (req, res) => {
if (!UserConfig.ready)
return res.status(409).json({ success: false, message: 'User config not ready' });
const newUser = req.body as AssUserNewReq;
// Run input validation
let issue: false | string = false;
let user: AssUser;
try {
// Username check
if (!newUser.username) issue = 'Missing username';
newUser.username.replaceAll(/[^A-z0-9_-]/g, '');
if (newUser.username === '') issue = 'Invalid username';
// Password check
if (!newUser.password) issue = 'Missing password';
if (newUser.password === '') issue = 'Invalid password';
newUser.password = newUser.password.substring(0, 128);
// todo: figure out how to check admin:boolean and meta:{}
// Create new AssUser objet
user = {
id: nanoid(32),
username: newUser.username,
password: await bcrypt.hash(newUser.password, 10),
admin: newUser.admin ?? false,
meta: newUser.meta ?? {},
tokens: [],
files: []
};
log.debug(`Creating ${user.admin ? 'admin' : 'regular'} user`, user.username, user.id);
// todo: also check duplicate usernames
await data.put('users', user.id, user);
} catch (err: any) { issue = `Error: ${err.message}`; }
if (issue) {
log.error('Failed to create user', issue);
return res.status(400).json({ success: false, messsage: issue });
}
log.debug(`User created`, user!.username);
res.json(({ success: true, message: `User ${user!.username} created` }));
});
export { router };

@ -0,0 +1,154 @@
import { BusBoyFile, AssFile } from 'ass';
import axios from 'axios';
import fs from 'fs-extra';
import bb from 'express-busboy';
import crypto from 'crypto';
import { Router } from 'express';
import { Readable } from 'stream';
import * as data from '../data.js';
import { log } from '../log.js';
import { App } from '../app.js';
import { random } from '../generators.js';
import { UserConfig } from '../UserConfig.js';
import { getFileS3, uploadFileS3 } from '../s3.js';
import { rateLimiterMiddleware } from '../ratelimit.js';
const router = Router({ caseSensitive: true });
//@ts-ignore // Required since bb.extends expects express.Application, not a Router (but it still works)
bb.extend(router, {
upload: true,
restrictMultiple: true,
allowedPath: (url: string) => url === '/',
limits: {
fileSize: () => (UserConfig.ready ? UserConfig.config.maximumFileSize : 50) * 1000000 // MB
}
});
// Render or redirect
router.get('/', (req, res) => UserConfig.ready ? res.render('index', { version: App.pkgVersion }) : res.redirect('/setup'));
// Upload flow
router.post('/', rateLimiterMiddleware("upload", UserConfig.config?.rateLimit?.upload), async (req, res) => {
// Check user config
if (!UserConfig.ready) return res.status(500).type('text').send('Configuration missing!');
// Does the file actually exist
if (!req.files || !req.files['file']) return res.status(400).type('text').send('No file was provided!');
else log.debug('Upload request received', `Using ${UserConfig.config.s3 != null ? 'S3' : 'local'} storage`);
// Type-check the file data
const bbFile: BusBoyFile = req.files['file'];
// Prepare file move
const uploads = UserConfig.config.uploadsDir;
const timestamp = Date.now().toString();
const fileKey = `${timestamp}_${bbFile.filename}`;
const destination = `${uploads}${uploads.endsWith('/') ? '' : '/'}${fileKey}`;
// S3 configuration
const s3 = UserConfig.config.s3 != null ? UserConfig.config.s3 : false;
try {
// Get the file size
const size = (await fs.stat(bbFile.file)).size;
// Get the hash
const sha256 = crypto.createHash('sha256').update(await fs.readFile(bbFile.file)).digest('base64');
// * Move the file
if (!s3) await fs.move(bbFile.file, destination);
else await uploadFileS3(await fs.readFile(bbFile.file), fileKey, bbFile.mimetype, size, sha256);
// Build ass metadata
const assFile: AssFile = {
fakeid: random({ length: UserConfig.config.idSize }), // todo: more generators
size,
sha256,
fileKey,
timestamp,
mimetype: bbFile.mimetype,
filename: bbFile.filename,
uploader: '0', // todo: users
save: {},
};
// Set the save location
if (!s3) assFile.save.local = destination;
else {
// Using S3 doesn't move temp file, delete it now
await fs.rm(bbFile.file);
assFile.save.s3 = true;
}
// * Save metadata
data.put('files', assFile.fakeid, assFile);
log.debug('File saved to', !s3 ? assFile.save.local! : 'S3');
await res.type('json').send({ resource: `${req.ass.host}/${assFile.fakeid}` });
// Send to Discord webhook
try {
await axios.post(UserConfig.config.discordWebhook, {
body: JSON.stringify({
content: `New upload: ${req.ass.host}/${assFile.fakeid}`
})
})
} catch (err) {
log.warn('Failed to send request to Discord webhook');
console.error(err);
}
} catch (err) {
log.error('Failed to upload file', bbFile.filename);
console.error(err);
return res.status(500).send(err);
}
});
router.get('/:fakeId', (req, res) => res.redirect(`/direct/${req.params.fakeId}`));
router.get('/direct/:fakeId', async (req, res) => {
if (!UserConfig.ready) res.redirect('/setup');
// Get the ID
const fakeId = req.params.fakeId;
// Get the file metadata
let _data;
try { _data = await data.get('files', fakeId); }
catch (err) {
log.error('Failed to get', fakeId);
console.error(err);
return res.status(500).send();
}
if (!_data) return res.status(404).send();
else {
const meta = _data as AssFile;
// File data can come from either S3 or local filesystem
let output: Readable | NodeJS.ReadableStream;
// Try to retrieve the file
if (!!meta.save.s3) {
const file = await getFileS3(meta.fileKey);
if (!file.Body) return res.status(500).send('Unknown error');
output = file.Body as Readable;
} else output = fs.createReadStream(meta.save.local!);
// Configure response headers
res.type(meta.mimetype)
.header('Content-Disposition', `inline; filename="${meta.filename}"`)
.header('Cache-Control', 'public, max-age=31536000, immutable')
.header('Accept-Ranges', 'bytes');
// Send the file (thanks to https://stackoverflow.com/a/67373050)
output.pipe(res);
}
});
export { router };

@ -0,0 +1,177 @@
import {
S3Client,
S3ClientConfig,
PutObjectCommand,
PutObjectCommandOutput,
GetObjectCommand,
GetObjectCommandOutput,
CreateMultipartUploadCommand,
UploadPartCommand,
CompleteMultipartUploadCommand,
CompleteMultipartUploadCommandOutput,
AbortMultipartUploadCommand,
} from "@aws-sdk/client-s3";
import { log } from './log.js';
import { UserConfig } from './UserConfig.js';
const NYR = 'S3 not ready';
/**
* Helper function to verify if the S3 config has been set
*/
const s3readyCheck = (): boolean => UserConfig.ready && UserConfig.config.s3 != null;
let _s3client: S3Client;
const s3 = (): S3Client | null => {
if (!s3readyCheck) return null;
// Build the S3 client
if (_s3client == undefined) {
const { endpoint, bucket, credentials, region } = UserConfig.config.s3!;
// Set up base config (without optional region)
const s3config: S3ClientConfig = {
endpoint,
credentials: {
accessKeyId: credentials.accessKey,
secretAccessKey: credentials.secretKey
}
};
// Attach region to config if required
s3config.region = region != null ? region : 'auto';
// Build the new client
_s3client = new S3Client(s3config);
log.debug('S3 client configured', endpoint, bucket);
}
return _s3client;
};
/**
* Basic single file upload
*/
const doObjectUpload = (file: Buffer, fileKey: string, mimetype: string, size: number, sha256: string): Promise<PutObjectCommandOutput> =>
new Promise((resolve, reject) => s3()!.send(new PutObjectCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
ContentType: mimetype,
ContentLength: size,
Body: new Uint8Array(file),
ChecksumSHA256: sha256
})).then(resolve).catch(reject));
/**
* More complicated multipart upload for large files
*/
const doMultipartUpload = (file: Buffer, mimetype: string, fileKey: string): Promise<CompleteMultipartUploadCommandOutput> => new Promise(async (resolve, reject) => {
let uploadId: string | undefined;
try {
// Create multipart upload for S3
const multipartUpload = await s3()!.send(new CreateMultipartUploadCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
ContentType: mimetype
}));
// Get the ID in case we have to abort it later
uploadId = multipartUpload.UploadId;
// Minimum size of 5 MB per part
const partSize = Math.ceil(file.length / 5);
// Build the upload commands
const uploadParts = [];
for (let i = 0; i < 5; i++) {
const start = i * partSize;
const end = start + partSize;
uploadParts.push(s3()!
.send(new UploadPartCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
UploadId: uploadId,
Body: file.subarray(start, end),
PartNumber: i + 1
}))
.then((d) => (log.debug('S3 Upload', `Part ${i + 1} uploaded`), d)));
}
// Upload all the parts
const uploadResults = await Promise.all(uploadParts);
// Get the URL? who knows
const output = await s3()!.send(
new CompleteMultipartUploadCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
UploadId: uploadId,
MultipartUpload: {
Parts: uploadResults.map(({ ETag }, i) => ({ ETag, PartNumber: i + 1 }))
}
}));
// todo: S3 multipart: clean up/finalize this properly
console.log(output);
resolve(output);
} catch (err) {
if (uploadId) {
reject(err);
await s3()!.send(new AbortMultipartUploadCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey,
UploadId: uploadId,
}));
}
}
});
/**
* Uploads a file to your configured S3 provider
*/
export const uploadFileS3 = (file: Buffer, fileKey: string, mimetype: string, size: number, sha256: string): Promise<void> => new Promise(async (resolve, reject) => {
if (!s3readyCheck) return reject(NYR);
try {
// todo: determine when to do multipart uplloads
await doObjectUpload(file, fileKey, mimetype, size, sha256);
resolve(void 0);
} catch (err) {
log.error('Failed to upload object to S3', fileKey);
console.error(err);
reject(err);
}
});
/**
* Gets a file from your configured S3 provider
*/
export const getFileS3 = (fileKey: string): Promise<GetObjectCommandOutput> => new Promise(async (resolve, reject) => {
if (!s3readyCheck) return reject(NYR);
try {
resolve(await s3()!.send(new GetObjectCommand({
Bucket: UserConfig.config.s3!.bucket,
Key: fileKey
})));
} catch (err) {
log.error('Failed to get object from S3', fileKey);
console.error(err);
reject(err);
}
});
/**
* Deletes a file from your configured S3 provider
*/
export const deleteFileS3 = (): Promise<void> => new Promise((resolve, reject) => {
const NYI = 'Not yet implemented';
if (!s3readyCheck) return reject(NYR);
log.warn('S3 Delete', NYI);
reject(NYI);
});

@ -0,0 +1,67 @@
import { NID, Database, DatabaseTable, DatabaseValue } from "ass";
export class DBManager {
private static _db: Database;
private static _dbReady: boolean = false;
public static get ready() {
return this._dbReady;
}
static {
process.on('exit', () => {
if (DBManager._db) DBManager._db.close();
});
}
/**
* activate a database
*/
public static use(db: Database): Promise<void> {
return new Promise(async (resolve, reject) => {
if (this._db != undefined) {
await this._db.close();
this._dbReady = false;
}
this._db = db;
await this._db.open();
await this._db.configure();
this._dbReady = true;
resolve();
});
}
public static configure(): Promise<void> {
if (this._db && this._dbReady) {
return this._db.configure();
} else throw new Error("No database active");
}
/**
* put a value in the database
*/
public static put(table: DatabaseTable, key: NID, data: DatabaseValue): Promise<void> {
if (this._db && this._dbReady) {
return this._db.put(table, key, data);
} else throw new Error("No database active");
}
/**
* get a value from the database
*/
public static get(table: DatabaseTable, key: NID): Promise<DatabaseValue> {
if (this._db && this._dbReady) {
return this._db.get(table, key);
} else throw new Error("No database active");
}
/**
* get all values from the database
*/
public static getAll(table: DatabaseTable): Promise<DatabaseValue[]> {
if (this._db && this._dbReady) {
return this._db.getAll(table);
} else throw new Error("No database active");
}
}

@ -0,0 +1,152 @@
import { AssFile, AssUser, FilesSchema, UsersSchema, Database, DatabaseTable, DatabaseValue } from 'ass';
import path, { resolve } from 'path';
import fs from 'fs-extra';
import { log } from '../log.js';
import { nanoid } from '../generators.js';
/**
* Absolute filepaths for JSON data files
*/
const PATHS = {
files: path.join('.ass-data/files.json'),
users: path.join('.ass-data/users.json')
};
/**
* map from tables to paths
*/
const PATHMAP = {
assfiles: PATHS.files,
assusers: PATHS.users
} as { [index: string]: string };
/**
* map from tables to sectors
*/
const SECTORMAP = {
assfiles: 'files',
assusers: 'users'
} as { [index: string]: string };
const bothWriter = async (files: FilesSchema, users: UsersSchema) => {
await fs.writeJson(PATHS.files, files, { spaces: '\t' });
await fs.writeJson(PATHS.users, users, { spaces: '\t' });
};
/**
* Creates a JSON file with a given empty data template
*/
const createEmptyJson = (filepath: string, emptyData: any): Promise<void> => new Promise(async (resolve, reject) => {
try {
if (!(await fs.pathExists(filepath))) {
await fs.ensureFile(filepath);
await fs.writeJson(filepath, emptyData, { spaces: '\t' });
}
resolve(void 0);
} catch (err) {
reject(err);
}
});
/**
* Ensures the data files exist and creates them if required
*/
export const ensureFiles = (): Promise<void> => new Promise(async (resolve, reject) => {
log.debug('Checking data files');
try {
// * Default files.json
await createEmptyJson(PATHS.files, {
files: {},
useSql: false,
meta: {}
} as FilesSchema);
// * Default users.json
await createEmptyJson(PATHS.users, {
tokens: [],
users: {},
cliKey: nanoid(32),
useSql: false,
meta: {}
} as UsersSchema);
log.debug('Data files exist');
resolve();
} catch (err) {
log.error('Failed to verify existence of data files');
reject(err);
}
});
/**
* JSON database. i know json isnt sql, shut up.
*/
export class JSONDatabase implements Database {
public open(): Promise<void> { return Promise.resolve() }
public close(): Promise<void> { return Promise.resolve() }
public configure(): Promise<void> {
return new Promise((resolve, reject) => {
ensureFiles();
resolve();
});
}
public put(table: DatabaseTable, key: string, data: DatabaseValue): Promise<void> {
return new Promise(async (resolve, reject) => {
if (table == 'assfiles') {
// ? Local JSON
const filesJson = await fs.readJson(PATHS.files) as FilesSchema;
// Check if key already exists
if (filesJson.files[key] != null) return reject(new Error(`File key ${key} already exists`));
// Otherwise add the data
filesJson.files[key] = data as AssFile;
// Also save the key to the users file
const usersJson = await fs.readJson(PATHS.users) as UsersSchema;
// todo: uncomment this once users are implemented
// usersJson.users[data.uploader].files.push(key);
// Save the files
await bothWriter(filesJson, usersJson);
resolve()
} else if (table == 'assusers') {
// ? Local JSON
const usersJson = await fs.readJson(PATHS.users) as UsersSchema;
// Check if key already exists
if (usersJson.users[key] != null) return reject(new Error(`User key ${key} already exists`));
// Otherwise add the data
usersJson.users[key] = data as AssUser;
await fs.writeJson(PATHS.users, usersJson, { spaces: '\t' });
resolve();
}
})
}
public get(table: DatabaseTable, key: string): Promise<DatabaseValue> {
return new Promise(async (resolve, reject) => {
const data = (await fs.readJson(PATHMAP[table]))[SECTORMAP[table]][key];
(!data) ? reject(new Error(`Key '${key}' not found in '${table}'`)) : resolve(data);
});
}
public getAll(table: DatabaseTable): Promise<DatabaseValue[]> {
return new Promise(async (resolve, reject) => {
const data = (await fs.readJson(PATHMAP[table]))[SECTORMAP[table]];
// todo: fix this
(!data) ? resolve(data) : resolve(data);
});
}
}

@ -0,0 +1,185 @@
import { AssFile, AssUser, NID, UploadToken, Database, DatabaseTable, DatabaseValue } from 'ass';
import mysql, { Pool } from 'mysql2/promise';
import { log } from '../log.js';
import { UserConfig } from '../UserConfig.js';
export class MySQLDatabase implements Database {
private _pool: Pool;
private _ready: boolean = false;
public get ready() { return this._ready; }
/**
* Quick function for creating a simple JSON table
*/
private _tableManager(mode: 'create' | 'drop', name: string, schema = '( NanoID varchar(255), Data JSON )'): Promise<void> {
return new Promise((resolve, reject) =>
this._pool.query(
mode === 'create'
? `CREATE TABLE ${name} ${schema};`
: `DROP TABLE ${name};`)
.then(() => resolve())
.catch((err) => reject(err)));
}
/**
* validate the mysql config
*/
private _validateConfig(): string | undefined {
// make sure the configuration exists
if (!UserConfig.ready) return 'User configuration not ready';
if (typeof UserConfig.config.database != 'object') return 'MySQL configuration missing';
if (UserConfig.config.database.kind != "mysql") return 'Database not set to MySQL, but MySQL is in use, something has gone terribly wrong';
if (typeof UserConfig.config.database.options != 'object') return 'MySQL configuration missing';
let mySqlConf = UserConfig.config.database.options;
// Check the MySQL configuration
const checker = (val: string) => val != null && val !== '';
const issue =
!checker(mySqlConf.host) ? 'Missing MySQL Host'
: !checker(mySqlConf.user) ? 'Missing MySQL User'
: !checker(mySqlConf.password) ? 'Missing MySQL Password'
: !checker(mySqlConf.database) ? 'Missing MySQL Database'
// ! Blame VS Code for this weird indentation
: undefined;
return issue;
}
public open() { return Promise.resolve(); }
public close() { return Promise.resolve(); }
/**
* Build the MySQL client and create the tables
*/
public configure(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// Config check
let configError = this._validateConfig();
if (configError) throw new Error(configError);
// Create the pool
this._pool = mysql.createPool(UserConfig.config.database!.options!);
// Check if the pool is usable
const [rowz, _fields] = await this._pool.query(`SHOW FULL TABLES WHERE Table_Type LIKE 'BASE TABLE';`);
const rows_tableData = rowz as unknown as { [key: string]: string }[];
// Create tables if needed
if (rows_tableData.length === 0) {
log.warn('MySQL', 'Tables do not exist, creating');
await Promise.all([
this._tableManager('create', 'assfiles'),
this._tableManager('create', 'assusers'),
this._tableManager('create', 'asstokens')
]);
log.success('MySQL', 'Tables created');
} else {
// There's at least one row, do further checks
const tablesExist = { files: false, users: false, tokens: false };
// Check which tables ACTUALLY do exist
for (let row of rows_tableData) {
const table = row[`Tables_in_${UserConfig.config.database!.options!.database}`
] as DatabaseTable;
if (table === 'assfiles') tablesExist.files = true;
if (table === 'assusers') tablesExist.users = true;
if (table === 'asstokens') tablesExist.tokens = true;
// ! Don't use `= table === ''` because this is a loop
}
// Mini-function for creating a one-off table
const createOneTable = async (name: DatabaseTable) => {
log.warn('MySQL', `Table '${name}' missing, creating`);
await this._tableManager('create', name);
log.success('MySQL', `Table '${name}' created`);
}
// Check & create tables
if (!tablesExist.files) await createOneTable('assfiles');
if (!tablesExist.users) await createOneTable('assusers');
if (!tablesExist.users) await createOneTable('asstokens');
// ! temp: drop tables for testing
/* await MySql._tableManager('drop', 'assfiles');
await MySql._tableManager('drop', 'assusers');
log.debug('Table dropped'); */
// Hopefully we are ready
if (tablesExist.files && tablesExist.users)
log.info('MySQL', 'Tables exist, ready');
else throw new Error('Table(s) missing!');
}
// We are ready!
this._ready = true;
resolve();
} catch (err) {
log.error('MySQL', 'failed to initialize');
console.error(err);
reject(err);
}
});
}
public put(table: DatabaseTable, key: NID, data: DatabaseValue): Promise<void> {
return new Promise(async (resolve, reject) => {
if (!this._ready) return reject(new Error('MySQL not ready'));
try {
if (await this.get(table, key))
reject(new Error(`${table == 'assfiles' ? 'File' : table == 'assusers' ? 'User' : 'Token'} key ${key} already exists`));
} catch (err: any) {
if (!err.message.includes('not found in'))
reject(err);
}
const query = `
INSERT INTO ${table} ( NanoID, Data )
VALUES ('${key}', '${JSON.stringify(data)}');
`;
return this._pool.query(query)
.then(() => resolve(void 0))
.catch((err) => reject(err));
});
}
public get(table: DatabaseTable, key: NID): Promise<DatabaseValue> {
return new Promise(async (resolve, reject) => {
try {
// Run query
const [rowz, _fields] = await this._pool.query(`SELECT Data FROM ${table} WHERE NanoID = '${key}';`);
// Disgustingly interpret the query results
const rows_tableData = (rowz as unknown as { [key: string]: string }[])[0] as unknown as ({ Data: UploadToken | AssFile | AssUser | undefined });
if (rows_tableData?.Data) resolve(rows_tableData.Data);
else throw new Error(`Key '${key}' not found in '${table}'`);
} catch (err) {
reject(err);
}
});
}
public getAll(table: DatabaseTable): Promise<DatabaseValue[]> {
return new Promise(async (resolve, reject) => {
try {
// Run query
const [rowz, _fields] = await this._pool.query(`SELECT Data FROM ${table}`);
// Interpret results this is pain
const rows = (rowz as unknown as { Data: UploadToken | AssFile | AssUser }[]);
resolve(rows.map((row) => row.Data));
} catch (err) {
reject(err);
}
});
}
}

@ -0,0 +1,200 @@
import { PostgresConfiguration, Database, DatabaseTable, DatabaseValue } from 'ass';
import pg from 'pg';
import { log } from '../log.js';
import { UserConfig } from '../UserConfig.js';
/**
* database adapter for postgresql
*/
export class PostgreSQLDatabase implements Database {
private _client: pg.Client;
/**
* validate config
*/
private _validateConfig(): string | undefined {
// make sure the configuration exists
if (!UserConfig.ready) return 'User configuration not ready';
if (typeof UserConfig.config.database != 'object') return 'PostgreSQL configuration missing';
if (UserConfig.config.database.kind != "postgres") return 'Database not set to PostgreSQL, but PostgreSQL is in use, something has gone terribly wrong';
if (typeof UserConfig.config.database.options != 'object') return 'PostgreSQL configuration missing';
let config = UserConfig.config.database.options;
// check the postgres config
const checker = (val: string) => val != null && val !== '';
const issue =
!checker(config.host) ? 'Missing PostgreSQL Host'
: !checker(config.user) ? 'Missing PostgreSQL User'
: !checker(config.password) ? 'Missing PostgreSQL Password'
: !checker(config.database) ? 'Missing PostgreSQL Database'
// ! Blame VS Code for this weird indentation
: undefined;
return issue;
}
public open(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// config check
let configError = this._validateConfig();
if (configError) throw new Error(configError);
// grab the config
let config = UserConfig.config.database!.options! as PostgresConfiguration;
// set up the client
this._client = new pg.Client({
host: config.host,
port: config.port,
user: config.user,
password: config.password,
database: config.database,
});
// connect to the database
log.info('PostgreSQL', `connecting to ${config.host}:${config.port}`);
await this._client.connect();
log.success('PostgreSQL', 'ok');
resolve();
} catch (err) {
log.error('PostgreSQL', 'failed to connect');
console.error(err);
reject(err);
}
});
}
public close(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
// gracefully disconnect
await this._client.end();
resolve();
} catch (err) {
log.error('PostgreSQL', 'failed to disconnect');
console.error(err);
reject(err);
}
});
}
public configure(): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
await this._client.query(
`CREATE TABLE IF NOT EXISTS asstables (
name TEXT PRIMARY KEY,
version INT NOT NULL
);`);
log.info('PostgreSQL', 'checking database');
// update tables
let seenRows = new Set<string>();
let versions = await this._client.query('SELECT * FROM asstables;');
for (let row of versions.rows) {
seenRows.add(row.name);
}
const assTableSchema = '(id TEXT PRIMARY KEY, data JSON NOT NULL)'
// add missing tables
if (!seenRows.has('assfiles')) {
log.warn('PostgreSQL', 'assfiles missing, repairing...')
await this._client.query(
`CREATE TABLE assfiles ${assTableSchema};` +
`INSERT INTO asstables (name, version) VALUES ('assfiles', 1);`
);
log.success('PostgreSQL', 'ok');
}
if (!seenRows.has('assusers')) {
log.warn('PostgreSQL', 'asstokens missing, repairing...')
await this._client.query(
`CREATE TABLE assusers ${assTableSchema};` +
`INSERT INTO asstables (name, version) VALUES ('assusers', 1);`
);
log.success('PostgreSQL', 'ok');
}
if (!seenRows.has('asstokens')) {
log.warn('PostgreSQL', 'asstokens missing, repairing...')
await this._client.query(
`CREATE TABLE asstokens ${assTableSchema};` +
`INSERT INTO asstables (name, version) VALUES ('asstokens', 1);`
);
log.success('PostgreSQL', 'ok');
}
log.success('PostgreSQL', 'database is ok').callback(() => {
resolve();
});
} catch (err) {
log.error('PostgreSQL', 'failed to set up');
console.error(err);
reject(err);
}
});
}
public put(table: DatabaseTable, key: string, data: DatabaseValue): Promise<void> {
return new Promise(async (resolve, reject) => {
try {
const queries = {
assfiles: 'INSERT INTO assfiles (id, data) VALUES ($1, $2);',
assusers: 'INSERT INTO assusers (id, data) VALUES ($1, $2);',
asstokens: 'INSERT INTO asstokens (id, data) VALUES ($1, $2);'
};
let result = await this._client.query(queries[table], [key, data]);
resolve();
} catch (err) {
reject(err);
}
});
}
public get(table: DatabaseTable, key: string): Promise<DatabaseValue> {
return new Promise(async (resolve, reject) => {
try {
const queries = {
assfiles: 'SELECT data FROM assfiles WHERE id = $1::text;',
assusers: 'SELECT data FROM assusers WHERE id = $1::text;',
asstokens: 'SELECT data FROM asstokens WHERE id = $1::text;'
};
let result = await this._client.query(queries[table], [key]);
resolve(result.rowCount ? result.rows[0].data : void 0);
} catch (err) {
reject(err);
}
});
}
// todo: verify this works
public getAll(table: DatabaseTable): Promise<DatabaseValue[]> {
return new Promise(async (resolve, reject) => {
try {
const queries = {
assfiles: 'SELECT json_object_agg(id, data) AS stuff FROM assfiles;',
assusers: 'SELECT json_object_agg(id, data) AS stuff FROM assusers;',
asstokens: 'SELECT json_object_agg(id, data) AS stuff FROM asstokens;'
};
let result = await this._client.query(queries[table]);
resolve(result.rowCount ? result.rows[0].stuff : void 0);
} catch (err) {
reject(err);
}
});
}
}

@ -0,0 +1,11 @@
{
"extends": "@tsconfig/node20/tsconfig.json",
"compilerOptions": {
"outDir": "../dist/backend",
"strictPropertyInitialization": false
},
"include": [
"./**/*.ts",
"../**/common/*.ts"
]
}

@ -0,0 +1,22 @@
import { DateTime } from 'luxon';
import { id } from 'william.js';
export const customId = (length: number, alphabet: string = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789') => id(length, 1, alphabet);
export const randomHexColour = () => { // From: https://www.geeksforgeeks.org/javascript-generate-random-hex-codes-color/
const letters = '0123456789ABCDEF';
let colour = '#';
for (let i = 0; i < 6; i++)
colour += letters[(Math.floor(Math.random() * letters.length))];
return colour;
};
export const formatTimestamp = (timestamp: number, timeoffset: string) =>
DateTime.fromMillis(timestamp).setZone(timeoffset).toLocaleString(DateTime.DATETIME_MED);
export const formatBytes = (bytes: number, decimals = 2) => {
if (bytes === 0) return '0 Bytes';
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];
const i = Math.floor(Math.log(bytes) / Math.log(1024));
return parseFloat((bytes / Math.pow(1024, i)).toFixed(decimals < 0 ? 0 : decimals)).toString().concat(` ${sizes[i]}`);
};

@ -0,0 +1,28 @@
import fs from 'fs-extra';
import { path } from '@tycrek/joint';
import { TLog } from '@tycrek/log';
const log = new TLog();
const FILES = {
prefix: 'dist/frontend',
suffix: '.mjs',
pages: [
'setup',
'login',
'admin',
'user',
]
};
const fixFile = (page) => {
const filePath = path.join(FILES.prefix, `${page}${FILES.suffix}`);
const fixed = fs.readFileSync(filePath).toString().replace('export {};', '');
return fs.writeFile(filePath, fixed);
};
log.info('Fixing frontend JS', `${FILES.pages.length} files`);
Promise.all(FILES.pages.map(fixFile))
.then(() => log.success('Fixed.'))
.catch(console.error);

39
common/global.d.ts vendored

@ -0,0 +1,39 @@
import { BusBoyFile } from 'ass';
import { Request, Response } from 'express';
declare module 'express-session' {
interface SessionData {
ass: {
auth?: {
uid: string;
token: string;
}
preLoginPath?: string;
}
}
}
declare global {
namespace Express {
interface Request {
/**
* ass-specific request items
*/
ass: {
/**
* Combination of {protocol}://{hostname}
*/
host: string
/**
* ass version
*/
version: string
}
files: { [key: string]: BusBoyFile }
}
}
}

306
common/types.d.ts vendored

@ -0,0 +1,306 @@
declare module 'ass' {
type NID = string;
type IdType = 'random' | 'original' | 'gfycat' | 'timestamp' | 'zws'
export type DatabaseValue = AssFile | AssUser | UploadToken;
export type DatabaseTable = 'assfiles' | 'assusers' | 'asstokens';
/**
* Core Express server config.
* This is separate from the user configuration starting in 0.15.0
*/
interface ServerConfiguration {
host: string;
port: number;
proxied: boolean;
}
/**
* User-defined configuration
*/
interface UserConfiguration {
uploadsDir: string;
idType: IdType;
idSize: number;
gfySize: number;
maximumFileSize: number;
discordWebhook: string;
s3?: S3Configuration;
database?: DatabaseConfiguration;
rateLimit?: RateLimitConfiguration;
}
interface S3Configuration {
/**
* S3 endpoint to use
*/
endpoint: string;
/**
* Bucket to upload to
*/
bucket: string;
/**
* Optional region. Required for some providers
*/
region?: string;
/**
* Access credentials
*/
credentials: {
accessKey: string;
secretKey: string;
}
}
/**
* interface for database classes
*/
export interface Database {
/**
* preform database initialization tasks
*/
open(): Promise<void>;
/**
* preform database suspension tasks
*/
close(): Promise<void>;
/**
* set up database
*/
configure(): Promise<void>;
/**
* put a value in the database
*/
put(table: DatabaseTable, key: NID, data: DatabaseValue): Promise<void>;
/**
* get a value from the database
*/
get(table: DatabaseTable, key: NID): Promise<DatabaseValue>;
/**
* get all values from the database
*/
getAll(table: DatabaseTable): Promise<DatabaseValue[]>;
}
interface DatabaseConfiguration {
kind: 'mysql' | 'postgres' | 'json';
options?: MySQLConfiguration | PostgresConfiguration;
}
interface MySQLConfiguration {
host: string;
port: number;
user: string;
password: string;
database: string;
}
interface PostgresConfiguration {
host: string;
port: number;
user: string;
password: string;
database: string;
}
/**
* rate limiter configuration
* @since 0.15.0
*/
interface RateLimitConfiguration {
/**
* rate limit for the login endpoints
*/
login?: EndpointRateLimitConfiguration;
/**
* rate limit for parts of the api not covered by other rate limits
*/
api?: EndpointRateLimitConfiguration;
/**
* rate limit for file uploads
*/
upload?: EndpointRateLimitConfiguration;
}
/**
* rate limiter per-endpoint configuration
* @since 0.15.0
*/
interface EndpointRateLimitConfiguration {
/**
* maximum number of requests per duration
*/
requests: number;
/**
* rate limiting window in seconds
*/
duration: number;
}
interface UserConfigTypeChecker {
uploadsDir: (val: any) => boolean;
idType: (val: any) => boolean;
idSize: (val: any) => boolean;
gfySize: (val: any) => boolean;
maximumFileSize: (val: any) => boolean;
discordWebhook: (val: any) => boolean;
s3: {
endpoint: (val: any) => boolean;
bucket: (val: any) => boolean;
region: (val: any) => boolean;
credentials: {
accessKey: (val: any) => boolean;
secretKey: (val: any) => boolean;
}
}
sql: {
mySql: {
host: (val: any) => boolean;
port: (val: any) => boolean;
user: (val: any) => boolean;
password: (val: any) => boolean;
database: (val: any) => boolean;
}
postgres: {
port: (val: any) => boolean;
}
}
rateLimit: {
endpoint: (val: any) => boolean;
}
}
/**
* The in-progress structure of a file being uploaded (pre-ass processing)
*/
interface BusBoyFile {
uuid: string;
field: string;
/**
* Absolute path to the temporary file on-disk
*/
file: string;
filename: string;
encoding: string;
mimetype: string;
truncated: boolean;
done: boolean;
}
/**
* Object describing the file as ass handles it (after BusBoy)
*/
interface AssFile {
/**
* Public identifier used in the URL
*/
fakeid: NID;
/**
* Unique-but-human-readable ID. Combination of Epoch and filename.
* This allows users to search for their file while also avoiding conflicts.
*/
fileKey: string;
/**
* The original filename when it was uploaded by the user
*/
filename: string;
mimetype: string;
save: {
local?: string;
s3?: {
privateUrl?: string;
publicUrl?: string;
thumbnailUrl?: string;
} | true;
}
sha256: string;
size: number;
timestamp: string;
uploader: NID;
}
/**
* Structure of a token in 0.15.0, allowing more fancy features, maybe
*/
interface UploadToken {
/**
* Token ID to link it to a user
*/
id: NID;
/**
* The token itself. The user will need this for upload auth.
*/
token: string;
/**
* Helps the user know what this token is used for
*/
hint: string;
}
/**
* Object describing the users of an ass instance
*/
interface AssUser {
id: NID;
username: string;
password: string;
admin: boolean
tokens: NID[];
files: NID[];
meta: { [key: string]: any };
}
interface AssUserNewReq {
username: string;
password: string;
admin?: boolean;
meta?: { [key: string]: any };
}
/**
* JSON schema for files.json
*/
interface FilesSchema {
files: {
[key: NID]: AssFile;
}
meta: { [key: string]: any };
}
/**
* JSON scheme for users.json
*/
interface UsersSchema {
tokens: UploadToken[];
users: {
[key: NID]: AssUser;
};
cliKey: string;
meta: { [key: string]: any };
}
}
//#region Dummy modules
declare module '@tinycreek/postcss-font-magician';
//#endregion
// don't commit
/* future UserConfig options:
mediaStrict: boolean;
viewDirect: boolean;
viewDirectDiscord: boolean;
adminWebhook: {}
s3: {}
*/

@ -1,4 +1,4 @@
# ass Docker compose.yaml v0.2.0
# ass Docker compose.yaml v0.3.0
# authors:
# - tycrek <t@tycrek.com> (https://tycrek.com/)
# - Zusier <zusier@pm.me> (https://github.com/Zusier)
@ -9,27 +9,15 @@
services:
ass:
build: .
image: tycrek/ass
container_name: ass-docker
restart: unless-stopped
volumes:
- ./.ass-data:/opt/ass-src/.ass-data
ports:
- "40115:40115"
volumes:
- ./uploads:/opt/ass/uploads
- ./share:/opt/ass/share
- type: bind
source: ./config.json
target: /opt/ass/config.json
- type: bind
source: ./auth.json
target: /opt/ass/auth.json
- type: bind
source: ./data.json
target: /opt/ass/data.json
tmpfs: /tmp # temp files such as uploads are stored here
tmpfs: /tmp
tty: true
environment:
- NODE_ENV=production # for production
- ASS_ENV=docker # docker, local, production (not widely used yet)
- LOG_LEVEL=debug # debug, info, warn, error
- FORCE_COLOR=3 # force color output
- NODE_ENV=production
- FORCE_COLOR=3 # tlog color output

@ -0,0 +1,29 @@
#!/bin/bash
denv=FORCE_COLOR=3
volume=$(pwd)/.ass-data:/opt/ass-src/.ass-data
workdir=/opt/ass-src/
port=40115:40115
# container name:tag (tag is unix timestamp)
cname=ass:$(date +%s)
# build image
docker buildx build -t $cname .
# run the new image
docker run -it -e $denv -v $volume -w $workdir -p $port $cname
# wait for exit
echo
echo
echo -e "\033[32m\033[1mTo use this image again, run:\033[0m"
echo
echo " docker run -it \\"
echo " -e $denv \\"
echo " -v \$(pwd)/.ass-data:/opt/ass-src/.ass-data \\"
echo " -w $workdir \\"
echo " -p $port \\"
echo " $cname"
echo

@ -0,0 +1,103 @@
import { defineConfig } from 'vitepress';
const LOGO = 'https://i.tycrek.dev/ass';
const GIT_BRANCH = 'dev/0.15.0'
// https://vitepress.dev/reference/site-config
export default defineConfig({
lang: 'en-US',
title: 'ass docs',
titleTemplate: ':title ~ ass docs',
description: 'Documentation for ass, an open-source ShareX server',
cleanUrls: true,
lastUpdated: true,
head: [
['meta', { property: 'og:image', content: LOGO }],
['meta', { property: 'og:type', content: 'website' }],
['meta', { property: 'twitter:domain', content: 'ass.tycrek.dev' }],
['meta', { property: 'twitter:image', content: LOGO }],
['link', { rel: 'icon', href: LOGO }],
],
themeConfig: {
// https://vitepress.dev/reference/default-theme-config
logo: LOGO,
nav: [
{ text: 'Home', link: '/' },
{
text: 'Install', items: [
{ text: 'Docker', link: '/install/docker' },
{ text: 'Local', link: '/install/local' }
]
},
{ text: 'Configure', link: '/configure/' }
],
sidebar: [
{
text: 'Install',
link: '/install/',
items: [
{ text: 'Docker', link: '/install/docker' },
{ text: 'Local', link: '/install/local' }
]
},
{
text: 'Configure',
link: '/configure/',
items: [
{
text: 'SQL',
items: [
{
text: 'MySQL',
link: '/configure/sql/mysql'
},
{
text: 'PostgreSQL',
link: '/configure/sql/postgresql'
}
]
},
{
text: 'Clients',
items: [
{
text: 'ShareX',
link: '/configure/clients/sharex'
},
{
text: 'Flameshot',
link: '/configure/clients/flameshot'
}
]
}
]
},
{
text: 'Customize',
link: '/customize/',
items: [
{ text: 'Colors', link: '/customize/colors' }
]
}
],
editLink: {
pattern: `https://github.com/tycrek/ass/edit/${GIT_BRANCH}/docs/:path`,
text: 'Edit this page on GitHub',
},
footer: {
message: 'Released under the ISC License.',
copyright: 'Copyright © 2023 tycrek & ass contributors',
},
socialLinks: [
{ icon: 'github', link: 'https://github.com/tycrek/ass/' },
{ icon: 'discord', link: 'https://discord.gg/wGZYt5fasY' }
]
}
});

@ -0,0 +1,17 @@
// https://vitepress.dev/guide/custom-theme
import { h } from 'vue'
import type { Theme } from 'vitepress'
import DefaultTheme from 'vitepress/theme'
import './style.css'
export default {
extends: DefaultTheme,
Layout: () => {
return h(DefaultTheme.Layout, null, {
// https://vitepress.dev/guide/extending-default-theme#layout-slots
})
},
enhanceApp({ app, router, siteData }) {
// ...
}
} satisfies Theme

@ -0,0 +1,139 @@
/**
* Customize default theme styling by overriding CSS variables:
* https://github.com/vuejs/vitepress/blob/main/src/client/theme-default/styles/vars.css
*/
/**
* Colors
*
* Each colors have exact same color scale system with 3 levels of solid
* colors with different brightness, and 1 soft color.
*
* - `XXX-1`: The most solid color used mainly for colored text. It must
* satisfy the contrast ratio against when used on top of `XXX-soft`.
*
* - `XXX-2`: The color used mainly for hover state of the button.
*
* - `XXX-3`: The color for solid background, such as bg color of the button.
* It must satisfy the contrast ratio with pure white (#ffffff) text on
* top of it.
*
* - `XXX-soft`: The color used for subtle background such as custom container
* or badges. It must satisfy the contrast ratio when putting `XXX-1` colors
* on top of it.
*
* The soft color must be semi transparent alpha channel. This is crucial
* because it allows adding multiple "soft" colors on top of each other
* to create a accent, such as when having inline code block inside
* custom containers.
*
* - `default`: The color used purely for subtle indication without any
* special meanings attched to it such as bg color for menu hover state.
*
* - `brand`: Used for primary brand colors, such as link text, button with
* brand theme, etc.
*
* - `tip`: Used to indicate useful information. The default theme uses the
* brand color for this by default.
*
* - `warning`: Used to indicate warning to the users. Used in custom
* container, badges, etc.
*
* - `danger`: Used to show error, or dangerous message to the users. Used
* in custom container, badges, etc.
* -------------------------------------------------------------------------- */
:root {
--vp-c-default-1: var(--vp-c-gray-1);
--vp-c-default-2: var(--vp-c-gray-2);
--vp-c-default-3: var(--vp-c-gray-3);
--vp-c-default-soft: var(--vp-c-gray-soft);
--vp-c-brand-1: var(--vp-c-indigo-1);
--vp-c-brand-2: var(--vp-c-indigo-2);
--vp-c-brand-3: var(--vp-c-indigo-3);
--vp-c-brand-soft: var(--vp-c-indigo-soft);
--vp-c-tip-1: var(--vp-c-brand-1);
--vp-c-tip-2: var(--vp-c-brand-2);
--vp-c-tip-3: var(--vp-c-brand-3);
--vp-c-tip-soft: var(--vp-c-brand-soft);
--vp-c-warning-1: var(--vp-c-yellow-1);
--vp-c-warning-2: var(--vp-c-yellow-2);
--vp-c-warning-3: var(--vp-c-yellow-3);
--vp-c-warning-soft: var(--vp-c-yellow-soft);
--vp-c-danger-1: var(--vp-c-red-1);
--vp-c-danger-2: var(--vp-c-red-2);
--vp-c-danger-3: var(--vp-c-red-3);
--vp-c-danger-soft: var(--vp-c-red-soft);
}
/**
* Component: Button
* -------------------------------------------------------------------------- */
:root {
--vp-button-brand-border: transparent;
--vp-button-brand-text: var(--vp-c-white);
--vp-button-brand-bg: var(--vp-c-brand-3);
--vp-button-brand-hover-border: transparent;
--vp-button-brand-hover-text: var(--vp-c-white);
--vp-button-brand-hover-bg: var(--vp-c-brand-2);
--vp-button-brand-active-border: transparent;
--vp-button-brand-active-text: var(--vp-c-white);
--vp-button-brand-active-bg: var(--vp-c-brand-1);
}
/**
* Component: Home
* -------------------------------------------------------------------------- */
:root {
--vp-home-hero-name-color: transparent;
--vp-home-hero-name-background: -webkit-linear-gradient(
120deg,
#bd34fe 30%,
#41d1ff
);
--vp-home-hero-image-background-image: linear-gradient(
-45deg,
#bd34fe 50%,
#47caff 50%
);
--vp-home-hero-image-filter: blur(44px);
}
@media (min-width: 640px) {
:root {
--vp-home-hero-image-filter: blur(56px);
}
}
@media (min-width: 960px) {
:root {
--vp-home-hero-image-filter: blur(68px);
}
}
/**
* Component: Custom Block
* -------------------------------------------------------------------------- */
:root {
--vp-custom-block-tip-border: transparent;
--vp-custom-block-tip-text: var(--vp-c-text-1);
--vp-custom-block-tip-bg: var(--vp-c-brand-soft);
--vp-custom-block-tip-code-bg: var(--vp-c-brand-soft);
}
/**
* Component: Algolia
* -------------------------------------------------------------------------- */
.DocSearch {
--docsearch-primary-color: var(--vp-c-brand-1) !important;
}

@ -0,0 +1,49 @@
---
outline: deep
---
# Runtime API Examples
This page demonstrates usage of some of the runtime APIs provided by VitePress.
The main `useData()` API can be used to access site, theme, and page data for the current page. It works in both `.md` and `.vue` files:
```md
<script setup>
import { useData } from 'vitepress'
const { theme, page, frontmatter } = useData()
</script>
## Results
### Theme Data
<pre>{{ theme }}</pre>
### Page Data
<pre>{{ page }}</pre>
### Page Frontmatter
<pre>{{ frontmatter }}</pre>
```
<script setup>
import { useData } from 'vitepress'
const { site, theme, page, frontmatter } = useData()
</script>
## Results
### Theme Data
<pre>{{ theme }}</pre>
### Page Data
<pre>{{ page }}</pre>
### Page Frontmatter
<pre>{{ frontmatter }}</pre>
## More
Check out the documentation for the [full list of runtime APIs](https://vitepress.dev/reference/runtime-api#usedata).

@ -0,0 +1,11 @@
# Flameshot
The Flameshot script has been updated to be a lot more dynamic, including adding support for [cheek](https://github.com/tycrek/cheek#readme), my serverless ShareX upload server. To set cheek mode, edit the file [`flameshot-v2.sh`](https://github.com/tycrek/ass/blob/dev/0.15.0/flameshot-v2.sh) and set `MODE=0` to `MODE=1`.
To set your token (not in use yet, can be random) and domain for the script, create these directories with the following files:
- `~/.ass/` (or `~/.cheek/`)
- `~/.ass/.token`
- `~/.ass/.domain`
For `.domain`, you do **not** need to include `http(s)://`.

@ -0,0 +1,10 @@
# ShareX
| Setting | Value |
| ------- | ----- |
| Request URL | Your server domain (including `http(s)://`) |
| Request Method | `POST` |
| Destination Type | `Image`, `Text`, `File` |
| Body | `multipart/form-data` |
| File Form Name | `file` |
| URL | `{json.resource}` |

@ -0,0 +1,25 @@
# Configure
Most of the configuration is managed through the administrator dashboard.
## `server.json` overrides
The webserver in ass 15 is hosted independently of any user configuration. If you wish to set a specific server setting, you may do so with a `server.json` file.
Place this file in `<root>/.ass-data/`.
| Property | Use | Default |
| -------- | --- | ------- |
| `host` | Local IP to bind to | `0.0.0.0` |
| `port` | Port to listen on | `40115` |
| `proxied` | If ass is behind a reverse proxy | `false`, unless `NODE_ENV=production` is specified, otherwise `true` |
**Example**
```json
{
"host": "127.0.1.2",
"port": 40200,
"proxied": false
}
```

@ -0,0 +1,7 @@
# MySQL
## Provider-specific instructions
### Aiven
In the **Overview** panel, scroll down to **Advanced**, and set `mysql.sql_require_primary_key` to **`Disabled`**.

@ -0,0 +1,3 @@
# Customize
This is coming soon.

@ -0,0 +1,34 @@
---
# https://vitepress.dev/reference/default-theme-home-page
layout: home
title: Home
hero:
name: ass
text: open-source file hosting server
tagline: Unopinionated, customizable, uniquely yours.
actions:
- theme: brand
text: Get Started
link: /install/
- theme: alt
text: View on GitHub
link: https://github.com/tycrek/ass
image:
src: 'https://i.tycrek.dev/ass-round-square-logo-white-with-text'
alt: ass logo
features:
- icon: 😋
title: sassy
details: Like me.
- icon: 🍔
title: greasy
details: More than a Big Mac.
- icon: ☁️
title: soft
details: Just the way you like it.
---

@ -0,0 +1,32 @@
# Docker
The Docker method uses [Docker Compose][1] for a quick and easy installation. For a faster deployment, a pre-built image is pulled from [Docker Hub](https://hub.docker.com/r/tycrek/ass).
## Requirements
- Latest [Docker](https://docs.docker.com/engine/install/)
- [Docker Compose][1] v2 plugin
[1]: https://docs.docker.com/compose/
## Install
I provide a pre-made `compose.yaml` file that makes it easier to get started.
```bash
mkdir ass && cd ass/
curl -LO https://ass.tycrek.dev/compose.yaml
docker compose up -d
```
### View logs
Use the following command to view the container logs:
```bash
docker compose logs -n <lines> --follow
```
## Build local image
If you wish to build a Docker image locally for development, you can use the provided [docker-dev-container.sh](https://github.com/tycrek/ass/blob/dev/0.15.0/docker-dev-container.sh) script.

@ -0,0 +1,18 @@
# Installation
You can use either [Docker](docker) (recommended) or your [local Node.js](local) installation.
::: warning ass 0.15.0 is experimental
Branch [`dev/0.15.0`](https://github.com/tycrek/ass/tree/dev/0.15.0/) is a full rewrite of the ass codebase.
At this time, it is working and ready for testing, **but is very incomplete** and is lacking many features currently found in ass 0.14.
**The existing configs, data.json, and auth.json will be abandoned. There is currently no migration nor one planned.**
::::
## Alternatives
These are maintained by the ass community.
- Nix flake (soon)
- Pterodactyl Egg (soon)

@ -0,0 +1,20 @@
# Local install
The local method uses the [Node.js](https://nodejs.org/en) installation found on your system.
## Requirements
- **Node 20** or later
- [pnpm](https://pnpm.io/installation)
## Install
```bash
git clone -b dev/0.15.0 https://github.com/tycrek/ass.git && cd ass/
pnpm i
# or: npm i -D
pnpm run dev
# After ass has been compiled, you can instead use:
pnpm run start
```

@ -0,0 +1,85 @@
# Markdown Extension Examples
This page demonstrates some of the built-in markdown extensions provided by VitePress.
## Syntax Highlighting
VitePress provides Syntax Highlighting powered by [Shikiji](https://github.com/antfu/shikiji), with additional features like line-highlighting:
**Input**
````md
```js{4}
export default {
data () {
return {
msg: 'Highlighted!'
}
}
}
```
````
**Output**
```js{4}
export default {
data () {
return {
msg: 'Highlighted!'
}
}
}
```
## Custom Containers
**Input**
```md
::: info
This is an info box.
:::
::: tip
This is a tip.
:::
::: warning
This is a warning.
:::
::: danger
This is a dangerous warning.
:::
::: details
This is a details block.
:::
```
**Output**
::: info
This is an info box.
:::
::: tip
This is a tip.
:::
::: warning
This is a warning.
:::
::: danger
This is a dangerous warning.
:::
::: details
This is a details block.
:::
## More
Check out the documentation for the [full list of markdown extensions](https://vitepress.dev/guide/markdown).

@ -0,0 +1,136 @@
#!/usr/bin/env bash
# Script Configuration
# Load configuration file if available
# this is useful if you want to source keys from a secret file
CONFIG_FILE="config.sh"
if [ -f "$CONFIG_FILE" ]; then
# shellcheck disable=1090
source "${CONFIG_FILE}"
fi
LOG_DIR=$(pwd)
if [ ! -d "$LOG_DIR" ]; then
echo "The directory you have specified to save the logs does not exist."
echo "Please create the directory with the following command:"
echo "mkdir -p $LOG_DIR"
echo -en "Or specify a different LOG_DIR\n"
exit 1
fi
IMAGE_PATH="$HOME/Pictures"
if [ ! -d "$IMAGE_PATH" ]; then
echo "The directory you have specified to save the screenshot does not exist."
echo "Please create the directory with the following command:"
echo "mkdir -p $IMAGE_PATH"
echo -en "Or specify a different IMAGE_PATH\n"
exit 1
fi
IMAGE_NAME="ass"
FILE="${IMAGE_PATH}/${IMAGE_NAME}.png"
# Function to check if a tool is installed
check_tool() {
command -v "$1" >/dev/null 2>&1
}
# Function to take Flameshot screenshots
takeFlameshot() {
# check if flameshot tool is installed
REQUIRED_TOOLS=("flameshot")
for tool in "${REQUIRED_TOOLS[@]}"; do
if ! check_tool "$tool"; then
echo "Error: $tool is not installed. Please install it before using this script."
exit 1
fi
done
flameshot config -f "${IMAGE_NAME}"
flameshot gui -r -p "${IMAGE_PATH}" >/dev/null
}
# Function to take Wayland screenshots using grim + slurp
takeGrimshot() {
# check if grim and slurp are installed
REQUIRED_TOOLS=("grim" "slurp")
for tool in "${REQUIRED_TOOLS[@]}"; do
if ! check_tool "$tool"; then
echo "Error: $tool is not installed. Please install it before using this script."
exit 1
fi
done
grim -g "$(slurp)" "${FILE}" >/dev/null
}
# Function to remove the taken screenshot
removeTargetFile() {
echo -en "Process complete.\nRemoving image.\n"
rm -v "${FILE}"
}
# Function to upload target image to your ass instance
uploadScreenshot() {
echo -en "KEY & DOMAIN are set. Attempting to upload to your ass instance.\n"
URL=$(curl -X POST \
-H "Content-Type: multipart/form-data" \
-H "Accept: application/json" \
-H "User-Agent: ShareX/13.4.0" \
-H "Authorization: $KEY" \
-F "file=@${FILE}" "https://$DOMAIN/" | grep -Po '(?<="resource":")[^"]+')
if [[ "${XDG_SESSION_TYPE}" == x11 ]]; then
printf "%s" "$URL" | xclip -sel clip
elif [[ "${XDG_SESSION_TYPE}" == wayland ]]; then
printf "%s" "$URL" | wl-copy
else
echo -en "Invalid desktop session!\nExiting.\n"
exit 1
fi
}
localScreenshot() {
echo -en "KEY & DOMAIN variables are not set. Attempting local screenshot.\n"
if [[ "${XDG_SESSION_TYPE}" == x11 ]]; then
xclip -sel clip -target image/png <"${FILE}"
elif [[ "${XDG_SESSION_TYPE}" == wayland ]]; then
wl-copy <"${FILE}"
else
echo -en "Unknown display backend. Assuming Xorg and using xclip.\n"
xclip -sel clip -target image/png <"${FILE}"
fi
}
# Check if the screenshot tool based on display backend
if [[ "${XDG_SESSION_TYPE}" == x11 ]]; then
echo -en "Display backend detected as Xorg (x11), using Flameshot\n"
takeFlameshot
elif [[ "${XDG_SESSION_TYPE}" == wayland ]]; then
echo -en "Display backend detected as Wayland, using grim & slurp\n"
takeGrimshot
else
echo -en "Unknown display backend. Assuming Xorg and using Flameshot\n"
takeFlameshot >"${LOG_DIR}/flameshot.log"
echo -en "Done. Make sure you check for any errors and report them.\nLogfile located in '${LOG_DIR}'\n"
fi
# Check if the screenshot file exists before proceeding
if [[ -f "${FILE}" ]]; then
if [[ -n "$KEY" && -n "$DOMAIN" ]]; then
# Upload the file to the ass instance
uploadImage
# Remove image
removeTargetFile
else
# Take a screenshot locally
localScreenshot
# Remove image
removeTargetFile
fi
else
echo -en "Target file ${FILE} was not found. Aborting screenshot.\n"
exit 1
fi

@ -0,0 +1,89 @@
#!/bin/bash
## * ass & cheek flameshot script * ##
#
# Required packages: flameshot, curl, xclip, libnotify
#
# Authors:
# - ToxicAven (https://github.com/ToxicAven)
# - tycrek (https://github.com/tycrek)
# - Metacinnabar (https://github.com/Metacinnabar)
# - NotAShelf (https://github.com/NotAShelf)
# ! Upload mode (ass=0,cheek=1)
MODE=0
# Function to check if a tool is installed
check_tool() {
command -v "$1" >/dev/null 2>&1
}
# Mode string switcher
get_mode() {
if [[ $MODE -eq 0 ]];
then echo "ass"
else echo "cheek"
fi
}
# File details
IMGPATH="$HOME/.$(get_mode)"
FILE="$IMGPATH/$(get_mode)-$(date +%s).png"
# ass/cheek configuration (domain should be saved without http(s)://)
TOKEN=$(cat $IMGPATH/.token)
DOMAIN=$(cat $IMGPATH/.domain)
takeScreenshot() {
REQUIRED_TOOLS=("flameshot" "curl" "xclip" "notify-send")
# Check if the proper tools are installed
for tool in "${REQUIRED_TOOLS[@]}"; do
if ! check_tool "$tool"; then
echo "Error: $tool is missing!"
exit 1
fi
done
# Build dynamic Flameshot user-agent
USERAGENT=$(flameshot -v | sed -n -E 's/(Flameshot) (v[0-9]+\.[0-9]+\.[0-9]+) .+/\1-\2/p')
# Take screenshot with Flameshot
flameshot gui -r -p "$FILE" > /dev/null # Append the random gibberish to /dev/null
# Upload file
if [ -f "$FILE" ]; then
echo "Uploading $FILE to $(get_mode)..."
# Configure upload fields
FIELD="$([[ $MODE -eq 0 ]] && echo "file" || echo "image")=@$FILE"
[[ "${DOMAIN%%:*}" = "127.0.0.1" ]] && PROTOCOL="http" || PROTOCOL="https"
POSTTO="$PROTOCOL://$DOMAIN/$([[ $MODE -eq 0 ]] && echo "" || echo "upload")"
# Upload the file
URL=$(curl -sS -X POST \
-H "Content-Type: multipart/form-data" \
-H "Accept: application/json" \
-H "User-Agent: $USERAGENT" \
-H "Authorization: $TOKEN" \
-F $FIELD $POSTTO
)
# Response parser unique to ass
if [[ $MODE -eq 0 ]]; then
URL=$(echo $URL | grep -Po '(?<="resource":")[^"]+')
fi
# Copy the URL to clipboard (using printf instead of echo to avoid a newline)
printf "%s" "$URL" | xclip -sel clip
echo "URL copied: $URL"
notify-send -a $(get_mode) -t 4000 "URL copied to clipboard" "<a href=\"$URL\">View in browser</a>"
# Delete local file
rm "$FILE"
else
echo "Aborted."
fi
}
takeScreenshot

@ -1,27 +0,0 @@
#!/bin/bash
IMAGEPATH="$HOME/Pictures/" # Where to store screenshots before they're deleted
IMAGENAME="ass" # Not really important, tells Flameshot what file to send and delete
KEY="" # Your ass upload token
DOMAIN="" # Your upload domain (without http:// or https://)
flameshot config -f "$IMAGENAME" # Make sure that Flameshot names the file correctly
flameshot gui -r -p "$IMAGEPATH" > /dev/null # Prompt the screenshot GUI, also append the random gibberish to /dev/null
FILE="$IMAGEPATH$IMAGENAME.png" # File path and file name combined
# Check if file exists to handle Curl and rm errors
# then upload the image and copy the response URL
if [ -f "$FILE" ]; then
echo "$FILE exists."
URL=$(curl -X POST \
-H "Content-Type: multipart/form-data" \
-H "Accept: application/json" \
-H "User-Agent: ShareX/13.4.0" \
-H "Authorization: $KEY" \
-F "file=@$IMAGEPATH$IMAGENAME.png" "https://$DOMAIN/" | grep -Po '(?<="resource":")[^"]+')
# printf instead of echo as echo appends a newline
printf "%s" "$URL" | xclip -sel clip
rm "$IMAGEPATH$IMAGENAME.png" # Delete the image locally
else
echo "Aborted."
fi

@ -0,0 +1,4 @@
import { SlInput, SlButton } from '@shoelace-style/shoelace';
// * Wait for the document to be ready
document.addEventListener('DOMContentLoaded', () => console.log('Admin page loaded'));

@ -0,0 +1,46 @@
import { SlInput, SlButton } from '@shoelace-style/shoelace';
const genericErrorAlert = () => alert('An error occured, please check the console for details');
const errAlert = (logTitle: string, err: any, stream: 'error' | 'warn' = 'error') => (console[stream](logTitle, err), genericErrorAlert());
const errReset = (message: string, element: SlButton) => (element.disabled = false, alert(message));
// * Wait for the document to be ready
document.addEventListener('DOMContentLoaded', () => {
const Elements = {
usernameInput: document.querySelector('#login-username') as SlInput,
passwordInput: document.querySelector('#login-password') as SlInput,
submitButton: document.querySelector('#login-submit') as SlButton
};
// * Login button click handler
Elements.submitButton.addEventListener('click', async () => {
Elements.submitButton.disabled = true;
// Make sure fields are filled
if (Elements.usernameInput.value == null || Elements.usernameInput.value === '')
return errReset('Username is required!', Elements.submitButton);
if (Elements.passwordInput.value == null || Elements.passwordInput.value === '')
return errReset('Password is required!', Elements.submitButton);
fetch('/api/login', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
username: Elements.usernameInput.value,
password: Elements.passwordInput.value
})
})
.then((res) => res.json())
.then((data: {
success: boolean,
message: string,
meta: { redirectTo: string }
}) => {
if (!data.success) alert(data.message);
else window.location.href = data.meta.redirectTo;
})
.catch((err) => errAlert('POST to /api/login failed!', err))
.finally(() => Elements.submitButton.disabled = false);
});
});

@ -0,0 +1,195 @@
import { SlInput, SlButton, SlTab } from '@shoelace-style/shoelace';
import { IdType, UserConfiguration } from 'ass';
const genericErrorAlert = () => alert('An error occured, please check the console for details');
const errAlert = (logTitle: string, err: any, stream: 'error' | 'warn' = 'error') => (console[stream](logTitle, err), genericErrorAlert());
const errReset = (message: string, element: SlButton) => (element.disabled = false, alert(message));
const genericRateLimit = (config: object, category: string, submitButton: SlButton, requests: SlInput, time: SlInput) => {
if ((requests.value || time.value) != '') {
if (requests.value == '') {
errReset(`No count for ${category} rate limit`, submitButton);
return true; // this should probably be false but this lets us chain this until we see an error
}
if (time.value == '') {
errReset(`No time for ${category} rate limit`, submitButton);
return true;
}
(config as any)[category] = {
requests: parseInt(requests.value),
duration: parseInt(time.value),
};
}
return false;
};
// * Wait for the document to be ready
document.addEventListener('DOMContentLoaded', () => {
const Elements = {
dirInput: document.querySelector('#uploads-dir') as SlInput,
idTypeInput: document.querySelector('#uploads-idtype') as SlInput,
idSizeInput: document.querySelector('#uploads-idsize') as SlInput,
gfySizeInput: document.querySelector('#uploads-gfysize') as SlInput,
fileSizeInput: document.querySelector('#uploads-filesize') as SlInput,
s3endpoint: document.querySelector('#s3-endpoint') as SlInput,
s3bucket: document.querySelector('#s3-bucket') as SlInput,
s3accessKey: document.querySelector('#s3-accessKey') as SlInput,
s3secretKey: document.querySelector('#s3-secretKey') as SlInput,
s3region: document.querySelector('#s3-region') as SlInput,
jsonTab: document.querySelector('#json-tab') as SlTab,
mySqlTab: document.querySelector('#mysql-tab') as SlTab,
mySqlHost: document.querySelector('#mysql-host') as SlInput,
mySqlPort: document.querySelector('#mysql-port') as SlInput,
mySqlUser: document.querySelector('#mysql-user') as SlInput,
mySqlPassword: document.querySelector('#mysql-password') as SlInput,
mySqlDatabase: document.querySelector('#mysql-database') as SlInput,
pgsqlTab: document.querySelector('#pgsql-tab') as SlTab,
pgsqlHost: document.querySelector('#pgsql-host') as SlInput,
pgsqlPort: document.querySelector('#pgsql-port') as SlInput,
pgsqlUser: document.querySelector('#pgsql-user') as SlInput,
pgsqlPassword: document.querySelector('#pgsql-password') as SlInput,
pgsqlDatabase: document.querySelector('#pgsql-database') as SlInput,
userUsername: document.querySelector('#user-username') as SlInput,
userPassword: document.querySelector('#user-password') as SlInput,
ratelimitLoginRequests: document.querySelector('#ratelimit-login-requests') as SlInput,
ratelimitLoginTime: document.querySelector('#ratelimit-login-time') as SlInput,
ratelimitApiRequests: document.querySelector('#ratelimit-api-requests') as SlInput,
ratelimitApiTime: document.querySelector('#ratelimit-api-time') as SlInput,
ratelimitUploadRequests: document.querySelector('#ratelimit-upload-requests') as SlInput,
ratelimitUploadTime: document.querySelector('#ratelimit-upload-time') as SlInput,
submitButton: document.querySelector('#submit') as SlButton,
};
// * Setup button click handler
Elements.submitButton.addEventListener('click', async () => {
Elements.submitButton.disabled = true;
// Base configuration values
const config: UserConfiguration = {
uploadsDir: Elements.dirInput.value,
idType: Elements.idTypeInput.value as IdType,
idSize: parseInt(Elements.idSizeInput.value),
gfySize: parseInt(Elements.gfySizeInput.value),
maximumFileSize: parseInt(Elements.fileSizeInput.value),
};
// Append S3 to config, if specified
if (Elements.s3endpoint.value != null && Elements.s3endpoint.value !== '') {
config.s3 = {
endpoint: Elements.s3endpoint.value,
bucket: Elements.s3bucket.value,
credentials: {
accessKey: Elements.s3accessKey.value,
secretKey: Elements.s3secretKey.value
}
};
// Also append region, if it was provided
if (Elements.s3region.value != null && Elements.s3region.value !== '')
config.s3.region = Elements.s3region.value;
}
// Append database to config, if specified
if (Elements.jsonTab.active) {
config.database = {
kind: 'json'
};
} else if (Elements.mySqlTab.active) {
if (Elements.mySqlHost.value != null && Elements.mySqlHost.value != '') {
config.database = {
kind: 'mysql',
options: {
host: Elements.mySqlHost.value,
port: parseInt(Elements.mySqlPort.value),
user: Elements.mySqlUser.value,
password: Elements.mySqlPassword.value,
database: Elements.mySqlDatabase.value
}
};
}
} else if (Elements.pgsqlTab.active) {
if (Elements.pgsqlHost.value != null && Elements.pgsqlHost.value != '') {
config.database = {
kind: 'postgres',
options: {
host: Elements.pgsqlHost.value,
port: parseInt(Elements.pgsqlPort.value),
user: Elements.pgsqlUser.value,
password: Elements.pgsqlPassword.value,
database: Elements.pgsqlDatabase.value
}
};
}
}
// append rate limit config, if specified
if ((
Elements.ratelimitLoginRequests.value
|| Elements.ratelimitLoginTime.value
|| Elements.ratelimitUploadRequests.value
|| Elements.ratelimitUploadTime.value
|| Elements.ratelimitApiRequests.value
|| Elements.ratelimitApiTime.value) != ''
) {
if (!config.rateLimit) config.rateLimit = {};
if (
genericRateLimit(config.rateLimit, 'login', Elements.submitButton, Elements.ratelimitLoginRequests, Elements.ratelimitLoginTime)
|| genericRateLimit(config.rateLimit, 'api', Elements.submitButton, Elements.ratelimitApiRequests, Elements.ratelimitApiTime)
|| genericRateLimit(config.rateLimit, 'upload', Elements.submitButton, Elements.ratelimitUploadRequests, Elements.ratelimitUploadTime)
) {
return;
}
}
// ! Make sure the admin user fields are set
if (Elements.userUsername.value == null || Elements.userUsername.value === '')
return errReset('Admin username is required!', Elements.submitButton);
if (Elements.userPassword.value == null || Elements.userPassword.value === '')
return errReset('Admin password is required!', Elements.submitButton);
// Do setup
fetch('/api/setup', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(config)
})
.then((res) => res.json())
.then((data: {
success: boolean,
message: string
}) => {
if (!data.success) alert(data.message);
// Create first user (YES I KNOW THIS NESTING IS GROSS)
else return fetch('/api/user', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
username: Elements.userUsername.value,
password: Elements.userPassword.value,
admin: true
})
}).then((res) => res.json())
.then((data: {
success: boolean,
message: string
}) => {
if (data.success) window.location.href = '/admin';
else alert(data.message);
});
})
.catch((err) => errAlert('POST to /api/setup failed!', err))
.finally(() => Elements.submitButton.disabled = false);
});
});

@ -0,0 +1,15 @@
{
"extends": "@tsconfig/node20/tsconfig.json",
"compilerOptions": {
"outDir": "../dist/frontend",
"lib": [
"ES2022",
"DOM"
],
"target": "ES2015",
},
"include": [
"./**/*.mts",
"../**/common/*.ts"
]
}

@ -0,0 +1,4 @@
import { SlInput, SlButton } from '@shoelace-style/shoelace';
// * Wait for the document to be ready
document.addEventListener('DOMContentLoaded', () => console.log('User page loaded'));

@ -1,31 +0,0 @@
#!/bin/bash
echo "Installing ass-docker for Linux..."
# Ensure that ./uploads/thumbnails/ exists
mkdir -p ./uploads/thumbnails/
# Ensure that ./share/ exists
mkdir -p ./share/
# Ensure that files config.json, auth.json, & data.json exist
for value in config.json auth.json data.json
do
if [ ! -f $value ]; then
touch $value
fi
done
# Wait for user to confirm
echo "Continuing will run docker compose. Continue? (Press Ctrl+C to abort)"
read -n 1 -s -r -p "Press any key to continue..."
echo Running setup...
# Bring up the container and run the setup
docker compose up -d && docker compose exec ass npm run setup && docker compose restart
# Done!
echo "ass-docker for Linux installed!"
echo "Run the following to view commands:"
echo "$ docker compose logs -f --tail=50 --no-log-prefix ass"

@ -1,28 +0,0 @@
@echo off
ECHO Installing ass-docker for Windows...
REM Ensure that ./uploads/thumbnails/ exists
if not exist "./uploads/thumbnails/" md "./uploads/thumbnails/"
REM Ensure that ./share/ exists
if not exist "./share/" md "./share/"
REM Ensure that files config.json, auth.json, & data.json exist
if not exist "./config.json" echo. >> "./config.json"
if not exist "./auth.json" echo. >> "./auth.json"
if not exist "./data.json" echo. >> "./data.json"
REM Wait for user to confirm
ECHO Continuing will run docker compose. Continue? (Press Ctrl+C to abort)
PAUSE
ECHO Running setup...
REM Bring up the container and run the setup
docker compose up -d && docker compose exec ass npm run setup && docker compose restart
REM Done!
ECHO ass-docker for Windows installed!
ECHO Run the following to view commands:
ECHO > docker compose logs -f --tail=50 --no-log-prefix ass

6253
package-lock.json generated

File diff suppressed because it is too large Load Diff

@ -1,97 +1,82 @@
{
"name": "ass",
"version": "0.14.8",
"version": "0.15.0-indev",
"description": "The simple self-hosted ShareX server",
"main": "ass.js",
"main": "dist/backend/app.js",
"type": "module",
"engines": {
"node": ">=16.14.x",
"npm": ">=8.3.x"
"node": "^20"
},
"scripts": {
"start": "node dist/backend/app.js",
"dev": "npm run build && npm start",
"dev-win": "npm run build-skip-options && npm run start",
"build": "NODE_OPTIONS=\"--max-old-space-size=1024\" tsc",
"build-skip-options": "tsc",
"start": "node dist/ass.js",
"setup": "node dist/setup.js",
"metrics": "node dist/metrics.js",
"engine-check": "node dist/checkEngine.js",
"prestart": "npm run engine-check",
"presetup": "npm run engine-check",
"purge": "node dist/purge.js",
"docker-logs": "docker-compose logs -f --tail=50 --no-log-prefix ass",
"docker-update": "git pull && npm run docker-uplite",
"docker-uplite": "docker-compose up --force-recreate --build -d && docker image prune -f",
"docker-upfull": "npm run docker-update && npm run docker-resetup",
"docker-resetup": "docker-compose exec ass npm run setup && docker-compose restart",
"cli-setpassword": "node dist/tools/script.setpassword.js",
"cli-testpassword": "node dist/tools/script.testpassword.js",
"cli-adduser": "node dist/tools/script.adduser.js"
"dev:docs": "wrangler pages dev --proxy 5173 -- npm run vp:dev",
"build": "rm -dr dist/ ; npm run build:backend && npm run build:frontend && npm run build:fix-frontend",
"build:backend": "tsc -p backend/",
"build:frontend": "tsc -p frontend/",
"build:fix-frontend": "node common/fix-frontend-js.js",
"build:docs": "npm run vp:build && npm run build:compose-redir",
"build:compose-redir": "echo \"/compose.yaml https://raw.githubusercontent.com/tycrek/ass/dev/0.15.0/compose.yaml 302\" > ./docs/.vitepress/dist/_redirects",
"vp:dev": "vitepress dev docs",
"vp:build": "vitepress build docs",
"vp:preview": "vitepress preview docs"
},
"repository": "github:tycrek/ass",
"keywords": [
"sharex",
"sharex-server"
],
"author": "tycrek <t@tycrek.com> (https://tycrek.com/)",
"author": "tycrek <sylvie@tycrek.com> (https://tycrek.com/)",
"license": "ISC",
"bugs": "https://github.com/tycrek/ass/issues",
"homepage": "https://github.com/tycrek/ass#readme",
"funding": {
"type": "patreon",
"url": "https://patreon.com/tycrek"
},
"dependencies": {
"@aws-sdk/client-s3": "^3.465.0",
"@shoelace-style/shoelace": "^2.12.0",
"@tinycreek/postcss-font-magician": "^4.2.0",
"@tsconfig/node16": "^1.0.1",
"@tsconfig/node20": "^20.1.2",
"@tycrek/discord-hookr": "^0.1.0",
"@tycrek/express-postcss": "^0.4.1",
"@tycrek/joint": "^1.0.0-1",
"@tycrek/log": "^0.7.1",
"@tycrek/papito": "^0.3.4",
"@tycrek/joint": "1.0.0-1",
"@tycrek/log": "^0.7.5",
"@xoi/gps-metadata-remover": "^1.1.2",
"any-shell-escape": "^0.1.1",
"autoprefixer": "^10.4.16",
"aws-sdk": "^2.1467.0",
"axios": "^1.6.0",
"axios": "^1.6.2",
"bcrypt": "^5.1.1",
"chalk": "^4.1.2",
"check-node-version": "^4.2.1",
"crypto-random-string": "3.3.1",
"cssnano": "^6.0.1",
"escape-html": "^1.0.3",
"express": "^4.18.2",
"express-busboy": "^10.1.0",
"express-rate-limit": "^7.1.5",
"express-session": "^1.17.3",
"ffmpeg-static": "^5.2.0",
"fs-extra": "^11.1.1",
"helmet": "^7.0.0",
"luxon": "^3.4.3",
"nanoid": "^3.3.4",
"node-fetch": "^2.6.7",
"node-vibrant": "^3.2.1-alpha.1",
"prompt": "^1.3.0",
"fs-extra": "^11.2.0",
"luxon": "^3.4.4",
"memorystore": "^1.6.7",
"mysql2": "^3.6.5",
"node-vibrant": "^3.1.6",
"pg": "^8.11.3",
"pug": "^3.0.2",
"sanitize-filename": "^1.6.3",
"sharp": "^0.32.6",
"stream-to-array": "^2.3.0",
"tailwindcss": "^3.3.3",
"typescript": "^4.9.5",
"uuid": "^8.3.2"
"shoelace-fontawesome-pug": "^6.4.3",
"shoelace-pug-loader": "^2.11.0",
"tailwindcss": "^3.3.6",
"typescript": "^5.3.2",
"william.js": "^1.3.1"
},
"devDependencies": {
"@types/bcrypt": "^5.0.0",
"@types/escape-html": "^1.0.1",
"@types/express": "^4.17.13",
"@types/express-busboy": "^8.0.0",
"@types/ffmpeg-static": "^3.0.0",
"@types/fs-extra": "^9.0.12",
"@types/luxon": "^3.3.0",
"@types/marked": "^3.0.0",
"@types/node": "^16.9.0",
"@types/node-fetch": "^2.5.12",
"@types/sharp": "^0.30.2",
"@types/stream-to-array": "^2.3.0",
"@types/uuid": "^8.3.1",
"@types/ws": "^7.4.7"
"@types/bcrypt": "^5.0.2",
"@types/express": "^4.17.21",
"@types/express-busboy": "^8.0.3",
"@types/express-session": "^1.17.10",
"@types/ffmpeg-static": "^3.0.3",
"@types/fs-extra": "^11.0.4",
"@types/luxon": "^3.3.6",
"@types/node": "^20.10.3",
"@types/pg": "^8.10.9",
"vitepress": "1.0.0-rc.31",
"vue": "^3.3.10",
"wrangler": "^3.18.0"
}
}

File diff suppressed because it is too large Load Diff

@ -1,133 +0,0 @@
import { ErrWrap } from './types/definitions';
import { Config, MagicNumbers, Package } from 'ass-json';
//#region Imports
import fs from 'fs-extra';
import express, { Request, Response, json as BodyParserJson } from 'express';
import { nofavicon } from '@tycrek/joint';
import { epcss } from '@tycrek/express-postcss';
import tailwindcss from 'tailwindcss';
import helmet from 'helmet';
import { path, log, getTrueHttp, getTrueDomain } from './utils';
import { onStart as ApiOnStart } from './routers/api';
//#endregion
//#region Setup - Run first time setup if using Docker (pseudo-process, setup will be run with docker exec)
import { doSetup } from './setup';
const configPath = path('config.json');
if (!fs.existsSync(configPath) || fs.readFileSync(configPath).toString().length === 0) {
doSetup();
// @ts-ignore
return;
}
//#endregion
// Load the JSON
const { host, port, useSsl, isProxied, s3enabled, frontendName, diskFilePath }: Config = fs.readJsonSync(path('config.json'));
const { CODE_INTERNAL_SERVER_ERROR }: MagicNumbers = fs.readJsonSync(path('MagicNumbers.json'));
const { name, version, homepage }: Package = fs.readJsonSync(path('package.json'));
//#region Local imports
import uploadRouter from './routers/upload';
import resourceRouter from './routers/resource';
//#endregion
// Welcome :D
log.blank().info(`* ${name} v${version} *`).blank();
//#region Variables, module setup
const app = express();
const ROUTERS = {
upload: uploadRouter,
resource: resourceRouter
};
// Read users and data
import { onStart as AuthOnStart, users } from './auth';
import { onStart as DataOnStart, data } from './data';
//#endregion
// Create thumbnails directory
fs.ensureDirSync(path(diskFilePath, 'thumbnails'));
// Enable/disable Express features
app.enable('case sensitive routing');
app.disable('x-powered-by');
// Set Express variables
app.set('trust proxy', isProxied);
app.set('view engine', 'pug');
// Express logger middleware
// app.use(log.middleware());
// Body parser for API POST requests
// (I really don't like this being top level but it does not work inside the API Router as of 2022-12-24)
app.use(BodyParserJson());
// Helmet security middleware
app.use(helmet.noSniff());
app.use(helmet.ieNoOpen());
app.use(helmet.xssFilter());
app.use(helmet.referrerPolicy());
app.use(helmet.dnsPrefetchControl());
useSsl && app.use(helmet.hsts({ preload: true })); // skipcq: JS-0093
// Don't process favicon requests
// todo: this doesn't actually return a 204 properly, it returns a 404
app.use(nofavicon.none());
// Use custom index, otherwise render README.md
type ASS_INDEX_TYPE = 'html' | 'js' | undefined;
const ASS_INDEX: ASS_INDEX_TYPE = fs.existsSync(path('share', 'index.html')) ? 'html' : fs.existsSync(path('share', 'index.js')) ? 'js' : undefined;
app.get('/', (req, res, next) =>
ASS_INDEX === 'html' ? res.sendFile(path('share', 'index.html')) :
ASS_INDEX === 'js' ? require(path('share', 'index.js'))(req, res, next) : // skipcq: JS-0359
res.redirect(homepage))
// Set up custom frontend
const ASS_FRONTEND = { enabled: false }; // ! Disabled in 0.14.7
// Upload router (has to come after custom frontends as express-busboy interferes with all POST calls)
app.use('/', ROUTERS.upload);
// API
app.use('/api', ApiOnStart());
// CSS
app.use('/css', epcss({
cssPath: path('tailwind.css'),
plugins: [
tailwindcss,
require('autoprefixer')(),
require('cssnano')(),
require('@tinycreek/postcss-font-magician')(),
],
warn: (warning: Error) => log.warn('PostCSS', warning.toString())
}));
// '/:resouceId' always needs to be LAST since it's a catch-all route
app.use('/:resourceId', (req, _res, next) => (req.resourceId = req.params.resourceId, next()), ROUTERS.resource); // skipcq: JS-0086, JS-0090
// Error handler
app.use((err: ErrWrap, _req: Request, res: Response) => {
log.error(err.message);
console.error(err);
res.sendStatus(CODE_INTERNAL_SERVER_ERROR);
});
(async function start() {
await AuthOnStart();
await DataOnStart();
if (data() == null) setTimeout(start, 100);
else log
.info('Users', `${users.length}`)
.info('Files', `${data().size}`)
.info('Data engine', data().name, data().type)
.info('Frontend', 'disabled')
.info('Custom index', ASS_INDEX ?? 'disabled')
.blank()
.callback(() => app.listen(port, host, () => log.success('Ready for uploads', `Storing resources ${s3enabled ? 'in S3' : 'on disk'}`)));
})();

@ -1,22 +0,0 @@
const check = require("check-node-version");
const ENGINES = require('../package.json').engines;
const { TLog } = require('@tycrek/log');
const logger = new TLog();
function doCheck() {
return new Promise((resolve, reject) =>
check(ENGINES, (err, { isSatisfied: allSatisfied, versions }) =>
err ? reject(err) : allSatisfied ? resolve('Node & npm version requirements satisfied!')
: reject(Object.entries(versions)
.filter(([, { isSatisfied }]) => (!isSatisfied))
.map(([packageName, { version: current, wanted: minimum }]) =>
`\nInvalid ${packageName} version!\n- Current: ${current}\n- Required: ${minimum}`)
.join('')
.concat('\nPlease update to continue!'))));
}
if (require.main !== module) module.exports = doCheck;
else doCheck()
.then((result) => logger.comment(`Wanted: ${ENGINES.node} (npm ${ENGINES.npm})`)/* .node() */.success(result))
.catch((err) => logger.error(err) && process.exit(1));

@ -1,25 +0,0 @@
/**
* Used for global data management
*/
import fs from 'fs-extra';
import { Config } from 'ass-json';
import { JsonDataEngine } from '@tycrek/papito'
let theData: any;
/**
* Called by ass.ts on startup
* @since v0.14.2
*/
export const onStart = () => new Promise((resolve, reject) => {
// Actual data engine
const { dataEngine }: Config = fs.readJsonSync('config.json');
import(dataEngine)
.then(({ _ENGINE_ }) => theData = _ENGINE_(new JsonDataEngine()))
.then(resolve)
.catch(reject);
});
// Export a self-calling const function returning the data
export const data = ((): any => theData);

@ -1,23 +0,0 @@
import fs from 'fs-extra';
// Don't trigger circular dependency during setup
if (require !== undefined && !require?.main?.filename.includes('setup.js'))
var MIN_LENGTH = require('../setup').gfyIdSize; // skipcq: JS-0239, JS-0102
function getWord(list: string[], delim = '') {
return list[Math.floor(Math.random() * list.length)].concat(delim);
}
function genString(count = MIN_LENGTH) {
// For some reason these 3 lines MUST be inside the function
const { path } = require('../utils');
const adjectives = fs.readFileSync(path('./gfycat/adjectives.txt')).toString().split('\n');
const animals = fs.readFileSync(path('./gfycat/animals.txt')).toString().split('\n');
let gfycat = '';
for (let i = 0; i < (count < MIN_LENGTH ? MIN_LENGTH : count); i++)
gfycat += getWord(adjectives, '-');
return gfycat.concat(getWord(animals));
};
export default ({ gfyLength }: { gfyLength: number }) => genString(gfyLength);

@ -1,2 +0,0 @@
import { randomBytes } from 'crypto';
export default (length: number, charset: string[]): string => [...randomBytes(length)].map((byte) => charset[Number(byte) % charset.length]).join('').slice(1).concat(charset[0]);

@ -1,2 +0,0 @@
import { nanoid } from 'nanoid';
export default ({ length }: { length?: number }) => nanoid(length);

@ -1,2 +0,0 @@
import cryptoRandomString from 'crypto-random-string';
export default ({ length }: { length: number }) => cryptoRandomString({ length, type: 'alphanumeric' });

@ -1 +0,0 @@
export default () => `${Date.now()}`;

@ -1,38 +0,0 @@
import { v4 as uuid } from 'uuid';
import fs from 'fs-extra';
import path from 'path';
import randomGen from './random';
import { TLog } from '@tycrek/log';
const log = new TLog();
const MAX_USERNAME = 20;
export default () => uuid().replace(/-/g, '');
module.exports = () => uuid().replace(/-/g, '');
// If directly called on the command line, generate a new token
if (require.main === module) {
const token = module.exports();
const authPath = path.join(process.cwd(), 'auth.json');
let name = '';
fs.readJson(authPath)
.then((auth) => {
// Generate the user
const username = process.argv[2] ? process.argv[2].replace(/[^\da-z_]/gi, '').substring(0, MAX_USERNAME) : randomGen({ length: 20 }); // skipcq: JS-0074
if (!auth.users) auth.users = {};
if (Object.values(auth.users).findIndex((user: any) => user.username === username) !== -1) {
log.error('Username already exists', username);
process.exit(1);
}
auth.users[token] = { username, count: 0 };
name = auth.users[token].username;
fs.writeJsonSync(authPath, auth, { spaces: 4 });
})
.then(() => log
.comment('A new token has been generated and automatically applied.')
.comment('You do not need to restart \'ass\'.')
.success('Your token', token, `username: ${name}`))
.catch(console.error);
}

@ -1,4 +0,0 @@
import lengthGen from './lengthGen';
const zeroWidthChars = ['\u200B', '\u200C', '\u200D', '\u2060'];
export default ({ length }: { length: number }) => lengthGen(length, zeroWidthChars);
export const checkIfZws = (str: string) => str.split('').every(char => zeroWidthChars.includes(char));

@ -1,16 +0,0 @@
import { FileData } from './types/definitions';
import fs from 'fs-extra';
import crypto from 'crypto';
import toArray from 'stream-to-array';
import { log } from './utils';
/**
* Generates a SHA1 hash for the provided file
*/
export default (file: FileData): Promise<string> =>
new Promise((resolve, reject) =>
toArray((fs.createReadStream(file.path)))
.then((parts: any[]) => Buffer.concat(parts.map((part: any) => (Buffer.isBuffer(part) ? part : Buffer.from(part)))))
.then((buf: Buffer) => crypto.createHash('sha1').update(buf).digest('hex')) // skipcq: JS-D003
.then((hash: string) => log.debug(`Hash for ${file.originalname}`, hash, 'SHA1, hex').callback(() => resolve(hash)))
.catch(reject));

@ -1,10 +0,0 @@
import { TLog } from '@tycrek/log';
import { DateTime } from 'luxon';
// Set up logging
const logger = new TLog(process.env.NODE_ENV === 'production' ? 'info' : 'debug')
.setTimestamp({ preset: DateTime.DATETIME_MED });
// todo: re-enable the Express logger
export default logger;

@ -1,65 +0,0 @@
const fs = require('fs-extra');
const path = require('path');
const { s3enabled } = require('../config.json');
const { formatBytes } = require('./utils');
const { bucketSize } = require('./storage');
const { TLog } = require('@tycrek/log');
const log = new TLog({ level: 'debug', timestamp: { enabled: false } });
/**
* Thank you CoPilot for helping write whatever the fuck this is -tycrek, 2022-04-18
*/
function whileWait(expression, timeout = 1000) {
return new Promise(async (resolve, reject) => {
while (expression())
await new Promise((resolve) => setTimeout(resolve, timeout));
resolve();
});
}
module.exports = () => {
const data = require('./data').data;
const { users } = fs.readJsonSync(path.join(process.cwd(), 'auth.json'));
Object.keys(users).forEach((token) => users[token].count = 0);
let totalSize = 0;
let oldSize = 0;
let d = [];
whileWait(() => data() === undefined)
.then(() => data().get())
.then((D) => (d = D.map(([, resource]) => resource)))
.then(() =>
d.forEach(({ token, size }) => {
try {
totalSize += size;
if (token === undefined) oldSize += size; // skipcq: JS-0127
else {
if (!users[token].size) users[token].size = 0;
users[token].size += size;
users[token].count++;
}
} catch (ex) {
// Silently handle missing tokens from dev environment -tycrek
}
}))
.then(() => bucketSize())
.then((s3size) => {
log.info('---- Usage metrics ----')
.blank()
.info('Users', Object.keys(users).length)
.info('Files', Object.keys(d).length)
.info('S3 size', s3enabled ? s3size : '--')
.blank()
.info('Total size', formatBytes(totalSize))
.info('Old files', formatBytes(oldSize))
.blank();
Object.values(users).forEach(({ username, count, size }) => log.info(`- ${username}`, formatBytes(size), `${count} files`));
process.exit(0);
})
.catch(console.error);
}
if (require.main === module) module.exports();

@ -1,26 +0,0 @@
/**
* This strips GPS EXIF data from files
*/
import { removeLocation } from '@xoi/gps-metadata-remover';
import fs from 'fs-extra';
/**
* This strips GPS EXIF data from files using the @xoi/gps-metadata-remover package
* @returns A Promise that resolves to `true` if GPS data was removed, `false` if not
*/
export const removeGPS = (file: string): Promise<boolean> => {
return new Promise((resolve, reject) =>
fs.open(file, 'r+')
.then((fd) => removeLocation(file,
// Read function
(size: number, offset: number): Promise<Buffer> =>
fs.read(fd, Buffer.alloc(size), 0, size, offset)
.then(({ buffer }) => Promise.resolve(buffer)),
// Write function
(val: string, offset: number, enc: BufferEncoding): Promise<void> =>
fs.write(fd, Buffer.alloc(val.length, val, enc), 0, val.length, offset)
.then(() => Promise.resolve())))
.then(resolve)
.catch(reject));
}

@ -1,16 +0,0 @@
import { TLog } from '@tycrek/log';
import fs from 'fs-extra';
import path from 'path';
const log = new TLog();
const uploadsPath = path.join(process.cwd(), 'uploads/');
const dataPath = path.join(process.cwd(), 'data.json');
if (fs.existsSync(uploadsPath)) {
fs.removeSync(uploadsPath);
log.success('Deleted', uploadsPath);
}
if (fs.existsSync(dataPath)) {
fs.removeSync(dataPath);
log.success('Deleted', dataPath);
}

@ -1,80 +0,0 @@
import { FileData } from './types/definitions';
import { Config } from 'ass-json';
import fs from 'fs-extra';
import ffmpeg from 'ffmpeg-static';
import sharp from 'sharp';
// @ts-ignore
import shell from 'any-shell-escape';
import { exec } from 'child_process';
import { isProd, path } from './utils';
const { diskFilePath }: Config = fs.readJsonSync(path('config.json'));
// Thumbnail parameters
const THUMBNAIL = {
QUALITY: 75,
WIDTH: 200 * 2,
HEIGHT: 140 * 2,
}
/**
* Builds a safe escaped ffmpeg command
*/
function getCommand(src: String, dest: String) {
return shell([
ffmpeg, '-y',
'-v', (isProd ? 'error' : 'debug'), // Log level
'-i', src, // Input file
'-ss', '00:00:01.000', // Timestamp of frame to grab
'-vf', `scale=${THUMBNAIL.WIDTH}:${THUMBNAIL.HEIGHT}:force_original_aspect_ratio=increase,crop=${THUMBNAIL.WIDTH}:${THUMBNAIL.HEIGHT}`, // Dimensions of output file
'-frames:v', '1', // Number of frames to grab
dest // Output file
]);
}
/**
* Builds a thumbnail filename
*/
function getNewName(oldName: String) {
return oldName.concat('.thumbnail.jpg');
}
/**
* Builds a path to the thumbnails
*/
function getNewNamePath(oldName: String) {
return path(diskFilePath, 'thumbnails/', getNewName(oldName));
}
/**
* Extracts an image from a video file to use as a thumbnail, using ffmpeg
*/
function getVideoThumbnail(file: FileData) {
return new Promise((resolve: Function, reject: Function) => exec(
getCommand(file.path, getNewNamePath(file.randomId)),
// @ts-ignore
(err: Error) => (err ? reject(err) : resolve())
));
}
/**
* Generates a thumbnail for the provided image
*/
function getImageThumbnail(file: FileData) {
return new Promise((resolve, reject) =>
sharp(file.path)
.resize(THUMBNAIL.WIDTH, THUMBNAIL.HEIGHT, { kernel: 'cubic' })
.jpeg({ quality: THUMBNAIL.QUALITY })
.toFile(getNewNamePath(file.randomId))
.then(resolve)
.catch(reject));
}
/**
* Generates a thumbnail
*/
export default (file: FileData): Promise<string> =>
new Promise((resolve, reject) =>
(file.is.video ? getVideoThumbnail : (file.is.image && !file.mimetype.includes('webp')) ? getImageThumbnail : () => Promise.resolve())(file)
.then(() => resolve((file.is.video || file.is.image) ? getNewName(file.randomId) : file.is.audio ? 'views/ass-audio-icon.png' : 'views/ass-file-icon.png'))
.catch(reject));

@ -1,29 +0,0 @@
import path from 'path';
import fs from 'fs-extra';
import axios from 'axios';
import logger from '../logger';
import { User } from '../types/auth';
// Port from config.json
const { port } = fs.readJsonSync(path.join(process.cwd(), 'config.json'));
// CLI key from auth.json
const { cliKey } = fs.readJsonSync(path.join(process.cwd(), 'auth.json'));
if (process.argv.length < 4) {
logger.error('Missing username or password');
logger.error('Usage: node script.adduser.js <username> <password> [admin] [meta]');
process.exit(1);
} else {
const username = process.argv[2];
const password = process.argv[3];
const admin = process.argv[4] ? process.argv[4].toLowerCase() === 'true' : false;
const meta = process.argv[5] ? JSON.parse(process.argv[5]) : {};
axios.post(`http://localhost:${port}/api/user`, { username, password, admin, meta }, { headers: { 'Authorization': cliKey } })
.then((response) => {
const user = response.data as User;
logger.info('User created', `${username} (${user.unid})`, `token: ${user.token}`).callback(() => process.exit(0))
})
.catch((err) => logger.error(err).callback(() => process.exit(1)));
}

@ -1,19 +0,0 @@
import logger from '../logger';
import { onStart, users, setUserPassword } from '../auth';
if (process.argv.length < 4) {
logger.error('Missing username/unid or password');
process.exit(1);
} else {
const id = process.argv[2];
const password = process.argv[3];
onStart(process.argv[4] || 'auth.json')
.then(() => {
const user = users.find((user) => user.unid === id || user.username === id);
if (!user) throw new Error('User not found');
else return setUserPassword(user.unid, password);
})
.then(() => logger.info('Password changed successfully').callback(() => process.exit(0)))
.catch((err) => logger.error(err).callback(() => process.exit(1)));
}

@ -1,20 +0,0 @@
import logger from '../logger';
import { onStart, users } from '../auth';
import { compare } from 'bcrypt';
if (process.argv.length < 4) {
logger.error('Missing username/unid or password');
process.exit(1);
} else {
const id = process.argv[2];
const password = process.argv[3];
onStart(process.argv[4] || 'auth.json')
.then(() => {
const user = users.find((user) => user.unid === id || user.username === id);
if (!user) throw new Error('User not found');
else return compare(password, user.passhash);
})
.then((result) => logger.info('Matches', `${result}`).callback(() => process.exit(0)))
.catch((err) => logger.error(err).callback(() => process.exit(1)));
}

@ -1,5 +1,3 @@
import { Request, Response } from 'express';
declare global {
namespace Express {
interface Request {

@ -1,5 +0,0 @@
declare module './setup' {
export function doSetup(): void;
}
declare module '@tycrek/papito';
declare module '@skynetlabs/skynet-nodejs';

@ -1,30 +1,4 @@
import { Config } from 'ass-json';
import { FileData } from './types/definitions';
import fs from 'fs-extra';
import Path from 'path';
import fetch from 'node-fetch';
import sanitize from 'sanitize-filename';
import { DateTime } from 'luxon';
import token from './generators/token';
import zwsGen from './generators/zws';
import randomGen from './generators/random';
import gfyGen from './generators/gfycat';
import tsGen from './generators/timestamp';
import logger from './logger';
import { Request } from 'express';
import { isProd as ip } from '@tycrek/joint';
const { HTTP, HTTPS, KILOBYTES } = require('../MagicNumbers.json');
// Catch config.json not existing when running setup script
try {
// todo: fix this
const configPath = Path.join(process.cwd(), 'config.json');
if (!fs.existsSync(configPath)) throw new Error('Config file not found');
var { useSsl, port, domain, isProxied, diskFilePath, s3bucket, s3endpoint, s3usePathStyle }: Config = fs.readJsonSync(configPath);
} catch (ex) {
// @ts-ignore
if (ex.code !== 'MODULE_NOT_FOUND' || !ex.toString().includes('Unexpected end')) console.error(ex);
}
const { HTTP, HTTPS } = require('../MagicNumbers.json');
export function getTrueHttp() {
return ('http').concat(useSsl ? 's' : '').concat('://');
@ -42,88 +16,13 @@ export function getDirectUrl(resourceId: string) {
return `${getTrueHttp()}${getTrueDomain()}/${resourceId}/direct`;
}
export function randomHexColour() { // From: https://www.geeksforgeeks.org/javascript-generate-random-hex-codes-color/
const letters = '0123456789ABCDEF';
let colour = '#';
for (let i = 0; i < 6; i++) // skipcq: JS-0074
colour += letters[(Math.floor(Math.random() * letters.length))];
return colour;
}
export function getResourceColor(colorValue: string, vibrantValue: string) {
return (!colorValue || colorValue === '&vibrant') ? vibrantValue : colorValue === '&random' ? randomHexColour() : colorValue;
}
export function formatTimestamp(timestamp: number, timeoffset: string) {
return DateTime.fromMillis(timestamp).setZone(timeoffset).toLocaleString(DateTime.DATETIME_MED);
}
export function formatBytes(bytes: number, decimals = 2) { // skipcq: JS-0074
if (bytes === 0) return '0 Bytes';
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB', 'PB', 'EB', 'ZB', 'YB'];
const i = Math.floor(Math.log(bytes) / Math.log(KILOBYTES));
return parseFloat((bytes / Math.pow(KILOBYTES, i)).toFixed(decimals < 0 ? 0 : decimals)).toString().concat(` ${sizes[i]}`);
}
export function replaceholder(data: string, size: number, timestamp: number, timeoffset: string, originalname: string) {
return data
.replace(/&size/g, formatBytes(size))
.replace(/&filename/g, originalname)
.replace(/&timestamp/g, formatTimestamp(timestamp, timeoffset));
}
const idModes = {
zws: 'zws', // Zero-width spaces (see: https://zws.im/)
og: 'original', // Use original uploaded filename
r: 'random', // Use a randomly generated ID with a mixed-case alphanumeric character set
gfy: 'gfycat', // Gfycat-style ID's (https://gfycat.com/unsungdiscretegrub)
ts: 'timestamp', // Timestamp-based ID's
};
const GENERATORS = new Map();
GENERATORS.set(idModes.zws, zwsGen);
GENERATORS.set(idModes.r, randomGen);
GENERATORS.set(idModes.gfy, gfyGen);
GENERATORS.set(idModes.ts, tsGen);
export function generateId(mode: string, length: number, gfyLength: number, originalName: string) {
return (GENERATORS.has(mode) ? GENERATORS.get(mode)({ length, gfyLength }) : originalName);
}
// Set up pathing
export const path = (...paths: string[]) => Path.join(process.cwd(), ...paths);
export const isProd = ip();
module.exports = {
path,
getTrueHttp,
getTrueDomain,
getS3url,
getDirectUrl,
getResourceColor,
formatTimestamp,
formatBytes,
replaceholder,
randomHexColour,
sanitize,
renameFile: (req: Request, newName: string) => new Promise((resolve: Function, reject) => {
try {
const paths = [req.file.destination, newName];
fs.rename(path(req.file.path), path(...paths));
req.file.path = Path.join(...paths);
resolve();
} catch (err) {
reject(err);
}
}),
generateToken: () => token(),
generateId,
downloadTempS3: (file: FileData) => new Promise((resolve: Function, reject) =>
fetch(getS3url(file.randomId, file.ext))
.then((f2) => f2.body!.pipe(fs.createWriteStream(Path.join(__dirname, diskFilePath, sanitize(file.originalname))).on('close', () => resolve())))
.catch(reject)),
}
export const log = logger;
/**
* @type {TLog}
*/
module.exports.log = logger;

@ -1,26 +0,0 @@
import { FileData } from './types/definitions';
import Vibrant from 'node-vibrant';
import sharp from 'sharp';
import { randomHexColour } from './utils';
// Vibrant parameters
const COLOR_COUNT = 256;
const QUALITY = 3;
/**
* Extracts a prominent colour from the provided image file
*/
function getVibrant(file: FileData, resolve: Function, reject: Function) {
sharp(file.path).png().toBuffer()
.then((data) => Vibrant.from(data)
.maxColorCount(COLOR_COUNT)
.quality(QUALITY)
.getPalette())
.then((palettes) => resolve(palettes[Object.keys(palettes).sort((a, b) => palettes[b]!.population - palettes[a]!.population)[0]]!.hex))
.catch((err) => reject(err));
}
/**
* Extracts a colour from an image file. Returns a random Hex value if provided file is a video
*/
export default (file: FileData): Promise<string> => new Promise((resolve, reject) => (!file.is.image || file.mimetype.includes('webp')) ? resolve(randomHexColour()) : getVibrant(file, resolve, reject)); // skipcq: JS-0229

@ -5,29 +5,29 @@
@layer base {}
@layer components {
.res-media {
@apply border-l-4 rounded max-h-half-port;
.setup-text-section-header {
@apply text-2xl font-bold font-mono;
}
.link {
@apply no-underline hover_no-underline active_no-underline visited_no-underline
/* regular, visited */
text-link-primary visited_text-link-primary
border-b-2 visited_border-b-2
border-transparent visited_border-transparent
rounded-sm visited_rounded-sm
.setup-text-item-title {
@apply text-stone-300;
}
/* hover */
hover_text-link-hover
hover_border-hover
.setup-text-optional {
@apply text-stone-400 italic;
}
/* active */
active_text-link-active
.setup-panel {
@apply flex flex-col pt-4 w-full max-w-xs;
}
/* transitions */
ease-linear duration-150 transition-all;
.setup-panel>sl-input {
@apply mb-4;
}
}
@layer utilities {}
@layer utilities {
.flex-center {
@apply items-center justify-center;
}
}

@ -1,20 +0,0 @@
{
"extends": "@tsconfig/node16/tsconfig.json",
"compilerOptions": {
"outDir": "./dist",
"target": "ES2022",
"lib": [
"ES2022",
"DOM"
],
"allowJs": true,
"downlevelIteration": true
},
"include": [
"src/**/*.js",
"src/**/*.ts"
],
"exclude": [
"ass-x"
]
}

Before

Width:  |  Height:  |  Size: 6.2 KiB

After

Width:  |  Height:  |  Size: 6.2 KiB

Before

Width:  |  Height:  |  Size: 11 KiB

After

Width:  |  Height:  |  Size: 11 KiB

@ -0,0 +1,24 @@
doctype html
html.dark.sl-theme-dark(lang='en')
head
meta(charset='UTF-8')
meta(name='viewport', content='width=device-width, initial-scale=1.0')
block title
title ass 🍑
meta(name='theme-color' content='black')
link(rel='stylesheet', href='/.css')
//- Shoelace/Font Awesome mixins
include ../node_modules/shoelace-fontawesome-pug/sl-fa-mixin.pug
include ../node_modules/shoelace-pug-loader/loader.pug
+slTheme('dark')
+slAuto
body.w-screen.h-screen.flex.flex-col
//- Header
.w-full.border-b.border-stone-500.flex.justify-center.items-center.py-3
h1.text-4xl.font-bold.font-mono: block section
span [section]
//- Centering width-fixer
.w-full.flex.justify-center.h-full
.w-full.md_max-w-xl.px-4.pt-16.h-full: block content

@ -0,0 +1,9 @@
extends _base_
block title
title ass admin 🍑
block section
span admin
block content
h1.text-3xl Coming soon.
script(src='/admin/ui.js')

@ -0,0 +1,5 @@
extends _base_
block section
span ass
block content
h1.text-3xl Welcome to ass #{version}, a ShareX server.

@ -0,0 +1,11 @@
extends _base_
block section
span login
block content
.flex.flex-col.flex-center.h-full: .setup-panel
h3 Username
sl-input#login-username(type='text' placeholder='username' clearable): sl-icon(slot='prefix' name='fas-user' library='fa')
h3 Password
sl-input#login-password(type='password' placeholder='password' clearable): sl-icon(slot='prefix' name='fas-lock' library='fa')
sl-button.mt-4#login-submit(type='primary' submit) Login
script(src='/login/ui.js')

@ -0,0 +1,101 @@
extends _base_
block title
title ass setup 🍑
block section
span ass setup
block content
//- Setup panel
.flex.flex-col.items-center
p.text-lg.mb-4 Welcome to ass, your new personal file upload server!
//- * Base config
h2.setup-text-section-header.mt-12 Upload configuration
.setup-panel
h3.setup-text-item-title Uploads directory
sl-input#uploads-dir(type='text' placeholder='/opt/ass/uploads' clearable): sl-icon(slot='prefix' name='fas-folders' library='fa')
h3.setup-text-item-title ID type
sl-input#uploads-idtype(type='text' placeholder='random'): sl-icon(slot='prefix' name='fas-input-text' library='fa')
h3.setup-text-item-title ID size
sl-input#uploads-idsize(type='number' placeholder='8'): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title Gfycat size
sl-input#uploads-gfysize(type='number' placeholder='3'): sl-icon(slot='prefix' name='fas-cat' library='fa')
h3.setup-text-item-title Maximum file size (MB)
sl-input#uploads-filesize(type='number' placeholder='50'): sl-icon(slot='prefix' name='fas-file' library='fa')
//- * Admin User
h2.setup-text-section-header.mt-4 Admin User
.setup-panel
h3.setup-text-item-title Username
sl-input#user-username(type='text' placeholder='admin' clearable): sl-icon(slot='prefix' name='fas-user' library='fa')
h3.setup-text-item-title Password
sl-input#user-password(type='password' placeholder='the-most-secure' clearable): sl-icon(slot='prefix' name='fas-lock' library='fa')
//- * Database
h2.setup-text-section-header.mt-4 Database
.setup-panel
sl-tab-group
//- * JSON
sl-tab#json-tab(slot='nav' panel='json') JSON
sl-tab-panel(name='json')
| you all good!
//- * MySQL
sl-tab#mysql-tab(slot='nav' panel='mysql') MySQL
sl-tab-panel(name='mysql')
h3.setup-text-item-title Host
sl-input#mysql-host(type='text' placeholder='mysql.example.com' clearable): sl-icon(slot='prefix' name='fas-server' library='fa')
h3.setup-text-item-title Port
sl-input#mysql-port(type='number' placeholder='3306' min='1' max='65535' no-spin-buttons clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title User
sl-input#mysql-user(type='text' placeholder='myassql' clearable): sl-icon(slot='prefix' name='fas-user' library='fa')
h3.setup-text-item-title Password
sl-input#mysql-password(type='password' placeholder='super-secure' clearable): sl-icon(slot='prefix' name='fas-lock' library='fa')
h3.setup-text-item-title Database
sl-input#mysql-database(type='text' placeholder='assdb' clearable): sl-icon(slot='prefix' name='fas-database' library='fa')
//- * PostgreSQL
sl-tab#pgsql-tab(slot='nav' panel='pgsql') PostgreSQL
sl-tab-panel(name='pgsql')
h3.setup-text-item-title Host
sl-input#pgsql-host(type='text' placeholder='postgres.example.com' clearable): sl-icon(slot='prefix' name='fas-server' library='fa')
h3.setup-text-item-title Port
sl-input#pgsql-port(type='number' placeholder='5432' min='1' max='65535' no-spin-buttons clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title User
sl-input#pgsql-user(type='text' placeholder='posgrassql' clearable): sl-icon(slot='prefix' name='fas-user' library='fa')
h3.setup-text-item-title Password
sl-input#pgsql-password(type='password' placeholder='super-secure' clearable): sl-icon(slot='prefix' name='fas-lock' library='fa')
h3.setup-text-item-title Database
sl-input#pgsql-database(type='text' placeholder='assdb' clearable): sl-icon(slot='prefix' name='fas-database' library='fa')
//- * S3
h2.setup-text-section-header.mt-4 S3 #[span.setup-text-optional optional]
.setup-panel
h3.setup-text-item-title Endpoint
sl-input#s3-endpoint(type='text' placeholder='https://s3.example.com' clearable): sl-icon(slot='prefix' name='fas-server' library='fa')
h3.setup-text-item-title Bucket
sl-input#s3-bucket(type='text' placeholder='ass-bucket' clearable): sl-icon(slot='prefix' name='fas-bucket' library='fa')
h3.setup-text-item-title Access key
sl-input#s3-accessKey(type='text' placeholder='ABCD1234' clearable): sl-icon(slot='prefix' name='fas-key-skeleton' library='fa')
h3.setup-text-item-title Secret key
sl-input#s3-secretKey(type='password' placeholder='EF56GH78IJ90KL12' clearable): sl-icon(slot='prefix' name='fas-user-secret' library='fa')
h3.setup-text-item-title Region #[span.setup-text-optional optional]
sl-input#s3-region(type='text' placeholder='us-east' clearable): sl-icon(slot='prefix' name='fas-map-location-dot' library='fa')
//- * Rate Limits
h2.setup-text-section-header.mt-4 Rate Limits #[span.setup-text-optional optional]
.setup-panel
h3.setup-text-item-title Generic API - Requests
sl-input#ratelimit-api-requests(type='text' placeholder='120' clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title Generic API - Seconds per reset
sl-input#ratelimit-api-time(type='text' placeholder='60' clearable): sl-icon(slot='prefix' name='fas-clock' library='fa')
h3.setup-text-item-title Login - Requests
sl-input#ratelimit-login-requests(type='text' placeholder='5' clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title Login - Seconds per reset
sl-input#ratelimit-login-time(type='text' placeholder='30' clearable): sl-icon(slot='prefix' name='fas-clock' library='fa')
h3.setup-text-item-title File upload - Requests
sl-input#ratelimit-upload-requests(type='text' placeholder='120' clearable): sl-icon(slot='prefix' name='fas-hashtag' library='fa')
h3.setup-text-item-title File upload - Seconds per reset
sl-input#ratelimit-upload-time(type='text' placeholder='60' clearable): sl-icon(slot='prefix' name='fas-clock' library='fa')
sl-button.w-32.mt-2.self-center#submit(type='primary' submit) Submit
script(src='/setup/ui.js')

@ -0,0 +1,9 @@
extends _base_
block title
title ass user 🍑
block section
span user
block content
h1.text-3xl Coming soon.
script(src='/user/ui.js')
Loading…
Cancel
Save