Merge branch 'master' into master

pull/126/head
Josh Moore 2 years ago committed by GitHub
commit 57b184d06d
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

23
.github/README.md vendored

@ -65,6 +65,7 @@ ass was designed with developers in mind. If you are a developer & want somethin
- Usage metrics - Usage metrics
- Thumbnail support - Thumbnail support
- Mimetype blocking - Mimetype blocking
- Themeable viewer page
- Basic multi-user support - Basic multi-user support
- Configurable global upload size limit (per-user coming soon) - Configurable global upload size limit (per-user coming soon)
- Custom pluggable frontends using [Git Submodules] - Custom pluggable frontends using [Git Submodules]
@ -178,7 +179,7 @@ docker-compose up --force-recreate --build -d && docker image prune -f # && dock
- `docker-compose` exposes **five** volumes. These volumes let you edit the config, view the auth or data files, or view the `uploads/` folder from your host. - `docker-compose` exposes **five** volumes. These volumes let you edit the config, view the auth or data files, or view the `uploads/` folder from your host.
- `uploads/` - `uploads/`
- `share/` (for future use) - `share/`
- `config.json` - `config.json`
- `auth.json` - `auth.json`
- `data.json` - `data.json`
@ -290,6 +291,21 @@ Webhooks will show the filename, mimetype, size, upload timestamp, thumbail, & a
[create a new Webhook]: https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks [create a new Webhook]: https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks
## Customizing the viewer
If you want to customize the font or colours of the viewer page, create a file in the `share/` directory called `theme.json`. Available options are:
| Option | Purpose |
| ------ | ------- |
| **`font`** | The font family to use; defaults to `"Josefin Sans"`. Fonts with a space should be surrounded by double quotes. |
| **`bgPage`** | Background colour for the whole page |
| **`bgViewer`** | Background colour for the viewer element |
| **`txtPrimary`** | Primary text colour; this should be your main brand colour. |
| **`txtSecondary`** | Secondary text colour; this is used for the file details. |
| **`linkHover`** | Colour of the `hover` effect for links |
| **`linkActive`** | Colour of the `active` effect for links |
| **`borderHover`** | Colour of the `hover` effect for borders; this is used for the underlining links. |
## Custom index ## Custom index
By default, ass directs the index route `/` to this README. Follow these steps to use a custom index: By default, ass directs the index route `/` to this README. Follow these steps to use a custom index:
@ -339,7 +355,9 @@ For hosts who are looking for a reliable, always available storage solution with
[Amazon S3]: https://en.wikipedia.org/wiki/Amazon_S3 [Amazon S3]: https://en.wikipedia.org/wiki/Amazon_S3
[Skynet Labs]: https://github.com/SkynetLabs [Skynet Labs]: https://github.com/SkynetLabs
## Custom frontends ## Custom frontends - OUTDATED
**Please be aware that this section is outdated (marked as of 2022-04-15). It will be updated when I overhaul the frontend system.**
ass is intended to provide a strong backend for developers to build their own frontends around. [Git Submodules] make it easy to create custom frontends. Submodules are their own projects, which means you are free to build the router however you wish, as long as it exports the required items. A custom frontend is really just an [Express.js router]. ass is intended to provide a strong backend for developers to build their own frontends around. [Git Submodules] make it easy to create custom frontends. Submodules are their own projects, which means you are free to build the router however you wish, as long as it exports the required items. A custom frontend is really just an [Express.js router].
@ -394,6 +412,7 @@ ass has a number of pre-made npm scripts for you to use. **All** of these script
| `dev` | Chains the `build` and `compile` scripts together. | | `dev` | Chains the `build` and `compile` scripts together. |
| `setup` | Starts the easy setup process. Should be run after any updates that introduce new config options. | | `setup` | Starts the easy setup process. Should be run after any updates that introduce new config options. |
| `metrics` | Runs the metrics script. This is a simple script that outputs basic resource statistics. | | `metrics` | Runs the metrics script. This is a simple script that outputs basic resource statistics. |
| `purge` | Purges all uploads and data associated with them. This does **not** delete any users, however. |
| `new-token` | Generates a new API token. Accepts one parameter for specifying a username, like `npm run new-token <username>`. ass automatically detects the new token & reloads it, so there's no need to restart the server. | | `new-token` | Generates a new API token. Accepts one parameter for specifying a username, like `npm run new-token <username>`. ass automatically detects the new token & reloads it, so there's no need to restart the server. |
| `engine-check` | Ensures your environment meets the minimum Node & npm version requirements. | | `engine-check` | Ensures your environment meets the minimum Node & npm version requirements. |
| `docker-logs` | Alias for `docker-compose logs -f --tail=50 --no-log-prefix ass` | | `docker-logs` | Alias for `docker-compose logs -f --tail=50 --no-log-prefix ass` |

4379
package-lock.json generated

File diff suppressed because it is too large Load Diff

@ -1,6 +1,6 @@
{ {
"name": "ass", "name": "ass",
"version": "0.9.1", "version": "0.11.0-rc.1",
"description": "The superior self-hosted ShareX server", "description": "The superior self-hosted ShareX server",
"main": "ass.js", "main": "ass.js",
"engines": { "engines": {
@ -17,6 +17,7 @@
"engine-check": "node dist/checkEngine.js", "engine-check": "node dist/checkEngine.js",
"prestart": "npm run engine-check", "prestart": "npm run engine-check",
"presetup": "npm run engine-check", "presetup": "npm run engine-check",
"purge": "node dist/purge.js",
"docker-logs": "docker-compose logs -f --tail=50 --no-log-prefix ass", "docker-logs": "docker-compose logs -f --tail=50 --no-log-prefix ass",
"docker-update": "git pull && npm run docker-uplite", "docker-update": "git pull && npm run docker-uplite",
"docker-uplite": "docker-compose up --force-recreate --build -d && docker image prune -f", "docker-uplite": "docker-compose up --force-recreate --build -d && docker image prune -f",
@ -40,36 +41,35 @@
"@skynetlabs/skynet-nodejs": "^2.3.0", "@skynetlabs/skynet-nodejs": "^2.3.0",
"@tsconfig/node16": "^1.0.1", "@tsconfig/node16": "^1.0.1",
"@tycrek/express-nofavicon": "^1.0.3", "@tycrek/express-nofavicon": "^1.0.3",
"@tycrek/express-postcss": "^0.1.0", "@tycrek/express-postcss": "^0.2.4",
"@tycrek/isprod": "^2.0.2", "@tycrek/isprod": "^2.0.2",
"@tycrek/log": "^0.5.9", "@tycrek/log": "^0.6.0-7",
"@tycrek/papito": "^0.3.4", "@tycrek/papito": "^0.3.4",
"any-shell-escape": "^0.1.1", "any-shell-escape": "^0.1.1",
"autoprefixer": "^10.3.7", "autoprefixer": "^10.4.4",
"aws-sdk": "^2.1008.0", "aws-sdk": "^2.1115.0",
"check-node-version": "^4.1.0", "check-node-version": "^4.2.1",
"crypto-random-string": "3.3.1", "crypto-random-string": "3.3.1",
"cssnano": "^5.0.8", "cssnano": "^5.1.7",
"discord-webhook-node": "^1.1.8", "discord-webhook-node": "^1.1.8",
"escape-html": "^1.0.3", "escape-html": "^1.0.3",
"express": "^4.17.1", "express": "^4.17.3",
"express-busboy": "^8.0.0", "express-busboy": "^8.0.2",
"express-rate-limit": "^5.5.0",
"ffmpeg-static": "^4.4.0", "ffmpeg-static": "^4.4.0",
"fs-extra": "^10.0.0", "fs-extra": "^10.0.1",
"helmet": "^4.6.0", "helmet": "^4.6.0",
"jimp": "^0.16.1", "luxon": "^2.3.1",
"luxon": "^2.0.2",
"node-fetch": "^2.6.7", "node-fetch": "^2.6.7",
"node-vibrant": "^3.1.6", "node-vibrant": "^3.1.6",
"postcss-font-magician": "^3.0.0", "postcss-font-magician": "^3.0.0",
"prompt": "^1.2.0", "prompt": "^1.3.0",
"pug": "^3.0.2", "pug": "^3.0.2",
"sanitize-filename": "^1.6.3", "sanitize-filename": "^1.6.3",
"sharp": "^0.30.3",
"stream-to-array": "^2.3.0", "stream-to-array": "^2.3.0",
"submodule": "^1.2.1", "submodule": "^1.2.1",
"tailwindcss": "^3.0.23", "tailwindcss": "^3.0.24",
"typescript": "^4.4.4", "typescript": "^4.6.3",
"uuid": "^8.3.2" "uuid": "^8.3.2"
}, },
"devDependencies": { "devDependencies": {
@ -82,6 +82,7 @@
"@types/marked": "^3.0.0", "@types/marked": "^3.0.0",
"@types/node": "^16.9.0", "@types/node": "^16.9.0",
"@types/node-fetch": "^2.5.12", "@types/node-fetch": "^2.5.12",
"@types/sharp": "^0.30.2",
"@types/stream-to-array": "^2.3.0", "@types/stream-to-array": "^2.3.0",
"@types/tailwindcss": "^3.0.9", "@types/tailwindcss": "^3.0.9",
"@types/uuid": "^8.3.1", "@types/uuid": "^8.3.1",

@ -1,39 +1,39 @@
import { AssRequest, AssResponse, ErrWrap } from './definitions'; import { ErrWrap } from './types/definitions';
import { Config, MagicNumbers, Package } from 'ass-json';
let doSetup = null;
try { //#region Imports
// Check if config.json exists import fs from 'fs-extra';
require('../config.json'); import express, { Request, Response } from 'express';
} catch (err) { import nofavicon from '@tycrek/express-nofavicon';
doSetup = require('./setup').doSetup; import { epcss } from '@tycrek/express-postcss';
} import tailwindcss from 'tailwindcss';
import helmet from 'helmet';
import { path, log, getTrueHttp, getTrueDomain } from './utils';
//#endregion
// Run first time setup if using Docker (pseudo-process, setup will be run with docker exec) //#region Setup - Run first time setup if using Docker (pseudo-process, setup will be run with docker exec)
if (doSetup) { import { doSetup } from './setup';
const configPath = path('config.json');
if (!fs.existsSync(configPath)) {
doSetup(); doSetup();
// @ts-ignore // @ts-ignore
return; return;
} }
//#endregion
// Load the config // Load the JSON
const { host, port, useSsl, isProxied, s3enabled, frontendName, indexFile, useSia } = require('../config.json'); const { host, port, useSsl, isProxied, s3enabled, frontendName, indexFile, useSia, diskFilePath }: Config = fs.readJsonSync(path('config.json'));
const { CODE_INTERNAL_SERVER_ERROR }: MagicNumbers = fs.readJsonSync(path('MagicNumbers.json'));
const { name, version, homepage }: Package = fs.readJsonSync(path('package.json'));
//#region Imports //#region Local imports
import fs from 'fs-extra';
import express from 'express';
const nofavicon = require('@tycrek/express-nofavicon');
const epcss = require('@tycrek/express-postcss');
import tailwindcss from 'tailwindcss';
import helmet from 'helmet';
import uploadRouter from './routers/upload'; import uploadRouter from './routers/upload';
import resourceRouter from './routers/resource'; import resourceRouter from './routers/resource';
import { path, log, getTrueHttp, getTrueDomain } from './utils';
const { CODE_INTERNAL_SERVER_ERROR } = require('../MagicNumbers.json');
const { name: ASS_NAME, version: ASS_VERSION, homepage } = require('../package.json');
//#endregion //#endregion
// Welcome :D // Welcome :D
log.blank().info(`* ${ASS_NAME} v${ASS_VERSION} *`).blank(); log.blank().info(`* ${name} v${version} *`).blank();
//#region Variables, module setup //#region Variables, module setup
const app = express(); const app = express();
@ -47,6 +47,9 @@ import { users } from './auth';
import { data } from './data'; import { data } from './data';
//#endregion //#endregion
// Create thumbnails directory
fs.ensureDirSync(path(diskFilePath, 'thumbnails'));
// Enable/disable Express features // Enable/disable Express features
app.enable('case sensitive routing'); app.enable('case sensitive routing');
app.disable('x-powered-by'); app.disable('x-powered-by');
@ -56,7 +59,7 @@ app.set('trust proxy', isProxied);
app.set('view engine', 'pug'); app.set('view engine', 'pug');
// Express logger middleware // Express logger middleware
app.use(log.express(true)); app.use(log.middleware());
// Helmet security middleware // Helmet security middleware
app.use(helmet.noSniff()); app.use(helmet.noSniff());
@ -96,17 +99,19 @@ app.use('/css', epcss({
})); }));
// '/:resouceId' always needs to be LAST since it's a catch-all route // '/:resouceId' always needs to be LAST since it's a catch-all route
app.use('/:resourceId', (req: AssRequest, _res, next) => (req.resourceId = req.params.resourceId, next()), ROUTERS.resource); // skipcq: JS-0086, JS-0090 app.use('/:resourceId', (req, _res, next) => (req.resourceId = req.params.resourceId, next()), ROUTERS.resource); // skipcq: JS-0086, JS-0090
// Error handler // Error handler
app.use((err: ErrWrap, _req: AssRequest, res: AssResponse, _next: Function) => log.error(err).err(err).callback(() => res.sendStatus(CODE_INTERNAL_SERVER_ERROR))); // skipcq: JS-0128 app.use((err: ErrWrap, _req: Request, res: Response) => log.error(err.message).err(err).callback(() => res.sendStatus(CODE_INTERNAL_SERVER_ERROR))); // skipcq: JS-0128
// Host the server (function start() {
log if (data() == null) setTimeout(start, 100);
.info('Users', `${Object.keys(users).length}`) else log
.info('Files', `${data.size}`) .info('Users', `${Object.keys(users).length}`)
.info('Data engine', data.name, data.type) .info('Files', `${data().size}`)
.info('Frontend', ASS_FRONTEND.enabled ? ASS_FRONTEND.brand : 'disabled', `${ASS_FRONTEND.enabled ? `${getTrueHttp()}${getTrueDomain()}${ASS_FRONTEND.endpoint}` : ''}`) .info('Data engine', data().name, data().type)
.info('Custom index', ASS_INDEX_ENABLED ? `enabled` : 'disabled') .info('Frontend', ASS_FRONTEND.enabled ? ASS_FRONTEND.brand : 'disabled', `${ASS_FRONTEND.enabled ? `${getTrueHttp()}${getTrueDomain()}${ASS_FRONTEND.endpoint}` : ''}`)
.blank() .info('Custom index', ASS_INDEX_ENABLED ? `enabled` : 'disabled')
.express().Host(app, port, host, () => log.success('Ready for uploads', `Storing resources ${s3enabled ? 'in S3' : useSia ? 'on Sia blockchain' : 'on disk'}`)); .blank()
.express()!.Host(app, port, host, () => log.success('Ready for uploads', `Storing resources ${s3enabled ? 'in S3' : useSia ? 'on Sia blockchain' : 'on disk'}`));
})();

@ -17,4 +17,4 @@ fs.watch(path('auth.json'), { persistent: false },
log.info('New token added', Object.keys(users)[Object.keys(users).length - 1] || 'No new token'); log.info('New token added', Object.keys(users)[Object.keys(users).length - 1] || 'No new token');
} }
}) })
.catch(log.c.error)); .catch(console.error));

@ -1,7 +1,7 @@
const check = require("check-node-version"); const check = require("check-node-version");
const ENGINES = require('../package.json').engines; const ENGINES = require('../package.json').engines;
const TLog = require('@tycrek/log'); const { TLog } = require('@tycrek/log');
const logger = new TLog(); const logger = new TLog();
function doCheck() { function doCheck() {

@ -2,11 +2,17 @@
* Used for global data management * Used for global data management
*/ */
// Old data import fs from 'fs-extra';
const { JsonDataEngine } = require('@tycrek/papito'); import { Config } from 'ass-json';
import { JsonDataEngine } from '@tycrek/papito'
let theData: any;
// Actual data engine // Actual data engine
const { dataEngine } = require('../config.json'); const { dataEngine }: Config = fs.readJsonSync('config.json');
const { _ENGINE_ } = require(dataEngine); import(dataEngine)
.then(({ _ENGINE_ }) => theData = _ENGINE_(new JsonDataEngine()))
.catch(err => console.error(err));
export const data = _ENGINE_(new JsonDataEngine()); // Export a self-calling const function returning the data
export const data = ((): any => theData);

@ -2,7 +2,7 @@ import { v4 as uuid } from 'uuid';
import fs from 'fs-extra'; import fs from 'fs-extra';
import path from 'path'; import path from 'path';
import randomGen from './random'; import randomGen from './random';
const TLog = require('@tycrek/log'); import { TLog } from '@tycrek/log';
const log = new TLog(); const log = new TLog();
const MAX_USERNAME = 20; const MAX_USERNAME = 20;
@ -34,5 +34,5 @@ if (require.main === module) {
.comment('A new token has been generated and automatically applied.') .comment('A new token has been generated and automatically applied.')
.comment('You do not need to restart \'ass\'.') .comment('You do not need to restart \'ass\'.')
.success('Your token', token, `username: ${name}`)) .success('Your token', token, `username: ${name}`))
.catch(log.c.error); .catch(console.error);
} }

@ -1,4 +1,4 @@
import { FileData } from './definitions'; import { FileData } from './types/definitions';
import fs from 'fs-extra'; import fs from 'fs-extra';
import crypto from 'crypto'; import crypto from 'crypto';
import toArray from 'stream-to-array'; import toArray from 'stream-to-array';
@ -6,13 +6,11 @@ import { log } from './utils';
/** /**
* Generates a SHA1 hash for the provided file * Generates a SHA1 hash for the provided file
* @param {*} file The file to hash
* @returns The SHA1 hash
*/ */
export default (file: FileData): Promise<string> => export default (file: FileData): Promise<string> =>
new Promise((resolve, reject) => new Promise((resolve, reject) =>
toArray((fs.createReadStream(file.path))) toArray((fs.createReadStream(file.path)))
.then((parts: any[]) => Buffer.concat(parts.map((part: any) => (Buffer.isBuffer(part) ? part : Buffer.from(part))))) .then((parts: any[]) => Buffer.concat(parts.map((part: any) => (Buffer.isBuffer(part) ? part : Buffer.from(part)))))
.then((buf: Buffer) => crypto.createHash('sha1').update(buf).digest('hex')) // skipcq: JS-D003 .then((buf: Buffer) => crypto.createHash('sha1').update(buf).digest('hex')) // skipcq: JS-D003
.then((hash: String) => log.debug(`Hash for ${file.originalname}`, hash, 'SHA1, hex').callback(resolve, hash)) .then((hash: string) => log.debug(`Hash for ${file.originalname}`, hash, 'SHA1, hex').callback(resolve, hash))
.catch(reject)); .catch(reject));

@ -1,17 +1,29 @@
const TLog = require('@tycrek/log'); import { TLog, DateTimePreset } from '@tycrek/log';
// Set up logging // Set up logging
const logger = new TLog({ const logger = new TLog({
// @ts-ignore
level: process.env.LOG_LEVEL || (process.env.NODE_ENV === 'production' ? 'info' : 'debug'), level: process.env.LOG_LEVEL || (process.env.NODE_ENV === 'production' ? 'info' : 'debug'),
timestamp: { timestamp: {
enabled: true, enabled: true,
colour: 'grey', colour: 'grey',
preset: 'DATETIME_MED' preset: DateTimePreset.DATETIME_MED
}, }
}); });
// Enable the Express logger // Enable the Express logger
logger.enable.express({ handle500: false }).debug('Plugin enabled', 'Express'); logger.enable.express({
middleware: {
excludePaths: ['favicon.ico'],
},
trim: {
enabled: true,
maxLength: 80,
delim: ': ',
},
handle404: true,
handle500: false
}).debug('Plugin enabled', 'Express');
/** /**
* @type {TLog} * @type {TLog}

@ -4,9 +4,20 @@ const { s3enabled } = require('../config.json');
const { formatBytes } = require('./utils'); const { formatBytes } = require('./utils');
const { bucketSize } = require('./storage'); const { bucketSize } = require('./storage');
const TLog = require('@tycrek/log'); const { TLog } = require('@tycrek/log');
const log = new TLog({ level: 'debug', timestamp: { enabled: false } }); const log = new TLog({ level: 'debug', timestamp: { enabled: false } });
/**
* Thank you CoPilot for helping write whatever the fuck this is -tycrek, 2022-04-18
*/
function whileWait(expression, timeout = 1000) {
return new Promise(async (resolve, reject) => {
while (expression())
await new Promise((resolve) => setTimeout(resolve, timeout));
resolve();
});
}
module.exports = () => { module.exports = () => {
const data = require('./data').data; const data = require('./data').data;
const { users } = fs.readJsonSync(path.join(process.cwd(), 'auth.json')); const { users } = fs.readJsonSync(path.join(process.cwd(), 'auth.json'));
@ -16,7 +27,8 @@ module.exports = () => {
let oldSize = 0; let oldSize = 0;
let d = []; let d = [];
data.get() whileWait(() => data() === undefined)
.then(() => data().get())
.then((D) => (d = D.map(([, resource]) => resource))) .then((D) => (d = D.map(([, resource]) => resource)))
.then(() => .then(() =>
d.forEach(({ token, size }) => { d.forEach(({ token, size }) => {
@ -47,7 +59,7 @@ module.exports = () => {
Object.values(users).forEach(({ username, count, size }) => log.info(`- ${username}`, formatBytes(size), `${count} files`)); Object.values(users).forEach(({ username, count, size }) => log.info(`- ${username}`, formatBytes(size), `${count} files`));
process.exit(0); process.exit(0);
}) })
.catch(log.c.error); .catch(console.error);
} }
if (require.main === module) module.exports(); if (require.main === module) module.exports();

@ -0,0 +1,16 @@
import { TLog } from '@tycrek/log';
import fs from 'fs-extra';
import path from 'path';
const log = new TLog();
const uploadsPath = path.join(process.cwd(), 'uploads/');
const dataPath = path.join(process.cwd(), 'data.json');
if (fs.existsSync(uploadsPath)) {
fs.removeSync(uploadsPath);
log.success('Deleted', uploadsPath);
}
if (fs.existsSync(dataPath)) {
fs.removeSync(dataPath);
log.success('Deleted', dataPath);
}

@ -1,33 +1,39 @@
import { FileData, IsPossible, AssRequest, AssResponse } from '../definitions'; import { FileData, IsPossible } from '../types/definitions';
import { Config, MagicNumbers } from 'ass-json';
import fs from 'fs-extra'; import fs from 'fs-extra';
import escape from 'escape-html'; import escape from 'escape-html';
import fetch, { Response } from 'node-fetch'; import fetch, { Response as FetchResponse } from 'node-fetch';
import { Request, Response } from 'express';
import { deleteS3 } from '../storage'; import { deleteS3 } from '../storage';
import { SkynetDelete, SkynetDownload } from '../skynet'; import { SkynetDelete, SkynetDownload } from '../skynet';
const { diskFilePath, s3enabled, viewDirect, useSia } = require('../../config.json');
import { path, log, getTrueHttp, getTrueDomain, formatBytes, formatTimestamp, getS3url, getDirectUrl, getResourceColor, replaceholder } from '../utils'; import { path, log, getTrueHttp, getTrueDomain, formatBytes, formatTimestamp, getS3url, getDirectUrl, getResourceColor, replaceholder } from '../utils';
const { CODE_UNAUTHORIZED, CODE_NOT_FOUND, } = require('../../MagicNumbers.json'); const { diskFilePath, s3enabled, viewDirect, useSia }: Config = fs.readJsonSync(path('config.json'));
const { CODE_UNAUTHORIZED, CODE_NOT_FOUND, }: MagicNumbers = fs.readJsonSync(path('MagicNumbers.json'));
import { data } from '../data'; import { data } from '../data';
import { users } from '../auth'; import { users } from '../auth';
import express from 'express'; import express from 'express';
const router = express.Router(); const router = express.Router();
let theme = {};
if (fs.existsSync(path('share/', 'theme.json')))
theme = fs.readJsonSync(path('share/', 'theme.json'));
// Middleware for parsing the resource ID and handling 404 // Middleware for parsing the resource ID and handling 404
router.use((req: AssRequest, res: AssResponse, next) => { router.use((req: Request, res: Response, next) => {
// Parse the resource ID // Parse the resource ID
req.ass = { resourceId: escape(req.resourceId || '').split('.')[0] }; req.ass = { resourceId: escape(req.resourceId || '').split('.')[0] };
// If the ID is invalid, return 404. Otherwise, continue normally // If the ID is invalid, return 404. Otherwise, continue normally
data.has(req.ass.resourceId) data().has(req.ass.resourceId)
.then((has: boolean) => has ? next() : res.sendStatus(CODE_NOT_FOUND)) // skipcq: JS-0229 .then((has: boolean) => has ? next() : res.sendStatus(CODE_NOT_FOUND)) // skipcq: JS-0229
.catch(next); .catch(next);
}); });
// View file // View file
router.get('/', (req: AssRequest, res: AssResponse, next) => data.get(req.ass?.resourceId).then((fileData: FileData) => { router.get('/', (req: Request, res: Response, next) => data().get(req.ass.resourceId).then((fileData: FileData) => {
const resourceId = req.ass!.resourceId; const resourceId = req.ass.resourceId;
// Build OpenGraph meta tags // Build OpenGraph meta tags
const og = fileData.opengraph, ogs = ['']; const og = fileData.opengraph, ogs = [''];
@ -55,19 +61,21 @@ router.get('/', (req: AssRequest, res: AssResponse, next) => data.get(req.ass?.r
ogtype: fileData.is.video ? 'video.other' : fileData.is.image ? 'image' : 'website', ogtype: fileData.is.video ? 'video.other' : fileData.is.image ? 'image' : 'website',
urlType: `og:${fileData.is.video ? 'video' : fileData.is.audio ? 'audio' : 'image'}`, urlType: `og:${fileData.is.video ? 'video' : fileData.is.audio ? 'audio' : 'image'}`,
opengraph: replaceholder(ogs.join('\n'), fileData.size, fileData.timestamp, fileData.timeoffset, fileData.originalname), opengraph: replaceholder(ogs.join('\n'), fileData.size, fileData.timestamp, fileData.timeoffset, fileData.originalname),
viewDirect viewDirect,
//@ts-ignore
showAd: theme.showAd ?? true,
}); });
}).catch(next)); }).catch(next));
// Direct resource // Direct resource
router.get('/direct*', (req: AssRequest, res: AssResponse, next) => data.get(req.ass?.resourceId).then((fileData: FileData) => { router.get('/direct*', (req: Request, res: Response, next) => data().get(req.ass.resourceId).then((fileData: FileData) => {
// Send file as an attachement for downloads // Send file as an attachement for downloads
if (req.query.download) if (req.query.download)
res.header('Content-Disposition', `attachment; filename="${fileData.originalname}"`); res.header('Content-Disposition', `attachment; filename="${fileData.originalname}"`);
// Return the file differently depending on what storage option was used // Return the file differently depending on what storage option was used
const uploaders = { const uploaders = {
s3: () => fetch(getS3url(fileData.randomId, fileData.ext)).then((file: Response) => { s3: () => fetch(getS3url(fileData.randomId, fileData.ext)).then((file: FetchResponse) => {
file.headers.forEach((value, header) => res.setHeader(header, value)); file.headers.forEach((value, header) => res.setHeader(header, value));
file.body?.pipe(res); file.body?.pipe(res);
}), }),
@ -86,8 +94,8 @@ router.get('/direct*', (req: AssRequest, res: AssResponse, next) => data.get(req
}).catch(next)); }).catch(next));
// Thumbnail response // Thumbnail response
router.get('/thumbnail', (req: AssRequest, res: AssResponse, next) => router.get('/thumbnail', (req: Request, res: Response, next) =>
data.get(req.ass?.resourceId) data().get(req.ass.resourceId)
.then(({ is, thumbnail }: { is: IsPossible, thumbnail: string }) => fs.readFile((!is || (is.image || is.video)) ? path(diskFilePath, 'thumbnails/', thumbnail) : is.audio ? 'views/ass-audio-icon.png' : 'views/ass-file-icon.png')) .then(({ is, thumbnail }: { is: IsPossible, thumbnail: string }) => fs.readFile((!is || (is.image || is.video)) ? path(diskFilePath, 'thumbnails/', thumbnail) : is.audio ? 'views/ass-audio-icon.png' : 'views/ass-file-icon.png'))
.then((fileData: Buffer) => res.type('jpg').send(fileData)) .then((fileData: Buffer) => res.type('jpg').send(fileData))
.catch(next)); .catch(next));
@ -95,8 +103,8 @@ router.get('/thumbnail', (req: AssRequest, res: AssResponse, next) =>
// oEmbed response for clickable authors/providers // oEmbed response for clickable authors/providers
// https://oembed.com/ // https://oembed.com/
// https://old.reddit.com/r/discordapp/comments/82p8i6/a_basic_tutorial_on_how_to_get_the_most_out_of/ // https://old.reddit.com/r/discordapp/comments/82p8i6/a_basic_tutorial_on_how_to_get_the_most_out_of/
router.get('/oembed', (req: AssRequest, res: AssResponse, next) => router.get('/oembed', (req: Request, res: Response, next) =>
data.get(req.ass?.resourceId) data().get(req.ass.resourceId)
.then((fileData: FileData) => .then((fileData: FileData) =>
res.type('json').send({ res.type('json').send({
version: '1.0', version: '1.0',
@ -113,9 +121,9 @@ router.get('/oembed', (req: AssRequest, res: AssResponse, next) =>
.catch(next)); .catch(next));
// Delete file // Delete file
router.get('/delete/:deleteId', (req: AssRequest, res: AssResponse, next) => { router.get('/delete/:deleteId', (req: Request, res: Response, next) => {
let oldName: string, oldType: string; // skipcq: JS-0119 let oldName: string, oldType: string; // skipcq: JS-0119
data.get(req.ass?.resourceId) data().get(req.ass.resourceId)
.then((fileData: FileData) => { .then((fileData: FileData) => {
// Extract info for logs // Extract info for logs
oldName = fileData.originalname; oldName = fileData.originalname;
@ -133,7 +141,7 @@ router.get('/delete/:deleteId', (req: AssRequest, res: AssResponse, next) => {
(!fileData.is || (fileData.is.image || fileData.is.video)) && fs.existsSync(path(diskFilePath, 'thumbnails/', fileData.thumbnail)) (!fileData.is || (fileData.is.image || fileData.is.video)) && fs.existsSync(path(diskFilePath, 'thumbnails/', fileData.thumbnail))
? fs.rmSync(path(diskFilePath, 'thumbnails/', fileData.thumbnail)) : () => Promise.resolve()]); ? fs.rmSync(path(diskFilePath, 'thumbnails/', fileData.thumbnail)) : () => Promise.resolve()]);
}) })
.then(() => data.del(req.ass?.resourceId)) .then(() => data().del(req.ass.resourceId))
.then(() => (log.success('Deleted', oldName, oldType), res.type('text').send('File has been deleted!'))) // skipcq: JS-0090 .then(() => (log.success('Deleted', oldName, oldType), res.type('text').send('File has been deleted!'))) // skipcq: JS-0090
.catch(next); .catch(next);
}); });

@ -1,4 +1,5 @@
import { FileData, AssRequest, AssResponse, ErrWrap, User } from "../definitions"; import { ErrWrap, User } from '../types/definitions';
import { Config, MagicNumbers } from 'ass-json';
import fs from 'fs-extra'; import fs from 'fs-extra';
import bb from 'express-busboy'; import bb from 'express-busboy';
@ -6,14 +7,14 @@ import bb from 'express-busboy';
import { DateTime } from 'luxon'; import { DateTime } from 'luxon';
import { Webhook, MessageBuilder } from 'discord-webhook-node'; import { Webhook, MessageBuilder } from 'discord-webhook-node';
import { processUploaded } from '../storage'; import { processUploaded } from '../storage';
const { maxUploadSize, resourceIdSize, gfyIdSize, resourceIdType, spaceReplace } = require('../../config.json');
import { path, log, verify, getTrueHttp, getTrueDomain, generateId, formatBytes } from '../utils'; import { path, log, verify, getTrueHttp, getTrueDomain, generateId, formatBytes } from '../utils';
const { CODE_UNAUTHORIZED, CODE_PAYLOAD_TOO_LARGE } = require('../../MagicNumbers.json');
import { data } from '../data'; import { data } from '../data';
import { users } from '../auth'; import { users } from '../auth';
const { maxUploadSize, resourceIdSize, gfyIdSize, resourceIdType, spaceReplace }: Config = fs.readJsonSync(path('config.json'));
const { CODE_UNAUTHORIZED, CODE_PAYLOAD_TOO_LARGE }: MagicNumbers = fs.readJsonSync(path('MagicNumbers.json'));
const ASS_LOGO = 'https://cdn.discordapp.com/icons/848274994375294986/8d339d4a2f3f54b2295e5e0ff62bd9e6.png?size=1024'; const ASS_LOGO = 'https://cdn.discordapp.com/icons/848274994375294986/8d339d4a2f3f54b2295e5e0ff62bd9e6.png?size=1024';
import express from 'express'; import express, { Request, Response } from 'express';
const router = express.Router(); const router = express.Router();
// Set up express-busboy // Set up express-busboy
@ -31,7 +32,7 @@ bb.extend(router, {
})); */ })); */
// Block unauthorized requests and attempt token sanitization // Block unauthorized requests and attempt token sanitization
router.post('/', (req: AssRequest, res: AssResponse, next: Function) => { router.post('/', (req: Request, res: Response, next: Function) => {
req.headers.authorization = req.headers.authorization || ''; req.headers.authorization = req.headers.authorization || '';
req.token = req.headers.authorization.replace(/[^\da-z]/gi, ''); // Strip anything that isn't a digit or ASCII letter req.token = req.headers.authorization.replace(/[^\da-z]/gi, ''); // Strip anything that isn't a digit or ASCII letter
!verify(req, users) ? log.warn('Upload blocked', 'Unauthorized').callback(() => res.sendStatus(CODE_UNAUTHORIZED)) : next(); // skipcq: JS-0093 !verify(req, users) ? log.warn('Upload blocked', 'Unauthorized').callback(() => res.sendStatus(CODE_UNAUTHORIZED)) : next(); // skipcq: JS-0093
@ -41,28 +42,28 @@ router.post('/', (req: AssRequest, res: AssResponse, next: Function) => {
router.post('/', processUploaded); router.post('/', processUploaded);
// Max upload size error handling // Max upload size error handling
router.use('/', (err: ErrWrap, _req: AssRequest, res: AssResponse, next: Function) => err.message === 'LIMIT_FILE_SIZE' ? log.warn('Upload blocked', 'File too large').callback(() => res.status(CODE_PAYLOAD_TOO_LARGE).send(`Max upload size: ${maxUploadSize}MB`)) : next(err)); // skipcq: JS-0229 router.use('/', (err: ErrWrap, _req: Request, res: Response, next: Function) => err.message === 'LIMIT_FILE_SIZE' ? log.warn('Upload blocked', 'File too large').callback(() => res.status(CODE_PAYLOAD_TOO_LARGE).send(`Max upload size: ${maxUploadSize}MB`)) : next(err)); // skipcq: JS-0229
// Process uploaded file // Process uploaded file
router.post('/', (req: AssRequest, res: AssResponse, next: Function) => { router.post('/', (req: Request, res: Response, next: Function) => {
// Load overrides // Load overrides
const trueDomain = getTrueDomain(req.headers['x-ass-domain']); const trueDomain = getTrueDomain(req.headers['x-ass-domain']);
const generator = req.headers['x-ass-access'] || resourceIdType; const generator = req.headers['x-ass-access']?.toString() || resourceIdType;
// Save domain with file // Save domain with file
req.file!.domain = `${getTrueHttp()}${trueDomain}`; req.file.domain = `${getTrueHttp()}${trueDomain}`;
// Get the uploaded time in milliseconds // Get the uploaded time in milliseconds
req.file!.timestamp = DateTime.now().toMillis(); req.file.timestamp = DateTime.now().toMillis();
// Save the timezone offset // Save the timezone offset
req.file!.timeoffset = req.headers['x-ass-timeoffset']?.toString() || 'UTC+0'; req.file!.timeoffset = req.headers['x-ass-timeoffset']?.toString() || 'UTC+0';
// Keep track of the token that uploaded the resource // Keep track of the token that uploaded the resource
req.file!.token = req.token ?? ''; req.file.token = req.token ?? '';
// Attach any embed overrides, if necessary // Attach any embed overrides, if necessary
req.file!.opengraph = { req.file.opengraph = {
title: req.headers['x-ass-og-title'], title: req.headers['x-ass-og-title'],
description: req.headers['x-ass-og-description'], description: req.headers['x-ass-og-description'],
author: req.headers['x-ass-og-author'], author: req.headers['x-ass-og-author'],
@ -73,13 +74,13 @@ router.post('/', (req: AssRequest, res: AssResponse, next: Function) => {
}; };
// Fix spaces in originalname // Fix spaces in originalname
req.file!.originalname = req.file!.originalname.replace(/\s/g, spaceReplace === '!' ? '' : spaceReplace); req.file!.originalname = req.file.originalname.replace(/\s/g, spaceReplace === '!' ? '' : spaceReplace);
// Generate a unique resource ID // Generate a unique resource ID
let resourceId = ''; let resourceId = '';
// Function to call to generate a fresh ID. Used for multiple attempts in case an ID is already taken // Function to call to generate a fresh ID. Used for multiple attempts in case an ID is already taken
const gen = () => generateId(generator, resourceIdSize, req.headers['x-ass-gfycat'] || gfyIdSize, req.file!.originalname); const gen = () => generateId(generator, resourceIdSize, parseInt(req.headers['x-ass-gfycat']?.toString() || gfyIdSize.toString()), req.file.originalname);
// Keeps track of the number of attempts in case all ID's are taken // Keeps track of the number of attempts in case all ID's are taken
const attempts = { const attempts = {
@ -91,7 +92,7 @@ router.post('/', (req: AssRequest, res: AssResponse, next: Function) => {
function genCheckId(resolve: Function, reject: Function) { function genCheckId(resolve: Function, reject: Function) {
const uniqueId = gen(); const uniqueId = gen();
attempts.count++; attempts.count++;
data.has(uniqueId) data().has(uniqueId)
.then((exists: boolean) => { .then((exists: boolean) => {
log.debug('ID check', exists ? 'Taken' : 'Available'); log.debug('ID check', exists ? 'Taken' : 'Available');
return attempts.count - 1 >= attempts.max ? reject(new Error('No ID\'s remaining')) : exists ? genCheckId(resolve, reject) : resolve(uniqueId); return attempts.count - 1 >= attempts.max ? reject(new Error('No ID\'s remaining')) : exists ? genCheckId(resolve, reject) : resolve(uniqueId);
@ -105,16 +106,16 @@ router.post('/', (req: AssRequest, res: AssResponse, next: Function) => {
resourceId = uniqueId; resourceId = uniqueId;
log.debug('Saving data', data.name); log.debug('Saving data', data.name);
}) })
.then(() => data.put(resourceId.split('.')[0], req.file)) .then(() => data().put(resourceId.split('.')[0], req.file))
.then(() => { .then(() => {
// Log the upload // Log the upload
const logInfo = `${req.file!.originalname} (${req.file!.mimetype}, ${formatBytes(req.file!.size)})`; const logInfo = `${req.file!.originalname} (${req.file!.mimetype}, ${formatBytes(req.file.size)})`;
log.success('File uploaded', logInfo, `uploaded by ${users[req.token ?? ''] ? users[req.token ?? ''].username : '<token-only>'}`); log.success('File uploaded', logInfo, `uploaded by ${users[req.token ?? ''] ? users[req.token ?? ''].username : '<token-only>'}`);
// Build the URLs // Build the URLs
const resourceUrl = `${getTrueHttp()}${trueDomain}/${resourceId}`; const resourceUrl = `${getTrueHttp()}${trueDomain}/${resourceId}`;
const thumbnailUrl = `${getTrueHttp()}${trueDomain}/${resourceId}/thumbnail`; const thumbnailUrl = `${getTrueHttp()}${trueDomain}/${resourceId}/thumbnail`;
const deleteUrl = `${getTrueHttp()}${trueDomain}/${resourceId}/delete/${req.file!.deleteId}`; const deleteUrl = `${getTrueHttp()}${trueDomain}/${resourceId}/delete/${req.file.deleteId}`;
// Send the response // Send the response
res.type('json').send({ resource: resourceUrl, thumbnail: thumbnailUrl, delete: deleteUrl }) res.type('json').send({ resource: resourceUrl, thumbnail: thumbnailUrl, delete: deleteUrl })
@ -134,9 +135,9 @@ router.post('/', (req: AssRequest, res: AssResponse, next: Function) => {
.setTitle(logInfo) .setTitle(logInfo)
//@ts-ignore //@ts-ignore
.setURL(resourceUrl) .setURL(resourceUrl)
.setDescription(`**Size:** \`${formatBytes(req.file!.size)}\`\n**[Delete](${deleteUrl})**`) .setDescription(`**Size:** \`${formatBytes(req.file.size)}\`\n**[Delete](${deleteUrl})**`)
.setThumbnail(thumbnailUrl) .setThumbnail(thumbnailUrl)
.setColor(req.file!.vibrant) .setColor(req.file.vibrant)
.setTimestamp(); .setTimestamp();
// Send the embed to the webhook, then delete the client after to free resources // Send the embed to the webhook, then delete the client after to free resources
@ -148,7 +149,7 @@ router.post('/', (req: AssRequest, res: AssResponse, next: Function) => {
// Also update the users upload count // Also update the users upload count
if (!users[req.token ?? '']) { if (!users[req.token ?? '']) {
const generateUsername = () => generateId('random', 20, 0, req.file!.size.toString()); // skipcq: JS-0074 const generateUsername = () => generateId('random', 20, 0, req.file.size.toString()); // skipcq: JS-0074
let username: string = generateUsername(); let username: string = generateUsername();
// eslint-disable-next-line @typescript-eslint/ban-ts-comment // eslint-disable-next-line @typescript-eslint/ban-ts-comment

@ -57,7 +57,7 @@ function getConfirmSchema(description) {
// If directly called on the command line, run setup script // If directly called on the command line, run setup script
function doSetup() { function doSetup() {
const path = (...paths) => require('path').join(process.cwd(), ...paths); const path = (...paths) => require('path').join(process.cwd(), ...paths);
const TLog = require('@tycrek/log'); const { TLog, getChalk } = require('@tycrek/log');
const fs = require('fs-extra'); const fs = require('fs-extra');
const prompt = require('prompt'); const prompt = require('prompt');
const token = require('./generators/token'); const token = require('./generators/token');
@ -245,7 +245,7 @@ function doSetup() {
// Verify information is correct // Verify information is correct
.then(() => log .then(() => log
.blank() .blank()
.info('Please verify your information', '\n'.concat(Object.entries(results).map(([setting, value]) => `${' '}${log.chalk.dim.gray('-->')} ${log.chalk.bold.white(`${setting}:`)} ${log.chalk.white(value)}`).join('\n'))) .info('Please verify your information', '\n'.concat(Object.entries(results).map(([setting, value]) => `${' '}${getChalk().dim.gray('-->')} ${getChalk().bold.white(`${setting}:`)} ${getChalk().white(value)}`).join('\n')))
.blank()) .blank())
// Apply old configs // Apply old configs

@ -1,7 +1,7 @@
import { FileData } from "./definitions"; import { FileData } from './types/definitions';
import fs, { ReadStream } from 'fs-extra'; import fs, { ReadStream } from 'fs-extra';
import { path } from './utils'; import { path } from './utils';
const { SkynetClient } = require('@skynetlabs/skynet-nodejs'); import { SkynetClient } from '@skynetlabs/skynet-nodejs';
function getFullPath(fileData: FileData) { function getFullPath(fileData: FileData) {
return path('share', '.skynet', `${fileData.randomId}${fileData.ext}`.replace(/sia\:\/\//gi, '')); return path('share', '.skynet', `${fileData.randomId}${fileData.ext}`.replace(/sia\:\/\//gi, ''));

@ -1,16 +1,18 @@
// https://docs.digitalocean.com/products/spaces/resources/s3-sdk-examples/ // https://docs.digitalocean.com/products/spaces/resources/s3-sdk-examples/
// https://www.digitalocean.com/community/tutorials/how-to-upload-a-file-to-object-storage-with-node-js // https://www.digitalocean.com/community/tutorials/how-to-upload-a-file-to-object-storage-with-node-js
import { AssRequest, AssResponse, FileData } from './definitions'; import { FileData } from './types/definitions';
import { Config, MagicNumbers } from 'ass-json'
import fs, { Stats } from 'fs-extra'; import fs, { Stats } from 'fs-extra';
import aws from 'aws-sdk'; import aws from 'aws-sdk';
import Thumbnail from './thumbnails'; import Thumbnail from './thumbnails';
import Vibrant from './vibrant'; import Vibrant from './vibrant';
import Hash from './hash'; import Hash from './hash';
import { generateId, log } from './utils'; import { path, generateId, log } from './utils';
import { SkynetUpload } from './skynet'; import { SkynetUpload } from './skynet';
const { s3enabled, s3endpoint, s3bucket, s3usePathStyle, s3accessKey, s3secretKey, diskFilePath, saveAsOriginal, saveWithDate, mediaStrict, maxUploadSize, useSia } = require('../config.json'); import { Request, Response } from 'express';
const { CODE_UNSUPPORTED_MEDIA_TYPE } = require('../MagicNumbers.json'); const { s3enabled, s3endpoint, s3bucket, s3usePathStyle, s3accessKey, s3secretKey, diskFilePath, saveAsOriginal, saveWithDate, mediaStrict, maxUploadSize, useSia }: Config = fs.readJsonSync(path('config.json'));
const { CODE_UNSUPPORTED_MEDIA_TYPE }: MagicNumbers = fs.readJsonSync(path('MagicNumbers.json'));
const ID_GEN_LENGTH = 32; const ID_GEN_LENGTH = 32;
const ALLOWED_MIMETYPES = /(image)|(video)|(audio)\//; const ALLOWED_MIMETYPES = /(image)|(video)|(audio)\//;
@ -31,23 +33,32 @@ function getDatedDirname() {
return `${diskFilePath}${diskFilePath.endsWith('/') ? '' : '/'}${year}-${`0${month}`.slice(-2)}`; // skipcq: JS-0074 return `${diskFilePath}${diskFilePath.endsWith('/') ? '' : '/'}${year}-${`0${month}`.slice(-2)}`; // skipcq: JS-0074
} }
function getLocalFilename(req: AssRequest) { function getLocalFilename(req: Request) {
return `${getDatedDirname()}/${saveAsOriginal ? req.file!.originalname : req.file!.sha1}`; let name = `${getDatedDirname()}/${saveAsOriginal ? req.file.originalname : req.file.sha1}`;
// Append a number if this file has already been uploaded before
let count = 0;
while (fs.existsSync(path(name))) {
count++
name = count == 1 ? name.concat(`-${count}`) : name.substring(0, name.lastIndexOf('-')).concat(`-${count}`);
}
return name;
} }
export function processUploaded(req: AssRequest, res: AssResponse, next: Function) { // skipcq: JS-0045 export function processUploaded(req: Request, res: Response, next: Function) { // skipcq: JS-0045
// Fix file object // Fix file object
req.file = req.files!.file; req.file = req.files.file;
// Other fixes // Other fixes
req.file!.ext = '.'.concat((req.file!.filename ?? '').split('.').pop() ?? ''); req.file.ext = '.'.concat((req.file.filename ?? '').split('.').pop() ?? '');
req.file!.originalname = req.file!.filename ?? ''; req.file.originalname = req.file.filename ?? '';
req.file!.path = req.file!.file ?? ''; req.file.path = req.file.file ?? '';
req.file!.randomId = generateId('random', ID_GEN_LENGTH, 0, ''); req.file.randomId = generateId('random', ID_GEN_LENGTH, 0, '');
req.file!.deleteId = generateId('random', ID_GEN_LENGTH, 0, ''); req.file.deleteId = generateId('random', ID_GEN_LENGTH, 0, '');
// Set up types // Set up types
req.file!.is = { req.file.is = {
image: false, image: false,
video: false, video: false,
audio: false, audio: false,
@ -55,16 +66,16 @@ export function processUploaded(req: AssRequest, res: AssResponse, next: Functio
}; };
// Specify correct type // Specify correct type
const isType = req.file!.mimetype.includes('image') ? 'image' : req.file!.mimetype.includes('video') ? 'video' : req.file!.mimetype.includes('audio') ? 'audio' : 'other'; const isType = req.file!.mimetype.includes('image') ? 'image' : req.file.mimetype.includes('video') ? 'video' : req.file.mimetype.includes('audio') ? 'audio' : 'other';
req.file!.is[isType] = true; req.file.is[isType] = true;
// Block the resource if the mimetype is not an image or video // Block the resource if the mimetype is not an image or video
if (mediaStrict && !ALLOWED_MIMETYPES.test(req.file!.mimetype)) if (mediaStrict && !ALLOWED_MIMETYPES.test(req.file.mimetype))
return log return log
.warn('Upload blocked', req.file!.originalname, req.file!.mimetype) .warn('Upload blocked', req.file.originalname, req.file.mimetype)
.warn('Strict media mode', 'only images, videos, & audio are file permitted') .warn('Strict media mode', 'only images, videos, & audio are file permitted')
.callback(() => .callback(() =>
fs.remove(req.file!.path) fs.remove(req.file.path)
.then(() => log .then(() => log
.debug('Temp file', 'deleted') .debug('Temp file', 'deleted')
.callback(() => res.sendStatus(CODE_UNSUPPORTED_MEDIA_TYPE))) .callback(() => res.sendStatus(CODE_UNSUPPORTED_MEDIA_TYPE)))
@ -73,29 +84,32 @@ export function processUploaded(req: AssRequest, res: AssResponse, next: Functio
.callback(next, err))); .callback(next, err)));
// Remove unwanted fields // Remove unwanted fields
delete req.file!.uuid; delete req.file.uuid;
delete req.file!.field; delete req.file.field;
delete req.file!.file; delete req.file.file;
delete req.file!.filename; delete req.file.filename;
delete req.file!.truncated; delete req.file.truncated;
delete req.file!.done; delete req.file.done;
// Temp file name used in case file already exists (long story; just don't touch this)
let tempFileName = '';
// Operations // Operations
// @ts-ignore // @ts-ignore
Promise.all([Thumbnail(req.file), Vibrant(req.file), Hash(req.file), fs.stat(req.file!.path)]) Promise.all([Thumbnail(req.file), Vibrant(req.file), Hash(req.file), fs.stat(req.file.path)])
// skipcq: JS-0086 // skipcq: JS-0086
.then(([thumbnail, vibrant, sha1, stat]: [string, string, string, Stats]) => ( .then(([thumbnail, vibrant, sha1, stat]: [string, string, string, Stats]) => (
req.file!.thumbnail = thumbnail, // skipcq: JS-0090 req.file.thumbnail = thumbnail, // skipcq: JS-0090
req.file!.vibrant = vibrant, // skipcq: JS-0090 req.file.vibrant = vibrant, // skipcq: JS-0090
req.file!.sha1 = sha1, // skipcq: JS-0090 req.file.sha1 = sha1, // skipcq: JS-0090
req.file!.size = stat.size // skipcq: JS-0090 req.file.size = stat.size // skipcq: JS-0090
)) ))
// Check if file size is too big // Check if file size is too big
.then(() => { if (req.file!.size / Math.pow(1024, 2) > maxUploadSize) throw new Error('LIMIT_FILE_SIZE'); }) .then(() => { if (req.file.size / Math.pow(1024, 2) > maxUploadSize) throw new Error('LIMIT_FILE_SIZE'); })
// Save file // Save file
.then(() => log.debug('Saving file', req.file!.originalname, s3enabled ? 'in S3' : useSia ? 'on Sia blockchain' : 'on disk')) .then(() => log.debug('Saving file', req.file.originalname, s3enabled ? 'in S3' : useSia ? 'on Sia blockchain' : 'on disk'))
.then(() => .then(() =>
// skipcq: JS-0229 // skipcq: JS-0229
new Promise((resolve, reject) => { new Promise((resolve, reject) => {
@ -103,31 +117,32 @@ export function processUploaded(req: AssRequest, res: AssResponse, next: Functio
// Upload to Amazon S3 // Upload to Amazon S3
if (s3enabled) return s3.putObject({ if (s3enabled) return s3.putObject({
Bucket: s3bucket, Bucket: s3bucket,
Key: req.file!.randomId.concat(req.file!.ext), Key: req.file.randomId.concat(req.file.ext),
ACL: 'public-read', ACL: 'public-read',
ContentType: req.file!.mimetype, ContentType: req.file.mimetype,
Body: fs.createReadStream(req.file!.path) Body: fs.createReadStream(req.file.path)
}).promise().then(resolve).catch(reject); }).promise().then(resolve).catch(reject);
// Use Sia Skynet // Use Sia Skynet
else if (useSia) return SkynetUpload(req.file!.path) else if (useSia) return SkynetUpload(req.file.path)
.then((skylink) => req.file!.randomId = skylink) .then((skylink) => req.file.randomId = skylink)
.then(resolve).catch(reject); .then(resolve).catch(reject);
// Save to local storage // Save to local storage
else return fs.ensureDir(getDatedDirname()) else return fs.ensureDir(getDatedDirname())
.then(() => fs.copy(req.file!.path, getLocalFilename(req), { preserveTimestamps: true })) .then(() => tempFileName = getLocalFilename(req))
.then(() => fs.copy(req.file.path, tempFileName, { preserveTimestamps: true }))
.then(resolve).catch(reject); .then(resolve).catch(reject);
})) }))
.then(() => log.debug('File saved', req.file!.originalname, s3enabled ? 'in S3' : useSia ? 'on Sia blockchain' : 'on disk')) .then(() => log.debug('File saved', req.file.originalname, s3enabled ? 'in S3' : useSia ? 'on Sia blockchain' : 'on disk'))
.catch((err) => next(err)) .catch((err) => next(err))
// Delete the file // Delete the file
.then(() => fs.remove(req.file!.path)) .then(() => fs.remove(req.file.path))
.then(() => log.debug('Temp file', 'deleted')) .then(() => log.debug('Temp file', 'deleted'))
// Fix the file path // Fix the file path
.then(() => !s3enabled && (req.file!.path = getLocalFilename(req))) // skipcq: JS-0090 .then(() => !s3enabled && (req.file.path = tempFileName)) // skipcq: JS-0090
.then(() => next()) .then(() => next())
.catch((err) => next(err)); .catch((err) => next(err));
} }

@ -1,11 +1,14 @@
import { FileData } from "./definitions"; import { FileData } from './types/definitions';
import { Config } from 'ass-json';
import fs from 'fs-extra';
import ffmpeg from 'ffmpeg-static'; import ffmpeg from 'ffmpeg-static';
import Jimp from 'jimp'; import sharp from 'sharp';
// @ts-ignore // @ts-ignore
import shell from 'any-shell-escape'; import shell from 'any-shell-escape';
import { exec } from 'child_process'; import { exec } from 'child_process';
import { isProd, path } from './utils'; import { isProd, path } from './utils';
const { diskFilePath } = require('../config.json'); const { diskFilePath }: Config = fs.readJsonSync(path('config.json'));
// Thumbnail parameters // Thumbnail parameters
const THUMBNAIL = { const THUMBNAIL = {
@ -16,9 +19,6 @@ const THUMBNAIL = {
/** /**
* Builds a safe escaped ffmpeg command * Builds a safe escaped ffmpeg command
* @param {String} src Path to the input file
* @param {String} dest Path of the output file
* @returns {String} The command to execute
*/ */
function getCommand(src: String, dest: String) { function getCommand(src: String, dest: String) {
return shell([ return shell([
@ -34,8 +34,6 @@ function getCommand(src: String, dest: String) {
/** /**
* Builds a thumbnail filename * Builds a thumbnail filename
* @param {String} oldName The original filename
* @returns {String} The filename for the thumbnail
*/ */
function getNewName(oldName: String) { function getNewName(oldName: String) {
return oldName.concat('.thumbnail.jpg'); return oldName.concat('.thumbnail.jpg');
@ -43,8 +41,6 @@ function getNewName(oldName: String) {
/** /**
* Builds a path to the thumbnails * Builds a path to the thumbnails
* @param {String} oldName The original filename
* @returns {String} The path to the thumbnail
*/ */
function getNewNamePath(oldName: String) { function getNewNamePath(oldName: String) {
return path(diskFilePath, 'thumbnails/', getNewName(oldName)); return path(diskFilePath, 'thumbnails/', getNewName(oldName));
@ -52,7 +48,6 @@ function getNewNamePath(oldName: String) {
/** /**
* Extracts an image from a video file to use as a thumbnail, using ffmpeg * Extracts an image from a video file to use as a thumbnail, using ffmpeg
* @param {*} file The video file to pull a frame from
*/ */
function getVideoThumbnail(file: FileData) { function getVideoThumbnail(file: FileData) {
return new Promise((resolve: Function, reject: Function) => exec( return new Promise((resolve: Function, reject: Function) => exec(
@ -64,23 +59,19 @@ function getVideoThumbnail(file: FileData) {
/** /**
* Generates a thumbnail for the provided image * Generates a thumbnail for the provided image
* @param {*} file The file to generate a thumbnail for
*/ */
function getImageThumbnail(file: FileData) { function getImageThumbnail(file: FileData) {
return new Promise((resolve, reject) => return new Promise((resolve, reject) =>
Jimp.read(file.path) sharp(file.path)
.then((image) => image .resize(THUMBNAIL.WIDTH, THUMBNAIL.HEIGHT, { kernel: 'cubic' })
.quality(THUMBNAIL.QUALITY) .jpeg({ quality: THUMBNAIL.QUALITY })
.resize(THUMBNAIL.WIDTH, THUMBNAIL.HEIGHT, Jimp.RESIZE_BICUBIC) .toFile(getNewNamePath(file.randomId))
.write(getNewNamePath(file.randomId)))
.then(resolve) .then(resolve)
.catch(reject)); .catch(reject));
} }
/** /**
* Generates a thumbnail * Generates a thumbnail
* @param {*} file The file to generate a thumbnail for
* @returns The thumbnail filename (NOT the path)
*/ */
export default (file: FileData): Promise<string> => export default (file: FileData): Promise<string> =>
new Promise((resolve, reject) => new Promise((resolve, reject) =>

@ -1,4 +1,16 @@
import { Request, Response } from "express"; import { Request, Response } from 'express';
declare global {
namespace Express {
interface Request {
resourceId: string
ass: { resourceId: string }
token: string
file: FileData
files: { [key: string]: any }
}
}
}
export interface User { export interface User {
token: string token: string
@ -55,18 +67,6 @@ export interface OpenGraphData {
color?: string | string[] color?: string | string[]
} }
export interface AssRequest extends Request {
resourceId?: string
ass?: { resourceId: string }
token?: string
file?: FileData
files?: { [key: string]: any }
}
export interface AssResponse extends Response {
}
export interface ErrWrap extends Error { export interface ErrWrap extends Error {
code?: number | string code?: number | string
} }

@ -0,0 +1,49 @@
declare module 'ass-json' {
interface Config {
host: string
port: number
domain: string
maxUploadSize: number
isProxied: boolean
useSsl: boolean
resourceIdSize: number
resourceIdType: string
spaceReplace: string
gfyIdSize: number
mediaStrict: boolean
viewDirect: boolean
dataEngine: string
frontendName: string
indexFile: string
useSia: boolean
s3enabled: boolean
s3endpoint: string
s3bucket: string
s3usePathStyle: boolean
s3accessKey: string
s3secretKey: string
__WARNING__: string
diskFilePath: string
saveWithDate: boolean
saveAsOriginal: boolean
}
interface MagicNumbers {
HTTP: number
HTTPS: number
CODE_OK: number
CODE_NO_CONTENT: number
CODE_UNAUTHORIZED: number
CODE_NOT_FOUND: number
CODE_PAYLOAD_TOO_LARGE: number
CODE_UNSUPPORTED_MEDIA_TYPE: number
CODE_INTERNAL_SERVER_ERROR: number
KILOBYTES: number
}
interface Package {
name: string
version: string
homepage: string
}
}

@ -0,0 +1,6 @@
declare module './setup' {
export function doSetup(): void;
}
declare module '@tycrek/express-nofavicon';
declare module '@tycrek/papito';
declare module '@skynetlabs/skynet-nodejs';

@ -1,4 +1,4 @@
import { AssRequest, FileData } from './definitions'; import { FileData } from './types/definitions';
import fs from 'fs-extra'; import fs from 'fs-extra';
import Path from 'path'; import Path from 'path';
import fetch from 'node-fetch'; import fetch from 'node-fetch';
@ -9,10 +9,12 @@ import zwsGen from './generators/zws';
import randomGen from './generators/random'; import randomGen from './generators/random';
import gfyGen from './generators/gfycat'; import gfyGen from './generators/gfycat';
import logger from './logger'; import logger from './logger';
import { Request } from 'express';
const { HTTP, HTTPS, KILOBYTES } = require('../MagicNumbers.json'); const { HTTP, HTTPS, KILOBYTES } = require('../MagicNumbers.json');
// Catch config.json not existing when running setup script // Catch config.json not existing when running setup script
try { try {
// todo: fix this
var { useSsl, port, domain, isProxied, diskFilePath, s3bucket, s3endpoint, s3usePathStyle } = require('../config.json'); // skipcq: JS-0239, JS-0102 var { useSsl, port, domain, isProxied, diskFilePath, s3bucket, s3endpoint, s3usePathStyle } = require('../config.json'); // skipcq: JS-0239, JS-0102
} catch (ex) { } catch (ex) {
// @ts-ignore // @ts-ignore
@ -69,28 +71,26 @@ export function arrayEquals(arr1: any[], arr2: any[]) {
return arr1.length === arr2.length && arr1.slice().sort().every((value: string, index: number) => value === arr2.slice().sort()[index]) return arr1.length === arr2.length && arr1.slice().sort().every((value: string, index: number) => value === arr2.slice().sort()[index])
}; };
export function verify(req: AssRequest, users: JSON) { export function verify(req: Request, users: JSON) {
return req.headers.authorization && Object.prototype.hasOwnProperty.call(users, req.headers.authorization); return req.headers.authorization && Object.prototype.hasOwnProperty.call(users, req.headers.authorization);
} }
export function generateId(mode: string, length: number, gfyLength: number, originalName: string) {
return (GENERATORS.has(mode) ? GENERATORS.get(mode)({ length, gfyLength }) : originalName);
}
// Set up pathing
export const path = (...paths: string[]) => Path.join(process.cwd(), ...paths);
const idModes = { const idModes = {
zws: 'zws', // Zero-width spaces (see: https://zws.im/) zws: 'zws', // Zero-width spaces (see: https://zws.im/)
og: 'original', // Use original uploaded filename og: 'original', // Use original uploaded filename
r: 'random', // Use a randomly generated ID with a mixed-case alphanumeric character set r: 'random', // Use a randomly generated ID with a mixed-case alphanumeric character set
gfy: 'gfycat' // Gfycat-style ID's (https://gfycat.com/unsungdiscretegrub) gfy: 'gfycat' // Gfycat-style ID's (https://gfycat.com/unsungdiscretegrub)
}; };
const GENERATORS = new Map(); const GENERATORS = new Map();
GENERATORS.set(idModes.zws, zwsGen); GENERATORS.set(idModes.zws, zwsGen);
GENERATORS.set(idModes.r, randomGen); GENERATORS.set(idModes.r, randomGen);
GENERATORS.set(idModes.gfy, gfyGen); GENERATORS.set(idModes.gfy, gfyGen);
export function generateId(mode: string, length: number, gfyLength: number, originalName: string) {
return (GENERATORS.has(mode) ? GENERATORS.get(mode)({ length, gfyLength }) : originalName);
}
// Set up pathing
export const path = (...paths: string[]) => Path.join(process.cwd(), ...paths);
export const isProd = require('@tycrek/isprod')(); export const isProd = require('@tycrek/isprod')();
module.exports = { module.exports = {
@ -106,11 +106,11 @@ module.exports = {
randomHexColour, randomHexColour,
sanitize, sanitize,
verify, verify,
renameFile: (req: AssRequest, newName: string) => new Promise((resolve: Function, reject) => { renameFile: (req: Request, newName: string) => new Promise((resolve: Function, reject) => {
try { try {
const paths = [req.file!.destination, newName]; const paths = [req.file.destination, newName];
fs.rename(path(req.file!.path), path(...paths)); fs.rename(path(req.file.path), path(...paths));
req.file!.path = Path.join(...paths); req.file.path = Path.join(...paths);
resolve(); resolve();
} catch (err) { } catch (err) {
reject(err); reject(err);

@ -1,5 +1,6 @@
import { FileData } from './definitions'; import { FileData } from './types/definitions';
import Vibrant from 'node-vibrant'; import Vibrant from 'node-vibrant';
import sharp from 'sharp';
import { randomHexColour } from './utils'; import { randomHexColour } from './utils';
// Vibrant parameters // Vibrant parameters
@ -8,22 +9,18 @@ const QUALITY = 3;
/** /**
* Extracts a prominent colour from the provided image file * Extracts a prominent colour from the provided image file
* @param {*} file The image to extract a colour from
* @param {*} resolve Runs if Promise was successful
* @param {*} reject Runs if Promise failed
*/ */
function getVibrant(file: FileData, resolve: Function, reject: Function) { function getVibrant(file: FileData, resolve: Function, reject: Function) {
Vibrant.from(file.path) sharp(file.path).png().toBuffer()
.maxColorCount(COLOR_COUNT) .then((data) => Vibrant.from(data)
.quality(QUALITY) .maxColorCount(COLOR_COUNT)
.getPalette() .quality(QUALITY)
.getPalette())
.then((palettes) => resolve(palettes[Object.keys(palettes).sort((a, b) => palettes[b]!.population - palettes[a]!.population)[0]]!.hex)) .then((palettes) => resolve(palettes[Object.keys(palettes).sort((a, b) => palettes[b]!.population - palettes[a]!.population)[0]]!.hex))
.catch((err) => reject(err)); .catch((err) => reject(err));
} }
/** /**
* Extracts a colour from an image file. Returns a random Hex value if provided file is a video * Extracts a colour from an image file. Returns a random Hex value if provided file is a video
* @param {*} file The file to get a colour from
* @returns The Vibrant colour as a Hex value (or random Hex value for videos)
*/ */
export default (file: FileData): Promise<string> => new Promise((resolve, reject) => (!file.is.image || file.mimetype.includes('webp')) ? resolve(randomHexColour()) : getVibrant(file, resolve, reject)); // skipcq: JS-0229 export default (file: FileData): Promise<string> => new Promise((resolve, reject) => (!file.is.image || file.mimetype.includes('webp')) ? resolve(randomHexColour()) : getVibrant(file, resolve, reject)); // skipcq: JS-0229

@ -1,5 +1,38 @@
const primary = '#FD842D'; const fs = require('fs-extra');
const primaryDim = '#B64D02'; const path = require('path');
const themePath = path.join(process.cwd(), 'share', 'theme.json');
/**
* ! IMPORTANT !
* Do NOT edit this file directly!
*
* Instead, edit the `theme.js` file in the `share` directory.
* For more info, please see the README: https://github.com/tycrek/ass/#customizing-the-viewer
*/
const defaults = {
// Font
font: '"Josefin Sans"',
// Background colours
bgPage: '#212121',
bgViewer: '#151515',
// Text colours
txtPrimary: '#FD842D',
txtSecondary: '#BDBDBD',
// Links
linkHover: '#FD710D',
linkActive: '#DE5E02',
// Other
borderHover: '#B64D02',
};
let theme = {};
if (fs.existsSync(themePath))
theme = fs.readJsonSync(themePath);
module.exports = { module.exports = {
separator: '_', separator: '_',
darkMode: 'class', darkMode: 'class',
@ -10,23 +43,20 @@ module.exports = {
theme: { theme: {
extend: { extend: {
fontFamily: { fontFamily: {
main: ['"Josefin Sans"', 'ui-sans-serif', 'system-ui', 'sans-serif'] main: [theme.font || defaults.font, 'ui-sans-serif', 'system-ui', 'sans-serif']
}, },
backgroundColor: { backgroundColor: {
'primary': primary, 'page': theme.bgPage || defaults.bgPage,
'body': '#212121', 'viewer': theme.bgViewer || defaults.bgViewer,
}, },
colors: { colors: {
'content-bg': '#151515', 'primary': theme.txtPrimary || defaults.txtPrimary,
'primary': primary, 'secondary': theme.txtSecondary || defaults.txtSecondary,
'primary-dim': primaryDim, 'link-hover': theme.linkHover || defaults.linkHover,
'primary-dark': '#793301', 'link-active': theme.linkActive || defaults.linkActive,
'link-hover': '#FD710D',
'link-active': '#DE5E02',
'text-primary': '#BDBDBD',
}, },
borderColor: { borderColor: {
'primary-dim': primaryDim 'hover': theme.borderHover || defaults.borderHover
}, },
maxHeight: { maxHeight: {
'half-port': '50vh' 'half-port': '50vh'

@ -17,7 +17,7 @@
/* hover */ /* hover */
hover_text-link-hover hover_text-link-hover
hover_border-primary-dim hover_border-hover
/* active */ /* active */
active_text-link-active active_text-link-active

@ -2,8 +2,9 @@
"extends": "@tsconfig/node16/tsconfig.json", "extends": "@tsconfig/node16/tsconfig.json",
"compilerOptions": { "compilerOptions": {
"outDir": "./dist", "outDir": "./dist",
"target": "ES2021",
"lib": [ "lib": [
"ES2020", "ES2021",
"DOM" "DOM"
], ],
"allowJs": true, "allowJs": true,

@ -21,10 +21,10 @@ html
* { display: none !important; } * { display: none !important; }
meta(http-equiv='refresh' content=`0; url='${resourceAttr.src}'`) meta(http-equiv='refresh' content=`0; url='${resourceAttr.src}'`)
body.font-main.text-text-primary.bg-body body.font-main.text-secondary.bg-page
.w-full.h-full.flex.justify-center.items-center.text-center .w-full.h-full.flex.justify-center.items-center.text-center
.bg-content-bg.rounded-24 .bg-viewer.rounded-24
h4.mt-6.mb-4.text-3xl.font-main!=title h4.mx-4.mt-6.mb-4.text-3xl.font-main!=title
figure.block.mx-10.my-4.flex.flex-col.align-items-center figure.block.mx-10.my-4.flex.flex-col.align-items-center
if fileIs.video if fileIs.video
video.res-media(controls loop muted playsinline preload='metadata')&attributes(resourceAttr) video.res-media(controls loop muted playsinline preload='metadata')&attributes(resourceAttr)
@ -41,4 +41,5 @@ html
span #{timestamp} (#{size}) span #{timestamp} (#{size})
br br
span: a.link(href='#' onclick=`window.location = '${resourceAttr.src}?download=yes'; return false;` download=title) Download span: a.link(href='#' onclick=`window.location = '${resourceAttr.src}?download=yes'; return false;` download=title) Download
.mx-4.mb-8.text-footer: p Image hosted by #[a.link(href='https://github.com/tycrek/ass' target='_blank'): strong ass], the superior self-hosted ShareX server if showAd
.mx-4.mb-8.text-footer: p Image hosted by #[a.link(href='https://github.com/tycrek/ass' target='_blank'): strong ass], the superior self-hosted ShareX server

Loading…
Cancel
Save