0.7.0 - Migrated to new StorageEngine system

Merge pull request #22 from tycrek/storage-engines
pull/29/head releases/0.7.0
Josh Moore 3 years ago committed by GitHub
commit bd176a144d
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

@ -3,7 +3,6 @@
"HTTPS": 443, "HTTPS": 443,
"CODE_OK": 200, "CODE_OK": 200,
"CODE_NO_CONTENT": 204, "CODE_NO_CONTENT": 204,
"CODE_BAD_REQUEST": 400,
"CODE_UNAUTHORIZED": 401, "CODE_UNAUTHORIZED": 401,
"CODE_NOT_FOUND": 404, "CODE_NOT_FOUND": 404,
"CODE_PAYLOAD_TOO_LARGE": 413, "CODE_PAYLOAD_TOO_LARGE": 413,

@ -29,6 +29,7 @@
- ✔️ Thumbnail support - ✔️ Thumbnail support
- ✔️ Basic multi-user support - ✔️ Basic multi-user support
- ✔️ Configurable global upload limit (per-user coming soon!) - ✔️ Configurable global upload limit (per-user coming soon!)
- ✔️ Basic macOS/Linux support using other clients including [Flameshot](https://flameshot.org/) ([ass-compatible Flameshot script](https://github.com/tycrek/ass#flameshot-users-linux)) & [MagicCap](https://magiccap.me/)
- ✔️ Local storage *or* block-storage support for [Amazon S3](https://aws.amazon.com/s3/) (including [DigitalOcean Spaces](https://www.digitalocean.com/products/spaces/)) - ✔️ Local storage *or* block-storage support for [Amazon S3](https://aws.amazon.com/s3/) (including [DigitalOcean Spaces](https://www.digitalocean.com/products/spaces/))
- ✔️ Custom pluggable frontend dashboards using [Git Submodules](https://git-scm.com/book/en/v2/Git-Tools-Submodules) - ✔️ Custom pluggable frontend dashboards using [Git Submodules](https://git-scm.com/book/en/v2/Git-Tools-Submodules)
- ✔️ Multiple access types - ✔️ Multiple access types
@ -36,11 +37,14 @@
- **Mixed-case alphanumeric** - **Mixed-case alphanumeric**
- **Gfycat** - **Gfycat**
- **Original** - **Original**
- ❌ Multiple database types - ✔️ Multiple storage methods using [ass StorageEngines](https://github.com/tycrek/ass-storage-engine) (JSON by default)
- **JSON** - **File**
- **Mongo** (soon!) - **JSON**
- **MySQL** (soon!) - **YAML** (soon!)
- **PostgreSQL** (soon!) - **Databases**
- **Mongo** (soon!)
- **MySQL** (soon!)
- **PostgreSQL** (soon!)
### Access types ### Access types
@ -117,7 +121,7 @@ If you primarily share media on Discord, you can add these additional (optional)
| **`X-Ass-OG-Author-Url`** | URL to open when the Author is clicked | | **`X-Ass-OG-Author-Url`** | URL to open when the Author is clicked |
| **`X-Ass-OG-Provider`** | Smaller text shown above the author | | **`X-Ass-OG-Provider`** | Smaller text shown above the author |
| **`X-Ass-OG-Provider-Url`** | URL to open when the Provider is clicked | | **`X-Ass-OG-Provider-Url`** | URL to open when the Provider is clicked |
| **`X-Ass-OG-Color`** | Colour shown on the left side of the embed. Must be one of `&random`, `&vibrant`, or a hex colour value (for example: `#fe3c29`). Random is a randomly generated hex value and Vibrant is sourced from the image itself | | **`X-Ass-OG-Color`** | Colour shown on the left side of the embed. Must be one of `&random`, `&vibrant`, or a hex colour value (for example: `#fe3c29`). Random is a randomly generated hex value & Vibrant is sourced from the image itself |
#### Embed placeholders #### Embed placeholders
@ -178,6 +182,52 @@ Now you should see `My awesome dashboard!` when you navigate to `http://your-ass
**For a detailed walkthrough on developing your first frontend, [consult the wiki](https://github.com/tycrek/ass/wiki/Writing-a-custom-frontend).** **For a detailed walkthrough on developing your first frontend, [consult the wiki](https://github.com/tycrek/ass/wiki/Writing-a-custom-frontend).**
## StorageEngines
[StorageEngines](https://github.com/tycrek/ass-storage-engine) are responsible for managing your data. "Data" has two parts: an identifier & the actual data itself. With ass, the data is a JSON object representing the uploaded resource. The identifier is the unique ID in the URL returned to the user on upload.
ass aims to support these storage methods at a minimum:
- **JSON**
- **Mongo** (soon)
An ass StorageEngine implements support for one type of database (or file, such as JSON or YAML). This lets ass server hosts pick their database of choice, because all they'll have to do is plugin the connection/authentication details, then ass will handle the rest, using the resource ID as the key.
The only storage engine ass comes with by default is **JSON**. Others will be published to [npm](https://www.npmjs.com/) and listed here. If you find (or create!) a StorageEngine you like, you can use it by installing it with `npm i <package-name>` then changing the contents of [`data.js`](https://github.com/tycrek/ass/blob/master/data.js). At this time, a modified `data.js` might look like this:
```js
/**
* Used for global data management
*/
//const { JsonStorageEngine } = require('@tycrek/ass-storage-engine');
const { CustomStorageEngine } = require('my-custom-ass-storage-engine');
//const data = new JsonStorageEngine();
// StorageEngines may take no parameters...
const data1 = new CustomStorageEngine();
// multiple parameters...
const data2 = new CustomStorageEngine('Parameters!!', 420);
// or object-based parameters, depending on what the StorageEngine dev decides on.
const data3 = new CustomStorageEngine({ key1: 'value1', key2: { key3: 44 } });
module.exports = data1;
```
As long as the StorageEngine properly implements `GET`/`PUT`/`DEL`/`HAS`
StorageFunctions, replacing the file/database system is just that easy.
If you develop & publish a Engine, feel free to [open a PR on this README](https://github.com/tycrek/ass/edit/master/README.md) to add it.
- [`npm publish` docs](https://docs.npmjs.com/cli/v7/commands/npm-publish)
- ["How to publish packages to npm (the way the industry does things)"](https://zellwk.com/blog/publish-to-npm/)([`@tycrek/ass-storage-engine`](https://www.npmjs.com/package/@tycrek/ass-storage-engine) is published using the software this guide recommends, [np](https://github.com/sindresorhus/np))
**A wiki page on writing a custom StorageEngine is coming soon. Once complete, you can find it [here](https://github.com/tycrek/ass/wiki/Writing-a-StorageEngine).**
## Flameshot users (Linux) ## Flameshot users (Linux)
Use [this script](https://github.com/tycrek/ass/blob/master/flameshot_example.sh) kindly provided by [@ToxicAven](https://github.com/ToxicAven). For the `KEY`, put your token. Use [this script](https://github.com/tycrek/ass/blob/master/flameshot_example.sh) kindly provided by [@ToxicAven](https://github.com/ToxicAven). For the `KEY`, put your token.

@ -1 +1 @@
Subproject commit 43c8082e78d01d26f2e6c944e73bca67bb1d5197 Subproject commit 7e27c22ce5ac86e19789d7f94e85ad4225ea0b0a

@ -34,7 +34,7 @@ const ROUTERS = {
// Read users and data // Read users and data
const users = require('./auth'); const users = require('./auth');
const data = require('./data'); const data = require('./data');
log('Users & data read from filesystem'); log(`StorageEngine: ${data.name} (${data.type})`);
//#endregion //#endregion
// Create thumbnails directory // Create thumbnails directory
@ -71,4 +71,4 @@ app.use(([err, , res,]) => {
}); });
// Host the server // Host the server
app.listen(port, host, () => log(`Server started on [${host}:${port}]\nAuthorized users: ${Object.keys(users).length}\nAvailable files: ${Object.keys(data).length}`)); app.listen(port, host, () => log(`Server started on [${host}:${port}]\nAuthorized users: ${Object.keys(users).length}\nAvailable files: ${data.size}`));

@ -2,14 +2,6 @@
* Used for global data management * Used for global data management
*/ */
const fs = require('fs-extra'); const { JsonStorageEngine } = require('@tycrek/ass-storage-engine');
const { log, path } = require('./utils'); const data = new JsonStorageEngine();
// Make sure data.json exists
if (!fs.existsSync(path('data.json'))) {
fs.writeJsonSync(path('data.json'), {}, { spaces: 4 });
log('File [data.json] created');
} else log('File [data.json] exists');
const data = require('./data.json');
module.exports = data; module.exports = data;

54
package-lock.json generated

@ -1,14 +1,15 @@
{ {
"name": "ass", "name": "ass",
"version": "0.6.0", "version": "0.7.0",
"lockfileVersion": 2, "lockfileVersion": 2,
"requires": true, "requires": true,
"packages": { "packages": {
"": { "": {
"name": "ass", "name": "ass",
"version": "0.6.0", "version": "0.7.0",
"license": "ISC", "license": "ISC",
"dependencies": { "dependencies": {
"@tycrek/ass-storage-engine": "0.2.5",
"any-shell-escape": "^0.1.1", "any-shell-escape": "^0.1.1",
"aws-sdk": "^2.930.0", "aws-sdk": "^2.930.0",
"check-node-version": "^4.1.0", "check-node-version": "^4.1.0",
@ -560,6 +561,35 @@
"regenerator-runtime": "^0.13.3" "regenerator-runtime": "^0.13.3"
} }
}, },
"node_modules/@tycrek/ass-storage-engine": {
"version": "0.2.5",
"resolved": "https://registry.npmjs.org/@tycrek/ass-storage-engine/-/ass-storage-engine-0.2.5.tgz",
"integrity": "sha512-D4C4WAtTQIkoN1QH7l9h4gJTIWcrfpqZPd2nrgLc8O400eZdZJWkeA9Tf6UD18V5e5GWun3bwJFmCqfTwmmjWw==",
"dependencies": {
"fs-extra": "^10.0.0"
},
"engines": {
"node": "^14.x.x",
"npm": "^7.x.x"
},
"funding": {
"type": "patreon",
"url": "https://patreon.com/tycrek"
}
},
"node_modules/@tycrek/ass-storage-engine/node_modules/fs-extra": {
"version": "10.0.0",
"resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-10.0.0.tgz",
"integrity": "sha512-C5owb14u9eJwizKGdchcDUQeFtlSHHthBk8pbX9Vc1PFZrLombudjDnNns88aYslCyF6IY5SUw3Roz6xShcEIQ==",
"dependencies": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
"universalify": "^2.0.0"
},
"engines": {
"node": ">=12"
}
},
"node_modules/@types/node": { "node_modules/@types/node": {
"version": "10.17.60", "version": "10.17.60",
"resolved": "https://registry.npmjs.org/@types/node/-/node-10.17.60.tgz", "resolved": "https://registry.npmjs.org/@types/node/-/node-10.17.60.tgz",
@ -3754,6 +3784,26 @@
"regenerator-runtime": "^0.13.3" "regenerator-runtime": "^0.13.3"
} }
}, },
"@tycrek/ass-storage-engine": {
"version": "0.2.5",
"resolved": "https://registry.npmjs.org/@tycrek/ass-storage-engine/-/ass-storage-engine-0.2.5.tgz",
"integrity": "sha512-D4C4WAtTQIkoN1QH7l9h4gJTIWcrfpqZPd2nrgLc8O400eZdZJWkeA9Tf6UD18V5e5GWun3bwJFmCqfTwmmjWw==",
"requires": {
"fs-extra": "^10.0.0"
},
"dependencies": {
"fs-extra": {
"version": "10.0.0",
"resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-10.0.0.tgz",
"integrity": "sha512-C5owb14u9eJwizKGdchcDUQeFtlSHHthBk8pbX9Vc1PFZrLombudjDnNns88aYslCyF6IY5SUw3Roz6xShcEIQ==",
"requires": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
"universalify": "^2.0.0"
}
}
}
},
"@types/node": { "@types/node": {
"version": "10.17.60", "version": "10.17.60",
"resolved": "https://registry.npmjs.org/@types/node/-/node-10.17.60.tgz", "resolved": "https://registry.npmjs.org/@types/node/-/node-10.17.60.tgz",

@ -1,6 +1,6 @@
{ {
"name": "ass", "name": "ass",
"version": "0.6.0", "version": "0.7.0",
"description": "The superior self-hosted ShareX server", "description": "The superior self-hosted ShareX server",
"main": "ass.js", "main": "ass.js",
"engines": { "engines": {
@ -34,6 +34,7 @@
"url": "https://patreon.com/tycrek" "url": "https://patreon.com/tycrek"
}, },
"dependencies": { "dependencies": {
"@tycrek/ass-storage-engine": "0.2.5",
"any-shell-escape": "^0.1.1", "any-shell-escape": "^0.1.1",
"aws-sdk": "^2.930.0", "aws-sdk": "^2.930.0",
"check-node-version": "^4.1.0", "check-node-version": "^4.1.0",

@ -3,8 +3,8 @@ const escape = require('escape-html');
const fetch = require('node-fetch'); const fetch = require('node-fetch');
const { deleteS3 } = require('../storage'); const { deleteS3 } = require('../storage');
const { diskFilePath, s3enabled } = require('../config.json'); const { diskFilePath, s3enabled } = require('../config.json');
const { path, saveData, log, getTrueHttp, getTrueDomain, formatBytes, formatTimestamp, getS3url, getDirectUrl, getSafeExt, getResourceColor, replaceholder } = require('../utils'); const { path, log, getTrueHttp, getTrueDomain, formatBytes, formatTimestamp, getS3url, getDirectUrl, getSafeExt, getResourceColor, replaceholder } = require('../utils');
const { CODE_BAD_REQUEST, CODE_UNAUTHORIZED, CODE_NOT_FOUND, } = require('../MagicNumbers.json'); const { CODE_UNAUTHORIZED, CODE_NOT_FOUND, } = require('../MagicNumbers.json');
const data = require('../data'); const data = require('../data');
const users = require('../auth'); const users = require('../auth');
@ -14,16 +14,17 @@ const router = express.Router();
// Middleware for parsing the resource ID and handling 404 // Middleware for parsing the resource ID and handling 404
router.use((req, res, next) => { router.use((req, res, next) => {
// Parse the resource ID // Parse the resource ID
req.ass = { resourceId: escape(req.resourceId).split('.')[0] }; req.ass = { resourceId: escape(req.resourceId || '').split('.')[0] };
// If the ID is invalid, return 404. Otherwise, continue normally // skipcq: JS-0093 // If the ID is invalid, return 404. Otherwise, continue normally
(!req.ass.resourceId || !data[req.ass.resourceId]) ? res.sendStatus(CODE_NOT_FOUND) : next(); data.has(req.ass.resourceId)
.then((has) => has ? next() : res.sendStatus(CODE_NOT_FOUND)) // skipcq: JS-0229
.catch(next);
}); });
// View file // View file
router.get('/', (req, res) => { router.get('/', (req, res, next) => data.get(req.ass.resourceId).then((fileData) => {
const { resourceId } = req.ass; const { resourceId } = req.ass;
const fileData = data[resourceId];
const isVideo = fileData.mimetype.includes('video'); const isVideo = fileData.mimetype.includes('video');
// Build OpenGraph meta tags // Build OpenGraph meta tags
@ -47,15 +48,12 @@ router.get('/', (req, res) => {
oembedUrl: `${getTrueHttp()}${getTrueDomain()}/${resourceId}/oembed`, oembedUrl: `${getTrueHttp()}${getTrueDomain()}/${resourceId}/oembed`,
ogtype: isVideo ? 'video.other' : 'image', ogtype: isVideo ? 'video.other' : 'image',
urlType: `og:${isVideo ? 'video' : 'image'}`, urlType: `og:${isVideo ? 'video' : 'image'}`,
opengraph: replaceholder(ogs.join('\n'), fileData) opengraph: replaceholder(ogs.join('\n'), fileData.size, fileData.timestamp, fileData.originalname)
}); });
}); }).catch(next));
// Direct resource // Direct resource
router.get('/direct*', (req, res) => { router.get('/direct*', (req, res, next) => data.get(req.ass.resourceId).then((fileData) => {
const { resourceId } = req.ass;
const fileData = data[resourceId];
// Send file as an attachement for downloads // Send file as an attachement for downloads
if (req.query.download) if (req.query.download)
res.header('Content-Disposition', `attachment; filename="${fileData.originalname}"`); res.header('Content-Disposition', `attachment; filename="${fileData.originalname}"`);
@ -73,58 +71,52 @@ router.get('/direct*', (req, res) => {
}; };
uploaders[s3enabled ? 's3' : 'local'](); uploaders[s3enabled ? 's3' : 'local']();
}); }).catch(next));
// Thumbnail response // Thumbnail response
router.get('/thumbnail', (req, res) => { router.get('/thumbnail', (req, res, next) =>
const { resourceId } = req.ass; data.get(req.ass.resourceId)
.then(({ thumbnail }) => fs.readFile(path(diskFilePath, 'thumbnails/', thumbnail)))
// Read the file and send it to the client
fs.readFile(path(diskFilePath, 'thumbnails/', data[resourceId].thumbnail))
.then((fileData) => res.type('jpg').send(fileData)) .then((fileData) => res.type('jpg').send(fileData))
.catch(console.error); .catch(next));
});
// oEmbed response for clickable authors/providers // oEmbed response for clickable authors/providers
// https://oembed.com/ // https://oembed.com/
// https://old.reddit.com/r/discordapp/comments/82p8i6/a_basic_tutorial_on_how_to_get_the_most_out_of/ // https://old.reddit.com/r/discordapp/comments/82p8i6/a_basic_tutorial_on_how_to_get_the_most_out_of/
router.get('/oembed', (req, res) => { router.get('/oembed', (req, res, next) =>
const { resourceId } = req.ass; data.get(req.ass.resourceId)
.then(({ opengraph, mimetype, size, timestamp, originalname }) =>
// Build the oEmbed object & send the response res.type('json').send({
const { opengraph, mimetype } = data[resourceId]; version: '1.0',
res.type('json').send({ type: mimetype.includes('video') ? 'video' : 'photo',
version: '1.0', author_url: opengraph.authorUrl,
type: mimetype.includes('video') ? 'video' : 'photo', provider_url: opengraph.providerUrl,
author_url: opengraph.authorUrl, author_name: replaceholder(opengraph.author || '', size, timestamp, originalname),
provider_url: opengraph.providerUrl, provider_name: replaceholder(opengraph.provider || '', size, timestamp, originalname)
author_name: replaceholder(opengraph.author || '', data[resourceId]), }))
provider_name: replaceholder(opengraph.provider || '', data[resourceId]) .catch(next));
});
});
// Delete file // Delete file
router.get('/delete/:deleteId', (req, res) => { router.get('/delete/:deleteId', (req, res, next) => {
const { resourceId } = req.ass; let oldName, oldType; // skipcq: JS-0119
const deleteId = escape(req.params.deleteId); data.get(req.ass.resourceId)
const fileData = data[resourceId]; .then((fileData) => {
// Extract info for logs
// If the delete ID doesn't match, don't delete the file oldName = fileData.originalname;
if (deleteId !== fileData.deleteId) return res.sendStatus(CODE_UNAUTHORIZED); oldType = fileData.mimetype;
// If the ID is invalid, return 400 because we are unable to process the resource // Clean deleteId
if (!resourceId || !fileData) return res.sendStatus(CODE_BAD_REQUEST); const deleteId = escape(req.params.deleteId);
log(`Deleted: ${fileData.originalname} (${fileData.mimetype})`); // If the delete ID doesn't match, don't delete the file
if (deleteId !== fileData.deleteId) return res.sendStatus(CODE_UNAUTHORIZED);
// Save the file information
Promise.all([s3enabled ? deleteS3(fileData) : fs.rmSync(path(fileData.path)), fs.rmSync(path(diskFilePath, 'thumbnails/', fileData.thumbnail))]) // Save the file information
.then(() => { return Promise.all([s3enabled ? deleteS3(fileData) : fs.rmSync(path(fileData.path)), fs.rmSync(path(diskFilePath, 'thumbnails/', fileData.thumbnail))]);
delete data[resourceId];
saveData(data);
res.type('text').send('File has been deleted!');
}) })
.catch(console.error); .then(() => data.del(req.ass.resourceId))
.then(() => (log(`Deleted: ${oldName} (${oldType})`), res.type('text').send('File has been deleted!'))) // skipcq: JS-0090
.catch(next);
}); });
module.exports = router; module.exports = router;

@ -5,7 +5,7 @@ const { DateTime } = require('luxon');
const { WebhookClient, MessageEmbed } = require('discord.js'); const { WebhookClient, MessageEmbed } = require('discord.js');
const { doUpload, processUploaded } = require('../storage'); const { doUpload, processUploaded } = require('../storage');
const { maxUploadSize, resourceIdSize, gfyIdSize, resourceIdType } = require('../config.json'); const { maxUploadSize, resourceIdSize, gfyIdSize, resourceIdType } = require('../config.json');
const { path, saveData, log, verify, getTrueHttp, getTrueDomain, generateId, formatBytes } = require('../utils'); const { path, log, verify, getTrueHttp, getTrueDomain, generateId, formatBytes } = require('../utils');
const { CODE_UNAUTHORIZED, CODE_PAYLOAD_TOO_LARGE } = require('../MagicNumbers.json'); const { CODE_UNAUTHORIZED, CODE_PAYLOAD_TOO_LARGE } = require('../MagicNumbers.json');
const data = require('../data'); const data = require('../data');
const users = require('../auth'); const users = require('../auth');
@ -40,7 +40,7 @@ router.post('/', doUpload, processUploaded, ({ next }) => next());
router.use('/', (err, _req, res, next) => err.code && err.code === 'LIMIT_FILE_SIZE' ? res.status(CODE_PAYLOAD_TOO_LARGE).send(`Max upload size: ${maxUploadSize}MB`) : next(err)); // skipcq: JS-0229 router.use('/', (err, _req, res, next) => err.code && err.code === 'LIMIT_FILE_SIZE' ? res.status(CODE_PAYLOAD_TOO_LARGE).send(`Max upload size: ${maxUploadSize}MB`) : next(err)); // skipcq: JS-0229
// Process uploaded file // Process uploaded file
router.post('/', (req, res) => { router.post('/', (req, res, next) => {
// Load overrides // Load overrides
const trueDomain = getTrueDomain(req.headers['x-ass-domain']); const trueDomain = getTrueDomain(req.headers['x-ass-domain']);
const generator = req.headers['x-ass-access'] || resourceIdType; const generator = req.headers['x-ass-access'] || resourceIdType;
@ -67,54 +67,53 @@ router.post('/', (req, res) => {
// Save the file information // Save the file information
const resourceId = generateId(generator, resourceIdSize, req.headers['x-ass-gfycat'] || gfyIdSize, req.file.originalname); const resourceId = generateId(generator, resourceIdSize, req.headers['x-ass-gfycat'] || gfyIdSize, req.file.originalname);
data[resourceId.split('.')[0]] = req.file; data.put(resourceId.split('.')[0], req.file).then(() => {
saveData(data); // Log the upload
const logInfo = `${req.file.originalname} (${req.file.mimetype})`;
// Log the upload log(`Uploaded: ${logInfo} (user: ${users[req.token] ? users[req.token].username : '<token-only>'})`);
const logInfo = `${req.file.originalname} (${req.file.mimetype})`;
log(`Uploaded: ${logInfo} (user: ${users[req.token] ? users[req.token].username : '<token-only>'})`); // Build the URLs
const resourceUrl = `${getTrueHttp()}${trueDomain}/${resourceId}`;
// Build the URLs const thumbnailUrl = `${getTrueHttp()}${trueDomain}/${resourceId}/thumbnail`;
const resourceUrl = `${getTrueHttp()}${trueDomain}/${resourceId}`; const deleteUrl = `${getTrueHttp()}${trueDomain}/${resourceId}/delete/${req.file.deleteId}`;
const thumbnailUrl = `${getTrueHttp()}${trueDomain}/${resourceId}/thumbnail`;
const deleteUrl = `${getTrueHttp()}${trueDomain}/${resourceId}/delete/${req.file.deleteId}`; // Send the response
res.type('json').send({ resource: resourceUrl, thumbnail: thumbnailUrl, delete: deleteUrl })
// Send the response .on('finish', () => {
res.type('json').send({ resource: resourceUrl, thumbnail: thumbnailUrl, delete: deleteUrl })
.on('finish', () => { // After we have sent the user the response, also send a Webhook to Discord (if headers are present)
if (req.headers['x-ass-webhook-client'] && req.headers['x-ass-webhook-token']) {
// After we have sent the user the response, also send a Webhook to Discord (if headers are present)
if (req.headers['x-ass-webhook-client'] && req.headers['x-ass-webhook-token']) { // Build the webhook client & embed
const whc = new WebhookClient(req.headers['x-ass-webhook-client'], req.headers['x-ass-webhook-token']);
// Build the webhook client & embed const embed = new MessageEmbed()
const whc = new WebhookClient(req.headers['x-ass-webhook-client'], req.headers['x-ass-webhook-token']); .setTitle(logInfo)
const embed = new MessageEmbed() .setURL(resourceUrl)
.setTitle(logInfo) .setDescription(`**Size:** \`${formatBytes(req.file.size)}\`\n**[Delete](${deleteUrl})**`)
.setURL(resourceUrl) .setThumbnail(thumbnailUrl)
.setDescription(`**Size:** \`${formatBytes(req.file.size)}\`\n**[Delete](${deleteUrl})**`) .setColor(req.file.vibrant)
.setThumbnail(thumbnailUrl) .setTimestamp(req.file.timestamp);
.setColor(req.file.vibrant)
.setTimestamp(req.file.timestamp); // Send the embed to the webhook, then delete the client after to free resources
whc.send(null, {
// Send the embed to the webhook, then delete the client after to free resources username: req.headers['x-ass-webhook-username'] || 'ass',
whc.send(null, { avatarURL: req.headers['x-ass-webhook-avatar'] || ASS_LOGO,
username: req.headers['x-ass-webhook-username'] || 'ass', embeds: [embed]
avatarURL: req.headers['x-ass-webhook-avatar'] || ASS_LOGO, }).then(() => whc.destroy());
embeds: [embed] }
}).then(() => whc.destroy());
} // Also update the users upload count
if (!users[req.token]) {
// Also update the users upload count const generateUsername = () => generateId('random', 20, null); // skipcq: JS-0074
if (!users[req.token]) { let username = generateUsername();
const generateUsername = () => generateId('random', 20, null); // skipcq: JS-0074 while (Object.values(users).findIndex((user) => user.username === username) !== -1) // skipcq: JS-0073
let username = generateUsername(); username = generateUsername();
while (Object.values(users).findIndex((user) => user.username === username) !== -1) // skipcq: JS-0073 users[req.token] = { username, count: 0 };
username = generateUsername(); }
users[req.token] = { username, count: 0 }; users[req.token].count += 1;
} fs.writeJsonSync(path('auth.json'), { users }, { spaces: 4 })
users[req.token].count += 1; });
fs.writeJsonSync(path('auth.json'), { users }, { spaces: 4 }) }).catch(next);
});
}); });
module.exports = router; module.exports = router;

@ -61,7 +61,7 @@ function formatBytes(bytes, decimals = 2) { // skipcq: JS-0074
return parseFloat((bytes / Math.pow(KILOBYTES, i)).toFixed(decimals < 0 ? 0 : decimals)).toString().concat(` ${sizes[i]}`); return parseFloat((bytes / Math.pow(KILOBYTES, i)).toFixed(decimals < 0 ? 0 : decimals)).toString().concat(` ${sizes[i]}`);
} }
function replaceholder(data, { size, timestamp, originalname }) { function replaceholder(data, size, timestamp, originalname) {
return data return data
.replace(/&size/g, formatBytes(size)) .replace(/&size/g, formatBytes(size))
.replace(/&filename/g, originalname) .replace(/&filename/g, originalname)
@ -105,7 +105,6 @@ module.exports = {
randomHexColour, randomHexColour,
sanitize, sanitize,
log: console.log, log: console.log,
saveData: (data) => fs.writeJsonSync(Path.join(__dirname, 'data.json'), data, { spaces: 4 }),
verify: (req, users) => req.headers.authorization && Object.prototype.hasOwnProperty.call(users, req.headers.authorization), verify: (req, users) => req.headers.authorization && Object.prototype.hasOwnProperty.call(users, req.headers.authorization),
renameFile: (req, newName) => new Promise((resolve, reject) => { renameFile: (req, newName) => new Promise((resolve, reject) => {
try { try {

Loading…
Cancel
Save