Compare commits

...

153 Commits

Author SHA1 Message Date
Alex Ling
8a732804ae Merge pull request #232 from hkalexling/rc/0.24.0
v0.24.0
2021-09-25 13:39:48 +08:00
Alex Ling
9df372f784 Merge branch 'dev' into rc/0.24.0 2021-09-23 07:52:58 +00:00
Alex Ling
cf7431b8b6 Merge pull request #236 from hkalexling/fix/shallow-api-fix
Use `depth` instead of `shallow` in API
2021-09-23 15:52:14 +08:00
Alex Ling
974b6cfe9b Use depth instead of shallow in API 2021-09-22 09:17:14 +00:00
Alex Ling
4fbe5b471c Update README 2021-09-20 01:36:31 +00:00
Alex Ling
33e7e31fbc Bump version to 0.24.0 2021-09-20 01:11:26 +00:00
Alex Ling
72fae7f5ed Fix typo cbz -> gz 2021-09-18 12:41:25 +00:00
Alex Ling
f50a7e3b3e Merge branch 'dev' into rc/0.24.0 2021-09-18 12:40:56 +00:00
Alex Ling
66c4037f2b Merge pull request #233 from Leeingnyo/feature/avoid-unnecessary-sort
Avoid a unnecessary sorting
2021-09-18 20:40:25 +08:00
Leeingnyo
2c022a07e7 Avoid unnecessary sorts when getting deep percentage
This make page loading fast
2021-09-18 18:49:29 +09:00
Alex Ling
91362dfc7d Merge branch 'master' into rc/0.24.0 2021-09-18 09:20:59 +00:00
Alex Ling
97168b65d8 Make library cache path configurable 2021-09-18 08:40:08 +00:00
Alex Ling
6e04e249e7 Merge pull request #229 from Leeingnyo/feature/preserve-scanned-titles
Reuse scanned titles on boot and scanning
2021-09-18 16:14:31 +08:00
Alex Ling
16397050dd Update comments 2021-09-18 02:24:50 +00:00
Alex Ling
3f73591dd4 Update comments 2021-09-18 02:14:22 +00:00
Alex Ling
ec25109fa5 Merge branch 'feature/preserve-scanned-titles' of https://github.com/Leeingnyo/Mango into feature/preserve-scanned-titles 2021-09-18 02:04:02 +00:00
Alex Ling
96f1ef3dde Improve comments on examine 2021-09-18 02:00:10 +00:00
Leeingnyo
b56e16e1e1 Remove counter, yield everytime 2021-09-18 10:59:43 +09:00
Leeingnyo
9769e760a0 Pass a counter to recursive calls, Ignore negative threshold 2021-09-16 07:49:12 +09:00
Leeingnyo
70ab198a33 Add config 'forcely_yield_count'
the default value 1000 would make a fiber yield on each 4ms on SSD

Apply yield counter in Dir.contents_signauture
Use contents_signature cache in Title.new
2021-09-16 00:16:26 +09:00
Alex Ling
44a6f822cd Simplify Title.new 2021-09-15 09:00:30 +00:00
Alex Ling
2c241a96bb Merge branch 'dev' into feature/preserve-scanned-titles 2021-09-15 08:58:24 +00:00
Alex Ling
219d4446d1 Merge pull request #231 from hkalexling/feature/api-improvements
API Improvements
2021-09-15 16:54:41 +08:00
Alex Ling
d330db131e Simplify mark_unavailable 2021-09-15 08:46:30 +00:00
Leeingnyo
de193906a2 Refactor mark_unavailable 2021-09-15 16:54:55 +09:00
Leeingnyo
d13cfc045f Add a comment 2021-09-15 01:27:05 +09:00
Leeingnyo
a3b2cdd372 Lint 2021-09-15 01:17:44 +09:00
Leeingnyo
f4d7128b59 Mark unavailable only in candidates 2021-09-14 23:30:03 +09:00
Leeingnyo
663c0c0b38 Remove nested title including self 2021-09-14 23:28:28 +09:00
Leeingnyo
57b2f7c625 Get nested ids when title removed 2021-09-14 23:08:07 +09:00
Leeingnyo
9489d6abfd Use reference instead of primitive 2021-09-14 23:07:47 +09:00
Leeingnyo
670cf54957 Apply yield forcely 2021-09-14 22:51:37 +09:00
Leeingnyo
2e09efbd62 Collect deleted ids 2021-09-14 22:51:25 +09:00
Leeingnyo
523195d649 Define ExamineContext, apply it when scanning 2021-09-14 22:37:30 +09:00
Leeingnyo
be47f309b0 Use cache when calculating contents_signature 2021-09-14 18:11:08 +09:00
Alex Ling
03e044a1aa Improve logging 2021-09-14 07:16:14 +00:00
Alex Ling
4eaf271fa4 Simplify #json_build 2021-09-14 02:30:57 +00:00
Alex Ling
4b464ed361 Allow sorting in /api/book endpoint 2021-09-13 10:58:07 +00:00
Alex Ling
a9520d6f26 Add shallow option to library API endpoints 2021-09-13 10:18:07 +00:00
Leeingnyo
a151ec486d Fix file extension of gzip file 2021-09-12 18:04:41 +09:00
Leeingnyo
8f1383a818 Use Gzip instead of Zip 2021-09-12 18:01:16 +09:00
Leeingnyo
f5933a48d9 Register mime_type scan, thumbnails when loading instance 2021-09-12 17:40:40 +09:00
Leeingnyo
7734dae138 Remove unnecessary sort 2021-09-12 14:36:17 +09:00
Leeingnyo
8c90b46114 Remove removed titles from title_hash 2021-09-12 13:39:28 +09:00
Leeingnyo
cd48b45f11 Add 'require "yaml"' 2021-09-12 12:45:24 +09:00
Leeingnyo
bdbdf9c94b Fix to pass 'make check', fix comments 2021-09-12 11:09:48 +09:00
Leeingnyo
7e36c91ea7 Remove debug print 2021-09-12 10:47:15 +09:00
Leeingnyo
9309f51df6 Memoization on dir contents_signature 2021-09-12 02:19:49 +09:00
Leeingnyo
a8f729f5c1 Sort entries and titles when they needed 2021-09-12 02:19:49 +09:00
Leeingnyo
4e8b561f70 Apply contents signature of directories 2021-09-12 02:19:49 +09:00
Leeingnyo
e6214ddc5d Rescan only if instance loaded 2021-09-12 02:19:49 +09:00
Leeingnyo
80e13abc4a Spawn scan job 2021-09-12 02:14:58 +09:00
Leeingnyo
fb43abb950 Enhance the examine method 2021-09-12 02:14:58 +09:00
Leeingnyo
eb3e37b950 Examine titles and recycle them 2021-09-12 02:14:58 +09:00
Leeingnyo
0a90e3b333 Ignore caches 2021-09-12 02:14:58 +09:00
Leeingnyo
4409ed8f45 Implement save_instance, load_instance 2021-09-12 02:14:58 +09:00
Leeingnyo
291a340cdd Add yaml serializer to Library, Title, Entry 2021-09-12 02:14:58 +09:00
Leeingnyo
0667f01471 Measure scan only 2021-09-12 02:14:58 +09:00
Alex Ling
d5847bb105 Merge pull request #224 from hkalexling/fix/sanitize-download-filename
Stricter sanitization rules for download filenames
2021-09-09 08:47:41 +08:00
Alex Ling
3d295e961e Merge branch 'dev' into fix/sanitize-download-filename 2021-09-09 00:31:24 +00:00
Alex Ling
e408398523 Merge pull request #227 from hkalexling/all-contributors/add-lincolnthedev
docs: add lincolnthedev as a contributor for infra
2021-09-09 08:18:22 +08:00
Alex Ling
566cebfcdd Remove all leading dots and spaces 2021-09-09 00:13:58 +00:00
Alex Ling
a190ae3ed6 Merge pull request #226 from lincolnthedev/master
Better .dockerignore
2021-09-08 20:42:12 +08:00
allcontributors[bot]
17d7cefa12 docs: update .all-contributorsrc [skip ci] 2021-09-08 12:42:10 +00:00
allcontributors[bot]
eaef0556fa docs: update README.md [skip ci] 2021-09-08 12:42:09 +00:00
i use arch btw
53226eab61 Forgot .github 2021-09-08 07:58:58 -04:00
Alex Ling
ccf558eaa7 Improve filename sanitization rules 2021-09-08 10:03:05 +00:00
Alex Ling
0305433e46 Merge pull request #225 from hkalexling/feature/support-all-image-types
Support additional image formats (resolves #192)
2021-09-08 11:03:59 +08:00
i use arch btw
d2cad6c496 Update .dockerignore 2021-09-07 21:12:51 -04:00
Alex Ling
371796cce9 Support additional image formats:
- APNG
- AVIF
- GIF
- SVG
2021-09-07 11:04:05 +00:00
Alex Ling
d9adb49c27 Revert "Support all image types (resolves #192)"
This reverts commit f67e4e6cb9.
2021-09-07 10:45:59 +00:00
Alex Ling
f67e4e6cb9 Support all image types (resolves #192) 2021-09-06 13:32:10 +00:00
Alex Ling
60a126024c Stricter sanitization rules for download filenames
Fixes #212
2021-09-06 13:01:05 +00:00
Alex Ling
da8a485087 Merge pull request #222 from Leeingnyo/feature/enhance-loading-library
Improve loading pages (library, titles)
2021-09-06 16:49:19 +08:00
Alex Ling
d809c21ee1 Document CacheEntry 2021-09-06 08:23:54 +00:00
Alex Ling
ca1e221b10 Rename ids2entries -> ids_to_entries 2021-09-06 08:23:31 +00:00
Alex Ling
44d9c51ff9 Fix logging 2021-09-06 08:10:42 +00:00
Alex Ling
15a54f4f23 Add :sorted_entries suffix to gen_key 2021-09-06 08:10:13 +00:00
Alex Ling
51806f18db Rename config fields and improve logging 2021-09-06 03:35:46 +00:00
Alex Ling
79ef7bcd1c Remove unused variable 2021-09-06 03:01:21 +00:00
Leeingnyo
5cb85ea857 Set cached data when changed 2021-09-06 09:41:46 +09:00
Leeingnyo
9807db6ac0 Fix bug on entry_cover_url_cache 2021-09-06 02:32:13 +09:00
Leeingnyo
565a535d22 Remove caching verbosely, add cached_cover_url 2021-09-06 02:32:13 +09:00
Alex Ling
c5b6a8b5b9 Improve instance_size for Tuple 2021-09-05 13:57:20 +00:00
Leeingnyo
c75c71709f make check 2021-09-05 11:21:53 +09:00
Leeingnyo
11976b15f9 Make LRUCache togglable 2021-09-05 03:02:20 +09:00
Leeingnyo
847f516a65 Cache TitleInfo using LRUCache 2021-09-05 02:35:44 +09:00
Leeingnyo
de410f42b8 Replace InfoCache to LRUCache 2021-09-05 02:08:11 +09:00
Leeingnyo
0fd7caef4b Rename 2021-09-05 00:02:05 +09:00
Leeingnyo
5e919d3e19 Make entry generic 2021-09-04 23:56:17 +09:00
Leeingnyo
9e90aa17b9 Move entry specific method 2021-09-04 14:37:05 +09:00
Leeingnyo
0a8fd993e5 Use bytesize and add comments 2021-09-03 11:11:28 +09:00
Leeingnyo
365f71cd1d Change kbs to mbs 2021-08-30 23:10:08 +09:00
Leeingnyo
601346b209 Set cache if enabled 2021-08-30 23:07:59 +09:00
Leeingnyo
e988a8c121 Add config for sorted entries cache
optional
2021-08-30 22:59:23 +09:00
Leeingnyo
bf81a4e48b Implement sorted entries cache
sorted_entries cached
2021-08-30 22:58:40 +09:00
Leeingnyo
4a09aee177 Implement library caching TitleInfo
* Cache sum of entry progress
* Cache cover_url
* Cache display_name
* Cache sort_opt
2021-08-30 11:31:45 +09:00
Leeingnyo
00c9cc1fcd Prevent saving a sort opt unnecessarily 2021-08-30 11:31:45 +09:00
Leeingnyo
51a47b5ddd Cache display_name 2021-08-30 11:31:45 +09:00
Leeingnyo
244f97a68e Cache entries' cover_url 2021-08-30 08:24:40 +09:00
Alex Ling
8d84a3c502 Merge pull request #221 from hkalexling/rc/0.23.0
v0.23.0
2021-08-28 18:44:50 +08:00
Alex Ling
a26b4b3965 Add Discord badge 2021-08-28 10:27:43 +00:00
Alex Ling
f2dd20cdec Bump version to 0.23.0 2021-08-22 08:20:24 +00:00
Alex Ling
64d6cd293c Remove MangaDex from README 2021-08-22 08:20:01 +00:00
Alex Ling
08dc0601e8 Remove page_margin from README 2021-08-22 08:18:28 +00:00
Alex Ling
9c983df7e9 Merge pull request #216 from Leeingnyo/feature/update-crystal-1.0
Update crystal version to 1.0
2021-08-22 15:59:20 +08:00
Alex Ling
efc547f5b2 Fix "Executing query" spam in log 2021-08-22 07:34:29 +00:00
Alex Ling
995ca3b40f Update ARM Dockerfile 2021-08-20 08:26:11 +00:00
Alex Ling
864435d3f9 Upgrade dependencies for Crystal 1.0.0 2021-08-20 07:15:16 +00:00
Alex Ling
64c145cf80 Merge branch 'dev' of https://github.com/hkalexling/Mango into feature/update-crystal-1.0 2021-08-20 00:36:38 +00:00
Alex Ling
6549253ed1 Merge branch 'dev' of https://github.com/hkalexling/Mango into dev 2021-08-20 00:26:59 +00:00
Alex Ling
d9565718a4 Remove MangaDex integration 2021-08-20 00:25:21 +00:00
Alex Ling
400c3024fd Merge pull request #219 from Leeingnyo/feature/enhance-paged-mode
Enhance the paged mode reader
2021-08-20 08:13:39 +08:00
Leeingnyo
a703175b3a Parenthesize to avoid a misbehavior 2021-08-19 15:03:01 +09:00
Leeingnyo
83b122ab75 Implement a controller to toggle flip animation 2021-08-18 23:46:45 +09:00
Leeingnyo
1e7d6ba5b1 Keep an image ratio at the paged mode, clamping the image in a screen 2021-08-18 23:46:45 +09:00
Leeingnyo
4d1ad8fb38 Prevent load images 2021-08-18 23:46:45 +09:00
Leeingnyo
d544252e3e Add preload lookahead controller 2021-08-18 23:46:45 +09:00
Leeingnyo
b02b28d3e3 Implement to preload images when viewer is paged mode 2021-08-18 23:07:38 +09:00
Leeingnyo
d7efe1e553 Use yaml-static in Dockerfile 2021-08-18 21:23:28 +09:00
Alex Ling
1973564272 Revert "Subscription manager"
This reverts commit a612500b0f.
2021-08-18 12:09:59 +00:00
Alex Ling
29923f6dc7 Merge pull request #214 from Leeingnyo/feature/http-cache
HTTP cache for thumbnails, pages, dimensions
2021-08-18 07:37:21 +08:00
Leeingnyo
4a261d5ff8 Set Cache-Control header at page, dimensions API 2021-08-18 01:33:22 +09:00
Alex Ling
31d425d462 Document the 304 responses 2021-08-17 07:14:24 +00:00
Leeingnyo
a21681a6d7 Update a container version of bulid workflow 2021-08-16 13:23:58 +09:00
Leeingnyo
208019a0b9 Update Docker image 2021-08-16 01:48:18 +09:00
Leeingnyo
54e2a54ecb Update cyrstal-lang, libraries 2021-08-16 01:48:18 +09:00
Leeingnyo
2426ef05ec Apply cache on dimensions api
Use zip_path and mtime for hashing
It used for weak validation
2021-08-15 21:41:44 +09:00
Leeingnyo
25b90a8724 Apply cache on page, cover api
Get image data and use it for hashing
2021-08-15 21:33:08 +09:00
Alex Ling
cd8944ed2d Slim option in library and title APIs 2021-04-25 12:41:37 +00:00
Alex Ling
7f0c256fe6 Log login errors 2021-04-25 12:41:29 +00:00
Alex Ling
46e6e41bfe Fix reader buttons stacking on mobile 2021-03-29 00:41:33 +00:00
Alex Ling
c9f55e7a8e Use yaml-static 2021-03-28 12:49:50 +00:00
Alex Ling
741c3a4e20 Update config example in README 2021-03-28 11:56:06 +00:00
Alex Ling
f6da20321d Bump version to 0.22.0 2021-03-28 11:49:49 +00:00
Alex Ling
2764e955b2 Show success alert on plugin download page 2021-03-15 17:07:15 +00:00
Alex Ling
00c15014a1 Document subscription APIs 2021-03-15 07:12:30 +00:00
Alex Ling
c6fdbfd9fd Better format ranges on subscription manager page 2021-03-15 07:12:10 +00:00
Alex Ling
e03bf32358 Show success alerts on the download page 2021-03-14 17:36:43 +00:00
Alex Ling
bbf1520c73 Make in_range? private 2021-03-14 17:36:26 +00:00
Alex Ling
8950c3a1ed Fix downloader stuck on external chapters 2021-03-14 16:27:08 +00:00
Alex Ling
17837d8a29 Add tooltips to download manager 2021-03-14 16:03:37 +00:00
Alex Ling
b4a69425c8 Reverse the queue on download manager 2021-03-14 16:01:29 +00:00
Alex Ling
a612500b0f Subscription manager 2021-03-14 16:01:29 +00:00
Alex Ling
9bb7144479 Fix warning 2021-03-12 15:28:39 +00:00
Alex Ling
ee52c52f46 Fix new linter errors 2021-03-12 15:03:12 +00:00
Alex Ling
daec2bdac6 Update ameba 2021-03-12 14:06:20 +00:00
Alex Ling
e9a490676b Update the mangadex shard 2021-03-12 13:59:11 +00:00
Alex Ling
757f7c8214 Upgrade Crystal to 0.36.1 2021-03-12 13:41:24 +00:00
Alex Ling
eed1a9717e Merge branch 'master' into dev 2021-03-10 16:48:51 +00:00
Alex Ling
0b3e78bcb7 Merge branch 'rc/0.21.0' into dev 2021-03-09 16:45:26 +00:00
Alex Ling
6a275286ea Merge branch 'rc/0.21.0' into dev 2021-03-07 14:14:46 +00:00
Alex Ling
d3f26ecbc9 Move the page margin config to frontend 2021-03-06 15:04:44 +00:00
46 changed files with 1223 additions and 1050 deletions

View File

@@ -104,6 +104,15 @@
"contributions": [
"infra"
]
},
{
"login": "lincolnthedev",
"name": "i use arch btw",
"avatar_url": "https://avatars.githubusercontent.com/u/41193328?v=4",
"profile": "https://lncn.dev",
"contributions": [
"infra"
]
}
],
"contributorsPerLine": 7,

View File

@@ -1,2 +1,9 @@
node_modules
lib
Dockerfile
Dockerfile.arm32v7
Dockerfile.arm64v8
README.md
.all-contributorsrc
env.example
.github/

View File

@@ -12,12 +12,12 @@ jobs:
runs-on: ubuntu-latest
container:
image: crystallang/crystal:0.35.1-alpine
image: crystallang/crystal:1.0.0-alpine
steps:
- uses: actions/checkout@v2
- name: Install dependencies
run: apk add --no-cache yarn yaml sqlite-static libarchive-dev libarchive-static acl-static expat-static zstd-static lz4-static bzip2-static libjpeg-turbo-dev libpng-dev tiff-dev
run: apk add --no-cache yarn yaml-static sqlite-static libarchive-dev libarchive-static acl-static expat-static zstd-static lz4-static bzip2-static libjpeg-turbo-dev libpng-dev tiff-dev
- name: Build
run: make static || make static
- name: Linter

View File

@@ -1,9 +1,9 @@
FROM crystallang/crystal:0.35.1-alpine AS builder
FROM crystallang/crystal:1.0.0-alpine AS builder
WORKDIR /Mango
COPY . .
RUN apk add --no-cache yarn yaml sqlite-static libarchive-dev libarchive-static acl-static expat-static zstd-static lz4-static bzip2-static libjpeg-turbo-dev libpng-dev tiff-dev
RUN apk add --no-cache yarn yaml-static sqlite-static libarchive-dev libarchive-static acl-static expat-static zstd-static lz4-static bzip2-static libjpeg-turbo-dev libpng-dev tiff-dev
RUN make static || make static
FROM library/alpine

View File

@@ -2,10 +2,10 @@ FROM arm32v7/ubuntu:18.04
RUN apt-get update && apt-get install -y wget git make llvm-8 llvm-8-dev g++ libsqlite3-dev libyaml-dev libgc-dev libssl-dev libcrypto++-dev libevent-dev libgmp-dev zlib1g-dev libpcre++-dev pkg-config libarchive-dev libxml2-dev libacl1-dev nettle-dev liblzo2-dev liblzma-dev libbz2-dev libjpeg-turbo8-dev libpng-dev libtiff-dev
RUN git clone https://github.com/crystal-lang/crystal && cd crystal && git checkout 0.35.1 && make deps && cd ..
RUN git clone https://github.com/kostya/myhtml && cd myhtml/src/ext && git checkout v1.5.0 && make && cd ..
RUN git clone https://github.com/jessedoyle/duktape.cr && cd duktape.cr/ext && git checkout v0.20.0 && make && cd ..
RUN git clone https://github.com/hkalexling/image_size.cr && cd image_size.cr && git checkout v0.2.0 && make && cd ..
RUN git clone https://github.com/crystal-lang/crystal && cd crystal && git checkout 1.0.0 && make deps && cd ..
RUN git clone https://github.com/kostya/myhtml && cd myhtml/src/ext && git checkout v1.5.8 && make && cd ..
RUN git clone https://github.com/jessedoyle/duktape.cr && cd duktape.cr/ext && git checkout v1.0.0 && make && cd ..
RUN git clone https://github.com/hkalexling/image_size.cr && cd image_size.cr && git checkout v0.5.0 && make && cd ..
COPY mango-arm32v7.o .

View File

@@ -2,10 +2,10 @@ FROM arm64v8/ubuntu:18.04
RUN apt-get update && apt-get install -y wget git make llvm-8 llvm-8-dev g++ libsqlite3-dev libyaml-dev libgc-dev libssl-dev libcrypto++-dev libevent-dev libgmp-dev zlib1g-dev libpcre++-dev pkg-config libarchive-dev libxml2-dev libacl1-dev nettle-dev liblzo2-dev liblzma-dev libbz2-dev libjpeg-turbo8-dev libpng-dev libtiff-dev
RUN git clone https://github.com/crystal-lang/crystal && cd crystal && git checkout 0.35.1 && make deps && cd ..
RUN git clone https://github.com/kostya/myhtml && cd myhtml/src/ext && git checkout v1.5.0 && make && cd ..
RUN git clone https://github.com/jessedoyle/duktape.cr && cd duktape.cr/ext && git checkout v0.20.0 && make && cd ..
RUN git clone https://github.com/hkalexling/image_size.cr && cd image_size.cr && git checkout v0.2.0 && make && cd ..
RUN git clone https://github.com/crystal-lang/crystal && cd crystal && git checkout 1.0.0 && make deps && cd ..
RUN git clone https://github.com/kostya/myhtml && cd myhtml/src/ext && git checkout v1.5.8 && make && cd ..
RUN git clone https://github.com/jessedoyle/duktape.cr && cd duktape.cr/ext && git checkout v1.0.0 && make && cd ..
RUN git clone https://github.com/hkalexling/image_size.cr && cd image_size.cr && git checkout v0.5.0 && make && cd ..
COPY mango-arm64v8.o .

View File

@@ -2,7 +2,7 @@
# Mango
[![Patreon](https://img.shields.io/badge/support-patreon-brightgreen?link=https://www.patreon.com/hkalexling)](https://www.patreon.com/hkalexling) ![Build](https://github.com/hkalexling/Mango/workflows/Build/badge.svg) [![Gitter](https://badges.gitter.im/mango-cr/mango.svg)](https://gitter.im/mango-cr/mango?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
[![Patreon](https://img.shields.io/badge/support-patreon-brightgreen?link=https://www.patreon.com/hkalexling)](https://www.patreon.com/hkalexling) ![Build](https://github.com/hkalexling/Mango/workflows/Build/badge.svg) [![Gitter](https://badges.gitter.im/mango-cr/mango.svg)](https://gitter.im/mango-cr/mango?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [![Discord](https://img.shields.io/discord/855633663425118228?label=discord)](http://discord.com/invite/ezKtacCp9Q)
Mango is a self-hosted manga server and reader. Its features include
@@ -13,7 +13,6 @@ Mango is a self-hosted manga server and reader. Its features include
- Supports nested folders in library
- Automatically stores reading progress
- Thumbnail generation
- Built-in [MangaDex](https://mangadex.org/) downloader
- Supports [plugins](https://github.com/hkalexling/mango-plugins) to download from thrid-party sites
- The web reader is responsive and works well on mobile, so there is no need for a mobile app
- All the static files are embedded in the binary, so the deployment process is easy and painless
@@ -52,7 +51,7 @@ The official docker images are available on [Dockerhub](https://hub.docker.com/r
### CLI
```
Mango - Manga Server and Web Reader. Version 0.21.0
Mango - Manga Server and Web Reader. Version 0.24.0
Usage:
@@ -87,7 +86,10 @@ log_level: info
upload_path: ~/mango/uploads
plugin_path: ~/mango/plugins
download_timeout_seconds: 30
page_margin: 30
library_cache_path: ~/mango/library.yml.gz
cache_enabled: false
cache_size_mbs: 50
cache_log_enabled: true
disable_login: false
default_username: ""
auth_proxy_header_name: ""
@@ -104,6 +106,7 @@ mangadex:
- `scan_interval_minutes`, `thumbnail_generation_interval_hours` and `db_optimization_interval_hours` can be any non-negative integer. Setting them to `0` disables the periodic tasks
- `log_level` can be `debug`, `info`, `warn`, `error`, `fatal` or `off`. Setting it to `off` disables the logging
- You can disable authentication by setting `disable_login` to true. Note that `default_username` must be set to an existing username for this to work.
- By setting `cache_enabled` to `true`, you can enable an experimental feature where Mango caches library metadata to improve page load time. You can further fine-tune the feature with `cache_size_mbs` and `cache_log_enabled`.
### Library Structure
@@ -175,6 +178,7 @@ Please check the [development guideline](https://github.com/hkalexling/Mango/wik
<td align="center"><a href="https://github.com/Leeingnyo"><img src="https://avatars0.githubusercontent.com/u/6760150?v=4?s=100" width="100px;" alt=""/><br /><sub><b>이인용</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=Leeingnyo" title="Code">💻</a></td>
<td align="center"><a href="http://h45h74x.eu.org"><img src="https://avatars1.githubusercontent.com/u/27204033?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Simon</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=h45h74x" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/davidkna"><img src="https://avatars.githubusercontent.com/u/835177?v=4?s=100" width="100px;" alt=""/><br /><sub><b>David Knaack</b></sub></a><br /><a href="#infra-davidkna" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://lncn.dev"><img src="https://avatars.githubusercontent.com/u/41193328?v=4?s=100" width="100px;" alt=""/><br /><sub><b>i use arch btw</b></sub></a><br /><a href="#infra-lincolnthedev" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
</tr>
</table>

View File

@@ -1,287 +0,0 @@
const downloadComponent = () => {
return {
chaptersLimit: 1000,
loading: false,
addingToDownload: false,
searchAvailable: false,
searchInput: '',
data: {},
chapters: [],
mangaAry: undefined, // undefined: not searching; []: searched but no result
candidateManga: {},
langChoice: 'All',
groupChoice: 'All',
chapterRange: '',
volumeRange: '',
get languages() {
const set = new Set();
if (this.data.chapters) {
this.data.chapters.forEach(chp => {
set.add(chp.language);
});
}
const ary = [...set].sort();
ary.unshift('All');
return ary;
},
get groups() {
const set = new Set();
if (this.data.chapters) {
this.data.chapters.forEach(chp => {
Object.keys(chp.groups).forEach(g => {
set.add(g);
});
});
}
const ary = [...set].sort();
ary.unshift('All');
return ary;
},
init() {
const tableObserver = new MutationObserver(() => {
console.log('table mutated');
$("#selectable").selectable({
filter: 'tr'
});
});
tableObserver.observe($('table').get(0), {
childList: true,
subtree: true
});
$.getJSON(`${base_url}api/admin/mangadex/expires`)
.done((data) => {
if (data.error) {
alert('danger', 'Failed to check MangaDex integration status. Error: ' + data.error);
return;
}
if (data.expires && data.expires > Math.floor(Date.now() / 1000))
this.searchAvailable = true;
})
.fail((jqXHR, status) => {
alert('danger', `Failed to check MangaDex integration status. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
},
filtersUpdated() {
if (!this.data.chapters)
this.chapters = [];
const filters = {
chapter: this.parseRange(this.chapterRange),
volume: this.parseRange(this.volumeRange),
lang: this.langChoice,
group: this.groupChoice
};
console.log('filters:', filters);
let _chapters = this.data.chapters.slice();
Object.entries(filters).forEach(([k, v]) => {
if (v === 'All') return;
if (k === 'group') {
_chapters = _chapters.filter(c => {
const unescaped_groups = Object.entries(c.groups).map(([g, id]) => this.unescapeHTML(g));
return unescaped_groups.indexOf(v) >= 0;
});
return;
}
if (k === 'lang') {
_chapters = _chapters.filter(c => c.language === v);
return;
}
const lb = parseFloat(v[0]);
const ub = parseFloat(v[1]);
if (isNaN(lb) && isNaN(ub)) return;
_chapters = _chapters.filter(c => {
const val = parseFloat(c[k]);
if (isNaN(val)) return false;
if (isNaN(lb))
return val <= ub;
else if (isNaN(ub))
return val >= lb;
else
return val >= lb && val <= ub;
});
});
console.log('filtered chapters:', _chapters);
this.chapters = _chapters;
},
search() {
if (this.loading || this.searchInput === '') return;
this.data = {};
this.mangaAry = undefined;
var int_id = -1;
try {
const path = new URL(this.searchInput).pathname;
const match = /\/(?:title|manga)\/([0-9]+)/.exec(path);
int_id = parseInt(match[1]);
} catch (e) {
int_id = parseInt(this.searchInput);
}
if (!isNaN(int_id) && int_id > 0) {
// The input is a positive integer. We treat it as an ID.
this.loading = true;
$.getJSON(`${base_url}api/admin/mangadex/manga/${int_id}`)
.done((data) => {
if (data.error) {
alert('danger', 'Failed to get manga info. Error: ' + data.error);
return;
}
this.data = data;
this.chapters = data.chapters;
this.mangaAry = undefined;
})
.fail((jqXHR, status) => {
alert('danger', `Failed to get manga info. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
.always(() => {
this.loading = false;
});
} else {
if (!this.searchAvailable) {
alert('danger', 'Please make sure you are using a valid manga ID or manga URL from Mangadex. If you are trying to search MangaDex with a search term, please log in to MangaDex first by going to "Admin -> Connect to MangaDex".');
return;
}
// Search as a search term
this.loading = true;
$.getJSON(`${base_url}api/admin/mangadex/search?${$.param({
query: this.searchInput
})}`)
.done((data) => {
if (data.error) {
alert('danger', `Failed to search MangaDex. Error: ${data.error}`);
return;
}
this.mangaAry = data.manga;
this.data = {};
})
.fail((jqXHR, status) => {
alert('danger', `Failed to search MangaDex. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
.always(() => {
this.loading = false;
});
}
},
parseRange(str) {
const regex = /^[\t ]*(?:(?:(<|<=|>|>=)[\t ]*([0-9]+))|(?:([0-9]+))|(?:([0-9]+)[\t ]*-[\t ]*([0-9]+))|(?:[\t ]*))[\t ]*$/m;
const matches = str.match(regex);
var num;
if (!matches) {
return [null, null];
} else if (typeof matches[1] !== 'undefined' && typeof matches[2] !== 'undefined') {
// e.g., <= 30
num = parseInt(matches[2]);
if (isNaN(num)) {
return [null, null];
}
switch (matches[1]) {
case '<':
return [null, num - 1];
case '<=':
return [null, num];
case '>':
return [num + 1, null];
case '>=':
return [num, null];
}
} else if (typeof matches[3] !== 'undefined') {
// a single number
num = parseInt(matches[3]);
if (isNaN(num)) {
return [null, null];
}
return [num, num];
} else if (typeof matches[4] !== 'undefined' && typeof matches[5] !== 'undefined') {
// e.g., 10 - 23
num = parseInt(matches[4]);
const n2 = parseInt(matches[5]);
if (isNaN(num) || isNaN(n2) || num > n2) {
return [null, null];
}
return [num, n2];
} else {
// empty or space only
return [null, null];
}
},
unescapeHTML(str) {
var elt = document.createElement("span");
elt.innerHTML = str;
return elt.innerText;
},
selectAll() {
$('tbody > tr').each((i, e) => {
$(e).addClass('ui-selected');
});
},
clearSelection() {
$('tbody > tr').each((i, e) => {
$(e).removeClass('ui-selected');
});
},
download() {
const selected = $('tbody > tr.ui-selected');
if (selected.length === 0) return;
UIkit.modal.confirm(`Download ${selected.length} selected chapters?`).then(() => {
const ids = selected.map((i, e) => {
return parseInt($(e).find('td').first().text());
}).get();
const chapters = this.chapters.filter(c => ids.indexOf(c.id) >= 0);
console.log(ids);
this.addingToDownload = true;
$.ajax({
type: 'POST',
url: `${base_url}api/admin/mangadex/download`,
data: JSON.stringify({
chapters: chapters
}),
contentType: "application/json",
dataType: 'json'
})
.done(data => {
console.log(data);
if (data.error) {
alert('danger', `Failed to add chapters to the download queue. Error: ${data.error}`);
return;
}
const successCount = parseInt(data.success);
const failCount = parseInt(data.fail);
UIkit.modal.confirm(`${successCount} of ${successCount + failCount} chapters added to the download queue. Proceed to the download manager?`).then(() => {
window.location.href = base_url + 'admin/downloads';
});
})
.fail((jqXHR, status) => {
alert('danger', `Failed to add chapters to the download queue. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
.always(() => {
this.addingToDownload = false;
});
});
},
chooseManga(manga) {
this.candidateManga = manga;
UIkit.modal($('#modal').get(0)).show();
},
confirmManga(id) {
UIkit.modal($('#modal').get(0)).hide();
this.searchInput = id;
this.search();
}
};
};

View File

@@ -1,61 +0,0 @@
const component = () => {
return {
username: '',
password: '',
expires: undefined,
loading: true,
loggingIn: false,
init() {
this.loading = true;
$.ajax({
type: 'GET',
url: `${base_url}api/admin/mangadex/expires`,
contentType: "application/json",
})
.done(data => {
console.log(data);
if (data.error) {
alert('danger', `Failed to retrieve MangaDex token status. Error: ${data.error}`);
return;
}
this.expires = data.expires;
this.loading = false;
})
.fail((jqXHR, status) => {
alert('danger', `Failed to retrieve MangaDex token status. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
});
},
login() {
if (!(this.username && this.password)) return;
this.loggingIn = true;
$.ajax({
type: 'POST',
url: `${base_url}api/admin/mangadex/login`,
contentType: "application/json",
dataType: 'json',
data: JSON.stringify({
username: this.username,
password: this.password
})
})
.done(data => {
console.log(data);
if (data.error) {
alert('danger', `Failed to log in. Error: ${data.error}`);
return;
}
this.expires = data.expires;
})
.fail((jqXHR, status) => {
alert('danger', `Failed to log in. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
.always(() => {
this.loggingIn = false;
});
},
get expired() {
return this.expires && moment().diff(moment.unix(this.expires)) > 0;
}
};
};

View File

@@ -126,9 +126,7 @@ const download = () => {
}
const successCount = parseInt(data.success);
const failCount = parseInt(data.fail);
UIkit.modal.confirm(`${successCount} of ${successCount + failCount} chapters added to the download queue. Proceed to the download manager?`).then(() => {
window.location.href = base_url + 'admin/downloads';
});
alert('success', `${successCount} of ${successCount + failCount} chapters added to the download queue. You can view and manage your download queue on the <a href="${base_url}admin/downloads">download manager page</a>.`);
})
.fail((jqXHR, status) => {
alert('danger', `Failed to add chapters to the download queue. Error: [${jqXHR.status}] ${jqXHR.statusText}`);

View File

@@ -6,10 +6,13 @@ const readerComponent = () => {
alertClass: 'uk-alert-primary',
items: [],
curItem: {},
enableFlipAnimation: true,
flipAnimation: null,
longPages: false,
lastSavedPage: page,
selectedIndex: 0, // 0: not selected; 1: the first page
margin: 30,
preloadLookahead: 3,
/**
* Initialize the component by fetching the page dimensions
@@ -27,7 +30,6 @@ const readerComponent = () => {
url: `${base_url}api/page/${tid}/${eid}/${i+1}`,
width: d.width,
height: d.height,
style: `margin-top: ${data.margin}px; margin-bottom: ${data.margin}px;`
};
});
@@ -47,6 +49,21 @@ const readerComponent = () => {
const mode = this.mode;
this.updateMode(this.mode, page, nextTick);
$('#mode-select').val(mode);
const savedMargin = localStorage.getItem('margin');
if (savedMargin) {
this.margin = savedMargin;
}
// Preload Images
this.preloadLookahead = +(localStorage.getItem('preloadLookahead') ?? 3);
const limit = Math.min(page + this.preloadLookahead, this.items.length + 1);
for (let idx = page + 1; idx <= limit; idx++) {
this.preloadImage(this.items[idx - 1].url);
}
const savedFlipAnimation = localStorage.getItem('enableFlipAnimation');
this.enableFlipAnimation = savedFlipAnimation === null || savedFlipAnimation === 'true';
})
.catch(e => {
const errMsg = `Failed to get the page dimensions. ${e}`;
@@ -55,6 +72,12 @@ const readerComponent = () => {
this.msg = errMsg;
})
},
/**
* Preload an image, which is expected to be cached
*/
preloadImage(url) {
(new Image()).src = url;
},
/**
* Handles the `change` event for the page selector
*/
@@ -106,12 +129,18 @@ const readerComponent = () => {
if (newIdx <= 0 || newIdx > this.items.length) return;
if (newIdx + this.preloadLookahead < this.items.length + 1) {
this.preloadImage(this.items[newIdx + this.preloadLookahead - 1].url);
}
this.toPage(newIdx);
if (isNext)
this.flipAnimation = 'right';
else
this.flipAnimation = 'left';
if (this.enableFlipAnimation) {
if (isNext)
this.flipAnimation = 'right';
else
this.flipAnimation = 'left';
}
setTimeout(() => {
this.flipAnimation = null;
@@ -277,6 +306,19 @@ const readerComponent = () => {
entryChanged() {
const id = $('#entry-select').val();
this.redirect(`${base_url}reader/${tid}/${id}`);
}
},
marginChanged() {
localStorage.setItem('margin', this.margin);
this.toPage(this.selectedIndex);
},
preloadLookaheadChanged() {
localStorage.setItem('preloadLookahead', this.preloadLookahead);
},
enableFlipAnimationChanged() {
localStorage.setItem('enableFlipAnimation', this.enableFlipAnimation);
},
};
}

82
public/js/subscription.js Normal file
View File

@@ -0,0 +1,82 @@
const component = () => {
return {
available: undefined,
subscriptions: [],
init() {
$.getJSON(`${base_url}api/admin/mangadex/expires`)
.done((data) => {
if (data.error) {
alert('danger', 'Failed to check MangaDex integration status. Error: ' + data.error);
return;
}
this.available = Boolean(data.expires && data.expires > Math.floor(Date.now() / 1000));
if (this.available) this.getSubscriptions();
})
.fail((jqXHR, status) => {
alert('danger', `Failed to check MangaDex integration status. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
},
getSubscriptions() {
$.getJSON(`${base_url}api/admin/mangadex/subscriptions`)
.done(data => {
if (data.error) {
alert('danger', 'Failed to get subscriptions. Error: ' + data.error);
return;
}
this.subscriptions = data.subscriptions;
})
.fail((jqXHR, status) => {
alert('danger', `Failed to get subscriptions. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
},
rm(event) {
const id = event.currentTarget.parentNode.getAttribute('data-id');
$.ajax({
type: 'DELETE',
url: `${base_url}api/admin/mangadex/subscriptions/${id}`,
contentType: 'application/json'
})
.done(data => {
if (data.error) {
alert('danger', `Failed to delete subscription. Error: ${data.error}`);
}
this.getSubscriptions();
})
.fail((jqXHR, status) => {
alert('danger', `Failed to delete subscription. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
});
},
check(event) {
const id = event.currentTarget.parentNode.getAttribute('data-id');
$.ajax({
type: 'POST',
url: `${base_url}api/admin/mangadex/subscriptions/check/${id}`,
contentType: 'application/json'
})
.done(data => {
if (data.error) {
alert('danger', `Failed to check subscription. Error: ${data.error}`);
return;
}
alert('success', 'Mango is now checking the subscription for updates. This might take a while, but you can safely leave the page.');
})
.fail((jqXHR, status) => {
alert('danger', `Failed to check subscription. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
});
},
formatRange(min, max) {
if (!isNaN(min) && isNaN(max)) return `${min}`;
if (isNaN(min) && !isNaN(max)) return `${max}`;
if (isNaN(min) && isNaN(max)) return 'All';
if (min === max) return `= ${min}`;
return `${min} - ${max}`;
}
};
};

View File

@@ -2,81 +2,77 @@ version: 2.0
shards:
ameba:
git: https://github.com/crystal-ameba/ameba.git
version: 0.12.1
version: 0.14.3
archive:
git: https://github.com/hkalexling/archive.cr.git
version: 0.4.0
version: 0.5.0
baked_file_system:
git: https://github.com/schovi/baked_file_system.git
version: 0.9.8+git.commit.fb3091b546797fbec3c25dc0e1e2cff60bb9033b
version: 0.10.0
clim:
git: https://github.com/at-grandpa/clim.git
version: 0.12.0
version: 0.17.1
db:
git: https://github.com/crystal-lang/crystal-db.git
version: 0.9.0
version: 0.10.1
duktape:
git: https://github.com/jessedoyle/duktape.cr.git
version: 0.20.0
version: 1.0.0
exception_page:
git: https://github.com/crystal-loot/exception_page.git
version: 0.1.4
version: 0.1.5
http_proxy:
git: https://github.com/mamantoha/http_proxy.git
version: 0.7.1
version: 0.8.0
image_size:
git: https://github.com/hkalexling/image_size.cr.git
version: 0.4.0
version: 0.5.0
kemal:
git: https://github.com/kemalcr/kemal.git
version: 0.27.0
version: 1.0.0
kemal-session:
git: https://github.com/kemalcr/kemal-session.git
version: 0.12.1
version: 1.0.0
kilt:
git: https://github.com/jeromegn/kilt.git
version: 0.4.0
version: 0.4.1
koa:
git: https://github.com/hkalexling/koa.git
version: 0.7.0
mangadex:
git: https://github.com/hkalexling/mangadex.git
version: 0.8.0+git.commit.24e6fb51afd043721139355854e305b43bf98c43
version: 0.8.0
mg:
git: https://github.com/hkalexling/mg.git
version: 0.3.0+git.commit.a19417abf03eece80039f89569926cff1ce3a1a3
version: 0.5.0+git.commit.697e46e27cde8c3969346e228e372db2455a6264
myhtml:
git: https://github.com/kostya/myhtml.git
version: 1.5.1
version: 1.5.8
open_api:
git: https://github.com/jreinert/open_api.cr.git
version: 1.2.1+git.commit.95e4df2ca10b1fe88b8b35c62a18b06a10267b6c
git: https://github.com/hkalexling/open_api.cr.git
version: 1.2.1+git.commit.1d3c55dd5534c6b0af18964d031858a08515553a
radix:
git: https://github.com/luislavena/radix.git
version: 0.3.9
version: 0.4.1
sqlite3:
git: https://github.com/crystal-lang/crystal-sqlite3.git
version: 0.16.0
version: 0.18.0
tallboy:
git: https://github.com/epoch/tallboy.git
version: 0.9.3
version: 0.9.3+git.commit.9be1510bb0391c95e92f1b288f3afb429a73caa6

View File

@@ -1,5 +1,5 @@
name: mango
version: 0.21.0
version: 0.24.0
authors:
- Alex Ling <hkalexling@gmail.com>
@@ -8,7 +8,7 @@ targets:
mango:
main: src/mango.cr
crystal: 0.35.1
crystal: 1.0.0
license: MIT
@@ -21,7 +21,6 @@ dependencies:
github: crystal-lang/crystal-sqlite3
baked_file_system:
github: schovi/baked_file_system
version: 0.9.8+git.commit.fb3091b546797fbec3c25dc0e1e2cff60bb9033b
archive:
github: hkalexling/archive.cr
ameba:
@@ -30,7 +29,6 @@ dependencies:
github: at-grandpa/clim
duktape:
github: jessedoyle/duktape.cr
version: ~> 0.20.0
myhtml:
github: kostya/myhtml
http_proxy:
@@ -41,7 +39,6 @@ dependencies:
github: hkalexling/koa
tallboy:
github: epoch/tallboy
branch: master
mg:
github: hkalexling/mg
mangadex:
github: hkalexling/mangadex

View File

@@ -8,9 +8,7 @@ describe Storage do
end
it "deletes user" do
with_storage do |storage|
storage.delete_user "admin"
end
with_storage &.delete_user "admin"
end
it "creates new user" do

View File

@@ -21,7 +21,7 @@ describe "compare_numerically" do
it "sorts like the stack exchange post" do
ary = ["2", "12", "200000", "1000000", "a", "a12", "b2", "text2",
"text2a", "text2a2", "text2a12", "text2ab", "text12", "text12a"]
ary.reverse.sort { |a, b|
ary.reverse.sort! { |a, b|
compare_numerically a, b
}.should eq ary
end
@@ -29,7 +29,7 @@ describe "compare_numerically" do
# https://github.com/hkalexling/Mango/issues/22
it "handles numbers larger than Int32" do
ary = ["14410155591588.jpg", "21410155591588.png", "104410155591588.jpg"]
ary.reverse.sort { |a, b|
ary.reverse.sort! { |a, b|
compare_numerically a, b
}.should eq ary
end
@@ -56,8 +56,18 @@ describe "chapter_sort" do
it "sorts correctly" do
ary = ["Vol.1 Ch.01", "Vol.1 Ch.02", "Vol.2 Ch. 2.5", "Ch. 3", "Ch.04"]
sorter = ChapterSorter.new ary
ary.reverse.sort do |a, b|
ary.reverse.sort! do |a, b|
sorter.compare a, b
end.should eq ary
end
end
describe "sanitize_filename" do
it "returns a random string for empty sanitized string" do
sanitize_filename("..").should_not eq sanitize_filename("..")
end
it "sanitizes correctly" do
sanitize_filename(".. \n\v.\rマンゴー/|*()<[1/2] 3.14 hello world ")
.should eq "マンゴー_()[1_2] 3.14 hello world"
end
end

View File

@@ -11,6 +11,8 @@ class Config
property session_secret : String = "mango-session-secret"
property library_path : String = File.expand_path "~/mango/library",
home: true
property library_cache_path = File.expand_path "~/mango/library.yml.gz",
home: true
property db_path : String = File.expand_path "~/mango/mango.db", home: true
property scan_interval_minutes : Int32 = 5
property thumbnail_generation_interval_hours : Int32 = 24
@@ -20,7 +22,9 @@ class Config
property plugin_path : String = File.expand_path "~/mango/plugins",
home: true
property download_timeout_seconds : Int32 = 30
property page_margin : Int32 = 30
property cache_enabled = false
property cache_size_mbs = 50
property cache_log_enabled = true
property disable_login = false
property default_username = ""
property auth_proxy_header_name = ""

188
src/library/cache.cr Normal file
View File

@@ -0,0 +1,188 @@
require "digest"
require "./entry"
require "./types"
# Base class for an entry in the LRU cache.
# There are two ways to use it:
# 1. Use it as it is by instantiating with the appropriate `SaveT` and
# `ReturnT`. Note that in this case, `SaveT` and `ReturnT` must be the
# same type. That is, the input value will be stored as it is without
# any transformation.
# 2. You can also subclass it and provide custom implementations for
# `to_save_t` and `to_return_t`. This allows you to transform and store
# the input value to a different type. See `SortedEntriesCacheEntry` as
# an example.
private class CacheEntry(SaveT, ReturnT)
getter key : String, atime : Time
@value : SaveT
def initialize(@key : String, value : ReturnT)
@atime = @ctime = Time.utc
@value = self.class.to_save_t value
end
def value
@atime = Time.utc
self.class.to_return_t @value
end
def self.to_save_t(value : ReturnT)
value
end
def self.to_return_t(value : SaveT)
value
end
def instance_size
instance_sizeof(CacheEntry(SaveT, ReturnT)) + # sizeof itself
instance_sizeof(String) + @key.bytesize + # allocated memory for @key
@value.instance_size
end
end
class SortedEntriesCacheEntry < CacheEntry(Array(String), Array(Entry))
def self.to_save_t(value : Array(Entry))
value.map &.id
end
def self.to_return_t(value : Array(String))
ids_to_entries value
end
private def self.ids_to_entries(ids : Array(String))
e_map = Library.default.deep_entries.to_h { |entry| {entry.id, entry} }
entries = [] of Entry
begin
ids.each do |id|
entries << e_map[id]
end
return entries if ids.size == entries.size
rescue
end
end
def instance_size
instance_sizeof(SortedEntriesCacheEntry) + # sizeof itself
instance_sizeof(String) + @key.bytesize + # allocated memory for @key
@value.size * (instance_sizeof(String) + sizeof(String)) +
@value.sum(&.bytesize) # elements in Array(String)
end
def self.gen_key(book_id : String, username : String,
entries : Array(Entry), opt : SortOptions?)
entries_sig = Digest::SHA1.hexdigest (entries.map &.id).to_s
user_context = opt && opt.method == SortMethod::Progress ? username : ""
sig = Digest::SHA1.hexdigest (book_id + entries_sig + user_context +
(opt ? opt.to_tuple.to_s : "nil"))
"#{sig}:sorted_entries"
end
end
class String
def instance_size
instance_sizeof(String) + bytesize
end
end
struct Tuple(*T)
def instance_size
sizeof(T) + # total size of non-reference types
self.sum do |e|
next 0 unless e.is_a? Reference
if e.responds_to? :instance_size
e.instance_size
else
instance_sizeof(typeof(e))
end
end
end
end
alias CacheableType = Array(Entry) | String | Tuple(String, Int32)
alias CacheEntryType = SortedEntriesCacheEntry |
CacheEntry(String, String) |
CacheEntry(Tuple(String, Int32), Tuple(String, Int32))
def generate_cache_entry(key : String, value : CacheableType)
if value.is_a? Array(Entry)
SortedEntriesCacheEntry.new key, value
else
CacheEntry(typeof(value), typeof(value)).new key, value
end
end
# LRU Cache
class LRUCache
@@limit : Int128 = Int128.new 0
@@should_log = true
# key => entry
@@cache = {} of String => CacheEntryType
def self.enabled
Config.current.cache_enabled
end
def self.init
cache_size = Config.current.cache_size_mbs
@@limit = Int128.new cache_size * 1024 * 1024 if enabled
@@should_log = Config.current.cache_log_enabled
end
def self.get(key : String)
return unless enabled
entry = @@cache[key]?
if @@should_log
Logger.debug "LRUCache #{entry.nil? ? "miss" : "hit"} #{key}"
end
return entry.value unless entry.nil?
end
def self.set(cache_entry : CacheEntryType)
return unless enabled
key = cache_entry.key
@@cache[key] = cache_entry
Logger.debug "LRUCache cached #{key}" if @@should_log
remove_least_recent_access
end
def self.invalidate(key : String)
return unless enabled
@@cache.delete key
end
def self.print
return unless @@should_log
sum = @@cache.sum { |_, entry| entry.instance_size }
Logger.debug "---- LRU Cache ----"
Logger.debug "Size: #{sum} Bytes"
Logger.debug "List:"
@@cache.each do |k, v|
Logger.debug "#{k} | #{v.atime} | #{v.instance_size}"
end
Logger.debug "-------------------"
end
private def self.is_cache_full
sum = @@cache.sum { |_, entry| entry.instance_size }
sum > @@limit
end
private def self.remove_least_recent_access
if @@should_log && is_cache_full
Logger.debug "Removing entries from LRUCache"
end
while is_cache_full && @@cache.size > 0
min_tuple = @@cache.min_by { |_, entry| entry.atime }
min_key = min_tuple[0]
min_entry = min_tuple[1]
Logger.debug " \
Target: #{min_key}, \
Last Access Time: #{min_entry.atime}" if @@should_log
invalidate min_key
end
end
end

View File

@@ -1,6 +1,9 @@
require "image_size"
require "yaml"
class Entry
include YAML::Serializable
getter zip_path : String, book : Title, title : String,
size : String, pages : Int32, id : String, encoded_path : String,
encoded_title : String, mtime : Time, err_msg : String?
@@ -46,16 +49,20 @@ class Entry
file.close
end
def to_json(json : JSON::Builder)
json.object do
{% for str in ["zip_path", "title", "size", "id"] %}
def build_json(*, slim = false)
JSON.build do |json|
json.object do
{% for str in ["zip_path", "title", "size", "id"] %}
json.field {{str}}, @{{str.id}}
{% end %}
json.field "title_id", @book.id
json.field "display_name", @book.display_name @title
json.field "cover_url", cover_url
json.field "pages" { json.number @pages }
json.field "mtime" { json.number @mtime.to_unix }
json.field "title_id", @book.id
json.field "pages" { json.number @pages }
unless slim
json.field "display_name", @book.display_name @title
json.field "cover_url", cover_url
json.field "mtime" { json.number @mtime.to_unix }
end
end
end
end
@@ -69,9 +76,17 @@ class Entry
def cover_url
return "#{Config.current.base_url}img/icon.png" if @err_msg
unless @book.entry_cover_url_cache
TitleInfo.new @book.dir do |info|
@book.entry_cover_url_cache = info.entry_cover_url
end
end
entry_cover_url = @book.entry_cover_url_cache
url = "#{Config.current.base_url}api/cover/#{@book.id}/#{@id}"
TitleInfo.new @book.dir do |info|
info_url = info.entry_cover_url[@title]?
if entry_cover_url
info_url = entry_cover_url[@title]?
unless info_url.nil? || info_url.empty?
url = File.join Config.current.base_url, info_url
end
@@ -86,7 +101,7 @@ class Entry
SUPPORTED_IMG_TYPES.includes? \
MIME.from_filename? e.filename
}
.sort { |a, b|
.sort! { |a, b|
compare_numerically a.filename, b.filename
}
yield file, entries
@@ -158,6 +173,16 @@ class Entry
# For backward backward compatibility with v0.1.0, we save entry titles
# instead of IDs in info.json
def save_progress(username, page)
LRUCache.invalidate "#{@book.id}:#{username}:progress_sum"
@book.parents.each do |parent|
LRUCache.invalidate "#{parent.id}:#{username}:progress_sum"
end
[false, true].each do |ascend|
sorted_entries_cache_key = SortedEntriesCacheEntry.gen_key @book.id,
username, @book.entries, SortOptions.new(SortMethod::Progress, ascend)
LRUCache.invalidate sorted_entries_cache_key
end
TitleInfo.new @book.dir do |info|
if info.progress[username]?.nil?
info.progress[username] = {@title => page}

View File

@@ -1,12 +1,38 @@
class Library
include YAML::Serializable
getter dir : String, title_ids : Array(String),
title_hash : Hash(String, Title)
use_default
def initialize
register_mime_types
def save_instance
path = Config.current.library_cache_path
Logger.debug "Caching library to #{path}"
writer = Compress::Gzip::Writer.new path,
Compress::Gzip::BEST_COMPRESSION
writer.write self.to_yaml.to_slice
writer.close
end
def self.load_instance
path = Config.current.library_cache_path
return unless File.exists? path
Logger.debug "Loading cached library from #{path}"
begin
Compress::Gzip::Reader.open path do |content|
@@default = Library.from_yaml content
end
Library.default.register_jobs
rescue e
Logger.error e
end
end
def initialize
@dir = Config.current.library_path
# explicitly initialize @titles to bypass the compiler check. it will
# be filled with actual Titles in the `scan` call below
@@ -16,6 +42,12 @@ class Library
@entries_count = 0
@thumbnails_count = 0
register_jobs
end
protected def register_jobs
register_mime_types
scan_interval = Config.current.scan_interval_minutes
if scan_interval < 1
scan
@@ -25,7 +57,7 @@ class Library
start = Time.local
scan
ms = (Time.local - start).total_milliseconds
Logger.info "Scanned #{@title_ids.size} titles in #{ms}ms"
Logger.debug "Library initialized in #{ms}ms"
sleep scan_interval.minutes
end
end
@@ -51,11 +83,6 @@ class Library
def sorted_titles(username, opt : SortOptions? = nil)
if opt.nil?
opt = SortOptions.from_info_json @dir, username
else
TitleInfo.new @dir do |info|
info.sort_by[username] = opt.to_tuple
info.save
end
end
# Helper function from src/util/util.cr
@@ -63,14 +90,24 @@ class Library
end
def deep_titles
titles + titles.map { |t| t.deep_titles }.flatten
titles + titles.flat_map &.deep_titles
end
def to_json(json : JSON::Builder)
json.object do
json.field "dir", @dir
json.field "titles" do
json.raw self.titles.to_json
def deep_entries
titles.flat_map &.deep_entries
end
def build_json(*, slim = false, depth = -1)
JSON.build do |json|
json.object do
json.field "dir", @dir
json.field "titles" do
json.array do
self.titles.each do |title|
json.raw title.build_json(slim: slim, depth: depth)
end
end
end
end
end
end
@@ -84,6 +121,7 @@ class Library
end
def scan
start = Time.local
unless Dir.exists? @dir
Logger.info "The library directory #{@dir} does not exist. " \
"Attempting to create it"
@@ -92,14 +130,36 @@ class Library
storage = Storage.new auto_close: false
examine_context : ExamineContext = {
cached_contents_signature: {} of String => String,
deleted_title_ids: [] of String,
deleted_entry_ids: [] of String,
}
@title_ids.select! do |title_id|
title = @title_hash[title_id]
existence = title.examine examine_context
unless existence
examine_context["deleted_title_ids"].concat [title_id] +
title.deep_titles.map &.id
examine_context["deleted_entry_ids"].concat title.deep_entries.map &.id
end
existence
end
remained_title_dirs = @title_ids.map { |id| title_hash[id].dir }
examine_context["deleted_title_ids"].each do |title_id|
@title_hash.delete title_id
end
cache = examine_context["cached_contents_signature"]
(Dir.entries @dir)
.select { |fn| !fn.starts_with? "." }
.map { |fn| File.join @dir, fn }
.select { |path| !(remained_title_dirs.includes? path) }
.select { |path| File.directory? path }
.map { |path| Title.new path, "" }
.map { |path| Title.new path, "", cache }
.select { |title| !(title.entries.empty? && title.titles.empty?) }
.sort { |a, b| a.title <=> b.title }
.tap { |_| @title_ids.clear }
.sort! { |a, b| a.title <=> b.title }
.each do |title|
@title_hash[title.id] = title
@title_ids << title.id
@@ -108,13 +168,20 @@ class Library
storage.bulk_insert_ids
storage.close
Logger.debug "Scan completed"
Storage.default.mark_unavailable
ms = (Time.local - start).total_milliseconds
Logger.info "Scanned #{@title_ids.size} titles in #{ms}ms"
Storage.default.mark_unavailable examine_context["deleted_entry_ids"],
examine_context["deleted_title_ids"]
spawn do
save_instance
end
end
def get_continue_reading_entries(username)
cr_entries = deep_titles
.map { |t| t.get_last_read_entry username }
.map(&.get_last_read_entry username)
# Select elements with type `Entry` from the array and ignore all `Nil`s
.select(Entry)[0...ENTRIES_IN_HOME_SECTIONS]
.map { |e|
@@ -150,14 +217,14 @@ class Library
recently_added = [] of RA
last_date_added = nil
titles.map { |t| t.deep_entries_with_date_added }.flatten
.select { |e| e[:date_added] > 1.month.ago }
.sort { |a, b| b[:date_added] <=> a[:date_added] }
titles.flat_map(&.deep_entries_with_date_added)
.select(&.[:date_added].> 1.month.ago)
.sort! { |a, b| b[:date_added] <=> a[:date_added] }
.each do |e|
break if recently_added.size > 12
last = recently_added.last?
if last && e[:entry].book.id == last[:entry].book.id &&
(e[:date_added] - last_date_added.not_nil!).duration < 1.day
(e[:date_added] - last_date_added.not_nil!).abs < 1.day
# A NamedTuple is immutable, so we have to cast it to a Hash first
last_hash = last.to_h
count = last_hash[:grouped_count].as(Int32)
@@ -188,9 +255,9 @@ class Library
# If we use `deep_titles`, the start reading section might include `Vol. 2`
# when the user hasn't started `Vol. 1` yet
titles
.select { |t| t.load_percentage(username) == 0 }
.select(&.load_percentage(username).== 0)
.sample(ENTRIES_IN_HOME_SECTIONS)
.shuffle
.shuffle!
end
def thumbnail_generation_progress
@@ -205,7 +272,7 @@ class Library
end
Logger.info "Starting thumbnail generation"
entries = deep_titles.map(&.deep_entries).flatten.reject &.err_msg
entries = deep_titles.flat_map(&.deep_entries).reject &.err_msg
@entries_count = entries.size
@thumbnails_count = 0

View File

@@ -1,13 +1,25 @@
require "digest"
require "../archive"
class Title
include YAML::Serializable
getter dir : String, parent_id : String, title_ids : Array(String),
entries : Array(Entry), title : String, id : String,
encoded_title : String, mtime : Time, signature : UInt64
encoded_title : String, mtime : Time, signature : UInt64,
entry_cover_url_cache : Hash(String, String)?
setter entry_cover_url_cache : Hash(String, String)?
@[YAML::Field(ignore: true)]
@entry_display_name_cache : Hash(String, String)?
@[YAML::Field(ignore: true)]
@entry_cover_url_cache : Hash(String, String)?
@[YAML::Field(ignore: true)]
@cached_display_name : String?
@[YAML::Field(ignore: true)]
@cached_cover_url : String?
def initialize(@dir : String, @parent_id)
def initialize(@dir : String, @parent_id, cache = {} of String => String)
storage = Storage.default
@signature = Dir.signature dir
id = storage.get_title_id dir, signature
@@ -20,6 +32,7 @@ class Title
})
end
@id = id
@contents_signature = Dir.contents_signature dir, cache
@title = File.basename dir
@encoded_title = URI.encode @title
@title_ids = [] of String
@@ -30,7 +43,7 @@ class Title
next if fn.starts_with? "."
path = File.join dir, fn
if File.directory? path
title = Title.new path, @id
title = Title.new path, @id, cache
next if title.entries.size == 0 && title.titles.size == 0
Library.default.title_hash[title.id] = title
@title_ids << title.id
@@ -44,40 +57,164 @@ class Title
mtimes = [@mtime]
mtimes += @title_ids.map { |e| Library.default.title_hash[e].mtime }
mtimes += @entries.map { |e| e.mtime }
mtimes += @entries.map &.mtime
@mtime = mtimes.max
@title_ids.sort! do |a, b|
compare_numerically Library.default.title_hash[a].title,
Library.default.title_hash[b].title
end
sorter = ChapterSorter.new @entries.map { |e| e.title }
sorter = ChapterSorter.new @entries.map &.title
@entries.sort! do |a, b|
sorter.compare a.title, b.title
end
end
def to_json(json : JSON::Builder)
json.object do
{% for str in ["dir", "title", "id"] %}
# Utility method used in library rescanning.
# - When the title does not exist on the file system anymore, return false
# and let it be deleted from the library instance
# - When the title exists, but its contents signature is now different from
# the cache, it means some of its content (nested titles or entries)
# has been added, deleted, or renamed. In this case we update its
# contents signature and instance variables
# - When the title exists and its contents signature is still the same, we
# return true so it can be reused without rescanning
def examine(context : ExamineContext) : Bool
return false unless Dir.exists? @dir
contents_signature = Dir.contents_signature @dir,
context["cached_contents_signature"]
return true if @contents_signature == contents_signature
@contents_signature = contents_signature
@signature = Dir.signature @dir
storage = Storage.default
id = storage.get_title_id dir, signature
if id.nil?
id = random_str
storage.insert_title_id({
path: dir,
id: id,
signature: signature.to_s,
})
end
@id = id
@mtime = File.info(@dir).modification_time
previous_titles_size = @title_ids.size
@title_ids.select! do |title_id|
title = Library.default.get_title! title_id
existence = title.examine context
unless existence
context["deleted_title_ids"].concat [title_id] +
title.deep_titles.map &.id
context["deleted_entry_ids"].concat title.deep_entries.map &.id
end
existence
end
remained_title_dirs = @title_ids.map do |title_id|
title = Library.default.get_title! title_id
title.dir
end
previous_entries_size = @entries.size
@entries.select! do |entry|
existence = File.exists? entry.zip_path
Fiber.yield
context["deleted_entry_ids"] << entry.id unless existence
existence
end
remained_entry_zip_paths = @entries.map &.zip_path
is_titles_added = false
is_entries_added = false
Dir.entries(dir).each do |fn|
next if fn.starts_with? "."
path = File.join dir, fn
if File.directory? path
next if remained_title_dirs.includes? path
title = Title.new path, @id, context["cached_contents_signature"]
next if title.entries.size == 0 && title.titles.size == 0
Library.default.title_hash[title.id] = title
@title_ids << title.id
is_titles_added = true
next
end
if is_supported_file path
next if remained_entry_zip_paths.includes? path
entry = Entry.new path, self
if entry.pages > 0 || entry.err_msg
@entries << entry
is_entries_added = true
end
end
end
mtimes = [@mtime]
mtimes += @title_ids.map { |e| Library.default.title_hash[e].mtime }
mtimes += @entries.map &.mtime
@mtime = mtimes.max
if is_titles_added || previous_titles_size != @title_ids.size
@title_ids.sort! do |a, b|
compare_numerically Library.default.title_hash[a].title,
Library.default.title_hash[b].title
end
end
if is_entries_added || previous_entries_size != @entries.size
sorter = ChapterSorter.new @entries.map &.title
@entries.sort! do |a, b|
sorter.compare a.title, b.title
end
end
true
end
alias SortContext = NamedTuple(username: String, opt: SortOptions)
def build_json(*, slim = false, depth = -1,
sort_context : SortContext? = nil)
JSON.build do |json|
json.object do
{% for str in ["dir", "title", "id"] %}
json.field {{str}}, @{{str.id}}
{% end %}
json.field "signature" { json.number @signature }
json.field "display_name", display_name
json.field "cover_url", cover_url
json.field "mtime" { json.number @mtime.to_unix }
json.field "titles" do
json.raw self.titles.to_json
end
json.field "entries" do
json.raw @entries.to_json
end
json.field "parents" do
json.array do
self.parents.each do |title|
json.object do
json.field "title", title.title
json.field "id", title.id
json.field "signature" { json.number @signature }
unless slim
json.field "display_name", display_name
json.field "cover_url", cover_url
json.field "mtime" { json.number @mtime.to_unix }
end
unless depth == 0
json.field "titles" do
json.array do
self.titles.each do |title|
json.raw title.build_json(slim: slim,
depth: depth > 0 ? depth - 1 : depth)
end
end
end
json.field "entries" do
json.array do
_entries = if sort_context
sorted_entries sort_context[:username],
sort_context[:opt]
else
@entries
end
_entries.each do |entry|
json.raw entry.build_json(slim: slim)
end
end
end
end
json.field "parents" do
json.array do
self.parents.each do |title|
json.object do
json.field "title", title.title
json.field "id", title.id
end
end
end
end
@@ -92,12 +229,12 @@ class Title
# Get all entries, including entries in nested titles
def deep_entries
return @entries if title_ids.empty?
@entries + titles.map { |t| t.deep_entries }.flatten
@entries + titles.flat_map &.deep_entries
end
def deep_titles
return [] of Title if titles.empty?
titles + titles.map { |t| t.deep_titles }.flatten
titles + titles.flat_map &.deep_titles
end
def parents
@@ -138,15 +275,19 @@ class Title
end
def get_entry(eid)
@entries.find { |e| e.id == eid }
@entries.find &.id.== eid
end
def display_name
cached_display_name = @cached_display_name
return cached_display_name unless cached_display_name.nil?
dn = @title
TitleInfo.new @dir do |info|
info_dn = info.display_name
dn = info_dn unless info_dn.empty?
end
@cached_display_name = dn
dn
end
@@ -170,6 +311,7 @@ class Title
end
def set_display_name(dn)
@cached_display_name = dn
TitleInfo.new @dir do |info|
info.display_name = dn
info.save
@@ -179,11 +321,15 @@ class Title
def set_display_name(entry_name : String, dn)
TitleInfo.new @dir do |info|
info.entry_display_name[entry_name] = dn
@entry_display_name_cache = info.entry_display_name
info.save
end
end
def cover_url
cached_cover_url = @cached_cover_url
return cached_cover_url unless cached_cover_url.nil?
url = "#{Config.current.base_url}img/icon.png"
readable_entries = @entries.select &.err_msg.nil?
if readable_entries.size > 0
@@ -195,10 +341,12 @@ class Title
url = File.join Config.current.base_url, info_url
end
end
@cached_cover_url = url
url
end
def set_cover_url(url : String)
@cached_cover_url = url
TitleInfo.new @dir do |info|
info.cover_url = url
info.save
@@ -208,6 +356,7 @@ class Title
def set_cover_url(entry_name : String, url : String)
TitleInfo.new @dir do |info|
info.entry_cover_url[entry_name] = url
@entry_cover_url_cache = info.entry_cover_url
info.save
end
end
@@ -217,29 +366,30 @@ class Title
@entries.each do |e|
e.save_progress username, e.pages
end
titles.each do |t|
t.read_all username
end
titles.each &.read_all username
end
# Set the reading progress of all entries and nested libraries to 0%
def unread_all(username)
@entries.each do |e|
e.save_progress username, 0
end
titles.each do |t|
t.unread_all username
end
@entries.each &.save_progress(username, 0)
titles.each &.unread_all username
end
def deep_read_page_count(username) : Int32
load_progress_for_all_entries(username).sum +
titles.map { |t| t.deep_read_page_count username }.flatten.sum
key = "#{@id}:#{username}:progress_sum"
sig = Digest::SHA1.hexdigest (entries.map &.id).to_s
cached_sum = LRUCache.get key
return cached_sum[1] if cached_sum.is_a? Tuple(String, Int32) &&
cached_sum[0] == sig
sum = load_progress_for_all_entries(username, nil, true).sum +
titles.flat_map(&.deep_read_page_count username).sum
LRUCache.set generate_cache_entry key, {sig, sum}
sum
end
def deep_total_page_count : Int32
entries.map { |e| e.pages }.sum +
titles.map { |t| t.deep_total_page_count }.flatten.sum
entries.sum(&.pages) +
titles.flat_map(&.deep_total_page_count).sum
end
def load_percentage(username)
@@ -288,13 +438,12 @@ class Title
# use the default (auto, ascending)
# When `opt` is not nil, it saves the options to info.json
def sorted_entries(username, opt : SortOptions? = nil)
cache_key = SortedEntriesCacheEntry.gen_key @id, username, @entries, opt
cached_entries = LRUCache.get cache_key
return cached_entries if cached_entries.is_a? Array(Entry)
if opt.nil?
opt = SortOptions.from_info_json @dir, username
else
TitleInfo.new @dir do |info|
info.sort_by[username] = opt.to_tuple
info.save
end
end
case opt.not_nil!.method
@@ -311,13 +460,13 @@ class Title
ary = @entries.zip(percentage_ary)
.sort { |a_tp, b_tp| (a_tp[1] <=> b_tp[1]).or \
compare_numerically a_tp[0].title, b_tp[0].title }
.map { |tp| tp[0] }
.map &.[0]
else
unless opt.method.auto?
Logger.warn "Unknown sorting method #{opt.not_nil!.method}. Using " \
"Auto instead"
end
sorter = ChapterSorter.new @entries.map { |e| e.title }
sorter = ChapterSorter.new @entries.map &.title
ary = @entries.sort do |a, b|
sorter.compare(a.title, b.title).or \
compare_numerically a.title, b.title
@@ -326,6 +475,7 @@ class Title
ary.reverse! unless opt.not_nil!.ascend
LRUCache.set generate_cache_entry cache_key, ary
ary
end
@@ -383,13 +533,24 @@ class Title
{entry: e, date_added: da_ary[i]}
end
return zip if title_ids.empty?
zip + titles.map { |t| t.deep_entries_with_date_added }.flatten
zip + titles.flat_map &.deep_entries_with_date_added
end
def bulk_progress(action, ids : Array(String), username)
LRUCache.invalidate "#{@id}:#{username}:progress_sum"
parents.each do |parent|
LRUCache.invalidate "#{parent.id}:#{username}:progress_sum"
end
[false, true].each do |ascend|
sorted_entries_cache_key =
SortedEntriesCacheEntry.gen_key @id, username, @entries,
SortOptions.new(SortMethod::Progress, ascend)
LRUCache.invalidate sorted_entries_cache_key
end
selected_entries = ids
.map { |id|
@entries.find { |e| e.id == id }
@entries.find &.id.==(id)
}
.select(Entry)

View File

@@ -1,4 +1,12 @@
SUPPORTED_IMG_TYPES = ["image/jpeg", "image/png", "image/webp"]
SUPPORTED_IMG_TYPES = %w(
image/jpeg
image/png
image/webp
image/apng
image/avif
image/gif
image/svg+xml
)
enum SortMethod
Auto
@@ -88,6 +96,18 @@ class TitleInfo
@@mutex_hash = {} of String => Mutex
def self.new(dir, &)
key = "#{dir}:info.json"
info = LRUCache.get key
if info.is_a? String
begin
instance = TitleInfo.from_json info
instance.dir = dir
yield instance
return
rescue
end
end
if @@mutex_hash[dir]?
mutex = @@mutex_hash[dir]
else
@@ -101,6 +121,7 @@ class TitleInfo
instance = TitleInfo.from_json File.read json_path
end
instance.dir = dir
LRUCache.set generate_cache_entry key, instance.to_json
yield instance
end
end
@@ -108,5 +129,12 @@ class TitleInfo
def save
json_path = File.join @dir, "info.json"
File.write json_path, self.to_pretty_json
key = "#{@dir}:info.json"
LRUCache.set generate_cache_entry key, self.to_json
end
end
alias ExamineContext = NamedTuple(
cached_contents_signature: Hash(String, String),
deleted_title_ids: Array(String),
deleted_entry_ids: Array(String))

View File

@@ -34,7 +34,11 @@ class Logger
end
@backend.formatter = Log::Formatter.new &format_proc
Log.setup @@severity, @backend
Log.setup do |c|
c.bind "*", @@severity, @backend
c.bind "db.*", :error, @backend
end
end
def self.get_severity(level = "") : Log::Severity

View File

@@ -1,169 +0,0 @@
require "mangadex"
require "compress/zip"
require "../rename"
require "./ext"
module MangaDex
class PageJob
property success = false
property url : String
property filename : String
property writer : Compress::Zip::Writer
property tries_remaning : Int32
def initialize(@url, @filename, @writer, @tries_remaning)
end
end
class Downloader < Queue::Downloader
@wait_seconds : Int32 = Config.current.mangadex["download_wait_seconds"]
.to_i32
@retries : Int32 = Config.current.mangadex["download_retries"].to_i32
use_default
def initialize
@client = Client.from_config
super
end
def pop : Queue::Job?
job = nil
MainFiber.run do
DB.open "sqlite3://#{@queue.path}" do |db|
begin
db.query_one "select * from queue where id not like '%-%' " \
"and (status = 0 or status = 1) " \
"order by time limit 1" do |res|
job = Queue::Job.from_query_result res
end
rescue
end
end
end
job
end
private def download(job : Queue::Job)
@downloading = true
@queue.set_status Queue::JobStatus::Downloading, job
begin
chapter = @client.chapter job.id
rescue e
Logger.error e
@queue.set_status Queue::JobStatus::Error, job
unless e.message.nil?
@queue.add_message e.message.not_nil!, job
end
@downloading = false
return
end
@queue.set_pages chapter.pages.size, job
lib_dir = @library_path
rename_rule = Rename::Rule.new \
Config.current.mangadex["manga_rename_rule"].to_s
manga_dir = File.join lib_dir, chapter.manga.rename rename_rule
unless File.exists? manga_dir
Dir.mkdir_p manga_dir
end
zip_path = File.join manga_dir, "#{job.title}.cbz.part"
# Find the number of digits needed to store the number of pages
len = Math.log10(chapter.pages.size).to_i + 1
writer = Compress::Zip::Writer.new zip_path
# Create a buffered channel. It works as an FIFO queue
channel = Channel(PageJob).new chapter.pages.size
spawn do
chapter.pages.each_with_index do |url, i|
fn = Path.new(URI.parse(url).path).basename
ext = File.extname fn
fn = "#{i.to_s.rjust len, '0'}#{ext}"
page_job = PageJob.new url, fn, writer, @retries
Logger.debug "Downloading #{url}"
loop do
sleep @wait_seconds.seconds
download_page page_job
break if page_job.success ||
page_job.tries_remaning <= 0
page_job.tries_remaning -= 1
Logger.warn "Failed to download page #{url}. " \
"Retrying... Remaining retries: " \
"#{page_job.tries_remaning}"
end
channel.send page_job
break unless @queue.exists? job
end
end
spawn do
page_jobs = [] of PageJob
chapter.pages.size.times do
page_job = channel.receive
break unless @queue.exists? job
Logger.debug "[#{page_job.success ? "success" : "failed"}] " \
"#{page_job.url}"
page_jobs << page_job
if page_job.success
@queue.add_success job
else
@queue.add_fail job
msg = "Failed to download page #{page_job.url}"
@queue.add_message msg, job
Logger.error msg
end
end
unless @queue.exists? job
Logger.debug "Download cancelled"
@downloading = false
next
end
fail_count = page_jobs.count { |j| !j.success }
Logger.debug "Download completed. " \
"#{fail_count}/#{page_jobs.size} failed"
writer.close
filename = File.join File.dirname(zip_path), File.basename(zip_path,
".part")
File.rename zip_path, filename
Logger.debug "cbz File created at #{filename}"
zip_exception = validate_archive filename
if !zip_exception.nil?
@queue.add_message "The downloaded archive is corrupted. " \
"Error: #{zip_exception}", job
@queue.set_status Queue::JobStatus::Error, job
elsif fail_count > 0
@queue.set_status Queue::JobStatus::MissingPages, job
else
@queue.set_status Queue::JobStatus::Completed, job
end
@downloading = false
end
end
private def download_page(job : PageJob)
Logger.debug "downloading #{job.url}"
headers = HTTP::Headers{
"User-agent" => "Mangadex.cr",
}
begin
HTTP::Client.get job.url, headers do |res|
unless res.success?
raise "Failed to download page #{job.url}. " \
"[#{res.status_code}] #{res.status_message}"
end
job.writer.add job.filename, res.body_io
end
job.success = true
rescue e
Logger.error e
job.success = false
end
end
end
end

View File

@@ -1,60 +0,0 @@
private macro properties_to_hash(names)
{
{% for name in names %}
"{{name.id}}" => {{name.id}}.to_s,
{% end %}
}
end
# Monkey-patch the structures in the `mangadex` shard to suit our needs
module MangaDex
struct Client
@@group_cache = {} of String => Group
def self.from_config : Client
self.new base_url: Config.current.mangadex["base_url"].to_s,
api_url: Config.current.mangadex["api_url"].to_s
end
end
struct Manga
def rename(rule : Rename::Rule)
rule.render properties_to_hash %w(id title author artist)
end
def to_info_json
hash = JSON.parse(to_json).as_h
_chapters = chapters.map do |c|
JSON.parse c.to_info_json
end
hash["chapters"] = JSON::Any.new _chapters
hash.to_json
end
end
struct Chapter
def rename(rule : Rename::Rule)
hash = properties_to_hash %w(id title volume chapter lang_code language)
hash["groups"] = groups.map(&.name).join ","
rule.render hash
end
def full_title
rule = Rename::Rule.new \
Config.current.mangadex["chapter_rename_rule"].to_s
rename rule
end
def to_info_json
hash = JSON.parse(to_json).as_h
hash["language"] = JSON::Any.new language
_groups = {} of String => JSON::Any
groups.each do |g|
_groups[g.name] = JSON::Any.new g.id
end
hash["groups"] = JSON::Any.new _groups
hash["full_title"] = JSON::Any.new full_title
hash.to_json
end
end
end

View File

@@ -2,13 +2,12 @@ require "./config"
require "./queue"
require "./server"
require "./main_fiber"
require "./mangadex/*"
require "./plugin/*"
require "option_parser"
require "clim"
require "tallboy"
MANGO_VERSION = "0.21.0"
MANGO_VERSION = "0.24.0"
# From http://www.network-science.de/ascii/
BANNER = %{
@@ -56,10 +55,11 @@ class CLI < Clim
Config.load(opts.config).set_current
# Initialize main components
LRUCache.init
Storage.default
Queue.default
Library.load_instance
Library.default
MangaDex::Downloader.default
Plugin::Downloader.default
spawn do

View File

@@ -23,11 +23,6 @@ class Plugin
job
end
private def process_filename(str)
return "_" if str == ".."
str.gsub "/", "_"
end
private def download(job : Queue::Job)
@downloading = true
@queue.set_status Queue::JobStatus::Downloading, job
@@ -42,8 +37,8 @@ class Plugin
pages = info["pages"].as_i
manga_title = process_filename job.manga_title
chapter_title = process_filename info["title"].as_s
manga_title = sanitize_filename job.manga_title
chapter_title = sanitize_filename info["title"].as_s
@queue.set_pages pages, job
lib_dir = @library_path
@@ -68,7 +63,7 @@ class Plugin
while page = plugin.next_page
break unless @queue.exists? job
fn = process_filename page["filename"].as_s
fn = sanitize_filename page["filename"].as_s
url = page["url"].as_s
headers = HTTP::Headers.new

View File

@@ -117,7 +117,7 @@ class Plugin
def initialize(id : String)
Plugin.build_info_ary
@info = @@info_ary.find { |i| i.id == id }
@info = @@info_ary.find &.id.== id
if @info.nil?
raise Error.new "Plugin with ID #{id} not found"
end

View File

@@ -303,12 +303,12 @@ class Queue
end
def pause
@downloaders.each { |d| d.stopped = true }
@downloaders.each &.stopped=(true)
@paused = true
end
def resume
@downloaders.each { |d| d.stopped = false }
@downloaders.each &.stopped=(false)
@paused = false
end

View File

@@ -35,15 +35,15 @@ module Rename
class Group < Base(Pattern | String)
def render(hash : VHash)
return "" if @ary.select(&.is_a? Pattern)
return "" if @ary.select(Pattern)
.any? &.as(Pattern).render(hash).empty?
@ary.map do |e|
@ary.join do |e|
if e.is_a? Pattern
e.render hash
else
e
end
end.join
end
end
end
@@ -129,13 +129,13 @@ module Rename
end
def render(hash : VHash)
str = @ary.map do |e|
str = @ary.join do |e|
if e.is_a? String
e
else
e.render hash
end
end.join.strip
end.strip
post_process str
end

View File

@@ -73,9 +73,5 @@ struct AdminRouter
get "/admin/missing" do |env|
layout "missing-items"
end
get "/admin/mangadex" do |env|
layout "mangadex"
end
end
end

View File

@@ -1,6 +1,6 @@
require "../mangadex/*"
require "../upload"
require "koa"
require "digest"
struct APIRouter
@@api_json : String?
@@ -56,31 +56,20 @@ struct APIRouter
"error" => String?,
}
Koa.schema("mdChapter", {
"id" => Int64,
"group" => {} of String => String,
}.merge(s %w(title volume chapter language full_title time
manga_title manga_id)),
desc: "A MangaDex chapter")
Koa.schema "mdManga", {
"id" => Int64,
"chapters" => ["mdChapter"],
}.merge(s %w(title description author artist cover_url)),
desc: "A MangaDex manga"
Koa.describe "Returns a page in a manga entry"
Koa.path "tid", desc: "Title ID"
Koa.path "eid", desc: "Entry ID"
Koa.path "page", schema: Int32, desc: "The page number to return (starts from 1)"
Koa.response 200, schema: Bytes, media_type: "image/*"
Koa.response 500, "Page not found or not readable"
Koa.response 304, "Page not modified (only available when `If-None-Match` is set)"
Koa.tag "reader"
get "/api/page/:tid/:eid/:page" do |env|
begin
tid = env.params.url["tid"]
eid = env.params.url["eid"]
page = env.params.url["page"].to_i
prev_e_tag = env.request.headers["If-None-Match"]?
title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil?
@@ -90,7 +79,15 @@ struct APIRouter
raise "Failed to load page #{page} of " \
"`#{title.title}/#{entry.title}`" if img.nil?
send_img env, img
e_tag = Digest::SHA1.hexdigest img.data
if prev_e_tag == e_tag
env.response.status_code = 304
""
else
env.response.headers["ETag"] = e_tag
env.response.headers["Cache-Control"] = "public, max-age=86400"
send_img env, img
end
rescue e
Logger.error e
env.response.status_code = 500
@@ -102,12 +99,14 @@ struct APIRouter
Koa.path "tid", desc: "Title ID"
Koa.path "eid", desc: "Entry ID"
Koa.response 200, schema: Bytes, media_type: "image/*"
Koa.response 304, "Page not modified (only available when `If-None-Match` is set)"
Koa.response 500, "Page not found or not readable"
Koa.tag "library"
get "/api/cover/:tid/:eid" do |env|
begin
tid = env.params.url["tid"]
eid = env.params.url["eid"]
prev_e_tag = env.request.headers["If-None-Match"]?
title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil?
@@ -118,7 +117,14 @@ struct APIRouter
raise "Failed to get cover of `#{title.title}/#{entry.title}`" \
if img.nil?
send_img env, img
e_tag = Digest::SHA1.hexdigest img.data
if prev_e_tag == e_tag
env.response.status_code = 304
""
else
env.response.headers["ETag"] = e_tag
send_img env, img
end
rescue e
Logger.error e
env.response.status_code = 500
@@ -126,18 +132,39 @@ struct APIRouter
end
end
Koa.describe "Returns the book with title `tid`"
Koa.describe "Returns the book with title `tid`", <<-MD
- Supply the `slim` query parameter to strip away "display_name", "cover_url", and "mtime" from the returned object to speed up the loading time
- Supply the `depth` query parameter to control the depth of nested titles to return.
- When `depth` is 1, returns the top-level titles and sub-titles/entries one level in them
- When `depth` is 0, returns the top-level titles without their sub-titles/entries
- When `depth` is N, returns the top-level titles and sub-titles/entries N levels in them
- When `depth` is negative, returns the entire library
MD
Koa.path "tid", desc: "Title ID"
Koa.query "slim"
Koa.query "depth"
Koa.query "sort", desc: "Sorting option for entries. Can be one of 'auto', 'title', 'progress', 'time_added' and 'time_modified'"
Koa.query "ascend", desc: "Sorting direction for entries. Set to 0 for the descending order. Doesn't work without specifying 'sort'"
Koa.response 200, schema: "title"
Koa.response 404, "Title not found"
Koa.tag "library"
get "/api/book/:tid" do |env|
begin
username = get_username env
sort_opt = SortOptions.new
get_sort_opt
tid = env.params.url["tid"]
title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil?
send_json env, title.to_json
slim = !env.params.query["slim"]?.nil?
depth = env.params.query["depth"]?.try(&.to_i?) || -1
send_json env, title.build_json(slim: slim, depth: depth,
sort_context: {username: username,
opt: sort_opt})
rescue e
Logger.error e
env.response.status_code = 404
@@ -145,14 +172,26 @@ struct APIRouter
end
end
Koa.describe "Returns the entire library with all titles and entries"
Koa.describe "Returns the entire library with all titles and entries", <<-MD
- Supply the `slim` query parameter to strip away "display_name", "cover_url", and "mtime" from the returned object to speed up the loading time
- Supply the `dpeth` query parameter to control the depth of nested titles to return.
- When `depth` is 1, returns the requested title and sub-titles/entries one level in it
- When `depth` is 0, returns the requested title without its sub-titles/entries
- When `depth` is N, returns the requested title and sub-titles/entries N levels in it
- When `depth` is negative, returns the requested title and all sub-titles/entries in it
MD
Koa.query "slim"
Koa.query "depth"
Koa.response 200, schema: {
"dir" => String,
"titles" => ["title"],
}
Koa.tag "library"
get "/api/library" do |env|
send_json env, Library.default.to_json
slim = !env.params.query["slim"]?.nil?
depth = env.params.query["depth"]?.try(&.to_i?) || -1
send_json env, Library.default.build_json(slim: slim, depth: depth)
end
Koa.describe "Triggers a library scan"
@@ -309,64 +348,12 @@ struct APIRouter
end
end
Koa.describe "Returns a MangaDex manga identified by `id`", <<-MD
On error, returns a JSON that contains the error message in the `error` field.
MD
Koa.tags ["admin", "mangadex"]
Koa.path "id", desc: "A MangaDex manga ID"
Koa.response 200, schema: "mdManga"
get "/api/admin/mangadex/manga/:id" do |env|
begin
id = env.params.url["id"]
manga = MangaDex::Client.from_config.manga id
send_json env, manga.to_info_json
rescue e
Logger.error e
send_json env, {"error" => e.message}.to_json
end
end
Koa.describe "Adds a list of MangaDex chapters to the download queue", <<-MD
On error, returns a JSON that contains the error message in the `error` field.
MD
Koa.tags ["admin", "mangadex", "downloader"]
Koa.body schema: {
"chapters" => ["mdChapter"],
}
Koa.response 200, schema: {
"success" => Int32,
"fail" => Int32,
}
post "/api/admin/mangadex/download" do |env|
begin
chapters = env.params.json["chapters"].as(Array).map { |c| c.as_h }
jobs = chapters.map { |chapter|
Queue::Job.new(
chapter["id"].as_i64.to_s,
chapter["mangaId"].as_i64.to_s,
chapter["full_title"].as_s,
chapter["mangaTitle"].as_s,
Queue::JobStatus::Pending,
Time.unix chapter["timestamp"].as_i64
)
}
inserted_count = Queue.default.push jobs
send_json env, {
"success": inserted_count,
"fail": jobs.size - inserted_count,
}.to_json
rescue e
Logger.error e
send_json env, {"error" => e.message}.to_json
end
end
ws "/api/admin/mangadex/queue" do |socket, env|
interval_raw = env.params.query["interval"]?
interval = (interval_raw.to_i? if interval_raw) || 5
loop do
socket.send({
"jobs" => Queue.default.get_all,
"jobs" => Queue.default.get_all.reverse,
"paused" => Queue.default.paused?,
}.to_json)
sleep interval.seconds
@@ -390,13 +377,13 @@ struct APIRouter
}
get "/api/admin/mangadex/queue" do |env|
begin
jobs = Queue.default.get_all
send_json env, {
"jobs" => jobs,
"jobs" => Queue.default.get_all.reverse,
"paused" => Queue.default.paused?,
"success" => true,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -444,6 +431,7 @@ struct APIRouter
send_json env, {"success" => true}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -516,6 +504,7 @@ struct APIRouter
raise "No part with name `file` found"
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -551,6 +540,7 @@ struct APIRouter
"title" => title,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -594,6 +584,7 @@ struct APIRouter
"fail": jobs.size - inserted_count,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -612,25 +603,35 @@ struct APIRouter
"width" => Int32,
"height" => Int32,
}],
"margin" => Int32?,
}
Koa.response 304, "Not modified (only available when `If-None-Match` is set)"
get "/api/dimensions/:tid/:eid" do |env|
begin
tid = env.params.url["tid"]
eid = env.params.url["eid"]
prev_e_tag = env.request.headers["If-None-Match"]?
title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil?
entry = title.get_entry eid
raise "Entry ID `#{eid}` of `#{title.title}` not found" if entry.nil?
sizes = entry.page_dimensions
send_json env, {
"success" => true,
"dimensions" => sizes,
"margin" => Config.current.page_margin,
}.to_json
file_hash = Digest::SHA1.hexdigest (entry.zip_path + entry.mtime.to_s)
e_tag = "W/#{file_hash}"
if e_tag == prev_e_tag
env.response.status_code = 304
""
else
sizes = entry.page_dimensions
env.response.headers["ETag"] = e_tag
env.response.headers["Cache-Control"] = "public, max-age=86400"
send_json env, {
"success" => true,
"dimensions" => sizes,
}.to_json
end
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -770,6 +771,7 @@ struct APIRouter
"titles" => Storage.default.missing_titles,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -796,6 +798,7 @@ struct APIRouter
"entries" => Storage.default.missing_entries,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -814,6 +817,7 @@ struct APIRouter
"error" => nil,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -832,6 +836,7 @@ struct APIRouter
"error" => nil,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -853,6 +858,7 @@ struct APIRouter
"error" => nil,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -873,114 +879,6 @@ struct APIRouter
"success" => true,
"error" => nil,
}.to_json
rescue e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Logs the current user into their MangaDex account", <<-MD
If successful, returns the expiration date (as a unix timestamp) of the newly created token.
MD
Koa.body schema: {
"username" => String,
"password" => String,
}
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"expires" => Int64?,
}
Koa.tags ["admin", "mangadex", "users"]
post "/api/admin/mangadex/login" do |env|
begin
username = env.params.json["username"].as String
password = env.params.json["password"].as String
mango_username = get_username env
client = MangaDex::Client.from_config
client.auth username, password
Storage.default.save_md_token mango_username, client.token.not_nil!,
client.token_expires
send_json env, {
"success" => true,
"error" => nil,
"expires" => client.token_expires.to_unix,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Returns the expiration date (as a unix timestamp) of the mangadex token if it exists"
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"expires" => Int64?,
}
Koa.tags ["admin", "mangadex", "users"]
get "/api/admin/mangadex/expires" do |env|
begin
username = get_username env
_, expires = Storage.default.get_md_token username
send_json env, {
"success" => true,
"error" => nil,
"expires" => expires.try &.to_unix,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Searches MangaDex for manga matching `query`", <<-MD
Returns an empty list if the current user hasn't logged in to MangaDex.
MD
Koa.query "query"
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"manga?" => [{
"id" => Int64,
"title" => String,
"description" => String,
"mainCover" => String,
}],
}
Koa.tags ["admin", "mangadex"]
get "/api/admin/mangadex/search" do |env|
begin
username = get_username env
token, expires = Storage.default.get_md_token username
unless expires && token
raise "No token found for user #{username}"
end
client = MangaDex::Client.from_config
client.token = token
client.token_expires = expires
query = env.params.query["query"]
send_json env, {
"success" => true,
"error" => nil,
"manga" => client.partial_search query,
}.to_json
rescue e
Logger.error e
send_json env, {

View File

@@ -30,7 +30,8 @@ struct MainRouter
else
redirect env, "/"
end
rescue
rescue e
Logger.error e
redirect env, "/login"
end
end
@@ -40,7 +41,7 @@ struct MainRouter
username = get_username env
sort_opt = SortOptions.from_info_json Library.default.dir, username
get_sort_opt
get_and_save_sort_opt Library.default.dir
titles = Library.default.sorted_titles username, sort_opt
percentage = titles.map &.load_percentage username
@@ -58,12 +59,12 @@ struct MainRouter
username = get_username env
sort_opt = SortOptions.from_info_json title.dir, username
get_sort_opt
get_and_save_sort_opt title.dir
entries = title.sorted_entries username, sort_opt
percentage = title.load_percentage_for_all_entries username, sort_opt
title_percentage = title.titles.map &.load_percentage username
layout "title"
rescue e
Logger.error e
@@ -71,11 +72,6 @@ struct MainRouter
end
end
get "/download" do |env|
mangadex_base_url = Config.current.mangadex["base_url"]
layout "download"
end
get "/download/plugins" do |env|
begin
id = env.params.query["plugin"]?
@@ -103,7 +99,7 @@ struct MainRouter
recently_added = Library.default.get_recently_added_entries username
start_reading = Library.default.get_start_reading_titles username
titles = Library.default.titles
new_user = !titles.any? { |t| t.load_percentage(username) > 0 }
new_user = !titles.any? &.load_percentage(username).> 0
empty_library = titles.size == 0
layout "home"
rescue e

View File

@@ -428,12 +428,21 @@ class Storage
end
end
def mark_unavailable
# Mark titles and entries that no longer exist on the file system as
# unavailable. By supplying `id_candidates` and `titles_candidates`, it
# only checks the existence of the candidate titles/entries to speed up
# the process.
def mark_unavailable(ids_candidates : Array(String)?,
titles_candidates : Array(String)?)
MainFiber.run do
get_db do |db|
# Detect dangling entry IDs
trash_ids = [] of String
db.query "select path, id from ids where unavailable = 0" do |rs|
query = "select path, id from ids where unavailable = 0"
unless ids_candidates.nil?
query += " and id in (#{ids_candidates.join "," { |i| "'#{i}'" }})"
end
db.query query do |rs|
rs.each do
path = rs.read String
fullpath = Path.new(path).expand(Config.current.library_path).to_s
@@ -445,11 +454,15 @@ class Storage
Logger.debug "Marking #{trash_ids.size} entries as unavailable"
end
db.exec "update ids set unavailable = 1 where id in " \
"(#{trash_ids.map { |i| "'#{i}'" }.join ","})"
"(#{trash_ids.join "," { |i| "'#{i}'" }})"
# Detect dangling title IDs
trash_titles = [] of String
db.query "select path, id from titles where unavailable = 0" do |rs|
query = "select path, id from titles where unavailable = 0"
unless titles_candidates.nil?
query += " and id in (#{titles_candidates.join "," { |i| "'#{i}'" }})"
end
db.query query do |rs|
rs.each do
path = rs.read String
fullpath = Path.new(path).expand(Config.current.library_path).to_s
@@ -461,7 +474,7 @@ class Storage
Logger.debug "Marking #{trash_titles.size} titles as unavailable"
end
db.exec "update titles set unavailable = 1 where id in " \
"(#{trash_titles.map { |i| "'#{i}'" }.join ","})"
"(#{trash_titles.join "," { |i| "'#{i}'" }})"
end
end
end

83
src/subscription.cr Normal file
View File

@@ -0,0 +1,83 @@
require "db"
require "json"
struct Subscription
include DB::Serializable
include JSON::Serializable
getter id : Int64 = 0
getter username : String
getter manga_id : Int64
property language : String?
property group_id : Int64?
property min_volume : Int64?
property max_volume : Int64?
property min_chapter : Int64?
property max_chapter : Int64?
@[DB::Field(key: "last_checked")]
@[JSON::Field(key: "last_checked")]
@raw_last_checked : Int64
@[DB::Field(key: "created_at")]
@[JSON::Field(key: "created_at")]
@raw_created_at : Int64
def last_checked : Time
Time.unix @raw_last_checked
end
def created_at : Time
Time.unix @raw_created_at
end
def initialize(@manga_id, @username)
@raw_created_at = Time.utc.to_unix
@raw_last_checked = Time.utc.to_unix
end
private def in_range?(value : String, lowerbound : Int64?,
upperbound : Int64?) : Bool
lb = lowerbound.try &.to_f64
ub = upperbound.try &.to_f64
return true if lb.nil? && ub.nil?
v = value.to_f64?
return false unless v
if lb.nil?
v <= ub.not_nil!
elsif ub.nil?
v >= lb.not_nil!
else
v >= lb.not_nil! && v <= ub.not_nil!
end
end
def match?(chapter : MangaDex::Chapter) : Bool
if chapter.manga_id != manga_id ||
(language && chapter.language != language) ||
(group_id && !chapter.groups.map(&.id).includes? group_id)
return false
end
in_range?(chapter.volume, min_volume, max_volume) &&
in_range?(chapter.chapter, min_chapter, max_chapter)
end
def check_for_updates : Int32
Logger.debug "Checking updates for subscription with ID #{id}"
jobs = [] of Queue::Job
get_client(username).user.updates_after last_checked do |chapter|
next unless match? chapter
jobs << chapter.to_job
end
Storage.default.update_subscription_last_checked id
count = Queue.default.push jobs
Logger.debug "#{count}/#{jobs.size} of updates added to queue"
count
rescue e
Logger.error "Error occurred when checking updates for " \
"subscription with ID #{id}. #{e}"
0
end
end

View File

@@ -73,7 +73,7 @@ class ChapterSorter
.select do |key|
keys[key].count >= str_ary.size / 2
end
.sort do |a_key, b_key|
.sort! do |a_key, b_key|
a = keys[a_key]
b = keys[b_key]
# Sort keys by the number of times they appear

View File

@@ -11,7 +11,7 @@ end
def split_by_alphanumeric(str)
arr = [] of String
str.scan(/([^\d\n\r]*)(\d*)([^\d\n\r]*)/) do |match|
arr += match.captures.select { |s| s != "" }
arr += match.captures.select &.!= ""
end
arr
end

View File

@@ -48,4 +48,32 @@ class Dir
end
Digest::CRC32.checksum(signatures.sort.join).to_u64
end
# Returns the contents signature of the directory at dirname for checking
# to rescan.
# Rescan conditions:
# - When a file added, moved, removed, renamed (including which in nested
# directories)
def self.contents_signature(dirname, cache = {} of String => String) : String
return cache[dirname] if cache[dirname]?
Fiber.yield
signatures = [] of String
self.open dirname do |dir|
dir.entries.sort.each do |fn|
next if fn.starts_with? "."
path = File.join dirname, fn
if File.directory? path
signatures << Dir.contents_signature path, cache
else
# Only add its signature value to `signatures` when it is a
# supported file
signatures << fn if is_supported_file fn
end
Fiber.yield
end
end
hash = Digest::SHA1.hexdigest(signatures.join)
cache[dirname] = hash
hash
end
end

View File

@@ -35,6 +35,11 @@ def register_mime_types
# FontAwesome fonts
".woff" => "font/woff",
".woff2" => "font/woff2",
# Supported image formats. JPG, PNG, GIF, WebP, and SVG are already
# defiend by Crystal in `MIME.DEFAULT_TYPES`
".apng" => "image/apng",
".avif" => "image/avif",
}.each do |k, v|
MIME.register k, v
end
@@ -114,9 +119,28 @@ class String
def components_similarity(other : String) : Float64
s, l = [self, other]
.map { |str| Path.new(str).parts }
.sort_by &.size
.sort_by! &.size
match = s.reverse.zip(l.reverse).count { |a, b| a == b }
match / s.size
end
end
# Does the followings:
# - turns space-like characters into the normal whitespaces ( )
# - strips and collapses spaces
# - removes ASCII control characters
# - replaces slashes (/) with underscores (_)
# - removes leading dots (.)
# - removes the following special characters: \:*?"<>|
#
# If the sanitized string is empty, returns a random string instead.
def sanitize_filename(str : String) : String
sanitized = str
.gsub(/\s+/, " ")
.strip
.gsub(/\//, "_")
.gsub(/^[\.\s]+/, "")
.gsub(/[\177\000-\031\\:\*\?\"<>\|]/, "")
sanitized.size > 0 ? sanitized : random_str
end

View File

@@ -72,7 +72,7 @@ def redirect(env, path)
end
def hash_to_query(hash)
hash.map { |k, v| "#{k}=#{v}" }.join("&")
hash.join "&" { |k, v| "#{k}=#{v}" }
end
def request_path_startswith(env, ary)
@@ -107,6 +107,26 @@ macro get_sort_opt
end
end
macro get_and_save_sort_opt(dir)
sort_method = env.params.query["sort"]?
if sort_method
is_ascending = true
ascend = env.params.query["ascend"]?
if ascend && ascend.to_i? == 0
is_ascending = false
end
sort_opt = SortOptions.new sort_method, is_ascending
TitleInfo.new {{dir}} do |info|
info.sort_by[username] = sort_opt.to_tuple
info.save
end
end
end
module HTTP
class Client
private def self.exec(uri : URI, tls : TLSContext = nil)

View File

@@ -33,7 +33,6 @@
<option>System</option>
</select>
</li>
<li><a class="uk-link-reset" href="<%= base_url %>admin/mangadex">Connect to MangaDex</a></li>
</ul>
<hr class="uk-divider-icon">

View File

@@ -49,11 +49,10 @@
</td>
<td x-text="`${job.plugin_id || ''}`"></td>
<td>
<a @click="jobAction('delete', $event)" uk-icon="trash"></a>
<a @click="jobAction('delete', $event)" uk-icon="trash" uk-tooltip="Delete"></a>
<template x-if="job.status_message.length > 0">
<a @click="jobAction('retry', $event)" uk-icon="refresh"></a>
<a @click="jobAction('retry', $event)" uk-icon="refresh" uk-tooltip="Retry"></a>
</template>
</td>
</tr>
@@ -61,6 +60,7 @@
</tbody>
</table>
</div>
</div>
<% content_for "script" do %>
<%= render_component "moment" %>

View File

@@ -1,89 +1,87 @@
<!DOCTYPE html>
<html>
<%= render_component "head" %>
<%= render_component "head" %>
<body>
<div class="uk-offcanvas-content">
<div class="uk-navbar-container uk-navbar-transparent" uk-navbar="uk-navbar">
<div id="mobile-nav" uk-offcanvas="overlay: true">
<div class="uk-offcanvas-bar uk-flex uk-flex-column">
<ul class="uk-nav-parent-icon uk-nav-primary uk-nav-center uk-margin-auto-vertical" uk-nav>
<li><a href="<%= base_url %>">Home</a></li>
<li><a href="<%= base_url %>library">Library</a></li>
<li><a href="<%= base_url %>tags">Tags</a></li>
<% if is_admin %>
<li><a href="<%= base_url %>admin">Admin</a></li>
<li class="uk-parent">
<a href="#">Download</a>
<ul class="uk-nav-sub">
<li><a href="<%= base_url %>download">MangaDex</a></li>
<li><a href="<%= base_url %>download/plugins">Plugins</a></li>
<li><a href="<%= base_url %>admin/downloads">Download Manager</a></li>
</ul>
</li>
<% end %>
<hr uk-divider>
<li><a onclick="toggleTheme()"><i class="fas fa-adjust"></i></a></li>
<li><a href="<%= base_url %>logout">Logout</a></li>
</ul>
</div>
</div>
</div>
<body>
<div class="uk-offcanvas-content">
<div class="uk-navbar-container uk-navbar-transparent" uk-navbar="uk-navbar">
<div id="mobile-nav" uk-offcanvas="overlay: true">
<div class="uk-offcanvas-bar uk-flex uk-flex-column">
<ul class="uk-nav-parent-icon uk-nav-primary uk-nav-center uk-margin-auto-vertical" uk-nav>
<li><a href="<%= base_url %>">Home</a></li>
<li><a href="<%= base_url %>library">Library</a></li>
<li><a href="<%= base_url %>tags">Tags</a></li>
<% if is_admin %>
<li><a href="<%= base_url %>admin">Admin</a></li>
<li class="uk-parent">
<a href="#">Download</a>
<ul class="uk-nav-sub">
<li><a href="<%= base_url %>download/plugins">Plugins</a></li>
<li><a href="<%= base_url %>admin/downloads">Download Manager</a></li>
</ul>
</li>
<% end %>
<hr uk-divider>
<li><a onclick="toggleTheme()"><i class="fas fa-adjust"></i></a></li>
<li><a href="<%= base_url %>logout">Logout</a></li>
</ul>
</div>
</div>
<div class="uk-position-top">
<div class="uk-navbar-container uk-navbar-transparent" uk-navbar="uk-navbar">
<div class="uk-navbar-left uk-hidden@s">
<div class="uk-navbar-toggle" uk-navbar-toggle-icon="uk-navbar-toggle-icon" uk-toggle="target: #mobile-nav"></div>
</div>
<div class="uk-navbar-left uk-visible@s">
<a class="uk-navbar-item uk-logo" href="<%= base_url %>"><img src="<%= base_url %>img/icon.png" style="width:90px;height:90px;"></a>
<ul class="uk-navbar-nav">
<li><a href="<%= base_url %>">Home</a></li>
<li><a href="<%= base_url %>library">Library</a></li>
<li><a href="<%= base_url %>tags">Tags</a></li>
<% if is_admin %>
<li><a href="<%= base_url %>admin">Admin</a></li>
<li>
<a href="#">Download</a>
<div class="uk-navbar-dropdown">
<ul class="uk-nav uk-navbar-dropdown-nav">
<li class="uk-nav-header">Source</li>
<li><a href="<%= base_url %>download">MangaDex</a></li>
<li><a href="<%= base_url %>download/plugins">Plugins</a></li>
<li class="uk-nav-divider"></li>
<li><a href="<%= base_url %>admin/downloads">Download Manager</a></li>
</ul>
</div>
</li>
<% end %>
</ul>
</div>
<div class="uk-navbar-right uk-visible@s">
<ul class="uk-navbar-nav">
<li><a onclick="toggleTheme()"><i class="fas fa-adjust"></i></a></li>
<li><a href="<%= base_url %>logout">Logout</a></li>
</ul>
</div>
</div>
</div>
</div>
<div class="uk-position-top">
<div class="uk-navbar-container uk-navbar-transparent" uk-navbar="uk-navbar">
<div class="uk-navbar-left uk-hidden@s">
<div class="uk-navbar-toggle" uk-navbar-toggle-icon="uk-navbar-toggle-icon" uk-toggle="target: #mobile-nav"></div>
</div>
<div class="uk-section uk-section-small">
</div>
<div class="uk-section uk-section-small" style="position:relative;">
<div class="uk-container uk-container-small">
<div id="alert"></div>
<%= content %>
<div class="uk-visible@m" id="totop-wrapper" x-data="{}" x-show="$('body').height() > 1.5 * $(window).height()">
<a href="#" uk-totop uk-scroll></a>
<div class="uk-navbar-left uk-visible@s">
<a class="uk-navbar-item uk-logo" href="<%= base_url %>"><img src="<%= base_url %>img/icon.png" style="width:90px;height:90px;"></a>
<ul class="uk-navbar-nav">
<li><a href="<%= base_url %>">Home</a></li>
<li><a href="<%= base_url %>library">Library</a></li>
<li><a href="<%= base_url %>tags">Tags</a></li>
<% if is_admin %>
<li><a href="<%= base_url %>admin">Admin</a></li>
<li>
<a href="#">Download</a>
<div class="uk-navbar-dropdown">
<ul class="uk-nav uk-navbar-dropdown-nav">
<li class="uk-nav-header">Source</li>
<li><a href="<%= base_url %>download/plugins">Plugins</a></li>
<li class="uk-nav-divider"></li>
<li><a href="<%= base_url %>admin/downloads">Download Manager</a></li>
</ul>
</div>
</div>
</li>
<% end %>
</ul>
</div>
<script>
setTheme();
const base_url = "<%= base_url %>";
</script>
<%= render_component "uikit" %>
<%= yield_content "script" %>
</body>
<div class="uk-navbar-right uk-visible@s">
<ul class="uk-navbar-nav">
<li><a onclick="toggleTheme()"><i class="fas fa-adjust"></i></a></li>
<li><a href="<%= base_url %>logout">Logout</a></li>
</ul>
</div>
</div>
</div>
<div class="uk-section uk-section-small">
</div>
<div class="uk-section uk-section-small" style="position:relative;">
<div class="uk-container uk-container-small">
<div id="alert"></div>
<%= content %>
<div class="uk-visible@m" id="totop-wrapper" x-data="{}" x-show="$('body').height() > 1.5 * $(window).height()">
<a href="#" uk-totop uk-scroll></a>
</div>
</div>
</div>
<script>
setTheme();
const base_url = "<%= base_url %>";
</script>
<%= render_component "uikit" %>
<%= yield_content "script" %>
</body>
</html>

View File

@@ -56,8 +56,10 @@
<div id="download-spinner" uk-spinner class="uk-margin-left" hidden></div>
</div>
<p class="uk-text-meta">Click on a table row to select the chapter. Drag your mouse over multiple rows to select them all. Hold Ctrl to make multiple non-adjacent selections.</p>
<table class="uk-table uk-table-striped uk-overflow-auto tablesorter">
</table>
<div class="uk-overflow-auto">
<table class="uk-table uk-table-striped tablesorter">
</table>
</div>
</div>
<% end %>

View File

@@ -21,15 +21,15 @@
<div
:class="{'uk-container': true, 'uk-container-small': mode === 'continuous', 'uk-container-expand': mode !== 'continuous'}">
<div x-show="!loading && mode === 'continuous'" x-cloak>
<template x-for="item in items">
<template x-if="!loading && mode === 'continuous'" x-for="item in items">
<img
uk-img
:class="{'uk-align-center': true, 'spine': item.width < 50}"
:style="item.style"
:data-src="item.url"
:width="item.width"
:height="item.height"
:id="item.id"
:style="`margin-top:${margin}px; margin-bottom:${margin}px`"
@click="showControl($event)"
/>
</template>
@@ -50,6 +50,9 @@
width:${mode === 'width' ? '100vw' : 'auto'};
height:${mode === 'height' ? '100vh' : 'auto'};
margin-bottom:0;
max-width:100%;
max-height:100%;
object-fit: contain;
`" />
<div style="position:absolute;z-index:1; top:0;left:0; width:30%;height:100%;" @click="flipPage(false)"></div>
@@ -80,6 +83,7 @@
</select>
</div>
</div>
<div class="uk-margin">
<label class="uk-form-label" for="mode-select">Mode</label>
<div class="uk-form-controls">
@@ -90,6 +94,26 @@
</div>
</div>
<div class="uk-margin" x-show="mode === 'continuous'">
<label class="uk-form-label" for="margin-range" x-text="`Page Margin: ${margin}px`"></label>
<div class="uk-form-controls">
<input id="margin-range" class="uk-range" type="range" min="0" max="50" step="5" x-model="margin" @change="marginChanged()">
</div>
</div>
<div class="uk-margin uk-form-horizontal" x-show="mode !== 'continuous'">
<label class="uk-form-label" for="enable-flip-animation">Enable Flip Animation</label>
<div class="uk-form-controls">
<input id="enable-flip-animation" class="uk-checkbox" type="checkbox" x-model="enableFlipAnimation" @change="enableFlipAnimationChanged()">
</div>
</div>
<div class="uk-margin uk-form-horizontal" x-show="mode !== 'continuous'">
<label class="uk-form-label" for="preload-lookahead" x-text="`Preload Image: ${preloadLookahead} page(s)`"></label>
<div class="uk-form-controls">
<input id="preload-lookahead" class="uk-range" type="range" min="0" max="5" step="1" x-model.number="preloadLookahead" @change="preloadLookaheadChanged()">
</div>
</div>
<hr class="uk-divider-icon">
<div class="uk-margin">
@@ -110,12 +134,12 @@
</div>
<div class="uk-modal-footer uk-text-right">
<% if previous_entry_url %>
<a class="uk-button uk-button-default uk-margin-small-right" href="<%= previous_entry_url %>">Previous Entry</a>
<a class="uk-button uk-button-default uk-margin-small-bottom uk-margin-small-right" href="<%= previous_entry_url %>">Previous Entry</a>
<% end %>
<% if next_entry_url %>
<a class="uk-button uk-button-default uk-margin-small-right" href="<%= next_entry_url %>">Next Entry</a>
<a class="uk-button uk-button-default uk-margin-small-bottom uk-margin-small-right" href="<%= next_entry_url %>">Next Entry</a>
<% end %>
<a class="uk-button uk-button-danger" href="<%= exit_url %>">Exit Reader</a>
<a class="uk-button uk-button-danger uk-margin-small-bottom uk-margin-small-right" href="<%= exit_url %>">Exit Reader</a>
</div>
</div>
</div>

View File

@@ -0,0 +1,54 @@
<h2 class="uk-title">MangaDex Subscription Manager</h2>
<div x-data="component()" x-init="init()">
<p x-show="available === false">The subscription manager uses a MangaDex API that requires authentication. Please <a href="<%= base_url %>admin/mangadex">connect to MangaDex</a> before using this feature.</p>
<p x-show="available && subscriptions.length === 0">No subscription found. Go to the <a href="<%= base_url %>download">MangaDex download page</a> and start subscribing.</p>
<template x-if="subscriptions.length > 0">
<div class="uk-overflow-auto">
<table class="uk-table uk-table-striped">
<thead>
<tr>
<th>Manga ID</th>
<th>Language</th>
<th>Group ID</th>
<th>Volume Range</th>
<th>Chapter Range</th>
<th>Creator</th>
<th>Last Checked</th>
<th>Created At</th>
<th>Actions</th>
</tr>
</thead>
<tbody>
<template x-for="sub in subscriptions" :key="sub">
<tr>
<td><a :href="`<%= mangadex_base_url %>/manga/${sub.manga_id}`" x-text="sub.manga_id"></a></td>
<td x-text="sub.language || 'All'"></td>
<td>
<a x-show="sub.group_id" :href="`<%= mangadex_base_url %>/group/${sub.group_id}`" x-text="sub.group_id"></a>
<span x-show="!sub.group_id">All</span>
</td>
<td x-text="formatRange(sub.min_volume, sub.max_volume)"></td>
<td x-text="formatRange(sub.min_chapter, sub.max_chapter)"></td>
<td x-text="sub.username"></td>
<td x-text="`${moment.unix(sub.last_checked).fromNow()}`"></td>
<td x-text="`${moment.unix(sub.created_at).fromNow()}`"></td>
<td :data-id="sub.id">
<a @click="check($event)" x-show="sub.username === '<%= username %>'" uk-icon="refresh" uk-tooltip="Check for updates"></a>
<a @click="rm($event)" x-show="sub.username === '<%= username %>'" uk-icon="trash" uk-tooltip="Delete"></a>
</td>
</tr>
</template>
</tbody>
</table>
</div>
</template>
</div>
<% content_for "script" do %>
<%= render_component "moment" %>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/subscription.js"></script>
<% end %>