Compare commits

...

254 Commits

Author SHA1 Message Date
Alex Ling
8a732804ae Merge pull request #232 from hkalexling/rc/0.24.0
v0.24.0
2021-09-25 13:39:48 +08:00
Alex Ling
9df372f784 Merge branch 'dev' into rc/0.24.0 2021-09-23 07:52:58 +00:00
Alex Ling
cf7431b8b6 Merge pull request #236 from hkalexling/fix/shallow-api-fix
Use `depth` instead of `shallow` in API
2021-09-23 15:52:14 +08:00
Alex Ling
974b6cfe9b Use depth instead of shallow in API 2021-09-22 09:17:14 +00:00
Alex Ling
4fbe5b471c Update README 2021-09-20 01:36:31 +00:00
Alex Ling
33e7e31fbc Bump version to 0.24.0 2021-09-20 01:11:26 +00:00
Alex Ling
72fae7f5ed Fix typo cbz -> gz 2021-09-18 12:41:25 +00:00
Alex Ling
f50a7e3b3e Merge branch 'dev' into rc/0.24.0 2021-09-18 12:40:56 +00:00
Alex Ling
66c4037f2b Merge pull request #233 from Leeingnyo/feature/avoid-unnecessary-sort
Avoid a unnecessary sorting
2021-09-18 20:40:25 +08:00
Leeingnyo
2c022a07e7 Avoid unnecessary sorts when getting deep percentage
This make page loading fast
2021-09-18 18:49:29 +09:00
Alex Ling
91362dfc7d Merge branch 'master' into rc/0.24.0 2021-09-18 09:20:59 +00:00
Alex Ling
97168b65d8 Make library cache path configurable 2021-09-18 08:40:08 +00:00
Alex Ling
6e04e249e7 Merge pull request #229 from Leeingnyo/feature/preserve-scanned-titles
Reuse scanned titles on boot and scanning
2021-09-18 16:14:31 +08:00
Alex Ling
16397050dd Update comments 2021-09-18 02:24:50 +00:00
Alex Ling
3f73591dd4 Update comments 2021-09-18 02:14:22 +00:00
Alex Ling
ec25109fa5 Merge branch 'feature/preserve-scanned-titles' of https://github.com/Leeingnyo/Mango into feature/preserve-scanned-titles 2021-09-18 02:04:02 +00:00
Alex Ling
96f1ef3dde Improve comments on examine 2021-09-18 02:00:10 +00:00
Leeingnyo
b56e16e1e1 Remove counter, yield everytime 2021-09-18 10:59:43 +09:00
Leeingnyo
9769e760a0 Pass a counter to recursive calls, Ignore negative threshold 2021-09-16 07:49:12 +09:00
Leeingnyo
70ab198a33 Add config 'forcely_yield_count'
the default value 1000 would make a fiber yield on each 4ms on SSD

Apply yield counter in Dir.contents_signauture
Use contents_signature cache in Title.new
2021-09-16 00:16:26 +09:00
Alex Ling
44a6f822cd Simplify Title.new 2021-09-15 09:00:30 +00:00
Alex Ling
2c241a96bb Merge branch 'dev' into feature/preserve-scanned-titles 2021-09-15 08:58:24 +00:00
Alex Ling
219d4446d1 Merge pull request #231 from hkalexling/feature/api-improvements
API Improvements
2021-09-15 16:54:41 +08:00
Alex Ling
d330db131e Simplify mark_unavailable 2021-09-15 08:46:30 +00:00
Leeingnyo
de193906a2 Refactor mark_unavailable 2021-09-15 16:54:55 +09:00
Leeingnyo
d13cfc045f Add a comment 2021-09-15 01:27:05 +09:00
Leeingnyo
a3b2cdd372 Lint 2021-09-15 01:17:44 +09:00
Leeingnyo
f4d7128b59 Mark unavailable only in candidates 2021-09-14 23:30:03 +09:00
Leeingnyo
663c0c0b38 Remove nested title including self 2021-09-14 23:28:28 +09:00
Leeingnyo
57b2f7c625 Get nested ids when title removed 2021-09-14 23:08:07 +09:00
Leeingnyo
9489d6abfd Use reference instead of primitive 2021-09-14 23:07:47 +09:00
Leeingnyo
670cf54957 Apply yield forcely 2021-09-14 22:51:37 +09:00
Leeingnyo
2e09efbd62 Collect deleted ids 2021-09-14 22:51:25 +09:00
Leeingnyo
523195d649 Define ExamineContext, apply it when scanning 2021-09-14 22:37:30 +09:00
Leeingnyo
be47f309b0 Use cache when calculating contents_signature 2021-09-14 18:11:08 +09:00
Alex Ling
03e044a1aa Improve logging 2021-09-14 07:16:14 +00:00
Alex Ling
4eaf271fa4 Simplify #json_build 2021-09-14 02:30:57 +00:00
Alex Ling
4b464ed361 Allow sorting in /api/book endpoint 2021-09-13 10:58:07 +00:00
Alex Ling
a9520d6f26 Add shallow option to library API endpoints 2021-09-13 10:18:07 +00:00
Leeingnyo
a151ec486d Fix file extension of gzip file 2021-09-12 18:04:41 +09:00
Leeingnyo
8f1383a818 Use Gzip instead of Zip 2021-09-12 18:01:16 +09:00
Leeingnyo
f5933a48d9 Register mime_type scan, thumbnails when loading instance 2021-09-12 17:40:40 +09:00
Leeingnyo
7734dae138 Remove unnecessary sort 2021-09-12 14:36:17 +09:00
Leeingnyo
8c90b46114 Remove removed titles from title_hash 2021-09-12 13:39:28 +09:00
Leeingnyo
cd48b45f11 Add 'require "yaml"' 2021-09-12 12:45:24 +09:00
Leeingnyo
bdbdf9c94b Fix to pass 'make check', fix comments 2021-09-12 11:09:48 +09:00
Leeingnyo
7e36c91ea7 Remove debug print 2021-09-12 10:47:15 +09:00
Leeingnyo
9309f51df6 Memoization on dir contents_signature 2021-09-12 02:19:49 +09:00
Leeingnyo
a8f729f5c1 Sort entries and titles when they needed 2021-09-12 02:19:49 +09:00
Leeingnyo
4e8b561f70 Apply contents signature of directories 2021-09-12 02:19:49 +09:00
Leeingnyo
e6214ddc5d Rescan only if instance loaded 2021-09-12 02:19:49 +09:00
Leeingnyo
80e13abc4a Spawn scan job 2021-09-12 02:14:58 +09:00
Leeingnyo
fb43abb950 Enhance the examine method 2021-09-12 02:14:58 +09:00
Leeingnyo
eb3e37b950 Examine titles and recycle them 2021-09-12 02:14:58 +09:00
Leeingnyo
0a90e3b333 Ignore caches 2021-09-12 02:14:58 +09:00
Leeingnyo
4409ed8f45 Implement save_instance, load_instance 2021-09-12 02:14:58 +09:00
Leeingnyo
291a340cdd Add yaml serializer to Library, Title, Entry 2021-09-12 02:14:58 +09:00
Leeingnyo
0667f01471 Measure scan only 2021-09-12 02:14:58 +09:00
Alex Ling
d5847bb105 Merge pull request #224 from hkalexling/fix/sanitize-download-filename
Stricter sanitization rules for download filenames
2021-09-09 08:47:41 +08:00
Alex Ling
3d295e961e Merge branch 'dev' into fix/sanitize-download-filename 2021-09-09 00:31:24 +00:00
Alex Ling
e408398523 Merge pull request #227 from hkalexling/all-contributors/add-lincolnthedev
docs: add lincolnthedev as a contributor for infra
2021-09-09 08:18:22 +08:00
Alex Ling
566cebfcdd Remove all leading dots and spaces 2021-09-09 00:13:58 +00:00
Alex Ling
a190ae3ed6 Merge pull request #226 from lincolnthedev/master
Better .dockerignore
2021-09-08 20:42:12 +08:00
allcontributors[bot]
17d7cefa12 docs: update .all-contributorsrc [skip ci] 2021-09-08 12:42:10 +00:00
allcontributors[bot]
eaef0556fa docs: update README.md [skip ci] 2021-09-08 12:42:09 +00:00
i use arch btw
53226eab61 Forgot .github 2021-09-08 07:58:58 -04:00
Alex Ling
ccf558eaa7 Improve filename sanitization rules 2021-09-08 10:03:05 +00:00
Alex Ling
0305433e46 Merge pull request #225 from hkalexling/feature/support-all-image-types
Support additional image formats (resolves #192)
2021-09-08 11:03:59 +08:00
i use arch btw
d2cad6c496 Update .dockerignore 2021-09-07 21:12:51 -04:00
Alex Ling
371796cce9 Support additional image formats:
- APNG
- AVIF
- GIF
- SVG
2021-09-07 11:04:05 +00:00
Alex Ling
d9adb49c27 Revert "Support all image types (resolves #192)"
This reverts commit f67e4e6cb9.
2021-09-07 10:45:59 +00:00
Alex Ling
f67e4e6cb9 Support all image types (resolves #192) 2021-09-06 13:32:10 +00:00
Alex Ling
60a126024c Stricter sanitization rules for download filenames
Fixes #212
2021-09-06 13:01:05 +00:00
Alex Ling
da8a485087 Merge pull request #222 from Leeingnyo/feature/enhance-loading-library
Improve loading pages (library, titles)
2021-09-06 16:49:19 +08:00
Alex Ling
d809c21ee1 Document CacheEntry 2021-09-06 08:23:54 +00:00
Alex Ling
ca1e221b10 Rename ids2entries -> ids_to_entries 2021-09-06 08:23:31 +00:00
Alex Ling
44d9c51ff9 Fix logging 2021-09-06 08:10:42 +00:00
Alex Ling
15a54f4f23 Add :sorted_entries suffix to gen_key 2021-09-06 08:10:13 +00:00
Alex Ling
51806f18db Rename config fields and improve logging 2021-09-06 03:35:46 +00:00
Alex Ling
79ef7bcd1c Remove unused variable 2021-09-06 03:01:21 +00:00
Leeingnyo
5cb85ea857 Set cached data when changed 2021-09-06 09:41:46 +09:00
Leeingnyo
9807db6ac0 Fix bug on entry_cover_url_cache 2021-09-06 02:32:13 +09:00
Leeingnyo
565a535d22 Remove caching verbosely, add cached_cover_url 2021-09-06 02:32:13 +09:00
Alex Ling
c5b6a8b5b9 Improve instance_size for Tuple 2021-09-05 13:57:20 +00:00
Leeingnyo
c75c71709f make check 2021-09-05 11:21:53 +09:00
Leeingnyo
11976b15f9 Make LRUCache togglable 2021-09-05 03:02:20 +09:00
Leeingnyo
847f516a65 Cache TitleInfo using LRUCache 2021-09-05 02:35:44 +09:00
Leeingnyo
de410f42b8 Replace InfoCache to LRUCache 2021-09-05 02:08:11 +09:00
Leeingnyo
0fd7caef4b Rename 2021-09-05 00:02:05 +09:00
Leeingnyo
5e919d3e19 Make entry generic 2021-09-04 23:56:17 +09:00
Leeingnyo
9e90aa17b9 Move entry specific method 2021-09-04 14:37:05 +09:00
Leeingnyo
0a8fd993e5 Use bytesize and add comments 2021-09-03 11:11:28 +09:00
Leeingnyo
365f71cd1d Change kbs to mbs 2021-08-30 23:10:08 +09:00
Leeingnyo
601346b209 Set cache if enabled 2021-08-30 23:07:59 +09:00
Leeingnyo
e988a8c121 Add config for sorted entries cache
optional
2021-08-30 22:59:23 +09:00
Leeingnyo
bf81a4e48b Implement sorted entries cache
sorted_entries cached
2021-08-30 22:58:40 +09:00
Leeingnyo
4a09aee177 Implement library caching TitleInfo
* Cache sum of entry progress
* Cache cover_url
* Cache display_name
* Cache sort_opt
2021-08-30 11:31:45 +09:00
Leeingnyo
00c9cc1fcd Prevent saving a sort opt unnecessarily 2021-08-30 11:31:45 +09:00
Leeingnyo
51a47b5ddd Cache display_name 2021-08-30 11:31:45 +09:00
Leeingnyo
244f97a68e Cache entries' cover_url 2021-08-30 08:24:40 +09:00
Alex Ling
8d84a3c502 Merge pull request #221 from hkalexling/rc/0.23.0
v0.23.0
2021-08-28 18:44:50 +08:00
Alex Ling
a26b4b3965 Add Discord badge 2021-08-28 10:27:43 +00:00
Alex Ling
f2dd20cdec Bump version to 0.23.0 2021-08-22 08:20:24 +00:00
Alex Ling
64d6cd293c Remove MangaDex from README 2021-08-22 08:20:01 +00:00
Alex Ling
08dc0601e8 Remove page_margin from README 2021-08-22 08:18:28 +00:00
Alex Ling
9c983df7e9 Merge pull request #216 from Leeingnyo/feature/update-crystal-1.0
Update crystal version to 1.0
2021-08-22 15:59:20 +08:00
Alex Ling
efc547f5b2 Fix "Executing query" spam in log 2021-08-22 07:34:29 +00:00
Alex Ling
995ca3b40f Update ARM Dockerfile 2021-08-20 08:26:11 +00:00
Alex Ling
864435d3f9 Upgrade dependencies for Crystal 1.0.0 2021-08-20 07:15:16 +00:00
Alex Ling
64c145cf80 Merge branch 'dev' of https://github.com/hkalexling/Mango into feature/update-crystal-1.0 2021-08-20 00:36:38 +00:00
Alex Ling
6549253ed1 Merge branch 'dev' of https://github.com/hkalexling/Mango into dev 2021-08-20 00:26:59 +00:00
Alex Ling
d9565718a4 Remove MangaDex integration 2021-08-20 00:25:21 +00:00
Alex Ling
400c3024fd Merge pull request #219 from Leeingnyo/feature/enhance-paged-mode
Enhance the paged mode reader
2021-08-20 08:13:39 +08:00
Leeingnyo
a703175b3a Parenthesize to avoid a misbehavior 2021-08-19 15:03:01 +09:00
Leeingnyo
83b122ab75 Implement a controller to toggle flip animation 2021-08-18 23:46:45 +09:00
Leeingnyo
1e7d6ba5b1 Keep an image ratio at the paged mode, clamping the image in a screen 2021-08-18 23:46:45 +09:00
Leeingnyo
4d1ad8fb38 Prevent load images 2021-08-18 23:46:45 +09:00
Leeingnyo
d544252e3e Add preload lookahead controller 2021-08-18 23:46:45 +09:00
Leeingnyo
b02b28d3e3 Implement to preload images when viewer is paged mode 2021-08-18 23:07:38 +09:00
Leeingnyo
d7efe1e553 Use yaml-static in Dockerfile 2021-08-18 21:23:28 +09:00
Alex Ling
1973564272 Revert "Subscription manager"
This reverts commit a612500b0f.
2021-08-18 12:09:59 +00:00
Alex Ling
29923f6dc7 Merge pull request #214 from Leeingnyo/feature/http-cache
HTTP cache for thumbnails, pages, dimensions
2021-08-18 07:37:21 +08:00
Leeingnyo
4a261d5ff8 Set Cache-Control header at page, dimensions API 2021-08-18 01:33:22 +09:00
Alex Ling
31d425d462 Document the 304 responses 2021-08-17 07:14:24 +00:00
Leeingnyo
a21681a6d7 Update a container version of bulid workflow 2021-08-16 13:23:58 +09:00
Leeingnyo
208019a0b9 Update Docker image 2021-08-16 01:48:18 +09:00
Leeingnyo
54e2a54ecb Update cyrstal-lang, libraries 2021-08-16 01:48:18 +09:00
Leeingnyo
2426ef05ec Apply cache on dimensions api
Use zip_path and mtime for hashing
It used for weak validation
2021-08-15 21:41:44 +09:00
Leeingnyo
25b90a8724 Apply cache on page, cover api
Get image data and use it for hashing
2021-08-15 21:33:08 +09:00
Alex Ling
cd8944ed2d Slim option in library and title APIs 2021-04-25 12:41:37 +00:00
Alex Ling
7f0c256fe6 Log login errors 2021-04-25 12:41:29 +00:00
Alex Ling
46e6e41bfe Fix reader buttons stacking on mobile 2021-03-29 00:41:33 +00:00
Alex Ling
c9f55e7a8e Use yaml-static 2021-03-28 12:49:50 +00:00
Alex Ling
741c3a4e20 Update config example in README 2021-03-28 11:56:06 +00:00
Alex Ling
f6da20321d Bump version to 0.22.0 2021-03-28 11:49:49 +00:00
Alex Ling
2764e955b2 Show success alert on plugin download page 2021-03-15 17:07:15 +00:00
Alex Ling
00c15014a1 Document subscription APIs 2021-03-15 07:12:30 +00:00
Alex Ling
c6fdbfd9fd Better format ranges on subscription manager page 2021-03-15 07:12:10 +00:00
Alex Ling
e03bf32358 Show success alerts on the download page 2021-03-14 17:36:43 +00:00
Alex Ling
bbf1520c73 Make in_range? private 2021-03-14 17:36:26 +00:00
Alex Ling
8950c3a1ed Fix downloader stuck on external chapters 2021-03-14 16:27:08 +00:00
Alex Ling
17837d8a29 Add tooltips to download manager 2021-03-14 16:03:37 +00:00
Alex Ling
b4a69425c8 Reverse the queue on download manager 2021-03-14 16:01:29 +00:00
Alex Ling
a612500b0f Subscription manager 2021-03-14 16:01:29 +00:00
Alex Ling
9bb7144479 Fix warning 2021-03-12 15:28:39 +00:00
Alex Ling
ee52c52f46 Fix new linter errors 2021-03-12 15:03:12 +00:00
Alex Ling
daec2bdac6 Update ameba 2021-03-12 14:06:20 +00:00
Alex Ling
e9a490676b Update the mangadex shard 2021-03-12 13:59:11 +00:00
Alex Ling
757f7c8214 Upgrade Crystal to 0.36.1 2021-03-12 13:41:24 +00:00
Alex Ling
eed1a9717e Merge branch 'master' into dev 2021-03-10 16:48:51 +00:00
Alex Ling
8829d2e237 Merge pull request #173 from hkalexling/rc/0.21.0 2021-03-11 00:44:49 +08:00
Alex Ling
eec6ec60bf Warn about old API url (#174) 2021-03-10 05:47:25 +00:00
Alex Ling
3a82effa40 Update config in README 2021-03-09 18:01:03 +00:00
Alex Ling
0b3e78bcb7 Merge branch 'rc/0.21.0' into dev 2021-03-09 16:45:26 +00:00
Alex Ling
cb4e4437a6 Update MD API URL (closes #174) 2021-03-09 16:43:46 +00:00
Alex Ling
6a275286ea Merge branch 'rc/0.21.0' into dev 2021-03-07 14:14:46 +00:00
Alex Ling
2743868438 Remove outdated MD API link in warning 2021-03-06 17:03:48 +00:00
Alex Ling
d3f26ecbc9 Move the page margin config to frontend 2021-03-06 15:04:44 +00:00
Alex Ling
f62344806a Bump version to 0.21.0 2021-03-06 06:16:07 +00:00
Alex Ling
b7b7e6f718 Fix typo [skip ci] 2021-03-05 17:04:23 +00:00
Alex Ling
05b4e77fa9 Entry selector on reader page (closes #168) 2021-03-05 17:02:45 +00:00
Alex Ling
8aab113aab Expiration date should be nil when theres no token 2021-03-05 11:01:00 +00:00
Alex Ling
371c8056e7 Wording 2021-03-05 10:57:23 +00:00
Alex Ling
a9a2c9faa8 Finish search for MD 2021-03-05 04:58:56 +00:00
Alex Ling
011768ed1f Rename the dots-scripts component to dots 2021-03-05 04:58:56 +00:00
Alex Ling
c36d2608e8 Make uk-card adaptive to dark/light mode 2021-03-05 04:58:56 +00:00
Alex Ling
1b25a1fa47 Update Koa 2021-03-05 04:58:56 +00:00
Alex Ling
df7e2270a4 Add MangaDex login page 2021-03-05 04:58:56 +00:00
Alex Ling
3c3549a489 Merge pull request #172 from hkalexling/hotfix/bind-localhost 2021-03-04 13:47:59 +08:00
Alex Ling
8160b0a18e Bump version to 0.20.2 2021-03-04 04:49:37 +00:00
Alex Ling
a7eff772be Update example config in README 2021-03-04 04:48:51 +00:00
Alex Ling
bf3900f9a2 Add host to config 2021-03-03 17:35:39 +00:00
Alex Ling
6fa575cf4f Bind localhost when a proxy auth header is set 2021-03-03 16:28:31 +00:00
Alex Ling
604c5d49a6 Merge pull request #166 from hkalexling/dev 2021-02-28 19:38:02 +08:00
Alex Ling
7449d19075 Bump version to 0.20.1 2021-02-26 10:35:34 +00:00
Alex Ling
c5c9305a0b Merge pull request #162 from hkalexling/all-contributors/add-davidkna
docs: add davidkna as a contributor
2021-02-14 23:30:02 +08:00
allcontributors[bot]
fdceab9060 docs: update .all-contributorsrc [skip ci] 2021-02-14 15:28:33 +00:00
allcontributors[bot]
c18591c5cf docs: update README.md [skip ci] 2021-02-14 15:28:32 +00:00
Alex Ling
bb5cb9b94c Merge pull request #161 from davidkna/docker-usr-local
Move binary in docker image to /usr/local
2021-02-14 23:26:38 +08:00
David Knaack
fb499a5caf Move binary in docker image to /usr/local 2021-02-14 11:42:00 +01:00
Alex Ling
154d85e197 Use only woff and woff2 2021-02-11 08:40:24 +00:00
Alex Ling
933617503e Optimize the static files
- Use webfont version of FontAwesome
- Use CDN for UIKit JS files
2021-02-10 16:24:34 +00:00
Alex Ling
31c6893bbb Display book spines in original size (fixes #152) 2021-02-06 13:37:25 +00:00
Alex Ling
171125e8ac Merge pull request #159 from Leeingnyo/fix/favicon-500-error
Fix HTTP 500 Error when accessing the favicon
2021-02-06 16:34:56 +08:00
Leeingnyo
d81334026b add MIME type of ico file
The server returns 500 error when requested '/favion.ico'
The handler worked fine, but send_file has failed with
- Missing MIME type for extension ".ico"
so I register mime type for .ico file
2021-02-06 16:58:49 +09:00
Alex Ling
2b3b2eb8ba Fill default configs before pre-processing 2021-02-03 05:27:41 +00:00
Alex Ling
ffd5f4454b Merge branch 'feature/auth-proxy' into dev 2021-02-03 05:23:00 +00:00
Alex Ling
cb25d7ba00 Merge branch 'feature/mangadex-api-upgrade' into dev 2021-02-03 05:22:35 +00:00
Alex Ling
3abd2924d0 Merge pull request #156 from hkalexling/dev
v0.20.0
2021-02-02 12:16:05 +08:00
Alex Ling
21233df754 Fix group filter on the download page 2021-02-01 11:37:00 +00:00
Alex Ling
c61eb7554e Update the mangadex shard 2021-02-01 11:35:16 +00:00
Alex Ling
edd9a2e093 Add MutationObserver polyfill 2021-01-31 15:32:38 +00:00
Alex Ling
1f50785e8f Rewrite MangaDex download page with Alpine 2021-01-31 12:48:37 +00:00
Alex Ling
70d418d1a1 Upgrade to MangaDex API v2 2021-01-30 17:08:04 +00:00
Alex Ling
45e20c94f9 Merge branch 'dev' into feature/auth-proxy 2021-01-30 10:55:27 +00:00
Alex Ling
ca8e9a164e Fix the /api page error when using base URL 2021-01-30 10:54:21 +00:00
Alex Ling
4da263c594 Rewrite auth_handler
Make sure the OPDS pages are accessible without login when login is
disabled
2021-01-30 10:54:03 +00:00
Alex Ling
d67a24809b Allow proxy authentication (#141) 2021-01-30 07:43:02 +00:00
Alex Ling
cd268af9dd Fix tags.css base URL 2021-01-30 07:39:54 +00:00
Alex Ling
135fa9fde6 Update sample config in README [skip ci] 2021-01-29 14:42:58 +00:00
Alex Ling
77333aaafd Bump version to 0.20.0 2021-01-29 10:27:31 +00:00
Alex Ling
1fad530331 Fix admin page theme setting syncing (#155) 2021-01-29 08:40:50 +00:00
Alex Ling
a1bd87098c Escape single quotes in migration 8 2021-01-28 12:41:51 +00:00
Alex Ling
a389fa7178 Allow delete all missing items (#151) 2021-01-28 09:55:41 +00:00
Alex Ling
b5db508005 Fix relative path mismatch (#151) 2021-01-28 04:04:42 +00:00
Alex Ling
30178c42ef Merge branch 'master' into dev 2021-01-27 09:47:49 +00:00
Alex Ling
b712db9e8f Merge pull request #154 from hkalexling/hkalexling-patch-1
Update autoapproval.yml
2021-01-27 16:28:05 +08:00
Alex Ling
dd9c75d1c9 Update autoapproval.yml 2021-01-27 16:17:33 +08:00
Alex Ling
2d150c3bf2 Create autoapproval.yml 2021-01-27 16:14:47 +08:00
Alex Ling
40f74ea375 Merge pull request #153 from hkalexling/hotfix/reader-bg
Fix incorrect background color on reader page
2021-01-27 15:19:04 +08:00
Alex Ling
adf260bc35 Bump version to v0.19.1 2021-01-27 06:33:45 +00:00
Alex Ling
432d6f0cd5 Run CI for hotfix/* branches 2021-01-27 06:33:45 +00:00
Alex Ling
3de314ae9a Fix incorrect background color on reader page 2021-01-27 06:33:45 +00:00
Alex Ling
c1c8cca877 Use Ameba to enforce max line width
Didn't know Ameba supports this!
2021-01-27 04:18:47 +00:00
Alex Ling
07965b98b7 Force File::Info#inode to return UInt64 2021-01-27 03:42:51 +00:00
Alex Ling
5779d225f6 Merge branch 'dev' of https://github.com/hkalexling/Mango into dev 2021-01-27 03:23:08 +00:00
Alex Ling
bf18a14016 Use inode number 2021-01-27 03:19:58 +00:00
Alex Ling
605dc61777 Merge pull request #150 from Leeingnyo/fix/allow-uppercase-extensions
Make file extension check case-insensitive
2021-01-26 19:16:12 +08:00
Alex Ling
def64d9f98 Rename interesting files to supported files 2021-01-26 10:55:50 +00:00
Leeingnyo
0ba2409c9a add tests about is_interesting_file 2021-01-26 04:18:09 +09:00
Leeingnyo
2b0cf41336 add and apply util method is_interesting_file 2021-01-26 04:17:32 +09:00
Leeingnyo
c51cb28df2 make filename extension downcase for comparing 2021-01-25 23:13:35 +09:00
Alex Ling
2b079c652d Fix duplicating options on the download page 2021-01-20 08:02:07 +00:00
Alex Ling
68050a9025 Fix incorrect dropdown color in dark mode 2021-01-20 05:20:03 +00:00
Alex Ling
54cd15d542 Mark items unavailable and retire DB optimization
This prepares us for the moving metadata to DB in the future
2021-01-19 15:09:38 +00:00
Alex Ling
781de97c68 Make thumbnail generation slower
This reduces the IO stress
2021-01-19 15:06:27 +00:00
Alex Ling
c7be0e0e7c Separate insert_id into titles and entries 2021-01-19 09:08:31 +00:00
Alex Ling
667d390be4 Signature matching 2021-01-19 08:43:45 +00:00
Alex Ling
7f76322377 Merge branch 'dev' into feature/signature 2021-01-18 06:54:38 +00:00
Alex Ling
377c4c6554 Stop the process when the server fails to start 2021-01-18 06:44:10 +00:00
Alex Ling
952aa0c6ca Fix linter 2021-01-17 15:59:42 +00:00
Alex Ling
bd81c2e005 Fix incorrect migration SQL 2021-01-17 15:58:13 +00:00
Alex Ling
b471ed2fa0 Upgrade MG 2021-01-17 15:49:10 +00:00
Alex Ling
7507ab64ad Bump version to v0.19.0 2021-01-17 08:34:35 +00:00
Alex Ling
e4587d36bc Fix linter 2021-01-17 08:25:01 +00:00
Alex Ling
7d6d3640ad Disable the tagging UI for non-admin users 2021-01-17 08:16:40 +00:00
Alex Ling
3071d44e32 Fix admin API bypassing 2021-01-17 08:10:43 +00:00
Alex Ling
7a09c9006a Set up foreign keys 2021-01-17 04:47:06 +00:00
Alex Ling
959560c7a7 Add titles and move insert_ids to class variable
This fixes the bug where the new ids are not saved
2021-01-17 04:45:55 +00:00
Alex Ling
ff679b30d8 Capitalize the UNIQUE keyword 2021-01-17 04:41:05 +00:00
Alex Ling
f7a360c2d8 Proper DB migration 2021-01-16 17:11:57 +00:00
Alex Ling
1065b430e3 Rewrite tagging UI with suggestions (#146) 2021-01-14 13:08:50 +00:00
Alex Ling
5abf7032a5 Use less 2021-01-14 13:04:57 +00:00
Alex Ling
18e8e88c66 Initial work on title signature 2021-01-14 08:23:39 +00:00
Alex Ling
44336c546a Bump version to v0.18.3 2021-01-12 10:14:12 +00:00
Alex Ling
a4c6e6611c Try WSS first, and fallback to WS (#144) 2021-01-12 10:13:06 +00:00
Alex Ling
0b457a2797 Merge branch 'master' of https://github.com/hkalexling/Mango 2021-01-11 15:37:34 +00:00
Alex Ling
653751bede Merge branch 'dev' 2021-01-11 15:37:06 +00:00
Alex Ling
a02bf4a81e Bump version to v0.18.2 2021-01-11 15:22:51 +00:00
Alex Ling
5271d12f4c Respect base URL in WS connections 2021-01-11 15:05:58 +00:00
Alex Ling
c2e2f0b9b3 Merge pull request #143 from hkalexling/all-contributors/add-h45h74x
docs: add h45h74x as a contributor
2021-01-11 19:32:47 +08:00
allcontributors[bot]
72d319902e docs: update .all-contributorsrc [skip ci] 2021-01-11 11:31:21 +00:00
allcontributors[bot]
bbd0fd68cb docs: update README.md [skip ci] 2021-01-11 11:31:20 +00:00
Alex Ling
0fb1e1598d Remove sourcerer.io HoF and use all-contributors
[skip ci]
RIP sourcerer.io https://github.com/sourcerer-io/sourcerer-app/issues/632
2021-01-11 11:28:30 +00:00
90 changed files with 3157 additions and 1670 deletions

120
.all-contributorsrc Normal file
View File

@@ -0,0 +1,120 @@
{
"projectName": "Mango",
"projectOwner": "hkalexling",
"repoType": "github",
"repoHost": "https://github.com",
"files": [
"README.md"
],
"imageSize": 100,
"commit": false,
"commitConvention": "none",
"contributors": [
{
"login": "hkalexling",
"name": "Alex Ling",
"avatar_url": "https://avatars1.githubusercontent.com/u/7845831?v=4",
"profile": "https://github.com/hkalexling/",
"contributions": [
"code",
"doc",
"infra"
]
},
{
"login": "jaredlt",
"name": "jaredlt",
"avatar_url": "https://avatars1.githubusercontent.com/u/8590311?v=4",
"profile": "https://github.com/jaredlt",
"contributions": [
"code",
"ideas",
"design"
]
},
{
"login": "shincurry",
"name": "ココロ",
"avatar_url": "https://avatars1.githubusercontent.com/u/4946624?v=4",
"profile": "https://windisco.com/",
"contributions": [
"infra"
]
},
{
"login": "noirscape",
"name": "Valentijn",
"avatar_url": "https://avatars0.githubusercontent.com/u/13433513?v=4",
"profile": "https://catgirlsin.space/",
"contributions": [
"infra"
]
},
{
"login": "flying-sausages",
"name": "flying-sausages",
"avatar_url": "https://avatars1.githubusercontent.com/u/23618693?v=4",
"profile": "https://github.com/flying-sausages",
"contributions": [
"doc",
"ideas"
]
},
{
"login": "XavierSchiller",
"name": "Xavier",
"avatar_url": "https://avatars1.githubusercontent.com/u/22575255?v=4",
"profile": "https://github.com/XavierSchiller",
"contributions": [
"infra"
]
},
{
"login": "WROIATE",
"name": "Jarao",
"avatar_url": "https://avatars3.githubusercontent.com/u/44677306?v=4",
"profile": "https://github.com/WROIATE",
"contributions": [
"infra"
]
},
{
"login": "Leeingnyo",
"name": "이인용",
"avatar_url": "https://avatars0.githubusercontent.com/u/6760150?v=4",
"profile": "https://github.com/Leeingnyo",
"contributions": [
"code"
]
},
{
"login": "h45h74x",
"name": "Simon",
"avatar_url": "https://avatars1.githubusercontent.com/u/27204033?v=4",
"profile": "http://h45h74x.eu.org",
"contributions": [
"code"
]
},
{
"login": "davidkna",
"name": "David Knaack",
"avatar_url": "https://avatars.githubusercontent.com/u/835177?v=4",
"profile": "https://github.com/davidkna",
"contributions": [
"infra"
]
},
{
"login": "lincolnthedev",
"name": "i use arch btw",
"avatar_url": "https://avatars.githubusercontent.com/u/41193328?v=4",
"profile": "https://lncn.dev",
"contributions": [
"infra"
]
}
],
"contributorsPerLine": 7,
"skipCi": true
}

View File

@@ -7,3 +7,8 @@ Lint/UnusedArgument:
- src/routes/*
Metrics/CyclomaticComplexity:
Enabled: false
Layout/LineLength:
Enabled: true
MaxLength: 80
Excluded:
- src/routes/api.cr

View File

@@ -1,2 +1,9 @@
node_modules
lib
Dockerfile
Dockerfile.arm32v7
Dockerfile.arm64v8
README.md
.all-contributorsrc
env.example
.github/

6
.github/autoapproval.yml vendored Normal file
View File

@@ -0,0 +1,6 @@
from_owner:
- hkalexling
required_labels:
- autoapprove
apply_labels:
- autoapproved

View File

@@ -2,7 +2,7 @@ name: Build
on:
push:
branches: [ master, dev ]
branches: [ master, dev, hotfix/* ]
pull_request:
branches: [ master, dev ]
@@ -12,12 +12,12 @@ jobs:
runs-on: ubuntu-latest
container:
image: crystallang/crystal:0.35.1-alpine
image: crystallang/crystal:1.0.0-alpine
steps:
- uses: actions/checkout@v2
- name: Install dependencies
run: apk add --no-cache yarn yaml sqlite-static libarchive-dev libarchive-static acl-static expat-static zstd-static lz4-static bzip2-static libjpeg-turbo-dev libpng-dev tiff-dev
run: apk add --no-cache yarn yaml-static sqlite-static libarchive-dev libarchive-static acl-static expat-static zstd-static lz4-static bzip2-static libjpeg-turbo-dev libpng-dev tiff-dev
- name: Build
run: make static || make static
- name: Linter

2
.gitignore vendored
View File

@@ -12,3 +12,5 @@ mango
public/css/uikit.css
public/img/*.svg
public/js/*.min.js
public/css/*.css
public/webfonts

View File

@@ -1,15 +1,15 @@
FROM crystallang/crystal:0.35.1-alpine AS builder
FROM crystallang/crystal:1.0.0-alpine AS builder
WORKDIR /Mango
COPY . .
RUN apk add --no-cache yarn yaml sqlite-static libarchive-dev libarchive-static acl-static expat-static zstd-static lz4-static bzip2-static libjpeg-turbo-dev libpng-dev tiff-dev
RUN apk add --no-cache yarn yaml-static sqlite-static libarchive-dev libarchive-static acl-static expat-static zstd-static lz4-static bzip2-static libjpeg-turbo-dev libpng-dev tiff-dev
RUN make static || make static
FROM library/alpine
WORKDIR /
COPY --from=builder /Mango/mango .
COPY --from=builder /Mango/mango /usr/local/bin/mango
CMD ["./mango"]
CMD ["/usr/local/bin/mango"]

View File

@@ -2,13 +2,14 @@ FROM arm32v7/ubuntu:18.04
RUN apt-get update && apt-get install -y wget git make llvm-8 llvm-8-dev g++ libsqlite3-dev libyaml-dev libgc-dev libssl-dev libcrypto++-dev libevent-dev libgmp-dev zlib1g-dev libpcre++-dev pkg-config libarchive-dev libxml2-dev libacl1-dev nettle-dev liblzo2-dev liblzma-dev libbz2-dev libjpeg-turbo8-dev libpng-dev libtiff-dev
RUN git clone https://github.com/crystal-lang/crystal && cd crystal && git checkout 0.35.1 && make deps && cd ..
RUN git clone https://github.com/kostya/myhtml && cd myhtml/src/ext && git checkout v1.5.0 && make && cd ..
RUN git clone https://github.com/jessedoyle/duktape.cr && cd duktape.cr/ext && git checkout v0.20.0 && make && cd ..
RUN git clone https://github.com/hkalexling/image_size.cr && cd image_size.cr && git checkout v0.2.0 && make && cd ..
RUN git clone https://github.com/crystal-lang/crystal && cd crystal && git checkout 1.0.0 && make deps && cd ..
RUN git clone https://github.com/kostya/myhtml && cd myhtml/src/ext && git checkout v1.5.8 && make && cd ..
RUN git clone https://github.com/jessedoyle/duktape.cr && cd duktape.cr/ext && git checkout v1.0.0 && make && cd ..
RUN git clone https://github.com/hkalexling/image_size.cr && cd image_size.cr && git checkout v0.5.0 && make && cd ..
COPY mango-arm32v7.o .
RUN cc 'mango-arm32v7.o' -o 'mango' -rdynamic -lxml2 -L/image_size.cr/ext/libwebp -lwebp -L/image_size.cr/ext/stbi -lstbi /myhtml/src/ext/modest-c/lib/libmodest_static.a -L/duktape.cr/src/.build/lib -L/duktape.cr/src/.build/include -lduktape -lm `pkg-config libarchive --libs` -lz `command -v pkg-config > /dev/null && pkg-config --libs --silence-errors libssl || printf %s '-lssl -lcrypto'` `command -v pkg-config > /dev/null && pkg-config --libs --silence-errors libcrypto || printf %s '-lcrypto'` -lgmp -lsqlite3 -lyaml -lpcre -lm /usr/lib/arm-linux-gnueabihf/libgc.so -lpthread /crystal/src/ext/libcrystal.a -levent -lrt -ldl -L/usr/bin/../lib/crystal/lib -L/usr/bin/../lib/crystal/lib
RUN cc 'mango-arm32v7.o' -o '/usr/local/bin/mango' -rdynamic -lxml2 -L/image_size.cr/ext/libwebp -lwebp -L/image_size.cr/ext/stbi -lstbi /myhtml/src/ext/modest-c/lib/libmodest_static.a -L/duktape.cr/src/.build/lib -L/duktape.cr/src/.build/include -lduktape -lm `pkg-config libarchive --libs` -lz `command -v pkg-config > /dev/null && pkg-config --libs --silence-errors libssl || printf %s '-lssl -lcrypto'` `command -v pkg-config > /dev/null && pkg-config --libs --silence-errors libcrypto || printf %s '-lcrypto'` -lgmp -lsqlite3 -lyaml -lpcre -lm /usr/lib/arm-linux-gnueabihf/libgc.so -lpthread /crystal/src/ext/libcrystal.a -levent -lrt -ldl -L/usr/bin/../lib/crystal/lib -L/usr/bin/../lib/crystal/lib
CMD ["/usr/local/bin/mango"]
CMD ["./mango"]

View File

@@ -2,13 +2,13 @@ FROM arm64v8/ubuntu:18.04
RUN apt-get update && apt-get install -y wget git make llvm-8 llvm-8-dev g++ libsqlite3-dev libyaml-dev libgc-dev libssl-dev libcrypto++-dev libevent-dev libgmp-dev zlib1g-dev libpcre++-dev pkg-config libarchive-dev libxml2-dev libacl1-dev nettle-dev liblzo2-dev liblzma-dev libbz2-dev libjpeg-turbo8-dev libpng-dev libtiff-dev
RUN git clone https://github.com/crystal-lang/crystal && cd crystal && git checkout 0.35.1 && make deps && cd ..
RUN git clone https://github.com/kostya/myhtml && cd myhtml/src/ext && git checkout v1.5.0 && make && cd ..
RUN git clone https://github.com/jessedoyle/duktape.cr && cd duktape.cr/ext && git checkout v0.20.0 && make && cd ..
RUN git clone https://github.com/hkalexling/image_size.cr && cd image_size.cr && git checkout v0.2.0 && make && cd ..
RUN git clone https://github.com/crystal-lang/crystal && cd crystal && git checkout 1.0.0 && make deps && cd ..
RUN git clone https://github.com/kostya/myhtml && cd myhtml/src/ext && git checkout v1.5.8 && make && cd ..
RUN git clone https://github.com/jessedoyle/duktape.cr && cd duktape.cr/ext && git checkout v1.0.0 && make && cd ..
RUN git clone https://github.com/hkalexling/image_size.cr && cd image_size.cr && git checkout v0.5.0 && make && cd ..
COPY mango-arm64v8.o .
RUN cc 'mango-arm64v8.o' -o 'mango' -rdynamic -lxml2 -L/image_size.cr/ext/libwebp -lwebp -L/image_size.cr/ext/stbi -lstbi /myhtml/src/ext/modest-c/lib/libmodest_static.a -L/duktape.cr/src/.build/lib -L/duktape.cr/src/.build/include -lduktape -lm `pkg-config libarchive --libs` -lz `command -v pkg-config > /dev/null && pkg-config --libs --silence-errors libssl || printf %s '-lssl -lcrypto'` `command -v pkg-config > /dev/null && pkg-config --libs --silence-errors libcrypto || printf %s '-lcrypto'` -lgmp -lsqlite3 -lyaml -lpcre -lm /usr/lib/aarch64-linux-gnu/libgc.so -lpthread /crystal/src/ext/libcrystal.a -levent -lrt -ldl -L/usr/bin/../lib/crystal/lib -L/usr/bin/../lib/crystal/lib
RUN cc 'mango-arm64v8.o' -o '/usr/local/bin/mango' -rdynamic -lxml2 -L/image_size.cr/ext/libwebp -lwebp -L/image_size.cr/ext/stbi -lstbi /myhtml/src/ext/modest-c/lib/libmodest_static.a -L/duktape.cr/src/.build/lib -L/duktape.cr/src/.build/include -lduktape -lm `pkg-config libarchive --libs` -lz `command -v pkg-config > /dev/null && pkg-config --libs --silence-errors libssl || printf %s '-lssl -lcrypto'` `command -v pkg-config > /dev/null && pkg-config --libs --silence-errors libcrypto || printf %s '-lcrypto'` -lgmp -lsqlite3 -lyaml -lpcre -lm /usr/lib/aarch64-linux-gnu/libgc.so -lpthread /crystal/src/ext/libcrystal.a -levent -lrt -ldl -L/usr/bin/../lib/crystal/lib -L/usr/bin/../lib/crystal/lib
CMD ["./mango"]
CMD ["/usr/local/bin/mango"]

View File

@@ -29,7 +29,6 @@ test:
check:
crystal tool format --check
./bin/ameba
./dev/linewidth.sh
arm32v7:
crystal build src/mango.cr --release --progress --error-trace --cross-compile --target='arm-linux-gnueabihf' -o mango-arm32v7

View File

@@ -2,7 +2,7 @@
# Mango
[![Patreon](https://img.shields.io/badge/support-patreon-brightgreen?link=https://www.patreon.com/hkalexling)](https://www.patreon.com/hkalexling) ![Build](https://github.com/hkalexling/Mango/workflows/Build/badge.svg) [![Gitter](https://badges.gitter.im/mango-cr/mango.svg)](https://gitter.im/mango-cr/mango?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
[![Patreon](https://img.shields.io/badge/support-patreon-brightgreen?link=https://www.patreon.com/hkalexling)](https://www.patreon.com/hkalexling) ![Build](https://github.com/hkalexling/Mango/workflows/Build/badge.svg) [![Gitter](https://badges.gitter.im/mango-cr/mango.svg)](https://gitter.im/mango-cr/mango?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [![Discord](https://img.shields.io/discord/855633663425118228?label=discord)](http://discord.com/invite/ezKtacCp9Q)
Mango is a self-hosted manga server and reader. Its features include
@@ -13,7 +13,6 @@ Mango is a self-hosted manga server and reader. Its features include
- Supports nested folders in library
- Automatically stores reading progress
- Thumbnail generation
- Built-in [MangaDex](https://mangadex.org/) downloader
- Supports [plugins](https://github.com/hkalexling/mango-plugins) to download from thrid-party sites
- The web reader is responsive and works well on mobile, so there is no need for a mobile app
- All the static files are embedded in the binary, so the deployment process is easy and painless
@@ -52,7 +51,7 @@ The official docker images are available on [Dockerhub](https://hub.docker.com/r
### CLI
```
Mango - Manga Server and Web Reader. Version 0.18.1
Mango - Manga Server and Web Reader. Version 0.24.0
Usage:
@@ -75,6 +74,7 @@ The default config file location is `~/.config/mango/config.yml`. It might be di
```yaml
---
host: 0.0.0.0
port: 9000
base_url: /
session_secret: mango-session-secret
@@ -82,17 +82,20 @@ library_path: ~/mango/library
db_path: ~/mango/mango.db
scan_interval_minutes: 5
thumbnail_generation_interval_hours: 24
db_optimization_interval_hours: 24
log_level: info
upload_path: ~/mango/uploads
plugin_path: ~/mango/plugins
download_timeout_seconds: 30
page_margin: 30
library_cache_path: ~/mango/library.yml.gz
cache_enabled: false
cache_size_mbs: 50
cache_log_enabled: true
disable_login: false
default_username: ""
auth_proxy_header_name: ""
mangadex:
base_url: https://mangadex.org
api_url: https://mangadex.org/api
api_url: https://api.mangadex.org/v2
download_wait_seconds: 5
download_retries: 4
download_queue_db_path: ~/mango/queue.db
@@ -103,6 +106,7 @@ mangadex:
- `scan_interval_minutes`, `thumbnail_generation_interval_hours` and `db_optimization_interval_hours` can be any non-negative integer. Setting them to `0` disables the periodic tasks
- `log_level` can be `debug`, `info`, `warn`, `error`, `fatal` or `off`. Setting it to `off` disables the logging
- You can disable authentication by setting `disable_login` to true. Note that `default_username` must be set to an existing username for this to work.
- By setting `cache_enabled` to `true`, you can enable an experimental feature where Mango caches library metadata to improve page load time. You can further fine-tune the feature with `cache_size_mbs` and `cache_log_enabled`.
### Library Structure
@@ -157,5 +161,28 @@ Mobile UI:
## Contributors
Please check the [development guideline](https://github.com/hkalexling/Mango/wiki/Development) if you are interested in code contributions.
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
<tr>
<td align="center"><a href="https://github.com/hkalexling/"><img src="https://avatars1.githubusercontent.com/u/7845831?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Alex Ling</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=hkalexling" title="Code">💻</a> <a href="https://github.com/hkalexling/Mango/commits?author=hkalexling" title="Documentation">📖</a> <a href="#infra-hkalexling" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://github.com/jaredlt"><img src="https://avatars1.githubusercontent.com/u/8590311?v=4?s=100" width="100px;" alt=""/><br /><sub><b>jaredlt</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=jaredlt" title="Code">💻</a> <a href="#ideas-jaredlt" title="Ideas, Planning, & Feedback">🤔</a> <a href="#design-jaredlt" title="Design">🎨</a></td>
<td align="center"><a href="https://windisco.com/"><img src="https://avatars1.githubusercontent.com/u/4946624?v=4?s=100" width="100px;" alt=""/><br /><sub><b>ココロ</b></sub></a><br /><a href="#infra-shincurry" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://catgirlsin.space/"><img src="https://avatars0.githubusercontent.com/u/13433513?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Valentijn</b></sub></a><br /><a href="#infra-noirscape" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://github.com/flying-sausages"><img src="https://avatars1.githubusercontent.com/u/23618693?v=4?s=100" width="100px;" alt=""/><br /><sub><b>flying-sausages</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=flying-sausages" title="Documentation">📖</a> <a href="#ideas-flying-sausages" title="Ideas, Planning, & Feedback">🤔</a></td>
<td align="center"><a href="https://github.com/XavierSchiller"><img src="https://avatars1.githubusercontent.com/u/22575255?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Xavier</b></sub></a><br /><a href="#infra-XavierSchiller" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://github.com/WROIATE"><img src="https://avatars3.githubusercontent.com/u/44677306?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Jarao</b></sub></a><br /><a href="#infra-WROIATE" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
</tr>
<tr>
<td align="center"><a href="https://github.com/Leeingnyo"><img src="https://avatars0.githubusercontent.com/u/6760150?v=4?s=100" width="100px;" alt=""/><br /><sub><b>이인용</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=Leeingnyo" title="Code">💻</a></td>
<td align="center"><a href="http://h45h74x.eu.org"><img src="https://avatars1.githubusercontent.com/u/27204033?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Simon</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=h45h74x" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/davidkna"><img src="https://avatars.githubusercontent.com/u/835177?v=4?s=100" width="100px;" alt=""/><br /><sub><b>David Knaack</b></sub></a><br /><a href="#infra-davidkna" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://lncn.dev"><img src="https://avatars.githubusercontent.com/u/41193328?v=4?s=100" width="100px;" alt=""/><br /><sub><b>i use arch btw</b></sub></a><br /><a href="#infra-lincolnthedev" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
</tr>
</table>
[![](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/images/0)](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/links/0)[![](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/images/1)](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/links/1)[![](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/images/2)](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/links/2)[![](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/images/3)](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/links/3)[![](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/images/4)](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/links/4)[![](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/images/5)](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/links/5)[![](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/images/6)](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/links/6)[![](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/images/7)](https://sourcerer.io/fame/hkalexling/hkalexling/Mango/links/7)
<!-- markdownlint-restore -->
<!-- prettier-ignore-end -->
<!-- ALL-CONTRIBUTORS-LIST:END -->

View File

@@ -1,5 +0,0 @@
#!/bin/sh
[ ! -z "$(grep '.\{80\}' --exclude-dir=lib --include="*.cr" -nr --color=always . | grep -v "routes/api.cr" | tee /dev/tty)" ] \
&& echo "The above lines exceed the 80 characters limit" \
|| exit 0

View File

@@ -4,26 +4,25 @@ const minify = require('gulp-babel-minify');
const minifyCss = require('gulp-minify-css');
const less = require('gulp-less');
// Copy libraries from node_moduels to public/js
gulp.task('copy-js', () => {
return gulp.src([
'node_modules/@fortawesome/fontawesome-free/js/fontawesome.min.js',
'node_modules/@fortawesome/fontawesome-free/js/solid.min.js',
'node_modules/uikit/dist/js/uikit.min.js',
'node_modules/uikit/dist/js/uikit-icons.min.js'
])
.pipe(gulp.dest('public/js'));
});
// Copy UIKit SVG icons to public/img
gulp.task('copy-uikit-icons', () => {
gulp.task('copy-img', () => {
return gulp.src('node_modules/uikit/src/images/backgrounds/*.svg')
.pipe(gulp.dest('public/img'));
});
gulp.task('copy-font', () => {
return gulp.src('node_modules/@fortawesome/fontawesome-free/webfonts/fa-solid-900.woff**')
.pipe(gulp.dest('public/webfonts'));
});
// Copy files from node_modules
gulp.task('node-modules-copy', gulp.parallel('copy-img', 'copy-font'));
// Compile less
gulp.task('less', () => {
return gulp.src('public/css/*.less')
return gulp.src([
'public/css/mango.less',
'public/css/tags.less'
])
.pipe(less())
.pipe(gulp.dest('public/css'));
});
@@ -54,14 +53,19 @@ gulp.task('minify-css', () => {
// Copy static files (includeing images) to dist
gulp.task('copy-files', () => {
return gulp.src(['public/img/*', 'public/*.*', 'public/js/*.min.js'], {
return gulp.src([
'public/*.*',
'public/img/*',
'public/webfonts/*',
'public/js/*.min.js'
], {
base: 'public'
})
.pipe(gulp.dest('dist'));
});
// Set up the public folder for development
gulp.task('dev', gulp.parallel('copy-js', 'copy-uikit-icons', 'less'));
gulp.task('dev', gulp.parallel('node-modules-copy', 'less'));
// Set up the dist folder for deployment
gulp.task('deploy', gulp.parallel('babel', 'minify-css', 'copy-files'));

View File

@@ -0,0 +1,85 @@
class ForeignKeys < MG::Base
def up : String
<<-SQL
-- add foreign key to tags
ALTER TABLE tags RENAME TO tmp;
CREATE TABLE tags (
id TEXT NOT NULL,
tag TEXT NOT NULL,
UNIQUE (id, tag),
FOREIGN KEY (id) REFERENCES titles (id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO tags
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE INDEX tags_id_idx ON tags (id);
CREATE INDEX tags_tag_idx ON tags (tag);
-- add foreign key to thumbnails
ALTER TABLE thumbnails RENAME TO tmp;
CREATE TABLE thumbnails (
id TEXT NOT NULL,
data BLOB NOT NULL,
filename TEXT NOT NULL,
mime TEXT NOT NULL,
size INTEGER NOT NULL,
FOREIGN KEY (id) REFERENCES ids (id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO thumbnails
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE UNIQUE INDEX tn_index ON thumbnails (id);
SQL
end
def down : String
<<-SQL
-- remove foreign key from thumbnails
ALTER TABLE thumbnails RENAME TO tmp;
CREATE TABLE thumbnails (
id TEXT NOT NULL,
data BLOB NOT NULL,
filename TEXT NOT NULL,
mime TEXT NOT NULL,
size INTEGER NOT NULL
);
INSERT INTO thumbnails
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE UNIQUE INDEX tn_index ON thumbnails (id);
-- remove foreign key from tags
ALTER TABLE tags RENAME TO tmp;
CREATE TABLE tags (
id TEXT NOT NULL,
tag TEXT NOT NULL,
UNIQUE (id, tag)
);
INSERT INTO tags
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE INDEX tags_id_idx ON tags (id);
CREATE INDEX tags_tag_idx ON tags (tag);
SQL
end
end

19
migration/ids.2.cr Normal file
View File

@@ -0,0 +1,19 @@
class CreateIds < MG::Base
def up : String
<<-SQL
CREATE TABLE IF NOT EXISTS ids (
path TEXT NOT NULL,
id TEXT NOT NULL,
is_title INTEGER NOT NULL
);
CREATE UNIQUE INDEX IF NOT EXISTS path_idx ON ids (path);
CREATE UNIQUE INDEX IF NOT EXISTS id_idx ON ids (id);
SQL
end
def down : String
<<-SQL
DROP TABLE ids;
SQL
end
end

View File

@@ -0,0 +1,50 @@
class IDSignature < MG::Base
def up : String
<<-SQL
ALTER TABLE ids ADD COLUMN signature TEXT;
SQL
end
def down : String
<<-SQL
-- remove signature column from ids
ALTER TABLE ids RENAME TO tmp;
CREATE TABLE ids (
path TEXT NOT NULL,
id TEXT NOT NULL
);
INSERT INTO ids
SELECT path, id
FROM tmp;
DROP TABLE tmp;
-- recreate the indices
CREATE UNIQUE INDEX path_idx ON ids (path);
CREATE UNIQUE INDEX id_idx ON ids (id);
-- recreate the foreign key constraint on thumbnails
ALTER TABLE thumbnails RENAME TO tmp;
CREATE TABLE thumbnails (
id TEXT NOT NULL,
data BLOB NOT NULL,
filename TEXT NOT NULL,
mime TEXT NOT NULL,
size INTEGER NOT NULL,
FOREIGN KEY (id) REFERENCES ids (id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO thumbnails
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE UNIQUE INDEX tn_index ON thumbnails (id);
SQL
end
end

View File

@@ -0,0 +1,20 @@
class CreateMangaDexAccount < MG::Base
def up : String
<<-SQL
CREATE TABLE md_account (
username TEXT NOT NULL PRIMARY KEY,
token TEXT NOT NULL,
expire INTEGER NOT NULL,
FOREIGN KEY (username) REFERENCES users (username)
ON UPDATE CASCADE
ON DELETE CASCADE
);
SQL
end
def down : String
<<-SQL
DROP TABLE md_account;
SQL
end
end

View File

@@ -0,0 +1,33 @@
class RelativePath < MG::Base
def up : String
base = Config.current.library_path
# Escape single quotes in case the path contains them, and remove the
# trailing slash (this is a mistake, fixed in DB version 10)
base = base.gsub("'", "''").rstrip "/"
<<-SQL
-- update the path column in ids to relative paths
UPDATE ids
SET path = REPLACE(path, '#{base}', '');
-- update the path column in titles to relative paths
UPDATE titles
SET path = REPLACE(path, '#{base}', '');
SQL
end
def down : String
base = Config.current.library_path
base = base.gsub("'", "''").rstrip "/"
<<-SQL
-- update the path column in ids to absolute paths
UPDATE ids
SET path = '#{base}' || path;
-- update the path column in titles to absolute paths
UPDATE titles
SET path = '#{base}' || path;
SQL
end
end

View File

@@ -0,0 +1,31 @@
# In DB version 8, we replaced the absolute paths in DB with relative paths,
# but we mistakenly left the starting slashes. This migration removes them.
class RelativePathFix < MG::Base
def up : String
<<-SQL
-- remove leading slashes from the paths in ids
UPDATE ids
SET path = SUBSTR(path, 2, LENGTH(path) - 1)
WHERE path LIKE '/%';
-- remove leading slashes from the paths in titles
UPDATE titles
SET path = SUBSTR(path, 2, LENGTH(path) - 1)
WHERE path LIKE '/%';
SQL
end
def down : String
<<-SQL
-- add leading slashes to paths in ids
UPDATE ids
SET path = '/' || path
WHERE path NOT LIKE '/%';
-- add leading slashes to paths in titles
UPDATE titles
SET path = '/' || path
WHERE path NOT LIKE '/%';
SQL
end
end

19
migration/tags.4.cr Normal file
View File

@@ -0,0 +1,19 @@
class CreateTags < MG::Base
def up : String
<<-SQL
CREATE TABLE IF NOT EXISTS tags (
id TEXT NOT NULL,
tag TEXT NOT NULL,
UNIQUE (id, tag)
);
CREATE INDEX IF NOT EXISTS tags_id_idx ON tags (id);
CREATE INDEX IF NOT EXISTS tags_tag_idx ON tags (tag);
SQL
end
def down : String
<<-SQL
DROP TABLE tags;
SQL
end
end

20
migration/thumbnails.3.cr Normal file
View File

@@ -0,0 +1,20 @@
class CreateThumbnails < MG::Base
def up : String
<<-SQL
CREATE TABLE IF NOT EXISTS thumbnails (
id TEXT NOT NULL,
data BLOB NOT NULL,
filename TEXT NOT NULL,
mime TEXT NOT NULL,
size INTEGER NOT NULL
);
CREATE UNIQUE INDEX IF NOT EXISTS tn_index ON thumbnails (id);
SQL
end
def down : String
<<-SQL
DROP TABLE thumbnails;
SQL
end
end

56
migration/titles.5.cr Normal file
View File

@@ -0,0 +1,56 @@
class CreateTitles < MG::Base
def up : String
<<-SQL
-- create titles
CREATE TABLE titles (
id TEXT NOT NULL,
path TEXT NOT NULL,
signature TEXT
);
CREATE UNIQUE INDEX titles_id_idx on titles (id);
CREATE UNIQUE INDEX titles_path_idx on titles (path);
-- migrate data from ids to titles
INSERT INTO titles
SELECT id, path, null
FROM ids
WHERE is_title = 1;
DELETE FROM ids
WHERE is_title = 1;
-- remove the is_title column from ids
ALTER TABLE ids RENAME TO tmp;
CREATE TABLE ids (
path TEXT NOT NULL,
id TEXT NOT NULL
);
INSERT INTO ids
SELECT path, id
FROM tmp;
DROP TABLE tmp;
-- recreate the indices
CREATE UNIQUE INDEX path_idx ON ids (path);
CREATE UNIQUE INDEX id_idx ON ids (id);
SQL
end
def down : String
<<-SQL
-- insert the is_title column
ALTER TABLE ids ADD COLUMN is_title INTEGER NOT NULL DEFAULT 0;
-- migrate data from titles to ids
INSERT INTO ids
SELECT path, id, 1
FROM titles;
-- remove titles
DROP TABLE titles;
SQL
end
end

View File

@@ -0,0 +1,94 @@
class UnavailableIDs < MG::Base
def up : String
<<-SQL
-- add unavailable column to ids
ALTER TABLE ids ADD COLUMN unavailable INTEGER NOT NULL DEFAULT 0;
-- add unavailable column to titles
ALTER TABLE titles ADD COLUMN unavailable INTEGER NOT NULL DEFAULT 0;
SQL
end
def down : String
<<-SQL
-- remove unavailable column from ids
ALTER TABLE ids RENAME TO tmp;
CREATE TABLE ids (
path TEXT NOT NULL,
id TEXT NOT NULL,
signature TEXT
);
INSERT INTO ids
SELECT path, id, signature
FROM tmp;
DROP TABLE tmp;
-- recreate the indices
CREATE UNIQUE INDEX path_idx ON ids (path);
CREATE UNIQUE INDEX id_idx ON ids (id);
-- recreate the foreign key constraint on thumbnails
ALTER TABLE thumbnails RENAME TO tmp;
CREATE TABLE thumbnails (
id TEXT NOT NULL,
data BLOB NOT NULL,
filename TEXT NOT NULL,
mime TEXT NOT NULL,
size INTEGER NOT NULL,
FOREIGN KEY (id) REFERENCES ids (id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO thumbnails
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE UNIQUE INDEX tn_index ON thumbnails (id);
-- remove unavailable column from titles
ALTER TABLE titles RENAME TO tmp;
CREATE TABLE titles (
id TEXT NOT NULL,
path TEXT NOT NULL,
signature TEXT
);
INSERT INTO titles
SELECT path, id, signature
FROM tmp;
DROP TABLE tmp;
-- recreate the indices
CREATE UNIQUE INDEX titles_id_idx on titles (id);
CREATE UNIQUE INDEX titles_path_idx on titles (path);
-- recreate the foreign key constraint on tags
ALTER TABLE tags RENAME TO tmp;
CREATE TABLE tags (
id TEXT NOT NULL,
tag TEXT NOT NULL,
UNIQUE (id, tag),
FOREIGN KEY (id) REFERENCES titles (id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO tags
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE INDEX tags_id_idx ON tags (id);
CREATE INDEX tags_tag_idx ON tags (tag);
SQL
end
end

20
migration/users.1.cr Normal file
View File

@@ -0,0 +1,20 @@
class CreateUsers < MG::Base
def up : String
<<-SQL
CREATE TABLE IF NOT EXISTS users (
username TEXT NOT NULL,
password TEXT NOT NULL,
token TEXT,
admin INTEGER NOT NULL
);
CREATE UNIQUE INDEX IF NOT EXISTS username_idx ON users (username);
CREATE UNIQUE INDEX IF NOT EXISTS token_idx ON users (token);
SQL
end
def down : String
<<-SQL
DROP TABLE users;
SQL
end
end

View File

@@ -7,6 +7,7 @@
"license": "MIT",
"devDependencies": {
"@babel/preset-env": "^7.11.5",
"all-contributors-cli": "^6.19.0",
"gulp": "^4.0.2",
"gulp-babel": "^8.0.0",
"gulp-babel-minify": "^0.5.1",

View File

@@ -1,146 +0,0 @@
.uk-alert-close {
color: black !important;
}
.uk-card-body {
padding: 20px;
}
.uk-card-media-top {
width: 100%;
height: 250px;
}
@media (min-width: 600px) {
.uk-card-media-top {
height: 300px;
}
}
.uk-card-media-top>img {
height: 100%;
width: 100%;
object-fit: cover;
}
.uk-card-title {
max-height: 3em;
}
.acard:hover {
cursor: pointer;
}
.reader-bg {
background-color: black;
}
.break-word {
word-wrap: break-word;
}
.uk-logo>img {
height: 90px;
width: 90px;
}
.uk-search {
width: 100%;
}
#selectable .ui-selecting {
background: #EEE6B9;
}
#selectable .ui-selected {
background: #F4E487;
}
.uk-light #selectable .ui-selecting {
background: #5E5731;
}
.uk-light #selectable .ui-selected {
background: #9D9252;
}
td>.uk-dropdown {
white-space: pre-line;
}
#edit-modal .uk-grid>div {
height: 300px;
}
#edit-modal #cover {
height: 100%;
width: 100%;
object-fit: cover;
}
#edit-modal #cover-upload {
height: 100%;
box-sizing: border-box;
}
#edit-modal .uk-modal-body .uk-inline {
width: 100%;
}
.item .uk-card-title {
font-size: 1rem;
}
.grayscale {
filter: grayscale(100%);
}
.uk-light .uk-navbar-dropdown,
.uk-light .uk-modal-header,
.uk-light .uk-modal-body,
.uk-light .uk-modal-footer {
background: #222;
}
.uk-light .uk-dropdown {
background: #333;
}
.uk-light .uk-navbar-dropdown,
.uk-light .uk-dropdown {
color: #ccc;
}
.uk-light .uk-nav-header,
.uk-light .uk-description-list>dt {
color: #555;
}
[x-cloak] {
display: none;
}
#select-bar-controls a {
transform: scale(1.5, 1.5);
}
#select-bar-controls a:hover {
color: orange;
}
#main-section {
position: relative;
}
#totop-wrapper {
position: absolute;
top: 100vh;
right: 2em;
bottom: 0;
}
#totop-wrapper a {
position: fixed;
position: sticky;
top: calc(100vh - 5em);
}

139
public/css/mango.less Normal file
View File

@@ -0,0 +1,139 @@
// UIKit
@import "./uikit.less";
// FontAwesome
@import "../../node_modules/@fortawesome/fontawesome-free/less/fontawesome.less";
@import "../../node_modules/@fortawesome/fontawesome-free/less/solid.less";
@font-face {
src: url('@{fa-font-path}/fa-solid-900.woff2');
src: url('@{fa-font-path}/fa-solid-900.woff2') format('woff2'),
url('@{fa-font-path}/fa-solid-900.woff') format('woff');
}
// Item cards
.item .uk-card {
cursor: pointer;
.uk-card-media-top {
width: 100%;
height: 250px;
@media (min-width: 600px) {
height: 300px;
}
img {
height: 100%;
width: 100%;
object-fit: cover;
&.grayscale {
filter: grayscale(100%);
}
}
}
.uk-card-body {
padding: 20px;
.uk-card-title {
font-size: 1rem;
}
.uk-card-title:not(.free-height) {
max-height: 3em;
}
}
}
// jQuery selectable
#selectable {
.ui-selecting {
background: #EEE6B9;
}
.ui-selected {
background: #F4E487;
}
.uk-light & {
.ui-selecting {
background: #5E5731;
}
.ui-selected {
background: #9D9252;
}
}
}
// Edit modal
#edit-modal {
.uk-grid > div {
height: 300px;
}
#cover {
height: 100%;
width: 100%;
object-fit: cover;
}
#cover-upload {
height: 100%;
box-sizing: border-box;
}
.uk-modal-body .uk-inline {
width: 100%;
}
}
// Dark theme
.uk-light {
.uk-modal-header,
.uk-modal-body,
.uk-modal-footer {
background: #222;
}
.uk-navbar-dropdown,
.uk-dropdown {
color: #ccc;
background: #333;
}
.uk-nav-header,
.uk-description-list > dt {
color: #555;
}
}
// Alpine magic
[x-cloak] {
display: none;
}
// Batch select bar on title page
#select-bar-controls {
a {
transform: scale(1.5, 1.5);
&:hover {
color: orange;
}
}
}
// Totop button
#totop-wrapper {
position: absolute;
top: 100vh;
right: 2em;
bottom: 0;
a {
position: fixed;
position: sticky;
top: calc(100vh - 5em);
}
}
// Misc
.uk-alert-close {
color: black !important;
}
.break-word {
word-wrap: break-word;
}
.uk-search {
width: 100%;
}

58
public/css/tags.less Normal file
View File

@@ -0,0 +1,58 @@
@light-gray: #e5e5e5;
@gray: #666666;
@black: #141414;
@blue: rgb(30, 135, 240);
@white1: rgba(255, 255, 255, .1);
@white2: rgba(255, 255, 255, .2);
@white7: rgba(255, 255, 255, .7);
.select2-container--default {
.select2-selection--multiple {
border: 1px solid @light-gray;
.select2-selection__choice,
.select2-selection__choice__remove,
.select2-selection__choice__remove:hover
{
background-color: @blue;
color: white;
border: none;
border-radius: 2px;
}
}
.select2-dropdown {
.select2-results__option--highlighted.select2-results__option--selectable {
background-color: @blue;
}
.select2-results__option--selected:not(.select2-results__option--highlighted) {
background-color: @light-gray
}
}
}
.uk-light {
.select2-container--default {
.select2-selection {
background-color: @white1;
}
.select2-selection--multiple {
border: 1px solid @white2;
.select2-selection__choice,
.select2-selection__choice__remove,
.select2-selection__choice__remove:hover
{
background-color: white;
color: @gray;
border: none;
}
.select2-search__field {
color: @white7;
}
}
}
.select2-dropdown {
background-color: @black;
.select2-results__option--selected:not(.select2-results__option--highlighted) {
background-color: @white2;
}
}
}

View File

@@ -43,3 +43,22 @@
@internal-list-bullet-image: "../img/list-bullet.svg";
@internal-accordion-open-image: "../img/accordion-open.svg";
@internal-accordion-close-image: "../img/accordion-close.svg";
.hook-card-default() {
.uk-light & {
background: @card-secondary-background;
color: @card-secondary-color;
}
}
.hook-card-default-title() {
.uk-light & {
color: @card-secondary-title-color;
}
}
.hook-card-default-hover() {
.uk-light & {
background-color: @card-secondary-hover-background;
}
}

View File

@@ -117,14 +117,10 @@ const setTheme = (theme) => {
if (theme === 'dark') {
$('html').css('background', 'rgb(20, 20, 20)');
$('body').addClass('uk-light');
$('.uk-card').addClass('uk-card-secondary');
$('.uk-card').removeClass('uk-card-default');
$('.ui-widget-content').addClass('dark');
} else {
$('html').css('background', '');
$('body').removeClass('uk-light');
$('.uk-card').removeClass('uk-card-secondary');
$('.uk-card').addClass('uk-card-default');
$('.ui-widget-content').removeClass('dark');
}
};

View File

@@ -4,21 +4,30 @@ const component = () => {
paused: undefined,
loading: false,
toggling: false,
ws: undefined,
init() {
const ws = new WebSocket(`ws://${location.host}/api/admin/mangadex/queue`);
ws.onmessage = event => {
wsConnect(secure = true) {
const url = `${secure ? 'wss' : 'ws'}://${location.host}${base_url}api/admin/mangadex/queue`;
console.log(`Connecting to ${url}`);
this.ws = new WebSocket(url);
this.ws.onmessage = event => {
const data = JSON.parse(event.data);
this.jobs = data.jobs;
this.paused = data.paused;
};
ws.onerror = err => {
alert('danger', `Socket connection failed. Error: ${err}`);
this.ws.onclose = () => {
if (this.ws.failed)
return this.wsConnect(false);
alert('danger', 'Socket connection closed');
};
ws.onclose = err => {
this.ws.onerror = () => {
if (secure)
return this.ws.failed = true;
alert('danger', 'Socket connection failed');
};
},
init() {
this.wsConnect();
this.load();
},
load() {

View File

@@ -1,305 +0,0 @@
$(() => {
$('#search-input').keypress(event => {
if (event.which === 13) {
search();
}
});
$('.filter-field').each((i, ele) => {
$(ele).change(() => {
buildTable();
});
});
});
const selectAll = () => {
$('tbody > tr').each((i, e) => {
$(e).addClass('ui-selected');
});
};
const unselect = () => {
$('tbody > tr').each((i, e) => {
$(e).removeClass('ui-selected');
});
};
const download = () => {
const selected = $('tbody > tr.ui-selected');
if (selected.length === 0) return;
UIkit.modal.confirm(`Download ${selected.length} selected chapters?`).then(() => {
$('#download-btn').attr('hidden', '');
$('#download-spinner').removeAttr('hidden');
const ids = selected.map((i, e) => {
return $(e).find('td').first().text();
}).get();
const chapters = globalChapters.filter(c => ids.indexOf(c.id) >= 0);
console.log(ids);
$.ajax({
type: 'POST',
url: base_url + 'api/admin/mangadex/download',
data: JSON.stringify({
chapters: chapters
}),
contentType: "application/json",
dataType: 'json'
})
.done(data => {
console.log(data);
if (data.error) {
alert('danger', `Failed to add chapters to the download queue. Error: ${data.error}`);
return;
}
const successCount = parseInt(data.success);
const failCount = parseInt(data.fail);
UIkit.modal.confirm(`${successCount} of ${successCount + failCount} chapters added to the download queue. Proceed to the download manager?`).then(() => {
window.location.href = base_url + 'admin/downloads';
});
})
.fail((jqXHR, status) => {
alert('danger', `Failed to add chapters to the download queue. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
.always(() => {
$('#download-spinner').attr('hidden', '');
$('#download-btn').removeAttr('hidden');
});
});
};
const toggleSpinner = () => {
var attr = $('#spinner').attr('hidden');
if (attr) {
$('#spinner').removeAttr('hidden');
$('#search-btn').attr('hidden', '');
} else {
$('#search-btn').removeAttr('hidden');
$('#spinner').attr('hidden', '');
}
searching = !searching;
};
var searching = false;
var globalChapters;
const search = () => {
if (searching) {
return;
}
$('#manga-details').attr('hidden', '');
$('#filter-form').attr('hidden', '');
$('table').attr('hidden', '');
$('#selection-controls').attr('hidden', '');
$('#filter-notification').attr('hidden', '');
toggleSpinner();
const input = $('input').val();
if (input === "") {
toggleSpinner();
return;
}
var int_id = -1;
try {
const path = new URL(input).pathname;
const match = /\/(?:title|manga)\/([0-9]+)/.exec(path);
int_id = parseInt(match[1]);
} catch (e) {
int_id = parseInt(input);
}
if (int_id <= 0 || isNaN(int_id)) {
alert('danger', 'Please make sure you are using a valid manga ID or manga URL from Mangadex.');
toggleSpinner();
return;
}
$.getJSON(`${base_url}api/admin/mangadex/manga/${int_id}`)
.done((data) => {
if (data.error) {
alert('danger', 'Failed to get manga info. Error: ' + data.error);
return;
}
const cover = baseURL + data.cover_url;
$('#cover').attr("src", cover);
$('#title').text("Title: " + data.title);
$('#artist').text("Artist: " + data.artist);
$('#author').text("Author: " + data.author);
$('#manga-details').removeAttr('hidden');
console.log(data.chapters);
globalChapters = data.chapters;
let langs = new Set();
let group_names = new Set();
data.chapters.forEach(chp => {
Object.entries(chp.groups).forEach(([k, v]) => {
group_names.add(k);
});
langs.add(chp.language);
});
const comp = (a, b) => {
var ai;
var bi;
try {
ai = parseFloat(a);
} catch (e) {}
try {
bi = parseFloat(b);
} catch (e) {}
if (typeof ai === 'undefined') return -1;
if (typeof bi === 'undefined') return 1;
if (ai < bi) return 1;
if (ai > bi) return -1;
return 0;
};
langs = [...langs].sort();
group_names = [...group_names].sort();
langs.unshift('All');
group_names.unshift('All');
$('select#lang-select').append(langs.map(e => `<option>${e}</option>`).join(''));
$('select#group-select').append(group_names.map(e => `<option>${e}</option>`).join(''));
$('#filter-form').removeAttr('hidden');
buildTable();
})
.fail((jqXHR, status) => {
alert('danger', `Failed to get manga info. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
.always(() => {
toggleSpinner();
});
};
const parseRange = str => {
const regex = /^[\t ]*(?:(?:(<|<=|>|>=)[\t ]*([0-9]+))|(?:([0-9]+))|(?:([0-9]+)[\t ]*-[\t ]*([0-9]+))|(?:[\t ]*))[\t ]*$/m;
const matches = str.match(regex);
var num;
if (!matches) {
alert('danger', `Failed to parse filter input ${str}`);
return [null, null];
} else if (typeof matches[1] !== 'undefined' && typeof matches[2] !== 'undefined') {
// e.g., <= 30
num = parseInt(matches[2]);
if (isNaN(num)) {
alert('danger', `Failed to parse filter input ${str}`);
return [null, null];
}
switch (matches[1]) {
case '<':
return [null, num - 1];
case '<=':
return [null, num];
case '>':
return [num + 1, null];
case '>=':
return [num, null];
}
} else if (typeof matches[3] !== 'undefined') {
// a single number
num = parseInt(matches[3]);
if (isNaN(num)) {
alert('danger', `Failed to parse filter input ${str}`);
return [null, null];
}
return [num, num];
} else if (typeof matches[4] !== 'undefined' && typeof matches[5] !== 'undefined') {
// e.g., 10 - 23
num = parseInt(matches[4]);
const n2 = parseInt(matches[5]);
if (isNaN(num) || isNaN(n2) || num > n2) {
alert('danger', `Failed to parse filter input ${str}`);
return [null, null];
}
return [num, n2];
} else {
// empty or space only
return [null, null];
}
};
const getFilters = () => {
const filters = {};
$('.uk-select').each((i, ele) => {
const id = $(ele).attr('id');
const by = id.split('-')[0];
const choice = $(ele).val();
filters[by] = choice;
});
filters.volume = parseRange($('#volume-range').val());
filters.chapter = parseRange($('#chapter-range').val());
return filters;
};
const buildTable = () => {
$('table').attr('hidden', '');
$('#selection-controls').attr('hidden', '');
$('#filter-notification').attr('hidden', '');
console.log('rebuilding table');
const filters = getFilters();
console.log('filters:', filters);
var chapters = globalChapters.slice();
Object.entries(filters).forEach(([k, v]) => {
if (v === 'All') return;
if (k === 'group') {
chapters = chapters.filter(c => {
unescaped_groups = Object.entries(c.groups).map(([g, id]) => unescapeHTML(g));
return unescaped_groups.indexOf(v) >= 0;
});
return;
}
if (k === 'lang') {
chapters = chapters.filter(c => c.language === v);
return;
}
const lb = parseFloat(v[0]);
const ub = parseFloat(v[1]);
if (isNaN(lb) && isNaN(ub)) return;
chapters = chapters.filter(c => {
const val = parseFloat(c[k]);
if (isNaN(val)) return false;
if (isNaN(lb))
return val <= ub;
else if (isNaN(ub))
return val >= lb;
else
return val >= lb && val <= ub;
});
});
console.log('filtered chapters:', chapters);
$('#count-text').text(`${chapters.length} chapters found`);
const chaptersLimit = 1000;
if (chapters.length > chaptersLimit) {
$('#filter-notification').text(`Mango can only list ${chaptersLimit} chapters, but we found ${chapters.length} chapters in this manga. Please use the filter options above to narrow down your search.`);
$('#filter-notification').removeAttr('hidden');
return;
}
const inner = chapters.map(chp => {
const group_str = Object.entries(chp.groups).map(([k, v]) => {
return `<a href="${baseURL }/group/${v}">${k}</a>`;
}).join(' | ');
return `<tr class="ui-widget-content">
<td><a href="${baseURL}/chapter/${chp.id}">${chp.id}</a></td>
<td>${chp.title}</td>
<td>${chp.language}</td>
<td>${group_str}</td>
<td>${chp.volume}</td>
<td>${chp.chapter}</td>
<td>${moment.unix(chp.time).fromNow()}</td>
</tr>`;
}).join('');
const tbody = `<tbody id="selectable">${inner}</tbody>`;
$('tbody').remove();
$('table').append(tbody);
$('table').removeAttr('hidden');
$("#selectable").selectable({
filter: 'tr'
});
$('#selection-controls').removeAttr('hidden');
};
const unescapeHTML = (str) => {
var elt = document.createElement("span");
elt.innerHTML = str;
return elt.innerText;
};

View File

@@ -0,0 +1,60 @@
const component = () => {
return {
empty: true,
titles: [],
entries: [],
loading: true,
load() {
this.loading = true;
this.request('GET', `${base_url}api/admin/titles/missing`, data => {
this.titles = data.titles;
this.request('GET', `${base_url}api/admin/entries/missing`, data => {
this.entries = data.entries;
this.loading = false;
this.empty = this.entries.length === 0 && this.titles.length === 0;
});
});
},
rm(event) {
const rawID = event.currentTarget.closest('tr').id;
const [type, id] = rawID.split('-');
const url = `${base_url}api/admin/${type === 'title' ? 'titles' : 'entries'}/missing/${id}`;
this.request('DELETE', url, () => {
this.load();
});
},
rmAll() {
UIkit.modal.confirm('Are you sure? All metadata associated with these items, including their tags and thumbnails, will be deleted from the database.', {
labels: {
ok: 'Yes, delete them',
cancel: 'Cancel'
}
}).then(() => {
this.request('DELETE', `${base_url}api/admin/titles/missing`, () => {
this.request('DELETE', `${base_url}api/admin/entries/missing`, () => {
this.load();
});
});
});
},
request(method, url, cb) {
console.log(url);
$.ajax({
type: method,
url: url,
contentType: 'application/json'
})
.done(data => {
if (data.error) {
alert('danger', `Failed to ${method} ${url}. Error: ${data.error}`);
return;
}
if (cb) cb(data);
})
.fail((jqXHR, status) => {
alert('danger', `Failed to ${method} ${url}. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
});
}
};
};

View File

@@ -126,9 +126,7 @@ const download = () => {
}
const successCount = parseInt(data.success);
const failCount = parseInt(data.fail);
UIkit.modal.confirm(`${successCount} of ${successCount + failCount} chapters added to the download queue. Proceed to the download manager?`).then(() => {
window.location.href = base_url + 'admin/downloads';
});
alert('success', `${successCount} of ${successCount + failCount} chapters added to the download queue. You can view and manage your download queue on the <a href="${base_url}admin/downloads">download manager page</a>.`);
})
.fail((jqXHR, status) => {
alert('danger', `Failed to add chapters to the download queue. Error: [${jqXHR.status}] ${jqXHR.statusText}`);

View File

@@ -6,9 +6,13 @@ const readerComponent = () => {
alertClass: 'uk-alert-primary',
items: [],
curItem: {},
enableFlipAnimation: true,
flipAnimation: null,
longPages: false,
lastSavedPage: page,
selectedIndex: 0, // 0: not selected; 1: the first page
margin: 30,
preloadLookahead: 3,
/**
* Initialize the component by fetching the page dimensions
@@ -26,7 +30,6 @@ const readerComponent = () => {
url: `${base_url}api/page/${tid}/${eid}/${i+1}`,
width: d.width,
height: d.height,
style: `margin-top: ${data.margin}px; margin-bottom: ${data.margin}px;`
};
});
@@ -46,6 +49,21 @@ const readerComponent = () => {
const mode = this.mode;
this.updateMode(this.mode, page, nextTick);
$('#mode-select').val(mode);
const savedMargin = localStorage.getItem('margin');
if (savedMargin) {
this.margin = savedMargin;
}
// Preload Images
this.preloadLookahead = +(localStorage.getItem('preloadLookahead') ?? 3);
const limit = Math.min(page + this.preloadLookahead, this.items.length + 1);
for (let idx = page + 1; idx <= limit; idx++) {
this.preloadImage(this.items[idx - 1].url);
}
const savedFlipAnimation = localStorage.getItem('enableFlipAnimation');
this.enableFlipAnimation = savedFlipAnimation === null || savedFlipAnimation === 'true';
})
.catch(e => {
const errMsg = `Failed to get the page dimensions. ${e}`;
@@ -54,6 +72,12 @@ const readerComponent = () => {
this.msg = errMsg;
})
},
/**
* Preload an image, which is expected to be cached
*/
preloadImage(url) {
(new Image()).src = url;
},
/**
* Handles the `change` event for the page selector
*/
@@ -105,12 +129,18 @@ const readerComponent = () => {
if (newIdx <= 0 || newIdx > this.items.length) return;
if (newIdx + this.preloadLookahead < this.items.length + 1) {
this.preloadImage(this.items[newIdx + this.preloadLookahead - 1].url);
}
this.toPage(newIdx);
if (isNext)
this.flipAnimation = 'right';
else
this.flipAnimation = 'left';
if (this.enableFlipAnimation) {
if (isNext)
this.flipAnimation = 'right';
else
this.flipAnimation = 'left';
}
setTimeout(() => {
this.flipAnimation = null;
@@ -221,10 +251,7 @@ const readerComponent = () => {
*/
showControl(event) {
const idx = event.currentTarget.id;
const pageCount = this.items.length;
const progressText = `Progress: ${idx}/${pageCount} (${(idx/pageCount * 100).toFixed(1)}%)`;
$('#progress-label').text(progressText);
$('#page-select').val(idx);
this.selectedIndex = idx;
UIkit.modal($('#modal-sections')).show();
},
/**
@@ -263,19 +290,35 @@ const readerComponent = () => {
});
},
/**
* Exits the reader, and optionally sets the reading progress tp 100%
* Exits the reader, and sets the reading progress tp 100%
*
* @param {string} exitUrl - The Exit URL
* @param {boolean} [markCompleted] - Whether we should mark the
* reading progress to 100%
*/
exitReader(exitUrl, markCompleted = false) {
if (!markCompleted) {
return this.redirect(exitUrl);
}
exitReader(exitUrl) {
this.saveProgress(this.items.length, () => {
this.redirect(exitUrl);
});
}
},
/**
* Handles the `change` event for the entry selector
*/
entryChanged() {
const id = $('#entry-select').val();
this.redirect(`${base_url}reader/${tid}/${id}`);
},
marginChanged() {
localStorage.setItem('margin', this.margin);
this.toPage(this.selectedIndex);
},
preloadLookaheadChanged() {
localStorage.setItem('preloadLookahead', this.preloadLookahead);
},
enableFlipAnimationChanged() {
localStorage.setItem('enableFlipAnimation', this.enableFlipAnimation);
},
};
}

82
public/js/subscription.js Normal file
View File

@@ -0,0 +1,82 @@
const component = () => {
return {
available: undefined,
subscriptions: [],
init() {
$.getJSON(`${base_url}api/admin/mangadex/expires`)
.done((data) => {
if (data.error) {
alert('danger', 'Failed to check MangaDex integration status. Error: ' + data.error);
return;
}
this.available = Boolean(data.expires && data.expires > Math.floor(Date.now() / 1000));
if (this.available) this.getSubscriptions();
})
.fail((jqXHR, status) => {
alert('danger', `Failed to check MangaDex integration status. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
},
getSubscriptions() {
$.getJSON(`${base_url}api/admin/mangadex/subscriptions`)
.done(data => {
if (data.error) {
alert('danger', 'Failed to get subscriptions. Error: ' + data.error);
return;
}
this.subscriptions = data.subscriptions;
})
.fail((jqXHR, status) => {
alert('danger', `Failed to get subscriptions. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
})
},
rm(event) {
const id = event.currentTarget.parentNode.getAttribute('data-id');
$.ajax({
type: 'DELETE',
url: `${base_url}api/admin/mangadex/subscriptions/${id}`,
contentType: 'application/json'
})
.done(data => {
if (data.error) {
alert('danger', `Failed to delete subscription. Error: ${data.error}`);
}
this.getSubscriptions();
})
.fail((jqXHR, status) => {
alert('danger', `Failed to delete subscription. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
});
},
check(event) {
const id = event.currentTarget.parentNode.getAttribute('data-id');
$.ajax({
type: 'POST',
url: `${base_url}api/admin/mangadex/subscriptions/check/${id}`,
contentType: 'application/json'
})
.done(data => {
if (data.error) {
alert('danger', `Failed to check subscription. Error: ${data.error}`);
return;
}
alert('success', 'Mango is now checking the subscription for updates. This might take a while, but you can safely leave the page.');
})
.fail((jqXHR, status) => {
alert('danger', `Failed to check subscription. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
});
},
formatRange(min, max) {
if (!isNaN(min) && isNaN(max)) return `${min}`;
if (isNaN(min) && !isNaN(max)) return `${max}`;
if (isNaN(min) && isNaN(max)) return 'All';
if (min === max) return `= ${min}`;
return `${min} - ${max}`;
}
};
};

View File

@@ -255,48 +255,65 @@ const bulkProgress = (action, el) => {
const tagsComponent = () => {
return {
loading: true,
isAdmin: false,
tags: [],
newTag: '',
inputShown: false,
tid: $('.upload-field').attr('data-title-id'),
loading: true,
load(admin) {
this.isAdmin = admin;
const url = `${base_url}api/tags/${this.tid}`;
this.request(url, 'GET', (data) => {
this.tags = data.tags;
this.loading = false;
$('.tag-select').select2({
tags: true,
placeholder: this.isAdmin ? 'Tag the title' : 'No tags found',
disabled: !this.isAdmin,
templateSelection(state) {
const a = document.createElement('a');
a.setAttribute('href', `${base_url}tags/${encodeURIComponent(state.text)}`);
a.setAttribute('class', 'uk-link-reset');
a.onclick = event => {
event.stopPropagation();
};
a.innerText = state.text;
return a;
}
});
},
add() {
const tag = this.newTag.trim();
const url = `${base_url}api/admin/tags/${this.tid}/${encodeURIComponent(tag)}`;
this.request(url, 'PUT', () => {
this.tags.push(tag);
this.newTag = '';
});
},
keydown(event) {
if (event.key === 'Enter')
this.add()
},
rm(event) {
const tag = event.currentTarget.id.split('-')[0];
const url = `${base_url}api/admin/tags/${this.tid}/${encodeURIComponent(tag)}`;
this.request(url, 'DELETE', () => {
const idx = this.tags.indexOf(tag);
if (idx < 0) return;
this.tags.splice(idx, 1);
});
},
toggleInput(nextTick) {
this.inputShown = !this.inputShown;
if (this.inputShown) {
nextTick(() => {
$('#tag-input').get(0).focus();
this.request(`${base_url}api/tags`, 'GET', (data) => {
const allTags = data.tags;
const url = `${base_url}api/tags/${this.tid}`;
this.request(url, 'GET', data => {
this.tags = data.tags;
allTags.forEach(t => {
const op = new Option(t, t, false, this.tags.indexOf(t) >= 0);
$('.tag-select').append(op);
});
$('.tag-select').on('select2:select', e => {
this.onAdd(e);
});
$('.tag-select').on('select2:unselect', e => {
this.onDelete(e);
});
$('.tag-select').on('change', () => {
this.onChange();
});
$('.tag-select').trigger('change');
this.loading = false;
});
}
});
},
onChange() {
this.tags = $('.tag-select').select2('data').map(o => o.text);
},
onAdd(event) {
const tag = event.params.data.text;
const url = `${base_url}api/admin/tags/${this.tid}/${encodeURIComponent(tag)}`;
this.request(url, 'PUT');
},
onDelete(event) {
const tag = event.params.data.text;
const url = `${base_url}api/admin/tags/${this.tid}/${encodeURIComponent(tag)}`;
this.request(url, 'DELETE');
},
request(url, method, cb) {
$.ajax({
@@ -305,9 +322,9 @@ const tagsComponent = () => {
dataType: 'json'
})
.done(data => {
if (data.success)
cb(data);
else {
if (data.success) {
if (cb) cb(data);
} else {
alert('danger', data.error);
}
})

View File

@@ -2,73 +2,77 @@ version: 2.0
shards:
ameba:
git: https://github.com/crystal-ameba/ameba.git
version: 0.12.1
version: 0.14.3
archive:
git: https://github.com/hkalexling/archive.cr.git
version: 0.4.0
version: 0.5.0
baked_file_system:
git: https://github.com/schovi/baked_file_system.git
version: 0.9.8+git.commit.fb3091b546797fbec3c25dc0e1e2cff60bb9033b
version: 0.10.0
clim:
git: https://github.com/at-grandpa/clim.git
version: 0.12.0
version: 0.17.1
db:
git: https://github.com/crystal-lang/crystal-db.git
version: 0.9.0
version: 0.10.1
duktape:
git: https://github.com/jessedoyle/duktape.cr.git
version: 0.20.0
version: 1.0.0
exception_page:
git: https://github.com/crystal-loot/exception_page.git
version: 0.1.4
version: 0.1.5
http_proxy:
git: https://github.com/mamantoha/http_proxy.git
version: 0.7.1
version: 0.8.0
image_size:
git: https://github.com/hkalexling/image_size.cr.git
version: 0.4.0
version: 0.5.0
kemal:
git: https://github.com/kemalcr/kemal.git
version: 0.27.0
version: 1.0.0
kemal-session:
git: https://github.com/kemalcr/kemal-session.git
version: 0.12.1
version: 1.0.0
kilt:
git: https://github.com/jeromegn/kilt.git
version: 0.4.0
version: 0.4.1
koa:
git: https://github.com/hkalexling/koa.git
version: 0.5.0
version: 0.8.0
mg:
git: https://github.com/hkalexling/mg.git
version: 0.5.0+git.commit.697e46e27cde8c3969346e228e372db2455a6264
myhtml:
git: https://github.com/kostya/myhtml.git
version: 1.5.1
version: 1.5.8
open_api:
git: https://github.com/jreinert/open_api.cr.git
version: 1.2.1+git.commit.95e4df2ca10b1fe88b8b35c62a18b06a10267b6c
git: https://github.com/hkalexling/open_api.cr.git
version: 1.2.1+git.commit.1d3c55dd5534c6b0af18964d031858a08515553a
radix:
git: https://github.com/luislavena/radix.git
version: 0.3.9
version: 0.4.1
sqlite3:
git: https://github.com/crystal-lang/crystal-sqlite3.git
version: 0.16.0
version: 0.18.0
tallboy:
git: https://github.com/epoch/tallboy.git
version: 0.9.3
version: 0.9.3+git.commit.9be1510bb0391c95e92f1b288f3afb429a73caa6

View File

@@ -1,5 +1,5 @@
name: mango
version: 0.18.1
version: 0.24.0
authors:
- Alex Ling <hkalexling@gmail.com>
@@ -8,7 +8,7 @@ targets:
mango:
main: src/mango.cr
crystal: 0.35.1
crystal: 1.0.0
license: MIT
@@ -21,7 +21,6 @@ dependencies:
github: crystal-lang/crystal-sqlite3
baked_file_system:
github: schovi/baked_file_system
version: 0.9.8+git.commit.fb3091b546797fbec3c25dc0e1e2cff60bb9033b
archive:
github: hkalexling/archive.cr
ameba:
@@ -30,7 +29,6 @@ dependencies:
github: at-grandpa/clim
duktape:
github: jessedoyle/duktape.cr
version: ~> 0.20.0
myhtml:
github: kostya/myhtml
http_proxy:
@@ -41,3 +39,6 @@ dependencies:
github: hkalexling/koa
tallboy:
github: epoch/tallboy
branch: master
mg:
github: hkalexling/mg

View File

@@ -8,9 +8,7 @@ describe Storage do
end
it "deletes user" do
with_storage do |storage|
storage.delete_user "admin"
end
with_storage &.delete_user "admin"
end
it "creates new user" do

View File

@@ -21,7 +21,7 @@ describe "compare_numerically" do
it "sorts like the stack exchange post" do
ary = ["2", "12", "200000", "1000000", "a", "a12", "b2", "text2",
"text2a", "text2a2", "text2a12", "text2ab", "text12", "text12a"]
ary.reverse.sort { |a, b|
ary.reverse.sort! { |a, b|
compare_numerically a, b
}.should eq ary
end
@@ -29,18 +29,45 @@ describe "compare_numerically" do
# https://github.com/hkalexling/Mango/issues/22
it "handles numbers larger than Int32" do
ary = ["14410155591588.jpg", "21410155591588.png", "104410155591588.jpg"]
ary.reverse.sort { |a, b|
ary.reverse.sort! { |a, b|
compare_numerically a, b
}.should eq ary
end
end
describe "is_supported_file" do
it "returns true when the filename has a supported extension" do
filename = "manga.cbz"
is_supported_file(filename).should eq true
end
it "returns true when the filename does not have a supported extension" do
filename = "info.json"
is_supported_file(filename).should eq false
end
it "is case insensitive" do
filename = "manga.ZiP"
is_supported_file(filename).should eq true
end
end
describe "chapter_sort" do
it "sorts correctly" do
ary = ["Vol.1 Ch.01", "Vol.1 Ch.02", "Vol.2 Ch. 2.5", "Ch. 3", "Ch.04"]
sorter = ChapterSorter.new ary
ary.reverse.sort do |a, b|
ary.reverse.sort! do |a, b|
sorter.compare a, b
end.should eq ary
end
end
describe "sanitize_filename" do
it "returns a random string for empty sanitized string" do
sanitize_filename("..").should_not eq sanitize_filename("..")
end
it "sanitizes correctly" do
sanitize_filename(".. \n\v.\rマンゴー/|*()<[1/2] 3.14 hello world ")
.should eq "マンゴー_()[1_2] 3.14 hello world"
end
end

View File

@@ -1,41 +0,0 @@
Arabic,sa
Bengali,bd
Bulgarian,bg
Burmese,mm
Catalan,ct
Chinese (Simp),cn
Chinese (Trad),hk
Czech,cz
Danish,dk
Dutch,nl
English,gb
Filipino,ph
Finnish,fi
French,fr
German,de
Greek,gr
Hebrew,il
Hindi,in
Hungarian,hu
Indonesian,id
Italian,it
Japanese,jp
Korean,kr
Lithuanian,lt
Malay,my
Mongolian,mn
Other,
Persian,ir
Polish,pl
Portuguese (Br),br
Portuguese (Pt),pt
Romanian,ro
Russian,ru
Serbo-Croatian,rs
Spanish (Es),es
Spanish (LATAM),mx
Swedish,se
Thai,th
Turkish,tr
Ukrainian,ua
Vietnames,vn
1 Arabic sa
2 Bengali bd
3 Bulgarian bg
4 Burmese mm
5 Catalan ct
6 Chinese (Simp) cn
7 Chinese (Trad) hk
8 Czech cz
9 Danish dk
10 Dutch nl
11 English gb
12 Filipino ph
13 Finnish fi
14 French fr
15 German de
16 Greek gr
17 Hebrew il
18 Hindi in
19 Hungarian hu
20 Indonesian id
21 Italian it
22 Japanese jp
23 Korean kr
24 Lithuanian lt
25 Malay my
26 Mongolian mn
27 Other
28 Persian ir
29 Polish pl
30 Portuguese (Br) br
31 Portuguese (Pt) pt
32 Romanian ro
33 Russian ru
34 Serbo-Croatian rs
35 Spanish (Es) es
36 Spanish (LATAM) mx
37 Swedish se
38 Thai th
39 Turkish tr
40 Ukrainian ua
41 Vietnames vn

View File

@@ -5,30 +5,35 @@ class Config
@[YAML::Field(ignore: true)]
property path : String = ""
property host : String = "0.0.0.0"
property port : Int32 = 9000
property base_url : String = "/"
property session_secret : String = "mango-session-secret"
property library_path : String = File.expand_path "~/mango/library",
home: true
property library_cache_path = File.expand_path "~/mango/library.yml.gz",
home: true
property db_path : String = File.expand_path "~/mango/mango.db", home: true
property scan_interval_minutes : Int32 = 5
property thumbnail_generation_interval_hours : Int32 = 24
property db_optimization_interval_hours : Int32 = 24
property log_level : String = "info"
property upload_path : String = File.expand_path "~/mango/uploads",
home: true
property plugin_path : String = File.expand_path "~/mango/plugins",
home: true
property download_timeout_seconds : Int32 = 30
property page_margin : Int32 = 30
property cache_enabled = false
property cache_size_mbs = 50
property cache_log_enabled = true
property disable_login = false
property default_username = ""
property auth_proxy_header_name = ""
property mangadex = Hash(String, String | Int32).new
@[YAML::Field(ignore: true)]
@mangadex_defaults = {
"base_url" => "https://mangadex.org",
"api_url" => "https://mangadex.org/api",
"api_url" => "https://api.mangadex.org/v2",
"download_wait_seconds" => 5,
"download_retries" => 4,
"download_queue_db_path" => File.expand_path("~/mango/queue.db",
@@ -52,9 +57,9 @@ class Config
cfg_path = File.expand_path path, home: true
if File.exists? cfg_path
config = self.from_yaml File.read cfg_path
config.preprocess
config.path = path
config.fill_defaults
config.preprocess
return config
end
puts "The config file #{cfg_path} does not exist. " \
@@ -92,5 +97,24 @@ class Config
raise "Login is disabled, but default username is not set. " \
"Please set a default username"
end
# `Logger.default` is not available yet
Log.setup :debug
unless mangadex["api_url"] =~ /\/v2/
Log.warn { "It looks like you are using the deprecated MangaDex API " \
"v1 in your config file. Please update it to " \
"https://api.mangadex.org/v2 to suppress this warning." }
mangadex["api_url"] = "https://api.mangadex.org/v2"
end
if mangadex["api_url"] =~ /\/api\/v2/
Log.warn { "It looks like you are using the outdated MangaDex API " \
"url (mangadex.org/api/v2) in your config file. Please " \
"update it to https://api.mangadex.org/v2 to suppress this " \
"warning." }
mangadex["api_url"] = "https://api.mangadex.org/v2"
end
mangadex["api_url"] = mangadex["api_url"].to_s.rstrip "/"
mangadex["base_url"] = mangadex["base_url"].to_s.rstrip "/"
end
end

View File

@@ -15,7 +15,11 @@ class AuthHandler < Kemal::Handler
env.response.status_code = 401
env.response.headers["WWW-Authenticate"] = HEADER_LOGIN_REQUIRED
env.response.print AUTH_MESSAGE
call_next env
end
def require_auth(env)
env.session.string "callback", env.request.path
redirect env, "/login"
end
def validate_token(env)
@@ -49,50 +53,50 @@ class AuthHandler < Kemal::Handler
Storage.default.verify_user username, password
end
def handle_opds_auth(env)
if validate_token(env) || validate_auth_header(env)
call_next env
else
env.response.status_code = 401
env.response.headers["WWW-Authenticate"] = HEADER_LOGIN_REQUIRED
env.response.print AUTH_MESSAGE
end
end
def handle_auth(env)
def call(env)
# Skip all authentication if requesting /login, /logout, or a static file
if request_path_startswith(env, ["/login", "/logout"]) ||
requesting_static_file env
return call_next(env)
end
unless validate_token(env) || Config.current.disable_login
env.session.string "callback", env.request.path
return redirect env, "/login"
# Check user is logged in
if validate_token env
# Skip if the request has a valid token
elsif Config.current.disable_login
# Check default username if login is disabled
unless Storage.default.username_exists Config.current.default_username
Logger.warn "Default username #{Config.current.default_username} " \
"does not exist"
return require_auth env
end
elsif !Config.current.auth_proxy_header_name.empty?
# Check auth proxy if present
username = env.request.headers[Config.current.auth_proxy_header_name]?
unless username && Storage.default.username_exists username
Logger.warn "Header #{Config.current.auth_proxy_header_name} unset " \
"or is not a valid username"
return require_auth env
end
elsif request_path_startswith env, ["/opds"]
# Check auth header if requesting an opds page
unless validate_auth_header env
return require_basic_auth env
end
else
return require_auth env
end
if request_path_startswith env, ["/admin", "/api/admin", "/download"]
# The token (if exists) takes precedence over the default user option.
# this is why we check the default username first before checking the
# token.
should_reject = true
if Config.current.disable_login &&
Storage.default.username_is_admin Config.current.default_username
should_reject = false
# Check admin access when requesting an admin page
if request_path_startswith env, %w(/admin /api/admin /download)
unless is_admin? env
env.response.status_code = 403
return send_error_page "HTTP 403: You are not authorized to visit " \
"#{env.request.path}"
end
if env.session.string? "token"
should_reject = !validate_token_admin(env)
end
env.response.status_code = 403 if should_reject
end
# Let the request go through if it passes the above checks
call_next env
end
def call(env)
if request_path_startswith env, ["/opds"]
handle_opds_auth env
else
handle_auth env
end
end
end

188
src/library/cache.cr Normal file
View File

@@ -0,0 +1,188 @@
require "digest"
require "./entry"
require "./types"
# Base class for an entry in the LRU cache.
# There are two ways to use it:
# 1. Use it as it is by instantiating with the appropriate `SaveT` and
# `ReturnT`. Note that in this case, `SaveT` and `ReturnT` must be the
# same type. That is, the input value will be stored as it is without
# any transformation.
# 2. You can also subclass it and provide custom implementations for
# `to_save_t` and `to_return_t`. This allows you to transform and store
# the input value to a different type. See `SortedEntriesCacheEntry` as
# an example.
private class CacheEntry(SaveT, ReturnT)
getter key : String, atime : Time
@value : SaveT
def initialize(@key : String, value : ReturnT)
@atime = @ctime = Time.utc
@value = self.class.to_save_t value
end
def value
@atime = Time.utc
self.class.to_return_t @value
end
def self.to_save_t(value : ReturnT)
value
end
def self.to_return_t(value : SaveT)
value
end
def instance_size
instance_sizeof(CacheEntry(SaveT, ReturnT)) + # sizeof itself
instance_sizeof(String) + @key.bytesize + # allocated memory for @key
@value.instance_size
end
end
class SortedEntriesCacheEntry < CacheEntry(Array(String), Array(Entry))
def self.to_save_t(value : Array(Entry))
value.map &.id
end
def self.to_return_t(value : Array(String))
ids_to_entries value
end
private def self.ids_to_entries(ids : Array(String))
e_map = Library.default.deep_entries.to_h { |entry| {entry.id, entry} }
entries = [] of Entry
begin
ids.each do |id|
entries << e_map[id]
end
return entries if ids.size == entries.size
rescue
end
end
def instance_size
instance_sizeof(SortedEntriesCacheEntry) + # sizeof itself
instance_sizeof(String) + @key.bytesize + # allocated memory for @key
@value.size * (instance_sizeof(String) + sizeof(String)) +
@value.sum(&.bytesize) # elements in Array(String)
end
def self.gen_key(book_id : String, username : String,
entries : Array(Entry), opt : SortOptions?)
entries_sig = Digest::SHA1.hexdigest (entries.map &.id).to_s
user_context = opt && opt.method == SortMethod::Progress ? username : ""
sig = Digest::SHA1.hexdigest (book_id + entries_sig + user_context +
(opt ? opt.to_tuple.to_s : "nil"))
"#{sig}:sorted_entries"
end
end
class String
def instance_size
instance_sizeof(String) + bytesize
end
end
struct Tuple(*T)
def instance_size
sizeof(T) + # total size of non-reference types
self.sum do |e|
next 0 unless e.is_a? Reference
if e.responds_to? :instance_size
e.instance_size
else
instance_sizeof(typeof(e))
end
end
end
end
alias CacheableType = Array(Entry) | String | Tuple(String, Int32)
alias CacheEntryType = SortedEntriesCacheEntry |
CacheEntry(String, String) |
CacheEntry(Tuple(String, Int32), Tuple(String, Int32))
def generate_cache_entry(key : String, value : CacheableType)
if value.is_a? Array(Entry)
SortedEntriesCacheEntry.new key, value
else
CacheEntry(typeof(value), typeof(value)).new key, value
end
end
# LRU Cache
class LRUCache
@@limit : Int128 = Int128.new 0
@@should_log = true
# key => entry
@@cache = {} of String => CacheEntryType
def self.enabled
Config.current.cache_enabled
end
def self.init
cache_size = Config.current.cache_size_mbs
@@limit = Int128.new cache_size * 1024 * 1024 if enabled
@@should_log = Config.current.cache_log_enabled
end
def self.get(key : String)
return unless enabled
entry = @@cache[key]?
if @@should_log
Logger.debug "LRUCache #{entry.nil? ? "miss" : "hit"} #{key}"
end
return entry.value unless entry.nil?
end
def self.set(cache_entry : CacheEntryType)
return unless enabled
key = cache_entry.key
@@cache[key] = cache_entry
Logger.debug "LRUCache cached #{key}" if @@should_log
remove_least_recent_access
end
def self.invalidate(key : String)
return unless enabled
@@cache.delete key
end
def self.print
return unless @@should_log
sum = @@cache.sum { |_, entry| entry.instance_size }
Logger.debug "---- LRU Cache ----"
Logger.debug "Size: #{sum} Bytes"
Logger.debug "List:"
@@cache.each do |k, v|
Logger.debug "#{k} | #{v.atime} | #{v.instance_size}"
end
Logger.debug "-------------------"
end
private def self.is_cache_full
sum = @@cache.sum { |_, entry| entry.instance_size }
sum > @@limit
end
private def self.remove_least_recent_access
if @@should_log && is_cache_full
Logger.debug "Removing entries from LRUCache"
end
while is_cache_full && @@cache.size > 0
min_tuple = @@cache.min_by { |_, entry| entry.atime }
min_key = min_tuple[0]
min_entry = min_tuple[1]
Logger.debug " \
Target: #{min_key}, \
Last Access Time: #{min_entry.atime}" if @@should_log
invalidate min_key
end
end
end

View File

@@ -1,6 +1,9 @@
require "image_size"
require "yaml"
class Entry
include YAML::Serializable
getter zip_path : String, book : Title, title : String,
size : String, pages : Int32, id : String, encoded_path : String,
encoded_title : String, mtime : Time, err_msg : String?
@@ -11,13 +14,13 @@ class Entry
@title = File.basename @zip_path, File.extname @zip_path
@encoded_title = URI.encode @title
@size = (File.size @zip_path).humanize_bytes
id = storage.get_id @zip_path, false
id = storage.get_entry_id @zip_path, File.signature(@zip_path)
if id.nil?
id = random_str
storage.insert_id({
path: @zip_path,
id: id,
is_title: false,
storage.insert_entry_id({
path: @zip_path,
id: id,
signature: File.signature(@zip_path).to_s,
})
end
@id = id
@@ -46,16 +49,20 @@ class Entry
file.close
end
def to_json(json : JSON::Builder)
json.object do
{% for str in ["zip_path", "title", "size", "id"] %}
def build_json(*, slim = false)
JSON.build do |json|
json.object do
{% for str in ["zip_path", "title", "size", "id"] %}
json.field {{str}}, @{{str.id}}
{% end %}
json.field "title_id", @book.id
json.field "display_name", @book.display_name @title
json.field "cover_url", cover_url
json.field "pages" { json.number @pages }
json.field "mtime" { json.number @mtime.to_unix }
json.field "title_id", @book.id
json.field "pages" { json.number @pages }
unless slim
json.field "display_name", @book.display_name @title
json.field "cover_url", cover_url
json.field "mtime" { json.number @mtime.to_unix }
end
end
end
end
@@ -69,9 +76,17 @@ class Entry
def cover_url
return "#{Config.current.base_url}img/icon.png" if @err_msg
unless @book.entry_cover_url_cache
TitleInfo.new @book.dir do |info|
@book.entry_cover_url_cache = info.entry_cover_url
end
end
entry_cover_url = @book.entry_cover_url_cache
url = "#{Config.current.base_url}api/cover/#{@book.id}/#{@id}"
TitleInfo.new @book.dir do |info|
info_url = info.entry_cover_url[@title]?
if entry_cover_url
info_url = entry_cover_url[@title]?
unless info_url.nil? || info_url.empty?
url = File.join Config.current.base_url, info_url
end
@@ -86,7 +101,7 @@ class Entry
SUPPORTED_IMG_TYPES.includes? \
MIME.from_filename? e.filename
}
.sort { |a, b|
.sort! { |a, b|
compare_numerically a.filename, b.filename
}
yield file, entries
@@ -134,10 +149,11 @@ class Entry
entries[idx + 1]
end
def previous_entry
idx = @book.entries.index self
def previous_entry(username)
entries = @book.sorted_entries username
idx = entries.index self
return nil if idx.nil? || idx == 0
@book.entries[idx - 1]
entries[idx - 1]
end
def date_added
@@ -157,6 +173,16 @@ class Entry
# For backward backward compatibility with v0.1.0, we save entry titles
# instead of IDs in info.json
def save_progress(username, page)
LRUCache.invalidate "#{@book.id}:#{username}:progress_sum"
@book.parents.each do |parent|
LRUCache.invalidate "#{parent.id}:#{username}:progress_sum"
end
[false, true].each do |ascend|
sorted_entries_cache_key = SortedEntriesCacheEntry.gen_key @book.id,
username, @book.entries, SortOptions.new(SortMethod::Progress, ascend)
LRUCache.invalidate sorted_entries_cache_key
end
TitleInfo.new @book.dir do |info|
if info.progress[username]?.nil?
info.progress[username] = {@title => page}

View File

@@ -1,12 +1,38 @@
class Library
include YAML::Serializable
getter dir : String, title_ids : Array(String),
title_hash : Hash(String, Title)
use_default
def initialize
register_mime_types
def save_instance
path = Config.current.library_cache_path
Logger.debug "Caching library to #{path}"
writer = Compress::Gzip::Writer.new path,
Compress::Gzip::BEST_COMPRESSION
writer.write self.to_yaml.to_slice
writer.close
end
def self.load_instance
path = Config.current.library_cache_path
return unless File.exists? path
Logger.debug "Loading cached library from #{path}"
begin
Compress::Gzip::Reader.open path do |content|
@@default = Library.from_yaml content
end
Library.default.register_jobs
rescue e
Logger.error e
end
end
def initialize
@dir = Config.current.library_path
# explicitly initialize @titles to bypass the compiler check. it will
# be filled with actual Titles in the `scan` call below
@@ -16,6 +42,12 @@ class Library
@entries_count = 0
@thumbnails_count = 0
register_jobs
end
protected def register_jobs
register_mime_types
scan_interval = Config.current.scan_interval_minutes
if scan_interval < 1
scan
@@ -25,7 +57,7 @@ class Library
start = Time.local
scan
ms = (Time.local - start).total_milliseconds
Logger.info "Scanned #{@title_ids.size} titles in #{ms}ms"
Logger.debug "Library initialized in #{ms}ms"
sleep scan_interval.minutes
end
end
@@ -42,16 +74,6 @@ class Library
end
end
end
db_interval = Config.current.db_optimization_interval_hours
unless db_interval < 1
spawn do
loop do
Storage.default.optimize
sleep db_interval.hours
end
end
end
end
def titles
@@ -61,11 +83,6 @@ class Library
def sorted_titles(username, opt : SortOptions? = nil)
if opt.nil?
opt = SortOptions.from_info_json @dir, username
else
TitleInfo.new @dir do |info|
info.sort_by[username] = opt.to_tuple
info.save
end
end
# Helper function from src/util/util.cr
@@ -73,14 +90,24 @@ class Library
end
def deep_titles
titles + titles.map { |t| t.deep_titles }.flatten
titles + titles.flat_map &.deep_titles
end
def to_json(json : JSON::Builder)
json.object do
json.field "dir", @dir
json.field "titles" do
json.raw self.titles.to_json
def deep_entries
titles.flat_map &.deep_entries
end
def build_json(*, slim = false, depth = -1)
JSON.build do |json|
json.object do
json.field "dir", @dir
json.field "titles" do
json.array do
self.titles.each do |title|
json.raw title.build_json(slim: slim, depth: depth)
end
end
end
end
end
end
@@ -94,6 +121,7 @@ class Library
end
def scan
start = Time.local
unless Dir.exists? @dir
Logger.info "The library directory #{@dir} does not exist. " \
"Attempting to create it"
@@ -102,14 +130,36 @@ class Library
storage = Storage.new auto_close: false
examine_context : ExamineContext = {
cached_contents_signature: {} of String => String,
deleted_title_ids: [] of String,
deleted_entry_ids: [] of String,
}
@title_ids.select! do |title_id|
title = @title_hash[title_id]
existence = title.examine examine_context
unless existence
examine_context["deleted_title_ids"].concat [title_id] +
title.deep_titles.map &.id
examine_context["deleted_entry_ids"].concat title.deep_entries.map &.id
end
existence
end
remained_title_dirs = @title_ids.map { |id| title_hash[id].dir }
examine_context["deleted_title_ids"].each do |title_id|
@title_hash.delete title_id
end
cache = examine_context["cached_contents_signature"]
(Dir.entries @dir)
.select { |fn| !fn.starts_with? "." }
.map { |fn| File.join @dir, fn }
.select { |path| !(remained_title_dirs.includes? path) }
.select { |path| File.directory? path }
.map { |path| Title.new path, "" }
.map { |path| Title.new path, "", cache }
.select { |title| !(title.entries.empty? && title.titles.empty?) }
.sort { |a, b| a.title <=> b.title }
.tap { |_| @title_ids.clear }
.sort! { |a, b| a.title <=> b.title }
.each do |title|
@title_hash[title.id] = title
@title_ids << title.id
@@ -118,19 +168,27 @@ class Library
storage.bulk_insert_ids
storage.close
Logger.debug "Scan completed"
ms = (Time.local - start).total_milliseconds
Logger.info "Scanned #{@title_ids.size} titles in #{ms}ms"
Storage.default.mark_unavailable examine_context["deleted_entry_ids"],
examine_context["deleted_title_ids"]
spawn do
save_instance
end
end
def get_continue_reading_entries(username)
cr_entries = deep_titles
.map { |t| t.get_last_read_entry username }
.map(&.get_last_read_entry username)
# Select elements with type `Entry` from the array and ignore all `Nil`s
.select(Entry)[0...ENTRIES_IN_HOME_SECTIONS]
.map { |e|
# Get the last read time of the entry. If it hasn't been started, get
# the last read time of the previous entry
last_read = e.load_last_read username
pe = e.previous_entry
pe = e.previous_entry username
if last_read.nil? && pe
last_read = pe.load_last_read username
end
@@ -159,14 +217,14 @@ class Library
recently_added = [] of RA
last_date_added = nil
titles.map { |t| t.deep_entries_with_date_added }.flatten
.select { |e| e[:date_added] > 1.month.ago }
.sort { |a, b| b[:date_added] <=> a[:date_added] }
titles.flat_map(&.deep_entries_with_date_added)
.select(&.[:date_added].> 1.month.ago)
.sort! { |a, b| b[:date_added] <=> a[:date_added] }
.each do |e|
break if recently_added.size > 12
last = recently_added.last?
if last && e[:entry].book.id == last[:entry].book.id &&
(e[:date_added] - last_date_added.not_nil!).duration < 1.day
(e[:date_added] - last_date_added.not_nil!).abs < 1.day
# A NamedTuple is immutable, so we have to cast it to a Hash first
last_hash = last.to_h
count = last_hash[:grouped_count].as(Int32)
@@ -197,9 +255,9 @@ class Library
# If we use `deep_titles`, the start reading section might include `Vol. 2`
# when the user hasn't started `Vol. 1` yet
titles
.select { |t| t.load_percentage(username) == 0 }
.select(&.load_percentage(username).== 0)
.sample(ENTRIES_IN_HOME_SECTIONS)
.shuffle
.shuffle!
end
def thumbnail_generation_progress
@@ -214,7 +272,7 @@ class Library
end
Logger.info "Starting thumbnail generation"
entries = deep_titles.map(&.deep_entries).flatten.reject &.err_msg
entries = deep_titles.flat_map(&.deep_entries).reject &.err_msg
@entries_count = entries.size
@thumbnails_count = 0
@@ -241,7 +299,7 @@ class Library
e.generate_thumbnail
# Sleep after each generation to minimize the impact on disk IO
# and CPU
sleep 0.5.seconds
sleep 1.seconds
end
@thumbnails_count += 1
end

View File

@@ -1,24 +1,38 @@
require "digest"
require "../archive"
class Title
include YAML::Serializable
getter dir : String, parent_id : String, title_ids : Array(String),
entries : Array(Entry), title : String, id : String,
encoded_title : String, mtime : Time
encoded_title : String, mtime : Time, signature : UInt64,
entry_cover_url_cache : Hash(String, String)?
setter entry_cover_url_cache : Hash(String, String)?
@[YAML::Field(ignore: true)]
@entry_display_name_cache : Hash(String, String)?
@[YAML::Field(ignore: true)]
@entry_cover_url_cache : Hash(String, String)?
@[YAML::Field(ignore: true)]
@cached_display_name : String?
@[YAML::Field(ignore: true)]
@cached_cover_url : String?
def initialize(@dir : String, @parent_id)
def initialize(@dir : String, @parent_id, cache = {} of String => String)
storage = Storage.default
id = storage.get_id @dir, true
@signature = Dir.signature dir
id = storage.get_title_id dir, signature
if id.nil?
id = random_str
storage.insert_id({
path: @dir,
id: id,
is_title: true,
storage.insert_title_id({
path: dir,
id: id,
signature: signature.to_s,
})
end
@id = id
@contents_signature = Dir.contents_signature dir, cache
@title = File.basename dir
@encoded_title = URI.encode @title
@title_ids = [] of String
@@ -29,13 +43,13 @@ class Title
next if fn.starts_with? "."
path = File.join dir, fn
if File.directory? path
title = Title.new path, @id
title = Title.new path, @id, cache
next if title.entries.size == 0 && title.titles.size == 0
Library.default.title_hash[title.id] = title
@title_ids << title.id
next
end
if [".zip", ".cbz", ".rar", ".cbr"].includes? File.extname path
if is_supported_file path
entry = Entry.new path, self
@entries << entry if entry.pages > 0 || entry.err_msg
end
@@ -43,39 +57,164 @@ class Title
mtimes = [@mtime]
mtimes += @title_ids.map { |e| Library.default.title_hash[e].mtime }
mtimes += @entries.map { |e| e.mtime }
mtimes += @entries.map &.mtime
@mtime = mtimes.max
@title_ids.sort! do |a, b|
compare_numerically Library.default.title_hash[a].title,
Library.default.title_hash[b].title
end
sorter = ChapterSorter.new @entries.map { |e| e.title }
sorter = ChapterSorter.new @entries.map &.title
@entries.sort! do |a, b|
sorter.compare a.title, b.title
end
end
def to_json(json : JSON::Builder)
json.object do
{% for str in ["dir", "title", "id"] %}
# Utility method used in library rescanning.
# - When the title does not exist on the file system anymore, return false
# and let it be deleted from the library instance
# - When the title exists, but its contents signature is now different from
# the cache, it means some of its content (nested titles or entries)
# has been added, deleted, or renamed. In this case we update its
# contents signature and instance variables
# - When the title exists and its contents signature is still the same, we
# return true so it can be reused without rescanning
def examine(context : ExamineContext) : Bool
return false unless Dir.exists? @dir
contents_signature = Dir.contents_signature @dir,
context["cached_contents_signature"]
return true if @contents_signature == contents_signature
@contents_signature = contents_signature
@signature = Dir.signature @dir
storage = Storage.default
id = storage.get_title_id dir, signature
if id.nil?
id = random_str
storage.insert_title_id({
path: dir,
id: id,
signature: signature.to_s,
})
end
@id = id
@mtime = File.info(@dir).modification_time
previous_titles_size = @title_ids.size
@title_ids.select! do |title_id|
title = Library.default.get_title! title_id
existence = title.examine context
unless existence
context["deleted_title_ids"].concat [title_id] +
title.deep_titles.map &.id
context["deleted_entry_ids"].concat title.deep_entries.map &.id
end
existence
end
remained_title_dirs = @title_ids.map do |title_id|
title = Library.default.get_title! title_id
title.dir
end
previous_entries_size = @entries.size
@entries.select! do |entry|
existence = File.exists? entry.zip_path
Fiber.yield
context["deleted_entry_ids"] << entry.id unless existence
existence
end
remained_entry_zip_paths = @entries.map &.zip_path
is_titles_added = false
is_entries_added = false
Dir.entries(dir).each do |fn|
next if fn.starts_with? "."
path = File.join dir, fn
if File.directory? path
next if remained_title_dirs.includes? path
title = Title.new path, @id, context["cached_contents_signature"]
next if title.entries.size == 0 && title.titles.size == 0
Library.default.title_hash[title.id] = title
@title_ids << title.id
is_titles_added = true
next
end
if is_supported_file path
next if remained_entry_zip_paths.includes? path
entry = Entry.new path, self
if entry.pages > 0 || entry.err_msg
@entries << entry
is_entries_added = true
end
end
end
mtimes = [@mtime]
mtimes += @title_ids.map { |e| Library.default.title_hash[e].mtime }
mtimes += @entries.map &.mtime
@mtime = mtimes.max
if is_titles_added || previous_titles_size != @title_ids.size
@title_ids.sort! do |a, b|
compare_numerically Library.default.title_hash[a].title,
Library.default.title_hash[b].title
end
end
if is_entries_added || previous_entries_size != @entries.size
sorter = ChapterSorter.new @entries.map &.title
@entries.sort! do |a, b|
sorter.compare a.title, b.title
end
end
true
end
alias SortContext = NamedTuple(username: String, opt: SortOptions)
def build_json(*, slim = false, depth = -1,
sort_context : SortContext? = nil)
JSON.build do |json|
json.object do
{% for str in ["dir", "title", "id"] %}
json.field {{str}}, @{{str.id}}
{% end %}
json.field "display_name", display_name
json.field "cover_url", cover_url
json.field "mtime" { json.number @mtime.to_unix }
json.field "titles" do
json.raw self.titles.to_json
end
json.field "entries" do
json.raw @entries.to_json
end
json.field "parents" do
json.array do
self.parents.each do |title|
json.object do
json.field "title", title.title
json.field "id", title.id
json.field "signature" { json.number @signature }
unless slim
json.field "display_name", display_name
json.field "cover_url", cover_url
json.field "mtime" { json.number @mtime.to_unix }
end
unless depth == 0
json.field "titles" do
json.array do
self.titles.each do |title|
json.raw title.build_json(slim: slim,
depth: depth > 0 ? depth - 1 : depth)
end
end
end
json.field "entries" do
json.array do
_entries = if sort_context
sorted_entries sort_context[:username],
sort_context[:opt]
else
@entries
end
_entries.each do |entry|
json.raw entry.build_json(slim: slim)
end
end
end
end
json.field "parents" do
json.array do
self.parents.each do |title|
json.object do
json.field "title", title.title
json.field "id", title.id
end
end
end
end
@@ -90,12 +229,12 @@ class Title
# Get all entries, including entries in nested titles
def deep_entries
return @entries if title_ids.empty?
@entries + titles.map { |t| t.deep_entries }.flatten
@entries + titles.flat_map &.deep_entries
end
def deep_titles
return [] of Title if titles.empty?
titles + titles.map { |t| t.deep_titles }.flatten
titles + titles.flat_map &.deep_titles
end
def parents
@@ -136,15 +275,19 @@ class Title
end
def get_entry(eid)
@entries.find { |e| e.id == eid }
@entries.find &.id.== eid
end
def display_name
cached_display_name = @cached_display_name
return cached_display_name unless cached_display_name.nil?
dn = @title
TitleInfo.new @dir do |info|
info_dn = info.display_name
dn = info_dn unless info_dn.empty?
end
@cached_display_name = dn
dn
end
@@ -168,6 +311,7 @@ class Title
end
def set_display_name(dn)
@cached_display_name = dn
TitleInfo.new @dir do |info|
info.display_name = dn
info.save
@@ -177,11 +321,15 @@ class Title
def set_display_name(entry_name : String, dn)
TitleInfo.new @dir do |info|
info.entry_display_name[entry_name] = dn
@entry_display_name_cache = info.entry_display_name
info.save
end
end
def cover_url
cached_cover_url = @cached_cover_url
return cached_cover_url unless cached_cover_url.nil?
url = "#{Config.current.base_url}img/icon.png"
readable_entries = @entries.select &.err_msg.nil?
if readable_entries.size > 0
@@ -193,10 +341,12 @@ class Title
url = File.join Config.current.base_url, info_url
end
end
@cached_cover_url = url
url
end
def set_cover_url(url : String)
@cached_cover_url = url
TitleInfo.new @dir do |info|
info.cover_url = url
info.save
@@ -206,6 +356,7 @@ class Title
def set_cover_url(entry_name : String, url : String)
TitleInfo.new @dir do |info|
info.entry_cover_url[entry_name] = url
@entry_cover_url_cache = info.entry_cover_url
info.save
end
end
@@ -215,29 +366,30 @@ class Title
@entries.each do |e|
e.save_progress username, e.pages
end
titles.each do |t|
t.read_all username
end
titles.each &.read_all username
end
# Set the reading progress of all entries and nested libraries to 0%
def unread_all(username)
@entries.each do |e|
e.save_progress username, 0
end
titles.each do |t|
t.unread_all username
end
@entries.each &.save_progress(username, 0)
titles.each &.unread_all username
end
def deep_read_page_count(username) : Int32
load_progress_for_all_entries(username).sum +
titles.map { |t| t.deep_read_page_count username }.flatten.sum
key = "#{@id}:#{username}:progress_sum"
sig = Digest::SHA1.hexdigest (entries.map &.id).to_s
cached_sum = LRUCache.get key
return cached_sum[1] if cached_sum.is_a? Tuple(String, Int32) &&
cached_sum[0] == sig
sum = load_progress_for_all_entries(username, nil, true).sum +
titles.flat_map(&.deep_read_page_count username).sum
LRUCache.set generate_cache_entry key, {sig, sum}
sum
end
def deep_total_page_count : Int32
entries.map { |e| e.pages }.sum +
titles.map { |t| t.deep_total_page_count }.flatten.sum
entries.sum(&.pages) +
titles.flat_map(&.deep_total_page_count).sum
end
def load_percentage(username)
@@ -286,13 +438,12 @@ class Title
# use the default (auto, ascending)
# When `opt` is not nil, it saves the options to info.json
def sorted_entries(username, opt : SortOptions? = nil)
cache_key = SortedEntriesCacheEntry.gen_key @id, username, @entries, opt
cached_entries = LRUCache.get cache_key
return cached_entries if cached_entries.is_a? Array(Entry)
if opt.nil?
opt = SortOptions.from_info_json @dir, username
else
TitleInfo.new @dir do |info|
info.sort_by[username] = opt.to_tuple
info.save
end
end
case opt.not_nil!.method
@@ -309,13 +460,13 @@ class Title
ary = @entries.zip(percentage_ary)
.sort { |a_tp, b_tp| (a_tp[1] <=> b_tp[1]).or \
compare_numerically a_tp[0].title, b_tp[0].title }
.map { |tp| tp[0] }
.map &.[0]
else
unless opt.method.auto?
Logger.warn "Unknown sorting method #{opt.not_nil!.method}. Using " \
"Auto instead"
end
sorter = ChapterSorter.new @entries.map { |e| e.title }
sorter = ChapterSorter.new @entries.map &.title
ary = @entries.sort do |a, b|
sorter.compare(a.title, b.title).or \
compare_numerically a.title, b.title
@@ -324,6 +475,7 @@ class Title
ary.reverse! unless opt.not_nil!.ascend
LRUCache.set generate_cache_entry cache_key, ary
ary
end
@@ -381,13 +533,24 @@ class Title
{entry: e, date_added: da_ary[i]}
end
return zip if title_ids.empty?
zip + titles.map { |t| t.deep_entries_with_date_added }.flatten
zip + titles.flat_map &.deep_entries_with_date_added
end
def bulk_progress(action, ids : Array(String), username)
LRUCache.invalidate "#{@id}:#{username}:progress_sum"
parents.each do |parent|
LRUCache.invalidate "#{parent.id}:#{username}:progress_sum"
end
[false, true].each do |ascend|
sorted_entries_cache_key =
SortedEntriesCacheEntry.gen_key @id, username, @entries,
SortOptions.new(SortMethod::Progress, ascend)
LRUCache.invalidate sorted_entries_cache_key
end
selected_entries = ids
.map { |id|
@entries.find { |e| e.id == id }
@entries.find &.id.==(id)
}
.select(Entry)

View File

@@ -1,4 +1,12 @@
SUPPORTED_IMG_TYPES = ["image/jpeg", "image/png", "image/webp"]
SUPPORTED_IMG_TYPES = %w(
image/jpeg
image/png
image/webp
image/apng
image/avif
image/gif
image/svg+xml
)
enum SortMethod
Auto
@@ -88,6 +96,18 @@ class TitleInfo
@@mutex_hash = {} of String => Mutex
def self.new(dir, &)
key = "#{dir}:info.json"
info = LRUCache.get key
if info.is_a? String
begin
instance = TitleInfo.from_json info
instance.dir = dir
yield instance
return
rescue
end
end
if @@mutex_hash[dir]?
mutex = @@mutex_hash[dir]
else
@@ -101,6 +121,7 @@ class TitleInfo
instance = TitleInfo.from_json File.read json_path
end
instance.dir = dir
LRUCache.set generate_cache_entry key, instance.to_json
yield instance
end
end
@@ -108,5 +129,12 @@ class TitleInfo
def save
json_path = File.join @dir, "info.json"
File.write json_path, self.to_pretty_json
key = "#{@dir}:info.json"
LRUCache.set generate_cache_entry key, self.to_json
end
end
alias ExamineContext = NamedTuple(
cached_contents_signature: Hash(String, String),
deleted_title_ids: Array(String),
deleted_entry_ids: Array(String))

View File

@@ -6,26 +6,14 @@ class Logger
SEVERITY_IDS = [0, 4, 5, 2, 3]
COLORS = [:light_cyan, :light_red, :red, :light_yellow, :light_magenta]
getter raw_log = Log.for ""
@@severity : Log::Severity = :info
use_default
def initialize
level = Config.current.log_level
{% begin %}
case level.downcase
when "off"
@@severity = :none
{% for lvl, i in LEVELS %}
when {{lvl}}
@@severity = Log::Severity.new SEVERITY_IDS[{{i}}]
{% end %}
else
raise "Unknown log level #{level}"
end
{% end %}
@log = Log.for("")
@@severity = Logger.get_severity
@backend = Log::IOBackend.new
format_proc = ->(entry : Log::Entry, io : IO) do
@@ -46,7 +34,29 @@ class Logger
end
@backend.formatter = Log::Formatter.new &format_proc
Log.setup @@severity, @backend
Log.setup do |c|
c.bind "*", @@severity, @backend
c.bind "db.*", :error, @backend
end
end
def self.get_severity(level = "") : Log::Severity
if level.empty?
level = Config.current.log_level
end
{% begin %}
case level.downcase
when "off"
return Log::Severity::None
{% for lvl, i in LEVELS %}
when {{lvl}}
return Log::Severity.new SEVERITY_IDS[{{i}}]
{% end %}
else
raise "Unknown log level #{level}"
end
{% end %}
end
# Ignores @@severity and always log msg
@@ -61,7 +71,7 @@ class Logger
{% for lvl in LEVELS %}
def {{lvl.id}}(msg)
@log.{{lvl.id}} { msg }
raw_log.{{lvl.id}} { msg }
end
def self.{{lvl.id}}(msg)
default.not_nil!.{{lvl.id}} msg

View File

@@ -1,217 +0,0 @@
require "json"
require "csv"
require "../rename"
macro string_properties(names)
{% for name in names %}
property {{name.id}} = ""
{% end %}
end
macro parse_strings_from_json(names)
{% for name in names %}
@{{name.id}} = obj[{{name}}].as_s
{% end %}
end
macro properties_to_hash(names)
{
{% for name in names %}
"{{name.id}}" => @{{name.id}}.to_s,
{% end %}
}
end
module MangaDex
class Chapter
string_properties ["lang_code", "title", "volume", "chapter"]
property manga : Manga
property time = Time.local
property id : String
property full_title = ""
property language = ""
property pages = [] of {String, String} # filename, url
property groups = [] of {Int32, String} # group_id, group_name
def initialize(@id, json_obj : JSON::Any, @manga,
lang : Hash(String, String))
self.parse_json json_obj, lang
end
def to_info_json
JSON.build do |json|
json.object do
{% for name in ["id", "title", "volume", "chapter",
"language", "full_title"] %}
json.field {{name}}, @{{name.id}}
{% end %}
json.field "time", @time.to_unix.to_s
json.field "manga_title", @manga.title
json.field "manga_id", @manga.id
json.field "groups" do
json.object do
@groups.each do |gid, gname|
json.field gname, gid
end
end
end
end
end
end
def parse_json(obj, lang)
parse_strings_from_json ["lang_code", "title", "volume",
"chapter"]
language = lang[@lang_code]?
@language = language if language
@time = Time.unix obj["timestamp"].as_i
suffixes = ["", "_2", "_3"]
suffixes.each do |s|
gid = obj["group_id#{s}"].as_i
next if gid == 0
gname = obj["group_name#{s}"].as_s
@groups << {gid, gname}
end
rename_rule = Rename::Rule.new \
Config.current.mangadex["chapter_rename_rule"].to_s
@full_title = rename rename_rule
rescue e
raise "failed to parse json: #{e}"
end
def rename(rule : Rename::Rule)
hash = properties_to_hash ["id", "title", "volume", "chapter",
"lang_code", "language", "pages"]
hash["groups"] = @groups.map { |g| g[1] }.join ","
rule.render hash
end
end
class Manga
string_properties ["cover_url", "description", "title", "author", "artist"]
property chapters = [] of Chapter
property id : String
def initialize(@id, json_obj : JSON::Any)
self.parse_json json_obj
end
def to_info_json(with_chapters = true)
JSON.build do |json|
json.object do
{% for name in ["id", "title", "description", "author", "artist",
"cover_url"] %}
json.field {{name}}, @{{name.id}}
{% end %}
if with_chapters
json.field "chapters" do
json.array do
@chapters.each do |c|
json.raw c.to_info_json
end
end
end
end
end
end
end
def parse_json(obj)
parse_strings_from_json ["cover_url", "description", "title", "author",
"artist"]
rescue e
raise "failed to parse json: #{e}"
end
def rename(rule : Rename::Rule)
rule.render properties_to_hash ["id", "title", "author", "artist"]
end
end
class API
use_default
def initialize
@base_url = Config.current.mangadex["api_url"].to_s ||
"https://mangadex.org/api/"
@lang = {} of String => String
CSV.each_row {{read_file "src/assets/lang_codes.csv"}} do |row|
@lang[row[1]] = row[0]
end
end
def get(url)
headers = HTTP::Headers{
"User-agent" => "Mangadex.cr",
}
res = HTTP::Client.get url, headers
raise "Failed to get #{url}. [#{res.status_code}] " \
"#{res.status_message}" if !res.success?
JSON.parse res.body
end
def get_manga(id)
obj = self.get File.join @base_url, "manga/#{id}"
if obj["status"]? != "OK"
raise "Expecting `OK` in the `status` field. Got `#{obj["status"]?}`"
end
begin
manga = Manga.new id, obj["manga"]
obj["chapter"].as_h.map do |k, v|
chapter = Chapter.new k, v, manga, @lang
manga.chapters << chapter
end
manga
rescue
raise "Failed to parse JSON"
end
end
def get_chapter(chapter : Chapter)
obj = self.get File.join @base_url, "chapter/#{chapter.id}"
if obj["status"]? == "external"
raise "This chapter is hosted on an external site " \
"#{obj["external"]?}, and Mango does not support " \
"external chapters."
end
if obj["status"]? != "OK"
raise "Expecting `OK` in the `status` field. Got `#{obj["status"]?}`"
end
begin
server = obj["server"].as_s
hash = obj["hash"].as_s
chapter.pages = obj["page_array"].as_a.map do |fn|
{
fn.as_s,
"#{server}#{hash}/#{fn.as_s}",
}
end
rescue
raise "Failed to parse JSON"
end
end
def get_chapter(id : String)
obj = self.get File.join @base_url, "chapter/#{id}"
if obj["status"]? == "external"
raise "This chapter is hosted on an external site " \
"#{obj["external"]?}, and Mango does not support " \
"external chapters."
end
if obj["status"]? != "OK"
raise "Expecting `OK` in the `status` field. Got `#{obj["status"]?}`"
end
manga_id = ""
begin
manga_id = obj["manga_id"].as_i.to_s
rescue
raise "Failed to parse JSON"
end
manga = self.get_manga manga_id
chapter = manga.chapters.find { |c| c.id == id }.not_nil!
self.get_chapter chapter
chapter
end
end
end

View File

@@ -1,167 +0,0 @@
require "./api"
require "compress/zip"
module MangaDex
class PageJob
property success = false
property url : String
property filename : String
property writer : Compress::Zip::Writer
property tries_remaning : Int32
def initialize(@url, @filename, @writer, @tries_remaning)
end
end
class Downloader < Queue::Downloader
@wait_seconds : Int32 = Config.current.mangadex["download_wait_seconds"]
.to_i32
@retries : Int32 = Config.current.mangadex["download_retries"].to_i32
use_default
def initialize
@api = API.default
super
end
def pop : Queue::Job?
job = nil
MainFiber.run do
DB.open "sqlite3://#{@queue.path}" do |db|
begin
db.query_one "select * from queue where id not like '%-%' " \
"and (status = 0 or status = 1) " \
"order by time limit 1" do |res|
job = Queue::Job.from_query_result res
end
rescue
end
end
end
job
end
private def download(job : Queue::Job)
@downloading = true
@queue.set_status Queue::JobStatus::Downloading, job
begin
chapter = @api.get_chapter(job.id)
rescue e
Logger.error e
@queue.set_status Queue::JobStatus::Error, job
unless e.message.nil?
@queue.add_message e.message.not_nil!, job
end
@downloading = false
return
end
@queue.set_pages chapter.pages.size, job
lib_dir = @library_path
rename_rule = Rename::Rule.new \
Config.current.mangadex["manga_rename_rule"].to_s
manga_dir = File.join lib_dir, chapter.manga.rename rename_rule
unless File.exists? manga_dir
Dir.mkdir_p manga_dir
end
zip_path = File.join manga_dir, "#{job.title}.cbz.part"
# Find the number of digits needed to store the number of pages
len = Math.log10(chapter.pages.size).to_i + 1
writer = Compress::Zip::Writer.new zip_path
# Create a buffered channel. It works as an FIFO queue
channel = Channel(PageJob).new chapter.pages.size
spawn do
chapter.pages.each_with_index do |tuple, i|
fn, url = tuple
ext = File.extname fn
fn = "#{i.to_s.rjust len, '0'}#{ext}"
page_job = PageJob.new url, fn, writer, @retries
Logger.debug "Downloading #{url}"
loop do
sleep @wait_seconds.seconds
download_page page_job
break if page_job.success ||
page_job.tries_remaning <= 0
page_job.tries_remaning -= 1
Logger.warn "Failed to download page #{url}. " \
"Retrying... Remaining retries: " \
"#{page_job.tries_remaning}"
end
channel.send page_job
break unless @queue.exists? job
end
end
spawn do
page_jobs = [] of PageJob
chapter.pages.size.times do
page_job = channel.receive
break unless @queue.exists? job
Logger.debug "[#{page_job.success ? "success" : "failed"}] " \
"#{page_job.url}"
page_jobs << page_job
if page_job.success
@queue.add_success job
else
@queue.add_fail job
msg = "Failed to download page #{page_job.url}"
@queue.add_message msg, job
Logger.error msg
end
end
unless @queue.exists? job
Logger.debug "Download cancelled"
@downloading = false
next
end
fail_count = page_jobs.count { |j| !j.success }
Logger.debug "Download completed. " \
"#{fail_count}/#{page_jobs.size} failed"
writer.close
filename = File.join File.dirname(zip_path), File.basename(zip_path,
".part")
File.rename zip_path, filename
Logger.debug "cbz File created at #{filename}"
zip_exception = validate_archive filename
if !zip_exception.nil?
@queue.add_message "The downloaded archive is corrupted. " \
"Error: #{zip_exception}", job
@queue.set_status Queue::JobStatus::Error, job
elsif fail_count > 0
@queue.set_status Queue::JobStatus::MissingPages, job
else
@queue.set_status Queue::JobStatus::Completed, job
end
@downloading = false
end
end
private def download_page(job : PageJob)
Logger.debug "downloading #{job.url}"
headers = HTTP::Headers{
"User-agent" => "Mangadex.cr",
}
begin
HTTP::Client.get job.url, headers do |res|
unless res.success?
raise "Failed to download page #{job.url}. " \
"[#{res.status_code}] #{res.status_message}"
end
job.writer.add job.filename, res.body_io
end
job.success = true
rescue e
Logger.error e
job.success = false
end
end
end
end

View File

@@ -2,13 +2,12 @@ require "./config"
require "./queue"
require "./server"
require "./main_fiber"
require "./mangadex/*"
require "./plugin/*"
require "option_parser"
require "clim"
require "tallboy"
MANGO_VERSION = "0.18.1"
MANGO_VERSION = "0.24.0"
# From http://www.network-science.de/ascii/
BANNER = %{
@@ -56,14 +55,20 @@ class CLI < Clim
Config.load(opts.config).set_current
# Initialize main components
LRUCache.init
Storage.default
Queue.default
Library.load_instance
Library.default
MangaDex::Downloader.default
Plugin::Downloader.default
spawn do
Server.new.start
begin
Server.new.start
rescue e
Logger.fatal e
Process.exit 1
end
end
MainFiber.start_and_block

View File

@@ -23,11 +23,6 @@ class Plugin
job
end
private def process_filename(str)
return "_" if str == ".."
str.gsub "/", "_"
end
private def download(job : Queue::Job)
@downloading = true
@queue.set_status Queue::JobStatus::Downloading, job
@@ -42,8 +37,8 @@ class Plugin
pages = info["pages"].as_i
manga_title = process_filename job.manga_title
chapter_title = process_filename info["title"].as_s
manga_title = sanitize_filename job.manga_title
chapter_title = sanitize_filename info["title"].as_s
@queue.set_pages pages, job
lib_dir = @library_path
@@ -68,7 +63,7 @@ class Plugin
while page = plugin.next_page
break unless @queue.exists? job
fn = process_filename page["filename"].as_s
fn = sanitize_filename page["filename"].as_s
url = page["url"].as_s
headers = HTTP::Headers.new

View File

@@ -117,7 +117,7 @@ class Plugin
def initialize(id : String)
Plugin.build_info_ary
@info = @@info_ary.find { |i| i.id == id }
@info = @@info_ary.find &.id.== id
if @info.nil?
raise Error.new "Plugin with ID #{id} not found"
end

View File

@@ -303,12 +303,12 @@ class Queue
end
def pause
@downloaders.each { |d| d.stopped = true }
@downloaders.each &.stopped=(true)
@paused = true
end
def resume
@downloaders.each { |d| d.stopped = false }
@downloaders.each &.stopped=(false)
@paused = false
end

View File

@@ -35,15 +35,15 @@ module Rename
class Group < Base(Pattern | String)
def render(hash : VHash)
return "" if @ary.select(&.is_a? Pattern)
return "" if @ary.select(Pattern)
.any? &.as(Pattern).render(hash).empty?
@ary.map do |e|
@ary.join do |e|
if e.is_a? Pattern
e.render hash
else
e
end
end.join
end
end
end
@@ -129,13 +129,13 @@ module Rename
end
def render(hash : VHash)
str = @ary.map do |e|
str = @ary.join do |e|
if e.is_a? String
e
else
e.render hash
end
end.join.strip
end.strip
post_process str
end

View File

@@ -1,6 +1,9 @@
struct AdminRouter
def initialize
get "/admin" do |env|
storage = Storage.default
missing_count = storage.missing_titles.size +
storage.missing_entries.size
layout "admin"
end
@@ -66,5 +69,9 @@ struct AdminRouter
mangadex_base_url = Config.current.mangadex["base_url"]
layout "download-manager"
end
get "/admin/missing" do |env|
layout "missing-items"
end
end
end

View File

@@ -1,6 +1,6 @@
require "../mangadex/*"
require "../upload"
require "koa"
require "digest"
struct APIRouter
@@api_json : String?
@@ -10,7 +10,7 @@ struct APIRouter
macro s(fields)
{
{% for field in fields %}
{{field}} => "string",
{{field}} => String,
{% end %}
}
end
@@ -33,150 +33,43 @@ struct APIRouter
MD
Koa.cookie_auth "cookie", "mango-sessid-#{Config.current.port}"
Koa.global_tag "admin", desc: <<-MD
Koa.define_tag "admin", desc: <<-MD
These are the admin endpoints only accessible for users with admin access. A non-admin user will get HTTP 403 when calling the endpoints.
MD
Koa.binary "binary", desc: "A binary file"
Koa.array "entryAry", "$entry", desc: "An array of entries"
Koa.array "titleAry", "$title", desc: "An array of titles"
Koa.array "strAry", "string", desc: "An array of strings"
Koa.schema "entry", {
"pages" => Int32,
"mtime" => Int64,
}.merge(s %w(zip_path title size id title_id display_name cover_url)),
desc: "An entry in a book"
entry_schema = {
"pages" => "integer",
"mtime" => "integer",
}.merge s %w(zip_path title size id title_id display_name cover_url)
Koa.object "entry", entry_schema, desc: "An entry in a book"
title_schema = {
"mtime" => "integer",
"entries" => "$entryAry",
"titles" => "$titleAry",
"parents" => "$strAry",
}.merge s %w(dir title id display_name cover_url)
Koa.object "title", title_schema,
Koa.schema "title", {
"mtime" => Int64,
"entries" => ["entry"],
"titles" => ["title"],
"parents" => [String],
}.merge(s %w(dir title id display_name cover_url)),
desc: "A manga title (a collection of entries and sub-titles)"
Koa.object "library", {
"dir" => "string",
"titles" => "$titleAry",
}, desc: "A library containing a list of top-level titles"
Koa.object "scanResult", {
"milliseconds" => "integer",
"titles" => "integer",
}
Koa.object "progressResult", {
"progress" => "number",
}
Koa.object "result", {
"success" => "boolean",
"error" => "string?",
}
mc_schema = {
"groups" => "object",
}.merge s %w(id title volume chapter language full_title time manga_title manga_id)
Koa.object "mangadexChapter", mc_schema, desc: "A MangaDex chapter"
Koa.array "chapterAry", "$mangadexChapter"
mm_schema = {
"chapers" => "$chapterAry",
}.merge s %w(id title description author artist cover_url)
Koa.object "mangadexManga", mm_schema, desc: "A MangaDex manga"
Koa.object "chaptersObj", {
"chapters" => "$chapterAry",
}
Koa.object "successFailCount", {
"success" => "integer",
"fail" => "integer",
}
job_schema = {
"pages" => "integer",
"success_count" => "integer",
"fail_count" => "integer",
"time" => "integer",
}.merge s %w(id manga_id title manga_title status_message status)
Koa.object "job", job_schema, desc: "A download job in the queue"
Koa.array "jobAry", "$job"
Koa.object "jobs", {
"success" => "boolean",
"paused" => "boolean",
"jobs" => "$jobAry",
}
Koa.object "binaryUpload", {
"file" => "$binary",
}
Koa.object "pluginListBody", {
"plugin" => "string",
"query" => "string",
}
Koa.object "pluginChapter", {
"id" => "string",
"title" => "string",
}
Koa.array "pluginChapterAry", "$pluginChapter"
Koa.object "pluginList", {
"success" => "boolean",
"chapters" => "$pluginChapterAry?",
"title" => "string?",
"error" => "string?",
}
Koa.object "pluginDownload", {
"plugin" => "string",
"title" => "string",
"chapters" => "$pluginChapterAry",
}
Koa.object "dimension", {
"width" => "integer",
"height" => "integer",
}
Koa.array "dimensionAry", "$dimension"
Koa.object "dimensionResult", {
"success" => "boolean",
"dimensions" => "$dimensionAry?",
"margin" => "number",
"error" => "string?",
}
Koa.object "ids", {
"ids" => "$strAry",
}
Koa.object "tagsResult", {
"success" => "boolean",
"tags" => "$strAry?",
"error" => "string?",
Koa.schema "result", {
"success" => Bool,
"error" => String?,
}
Koa.describe "Returns a page in a manga entry"
Koa.path "tid", desc: "Title ID"
Koa.path "eid", desc: "Entry ID"
Koa.path "page", type: "integer", desc: "The page number to return (starts from 1)"
Koa.response 200, ref: "$binary", media_type: "image/*"
Koa.path "page", schema: Int32, desc: "The page number to return (starts from 1)"
Koa.response 200, schema: Bytes, media_type: "image/*"
Koa.response 500, "Page not found or not readable"
Koa.response 304, "Page not modified (only available when `If-None-Match` is set)"
Koa.tag "reader"
get "/api/page/:tid/:eid/:page" do |env|
begin
tid = env.params.url["tid"]
eid = env.params.url["eid"]
page = env.params.url["page"].to_i
prev_e_tag = env.request.headers["If-None-Match"]?
title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil?
@@ -186,7 +79,15 @@ struct APIRouter
raise "Failed to load page #{page} of " \
"`#{title.title}/#{entry.title}`" if img.nil?
send_img env, img
e_tag = Digest::SHA1.hexdigest img.data
if prev_e_tag == e_tag
env.response.status_code = 304
""
else
env.response.headers["ETag"] = e_tag
env.response.headers["Cache-Control"] = "public, max-age=86400"
send_img env, img
end
rescue e
Logger.error e
env.response.status_code = 500
@@ -197,12 +98,15 @@ struct APIRouter
Koa.describe "Returns the cover image of a manga entry"
Koa.path "tid", desc: "Title ID"
Koa.path "eid", desc: "Entry ID"
Koa.response 200, ref: "$binary", media_type: "image/*"
Koa.response 200, schema: Bytes, media_type: "image/*"
Koa.response 304, "Page not modified (only available when `If-None-Match` is set)"
Koa.response 500, "Page not found or not readable"
Koa.tag "library"
get "/api/cover/:tid/:eid" do |env|
begin
tid = env.params.url["tid"]
eid = env.params.url["eid"]
prev_e_tag = env.request.headers["If-None-Match"]?
title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil?
@@ -213,7 +117,14 @@ struct APIRouter
raise "Failed to get cover of `#{title.title}/#{entry.title}`" \
if img.nil?
send_img env, img
e_tag = Digest::SHA1.hexdigest img.data
if prev_e_tag == e_tag
env.response.status_code = 304
""
else
env.response.headers["ETag"] = e_tag
send_img env, img
end
rescue e
Logger.error e
env.response.status_code = 500
@@ -221,17 +132,39 @@ struct APIRouter
end
end
Koa.describe "Returns the book with title `tid`"
Koa.describe "Returns the book with title `tid`", <<-MD
- Supply the `slim` query parameter to strip away "display_name", "cover_url", and "mtime" from the returned object to speed up the loading time
- Supply the `depth` query parameter to control the depth of nested titles to return.
- When `depth` is 1, returns the top-level titles and sub-titles/entries one level in them
- When `depth` is 0, returns the top-level titles without their sub-titles/entries
- When `depth` is N, returns the top-level titles and sub-titles/entries N levels in them
- When `depth` is negative, returns the entire library
MD
Koa.path "tid", desc: "Title ID"
Koa.response 200, ref: "$title"
Koa.query "slim"
Koa.query "depth"
Koa.query "sort", desc: "Sorting option for entries. Can be one of 'auto', 'title', 'progress', 'time_added' and 'time_modified'"
Koa.query "ascend", desc: "Sorting direction for entries. Set to 0 for the descending order. Doesn't work without specifying 'sort'"
Koa.response 200, schema: "title"
Koa.response 404, "Title not found"
Koa.tag "library"
get "/api/book/:tid" do |env|
begin
username = get_username env
sort_opt = SortOptions.new
get_sort_opt
tid = env.params.url["tid"]
title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil?
send_json env, title.to_json
slim = !env.params.query["slim"]?.nil?
depth = env.params.query["depth"]?.try(&.to_i?) || -1
send_json env, title.build_json(slim: slim, depth: depth,
sort_context: {username: username,
opt: sort_opt})
rescue e
Logger.error e
env.response.status_code = 404
@@ -239,15 +172,34 @@ struct APIRouter
end
end
Koa.describe "Returns the entire library with all titles and entries"
Koa.response 200, ref: "$library"
Koa.describe "Returns the entire library with all titles and entries", <<-MD
- Supply the `slim` query parameter to strip away "display_name", "cover_url", and "mtime" from the returned object to speed up the loading time
- Supply the `dpeth` query parameter to control the depth of nested titles to return.
- When `depth` is 1, returns the requested title and sub-titles/entries one level in it
- When `depth` is 0, returns the requested title without its sub-titles/entries
- When `depth` is N, returns the requested title and sub-titles/entries N levels in it
- When `depth` is negative, returns the requested title and all sub-titles/entries in it
MD
Koa.query "slim"
Koa.query "depth"
Koa.response 200, schema: {
"dir" => String,
"titles" => ["title"],
}
Koa.tag "library"
get "/api/library" do |env|
send_json env, Library.default.to_json
slim = !env.params.query["slim"]?.nil?
depth = env.params.query["depth"]?.try(&.to_i?) || -1
send_json env, Library.default.build_json(slim: slim, depth: depth)
end
Koa.describe "Triggers a library scan"
Koa.tag "admin"
Koa.response 200, ref: "$scanResult"
Koa.tags ["admin", "library"]
Koa.response 200, schema: {
"milliseconds" => Float64,
"titles" => Int32,
}
post "/api/admin/scan" do |env|
start = Time.utc
Library.default.scan
@@ -259,8 +211,10 @@ struct APIRouter
end
Koa.describe "Returns the thumbnail generation progress between 0 and 1"
Koa.tag "admin"
Koa.response 200, ref: "$progressResult"
Koa.tags ["admin", "library"]
Koa.response 200, schema: {
"progress" => Float64,
}
get "/api/admin/thumbnail_progress" do |env|
send_json env, {
"progress" => Library.default.thumbnail_generation_progress,
@@ -268,7 +222,7 @@ struct APIRouter
end
Koa.describe "Triggers a thumbnail generation"
Koa.tag "admin"
Koa.tags ["admin", "library"]
post "/api/admin/generate_thumbnails" do |env|
spawn do
Library.default.generate_thumbnails
@@ -276,8 +230,8 @@ struct APIRouter
end
Koa.describe "Deletes a user with `username`"
Koa.tag "admin"
Koa.response 200, ref: "$result"
Koa.tags ["admin", "users"]
Koa.response 200, schema: "result"
delete "/api/admin/user/delete/:username" do |env|
begin
username = env.params.url["username"]
@@ -304,7 +258,8 @@ struct APIRouter
Koa.path "tid", desc: "Title ID"
Koa.query "eid", desc: "Entry ID", required: false
Koa.path "page", desc: "The new page number indicating the progress"
Koa.response 200, ref: "$result"
Koa.response 200, schema: "result"
Koa.tag "progress"
put "/api/progress/:tid/:page" do |env|
begin
username = get_username env
@@ -335,8 +290,11 @@ struct APIRouter
Koa.describe "Updates the reading progress of multiple entries in a title"
Koa.path "action", desc: "The action to perform. Can be either `read` or `unread`"
Koa.path "tid", desc: "Title ID"
Koa.body ref: "$ids", desc: "An array of entry IDs"
Koa.response 200, ref: "$result"
Koa.body schema: {
"ids" => [String],
}, desc: "An array of entry IDs"
Koa.response 200, schema: "result"
Koa.tag "progress"
put "/api/bulk_progress/:action/:tid" do |env|
begin
username = get_username env
@@ -362,11 +320,11 @@ struct APIRouter
Koa.describe "Sets the display name of a title or an entry", <<-MD
When `eid` is provided, apply the display name to the entry. Otherwise, apply the display name to the title identified by `tid`.
MD
Koa.tag "admin"
Koa.tags ["admin", "library"]
Koa.path "tid", desc: "Title ID"
Koa.query "eid", desc: "Entry ID", required: false
Koa.path "name", desc: "The new display name"
Koa.response 200, ref: "$result"
Koa.response 200, schema: "result"
put "/api/admin/display_name/:tid/:name" do |env|
begin
title = (Library.default.get_title env.params.url["tid"])
@@ -390,60 +348,12 @@ struct APIRouter
end
end
Koa.describe "Returns a MangaDex manga identified by `id`", <<-MD
On error, returns a JSON that contains the error message in the `error` field.
MD
Koa.tag "admin"
Koa.path "id", desc: "A MangaDex manga ID"
Koa.response 200, ref: "$mangadexManga"
get "/api/admin/mangadex/manga/:id" do |env|
begin
id = env.params.url["id"]
api = MangaDex::API.default
manga = api.get_manga id
send_json env, manga.to_info_json
rescue e
Logger.error e
send_json env, {"error" => e.message}.to_json
end
end
Koa.describe "Adds a list of MangaDex chapters to the download queue", <<-MD
On error, returns a JSON that contains the error message in the `error` field.
MD
Koa.tag "admin"
Koa.body ref: "$chaptersObj"
Koa.response 200, ref: "$successFailCount"
post "/api/admin/mangadex/download" do |env|
begin
chapters = env.params.json["chapters"].as(Array).map { |c| c.as_h }
jobs = chapters.map { |chapter|
Queue::Job.new(
chapter["id"].as_s,
chapter["manga_id"].as_s,
chapter["full_title"].as_s,
chapter["manga_title"].as_s,
Queue::JobStatus::Pending,
Time.unix chapter["time"].as_s.to_i
)
}
inserted_count = Queue.default.push jobs
send_json env, {
"success": inserted_count,
"fail": jobs.size - inserted_count,
}.to_json
rescue e
Logger.error e
send_json env, {"error" => e.message}.to_json
end
end
ws "/api/admin/mangadex/queue" do |socket, env|
interval_raw = env.params.query["interval"]?
interval = (interval_raw.to_i? if interval_raw) || 5
loop do
socket.send({
"jobs" => Queue.default.get_all,
"jobs" => Queue.default.get_all.reverse,
"paused" => Queue.default.paused?,
}.to_json)
sleep interval.seconds
@@ -453,17 +363,27 @@ struct APIRouter
Koa.describe "Returns the current download queue", <<-MD
On error, returns a JSON that contains the error message in the `error` field.
MD
Koa.tag "admin"
Koa.response 200, ref: "$jobs"
Koa.tags ["admin", "downloader"]
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"paused" => Bool?,
"jobs?" => [{
"pages" => Int32,
"success_count" => Int32,
"fail_count" => Int32,
"time" => Int64,
}.merge(s %w(id manga_id title manga_title status_message status))],
}
get "/api/admin/mangadex/queue" do |env|
begin
jobs = Queue.default.get_all
send_json env, {
"jobs" => jobs,
"jobs" => Queue.default.get_all.reverse,
"paused" => Queue.default.paused?,
"success" => true,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -480,10 +400,10 @@ struct APIRouter
When `action` is set to `retry`, the behavior depends on `id`. If `id` is provided, restarts the job identified by the ID. Otherwise, retries all jobs in the `Error` or `MissingPages` status in the queue.
MD
Koa.tag "admin"
Koa.tags ["admin", "downloader"]
Koa.path "action", desc: "The action to perform. It should be one of the followins: `delete`, `retry`, `pause` and `resume`."
Koa.query "id", required: false, desc: "A job ID"
Koa.response 200, ref: "$result"
Koa.response 200, schema: "result"
post "/api/admin/mangadex/queue/:action" do |env|
begin
action = env.params.url["action"]
@@ -511,6 +431,7 @@ struct APIRouter
send_json env, {"success" => true}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -532,8 +453,10 @@ struct APIRouter
When `eid` is omitted, the new cover image will be applied to the title. Otherwise, applies the image to the specified entry.
MD
Koa.tag "admin"
Koa.body type: "multipart/form-data", ref: "$binaryUpload"
Koa.response 200, ref: "$result"
Koa.body media_type: "multipart/form-data", schema: {
"file" => Bytes,
}
Koa.response 200, schema: "result"
post "/api/admin/upload/:target" do |env|
begin
target = env.params.url["target"]
@@ -581,6 +504,7 @@ struct APIRouter
raise "No part with name `file` found"
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -589,9 +513,18 @@ struct APIRouter
end
Koa.describe "Lists the chapters in a title from a plugin"
Koa.tag "admin"
Koa.body ref: "$pluginListBody"
Koa.response 200, ref: "$pluginList"
Koa.tags ["admin", "downloader"]
Koa.query "plugin", schema: String
Koa.query "query", schema: String
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"chapters?" => [{
"id" => String,
"title" => String,
}],
"title" => String?,
}
get "/api/admin/plugin/list" do |env|
begin
query = env.params.query["query"].as String
@@ -607,6 +540,7 @@ struct APIRouter
"title" => title,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -615,9 +549,19 @@ struct APIRouter
end
Koa.describe "Adds a list of chapters from a plugin to the download queue"
Koa.tag "admin"
Koa.body ref: "$pluginDownload"
Koa.response 200, ref: "$successFailCount"
Koa.tags ["admin", "downloader"]
Koa.body schema: {
"plugin" => String,
"title" => String,
"chapters" => [{
"id" => String,
"title" => String,
}],
}
Koa.response 200, schema: {
"success" => Int32,
"fail" => Int32,
}
post "/api/admin/plugin/download" do |env|
begin
plugin = Plugin.new env.params.json["plugin"].as String
@@ -640,6 +584,7 @@ struct APIRouter
"fail": jobs.size - inserted_count,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -650,24 +595,43 @@ struct APIRouter
Koa.describe "Returns the image dimensions of all pages in an entry"
Koa.path "tid", desc: "A title ID"
Koa.path "eid", desc: "An entry ID"
Koa.response 200, ref: "$dimensionResult"
Koa.tag "reader"
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"dimensions?" => [{
"width" => Int32,
"height" => Int32,
}],
}
Koa.response 304, "Not modified (only available when `If-None-Match` is set)"
get "/api/dimensions/:tid/:eid" do |env|
begin
tid = env.params.url["tid"]
eid = env.params.url["eid"]
prev_e_tag = env.request.headers["If-None-Match"]?
title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil?
entry = title.get_entry eid
raise "Entry ID `#{eid}` of `#{title.title}` not found" if entry.nil?
sizes = entry.page_dimensions
send_json env, {
"success" => true,
"dimensions" => sizes,
"margin" => Config.current.page_margin,
}.to_json
file_hash = Digest::SHA1.hexdigest (entry.zip_path + entry.mtime.to_s)
e_tag = "W/#{file_hash}"
if e_tag == prev_e_tag
env.response.status_code = 304
""
else
sizes = entry.page_dimensions
env.response.headers["ETag"] = e_tag
env.response.headers["Cache-Control"] = "public, max-age=86400"
send_json env, {
"success" => true,
"dimensions" => sizes,
}.to_json
end
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
@@ -678,8 +642,9 @@ struct APIRouter
Koa.describe "Downloads an entry"
Koa.path "tid", desc: "A title ID"
Koa.path "eid", desc: "An entry ID"
Koa.response 200, ref: "$binary"
Koa.response 200, schema: Bytes
Koa.response 404, "Entry not found"
Koa.tags ["library", "reader"]
get "/api/download/:tid/:eid" do |env|
begin
title = (Library.default.get_title env.params.url["tid"]).not_nil!
@@ -694,7 +659,12 @@ struct APIRouter
Koa.describe "Gets the tags of a title"
Koa.path "tid", desc: "A title ID"
Koa.response 200, ref: "$tagsResult"
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"tags" => [String?],
}
Koa.tags ["library", "tags"]
get "/api/tags/:tid" do |env|
begin
title = (Library.default.get_title env.params.url["tid"]).not_nil!
@@ -713,10 +683,33 @@ struct APIRouter
end
end
Koa.describe "Returns all tags"
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"tags" => [String?],
}
Koa.tags ["library", "tags"]
get "/api/tags" do |env|
begin
tags = Storage.default.list_tags
send_json env, {
"success" => true,
"tags" => tags,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Adds a new tag to a title"
Koa.path "tid", desc: "A title ID"
Koa.response 200, ref: "$result"
Koa.tag "admin"
Koa.response 200, schema: "result"
Koa.tags ["admin", "library", "tags"]
put "/api/admin/tags/:tid/:tag" do |env|
begin
title = (Library.default.get_title env.params.url["tid"]).not_nil!
@@ -738,8 +731,8 @@ struct APIRouter
Koa.describe "Deletes a tag from a title"
Koa.path "tid", desc: "A title ID"
Koa.response 200, ref: "$result"
Koa.tag "admin"
Koa.response 200, schema: "result"
Koa.tags ["admin", "library", "tags"]
delete "/api/admin/tags/:tid/:tag" do |env|
begin
title = (Library.default.get_title env.params.url["tid"]).not_nil!
@@ -759,6 +752,142 @@ struct APIRouter
end
end
Koa.describe "Lists all missing titles"
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"titles?" => [{
"path" => String,
"id" => String,
"signature" => String,
}],
}
Koa.tags ["admin", "library"]
get "/api/admin/titles/missing" do |env|
begin
send_json env, {
"success" => true,
"error" => nil,
"titles" => Storage.default.missing_titles,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Lists all missing entries"
Koa.response 200, schema: {
"success" => Bool,
"error" => String?,
"entries?" => [{
"path" => String,
"id" => String,
"signature" => String,
}],
}
Koa.tags ["admin", "library"]
get "/api/admin/entries/missing" do |env|
begin
send_json env, {
"success" => true,
"error" => nil,
"entries" => Storage.default.missing_entries,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Deletes all missing titles"
Koa.response 200, schema: "result"
Koa.tags ["admin", "library"]
delete "/api/admin/titles/missing" do |env|
begin
Storage.default.delete_missing_title
send_json env, {
"success" => true,
"error" => nil,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Deletes all missing entries"
Koa.response 200, schema: "result"
Koa.tags ["admin", "library"]
delete "/api/admin/entries/missing" do |env|
begin
Storage.default.delete_missing_entry
send_json env, {
"success" => true,
"error" => nil,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Deletes a missing title identified by `tid`", <<-MD
Does nothing if the given `tid` is not found or if the title is not missing.
MD
Koa.response 200, schema: "result"
Koa.tags ["admin", "library"]
delete "/api/admin/titles/missing/:tid" do |env|
begin
tid = env.params.url["tid"]
Storage.default.delete_missing_title tid
send_json env, {
"success" => true,
"error" => nil,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
Koa.describe "Deletes a missing entry identified by `eid`", <<-MD
Does nothing if the given `eid` is not found or if the entry is not missing.
MD
Koa.response 200, schema: "result"
Koa.tags ["admin", "library"]
delete "/api/admin/entries/missing/:eid" do |env|
begin
eid = env.params.url["eid"]
Storage.default.delete_missing_entry eid
send_json env, {
"success" => true,
"error" => nil,
}.to_json
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
end
end
doc = Koa.generate
@@api_json = doc.to_json if doc

View File

@@ -30,7 +30,8 @@ struct MainRouter
else
redirect env, "/"
end
rescue
rescue e
Logger.error e
redirect env, "/login"
end
end
@@ -40,7 +41,7 @@ struct MainRouter
username = get_username env
sort_opt = SortOptions.from_info_json Library.default.dir, username
get_sort_opt
get_and_save_sort_opt Library.default.dir
titles = Library.default.sorted_titles username, sort_opt
percentage = titles.map &.load_percentage username
@@ -58,12 +59,12 @@ struct MainRouter
username = get_username env
sort_opt = SortOptions.from_info_json title.dir, username
get_sort_opt
get_and_save_sort_opt title.dir
entries = title.sorted_entries username, sort_opt
percentage = title.load_percentage_for_all_entries username, sort_opt
title_percentage = title.titles.map &.load_percentage username
layout "title"
rescue e
Logger.error e
@@ -71,11 +72,6 @@ struct MainRouter
end
end
get "/download" do |env|
mangadex_base_url = Config.current.mangadex["base_url"]
layout "download"
end
get "/download/plugins" do |env|
begin
id = env.params.query["plugin"]?
@@ -103,7 +99,7 @@ struct MainRouter
recently_added = Library.default.get_recently_added_entries username
start_reading = Library.default.get_start_reading_titles username
titles = Library.default.titles
new_user = !titles.any? { |t| t.load_percentage(username) > 0 }
new_user = !titles.any? &.load_percentage(username).> 0
empty_library = titles.size == 0
layout "home"
rescue e
@@ -154,6 +150,7 @@ struct MainRouter
end
get "/api" do |env|
base_url = Config.current.base_url
render "src/views/api.html.ecr"
end
end

View File

@@ -30,6 +30,11 @@ struct ReaderRouter
title = (Library.default.get_title env.params.url["title"]).not_nil!
entry = (title.get_entry env.params.url["entry"]).not_nil!
sort_opt = SortOptions.from_info_json title.dir, username
get_sort_opt
entries = title.sorted_entries username, sort_opt
page_idx = env.params.url["page"].to_i
if page_idx > entry.pages || page_idx <= 0
raise "Page #{page_idx} not found."
@@ -37,10 +42,12 @@ struct ReaderRouter
exit_url = "#{base_url}book/#{title.id}"
next_entry_url = nil
next_entry = entry.next_entry username
unless next_entry.nil?
next_entry_url = "#{base_url}reader/#{title.id}/#{next_entry.id}"
next_entry_url = entry.next_entry(username).try do |e|
"#{base_url}reader/#{title.id}/#{e.id}"
end
previous_entry_url = entry.previous_entry(username).try do |e|
"#{base_url}reader/#{title.id}/#{e.id}"
end
render "src/views/reader.html.ecr"

View File

@@ -7,10 +7,6 @@ require "./routes/*"
class Server
def initialize
error 403 do |env|
message = "HTTP 403: You are not authorized to visit #{env.request.path}"
layout "message"
end
error 404 do |env|
message = "HTTP 404: Mango cannot find the page #{env.request.path}"
layout "message"
@@ -53,6 +49,7 @@ class Server
{% if flag?(:release) %}
Kemal.config.env = "production"
{% end %}
Kemal.config.host_binding = Config.current.host
Kemal.config.port = Config.current.port
Kemal.run
end

View File

@@ -3,6 +3,8 @@ require "crypto/bcrypt"
require "uuid"
require "base64"
require "./util/*"
require "mg"
require "../migration/*"
def hash_password(pw)
Crypto::Bcrypt::Password.create(pw).to_s
@@ -13,13 +15,16 @@ def verify_password(hash, pw)
end
class Storage
@@insert_entry_ids = [] of IDTuple
@@insert_title_ids = [] of IDTuple
@path : String
@db : DB::Database?
@insert_ids = [] of IDTuple
alias IDTuple = NamedTuple(path: String,
alias IDTuple = NamedTuple(
path: String,
id: String,
is_title: Bool)
signature: String?)
use_default
@@ -29,57 +34,20 @@ class Storage
dir = File.dirname @path
unless Dir.exists? dir
Logger.info "The DB directory #{dir} does not exist. " \
"Attepmting to create it"
"Attempting to create it"
Dir.mkdir_p dir
end
MainFiber.run do
DB.open "sqlite3://#{@path}" do |db|
begin
# v0.18.0
db.exec "create table tags (id text, tag text, unique (id, tag))"
db.exec "create index tags_id_idx on tags (id)"
db.exec "create index tags_tag_idx on tags (tag)"
# v0.15.0
db.exec "create table thumbnails " \
"(id text, data blob, filename text, " \
"mime text, size integer)"
db.exec "create unique index tn_index on thumbnails (id)"
# v0.1.1
db.exec "create table ids" \
"(path text, id text, is_title integer)"
db.exec "create unique index path_idx on ids (path)"
db.exec "create unique index id_idx on ids (id)"
# v0.1.0
db.exec "create table users" \
"(username text, password text, token text, admin integer)"
MG::Migration.new(db, log: Logger.default.raw_log).migrate
rescue e
unless e.message.not_nil!.ends_with? "already exists"
Logger.fatal "Error when checking tables in DB: #{e}"
raise e
end
# If the DB is initialized through CLI but no user is added, we need
# to create the admin user when first starting the app
user_count = db.query_one "select count(*) from users", as: Int32
init_admin if init_user && user_count == 0
else
Logger.debug "Creating DB file at #{@path}"
db.exec "create unique index username_idx on users (username)"
db.exec "create unique index token_idx on users (token)"
init_admin if init_user
Logger.fatal "DB migration failed. #{e}"
raise e
end
# Verifies that the default username in config is valid
if Config.current.disable_login
username = Config.current.default_username
unless username_exists username
raise "Default username #{username} does not exist"
end
end
user_count = db.query_one "select count(*) from users", as: Int32
init_admin if init_user && user_count == 0
end
unless @auto_close
@db = DB.open "sqlite3://#{@path}"
@@ -99,9 +67,11 @@ class Storage
private def get_db(&block : DB::Database ->)
if @db.nil?
DB.open "sqlite3://#{@path}" do |db|
db.exec "PRAGMA foreign_keys = 1"
yield db
end
else
@db.not_nil!.exec "PRAGMA foreign_keys = 1"
yield @db.not_nil!
end
end
@@ -254,32 +224,121 @@ class Storage
end
end
def get_id(path, is_title)
def get_title_id(path, signature)
id = nil
path = Path.new(path).relative_to(Config.current.library_path).to_s
MainFiber.run do
get_db do |db|
id = db.query_one? "select id from ids where path = (?)", path,
as: {String}
# First attempt to find the matching title in DB using BOTH path
# and signature
id = db.query_one? "select id from titles where path = (?) and " \
"signature = (?) and unavailable = 0",
path, signature.to_s, as: String
should_update = id.nil?
# If it fails, try to match using the path only. This could happen
# for example when a new entry is added to the title
id ||= db.query_one? "select id from titles where path = (?)", path,
as: String
# If it still fails, we will have to rely on the signature values.
# This could happen when the user moved or renamed the title, or
# a title containing the title
unless id
# If there are multiple rows with the same signature (this could
# happen simply by bad luck, or when the user copied a title),
# pick the row that has the most similar path to the give path
rows = [] of Tuple(String, String)
db.query "select id, path from titles where signature = (?)",
signature.to_s do |rs|
rs.each do
rows << {rs.read(String), rs.read(String)}
end
end
row = rows.max_by?(&.[1].components_similarity(path))
id = row[0] if row
end
# At this point, `id` would still be nil if there's no row matching
# either the path or the signature
# If we did identify a matching title, save the path and signature
# values back to the DB
if id && should_update
db.exec "update titles set path = (?), signature = (?), " \
"unavailable = 0 where id = (?)", path, signature.to_s, id
end
end
end
id
end
def insert_id(tp : IDTuple)
@insert_ids << tp
# See the comments in `#get_title_id` to see how this method works.
def get_entry_id(path, signature)
id = nil
path = Path.new(path).relative_to(Config.current.library_path).to_s
MainFiber.run do
get_db do |db|
id = db.query_one? "select id from ids where path = (?) and " \
"signature = (?) and unavailable = 0",
path, signature.to_s, as: String
should_update = id.nil?
id ||= db.query_one? "select id from ids where path = (?)", path,
as: String
unless id
rows = [] of Tuple(String, String)
db.query "select id, path from ids where signature = (?)",
signature.to_s do |rs|
rs.each do
rows << {rs.read(String), rs.read(String)}
end
end
row = rows.max_by?(&.[1].components_similarity(path))
id = row[0] if row
end
if id && should_update
db.exec "update ids set path = (?), signature = (?), " \
"unavailable = 0 where id = (?)", path, signature.to_s, id
end
end
end
id
end
def insert_entry_id(tp)
@@insert_entry_ids << tp
end
def insert_title_id(tp)
@@insert_title_ids << tp
end
def bulk_insert_ids
MainFiber.run do
get_db do |db|
db.transaction do |tx|
@insert_ids.each do |tp|
tx.connection.exec "insert into ids values (?, ?, ?)", tp[:path],
tp[:id], tp[:is_title] ? 1 : 0
db.transaction do |tran|
conn = tran.connection
@@insert_title_ids.each do |tp|
path = Path.new(tp[:path])
.relative_to(Config.current.library_path).to_s
conn.exec "insert into titles (id, path, signature, " \
"unavailable) values (?, ?, ?, 0)",
tp[:id], path, tp[:signature].to_s
end
@@insert_entry_ids.each do |tp|
path = Path.new(tp[:path])
.relative_to(Config.current.library_path).to_s
conn.exec "insert into ids (id, path, signature, " \
"unavailable) values (?, ?, ?, 0)",
tp[:id], path, tp[:signature].to_s
end
end
end
@insert_ids.clear
@@insert_entry_ids.clear
@@insert_title_ids.clear
end
end
@@ -336,7 +395,8 @@ class Storage
tags = [] of String
MainFiber.run do
get_db do |db|
db.query "select distinct tag from tags" do |rs|
db.query "select distinct tag from tags natural join titles " \
"where unavailable = 0" do |rs|
rs.each do
tags << rs.read String
end
@@ -368,44 +428,134 @@ class Storage
end
end
def optimize
# Mark titles and entries that no longer exist on the file system as
# unavailable. By supplying `id_candidates` and `titles_candidates`, it
# only checks the existence of the candidate titles/entries to speed up
# the process.
def mark_unavailable(ids_candidates : Array(String)?,
titles_candidates : Array(String)?)
MainFiber.run do
Logger.info "Starting DB optimization"
get_db do |db|
# Detect dangling entry IDs
trash_ids = [] of String
db.query "select path, id from ids" do |rs|
query = "select path, id from ids where unavailable = 0"
unless ids_candidates.nil?
query += " and id in (#{ids_candidates.join "," { |i| "'#{i}'" }})"
end
db.query query do |rs|
rs.each do
path = rs.read String
trash_ids << rs.read String unless File.exists? path
fullpath = Path.new(path).expand(Config.current.library_path).to_s
trash_ids << rs.read String unless File.exists? fullpath
end
end
# Delete dangling IDs
db.exec "delete from ids where id in " \
"(#{trash_ids.map { |i| "'#{i}'" }.join ","})"
Logger.debug "#{trash_ids.size} dangling IDs deleted" \
if trash_ids.size > 0
unless trash_ids.empty?
Logger.debug "Marking #{trash_ids.size} entries as unavailable"
end
db.exec "update ids set unavailable = 1 where id in " \
"(#{trash_ids.join "," { |i| "'#{i}'" }})"
# Delete dangling thumbnails
trash_thumbnails_count = db.query_one "select count(*) from " \
"thumbnails where id not in " \
"(select id from ids)", as: Int32
if trash_thumbnails_count > 0
db.exec "delete from thumbnails where id not in (select id from ids)"
Logger.info "#{trash_thumbnails_count} dangling thumbnails deleted"
# Detect dangling title IDs
trash_titles = [] of String
query = "select path, id from titles where unavailable = 0"
unless titles_candidates.nil?
query += " and id in (#{titles_candidates.join "," { |i| "'#{i}'" }})"
end
db.query query do |rs|
rs.each do
path = rs.read String
fullpath = Path.new(path).expand(Config.current.library_path).to_s
trash_titles << rs.read String unless Dir.exists? fullpath
end
end
# Delete dangling tags
trash_tags_count = db.query_one "select count(*) from tags " \
"where id not in " \
"(select id from ids)", as: Int32
if trash_tags_count > 0
db.exec "delete from tags where id not in (select id from ids)"
Logger.info "#{trash_tags_count} dangling tags deleted"
unless trash_titles.empty?
Logger.debug "Marking #{trash_titles.size} titles as unavailable"
end
db.exec "update titles set unavailable = 1 where id in " \
"(#{trash_titles.join "," { |i| "'#{i}'" }})"
end
end
end
private def get_missing(tablename)
ary = [] of IDTuple
MainFiber.run do
get_db do |db|
db.query "select id, path, signature from #{tablename} " \
"where unavailable = 1" do |rs|
rs.each do
ary << {
id: rs.read(String),
path: rs.read(String),
signature: rs.read(String?),
}
end
end
end
Logger.info "DB optimization finished"
end
ary
end
private def delete_missing(tablename, id : String? = nil)
MainFiber.run do
get_db do |db|
if id
db.exec "delete from #{tablename} where id = (?) " \
"and unavailable = 1", id
else
db.exec "delete from #{tablename} where unavailable = 1"
end
end
end
end
def missing_entries
get_missing "ids"
end
def missing_titles
get_missing "titles"
end
def delete_missing_entry(id = nil)
delete_missing "ids", id
end
def delete_missing_title(id = nil)
delete_missing "titles", id
end
def save_md_token(username : String, token : String, expire : Time)
MainFiber.run do
get_db do |db|
count = db.query_one "select count(*) from md_account where " \
"username = (?)", username, as: Int64
if count == 0
db.exec "insert into md_account values (?, ?, ?)", username, token,
expire.to_unix
else
db.exec "update md_account set token = (?), expire = (?) " \
"where username = (?)", token, expire.to_unix, username
end
end
end
end
def get_md_token(username) : Tuple(String?, Time?)
token = nil
expires = nil
MainFiber.run do
get_db do |db|
db.query_one? "select token, expire from md_account where " \
"username = (?)", username do |res|
token = res.read String
expires = Time.unix res.read Int64
end
end
end
{token, expires}
end
def close

83
src/subscription.cr Normal file
View File

@@ -0,0 +1,83 @@
require "db"
require "json"
struct Subscription
include DB::Serializable
include JSON::Serializable
getter id : Int64 = 0
getter username : String
getter manga_id : Int64
property language : String?
property group_id : Int64?
property min_volume : Int64?
property max_volume : Int64?
property min_chapter : Int64?
property max_chapter : Int64?
@[DB::Field(key: "last_checked")]
@[JSON::Field(key: "last_checked")]
@raw_last_checked : Int64
@[DB::Field(key: "created_at")]
@[JSON::Field(key: "created_at")]
@raw_created_at : Int64
def last_checked : Time
Time.unix @raw_last_checked
end
def created_at : Time
Time.unix @raw_created_at
end
def initialize(@manga_id, @username)
@raw_created_at = Time.utc.to_unix
@raw_last_checked = Time.utc.to_unix
end
private def in_range?(value : String, lowerbound : Int64?,
upperbound : Int64?) : Bool
lb = lowerbound.try &.to_f64
ub = upperbound.try &.to_f64
return true if lb.nil? && ub.nil?
v = value.to_f64?
return false unless v
if lb.nil?
v <= ub.not_nil!
elsif ub.nil?
v >= lb.not_nil!
else
v >= lb.not_nil! && v <= ub.not_nil!
end
end
def match?(chapter : MangaDex::Chapter) : Bool
if chapter.manga_id != manga_id ||
(language && chapter.language != language) ||
(group_id && !chapter.groups.map(&.id).includes? group_id)
return false
end
in_range?(chapter.volume, min_volume, max_volume) &&
in_range?(chapter.chapter, min_chapter, max_chapter)
end
def check_for_updates : Int32
Logger.debug "Checking updates for subscription with ID #{id}"
jobs = [] of Queue::Job
get_client(username).user.updates_after last_checked do |chapter|
next unless match? chapter
jobs << chapter.to_job
end
Storage.default.update_subscription_last_checked id
count = Queue.default.push jobs
Logger.debug "#{count}/#{jobs.size} of updates added to queue"
count
rescue e
Logger.error "Error occurred when checking updates for " \
"subscription with ID #{id}. #{e}"
0
end
end

View File

@@ -73,7 +73,7 @@ class ChapterSorter
.select do |key|
keys[key].count >= str_ary.size / 2
end
.sort do |a_key, b_key|
.sort! do |a_key, b_key|
a = keys[a_key]
b = keys[b_key]
# Sort keys by the number of times they appear

View File

@@ -11,7 +11,7 @@ end
def split_by_alphanumeric(str)
arr = [] of String
str.scan(/([^\d\n\r]*)(\d*)([^\d\n\r]*)/) do |match|
arr += match.captures.select { |s| s != "" }
arr += match.captures.select &.!= ""
end
arr
end

79
src/util/signature.cr Normal file
View File

@@ -0,0 +1,79 @@
require "./util"
class File
abstract struct Info
def inode : UInt64
@stat.st_ino.to_u64
end
end
# Returns the signature of the file at filename.
# When it is not a supported file, returns 0. Otherwise, uses the inode
# number as its signature. On most file systems, the inode number is
# preserved even when the file is renamed, moved or edited.
# Some cases that would cause the inode number to change:
# - Reboot/remount on some file systems
# - Replaced with a copied file
# - Moved to a different device
# Since we are also using the relative paths to match ids, we won't lose
# information as long as the above changes do not happen together with
# a file/folder rename, with no library scan in between.
def self.signature(filename) : UInt64
if is_supported_file filename
File.info(filename).inode
else
0u64
end
end
end
class Dir
# Returns the signature of the directory at dirname. See the comments for
# `File.signature` for more information.
def self.signature(dirname) : UInt64
signatures = [File.info(dirname).inode]
self.open dirname do |dir|
dir.entries.each do |fn|
next if fn.starts_with? "."
path = File.join dirname, fn
if File.directory? path
signatures << Dir.signature path
else
_sig = File.signature path
# Only add its signature value to `signatures` when it is a
# supported file
signatures << _sig if _sig > 0
end
end
end
Digest::CRC32.checksum(signatures.sort.join).to_u64
end
# Returns the contents signature of the directory at dirname for checking
# to rescan.
# Rescan conditions:
# - When a file added, moved, removed, renamed (including which in nested
# directories)
def self.contents_signature(dirname, cache = {} of String => String) : String
return cache[dirname] if cache[dirname]?
Fiber.yield
signatures = [] of String
self.open dirname do |dir|
dir.entries.sort.each do |fn|
next if fn.starts_with? "."
path = File.join dirname, fn
if File.directory? path
signatures << Dir.contents_signature path, cache
else
# Only add its signature value to `signatures` when it is a
# supported file
signatures << fn if is_supported_file fn
end
Fiber.yield
end
end
hash = Digest::SHA1.hexdigest(signatures.join)
cache[dirname] = hash
hash
end
end

View File

@@ -1,7 +1,8 @@
IMGS_PER_PAGE = 5
ENTRIES_IN_HOME_SECTIONS = 8
UPLOAD_URL_PREFIX = "/uploads"
STATIC_DIRS = ["/css", "/js", "/img", "/favicon.ico"]
STATIC_DIRS = %w(/css /js /img /webfonts /favicon.ico /robots.txt)
SUPPORTED_FILE_EXTNAMES = [".zip", ".cbz", ".rar", ".cbr"]
def random_str
UUID.random.to_s.gsub "-", ""
@@ -22,15 +23,32 @@ end
def register_mime_types
{
# Comic Archives
".zip" => "application/zip",
".rar" => "application/x-rar-compressed",
".cbz" => "application/vnd.comicbook+zip",
".cbr" => "application/vnd.comicbook-rar",
# Favicon
".ico" => "image/x-icon",
# FontAwesome fonts
".woff" => "font/woff",
".woff2" => "font/woff2",
# Supported image formats. JPG, PNG, GIF, WebP, and SVG are already
# defiend by Crystal in `MIME.DEFAULT_TYPES`
".apng" => "image/apng",
".avif" => "image/avif",
}.each do |k, v|
MIME.register k, v
end
end
def is_supported_file(path)
SUPPORTED_FILE_EXTNAMES.includes? File.extname(path).downcase
end
struct Int
def or(other : Int)
if self == 0
@@ -92,3 +110,37 @@ def sort_titles(titles : Array(Title), opt : SortOptions, username : String)
ary
end
class String
# Returns the similarity (in [0, 1]) of two paths.
# For the two paths, separate them into arrays of components, count the
# number of matching components backwards, and divide the count by the
# number of components of the shorter path.
def components_similarity(other : String) : Float64
s, l = [self, other]
.map { |str| Path.new(str).parts }
.sort_by! &.size
match = s.reverse.zip(l.reverse).count { |a, b| a == b }
match / s.size
end
end
# Does the followings:
# - turns space-like characters into the normal whitespaces ( )
# - strips and collapses spaces
# - removes ASCII control characters
# - replaces slashes (/) with underscores (_)
# - removes leading dots (.)
# - removes the following special characters: \:*?"<>|
#
# If the sanitized string is empty, returns a random string instead.
def sanitize_filename(str : String) : String
sanitized = str
.gsub(/\s+/, " ")
.strip
.gsub(/\//, "_")
.gsub(/^[\.\s]+/, "")
.gsub(/[\177\000-\031\\:\*\?\"<>\|]/, "")
sanitized.size > 0 ? sanitized : random_str
end

View File

@@ -1,19 +1,24 @@
# Web related helper functions/macros
def is_admin?(env) : Bool
is_admin = false
if !Config.current.auth_proxy_header_name.empty? ||
Config.current.disable_login
is_admin = Storage.default.username_is_admin get_username env
end
# The token (if exists) takes precedence over other authentication methods.
if token = env.session.string? "token"
is_admin = Storage.default.verify_admin token
end
is_admin
end
macro layout(name)
base_url = Config.current.base_url
is_admin = is_admin? env
begin
is_admin = false
# The token (if exists) takes precedence over the default user option.
# this is why we check the default username first before checking the
# token.
if Config.current.disable_login
is_admin = Storage.default.
username_is_admin Config.current.default_username
end
if token = env.session.string? "token"
is_admin = Storage.default.verify_admin token
end
page = {{name}}
render "src/views/#{{{name}}}.html.ecr", "src/views/layout.html.ecr"
rescue e
@@ -24,6 +29,15 @@ macro layout(name)
end
end
macro send_error_page(msg)
message = {{msg}}
base_url = Config.current.base_url
is_admin = is_admin? env
page = "Error"
html = render "src/views/message.html.ecr", "src/views/layout.html.ecr"
send_file env, html.to_slice, "text/html"
end
macro send_img(env, img)
send_file {{env}}, {{img}}.data, {{img}}.mime
end
@@ -35,6 +49,8 @@ macro get_username(env)
rescue e
if Config.current.disable_login
Config.current.default_username
elsif (header = Config.current.auth_proxy_header_name) && !header.empty?
env.request.headers[header]
else
raise e
end
@@ -56,7 +72,7 @@ def redirect(env, path)
end
def hash_to_query(hash)
hash.map { |k, v| "#{k}=#{v}" }.join("&")
hash.join "&" { |k, v| "#{k}=#{v}" }
end
def request_path_startswith(env, ary)
@@ -91,6 +107,26 @@ macro get_sort_opt
end
end
macro get_and_save_sort_opt(dir)
sort_method = env.params.query["sort"]?
if sort_method
is_ascending = true
ascend = env.params.query["ascend"]?
if ascend && ascend.to_i? == 0
is_ascending = false
end
sort_opt = SortOptions.new sort_method, is_ascending
TitleInfo.new {{dir}} do |info|
info.sort_by[username] = sort_opt.to_tuple
info.save
end
end
end
module HTTP
class Client
private def self.exec(uri : URI, tls : TLSContext = nil)

View File

@@ -1,5 +1,13 @@
<ul class="uk-list uk-list-large uk-list-divider" x-data="component()" x-init="init()">
<li><a class="uk-link-reset" href="<%= base_url %>admin/user">User Management</a></li>
<li>
<a class="uk-link-reset" href="<%= base_url %>admin/missing">Missing Items</a>
<% if missing_count > 0 %>
<div class="uk-align-right">
<span class="uk-badge"><%= missing_count %></span>
</div>
<% end %>
</li>
<li>
<a class="uk-link-reset" @click="scan()">
<span :style="`${scanning ? 'color:grey' : ''}`">Scan Library Files</span>
@@ -19,7 +27,7 @@
</li>
<li>
<span>Theme</span>
<select id="theme-select" class="uk-select uk-align-right uk-width-1-3@m uk-width-1-2" :val="themeSetting" @change="themeChanged($event)">
<select id="theme-select" class="uk-select uk-align-right uk-width-1-3@m uk-width-1-2" :value="themeSetting" @change="themeChanged($event)">
<option>Dark</option>
<option>Light</option>
<option>System</option>

View File

@@ -8,7 +8,7 @@
<meta name="viewport" content="width=device-width, initial-scale=1">
</head>
<body>
<redoc spec-url="/openapi.json"></redoc>
<redoc spec-url="<%= base_url %>openapi.json"></redoc>
<script src="https://cdn.jsdelivr.net/npm/redoc/bundles/redoc.standalone.js"></script>
</body>
</html>

View File

@@ -1,3 +1,3 @@
<script src="https://cdnjs.cloudflare.com/ajax/libs/jQuery.dotdotdot/4.0.11/dotdotdot.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/protonet-jquery.inview/1.1.2/jquery.inview.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jQuery.dotdotdot/4.0.11/dotdotdot.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/protonet-jquery.inview/1.1.2/jquery.inview.min.js"></script>
<script src="<%= base_url %>js/dots.js"></script>

View File

@@ -4,13 +4,10 @@
<title>Mango - <%= page.split("-").map(&.capitalize).join(" ") %></title>
<meta name="description" content="Mango - Manga Server and Web Reader">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="stylesheet" href="<%= base_url %>css/uikit.css" />
<link rel="stylesheet" href="<%= base_url %>css/mango.css" />
<link rel="icon" href="<%= base_url %>favicon.ico">
<script src="https://polyfill.io/v3/polyfill.min.js?features=matchMedia%2Cdefault&flags=gated"></script>
<script defer src="<%= base_url %>js/fontawesome.min.js"></script>
<script defer src="<%= base_url %>js/solid.min.js"></script>
<script src="https://polyfill.io/v3/polyfill.min.js?features=MutationObserver%2Cdefault%2CmatchMedia&flats=gated"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.2.1/jquery.min.js"></script>
<script type="module" src="https://cdn.jsdelivr.net/gh/alpinejs/alpine@v2.8.0/dist/alpine.min.js"></script>
<script nomodule src="https://cdn.jsdelivr.net/gh/alpinejs/alpine@v2.8.0/dist/alpine-ie11.min.js" defer></script>

View File

@@ -0,0 +1 @@
<script src="https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.1/jquery-ui.min.js"></script>

View File

@@ -0,0 +1 @@
<script src="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.24.0/moment.min.js"></script>

View File

@@ -1,12 +0,0 @@
<div class="uk-margin" x-data="tagsComponent()" x-cloak x-init="load(<%= is_admin %>)">
<p class="uk-text-meta" @selectstart.prevent>
<span style="position:relative; bottom:3px; margin-right:5px;">Tags: </span>
<template x-for="tag in tags" :key="tag">
<span class="uk-label uk-label-primary" style="padding:2px 5px; margin:0 5px 5px 5px; text-transform:none;">
<a class="uk-link-reset" x-show="isAdmin" @click="rm($event)" :id="`${tag}-rm`"><span uk-icon="close" style="margin-right: 5px; position: relative; bottom: 1.5px;"></span></a><a class="uk-link-reset" x-text="tag" :href="`<%= base_url %>tags/${encodeURIComponent(tag)}`"></a>
</span>
</template>
<a class="uk-link-reset" style="position:relative; bottom:3px;" :uk-icon="inputShown ? 'close' : 'plus'" @click="toggleInput($nextTick)" x-show="isAdmin"></a>
</p>
<input id="tag-input" class="uk-input" type="text" placeholder="Type in a new tag and hit enter" x-model="newTag" @keydown="keydown($event)" x-show="inputShown">
</div>

View File

@@ -0,0 +1,2 @@
<script src="https://cdn.jsdelivr.net/npm/uikit@3.5.9/dist/js/uikit.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/uikit@3.5.9/dist/js/uikit-icons.min.js"></script>

View File

@@ -25,14 +25,14 @@
<td x-text="job.title"></td>
</template>
<template x-if="!job.plugin_id">
<td><a :href="`${'<%= mangadex_base_url %>'.replace(/\/$/, '')}/chapter/${job.id}`" x-text="job.title"></td>
<td><a :href="`<%= mangadex_base_url %>/chapter/${job.id}`" x-text="job.title"></td>
</template>
<template x-if="job.plugin_id">
<td x-text="job.manga_title"></td>
</template>
<template x-if="!job.plugin_id">
<td><a :href="`${'<%= mangadex_base_url %>'.replace(/\/$/, '')}/manga/${job.manga_id}`" x-text="job.manga_title"></td>
<td><a :href="`<%= mangadex_base_url %>/manga/${job.manga_id}`" x-text="job.manga_title"></td>
</template>
<td x-text="`${job.success_count}/${job.pages}`"></td>
@@ -43,17 +43,16 @@
<template x-if="job.status_message.length > 0">
<div class="uk-inline">
<span uk-icon="info"></span>
<div uk-dropdown x-text="job.status_message"></div>
<div uk-dropdown x-text="job.status_message" style="white-space: pre-line;"></div>
</div>
</template>
</td>
<td x-text="`${job.plugin_id || ''}`"></td>
<td>
<a @click="jobAction('delete', $event)" uk-icon="trash"></a>
<a @click="jobAction('delete', $event)" uk-icon="trash" uk-tooltip="Delete"></a>
<template x-if="job.status_message.length > 0">
<a @click="jobAction('retry', $event)" uk-icon="refresh"></a>
<a @click="jobAction('retry', $event)" uk-icon="refresh" uk-tooltip="Retry"></a>
</template>
</td>
</tr>
@@ -61,9 +60,10 @@
</tbody>
</table>
</div>
</div>
<% content_for "script" do %>
<script src="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.24.0/moment.min.js"></script>
<%= render_component "moment" %>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/download-manager.js"></script>
<% end %>

View File

@@ -1,83 +1,162 @@
<h2 class=uk-title>Download from MangaDex</h2>
<div class="uk-grid-small" uk-grid>
<div class="uk-width-3-4">
<input id="search-input" class="uk-input" type="text" placeholder="MangaDex manga ID or URL">
<div x-data="downloadComponent()" x-init="init()">
<div class="uk-grid-small" uk-grid style="margin-bottom:40px;">
<div class="uk-width-expand">
<input class="uk-input" type="text" :placeholder="searchAvailable ? 'Search MangaDex or enter a manga ID/URL' : 'MangaDex manga ID or URL'" x-model="searchInput" @keydown.enter.debounce="search()">
</div>
<div class="uk-width-auto">
<div uk-spinner class="uk-align-center" x-show="loading" x-cloak></div>
<button class="uk-button uk-button-default" x-show="!loading" @click="search()">Search</button>
</div>
</div>
<div class="uk-width-1-4">
<div id="spinner" uk-spinner class="uk-align-center" hidden></div>
<button id="search-btn" class="uk-button uk-button-default" onclick="search()">Search</button>
</div>
</div>
<div class"uk-grid-small" uk-grid hidden id="manga-details">
<div class="uk-width-1-4@s">
<img id="cover">
</div>
<div class="uk-width-1-4@s">
<p id="title"></p>
<p id="artist"></p>
<p id="author"></p>
</div>
<div id="filter-form" class="uk-form-stacked uk-width-1-2@s" hidden>
<p class="uk-text-lead uk-margin-remove-bottom">Filter Chapters</p>
<p class="uk-text-meta uk-margin-remove-top" id="count-text"></p>
<div class="uk-margin">
<label class="uk-form-label" for="lang-select">Language</label>
<div class="uk-form-controls">
<select class="uk-select filter-field" id="lang-select">
</select>
<template x-if="mangaAry">
<div>
<p x-show="mangaAry.length === 0">No matching manga found.</p>
<div class="uk-child-width-1-4@m uk-child-width-1-2" uk-grid>
<template x-for="manga in mangaAry" :key="manga.id">
<div class="item" :data-id="manga.id" @click="chooseManga(manga)">
<div class="uk-card uk-card-default">
<div class="uk-card-media-top uk-inline">
<img uk-img :data-src="manga.mainCover">
</div>
<div class="uk-card-body">
<h3 class="uk-card-title break-word uk-margin-remove-bottom free-height" x-text="manga.title"></h3>
<p class="uk-text-meta" x-text="`ID: ${manga.id}`"></p>
</div>
</div>
</div>
</template>
</div>
</div>
<div class="uk-margin">
<label class="uk-form-label" for="group-select">Group</label>
<div class="uk-form-controls">
<select class="uk-select filter-field" id="group-select">
</select>
</template>
<div x-show="data && data.chapters" x-cloak>
<div class"uk-grid-small" uk-grid>
<div class="uk-width-1-4@s">
<img :src="data.mainCover">
</div>
<div class="uk-width-1-4@s">
<p>Title: <a :href="`<%= mangadex_base_url %>/manga/${data.id}`" x-text="data.title"></a></p>
<p x-text="`Artist: ${data.artist}`"></p>
<p x-text="`Author: ${data.author}`"></p>
</div>
<div class="uk-form-stacked uk-width-1-2@s" id="filters">
<p class="uk-text-lead uk-margin-remove-bottom">Filter Chapters</p>
<p class="uk-text-meta uk-margin-remove-top" x-text="`${chapters.length} chapters found`"></p>
<div class="uk-margin">
<label class="uk-form-label">Language</label>
<div class="uk-form-controls">
<select class="uk-select filter-field" x-model="langChoice" @change="filtersUpdated()">
<template x-for="lang in languages" :key="lang">
<option x-text="lang"></option>
</template>
</select>
</div>
</div>
<div class="uk-margin">
<label class="uk-form-label">Group</label>
<div class="uk-form-controls">
<select class="uk-select filter-field" x-model="groupChoice" @change="filtersUpdated()">
<template x-for="group in groups" :key="group">
<option x-text="group"></option>
</template>
</select>
</div>
</div>
<div class="uk-margin">
<label class="uk-form-label">Volume</label>
<div class="uk-form-controls">
<input class="uk-input filter-field" type="text" placeholder="e.g., 127, 10-14, >30, <=212, or leave it empty." x-model="volumeRange" @keydown.enter="filtersUpdated()">
</div>
</div>
<div class="uk-margin">
<label class="uk-form-label">Chapter</label>
<div class="uk-form-controls">
<input class="uk-input filter-field" type="text" placeholder="e.g., 127, 10-14, >30, <=212, or leave it empty." x-model="chapterRange" @keydown.enter="filtersUpdated()">
</div>
</div>
</div>
</div>
<div class="uk-margin">
<label class="uk-form-label" for="volume-range">Volume</label>
<div class="uk-form-controls">
<input class="uk-input filter-field" type="text" id="volume-range" placeholder="e.g., 127, 10-14, >30, <=212, or leave it empty.">
<div class="uk-margin">
<button class="uk-button uk-button-default" @click="selectAll()">Select All</button>
<button class="uk-button uk-button-default" @click="clearSelection()">Clear Selections</button>
<button class="uk-button uk-button-primary" @click="download()" x-show="!addingToDownload">Download Selected</button>
<div uk-spinner class="uk-margin-left" x-show="addingToDownload"></div>
</div>
<p class="uk-text-meta">Click on a table row to select the chapter. Drag your mouse over multiple rows to select them all. Hold Ctrl to make multiple non-adjacent selections.</p>
</div>
<div class="uk-margin">
<label class="uk-form-label" for="chapter-range">Chapter</label>
<div class="uk-form-controls">
<input class="uk-input filter-field" type="text" id="chapter-range" placeholder="e.g., 127, 10-14, >30, <=212, or leave it empty.">
<p x-text="`Mango can only list ${chaptersLimit} chapters, but we found ${chapters.length} chapters. Please use the filter options above to narrow down your search.`" x-show="chapters.length > chaptersLimit"></p>
<table class="uk-table uk-table-striped uk-overflow-auto" x-show="chapters.length <= chaptersLimit">
<thead>
<tr>
<th>ID</th>
<th>Title</th>
<th>Language</th>
<th>Group</th>
<th>Volume</th>
<th>Chapter</th>
<th>Timestamp</th>
</tr>
</thead>
<template x-if="chapters.length <= chaptersLimit">
<tbody id="selectable">
<template x-for="chp in chapters" :key="chp">
<tr class="ui-widget-content">
<td><a :href="`<%= mangadex_base_url %>/chapter/${chp.id}`" x-text="chp.id"></a></td>
<td x-text="chp.title"></td>
<td x-text="chp.language"></td>
<td>
<template x-for="grp in Object.entries(chp.groups)">
<div>
<a :href="`<%= mangadex_base_url %>/group/${grp[1]}`" x-text="grp[0]"></a>
</div>
</template>
</td>
<td x-text="chp.volume"></td>
<td x-text="chp.chapter"></td>
<td x-text="`${moment.unix(chp.timestamp).fromNow()}`"></td>
</tr>
</template>
</tbody>
</template>
</table>
</div>
<div id="modal" class="uk-flex-top" uk-modal="container: false">
<div class="uk-modal-dialog uk-margin-auto-vertical">
<button class="uk-modal-close-default" type="button" uk-close></button>
<div class="uk-modal-header">
<h3 class="uk-modal-title break-word" x-text="candidateManga.title"></h3>
</div>
<div class="uk-modal-body">
<div class="uk-grid">
<div class="uk-width-1-3@s">
<img uk-img data-width data-height :src="candidateManga.mainCover" style="width:100%;margin-bottom:10px;">
<a :href="`<%= mangadex_base_url %>/manga/${candidateManga.id}`" x-text="`ID: ${candidateManga.id}`" class="uk-link-muted"></a>
</div>
<div class="uk-width-2-3@s" uk-overflow-auto>
<p x-text="candidateManga.description"></p>
</div>
</div>
</div>
<div class="uk-modal-footer">
<button class="uk-button uk-button-primary" type="button" @click="confirmManga(candidateManga.id)">Choose</button>
</div>
</div>
</div>
</div>
<div id="selection-controls" class="uk-margin" hidden>
<div class="uk-margin">
<button class="uk-button uk-button-default" onclick="selectAll()">Select All</button>
<button class="uk-button uk-button-default" onclick="unselect()">Clear Selections</button>
<button class="uk-button uk-button-primary" id="download-btn" onclick="download()">Download Selected</button>
<div id="download-spinner" uk-spinner class="uk-margin-left" hidden></div>
</div>
<p class="uk-text-meta">Click on a table row to select the chapter. Drag your mouse over multiple rows to select them all. Hold Ctrl to make multiple non-adjacent selections.</p>
</div>
<p id="filter-notification" hidden></p>
<table class="uk-table uk-table-striped uk-overflow-auto" hidden>
<thead>
<tr>
<th>ID</th>
<th>Title</th>
<th>Language</th>
<th>Group</th>
<th>Volume</th>
<th>Chapter</th>
<th>Timestamp</th>
</tr>
</thead>
</table>
<% content_for "script" do %>
<script>
var baseURL = "<%= mangadex_base_url %>".replace(/\/$/, "");
</script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.24.0/moment.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.1/jquery-ui.min.js"></script>
<%= render_component "moment" %>
<%= render_component "jquery-ui" %>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/download.js"></script>
<% end %>

View File

@@ -77,7 +77,7 @@
<%- end -%>
<% content_for "script" do %>
<%= render_component "dots-scripts" %>
<%= render_component "dots" %>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/title.js"></script>
<% end %>

View File

@@ -17,7 +17,6 @@
<li class="uk-parent">
<a href="#">Download</a>
<ul class="uk-nav-sub">
<li><a href="<%= base_url %>download">MangaDex</a></li>
<li><a href="<%= base_url %>download/plugins">Plugins</a></li>
<li><a href="<%= base_url %>admin/downloads">Download Manager</a></li>
</ul>
@@ -37,7 +36,7 @@
<div class="uk-navbar-toggle" uk-navbar-toggle-icon="uk-navbar-toggle-icon" uk-toggle="target: #mobile-nav"></div>
</div>
<div class="uk-navbar-left uk-visible@s">
<a class="uk-navbar-item uk-logo" href="<%= base_url %>"><img src="<%= base_url %>img/icon.png"></a>
<a class="uk-navbar-item uk-logo" href="<%= base_url %>"><img src="<%= base_url %>img/icon.png" style="width:90px;height:90px;"></a>
<ul class="uk-navbar-nav">
<li><a href="<%= base_url %>">Home</a></li>
<li><a href="<%= base_url %>library">Library</a></li>
@@ -49,7 +48,6 @@
<div class="uk-navbar-dropdown">
<ul class="uk-nav uk-navbar-dropdown-nav">
<li class="uk-nav-header">Source</li>
<li><a href="<%= base_url %>download">MangaDex</a></li>
<li><a href="<%= base_url %>download/plugins">Plugins</a></li>
<li class="uk-nav-divider"></li>
<li><a href="<%= base_url %>admin/downloads">Download Manager</a></li>
@@ -69,7 +67,7 @@
</div>
<div class="uk-section uk-section-small">
</div>
<div class="uk-section uk-section-small" id="main-section">
<div class="uk-section uk-section-small" style="position:relative;">
<div class="uk-container uk-container-small">
<div id="alert"></div>
<%= content %>
@@ -82,9 +80,7 @@
setTheme();
const base_url = "<%= base_url %>";
</script>
<script src="<%= base_url %>js/uikit.min.js"></script>
<script src="<%= base_url %>js/uikit-icons.min.js"></script>
<%= render_component "uikit" %>
<%= yield_content "script" %>
</body>

View File

@@ -24,7 +24,7 @@
</div>
<% content_for "script" do %>
<%= render_component "dots-scripts" %>
<%= render_component "dots" %>
<script src="<%= base_url %>js/search.js"></script>
<script src="<%= base_url %>js/sort-items.js"></script>
<% end %>

View File

@@ -30,8 +30,7 @@
<script>
setTheme();
</script>
<script src="<%= base_url %>js/uikit.min.js"></script>
<script src="<%= base_url %>js/uikit-icons.min.js"></script>
<%= render_component "uikit" %>
</body>
</html>

View File

@@ -0,0 +1,39 @@
<div x-data="component()" x-init="init()">
<h2 class="uk-title">Connect to MangaDex</h2>
<div class"uk-grid-small" uk-grid x-show="!loading" x-cloak>
<div class="uk-width-1-2@s" x-show="!expires">
<p>This step is optional but highly recommended if you are using the MangaDex downloader. Connecting to MangaDex allows you to:</p>
<ul>
<li>Search MangaDex by search terms in addition to manga IDs</li>
<li>Automatically download new chapters when they are available (coming soon)</li>
</ul>
</div>
<div class="uk-width-1-2@s" x-show="expires">
<p>
<span x-show="!expired">You have logged in to MangaDex!</span>
<span x-show="expired">You have logged in to MangaDex but the token has expired.</span>
The expiration date of your token is <code x-text="moment.unix(expires).format('MMMM Do YYYY, HH:mm:ss')"></code>.
<span x-show="!expired">If the integration is not working, you</span>
<span x-show="expired">You</span>
can log in again and the token will be updated.
</p>
</div>
<div class="uk-width-1-2@s">
<div class="uk-margin">
<div class="uk-inline uk-width-1-1"><span class="uk-form-icon" uk-icon="icon:user"></span><input class="uk-input uk-form-large" type="text" x-model="username" @keydown.enter.debounce="login()"></div>
</div>
<div class="uk-margin">
<div class="uk-inline uk-width-1-1"><span class="uk-form-icon" uk-icon="icon:lock"></span><input class="uk-input uk-form-large" type="password" x-model="password" @keydown.enter.debounce="login()"></div>
</div>
<div class="uk-margin"><button class="uk-button uk-button-primary uk-button-large uk-width-1-1" @click="login()" :disabled="loggingIn">Login to MangaDex</button></div>
</div>
</div>
</div>
<% content_for "script" do %>
<%= render_component "moment" %>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/mangadex.js"></script>
<% end %>

View File

@@ -0,0 +1,40 @@
<div x-data="component()" x-init="load()" x-cloak x-show="!loading">
<p x-show="empty" class="uk-text-lead uk-text-center">No missing items found.</p>
<div x-show="!empty">
<p>The following items were present in your library, but now we can't find them anymore. If you deleted them mistakenly, try to recover the files or folders, put them back to where they were, and rescan the library. Otherwise, you can safely delete them and the associated metadata using the buttons below to free up database space.</p>
<button class="uk-button uk-button-danger" @click="rmAll()">Delete All</button>
<table class="uk-table uk-table-striped uk-overflow-auto">
<thead>
<tr>
<th>Type</th>
<th>Relative Path</th>
<th>ID</th>
<th>Actions</th>
</tr>
</thead>
<tbody>
<template x-for="title in titles" :key="title">
<tr :id="`title-${title.id}`">
<td>Title</td>
<td x-text="title.path"></td>
<td x-text="title.id"></td>
<td><a @click="rm($event)" uk-icon="trash"></a></td>
</tr>
</template>
<template x-for="entry in entries" :key="entry">
<tr :id="`entry-${entry.id}`">
<td>Entry</td>
<td x-text="entry.path"></td>
<td x-text="entry.id"></td>
<td><a @click="rm($event)" uk-icon="trash"></a></td>
</tr>
</template>
</tbody>
</table>
</div>
</div>
<% content_for "script" do %>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/missing-items.js"></script>
<% end %>

View File

@@ -56,8 +56,10 @@
<div id="download-spinner" uk-spinner class="uk-margin-left" hidden></div>
</div>
<p class="uk-text-meta">Click on a table row to select the chapter. Drag your mouse over multiple rows to select them all. Hold Ctrl to make multiple non-adjacent selections.</p>
<table class="uk-table uk-table-striped uk-overflow-auto tablesorter">
</table>
<div class="uk-overflow-auto">
<table class="uk-table uk-table-striped tablesorter">
</table>
</div>
</div>
<% end %>
@@ -68,7 +70,7 @@
var pid = "<%= plugin.not_nil!.info.id %>";
</script>
<% end %>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.1/jquery-ui.min.js"></script>
<%= render_component "jquery-ui" %>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery.tablesorter/2.31.3/js/jquery.tablesorter.combined.min.js"></script>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/plugin-download.js"></script>

View File

@@ -21,22 +21,22 @@
<div
:class="{'uk-container': true, 'uk-container-small': mode === 'continuous', 'uk-container-expand': mode !== 'continuous'}">
<div x-show="!loading && mode === 'continuous'" x-cloak>
<template x-for="item in items">
<template x-if="!loading && mode === 'continuous'" x-for="item in items">
<img
uk-img
class="uk-align-center"
:style="item.style"
:class="{'uk-align-center': true, 'spine': item.width < 50}"
:data-src="item.url"
:width="item.width"
:height="item.height"
:id="item.id"
:style="`margin-top:${margin}px; margin-bottom:${margin}px`"
@click="showControl($event)"
/>
</template>
<%- if next_entry_url -%>
<button id="next-btn" class="uk-align-center uk-button uk-button-primary" @click="nextEntry('<%= next_entry_url %>')">Next Entry</button>
<%- else -%>
<button id="next-btn" class="uk-align-center uk-button uk-button-primary" @click="exitReader('<%= exit_url %>', true)">Exit Reader</button>
<button id="next-btn" class="uk-align-center uk-button uk-button-primary" @click="exitReader('<%= exit_url %>')">Exit Reader</button>
<%- end -%>
</div>
@@ -50,6 +50,9 @@
width:${mode === 'width' ? '100vw' : 'auto'};
height:${mode === 'height' ? '100vh' : 'auto'};
margin-bottom:0;
max-width:100%;
max-height:100%;
object-fit: contain;
`" />
<div style="position:absolute;z-index:1; top:0;left:0; width:30%;height:100%;" @click="flipPage(false)"></div>
@@ -68,18 +71,19 @@
</div>
<div class="uk-modal-body">
<div class="uk-margin">
<p id="progress-label"></p>
<p x-text="`Progress: ${selectedIndex}/${items.length} (${(selectedIndex/items.length * 100).toFixed(1)}%)`"></p>
</div>
<div class="uk-margin">
<label class="uk-form-label" for="page-select">Jump to page</label>
<label class="uk-form-label" for="page-select">Jump to Page</label>
<div class="uk-form-controls">
<select id="page-select" class="uk-select" @change="pageChanged()">
<select id="page-select" class="uk-select" @change="pageChanged()" x-model="selectedIndex">
<%- (1..entry.pages).each do |p| -%>
<option value="<%= p %>"><%= p %></option>
<%- end -%>
</select>
</div>
</div>
<div class="uk-margin">
<label class="uk-form-label" for="mode-select">Mode</label>
<div class="uk-form-controls">
@@ -89,9 +93,53 @@
</select>
</div>
</div>
<div class="uk-margin" x-show="mode === 'continuous'">
<label class="uk-form-label" for="margin-range" x-text="`Page Margin: ${margin}px`"></label>
<div class="uk-form-controls">
<input id="margin-range" class="uk-range" type="range" min="0" max="50" step="5" x-model="margin" @change="marginChanged()">
</div>
</div>
<div class="uk-margin uk-form-horizontal" x-show="mode !== 'continuous'">
<label class="uk-form-label" for="enable-flip-animation">Enable Flip Animation</label>
<div class="uk-form-controls">
<input id="enable-flip-animation" class="uk-checkbox" type="checkbox" x-model="enableFlipAnimation" @change="enableFlipAnimationChanged()">
</div>
</div>
<div class="uk-margin uk-form-horizontal" x-show="mode !== 'continuous'">
<label class="uk-form-label" for="preload-lookahead" x-text="`Preload Image: ${preloadLookahead} page(s)`"></label>
<div class="uk-form-controls">
<input id="preload-lookahead" class="uk-range" type="range" min="0" max="5" step="1" x-model.number="preloadLookahead" @change="preloadLookaheadChanged()">
</div>
</div>
<hr class="uk-divider-icon">
<div class="uk-margin">
<label class="uk-form-label" for="entry-select">Jump to Entry</label>
<div class="uk-form-controls">
<select id="entry-select" class="uk-select" @change="entryChanged()">
<% entries.each do |e| %>
<option value="<%= e.id %>"
<% if e.id == entry.id %>
selected
<% end %>>
<%= e.title %>
</option>
<% end %>
</select>
</div>
</div>
</div>
<div class="uk-modal-footer uk-text-right">
<button class="uk-button uk-button-danger" type="button" @click="exitReader('<%= exit_url %>')">Exit Reader</button>
<% if previous_entry_url %>
<a class="uk-button uk-button-default uk-margin-small-bottom uk-margin-small-right" href="<%= previous_entry_url %>">Previous Entry</a>
<% end %>
<% if next_entry_url %>
<a class="uk-button uk-button-default uk-margin-small-bottom uk-margin-small-right" href="<%= next_entry_url %>">Next Entry</a>
<% end %>
<a class="uk-button uk-button-danger uk-margin-small-bottom uk-margin-small-right" href="<%= exit_url %>">Exit Reader</a>
</div>
</div>
</div>
@@ -103,15 +151,15 @@
const eid = "<%= entry.id %>";
</script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/protonet-jquery.inview/1.1.2/jquery.inview.min.js"></script>
<%= render_component "uikit" %>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/uikit.min.js"></script>
<script src="<%= base_url %>js/uikit-icons.min.js"></script>
<script src="<%= base_url %>js/reader.js"></script>
</body>
<style>
img[data-src][src*='data:image'] { background: white; }
img { width: 100%; }
img:not(.spine) { width: 100%; }
.reader-bg { background: black; }
</style>
</html>

View File

@@ -0,0 +1,54 @@
<h2 class="uk-title">MangaDex Subscription Manager</h2>
<div x-data="component()" x-init="init()">
<p x-show="available === false">The subscription manager uses a MangaDex API that requires authentication. Please <a href="<%= base_url %>admin/mangadex">connect to MangaDex</a> before using this feature.</p>
<p x-show="available && subscriptions.length === 0">No subscription found. Go to the <a href="<%= base_url %>download">MangaDex download page</a> and start subscribing.</p>
<template x-if="subscriptions.length > 0">
<div class="uk-overflow-auto">
<table class="uk-table uk-table-striped">
<thead>
<tr>
<th>Manga ID</th>
<th>Language</th>
<th>Group ID</th>
<th>Volume Range</th>
<th>Chapter Range</th>
<th>Creator</th>
<th>Last Checked</th>
<th>Created At</th>
<th>Actions</th>
</tr>
</thead>
<tbody>
<template x-for="sub in subscriptions" :key="sub">
<tr>
<td><a :href="`<%= mangadex_base_url %>/manga/${sub.manga_id}`" x-text="sub.manga_id"></a></td>
<td x-text="sub.language || 'All'"></td>
<td>
<a x-show="sub.group_id" :href="`<%= mangadex_base_url %>/group/${sub.group_id}`" x-text="sub.group_id"></a>
<span x-show="!sub.group_id">All</span>
</td>
<td x-text="formatRange(sub.min_volume, sub.max_volume)"></td>
<td x-text="formatRange(sub.min_chapter, sub.max_chapter)"></td>
<td x-text="sub.username"></td>
<td x-text="`${moment.unix(sub.last_checked).fromNow()}`"></td>
<td x-text="`${moment.unix(sub.created_at).fromNow()}`"></td>
<td :data-id="sub.id">
<a @click="check($event)" x-show="sub.username === '<%= username %>'" uk-icon="refresh" uk-tooltip="Check for updates"></a>
<a @click="rm($event)" x-show="sub.username === '<%= username %>'" uk-icon="trash" uk-tooltip="Delete"></a>
</td>
</tr>
</template>
</tbody>
</table>
</div>
</template>
</div>
<% content_for "script" do %>
<%= render_component "moment" %>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/subscription.js"></script>
<% end %>

View File

@@ -24,7 +24,7 @@
</div>
<% content_for "script" do %>
<%= render_component "dots-scripts" %>
<%= render_component "dots" %>
<script src="<%= base_url %>js/search.js"></script>
<script src="<%= base_url %>js/sort-items.js"></script>
<% end %>

View File

@@ -34,7 +34,10 @@
</ul>
<p class="uk-text-meta"><%= title.content_label %> found</p>
<%= render_component "tags" %>
<div class="uk-margin" x-data="tagsComponent()" x-cloak x-init="load(<%= is_admin %>)" x-show="!loading">
<select class="tag-select" multiple="multiple" style="width:100%">
</select>
</div>
<div class="uk-grid-small" uk-grid>
<div class="uk-margin-bottom uk-width-3-4@s">
@@ -120,7 +123,10 @@
</div>
<% content_for "script" do %>
<%= render_component "dots-scripts" %>
<%= render_component "dots" %>
<link href="https://cdn.jsdelivr.net/npm/select2@4.1.0-beta.1/dist/css/select2.min.css" rel="stylesheet" />
<link href="<%= base_url %>css/tags.css" rel="stylesheet" />
<script src="https://cdn.jsdelivr.net/npm/select2@4.1.0-beta.1/dist/js/select2.min.js"></script>
<script src="<%= base_url %>js/alert.js"></script>
<script src="<%= base_url %>js/title.js"></script>
<script src="<%= base_url %>js/search.js"></script>