mirror of
https://github.com/EstrellaXD/Auto_Bangumi.git
synced 2026-04-15 11:00:01 +08:00
Merge branch '3.1-dev' into self-rss
# Conflicts: # .gitignore # backend/src/module/database/bangumi.py # backend/src/module/database/orm/update.py # backend/src/module/models/__init__.py # backend/src/module/rss/poller.py
This commit is contained in:
14
.gitattributes
vendored
Normal file
14
.gitattributes
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
# Don't allow people to merge changes to these generated files, because the result
|
||||
# may be invalid. You need to run "rush update" again.
|
||||
pnpm-lock.yaml merge=binary
|
||||
shrinkwrap.yaml merge=binary
|
||||
npm-shrinkwrap.json merge=binary
|
||||
yarn.lock merge=binary
|
||||
|
||||
# Rush's JSON config files use JavaScript-style code comments. The rule below prevents pedantic
|
||||
# syntax highlighters such as GitHub's from highlighting these comments as errors. Your text editor
|
||||
# may also require a special configuration to allow comments in JSON.
|
||||
#
|
||||
# For more information, see this issue: https://github.com/microsoft/rushstack/issues/1088
|
||||
#
|
||||
*.json linguist-language=JSON-with-Comments
|
||||
18
.github/workflows/build.yml
vendored
18
.github/workflows/build.yml
vendored
@@ -53,13 +53,13 @@ jobs:
|
||||
|
||||
- name: Build
|
||||
run: |
|
||||
cd webui && pnpm build && zip -r dist.zip dist
|
||||
cd webui && pnpm build
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: dist
|
||||
path: webui/dist.zip
|
||||
path: webui/dist
|
||||
|
||||
build-docker:
|
||||
runs-on: ubuntu-latest
|
||||
@@ -131,7 +131,11 @@ jobs:
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: dist
|
||||
path: backend/dist.zip
|
||||
path: backend/src/dist
|
||||
|
||||
- name: View files
|
||||
run: |
|
||||
pwd && ls -al && tree
|
||||
|
||||
- name: Build and push
|
||||
uses: docker/build-push-action@v4
|
||||
@@ -155,11 +159,15 @@ jobs:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v2
|
||||
|
||||
- name: download artifact
|
||||
- name: Download artifact
|
||||
uses: actions/download-artifact@v3
|
||||
with:
|
||||
name: dist
|
||||
path: webui/dist.zip
|
||||
path: webui/dist
|
||||
|
||||
- name: Zip webui
|
||||
run: |
|
||||
cd webui && ls -al && tree && zip -r dist.zip dist
|
||||
|
||||
- name: Generate Release
|
||||
if: ${{ github.event_name == 'pull_request' && github.event.pull_request.merged == true }}
|
||||
|
||||
10
.gitignore
vendored
10
.gitignore
vendored
@@ -178,7 +178,7 @@ test.*
|
||||
/backend/src/config/
|
||||
/src/debuger.py
|
||||
/backend/src/dist.zip
|
||||
/src
|
||||
/pyrightconfig.json
|
||||
|
||||
# webui
|
||||
logs
|
||||
@@ -197,8 +197,11 @@ dist-ssr
|
||||
|
||||
# Editor directories and files
|
||||
.vscode/*
|
||||
!.vscode/extensions.json
|
||||
!.vscode/settings.json
|
||||
!.vscode/tasks.json
|
||||
!.vscode/launch.json
|
||||
!.vscode/extensions.json
|
||||
|
||||
.idea
|
||||
.DS_Store
|
||||
*.suo
|
||||
@@ -206,3 +209,6 @@ dist-ssr
|
||||
*.njsproj
|
||||
*.sln
|
||||
*.sw?
|
||||
|
||||
# vitepress
|
||||
/docs/.vitepress/cache/
|
||||
|
||||
3
.gitmodules
vendored
3
.gitmodules
vendored
@@ -1,3 +0,0 @@
|
||||
[submodule "docs/wiki"]
|
||||
path = docs/wiki
|
||||
url = https://github.com/EstrellaXD/Auto_Bangumi.wiki.git
|
||||
13
.vscode/settings.json
vendored
Normal file
13
.vscode/settings.json
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
{
|
||||
"files.associations": {
|
||||
"settings.json": "json5",
|
||||
"launch.json": "json5",
|
||||
"extensions.json": "json5",
|
||||
"tsconfig.json": "json5",
|
||||
"tsconfig.*.json": "json5",
|
||||
},
|
||||
"[markdown]": {
|
||||
"editor.wordWrap": "off",
|
||||
},
|
||||
"python.venvPath": "./backend/venv",
|
||||
}
|
||||
180
CONTRIBUTING.md
Normal file
180
CONTRIBUTING.md
Normal file
@@ -0,0 +1,180 @@
|
||||
# 贡献指南 Contributing
|
||||
|
||||
我们欢迎各位 Contributors 参与贡献帮助 AutoBangumi 更好的解决大家遇到的问题,
|
||||
|
||||
这篇指南会指导你如何为 AutoBangumi 贡献功能修复代码,可以在你要提出 Pull Request 之前花几分钟来阅读一遍这篇指南。
|
||||
|
||||
这篇文章包含什么?
|
||||
|
||||
- [项目规划 Roadmap](#项目规划-roadmap)
|
||||
- [提案寻求共识 Request for Comments](#提案寻求共识-request-for-comments)
|
||||
- [分支管理 Git Branch](#分支管理-git-branch)
|
||||
- [版本号](#版本号)
|
||||
- [分支开发,主干发布](#分支开发主干发布)
|
||||
- [Branch 生命周期](#branch-生命周期)
|
||||
- [Git Workflow 一览](#git-workflow-一览)
|
||||
- [Pull Request](#pull-request)
|
||||
- [版本发布介绍](#版本发布介绍)
|
||||
|
||||
|
||||
## 项目规划 Roadmap
|
||||
|
||||
AutoBangumi 开发组使用 [GitHub Project](https://github.com/EstrellaXD/Auto_Bangumi/projects?query=is%3Aopen) 看板来管理预计开发的规划、在修复中的问题,以及它们处理的进度;
|
||||
|
||||
这将帮助你更好的了解
|
||||
- 开发团队在做什么?
|
||||
- 有什么和你想贡献的方向一致的,可以直接参与实现与优化
|
||||
- 有什么已经在进行中的,避免自己重复不必要的工作
|
||||
|
||||
在 [Project](https://github.com/EstrellaXD/Auto_Bangumi/projects?query=is%3Aopen) 中你可以看到除通常的 `[Feature Request]`, `[BUG]`, 一些小优化项以外,还有一类 **`[RFC]`**;
|
||||
|
||||
### 提案寻求共识 Request for Comments
|
||||
|
||||
> 在 issue 中通过 `RFC` label 能找到到现有的 [AutoBangumi RFCs](https://github.com/EstrellaXD/Auto_Bangumi/issues?q=is%3Aissue+label%3ARFC)
|
||||
|
||||
对于一些小的优化项或者 bug 修复,你大可以直接帮忙调整代码然后提出 Pull Request,只需要简单阅读下 [分支管理](#分支管理-Git-Branch) 章节以基于正确的版本分支修复、以及通过 [Pull Request](#Pull-Request) 章节了解 PR 将如何被合并。
|
||||
|
||||
<br/>
|
||||
|
||||
而如果你打算做的是一项**较大的**功能重构,改动范围大而涉及的方面比较多,那么希望你能通过 [Issue: 功能提案](https://github.com/EstrellaXD/Auto_Bangumi/issues/new?assignees=&labels=RFC&projects=&template=rfc.yml&title=%5BRFC%5D%3A+) 先写一份 RFC 提案来简单阐述「你打算怎么做」的简短方案,来寻求开发者的讨论和共识。
|
||||
|
||||
因为有些方案可能是开发团队原本讨论并且认为不要做的事,而上一步可以避免你浪费大量精力。
|
||||
|
||||
> 如果仅希望讨论是否添加或改进某功能本身,而非「要如何实现」,请使用 -> [Issue: 功能改进](https://github.com/EstrellaXD/Auto_Bangumi/issues/new?labels=feature+request&template=feature_request.yml&title=%5BFeature+Request%5D+)
|
||||
|
||||
|
||||
<br/>
|
||||
|
||||
一份 [提案(RFC)](https://github.com/EstrellaXD/Auto_Bangumi/issues?q=is%3Aissue+is%3Aopen+label%3ARFC) 定位为 **「在某功能/重构的具体开发前,用于开发者间 review 技术设计/方案的文档」**,
|
||||
|
||||
目的是让协作的开发者间清晰的知道「要做什么」和「具体会怎么做」,以及所有的开发者都能公开透明的参与讨论;
|
||||
|
||||
以便评估和讨论产生的影响 (遗漏的考虑、向后兼容性、与现有功能的冲突),
|
||||
|
||||
因此提案侧重在对解决问题的 **方案、设计、步骤** 的描述上。
|
||||
|
||||
|
||||
## 分支管理 Git Branch
|
||||
|
||||
### 版本号
|
||||
|
||||
AutoBangumi 项目中的 Git 分支使用与发布版本规则密切相关,因此先介绍版本规范;
|
||||
|
||||
AutoBangumi 发布的版本号遵循 [「语义化版本 SemVer」](https://semver.org/lang/zh-CN/) 的规范,
|
||||
|
||||
使用 `<Major>.<Minor>.<Patch>` 三位版本的格式,每一位版本上的数字更新含义如下:
|
||||
|
||||
- **Major**: 大版本更新,很可能有不兼容的 配置/API 修改
|
||||
- **Minor**: 向下兼容的功能性新增
|
||||
- **Patch**: 向下兼容的 Bug 修复 / 小优化修正
|
||||
|
||||
### 分支开发,主干发布
|
||||
|
||||
AutoBangumi 项目使用「分支开发,主干发布」的模式,
|
||||
|
||||
[**`main`**](https://github.com/EstrellaXD/Auto_Bangumi/commits/main) 分支是稳定版本的 **「主干分支」**,只用于发布版本,不用于直接开发新功能或修复。
|
||||
|
||||
每一个 Minor 版本都有一个对应的 **「开发分支」** 用于开发新功能、与发布后维护修复问题,
|
||||
|
||||
开发分支的名字为 `<Major>.<Minor>-dev`,如 `3.1-dev`, `3.0-dev`, `2.6-dev`, 你可以在仓库的 [All Branches 中搜索到它们](https://github.com/EstrellaXD/Auto_Bangumi/branches/all?query=-dev)。
|
||||
|
||||
|
||||
### Branch 生命周期
|
||||
|
||||
当一个 Minor 开发分支(以 `3.1-dev` 为例) 完成新功能开发,**首次**合入 main 分支后,
|
||||
- 发布 Minor 版本 (如 `3.1.0`)
|
||||
- 同时拉出**下一个** Minor 开发分支(`3.2-dev`),用于下一个版本新功能开发
|
||||
- 而**上一个**版本开发分支(`3.0-dev`)进入归档不再维护
|
||||
- 且这个 Minor 分支(`3.1-dev`)进入维护阶段,不再增加新功能/重构,只维护 Bugs 修复
|
||||
- Bug 修复到维护阶段的 Minor 分支(`3.1-dev`)后,会再往 main 分支合并,并发布 `Patch` 版本
|
||||
|
||||
根据这个流程,对于各位 Contributors 在开发贡献时选择 Git Branch 来说,则是:
|
||||
- 若「修复 Bug」,则基于**当前发布版本**的 Minor 分支开发修复,并 PR 到这个分支
|
||||
- 若「添加新功能/重构」,则基于**还未发布的下一个版本** Minor 分支开发,并 PR 到这个分支
|
||||
|
||||
> 「当前发布版本」为 [[Releases 页面]](https://github.com/EstrellaXD/Auto_Bangumi/releases) 最新版本,这也与 [[GitHub Container Registry]](https://github.com/EstrellaXD/Auto_Bangumi/pkgs/container/auto_bangumi) 中最新版本相同
|
||||
|
||||
|
||||
### Git Workflow 一览
|
||||
|
||||
> 图中 commit timeline 从左到右 --->
|
||||
|
||||
```mermaid
|
||||
%%{init: {'theme': 'base', 'gitGraph': {'showCommitLabel': true}}}%%
|
||||
|
||||
gitGraph:
|
||||
checkout main
|
||||
commit id: "."
|
||||
branch 3.0-dev
|
||||
commit id: "feat 1"
|
||||
commit id: "feat 2"
|
||||
commit id: "feat 3"
|
||||
|
||||
checkout main
|
||||
merge 3.0-dev tag: "3.0.9"
|
||||
commit id: ".."
|
||||
|
||||
branch 3.1-dev
|
||||
commit id: "feat 4"
|
||||
|
||||
checkout 3.0-dev
|
||||
commit id: "PR merge (fix)"
|
||||
checkout main
|
||||
merge 3.0-dev tag: "3.0.10"
|
||||
|
||||
checkout 3.1-dev
|
||||
commit id: "feat 5"
|
||||
commit id: "feat 6"
|
||||
|
||||
checkout main
|
||||
merge 3.1-dev tag: "3.1.0"
|
||||
commit id: "..."
|
||||
|
||||
branch 3.2-dev
|
||||
commit id: "feat 7"
|
||||
commit id: "feat 8"
|
||||
|
||||
checkout 3.1-dev
|
||||
commit id: "PR merge (fix) "
|
||||
checkout main
|
||||
merge 3.1-dev tag: "3.1.1"
|
||||
|
||||
checkout 3.2-dev
|
||||
commit id: "PR merge (feat)"
|
||||
```
|
||||
|
||||
|
||||
## Pull Request
|
||||
|
||||
请确保你根据上文的 Git 分支管理 章节选择了正确的 PR 目标分支,
|
||||
> - 若「修复 Bug」,则 PR 到**当前发布版本**的 Minor 维护分支
|
||||
> - 若「添加新功能/重构」,则 PR **下一个版本** Minor 开发分支
|
||||
|
||||
<br/>
|
||||
|
||||
- 一个 PR 应该只对应一件事,而不应引入不相关的更改;
|
||||
|
||||
对于不同的事情可以拆分提多个 PR,这能帮助开发组每次 review 只专注一个问题。
|
||||
|
||||
- 在提 PR 的标题与描述中,最好对修改内容做简短的说明,包括原因和意图,
|
||||
|
||||
如果有相关的 issue 或 RFC,应该把它们链接到 PR 描述中,
|
||||
|
||||
这将帮助开发组 code review 时能最快了解上下文。
|
||||
|
||||
- 确保勾选了「允许维护者编辑」(`Allow edits from maintainers`) 选项。这使我们可以直接进行较小的编辑/重构并节省大量时间。
|
||||
|
||||
- 请确保本地通过了「单元测试」和「代码风格 Lint」,这也会在 PR 的 GitHub CI 上检查
|
||||
- 对于 bug fix 和新功能,通常开发组也会请求你添加对应改动的单元测试覆盖
|
||||
|
||||
|
||||
开发组会在有时间的最快阶段 Review 贡献者提的 PR 并讨论或批准合并(Approve Merge)。
|
||||
|
||||
## 版本发布介绍
|
||||
|
||||
版本发布目前由开发组通过手动合并「特定发版 PR」后自动触发打包与发布。
|
||||
|
||||
通常 Bug 修复的 PR 合并后会很快发版,通常不到一周;
|
||||
|
||||
而新功能的发版时间则会更长而且不定,你可以在我们的 [GitHub Project](https://github.com/EstrellaXD/Auto_Bangumi/projects?query=is%3Aopen) 看板中看到开发进度,一个版本规划的新功能都开发完备后就会发版。
|
||||
|
||||
15
Dockerfile
15
Dockerfile
@@ -17,8 +17,8 @@ ENV S6_SERVICES_GRACETIME=30000 \
|
||||
WORKDIR /app
|
||||
|
||||
COPY backend/requirements.txt .
|
||||
COPY backend/dist.zip .
|
||||
RUN apk add --no-cache \
|
||||
RUN set -ex && \
|
||||
apk add --no-cache \
|
||||
bash \
|
||||
ca-certificates \
|
||||
coreutils \
|
||||
@@ -32,22 +32,19 @@ RUN apk add --no-cache \
|
||||
s6-overlay \
|
||||
shadow \
|
||||
tzdata && \
|
||||
python3 -m pip install --upgrade pip && \
|
||||
python3 -m pip install --no-cache-dir --upgrade pip && \
|
||||
sed -i '/bcrypt/d' requirements.txt && \
|
||||
pip install --no-cache-dir -r requirements.txt && \
|
||||
# Unzip WebUI \
|
||||
unzip dist.zip && \
|
||||
# Add user
|
||||
addgroup -S ab -g 911 && \
|
||||
adduser -S ab -G ab -h /ab -s /bin/bash -u 911 && \
|
||||
# Clear
|
||||
rm -rf \
|
||||
/root/.cache \
|
||||
/tmp/* \
|
||||
/app/dist.zip
|
||||
/ab/.cache \
|
||||
/tmp/*
|
||||
|
||||
COPY --chmod=755 backend/src/. .
|
||||
COPY --chmod=755 backend/src/docker /
|
||||
COPY --chmod=755 docker/ /
|
||||
|
||||
ENTRYPOINT [ "/init" ]
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
<p align="center">
|
||||
<img src="docs/image/light-icon.png#gh-light-mode-only" width=50%/ alt="">
|
||||
<img src="docs/image/dark-icon.png#gh-dark-mode-only" width=50%/ alt="">
|
||||
<img src="docs/image/icons/light-icon.svg#gh-light-mode-only" width=50%/ alt="">
|
||||
<img src="docs/image/icons/dark-icon.svg#gh-dark-mode-only" width=50%/ alt="">
|
||||
</p>
|
||||
<p align="center">
|
||||
<img title="docker build version" src="https://img.shields.io/docker/v/estrellaxd/auto_bangumi" alt="">
|
||||
@@ -12,7 +12,7 @@
|
||||
# 项目说明
|
||||
|
||||
<p align="center">
|
||||
<img title="AutoBangumi" src="docs/image/window.png" alt="" width=75%>
|
||||
<img title="AutoBangumi" src="docs/image/preview/window.png" alt="" width=75%>
|
||||
</p>
|
||||
|
||||
本项目是基于 [Mikan Project](https://mikanani.me)、[qBittorrent](https://qbittorrent.org) 的全自动追番整理下载工具。只需要在 [Mikan Project](https://mikanani.me) 上订阅番剧,就可以全自动追番。并且整理完成的名称和目录可以直接被 [Plex]()、[Jellyfin]() 等媒体库软件识别,无需二次刮削。
|
||||
|
||||
8
backend/.vscode/settings.json
vendored
Normal file
8
backend/.vscode/settings.json
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
{
|
||||
"python.formatting.provider": "none",
|
||||
"python.formatting.blackPath": "black",
|
||||
"editor.formatOnSave": true,
|
||||
"[python]": {
|
||||
"editor.defaultFormatter": "ms-python.black-formatter"
|
||||
}
|
||||
}
|
||||
4
backend/dev.sh
Normal file → Executable file
4
backend/dev.sh
Normal file → Executable file
@@ -1,10 +1,10 @@
|
||||
#!/bin/bash
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# This script is used to run the development environment.
|
||||
|
||||
python3 -m venv venv
|
||||
|
||||
./venv/bin/python3 -m pip install -i https://pypi.tuna.tsinghua.edu.cn/simple install -r requirements-dev.txt
|
||||
./venv/bin/python3 -m pip install -i https://pypi.tuna.tsinghua.edu.cn/simple -r requirements-dev.txt
|
||||
|
||||
# install git-hooks for pre-commit before committing.
|
||||
./venv/bin/pre-commit install
|
||||
|
||||
@@ -1,25 +1,26 @@
|
||||
anyio
|
||||
bs4
|
||||
certifi
|
||||
charset-normalizer
|
||||
click
|
||||
fastapi
|
||||
h11
|
||||
idna
|
||||
pydantic
|
||||
PySocks
|
||||
qbittorrent-api
|
||||
requests
|
||||
six
|
||||
sniffio
|
||||
soupsieve
|
||||
typing_extensions
|
||||
urllib3
|
||||
uvicorn
|
||||
attrdict
|
||||
jinja2
|
||||
python-dotenv
|
||||
python-jose
|
||||
passlib
|
||||
bcrypt
|
||||
python-multipart
|
||||
anyio==3.7.0
|
||||
bs4==0.0.1
|
||||
certifi==2023.5.7
|
||||
charset-normalizer==3.1.0
|
||||
click==8.1.3
|
||||
fastapi==0.97.0
|
||||
h11==0.14.0
|
||||
idna==3.4
|
||||
pydantic~=1.10
|
||||
PySocks==1.7.1
|
||||
qbittorrent-api==2023.6.49
|
||||
requests==2.31.0
|
||||
six==1.16.0
|
||||
sniffio==1.3.0
|
||||
soupsieve==2.4.1
|
||||
typing_extensions==4.6.3
|
||||
urllib3==2.0.3
|
||||
uvicorn==0.22.0
|
||||
attrdict==2.0.1
|
||||
Jinja2==3.1.2
|
||||
python-dotenv==1.0.0
|
||||
python-jose==3.3.0
|
||||
passlib==1.7.4
|
||||
bcrypt==4.0.1
|
||||
python-multipart==0.0.6
|
||||
sqlmodel
|
||||
|
||||
42
backend/scripts/pip-lock-version.sh
Executable file
42
backend/scripts/pip-lock-version.sh
Executable file
@@ -0,0 +1,42 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
#
|
||||
# Usage:
|
||||
# `bash scripts/pip-lock-version.sh`
|
||||
#
|
||||
# ```prompt
|
||||
# Lock the library versions in `requirements.txt` to the current ones from `pip freeze` using shell script,
|
||||
# but don't change any order in `requirements.txt`
|
||||
# ```
|
||||
#
|
||||
|
||||
|
||||
# Create a temporary requirements file using pip freeze
|
||||
pip freeze > pip_freeze.log
|
||||
|
||||
# Read the existing requirements.txt line by line
|
||||
while IFS= read -r line
|
||||
do
|
||||
# Extract the library name without version
|
||||
lib_name=$(echo $line | cut -d'=' -f1)
|
||||
|
||||
# Find the corresponding library in the temporary requirements file
|
||||
lib_line=$(grep "^$lib_name==" pip_freeze.log)
|
||||
|
||||
# If the library is found, update the line
|
||||
if [[ $lib_line ]]
|
||||
then
|
||||
echo $lib_line
|
||||
else
|
||||
echo $line
|
||||
fi
|
||||
|
||||
# Redirect the output to a new requirements file
|
||||
done < requirements.txt > new_requirements.log
|
||||
|
||||
# Remove the temporary requirements file
|
||||
rm pip_freeze.log
|
||||
|
||||
# Replace the old requirements file with the new one
|
||||
mv new_requirements.log requirements.txt
|
||||
|
||||
@@ -1,14 +0,0 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# shellcheck shell=bash
|
||||
|
||||
function __old_compatible {
|
||||
|
||||
umask ${UMASK}
|
||||
|
||||
if [ -f /config/bangumi.json ]; then
|
||||
mv /config/bangumi.json /app/data/bangumi.json
|
||||
fi
|
||||
|
||||
}
|
||||
|
||||
__old_compatible 2>&1 | sed "s#^#cont-init: info: $(realpath $0): &#g"
|
||||
@@ -1,13 +0,0 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# shellcheck shell=bash
|
||||
|
||||
function __fixuser {
|
||||
|
||||
groupmod -o -g "${PGID}" ab
|
||||
usermod -o -u "${PUID}" ab
|
||||
|
||||
chown ab:ab -R /app /ab
|
||||
|
||||
}
|
||||
|
||||
__fixuser 2>&1 | sed "s#^#cont-init: info: $(realpath $0): &#g"
|
||||
@@ -58,7 +58,7 @@ if VERSION != "DEV_VERSION":
|
||||
@app.get("/favicon-light.svg", tags=["html"])
|
||||
def favicon_light():
|
||||
return FileResponse("dist/favicon-light.svg")
|
||||
|
||||
|
||||
@app.get("/robots.txt", tags=["html"])
|
||||
def robots():
|
||||
return FileResponse("dist/robots.txt")
|
||||
@@ -70,10 +70,12 @@ if VERSION != "DEV_VERSION":
|
||||
return templates.TemplateResponse("index.html", context)
|
||||
|
||||
else:
|
||||
|
||||
@app.get("/", status_code=302, tags=["html"])
|
||||
def index():
|
||||
return RedirectResponse("/docs")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
uvicorn.run(
|
||||
app,
|
||||
|
||||
@@ -2,13 +2,13 @@ from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from fastapi.responses import JSONResponse
|
||||
|
||||
from module.manager import TorrentManager
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
from module.security import get_current_user
|
||||
|
||||
router = APIRouter(prefix="/bangumi", tags=["bangumi"])
|
||||
|
||||
|
||||
@router.get("/getAll", response_model=list[BangumiData])
|
||||
@router.get("/getAll", response_model=list[Bangumi])
|
||||
async def get_all_data(current_user=Depends(get_current_user)):
|
||||
if not current_user:
|
||||
raise HTTPException(
|
||||
@@ -18,7 +18,7 @@ async def get_all_data(current_user=Depends(get_current_user)):
|
||||
return torrent.search_all()
|
||||
|
||||
|
||||
@router.get("/getData/{bangumi_id}", response_model=BangumiData)
|
||||
@router.get("/getData/{bangumi_id}", response_model=Bangumi)
|
||||
async def get_data(bangumi_id: str, current_user=Depends(get_current_user)):
|
||||
if not current_user:
|
||||
raise HTTPException(
|
||||
@@ -29,7 +29,7 @@ async def get_data(bangumi_id: str, current_user=Depends(get_current_user)):
|
||||
|
||||
|
||||
@router.post("/updateRule")
|
||||
async def update_rule(data: BangumiData, current_user=Depends(get_current_user)):
|
||||
async def update_rule(data: Bangumi, current_user=Depends(get_current_user)):
|
||||
if not current_user:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED, detail="invalid token"
|
||||
|
||||
@@ -16,7 +16,7 @@ async def get_config(current_user=Depends(get_current_user)):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED, detail="invalid token"
|
||||
)
|
||||
return settings
|
||||
return settings.dict()
|
||||
|
||||
|
||||
@router.post("/updateConfig")
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
|
||||
from module.manager import SeasonCollector
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
from module.models.api import RssLink
|
||||
from module.rss import analyser
|
||||
from module.security import get_current_user
|
||||
@@ -23,9 +23,7 @@ async def analysis(link: RssLink, current_user=Depends(get_current_user)):
|
||||
|
||||
|
||||
@router.post("/collection")
|
||||
async def download_collection(
|
||||
data: BangumiData, current_user=Depends(get_current_user)
|
||||
):
|
||||
async def download_collection(data: Bangumi, current_user=Depends(get_current_user)):
|
||||
if not current_user:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED, detail="invalid token"
|
||||
@@ -41,7 +39,7 @@ async def download_collection(
|
||||
|
||||
|
||||
@router.post("/subscribe")
|
||||
async def subscribe(data: BangumiData, current_user=Depends(get_current_user)):
|
||||
async def subscribe(data: Bangumi, current_user=Depends(get_current_user)):
|
||||
if not current_user:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED, detail="invalid token"
|
||||
|
||||
@@ -14,7 +14,7 @@ async def get_log(current_user=Depends(get_current_user)):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED, detail="invalid token"
|
||||
)
|
||||
if os.path.isfile(LOG_PATH):
|
||||
if LOG_PATH.exists():
|
||||
with open(LOG_PATH, "rb") as f:
|
||||
return Response(f.read(), media_type="text/plain")
|
||||
else:
|
||||
@@ -27,9 +27,8 @@ async def clear_log(current_user=Depends(get_current_user)):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_401_UNAUTHORIZED, detail="invalid token"
|
||||
)
|
||||
if os.path.isfile(LOG_PATH):
|
||||
with open(LOG_PATH, "w") as f:
|
||||
f.write("")
|
||||
if LOG_PATH.exists():
|
||||
LOG_PATH.write_text("")
|
||||
return {"status": "ok"}
|
||||
else:
|
||||
return Response("Log file not found", status_code=404)
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import logging
|
||||
import os
|
||||
import signal
|
||||
import sys
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
|
||||
@@ -20,6 +21,7 @@ async def startup():
|
||||
@router.on_event("shutdown")
|
||||
async def shutdown():
|
||||
program.stop()
|
||||
sys.exit(0)
|
||||
|
||||
|
||||
@router.get("/restart")
|
||||
|
||||
@@ -1,7 +1,10 @@
|
||||
from pathlib import Path
|
||||
|
||||
from .config import VERSION, settings
|
||||
from .log import LOG_PATH, setup_logger
|
||||
|
||||
TMDB_API = "32b19d6a05b512190a056fa4e747cbbc"
|
||||
DATA_PATH = "data/data.db"
|
||||
DATA_PATH = "sqlite:///data/data.db"
|
||||
LEGACY_DATA_PATH = Path("data/data.json")
|
||||
|
||||
PLATFORM = "Windows" if "\\" in settings.downloader.path else "Unix"
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
from dotenv import load_dotenv
|
||||
|
||||
@@ -9,25 +10,26 @@ from module.models.config import Config
|
||||
from .const import ENV_TO_ATTR
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
CONFIG_ROOT = Path("config")
|
||||
|
||||
|
||||
try:
|
||||
from module.__version__ import VERSION
|
||||
|
||||
if VERSION == "DEV_VERSION":
|
||||
logger.info("Can't find version info, use DEV_VERSION instead")
|
||||
CONFIG_PATH = "config/config_dev.json"
|
||||
else:
|
||||
CONFIG_PATH = "config/config.json"
|
||||
except ImportError:
|
||||
logger.info("Can't find version info, use DEV_VERSION instead")
|
||||
VERSION = "DEV_VERSION"
|
||||
CONFIG_PATH = "config/config_dev.json"
|
||||
|
||||
CONFIG_PATH = (
|
||||
CONFIG_ROOT / "config_dev.json"
|
||||
if VERSION == "DEV_VERSION"
|
||||
else CONFIG_ROOT / "config.json"
|
||||
).resolve()
|
||||
|
||||
|
||||
class Settings(Config):
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
if os.path.exists(CONFIG_PATH):
|
||||
if CONFIG_PATH.exists():
|
||||
self.load()
|
||||
self.save()
|
||||
else:
|
||||
|
||||
@@ -1,17 +1,19 @@
|
||||
import logging
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
from .config import settings
|
||||
|
||||
LOG_PATH = "data/log.txt"
|
||||
LOG_ROOT = Path("data")
|
||||
LOG_PATH = LOG_ROOT / "log.txt"
|
||||
|
||||
|
||||
def setup_logger(level: int = logging.INFO, reset: bool = False):
|
||||
level = logging.DEBUG if settings.log.debug_enable else level
|
||||
if not os.path.isdir("data"):
|
||||
os.mkdir("data")
|
||||
if reset and os.path.isfile(LOG_PATH):
|
||||
os.remove(LOG_PATH)
|
||||
LOG_ROOT.mkdir(exist_ok=True)
|
||||
|
||||
if reset and LOG_PATH.exists():
|
||||
LOG_PATH.unlink(missing_ok=True)
|
||||
|
||||
logging.addLevelName(logging.DEBUG, "DEBUG:")
|
||||
logging.addLevelName(logging.INFO, "INFO:")
|
||||
logging.addLevelName(logging.WARNING, "WARNING:")
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import asyncio
|
||||
import os.path
|
||||
import threading
|
||||
|
||||
from module.checker import Checker
|
||||
from module.conf import LEGACY_DATA_PATH
|
||||
|
||||
|
||||
class ProgramStatus(Checker):
|
||||
@@ -51,4 +51,4 @@ class ProgramStatus(Checker):
|
||||
|
||||
@property
|
||||
def legacy_data(self):
|
||||
return os.path.exists("data/data.json")
|
||||
return LEGACY_DATA_PATH.exists()
|
||||
|
||||
@@ -1,138 +1,102 @@
|
||||
import logging
|
||||
|
||||
from module.database.orm import Connector
|
||||
from module.models import BangumiData
|
||||
from module.conf import DATA_PATH
|
||||
from sqlmodel import Session, select, delete, or_
|
||||
from sqlalchemy.sql import func
|
||||
from typing import Optional
|
||||
|
||||
from .engine import engine
|
||||
from module.models import Bangumi
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class BangumiDatabase(Connector):
|
||||
def __init__(self, database: str = DATA_PATH):
|
||||
super().__init__(
|
||||
table_name="bangumi",
|
||||
data=self.__data_to_db(BangumiData()),
|
||||
database=database,
|
||||
)
|
||||
class BangumiDatabase(Session):
|
||||
def __init__(self, _engine=engine):
|
||||
super().__init__(_engine)
|
||||
|
||||
def update_table(self):
|
||||
self.update.table()
|
||||
|
||||
@staticmethod
|
||||
def __data_to_db(data: BangumiData) -> dict:
|
||||
db_data = data.dict()
|
||||
for key, value in db_data.items():
|
||||
if isinstance(value, bool):
|
||||
db_data[key] = int(value)
|
||||
elif isinstance(value, list):
|
||||
db_data[key] = ",".join(value)
|
||||
return db_data
|
||||
|
||||
@staticmethod
|
||||
def __db_to_data(db_data: dict) -> BangumiData:
|
||||
for key, item in db_data.items():
|
||||
if isinstance(item, int):
|
||||
if key not in ["id", "offset", "season", "year"]:
|
||||
db_data[key] = bool(item)
|
||||
elif key in ["filter", "rss_link"]:
|
||||
db_data[key] = item.split(",")
|
||||
return BangumiData(**db_data)
|
||||
|
||||
def insert_one(self, data: BangumiData):
|
||||
db_data = self.__data_to_db(data)
|
||||
self.insert.one(db_data)
|
||||
def insert_one(self, data: Bangumi):
|
||||
self.add(data)
|
||||
self.commit()
|
||||
logger.debug(f"[Database] Insert {data.official_title} into database.")
|
||||
# if self.__check_exist(data):
|
||||
# self.update_one(data)
|
||||
# else:
|
||||
# db_data = self.__data_to_db(data)
|
||||
# db_data["id"] = self.gen_id()
|
||||
# self._insert(db_data=db_data, table_name=self.__table_name)
|
||||
# logger.debug(f"[Database] Insert {data.official_title} into database.")
|
||||
|
||||
def insert_list(self, data: list[BangumiData]):
|
||||
data_list = [self.__data_to_db(x) for x in data]
|
||||
self.insert.many(data_list)
|
||||
# _id = self.gen_id()
|
||||
# for i, item in enumerate(data):
|
||||
# item.id = _id + i
|
||||
# data_list = [self.__data_to_db(x) for x in data]
|
||||
# self._insert_list(data_list=data_list, table_name=self.__table_name)
|
||||
def insert_list(self, data: list[Bangumi]):
|
||||
self.add_all(data)
|
||||
logger.debug(f"[Database] Insert {len(data)} bangumi into database.")
|
||||
|
||||
def update_one(self, data: BangumiData) -> bool:
|
||||
db_data = self.__data_to_db(data)
|
||||
return self.update.one(db_data)
|
||||
def update_one(self, data: Bangumi) -> bool:
|
||||
db_data = self.get(Bangumi, data.id)
|
||||
if not db_data:
|
||||
return False
|
||||
bangumi_data = data.dict(exclude_unset=True)
|
||||
for key, value in bangumi_data.items():
|
||||
setattr(db_data, key, value)
|
||||
self.add(db_data)
|
||||
self.commit()
|
||||
self.refresh(db_data)
|
||||
logger.debug(f"[Database] Update {data.official_title}")
|
||||
return True
|
||||
|
||||
def update_list(self, data: list[BangumiData]):
|
||||
data_list = [self.__data_to_db(x) for x in data]
|
||||
self.update.many(data_list)
|
||||
def update_list(self, datas: list[Bangumi]):
|
||||
for data in datas:
|
||||
self.update_one(data)
|
||||
|
||||
def update_rss(self, title_raw, rss_set: str):
|
||||
# Update rss and added
|
||||
location = {"title_raw": title_raw}
|
||||
set_value = {"rss_link": rss_set, "added": 0}
|
||||
self.update.value(location, set_value)
|
||||
# self._cursor.execute(
|
||||
# """
|
||||
# UPDATE bangumi
|
||||
# SET rss_link = :rss_link, added = 0
|
||||
# WHERE title_raw = :title_raw
|
||||
# """,
|
||||
# {"rss_link": rss_set, "title_raw": title_raw},
|
||||
# )
|
||||
# self._conn.commit()
|
||||
statement = select(Bangumi).where(Bangumi.title_raw == title_raw)
|
||||
bangumi = self.exec(statement).first()
|
||||
bangumi.rss_link = rss_set
|
||||
bangumi.added = False
|
||||
self.add(bangumi)
|
||||
self.commit()
|
||||
self.refresh(bangumi)
|
||||
logger.debug(f"[Database] Update {title_raw} rss_link to {rss_set}.")
|
||||
|
||||
def update_poster(self, title_raw, poster_link: str):
|
||||
location = {"title_raw": title_raw}
|
||||
set_value = {"poster_link": poster_link}
|
||||
self.update.value(location, set_value)
|
||||
statement = select(Bangumi).where(Bangumi.title_raw == title_raw)
|
||||
bangumi = self.exec(statement).first()
|
||||
bangumi.poster_link = poster_link
|
||||
self.add(bangumi)
|
||||
self.commit()
|
||||
self.refresh(bangumi)
|
||||
logger.debug(f"[Database] Update {title_raw} poster_link to {poster_link}.")
|
||||
|
||||
def delete_one(self, _id: int):
|
||||
self.delete.one(_id)
|
||||
statement = select(Bangumi).where(Bangumi.id == _id)
|
||||
bangumi = self.exec(statement).first()
|
||||
self.delete(bangumi)
|
||||
self.commit()
|
||||
logger.debug(f"[Database] Delete bangumi id: {_id}.")
|
||||
|
||||
def delete_all(self):
|
||||
self.delete.all()
|
||||
statement = delete(Bangumi)
|
||||
self.exec(statement)
|
||||
self.commit()
|
||||
|
||||
def search_all(self) -> list[BangumiData]:
|
||||
all_data = self.select.all()
|
||||
return [self.__db_to_data(x) for x in all_data]
|
||||
def search_all(self) -> list[Bangumi]:
|
||||
statement = select(Bangumi)
|
||||
return self.exec(statement).all()
|
||||
|
||||
def search_id(self, _id: int) -> BangumiData | None:
|
||||
dict_data = self.select.one(conditions={"id": _id})
|
||||
if dict_data is None:
|
||||
def search_id(self, _id: int) -> Optional[Bangumi]:
|
||||
statement = select(Bangumi).where(Bangumi.id == _id)
|
||||
bangumi = self.exec(statement).first()
|
||||
if bangumi is None:
|
||||
logger.warning(f"[Database] Cannot find bangumi id: {_id}.")
|
||||
return None
|
||||
logger.debug(f"[Database] Find bangumi id: {_id}.")
|
||||
return self.__db_to_data(dict_data)
|
||||
|
||||
# def search_official_title(self, official_title: str) -> BangumiData | None:
|
||||
# dict_data = self._search_data(
|
||||
# table_name=self.__table_name, condition={"official_title": official_title}
|
||||
# )
|
||||
# if dict_data is None:
|
||||
# return None
|
||||
# return self.__db_to_data(dict_data)
|
||||
else:
|
||||
logger.debug(f"[Database] Find bangumi id: {_id}.")
|
||||
return self.exec(statement).first()
|
||||
|
||||
def match_poster(self, bangumi_name: str) -> str:
|
||||
condition = {"official_title": bangumi_name}
|
||||
keys = ["poster_link"]
|
||||
data = self.select.one(
|
||||
keys=keys,
|
||||
conditions=condition,
|
||||
combine_operator="INSTR",
|
||||
)
|
||||
if not data:
|
||||
# Use like to match
|
||||
statement = select(Bangumi).where(func.instr(bangumi_name, Bangumi.title_raw) > 0)
|
||||
data = self.exec(statement).first()
|
||||
if data:
|
||||
return data.poster_link
|
||||
else:
|
||||
return ""
|
||||
return data.get("poster_link")
|
||||
|
||||
def match_list(self, torrent_list: list, rss_link: str) -> list:
|
||||
# Match title_raw in database
|
||||
keys = ["title_raw", "rss_link", "poster_link"]
|
||||
match_datas = self.select.column(keys)
|
||||
match_datas = self.search_all()
|
||||
if not match_datas:
|
||||
return torrent_list
|
||||
# Match title
|
||||
@@ -140,50 +104,43 @@ class BangumiDatabase(Connector):
|
||||
while i < len(torrent_list):
|
||||
torrent = torrent_list[i]
|
||||
for match_data in match_datas:
|
||||
if match_data.get("title_raw") in torrent.name:
|
||||
if rss_link not in match_data.get("rss_link"):
|
||||
match_data["rss_link"] += f",{rss_link}"
|
||||
self.update_rss(
|
||||
match_data.get("title_raw"), match_data.get("rss_link")
|
||||
)
|
||||
if not match_data.get("poster_link"):
|
||||
self.update_poster(
|
||||
match_data.get("title_raw"), torrent.poster_link
|
||||
)
|
||||
if match_data.title_raw in torrent.name:
|
||||
if rss_link not in match_data.rss_link:
|
||||
match_data.rss_link += f",{rss_link}"
|
||||
self.update_rss(match_data.title_raw, match_data.rss_link)
|
||||
if not match_data.poster_link:
|
||||
self.update_poster(match_data.title_raw, torrent.poster_link)
|
||||
torrent_list.pop(i)
|
||||
break
|
||||
else:
|
||||
i += 1
|
||||
return torrent_list
|
||||
|
||||
def not_complete(self) -> list[BangumiData]:
|
||||
def not_complete(self) -> list[Bangumi]:
|
||||
# Find eps_complete = False
|
||||
condition = {"eps_collect": 0}
|
||||
dict_data = self.select.many(
|
||||
conditions=condition,
|
||||
condition = select(Bangumi).where(Bangumi.eps_collect == 0)
|
||||
datas = self.exec(condition).all()
|
||||
return datas
|
||||
|
||||
def not_added(self) -> list[Bangumi]:
|
||||
conditions = select(Bangumi).where(
|
||||
or_(
|
||||
Bangumi.added == 0, Bangumi.rule_name is None, Bangumi.save_path is None
|
||||
)
|
||||
)
|
||||
return [self.__db_to_data(x) for x in dict_data]
|
||||
datas = self.exec(conditions).all()
|
||||
return datas
|
||||
|
||||
def not_added(self) -> list[BangumiData]:
|
||||
conditions = {"added": 0, "rule_name": None, "save_path": None}
|
||||
dict_data = self.select.many(conditions=conditions, combine_operator="OR")
|
||||
return [self.__db_to_data(x) for x in dict_data]
|
||||
|
||||
def get_rss(self, rss_link: str) -> list[BangumiData]:
|
||||
conditions = {"rss_link": rss_link}
|
||||
dict_data = self.select.many(conditions=conditions, combine_operator="INSTR")
|
||||
return [self.__db_to_data(x) for x in dict_data]
|
||||
|
||||
def match_torrent(self, torrent_name: str, rss_link: str) -> BangumiData | None:
|
||||
conditions = {"title_raw": torrent_name, "rss_link": rss_link}
|
||||
dict_data = self.select.one(conditions=conditions, combine_operator="INSTR")
|
||||
if not dict_data:
|
||||
return None
|
||||
return self.__db_to_data(dict_data)
|
||||
def disable_rule(self, _id: int):
|
||||
statement = select(Bangumi).where(Bangumi.id == _id)
|
||||
bangumi = self.exec(statement).first()
|
||||
bangumi.deleted = True
|
||||
self.add(bangumi)
|
||||
self.commit()
|
||||
self.refresh(bangumi)
|
||||
logger.debug(f"[Database] Disable rule {bangumi.title_raw}.")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
with BangumiDatabase() as db:
|
||||
db.match_torrent(
|
||||
"魔法科高校の劣等生 来訪者編", "https://bangumi.moe/rss/5f6b3e3e4e8c4b0001b2e3a3"
|
||||
)
|
||||
print(db.not_complete())
|
||||
|
||||
@@ -1,165 +0,0 @@
|
||||
import logging
|
||||
import os
|
||||
import sqlite3
|
||||
|
||||
from module.conf import DATA_PATH
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DataConnector:
|
||||
def __init__(self):
|
||||
# Create folder if not exists
|
||||
if not os.path.exists(os.path.dirname(DATA_PATH)):
|
||||
os.makedirs(os.path.dirname(DATA_PATH))
|
||||
self._conn = sqlite3.connect(DATA_PATH)
|
||||
self._cursor = self._conn.cursor()
|
||||
|
||||
def _update_table(self, table_name: str, db_data: dict):
|
||||
columns = ", ".join(
|
||||
[
|
||||
f"{key} {self.__python_to_sqlite_type(value)}"
|
||||
for key, value in db_data.items()
|
||||
]
|
||||
)
|
||||
create_table_sql = f"CREATE TABLE IF NOT EXISTS {table_name} ({columns});"
|
||||
self._cursor.execute(create_table_sql)
|
||||
self._cursor.execute(f"PRAGMA table_info({table_name})")
|
||||
existing_columns = {
|
||||
column_info[1]: column_info for column_info in self._cursor.fetchall()
|
||||
}
|
||||
for key, value in db_data.items():
|
||||
if key not in existing_columns:
|
||||
insert_column = self.__python_to_sqlite_type(value)
|
||||
if value is None:
|
||||
value = "NULL"
|
||||
add_column_sql = f"ALTER TABLE {table_name} ADD COLUMN {key} {insert_column} DEFAULT {value};"
|
||||
self._cursor.execute(add_column_sql)
|
||||
self._conn.commit()
|
||||
logger.debug(f"Create / Update table {table_name}.")
|
||||
|
||||
def _insert(self, table_name: str, db_data: dict):
|
||||
columns = ", ".join(db_data.keys())
|
||||
values = ", ".join([f":{key}" for key in db_data.keys()])
|
||||
self._cursor.execute(
|
||||
f"INSERT INTO {table_name} ({columns}) VALUES ({values})", db_data
|
||||
)
|
||||
self._conn.commit()
|
||||
|
||||
def _insert_list(self, table_name: str, data_list: list[dict]):
|
||||
columns = ", ".join(data_list[0].keys())
|
||||
values = ", ".join([f":{key}" for key in data_list[0].keys()])
|
||||
self._cursor.executemany(
|
||||
f"INSERT INTO {table_name} ({columns}) VALUES ({values})", data_list
|
||||
)
|
||||
self._conn.commit()
|
||||
|
||||
def _select(self, keys: list[str], table_name: str, condition: str = None) -> dict:
|
||||
if condition is None:
|
||||
self._cursor.execute(f"SELECT {', '.join(keys)} FROM {table_name}")
|
||||
else:
|
||||
self._cursor.execute(
|
||||
f"SELECT {', '.join(keys)} FROM {table_name} WHERE {condition}"
|
||||
)
|
||||
return dict(zip(keys, self._cursor.fetchone()))
|
||||
|
||||
def _update(self, table_name: str, db_data: dict):
|
||||
_id = db_data.get("id")
|
||||
if _id is None:
|
||||
raise ValueError("No _id in db_data.")
|
||||
set_sql = ", ".join([f"{key} = :{key}" for key in db_data.keys()])
|
||||
self._cursor.execute(
|
||||
f"UPDATE {table_name} SET {set_sql} WHERE id = {_id}", db_data
|
||||
)
|
||||
self._conn.commit()
|
||||
return self._cursor.rowcount == 1
|
||||
|
||||
def _update_list(self, table_name: str, data_list: list[dict]):
|
||||
if len(data_list) == 0:
|
||||
return
|
||||
set_sql = ", ".join(
|
||||
[f"{key} = :{key}" for key in data_list[0].keys() if key != "id"]
|
||||
)
|
||||
self._cursor.executemany(
|
||||
f"UPDATE {table_name} SET {set_sql} WHERE id = :id", data_list
|
||||
)
|
||||
self._conn.commit()
|
||||
|
||||
def _update_section(self, table_name: str, location: dict, update_dict: dict):
|
||||
set_sql = ", ".join([f"{key} = :{key}" for key in update_dict.keys()])
|
||||
sql_loc = f"{location['key']} = {location['value']}"
|
||||
self._cursor.execute(
|
||||
f"UPDATE {table_name} SET {set_sql} WHERE {sql_loc}", update_dict
|
||||
)
|
||||
self._conn.commit()
|
||||
|
||||
def _delete_all(self, table_name: str):
|
||||
self._cursor.execute(f"DELETE FROM {table_name}")
|
||||
self._conn.commit()
|
||||
|
||||
def _delete(self, table_name: str, condition: dict):
|
||||
condition_sql = " AND ".join([f"{key} = :{key}" for key in condition.keys()])
|
||||
self._cursor.execute(f"DELETE FROM {table_name} WHERE {condition_sql}", condition)
|
||||
self._conn.commit()
|
||||
|
||||
def _search(self, table_name: str, keys: list[str] | None = None, condition: dict = None):
|
||||
if keys is None:
|
||||
select_sql = "*"
|
||||
else:
|
||||
select_sql = ", ".join(keys)
|
||||
if condition is None:
|
||||
self._cursor.execute(f"SELECT {select_sql} FROM {table_name}")
|
||||
else:
|
||||
custom_condition = condition.pop("_custom_condition", None)
|
||||
condition_sql = " AND ".join([f"{key} = :{key}" for key in condition.keys()]) + (
|
||||
f" AND {custom_condition}" if custom_condition else ""
|
||||
)
|
||||
self._cursor.execute(
|
||||
f"SELECT {select_sql} FROM {table_name} WHERE {condition_sql}", condition
|
||||
)
|
||||
|
||||
def _search_data(self, table_name: str, keys: list[str] | None = None, condition: dict = None) -> dict:
|
||||
if keys is None:
|
||||
keys = self.__get_table_columns(table_name)
|
||||
self._search(table_name, keys, condition)
|
||||
return dict(zip(keys, self._cursor.fetchone()))
|
||||
|
||||
def _search_datas(self, table_name: str, keys: list[str] | None = None, condition: dict = None) -> list[dict]:
|
||||
if keys is None:
|
||||
keys = self.__get_table_columns(table_name)
|
||||
self._search(table_name, keys, condition)
|
||||
return [dict(zip(keys, row)) for row in self._cursor.fetchall()]
|
||||
|
||||
def _table_exists(self, table_name: str) -> bool:
|
||||
self._cursor.execute(
|
||||
"SELECT name FROM sqlite_master WHERE type='table' AND name=?;",
|
||||
(table_name,),
|
||||
)
|
||||
return len(self._cursor.fetchall()) == 1
|
||||
|
||||
def __get_table_columns(self, table_name: str) -> list[str]:
|
||||
self._cursor.execute(f"PRAGMA table_info({table_name})")
|
||||
return [column_info[1] for column_info in self._cursor.fetchall()]
|
||||
|
||||
@staticmethod
|
||||
def __python_to_sqlite_type(value) -> str:
|
||||
if isinstance(value, int):
|
||||
return "INTEGER NOT NULL"
|
||||
elif isinstance(value, float):
|
||||
return "REAL NOT NULL"
|
||||
elif isinstance(value, str):
|
||||
return "TEXT NOT NULL"
|
||||
elif isinstance(value, bool):
|
||||
return "INTEGER NOT NULL"
|
||||
elif isinstance(value, list):
|
||||
return "TEXT NOT NULL"
|
||||
elif value is None:
|
||||
return "TEXT"
|
||||
else:
|
||||
raise ValueError(f"Unsupported data type: {type(value)}")
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb):
|
||||
self._conn.close()
|
||||
7
backend/src/module/database/engine.py
Normal file
7
backend/src/module/database/engine.py
Normal file
@@ -0,0 +1,7 @@
|
||||
from sqlmodel import create_engine, Session
|
||||
from module.conf import DATA_PATH
|
||||
|
||||
|
||||
engine = create_engine(DATA_PATH)
|
||||
|
||||
db_session = Session(engine)
|
||||
@@ -1 +0,0 @@
|
||||
from .connector import Connector
|
||||
@@ -1,62 +0,0 @@
|
||||
import sqlite3
|
||||
|
||||
from .delete import Delete
|
||||
from .insert import Insert
|
||||
from .select import Select
|
||||
from .update import Update
|
||||
|
||||
from module.conf import DATA_PATH
|
||||
|
||||
|
||||
class Connector:
|
||||
def __init__(self, table_name: str, data: dict, database: str = DATA_PATH):
|
||||
self._conn = sqlite3.connect(database)
|
||||
self._cursor = self._conn.cursor()
|
||||
self.update = Update(self, table_name, data)
|
||||
self.insert = Insert(self, table_name, data)
|
||||
self.select = Select(self, table_name, data)
|
||||
self.delete = Delete(self, table_name, data)
|
||||
self._columns = self.__get_columns(table_name)
|
||||
|
||||
def __enter__(self):
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_value, traceback):
|
||||
self._conn.close()
|
||||
|
||||
def __get_columns(self, table_name: str) -> list[str]:
|
||||
self._cursor.execute(f"PRAGMA table_info({table_name})")
|
||||
return [x[1] for x in self._cursor.fetchall()]
|
||||
|
||||
def execute(self, sql: str, params: tuple = None):
|
||||
if params is None:
|
||||
self._cursor.execute(sql)
|
||||
else:
|
||||
self._cursor.execute(sql, params)
|
||||
self._conn.commit()
|
||||
|
||||
def executemany(self, sql: str, params: list[tuple]):
|
||||
self._cursor.executemany(sql, params)
|
||||
self._conn.commit()
|
||||
|
||||
def fetchall(self, keys: str = None) -> list[dict]:
|
||||
datas = self._cursor.fetchall()
|
||||
if keys:
|
||||
return [dict(zip(keys, data)) for data in datas]
|
||||
return [dict(zip(self._columns, data)) for data in datas]
|
||||
|
||||
def fetchone(self, keys: list[str] = None) -> dict:
|
||||
data = self._cursor.fetchone()
|
||||
if data:
|
||||
if keys:
|
||||
return dict(zip(keys, data))
|
||||
return dict(zip(self._columns, data))
|
||||
|
||||
def fetchmany(self, keys: list[str], size: int) -> list[dict]:
|
||||
datas = self._cursor.fetchmany(size)
|
||||
if keys:
|
||||
return [dict(zip(keys, data)) for data in datas]
|
||||
return [dict(zip(self._columns, data)) for data in datas]
|
||||
|
||||
def fetch(self):
|
||||
return self._cursor.fetchall()
|
||||
@@ -1,23 +0,0 @@
|
||||
class Delete:
|
||||
def __init__(self, connector, table_name: str, data: dict):
|
||||
self._connector = connector
|
||||
self._table_name = table_name
|
||||
self._data = data
|
||||
|
||||
def one(self, _id: int) -> bool:
|
||||
self._connector.execute(
|
||||
f"""
|
||||
DELETE FROM {self._table_name}
|
||||
WHERE id = :id
|
||||
""",
|
||||
{"id": _id},
|
||||
)
|
||||
return True
|
||||
|
||||
def all(self):
|
||||
self._connector.execute(
|
||||
f"""
|
||||
DELETE FROM {self._table_name}
|
||||
""",
|
||||
)
|
||||
return True
|
||||
@@ -1,33 +0,0 @@
|
||||
class Insert:
|
||||
def __init__(self, connector, table_name: str, data: dict):
|
||||
self._connector = connector
|
||||
self._table_name = table_name
|
||||
self._columns = data.items()
|
||||
|
||||
def __gen_id(self) -> int:
|
||||
self._connector.execute(
|
||||
f"""
|
||||
SELECT MAX(id) FROM {self._table_name}
|
||||
""",
|
||||
)
|
||||
max_id = self._connector.fetchone(keys=["id"]).get("id")
|
||||
if max_id is None:
|
||||
return 1
|
||||
return max_id + 1
|
||||
|
||||
def one(self, data: dict):
|
||||
_id = self.__gen_id()
|
||||
data["id"] = _id
|
||||
columns = ", ".join(data.keys())
|
||||
placeholders = ", ".join([f":{key}" for key in data.keys()])
|
||||
self._connector.execute(
|
||||
f"""
|
||||
INSERT INTO {self._table_name} ({columns})
|
||||
VALUES ({placeholders})
|
||||
""",
|
||||
data,
|
||||
)
|
||||
|
||||
def many(self, data: list[dict]):
|
||||
for item in data:
|
||||
self.one(item)
|
||||
@@ -1,96 +0,0 @@
|
||||
class Select:
|
||||
def __init__(self, connector, table_name: str, data: dict):
|
||||
self._connector = connector
|
||||
self._table_name = table_name
|
||||
self._data = data
|
||||
|
||||
def id(self, _id: int):
|
||||
self._connector.execute(
|
||||
f"""
|
||||
SELECT * FROM {self._table_name}
|
||||
WHERE id = :id
|
||||
""",
|
||||
{"id": _id},
|
||||
)
|
||||
return self._connector.fetchone()
|
||||
|
||||
def all(self, limit: int = None):
|
||||
if limit is None:
|
||||
limit = 10000
|
||||
self._connector.execute(
|
||||
f"""
|
||||
SELECT * FROM {self._table_name} LIMIT {limit}
|
||||
""",
|
||||
)
|
||||
return self._connector.fetchall()
|
||||
|
||||
def one(
|
||||
self,
|
||||
keys: list[str] | None = None,
|
||||
conditions: dict = None,
|
||||
combine_operator: str = "AND",
|
||||
):
|
||||
if keys is None:
|
||||
columns = "*"
|
||||
else:
|
||||
columns = ", ".join(keys)
|
||||
condition_sql = self.__select_condition(conditions, combine_operator)
|
||||
self._connector.execute(
|
||||
f"""
|
||||
SELECT {columns} FROM {self._table_name}
|
||||
WHERE {condition_sql}
|
||||
""",
|
||||
conditions,
|
||||
)
|
||||
return self._connector.fetchone(keys)
|
||||
|
||||
def many(
|
||||
self,
|
||||
keys: list[str] | None = None,
|
||||
conditions: dict = None,
|
||||
combine_operator: str = "AND",
|
||||
limit: int = None,
|
||||
):
|
||||
if keys is None:
|
||||
columns = "*"
|
||||
else:
|
||||
columns = ", ".join(keys)
|
||||
if limit is None:
|
||||
limit = 10000
|
||||
condition_sql = self.__select_condition(conditions, combine_operator)
|
||||
self._connector.execute(
|
||||
f"""
|
||||
SELECT {columns} FROM {self._table_name}
|
||||
WHERE {condition_sql}
|
||||
LIMIT {limit}
|
||||
""",
|
||||
conditions,
|
||||
)
|
||||
return self._connector.fetchall(keys)
|
||||
|
||||
def column(self, keys: list[str]):
|
||||
columns = ", ".join(keys)
|
||||
self._connector.execute(
|
||||
f"""
|
||||
SELECT {columns} FROM {self._table_name}
|
||||
""",
|
||||
)
|
||||
return self._connector.fetchall(keys)
|
||||
|
||||
@staticmethod
|
||||
def __select_condition(conditions: dict, combine_operator: str = "AND"):
|
||||
if not conditions:
|
||||
raise ValueError("No conditions provided.")
|
||||
if combine_operator not in ["AND", "OR", "INSTR"]:
|
||||
raise ValueError(
|
||||
"Invalid combine_operator, must be 'AND' or 'OR' or 'INSTR'."
|
||||
)
|
||||
if combine_operator == "INSTR":
|
||||
condition_sql = f" AND ".join(
|
||||
[f"INSTR({key}, :{key})" for key in conditions.keys()]
|
||||
)
|
||||
else:
|
||||
condition_sql = f" {combine_operator} ".join(
|
||||
[f"{key} = :{key}" for key in conditions.keys()]
|
||||
)
|
||||
return condition_sql
|
||||
@@ -1,100 +0,0 @@
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Update:
|
||||
def __init__(self, connector, table_name: str, data: dict):
|
||||
self._connector = connector
|
||||
self._table_name = table_name
|
||||
self._example_data = data
|
||||
|
||||
def __table_exists(self) -> bool:
|
||||
self._connector.execute(
|
||||
f"""
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='{self._table_name}'
|
||||
"""
|
||||
)
|
||||
return self._connector.fetch() is not None
|
||||
|
||||
def table(self):
|
||||
columns_list = [
|
||||
self.__python_to_sqlite_type(key, value)
|
||||
for key, value in self._example_data.items()
|
||||
]
|
||||
columns = ", ".join(columns_list)
|
||||
create_table_sql = f"CREATE TABLE IF NOT EXISTS {self._table_name} ({columns});"
|
||||
self._connector.execute(create_table_sql)
|
||||
logger.debug(f"Create table {self._table_name}.")
|
||||
self._connector.execute(f"PRAGMA table_info({self._table_name})")
|
||||
existing_columns = [x[1] for x in self._connector.fetch()]
|
||||
for key, value in self._example_data.items():
|
||||
if key not in existing_columns:
|
||||
insert_column = self.__python_to_sqlite_type(key, value)
|
||||
if value is None:
|
||||
value = "NULL"
|
||||
add_column_sql = f"ALTER TABLE {self._table_name} ADD COLUMN {insert_column} DEFAULT {value};"
|
||||
self._connector.execute(add_column_sql)
|
||||
logger.debug(f"Update table {self._table_name}.")
|
||||
|
||||
def one(self, data: dict) -> bool:
|
||||
_id = data["id"]
|
||||
set_sql = ", ".join([f"{key} = :{key}" for key in data.keys()])
|
||||
self._connector.execute(
|
||||
f"""
|
||||
UPDATE {self._table_name}
|
||||
SET {set_sql}
|
||||
WHERE id = :id
|
||||
""",
|
||||
data,
|
||||
)
|
||||
logger.debug(f"Update {_id} in {self._table_name}.")
|
||||
return True
|
||||
|
||||
def many(self, data: list[dict]) -> bool:
|
||||
columns = ", ".join([f"{key} = :{key}" for key in data[0].keys()])
|
||||
self._connector.executemany(
|
||||
f"""
|
||||
UPDATE {self._table_name}
|
||||
SET {columns}
|
||||
WHERE id = :id
|
||||
""",
|
||||
data,
|
||||
)
|
||||
logger.debug(f"Update {self._table_name}.")
|
||||
return True
|
||||
|
||||
def value(self, location: dict, set_value: dict) -> bool:
|
||||
set_sql = ", ".join([f"{key} = :{key}" for key in set_value.keys()])
|
||||
params = {**location, **set_value}
|
||||
self._connector.execute(
|
||||
f"""
|
||||
UPDATE {self._table_name}
|
||||
SET {set_sql}
|
||||
WHERE {location["key"]} = :{location["key"]}
|
||||
""",
|
||||
params,
|
||||
)
|
||||
logger.debug(f"Update {self._table_name}.")
|
||||
return True
|
||||
|
||||
@staticmethod
|
||||
def __python_to_sqlite_type(key, value) -> str:
|
||||
if key == "id":
|
||||
column = "INTEGER PRIMARY KEY"
|
||||
elif isinstance(value, int):
|
||||
column = "INTEGER NOT NULL"
|
||||
elif isinstance(value, float):
|
||||
column = "REAL NOT NULL"
|
||||
elif isinstance(value, str):
|
||||
column = "TEXT NOT NULL"
|
||||
elif isinstance(value, bool):
|
||||
column = "INTEGER NOT NULL"
|
||||
elif isinstance(value, list):
|
||||
column = "TEXT NOT NULL"
|
||||
elif value is None:
|
||||
column = "TEXT"
|
||||
else:
|
||||
raise ValueError(f"Unsupported data type: {type(value)}")
|
||||
return f"{key} {column}"
|
||||
@@ -2,72 +2,58 @@ import logging
|
||||
|
||||
from fastapi import HTTPException
|
||||
|
||||
from module.database.connector import DataConnector
|
||||
from module.models.user import User
|
||||
from module.models.user import User, UserUpdate, UserLogin
|
||||
from module.security.jwt import get_password_hash, verify_password
|
||||
from module.database.engine import engine
|
||||
from sqlmodel import Session, select, SQLModel
|
||||
from sqlalchemy.exc import UnboundExecutionError, OperationalError
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class AuthDB(DataConnector):
|
||||
class UserDatabase(Session):
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
self.__table_name = "user"
|
||||
if not self._table_exists(self.__table_name):
|
||||
self.__update_table()
|
||||
|
||||
def __update_table(self):
|
||||
db_data = self.__data_to_db(User())
|
||||
self._update_table(self.__table_name, db_data)
|
||||
self._insert(self.__table_name, db_data)
|
||||
|
||||
@staticmethod
|
||||
def __data_to_db(data: User) -> dict:
|
||||
db_data = data.dict()
|
||||
db_data["password"] = get_password_hash(db_data["password"])
|
||||
return db_data
|
||||
|
||||
@staticmethod
|
||||
def __db_to_data(db_data: dict) -> User:
|
||||
return User(**db_data)
|
||||
super().__init__(engine)
|
||||
statement = select(User)
|
||||
try:
|
||||
self.exec(statement)
|
||||
except OperationalError:
|
||||
SQLModel.metadata.create_all(engine)
|
||||
self.add(User())
|
||||
self.commit()
|
||||
|
||||
def get_user(self, username):
|
||||
self._cursor.execute(
|
||||
f"SELECT * FROM {self.__table_name} WHERE username=?", (username,)
|
||||
)
|
||||
result = self._cursor.fetchone()
|
||||
statement = select(User).where(User.username == username)
|
||||
result = self.exec(statement).first()
|
||||
if not result:
|
||||
return None
|
||||
db_data = dict(zip([x[0] for x in self._cursor.description], result))
|
||||
return self.__db_to_data(db_data)
|
||||
raise HTTPException(status_code=404, detail="User not found")
|
||||
return result
|
||||
|
||||
def auth_user(self, username, password) -> bool:
|
||||
self._cursor.execute(
|
||||
f"SELECT username, password FROM {self.__table_name} WHERE username=?",
|
||||
(username,),
|
||||
)
|
||||
result = self._cursor.fetchone()
|
||||
def auth_user(self, user: UserLogin) -> bool:
|
||||
statement = select(User).where(User.username == user.username)
|
||||
result = self.exec(statement).first()
|
||||
if not result:
|
||||
raise HTTPException(status_code=401, detail="User not found")
|
||||
if not verify_password(password, result[1]):
|
||||
if not verify_password(user.password, result.password):
|
||||
raise HTTPException(status_code=401, detail="Password error")
|
||||
return True
|
||||
|
||||
def update_user(self, username, update_user: User):
|
||||
def update_user(self, username, update_user: UserUpdate):
|
||||
# Update username and password
|
||||
new_username = update_user.username
|
||||
new_password = update_user.password
|
||||
self._cursor.execute(
|
||||
f"""
|
||||
UPDATE {self.__table_name}
|
||||
SET username = '{new_username}', password = '{get_password_hash(new_password)}'
|
||||
WHERE username = '{username}'
|
||||
"""
|
||||
)
|
||||
self._conn.commit()
|
||||
statement = select(User).where(User.username == username)
|
||||
result = self.exec(statement).first()
|
||||
if not result:
|
||||
raise HTTPException(status_code=404, detail="User not found")
|
||||
if update_user.username:
|
||||
result.username = update_user.username
|
||||
if update_user.password:
|
||||
result.password = get_password_hash(update_user.password)
|
||||
self.add(result)
|
||||
self.commit()
|
||||
return result
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
with AuthDB() as db:
|
||||
with UserDatabase() as db:
|
||||
# db.update_user(UserLogin(username="admin", password="adminadmin"), User(username="admin", password="cica1234"))
|
||||
db.update_user("admin", User(username="estrella", password="cica1234"))
|
||||
db.update_user("admin", UserUpdate(username="estrella", password="cica1234"))
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import logging
|
||||
|
||||
from module.conf import settings
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
|
||||
from .path import TorrentPath
|
||||
|
||||
@@ -68,7 +68,7 @@ class DownloadClient(TorrentPath):
|
||||
prefs = self.client.get_app_prefs()
|
||||
settings.downloader.path = self._join_path(prefs["save_path"], "Bangumi")
|
||||
|
||||
def set_rule(self, data: BangumiData):
|
||||
def set_rule(self, data: Bangumi):
|
||||
data.rule_name = self._rule_name(data)
|
||||
data.save_path = self._gen_save_path(data)
|
||||
rule = {
|
||||
@@ -92,7 +92,7 @@ class DownloadClient(TorrentPath):
|
||||
f"[Downloader] Add {data.official_title} Season {data.season} to auto download rules."
|
||||
)
|
||||
|
||||
def set_rules(self, bangumi_info: list[BangumiData]):
|
||||
def set_rules(self, bangumi_info: list[Bangumi]):
|
||||
logger.debug("[Downloader] Start adding rules.")
|
||||
for info in bangumi_info:
|
||||
self.set_rule(info)
|
||||
|
||||
@@ -1,13 +1,10 @@
|
||||
import logging
|
||||
from os import PathLike
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
from module.conf import settings
|
||||
from module.models import BangumiData
|
||||
|
||||
if ":\\" in settings.downloader.path:
|
||||
import ntpath as path
|
||||
else:
|
||||
import os.path as path
|
||||
from module.models import Bangumi
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -22,7 +19,7 @@ class TorrentPath:
|
||||
subtitle_list = []
|
||||
for f in info.files:
|
||||
file_name = f.name
|
||||
suffix = path.splitext(file_name)[-1]
|
||||
suffix = Path(file_name).suffix
|
||||
if suffix.lower() in [".mp4", ".mkv"]:
|
||||
media_list.append(file_name)
|
||||
elif suffix.lower() in [".ass", ".srt"]:
|
||||
@@ -30,10 +27,10 @@ class TorrentPath:
|
||||
return media_list, subtitle_list
|
||||
|
||||
@staticmethod
|
||||
def _path_to_bangumi(save_path):
|
||||
def _path_to_bangumi(save_path: PathLike[str] | str):
|
||||
# Split save path and download path
|
||||
save_parts = save_path.split(path.sep)
|
||||
download_parts = settings.downloader.path.split(path.sep)
|
||||
save_parts = Path(save_path).parts
|
||||
download_parts = Path(settings.downloader.path).parts
|
||||
# Get bangumi name and season
|
||||
bangumi_name = ""
|
||||
season = 1
|
||||
@@ -45,22 +42,22 @@ class TorrentPath:
|
||||
return bangumi_name, season
|
||||
|
||||
@staticmethod
|
||||
def _file_depth(file_path):
|
||||
return len(file_path.split(path.sep))
|
||||
def _file_depth(file_path: PathLike[str] | str):
|
||||
return len(Path(file_path).parts)
|
||||
|
||||
def is_ep(self, file_path):
|
||||
def is_ep(self, file_path: PathLike[str] | str):
|
||||
return self._file_depth(file_path) <= 2
|
||||
|
||||
@staticmethod
|
||||
def _gen_save_path(data: BangumiData):
|
||||
def _gen_save_path(data: Bangumi):
|
||||
folder = (
|
||||
f"{data.official_title} ({data.year})" if data.year else data.official_title
|
||||
)
|
||||
save_path = path.join(settings.downloader.path, folder, f"Season {data.season}")
|
||||
return save_path
|
||||
save_path = Path(settings.downloader.path) / folder / f"Season {data.season}"
|
||||
return str(save_path)
|
||||
|
||||
@staticmethod
|
||||
def _rule_name(data: BangumiData):
|
||||
def _rule_name(data: Bangumi):
|
||||
rule_name = (
|
||||
f"[{data.group_name}] {data.official_title} S{data.season}"
|
||||
if settings.bangumi_manage.group_tag
|
||||
@@ -70,4 +67,4 @@ class TorrentPath:
|
||||
|
||||
@staticmethod
|
||||
def _join_path(*args):
|
||||
return path.join(*args)
|
||||
return str(Path(*args))
|
||||
|
||||
@@ -2,14 +2,14 @@ import logging
|
||||
|
||||
from module.database import BangumiDatabase
|
||||
from module.downloader import DownloadClient
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
from module.searcher import SearchTorrent
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SeasonCollector(DownloadClient):
|
||||
def add_season_torrents(self, data: BangumiData, torrents, torrent_files=None):
|
||||
def add_season_torrents(self, data: Bangumi, torrents, torrent_files=None):
|
||||
if torrent_files:
|
||||
download_info = {
|
||||
"torrent_files": torrent_files,
|
||||
@@ -23,7 +23,7 @@ class SeasonCollector(DownloadClient):
|
||||
}
|
||||
return self.add_torrent(download_info)
|
||||
|
||||
def collect_season(self, data: BangumiData, link: str = None, proxy: bool = False):
|
||||
def collect_season(self, data: Bangumi, link: str = None, proxy: bool = False):
|
||||
logger.info(f"Start collecting {data.official_title} Season {data.season}...")
|
||||
with SearchTorrent() as st:
|
||||
if not link:
|
||||
@@ -39,7 +39,7 @@ class SeasonCollector(DownloadClient):
|
||||
data=data, torrents=torrents, torrent_files=torrent_files
|
||||
)
|
||||
|
||||
def subscribe_season(self, data: BangumiData):
|
||||
def subscribe_season(self, data: Bangumi):
|
||||
with BangumiDatabase() as db:
|
||||
data.added = True
|
||||
data.eps_collect = True
|
||||
@@ -59,5 +59,3 @@ def eps_complete():
|
||||
sc.collect_season(data)
|
||||
data.eps_collect = True
|
||||
bd.update_list(datas)
|
||||
|
||||
|
||||
|
||||
@@ -4,21 +4,21 @@ from fastapi.responses import JSONResponse
|
||||
|
||||
from module.database import BangumiDatabase
|
||||
from module.downloader import DownloadClient
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class TorrentManager(BangumiDatabase):
|
||||
@staticmethod
|
||||
def __match_torrents_list(data: BangumiData) -> list:
|
||||
def __match_torrents_list(data: Bangumi) -> list:
|
||||
with DownloadClient() as client:
|
||||
torrents = client.get_torrent_info(status_filter=None)
|
||||
return [
|
||||
torrent.hash for torrent in torrents if torrent.save_path == data.save_path
|
||||
]
|
||||
|
||||
def delete_torrents(self, data: BangumiData, client: DownloadClient):
|
||||
def delete_torrents(self, data: Bangumi, client: DownloadClient):
|
||||
hash_list = self.__match_torrents_list(data)
|
||||
if hash_list:
|
||||
client.delete_torrent(hash_list)
|
||||
@@ -29,7 +29,7 @@ class TorrentManager(BangumiDatabase):
|
||||
|
||||
def delete_rule(self, _id: int | str, file: bool = False):
|
||||
data = self.search_id(int(_id))
|
||||
if isinstance(data, BangumiData):
|
||||
if isinstance(data, Bangumi):
|
||||
with DownloadClient() as client:
|
||||
client.remove_rule(data.rule_name)
|
||||
client.remove_rss_feed(data.official_title)
|
||||
@@ -54,7 +54,7 @@ class TorrentManager(BangumiDatabase):
|
||||
|
||||
def disable_rule(self, _id: str | int, file: bool = False):
|
||||
data = self.search_id(int(_id))
|
||||
if isinstance(data, BangumiData):
|
||||
if isinstance(data, Bangumi):
|
||||
with DownloadClient() as client:
|
||||
client.remove_rule(data.rule_name)
|
||||
data.deleted = True
|
||||
@@ -81,7 +81,7 @@ class TorrentManager(BangumiDatabase):
|
||||
|
||||
def enable_rule(self, _id: str | int):
|
||||
data = self.search_id(int(_id))
|
||||
if isinstance(data, BangumiData):
|
||||
if isinstance(data, Bangumi):
|
||||
data.deleted = False
|
||||
self.update_one(data)
|
||||
with DownloadClient() as client:
|
||||
@@ -98,7 +98,7 @@ class TorrentManager(BangumiDatabase):
|
||||
status_code=406, content={"msg": f"Can't find bangumi id {_id}"}
|
||||
)
|
||||
|
||||
def update_rule(self, data: BangumiData):
|
||||
def update_rule(self, data: Bangumi):
|
||||
old_data = self.search_id(data.id)
|
||||
if not old_data:
|
||||
logger.error(f"[Manager] Can't find data with {data.id}")
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
from .bangumi import BangumiData
|
||||
from .bangumi import Bangumi, Episode
|
||||
from .config import Config
|
||||
from .rss import RSSItem, TorrentData
|
||||
from .rss import RSSTorrents
|
||||
from .torrent import EpisodeFile, SubtitleFile, TorrentBase
|
||||
from .user import UserLogin
|
||||
|
||||
@@ -1,27 +1,50 @@
|
||||
from dataclasses import dataclass
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from pydantic import BaseModel
|
||||
from sqlmodel import SQLModel, Field
|
||||
from typing import Optional
|
||||
|
||||
|
||||
class BangumiData(BaseModel):
|
||||
id: int = Field(0, alias="id", title="番剧ID")
|
||||
official_title: str = Field("official_title", alias="official_title", title="番剧中文名")
|
||||
year: str | None = Field(None, alias="year", title="番剧年份")
|
||||
title_raw: str = Field("title_raw", alias="title_raw", title="番剧原名")
|
||||
season: int = Field(1, alias="season", title="番剧季度")
|
||||
season_raw: str | None = Field(None, alias="season_raw", title="番剧季度原名")
|
||||
group_name: str | None = Field(None, alias="group_name", title="字幕组")
|
||||
dpi: str | None = Field(None, alias="dpi", title="分辨率")
|
||||
source: str | None = Field(None, alias="source", title="来源")
|
||||
subtitle: str | None = Field(None, alias="subtitle", title="字幕")
|
||||
eps_collect: bool = Field(False, alias="eps_collect", title="是否已收集")
|
||||
offset: int = Field(0, alias="offset", title="番剧偏移量")
|
||||
filter: list[str] = Field(["720", "\\d+-\\d+"], alias="filter", title="番剧过滤器")
|
||||
rss_link: list[str] = Field([], alias="rss_link", title="番剧RSS链接")
|
||||
poster_link: str | None = Field(None, alias="poster_link", title="番剧海报链接")
|
||||
added: bool = Field(False, alias="added", title="是否已添加")
|
||||
rule_name: str | None = Field(None, alias="rule_name", title="番剧规则名")
|
||||
save_path: str | None = Field(None, alias="save_path", title="番剧保存路径")
|
||||
class Bangumi(SQLModel, table=True):
|
||||
id: int = Field(default=None, primary_key=True)
|
||||
official_title: str = Field(
|
||||
default="official_title", alias="official_title", title="番剧中文名"
|
||||
)
|
||||
year: Optional[str] = Field(alias="year", title="番剧年份")
|
||||
title_raw: str = Field(default="title_raw", alias="title_raw", title="番剧原名")
|
||||
season: int = Field(default=1, alias="season", title="番剧季度")
|
||||
season_raw: Optional[str] = Field(alias="season_raw", title="番剧季度原名")
|
||||
group_name: Optional[str] = Field(alias="group_name", title="字幕组")
|
||||
dpi: Optional[str] = Field(alias="dpi", title="分辨率")
|
||||
source: Optional[str] = Field(alias="source", title="来源")
|
||||
subtitle: Optional[str] = Field(alias="subtitle", title="字幕")
|
||||
eps_collect: bool = Field(default=False, alias="eps_collect", title="是否已收集")
|
||||
offset: int = Field(default=0, alias="offset", title="番剧偏移量")
|
||||
filter: str = Field(default="720, \\d+-\\d+", alias="filter", title="番剧过滤器")
|
||||
rss_link: str = Field(default="", alias="rss_link", title="番剧RSS链接")
|
||||
poster_link: Optional[str] = Field(alias="poster_link", title="番剧海报链接")
|
||||
added: bool = Field(default=False, alias="added", title="是否已添加")
|
||||
rule_name: Optional[str] = Field(alias="rule_name", title="番剧规则名")
|
||||
save_path: Optional[str] = Field(alias="save_path", title="番剧保存路径")
|
||||
deleted: bool = Field(False, alias="deleted", title="是否已删除")
|
||||
|
||||
|
||||
class BangumiUpdate(SQLModel):
|
||||
official_title: str = Field(
|
||||
default="official_title", alias="official_title", title="番剧中文名"
|
||||
)
|
||||
year: Optional[str] = Field(alias="year", title="番剧年份")
|
||||
season: int = Field(default=1, alias="season", title="番剧季度")
|
||||
season_raw: Optional[str] = Field(alias="season_raw", title="番剧季度原名")
|
||||
group_name: Optional[str] = Field(alias="group_name", title="字幕组")
|
||||
dpi: Optional[str] = Field(alias="dpi", title="分辨率")
|
||||
source: Optional[str] = Field(alias="source", title="来源")
|
||||
subtitle: Optional[str] = Field(alias="subtitle", title="字幕")
|
||||
eps_collect: bool = Field(default=False, alias="eps_collect", title="是否已收集")
|
||||
offset: int = Field(default=0, alias="offset", title="番剧偏移量")
|
||||
filter: str = Field(default="720, \\d+-\\d+", alias="filter", title="番剧过滤器")
|
||||
rss_link: str = Field(default="", alias="rss_link", title="番剧RSS链接")
|
||||
added: bool = Field(default=False, alias="added", title="是否已添加")
|
||||
deleted: bool = Field(False, alias="deleted", title="是否已删除")
|
||||
|
||||
|
||||
@@ -29,14 +52,14 @@ class Notification(BaseModel):
|
||||
official_title: str = Field(..., alias="official_title", title="番剧名")
|
||||
season: int = Field(..., alias="season", title="番剧季度")
|
||||
episode: int = Field(..., alias="episode", title="番剧集数")
|
||||
poster_path: str | None = Field(None, alias="poster_path", title="番剧海报路径")
|
||||
poster_path: Optional[str] = Field(None, alias="poster_path", title="番剧海报路径")
|
||||
|
||||
|
||||
@dataclass
|
||||
class Episode:
|
||||
title_en: str | None
|
||||
title_zh: str | None
|
||||
title_jp: str | None
|
||||
title_en: Optional[str]
|
||||
title_zh: Optional[str]
|
||||
title_jp: Optional[str]
|
||||
season: int
|
||||
season_raw: str
|
||||
episode: int
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
from os.path import expandvars
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
# Sub config
|
||||
@@ -12,21 +13,35 @@ class Program(BaseModel):
|
||||
class Downloader(BaseModel):
|
||||
type: str = Field("qbittorrent", description="Downloader type")
|
||||
host: str = Field("172.17.0.1:8080", description="Downloader host")
|
||||
username: str = Field("admin", description="Downloader username")
|
||||
password: str = Field("adminadmin", description="Downloader password")
|
||||
username_: str = Field("admin", alias="username", description="Downloader username")
|
||||
password_: str = Field(
|
||||
"adminadmin", alias="password", description="Downloader password"
|
||||
)
|
||||
path: str = Field("/downloads/Bangumi", description="Downloader path")
|
||||
ssl: bool = Field(False, description="Downloader ssl")
|
||||
|
||||
@property
|
||||
def username(self):
|
||||
return expandvars(self.username_)
|
||||
|
||||
@property
|
||||
def password(self):
|
||||
return expandvars(self.password_)
|
||||
|
||||
|
||||
class RSSParser(BaseModel):
|
||||
enable: bool = Field(True, description="Enable RSS parser")
|
||||
type: str = Field("mikan", description="RSS parser type")
|
||||
token: str = Field("token", description="RSS parser token")
|
||||
token_: str = Field("token", alias="token", description="RSS parser token")
|
||||
custom_url: str = Field("mikanani.me", description="Custom RSS host url")
|
||||
parser_type: str = Field("parser", description="Parser type")
|
||||
filter: list[str] = Field(["720", r"\d+-\d"], description="Filter")
|
||||
language: str = "zh"
|
||||
|
||||
@property
|
||||
def token(self):
|
||||
return expandvars(self.token_)
|
||||
|
||||
|
||||
class BangumiManage(BaseModel):
|
||||
enable: bool = Field(True, description="Enable bangumi manage")
|
||||
@@ -45,15 +60,31 @@ class Proxy(BaseModel):
|
||||
type: str = Field("http", description="Proxy type")
|
||||
host: str = Field("", description="Proxy host")
|
||||
port: int = Field(0, description="Proxy port")
|
||||
username: str = Field("", description="Proxy username")
|
||||
password: str = Field("", description="Proxy password")
|
||||
username_: str = Field("", alias="username", description="Proxy username")
|
||||
password_: str = Field("", alias="password", description="Proxy password")
|
||||
|
||||
@property
|
||||
def username(self):
|
||||
return expandvars(self.username_)
|
||||
|
||||
@property
|
||||
def password(self):
|
||||
return expandvars(self.password_)
|
||||
|
||||
|
||||
class Notification(BaseModel):
|
||||
enable: bool = Field(False, description="Enable notification")
|
||||
type: str = Field("telegram", description="Notification type")
|
||||
token: str = Field("", description="Notification token")
|
||||
chat_id: str = Field("", description="Notification chat id")
|
||||
token_: str = Field("", alias="token", description="Notification token")
|
||||
chat_id_: str = Field("", alias="chat_id", description="Notification chat id")
|
||||
|
||||
@property
|
||||
def token(self):
|
||||
return expandvars(self.token_)
|
||||
|
||||
@property
|
||||
def chat_id(self):
|
||||
return expandvars(self.chat_id_)
|
||||
|
||||
|
||||
class Config(BaseModel):
|
||||
@@ -64,3 +95,6 @@ class Config(BaseModel):
|
||||
log: Log = Log()
|
||||
proxy: Proxy = Proxy()
|
||||
notification: Notification = Notification()
|
||||
|
||||
def dict(self, *args, by_alias=True, **kwargs):
|
||||
return super().dict(*args, by_alias=by_alias, **kwargs)
|
||||
|
||||
@@ -1,14 +1,24 @@
|
||||
from pydantic import BaseModel, Field
|
||||
from pydantic import BaseModel
|
||||
from typing import Optional
|
||||
from sqlmodel import SQLModel, Field
|
||||
|
||||
|
||||
class User(BaseModel):
|
||||
class User(SQLModel, table=True):
|
||||
id: int = Field(default=None, primary_key=True)
|
||||
username: str = Field(
|
||||
"admin", min_length=4, max_length=20, regex=r"^[a-zA-Z0-9_]+$"
|
||||
)
|
||||
password: str = Field("adminadmin", min_length=8)
|
||||
|
||||
|
||||
class UserLogin(BaseModel):
|
||||
class UserUpdate(SQLModel):
|
||||
username: Optional[str] = Field(
|
||||
None, min_length=4, max_length=20, regex=r"^[a-zA-Z0-9_]+$"
|
||||
)
|
||||
password: Optional[str] = Field(None, min_length=8)
|
||||
|
||||
|
||||
class UserLogin(SQLModel):
|
||||
username: str
|
||||
password: str = Field(..., min_length=8)
|
||||
|
||||
|
||||
@@ -31,8 +31,7 @@ class PostNotification:
|
||||
def __init__(self):
|
||||
Notifier = getClient(settings.notification.type)
|
||||
self.notifier = Notifier(
|
||||
token=settings.notification.token,
|
||||
chat_id=settings.notification.chat_id
|
||||
token=settings.notification.token, chat_id=settings.notification.chat_id
|
||||
)
|
||||
|
||||
@staticmethod
|
||||
|
||||
@@ -27,15 +27,15 @@ class WecomNotification(RequestContent):
|
||||
title = "【番剧更新】" + notify.official_title
|
||||
msg = self.gen_message(notify)
|
||||
picurl = notify.poster_path
|
||||
#Default pic to avoid blank in message. Resolution:1068*455
|
||||
# Default pic to avoid blank in message. Resolution:1068*455
|
||||
if picurl == "https://mikanani.me":
|
||||
picurl = "https://article.biliimg.com/bfs/article/d8bcd0408bf32594fd82f27de7d2c685829d1b2e.png"
|
||||
data = {
|
||||
"key":self.token,
|
||||
"key": self.token,
|
||||
"type": "news",
|
||||
"title": title,
|
||||
"msg": msg,
|
||||
"picurl":picurl
|
||||
"picurl": picurl,
|
||||
}
|
||||
resp = self.post_data(self.notification_url, data)
|
||||
logger.debug(f"Wecom notification: {resp.status_code}")
|
||||
|
||||
@@ -181,6 +181,6 @@ def raw_parser(raw: str) -> Episode | None:
|
||||
)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
if __name__ == "__main__":
|
||||
title = "[动漫国字幕组&LoliHouse] THE MARGINAL SERVICE - 08 [WebRip 1080p HEVC-10bit AAC][简繁内封字幕]"
|
||||
print(raw_parser(title))
|
||||
|
||||
@@ -16,14 +16,13 @@ class TMDBInfo:
|
||||
year: str
|
||||
|
||||
|
||||
LANGUAGE = {
|
||||
"zh": "zh-CN",
|
||||
"jp": "ja-JP",
|
||||
"en": "en-US"
|
||||
}
|
||||
LANGUAGE = {"zh": "zh-CN", "jp": "ja-JP", "en": "en-US"}
|
||||
|
||||
|
||||
def search_url(e):
|
||||
return f"https://api.themoviedb.org/3/search/tv?api_key={TMDB_API}&page=1&query={e}&include_adult=false"
|
||||
|
||||
|
||||
def info_url(e, key):
|
||||
return f"https://api.themoviedb.org/3/tv/{e}?api_key={TMDB_API}&language={LANGUAGE[key]}"
|
||||
|
||||
@@ -68,8 +67,9 @@ def tmdb_parser(title, language) -> TMDBInfo | None:
|
||||
{
|
||||
"season": s.get("name"),
|
||||
"air_date": s.get("air_date"),
|
||||
"poster_path": s.get("poster_path")
|
||||
} for s in info_content.get("seasons")
|
||||
"poster_path": s.get("poster_path"),
|
||||
}
|
||||
for s in info_content.get("seasons")
|
||||
]
|
||||
last_season = get_season(season)
|
||||
original_title = info_content.get("original_name")
|
||||
@@ -81,7 +81,7 @@ def tmdb_parser(title, language) -> TMDBInfo | None:
|
||||
original_title,
|
||||
season,
|
||||
last_season,
|
||||
str(year_number)
|
||||
str(year_number),
|
||||
)
|
||||
else:
|
||||
return None
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import logging
|
||||
import ntpath as win_path
|
||||
import os.path as unix_path
|
||||
from pathlib import Path
|
||||
import re
|
||||
|
||||
from module.models import EpisodeFile, SubtitleFile
|
||||
@@ -23,11 +22,16 @@ SUBTITLE_LANG = {
|
||||
}
|
||||
|
||||
|
||||
def split_path(torrent_path: str) -> str:
|
||||
if PLATFORM == "Windows":
|
||||
return win_path.split(torrent_path)[-1]
|
||||
else:
|
||||
return unix_path.split(torrent_path)[-1]
|
||||
def get_path_basename(torrent_path: str) -> str:
|
||||
"""
|
||||
Returns the basename of a path string.
|
||||
|
||||
:param torrent_path: A string representing a path to a file.
|
||||
:type torrent_path: str
|
||||
:return: A string representing the basename of the given path.
|
||||
:rtype: str
|
||||
"""
|
||||
return Path(torrent_path).name
|
||||
|
||||
|
||||
def get_group(group_and_title) -> tuple[str | None, str]:
|
||||
@@ -64,7 +68,7 @@ def torrent_parser(
|
||||
season: int | None = None,
|
||||
file_type: str = "media",
|
||||
) -> EpisodeFile | SubtitleFile:
|
||||
media_path = split_path(torrent_path)
|
||||
media_path = get_path_basename(torrent_path)
|
||||
for rule in RULES:
|
||||
if torrent_name:
|
||||
match_obj = re.match(rule, torrent_name, re.I)
|
||||
@@ -77,7 +81,7 @@ def torrent_parser(
|
||||
else:
|
||||
title, _ = get_season_and_title(title)
|
||||
episode = int(match_obj.group(2))
|
||||
suffix = unix_path.splitext(torrent_path)[-1]
|
||||
suffix = Path(torrent_path).suffix
|
||||
if file_type == "media":
|
||||
return EpisodeFile(
|
||||
media_path=torrent_path,
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import logging
|
||||
|
||||
from module.conf import settings
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
|
||||
from .analyser import raw_parser, tmdb_parser, torrent_parser
|
||||
|
||||
@@ -39,7 +39,7 @@ class TitleParser:
|
||||
return official_title, tmdb_season, year
|
||||
|
||||
@staticmethod
|
||||
def raw_parser(raw: str, rss_link: str) -> BangumiData | None:
|
||||
def raw_parser(raw: str, rss_link: str) -> Bangumi | None:
|
||||
language = settings.rss_parser.language
|
||||
try:
|
||||
episode = raw_parser(raw)
|
||||
@@ -60,7 +60,7 @@ class TitleParser:
|
||||
else:
|
||||
official_title = title_raw
|
||||
_season = episode.season
|
||||
data = BangumiData(
|
||||
data = Bangumi(
|
||||
official_title=official_title,
|
||||
title_raw=title_raw,
|
||||
season=_season,
|
||||
|
||||
@@ -3,7 +3,7 @@ import re
|
||||
|
||||
from module.conf import settings
|
||||
from module.database import BangumiDatabase
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
from module.network import RequestContent, TorrentInfo
|
||||
from module.parser import TitleParser
|
||||
|
||||
@@ -16,7 +16,7 @@ class RSSAnalyser:
|
||||
with BangumiDatabase() as db:
|
||||
db.update_table()
|
||||
|
||||
def official_title_parser(self, data: BangumiData, mikan_title: str):
|
||||
def official_title_parser(self, data: Bangumi, mikan_title: str):
|
||||
if settings.rss_parser.parser_type == "mikan":
|
||||
data.official_title = mikan_title if mikan_title else data.official_title
|
||||
elif settings.rss_parser.parser_type == "tmdb":
|
||||
@@ -63,7 +63,7 @@ class RSSAnalyser:
|
||||
|
||||
def torrent_to_data(
|
||||
self, torrent: TorrentInfo, rss_link: str | None = None
|
||||
) -> BangumiData:
|
||||
) -> Bangumi:
|
||||
data = self._title_analyser.raw_parser(raw=torrent.name, rss_link=rss_link)
|
||||
if data:
|
||||
try:
|
||||
@@ -79,7 +79,7 @@ class RSSAnalyser:
|
||||
|
||||
def rss_to_data(
|
||||
self, rss_link: str, database: BangumiDatabase, full_parse: bool = True
|
||||
) -> list[BangumiData]:
|
||||
) -> list[Bangumi]:
|
||||
rss_torrents = self.get_rss_torrents(rss_link, full_parse)
|
||||
torrents_to_add = database.match_list(rss_torrents, rss_link)
|
||||
if not torrents_to_add:
|
||||
@@ -92,7 +92,7 @@ class RSSAnalyser:
|
||||
else:
|
||||
return []
|
||||
|
||||
def link_to_data(self, link: str) -> BangumiData:
|
||||
def link_to_data(self, link: str) -> Bangumi:
|
||||
torrents = self.get_rss_torrents(link, False)
|
||||
for torrent in torrents:
|
||||
data = self.torrent_to_data(torrent, link)
|
||||
|
||||
@@ -3,7 +3,7 @@ import logging
|
||||
from module.conf import settings
|
||||
from module.database import BangumiDatabase
|
||||
from module.downloader import DownloadClient
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
from module.network import RequestContent
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
@@ -14,7 +14,7 @@ def matched(torrent_title: str):
|
||||
return db.match_torrent(torrent_title)
|
||||
|
||||
|
||||
def save_path(data: BangumiData):
|
||||
def save_path(data: Bangumi):
|
||||
folder = (
|
||||
f"{data.official_title}({data.year})" if data.year else f"{data.official_title}"
|
||||
)
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
from module.models import BangumiData, TorrentBase
|
||||
from module.models import Bangumi, TorrentBase
|
||||
from module.network import RequestContent
|
||||
from module.searcher.plugin import search_url
|
||||
|
||||
@@ -30,7 +30,7 @@ class SearchTorrent(RequestContent):
|
||||
|
||||
return [TorrentBase(**d) for d in to_dict()]
|
||||
|
||||
def search_season(self, data: BangumiData):
|
||||
def search_season(self, data: Bangumi):
|
||||
keywords = [getattr(data, key) for key in SEARCH_KEY if getattr(data, key)]
|
||||
torrents = self.search_torrents(keywords)
|
||||
return [torrent for torrent in torrents if data.title_raw in torrent.name]
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
from fastapi import Depends, HTTPException, status
|
||||
from fastapi.security import OAuth2PasswordBearer
|
||||
|
||||
from module.database.user import AuthDB
|
||||
from module.database.user import UserDatabase
|
||||
from module.models.user import User
|
||||
|
||||
from .jwt import verify_token
|
||||
@@ -20,7 +20,7 @@ async def get_current_user(token: str = Depends(oauth2_scheme)):
|
||||
status_code=status.HTTP_401_UNAUTHORIZED, detail="invalid token"
|
||||
)
|
||||
username = payload.get("sub")
|
||||
with AuthDB() as user_db:
|
||||
with UserDatabase as user_db:
|
||||
user = user_db.get_user(username)
|
||||
if not user:
|
||||
raise HTTPException(
|
||||
@@ -40,7 +40,7 @@ async def get_token_data(token: str = Depends(oauth2_scheme)):
|
||||
|
||||
def update_user_info(user_data: User, current_user):
|
||||
try:
|
||||
with AuthDB() as db:
|
||||
with UserDatabase as db:
|
||||
db.update_user(current_user.username, user_data)
|
||||
return True
|
||||
except Exception as e:
|
||||
@@ -48,5 +48,5 @@ def update_user_info(user_data: User, current_user):
|
||||
|
||||
|
||||
def auth_user(username, password):
|
||||
with AuthDB() as db:
|
||||
with UserDatabase() as db:
|
||||
db.auth_user(username, password)
|
||||
|
||||
@@ -1,20 +1,22 @@
|
||||
import os
|
||||
|
||||
from module.conf import LEGACY_DATA_PATH
|
||||
from module.database import BangumiDatabase
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
from module.utils import json_config
|
||||
|
||||
|
||||
def data_migration():
|
||||
if not os.path.isfile("data/data.json"):
|
||||
if not LEGACY_DATA_PATH.exists():
|
||||
return False
|
||||
old_data = json_config.load("data/data.json")
|
||||
old_data = json_config.load(LEGACY_DATA_PATH)
|
||||
infos = old_data["bangumi_info"]
|
||||
rss_link = old_data["rss_link"]
|
||||
new_data = []
|
||||
for info in infos:
|
||||
new_data.append(BangumiData(**info, rss_link=[rss_link]))
|
||||
new_data.append(Bangumi(**info, rss_link=[rss_link]))
|
||||
with BangumiDatabase() as database:
|
||||
database.update_table()
|
||||
database.insert_list(new_data)
|
||||
os.remove("data/data.json")
|
||||
|
||||
LEGACY_DATA_PATH.unlink(missing_ok=True)
|
||||
|
||||
@@ -1,11 +1,17 @@
|
||||
from sqlmodel import create_engine, SQLModel
|
||||
from sqlmodel.pool import StaticPool
|
||||
|
||||
from module.database import BangumiDatabase
|
||||
from module.models import BangumiData
|
||||
from module.models import Bangumi
|
||||
|
||||
|
||||
def test_database():
|
||||
TEST_PATH = "test/test.db"
|
||||
test_data = BangumiData(
|
||||
id=1,
|
||||
def test_bangumi_database():
|
||||
# sqlite mock engine
|
||||
engine = create_engine(
|
||||
"sqlite://", connect_args={"check_same_thread": False}, poolclass=StaticPool
|
||||
)
|
||||
SQLModel.metadata.create_all(engine)
|
||||
test_data = Bangumi(
|
||||
official_title="test",
|
||||
year="2021",
|
||||
title_raw="test",
|
||||
@@ -17,18 +23,15 @@ def test_database():
|
||||
subtitle="test",
|
||||
eps_collect=False,
|
||||
offset=0,
|
||||
filter=["720p", "\\d+-\\d+"],
|
||||
rss_link=["test"],
|
||||
filter="720p,\\d+-\\d+",
|
||||
rss_link="test",
|
||||
poster_link="/test/test.jpg",
|
||||
added=False,
|
||||
rule_name=None,
|
||||
save_path=None,
|
||||
deleted=False,
|
||||
)
|
||||
with BangumiDatabase(database=TEST_PATH) as database:
|
||||
# create table
|
||||
database.update_table()
|
||||
with BangumiDatabase(database=TEST_PATH) as database:
|
||||
with BangumiDatabase(engine) as database:
|
||||
# insert
|
||||
database.insert_one(test_data)
|
||||
assert database.search_id(1) == test_data
|
||||
@@ -39,7 +42,7 @@ def test_database():
|
||||
assert database.search_id(1) == test_data
|
||||
|
||||
# search poster
|
||||
assert database.match_poster("test") == "/test/test.jpg"
|
||||
assert database.match_poster("test2 (2021)") == "/test/test.jpg"
|
||||
|
||||
# delete
|
||||
database.delete_one(1)
|
||||
|
||||
@@ -1,4 +1,8 @@
|
||||
import sys
|
||||
|
||||
import pytest
|
||||
from module.parser.analyser import torrent_parser
|
||||
from module.parser.analyser.torrent_parser import get_path_basename
|
||||
|
||||
|
||||
def test_torrent_parser():
|
||||
@@ -67,3 +71,18 @@ def test_torrent_parser():
|
||||
assert bf.title == "放学后失眠的你-Kimi wa Houkago Insomnia"
|
||||
assert bf.season == 1
|
||||
assert bf.episode == 6
|
||||
|
||||
|
||||
class TestGetPathBasename:
|
||||
def test_regular_path(self):
|
||||
assert get_path_basename("/path/to/file.txt") == "file.txt"
|
||||
|
||||
def test_empty_path(self):
|
||||
assert get_path_basename("") == ""
|
||||
|
||||
def test_path_with_trailing_slash(self):
|
||||
assert get_path_basename("/path/to/folder/") == "folder"
|
||||
|
||||
@pytest.mark.skipif(not sys.platform.startswith("win"), reason="Windows specific")
|
||||
def test_windows_path(self):
|
||||
assert get_path_basename("C:\\path\\to\\file.txt") == "file.txt"
|
||||
|
||||
7
docker/etc/s6-overlay/s6-rc.d/init-fixuser/run
Normal file
7
docker/etc/s6-overlay/s6-rc.d/init-fixuser/run
Normal file
@@ -0,0 +1,7 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# shellcheck shell=bash
|
||||
|
||||
groupmod -o -g "${PGID}" ab
|
||||
usermod -o -u "${PUID}" ab
|
||||
|
||||
chown ab:ab -R /app /ab
|
||||
1
docker/etc/s6-overlay/s6-rc.d/init-fixuser/type
Normal file
1
docker/etc/s6-overlay/s6-rc.d/init-fixuser/type
Normal file
@@ -0,0 +1 @@
|
||||
oneshot
|
||||
1
docker/etc/s6-overlay/s6-rc.d/init-fixuser/up
Normal file
1
docker/etc/s6-overlay/s6-rc.d/init-fixuser/up
Normal file
@@ -0,0 +1 @@
|
||||
/etc/s6-overlay/s6-rc.d/init-fixuser/run
|
||||
8
docker/etc/s6-overlay/s6-rc.d/init-old-compatible/run
Normal file
8
docker/etc/s6-overlay/s6-rc.d/init-old-compatible/run
Normal file
@@ -0,0 +1,8 @@
|
||||
#!/usr/bin/with-contenv bash
|
||||
# shellcheck shell=bash
|
||||
|
||||
umask ${UMASK}
|
||||
|
||||
if [ -f /config/bangumi.json ]; then
|
||||
mv /config/bangumi.json /app/data/bangumi.json
|
||||
fi
|
||||
1
docker/etc/s6-overlay/s6-rc.d/init-old-compatible/type
Normal file
1
docker/etc/s6-overlay/s6-rc.d/init-old-compatible/type
Normal file
@@ -0,0 +1 @@
|
||||
oneshot
|
||||
1
docker/etc/s6-overlay/s6-rc.d/init-old-compatible/up
Normal file
1
docker/etc/s6-overlay/s6-rc.d/init-old-compatible/up
Normal file
@@ -0,0 +1 @@
|
||||
/etc/s6-overlay/s6-rc.d/init-old-compatible/run
|
||||
1
docker/etc/s6-overlay/s6-rc.d/svc-autobangumi/type
Normal file
1
docker/etc/s6-overlay/s6-rc.d/svc-autobangumi/type
Normal file
@@ -0,0 +1 @@
|
||||
longrun
|
||||
181
docs/.vitepress/config.ts
Normal file
181
docs/.vitepress/config.ts
Normal file
@@ -0,0 +1,181 @@
|
||||
import { defineConfig } from "vitepress";
|
||||
import { inject } from '@vercel/analytics';
|
||||
|
||||
inject();
|
||||
|
||||
const version = `v3.0`
|
||||
|
||||
// https://vitepress.dev/reference/site-config
|
||||
// @ts-ignore
|
||||
export default defineConfig({
|
||||
title: "AutoBangumi",
|
||||
description: "从 Mikan Project 全自动追番下载整理",
|
||||
|
||||
head: [
|
||||
['link', { rel: 'icon', type: 'image/svg+xml', href: '/light-logo.svg' }],
|
||||
['meta', { property: 'og:image', content: '/social.png' }],
|
||||
['meta', { property: 'og:site_name', content: 'AutoBangumi' }],
|
||||
['meta', { property: 'og:url', content: 'https://www.autobangumi.org' }],
|
||||
["script", { src: '/_vercel/insights/script.js' }]
|
||||
],
|
||||
|
||||
themeConfig: {
|
||||
// https://vitepress.dev/reference/default-theme-config
|
||||
logo: {
|
||||
dark: '/dark-logo.svg',
|
||||
light: '/light-logo.svg',
|
||||
},
|
||||
|
||||
editLink: {
|
||||
pattern: 'https://github.com/vitejs/vite/blob/3.1-dev/docs/:path',
|
||||
text: 'Edit this page',
|
||||
},
|
||||
|
||||
search: {
|
||||
provider: 'local'
|
||||
},
|
||||
|
||||
socialLinks: [
|
||||
{ icon: "github", link: "https://github.com/EstrellaXD/Auto_Bangumi" },
|
||||
{
|
||||
icon: {
|
||||
svg: '<svg xmlns="http://www.w3.org/2000/svg" role="img" viewBox="0 0 24 24"><title>Telegram</title><path d="M11.944 0A12 12 0 0 0 0 12a12 12 0 0 0 12 12 12 12 0 0 0 12-12A12 12 0 0 0 12 0a12 12 0 0 0-.056 0zm4.962 7.224c.1-.002.321.023.465.14a.506.506 0 0 1 .171.325c.016.093.036.306.02.472-.18 1.898-.962 6.502-1.36 8.627-.168.9-.499 1.201-.82 1.23-.696.065-1.225-.46-1.9-.902-1.056-.693-1.653-1.124-2.678-1.8-1.185-.78-.417-1.21.258-1.91.177-.184 3.247-2.977 3.307-3.23.007-.032.014-.15-.056-.212s-.174-.041-.249-.024c-.106.024-1.793 1.14-5.061 3.345-.48.33-.913.49-1.302.48-.428-.008-1.252-.241-1.865-.44-.752-.245-1.349-.374-1.297-.789.027-.216.325-.437.893-.663 3.498-1.524 5.83-2.529 6.998-3.014 3.332-1.386 4.025-1.627 4.476-1.635z"/></svg>'
|
||||
},
|
||||
link: "https://t.me/autobangumi"
|
||||
},
|
||||
],
|
||||
|
||||
nav: [
|
||||
{ text: "项目说明", link: "/home/" },
|
||||
{ text: "快速开始", link: "/deploy/quick-start" },
|
||||
{ text: "排错流程", link: "/faq/排错流程" },
|
||||
{ text: "常见问题", link: "/faq/常见问题" },
|
||||
],
|
||||
|
||||
footer: {
|
||||
message: `AutoBangumi Released under the MIT License. (latest: ${version})`,
|
||||
copyright: 'Copyright © 2021-present @EstrellaXD & AutoBangumi Contributors',
|
||||
},
|
||||
|
||||
sidebar: [
|
||||
{
|
||||
items: [
|
||||
{
|
||||
text: "项目说明",
|
||||
link: "/home/",
|
||||
},
|
||||
{
|
||||
text: "快速开始",
|
||||
link: "/deploy/quick-start",
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
text: "部署",
|
||||
items: [
|
||||
{
|
||||
text: "Docker-cli 部署",
|
||||
link: "/deploy/docker-cli",
|
||||
},
|
||||
{
|
||||
text: "Docker-Compose 部署",
|
||||
link: "/deploy/docker-compose",
|
||||
},
|
||||
{
|
||||
text: "群晖NAS",
|
||||
link: "/deploy/dsm",
|
||||
},
|
||||
{
|
||||
text: "WSL",
|
||||
link: "/deploy/wsl",
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
text: "源码运行",
|
||||
items: [
|
||||
{
|
||||
text: "Windows 本地部署",
|
||||
link: "/deploy/windows",
|
||||
},
|
||||
{
|
||||
text: "Unix 本地部署",
|
||||
link: "/deploy/unix",
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
text: "配置说明",
|
||||
items: [
|
||||
{
|
||||
text: "获取 RSS 订阅链接",
|
||||
link: "/config/rss",
|
||||
},
|
||||
{
|
||||
text: "主程序配置",
|
||||
link: "/config/program",
|
||||
},
|
||||
{
|
||||
text: "下载器配置",
|
||||
link: "/config/downloader",
|
||||
},
|
||||
{
|
||||
text: "解析器配置",
|
||||
link: "/config/parser",
|
||||
},
|
||||
{
|
||||
text: "推送器配置",
|
||||
link: "/config/notifier",
|
||||
},
|
||||
{
|
||||
text: "代理配置",
|
||||
link: "/config/proxy",
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
text: "WebUI 使用说明",
|
||||
link: "/usage/webui",
|
||||
},
|
||||
{
|
||||
text: "FAQ",
|
||||
items: [
|
||||
{
|
||||
text: "排错流程",
|
||||
link: "/faq/排错流程",
|
||||
},
|
||||
{
|
||||
text: "常见问题",
|
||||
link: "/faq/常见问题",
|
||||
},
|
||||
{
|
||||
text: "网络问题",
|
||||
link: "/faq/mikan-network",
|
||||
}
|
||||
],
|
||||
},
|
||||
{
|
||||
text: "更新日志",
|
||||
items: [
|
||||
{
|
||||
text: "3.0 更新说明",
|
||||
link: "/changelog/3.0",
|
||||
},
|
||||
{
|
||||
text: "2.6 更新说明",
|
||||
link: "/changelog/2.6",
|
||||
},
|
||||
],
|
||||
},
|
||||
{
|
||||
text: "开发者文档",
|
||||
items: [
|
||||
{
|
||||
text: "贡献指南",
|
||||
link: "/dev/",
|
||||
},
|
||||
]
|
||||
}
|
||||
],
|
||||
},
|
||||
});
|
||||
50
docs/.vitepress/theme/components/HomePreviewWebUI.vue
Normal file
50
docs/.vitepress/theme/components/HomePreviewWebUI.vue
Normal file
@@ -0,0 +1,50 @@
|
||||
<script setup lang="ts">
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<div class="container">
|
||||
<img
|
||||
src="/image/preview/window.png"
|
||||
alt="AutoBangumi WebUI Preview"
|
||||
class="webui-preview"
|
||||
data-zoomable
|
||||
/>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<style scoped>
|
||||
|
||||
.container {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
margin: 0 auto;
|
||||
padding-inline: 24px;
|
||||
padding-block: 60px 20px;
|
||||
/**
|
||||
* same as VPHero.vue
|
||||
* https://github.com/vuejs/vitepress/blob/v1.0.0-beta.5/src/client/theme-default/components/VPHero.vue#L83
|
||||
*/
|
||||
max-width: 1280px;
|
||||
}
|
||||
|
||||
@media (min-width: 640px) {
|
||||
.container {
|
||||
padding-inline: 48px;
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 960px) {
|
||||
.container {
|
||||
padding-inline: 64px;
|
||||
}
|
||||
}
|
||||
|
||||
.webui-preview {
|
||||
width: 100%;
|
||||
height: auto;
|
||||
border-radius: 10px;
|
||||
|
||||
box-shadow: none;
|
||||
}
|
||||
|
||||
</style>
|
||||
45
docs/.vitepress/theme/index.ts
Normal file
45
docs/.vitepress/theme/index.ts
Normal file
@@ -0,0 +1,45 @@
|
||||
// https://vitepress.dev/guide/custom-theme
|
||||
import {
|
||||
h,
|
||||
onMounted,
|
||||
watch,
|
||||
nextTick,
|
||||
} from 'vue'
|
||||
import Theme from 'vitepress/theme'
|
||||
import { useRoute } from 'vitepress'
|
||||
import mediumZoom from 'medium-zoom'
|
||||
import HomePreviewWebUI from './components/HomePreviewWebUI.vue'
|
||||
|
||||
import './style.css'
|
||||
|
||||
export default {
|
||||
extends: Theme,
|
||||
Layout: () => {
|
||||
return h(Theme.Layout, null, {
|
||||
// https://vitepress.dev/guide/extending-default-theme#layout-slots
|
||||
'home-features-after': () => h(HomePreviewWebUI),
|
||||
})
|
||||
},
|
||||
setup() {
|
||||
const route = useRoute()
|
||||
const initZoom = () => {
|
||||
/**
|
||||
* Allow images to be zoomed in on click
|
||||
* https://github.com/vuejs/vitepress/issues/854
|
||||
*/
|
||||
mediumZoom('[data-zoomable]', { background: 'var(--vp-c-bg)' })
|
||||
}
|
||||
|
||||
onMounted(() => {
|
||||
initZoom()
|
||||
})
|
||||
|
||||
watch(
|
||||
() => route.path,
|
||||
() => nextTick(initZoom),
|
||||
)
|
||||
},
|
||||
enhanceApp({ app, router, siteData }) {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
110
docs/.vitepress/theme/style.css
Normal file
110
docs/.vitepress/theme/style.css
Normal file
@@ -0,0 +1,110 @@
|
||||
/**
|
||||
* Customize default theme styling by overriding CSS variables:
|
||||
* https://github.com/vuejs/vitepress/blob/main/src/client/theme-default/styles/vars.css
|
||||
*/
|
||||
|
||||
/**
|
||||
* Colors
|
||||
* -------------------------------------------------------------------------- */
|
||||
|
||||
:root {
|
||||
--vp-c-brand: #7B65D6;
|
||||
--vp-c-brand-light: #7162AE;
|
||||
--vp-c-brand-lighter: #8D7FC2;
|
||||
--vp-c-brand-lightest: #8E8A9C;
|
||||
--vp-c-brand-dark: #4E3C94;
|
||||
--vp-c-brand-darker: #281E52;
|
||||
--vp-c-brand-dimm: rgba(100, 108, 255, 0.08);
|
||||
}
|
||||
|
||||
/**
|
||||
* Component: Button
|
||||
* -------------------------------------------------------------------------- */
|
||||
|
||||
:root {
|
||||
--vp-button-brand-border: var(--vp-c-brand-light);
|
||||
--vp-button-brand-text: var(--vp-c-white);
|
||||
--vp-button-brand-bg: var(--vp-c-brand);
|
||||
--vp-button-brand-hover-border: var(--vp-c-brand-light);
|
||||
--vp-button-brand-hover-text: var(--vp-c-white);
|
||||
--vp-button-brand-hover-bg: var(--vp-c-brand-light);
|
||||
--vp-button-brand-active-border: var(--vp-c-brand-light);
|
||||
--vp-button-brand-active-text: var(--vp-c-white);
|
||||
--vp-button-brand-active-bg: var(--vp-button-brand-bg);
|
||||
}
|
||||
|
||||
/**
|
||||
* Component: Home
|
||||
* -------------------------------------------------------------------------- */
|
||||
|
||||
:root {
|
||||
--vp-home-hero-name-color: transparent;
|
||||
--vp-home-hero-name-background: -webkit-linear-gradient(
|
||||
120deg,
|
||||
#b42ff1 30%,
|
||||
#441bd9
|
||||
);
|
||||
|
||||
--vp-home-hero-image-background-image: linear-gradient(
|
||||
-45deg,
|
||||
#b42ff1bb 50%,
|
||||
#4794ffbb 50%
|
||||
);
|
||||
--vp-home-hero-image-filter: blur(40px);
|
||||
}
|
||||
|
||||
@media (min-width: 640px) {
|
||||
:root {
|
||||
--vp-home-hero-image-filter: blur(56px);
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-width: 960px) {
|
||||
:root {
|
||||
--vp-home-hero-image-filter: blur(72px);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Component: Custom Block
|
||||
* -------------------------------------------------------------------------- */
|
||||
|
||||
:root {
|
||||
--vp-custom-block-tip-border: var(--vp-c-brand);
|
||||
--vp-custom-block-tip-text: var(--vp-c-brand-darker);
|
||||
--vp-custom-block-tip-bg: var(--vp-c-brand-dimm);
|
||||
}
|
||||
|
||||
.dark {
|
||||
--vp-custom-block-tip-border: var(--vp-c-brand);
|
||||
--vp-custom-block-tip-text: var(--vp-c-brand-lightest);
|
||||
--vp-custom-block-tip-bg: var(--vp-c-brand-dimm);
|
||||
}
|
||||
|
||||
/**
|
||||
* Component: Algolia
|
||||
* -------------------------------------------------------------------------- */
|
||||
|
||||
.DocSearch {
|
||||
--docsearch-primary-color: var(--vp-c-brand) !important;
|
||||
}
|
||||
|
||||
/**
|
||||
* Component: medium-zoom
|
||||
* -------------------------------------------------------------------------- */
|
||||
|
||||
|
||||
.medium-zoom--opened .medium-zoom-overlay {
|
||||
z-index: 20;
|
||||
}
|
||||
|
||||
.medium-zoom--opened .medium-zoom-image {
|
||||
z-index: 21;
|
||||
}
|
||||
|
||||
.vp-doc .ab-shadow-card {
|
||||
box-shadow: 0 10px 30px -10px rgba(0,0,0,0.2),
|
||||
0 0 2px rgba(0,0,0,0.2),
|
||||
0 20px 30px -20px rgba(0,0,0,0.4);
|
||||
border-radius: 10px;
|
||||
}
|
||||
139
docs/changelog/2.6.md
Normal file
139
docs/changelog/2.6.md
Normal file
@@ -0,0 +1,139 @@
|
||||
# 2.6 版本更新说明
|
||||
|
||||
## 如何从老版本更新的注意事项
|
||||
|
||||
从 2.6 版本开始,AutoBangumi (以下简称 AB)的配置文件将会从环境变量更改为 `config.json` 升级之前需要注意以下事项。
|
||||
|
||||
### 环境变量继承
|
||||
|
||||
老的环境变量会在升级到 2.6 版本后第一次启动时自动转换为 `config.json`,生成的 `config.json` 会放在 `/app/config` 文件夹内。
|
||||
当你已经映射了 `/app/config` 文件夹之后,老的环境变量将不对 AB 运行产生影响,但是你可以通过删除 `config.json` 来重新使用环境变量生成配置。
|
||||
|
||||
### 容器 Volume 映射
|
||||
|
||||
2.6 版本之后需要映射的文件夹有:
|
||||
|
||||
- `/app/config`:配置文件夹,包含 `config.json`
|
||||
- `/app/data`:数据文件夹,包含 `bangumi.json` 等
|
||||
|
||||
### 数据文件
|
||||
|
||||
新版本进行了大规模更新,因此我们不建议你使用老的数据文件,AB 会自动生成新的数据文件 `bangumi.json` 保存在 `/app/data` 中。
|
||||
|
||||
但是不用担心,QB 不会重复下载已经下载过的番剧。
|
||||
|
||||
### 后续配置更改
|
||||
|
||||
现在 AB 可以在 WebUI 中直接编辑配置了,编辑完成之后只需要重启容器就可以立刻生效。
|
||||
|
||||
## 如何升级
|
||||
|
||||
### Docker compose
|
||||
|
||||
你可以用老的 docker-compose.yml 文件升级。
|
||||
|
||||
```bash
|
||||
docker compose stop autobangumi
|
||||
docker compose pull autobangumi
|
||||
```
|
||||
|
||||
然后修改 docker-compose.yml 文件,添加 `volumes` 映射。
|
||||
|
||||
```yaml
|
||||
version: "3.8"
|
||||
|
||||
services:
|
||||
autobangumi:
|
||||
image: estrellaxd/auto_bangumi:latest
|
||||
container_name: autobangumi
|
||||
restart: unless-stopped
|
||||
environment:
|
||||
- PUID=1000
|
||||
- PGID=1000
|
||||
- TZ=Asia/Shanghai
|
||||
volumes:
|
||||
- /path/to/config:/app/config
|
||||
- /path/to/data:/app/data
|
||||
networks:
|
||||
- bridge
|
||||
dns:
|
||||
- 223.5.5.5
|
||||
```
|
||||
|
||||
然后拉起 AB 即可
|
||||
|
||||
```bash
|
||||
docker compose up -d autobangumi
|
||||
```
|
||||
|
||||
### Portainer
|
||||
|
||||
你可以在 `Portainer` 中修改映射文件地址之后点击 `Recreate` 即可完成升级
|
||||
|
||||
### 如果升级出现了问题需要怎么做
|
||||
|
||||
由于每个人配置可能不尽相同,现在升级可能会出现程序无法运行的问题,此时删除掉所有以前的数据以及生成的配置文件后重启容器。
|
||||
然后在 WebUI 中重新配置一下即可。
|
||||
|
||||
|
||||
## 新版本特性
|
||||
|
||||
### 配置方式更改
|
||||
|
||||
2.6 版本之后配置程序的方式从修改 Docker 中的环境变量更改为修改 `config.json`。
|
||||
新版 WebUI 也提供了网页版修改配置的功能。可以访问 AB 地址,在左侧边栏找到 `配置` 即可修改。
|
||||
修改完成之后重启容器即可。
|
||||
|
||||
### 自定义反向代理 URL 和 AB 作为反代中转
|
||||
|
||||
为了应对 [蜜柑计划](https://mikanani.me) 无法访问的情况,AB 增加了三种应对的方式。
|
||||
|
||||
1. HTTP 以及 Socks 代理
|
||||
|
||||
老版本的 AB 就有这项功能,升级到 2.6 版本之后只需要在 WebUI 中检查代理配置即可正常访问蜜柑计划。
|
||||
|
||||
不过这时候 qBittorrent 无法正常访问蜜柑计划的 RSS 和种子地址,因此需要在 qBittorrent 中添加代理。详情可以查看 #198
|
||||
|
||||
2. 自定义反向代理 URL
|
||||
|
||||
2.6 版本的 AB 在配置中增加了 `custom_url` 选项,可以自定义反向代理的 URL。
|
||||
可以在配置中设置为自己正确设置的反代 URL。这样 AB 就会使用自定义的 URL 来访问蜜柑计划。并且 QB 也可以正常下载。
|
||||
|
||||
3. AB 作为反代中转
|
||||
|
||||
在 AB 配置代理之后,AB 自身可以作为本地的反代中转。不过目前仅开放 RSS 相关功能的反代。
|
||||
这时候只需要把 `custom_url` 设置为 `http://abhost:abport` 即可。 `abhost` 为 AB 的 IP 地址,`abport` 为 AB 的端口。
|
||||
此时 AB 会把自身地址推送给 qBittorrent,qBittorrent 会使用 AB 的地址作为反代来访问蜜柑计划。
|
||||
|
||||
请注意,此时如果你没有用 NGINX 等工具对 AB 进行反代,请填入 `http://` 来保证程序正常运行。
|
||||
|
||||
**注意事项**
|
||||
|
||||
需要注意的是,如果 AB 和 QB 在同一个容器中,请不要用 `127.0.0.1` 或者 `localhost`,因为这样会导致 AB QB 无法互相访问。
|
||||
如果在同一个网络中,可以使用容器名寻址的方式来访问。如 `http://autobangumi:7892`。
|
||||
|
||||
也可以使用 Docker 路由地址访问如 `http://172.17.0.1:7892`。
|
||||
|
||||
如果在不同宿主机中,可以使用宿主机 IP 地址访问。
|
||||
|
||||
### 合集以及文件夹重命名
|
||||
|
||||
AB 现在可以对合集以及文件夹内的文件进行重命名了,此时 AB 会把文件夹内的媒体文件重新放置到根目录中。
|
||||
需要注意的是,目前 AB 还依赖保存路径来判断季度信息和剧集信息,所以请按照 AB 的标准放置合集文件。
|
||||
|
||||
**2.6.4** 版本后,AB 可以对文件夹内的字幕进行重命名,不过该功能暂时在适配中。合集和字幕默认以 `pn` 格式重命名,暂时不提供调整选项。
|
||||
|
||||
**标准路径**
|
||||
|
||||
```
|
||||
/downloads/Bangumi/剧集信息/Season 1/xxx
|
||||
```
|
||||
|
||||
### 通知推送功能
|
||||
|
||||
AB 现在可以通过 `Telegram` 和 `ServerChan` 推送重命名完成的通知了。
|
||||
|
||||
在 WebUI 中,打开推送开关,并且填入所需的参数即可完成推送。
|
||||
|
||||
- Telegram 需要填入 Bot Token 和 Chat ID,具体如何获取请参看各类教程。
|
||||
- ServerChan 需要填入 Token,具体如何获取请参看各类教程。
|
||||
@@ -1,4 +1,4 @@
|
||||
## New
|
||||
# 3.0 更新说明
|
||||
|
||||
### 全新的 WebUI
|
||||
|
||||
|
||||
55
docs/config/downloader.md
Normal file
55
docs/config/downloader.md
Normal file
@@ -0,0 +1,55 @@
|
||||
# 下载器设置
|
||||
|
||||
## WebUI 设置
|
||||
|
||||
{width=500}{class=ab-shadow-card}
|
||||
|
||||
<br/>
|
||||
|
||||
- **Downloader Type** 为下载器类型,目前支持 qBittorrent 下载器,目前暂不支持修改。
|
||||
- **Host** 为下载器地址。[1](#下载器地址)
|
||||
- **Download path** 为映射的下载器下载路径。[2](#下载器路径问题)
|
||||
- **SSL** 为下载器是否使用 SSL。
|
||||
|
||||
## 常见问题
|
||||
|
||||
### 下载器地址
|
||||
|
||||
⚠️**请不要直接使用 127.0.0.1 或 localhost 作为下载器地址。**
|
||||
|
||||
由于 AB 在官方教程中是以 **Bridge** 模式运行在 Docker 中的,如果你是用 127.0.0.1 或者 localhost 那么 AB 将会把这个地址解析为自身,而非下载器。
|
||||
- 如果此时你的 qBittorrent 也运行在 Docker 中,那么我们推荐你是用 Docker 的 **网关地址:172.17.0.1**。
|
||||
- 如果你的 qBittorrent 运行在宿主机上,那么你需要使用宿主机的 IP 地址。
|
||||
|
||||
如果你以 **Host** 模式运行 AB,那么你可以直接使用 127.0.0.1 代替 Docker 网关地址。
|
||||
|
||||
⚠️ Macvlan 会隔离容器的网络,此时如果你不做额外的网桥配置将无法访问同宿主机的其他容器或者主机本身。
|
||||
|
||||
### 下载器路径问题
|
||||
|
||||
AB 中配置的路径只是为了生成对应番剧文件路径,AB 本身不对路径下的文件做直接管理。
|
||||
|
||||
**下载路径** 到底写什么?
|
||||
|
||||
这个参数只要和你 **下载器** 中的参数保持一致即可。
|
||||
- Docker:比如 qB 中是 `/downloads` 那就写 `/downloads/Bangumi`,`Bangumi`可以任意更改。
|
||||
- Linux/macOS:如果是 `/home/usr/downloads` 或者 `/User/UserName/Downloads` 只要在最后再加一行 `Bangumi` 就行。
|
||||
- Windows:`D:\Media\`, 改为 `D:\Media\Bangumi`
|
||||
|
||||
## `config.json` 中的配置选项
|
||||
|
||||
在配置文件中对应选项如下:
|
||||
|
||||
配置文件部分:`downloader`
|
||||
|
||||
| 参数名 | 参数说明 | 参数类型 | WebUI 对应选项 | 默认值 |
|
||||
|----------|-------------|------|-------------|--------------------|
|
||||
| type | 下载器类型 | 字符串 | 下载器类型 | qbittorrent |
|
||||
| host | 下载器地址 | 字符串 | 下载器地址 | 172.17.0.1:8080 |
|
||||
| username | 下载器用户名 | 字符串 | 下载器用户名 | admin |
|
||||
| password | 下载器密码 | 字符串 | 下载器密码 | adminadmin |
|
||||
| path | 下载器下载路径 | 字符串 | 下载器下载路径 | /downloads/Bangumi |
|
||||
| ssl | 下载器是否使用 SSL | 布尔值 | 下载器是否使用 SSL | false |
|
||||
|
||||
|
||||
|
||||
34
docs/config/notifier.md
Normal file
34
docs/config/notifier.md
Normal file
@@ -0,0 +1,34 @@
|
||||
# 通知配置
|
||||
|
||||
## WebUI 配置
|
||||
|
||||
{width=500}{class=ab-shadow-card}
|
||||
|
||||
<br/>
|
||||
|
||||
- **Enable** 为是否启用通知,如果不启用通知,下面的配置项将不会生效。
|
||||
- **Type** 为通知类型,目前支持:
|
||||
- Telegram
|
||||
- Wecom
|
||||
- Bark
|
||||
- ServerChan
|
||||
- **Chat ID** 仅在使用 `telegram` 通知时需要填写。[Telegram Bot 获取 Chat ID][1]
|
||||
- **Wecom**,chat_id参数框填写自建推送的url地址,同时需要在服务端增加[图文消息][2]类型。[Wecom酱配置说明][3]
|
||||
|
||||
## `config.json` 中的配置选项
|
||||
|
||||
在配置文件中对应选项如下:
|
||||
|
||||
配置文件部分:`notification`
|
||||
|
||||
| 参数名 | 参数说明 | 参数类型 | WebUI 对应选项 | 默认值 |
|
||||
|---------|------------|------|------------|----------|
|
||||
| enable | 是否启用通知 | 布尔值 | 通知 | false |
|
||||
| type | 通知类型 | 字符串 | 通知类型 | telegram |
|
||||
| token | 通知 Token | 字符串 | 通知 Token |
|
||||
| chat_id | 通知 Chat ID | 字符串 | 通知 Chat ID |
|
||||
|
||||
|
||||
[1]: https://core.telegram.org/bots#6-botfather
|
||||
[2]: https://github.com/umbors/wecomchan-alifun
|
||||
[3]: https://github.com/easychen/wecomchan
|
||||
41
docs/config/parser.md
Normal file
41
docs/config/parser.md
Normal file
@@ -0,0 +1,41 @@
|
||||
# 解析器设置
|
||||
|
||||
AB 的解析器用于解析聚合 RSS 链接,如果 RSS 有新条目更新,AB 就会解析标题并且生成自动下载规则。
|
||||
|
||||
## Webui 中的解析器设置
|
||||
|
||||
{width=500}{class=ab-shadow-card}
|
||||
|
||||
<br/>
|
||||
|
||||
- **Enable**: 是否启用 RSS 解析器。
|
||||
- **Source** 为 RSS 解析器类型,目前支持 `mikan` 。
|
||||
- **Token** 为蜜柑计划的 Token。[蜜柑计划 Token 获取][rss_token]
|
||||
- **Custom_url** 为自定义蜜柑计划地址。[自定义反代地址][reproxy]
|
||||
- **Language** 为 RSS 解析器语言,目前支持 `zh` 、 `jp` 、 `en` 三种语言。
|
||||
- **Parser_type** 为 **官方标题** 解析器解析类型,支持类型如下:
|
||||
- parser 为正则表达式解析器,使用正则表达式解析标题。
|
||||
- mikan 为蜜柑计划解析器,使用蜜柑计划解析标题。
|
||||
- tmdb 为 TMDB 解析器,使用 TMDB 解析标题。
|
||||
- **Exclude** 为全局 RSS 解析器过滤器,可以填入字符串或者正则表达式,AB 在解析 RSS 时会过滤掉符合过滤器的条目。
|
||||
|
||||
|
||||
## `config.json` 中的配置选项
|
||||
|
||||
在配置文件中对应选项如下:
|
||||
|
||||
配置文件部分:`rss_parser`
|
||||
|
||||
| 参数名 | 参数说明 | 参数类型 | WebUI 对应选项 | 默认值 |
|
||||
|-------------|----------------|------|----------------|---------------|
|
||||
| enable | RSS 解析器是否启用 | 布尔值 | RSS 解析器是否启用 | true |
|
||||
| type | RSS 解析器类型 | 字符串 | RSS 解析器类型 | mikan |
|
||||
| token | RSS 解析器 Token | 字符串 | RSS 解析器 Token | token |
|
||||
| custom_url | RSS 解析器自定义 URL | 字符串 | RSS 解析器自定义 URL | mikanime.tv |
|
||||
| parser_type | RSS 解析器解析类型 | 字符串 | RSS 解析器解析类型 | parser |
|
||||
| filter | RSS 解析器过滤器 | 数组 | 过滤器 | [720,\d+-\d+] |
|
||||
| language | RSS 解析器语言 | 字符串 | RSS 解析器语言 | zh |
|
||||
|
||||
|
||||
[rss_token]: rss
|
||||
[reproxy]: proxy##反向代理设置
|
||||
27
docs/config/program.md
Normal file
27
docs/config/program.md
Normal file
@@ -0,0 +1,27 @@
|
||||
# 主程序运行配置
|
||||
|
||||
## WebUI 配置
|
||||
|
||||
{width=500}{class=ab-shadow-card}
|
||||
|
||||
<br/>
|
||||
|
||||
- Interval Time 参数的单位为秒,如果你需要设置为分钟,请自行转换为秒。
|
||||
- RSS 为 RSS 检查时间间隔,这个参数影响自动下载规则生成的周期间隔。
|
||||
- Rename 为重命名检查时间间隔,如果你需要修改重命名检查时间间隔,请修改此参数。
|
||||
- WebUI Port 为端口,请注意如果你使用 Docker 部署,修改后需要在 Docker 中重新映射端口。
|
||||
|
||||
|
||||
## `config.json` 中的配置选项
|
||||
|
||||
在配置文件中对应选项如下:
|
||||
|
||||
配置文件部分:`program`
|
||||
|
||||
| 参数名 | 参数说明 | 参数类型 | WebUI 对应选项 | 默认值 |
|
||||
|-------------|------------|----------|------------|------|
|
||||
| rss_time | RSS 检查时间间隔 | 以秒为单位的整数 | RSS 检查时间间隔 | 7200 |
|
||||
| rename_time | 重命名检查时间间隔 | 以秒为单位的整数 | 重命名检查时间间隔 | 60 |
|
||||
| webui_port | WebUI 端口 | 以整数为单位 | WebUI 端口 | 7892 |
|
||||
|
||||
|
||||
54
docs/config/proxy.md
Normal file
54
docs/config/proxy.md
Normal file
@@ -0,0 +1,54 @@
|
||||
# 代理和反向代理
|
||||
|
||||
## 代理
|
||||
|
||||
{width=500}{class=ab-shadow-card}
|
||||
|
||||
<br/>
|
||||
|
||||
AB 支持 HTTP 代理和 SOCKS5 代理,通过设置代理可以解决一些网络问题。
|
||||
|
||||
- **Enable**: 是否启用代理。
|
||||
- **Type** 为代理类型。
|
||||
- **Host** 为代理地址。
|
||||
- **Port** 为代理端口。
|
||||
|
||||
需要注意的是,在 HTTP 模式下不支持用户名密码验证,如果你的代理需要用户名密码验证,请使用 **SOCKS5** 模式。
|
||||
|
||||
## 反向代理设置
|
||||
|
||||
为了应对 [蜜柑计划](https://mikanani.me) 无法访问的情况,AB 增加了三种应对的方式。
|
||||
|
||||
1. HTTP 以及 Socks 代理
|
||||
|
||||
老版本的 AB 就有这项功能,升级到 2.6 版本之后只需要在 WebUI 中检查代理配置即可正常访问蜜柑计划。
|
||||
|
||||
不过这时候 qBittorrent 无法正常访问蜜柑计划的 RSS 和种子地址,因此需要在 qBittorrent 中添加代理。详情可以查看: [Mikan 被墙怎么办](../faq/mikan-network.md)
|
||||
|
||||
2. 自定义反向代理 URL
|
||||
|
||||
2.6 版本的 AB 在配置中增加了 `custom_url` 选项,可以自定义反向代理的 URL。
|
||||
可以在配置中设置为自己正确设置的反代 URL。这样 AB 就会使用自定义的 URL 来访问蜜柑计划。并且 QB 也可以正常下载。
|
||||
|
||||
3. AB 作为反代中转
|
||||
|
||||
在 AB 配置代理之后,AB 自身可以作为本地的反代中转。不过目前仅开放 RSS 相关功能的反代。
|
||||
这时候只需要把 `custom_url` 设置为 `http://abhost:abport` 即可。 `abhost` 为 AB 的 IP 地址,`abport` 为 AB 的端口。
|
||||
此时 AB 会把自身地址推送给 qBittorrent,qBittorrent 会使用 AB 的地址作为反代来访问蜜柑计划。
|
||||
|
||||
请注意,此时如果你没有用 NGINX 等工具对 AB 进行反代,请填入 `http://` 来保证程序正常运行。
|
||||
|
||||
## `config.json` 中的配置选项
|
||||
|
||||
在配置文件中对应选项如下:
|
||||
|
||||
配置文件部分:`proxy`
|
||||
|
||||
| 参数名 | 参数说明 | 参数类型 | WebUI 对应选项 | 默认值 |
|
||||
|----------|--------|------|------------|-------|
|
||||
| enable | 是否启用代理 | 布尔值 | 代理 | false |
|
||||
| type | 代理类型 | 字符串 | 代理类型 | http |
|
||||
| host | 代理地址 | 字符串 | 代理地址 |
|
||||
| port | 代理端口 | 整数 | 代理端口 |
|
||||
| username | 代理用户名 | 字符串 | 代理用户名 |
|
||||
| password | 代理密码 | 字符串 | 代理密码 |
|
||||
40
docs/config/rss.md
Normal file
40
docs/config/rss.md
Normal file
@@ -0,0 +1,40 @@
|
||||
# 准备 Mikan RSS 订阅链接
|
||||
|
||||
AutoBangumi 可以自动解析 [Mikan Project][mikan-site] 的 RSS 订阅地址,根据字幕组以及番剧名称生成下载规则, 从而实现自动追番。
|
||||
因此在开始自动追番之前,你需要准备好 [Mikan Project][mikan-site] 的 RSS 订阅地址。
|
||||
|
||||
需要注意的是,目前 Mikan Project 主站已经被墙,当你不知道如何使用代理时请使用如下链接进行访问订阅:
|
||||
|
||||
[Mikan Project CN][mikan-cn-site]
|
||||
|
||||
## 获取订阅地址
|
||||
|
||||
本项目基于解析 Mikan Project 提供的 RSS 地址,因此如果要实现自动追番,需要注册并且获得 Mikan Project 的 RSS 地址:
|
||||
|
||||
{data-zoomable}
|
||||
|
||||
获取的 RSS 地址如下:
|
||||
|
||||
```txt
|
||||
https://mikanani.me/RSS/MyBangumi?token=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
|
||||
# 或者
|
||||
https://mikanime.tv/RSS/MyBangumi?token=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
|
||||
```
|
||||
|
||||
AB 中需要填入的 `token` 为以上 url 中 `token=` 后面的一串字符。
|
||||
|
||||
## Mikan Project 订阅贴士
|
||||
|
||||
由于 AutoBangumi 会解析所有获得的 RSS 信息,因此在订阅番剧的时候需要注意以下几点:
|
||||
|
||||
{data-zoomable}
|
||||
|
||||
- 在个人设置中打开高级设置。
|
||||
- 一部番剧只订阅一个字幕组,点击 Mikan Project 的番剧图片即可呼出二级菜单,选择一个字幕组订阅即可。
|
||||
- 如果字幕组有简体繁体不同的字幕,Mikan Project 大多时候提供了选择订阅的方式,选择一种字幕订阅。
|
||||
- 如果不提供简繁选择,那么可以在 AutoBangumi 中设置 `filter` 进行过滤,也可以在规则生成之后进入 qBittorrent 中手动过滤。
|
||||
- 目前不支持 OVA 以及 剧场版 的订阅解析。
|
||||
|
||||
|
||||
[mikan-site]: https://mikanani.me/
|
||||
[mikan-cn-site]: https://mikanime.tv/
|
||||
45
docs/deploy/docker-cli.md
Normal file
45
docs/deploy/docker-cli.md
Normal file
@@ -0,0 +1,45 @@
|
||||
# 使用 Docker-cli 部署
|
||||
|
||||
## 新版本提醒
|
||||
|
||||
AutoBangumi 2.6 版本后支持直接在 WebUI 中配置,你可以选择直接拉起容器再在 WebUI 中配置。老版本的环境变量配置参数会自动迁移,环境变量配置方式仍然可用,但是仅在第一次启动时生效。
|
||||
|
||||
## 创建数据和配置文件夹
|
||||
|
||||
为了保证 AB 在每次更新之后数据和配置不丢失,推荐使用 Docker volume 或者 bind mount 进行数据和配置的持久化。
|
||||
|
||||
```shell
|
||||
# 使用 Docker volume
|
||||
docker volume create AutoBangumi_config
|
||||
docker volume create AutoBangumi_data
|
||||
|
||||
# 使用 bind mount
|
||||
mkdir "AutoBangumi"
|
||||
cd "AutoBangumi"
|
||||
mkdir -p $PWD/config
|
||||
mkdir -p $PWD/data
|
||||
```
|
||||
|
||||
## 使用 Docker-cli 部署 AutoBangumi
|
||||
|
||||
复制以下命令运行即可。
|
||||
|
||||
```shell
|
||||
docker run -d \
|
||||
--name=AutoBangumi \
|
||||
-v AutoBangumi_config:/app/config \
|
||||
-v AutoBangumi_data:/app/data \
|
||||
-p 7892:7892 \
|
||||
--network=bridge \
|
||||
--dns=8.8.8.8 \
|
||||
--restart unless-stopped \
|
||||
estrellaxd/auto_bangumi:latest
|
||||
```
|
||||
|
||||
如果使用 bind mount,可以自行替换绑定路径。
|
||||
|
||||
此时 AB 的 WebUI 会自动运行,但是主程序会处于暂停状态,可以进入 `http://abhost:7892` 进行配置。
|
||||
|
||||
此时 AB 会自动把环境变量写入 `config.json` 文件中然后自动运行。
|
||||
|
||||
推荐使用 _[Portainer](https://www.portainer.io)_ 等带有 UI 的 Docker 管理器进行进阶部署
|
||||
95
docs/deploy/docker-compose.md
Normal file
95
docs/deploy/docker-compose.md
Normal file
@@ -0,0 +1,95 @@
|
||||
# 通过 Docker Compose 部署 AutoBangumi
|
||||
|
||||
现在提供了一键部署的 **AutoBangumi** 的方法,可以使用 `docker-compose.yml` 文件进行部署。
|
||||
|
||||
## 安装 Docker Compose
|
||||
|
||||
正常来说安装完 Docker 之后都会自带 `docker-compose`,使用命令:
|
||||
|
||||
```bash
|
||||
docker compose -v
|
||||
```
|
||||
|
||||
检查版本即可
|
||||
|
||||
如果没有安装,可以使用以下命令安装:
|
||||
|
||||
```bash
|
||||
$ sudo apt-get update
|
||||
$ sudo apt-get install docker-compose-plugin
|
||||
```
|
||||
|
||||
## 部署 **AutoBangumi**
|
||||
|
||||
### 创建 AutoBangumi 文件夹
|
||||
|
||||
```bash
|
||||
mkdir AutoBangumi
|
||||
cd AutoBangumi
|
||||
```
|
||||
|
||||
### 选项1: 自定义 Docker Compose 配置文件
|
||||
|
||||
```yaml
|
||||
version: "3.8"
|
||||
|
||||
services:
|
||||
AutoBangumi:
|
||||
image: estrellaxd/auto_bangumi:latest
|
||||
container_name: AutoBangumi
|
||||
volumes:
|
||||
- ./config:/app/config
|
||||
- ./data:/app/data
|
||||
ports:
|
||||
- "7892:7892"
|
||||
restart: unless-stopped
|
||||
dns:
|
||||
- 223.5.5.5
|
||||
|
||||
volumes:
|
||||
AutoBangumi_config:
|
||||
name: AutoBangumi_config
|
||||
AutoBangumi_data:
|
||||
name: AutoBangumi_data
|
||||
```
|
||||
|
||||
复制上面的内容到 `docker-compose.yml` 文件中。
|
||||
|
||||
### 选项2: 下载 Docker Compose 配置文件
|
||||
|
||||
当你不想自己创建 `docker-compose.yml` 文件时,
|
||||
项目中提供了三种安装方式:
|
||||
|
||||
- 只安装 **AutoBangumi**
|
||||
```bash
|
||||
wget https://raw.githubusercontent.com/EstrellaXD/Auto_Bangumi/main/docs/docker-compose/AutoBangumi/docker-compose.yml
|
||||
```
|
||||
- 安装 **qBittorrent** 与 **AutoBangumi**
|
||||
```bash
|
||||
wget https://raw.githubusercontent.com/EstrellaXD/Auto_Bangumi/main/docs/docker-compose/qBittorrent+AutoBangumi/docker-compose.yml
|
||||
```
|
||||
- **qBittorrent** + **AutoBangumi** + **Plex**
|
||||
```bash
|
||||
wget https://raw.githubusercontent.com/EstrellaXD/Auto_Bangumi/main/docs/docker-compose/All-in-one/docker-compose.yml
|
||||
```
|
||||
|
||||
首先选择你要安装的方式,**拷贝上面的命令运行即可**,这一步是下载 `docker-compose.yml` 配置文件,如果需要自定义可以使用文本编辑器对其中的参数进行自定义。
|
||||
|
||||
### 定义环境变量
|
||||
|
||||
如果你是用上面下载的 AB+QB / AB+QB+Plex 的 Docker-Compose 文件,那么你需要定义以下环境变量:
|
||||
|
||||
```shell
|
||||
export \
|
||||
QB_PORT=<YOUR_PORT>
|
||||
```
|
||||
|
||||
- `QB_PORT`: 填写你的已经部署的 qBittorrent 端口号,或者想要自定义的端口号,比如: `8080`
|
||||
|
||||
### 拉起 Docker-Compose
|
||||
|
||||
```bash
|
||||
# 如果配置过了上面的环境变量,请使用下面的方式拉起
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
44
docs/deploy/dsm.md
Normal file
44
docs/deploy/dsm.md
Normal file
@@ -0,0 +1,44 @@
|
||||
# 群晖 (DSM 7.2) 部署说明( QNAP 同理)
|
||||
|
||||
在 DSM 7.2 中,已经支持了 Docker Compose,推荐使用 Docker Compose 一键部署本项目。
|
||||
|
||||
## 创建配置和数据存储文件夹
|
||||
|
||||
|
||||
## 安装 Container Manager (Docker) 套件
|
||||
|
||||
进入套件中心,安装 Container Manager (Docker) 套件。
|
||||
|
||||
{data-zoomable}
|
||||
|
||||
## 通过 Docker compose 安装配置 AB
|
||||
|
||||
点击 **项目**,然后点击 **新建**,选择 **Docker Compose**。
|
||||
|
||||
{data-zoomable}
|
||||
|
||||
复制以下内容填入 **Docker Compose** 中。
|
||||
```yaml
|
||||
version: "3.8"
|
||||
|
||||
services:
|
||||
ab:
|
||||
image: "ghcr.io/estrellaxd/auto_bangumi:latest"
|
||||
container_name: "auto_bangumi"
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "7892:7892"
|
||||
volumes:
|
||||
- "/volume1/docker/ab/config:/app/config"
|
||||
- "/volume1/docker/ab/data:/app/data"
|
||||
- "/volume1/docker/ab/log:/app/log"
|
||||
```
|
||||
|
||||
点击 **下一步**,然后点击 **完成**。
|
||||
|
||||
{data-zoomable}
|
||||
|
||||
完成创建之后进入 `http://<NAS IP>:7892` 即可进入 AB 并进行配置。
|
||||
|
||||
|
||||
|
||||
107
docs/deploy/quick-start.md
Normal file
107
docs/deploy/quick-start.md
Normal file
@@ -0,0 +1,107 @@
|
||||
# 快速开始
|
||||
|
||||
我们推荐你在 Docker 中部署运行 AutoBangumi。
|
||||
部署前请确认已经安装了 [Docker Engine][docker-engine] 或者 [Docker Desktop][docker-desktop]。
|
||||
|
||||
## 创建数据和配置文件夹
|
||||
|
||||
为了保证 AB 在每次更新之后数据和配置不丢失,推荐使用 Docker volume 进行数据和配置的持久化。
|
||||
|
||||
```shell
|
||||
docker volume create AutoBangumi_config
|
||||
docker volume create AutoBangumi_data
|
||||
```
|
||||
## 使用 Docker 部署 AutoBangumi
|
||||
|
||||
### 选项1: 使用 Docker-cli 部署
|
||||
|
||||
复制以下命令运行即可。
|
||||
|
||||
```shell
|
||||
docker run -d \
|
||||
--name=AutoBangumi \
|
||||
-v AutoBangumi_config:/app/config \
|
||||
-v AutoBangumi_data:/app/data \
|
||||
-p 7892:7892 \
|
||||
--network=bridge \
|
||||
--dns=8.8.8.8
|
||||
--restart unless-stopped \
|
||||
estrellaxd/auto_bangumi:latest
|
||||
|
||||
```
|
||||
|
||||
### 选项2: 使用 Docker-compose 部署
|
||||
|
||||
复制以下内容到 `docker-compose.yml` 文件中,然后运行 `docker-compose up -d` 即可。
|
||||
|
||||
```yaml
|
||||
version: "3.8"
|
||||
|
||||
services:
|
||||
AutoBangumi:
|
||||
image: estrellaxd/auto_bangumi:latest
|
||||
container_name: AutoBangumi
|
||||
volumes:
|
||||
- AutoBangumi_config:/app/config
|
||||
- AutoBangumi_data:/app/data
|
||||
ports:
|
||||
- 7892:7892
|
||||
restart: unless-stopped
|
||||
dns:
|
||||
- 223.5.5.5
|
||||
network_mode: bridge
|
||||
|
||||
volumes:
|
||||
AutoBangumi_config:
|
||||
name: AutoBangumi_config
|
||||
AutoBangumi_data:
|
||||
name: AutoBangumi_data
|
||||
```
|
||||
|
||||
## 安装 qBittorrent
|
||||
|
||||
如果你没有安装 qBittorrent,请先安装 qBittorrent。
|
||||
|
||||
- [在 Docker 中安装 qBittorrent][qbittorrent-docker]
|
||||
- [在 Windows/macOS 中安装 qBittorrent][qbittorrent-desktop]
|
||||
- [在 Linux 中安装 qBittorrent-nox][qbittorrent-nox]
|
||||
|
||||
## 获取 Mikan Project 的 RSS 链接
|
||||
|
||||
进入 [MiKan Project][mikan-project],注册账号并登录,然后点击右下角的 **RSS** 按钮,复制链接。
|
||||
|
||||
{data-zoomable}
|
||||
|
||||
获取的 RSS 地址如下:
|
||||
|
||||
```txt
|
||||
https://mikanani.me/RSS/MyBangumi?token=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
|
||||
# 或者
|
||||
https://mikanime.tv/RSS/MyBangumi?token=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
|
||||
```
|
||||
复制 token= 后面的内容。
|
||||
|
||||
详细步骤参考 [Mikan RSS][config-rss]
|
||||
|
||||
|
||||
## 配置 AutoBangumi
|
||||
|
||||
安装好 AB 之后,AB 的 WebUI 会自动运行,但是主程序会处于暂停状态,可以进入 `http://abhost:7892` 进行配置。
|
||||
|
||||
1. 填入下载器的地址,端口,用户名和密码。
|
||||
|
||||
{width=500}{class=ab-shadow-card}
|
||||
|
||||
2. 填入 Mikan RSS 的 Token。
|
||||
|
||||
{width=500}{class=ab-shadow-card}
|
||||
|
||||
3. 点击 **Apply** 保存配置,此时 AB 会重启运行,当右上角的圆点变为绿色时,表示 AB 已经正常运行。
|
||||
|
||||
[docker-engine]: https://docs.docker.com/engine/install/
|
||||
[docker-desktop]: https://www.docker.com/products/docker-desktop
|
||||
[config-rss]: ../config/rss
|
||||
[mikan-project]: https://mikanani.me/
|
||||
[qbittorrent-docker]: https://hub.docker.com/r/superng6/qbittorrent
|
||||
[qbittorrent-desktop]: https://www.qbittorrent.org/download
|
||||
[qbittorrent-nox]: https://www.qbittorrent.org/download-nox
|
||||
41
docs/deploy/unix.md
Normal file
41
docs/deploy/unix.md
Normal file
@@ -0,0 +1,41 @@
|
||||
# 如何在本地运行 AutoBangumi
|
||||
|
||||
## 克隆项目
|
||||
|
||||
```bash
|
||||
git clone https://github.com/EstrellaXD/Auto_Bangumi.git
|
||||
```
|
||||
|
||||
## 安装依赖
|
||||
确认你的电脑本地已经安装了 `python3.10` 以上的版本,以及 `pip` 包管理工具。
|
||||
|
||||
```bash
|
||||
python3 pip install -r requirements.txt
|
||||
```
|
||||
|
||||
## 进入源代码目录并且创建版本信息
|
||||
|
||||
```bash
|
||||
cd backend/src
|
||||
|
||||
echo "VERSION = 'local'" > module/__version__.py
|
||||
```
|
||||
|
||||
## 下载 WebUI
|
||||
|
||||
```bash
|
||||
wget https://github.com/Rewrite0/Auto_Bangumi_WebUI/releases/download/latest/dist.zip
|
||||
|
||||
unzip dist.zip
|
||||
|
||||
mv dist templates
|
||||
```
|
||||
|
||||
## 创建配置文件夹以及数据文件夹并运行
|
||||
|
||||
```bash
|
||||
mkdir "config"
|
||||
mkdir "data"
|
||||
|
||||
python3 main.py
|
||||
```
|
||||
178
docs/deploy/windows.md
Normal file
178
docs/deploy/windows.md
Normal file
@@ -0,0 +1,178 @@
|
||||
# Windows 本地部署
|
||||
|
||||
|
||||
1. 克隆并进入 AutoBangumi 的 `git` 仓库:
|
||||
|
||||
```powershell
|
||||
git clone https://github.com/EstrellaXD/Auto_Bangumi.git
|
||||
cd Auto_Bangumi
|
||||
```
|
||||
|
||||
2. 创建版本信息文件:
|
||||
|
||||
```powershell
|
||||
echo "VERSION='local'" > backend\src\__version__.py
|
||||
```
|
||||
|
||||
3. 新建 `python` 虚拟环境、激活并安装依赖(需保证 `python -V` 打印的版本符合 `Dockerfile` 中的要求,如 `FROM python:3.11-alpine AS APP`)
|
||||
|
||||
```powershell
|
||||
python -m venv env
|
||||
.\env\Scripts\Activate.ps1
|
||||
python -m pip install -r backend\requirements.txt
|
||||
```
|
||||
|
||||
4. 下载 WebUI 并安装:
|
||||
|
||||
```powershell
|
||||
Invoke-WebRequest -Uri "https://github.com/Rewrite0/Auto_Bangumi_WebUI/releases/latest/download/dist.zip" -OutFile "dist.zip"
|
||||
Expand-Archive -Path "dist.zip"
|
||||
mv dist\* backend\src\templates
|
||||
```
|
||||
|
||||
5. 创建 `data` 与 `config` 目录和空白的 `config_dev.json`(如果有必要将这些目录存储在其他位置,建议使用 Junction Directory 链接即可)
|
||||
|
||||
```powershell
|
||||
mkdir backend\src\data
|
||||
mkdir backend\src\config
|
||||
echo "{}" > backend\src\config\config_dev.json
|
||||
```
|
||||
默认情况下,PowerShell 输出文件编码为 `UTF-16LE`,你需要将 `config_dev.json` 的编码格式改为 `UTF-8`。
|
||||
|
||||
6. 运行程序测试是否配置正确:
|
||||
|
||||
```powershell
|
||||
cd backend\src
|
||||
python main.py
|
||||
```
|
||||
|
||||
7. 接下来配置成服务以实现开机自启,以下以 `nssm` 为例:
|
||||
|
||||
```powershell
|
||||
nssm install AutoBangumi (Get-Command python).Source
|
||||
nssm set AutoBangumi AppParameters (Get-Item .\main.py).FullName
|
||||
nssm set AutoBangumi AppDirectory (Get-Item ..).FullName
|
||||
nssm set AutoBangumi Start SERVICE_DELAYED_AUTO_START
|
||||
```
|
||||
|
||||
8. [可选] 由于 3.0 版本之前 AutoBangumi 没有修改规则或者批量移动下载位置的功能,可能遇到季名不符合需要 (如《鬼灭之刃 刀匠村篇》被视作一个仅具有一季的独立的影视作品,而不是系列动画的第三季) 或者中途想继续下载但是移出库防止出现在新剧集通知中等情况,可与考虑将下载目录和库目录区分开并用 Junction Directory 相连,这样在管理库时可以随意移动软链接而不影响 AutoBangumi 的工作。比如:
|
||||
```powershell
|
||||
### Configurations
|
||||
$downloadDir = "path\to\download_dir"
|
||||
$libraryDir = "path\to\library_dir"
|
||||
$logFile = $(Join-Path -Path $download_dir -ChildPath "downloadWatcher.log")
|
||||
$subfolderCreationTimeout = 10
|
||||
$watcher = New-Object System.IO.FileSystemWatcher
|
||||
$watcher.Path = $downloadDir
|
||||
$watcher.EnableRaisingEvents = $true
|
||||
|
||||
function CreateJunction(
|
||||
# The path to the folder containing junction targets, e.g. $downloadDir\<ShowName>
|
||||
# The junction targets are its subfolders e.g. $downloadDir\<ShowName>\<SeasonName>
|
||||
$targetRoot
|
||||
) {
|
||||
# The basename of $targetRoot, e.g. <ShowName>
|
||||
$targetRootName = Split-Path -Path $targetRoot -Leaf
|
||||
# The path the folder where junctions are created, e.g. $libraryDir\<ShowName>
|
||||
$junctionRoot = $(Join-Path -Path $libraryDir -ChildPath $targetRootName)
|
||||
# Create $junctionRoot if it does not exist
|
||||
if (!(Test-Path $junctionRoot)) {
|
||||
New-Item -ItemType Directory -Path $junctionRoot
|
||||
Add-Content $logFile -Value "[Information] $(Get-Date) New folder created at $junctionRoot mirroring $targetRoot."
|
||||
}
|
||||
# Wait up to 10 secs for a subfolder to appear in $targetRoot
|
||||
# This is because if $targetRoot is newly created the downloader may not have created the subfolder yet
|
||||
$junctionTargetList = $(Get-ChildItem -Path $targetRoot -Directory)
|
||||
$subfolderWaitCount = 0
|
||||
while ($junctionTargetList.Count -eq 0) {
|
||||
if ($subfolderWaitCount -ge $subfolderCreationTimeout) {
|
||||
Add-Content $logFile -Value "[Warning] $(Get-Date) No subfolders exist in $targetRoot for junctioning, skipping."
|
||||
Return
|
||||
}
|
||||
Start-Sleep -Seconds 1
|
||||
try {
|
||||
$junctionTargetList = $(Get-ChildItem -Path $targetRoot -Directory)
|
||||
}
|
||||
# If $targetRoot is removed/renamed during the wait, skip
|
||||
catch [System.IO.DirectoryNotFoundException] {
|
||||
Add-Content $logFile -Value "[Warning] $(Get-Date) $targetRoot is removed/renamed during the wait, skipping."
|
||||
Return
|
||||
}
|
||||
$subfolderWaitCount++
|
||||
}
|
||||
Get-ChildItem $junctionRoot | Where-Object {$_.LinkType -eq "Junction"} | ForEach-Object {
|
||||
# If a junction target is non-existent, remove it
|
||||
if (!(Test-Path $_.Target)) {
|
||||
Remove-Item $_.FullName
|
||||
Add-Content $logFile -Value "[Information] $(Get-Date) Junction at $($_.FullName) is removed because its target $($_.Target) is non-existent."
|
||||
}
|
||||
else {
|
||||
# Remove a junction target from $junctionTargetList if a junction in $junctionRoot is already pointing to it
|
||||
$existingTarget = $_.Target
|
||||
$junctionTargetList = $junctionTargetList | Where-Object {$_.FullName -ne $existingTarget}
|
||||
Add-Content $logFile -Value "[Debug] $(Get-Date) $($_.FullName) already exists, skipping."
|
||||
}
|
||||
}
|
||||
# Create junctions for each remaining target in $junctionTargetList
|
||||
for ($i = 0; $i -lt $junctionTargetList.Count; $i++) {
|
||||
$junctionTarget = $junctionTargetList[$i]
|
||||
# The default name for the junction is the name of the junction target it self, e.g. <SeasonName>
|
||||
$junctionName = $junctionTarget.Name
|
||||
# If a junction with the same name already exists, append the current date to the name, e.g. <SeasonName>-yyyy-MM-dd
|
||||
if (Test-Path $(Join-Path -Path $junctionRoot -ChildPath $junctionName)) {
|
||||
$junctionName = "$junctionName-$(Get-Date -Format "yyyy-MM-dd")"
|
||||
# If the new name is still taken, append a random string generated by taking first 5 chars from New-Guid to the name, e.g. <SeasonName>-yyyy-MM-dd-<RandomString>
|
||||
while (Test-Path $(Join-Path -Path $junctionRoot -ChildPath $junctionName)) {
|
||||
$junctionName = "$junctionName-$((New-Guid).ToString().Substring(0, 5))"
|
||||
}
|
||||
}
|
||||
# Create the junction
|
||||
New-Item -ItemType Junction -Path $(Join-Path -Path $junctionRoot -ChildPath $junctionName) -Value $junctionTarget.FullName
|
||||
Add-Content $logFile -Value "[Information] $(Get-Date) New junction created at $(Join-Path -Path $junctionRoot -ChildPath $junctionName) pointing to $junctionTarget."
|
||||
}
|
||||
}
|
||||
|
||||
$action = {
|
||||
# Event arguments, see https://learn.microsoft.com/en-us/dotnet/api/system.io.filesystemeventargs
|
||||
$details = $event.SourceEventArgs
|
||||
$path = $details.FullPath # Gets the full path of the affected file or directory.
|
||||
$changeType = $details.ChangeType # Gets the change type, e.g. Created, Deleted, Renamed
|
||||
Add-Content $logFile -Value "[Debug] $(Get-Date) $changeType event detected at $path."
|
||||
if (!(Test-Path $path -PathType Container)) {
|
||||
Add-Content $logFile -Value "[Debug] $(Get-Date) $name is not mirrored because it is not a folder."
|
||||
Return
|
||||
}
|
||||
# If the directory contains .nomirror file, skip
|
||||
if (Test-Path $(Join-Path -Path $path -ChildPath ".nomirror")) {
|
||||
Add-Content $logFile -Value "[Debug] $(Get-Date) $path is not mirrored because it contains .nomirror file."
|
||||
Return
|
||||
}
|
||||
# Process the directory as a root of junction targets
|
||||
$targetRoot = $path
|
||||
# If the directory is renamed, rename its mirror directory
|
||||
if ($changeType -eq [System.IO.WatcherChangeTypes]::Renamed) {
|
||||
$oldJunctionRoot = $(Join-Path -Path $libraryDir -ChildPath $details.OldName)
|
||||
$newJunctionRoot = $(Join-Path -Path $libraryDir -ChildPath $details.Name)
|
||||
if (Test-Path $oldJunctionRoot) {
|
||||
Rename-Item -Path $oldJunctionRoot -NewName $details.Name
|
||||
Add-Content $logFile -Value "[Information] $(Get-Date) $oldJunctionRoot is renamed to $newJunctionRoot."
|
||||
}
|
||||
else {
|
||||
Add-Content $logFile -Value "[Warning] $(Get-Date) Junction at $oldJunctionRoot does not exist, skipping."
|
||||
}
|
||||
}
|
||||
# If a directory is modified or newly created, update/create its mirror directory by creating/updating junctions to point to its subfolders
|
||||
if ($changeType -eq [System.IO.WatcherChangeTypes]::Changed -or `
|
||||
$changeType -eq [System.IO.WatcherChangeTypes]::Renamed -or `
|
||||
$changeType -eq [System.IO.WatcherChangeTypes]::Created) {
|
||||
CreateJunction $targetRoot
|
||||
}
|
||||
}
|
||||
|
||||
Register-ObjectEvent -InputObject $watcher -EventName Created -Action $action
|
||||
Register-ObjectEvent -InputObject $watcher -EventName Changed -Action $action
|
||||
Register-ObjectEvent -InputObject $watcher -EventName Renamed -Action $action
|
||||
while ($true) {Start-Sleep 5}
|
||||
```
|
||||
上述脚本定义了一个 FileSystemWatcher 来监控下载目录的变化并镜像到库目录,可以用 `nssm` 安装为服务自动运行。如果需要排除一个目录,则只需要在该目录下新建一个名为 `.nomirror` 的文件即可。
|
||||
|
||||
61
docs/deploy/wsl.md
Normal file
61
docs/deploy/wsl.md
Normal file
@@ -0,0 +1,61 @@
|
||||
# WSL 配置说明
|
||||
|
||||
感谢 #73 的贡献
|
||||
|
||||
WSL 用户可以用以下 `docker-compose.yml` 配置文件来部署 AutoBangumi
|
||||
|
||||
```yml
|
||||
version: "3.6"
|
||||
services:
|
||||
qbittorrent:
|
||||
container_name: qbittorrent
|
||||
image: johngong/qbittorrent:latest
|
||||
hostname: qbittorrent
|
||||
environment:
|
||||
- QB_EE_BIN=false
|
||||
- UID=1000 # 用户权限1000 当前WSL登录用户,查询方法 wsl内输入 id 用户名
|
||||
- GID=1000
|
||||
- QB_WEBUI_PORT=8989
|
||||
ports:
|
||||
- "6881:6881"
|
||||
- "6881:6881/udp"
|
||||
- "8989:8989"
|
||||
volumes:
|
||||
- qb_config:/config
|
||||
- /mnt/g/animation:/Downloads #下载路径,对应 Windows上目录是 G:\animation
|
||||
networks:
|
||||
- AutoBangumi_network
|
||||
restart: unless-stopped
|
||||
|
||||
AutoBangumi:
|
||||
image: estrellaxd/auto_bangumi:latest
|
||||
container_name: AutoBangumi
|
||||
ports:
|
||||
- 7892:7892
|
||||
depends_on:
|
||||
- qbittorrent
|
||||
volumes:
|
||||
- ./config:/app/config
|
||||
- ./data:/app/data
|
||||
environment:
|
||||
- PUID=1000
|
||||
- PGID=1000
|
||||
- TZ=Asia/Shanghai
|
||||
- AB_DOWNLOADER_HOST=qbittorrent:8989
|
||||
- AB_DOWNLOADER_USERNAME=admin
|
||||
- AB_DOWNLOADER_PASSWORD=adminadmin
|
||||
- AB_NOT_CONTAIN=720|繁体|CHT|JPTC|繁日|BIG5
|
||||
- AB_DOWNLOAD_PATH=/Downloads #qbittorrent 映射的地址,否者可能提示下载失败
|
||||
- AB_RSS=https://mikanani.me/RSS/MyBangumi?token=xxxxxxxx%3d%3d #订阅地址,改成自己的
|
||||
networks:
|
||||
- AutoBangumi_network
|
||||
restart: unless-stopped
|
||||
|
||||
networks:
|
||||
AutoBangumi_network:
|
||||
volumes:
|
||||
qb_config:
|
||||
external: false
|
||||
auto_bangumi:
|
||||
external: false
|
||||
```
|
||||
138
docs/dev/index.md
Normal file
138
docs/dev/index.md
Normal file
@@ -0,0 +1,138 @@
|
||||
# 贡献指南 Contributing
|
||||
|
||||
我们欢迎各位 Contributors 参与贡献帮助 AutoBangumi 更好的解决大家遇到的问题,
|
||||
|
||||
这篇指南会指导你如何为 AutoBangumi 贡献功能修复代码,可以在你要提出 Pull Request 之前花几分钟来阅读一遍这篇指南。
|
||||
|
||||
这篇文章包含什么?
|
||||
|
||||
- [项目规划 Roadmap](#项目规划-roadmap)
|
||||
- [提案寻求共识 Request for Comments](#提案寻求共识-request-for-comments)
|
||||
- [分支管理 Git Branch](#分支管理-git-branch)
|
||||
- [版本号](#版本号)
|
||||
- [分支开发,主干发布](#分支开发主干发布)
|
||||
- [Branch 生命周期](#branch-生命周期)
|
||||
- [Git Workflow 一览](#git-workflow-一览)
|
||||
- [Pull Request](#pull-request)
|
||||
- [版本发布介绍](#版本发布介绍)
|
||||
|
||||
|
||||
## 项目规划 Roadmap
|
||||
|
||||
AutoBangumi 开发组使用 [GitHub Project](https://github.com/EstrellaXD/Auto_Bangumi/projects?query=is%3Aopen) 看板来管理预计开发的规划、在修复中的问题,以及它们处理的进度;
|
||||
|
||||
这将帮助你更好的了解
|
||||
- 开发团队在做什么?
|
||||
- 有什么和你想贡献的方向一致的,可以直接参与实现与优化
|
||||
- 有什么已经在进行中的,避免自己重复不必要的工作
|
||||
|
||||
在 [Project](https://github.com/EstrellaXD/Auto_Bangumi/projects?query=is%3Aopen) 中你可以看到除通常的 `[Feature Request]`, `[BUG]`, 一些小优化项以外,还有一类 **`[RFC]`**;
|
||||
|
||||
### 提案寻求共识 Request for Comments
|
||||
|
||||
> 在 issue 中通过 `RFC` label 能找到到现有的 [AutoBangumi RFCs](https://github.com/EstrellaXD/Auto_Bangumi/issues?q=is%3Aissue+label%3ARFC)
|
||||
|
||||
对于一些小的优化项或者 bug 修复,你大可以直接帮忙调整代码然后提出 Pull Request,只需要简单阅读下 [分支管理](#分支管理-Git-Branch) 章节以基于正确的版本分支修复、以及通过 [Pull Request](#Pull-Request) 章节了解 PR 将如何被合并。
|
||||
|
||||
<br/>
|
||||
|
||||
而如果你打算做的是一项**较大的**功能重构,改动范围大而涉及的方面比较多,那么希望你能通过 [Issue: 功能提案](https://github.com/EstrellaXD/Auto_Bangumi/issues/new?assignees=&labels=RFC&projects=&template=rfc.yml&title=%5BRFC%5D%3A+) 先写一份 RFC 提案来简单阐述「你打算怎么做」的简短方案,来寻求开发者的讨论和共识。
|
||||
|
||||
因为有些方案可能是开发团队原本讨论并且认为不要做的事,而上一步可以避免你浪费大量精力。
|
||||
|
||||
> 如果仅希望讨论是否添加或改进某功能本身,而非「要如何实现」,请使用 -> [Issue: 功能改进](https://github.com/EstrellaXD/Auto_Bangumi/issues/new?labels=feature+request&template=feature_request.yml&title=%5BFeature+Request%5D+)
|
||||
|
||||
|
||||
<br/>
|
||||
|
||||
一份 [提案(RFC)](https://github.com/EstrellaXD/Auto_Bangumi/issues?q=is%3Aissue+is%3Aopen+label%3ARFC) 定位为 **「在某功能/重构的具体开发前,用于开发者间 review 技术设计/方案的文档」**,
|
||||
|
||||
目的是让协作的开发者间清晰的知道「要做什么」和「具体会怎么做」,以及所有的开发者都能公开透明的参与讨论;
|
||||
|
||||
以便评估和讨论产生的影响 (遗漏的考虑、向后兼容性、与现有功能的冲突),
|
||||
|
||||
因此提案侧重在对解决问题的 **方案、设计、步骤** 的描述上。
|
||||
|
||||
|
||||
## 分支管理 Git Branch
|
||||
|
||||
### 版本号
|
||||
|
||||
AutoBangumi 项目中的 Git 分支使用与发布版本规则密切相关,因此先介绍版本规范;
|
||||
|
||||
AutoBangumi 发布的版本号遵循 [「语义化版本 SemVer」](https://semver.org/lang/zh-CN/) 的规范,
|
||||
|
||||
使用 `<Major>.<Minor>.<Patch>` 三位版本的格式,每一位版本上的数字更新含义如下:
|
||||
|
||||
- **Major**: 大版本更新,很可能有不兼容的 配置/API 修改
|
||||
- **Minor**: 向下兼容的功能性新增
|
||||
- **Patch**: 向下兼容的 Bug 修复 / 小优化修正
|
||||
|
||||
### 分支开发,主干发布
|
||||
|
||||
AutoBangumi 项目使用「分支开发,主干发布」的模式,
|
||||
|
||||
[**`main`**](https://github.com/EstrellaXD/Auto_Bangumi/commits/main) 分支是稳定版本的 **「主干分支」**,只用于发布版本,不用于直接开发新功能或修复。
|
||||
|
||||
每一个 Minor 版本都有一个对应的 **「开发分支」** 用于开发新功能、与发布后维护修复问题,
|
||||
|
||||
开发分支的名字为 `<Major>.<Minor>-dev`,如 `3.1-dev`, `3.0-dev`, `2.6-dev`, 你可以在仓库的 [All Branches 中搜索到它们](https://github.com/EstrellaXD/Auto_Bangumi/branches/all?query=-dev)。
|
||||
|
||||
|
||||
### Branch 生命周期
|
||||
|
||||
当一个 Minor 开发分支(以 `3.1-dev` 为例) 完成新功能开发,**首次**合入 main 分支后,
|
||||
- 发布 Minor 版本 (如 `3.1.0`)
|
||||
- 同时拉出**下一个** Minor 开发分支(`3.2-dev`),用于下一个版本新功能开发
|
||||
- 而**上一个**版本开发分支(`3.0-dev`)进入归档不再维护
|
||||
- 且这个 Minor 分支(`3.1-dev`)进入维护阶段,不再增加新功能/重构,只维护 Bugs 修复
|
||||
- Bug 修复到维护阶段的 Minor 分支(`3.1-dev`)后,会再往 main 分支合并,并发布 `Patch` 版本
|
||||
|
||||
根据这个流程,对于各位 Contributors 在开发贡献时选择 Git Branch 来说,则是:
|
||||
- 若「修复 Bug」,则基于**当前发布版本**的 Minor 分支开发修复,并 PR 到这个分支
|
||||
- 若「添加新功能/重构」,则基于**还未发布的下一个版本** Minor 分支开发,并 PR 到这个分支
|
||||
|
||||
> 「当前发布版本」为 [[Releases 页面]](https://github.com/EstrellaXD/Auto_Bangumi/releases) 最新版本,这也与 [[GitHub Container Registry]](https://github.com/EstrellaXD/Auto_Bangumi/pkgs/container/auto_bangumi) 中最新版本相同
|
||||
|
||||
|
||||
### Git Workflow 一览
|
||||
|
||||
> 图中 commit timeline 从左到右 --->
|
||||
|
||||

|
||||
|
||||
|
||||
## Pull Request
|
||||
|
||||
请确保你根据上文的 Git 分支管理 章节选择了正确的 PR 目标分支,
|
||||
> - 若「修复 Bug」,则 PR 到**当前发布版本**的 Minor 维护分支
|
||||
> - 若「添加新功能/重构」,则 PR **下一个版本** Minor 开发分支
|
||||
|
||||
<br/>
|
||||
|
||||
- 一个 PR 应该只对应一件事,而不应引入不相关的更改;
|
||||
|
||||
对于不同的事情可以拆分提多个 PR,这能帮助开发组每次 review 只专注一个问题。
|
||||
|
||||
- 在提 PR 的标题与描述中,最好对修改内容做简短的说明,包括原因和意图,
|
||||
|
||||
如果有相关的 issue 或 RFC,应该把它们链接到 PR 描述中,
|
||||
|
||||
这将帮助开发组 code review 时能最快了解上下文。
|
||||
|
||||
- 确保勾选了「允许维护者编辑」(`Allow edits from maintainers`) 选项。这使我们可以直接进行较小的编辑/重构并节省大量时间。
|
||||
|
||||
- 请确保本地通过了「单元测试」和「代码风格 Lint」,这也会在 PR 的 GitHub CI 上检查
|
||||
- 对于 bug fix 和新功能,通常开发组也会请求你添加对应改动的单元测试覆盖
|
||||
|
||||
|
||||
开发组会在有时间的最快阶段 Review 贡献者提的 PR 并讨论或批准合并(Approve Merge)。
|
||||
|
||||
## 版本发布介绍
|
||||
|
||||
版本发布目前由开发组通过手动合并「特定发版 PR」后自动触发打包与发布。
|
||||
|
||||
通常 Bug 修复的 PR 合并后会很快发版,通常不到一周;
|
||||
|
||||
而新功能的发版时间则会更长而且不定,你可以在我们的 [GitHub Project](https://github.com/EstrellaXD/Auto_Bangumi/projects?query=is%3Aopen) 看板中看到开发进度,一个版本规划的新功能都开发完备后就会发版。
|
||||
|
||||
@@ -1,71 +0,0 @@
|
||||
version: "3.2"
|
||||
services:
|
||||
qbittorrent:
|
||||
container_name: qBittorrent
|
||||
environment:
|
||||
- TZ=Asia/Shanghai
|
||||
- TemPath=/downloads
|
||||
- SavePath=/downloads
|
||||
- PGID=${GID}
|
||||
- PUID=${UID}
|
||||
- WEBUI_PORT=${QB_PORT}
|
||||
volumes:
|
||||
- qb_config:/config
|
||||
- ${DOWNLOAD_PATH}:/downloads
|
||||
ports:
|
||||
- ${QB_PORT}:${QB_PORT}
|
||||
- 6881:6881
|
||||
- 6881:6881/udp
|
||||
networks:
|
||||
- auto_bangumi
|
||||
restart: unless-stopped
|
||||
image: superng6/qbittorrent:latest
|
||||
|
||||
auto_bangumi:
|
||||
container_name: AutoBangumi
|
||||
environment:
|
||||
- TZ=Asia/Shanghai
|
||||
- PGID=${GID}
|
||||
- PUID=${UID}
|
||||
- AB_DOWNLOADER_HOST=qbittorrent:${QB_PORT}
|
||||
networks:
|
||||
- auto_bangumi
|
||||
volumes:
|
||||
- /path/to/config:/app/config
|
||||
- /path/to/data:/app/data
|
||||
ports:
|
||||
- 7892:7892
|
||||
dns:
|
||||
- 8.8.8.8
|
||||
- 223.5.5.5
|
||||
restart: unless-stopped
|
||||
image: estrellaxd/auto_bangumi:latest
|
||||
depends_on:
|
||||
- qbittorrent
|
||||
|
||||
plex:
|
||||
container_name: Plex
|
||||
environment:
|
||||
- TZ=Asia/Shanghai
|
||||
- PUID=${UID}
|
||||
- PGID=${GID}
|
||||
- VERSION=docker
|
||||
networks:
|
||||
- auto_bangumi
|
||||
ports:
|
||||
- 32400:32400
|
||||
volumes:
|
||||
- plex_config:/config
|
||||
- ${DOWNLOAD_PATH}/Bangumi:/tv
|
||||
restart: unless-stopped
|
||||
image: lscr.io/linuxserver/plex:latest
|
||||
|
||||
networks:
|
||||
auto_bangumi:
|
||||
|
||||
volumes:
|
||||
qb_config:
|
||||
external: false
|
||||
plex_config:
|
||||
external: false
|
||||
|
||||
@@ -1,26 +0,0 @@
|
||||
version: '3.2'
|
||||
services:
|
||||
auto_bangumi:
|
||||
container_name: AutoBangumi
|
||||
environment:
|
||||
- TZ=Asia/Shanghai
|
||||
- PGID=${GID}
|
||||
- PUID=${UID}
|
||||
networks:
|
||||
- auto_bangumi
|
||||
ports:
|
||||
- '7892:7892'
|
||||
volumes:
|
||||
- ./config:/app/config
|
||||
- ./data:/app/data
|
||||
dns:
|
||||
- 8.8.8.8
|
||||
- 223.5.5.5
|
||||
restart: unless-stopped
|
||||
image: estrellaxd/auto_bangumi:latest
|
||||
|
||||
networks:
|
||||
auto_bangumi:
|
||||
volumes:
|
||||
auto_bangumi:
|
||||
external: false
|
||||
@@ -1,50 +0,0 @@
|
||||
version: "3.2"
|
||||
services:
|
||||
qbittorrent:
|
||||
container_name: qBittorrent
|
||||
environment:
|
||||
- TZ=Asia/Shanghai
|
||||
- TemPath=/downloads
|
||||
- SavePath=/downloads
|
||||
- PGID=${GID}
|
||||
- PUID=${UID}
|
||||
- WEBUI_PORT=${QB_PORT}
|
||||
volumes:
|
||||
- qb_config:/config
|
||||
- ${DOWNLOAD_PATH}:/downloads # 填入下载绝对路径
|
||||
ports:
|
||||
- ${QB_PORT}:${QB_PORT}
|
||||
- "6881:6881"
|
||||
- "6881:6881/udp"
|
||||
networks:
|
||||
- auto_bangumi
|
||||
restart: unless-stopped
|
||||
image: superng6/qbittorrent
|
||||
|
||||
auto_bangumi:
|
||||
container_name: AutoBangumi
|
||||
environment:
|
||||
- TZ=Asia/Shanghai
|
||||
- PGID=${GID}
|
||||
- PUID=${UID}
|
||||
- AB_DOWNLOADER_HOST=qbittorrent:${QB_PORT}
|
||||
volumes:
|
||||
- ./config:/app/config
|
||||
- ./data:/app/data
|
||||
networks:
|
||||
- auto_bangumi
|
||||
ports:
|
||||
- '7892:7892'
|
||||
dns:
|
||||
- 8.8.8.8
|
||||
- 223.5.5.5
|
||||
restart: unless-stopped
|
||||
image: estrellaxd/auto_bangumi:latest
|
||||
depends_on:
|
||||
- qbittorrent
|
||||
|
||||
networks:
|
||||
auto_bangumi:
|
||||
volumes:
|
||||
qb_config:
|
||||
external: false
|
||||
75
docs/faq/mikan-network.md
Normal file
75
docs/faq/mikan-network.md
Normal file
@@ -0,0 +1,75 @@
|
||||
# Mikan 网络问题的应对方法
|
||||
|
||||
由于蜜柑计划本站: `https://mikanani.me` 目前被 GFW 封锁,因此可能会导致 AB 无法正确连接蜜柑计划的情况,建议使用如下方法解决。
|
||||
|
||||
- [使用蜜柑计划国内域名](#蜜柑计划国内域名)
|
||||
- [使用代理](#代理)
|
||||
- [使用 CloudFlare Worker 进行反代](#cloudflare-workers)
|
||||
|
||||
## 蜜柑计划国内域名
|
||||
|
||||
- 蜜柑计划更新了新的域名 `https://mikanime.tv`,请在不打开代理的情况下配合 AB 使用。
|
||||
|
||||
|
||||
## 配置代理
|
||||
|
||||
1. AB 中自带了代理配置,如果要配置代理请按照 [配置代理](../config/proxy) 中的方式正确配置 HTTP 或者 Socks 代理。配置完成可以规避墙的问题。
|
||||
2. QB 中也需要配置代理,请按照如下截图对 QB 中进行代理设置 (Socks 同理)
|
||||
<img width="483" alt="image" src="https://user-images.githubusercontent.com/33726646/233681562-cca3957a-a5de-40e2-8fb3-4cc7f57cc139.png">
|
||||
|
||||
3. 在 2.6 版本更新中 AB 额外提供了两种解决被墙的方案。
|
||||
|
||||
- 可以在 WebUI 中的 `源站链接` 中修改为自己反代过的 URL
|
||||
- 使用代理之后可以使用 AB 自身作为反代节点。
|
||||
|
||||
具体可以看[配置代理](../config/proxy)中的说明。
|
||||
|
||||
## CloudFlare Workers
|
||||
|
||||
根据 OpenAI 被墙的经验,我们也可以通过反向代理的方式解决。具体如何申请域名绑定 CloudFlare 在此不再赘述。
|
||||
在 Workers 中添加如下代码即可以用你自己的域名访问蜜柑计划并且解析下载 RSS 链接中的种子。
|
||||
|
||||
```javascript
|
||||
const TELEGRAPH_URL = 'https://mikanani.me';
|
||||
const MY_DOMAIN = 'https://yourdomain.com'
|
||||
|
||||
addEventListener('fetch', event => {
|
||||
event.respondWith(handleRequest(event.request))
|
||||
})
|
||||
|
||||
async function handleRequest(request) {
|
||||
const url = new URL(request.url);
|
||||
url.host = TELEGRAPH_URL.replace(/^https?:\/\//, '');
|
||||
|
||||
const modifiedRequest = new Request(url.toString(), {
|
||||
headers: request.headers,
|
||||
method: request.method,
|
||||
body: request.body,
|
||||
redirect: 'manual'
|
||||
});
|
||||
|
||||
const response = await fetch(modifiedRequest);
|
||||
const contentType = response.headers.get('Content-Type') || '';
|
||||
|
||||
// 如果内容类型是 RSS,才进行替换操作
|
||||
if (contentType.includes('application/xml')) {
|
||||
const text = await response.text();
|
||||
const replacedText = text.replace(/https?:\/\/mikanani\.me/g, MY_DOMAIN);
|
||||
const modifiedResponse = new Response(replacedText, response);
|
||||
|
||||
// 添加允许跨域访问的响应头
|
||||
modifiedResponse.headers.set('Access-Control-Allow-Origin', '*');
|
||||
|
||||
return modifiedResponse;
|
||||
} else {
|
||||
const modifiedResponse = new Response(response.body, response);
|
||||
|
||||
// 添加允许跨域访问的响应头
|
||||
modifiedResponse.headers.set('Access-Control-Allow-Origin', '*');
|
||||
|
||||
return modifiedResponse;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
完成上述配置之后,将你的域名填入 AB 的 **源站链接|Custom URL** 中即可。
|
||||
191
docs/faq/常见问题.md
Normal file
191
docs/faq/常见问题.md
Normal file
@@ -0,0 +1,191 @@
|
||||
# 常见问题
|
||||
|
||||
## WebUI
|
||||
|
||||
### WebUI 地址
|
||||
|
||||
默认端口为 7892,请访问 `http://serverhost:7892`,如果你修改了端口,请记得同时更改 Docker 中的端口映射。
|
||||
|
||||
### 默认用户名和密码
|
||||
|
||||
- 默认用户名:`admin`,默认密码:`adminadmin`。
|
||||
- 请务必在第一次登录后修改密码。
|
||||
|
||||
### 修改和忘记密码
|
||||
|
||||
- 修改密码:在登录后点击右上角`···`,点击 `Profile`,修改用户名和密码。
|
||||
- 目前没有简单的忘记密码的重置方法,如果忘记密码,可以删除 `data/data.db` 文件,然后重启。
|
||||
|
||||
### 为什么我修改了配置,但是没有生效?
|
||||
|
||||
- 修改配置之后请点击 **Apply** 按钮,然后`···` 中的 **Restart** 按钮,此时会重启主进程。
|
||||
- 如果开启 **Debug** 模式,请点击 `···` 中的 **Shutdown**,此时会重启容器。
|
||||
|
||||
### 如何判断程序是否正常运行
|
||||
|
||||
新版 WebUI 右上角有一个小圆点,绿色表示正常运行,红色表示出现错误,程序暂停。
|
||||
|
||||
## 3.0 是如何管理番剧的
|
||||
|
||||
升级到 3.0 之后 AB 可以在 WebUI 中一键管理番剧种子和下载规则。所以依赖的是种子的下载路径和规则名称。
|
||||
如果你手动在 QB 中更改了种子的下载路径,那么可能会碰到通知没有海报,删除种子不起作用等问题。
|
||||
请尽量在 AB 中操作管理番剧和种子。
|
||||
|
||||
## 下载以及关键词过滤
|
||||
|
||||
### 下载路径填写
|
||||
|
||||
**下载路径** 到底写什么?
|
||||
- 这个参数只要和你 qBittorrent 中的参数保持一致即可。
|
||||
- Docker:比如 qB 中是 `/downloads` 那就写 `/downloads/Bangumi`,`Bangumi`可以任意更改。
|
||||
- Linux/macOS:如果是 `/home/usr/downloads` 或者 `/User/UserName/Downloads` 只要在最后再加一行 `Bangumi` 就行。
|
||||
- Windows:`D:\Media\`, 改为 `D:\Media\Bangumi`
|
||||
|
||||
### 没有开始自动下载怎么办:
|
||||
|
||||
- 检查 AutoBangumi 的日志,如果日志一切正常,说明是 qBittorrent 的设置问题,检查:
|
||||
- 设置 >> RSS >> 启用 RSS Torrent 自动下载
|
||||
- Options >> RSS >> Enable auto downloading of RSS torrents
|
||||
- 设置 >> RSS >> 启用获取 RSS 订阅
|
||||
- Options >> RSS >> Enable fetching RSS feeds
|
||||
- 检查 qb 配置,看是否有权限新建文件夹
|
||||
|
||||
### 下载没有存储在正确的目录中
|
||||
|
||||
- 检查 [下载路径](###下载路径填写) 是否正确。
|
||||
- 检查 qBittorrent 的配置,PGID 与 PUID ,看是否有权限新建文件夹。可以尝试手动下载任意种子并且指定目录,如果出现错误或者未新建目录为文件夹权限问题。
|
||||
- 检查 qBittorrent 的默认配置,Saving Management 选项请选择手动保存,「保存管理 >> 默认种子管理模式 >> 手动」
|
||||
|
||||
### 下载很多没有订阅的番剧怎么办?
|
||||
|
||||
- 检查一下蜜柑订阅是否订阅了一部番剧的全部字幕组。请一部番剧只订阅一个组,最好开启高级订阅。
|
||||
- 高级定于在 蜜柑计划 的用户设置中开启
|
||||
- 正则过滤不到位,请参考下一节拓展正则表达式。
|
||||
- 如果以上都没有出现,请带上 LOG 反馈至 [ISSUE][ISSUE]。
|
||||
|
||||
### 过滤关键词怎么写。
|
||||
|
||||
AB 中的过滤关键词是正则表达式,只会在建立规则的时候添加,后续如果要拓展规则,3.0 版本之后可以在 WebUI 中对每个番剧进行单独定义。
|
||||
- 过滤关键词是正则表达式,只需要把不需要的关键词用 `|` 间隔开即可。
|
||||
- 默认 `720|\d+-\d+` 这个规则会过滤掉所有合集和 720P 的番剧,如果要添加,请在部署 AB 之前添加完成,后续修改环境变量只会影响到新添加的规则。
|
||||
- 常用正则关键词(中间用 `|` )隔开:
|
||||
- `720` 过滤 720、720P、720p 等等
|
||||
- `\d+-\d+` 过滤合集,比如 [1-12]
|
||||
- `[Bb]aha` 过滤 Baha 的番剧
|
||||
- `[Bb]ilibili`、`[Bb]-Global` 过滤 Bilibili 的番剧
|
||||
- `繁`、`CHT` 过滤繁体字幕
|
||||
- 如果想要命中关键词,请在 QB 包含中用这种形式添加:`XXXXX+1080P\+` ,其中 `1080P\+` 表示命中 1080P+ 的番剧。
|
||||
|
||||
### 第一次部署出了问题下了很多不想下的番剧怎么办?
|
||||
|
||||
1. 删除 QB 中多余的自动下载规则,和文件。
|
||||
2. 检查订阅,和过滤规则。
|
||||
3. 在浏览器中访问 resetRule API 地址 `http://localhost:7892/api/v1/resetRule` ,重置规则。
|
||||
4. 重启一下 AB 。
|
||||
|
||||
### AB 中识别的 RSS 条目比订阅的少
|
||||
|
||||
新版本中 AB 的过滤器也会默认过滤所有 RSS 条目,在设置的时候请不要一股脑全部加上过滤。如果想要细分下载,请在 WebUI 中对每个番剧进行单独配置。
|
||||
|
||||
### 过滤关键词不起作用
|
||||
|
||||
- 请检查是否正确设置了**全局过滤**参数。
|
||||
- 请在 QB 的 RSS 自动下载规则中检查,可以看右侧命中的 RSS,调整下载规则,点击 save 之后可以看到是哪个关键词出现错误。
|
||||
|
||||
## 🎬 番剧补全相关
|
||||
|
||||
### 番剧补全不起作用
|
||||
|
||||
- 请检查是否正确设置了 **番剧补全** 参数。
|
||||
- 旧版本的 AB 不支持修改参数临时启用补全,请清除数据并重新部署。
|
||||
- 2.5.15 版本之后的 AB 支持临时启用。
|
||||
|
||||
## 📁 重命名相关
|
||||
|
||||
### 解析错误 `NOT match with XXXXX`
|
||||
|
||||
- AB 暂时不支持解析合集,合集请通过 API 添加。
|
||||
- 如果非合集,请在 issue 中反馈问题。
|
||||
|
||||
### `Rename failed` 或者重命名失败怎么办?
|
||||
|
||||
- 检查文件路径,标准存储路径应为 `/title/Season/Episode.mp4` 如果不为标准格式路径会导致命名错误。发生这类错误请排查 qbittorrent 的配置。
|
||||
- 非上述问题,请到 issue 反馈。
|
||||
|
||||
### 没有自动重命名怎么办?
|
||||
|
||||
- 请检查 QB 中种子分类是否在 `Bangumi` 类中。
|
||||
- AB 只会重命名已经下载的文件。
|
||||
|
||||
### 如何让 AB 重命名非 AB 添加的番剧
|
||||
|
||||
- 只需要把种子的类别更改为 `Bangumi` 即可。
|
||||
- 需要注意的是,需要种子存放在 `Title/Season X/` 文件夹下才能正常触发重命名。
|
||||
|
||||
### 如何重命名合集
|
||||
|
||||
1. 把合集的类别更改为 `Bangumi`。
|
||||
2. 把合集的存储路径更改为 `Title/Season X/`。
|
||||
3. 等待合集下载完成,重命名完成。
|
||||
|
||||
## 🌍 网络链接
|
||||
|
||||
### 无法连接到 qBittorrent
|
||||
|
||||
- 检查 AB 中的 **下载器地址** 参数是否正确。
|
||||
- 如果你的 AB 和 QB 在同一个 Docker 网络中,可以尝试使用容器名称进行寻址,如:`http://qbittorrent:8080`。
|
||||
- 如果你的 AB 和 QB 在同一个 Docker 服务器中,可以尝试使用 Docker 网关地址进行访问,如:`http://172.17.0.1:8080`。
|
||||
- 如果 AB 网络模式不是 `host` 请不要使用 `127.0.0.1` 来访问 QB
|
||||
- 在 Docker 中不同容器中无法互相访问,可以在 QB 的网络连接的链接中,设定链接 AB。
|
||||
- 如果 qBittorrent 使用 HTTPS 模式,请在 **下载器地址** 参数中添加 `https://` 前缀。
|
||||
|
||||
### `DNS/Connect ERROR`
|
||||
|
||||
- 请检查网络连接,如果网络连接正常,请检查 DNS 解析。
|
||||
- 可以给 AB 添加一个 `dns=8.8.8.8`,如果是 HOST 模式可以忽略。
|
||||
- 如果 DNS 解析正常,添加代理。
|
||||
- 使用 TMDB 请添加代理。
|
||||
|
||||
### 如何给 Mikanani 添加代理
|
||||
|
||||
AB 现在提供三种代理模式
|
||||
1. HTTP 以及 Socks 代理
|
||||
|
||||
老版本的 AB 就有这项功能,升级到 2.6 版本之后只需要在 WebUI 中检查代理配置即可正常访问蜜柑计划。
|
||||
|
||||
不过这时候 qBittorrent 无法正常访问蜜柑计划的 RSS 和种子地址,因此需要在 qBittorrent 中添加代理。详情可以查看 #198
|
||||
|
||||
2. 自定义反向代理 URL
|
||||
|
||||
2.6 版本的 AB 在配置中增加了 `custom_url` 选项,可以自定义反向代理的 URL。
|
||||
可以在配置中设置为自己正确设置的反代 URL。这样 AB 就会使用自定义的 URL 来访问蜜柑计划。并且 QB 也可以正常下载。
|
||||
|
||||
3. AB 作为反代中转
|
||||
|
||||
在 AB 配置代理之后,AB 自身可以作为本地的反代中转。不过目前仅开放 RSS 相关功能的反代。
|
||||
这时候只需要把 `custom_url` 设置为 `http://abhost:abport` 即可。 `abhost` 为 AB 的 IP 地址,`abport` 为 AB 的端口。
|
||||
此时 AB 会把自身地址推送给 qBittorrent,qBittorrent 会使用 AB 的地址作为反代来访问蜜柑计划。
|
||||
|
||||
请注意,此时如果你没有用 NGINX 等工具对 AB 进行反代,请填入 `http://` 来保证程序正常运行。
|
||||
注意点2: 请不要在同时使用代理的情况下在 `custom_url` 中使用容器寻址,或者 Docker 的容器或者网关地址。转跳代理之后可能会无法访问 RSS。
|
||||
|
||||
## 🐬 Docker 相关
|
||||
|
||||
### 如何自动更新
|
||||
|
||||
可以通过在 Docker 中运行一个 `watchtower` 的守护进程,守护进程会自动更新你的容器。
|
||||
|
||||
[watchtower](https://containrrr.dev/watchtower) 官方文档
|
||||
|
||||
### 使用 Docker compose 更新
|
||||
|
||||
如果你的 AB 使用 Docker compose 部署,可以使用 `docker-compose pull` 命令更新。
|
||||
拉取完成容器之后可以使用 `docker-compose up -d` 重启容器。
|
||||
|
||||
### 如果升级出现了问题需要怎么做
|
||||
|
||||
由于每个人配置可能不尽相同,现在升级可能会出现程序无法运行的问题,此时删除掉所有以前的数据以及生成的配置文件后重启容器。
|
||||
然后在 WebUI 中重新配置一下即可。
|
||||
如果你是老版本升级,请先参考[升级指南](/changelog/2.6)。
|
||||
|
||||
如果有上述没有覆盖的问题,请到 [ISSUE](https://github.com/EstrellaXD/Auto_Bangumi/issues) 按照 bug 模板反馈。
|
||||
46
docs/faq/排错流程.md
Normal file
46
docs/faq/排错流程.md
Normal file
@@ -0,0 +1,46 @@
|
||||
---
|
||||
title: 排错流程
|
||||
---
|
||||
|
||||
## 💡 通用排错流程
|
||||
1. 如果 AB 无法正常启动,请检查启动命令是否正确, 当前版本正确的启动命令为 `/init`, 如果发现启动命令不正确且不会修改,请尝试重新部署 AB。
|
||||
2. 部署完 AB 之后请先查看 Log。如果运行如下说明 AB 运行正常,且与 QB 连接良好:
|
||||
```
|
||||
[2022-07-09 21:55:19,164] INFO: _ ____ _
|
||||
[2022-07-09 21:55:19,165] INFO: /\ | | | _ \ (_)
|
||||
[2022-07-09 21:55:19,166] INFO: / \ _ _| |_ ___ | |_) | __ _ _ __ __ _ _ _ _ __ ___ _
|
||||
[2022-07-09 21:55:19,167] INFO: / /\ \| | | | __/ _ \| _ < / _` | '_ \ / _` | | | | '_ ` _ \| |
|
||||
[2022-07-09 21:55:19,167] INFO: / ____ \ |_| | || (_) | |_) | (_| | | | | (_| | |_| | | | | | | |
|
||||
[2022-07-09 21:55:19,168] INFO: /_/ \_\__,_|\__\___/|____/ \__,_|_| |_|\__, |\__,_|_| |_| |_|_|
|
||||
[2022-07-09 21:55:19,169] INFO: __/ |
|
||||
[2022-07-09 21:55:19,169] INFO: |___/
|
||||
[2022-07-09 21:55:19,170] INFO: Version 3.0.1 Author: EstrellaXD Twitter: https://twitter.com/Estrella_Pan
|
||||
[2022-07-09 21:55:19,171] INFO: GitHub: https://github.com/EstrellaXD/Auto_Bangumi/
|
||||
[2022-07-09 21:55:19,172] INFO: Starting AutoBangumi...
|
||||
[2022-07-09 21:55:20,717] INFO: Add RSS Feed successfully.
|
||||
[2022-07-09 21:55:21,761] INFO: Start collecting RSS info.
|
||||
[2022-07-09 21:55:23,431] INFO: Finished
|
||||
[2022-07-09 21:55:23,432] INFO: Running....
|
||||
[2022-07-09 22:01:24,534] INFO: [NC-Raws] 继母的拖油瓶是我的前女友 - 01 (B-Global 1920x1080 HEVC AAC MKV) [0B604F3A].mkv >> 继母的拖油瓶是我的前女友 S01E01.mkv
|
||||
```
|
||||
1. 如果出现如下 LOG,说明 AB 无法连接 qBittorrent,请检查 qBittorrent 是否正常运行,如果 qBittorrent 正常运行,转跳 [网络问题](/faq/常见问题#🌍-网络链接) 部分进行排查。
|
||||
```
|
||||
[2022-07-09 22:01:24,534] WARNING: Cannot connect to qBittorrent, wait 5min and retry
|
||||
```
|
||||
2. 如果出现如下 LOG,说明 AB 无法连接到 Mikan RSS,请转跳到 [网络问题](/faq/常见问题#🌍-网络链接) 部分进行排查。
|
||||
```
|
||||
[2022-07-09 21:55:21,761] INFO: Start collecting RSS info.
|
||||
[2022-07-09 22:01:24,534] WARNING: Connected Failed,please check DNS/Connection
|
||||
```
|
||||
3. 如果此时 QB 中没有下载任务,请转到 RSS 自动下载规则页面,检查 AB 建立的规则是否正确。
|
||||
1. 查看 RSS 订阅,如果是正常的 RSS 图标,说明 RSS 订阅正常,如果是**问题图标**,说明 RSS 订阅有问题,请检查 qBittorrent 与 Mikan RSS 的连通性。
|
||||
2. 如果没有任何下载规则,请检查 RSS 订阅是否是空白,`filter` 是否设置太多过滤值,转跳 [过滤问题](/faq/常见问题#下载以及关键词过滤) 。
|
||||
3. 检查添加的规则是否是正确的,如果出现错误,请到 [issue](https://www.github.com/EstrellaXD/Auto_Bangumi/issues) 反馈。
|
||||
4. 如果有下载规则没有下载,点击规则,并且检查右侧是否命中条目。如果没有命中条目,请删除一下过滤值。
|
||||
5. 检查自动下载规则中「总是暂停」是否关闭
|
||||
4. 此时 QB 应该存在下载任务。
|
||||
1. 如果下载任务出现路径问题,请检查 QB 设置中的「保存管理」中的「默认种子管理模式」是否为「手动」,若不是请调整为「手动」。
|
||||
2. 如果下载全部为感叹号,或者下载路径中没有新建归类文件夹,请检查 QB 的权限。
|
||||
5. 如果上述排查均不能生效,请尝试重新部署一个新的 qBittorrent。
|
||||
6. 如果仍然无效,请带着 LOG 到 [issue](https://www.github.com/EstrellaXD/Auto_Bangumi/issues) 反馈。
|
||||
|
||||
88
docs/home/index.md
Normal file
88
docs/home/index.md
Normal file
@@ -0,0 +1,88 @@
|
||||
---
|
||||
title: 项目说明
|
||||
---
|
||||
|
||||
<p align="center">
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: dark)" srcset="../image/icons/dark-icon.svg">
|
||||
<source media="(prefers-color-scheme: light)" srcset="../image/icons/light-icon.svg">
|
||||
<img src="../image/icons/light-icon.svg" width=50%>
|
||||
</picture>
|
||||
</p>
|
||||
|
||||
|
||||
## 项目说明
|
||||
|
||||
|
||||
<p align="center">
|
||||
<img
|
||||
title="AutoBangumi WebUI"
|
||||
alt="AutoBangumi WebUI"
|
||||
src="../image/preview/window.png"
|
||||
width=85%
|
||||
data-zoomable
|
||||
>
|
||||
</p>
|
||||
|
||||
**`AutoBangumi`** 是从 [Mikan Project](https://mikanani.me) 全自动追番整理下载工具。
|
||||
只需要在 [Mikan Project](https://mikanani.me) 上订阅番剧,就可以全自动追番、下载并整理文件,
|
||||
整理后的番剧名称和目录可以直接被 [Plex]()、[Jellyfin]() 等媒体库软件识别,无需二次刮削。
|
||||
|
||||
## 功能说明
|
||||
|
||||
- 简易单次配置就能持续使用
|
||||
- 无需介入的 `RSS` 解析器,解析番组信息并且自动生成下载规则。
|
||||
- 番剧文件整理:
|
||||
|
||||
```
|
||||
Bangumi
|
||||
├── bangumi_A_title
|
||||
│ ├── Season 1
|
||||
│ │ ├── A S01E01.mp4
|
||||
│ │ ├── A S01E02.mp4
|
||||
│ │ ├── A S01E03.mp4
|
||||
│ │ └── A S01E04.mp4
|
||||
│ └── Season 2
|
||||
│ ├── A S02E01.mp4
|
||||
│ ├── A S02E02.mp4
|
||||
│ ├── A S02E03.mp4
|
||||
│ └── A S02E04.mp4
|
||||
├── bangumi_B_title
|
||||
│ └─── Season 1
|
||||
```
|
||||
|
||||
- 全自动重命名,重命名后 99% 以上的番剧可以直接被媒体库软件直接刮削
|
||||
|
||||
```
|
||||
[Lilith-Raws] Kakkou no Iinazuke - 07 [Baha][WEB-DL][1080p][AVC AAC][CHT][MP4].mp4
|
||||
>>
|
||||
Kakkou no Iinazuke S01E07.mp4
|
||||
```
|
||||
|
||||
- 自定义重命名,可以根据上级文件夹对所有子文件重命名。
|
||||
- 季中追番可以补全当季遗漏的所有剧集
|
||||
- 高度可自定义的功能选项,可以针对不同媒体库软件微调
|
||||
- 无需维护完全无感使用
|
||||
- 内置 TDMB 解析器,可以直接生成完整的 TMDB 格式的文件以及番剧信息。
|
||||
- 对于 Mikan RSS 的反代支持。
|
||||
|
||||
## 相关群组
|
||||
|
||||
- 更新推送:[Telegram Channel](https://t.me/autobangumi_update)
|
||||
- Bug 反馈群:[Telegram](https://t.me/+yNisOnDGaX5jMTM9)
|
||||
|
||||
## 致谢
|
||||
|
||||
感谢 [Sean](https://github.com/findix) 提供的大量帮助
|
||||
|
||||
## 贡献
|
||||
|
||||
欢迎提供 ISSUE 或者 PR
|
||||
|
||||
<a href="https://github.com/EstrellaXD/Auto_Bangumi/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=EstrellaXD/Auto_Bangumi" />
|
||||
</a>
|
||||
|
||||
## Licence
|
||||
|
||||
[MIT licence](https://github.com/EstrellaXD/Auto_Bangumi/blob/main/LICENSE)
|
||||
BIN
docs/image/config/downloader.png
Normal file
BIN
docs/image/config/downloader.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 106 KiB |
BIN
docs/image/config/manager.png
Normal file
BIN
docs/image/config/manager.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 106 KiB |
BIN
docs/image/config/notifier.png
Normal file
BIN
docs/image/config/notifier.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 96 KiB |
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user