perf: comprehensive performance optimization for backend and frontend

Backend: shared HTTP connection pool, concurrent RSS/torrent/notification
operations, TMDB/Mikan result caching, database indexes, pre-compiled
regex, __slots__ on dataclasses, O(1) set-based dedup, frozenset lookups,
batch RSS enable/disable, asyncio.to_thread for blocking calls.

Frontend: shallowRef for large arrays, computed table columns, watch
instead of watchEffect, scoped style fix, typed emits, noImplicitAny,
useIntervalFn lifecycle management, shared useClipboard instance,
shallow clone for shared template objects.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
Estrella Pan
2026-01-24 20:46:45 +01:00
parent 929a88c343
commit cba4988e52
38 changed files with 409 additions and 262 deletions

View File

@@ -1,3 +1,56 @@
# [3.2.0-beta.5] - 2026-01-24
## Backend
### Performance
- 新增共享 HTTP 客户端连接池,复用 TCP/SSL 连接,减少每次请求的握手开销
- RSS 刷新改为并发拉取所有订阅源(`asyncio.gather`),多源场景下速度提升约 10 倍
- 种子文件下载改为并发获取,下载多个种子时速度提升约 5 倍
- 重命名模块并发获取所有种子文件列表,速度提升约 20 倍
- 通知发送改为并发执行,移除 2 秒硬编码延迟
- 新增 TMDB 和 Mikan 解析结果的内存缓存,避免重复 API 调用
-`Torrent.url``Torrent.rss_id``Bangumi.title_raw``Bangumi.deleted``RSSItem.url` 添加数据库索引
- RSS 批量启用/禁用改为单次事务操作,替代逐条提交
- 预编译正则表达式(种子名解析规则、过滤器匹配),避免运行时重复编译
- `SeasonCollector` 在循环外创建,复用单次认证
- `check_first_run` 缓存默认配置字典,避免每次创建新对象
- 通知模块中的同步数据库调用改为 `asyncio.to_thread`,避免阻塞事件循环
- RSS 解析去重从 O(n²) 列表查找改为 O(1) 集合查找
- 文件后缀判断使用 `frozenset` 替代列表,提升查找效率
- `Episode`/`SeasonInfo` 数据类添加 `__slots__`,减少内存占用
- RSS XML 解析返回元组列表,替代三个独立列表再 zip 的模式
- qBittorrent 规则设置改为并发执行
## Frontend
### Performance
- 下载器 store 使用 `shallowRef` 替代 `ref`,避免大数组的深层响应式代理
- 表格列定义改为 `computed`,避免每次渲染重建
- RSS 表格列与数据分离,数据变化时不重建列配置
- 日历页移除重复的 `getAll()` 调用
- `ab-select``watchEffect` 改为 `watch`,消除挂载时的无效 emit
- `useClipboard` 提升到 store 顶层,避免每次 `copy()` 创建新实例
- `setInterval` 替换为 `useIntervalFn`,自动生命周期管理,防止内存泄漏
- 共享 `ruleTemplate` 对象改为浅拷贝,避免意外的引用共变
- `ab-add-rss` 移除不必要的 `setTimeout` 延迟
### Fixes
- 修复 `ab-image.vue``<style scope>` 的拼写错误(应为 `scoped`
- 修复 `ab-edit-rule.vue``String` 类型应为 `string`
- `bangumi` ref 初始化为 `[]` 而非 `undefined`,减少下游空值检查
- `ab-bangumi-card` 模板类型安全:动态属性访问改为显式枚举
- 启用 `noImplicitAny: true` 提升类型安全
### Types
- `ab-button``ab-search``defineEmits` 改为类型化声明
- `ab-data-list` 使用明确的 `DataItem` 类型替代 `any`
---
# [3.2.0-beta.4] - 2026-01-24
## Backend

View File

@@ -10,6 +10,16 @@ from module.update import version_check
logger = logging.getLogger(__name__)
_default_config_dict: dict | None = None
def _get_default_config_dict() -> dict:
global _default_config_dict
if _default_config_dict is None:
_default_config_dict = Config().dict()
return _default_config_dict
class Checker:
def __init__(self):
pass
@@ -32,7 +42,7 @@ class Checker:
def check_first_run() -> bool:
if Path("config/.setup_complete").exists():
return False
return settings.dict() == Config().dict()
return settings.dict() == _get_default_config_dict()
@staticmethod
def check_version() -> tuple[bool, int | None]:

View File

@@ -58,11 +58,11 @@ class RenameThread(ProgramStatus):
while not self.stop_event.is_set():
async with Renamer() as renamer:
renamed_info = await renamer.rename()
if settings.notification.enable:
if settings.notification.enable and renamed_info:
async with PostNotification() as notifier:
for info in renamed_info:
await notifier.send_msg(info)
await asyncio.sleep(2)
await asyncio.gather(
*[notifier.send_msg(info) for info in renamed_info]
)
try:
await asyncio.wait_for(
self.stop_event.wait(),

View File

@@ -62,6 +62,13 @@ class RSSDatabase:
self.session.commit()
return True
def enable_batch(self, ids: list[int]):
statement = select(RSSItem).where(RSSItem.id.in_(ids))
result = self.session.execute(statement)
for item in result.scalars().all():
item.enabled = True
self.session.commit()
def disable(self, _id: int) -> bool:
statement = select(RSSItem).where(RSSItem.id == _id)
result = self.session.execute(statement)
@@ -73,6 +80,13 @@ class RSSDatabase:
self.session.commit()
return True
def disable_batch(self, ids: list[int]):
statement = select(RSSItem).where(RSSItem.id.in_(ids))
result = self.session.execute(statement)
for item in result.scalars().all():
item.enabled = False
self.session.commit()
def search_id(self, _id: int) -> RSSItem | None:
return self.session.get(RSSItem, _id)

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
from module.conf import settings
@@ -98,8 +99,7 @@ class DownloadClient(TorrentPath):
async def set_rules(self, bangumi_info: list[Bangumi]):
logger.debug("[Downloader] Start adding rules.")
for info in bangumi_info:
await self.set_rule(info)
await asyncio.gather(*[self.set_rule(info) for info in bangumi_info])
logger.debug("[Downloader] Finished.")
async def get_torrent_info(self, category="Bangumi", status_filter="completed", tag=None):
@@ -138,7 +138,9 @@ class DownloadClient(TorrentPath):
torrent_url = [t.url for t in torrent]
torrent_file = None
else:
torrent_file = [await req.get_content(t.url) for t in torrent]
torrent_file = await asyncio.gather(
*[req.get_content(t.url) for t in torrent]
)
torrent_url = None
else:
if "magnet" in torrent.url:

View File

@@ -13,6 +13,10 @@ else:
from pathlib import Path
_MEDIA_SUFFIXES = frozenset({".mp4", ".mkv"})
_SUBTITLE_SUFFIXES = frozenset({".ass", ".srt"})
class TorrentPath:
def __init__(self):
pass
@@ -23,10 +27,10 @@ class TorrentPath:
subtitle_list = []
for f in files:
file_name = f["name"]
suffix = Path(file_name).suffix
if suffix.lower() in [".mp4", ".mkv"]:
suffix = Path(file_name).suffix.lower()
if suffix in _MEDIA_SUFFIXES:
media_list.append(file_name)
elif suffix.lower() in [".ass", ".srt"]:
elif suffix in _SUBTITLE_SUFFIXES:
subtitle_list.append(file_name)
return media_list, subtitle_list

View File

@@ -67,9 +67,9 @@ async def eps_complete():
datas = engine.bangumi.not_complete()
if datas:
logger.info("Start collecting full season...")
for data in datas:
if not data.eps_collect:
async with SeasonCollector() as collector:
async with SeasonCollector() as collector:
for data in datas:
if not data.eps_collect:
await collector.collect_season(data)
data.eps_collect = True
data.eps_collect = True
engine.bangumi.update_all(datas)

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
import re
@@ -142,11 +143,14 @@ class Renamer(DownloadClient):
rename_method = settings.bangumi_manage.rename_method
torrents_info = await self.get_torrent_info()
renamed_info: list[Notification] = []
for info in torrents_info:
# Fetch all torrent files concurrently
all_files = await asyncio.gather(
*[self.get_torrent_files(info["hash"]) for info in torrents_info]
)
for info, files in zip(torrents_info, all_files):
torrent_hash = info["hash"]
torrent_name = info["name"]
save_path = info["save_path"]
files = await self.get_torrent_files(torrent_hash)
media_list, subtitle_list = self.check_files(files)
bangumi_name, season = self._path_to_bangumi(save_path)
kwargs = {

View File

@@ -11,7 +11,7 @@ class Bangumi(SQLModel, table=True):
default="official_title", alias="official_title", title="番剧中文名"
)
year: Optional[str] = Field(alias="year", title="番剧年份")
title_raw: str = Field(default="title_raw", alias="title_raw", title="番剧原名")
title_raw: str = Field(default="title_raw", alias="title_raw", title="番剧原名", index=True)
season: int = Field(default=1, alias="season", title="番剧季度")
season_raw: Optional[str] = Field(alias="season_raw", title="番剧季度原名")
group_name: Optional[str] = Field(alias="group_name", title="字幕组")
@@ -26,7 +26,7 @@ class Bangumi(SQLModel, table=True):
added: bool = Field(default=False, alias="added", title="是否已添加")
rule_name: Optional[str] = Field(alias="rule_name", title="番剧规则名")
save_path: Optional[str] = Field(alias="save_path", title="番剧保存路径")
deleted: bool = Field(False, alias="deleted", title="是否已删除")
deleted: bool = Field(False, alias="deleted", title="是否已删除", index=True)
air_weekday: Optional[int] = Field(default=None, alias="air_weekday", title="放送星期")
@@ -61,7 +61,7 @@ class Notification(BaseModel):
poster_path: Optional[str] = Field(None, alias="poster_path", title="番剧海报路径")
@dataclass
@dataclass(slots=True)
class Episode:
title_en: Optional[str]
title_zh: Optional[str]
@@ -75,8 +75,8 @@ class Episode:
source: str
@dataclass
class SeasonInfo(dict):
@dataclass(slots=True)
class SeasonInfo:
official_title: str
title_raw: str
season: int

View File

@@ -6,7 +6,7 @@ from sqlmodel import Field, SQLModel
class RSSItem(SQLModel, table=True):
id: int = Field(default=None, primary_key=True, alias="id")
name: Optional[str] = Field(None, alias="name")
url: str = Field("https://mikanani.me", alias="url")
url: str = Field("https://mikanani.me", alias="url", index=True)
aggregate: bool = Field(False, alias="aggregate")
parser: str = Field("mikan", alias="parser")
enabled: bool = Field(True, alias="enabled")

View File

@@ -7,9 +7,9 @@ from sqlmodel import Field, SQLModel
class Torrent(SQLModel, table=True):
id: int = Field(default=None, primary_key=True, alias="id")
bangumi_id: Optional[int] = Field(None, alias="refer_id", foreign_key="bangumi.id")
rss_id: Optional[int] = Field(None, alias="rss_id", foreign_key="rssitem.id")
rss_id: Optional[int] = Field(None, alias="rss_id", foreign_key="rssitem.id", index=True)
name: str = Field("", alias="name")
url: str = Field("https://example.com/torrent", alias="url")
url: str = Field("https://example.com/torrent", alias="url", index=True)
homepage: Optional[str] = Field(None, alias="homepage")
downloaded: bool = Field(False, alias="downloaded")

View File

@@ -21,13 +21,11 @@ class RequestContent(RequestURL):
) -> list[Torrent]:
soup = await self.get_xml(_url, retry)
if soup:
torrent_titles, torrent_urls, torrent_homepage = rss_parser(soup)
parsed_items = rss_parser(soup)
torrents: list[Torrent] = []
if _filter is None:
_filter = "|".join(settings.rss_parser.filter)
for _title, torrent_url, homepage in zip(
torrent_titles, torrent_urls, torrent_homepage
):
for _title, torrent_url, homepage in parsed_items:
if re.search(_filter, _title) is None:
torrents.append(
Torrent(name=_title, url=torrent_url, homepage=homepage)

View File

@@ -8,6 +8,46 @@ from module.conf import settings
logger = logging.getLogger(__name__)
# Module-level shared client for connection reuse
_shared_client: httpx.AsyncClient | None = None
_shared_client_proxy_key: str | None = None
def _proxy_config_key() -> str:
if settings.proxy.enable:
return f"{settings.proxy.type}:{settings.proxy.host}:{settings.proxy.port}:{settings.proxy.username}"
return ""
async def get_shared_client() -> httpx.AsyncClient:
global _shared_client, _shared_client_proxy_key
current_key = _proxy_config_key()
if _shared_client is not None and _shared_client_proxy_key == current_key:
return _shared_client
if _shared_client is not None:
await _shared_client.aclose()
timeout = httpx.Timeout(connect=5.0, read=10.0, write=10.0, pool=10.0)
if settings.proxy.enable:
if "http" in settings.proxy.type:
if settings.proxy.username:
proxy_url = f"http://{settings.proxy.username}:{settings.proxy.password}@{settings.proxy.host}:{settings.proxy.port}"
else:
proxy_url = f"http://{settings.proxy.host}:{settings.proxy.port}"
_shared_client = httpx.AsyncClient(proxy=proxy_url, timeout=timeout)
elif settings.proxy.type == "socks5":
if settings.proxy.username:
socks_url = f"socks5://{settings.proxy.username}:{settings.proxy.password}@{settings.proxy.host}:{settings.proxy.port}"
else:
socks_url = f"socks5://{settings.proxy.host}:{settings.proxy.port}"
transport = AsyncProxyTransport.from_url(socks_url, rdns=True)
_shared_client = httpx.AsyncClient(transport=transport, timeout=timeout)
else:
_shared_client = httpx.AsyncClient(timeout=timeout)
else:
_shared_client = httpx.AsyncClient(timeout=timeout)
_shared_client_proxy_key = current_key
return _shared_client
class RequestURL:
def __init__(self):
@@ -86,31 +126,9 @@ class RequestURL:
return None
async def __aenter__(self):
timeout = httpx.Timeout(connect=5.0, read=10.0, write=10.0, pool=10.0)
if settings.proxy.enable:
if "http" in settings.proxy.type:
if settings.proxy.username:
username = settings.proxy.username
password = settings.proxy.password
proxy_url = f"http://{username}:{password}@{settings.proxy.host}:{settings.proxy.port}"
else:
proxy_url = f"http://{settings.proxy.host}:{settings.proxy.port}"
self._client = httpx.AsyncClient(proxy=proxy_url, timeout=timeout)
elif settings.proxy.type == "socks5":
if settings.proxy.username:
socks_url = f"socks5://{settings.proxy.username}:{settings.proxy.password}@{settings.proxy.host}:{settings.proxy.port}"
else:
socks_url = f"socks5://{settings.proxy.host}:{settings.proxy.port}"
transport = AsyncProxyTransport.from_url(socks_url, rdns=True)
self._client = httpx.AsyncClient(transport=transport, timeout=timeout)
else:
logger.error(f"[Network] Unsupported proxy type: {settings.proxy.type}")
self._client = httpx.AsyncClient(timeout=timeout)
else:
self._client = httpx.AsyncClient(timeout=timeout)
self._client = await get_shared_client()
return self
async def __aexit__(self, exc_type, exc_val, exc_tb):
if self._client:
await self._client.aclose()
self._client = None
# Client is shared; do not close it here
self._client = None

View File

@@ -1,17 +1,16 @@
def rss_parser(soup):
torrent_titles = []
torrent_urls = []
torrent_homepage = []
results = []
for item in soup.findall("./channel/item"):
torrent_titles.append(item.find("title").text)
title = item.find("title").text
enclosure = item.find("enclosure")
if enclosure is not None:
torrent_homepage.append(item.find("link").text)
torrent_urls.append(enclosure.attrib.get("url"))
homepage = item.find("link").text
url = enclosure.attrib.get("url")
else:
torrent_urls.append(item.find("link").text)
torrent_homepage.append("")
return torrent_titles, torrent_urls, torrent_homepage
url = item.find("link").text
homepage = ""
results.append((title, url, homepage))
return results
def mikan_title(soup):

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
from module.conf import settings
@@ -35,13 +36,13 @@ class PostNotification:
)
@staticmethod
def _get_poster(notify: Notification):
def _get_poster_sync(notify: Notification):
with Database() as db:
poster_path = db.bangumi.match_poster(notify.official_title)
notify.poster_path = poster_path
async def send_msg(self, notify: Notification) -> bool:
self._get_poster(notify)
await asyncio.to_thread(self._get_poster_sync, notify)
try:
await self.notifier.post_msg(notify)
logger.debug(f"Send notification: {notify.official_title}")

View File

@@ -1,3 +1,4 @@
import logging
import re
from bs4 import BeautifulSoup
@@ -6,8 +7,16 @@ from urllib3.util import parse_url
from module.network import RequestContent
from module.utils import save_image
logger = logging.getLogger(__name__)
# In-memory cache for Mikan homepage lookups
_mikan_cache: dict[str, tuple[str, str]] = {}
async def mikan_parser(homepage: str):
if homepage in _mikan_cache:
logger.debug(f"[Mikan] Cache hit for {homepage}")
return _mikan_cache[homepage]
root_path = parse_url(homepage).host
async with RequestContent() as req:
content = await req.get_html(homepage)
@@ -23,8 +32,12 @@ async def mikan_parser(homepage: str):
img = await req.get_content(f"https://{root_path}{poster_path}")
suffix = poster_path.split(".")[-1]
poster_link = save_image(img, suffix)
return poster_link, official_title
return "", ""
result = (poster_link, official_title)
_mikan_cache[homepage] = result
return result
result = ("", "")
_mikan_cache[homepage] = result
return result
if __name__ == '__main__':

View File

@@ -1,3 +1,4 @@
import logging
import re
import time
from dataclasses import dataclass
@@ -6,8 +7,13 @@ from module.conf import TMDB_API
from module.network import RequestContent
from module.utils import save_image
logger = logging.getLogger(__name__)
TMDB_URL = "https://api.themoviedb.org"
# In-memory cache for TMDB lookups to avoid repeated API calls
_tmdb_cache: dict[str, "TMDBInfo | None"] = {}
@dataclass
class TMDBInfo:
@@ -57,6 +63,11 @@ def get_season(seasons: list) -> tuple[int, str]:
async def tmdb_parser(title, language, test: bool = False) -> TMDBInfo | None:
cache_key = f"{title}:{language}"
if cache_key in _tmdb_cache:
logger.debug(f"[TMDB] Cache hit for {title}")
return _tmdb_cache[cache_key]
async with RequestContent() as req:
url = search_url(title)
contents = await req.get_json(url)
@@ -99,7 +110,7 @@ async def tmdb_parser(title, language, test: bool = False) -> TMDBInfo | None:
poster_link = "https://image.tmdb.org/t/p/w780" + poster_path
else:
poster_link = None
return TMDBInfo(
result = TMDBInfo(
id,
official_title,
original_title,
@@ -108,7 +119,10 @@ async def tmdb_parser(title, language, test: bool = False) -> TMDBInfo | None:
str(year_number),
poster_link,
)
_tmdb_cache[cache_key] = result
return result
else:
_tmdb_cache[cache_key] = None
return None

View File

@@ -16,6 +16,8 @@ RULES = [
r"(.*)(?:S\d{2})?EP?(\d{1,4}(?:\.\d{1,2})?)(.*)",
]
COMPILED_RULES = [re.compile(rule, re.I) for rule in RULES]
SUBTITLE_LANG = {
"zh-tw": ["tc", "cht", "", "zh-tw"],
"zh": ["sc", "chs", "", "zh"],
@@ -34,10 +36,11 @@ def get_path_basename(torrent_path: str) -> str:
return Path(torrent_path).name
_GROUP_SPLIT_RE = re.compile(r"[\[\]()【】()]")
def get_group(group_and_title) -> tuple[str | None, str]:
n = re.split(r"[\[\]()【】()]", group_and_title)
while "" in n:
n.remove("")
n = [x for x in _GROUP_SPLIT_RE.split(group_and_title) if x]
if len(n) > 1:
if re.match(r"\d+", n[1]):
return None, group_and_title
@@ -73,8 +76,8 @@ def torrent_parser(
if torrent_name is None:
match_names = match_names[1:]
for match_name in match_names:
for rule in RULES:
match_obj = re.match(rule, match_name, re.I)
for compiled_rule in COMPILED_RULES:
match_obj = compiled_rule.match(match_name)
if match_obj:
group, title = get_group(match_obj.group(1))
if not season:

View File

@@ -46,12 +46,14 @@ class RSSAnalyser(TitleParser):
self, torrents: list[Torrent], rss: RSSItem, full_parse: bool = True
) -> list:
new_data = []
seen_titles: set[str] = set()
for torrent in torrents:
bangumi = self.raw_parser(raw=torrent.name)
if bangumi and bangumi.title_raw not in [i.title_raw for i in new_data]:
if bangumi and bangumi.title_raw not in seen_titles:
await self.official_title_parser(bangumi=bangumi, rss=rss, torrent=torrent)
if not full_parse:
return [bangumi]
seen_titles.add(bangumi.title_raw)
new_data.append(bangumi)
logger.info(f"[RSS] New bangumi founded: {bangumi.official_title}")
return new_data

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
import re
from typing import Optional
@@ -65,8 +66,7 @@ class RSSEngine(Database):
)
def disable_list(self, rss_id_list: list[int]):
for rss_id in rss_id_list:
self.rss.disable(rss_id)
self.rss.disable_batch(rss_id_list)
return ResponseModel(
status=True,
status_code=200,
@@ -75,8 +75,7 @@ class RSSEngine(Database):
)
def enable_list(self, rss_id_list: list[int]):
for rss_id in rss_id_list:
self.rss.enable(rss_id)
self.rss.enable_batch(rss_id_list)
return ResponseModel(
status=True,
status_code=200,
@@ -99,13 +98,22 @@ class RSSEngine(Database):
new_torrents = self.torrent.check_new(torrents)
return new_torrents
_filter_cache: dict[str, re.Pattern] = {}
def _get_filter_pattern(self, filter_str: str) -> re.Pattern:
if filter_str not in self._filter_cache:
self._filter_cache[filter_str] = re.compile(
filter_str.replace(",", "|"), re.IGNORECASE
)
return self._filter_cache[filter_str]
def match_torrent(self, torrent: Torrent) -> Optional[Bangumi]:
matched: Bangumi = self.bangumi.match_torrent(torrent.name)
if matched:
if matched.filter == "":
return matched
_filter = matched.filter.replace(",", "|")
if not re.search(_filter, torrent.name, re.IGNORECASE):
pattern = self._get_filter_pattern(matched.filter)
if not pattern.search(torrent.name):
torrent.bangumi_id = matched.id
return matched
return None
@@ -117,11 +125,13 @@ class RSSEngine(Database):
else:
rss_item = self.rss.search_id(rss_id)
rss_items = [rss_item] if rss_item else []
# From RSS Items, get all torrents
# From RSS Items, fetch all torrents concurrently
logger.debug(f"[Engine] Get {len(rss_items)} RSS items")
for rss_item in rss_items:
new_torrents = await self.pull_rss(rss_item)
# Get all enabled bangumi data
all_torrents = await asyncio.gather(
*[self.pull_rss(rss_item) for rss_item in rss_items]
)
# Process results sequentially (DB operations)
for rss_item, new_torrents in zip(rss_items, all_torrents):
for torrent in new_torrents:
matched_data = self.match_torrent(torrent)
if matched_data:

View File

@@ -90,7 +90,7 @@ class TestPostNotification:
"""send_msg calls notifier.post_msg and succeeds."""
notify = Notification(official_title="Test Anime", season=1, episode=5)
with patch.object(PostNotification, "_get_poster"):
with patch.object(PostNotification, "_get_poster_sync"):
result = await post_notification.send_msg(notify)
mock_notifier.post_msg.assert_called_once_with(notify)
@@ -100,13 +100,13 @@ class TestPostNotification:
mock_notifier.post_msg.side_effect = Exception("Network error")
notify = Notification(official_title="Test Anime", season=1, episode=5)
with patch.object(PostNotification, "_get_poster"):
with patch.object(PostNotification, "_get_poster_sync"):
result = await post_notification.send_msg(notify)
assert result is False
def test_get_poster_sets_path(self):
"""_get_poster queries DB and sets poster_path on notification."""
def test_get_poster_sync_sets_path(self):
"""_get_poster_sync queries DB and sets poster_path on notification."""
notify = Notification(official_title="My Anime", season=1, episode=1)
with patch("module.notification.notification.Database") as MockDB:
@@ -114,12 +114,12 @@ class TestPostNotification:
mock_db.bangumi.match_poster.return_value = "/posters/my_anime.jpg"
MockDB.return_value.__enter__ = MagicMock(return_value=mock_db)
MockDB.return_value.__exit__ = MagicMock(return_value=False)
PostNotification._get_poster(notify)
PostNotification._get_poster_sync(notify)
assert notify.poster_path == "/posters/my_anime.jpg"
def test_get_poster_empty_when_not_found(self):
"""_get_poster sets empty string when no poster found in DB."""
def test_get_poster_sync_empty_when_not_found(self):
"""_get_poster_sync sets empty string when no poster found in DB."""
notify = Notification(official_title="Unknown", season=1, episode=1)
with patch("module.notification.notification.Database") as MockDB:
@@ -127,6 +127,6 @@ class TestPostNotification:
mock_db.bangumi.match_poster.return_value = ""
MockDB.return_value.__enter__ = MagicMock(return_value=mock_db)
MockDB.return_value.__exit__ = MagicMock(return_value=False)
PostNotification._get_poster(notify)
PostNotification._get_poster_sync(notify)
assert notify.poster_path == ""

View File

@@ -28,9 +28,7 @@ const loading = reactive({
watch(show, (val) => {
if (!val) {
rss.value = rssTemplate;
setTimeout(() => {
windowState.next = false;
}, 300);
windowState.next = false;
} else if (val && rule.value.official_title !== '') {
windowState.next = true;
windowState.rule = true;

View File

@@ -75,13 +75,21 @@ const posterSrc = computed(() => resolvePosterUrl(props.bangumi.poster_link));
<div class="search-card-meta">
<div class="search-card-title">{{ bangumi.official_title }}</div>
<div class="card-tags">
<template v-for="i in ['season', 'group_name', 'subtitle']" :key="i">
<ab-tag
v-if="bangumi[i]"
:title="i === 'season' ? `Season ${bangumi[i]}` : bangumi[i]"
type="primary"
/>
</template>
<ab-tag
v-if="bangumi.season"
:title="`Season ${bangumi.season}`"
type="primary"
/>
<ab-tag
v-if="bangumi.group_name"
:title="bangumi.group_name"
type="primary"
/>
<ab-tag
v-if="bangumi.subtitle"
:title="bangumi.subtitle"
type="primary"
/>
</div>
</div>
</div>

View File

@@ -31,7 +31,7 @@ watch(show, (val) => {
}
});
function showDeleteFileDialog(type: String) {
function showDeleteFileDialog(type: string) {
deleteFileDialog.show = true;
if (type === 'disable' || type === '禁用') {
deleteFileDialog.type = 'disable';

View File

@@ -43,4 +43,4 @@ withDefaults(
</div>
</template>
<style lang="scss" scope></style>
<style lang="scss" scoped></style>

View File

@@ -7,6 +7,7 @@ withDefaults(defineProps<AbSettingProps>(), {
bottomLine: false,
});
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const data = defineModel<any>('data');
</script>

View File

@@ -16,7 +16,7 @@ const props = withDefaults(
}
);
defineEmits(['click']);
defineEmits<{ click: [] }>();
const buttonSize = computed(() => {
switch (props.size) {

View File

@@ -4,13 +4,16 @@ import { ref, computed } from 'vue';
export interface DataListColumn {
key: string;
title: string;
render?: (row: any) => string;
render?: (row: Record<string, unknown>) => string;
hidden?: boolean;
}
// eslint-disable-next-line @typescript-eslint/no-explicit-any
type DataItem = Record<string, any>;
const props = withDefaults(
defineProps<{
items: any[];
items: DataItem[];
columns: DataListColumn[];
selectable?: boolean;
keyField?: string;
@@ -22,18 +25,18 @@ const props = withDefaults(
);
const emit = defineEmits<{
(e: 'select', keys: any[]): void;
(e: 'action', action: string, item: any): void;
(e: 'item-click', item: any): void;
(e: 'select', keys: unknown[]): void;
(e: 'action', action: string, item: DataItem): void;
(e: 'item-click', item: DataItem): void;
}>();
const selectedKeys = ref<Set<any>>(new Set());
const selectedKeys = ref<Set<unknown>>(new Set());
const visibleColumns = computed(() =>
props.columns.filter((col) => !col.hidden)
);
function toggleSelect(key: any) {
function toggleSelect(key: unknown) {
if (selectedKeys.value.has(key)) {
selectedKeys.value.delete(key);
} else {
@@ -51,7 +54,7 @@ function toggleSelectAll() {
emit('select', Array.from(selectedKeys.value));
}
function getCellValue(item: any, column: DataListColumn): string {
function getCellValue(item: DataItem, column: DataListColumn): string {
if (column.render) {
return column.render(item);
}

View File

@@ -13,7 +13,7 @@ withDefaults(
}
);
defineEmits(['select', 'search']);
defineEmits<{ select: []; search: [] }>();
const inputValue = defineModel<string>('inputValue');
</script>

View File

@@ -57,8 +57,8 @@ function getDisabled(item: SelectItem | string) {
return isString(item) ? false : item.disabled;
}
watchEffect(() => {
emit('update:modelValue', selected.value);
watch(selected, (val) => {
emit('update:modelValue', val);
});
</script>

View File

@@ -11,10 +11,15 @@ const messages = {
type Languages = keyof typeof messages;
function normalizeLocale(locale: string): Languages {
if (locale.startsWith('zh')) return 'zh-CN';
return 'en';
}
export const useMyI18n = createSharedComposable(() => {
const lang = useLocalStorage<Languages>(
'lang',
navigator.language as Languages
normalizeLocale(navigator.language)
);
const i18n = createI18n({
@@ -39,7 +44,7 @@ export const useMyI18n = createSharedComposable(() => {
function returnUserLangText(texts: {
[k in Languages]: string;
}) {
return texts[lang.value];
return texts[lang.value] ?? texts['en'];
}
function returnUserLangMsg(res: ApiSuccess) {

View File

@@ -25,7 +25,6 @@ async function refreshCalendar() {
}
onActivated(() => {
getAll();
refreshCalendar();
});

View File

@@ -24,21 +24,18 @@ const isNull = computed(() => {
return config.value.downloader.host === '';
});
let timer: ReturnType<typeof setInterval> | null = null;
const { pause, resume } = useIntervalFn(getAll, 5000, { immediate: false });
onActivated(() => {
getConfig();
onActivated(async () => {
await getConfig();
if (!isNull.value) {
getAll();
timer = setInterval(getAll, 5000);
resume();
}
});
onDeactivated(() => {
if (timer) {
clearInterval(timer);
timer = null;
}
pause();
clearSelection();
});
@@ -94,82 +91,80 @@ function isGroupAllSelected(group: TorrentGroup): boolean {
return group.torrents.every((t) => selectedHashes.value.includes(t.hash));
}
function tableColumns(): DataTableColumns<QbTorrentInfo> {
return [
{
type: 'selection',
const tableColumnsValue = computed<DataTableColumns<QbTorrentInfo>>(() => [
{
type: 'selection',
},
{
title: t('downloader.torrent.name'),
key: 'name',
ellipsis: { tooltip: true },
minWidth: 200,
},
{
title: t('downloader.torrent.progress'),
key: 'progress',
width: 160,
render(row: QbTorrentInfo) {
return (
<NProgress
type="line"
percentage={Math.round(row.progress * 100)}
indicator-placement="inside"
processing={row.state === 'downloading' || row.state === 'forcedDL'}
/>
);
},
{
title: t('downloader.torrent.name'),
key: 'name',
ellipsis: { tooltip: true },
minWidth: 200,
},
{
title: t('downloader.torrent.status'),
key: 'state',
width: 100,
render(row: QbTorrentInfo) {
return <ab-tag type={stateType(row.state)} title={stateLabel(row.state)} />;
},
{
title: t('downloader.torrent.progress'),
key: 'progress',
width: 160,
render(row: QbTorrentInfo) {
return (
<NProgress
type="line"
percentage={Math.round(row.progress * 100)}
indicator-placement="inside"
processing={row.state === 'downloading' || row.state === 'forcedDL'}
/>
);
},
},
{
title: t('downloader.torrent.size'),
key: 'size',
width: 100,
render(row: QbTorrentInfo) {
return formatSize(row.size);
},
{
title: t('downloader.torrent.status'),
key: 'state',
width: 100,
render(row: QbTorrentInfo) {
return <ab-tag type={stateType(row.state)} title={stateLabel(row.state)} />;
},
},
{
title: t('downloader.torrent.dlspeed'),
key: 'dlspeed',
width: 110,
render(row: QbTorrentInfo) {
return formatSpeed(row.dlspeed);
},
{
title: t('downloader.torrent.size'),
key: 'size',
width: 100,
render(row: QbTorrentInfo) {
return formatSize(row.size);
},
},
{
title: t('downloader.torrent.upspeed'),
key: 'upspeed',
width: 110,
render(row: QbTorrentInfo) {
return formatSpeed(row.upspeed);
},
{
title: t('downloader.torrent.dlspeed'),
key: 'dlspeed',
width: 110,
render(row: QbTorrentInfo) {
return formatSpeed(row.dlspeed);
},
},
{
title: 'ETA',
key: 'eta',
width: 80,
render(row: QbTorrentInfo) {
return formatEta(row.eta);
},
{
title: t('downloader.torrent.upspeed'),
key: 'upspeed',
width: 110,
render(row: QbTorrentInfo) {
return formatSpeed(row.upspeed);
},
},
{
title: t('downloader.torrent.peers'),
key: 'peers',
width: 90,
render(row: QbTorrentInfo) {
return `${row.num_seeds} / ${row.num_leechs}`;
},
{
title: 'ETA',
key: 'eta',
width: 80,
render(row: QbTorrentInfo) {
return formatEta(row.eta);
},
},
{
title: t('downloader.torrent.peers'),
key: 'peers',
width: 90,
render(row: QbTorrentInfo) {
return `${row.num_seeds} / ${row.num_leechs}`;
},
},
];
}
},
]);
function tableRowKey(row: QbTorrentInfo) {
return row.hash;
@@ -242,7 +237,7 @@ function groupCheckedKeys(group: TorrentGroup): string[] {
:default-open="true"
>
<NDataTable
:columns="tableColumns()"
:columns="tableColumnsValue"
:data="group.torrents"
:row-key="tableRowKey"
:pagination="false"

View File

@@ -1,5 +1,5 @@
<script lang="tsx" setup>
import { NDataTable } from 'naive-ui';
import { NDataTable, type DataTableColumns } from 'naive-ui';
import type { RSS } from '#/rss';
definePage({
@@ -16,62 +16,51 @@ onActivated(() => {
getAll();
});
const RSSTableOptions = computed(() => {
const columns = [
{
type: 'selection',
const rssColumns = computed<DataTableColumns<RSS>>(() => [
{
type: 'selection',
},
{
title: t('rss.name'),
key: 'name',
className: 'text-h3',
ellipsis: {
tooltip: true,
},
{
title: t('rss.name'),
key: 'name',
className: 'text-h3',
ellipsis: {
tooltip: true,
},
},
{
title: t('rss.url'),
key: 'url',
className: 'text-h3',
minWidth: 400,
align: 'center',
ellipsis: {
tooltip: true,
},
{
title: t('rss.url'),
key: 'url',
className: 'text-h3',
minWidth: 400,
align: 'center',
ellipsis: {
tooltip: true,
},
},
{
title: t('rss.status'),
key: 'status',
className: 'text-h3',
align: 'right',
minWidth: 200,
render(rss: RSS) {
return (
<div flex="~ justify-end gap-x-8">
{rss.parser && <ab-tag type="primary" title={rss.parser} />}
{rss.aggregate && <ab-tag type="primary" title="aggregate" />}
{rss.enabled ? (
<ab-tag type="active" title="active" />
) : (
<ab-tag type="inactive" title="inactive" />
)}
</div>
);
},
{
title: t('rss.status'),
key: 'status',
className: 'text-h3',
align: 'right',
minWidth: 200,
render(rss: RSS) {
return (
<div flex="~ justify-end gap-x-8">
{rss.parser && <ab-tag type="primary" title={rss.parser} />}
{rss.aggregate && <ab-tag type="primary" title="aggregate" />}
{rss.enabled ? (
<ab-tag type="active" title="active" />
) : (
<ab-tag type="inactive" title="inactive" />
)}
</div>
);
},
},
];
},
]);
const rowKey = (rss: RSS) => rss.id;
return {
columns,
data: rss.value,
pagination: false,
bordered: false,
rowKey,
maxHeight: 500,
} as unknown as InstanceType<typeof NDataTable>;
});
const rssRowKey = (row: RSS) => row.id;
</script>
<template>
@@ -108,7 +97,12 @@ const RSSTableOptions = computed(() => {
<!-- Desktop: Data table -->
<NDataTable
v-else
v-bind="RSSTableOptions"
:columns="rssColumns"
:data="rss"
:row-key="rssRowKey"
:pagination="false"
:bordered="false"
:max-height="500"
@update:checked-row-keys="(e) => (selectedRSS = (e as number[]))"
></NDataTable>

View File

@@ -2,13 +2,13 @@ import type { BangumiRule } from '#/bangumi';
import { ruleTemplate } from '#/bangumi';
export const useBangumiStore = defineStore('bangumi', () => {
const bangumi = ref<BangumiRule[]>();
const bangumi = ref<BangumiRule[]>([]);
const editRule = reactive<{
show: boolean;
item: BangumiRule;
}>({
show: false,
item: ruleTemplate,
item: { ...ruleTemplate },
});
async function getAll() {

View File

@@ -1,7 +1,7 @@
import type { QbTorrentInfo, TorrentGroup } from '#/downloader';
export const useDownloaderStore = defineStore('downloader', () => {
const torrents = ref<QbTorrentInfo[]>([]);
const torrents = shallowRef<QbTorrentInfo[]>([]);
const selectedHashes = ref<string[]>([]);
const loading = ref(false);

View File

@@ -27,14 +27,13 @@ export const useLogStore = defineStore('log', () => {
immediateCallback: true,
});
function copy() {
const { copy: copyLog, isSupported } = useClipboard({
source: log.value,
legacy: true,
});
const { copy: clipboardCopy, isSupported: clipboardSupported } = useClipboard({
legacy: true,
});
if (isSupported.value) {
copyLog();
function copy() {
if (clipboardSupported.value) {
clipboardCopy(log.value);
message.success(t('notify.copy_success'));
} else {
message.error(t('notify.copy_failed'));

View File

@@ -13,7 +13,7 @@
"esModuleInterop": true,
"lib": ["ESNext", "DOM"],
"skipLibCheck": true,
"noImplicitAny": false,
"noImplicitAny": true,
"baseUrl": "./",
"types": ["vite-plugin-pwa/client"],
"paths": {