Merge pull request #62 from lencx/dev

This commit is contained in:
lencx
2022-12-23 19:20:38 +08:00
committed by GitHub
38 changed files with 1325 additions and 1589 deletions

View File

@@ -22,9 +22,9 @@
**最新版:**
- `Mac`: [ChatGPT_0.5.1_x64.dmg](https://github.com/lencx/ChatGPT/releases/download/v0.5.1/ChatGPT_0.5.1_x64.dmg)
- `Linux`: [chat-gpt_0.5.1_amd64.deb](https://github.com/lencx/ChatGPT/releases/download/v0.5.1/chat-gpt_0.5.1_amd64.deb)
- `Windows`: [ChatGPT_0.5.1_x64_en-US.msi](https://github.com/lencx/ChatGPT/releases/download/v0.5.1/ChatGPT_0.5.1_x64_en-US.msi)
- `Mac`: [ChatGPT_0.6.0_x64.dmg](https://github.com/lencx/ChatGPT/releases/download/v0.6.0/ChatGPT_0.6.0_x64.dmg)
- `Linux`: [chat-gpt_0.6.0_amd64.deb](https://github.com/lencx/ChatGPT/releases/download/v0.6.0/chat-gpt_0.6.0_amd64.deb)
- `Windows`: [ChatGPT_0.6.0_x64_en-US.msi](https://github.com/lencx/ChatGPT/releases/download/v0.6.0/ChatGPT_0.6.0_x64_en-US.msi)
[其他版本...](https://github.com/lencx/ChatGPT/releases)
@@ -34,18 +34,18 @@
Easily install with _[Homebrew](https://brew.sh) ([Cask](https://docs.brew.sh/Cask-Cookbook)):_
~~~ sh
```sh
brew tap lencx/chatgpt https://github.com/lencx/ChatGPT.git
brew install --cask chatgpt --no-quarantine
~~~
```
Also, if you keep a _[Brewfile](https://github.com/Homebrew/homebrew-bundle#usage)_, you can add something like this:
~~~ rb
```rb
repo = "lencx/chatgpt"
tap repo, "https://github.com/#{repo}.git"
cask "popcorn-time", args: { "no-quarantine": true }
~~~
```
## 📢 公告
@@ -60,11 +60,10 @@ cask "popcorn-time", args: { "no-quarantine": true }
数据导入完成后,可以重新启动应用来使配置生效(`Menu -> Preferences -> Restart ChatGPT`)。
项目会维护一份常用命令,您也可以直接将 [chat.model.json](https://github.com/lencx/ChatGPT/blob/main/chat.model.json) 复制到你的本地目录 `~/.chatgpt/chat.model.json`
在 ChatGPT 文本输入区域,键入 `/` 开头的字符,则会弹出指令提示,按下空格键,它会默认将命令关联的文本填充到输入区域(注意:如果包含多个指令提示,它只会选择第一个作为填充,你可以持续输入,直到第一个提示命令为你想要时,再按下空格键。或者使用鼠标来点击多条指令中的某一个)。填充完成后,你只需要按下回车键即可。
在 ChatGPT 文本输入区域,键入 `/` 开头的字符,则会弹出指令提示,按下空格键,它会默认将命令关联的文本填充到输入区域(注意:如果包含多个指令提示,它只会选择第一个作为填充,你可以持续输入,直到第一个提示命令为你想要时,再按下空格键。或者使用鼠标来点击多条指令中的某一个)。填充完成后,你只需要按下回车键即可。斜杠命令下,使用 TAB 键修改 `{q}` 标签内容(仅支持单个修改 [#54](https://github.com/lencx/ChatGPT/issues/54)
![chatgpt](assets/chatgpt.gif)
![chatgpt-cmd](assets/chatgpt-cmd.gif)
## ✨ 功能概览
@@ -74,6 +73,7 @@ cask "popcorn-time", args: { "no-quarantine": true }
- 丰富的快捷键
- 系统托盘悬浮窗
- 应用菜单功能强大
- 支持斜杠命令及其配置(可手动配置或从文件同步 [#55](https://github.com/lencx/ChatGPT/issues/55)
### 菜单项
@@ -99,18 +99,68 @@ cask "popcorn-time", args: { "no-quarantine": true }
- `Report Bug`: 报告 BUG 或反馈建议
- `Toggle Developer Tools`: 网站调试工具,调试页面或脚本可能需要
## 应用配置
| 平台 | 路径 |
| ------- | ------------------------- |
| Linux | `/home/lencx/.chatgpt` |
| macOS | `/Users/lencx/.chatgpt` |
| Windows | `C:\Users\lencx\.chatgpt` |
- `[.chatgpt]` - 应用配置根路径
- `chat.conf.json` - 应用喜好配置
- `chat.model.json` - ChatGPT 输入提示,通过斜杠命令来快速完成输入,主要包含三部分:
- `user_custom` - 需要手动录入 (**Control Conter -> Language Model -> User Custom**)
- `sync_prompts` - 从 [f/awesome-chatgpt-prompts](https://github.com/f/awesome-chatgpt-prompts) 同步数据 (**Control Conter -> Language Model -> Sync Prompts**)
- `sync_custom` - 同步自定义的 json 或 csv 文件数据,支持本地和远程 (**Control Conter -> Language Model -> Sync Custom**)
- `chat.model.cmd.json` - 过滤(是否启用)和排序处理后的斜杠命令数据
- `[cache_model]` - 缓存同步或录入的数据
- `chatgpt_prompts.json` - 缓存 `sync_prompts` 数据
- `user_custom.json` - 缓存 `user_custom` 数据
- `ae6cf32a6f8541b499d6bfe549dbfca3.json` - 随机生成的文件名,缓存 `sync_custom` 数据
- `4f695d3cfbf8491e9b1f3fab6d85715c.json` - 随机生成的文件名,缓存 `sync_custom` 数据
- `bd1b96f15a1644f7bd647cc53073ff8f.json` - 随机生成的文件名,缓存 `sync_custom` 数据
### Sync Custom
目前同步自定文件仅支持 json 和 csv且需要满足以下格式否则会导致应用异常
> JSON 格式
```json
[
{
"cmd": "a",
"act": "aa",
"prompt": "aaa aaa aaa"
},
{
"cmd": "b",
"act": "bb",
"prompt": "bbb bbb bbb"
}
]
```
> CSV 格式
```csv
"cmd","act","prompt"
"a","aa","aaa aaa aaa"
"b","bb","bbb bbb bbb"
```
## 👀 预览
<img width="320" src="./assets/install.png" alt="install"> <img width="320" src="./assets/control-center.png" alt="control center">
<img width="320" src="./assets/export.png" alt="export"> <img width="320" src="./assets/tray.png" alt="tray">
<img width="320" src="./assets/tray-login.png" alt="tray login"> <img width="320" src="./assets/auto-update.png" alt="auto update">
---
<a href="https://www.buymeacoffee.com/lencx" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-blue.png" alt="Buy Me A Coffee" style="height: 60px !important;width: 217px !important;" ></a>
## ❓常见问题
## ❓ 常见问题
### 不能打开 ChatGPT

View File

@@ -11,6 +11,7 @@
[![ChatGPT downloads](https://img.shields.io/github/downloads/lencx/ChatGPT/total.svg?style=flat-square)](https://github.com/lencx/ChatGPT/releases)
[![chat](https://img.shields.io/badge/chat-discord-blue?style=flat&logo=discord)](https://discord.gg/aPhCRf4zZr)
[![lencx](https://img.shields.io/twitter/follow/lencx_.svg?style=social)](https://twitter.com/lencx_)
<!-- [![中文版 badge](https://img.shields.io/badge/%E4%B8%AD%E6%96%87-Traditional%20Chinese-blue)](./README-ZH.md) -->
[Awesome ChatGPT](./AWESOME.md)
@@ -23,9 +24,9 @@
**Latest:**
- `Mac`: [ChatGPT_0.5.1_x64.dmg](https://github.com/lencx/ChatGPT/releases/download/v0.5.1/ChatGPT_0.5.1_x64.dmg)
- `Linux`: [chat-gpt_0.5.1_amd64.deb](https://github.com/lencx/ChatGPT/releases/download/v0.5.1/chat-gpt_0.5.1_amd64.deb)
- `Windows`: [ChatGPT_0.5.1_x64_en-US.msi](https://github.com/lencx/ChatGPT/releases/download/v0.5.1/ChatGPT_0.5.1_x64_en-US.msi)
- `Mac`: [ChatGPT_0.6.0_x64.dmg](https://github.com/lencx/ChatGPT/releases/download/v0.6.0/ChatGPT_0.6.0_x64.dmg)
- `Linux`: [chat-gpt_0.6.0_amd64.deb](https://github.com/lencx/ChatGPT/releases/download/v0.6.0/chat-gpt_0.6.0_amd64.deb)
- `Windows`: [ChatGPT_0.6.0_x64_en-US.msi](https://github.com/lencx/ChatGPT/releases/download/v0.6.0/ChatGPT_0.6.0_x64_en-US.msi)
[Other version...](https://github.com/lencx/ChatGPT/releases)
@@ -35,18 +36,18 @@
Easily install with _[Homebrew](https://brew.sh) ([Cask](https://docs.brew.sh/Cask-Cookbook)):_
~~~ sh
```sh
brew tap lencx/chatgpt https://github.com/lencx/ChatGPT.git
brew install --cask chatgpt --no-quarantine
~~~
```
Also, if you keep a _[Brewfile](https://github.com/Homebrew/homebrew-bundle#usage)_, you can add something like this:
~~~ rb
```rb
repo = "lencx/chatgpt"
tap repo, "https://github.com/#{repo}.git"
cask "popcorn-time", args: { "no-quarantine": true }
~~~
```
## 📢 Announcement
@@ -61,11 +62,10 @@ You can look at [awesome-chatgpt-prompts](https://github.com/f/awesome-chatgpt-p
After the data import is done, you can restart the app to make the configuration take effect (`Menu -> Preferences -> Restart ChatGPT`).
The project maintains a list of common commands, or you can copy [chat.model.json](https://github.com/lencx/ChatGPT/blob/main/chat.model.json) directly to your local directory `~/.chatgpt/chat.model.json`
In the chatgpt text input area, type a character starting with `/` to bring up the command prompt, press the spacebar, and it will fill the input area with the text associated with the command by default (note: if it contains multiple command prompts, it will only select the first one as the fill, you can keep typing until the first prompted command is the one you want, then press the spacebar. Or use the mouse to click on one of the multiple commands). When the fill is complete, you simply press the Enter key.
In the chatgpt text input area, type a character starting with `/` to bring up the command prompt, press the spacebar, and it will fill the input area with the text associated with the command by default (note: if it contains multiple command prompts, it will only select the first one as the fill, you can keep typing until the first prompted command is the one you want, then press the spacebar. Or use the mouse to click on one of the multiple commands). When the fill is complete, you simply press the Enter key. Under the slash command, use the tab key to modify the contents of the `{q}` tag (only single changes are supported [#54](https://github.com/lencx/ChatGPT/issues/54)).
![chatgpt](assets/chatgpt.gif)
![chatgpt-cmd](assets/chatgpt-cmd.gif)
## ✨ Features
@@ -75,7 +75,7 @@ In the chatgpt text input area, type a character starting with `/` to bring up t
- Common shortcut keys
- System tray hover window
- Powerful menu items
- Shortcut command typing chatgpt prompt
- Support for slash commands and their configuration (can be configured manually or synchronized from a file [#55](https://github.com/lencx/ChatGPT/issues/55))
### MenuItem
@@ -101,6 +101,57 @@ In the chatgpt text input area, type a character starting with `/` to bring up t
- `Report Bug`: Report a bug or give feedback.
- `Toggle Developer Tools`: Developer debugging tools.
## Application Configuration
| Platform | Path |
| -------- | ------------------------- |
| Linux | `/home/lencx/.chatgpt` |
| macOS | `/Users/lencx/.chatgpt` |
| Windows | `C:\Users\lencx\.chatgpt` |
- `[.chatgpt]` - application configuration root folder
- `chat.conf.json` - preferences configuration
- `chat.model.json` - prompts configurationcontains three parts:
- `user_custom` - Requires manual data entry (**Control Conter -> Language Model -> User Custom**)
- `sync_prompts` - Synchronizing data from [f/awesome-chatgpt-prompts](https://github.com/f/awesome-chatgpt-prompts) (**Control Conter -> Language Model -> Sync Prompts**)
- `sync_custom` - Synchronize custom json and csv file data, support local and remote (**Control Conter -> Language Model -> Sync Custom**)
- `chat.model.cmd.json` - filtered (whether to enable) and sorted slash commands
- `[cache_model]` - caching model data
- `chatgpt_prompts.json` - Cache `sync_prompts` data
- `user_custom.json` - Cache `user_custom` data
- `ae6cf32a6f8541b499d6bfe549dbfca3.json` - Randomly generated file names, cache `sync_custom` data
- `4f695d3cfbf8491e9b1f3fab6d85715c.json` - Randomly generated file names, cache `sync_custom` data
- `bd1b96f15a1644f7bd647cc53073ff8f.json` - Randomly generated file names, cache `sync_custom` data
### Sync Custom
Currently, only json and csv are supported for synchronizing custom files, and the following formats need to be met, otherwise the application will be abnormal
> JSON format:
```json
[
{
"cmd": "a",
"act": "aa",
"prompt": "aaa aaa aaa"
},
{
"cmd": "b",
"act": "bb",
"prompt": "bbb bbb bbb"
}
]
```
> CSV format
```csv
"cmd","act","prompt"
"a","aa","aaa aaa aaa"
"b","bb","bbb bbb bbb"
```
## TODO
- Web access capability ([#20](https://github.com/lencx/ChatGPT/issues/20))

View File

@@ -1,5 +1,17 @@
# UPDATE LOG
## v0.6.0
fix:
- windows show Chinese when upgrading
feat:
- optimize the generated pdf file size
- menu added `Sync Prompts`
- `Control Center` added `Sync Custom`
- the slash command is triggered by the enter key
- under the slash command, use the tab key to modify the contents of the `{q}` tag (only single changes are supported (https://github.com/lencx/ChatGPT/issues/54)
## v0.5.1
some optimization

BIN
assets/chatgpt-cmd.gif Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 MiB

File diff suppressed because it is too large Load Diff

View File

@@ -32,7 +32,7 @@
"dependencies": {
"@ant-design/icons": "^4.8.0",
"@tauri-apps/api": "^1.2.0",
"antd": "^5.0.6",
"antd": "^5.1.0",
"dayjs": "^1.11.7",
"lodash": "^4.17.21",
"react": "^18.2.0",

View File

@@ -22,6 +22,9 @@ tauri-plugin-positioner = { version = "1.0.4", features = ["system-tray"] }
log = "0.4.17"
csv = "1.1.6"
thiserror = "1.0.38"
walkdir = "2.3.2"
# tokio = { version = "1.23.0", features = ["macros"] }
# reqwest = "0.11.13"
[dependencies.tauri-plugin-log]
git = "https://github.com/tauri-apps/tauri-plugin-log"

View File

@@ -66,14 +66,15 @@ pub fn open_file(path: PathBuf) {
}
#[command]
pub fn get_chat_model() -> serde_json::Value {
let path = utils::chat_root().join("chat.model.json");
pub fn get_chat_model_cmd() -> serde_json::Value {
let path = utils::chat_root().join("chat.model.cmd.json");
let content = fs::read_to_string(path).unwrap_or_else(|_| r#"{"data":[]}"#.to_string());
serde_json::from_str(&content).unwrap()
}
#[derive(Debug, serde::Serialize, serde::Deserialize)]
pub struct PromptRecord {
pub cmd: Option<String>,
pub act: String,
pub prompt: String,
}
@@ -88,3 +89,42 @@ pub fn parse_prompt(data: String) -> Vec<PromptRecord> {
}
list
}
#[command]
pub fn window_reload(app: AppHandle, label: &str) {
app.app_handle()
.get_window(label)
.unwrap()
.eval("window.location.reload()")
.unwrap();
}
use walkdir::WalkDir;
use utils::chat_root;
#[derive(serde::Serialize, serde::Deserialize, Debug, Clone)]
pub struct ModelRecord {
pub cmd: String,
pub act: String,
pub prompt: String,
pub tags: Vec<String>,
pub enable: bool,
}
#[command]
pub fn cmd_list() -> Vec<ModelRecord> {
let mut list = vec![];
for entry in WalkDir::new(chat_root().join("cache_model")).into_iter().filter_map(|e| e.ok()) {
let file = fs::read_to_string(entry.path().display().to_string());
if let Ok(v) = file {
let data: Vec<ModelRecord> = serde_json::from_str(&v).unwrap_or_else(|_| vec![]);
let enable_list = data.into_iter()
.filter(|v| v.enable);
list.extend(enable_list)
}
}
// dbg!(&list);
list.sort_by(|a, b| a.cmd.len().cmp(&b.cmd.len()));
list
}

View File

@@ -47,6 +47,10 @@ pub fn init() -> Menu {
let preferences_menu = Submenu::new(
"Preferences",
Menu::with_items([
CustomMenuItem::new("control_center".to_string(), "Control Center")
.accelerator("CmdOrCtrl+Shift+P")
.into(),
MenuItem::Separator.into(),
Submenu::new(
"Theme",
Menu::new()
@@ -67,13 +71,11 @@ pub fn init() -> Menu {
titlebar_menu.into(),
#[cfg(target_os = "macos")]
CustomMenuItem::new("hide_dock_icon".to_string(), "Hide Dock Icon").into(),
MenuItem::Separator.into(),
CustomMenuItem::new("inject_script".to_string(), "Inject Script")
.accelerator("CmdOrCtrl+J")
.into(),
CustomMenuItem::new("control_center".to_string(), "Control Center")
.accelerator("CmdOrCtrl+Shift+P")
.into(),
MenuItem::Separator.into(),
CustomMenuItem::new("sync_prompts".to_string(), "Sync Prompts").into(),
MenuItem::Separator.into(),
CustomMenuItem::new("go_conf".to_string(), "Go to Config")
.accelerator("CmdOrCtrl+Shift+G")
@@ -178,6 +180,21 @@ pub fn menu_handler(event: WindowMenuEvent<tauri::Wry>) {
"go_conf" => utils::open_file(utils::chat_root()),
"clear_conf" => utils::clear_conf(&app),
"awesome" => open(&app, conf::AWESOME_URL.to_string()),
"sync_prompts" => {
tauri::api::dialog::ask(
app.get_window("main").as_ref(),
"Sync Prompts",
"Data sync will enable all prompts, are you sure you want to sync?",
move |is_restart| {
if is_restart {
app.get_window("main")
.unwrap()
.eval("window.__sync_prompts && window.__sync_prompts()")
.unwrap()
}
},
);
}
"hide_dock_icon" => {
ChatConfJson::amend(&serde_json::json!({ "hide_dock_icon": true }), Some(app)).unwrap()
}

View File

@@ -61,10 +61,8 @@ function init() {
}
async function cmdTip() {
const chatModelJson = await invoke('get_chat_model') || {};
const user_custom = chatModelJson.user_custom || [];
const sys_sync_prompts = chatModelJson.sys_sync_prompts || [];
const data = [...user_custom, ...sys_sync_prompts];
const chatModelJson = await invoke('get_chat_model_cmd') || {};
const data = chatModelJson.data;
if (data.length <= 0) return;
const modelDom = document.createElement('div');
@@ -82,22 +80,72 @@ async function cmdTip() {
// Enter a command starting with `/` and press a space to automatically fill `chatgpt prompt`.
// If more than one command appears in the search results, the first one will be used by default.
searchInput.addEventListener('keydown', (event) => {
if (!window.__CHAT_MODEL_CMD__) {
if (!window.__CHAT_MODEL_CMD_PROMPT__) {
return;
}
if (event.keyCode === 32) {
searchInput.value = window.__CHAT_MODEL_CMD__;
modelDom.innerHTML = '';
delete window.__CHAT_MODEL_CMD__;
// feat: https://github.com/lencx/ChatGPT/issues/54
if (event.keyCode === 9 && !window.__CHAT_MODEL_STATUS__) {
const strGroup = window.__CHAT_MODEL_CMD_PROMPT__.match(/\{([^{}]*)\}/) || [];
if (strGroup[1]) {
searchInput.value = `/${window.__CHAT_MODEL_CMD__}` + ` {${strGroup[1]}}` + ' |-> ';
window.__CHAT_MODEL_STATUS__ = 1;
}
event.preventDefault();
}
if (event.keyCode === 13) {
if (window.__CHAT_MODEL_STATUS__ === 1 && event.keyCode === 9) {
const data = searchInput.value.split('|->');
if (data[1]?.trim()) {
window.__CHAT_MODEL_CMD_PROMPT__ = window.__CHAT_MODEL_CMD_PROMPT__?.replace(/\{([^{}]*)\}/, `{${data[1]?.trim()}}`);
window.__CHAT_MODEL_STATUS__ = 2;
}
event.preventDefault();
}
// input text
if (window.__CHAT_MODEL_STATUS__ === 2 && event.keyCode === 9) {
console.log('«110» /src/assets/cmd.js ~> ', __CHAT_MODEL_STATUS__);
searchInput.value = window.__CHAT_MODEL_CMD_PROMPT__;
modelDom.innerHTML = '';
delete window.__CHAT_MODEL_STATUS__;
event.preventDefault();
}
// type in a space to complete the fill
if (event.keyCode === 32) {
searchInput.value = window.__CHAT_MODEL_CMD_PROMPT__;
modelDom.innerHTML = '';
delete window.__CHAT_MODEL_CMD_PROMPT__;
}
// send
if (event.keyCode === 13 && window.__CHAT_MODEL_CMD_PROMPT__) {
const data = searchInput.value.split('|->');
if (data[1]?.trim()) {
window.__CHAT_MODEL_CMD_PROMPT__ = window.__CHAT_MODEL_CMD_PROMPT__?.replace(/\{([^{}]*)\}/, `{${data[1]?.trim()}}`);
}
searchInput.value = window.__CHAT_MODEL_CMD_PROMPT__;
modelDom.innerHTML = '';
delete window.__CHAT_MODEL_CMD_PROMPT__;
delete window.__CHAT_MODEL_CMD__;
delete window.__CHAT_MODEL_STATUS__;
event.preventDefault();
}
});
searchInput.addEventListener('input', (event) => {
if (searchInput.value === '') {
delete window.__CHAT_MODEL_CMD_PROMPT__;
delete window.__CHAT_MODEL_CMD__;
delete window.__CHAT_MODEL_STATUS__;
}
if (window.__CHAT_MODEL_STATUS__) return;
const query = searchInput.value;
if (!query || !/^\//.test(query)) {
modelDom.innerHTML = '';
@@ -106,19 +154,22 @@ async function cmdTip() {
// all cmd result
if (query === '/') {
const result = data.filter(i => i.enable);
modelDom.innerHTML = `<div>${result.map(itemDom).join('')}</div>`;
window.__CHAT_MODEL_CMD__ = result[0]?.prompt.trim();
modelDom.innerHTML = `<div>${data.map(itemDom).join('')}</div>`;
window.__CHAT_MODEL_CMD_PROMPT__ = data[0]?.prompt.trim();
window.__CHAT_MODEL_CMD__ = data[0]?.cmd.trim();
return;
}
const result = data.filter(i => i.enable && new RegExp(query.substring(1)).test(i.cmd));
const result = data.filter(i => new RegExp(query.substring(1)).test(i.cmd));
if (result.length > 0) {
modelDom.innerHTML = `<div>${result.map(itemDom).join('')}</div>`;
window.__CHAT_MODEL_CMD__ = result[0]?.prompt.trim();
window.__CHAT_MODEL_CMD_PROMPT__ = result[0]?.prompt.trim();
window.__CHAT_MODEL_CMD__ = result[0]?.cmd.trim();
} else {
modelDom.innerHTML = '';
delete window.__CHAT_MODEL_CMD_PROMPT__;
delete window.__CHAT_MODEL_CMD__;
delete window.__CHAT_MODEL_STATUS__;
}
}, {
capture: false,
@@ -140,7 +191,7 @@ async function cmdTip() {
const val = decodeURIComponent(item.getAttribute('data-prompt'));
searchInput.value = val;
document.querySelector('form textarea').focus();
window.__CHAT_MODEL_CMD__ = val;
window.__CHAT_MODEL_CMD_PROMPT__ = val;
modelDom.innerHTML = '';
}
}, {

View File

@@ -1,6 +1,7 @@
// *** Core Script - Export ***
// @ref: https://github.com/liady/ChatGPT-pdf
const buttonOuterHTMLFallback = `<button class="btn flex justify-center gap-2 btn-neutral" id="download-png-button">Try Again</button>`;
async function init() {
const chatConf = await invoke('get_chat_conf') || {};
if (window.buttonsInterval) {
@@ -11,14 +12,15 @@ async function init() {
if (!actionsArea) {
return;
}
const buttons = actionsArea.querySelectorAll("button");
const hasTryAgainButton = Array.from(buttons).some((button) => {
return !button.id?.includes("download");
});
if (hasTryAgainButton && buttons.length === 1) {
const TryAgainButton = actionsArea.querySelector("button");
if (shouldAddButtons(actionsArea)) {
let TryAgainButton = actionsArea.querySelector("button");
if (!TryAgainButton) {
const parentNode = document.createElement("div");
parentNode.innerHTML = buttonOuterHTMLFallback;
TryAgainButton = parentNode.querySelector("button");
}
addActionsButtons(actionsArea, TryAgainButton, chatConf);
} else if (!hasTryAgainButton) {
} else if (shouldRemoveButtons()) {
removeButtons();
}
}, 200);
@@ -29,32 +31,42 @@ const Format = {
PDF: "pdf",
};
function addActionsButtons(actionsArea, TryAgainButton, chatConf) {
const downloadButton = TryAgainButton.cloneNode(true);
downloadButton.id = "download-png-button";
downloadButton.innerText = "Generate PNG";
downloadButton.onclick = () => {
downloadThread();
};
actionsArea.appendChild(downloadButton);
const downloadPdfButton = TryAgainButton.cloneNode(true);
downloadPdfButton.id = "download-pdf-button";
downloadPdfButton.innerText = "Download PDF";
downloadPdfButton.onclick = () => {
downloadThread({ as: Format.PDF });
};
actionsArea.appendChild(downloadPdfButton);
if (new RegExp('//chat.openai.com').test(chatConf.origin)) {
const exportHtml = TryAgainButton.cloneNode(true);
exportHtml.id = "download-html-button";
exportHtml.innerText = "Share Link";
exportHtml.onclick = () => {
sendRequest();
};
actionsArea.appendChild(exportHtml);
function shouldRemoveButtons() {
const isOpenScreen = document.querySelector("h1.text-4xl");
if(isOpenScreen){
return true;
}
const inConversation = document.querySelector("form button>div");
if(inConversation){
return true;
}
return false;
}
function shouldAddButtons(actionsArea) {
// first, check if there's a "Try Again" button and no other buttons
const buttons = actionsArea.querySelectorAll("button");
const hasTryAgainButton = Array.from(buttons).some((button) => {
return !button.id?.includes("download");
});
if (hasTryAgainButton && buttons.length === 1) {
return true;
}
// otherwise, check if open screen is not visible
const isOpenScreen = document.querySelector("h1.text-4xl");
if (isOpenScreen) {
return false;
}
// check if the conversation is finished and there are no share buttons
const finishedConversation = document.querySelector("form button>svg");
const hasShareButtons = actionsArea.querySelectorAll("button[share-ext]");
if (finishedConversation && !hasShareButtons.length) {
return true;
}
return false;
}
function removeButtons() {
@@ -72,6 +84,33 @@ function removeButtons() {
}
}
function addActionsButtons(actionsArea, TryAgainButton) {
const downloadButton = TryAgainButton.cloneNode(true);
downloadButton.id = "download-png-button";
downloadButton.setAttribute("share-ext", "true");
downloadButton.innerText = "Generate PNG";
downloadButton.onclick = () => {
downloadThread();
};
actionsArea.appendChild(downloadButton);
const downloadPdfButton = TryAgainButton.cloneNode(true);
downloadPdfButton.id = "download-pdf-button";
downloadButton.setAttribute("share-ext", "true");
downloadPdfButton.innerText = "Download PDF";
downloadPdfButton.onclick = () => {
downloadThread({ as: Format.PDF });
};
actionsArea.appendChild(downloadPdfButton);
const exportHtml = TryAgainButton.cloneNode(true);
exportHtml.id = "download-html-button";
downloadButton.setAttribute("share-ext", "true");
exportHtml.innerText = "Share Link";
exportHtml.onclick = () => {
sendRequest();
};
actionsArea.appendChild(exportHtml);
}
function downloadThread({ as = Format.PNG } = {}) {
const elements = new Elements();
elements.fixLocation();
@@ -113,7 +152,7 @@ function handlePdf(imgData, canvas, pixelRatio) {
]);
var pdfWidth = pdf.internal.pageSize.getWidth();
var pdfHeight = pdf.internal.pageSize.getHeight();
pdf.addImage(imgData, "PNG", 0, 0, pdfWidth, pdfHeight);
pdf.addImage(imgData, "PNG", 0, 0, pdfWidth, pdfHeight, '', 'FAST');
const data = pdf.__private__.getArrayBuffer(pdf.__private__.buildDocument());
invoke('download', { name: `chatgpt-${Date.now()}.pdf`, blob: Array.from(new Uint8Array(data)) });

View File

@@ -17,6 +17,8 @@ use tauri_plugin_log::{
fn main() {
ChatConfJson::init();
// If the file does not exist, creating the file will block menu synchronization
utils::create_chatgpt_prompts();
let chat_conf = ChatConfJson::get_chat_conf();
let context = tauri::generate_context!();
let colors = ColoredLevelConfig {
@@ -57,8 +59,10 @@ fn main() {
cmd::form_confirm,
cmd::form_msg,
cmd::open_file,
cmd::get_chat_model,
cmd::get_chat_model_cmd,
cmd::parse_prompt,
cmd::window_reload,
cmd::cmd_list,
fs_extra::metadata,
])
.setup(setup::init)

View File

@@ -30,6 +30,14 @@ pub fn create_file(path: &Path) -> Result<File> {
File::create(path).map_err(Into::into)
}
pub fn create_chatgpt_prompts() {
let sync_file = chat_root().join("cache_model").join("chatgpt_prompts.json");
if !exists(&sync_file) {
create_file(&sync_file).unwrap();
fs::write(&sync_file, "[]").unwrap();
}
}
pub fn script_path() -> PathBuf {
let script_file = chat_root().join("main.js");
if !exists(&script_file) {

View File

@@ -7,14 +7,16 @@
},
"package": {
"productName": "ChatGPT",
"version": "0.5.1"
"version": "0.6.0"
},
"tauri": {
"allowlist": {
"all": true,
"http": {
"all": true,
"scope": [
"https://raw.githubusercontent.com/*"
"https://**",
"http://**"
]
},
"fs": {
@@ -62,12 +64,6 @@
"webviewInstallMode": {
"silent": true,
"type": "embedBootstrapper"
},
"wix": {
"language": [
"zh-CN",
"en-US"
]
}
}
},

View File

@@ -1,23 +1,53 @@
import { useState } from 'react';
import { useState, useEffect } from 'react';
import { clone } from 'lodash';
import { invoke } from '@tauri-apps/api';
import { CHAT_MODEL_JSON, readJSON, writeJSON } from '@/utils';
import { CHAT_MODEL_JSON, CHAT_MODEL_CMD_JSON, readJSON, writeJSON } from '@/utils';
import useInit from '@/hooks/useInit';
export default function useChatModel(key: string) {
export default function useChatModel(key: string, file = CHAT_MODEL_JSON) {
const [modelJson, setModelJson] = useState<Record<string, any>>({});
useInit(async () => {
const data = await readJSON(CHAT_MODEL_JSON, { name: 'ChatGPT Model', [key]: [] });
const data = await readJSON(file, {
defaultVal: { name: 'ChatGPT Model', [key]: null },
});
setModelJson(data);
});
const modelSet = async (data: Record<string, any>[]) => {
const modelSet = async (data: Record<string, any>[]|Record<string, any>) => {
const oData = clone(modelJson);
oData[key] = data;
await writeJSON(CHAT_MODEL_JSON, oData);
await writeJSON(file, oData);
setModelJson(oData);
}
return { modelJson, modelSet, modelData: modelJson?.[key] || [] }
return { modelJson, modelSet, modelData: modelJson?.[key] || [] };
}
export function useCacheModel(file = '') {
const [modelCacheJson, setModelCacheJson] = useState<Record<string, any>[]>([]);
useEffect(() => {
if (!file) return;
(async () => {
const data = await readJSON(file, { isRoot: true, isList: true });
setModelCacheJson(data);
})();
}, [file]);
const modelCacheSet = async (data: Record<string, any>[], newFile = '') => {
await writeJSON(newFile ? newFile : file, data, { isRoot: true });
setModelCacheJson(data);
await modelCacheCmd();
}
const modelCacheCmd = async () => {
// Generate the `chat.model.cmd.json` file and refresh the page for the slash command to take effect.
const list = await invoke('cmd_list');
await writeJSON(CHAT_MODEL_CMD_JSON, { name: 'ChatGPT CMD', last_updated: Date.now(), data: list });
await invoke('window_reload', { label: 'core' });
};
return { modelCacheJson, modelCacheSet, modelCacheCmd };
}

22
src/hooks/useData.ts vendored
View File

@@ -1,7 +1,7 @@
import { useState, useEffect } from 'react';
import { v4 } from 'uuid';
const safeKey = Symbol('chat-id');
export const safeKey = Symbol('chat-id');
export default function useData(oData: any[]) {
const [opData, setData] = useState<any[]>([]);
@@ -17,6 +17,9 @@ export default function useData(oData: any[]) {
};
const opInit = (val: any[] = []) => {
if (!val || !Array.isArray(val)) return;
console.log('«20» /src/hooks/useData.ts ~> ', val);
const nData = val.map(i => ({ [safeKey]: v4(), ...i }));
setData(nData);
};
@@ -35,5 +38,20 @@ export default function useData(oData: any[]) {
return nData;
};
return { opSafeKey: safeKey, opInit, opReplace, opAdd, opRemove, opData };
const opReplaceItems = (ids: string[], data: any) => {
const nData = [...opData];
let count = 0;
for (let i = 0; i < nData.length; i++) {
const v = nData[i];
if (ids.includes(v[safeKey])) {
count++;
nData[i] = { ...v, ...data };
}
if (count === ids.length) break;
}
setData(nData);
return nData;
};
return { opSafeKey: safeKey, opInit, opReplace, opAdd, opRemove, opData, opReplaceItems };
}

34
src/hooks/useEvent.ts vendored Normal file
View File

@@ -0,0 +1,34 @@
import { invoke, path, http, fs, dialog } from '@tauri-apps/api';
import useInit from '@/hooks/useInit';
import useChatModel, { useCacheModel } from '@/hooks/useChatModel';
import { GITHUB_PROMPTS_CSV_URL, chatRoot, genCmd } from '@/utils';
export default function useEvent() {
const { modelSet } = useChatModel('sync_prompts');
const { modelCacheSet } = useCacheModel();
// Using `emit` and `listen` will be triggered multiple times in development mode.
// So here we use `eval` to call `__sync_prompt`
useInit(() => {
(window as any).__sync_prompts = async () => {
const res = await http.fetch(GITHUB_PROMPTS_CSV_URL, {
method: 'GET',
responseType: http.ResponseType.Text,
});
const data = (res.data || '') as string;
if (res.ok) {
const file = await path.join(await chatRoot(), 'cache_model', 'chatgpt_prompts.json');
const list: Record<string, string>[] = await invoke('parse_prompt', { data });
const fmtList = list.map(i => ({ ...i, cmd: i.cmd ? i.cmd : genCmd(i.act), enable: true, tags: ['chatgpt-prompts'] }));
await modelCacheSet(fmtList, file);
modelSet({
id: 'chatgpt_prompts',
last_updated: Date.now(),
});
dialog.message('ChatGPT Prompts data has been synchronized!');
} else {
dialog.message('ChatGPT Prompts data sync failed, please try again!');
}
}
})
}

37
src/hooks/useTable.tsx vendored Normal file
View File

@@ -0,0 +1,37 @@
import React, { useState } from 'react';
import { Table } from 'antd';
import type { TableRowSelection } from 'antd/es/table/interface';
import { safeKey } from '@/hooks/useData';
export default function useTableRowSelection() {
const [selectedRowKeys, setSelectedRowKeys] = useState<React.Key[]>([]);
const [selectedRowIDs, setSelectedRowIDs] = useState<string[]>([]);
const onSelectChange = (newSelectedRowKeys: React.Key[], selectedRows: Record<string|symbol, any>) => {
const keys = selectedRows.map((i: any) => i[safeKey]);
setSelectedRowIDs(keys);
setSelectedRowKeys(newSelectedRowKeys);
};
const rowSelection: TableRowSelection<Record<string, any>> = {
selectedRowKeys,
onChange: onSelectChange,
selections: [
Table.SELECTION_ALL,
Table.SELECTION_INVERT,
Table.SELECTION_NONE,
],
};
return { rowSelection, selectedRowIDs };
}
export const TABLE_PAGINATION = {
hideOnSinglePage: true,
showSizeChanger: true,
showQuickJumper: true,
defaultPageSize: 5,
pageSizeOptions: [5, 10, 15, 20],
showTotal: (total: number) => <span>Total {total} items</span>,
};

View File

@@ -39,6 +39,7 @@ const ChatLayout: FC<ChatLayoutProps> = ({ children }) => {
mode="inline"
inlineIndent={12}
items={menuItems}
defaultOpenKeys={['/model']}
onClick={(i) => go(i.key)}
/>
</Sider>

46
src/main.scss vendored
View File

@@ -17,4 +17,50 @@
html, body {
padding: 0;
margin: 0;
}
.ant-table-selection-column {
width: 50px !important;
min-width: 50px !important;
}
.chat-prompts-val {
display: inline-block;
width: 100%;
max-width: 300px;
overflow: hidden;
text-overflow: ellipsis;
display: -webkit-box;
-webkit-line-clamp: 3;
-webkit-box-orient: vertical;
}
.chat-add-btn {
margin-bottom: 5px;
}
.chat-prompts-tags {
.ant-tag {
margin: 2px;
}
}
.chat-sync-path {
font-size: 12px;
font-weight: 500;
color: #888;
margin-bottom: 5px;
line-height: 16px;
span {
display: inline-block;
// background-color: #d8d8d8;
color: #4096ff;
padding: 0 8px;
height: 20px;
line-height: 20px;
border-radius: 4px;
cursor: pointer;
text-decoration: underline;
}
}

14
src/main.tsx vendored
View File

@@ -2,15 +2,23 @@ import { StrictMode, Suspense } from 'react';
import { BrowserRouter } from 'react-router-dom';
import ReactDOM from 'react-dom/client';
import useEvent from '@/hooks/useEvent';
import Layout from '@/layout';
import './main.scss';
const App = () => {
useEvent();
return (
<BrowserRouter>
<Layout/>
</BrowserRouter>
);
}
ReactDOM.createRoot(document.getElementById('root') as HTMLElement).render(
<StrictMode>
<Suspense fallback={null}>
<BrowserRouter>
<Layout/>
</BrowserRouter>
<App />
</Suspense>
</StrictMode>
);

42
src/routes.tsx vendored
View File

@@ -3,13 +3,16 @@ import {
DesktopOutlined,
BulbOutlined,
SyncOutlined,
FileSyncOutlined,
UserOutlined,
} from '@ant-design/icons';
import type { MenuProps } from 'antd';
import General from '@view/General';
import LanguageModel from '@/view/LanguageModel';
import SyncPrompts from '@/view/SyncPrompts';
import UserCustom from '@/view/model/UserCustom';
import SyncPrompts from '@/view/model/SyncPrompts';
import SyncCustom from '@/view/model/SyncCustom';
import SyncRecord from '@/view/model/SyncRecord';
export type ChatRouteMetaObject = {
label: string;
@@ -19,7 +22,8 @@ export type ChatRouteMetaObject = {
type ChatRouteObject = {
path: string;
element?: JSX.Element;
meta: ChatRouteMetaObject;
hideMenu?: boolean;
meta?: ChatRouteMetaObject;
children?: ChatRouteObject[];
}
@@ -33,7 +37,7 @@ export const routes: Array<ChatRouteObject> = [
},
},
{
path: '/language-model',
path: '/model',
meta: {
label: 'Language Model',
icon: <BulbOutlined />,
@@ -41,7 +45,7 @@ export const routes: Array<ChatRouteObject> = [
children: [
{
path: 'user-custom',
element: <LanguageModel />,
element: <UserCustom />,
meta: {
label: 'User Custom',
icon: <UserOutlined />,
@@ -55,17 +59,33 @@ export const routes: Array<ChatRouteObject> = [
icon: <SyncOutlined />,
},
},
{
path: 'sync-custom',
element: <SyncCustom />,
meta: {
label: 'Sync Custom',
icon: <FileSyncOutlined />,
},
},
{
path: 'sync-custom/:id',
element: <SyncRecord />,
hideMenu: true,
},
]
},
];
type MenuItem = Required<MenuProps>['items'][number];
export const menuItems: MenuItem[] = routes.map(i => ({
...i.meta,
key: i.path || '',
children: i?.children?.map((j) =>
({ ...j.meta, key: `${i.path}/${j.path}` || ''})),
}));
export const menuItems: MenuItem[] = routes
.filter((j) => !j.hideMenu)
.map(i => ({
...i.meta,
key: i.path || '',
children: i?.children
?.filter((j) => !j.hideMenu)
?.map((j) => ({ ...j.meta, key: `${i.path}/${j.path}` || ''})),
}));
export default () => {
return useRoutes(routes);

35
src/utils.ts vendored
View File

@@ -1,8 +1,9 @@
import { readTextFile, writeTextFile, exists } from '@tauri-apps/api/fs';
import { homeDir, join } from '@tauri-apps/api/path';
import { readTextFile, writeTextFile, exists, createDir, BaseDirectory } from '@tauri-apps/api/fs';
import { homeDir, join, dirname } from '@tauri-apps/api/path';
import dayjs from 'dayjs';
export const CHAT_MODEL_JSON = 'chat.model.json';
export const CHAT_MODEL_CMD_JSON = 'chat.model.cmd.json';
export const CHAT_PROMPTS_CSV = 'chat.prompts.csv';
export const GITHUB_PROMPTS_CSV_URL = 'https://raw.githubusercontent.com/f/awesome-chatgpt-prompts/main/prompts.csv';
export const DISABLE_AUTO_COMPLETE = {
@@ -19,18 +20,25 @@ export const chatModelPath = async (): Promise<string> => {
return join(await chatRoot(), CHAT_MODEL_JSON);
}
// export const chatModelSyncPath = async (): Promise<string> => {
// return join(await chatRoot(), CHAT_MODEL_SYNC_JSON);
// }
export const chatPromptsPath = async (): Promise<string> => {
return join(await chatRoot(), CHAT_PROMPTS_CSV);
}
export const readJSON = async (path: string, defaultVal = {}) => {
type readJSONOpts = { defaultVal?: Record<string, any>, isRoot?: boolean, isList?: boolean };
export const readJSON = async (path: string, opts: readJSONOpts = {}) => {
const { defaultVal = {}, isRoot = false, isList = false } = opts;
const root = await chatRoot();
const file = await join(root, path);
const file = await join(isRoot ? '' : root, path);
if (!await exists(file)) {
writeTextFile(file, JSON.stringify({
await createDir(await dirname(file), { recursive: true });
await writeTextFile(file, isList ? '[]' : JSON.stringify({
name: 'ChatGPT',
link: 'https://github.com/lencx/ChatGPT/blob/main/chat.model.md',
link: 'https://github.com/lencx/ChatGPT',
...defaultVal,
}, null, 2))
}
@@ -42,10 +50,19 @@ export const readJSON = async (path: string, defaultVal = {}) => {
}
}
export const writeJSON = async (path: string, data: Record<string, any>) => {
type writeJSONOpts = { dir?: string, isRoot?: boolean };
export const writeJSON = async (path: string, data: Record<string, any>, opts: writeJSONOpts = {}) => {
const { isRoot = false, dir = '' } = opts;
const root = await chatRoot();
const file = await join(root, path);
const file = await join(isRoot ? '' : root, path);
if (isRoot && !await exists(await dirname(file))) {
await createDir(await join('.chatgpt', dir), { dir: BaseDirectory.Home });
}
await writeTextFile(file, JSON.stringify(data, null, 2));
}
export const fmtDate = (date: any) => dayjs(date).format('YYYY-MM-DD HH:mm:ss');
export const fmtDate = (date: any) => dayjs(date).format('YYYY-MM-DD HH:mm:ss');
export const genCmd = (act: string) => act.replace(/\s+|\/+/g, '_').replace(/[^\d\w]/g, '').toLocaleLowerCase();

View File

@@ -1,39 +0,0 @@
.chat-prompts-val {
display: inline-block;
width: 100%;
max-width: 300px;
overflow: hidden;
text-overflow: ellipsis;
display: -webkit-box;
-webkit-line-clamp: 3;
-webkit-box-orient: vertical;
}
.chat-prompts-tags {
.ant-tag {
margin: 2px;
}
}
.add-btn {
margin-bottom: 5px;
}
.chat-model-path {
font-size: 12px;
font-weight: bold;
color: #888;
margin-bottom: 5px;
span {
display: inline-block;
// background-color: #d8d8d8;
color: #4096ff;
padding: 0 8px;
height: 20px;
line-height: 20px;
border-radius: 4px;
cursor: pointer;
text-decoration: underline;
}
}

View File

@@ -1,111 +0,0 @@
import { useState, useRef, useEffect } from 'react';
import { Table, Button, Modal, message } from 'antd';
import { invoke } from '@tauri-apps/api';
import useInit from '@/hooks/useInit';
import useData from '@/hooks/useData';
import useChatModel from '@/hooks/useChatModel';
import useColumns from '@/hooks/useColumns';
import { chatModelPath } from '@/utils';
import { modelColumns } from './config';
import LanguageModelForm from './Form';
import './index.scss';
export default function LanguageModel() {
const [isVisible, setVisible] = useState(false);
const [modelPath, setChatModelPath] = useState('');
const { modelData, modelSet } = useChatModel('user_custom');
const { opData, opInit, opAdd, opRemove, opReplace, opSafeKey } = useData([]);
const { columns, ...opInfo } = useColumns(modelColumns());
const formRef = useRef<any>(null);
useEffect(() => {
if (modelData.length <= 0) return;
opInit(modelData);
}, [modelData])
useEffect(() => {
if (!opInfo.opType) return;
if (['edit', 'new'].includes(opInfo.opType)) {
setVisible(true);
}
if (['delete'].includes(opInfo.opType)) {
const data = opRemove(opInfo?.opRecord?.[opSafeKey]);
modelSet(data);
opInfo.resetRecord();
}
}, [opInfo.opType, formRef]);
useEffect(() => {
if (opInfo.opType === 'enable') {
const data = opReplace(opInfo?.opRecord?.[opSafeKey], opInfo?.opRecord);
modelSet(data);
}
}, [opInfo.opTime])
useInit(async () => {
const path = await chatModelPath();
setChatModelPath(path);
})
const hide = () => {
setVisible(false);
opInfo.resetRecord();
};
const handleOk = () => {
formRef.current?.form?.validateFields()
.then((vals: Record<string, any>) => {
if (modelData.map((i: any) => i.cmd).includes(vals.cmd) && opInfo?.opRecord?.cmd !== vals.cmd) {
message.warning(`"cmd: /${vals.cmd}" already exists, please change the "${vals.cmd}" name and resubmit.`);
return;
}
let data = [];
switch (opInfo.opType) {
case 'new': data = opAdd(vals); break;
case 'edit': data = opReplace(opInfo?.opRecord?.[opSafeKey], vals); break;
default: break;
}
modelSet(data)
hide();
})
};
const handleOpenFile = () => {
invoke('open_file', { path: modelPath });
};
const modalTitle = `${({ new: 'Create', edit: 'Edit' })[opInfo.opType]} Language Model`;
return (
<div>
<Button className="add-btn" type="primary" onClick={opInfo.opNew}>Add Model</Button>
<div className="chat-model-path">PATH: <span onClick={handleOpenFile}>{modelPath}</span></div>
<Table
key={opInfo.opTime}
rowKey="cmd"
columns={columns}
scroll={{ x: 'auto' }}
dataSource={opData}
pagination={{
hideOnSinglePage: true,
showSizeChanger: true,
showQuickJumper: true,
defaultPageSize: 5,
pageSizeOptions: [5, 10, 15, 20],
showTotal: (total) => <span>Total {total} items</span>,
}}
/>
<Modal
open={isVisible}
onCancel={hide}
title={modalTitle}
onOk={handleOk}
destroyOnClose
maskClosable={false}
>
<LanguageModelForm record={opInfo?.opRecord} ref={formRef} />
</Modal>
</div>
)
}

View File

@@ -1,28 +0,0 @@
.chat-prompts-tags {
.ant-tag {
margin: 2px;
}
}
.add-btn {
margin-bottom: 5px;
}
.chat-model-path {
font-size: 12px;
font-weight: bold;
color: #888;
margin-bottom: 5px;
span {
display: inline-block;
// background-color: #d8d8d8;
color: #4096ff;
padding: 0 8px;
height: 20px;
line-height: 20px;
border-radius: 4px;
cursor: pointer;
text-decoration: underline;
}
}

View File

@@ -1,91 +0,0 @@
import { useEffect, useState } from 'react';
import { Table, Button, message } from 'antd';
import { invoke } from '@tauri-apps/api';
import { fetch, ResponseType } from '@tauri-apps/api/http';
import { writeTextFile, readTextFile } from '@tauri-apps/api/fs';
import useInit from '@/hooks/useInit';
import useColumns from '@/hooks/useColumns';
import useData from '@/hooks/useData';
import useChatModel from '@/hooks/useChatModel';
import { fmtDate, chatPromptsPath, GITHUB_PROMPTS_CSV_URL } from '@/utils';
import { modelColumns, genCmd } from './config';
import './index.scss';
const promptsURL = 'https://github.com/f/awesome-chatgpt-prompts/blob/main/prompts.csv';
export default function LanguageModel() {
const [loading, setLoading] = useState(false);
const [lastUpdated, setLastUpdated] = useState();
const { modelJson, modelSet } = useChatModel('sys_sync_prompts');
const { opData, opInit, opReplace, opSafeKey } = useData([]);
const { columns, ...opInfo } = useColumns(modelColumns());
// useInit(async () => {
// // const filename = await chatPromptsPath();
// // const data = await readTextFile(filename);
// // const list: Record<string, string>[] = await invoke('parse_prompt', { data });
// // const fileData: Record<string, any> = await invoke('metadata', { path: filename });
// // setLastUpdated(fileData.accessedAtMs);
// // opInit(list);
// console.log('«31» /view/SyncPrompts/index.tsx ~> ', modelJson);
// opInit([]);
// })
useEffect(() => {
if (!modelJson?.sys_sync_prompts) return;
opInit(modelJson?.sys_sync_prompts)
}, [modelJson?.sys_sync_prompts])
const handleSync = async () => {
setLoading(true);
const res = await fetch(GITHUB_PROMPTS_CSV_URL, {
method: 'GET',
responseType: ResponseType.Text,
});
const data = (res.data || '') as string;
if (res.ok) {
// const content = data.replace(/"(\s+)?,(\s+)?"/g, '","');
await writeTextFile(await chatPromptsPath(), data);
const list: Record<string, string>[] = await invoke('parse_prompt', { data });
opInit(list);
modelSet(list.map(i => ({ cmd: genCmd(i.act), enable: true, tags: ['chatgpt-prompts'], ...i })));
setLastUpdated(fmtDate(Date.now()) as any);
message.success('ChatGPT Prompts data has been synchronized!');
} else {
message.error('ChatGPT Prompts data sync failed, please try again!');
}
setLoading(false);
};
useEffect(() => {
if (opInfo.opType === 'enable') {
const data = opReplace(opInfo?.opRecord?.[opSafeKey], opInfo?.opRecord);
modelSet(data);
}
}, [opInfo.opTime]);
return (
<div>
<Button type="primary" loading={loading} onClick={handleSync}>Sync</Button>
{lastUpdated && <span style={{ marginLeft: 10, color: '#999' }}>Last updated on {fmtDate(lastUpdated)}</span>}
<div className="chat-model-path">URL: <a href={promptsURL} target="_blank">{promptsURL}</a></div>
<Table
key={lastUpdated}
rowKey="act"
columns={columns}
scroll={{ x: 'auto' }}
dataSource={opData}
pagination={{
hideOnSinglePage: true,
showSizeChanger: true,
showQuickJumper: true,
defaultPageSize: 5,
pageSizeOptions: [5, 10, 15, 20],
showTotal: (total) => <span>Total {total} items</span>,
}}
/>
</div>
)
}

105
src/view/model/SyncCustom/Form.tsx vendored Normal file
View File

@@ -0,0 +1,105 @@
import { useEffect, useState, ForwardRefRenderFunction, useImperativeHandle, forwardRef } from 'react';
import { Form, Input, Select, Tooltip } from 'antd';
import { v4 } from 'uuid';
import type { FormProps } from 'antd';
import { DISABLE_AUTO_COMPLETE, chatRoot } from '@/utils';
import useInit from '@/hooks/useInit';
interface SyncFormProps {
record?: Record<string|symbol, any> | null;
}
const initFormValue = {
act: '',
enable: true,
tags: [],
prompt: '',
};
const SyncForm: ForwardRefRenderFunction<FormProps, SyncFormProps> = ({ record }, ref) => {
const [form] = Form.useForm();
useImperativeHandle(ref, () => ({ form }));
const [root, setRoot] = useState('');
useInit(async () => {
setRoot(await chatRoot());
});
useEffect(() => {
if (record) {
form.setFieldsValue(record);
}
}, [record]);
const pathOptions = (
<Form.Item noStyle name="protocol" initialValue="https">
<Select>
<Select.Option value="local">{root}</Select.Option>
<Select.Option value="http">http://</Select.Option>
<Select.Option value="https">https://</Select.Option>
</Select>
</Form.Item>
);
const extOptions = (
<Form.Item noStyle name="ext" initialValue="json">
<Select>
<Select.Option value="csv">.csv</Select.Option>
<Select.Option value="json">.json</Select.Option>
</Select>
</Form.Item>
);
const jsonTip = (
<Tooltip
title={<pre>{JSON.stringify([
{ cmd: '', act: '', prompt: '' },
{ cmd: '', act: '', prompt: '' },
], null, 2)}</pre>}
>
<a>JSON</a>
</Tooltip>
);
const csvTip = (
<Tooltip
title={<pre>{`"cmd","act","prompt"
"cmd","act","prompt"
"cmd","act","prompt"
"cmd","act","prompt"`}</pre>}
>
<a>CSV</a>
</Tooltip>
);
return (
<>
<Form
form={form}
labelCol={{ span: 4 }}
initialValues={initFormValue}
>
<Form.Item
label="Name"
name="name"
rules={[{ required: true, message: 'Please input name!' }]}
>
<Input placeholder="Please input name" {...DISABLE_AUTO_COMPLETE} />
</Form.Item>
<Form.Item
label="PATH"
name="path"
rules={[{ required: true, message: 'Please input path!' }]}
>
<Input placeholder="YOUR_PATH" addonBefore={pathOptions} addonAfter={extOptions} {...DISABLE_AUTO_COMPLETE} />
</Form.Item>
<Form.Item style={{ display: 'none' }} name="id" initialValue={v4().replace(/-/g, '')}><input /></Form.Item>
</Form>
<div className="tip">
<p>The file supports only {csvTip} and {jsonTip} formats.</p>
</div>
</>
)
}
export default forwardRef(SyncForm);

81
src/view/model/SyncCustom/config.tsx vendored Normal file
View File

@@ -0,0 +1,81 @@
import { useState } from 'react';
import { Tag, Space, Popconfirm } from 'antd';
import { HistoryOutlined } from '@ant-design/icons';
import { shell, path } from '@tauri-apps/api';
import { Link } from 'react-router-dom';
import useInit from '@/hooks/useInit';
import { chatRoot, fmtDate } from '@/utils';
export const syncColumns = () => [
{
title: 'Name',
dataIndex: 'name',
key: 'name',
width: 100,
},
{
title: 'Protocol',
dataIndex: 'protocol',
key: 'protocol',
width: 80,
render: (v: string) => <Tag>{v}</Tag>,
},
{
title: 'PATH',
dataIndex: 'path',
key: 'path',
width: 180,
render: (_: string, row: any) => <RenderPath row={row} />
},
{
title: 'Last updated',
dataIndex: 'last_updated',
key: 'last_updated',
width: 140,
render: (v: number) => (
<div style={{ textAlign: 'center' }}>
<HistoryOutlined style={{ marginRight: 5, color: v ? '#52c41a' : '#ff4d4f' }} />
{ v ? fmtDate(v) : ''}
</div>
),
},
{
title: 'Action',
fixed: 'right',
width: 150,
render: (_: any, row: any, actions: any) => {
return (
<Space>
<a onClick={() => actions.setRecord(row, 'sync')}>Sync</a>
{row.last_updated && <Link to={`${row.id}`} state={row}>View</Link>}
<a onClick={() => actions.setRecord(row, 'edit')}>Edit</a>
<Popconfirm
title="Are you sure to delete this path?"
onConfirm={() => actions.setRecord(row, 'delete')}
okText="Yes"
cancelText="No"
>
<a>Delete</a>
</Popconfirm>
</Space>
)
}
}
];
const RenderPath = ({ row }: any) => {
const [filePath, setFilePath] = useState('');
useInit(async () => {
setFilePath(await getPath(row));
})
return <a onClick={() => shell.open(filePath)}>{filePath}</a>
};
export const getPath = async (row: any) => {
if (!/^http/.test(row.protocol)) {
return await path.join(await chatRoot(), row.path) + `.${row.ext}`;
} else {
return `${row.protocol}://${row.path}.${row.ext}`;
}
}

140
src/view/model/SyncCustom/index.tsx vendored Normal file
View File

@@ -0,0 +1,140 @@
import { useState, useRef, useEffect } from 'react';
import { Table, Modal, Button, message } from 'antd';
import { invoke, http, path, fs } from '@tauri-apps/api';
import useData from '@/hooks/useData';
import useChatModel, { useCacheModel } from '@/hooks/useChatModel';
import useColumns from '@/hooks/useColumns';
import { TABLE_PAGINATION } from '@/hooks/useTable';
import { CHAT_MODEL_JSON, chatRoot, readJSON, genCmd } from '@/utils';
import { syncColumns, getPath } from './config';
import SyncForm from './Form';
const setTag = (data: Record<string, any>[]) => data.map((i) => ({ ...i, tags: ['user-sync'], enable: true }))
export default function SyncCustom() {
const [isVisible, setVisible] = useState(false);
const { modelData, modelSet } = useChatModel('sync_custom', CHAT_MODEL_JSON);
const { modelCacheCmd, modelCacheSet } = useCacheModel();
const { opData, opInit, opAdd, opRemove, opReplace, opSafeKey } = useData([]);
const { columns, ...opInfo } = useColumns(syncColumns());
const formRef = useRef<any>(null);
const hide = () => {
setVisible(false);
opInfo.resetRecord();
};
useEffect(() => {
if (modelData.length <= 0) return;
opInit(modelData);
}, [modelData]);
useEffect(() => {
if (!opInfo.opType) return;
if (opInfo.opType === 'sync') {
const filename = `${opInfo?.opRecord?.id}.json`;
handleSync(filename).then(() => {
const data = opReplace(opInfo?.opRecord?.[opSafeKey], { ...opInfo?.opRecord, last_updated: Date.now() });
modelSet(data);
opInfo.resetRecord();
});
}
if (['edit', 'new'].includes(opInfo.opType)) {
setVisible(true);
}
if (['delete'].includes(opInfo.opType)) {
const data = opRemove(opInfo?.opRecord?.[opSafeKey]);
modelSet(data);
opInfo.resetRecord();
}
}, [opInfo.opType, formRef]);
const handleSync = async (filename: string) => {
const record = opInfo?.opRecord;
const isJson = /json$/.test(record?.ext);
const file = await path.join(await chatRoot(), 'cache_model', filename);
const filePath = await getPath(record);
// https or http
if (/^http/.test(record?.protocol)) {
const res = await http.fetch(filePath, {
method: 'GET',
responseType: isJson ? 1 : 2,
});
if (res.ok) {
if (isJson) {
// parse json
await modelCacheSet(setTag(Array.isArray(res?.data) ? res?.data : []), file);
} else {
// parse csv
const list: Record<string, string>[] = await invoke('parse_prompt', { data: res?.data });
const fmtList = list.map(i => ({ ...i, cmd: i.cmd ? i.cmd : genCmd(i.act), enable: true, tags: ['user-sync'] }));
await modelCacheSet(fmtList, file);
}
await modelCacheCmd();
message.success('ChatGPT Prompts data has been synchronized!');
} else {
message.error('ChatGPT Prompts data sync failed, please try again!');
}
return;
}
// local
if (isJson) {
// parse json
const data = await readJSON(filePath, { isRoot: true });
await modelCacheSet(setTag(Array.isArray(data) ? data : []), file);
} else {
// parse csv
const data = await fs.readTextFile(filePath);
const list: Record<string, string>[] = await invoke('parse_prompt', { data });
const fmtList = list.map(i => ({ ...i, cmd: i.cmd ? i.cmd : genCmd(i.act), enable: true, tags: ['user-sync'] }));
await modelCacheSet(fmtList, file);
}
await modelCacheCmd();
};
const handleOk = () => {
formRef.current?.form?.validateFields()
.then((vals: Record<string, any>) => {
let data = [];
switch (opInfo.opType) {
case 'new': data = opAdd(vals); break;
case 'edit': data = opReplace(opInfo?.opRecord?.[opSafeKey], vals); break;
default: break;
}
modelSet(data);
hide();
})
};
return (
<div>
<Button
className="chat-add-btn"
type="primary"
onClick={opInfo.opNew}
>
Add PATH
</Button>
<Table
key="id"
rowKey="name"
columns={columns}
scroll={{ x: 800 }}
dataSource={opData}
pagination={TABLE_PAGINATION}
/>
<Modal
open={isVisible}
onCancel={hide}
title="Model PATH"
onOk={handleOk}
destroyOnClose
maskClosable={false}
>
<SyncForm ref={formRef} record={opInfo?.opRecord} />
</Modal>
</div>
)
}

View File

@@ -1,8 +1,8 @@
import { Switch, Tag, Tooltip } from 'antd';
export const genCmd = (act: string) => act.replace(/\s+|\/+/g, '_').replace(/[^\d\w]/g, '').toLocaleLowerCase();
import { genCmd } from '@/utils';
export const modelColumns = () => [
export const syncColumns = () => [
{
title: '/{cmd}',
dataIndex: 'cmd',

12
src/view/model/SyncPrompts/index.scss vendored Normal file
View File

@@ -0,0 +1,12 @@
.chat-table-tip, .chat-table-btns {
display: flex;
justify-content: space-between;
}
.chat-table-btns {
margin-bottom: 5px;
.num {
margin-left: 10px;
}
}

109
src/view/model/SyncPrompts/index.tsx vendored Normal file
View File

@@ -0,0 +1,109 @@
import { useEffect, useState } from 'react';
import { Table, Button, message, Popconfirm } from 'antd';
import { invoke, http, path, shell } from '@tauri-apps/api';
import useInit from '@/hooks/useInit';
import useData from '@/hooks/useData';
import useColumns from '@/hooks/useColumns';
import useChatModel, { useCacheModel } from '@/hooks/useChatModel';
import useTable, { TABLE_PAGINATION } from '@/hooks/useTable';
import { fmtDate, chatRoot, GITHUB_PROMPTS_CSV_URL, genCmd } from '@/utils';
import { syncColumns } from './config';
import './index.scss';
const promptsURL = 'https://github.com/f/awesome-chatgpt-prompts/blob/main/prompts.csv';
export default function SyncPrompts() {
const { rowSelection, selectedRowIDs } = useTable();
const [jsonPath, setJsonPath] = useState('');
const { modelJson, modelSet } = useChatModel('sync_prompts');
const { modelCacheJson, modelCacheSet } = useCacheModel(jsonPath);
const { opData, opInit, opReplace, opReplaceItems, opSafeKey } = useData([]);
const { columns, ...opInfo } = useColumns(syncColumns());
const lastUpdated = modelJson?.sync_prompts?.last_updated;
const selectedItems = rowSelection.selectedRowKeys || [];
useInit(async () => {
setJsonPath(await path.join(await chatRoot(), 'cache_model', 'chatgpt_prompts.json'));
});
useEffect(() => {
if (modelCacheJson.length <= 0) return;
opInit(modelCacheJson);
}, [modelCacheJson.length]);
const handleSync = async () => {
const res = await http.fetch(GITHUB_PROMPTS_CSV_URL, {
method: 'GET',
responseType: http.ResponseType.Text,
});
const data = (res.data || '') as string;
if (res.ok) {
// const content = data.replace(/"(\s+)?,(\s+)?"/g, '","');
const list: Record<string, string>[] = await invoke('parse_prompt', { data });
const fmtList = list.map(i => ({ ...i, cmd: i.cmd ? i.cmd : genCmd(i.act), enable: true, tags: ['chatgpt-prompts'] }));
await modelCacheSet(fmtList);
opInit(fmtList);
modelSet({
id: 'chatgpt_prompts',
last_updated: Date.now(),
});
message.success('ChatGPT Prompts data has been synchronized!');
} else {
message.error('ChatGPT Prompts data sync failed, please try again!');
}
};
useEffect(() => {
if (opInfo.opType === 'enable') {
const data = opReplace(opInfo?.opRecord?.[opSafeKey], opInfo?.opRecord);
modelCacheSet(data);
}
}, [opInfo.opTime]);
const handleEnable = (isEnable: boolean) => {
const data = opReplaceItems(selectedRowIDs, { enable: isEnable })
modelCacheSet(data);
};
return (
<div>
<div className="chat-table-btns">
<div>
{selectedItems.length > 0 && (
<>
<Button type="primary" onClick={() => handleEnable(true)}>Enable</Button>
<Button onClick={() => handleEnable(false)}>Disable</Button>
<span className="num">Selected {selectedItems.length} items</span>
</>
)}
</div>
<Popconfirm
title={<span>Data sync will enable all prompts,<br/>are you sure you want to sync?</span>}
placement="topLeft"
onConfirm={handleSync}
okText="Yes"
cancelText="No"
>
<Button type="primary">Sync</Button>
</Popconfirm>
</div>
<div className="chat-table-tip">
<div className="chat-sync-path">
<div>PATH: <a onClick={() => shell.open(promptsURL)} target="_blank" title={promptsURL}>f/awesome-chatgpt-prompts/prompts.csv</a></div>
<div>CACHE: <a onClick={() => shell.open(jsonPath)} target="_blank" title={jsonPath}>{jsonPath}</a></div>
</div>
{lastUpdated && <span style={{ marginLeft: 10, color: '#888', fontSize: 12 }}>Last updated on {fmtDate(lastUpdated)}</span>}
</div>
<Table
key={lastUpdated}
rowKey="act"
columns={columns}
scroll={{ x: 'auto' }}
dataSource={opData}
rowSelection={rowSelection}
pagination={TABLE_PAGINATION}
/>
</div>
)
}

47
src/view/model/SyncRecord/config.tsx vendored Normal file
View File

@@ -0,0 +1,47 @@
import { Switch, Tag, Tooltip } from 'antd';
import { genCmd } from '@/utils';
export const syncColumns = () => [
{
title: '/{cmd}',
dataIndex: 'cmd',
fixed: 'left',
// width: 120,
key: 'cmd',
render: (_: string, row: Record<string, string>) => (
<Tag color="#2a2a2a">/{genCmd(row.act)}</Tag>
),
},
{
title: 'Act',
dataIndex: 'act',
key: 'act',
// width: 200,
},
{
title: 'Tags',
dataIndex: 'tags',
key: 'tags',
// width: 150,
render: () => <Tag>chatgpt-prompts</Tag>,
},
{
title: 'Enable',
dataIndex: 'enable',
key: 'enable',
// width: 80,
render: (v: boolean = false, row: Record<string, any>, action: Record<string, any>) => (
<Switch checked={v} onChange={(v) => action.setRecord({ ...row, enable: v }, 'enable')} />
),
},
{
title: 'Prompt',
dataIndex: 'prompt',
key: 'prompt',
// width: 300,
render: (v: string) => (
<Tooltip overlayInnerStyle={{ width: 350 }} title={v}><span className="chat-prompts-val">{v}</span></Tooltip>
),
},
];

85
src/view/model/SyncRecord/index.tsx vendored Normal file
View File

@@ -0,0 +1,85 @@
import { useEffect, useState } from 'react';
import { useLocation } from 'react-router-dom';
import { ArrowLeftOutlined } from '@ant-design/icons';
import { Table, Button } from 'antd';
import { shell, path } from '@tauri-apps/api';
import useColumns from '@/hooks/useColumns';
import useData from '@/hooks/useData';
import { useCacheModel } from '@/hooks/useChatModel';
import useTable, { TABLE_PAGINATION } from '@/hooks/useTable';
import { fmtDate, chatRoot } from '@/utils';
import { getPath } from '@/view/model/SyncCustom/config';
import { syncColumns } from './config';
import useInit from '@/hooks/useInit';
export default function SyncRecord() {
const location = useLocation();
const [filePath, setFilePath] = useState('');
const [jsonPath, setJsonPath] = useState('');
const state = location?.state;
const { rowSelection, selectedRowIDs } = useTable();
const { modelCacheJson, modelCacheSet } = useCacheModel(jsonPath);
const { opData, opInit, opReplace, opReplaceItems, opSafeKey } = useData([]);
const { columns, ...opInfo } = useColumns(syncColumns());
const selectedItems = rowSelection.selectedRowKeys || [];
useInit(async () => {
setFilePath(await getPath(state));
setJsonPath(await path.join(await chatRoot(), 'cache_model', `${state?.id}.json`));
})
useEffect(() => {
if (modelCacheJson.length <= 0) return;
opInit(modelCacheJson);
}, [modelCacheJson.length]);
useEffect(() => {
if (opInfo.opType === 'enable') {
const data = opReplace(opInfo?.opRecord?.[opSafeKey], opInfo?.opRecord);
modelCacheSet(data);
}
}, [opInfo.opTime]);
const handleEnable = (isEnable: boolean) => {
const data = opReplaceItems(selectedRowIDs, { enable: isEnable })
modelCacheSet(data);
};
return (
<div>
<div className="chat-table-btns">
<div>
<Button shape="round" icon={<ArrowLeftOutlined />} onClick={() => history.back()} />
</div>
<div>
{selectedItems.length > 0 && (
<>
<Button type="primary" onClick={() => handleEnable(true)}>Enable</Button>
<Button onClick={() => handleEnable(false)}>Disable</Button>
<span className="num">Selected {selectedItems.length} items</span>
</>
)}
</div>
</div>
<div className="chat-table-tip">
<div className="chat-sync-path">
<div>PATH: <a onClick={() => shell.open(filePath)} target="_blank" title={filePath}>{filePath}</a></div>
<div>CACHE: <a onClick={() => shell.open(jsonPath)} target="_blank" title={jsonPath}>{jsonPath}</a></div>
</div>
{state?.last_updated && <span style={{ marginLeft: 10, color: '#888', fontSize: 12 }}>Last updated on {fmtDate(state?.last_updated)}</span>}
</div>
<Table
key="prompt"
rowKey="act"
columns={columns}
scroll={{ x: 'auto' }}
dataSource={opData}
rowSelection={rowSelection}
pagination={TABLE_PAGINATION}
/>
</div>
)
}

View File

@@ -5,7 +5,7 @@ import type { FormProps } from 'antd';
import Tags from '@comps/Tags';
import { DISABLE_AUTO_COMPLETE } from '@/utils';
interface LanguageModelProps {
interface UserCustomFormProps {
record?: Record<string|symbol, any> | null;
}
@@ -16,7 +16,7 @@ const initFormValue = {
prompt: '',
};
const LanguageModel: ForwardRefRenderFunction<FormProps, LanguageModelProps> = ({ record }, ref) => {
const UserCustomForm: ForwardRefRenderFunction<FormProps, UserCustomFormProps> = ({ record }, ref) => {
const [form] = Form.useForm();
useImperativeHandle(ref, () => ({ form }));
@@ -63,4 +63,4 @@ const LanguageModel: ForwardRefRenderFunction<FormProps, LanguageModelProps> = (
)
}
export default forwardRef(LanguageModel);
export default forwardRef(UserCustomForm);

139
src/view/model/UserCustom/index.tsx vendored Normal file
View File

@@ -0,0 +1,139 @@
import { useState, useRef, useEffect } from 'react';
import { Table, Button, Modal, message } from 'antd';
import { shell, path } from '@tauri-apps/api';
import useInit from '@/hooks/useInit';
import useData from '@/hooks/useData';
import useChatModel, { useCacheModel } from '@/hooks/useChatModel';
import useColumns from '@/hooks/useColumns';
import useTable, { TABLE_PAGINATION } from '@/hooks/useTable';
import { chatRoot, fmtDate } from '@/utils';
import { modelColumns } from './config';
import UserCustomForm from './Form';
export default function LanguageModel() {
const { rowSelection, selectedRowIDs } = useTable();
const [isVisible, setVisible] = useState(false);
const [jsonPath, setJsonPath] = useState('');
const { modelJson, modelSet } = useChatModel('user_custom');
const { modelCacheJson, modelCacheSet } = useCacheModel(jsonPath);
const { opData, opInit, opReplaceItems, opAdd, opRemove, opReplace, opSafeKey } = useData([]);
const { columns, ...opInfo } = useColumns(modelColumns());
const lastUpdated = modelJson?.user_custom?.last_updated;
const selectedItems = rowSelection.selectedRowKeys || [];
const formRef = useRef<any>(null);
useInit(async () => {
setJsonPath(await path.join(await chatRoot(), 'cache_model', 'user_custom.json'));
});
useEffect(() => {
if (modelCacheJson.length <= 0) return;
opInit(modelCacheJson);
}, [modelCacheJson.length]);
useEffect(() => {
if (!opInfo.opType) return;
if (['edit', 'new'].includes(opInfo.opType)) {
setVisible(true);
}
if (['delete'].includes(opInfo.opType)) {
const data = opRemove(opInfo?.opRecord?.[opSafeKey]);
modelCacheSet(data);
opInfo.resetRecord();
}
}, [opInfo.opType, formRef]);
useEffect(() => {
if (opInfo.opType === 'enable') {
const data = opReplace(opInfo?.opRecord?.[opSafeKey], opInfo?.opRecord);
modelCacheSet(data);
}
}, [opInfo.opTime])
useEffect(() => {
if (opInfo.opType === 'enable') {
const data = opReplace(opInfo?.opRecord?.[opSafeKey], opInfo?.opRecord);
modelCacheSet(data);
}
}, [opInfo.opTime]);
const handleEnable = (isEnable: boolean) => {
const data = opReplaceItems(selectedRowIDs, { enable: isEnable })
modelCacheSet(data);
};
const hide = () => {
setVisible(false);
opInfo.resetRecord();
};
const handleOk = () => {
formRef.current?.form?.validateFields()
.then(async (vals: Record<string, any>) => {
if (modelCacheJson.map((i: any) => i.cmd).includes(vals.cmd) && opInfo?.opRecord?.cmd !== vals.cmd) {
message.warning(`"cmd: /${vals.cmd}" already exists, please change the "${vals.cmd}" name and resubmit.`);
return;
}
let data = [];
switch (opInfo.opType) {
case 'new': data = opAdd(vals); break;
case 'edit': data = opReplace(opInfo?.opRecord?.[opSafeKey], vals); break;
default: break;
}
await modelCacheSet(data);
opInit(data);
modelSet({
id: 'user_custom',
last_updated: Date.now(),
});
hide();
})
};
const modalTitle = `${({ new: 'Create', edit: 'Edit' })[opInfo.opType]} Model`;
return (
<div>
<div className="chat-table-btns">
<Button className="chat-add-btn" type="primary" onClick={opInfo.opNew}>Add Model</Button>
<div>
{selectedItems.length > 0 && (
<>
<Button type="primary" onClick={() => handleEnable(true)}>Enable</Button>
<Button onClick={() => handleEnable(false)}>Disable</Button>
<span className="num">Selected {selectedItems.length} items</span>
</>
)}
</div>
</div>
{/* <div className="chat-model-path">PATH: <span onClick={handleOpenFile}>{modelPath}</span></div> */}
<div className="chat-table-tip">
<div className="chat-sync-path">
<div>CACHE: <a onClick={() => shell.open(jsonPath)} title={jsonPath}>{jsonPath}</a></div>
</div>
{lastUpdated && <span style={{ marginLeft: 10, color: '#888', fontSize: 12 }}>Last updated on {fmtDate(lastUpdated)}</span>}
</div>
<Table
key={lastUpdated}
rowKey="cmd"
columns={columns}
scroll={{ x: 'auto' }}
dataSource={opData}
rowSelection={rowSelection}
pagination={TABLE_PAGINATION}
/>
<Modal
open={isVisible}
onCancel={hide}
title={modalTitle}
onOk={handleOk}
destroyOnClose
maskClosable={false}
>
<UserCustomForm record={opInfo?.opRecord} ref={formRef} />
</Modal>
</div>
)
}