Skip to content

Commit

Permalink
v1.0 release (#8)
Browse files Browse the repository at this point in the history
* Create .gitignore

* add data

* Create banner.png

* add oai_api_demo

* Create merge_llama3_with_chinese_lora_low_mem.py

* Create chat.sh

* add issue template

* init requirements.txt

* add ollama modelfile

* add downstream scripts

* update cmmlu script

* Update README.md

* update downstream scripts

* remove invalid args

* add inference script

* update longbench script

* Update ruozhiba_qa2449_gpt4turbo.json

* initial readme

---------

Co-authored-by: ymcui <[email protected]>
Co-authored-by: iMountTai <[email protected]>
  • Loading branch information
3 people authored Apr 30, 2024
1 parent 5ed732a commit ba95712
Show file tree
Hide file tree
Showing 29 changed files with 15,875 additions and 11 deletions.
77 changes: 77 additions & 0 deletions .github/ISSUE_TEMPLATE/ISSUE_TEMPLATE_EN.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
name: English Issue Template
description: For questions related to this project, we will prioritize issues with relatively complete content.

body:
- type: markdown
attributes:
value: 💡 For open discussions, please visit [Discussion Space](https://github.com/ymcui/Chinese-LLaMA-Alpaca-3/discussions). Please do not open a discussion in Issue section. Thank you.
- type: checkboxes
id: mustchecks
attributes:
label: Check before submitting issues
description: Please check the following items before asking questions. Use the search function to find issues related to your problem.
options:
- label: Make sure to pull the latest code, as some issues and bugs have been fixed.
required: true
- label: I have read the [Wiki](https://github.com/ymcui/Chinese-LLaMA-Alpaca-3/wiki) and [FAQ section](https://github.com/ymcui/Chinese-LLaMA-Alpaca-3/wiki/FAQ) AND searched for similar issues and did not find a similar problem or solution
required: true
- label: Third-party plugin issues - e.g., [llama.cpp](https://github.com/ggerganov/llama.cpp), [text-generation-webui](https://github.com/oobabooga/text-generation-webui), we recommend checking the corresponding project for solutions
required: true
- type: dropdown
id: question-type
attributes:
label: Type of Issue
description: Please select the type of issue that best matches your problem
options:
- Download issue
- Model conversion and merging
- Model training and fine-tuning
- Model inference
- Model quantization and deployment
- Performance issue
- Other issues
- type: dropdown
id: model-type
attributes:
label: Base Model
description: Please provide the type of base model. For issues related to multiple models, please select the most appropriate one and specify all models in the main text.
options:
- Llama-3-Chinese-8B (Base Model)
- Llama-3-Chinese-Instruct-8B (Chat Model)
- Others
- type: dropdown
id: operating-system
attributes:
label: Operating System
description: Please provide your operating system
options:
- Windows
- macOS
- Linux
- type: textarea
id: question-detailed
attributes:
label: Describe your issue in detail
description: Please describe your problem as detail as possible. **For code-related issues, please provide the complete command to reproduce the problem.** This will help us locate the issue quickly.
value: |
```
# Please copy-and-paste your command here.
```
- type: textarea
id: dependencies
attributes:
label: Dependencies (must be provided for code-related issues)
description: Please provide the versions of common dependencies such as transformers, peft, torch, etc. Use `pip list | grep -E 'transformers|peft|torch|sentencepiece|bitsandbytes'`
value: |
```
# Please copy-and-paste your dependencies here.
```
- type: textarea
id: logs
attributes:
label: Execution logs or screenshots
description: Please provide logs in text format (upload files if content is too long), or alternatively, screenshots of the execution record.
value: |
```
# Please copy-and-paste your logs here.
```
77 changes: 77 additions & 0 deletions .github/ISSUE_TEMPLATE/ISSUE_TEMPLATE_ZH.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
name: 中文提问模板
description: 与本项目相关的问题提问,我们会优先查阅内容相对完整的issue。

body:
- type: markdown
attributes:
value: 💡 开放式讨论请移步[讨论区](https://github.com/ymcui/Chinese-LLaMA-Alpaca-3/discussions),请勿以issue形式提问,谢谢。
- type: checkboxes
id: mustchecks
attributes:
label: 提交前必须检查以下项目
description: 请在提问前检查以下项目,善用搜索功能查找与自己问题相关的issue。
options:
- label: 请确保使用的是仓库最新代码(git pull)
required: true
- label: 已阅读[项目文档](https://github.com/ymcui/Chinese-LLaMA-Alpaca-3/wiki)和[FAQ章节](https://github.com/ymcui/Chinese-LLaMA-Alpaca-3/wiki/常见问题)并且已在Issue中对问题进行了搜索,没有找到相似问题和解决方案。
required: true
- label: 第三方插件问题:例如[llama.cpp](https://github.com/ggerganov/llama.cpp)、[text-generation-webui](https://github.com/oobabooga/text-generation-webui)等,建议优先去对应的项目中查找解决方案。
required: true
- type: dropdown
id: question-type
attributes:
label: 问题类型
description: 请选择最符合的问题类型
options:
- 下载问题
- 模型转换和合并
- 模型训练与精调
- 模型推理
- 模型量化和部署
- 效果问题
- 其他问题
- type: dropdown
id: model-type
attributes:
label: 基础模型
description: 请提供问题涉及的具体模型。
options:
- Llama-3-Chinese-8B(基座模型)
- Llama-3-Chinese-Instruct-8B(基座模型)
- Others
- type: dropdown
id: operating-system
attributes:
label: 操作系统
description: 请提供操作系统类型
options:
- Windows
- macOS
- Linux
- type: textarea
id: question-detailed
attributes:
label: 详细描述问题
description: 请尽量具体地描述遇到的问题,**代码程序类问题务必给出完整运行命令**,这将有助于快速定位问题所在。
value: |
```
# 请在此处粘贴运行代码(请粘贴在本代码块里)
```
- type: textarea
id: dependencies
attributes:
label: 依赖情况(代码类问题务必提供)
description: 请提供transformers, peft, torch等常规依赖库的版本:`pip list | grep -E 'transformers|peft|torch|sentencepiece|bitsandbytes'`
value: |
```
# 请在此处粘贴依赖情况(请粘贴在本代码块里)
```
- type: textarea
id: logs
attributes:
label: 运行日志或截图
description: 请优先提供文本形式的log(过长内容请上传文件),粘贴内容放在markdown代码块。或者提供截图形式的运行记录。
value: |
```
# 请在此处粘贴运行日志(请粘贴在本代码块里)
```
1 change: 1 addition & 0 deletions .github/ISSUE_TEMPLATE/config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
blank_issues_enabled: false
31 changes: 31 additions & 0 deletions .github/workflows/stale.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# This workflow warns and then closes issues and PRs that have had no activity for a specified amount of time.
#
# You can adjust the behavior by modifying this file.
# For more information, see:
# https://github.com/actions/stale
name: Mark stale issues and pull requests

on:
schedule:
- cron: '0 22 * * *'

jobs:
stale:

runs-on: ubuntu-latest
permissions:
issues: write
pull-requests: read

steps:
- uses: actions/stale@v8
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
stale-issue-message: 'This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.'
stale-pr-message: 'This PR has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.'
stale-issue-label: 'stale'
stale-pr-label: 'stale'
operations-per-run: 500
close-issue-message: 'Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.'
days-before-stale: 14
days-before-close: 7
Loading

0 comments on commit ba95712

Please sign in to comment.