Here are the frontend and backend servers.
Please install nginx
and configure its setting. A sample file nginx_setting
is provided.
sudo apt install nginx
sudo systemctl start nginx
A domain name and corresponding certificate is required for this server. To automatically issue, renew and install the certificate, tools of Certbot and ACME are recommended.
Please install MySQL
and create an account for superb
database.
- Install
sudo apt install mysql-server
sudo mysql_secure_installation
sudo mysql
- Create an account
CREATE USER '<username>'@'localhost' IDENTIFIED WITH mysql_native_password BY '<password>';
GRANT ALL PRIVILEGES ON *.* TO '<username>'@'localhost' WITH GRANT OPTION;
- Create a
superb
database
CREATE database superb;
Please refer to ./frontend/
Please refer to ./backend/
ALTER TABLE superb.scores
ADD ${column_name} float;
- Please mind that
${column_name}
=${task}_${metric}_${mode}
. e.g.PR_per_public
andQbE_mtwv_hidden
.
- Put your ground truth in
./backend/inference/truth/${task}
folder. - Modify
./backend/calculate.py
with the following template:
#============================================#
# ${task} #
#============================================#
# ${task} {public/hidden}
if os.path.isdir(os.path.join(predict_root, "${task_folder}")):
if os.path.isfile(os.path.join(predict_root, "${task_folder}", "${gt_file}")):
if #what user uploaded is valid:
print("[${task} ${public/hidden}]", file=output_log_f)
try:
score = # calcuation
print(f"${task}: ${metric} {score}", file=output_log_f)
score_model.${task}_${metric}_${mode} = score
session.commit()
except Exception as e:
print(e, file=output_log_f)
- Modify
./backend/models/score.py
. Add a column like:
${task}_${metric}_${mode} = db.Column(db.Float)
- Modify
./backend/models/naive_models.py
. Add a column toScoreModel
class like:
${task}_${metric}_${mode} = db.Column(db.Float)
-
Modify
./backend/configs.yaml
. Add your new task info to theSCORE
section ofINDIVIDUAL_SUBMISSION_INFO
andLEADERBOARD_INFO
. -
(Optional) Add calcuated scores of your new task/metric for official models by modifying the
get_leaderboard_default()
function defined in./backend/utils.py
.
- Append the
individual_submission_columnInfo
array in./frontend/src/Data.js
with:
${task}_${metric}_${mode}: {
header: "${task} ${mode}",
width: 100,
higherBetter: false,
isScore: true,
type: "number",
},
- Append the
leaderboard_columnInfo
array in./frontend/src/Data.js
with:
${task}_${metric}_${mode}: {
header: "${task} ${mode}",
width: 100,
higherBetter: false,
isScore: true,
type: "number",
},
ALTER TABLE superb.files
ADD ${column_name} bigint unsigned default NULL; -- if required: remove "default NULL"
-
Modify
./backend/models/file.py
. Add a column like:from sqlalchemy.dialects.mysql import BIGINT ${column_name} = db.Column(BIGINT)
-
Modify
./backend/models/naive_models.py
. Add a column toFileModel
class like:from sqlalchemy.dialects.mysql import BIGINT ${column_name} = db.Column(BIGINT)
-
Modify
./backend/configs.yaml
. Add your new task info to theFILE
section ofINDIVIDUAL_SUBMISSION_INFO
andLEADERBOARD_INFO
. -
(Optional) Add fields of your new task/metric for official models by modifying the
get_leaderboard_default()
function defined in./backend/utils.py
.
-
Append the
individual_submission_columnInfo
array in./frontend/src/Data.js
with:${column_name}: { header: "${column_name}", width: 100, higherBetter: false, isScore: true, type: "number", },
-
Append the
leaderboard_columnInfo
array in./frontend/src/Data.js
with:${column_name}: { header: "${column_name}", width: 100, higherBetter: false, isScore: true, type: "number", },