-
Notifications
You must be signed in to change notification settings - Fork 35
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #139 from stanford-crfm/jonathan/1013-weekly-assets
Jonathan/1013 weekly assets
- Loading branch information
Showing
20 changed files
with
680 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,88 @@ | ||
--- | ||
- type: model | ||
name: Firefly Image 2 | ||
organization: Adobe | ||
description: Firefly Image 2 is the next generation of generative AI for imaging, bringing significant advancements to creative control and quality, including new Text to Image capabilities now available in the popular Firefly web app where 90% of users are new to Adobe products. | ||
created_date: 2023-10-10 | ||
url: https://firefly.adobe.com/ | ||
model_card: none | ||
modality: text; image | ||
analysis: '' | ||
size: unknown | ||
dependencies: [] | ||
training_emissions: unknown | ||
training_time: unknown | ||
training_hardware: unknown | ||
quality_control: '' | ||
access: closed | ||
license: unknown | ||
intended_uses: creative generation of digital art and images | ||
prohibited_uses: AI/ML training, attempting to create abusive, illegal, or confidential content. | ||
monitoring: '' | ||
feedback: '' | ||
|
||
- type: model | ||
name: Firefly Vector | ||
organization: Adobe | ||
description: Firefly Vector is the world’s first generative AI focused on producing vector graphics, bringing Adobe's vector graphic and generative AI expertise directly into Adobe Illustrator workflows with Text to Vector Graphic. | ||
created_date: 2023-10-10 | ||
url: https://firefly.adobe.com/ | ||
model_card: none | ||
modality: text; vector graphic | ||
analysis: '' | ||
size: unknown | ||
dependencies: [] | ||
training_emissions: unknown | ||
training_time: unknown | ||
training_hardware: unknown | ||
quality_control: '' | ||
access: closed | ||
license: unknown | ||
intended_uses: creative generation of digital art and images | ||
prohibited_uses: AI/ML training, attempting to create abusive, illegal, or confidential content. | ||
monitoring: '' | ||
feedback: '' | ||
|
||
- type: model | ||
name: Firefly Design | ||
organization: Adobe | ||
description: Firefly Design powers instant generation of amazing quality template designs in Adobe Express with the new Text to Template capability. | ||
created_date: 2023-10-10 | ||
url: https://firefly.adobe.com/ | ||
model_card: none | ||
modality: text; template design | ||
analysis: '' | ||
size: unknown | ||
dependencies: [] | ||
training_emissions: unknown | ||
training_time: unknown | ||
training_hardware: unknown | ||
quality_control: '' | ||
access: closed | ||
license: unknown | ||
intended_uses: creative generation of digital art and images | ||
prohibited_uses: AI/ML training, attempting to create abusive, illegal, or confidential content. | ||
monitoring: '' | ||
feedback: '' | ||
|
||
- type: application | ||
name: Firefly | ||
organization: Adobe | ||
description: Adobe Firefly is a standalone web application. It offers new ways to ideate, create, and communicate while significantly improving creative workflows using generative AI. | ||
created_date: 2023-03-21 | ||
url: https://firefly.adobe.com/ | ||
dependencies: [Firefly Image 2, Firefly Vector, Firefly Design] | ||
adaptation: '' | ||
output_space: AI-generated creations | ||
quality_control: '' | ||
access: limited | ||
license: unknown | ||
terms_of_service: https://www.adobe.com/legal/licenses-terms/adobe-gen-ai-user-guidelines.html | ||
intended_uses: creative generation of digital art and images | ||
prohibited_uses: AI/ML training, attempting to create abusive, illegal, or confidential content. | ||
monitoring: '' | ||
feedback: '' | ||
monthly_active_users: unknown | ||
user_distribution: unknown | ||
failures: unknown | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
--- | ||
- type: model | ||
name: Baichuan 2 | ||
organization: Baichuan Inc. | ||
description: Baichuan 2 is a series of large-scale multilingual language models containing 7 billion and 13 billion parameters, trained from scratch, on 2.6 trillion tokens. | ||
created_date: 2023-09-20 | ||
url: https://arxiv.org/pdf/2309.10305.pdf | ||
model_card: none | ||
modality: text; text | ||
analysis: Evaluated on public benchmarks like MMLU, CMMLU, GSM8K, and HumanEval. | ||
size: 13B parameters (dense) | ||
dependencies: [] | ||
training_emissions: unknown | ||
training_time: unknown | ||
training_hardware: 1024 NVIDIA A800 GPUs | ||
quality_control: '' | ||
access: open | ||
license: unknown | ||
intended_uses: '' | ||
prohibited_uses: '' | ||
monitoring: none | ||
feedback: https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1/discussions |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,64 @@ | ||
--- | ||
- type: dataset | ||
name: ToyMix | ||
organization: Mila - Quebec AI Institute | ||
description: ToyMix is the smallest dataset of three extensive and meticulously curated multi-label datasets that cover nearly 100 million molecules and over 3000 sparsely defined tasks. | ||
created_date: 2023-10-09 | ||
url: https://arxiv.org/pdf/2310.04292.pdf | ||
datasheet: none | ||
modality: molecules, tasks | ||
size: 13B labels of quantum and biological nature. | ||
sample: [] | ||
analysis: Models of size 150k parameters trained on ToyMix and compared to models trained on its dependencies across GNN baselines. | ||
dependencies: [QM9, TOX21, ZINC12K] | ||
included: '' | ||
excluded: '' | ||
quality_control: '' | ||
access: open | ||
license: CC BY-NC-SA 4.0 | ||
intended_uses: The datasets are intended to be used in an academic setting for training molecular GNNs with orders of magnitude more parameters than current large models. Further, the ToyMix dataset is intended to be used in a multi-task setting, meaning that a single model should be trained to predict them simultaneously. | ||
prohibited_uses: none | ||
monitoring: none | ||
feedback: none | ||
- type: dataset | ||
name: LargeMix | ||
organization: Mila - Quebec AI Institute | ||
description: LargeMix is the middle-sized dataset of three extensive and meticulously curated multi-label datasets that cover nearly 100 million molecules and over 3000 sparsely defined tasks. | ||
created_date: 2023-10-09 | ||
url: https://arxiv.org/pdf/2310.04292.pdf | ||
datasheet: none | ||
modality: molecules, tasks | ||
size: 13B labels of quantum and biological nature. | ||
sample: [] | ||
analysis: Models of size between 4M and 6M parameters trained for 200 epochs on LargeMix and compared to models trained on its dependencies across GNN baselines. | ||
dependencies: [L1000 VCAP, L1000 MCF7, PCBA1328, PCQM4M_G25_N4] | ||
included: '' | ||
excluded: '' | ||
quality_control: '' | ||
access: open | ||
license: CC BY-NC-SA 4.0 | ||
intended_uses: The datasets are intended to be used in an academic setting for training molecular GNNs with orders of magnitude more parameters than current large models. Further, the LargeMix dataset is intended to be used in a multi-task setting, meaning that a single model should be trained to predict them simultaneously. | ||
prohibited_uses: none | ||
monitoring: none | ||
feedback: none | ||
- type: dataset | ||
name: UltraLarge | ||
organization: Mila - Quebec AI Institute | ||
description: UltraLarge is the largest dataset of three extensive and meticulously curated multi-label datasets that cover nearly 100 million molecules and over 3000 sparsely defined tasks. | ||
created_date: 2023-10-09 | ||
url: https://arxiv.org/pdf/2310.04292.pdf | ||
datasheet: none | ||
modality: molecules, tasks | ||
size: 13B labels of quantum and biological nature. | ||
sample: [] | ||
analysis: Models of size between 4M and 6M parameters trained for 50 epochs on UltraLarge and compared to models trained on its dependencies across GNN baselines. | ||
dependencies: [PM6_83M] | ||
included: '' | ||
excluded: '' | ||
quality_control: '' | ||
access: open | ||
license: CC BY-NC-SA 4.0 | ||
intended_uses: The datasets are intended to be used in an academic setting for training molecular GNNs with orders of magnitude more parameters than current large models. | ||
prohibited_uses: none | ||
monitoring: none | ||
feedback: none |
Oops, something went wrong.