Skip to content

Commit b458b5b

Browse files
committed
Add code models with blank templates.
1 parent 389206c commit b458b5b

19 files changed

+1900
-0
lines changed

AMD Llama Code.yaml

Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
---
2+
# Thank you for contributing!
3+
# In filling out this yaml file, please follow the criteria as described here:
4+
# https://github.com/opening-up-chatgpt/opening-up-chatgpt.github.io/tree/main/projects#criteria
5+
6+
# You're free to build on this work and reuse the data. It is licensed under CC-BY 4.0, with the
7+
# stipulation that attribution should come in the form of a link to http://opening-up-chatgpt.github.io
8+
# and a citation to the paper in which the initial dataset & criteria were published:
9+
10+
# Liesenfeld, Andreas, Alianda Lopez, and Mark Dingemanse. 2023. “Opening up ChatGPT: Tracking Openness, Transparency, and Accountability in Instruction-Tuned Text Generators.” In CUI '23: Proceedings of the 5th International Conference on Conversational User Interfaces. July 19-21, Eindhoven. doi: 10.1145/3571884.3604316
11+
12+
system:
13+
name: AMD Llama Code
14+
link: https://huggingface.co/amd/AMD-Llama-135m-code
15+
type: code
16+
performanceclass:
17+
basemodelname: AMD-Llama-135m
18+
endmodelname: AMD-Llama-135m-code
19+
endmodellicense: Apache-2.0
20+
releasedate:
21+
notes: Very tiny coder model, useful for speculative decoding.
22+
23+
org:
24+
name: AMD
25+
link: https://huggingface.co/amd
26+
notes:
27+
28+
# availability:
29+
datasources_basemodel:
30+
class: closed
31+
link:
32+
notes:
33+
34+
datasources_endmodel:
35+
class: closed
36+
link:
37+
notes:
38+
39+
weights_basemodel:
40+
class: closed
41+
link:
42+
notes:
43+
44+
weights_endmodel:
45+
class: closed
46+
link:
47+
notes:
48+
49+
trainingcode:
50+
class: closed
51+
link:
52+
notes:
53+
54+
# documentation:
55+
code:
56+
class: closed
57+
link:
58+
notes:
59+
60+
architecture:
61+
class: closed
62+
link:
63+
notes:
64+
65+
preprint:
66+
class: closed
67+
link:
68+
notes:
69+
70+
paper:
71+
class: closed
72+
link:
73+
notes:
74+
75+
modelcard:
76+
class: closed
77+
link:
78+
notes:
79+
80+
datasheet:
81+
class: closed
82+
link:
83+
notes:
84+
85+
# access:
86+
package:
87+
class: closed
88+
link:
89+
notes:
90+
91+
api:
92+
class: closed
93+
link:
94+
notes:
95+
metaprompt: closed
96+
97+
licenses:
98+
class: closed
99+
link:
100+
notes:

AlchemistCoder.yaml

Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
---
2+
# Thank you for contributing!
3+
# In filling out this yaml file, please follow the criteria as described here:
4+
# https://github.com/opening-up-chatgpt/opening-up-chatgpt.github.io/tree/main/projects#criteria
5+
6+
# You're free to build on this work and reuse the data. It is licensed under CC-BY 4.0, with the
7+
# stipulation that attribution should come in the form of a link to http://opening-up-chatgpt.github.io
8+
# and a citation to the paper in which the initial dataset & criteria were published:
9+
10+
# Liesenfeld, Andreas, Alianda Lopez, and Mark Dingemanse. 2023. “Opening up ChatGPT: Tracking Openness, Transparency, and Accountability in Instruction-Tuned Text Generators.” In CUI '23: Proceedings of the 5th International Conference on Conversational User Interfaces. July 19-21, Eindhoven. doi: 10.1145/3571884.3604316
11+
12+
system:
13+
name: AlchemistCoder
14+
link: https://huggingface.co/internlm/AlchemistCoder-DS-6.7B
15+
type: code
16+
performanceclass:
17+
basemodelname: DeepSeek-Coder-6.7B-Base
18+
endmodelname: AlchemistCoder-DS-6.7B
19+
endmodellicense: Apache-2.0
20+
releasedate:
21+
notes: Multiple versions exist with different base models.
22+
23+
org:
24+
name: Shanghai AI Laboratory
25+
link: https://huggingface.co/internlm
26+
notes:
27+
28+
# availability:
29+
datasources_basemodel:
30+
class: closed
31+
link:
32+
notes:
33+
34+
datasources_endmodel:
35+
class: closed
36+
link:
37+
notes:
38+
39+
weights_basemodel:
40+
class: closed
41+
link:
42+
notes:
43+
44+
weights_endmodel:
45+
class: closed
46+
link:
47+
notes:
48+
49+
trainingcode:
50+
class: closed
51+
link:
52+
notes:
53+
54+
# documentation:
55+
code:
56+
class: closed
57+
link:
58+
notes:
59+
60+
architecture:
61+
class: closed
62+
link:
63+
notes:
64+
65+
preprint:
66+
class: closed
67+
link:
68+
notes:
69+
70+
paper:
71+
class: closed
72+
link:
73+
notes:
74+
75+
modelcard:
76+
class: closed
77+
link:
78+
notes:
79+
80+
datasheet:
81+
class: closed
82+
link:
83+
notes:
84+
85+
# access:
86+
package:
87+
class: closed
88+
link:
89+
notes:
90+
91+
api:
92+
class: closed
93+
link:
94+
notes:
95+
metaprompt: closed
96+
97+
licenses:
98+
class: closed
99+
link:
100+
notes:

AquilaCode.yaml

Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
---
2+
# Thank you for contributing!
3+
# In filling out this yaml file, please follow the criteria as described here:
4+
# https://github.com/opening-up-chatgpt/opening-up-chatgpt.github.io/tree/main/projects#criteria
5+
6+
# You're free to build on this work and reuse the data. It is licensed under CC-BY 4.0, with the
7+
# stipulation that attribution should come in the form of a link to http://opening-up-chatgpt.github.io
8+
# and a citation to the paper in which the initial dataset & criteria were published:
9+
10+
# Liesenfeld, Andreas, Alianda Lopez, and Mark Dingemanse. 2023. “Opening up ChatGPT: Tracking Openness, Transparency, and Accountability in Instruction-Tuned Text Generators.” In CUI '23: Proceedings of the 5th International Conference on Conversational User Interfaces. July 19-21, Eindhoven. doi: 10.1145/3571884.3604316
11+
12+
system:
13+
name: AquilaCode
14+
link: https://huggingface.co/BAAI/AquilaCode-multi
15+
type: code
16+
performanceclass:
17+
basemodelname: Aquila-7B
18+
endmodelname: AquilaCode-multi
19+
endmodellicense: BAAI Aquila Model Licence Agreement
20+
releasedate:
21+
notes: Bilingual (English-Chinese) language model.
22+
23+
org:
24+
name: Beijing Academy of Artificial Intelligence
25+
link: https://huggingface.co/BAAI
26+
notes:
27+
28+
# availability:
29+
datasources_basemodel:
30+
class: closed
31+
link:
32+
notes:
33+
34+
datasources_endmodel:
35+
class: closed
36+
link:
37+
notes:
38+
39+
weights_basemodel:
40+
class: closed
41+
link:
42+
notes:
43+
44+
weights_endmodel:
45+
class: closed
46+
link:
47+
notes:
48+
49+
trainingcode:
50+
class: closed
51+
link:
52+
notes:
53+
54+
# documentation:
55+
code:
56+
class: closed
57+
link:
58+
notes:
59+
60+
architecture:
61+
class: closed
62+
link:
63+
notes:
64+
65+
preprint:
66+
class: closed
67+
link:
68+
notes:
69+
70+
paper:
71+
class: closed
72+
link:
73+
notes:
74+
75+
modelcard:
76+
class: closed
77+
link:
78+
notes:
79+
80+
datasheet:
81+
class: closed
82+
link:
83+
notes:
84+
85+
# access:
86+
package:
87+
class: closed
88+
link:
89+
notes:
90+
91+
api:
92+
class: closed
93+
link:
94+
notes:
95+
metaprompt: closed
96+
97+
licenses:
98+
class: closed
99+
link:
100+
notes:

0 commit comments

Comments
 (0)