You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Each player can be controlled by a multimodal model or an text generating model.
69
+
70
+
### TextRobot
71
+
63
72
We send to the LLM a text description of the screen. The LLM decide on the next moves its character will make. The next moves depends on its previous moves, the moves of its opponents, its power and health bars.
64
73
65
74
- Agent based
@@ -68,6 +77,10 @@ We send to the LLM a text description of the screen. The LLM decide on the next
We send to the LLM a screenshot of the current state of the game precising which character he is controlling. His decision is only based on this visual information.
83
+
71
84
# Installation
72
85
73
86
- Follow instructions in https://docs.diambra.ai/#installation
@@ -142,43 +155,52 @@ By default, it runs mistral against mistral. To use other models, you need to ch
142
155
fromeval.game import Game, Player1, Player2
143
156
144
157
defmain():
158
+
# Environment Settings
159
+
145
160
game = Game(
146
161
render=True,
147
162
save_game=True,
148
163
player_1=Player1(
149
164
nickname="Baby",
150
-
model="ollama:mistral", # change this
165
+
model="ollama:mistral",
166
+
robot_type="text", # vision or text
167
+
temperature=0.7,
151
168
),
152
169
player_2=Player2(
153
170
nickname="Daddy",
154
-
model="ollama:mistral", # change this
171
+
model="ollama:mistral",
172
+
robot_type="text",
173
+
temperature=0.7,
155
174
),
156
175
)
176
+
157
177
game.run()
158
178
return0
179
+
180
+
181
+
if__name__=="__main__":
182
+
main()
159
183
```
160
184
161
185
The convention we use is `model_provider:model_name`. If you want to use another local model than Mistral, you can do `ollama:some_other_model`
162
186
163
187
## How to make my own LLM model play? Can I improve the prompts?
164
188
165
-
The LLM is called in `Robot.call_llm()` method of the `agent/robot.py` file.
189
+
The LLM is called in `<Text||Vision>Robot.call_llm()` method of the `agent/robot.py` file.
190
+
191
+
#### TextRobot method:
166
192
167
193
```python
168
194
defcall_llm(
169
195
self,
170
-
temperature: float=0.7,
171
196
max_tokens: int=50,
172
197
top_p: float=1.0,
173
-
) -> str:
198
+
) -> Generator[ChatResponse, None, None]:
174
199
"""
175
200
Make an API call to the language model.
176
201
177
202
Edit this method to change the behavior of the robot!
178
203
"""
179
-
# self.model is a slug like mistral:mistral-small-latest or ollama:mistral
0 commit comments