Qwen Studio offers comprehensive functionality spanning chatbot, image and video understanding, image generation, document processing, web search integration, tool utilization, and artifacts.
It’s a combination of prompts and rails for the model to follow. The tool keeps a code graph in memory which gets parsed using tree sitter, and then I use prolog to reason about the graph, like finding related nodes, seeing if node is connected to the rest of the graph, etc. So, the model ends up being called as the tool walks the graph mechanically, and when the implementor is invoked it basically works as it normally would. The tool offers it MCP like interface where the model can call functions to look at code, run tests, etc., but it never gets to run any system commands itself, it just works against the API the tool exposes to it.
that sounds pretty rad… how do you do it? is it just prompts all the way down?
It’s a combination of prompts and rails for the model to follow. The tool keeps a code graph in memory which gets parsed using tree sitter, and then I use prolog to reason about the graph, like finding related nodes, seeing if node is connected to the rest of the graph, etc. So, the model ends up being called as the tool walks the graph mechanically, and when the implementor is invoked it basically works as it normally would. The tool offers it MCP like interface where the model can call functions to look at code, run tests, etc., but it never gets to run any system commands itself, it just works against the API the tool exposes to it.