Microsoft latest agentic model Fara-7B is now integrated in Magentic-UI:
- First install magentic-ui with the fara extras:
```bash
python3 -m venv .venv
source .venv/bin/activate
pip install magentic-ui[fara]- In a seperate process, serve the Fara-7B model using vLLM:
vllm serve "microsoft/Fara-7B" --port 5000 --dtype auto - First create a
fara_config.yamlfile with the following content:
model_config_local_surfer: &client_surfer
provider: OpenAIChatCompletionClient
config:
model: "microsoft/Fara-7B"
base_url: http://localhost:5000/v1
api_key: not-needed
model_info:
vision: true
function_calling: true
json_output: false
family: "unknown"
structured_output: false
multiple_system_messages: false
orchestrator_client: *client_surfer
coder_client: *client_surfer
web_surfer_client: *client_surfer
file_surfer_client: *client_surfer
action_guard_client: *client_surfer
model_client: *client_surferNote: if you are hosting vLLM on a different port or host, change the base_url accordingly.
Then launch Magentic-UI with the fara agent:
magentic-ui --fara --port 8081 --config fara_config.yaml Finally, navigate to http://localhost:8081 to access the interface!
What's Changed
- Update link for 'Tell me When' feature by @husseinmozannar in #402
- [WIP] Fara-7B in Magentic-UI by @husseinmozannar in #448
Full Changelog: 0.1.5...v0.1.6