Cognee
Open-source knowledge engine that transforms raw data into persistent, dynamic AI memory — combines vector search, graph databases, and relational storage. Includes API server and a direct-mode MCP server for IDE integration.
harbor build cognee
harbor up cogneeharbor config search <query>
Built-in config field/value search to supplement native grep workflow:
$ ▼ h config search cache
hf.cache /home/everlier/.cache/huggingface
llamacpp.cache ~/.cache/llama.cpp
ollama.cache ~/.ollama
vllm.cache ~/.cache/vllm
txtai.cache ~/.cache/txtai
nexa.cache ~/.cache/nexa
parllama.cache ~/.parllama
lmeval.cache ./lmeval/cacheharbor config <service>
You can now manage environment variables of a given service directly with harbor config. This is very similar in functionality to harbor env.
$ ▼ h config ollama ls
OLLAMA_CONTEXT_LENGTH 16384
OLLAMA_NUM_PARALLEL 2
OLLAMA_ORIGINS *
$ ▼ h config ollama get ollama.context_length
16384The command supports same naming convention as the harbor config.
Misc
harbor size- skip opencode workspacesharbor up- print URLs after start- fix "open" in Harbor App on Linux
- CLI suggestions after typo in a command
Full Changelog: v0.4.2...v0.4.3