DAE is a an open, vendor-neutral security specification for AI Agents that formally separates reasoning and authority
What is the kahalewai/dae GitHub project? Description: "DAE is a an open, vendor-neutral security specification for AI Agents that formally separates reasoning and authority". Explain what it does, its main use cases, key features, and who would benefit from using it.
Question is copied to clipboard — paste it after the AI opens.
Clone via HTTPS
Clone via SSH
Download ZIP
Download master.zipReport bugs or request features on the dae issue tracker:
Open GitHub Issues