Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -1416,6 +1416,7 @@ search, and comprehensive file analysis.
- **[Voyp](https://github.com/paulotaylor/voyp-mcp)** - VOYP MCP server for making calls using Artificial Intelligence.
- **[vscode-ai-model-detector](https://github.com/thisis-romar/vscode-ai-model-detector)** - Real-time AI model detection for VS Code Copilot with 100% accuracy. Enables proper git attribution by identifying active models (Claude, GPT, Gemini) via Chat Participant API.
- **[vulnicheck](https://github.com/andrasfe/vulnicheck)** - Real-time Python package vulnerability scanner that checks dependencies against OSV and NVD databases, providing comprehensive security analysis with CVE details, lock file support, and actionable upgrade recommendations.
- **[WAIaaS](https://github.com/minhoyoo-iotrust/WAIaaS)** - Self-hosted wallet daemon for AI agents with 60 MCP tools. Multi-chain crypto operations (EVM + Solana): transfers, DeFi (swap, lend, stake, bridge, perp, yield), NFTs, smart contracts, transaction signing, and x402 payments. Session-based auth with spending policies. [![WAIaaS MCP server](https://glama.ai/mcp/servers/minhoyoo-iotrust/WAIaaS/badges/score.svg)](https://glama.ai/mcp/servers/minhoyoo-iotrust/WAIaaS)
- **[Wanaku MCP Router](https://github.com/wanaku-ai/wanaku/)** - The Wanaku MCP Router is a SSE-based MCP server that provides an extensible routing engine that allows integrating your enterprise systems with AI agents.
- **[weather-mcp-server](https://github.com/devilcoder01/weather-mcp-server)** - Get real-time weather data for any location using weatherapi.
- **[Web Search MCP](https://github.com/mrkrsl/web-search-mcp)** - A server that provides full web search, summaries and page extration for use with Local LLMs.
Expand Down
Loading