Data is synced from the Chrome Web Store. View the official store page for the most current information.
An easier way to get local (any) models to "answer from this page instead". Goodbye Copilot Pro.
Not sure what's the point of this extension given the availability of fully open source alternatives. The only additional feature provided here is support for OpenRouter API, suggesting this extension is part of it. However after a quick test I've just found that OpenRouter needs a positive credit balance ($) to operate, even for the free to use models.
Started giving me unknown errors after working fine for a week. Can't figure out how to even get to their website to check an FAQ.
Window AI is great!
This is dope!
Building model-agnostic LLM apps is revolutionary and important. Additionally, it makes the developer experience for integrating large language models into apps easy and fast.
The user and developer experience is great! As a user, I can try many web apps without worrying about my API keys getting stolen, and can try different AI models. As a developer, I can reference the window.ai global variable to call the user's AI model.
If we're to have an LLM-connected internet, we can’t expect people to be pasting their keys into insecure text fields, and we can’t expect developers to front all inference costs. Window AI is the first real attempt at solving this problem.