this post was submitted on 28 Apr 2024
381 points (96.8% liked)
BecomeMe
805 readers
1 users here now
Social Experiment. Become Me. What I see, you see.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is the best summary I could come up with:
It's an open secret that Microsoft is gearing up to supercharge Windows 11 this summer with next-gen AI capabilities that will enable the OS to be context aware across any apps and interfaces, as well as remember everything you do on your PC to enhance user productivity and search.
These new capabilities are set to ship as part of a new app internally called "AI Explorer," which I'm told will be unveiled during Microsoft's special Windows event on May 20.
The feature is also said to be exclusive to devices powered by Qualcomm's upcoming Snapdragon X series chips, at least at first, as Intel and AMD play catchup in the NPU race.
AI Explorer is able to do more than just remember the things you do on your computer, it's also able to analyze what's currently on-screen and provide contextual suggestions and tasks based on what it can see.
This capability is called Screen Understanding, and I'm told one of the big selling points of AI Explorer is that it's supposed to work across any app, with no developer input required.
The existence of Rewind.ai proves that this is a concept that can be done, and Microsoft is essentially building its own version into Windows 11 that offloads the resources required for such a feature onto NPUs to keep the load away from the CPU.
The original article contains 1,076 words, the summary contains 225 words. Saved 79%. I'm a bot and I'm open source!