Chrome Now Lets You Turn AI Prompts Into Repeatable 'Skills' 22
Google is rolling out a Chrome feature called "Skills" that lets users save Gemini prompts as reusable one-click workflows they can run across multiple tabs. The feature also includes preset Skills from Google. It's launching first for Chrome desktop users set to US English. The Verge reports: Once you have access to the feature, it can be managed by typing a forward slash ( / ) in Gemini and clicking the compass icon. AI prompts can be saved as Skills directly from your Gemini chat history on desktop, where they'll then be available to reuse on any other desktop devices that are signed into the same Google account on Chrome.
The aim is to spare Chrome users from having to manually retype frequently used Gemini prompts or having to copy and paste them over from a saved list. Some of the Skills made by early testers include commands for calculating the nutritional information of online recipes and creating a side-by-side comparison of product specifications while shopping across multiple tabs, according to Google.
The company is also launching a library of preset Skills that you can save and use instead of making your own. These ready-to-use Skills can also be customized to better suit your needs, providing a starting point without requiring you to create your own from scratch.
The aim is to spare Chrome users from having to manually retype frequently used Gemini prompts or having to copy and paste them over from a saved list. Some of the Skills made by early testers include commands for calculating the nutritional information of online recipes and creating a side-by-side comparison of product specifications while shopping across multiple tabs, according to Google.
The company is also launching a library of preset Skills that you can save and use instead of making your own. These ready-to-use Skills can also be customized to better suit your needs, providing a starting point without requiring you to create your own from scratch.
oh god (Score:5, Insightful)
Re: (Score:2)
That, or a new market of software emerges to mitigate the problem
Re: (Score:2)
Or just use it to vote in those idiotic web-voting systems thousands of times, vote, clear cookies, refresh, vote again rinse and repeat.
Re: (Score:2)
Hope they've got they've got their LLM tools well-sandboxed off from the rest of the browser.
Re: (Score:2)
And they used the search key (Score:3)
I can remember to hit ^F but / was nicer.
Obligatory: You can ask it anything / Great, ask it to fuck off
Any good use case (Score:3)
Re: (Score:3, Insightful)
food examples
I regularly ask AI for the right temp and time to reheat things in my air fryer, so that seems like a great use of this function and associated compute resources.
Re: Any good use case (Score:1, Insightful)
How about turning copied text into plain ASCII so it posts cleanly here?
Re: (Score:2)
Im struggling to think of any prompts inwould need to re run, much less prompts built into the browser.
IMO, that's one of the biggest issues here. If there was some fancy operation that more than a few people might run regularly (IE: the included prompts), it'd likely be better served by a traditional application feature. As a dumb example, if there was no feature to search across all tabs, but they added a prompt to search across all tabs, that'd be stupid - just implement search across tabs and save all the LLM overhead.
I'm curious what features would be common enough to be included while also necessitatin
Re: Any good use case (Score:2)
Anthropic says (Score:2)
Great Idea; Terrible Execution (Score:2)
The idea of saving a "skill" is great. Instead of prompting the system to create a Bash/Powershell script, you just save the script results and there's no need to re-prompt and recalculate everything.
But, this "skill" is just saving the prompt. The clanker still has to reprocess and recalculate everything. It seems very wasteful unless the output is dynamic enough to require recomputing.
Re: (Score:2)
What you are calling a skill is a script produced by an LLM (possibly a skill fed to an LLM), but that is not a skill.
Saving scripts is less useful than you think. The LLM will not, unless asked, try to produce something that works generically. It will create very fine-tuned things to solve the problem based on the example dataset it's got in front of it.
If you want to save off and run scripts, you want plugins
Behold (Score:2)
Re: Behold (Score:2)
How reliable? (Score:2)
Last I heard LLMs tended to give a different response each time you queried it.
Re: (Score:2)
Decoding strategies vary from stochastic to greedy (non).
Configurable per generated token (though usually just per prompt)
LLMXSS (Score:4, Insightful)