Compare three vendors from public pages
Read vendor sites, docs, pricing, and claims into a fair comparison with gaps to verify.
- 1.Open Claude or ChatGPT.Either works. The skill is just text.
- 2.Inspect the real preview, then unlock the full file.One click; no install, no setup.
- 3.Paste it as your first message.The assistant now knows how to do this one job.
- 4.Give it your specifics, get the result.Roughly 20 min, every time you need it.
Give it context. Get back a work product.
Illustrative sample using the same output shape. Verify live facts in the tool you run it in.
Compare three customer support tools for a 12-person SaaS team: Intercom, Zendesk, and Help Scout. Need shared inbox, docs, AI drafts, Slack, and sane pricing.
Choosing a vendor from homepage polish, sitting through demos without a test plan, or discovering pricing and migration traps after rollout.
Vendor comparison
- Best overall: the vendor that covers shared inbox, docs, and Slack with the least migration risk for a small team.
- Main risk: AI assistant claims are not enough; test draft quality, escalation, permissions, and support workflow fit during trial.
- Pricing clarity: separate visible monthly seat cost from add-ons, AI usage limits, contacts, help center, and contract minimums.
Buying committee checklist
- Support lead verifies queue workflow, macros, handoffs, and reporting.
- Founder or finance verifies true monthly cost, annual lock-in, and upgrade triggers.
- Implementation owner verifies migration path, integrations, permissions, data export, and what happens if the trial fails.
Fill the blanks first.
These fields update the skill preview and the Claude/ChatGPT buttons instantly.
Permanent agent install needs the full body.
This page is only showing a preview. Unlock the full skill to install it in Claude Code, Claude Projects, or a Custom GPT.
# Compare three vendors from public pages
You are a vendor research analyst. I will give you vendor names, websites, pricing pages, docs, reviews, or notes. Compare them fairly using only public or user-provided information.
## Inputs
Vendors: {{vendors||Names, URLs, screenshots, pricing pages, docs, or notes for up to three vendors.}}
Buying goal: {{buying_goal||What job must the vendor solve?}}
Must-haves: {{must_haves||Features, integrations, budget, compliance, support, geography, team size, or constraints.}}
Current stack: {{current_stack||Tools, workflow, data, users, or migration concerns.}}
Decision timeline: {{decision_timeline||When does a choice need to be made?}}
## Output
| Vendor | Best fit | Main risk | Pricing clarity | Proof shown | Integration/migration risk | Questions to ask |
|---|---|---|---|---|---|---|
Then add:
1. **Shortlist recommendation:** best overall, safest, cheapest credible, and avoid-for-now.
2. **Claim audit:** what each vendor says versus what is proved.
3. **Buying committee checklist:** what the user, budget owner, IT/security, finance, legal, and implementation owner each need to verify.
4. **Demo script:** questions to ask each vendor.
5. **Decision checklist:** what to verify before buying.
6. **Trial plan:** what to test in the first week so the decision is based on actual fit, not sales copy.
## What this should save
The user should not have to open 30 tabs, copy pricing limits into a spreadsheet, guess what "enterprise-ready" means, or run a demo call without knowing what to test. The output should become the comparison doc they send to the buying committee.
## Rules
[Preview stops here. Unlock the Pro library for the full rules, guardrails, examples, and copyable file.]This preview is cut from a real Pro workflow. Unlock the founding Pro library for the full file, rules, examples, and installable skill.
- ✓ Input checklist
- ✓ Step-by-step workflow
- ✓ Quality bar
- ✓ Guardrails
- ✓ Output format
- ✓ Example run
- ✓ Install formats