Your AI Workshop.
What to Expect & How to Prepare.
This guide is for you and your team to review before the workshop. The more prepared you are, the more we can accomplish together — and the better the thing we build for you on day two.
What this workshop is.
The AI Workshop is a focused 1–2 day session where we work alongside your team to map out where time is actually being lost, identify which of those problems AI can solve well, and build the first version of something real.
You will not leave with a slide deck or a strategy document. You will leave with a working prototype — something your team can use the following Monday.
The workshop is not about AI in general. It is about your workflows, your tools, your team's specific pain points. Everything we build is specific to you.
- ✕A training on how to use ChatGPT
- ✕A generic AI overview or keynote
- ✕A requirements-gathering exercise with no output
- ✕A commitment to a full implementation project
- ✕A replacement for your existing software or team
What to bring and think about in advance.
Spend 30–60 minutes as a team working through these questions before the workshop. You don't need polished answers — rough notes and honest observations are more useful than a prepared presentation.
1 — Your current tools and integrations
List every software tool your team uses regularly. Don't overthink it — a rough list is fine.
- —CRM or client management (e.g. HubSpot, Salesforce, spreadsheet)
- —Project management (e.g. Asana, Monday, Notion)
- —Communication (e.g. Gmail, Outlook, Slack, Teams)
- —Scheduling (e.g. Calendly, Google Calendar)
- —Accounting / invoicing (e.g. QuickBooks, FreshBooks)
- —Document storage (e.g. Google Drive, SharePoint, Dropbox)
- —Industry-specific software (e.g. practice management, ERP)
- —Forms and intake tools
- —Any automations already in place (Zapier, Make, etc.)
- □A written list of every tool your team uses day-to-day
- □Login access or a demo account for 1–2 key tools
- □Any API documentation or integration guides your software provides
- □Notes on which tools talk to each other (or don't)
2 — Shortcomings and friction points
These are the moments in your workday where you think “there has to be a better way.”
- —Which tasks take longer than they should and feel repetitive?
- —Where does information get lost or have to be re-entered in multiple places?
- —What do new hires struggle to learn that experienced staff just know?
- —Where do errors or mistakes most commonly happen, and why?
- —What do you wish your software did that it doesn't?
- —Which handoffs between people or systems cause delays or confusion?
- —What do you spend time on that doesn't require your expertise?
3 — Your AI opportunity hypothesis
You don't need to be right about this — we'll refine it together. Just write down your gut instinct.
- —If you had to guess, where do you think AI could help your team the most?
- —Is there a specific task you've already tried to use AI for, with or without success?
- —Is there a process that you've thought "this is basically copy-paste work" and wondered if it could be automated?
- —Are there decisions your team makes repeatedly that follow a similar pattern each time?
- —Is there information locked in documents, emails, or someone's head that should be more accessible?
4 — Knowledge base and reference materials
Agents work best when they have access to your specific knowledge — not just generic information. Think about what a new hire would need to read to get up to speed.
- —SOPs (standard operating procedures) for common tasks
- —Pricing sheets, rate cards, or service descriptions
- —Frequently asked questions from clients or customers
- —Email templates or response scripts your team reuses
- —Decision trees or escalation rules ("if X, do Y")
- —Compliance rules, regulations, or policies that govern your work
- —Product or service knowledge documents
- —Common objections and how your team handles them
- □Any existing SOPs or process documents (even rough ones)
- □3–5 example emails or messages your team sends regularly
- □Any scripts, templates, or checklists your team currently uses
- □Sample inputs and outputs for a process you want to automate (e.g., a before/after document)
5 — Who should be in the room
The workshop works best with a small, focused group. You don't need everyone — just the right people.
- —1–2 people who do the actual day-to-day work in the area we're targeting (not just managers)
- —Someone with decision-making authority who can approve the direction we choose
- —Optionally: the person who manages the tools or software your team uses
Examples of A+ workshop outputs.
These are representative examples of what a well-prepared workshop can produce. Your outputs will be specific to your workflows and tools — these are patterns, not templates.
Use case: A roofing company spends 2–3 hours per week answering the same initial questions from prospects before a sales call can even be booked.
Your job is to collect the following from new inquiries:
— Property address and type (residential/commercial)
— Nature of the issue (leak, damage, replacement, inspection)
— Urgency (immediate, within 2 weeks, planning ahead)
— How they heard about us
Rules:
— Never quote prices. Say: "We'll confirm pricing on the call."
— If they describe an active leak, flag as urgent and escalate.
— Always end with a Calendly link to book a 15-min consult.
Use case: Tax preparers spend the first hour of every client file manually extracting information from uploaded documents before they can begin the actual preparation work.
- —Internal checklist of required documents by filing type (W-2 only, self-employed, rental income, etc.)
- —Common data extraction fields by form type (W-2, 1099-NEC, 1099-INT, K-1)
- —Firm-specific rules: which fields get flagged for senior review
- —Client communication templates: "We're missing your [X]" request emails
- —Auto-populated intake summary: client name, filing status, income sources identified
- —Missing document checklist generated per client, with draft request email
- —Data extracted to a structured format that pre-fills the preparer's software
- —Red flags surfaced (e.g., income discrepancy vs. prior year) for preparer review
Use case: A marketing agency has 4 years of client work, internal processes, and brand guidelines scattered across Google Drive folders that no one can find quickly.
- —Past campaign briefs and performance summaries
- —Client brand guidelines and tone-of-voice docs
- —Internal service pricing and scope definitions
- —Onboarding docs and process SOPs
- —Proposal templates and past winning proposals
What we do in each session.
The workshop runs over 1–2 days depending on scope. Day one is diagnostic and strategic. Day two is hands-on build time. You will be in the room for both.
We walk through your team's daily and weekly workflows end-to-end. We're looking for volume, repetition, and pain. We document every step of the processes that take the most time or cause the most friction.
We score each opportunity against two axes: how much time it saves and how well-suited it is for AI. We pick the top 1–2 to build during the workshop, and document the rest as a roadmap for future engagement.
For each opportunity we're building, we define what the agent needs to know — the rules, the context, the edge cases, and the escalation triggers. This is the session where your team's expertise gets encoded.
We build the prototype while your team is in the room. You test it in real-time against actual examples from your work. We refine it based on your feedback until it performs at a level you'd actually use.
We review what we built, how to use it starting Monday, and what the path looks like from here — whether that's a full implementation project or just extending what we built on your own.
What you leave with.
At least one functional AI-assisted workflow your team can use the following Monday — not a demo, not a mockup.
Documented instructions for every agent we scoped — including the rules, knowledge base references, and edge case handling.
A clear map of your current processes, where the friction is, and how the AI-assisted version compares.
A prioritized list of the remaining AI opportunities we identified but didn't build — ready to hand off for a follow-on engagement.
Instructions for your team on how to use what we built, what to do when it doesn't work as expected, and how to improve it over time.
Documentation of your current tool stack, what connects to what, and where AI can be stitched in without disrupting existing workflows.