Course Builder Prompt Selection and Liquid Templating

Course Builder includes "prompt presets" which work very much like a custom GPT inside of ChatGPT, but with more direct access to the system prompt and workflow code.

The presets live in the CMS, so they can be easily edited, updated, and distributed across the system. They can be simple text prompts or include an array of JSON OpenAI chat completion message objects that form an initial conversation with liquid templating enabled.

Both the prompt templates and user input messages accept liquid template formatting which allows us to directly access properties of the resource that we are currently editing. Right now we are primarily dealing with articles and videos, but this will eventually include email campaigns, courses, notes, and various other collections and resources that we are working with.

The liquid templates are awesome and allow a lot of context to be injected into the assistant chat and the rest of the system.

Watch the video for the TMI version.

The current prompt templates are a little wonky because they were "workflows" originally, but these custom assistant prompts are likely their own model that can be queried and presented dynamically CRUD style so that the creator can access their presets and tune it to their preferences and workflows

Workflows might be too generic, but will remain for now.

A few things on the TODO list:

  • [ ] simplify data model for custom prompts so they can be loaded from cms and not hard coded
  • [ ] create prompt CRUD UI
  • [ ] allow creators to add external resources (Github repo, Gist, ???) and create embeddings from them
  • [ ] links in text content will be followed, scraped, summarized, tagged, and embedded automatically
  • [ ] basic email automations
  • [ ] code and screenshot suggestions
  • [ ] daemons and personalities in the suggestion ai
  • [ ] codemirror integrations to present suggestions
  • [ ] persist assitant chats
  • [ ] multiplayer should feel good and work as expected

Transcript

[00:01] Inside of Course Builder, I'm editing a tip. This one's already been pretty much written. But now I have access to some different assistant templates. So basically this changes the prompt and the workflow in the background. And those are all over here in Sanity.

[00:21] So this particular one that's going to run is the tip chat response. And the only action it takes is the actual assistant prompt. And these take markdown templates whatever really can be in here. And what's interesting to me anyway is the inclusion of liquid. So whatever this template is, it can be any format.

[00:48] You can actually drop JSON in here and it'll attempt to parse it. So that's a thing too. But in this case, I'm going through and If there is a mux playback ID, let's see that's not it, And you can do things like formatting, apparently that adjusts it. So if there's a mux playback ID, use this template to create or add a screenshot where it's relevant anyway so it uses it down here and adds the title on the body as well so oops if we come back here, open that up. Let's see, summarize this and add a screenshot.

[02:00] So we have all these screenshots here too, and they're kind of pulled from the same basic way that we just did that, but you can see that it adds a screenshot. And I didn't check yet if it's in a good place, but it's interesting. Let's see, rewrite and add the screenshot. Add a screenshot More in the middle of the summary than at the end. So what I'd be curious about in this case is does this screenshot line up with what's being spoken about.

[03:08] So that would be something to probably dial in. And I can see the prompt over here. So here is the parsed prompt. And you can see here where it's at screenshots and then it's properly filling in that template and then down here we get the get the other stuff including the time stamped transcript and all that fun stuff and then the current saved body and then the rest of it and inside of code let's look at how this is processed So here's the chat. This is an ingest function.

[03:50] It's an ingest functions tips chat. The tips chat opens up resource chat. So kind of generically, any resource can be used. In this case it's our tip. And inside of that, this gets a little hairy, but it opens up the workflow, loads that via its slug from sanity, and goes through and finds the system prompt.

[04:23] I'm not super stoked with the logic flow here because it's, it was simple when I started and now it's becoming more complex. So this probably needs to be reshaped a little bit in terms of how it's loading this and what the workflow means or if we even want a workflow and we're just talking about prompt presets which might make more sense than workflows in this particular circumstance. It would definitely make this, it would be more representative of how this is being used if it was simply load the prompt. And then this goes through, and I actually try to parse the content of that. So basically whatever the content is of the system prompt that it loads, and I parse that as JSON and then send it through Zod.

[05:13] So if it isn't an array of chat completion requests objects, it'll actually throw and then it just assumes that the content is text and we want to do that. One of the things it will do also if it does find an array of chat completion messages, will go through them one by one and parse the individual contents so that we can use Liquid in each of those if it was a JSON response. And if this throws any errors at all, it just comes down here and parses it as normal. So you get that same parsing. And this one's actually the one that this example was using.

[06:03] Down here if messages length is less than or equal to 2 and this is because we send with each chat message we actually send two messages The first being a snapshot of the current state. So let's see if I can... It's a valid article. So there's a system message that comes over, and this is an article, but it's the same. It has the current title and the current body as a system message, just so this is unsaved.

[06:40] You get the saved version at the beginning and then the unsaved version as this goes along. One of the things I was considering is having an initial chat function and then have that actually listen for continuations and then listen for whatever the end circumstance is or have it time out over a certain time. It would be nice to persist these so that inside of our UI over here you'd be able to somehow you know like have a have a plus or whatever right here that would allow you to basically have threads. This is something that ChatGPT does and that the threads would get summarized and you could revisit them and start them back up, that sort of thing. And then also multiple players.

[07:36] So two of us could be working on this or you could have kind of assistants here that would, you know, you can have a stack of them and these might load dynamically and you'd have the ability to combo box and search these or add new drafts. These are kind of like custom GPTs in the chat GPT sense, but having some sort of way to persist these because as of right now if I refresh this we come back into the default state and we lose all of that chat goodness. Let's see And then this gets into the workflows. I'm not actually doing anything there in terms of what's next. Here's where it actually answers the user message.

[08:58] I guess the user message could include liquid template team as well, which is kind of cool. This makes me curious if it's catching this. I don't actually understand just by looking at it what's what's going on. So I was logging that out just to check and it does actually get that current user message which is good but I was I don't know what this splice is doing here and we're not and I thought this was kind of fun in my current user message I can throw that down there throw that down there. And then I'm pretty sure what that's going to do.

[10:30] We'll look at ingest. So we have the input, come all the way down. Yeah, you can inject properties from the resource in here in your user message. That's very cool.