Course Builder Wip: Creating Tips With Inngest Workflows and Openai

Course Builder is a platform for publishing courses, e-commerce, and related content such as articles, tips, and tutorials. It allows for the creation of tips, which include videos, articles, and transcripts.

The upload process is easy, including video processing and transcript generation. The behind-the-scenes work is coordinated by Inngest and includes tools like Mux for video processing and DeepGram for transcript production.

An OpenAI-based chat assistant helps you craft content, with real-time editing and collaboration options via PartyKit. Additional functionality like in-line suggestions for improving text, akin to Grammarly, are part of the system.

The platform also allows for eventual automation of tasks such as email marketing, making Course Builder a comprehensive solution for course creation and marketing, reducing the need for third-party services.

Transcript

[00:01] So this is Course Builder. It's my publishing platform for courses and eventually e-commerce and then kind of all the supporting content that you would want to build around your course if you were trying to sell it and grow it as a resource. And for us, that is, the starting places, articles, tips, and tutorials where articles are as expected, tips are video tips. So they look like this. You come in here and you get a video and basically an article and a transcript and stuff and that's kind of where it's at right now.

[00:37] With the tips, if I'm logged in I get the ability to create a tip. If I'm not logged in it looks like this and this is just the consumer view of the same thing. So I'm going to come in here and I'm just going to find something real quick that's relatively small. I'm going to use this film video. That's just a naked video that I grabbed off of the rack.

[01:04] Upload thing is happily working for me in the background, makes uploading stuff super easy and I don't have to manage any of that myself. I love that. I want to create the draft and this opens up this window and in here we can see the video is already ready so mux is working in the background to process this video this is a very short video so it's almost instantaneous to process it and present it in the browser. If it's longer, if you had an hour long video, this would obviously need to process and go through their upload and processing. So that's happening behind the scenes.

[01:45] You can see the transcript just popped in. So DeepGram is working behind the scenes for that. What's coordinating all that back here is Ingest. So Ingest, you can see the video is uploaded. We know when the asset was created, we order a transcript.

[02:03] So this is where the DeepGram transcript is ordered. We actually remove the video. So we're not storing that in upload thing long-term since we've saved it in mux. When the video is ready, it presents that. And then once the transcript is ready that gets updated too all that's actually handled inside of the browser where'd we go I think I'm in another tab so all that's handled with PartyKit that's what makes it go and now once you are here and you have the video and you have the transcript, there's a chat assistant over here and you can say things like summarize.

[02:48] And This is running OpenAI's GPT-4 with the 120k context in the background. So that takes the full transcript, any of this metadata that we also have right now. It's only the title. And then our system prompt is what this is running. And under the hood, this is another kind of cool thing.

[03:12] That's generating a response. I guess I'll do this. Come into Sanity. And that's actually controlled, or the workflow that is running the OpenAI calls is here in Sanity. You can see our system prompt exists here.

[03:33] And these system prompts are marked down. And then actually, we're using liquid so you can drop in additional context on the consumer side. So that's the system prompt. You can select different models. So you could even use something like replicates or perplexity and call different maybe open source models, whatever you wanted at this point, and decide here which model you will use.

[03:59] And then on the next side, when Ingest is running these functions, you can mix it up. And this has exactly one step, but you could add additional steps. So I think this is an example where you get the system prompt And then that result pipes into the writer, pipes into the editor, pipes back to the writer, and then finally a JSON formatter. And this will literally run through all of these whenever you make a request. So instead of just this, which is a simple kind of chat like you would expect if you wanted to execute functions or do things like message in Slack.

[04:38] You would be able to do that and add that into this flow, which in practice is incredibly handy. Under the hood, you have this. So you have the chats running. And as this runs, because we're using ingest and we're executing workflows there's this disconnection between the function being executed which ingest executes the functions on your own server so it's your compute your next.js app your serverless function that is getting executed by ingest. But the problem here is that we have this streaming response.

[05:12] And what people expect is this to flow in as it's written. So usually, I can take this. It does a decent job. I've been pretty happy with the summaries. I take this, and then I'd want to edit it, whatever, make use of it.

[05:28] So this is streaming behind the scenes. And when you type something in here, let's say, make it more concise. This is still a work in progress, so I don't have the user messages in here, but we're storing all the messages. So every time you do this, you get full history. So we have the whatever the user message was, the system prompt, and then any responses, all feedback into the context with the with this too.

[05:59] So when I update this, this new body here actually flows in also. Let's see. So you can see, just simple to make it more concise and it was able to do that. So I can come in and yeah, that's great. So under the hood, what's actually happening is as these stream in, that gets parsed and sent through PartyKit, and PartyKit is what flows it down.

[06:25] So the WebSockets via PartyKit is what gives us this kind of interactivity. And also, interestingly, gives us the ability to copy, new, paste this into, that's an unknown, I needed a new, there we go. Gets this into here, this loads. And we get this, so we have a multiplayer and we'd be able to collaborate on the editing of this particular tip, which in practice for us and our business at Badass Courses is huge. And it's a situation where we want to either work together or work with our expert partners.

[07:15] And we have this way where we can get in and we can edit our stuff. In the future, this is going to include things like building email automations and all of the kind of things that go into the process that we use for building badass courses. You'll notice down here, this actually gave me some, I call it like grammarly-like Prompts down here where it's analyzing what we have to say again with full context and then giving some advice and about about updating this. It's really actually pretty good and it gives you, you know, like you can you can high, medium, it's giving it a rating, sometimes it'll say critical, that sort of thing. And honestly, it gives some pretty good advice.

[08:00] The goal, because this is CodeMirror up top, is to eventually take these and then build them into like visual UI. So like you would get inside of an IDE. This is kind of like an integrated content environment, and we'll be able to do things like put markers in the gutter over here at specific lines or highlight specific suggestions in line and give these different personalities so you're getting different types of feedback and the goal ultimately is to, you know, this assistant box is fine. So let's do create a title that summarizes under 90 characters, is to have this here for what you might want it for, but then also give you this, this kind of. Just in time, integrated, assistant functionality that isn't, isn't the, the chat box, right?

[09:01] So, that works pretty good. And then, what, what's neat to me also is as this stuff happens, you don't have to be here because when party kid is announcing and ingest is running, we can have those announcements go to different places. So ingest allows you to fan out that messaging and we'll, you know, like when, when a video is done processing, I don't have to sit here and watch it in Slack or discord, or, you know, it could send me a Twitter DM. If That's how I wanted to configure it. I can have these announcement channels email me, whatever.

[09:35] You're able to send that out. And right now you just save this, but eventually there's gonna be a publish button and the publish button will then run a workflow itself and you'll be able to configure what happens after you publish a tip. And because this system is storing emails and that sort of thing you'll be able to you know like have it write up an email for you and then send you a message in slack to say approve if you like the body text or edit and it'll take you to an editing interface and then send it out to people that are interested on your email list, directly from the system, instead of using a third party, email marketing platform, your, your platform course builder is your marketing platform. You're not going to have to introduce a third-party service to manage your email sends and that sort of thing. And I'm pretty excited.

[10:29] I'm already really enthusiastic about using this personally And I'm going to convert my blog, I think, to use Course Builder. And then you get over here and you have the presentation of this. Yeah, there's a lot of use cases and it's going to be fun to see how it grows.