Prompt Stores
Understanding how prompt stores can be used to speed up the generative development process
Why Prompt Stores?
When working on a project with specific tools—for instance, React, Next.js, Chakra UI, Yup, and Axios—it can be tedious to re-specify them every single time you ask for a new component or feature. Prompt stores let you define shorthands or references up front, so you don’t have to restate them repeatedly.
Shorthand Example: You might tell the LLM, “When I say
myStack
, I mean React, Next.js, Chakra UI, Yup, and Axios.” Then, in subsequent prompts, you can just say “create a sign-up page usingmyStack
” and the LLM will remember the specifics.
How Context Windows Work
Most chat-based LLMs maintain a context window (as discussed in the previous section)—a chunk of recent conversation that the model uses to inform its next response. When you reference a shorthand like myStack
in subsequent prompts, the model looks back at that conversation context to recall what myStack
was defined to include. However, each LLM has a token limit for how much text it can remember at once (its “window size”).
If your conversation exceeds that limit, older prompts can drop out of context and the model may “forget” them.
You can manage this by reintroducing important information or storing it in a separate reference outside the immediate conversation.
Incorporating Vector Databases
To extend the context beyond what the LLM’s built-in window can handle, some workflows integrate vector databases:
Embedding and Storage – You convert text (e.g., your project specs) into vector embeddings and store them in a vector database.
Context Retrieval – When you query your LLM, relevant chunks of text can be pulled from the database and appended to your prompt, effectively refreshing the model’s memory.
Scalable Knowledge Base – This approach can scale well for large projects, where you might have extensive documentation or domain-specific details that need to be referenced repeatedly.
Practical Example
Without Prompt Stores
“Create a contact form using React, Next.js, Chakra UI, Yup, and Axios.” “Create a sign-up page using React, Next.js, Chakra UI, Yup, and Axios.” …
You’re repeating the entire stack each time, which can become cumbersome.
With Prompt Stores and Context
Define your stack once:
“When I say
myStack
, please recall that it’s React, Next.js, Chakra UI, Yup, and Axios.”Use your shorthand:
“Now, create a sign-up page using
myStack
.”
The LLM references the conversation’s context window (and potentially a vector database if you’ve set one up) to remember your definition of myStack
.
Key Takeaways
Prompt Stores help you establish shorthand references to reduce repetition.
Context Windows enable the LLM to recall recent details in a conversation, but they have token limits.
Vector Databases can scale context management by storing and retrieving important chunks of text outside the LLM’s built-in memory.
Efficiency & Consistency: By combining prompt stores and context windows, you can rapidly iterate on your application’s features without constantly re-describing your tech stack or project constraints.
Example
Create a prompt store to store language and libraries
Prompt
Response
Sure, I'll keep that in mind! When you write bicycle-app-config
, I'll assume that you mean a configuration that includes Next.js, Chakra UI, Yup, and Axios.
Discussion
Perfect. Now that we have a prompt store that retains the context of the languages and libraries we use, let's see it in practice.
Prompt
Response
Sure, here's an example of how you can create a contact form in a Next.js app with Chakra UI, Yup, and Axios, using the bicycle-app-config
:
In this example, we're using Chakra UI's FormControl, FormErrorMessage, FormLabel, Input, and Textarea components to create a form that includes the fields for name, email, and message. We're also using the useFormik hook from formik to manage the form state and handle form submission.
We're passing the contact form schema to the validationSchema
prop of the useFormik hook to validate the form data against the schema. The isInvalid
prop of the FormControl component is set to true
if the field has been touched and there is a validation error.
Finally, we're using the getFieldProps
function from formik to bind each form input to the corresponding formik field, and using the handleSubmit
function from formik to handle form submission. We're also using Axios to make an asynchronous POST request to the /api/contact
endpoint when the form is submitted.
Note that in order for this example to work, you'll need to create an API route in your Next.js app at /pages/api/contact.js
that handles the form submission and sends an email or performs other actions based on the submitted data.
Discussion
Using prompt stores can greatly expedite your development delivery while bringing unity and standardization to your response output. The concept of prompt stores can not only be applied languages, but can also be applied to libraries, UI layouts, business rules, formats, security schemas, user scenarios, or just about any repeatable concept you can think of.
Last updated
Was this helpful?