00:00:00

Share Your Feedback 🏝️

Prompt | Conversational Prompt Engineering

Prompt | Conversational Prompt Engineering

MinWoo(Daniel) Park | Tech Blog

Read more
Previous: RAG | Text-to-SQL LLM Next: Self-Taught Evaluators

Prompt | Conversational Prompt Engineering

  • Related Project: Private
  • Category: Paper Review
  • Date: 2024-08-08

Conversational Prompt Engineering

  • url: https://arxiv.org/abs/2408.04560
  • pdf: https://arxiv.org/pdf/2408.04560
  • html: https://arxiv.org/html/2408.04560v1
  • abstract: Prompts are how humans communicate with LLMs. Informative prompts are essential for guiding LLMs to produce the desired output. However, prompt engineering is often tedious and time-consuming, requiring significant expertise, limiting its widespread use. We propose Conversational Prompt Engineering (CPE), a user-friendly tool that helps users create personalized prompts for their specific tasks. CPE uses a chat model to briefly interact with users, helping them articulate their output preferences and integrating these into the prompt. The process includes two main stages: first, the model uses user-provided unlabeled data to generate data-driven questions and utilize user responses to shape the initial instruction. Then, the model shares the outputs generated by the instruction and uses user feedback to further refine the instruction and the outputs. The final result is a few-shot prompt, where the outputs approved by the user serve as few-shot examples. A user study on summarization tasks demonstrates the value of CPE in creating personalized, high-performing prompts. The results suggest that the zero-shot prompt obtained is comparable to its - much longer - few-shot counterpart, indicating significant savings in scenarios involving repetitive tasks with large text volumes.

Previous: RAG | Text-to-SQL LLM Next: Self-Taught Evaluators

post contain ""

    No matching posts found containing ""