In this article, I explained what LLMs are and how to use them to build smart applications.
But just using an LLM isn’t enough. We need to communicate with it clearly and strategically.
That’s where Prompt Engineering comes in.
What is Prompt Engineering?
Prompt engineering is the process of designing clear and structured inputs that guide a language model to produce useful responses.
It's like giving smart instructions to a very powerful assistant. The quality of what we get depends on how we ask.
Why Prompt Engineering
- LLMs don’t read our minds.
- They don’t
understand
like humans. They generate text based on patterns. - If you are vague, the model will be vague 😄.
- If you are clear and specific, the model gives you better results.
Simply put, Better prompts = Better outputs.
📌 Prompting Techniques vs Prompt Engineering
Prompting techniques are styles or strategies for writing prompts. They are part of the craft of telling the model what we want in a single interaction.
Prompt engineering is the entire process of integrating and scaling prompts in a system, not just how we write the prompt. It is how we design, test, monitor, and ship prompt-based features in a real-world application.
☝🏾 Note: Prompting techniques are part of prompt engineering, but prompt engineering is much more than writing a good prompt. It includes:
Prompting techniques: Use strategies like zero-shot, few-shot, chain-of-thought, or role-based prompting to guide the model’s behavior and improve output quality.
Implementation techniques: Combine your prompts with tools like function calling, retrieval-augmented generation (RAG), or dynamic input templating to power real app features.
Automation logic: Add logic to retry failed prompts, stream long responses, or fall back to simpler models when needed, ensuring your system stays responsive and stable.
Monitoring & optimization: Track key metrics like latency, token cost, error rates, and model accuracy to continuously improve both performance and user experience.
Testing & versioning: Maintain and test different versions of your prompts, running regression tests to ensure new changes don’t break your app’s behavior.
So while good prompting is the starting point, full prompt engineering means how you design, test, monitor, and ship prompt-based features in a real-world application.
📌 Types of Prompting Techniques (With Examples)
1. Zero-shot prompting
Ask the model to perform a task with no examples, just instructions.
Translate this sentence to French: "I love programming."
Use this when the task is simple and clear.
2. One-shot prompting
Give one example before asking the model to perform the same task again.
Translate this sentence to French: English: "I love cats." French: "J'aime les chats." English: "I love programming." French:
This is good for moderately complex tasks where one example helps show the pattern.
3. Few-shot prompting
Provide a few examples to help the model understand the expected format or logic.
Convert the following to a formal business email tone: Casual: "Need the report by tomorrow." Formal: "Kindly ensure the report is ready by tomorrow." Casual: "Can't make the meeting." Formal: "Unfortunately, I won’t be able to attend the meeting." Casual: "What's the update on the task?" Formal:
This is great when:
- We need consistency.
- The task involves writing style.
- We want the model to follow a specific structure.
4. Chain-of-thought prompting
Ask the model to think step by step before answering.
Question: If Sarah has 3 apples and buys 4 more, then gives 2 to her friend, how many apples does she have? Answer: Let's think step by step.
This is best for tasks that require reasoning, calculation, or logic.
5. Role-based or Instructional prompting
Give the model a role or identity to respond from.
You are a senior software engineer. Explain the difference between GraphQL and REST to a junior developer.
This is perfect for:
- Customer support bots
- Teaching/educational apps
- Task-specific assistants like lawyer, doctor, manager
- Example usecase is my project - Therabot
📌 How to Choose the Right Prompting Technique
Use Case | Suggested Prompt Type |
---|---|
Simple data transformation | Zero-shot |
Text classification | Few-shot |
Reasoning tasks | Chain-of-thought |
Needs personality or tone | Role-based |
New use cases, no examples | Zero-shot + Instructions |
Task where examples help | Few-shot or One-shot |
📌 Tips for Writing Better Prompts
✅ Be clear and specific.
✅ Use bullet points or steps if possible.
✅ Provide examples if the task has a pattern.
✅ Use delimiters like """
to separate instructions from data.
✅ Break big tasks into smaller ones when needed.
✅ Think like a teacher and don’t assume the model knows what you mean.
✅ Iterate: Test, tweak, test again
📌 Real Example: Job Description Analyzer
I built a project called Job Application Assistant, which helps users understand and respond to job listings. Before I integrated Function Calling, I used Prompting techniques with the OpenAI API to extract structured data from job descriptions.
Here’s how I did it using a combination of Few-shot and Role-based prompting:
const jobDescription = "We need a frontend developer skilled in React, JavaScript, and TailwindCSS. You will build UIs and collaborate with backend teams. 2+ years experience required."; const response = await openai.chat.completions.create({ model: "gpt-4", messages: [ { role: "system", content: "You are an AI assistant that extracts key skills, responsibilities, and experience from job descriptions.", }, { role: "user", content: `Extract the following from this job description:\n 1. Required Skills 2. Responsibilities 3. Required Experience\n\n${jobDescription}`, }, ], max_tokens: 200, }); return response.choices[0]?.message?.content?.trim() || "";
Sample output:
Skills: React, JavaScript, TailwindCSS Responsibilities: Build UIs, collaborate with backend Experience: 2+ years
📌 Some sample projects that illustrates Prompt Engineering
- Soul Sync - A safe space where users can check in emotionally, express themselves, and receive gentle, AI-powered guidance that helps them reconnect with their inner self.
- Therabot - A web app where users can chat with an AI-powered therapist for emotional support.
I also use this guide when prompting LLMs.
Prompting is about collaborating with LLMs, and not just dishing out sentences.
The more intentional your prompt, the more reliable your AI becomes.
Happy coding!
Top comments (13)
can i have meeting with you about career suggestion i need ur suggestion
yeah mam when u free thanks
Please join in: meet.google.com/adv-skos-yes
hi
Are you joining again?
i comment on linkdin u see that
mam i comment on linkdin u see that
voice is not clear i hardly listen ur suggestion
hi joy i need ur more suggestion i feel more stuck right now
Apologies, I waited but did not get a response from you.
I will chat with you on LinkedIn tomorrow please.
I will be stepping out soon
Interesting. Thanks for sharing!
Some comments have been hidden by the post's author - find out more