PromptOwl.ai
Redesigning prompt creation page to improve usability for non-technical users
Role: 
Product Designer
Team:
2 Designers
4 Engineers
1 Product Manager
Tools:
Figma, Zoom, Mural
Empowering users to create prompts with confidence.
Overview
PromptOwl.ai is a B2B SaaS no-code tool designed for both technical and non-technical users to create, manage, and deploy optimized AI prompts and chatbots. By simplifying generative AI workflows, PromptOwl empowers teams to harness AI's full potential efficiently and effectively through prompt engineering.

Problem Space
The overall prompt creation experience lacks necessary context to guide non-technical users to fully leverage the power of PromptOwl's no-code platform.
Audience
This redesign focuses on improving the prompt creation experience for nontechnical users, while balancing the pain points of technical users.
Solution
The solution includes tooltips, UX writing improvements, and rearrangement of components to establish a visual hierarchy in order to create a more user friendly experience.
Constraints & Problem Context
Focused Solutions on a Tight Timeline
Due to the tight project timeline and the need to deliver a beta product quickly, we prioritized getting a functional product to market which limited our ability to deeply explore user needs and validate assumptions early on.
Limited User Research During Ideation
Since we were limited by time and resources, we substituted user research by relying on rapid iterations and feedback loops from stakeholders during development to refine the product.

Heuristic Evaluation
Despite research limitations, we conducted a Heuristic Evaluation using Jakob Nielsen's 10 Usability Heuristics and identified the following usability issues:
First Iteration: Wizard Flow
In our initial iteration of solutions, we introduced a wizard flow to simplify user tasks and reduce the cognitive load associated with the extensive information required for creating a prompt.

Stakeholder Feedback
Product Managers & Engineers said:
Having so many steps across different pages would frustrate technical users who already know their way around prompt engineering terminology and settings.

Only power users and technical experts will be adjusting LLM settings, most non-technical users will keep the default settings and only adjust the model.
Second Iteration: Pivot from Wizard
Stakeholder feedback indicated that the wizard flow was not effective for our audience's varying levels of technical experience with AI tools. We pivoted away from the wizard flow and implemented targeted changes:

1. Introduced tooltips for contextual guidance and improved UX writing for greater clarity and approachability
2. Enhanced visual hierarchy to streamline the prompt creation process for both novice and advanced users
3. Added confirmation pop-ups to mitigate risks during critical actions
4. Revamped Calls to Action to support progress-saving and clearly communicate status within the user flow
Usability Testing
We conducted usability testing with two users to validate our redesigns and identify issues. We found that users struggled to differentiate input sections affecting prompt performance, confusing "Prompt Description" with "System Context." To address this, we improved the visual separation between descriptor elements and key input fields, clarifying their distinct roles.

Before
After
User Interface Improvements
Final Solutions
01. Tooltips & Default Text
Tooltips and default text provide clear, contextual explanations and examples for complex fields like "Variables", "Blocks", and "Variations" empowering users to understand and confidently interact with the tool without external guidance.
02. Establishing Visual Hierarchy
The Variable input fields are now grouped into the System Context card since they directly affect prompt behavior. Spacing was added between Prompt Descriptors and the System Context to distinguish informational fields from actionable ones. This reorganization makes the interface cleaner and more intuitive for users.
03. LLM Model Selection Redesign
The original model dropdown allowed users to select models that they did not have access to. The redesigned drop down clearly communicates which models users can test with and which ones they need API keys for.
Learnings
Cross Collaboration
Meaningfully collaborating with stakeholders such as product managers, engineers, marketing teams, and users through usability testing to ensure that designs aligned with business and user needs.
Jobs to Be Done Framework
Applied JTBD  to identify users' core tasks and motivations, to design more efficient solutions that enhanced the product's overall usability.
Communicating Design Decisions
Regularly presented design process design to cross functional teams through weekly standups & regularly sought out feedback.
Agile Development
Collaborated closely with cross-functional teams to iteratively design, test, and refine product features based on user feedback and changing requirements.



More Work