Home- Portfolio -
Revolutionising Query Experience with Natural Language

About Project

As part of our commitment to enhancing user experiences through innovative technology, we embarked on a project to transform the way users interact with our Query feature. The goal was to leverage Generative AI to enable natural language querying, simplifying the process while maintaining accuracy and efficiency. This case study delves into our journey, challenges, solutions, and the positive impact on our users.

Category
Product Design
Client
Robin AI
Release
June 2023
Role
Product Designer
Tools
Figma, Maze & Figjam
Duration
3 Months

Background

In a world dominated by Generative AI, we recognized the need to leverage this technology to enhance our software offerings at Robin. Our Query feature, which enabled users to retrieve contract data, required users to navigate through multiple filters, resulting in a cumbersome and time-consuming process. Our vision was to enable users to utilize natural language to perform searches, mimicking everyday search engine experiences.

Problem Statement

  1. Complex Query Process: Users were burdened with a multitude of clicks to perform detailed queries.
  2. Filter Reliance: Existing queries required users to remember and apply filters.
  3. Incomplete Queries: Users were unable to retrieve data from unlabelled contract sections.

Design Approach

How Might We? Questions:

  • How might we streamline the query process for users?
  • How might we make the querying experience more intuitive?
  • How might we integrate natural language querying without disrupting the existing workflow?
  • How might we optimize Generative AI for efficient query processing?
  • How might we implement this solution seamlessly?

Research Insights:

  • Limited LLM Handling: LLMs had token limits, affecting their ability to process extensive data.
  • Data Volume Challenges: Large data inputs led to processing delays and inaccurate results.
  • Matching Prompts: Successful natural language queries often aligned with existing filters and values.

Solution Implementation

Iterative Design:

  1. Token Limit Optimization: We experimented with data segmentation to fit within LLM token limits, ensuring faster processing and accuracy.
  2. Smart Data Segmentation: Breaking down queries into smaller segments allowed LLMs to provide more accurate responses.
  3. Matching Framework: We developed an algorithm to match natural language prompts with existing filters and values, enhancing the precision of results.
  4. User Interface Integration: The revamped Query interface seamlessly integrated natural language input, preserving the familiarity of the existing system.

User Journey Transformation

Critical Points Addressed:

  • Reduced Clicks: The revised system minimized the number of clicks needed for complex queries.

The Golden Path:

  1. User Engagement: Users accessed the Query feature, excited about the enhanced experience.
  2. Simplified Querying: Users articulated their queries naturally, without the need for pre-defined filters.
  3. Natural Language Input: Users used everyday language to describe the properties and values they sought.
  4. Filter and Value Mapping: Our system intelligently linked natural language input to existing filters and values.
  5. Instantaneous Results: Users received accurate results, addressing their query needs effectively.

Positive Outcomes

  1. Enhanced Efficiency: Users experienced reduced efforts, resulting in quicker and more efficient queries.
  2. Intuitive Interaction: The new natural language interaction resonated with users, mirroring familiar search engine experiences.
  3. Accurate Responses: Matching prompts with filters and values improved result accuracy, instilling confidence in our system.
  4. Seamless Implementation: Our solution seamlessly integrated within the existing framework, requiring minimal adaptation.

Conclusion

By embracing Generative AI and redefining the Query feature, we revolutionized the querying experience for our users. Through smart design choices, iterative development, and seamless integration, we successfully overcame challenges associated with LLM limitations and extensive data. Our commitment to creating a user-centric solution resulted in reduced clicks, intuitive interactions, and accurate results, marking a significant milestone in enhancing our software's capabilities.

Next Steps

  • Continuously refine the token optimization process for improved efficiency.
  • Expand the natural language querying capabilities to address more complex queries.
  • Implement user feedback loops to ensure ongoing usability enhancements.

By leveraging cutting-edge technology and user-centric design principles, we've paved the way for a more user-friendly and efficient querying experience, setting new standards in software usability and innovation.

There is more

Explore other Work