In web development, an efficient and user-friendly search function is essential for a good user experience. While working on the “In Archives” project, I initially cached the dataset without pagination, which led to poor performance—eight times worse than expected—because pagination was applied to the entire cached dataset.

To improve this, I set out to enhance the search functionality and faced several challenges. With the help of ChatGPT, I navigated these obstacles and implemented effective solutions. Here’s a summary of my journey and the problems I encountered.

The Initial Challenge

The existing search function was basic and lacked the performance needed for a growing dataset. I realized that proper caching was necessary to improve response times and user access to relevant results.

Problem 1: Caching Strategy

I needed an effective strategy to cache search results based on user queries and categories while keeping the cache manageable.

ChatGPT guided me in structuring my caching logic. It suggested generating unique cache keys based on the search query, category, and page number, allowing me to cache results for specific searches and significantly improve response times.

Problem 2: Pagination Issues

As I implemented caching, I faced challenges with pagination. My initial setup only cached the first page of results, making it difficult for users to access subsequent pages efficiently.

ChatGPT helped me refine my pagination logic by suggesting that I cache results per page, including the page number in the cache key. This allowed users to access any page without re-querying the database, enhancing performance and user experience.

Problem 3: Total Count for Pagination

I also needed to accurately calculate the total count of items for pagination to ensure that pagination links reflected the correct number of items.

With ChatGPT’s help, I implemented a method to calculate the total count of items before applying pagination. This ensured accurate pagination links while benefiting from cached results.

The Final Result

After addressing these challenges with ChatGPT’s assistance, I successfully implemented a robust search function that improved performance and user experience. The final search function now includes:

  • Efficient Caching: Results are cached based on user queries, categories, and page numbers for quick access.
  • Seamless Pagination: Users can navigate through multiple pages without performance issues.
  • Accurate Total Count: Pagination links reflect the total number of items matching the search criteria.

In the next section, I’ll share the search method used in my application. Additionally, the sanitize_sql_like method is handled within the model to ensure user input is properly sanitized before being used in database queries. Below is the updated implementation:

def search
  query = params["q"] || ""
  category_id = params["category_id"]
  cache_key = generate_cache_key(query, category_id)

  # Cache the raw results (not the paginated object)
  items = fetch_cached_items(cache_key, query, category_id)

  count_cache_key = generate_count_cache_key(query, category_id)
  @total_count = fetch_cached_count(count_cache_key, query, category_id)

  # Manually paginate the items array after retrieving it from the cache
  @items = Kaminari.paginate_array(items, total_count: @total_count)
                   .page(params[:page])
                   .per(Kaminari.config.default_per_page)
end

private

def generate_cache_key(query, category_id)
  "items_search_results_q_#{query}" +
  "#{'_cat_' + category_id.to_s if category_id.present?}" +
  "#{'_page_' + params[:page] if params[:page].present?}"
end

def fetch_cached_items(cache_key, query, category_id)
  Rails.cache.fetch(cache_key, expires_in: 30.days) do
    items = Item.search_by_title_and_content(query)
    items = items.where(category_id: category_id) if category_id.present?
    items.page(params[:page]).per(Kaminari.config.default_per_page).to_a
  end
end

def generate_count_cache_key(query, category_id)
  "items_search_results_q_#{query}" +
  "#{'_cat_' + category_id.to_s if category_id.present?}" +
  "#{'_page_' + params[:page] if params[:page].present?}_count"
end

def fetch_cached_count(count_cache_key, query, category_id)
  Rails.cache.fetch(count_cache_key, expires_in: 30.days) do
    count = Item.search_by_title_and_content(query)
    count = count.where(category_id: category_id) if category_id.present?
    count.count
  end
end

Conclusion

My journey to enhance the search functionality was filled with challenges, but with the support of ChatGPT, I was able to overcome these obstacles and implement a solution that meets users' needs. This experience not only improved the application but also demonstrated the value of leveraging AI in the development process. I look forward to continuing to refine the search capabilities and exploring new ways to enhance user experience in the future.

By sharing my journey, I hope to inspire other developers facing similar challenges to seek innovative solutions and consider the potential of AI tools like ChatGPT in their development processes.

If you want to see it in action, go ahead to Medical Dictionary for example.