• Home
  • Categories
  • Tags
  • Pricing
  • Submit
    Decorative pattern
    1. Home
    2. Research Papers & Surveys
    3. Maximum Inner Product is Query-Scaled Nearest Neighbor

    Maximum Inner Product is Query-Scaled Nearest Neighbor

    A theoretical paper establishing the relationship between Maximum Inner Product Search and query-scaled nearest neighbor search. This connection enables applying NN techniques to MIPS problems with theoretical guarantees.

    🌐Visit Website

    About this tool

    Overview

    Published in March 2025 (arXiv:2503.06882) and in VLDB 2025, this paper establishes an important theoretical connection between MIPS and nearest neighbor search through query scaling.

    Theoretical Contribution

    The paper proves that Maximum Inner Product Search is equivalent to query-scaled nearest neighbor search under certain transformations. This connection:

    • Provides theoretical foundation for MIPS algorithms
    • Enables applying NN techniques to MIPS problems
    • Offers performance guarantees
    • Guides algorithm design

    Practical Implications

    Algorithm Design

    NN algorithm advances can be adapted for MIPS with theoretical backing

    Index Structures

    NN index structures (HNSW, IVF) can be used for MIPS with appropriate transformations

    Performance Analysis

    NN theoretical results transfer to MIPS domain

    MIPS Applications Benefiting

    • Recommendation systems at scale
    • Neural network attention mechanisms
    • Embedding-based retrieval
    • Matrix completion
    • Collaborative filtering

    Bridging Two Communities

    The paper bridges:

    • ANN Research: Decades of algorithms and theory
    • MIPS Applications: Practical problems in ML and IR

    This connection accelerates MIPS algorithm development by leveraging NN research.

    Technical Details

    The query-scaled transformation:

    • Modifies query vectors based on norm
    • Preserves ranking (top-K results unchanged)
    • Enables using distance-based indexes
    • Maintains theoretical guarantees

    Impact on Vector Databases

    Vector databases can now:

    • Support both NN and MIPS with unified infrastructure
    • Apply optimizations from NN domain to MIPS
    • Provide theoretical performance guarantees for both

    Availability

    Published as arXiv preprint arXiv:2503.06882 (2025) and PVLDB Vol. 18, with full theoretical proofs and experimental validation.

    Surveys

    Loading more......

    Information

    Websitearxiv.org
    PublishedMar 20, 2026

    Categories

    1 Item
    Research Papers & Surveys

    Tags

    4 Items
    #Mips#theory#Algorithms#nearest neighbor

    Similar Products

    6 result(s)
    Faster Maximum Inner Product Search in High Dimensions

    A 2022 research paper presenting algorithms for faster MIPS (Maximum Inner Product Search) in high-dimensional spaces. MIPS is crucial for recommendation systems, neural networks, and various machine learning applications.

    Breaking the Storage-Compute Bottleneck in Billion-Scale ANNS

    A 2025 research paper presenting a GPU-driven asynchronous I/O framework for billion-scale approximate nearest neighbor search. The system addresses the fundamental bottleneck of data movement between storage and compute in large-scale vector search.

    Graph-Based Algorithms for Diverse Similarity Search

    A 2026 research paper presenting graph-based algorithms for diverse similarity search, where results must be both similar to the query and diverse from each other. This addresses the common problem of redundant results in traditional similarity search.

    In-Place Updates of Graph Index

    A 2026 research paper on streaming approximate nearest neighbor search with in-place graph index updates. The approach enables real-time index modifications without expensive rebuilds, crucial for dynamic datasets.

    JAG

    Joint Attribute Graphs for Filtered Nearest Neighbor Search, a research paper that addresses the challenge of combining vector similarity search with attribute filtering. JAG presents a novel index structure that efficiently handles filtered ANN queries common in real-world applications.

    LLMs Meet Isolation Kernel

    A research paper introducing lightweight, learning-free binary embeddings for fast retrieval. The approach uses isolation kernels to generate binary embeddings that dramatically reduce storage requirements (32× compression) while maintaining retrieval quality.

    Decorative pattern
    Built with
    Ever Works
    Ever Works

    Connect with us

    Stay Updated

    Get the latest updates and exclusive content delivered to your inbox.

    Product

    • Categories
    • Tags
    • Pricing
    • Help

    Clients

    • Sign In
    • Register
    • Forgot password?

    Company

    • About Us
    • Admin
    • Sitemap

    Resources

    • Blog
    • Submit
    • API Documentation
    All product names, logos, and brands are the property of their respective owners. All company, product, and service names used in this repository, related repositories, and associated websites are for identification purposes only. The use of these names, logos, and brands does not imply endorsement, affiliation, or sponsorship. This directory may include content generated by artificial intelligence.
    Copyright © 2025 Awesome Vector Databases. All rights reserved.·Terms of Service·Privacy Policy·Cookies