7.1: Time complexity and common uses of hash tables (2024)

  1. Last updated
  2. Save as PDF
  • Page ID
    47916
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vectorC}[1]{\textbf{#1}}\)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}}\)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}}\)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    Hash tables are often used to implementassociative arrays,setsandcaches. Likearrays, hash tables provide constant-timeO(1)lookup on average, regardless of the number of items in the table. The (hopefully rare) worst-case lookup time in most hash table schemes is O(n).[1]Compared to other associative array data structures, hash tables are most useful when we need to store a large numbers of data records.

    Hash tables may be used as in-memory data structures. Hash tables may also be adopted for use withpersistent data structures; database indexes commonly use disk-based data structures based on hash tables.

    Hash tables are also used to speed-up string searching in many implementations ofdata compression.

    Incomputer chess, a hash table can be used to implement thetransposition table.

    Footnotes

    1. The simplest hash table schemes -- "open addressing with linear probing", "separate chaining with linked lists", etc. -- have O(n) lookup time in the worst case where (accidentally or maliciously) most keys "collide" -- most keys are hashed to one or a few buckets. Other hash table schemes -- "cuckoo hashing", "dynamic perfect hashing", etc. -- guarantee O(1) lookup time even in the worst case. When a new key is inserted, such schemes change their hash function whenever necessary to avoid collisions.
    7.1: Time complexity and common uses of hash tables (2024)

    FAQs

    What is the time complexity of hash tables? ›

    The average time complexity for lookups, insertions, and deletions is O(1). Collisions can slow down the time complexity to O(n). Space complexity is O(n) because we have to store all of the keys and values in the hash table and the size of the hash table is proportional to the number of keys and values.

    What is the time complexity of hash set? ›

    The time complexity of HashSet basic operations – add , remove , contains , and size – is O(1), with a good hashing function when used with user-defined objects. As collisions increase, the time complexity degrades toward the O(n) spectrum. Java 8 introduced a balanced tree to better resolve collisions.

    What is the time complexity of hash sort? ›

    The hash sort algorithm has a linear time complexity factor -- even in the worst case. The hash sort opens an area for further work and investigation into alternative means of sorting.

    What are hash tables and why are they useful? ›

    Hash tables are a type of data structure in which the address/ index value of the data element is generated from a hash function. This enables very fast data access as the index value behaves as a key for the data value.

    What is the time complexity of hash code? ›

    Hash table
    Time complexity in big O notation
    OperationAverageWorst case
    SearchΘ(1)O(n)
    InsertΘ(1)O(n)
    DeleteΘ(1)O(n)
    2 more rows

    What is the complexity of hash table chaining? ›

    Separate Chaining technique combines a linked list with a hash table in order to resolve the collision. This method uses extra memory to resolve the collision. The time complexity of both searching and deleting a element in the hash table is O(n), where n is the number of keys hash to the same slot.

    What is a good hash function complexity? ›

    The time and space complexity for a hash map (or hash table) is not necessarily O(n) for all operations. The typical and desired time complexity for basic operations like insertion, lookup, and deletion in a well-designed hash map is O(1) on average.

    What is the worst-case time complexity of HashMap? ›

    Even though it's very very rare, the time complexity of hashmap lookup is O(n) in the worst case. But even when we're considering the worst case of some function or program, we often treat its complexity as O(1) (constant).

    What is the time complexity of linked hash set? ›

    For HashSet, LinkedHashSet, and EnumSet, the add(), remove() and contains() operations cost constant O(1) time thanks to the internal HashMap implementation. Likewise, the TreeSet has O(log(n)) time complexity for the operations listed in the previous group. This is because of the TreeMap implementation.

    What is the fastest time complexity for sorting? ›

    Which is the best sorting algorithm? If you've observed, the time complexity of Quicksort is O(n logn) in the best and average case scenarios and O(n^2) in the worst case. But since it has the upper hand in the average cases for most inputs, Quicksort is generally considered the “fastest” sorting algorithm.

    Can a hash table be used for sorting? ›

    Sorting with hash tables and hash function

    Can we use hashing for sorting? Not in general case --- the items we hash on need not have total order semantics. One of the things that makes hashing interesting is that it gives us efficient searching without requiring underlying sorting.

    Which sort has worst time complexity? ›

    Sorting algorithms
    AlgorithmData structureTime complexity:Worst
    Heap sortArrayO(n log(n))
    Smooth sortArrayO(n log(n))
    Bubble sortArrayO(n2)
    Insertion sortArrayO(n2)
    4 more rows

    When should you use hash tables? ›

    Use Hash tables when we need to store key-value pairs and perform frequent lookup, insert, or delete operations. Use Sets when we need to store unique elements, don't care about duplicates, and just need to perform operations like checking if an element is in the set or not.

    What is the time complexity of searching in hash table? ›

    Furthermore, the average complexity to search, insert, and delete data in a hash table is O(1) — a constant time. It means that, on average, a single hash table lookup is sufficient to find the desired memory bucket regardless of the aimed operation.

    Why not always use hash tables? ›

    Hash tables are vulnerable to denial of service (DoS) attacks. One type of DoS attack on hash tables is known as a “collision attack.” Hash tables use a hash function to map keys to indices in an array (buckets).

    Why is lookup in HashMap O 1? ›

    Yet we call hash map lookups O(1). Because in practice we imagine a hash map has a key space of a fixed size (even though such a constraint would mean that the hash map has a fixed maximum size and therefore no asymptotic performance case since it can't scale to infinity).

    What is the time complexity of HashMap contains? ›

    Time Complexity of HashMap.

    containsKey() is crucial for performance optimization, especially when dealing with large data sets. The time complexity of the containsKey() method in a HashMap is generally O(1), meaning it has constant time complexity.

    What is the time complexity of hash table in C#? ›

    Hashtables are your go-to when: You want rapid retrieval. The complexity of search operations in Hashtable is O(1), which basically means it's super quick! You need to store data in key-value pairs.

    Are hash tables slow? ›

    Hashtable is usually considered slower compared to some other data structures in Java, primarily due to its synchronization.

    Top Articles
    1inch Price Prediction: 2024, 2025, 2030
    Kidnapping of billionaire J Paul Getty's grandson: The 16-year-old boy was taken on this day in history
    English Bulldog Puppies For Sale Under 1000 In Florida
    Katie Pavlich Bikini Photos
    Gamevault Agent
    Pieology Nutrition Calculator Mobile
    Hocus Pocus Showtimes Near Harkins Theatres Yuma Palms 14
    Hendersonville (Tennessee) – Travel guide at Wikivoyage
    Compare the Samsung Galaxy S24 - 256GB - Cobalt Violet vs Apple iPhone 16 Pro - 128GB - Desert Titanium | AT&T
    Vardis Olive Garden (Georgioupolis, Kreta) ✈️ inkl. Flug buchen
    Craigslist Dog Kennels For Sale
    Things To Do In Atlanta Tomorrow Night
    Non Sequitur
    Crossword Nexus Solver
    How To Cut Eelgrass Grounded
    Pac Man Deviantart
    Alexander Funeral Home Gallatin Obituaries
    Energy Healing Conference Utah
    Geometry Review Quiz 5 Answer Key
    Hobby Stores Near Me Now
    Icivics The Electoral Process Answer Key
    Allybearloves
    Bible Gateway passage: Revelation 3 - New Living Translation
    Yisd Home Access Center
    Pearson Correlation Coefficient
    Home
    Shadbase Get Out Of Jail
    Gina Wilson Angle Addition Postulate
    Celina Powell Lil Meech Video: A Controversial Encounter Shakes Social Media - Video Reddit Trend
    Walmart Pharmacy Near Me Open
    Marquette Gas Prices
    A Christmas Horse - Alison Senxation
    Ou Football Brainiacs
    Access a Shared Resource | Computing for Arts + Sciences
    Vera Bradley Factory Outlet Sunbury Products
    Pixel Combat Unblocked
    Movies - EPIC Theatres
    Cvs Sport Physicals
    Mercedes W204 Belt Diagram
    Mia Malkova Bio, Net Worth, Age & More - Magzica
    'Conan Exiles' 3.0 Guide: How To Unlock Spells And Sorcery
    Teenbeautyfitness
    Where Can I Cash A Huntington National Bank Check
    Topos De Bolos Engraçados
    Sand Castle Parents Guide
    Gregory (Five Nights at Freddy's)
    Grand Valley State University Library Hours
    Hello – Cornerstone Chapel
    Stoughton Commuter Rail Schedule
    Nfsd Web Portal
    Selly Medaline
    Latest Posts
    Article information

    Author: Lidia Grady

    Last Updated:

    Views: 5858

    Rating: 4.4 / 5 (45 voted)

    Reviews: 84% of readers found this page helpful

    Author information

    Name: Lidia Grady

    Birthday: 1992-01-22

    Address: Suite 493 356 Dale Fall, New Wanda, RI 52485

    Phone: +29914464387516

    Job: Customer Engineer

    Hobby: Cryptography, Writing, Dowsing, Stand-up comedy, Calligraphy, Web surfing, Ghost hunting

    Introduction: My name is Lidia Grady, I am a thankful, fine, glamorous, lucky, lively, pleasant, shiny person who loves writing and wants to share my knowledge and understanding with you.