stay JDK1.6 in ,HashMap use * Position bucket + Linked list * Realization , That is, using linked lists to handle conflicts , same hash It's worth it Entity All stored in a linked list . But when there are more elements in a bucket , namely hash When there are many elements with equal values , adopt key The efficiency of finding values in turn is low . and JDK1.8(JDK The version number is ：1.8.0_25) in ,HashMap use

*Realization ,*

**Position bucket + Linked list + Red and black trees****When the chain length exceeds the threshold value （8） when , Convert linked list to red black tree , This greatly reduces the search time （ The search time complexity is determined by O(n) Turn into O(lgn)）**.

** 1、 Data structure involved ：** Handle hash Conflicting linked lists and red black trees and buckets

//Node It's a one-way list , It has achieved Map.Entry Interface

static class Node<k,v> implements Map.Entry<k,v> {

final int hash;

final K key;

V value;

Node<k,v> next;

// Constructors Hash value key value Next node

Node(int hash, K key, V value, Node<k,v> next) {

this.hash = hash;

this.key = key;

this.value = value;

this.next = next;

} public final K getKey() { return key; }

public final V getValue() { return value; }

public final String toString() { return key + = + value; } public final int hashCode() {

return Objects.hashCode(key) ^ Objects.hashCode(value);

} public final V setValue(V newValue) {

V oldValue = value;

value = newValue;

return oldValue;

}

// Whether two node Whether it is equal or not , if key and value All equal , return true. It can be compared with itself true

public final boolean equals(Object o) {

if (o == this)

return true;

if (o instanceof Map.Entry) {

Map.Entry e = (Map.Entry )o;

if (Objects.equals(key, e.getKey()) &&

Objects.equals(value, e.getValue()))

return true;

}

return false;

}

}

// Red and black trees

static final class TreeNode<k,v> extends LinkedHashMap.Entry<k,v> {

TreeNode<k,v> parent; // Parent node

TreeNode<k,v> left; // The left subtree

TreeNode<k,v> right;// Right subtree

TreeNode<k,v> prev; // needed to unlink next upon deletion

boolean red; // color property

TreeNode(int hash, K key, V val, Node<k,v> next) {

super(hash, key, val, next);

} // Returns the root node of the current node

final TreeNode<k,v> root() {

for (TreeNode<k,v> r = this, p;;) {

if ((p = r.parent) == null)

return r;

r = p;

}

}

transient Node<k,v>[] table;// Storage （ Position bucket ） Array of

With the above 3 Data structures , As long as there is a little data structure foundation , Can be roughly associated with HashMap Realized. . First of all, there is a linked list for each element （ It may not be accurate ） Array of , When you add an element （key-value） when , Just calculate the elements first key Of hash value , To determine the position of the inserted element in the bucket , But there may be the same hash The element of the value is already in the same position as the array , At this time through equals The method is relatively new Key Value and original Key value , If Key Same value , So use the new insert element Entity Replace the original Entity value ; If Key Values are different , Then form a linked list to store the newly inserted data . And when the list length is too long , The list is transformed into a red black tree , This greatly improves the efficiency of searching .

**2、HashMap Main attributes **

Let's talk about the fill factor , The default value is 0.75, If the capacity occupied by the actual element is one percent of the allocated capacity 75% It's going to be expanded . If the fill ratio is large , Explain that there is a lot of space to use , But the efficiency of searching is very low , Because the length of the list is very large （ Of course, the latest version will improve a lot after using the red black tree ）,HashMap It was space for time , So the fill ratio doesn't have to be too big . But if the fill ratio is too small, it will lead to a waste of space . If you focus on memory , The filling ratio can be slightly larger , If you focus on finding performance , The filling ratio can be slightly smaller .

public class HashMap<k,v> extends AbstractMap<k,v> implements Map<k,v>, Cloneable, Serializable { private static final long serialVersionUID = 362498820763181265L; static final int DEFAULT_INITIAL_CAPACITY = 1 << 4; // aka 16 static final int MAXIMUM_CAPACITY = 1 << 30;// The maximum capacity static final float DEFAULT_LOAD_FACTOR = 0.75f;// Filling ratio // When add An element to a bit bucket , The length of its chain list reaches 8 Convert the linked list to a red black tree

static final int TREEIFY_THRESHOLD = 8; static final int UNTREEIFY_THRESHOLD = 6; static final int MIN_TREEIFY_CAPACITY = 64; transient Node<k,v>[] table;// Array of storage elements transient Set<map.entry<k,v>> entrySet; transient int size;// Number of storage elements transient int modCount;// Number of changes fast-fail Mechanism int threshold;// critical value When the actual size ( Capacity * Filling ratio ) When the critical value is exceeded , Capacity expansion will be carried out final float loadFactor;// Filling ratio （...... Slightly behind ）

*3、 Construction method *

HashMap The construction methods of are 4 Kind of , The main parameters involved are , Specify initial capacity , Specifies the fill ratio and the Map, Look directly at the code

/* - Public operations -- */ // Constructors 1

public HashMap(int initialCapacity, float loadFactor) {

// The specified initial capacity is nonnegative

if (initialCapacity < 0)

throw new IllegalArgumentException(Illegal initial capacity: +

initialCapacity);

// If the specified initial capacity is greater than the maximum capacity , Set to maximum capacity

if (initialCapacity > MAXIMUM_CAPACITY)

initialCapacity = MAXIMUM_CAPACITY;

// The filling ratio is positive

if (loadFactor <= 0 Float.isNaN(loadFactor))

throw new IllegalArgumentException(Illegal load factor: +

loadFactor);

this.loadFactor = loadFactor;

this.threshold = tableSizeFor(initialCapacity);// New expansion threshold

} // Constructors 2

public HashMap(int initialCapacity) {

this(initialCapacity, DEFAULT_LOAD_FACTOR);

} // Constructors 3

public HashMap() {

this.loadFactor = DEFAULT_LOAD_FACTOR; // all other fields defaulted

} // Constructors 4 use m The element of initializing hash map

public HashMap(Map m) {

this.loadFactor = DEFAULT_LOAD_FACTOR;

putMapEntries(m, false);

}

**4、 Expansion mechanism **

structure hash Table time , If the initial size is not specified , The default size is 16（ namely Node Array size 16）, If Node[] The elements in the array reach （ Filling ratio *Node.length）

// Can be used to initialize HashMap size Or readjust HashMap size Change to the original 2 Multiple size

final Node<k,v>[] resize() {

Node<k,v>[] oldTab = table;

int oldCap = (oldTab == null) ? 0 : oldTab.length;

int oldThr = threshold;

int newCap, newThr = 0;

if (oldCap > 0) {

if (oldCap >= MAXIMUM_CAPACITY) {// exceed 1>>30 size , Can't expand, can only change threshold

threshold = Integer.MAX_VALUE;

return oldTab;

}

else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&

oldCap >= DEFAULT_INITIAL_CAPACITY)// New capacity for old 2 times It's the smallest 16

newThr = oldThr << 1; // Doubling of expansion threshold

}

else if (oldThr > 0)

newCap = oldThr;//oldCap=0 ,oldThr>0 here newThr=0

else { //oldCap=0,oldThr=0 Equivalent to using the default fill ratio and initial capacity initialization

newCap = DEFAULT_INITIAL_CAPACITY;

newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);

} if (newThr == 0) {

float ft = (float)newCap * loadFactor;

newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?

(int)ft : Integer.MAX_VALUE);

}

threshold = newThr;

@SuppressWarnings({rawtypes,unchecked})

Node<k,v>[] newTab = (Node<k,v>[])new Node[newCap];

// Array auxiliary to new array , Dividend black tree and linked list discussion

table = newTab;

if (oldTab != null) {

for (int j = 0; j < oldCap; ++j) {

Node<k,v> e;

if ((e = oldTab[j]) != null) {

oldTab[j] = null;

if (e.next == null)

newTab[e.hash & (newCap - 1)] = e;

else if (e instanceof TreeNode)

((TreeNode<k,v>)e).split(this, newTab, j, oldCap);

else { // preserve order

Node<k,v> loHead = null, loTail = null;

Node<k,v> hiHead = null, hiTail = null;

Node<k,v> next;

do {

next = e.next;

if ((e.hash & oldCap) == 0) {

if (loTail == null)

loHead = e;

else

loTail.next = e;

loTail = e;

}

else {

if (hiTail == null)

hiHead = e;

else

hiTail.next = e;

hiTail = e;

}

} while ((e = next) != null);

if (loTail != null) {

loTail.next = null;

newTab[j] = loHead;

}

if (hiTail != null) {

hiTail.next = null;

newTab[j + oldCap] = hiHead;

}

}

}

}

}

return newTab;

}

Obviously , Because there is an operation of old array elements into the new array , It's time-consuming to expand .

*5 、 Determine the elements put/get Array of Node[] Location *

static final int hash(Object key) {

int h;

return (key == null) ? 0 : (h = key.hashCode()) ^ (h >>> 16);

}

public native int hashCode();

First of all key Value through hash(key) obtain hash value h, Re pass h&（length-1） Get the array position . In general, for hashing of hash table, the direct addressing method is commonly used , Divide and leave the remainder, etc , It should be easy to calculate , And reduce conflict .

stay Hashtable In this paper, we use the division residue method to hash the distribution , As follows ：

int index = (hash & 0x7FFFFFFF) % tab.length;

But the efficiency of division is very low ,HashMap Through h&（length-1） Instead of taking the mold , Get the array position , It's a lot more efficient .

stay HashMap In the implementation, you can also see that the following code replaces the previous version JDK1.6 Of while Loop to ensure that the capacity of the hash table is always 2 Integral multiple of , Shift operation instead of circular shift .

// This code guarantees HashMap The capacity of is always 2 Of n Power

static final int tableSizeFor(int cap) {

int n = cap - 1;

n = n >>> 1;

n = n >>> 2;

n = n >>> 4;

n = n >>> 8;

n = n >>> 16;

return (n < 0) ? 1 : (n >= MAXIMUM_CAPACITY) ? MAXIMUM_CAPACITY : n + 1;

}

You can see from the source code , stay HashMap In the constructor of , It's called directly or indirectly tableSizeFor function . Here's why ：length by 2 The power of an integer guarantees length-1 Last （ Binary representation, of course ） by 1, So the index operation is guaranteed h&（length-1） The last one of them is also promising 0 And for 1 The possibility of , The uniformity of hash is guaranteed . On the contrary , When length In an odd number of ,length-1 The last one is 0, In this way h The last of the two must be 0, That is, the index position must be even , In this way, all the odd positions of the array have no elements , A lot of space is wasted .

* In short ：length by 2 The power of ensures the validity of the bitwise and the last bit , Make the hash table hash more uniform .*

*6、 The following analysis HashMap The most common operation of put and get*

Be careful HashMap in key and value All are allowed to be null

Go straight to the code ：

//***********************************get***************************************************/

public V get(Object key) {

Node<k,v> e;

return (e = getNode(hash(key), key)) == null ? null : e.value;

} final Node<k,v> getNode(int hash, Object key) {

Node<k,v>[] tab; Node<k,v> first, e; int n; K k;

//hash & (length-1) Get the save bit of the object

if ((tab = table) != null && (n = tab.length) > 0 &&

(first = tab[(n - 1) & hash]) != null) {

if (first.hash == hash && // always check first node

((k = first.key) == key (key != null && key.equals(k))))

return first;

if ((e = first.next) != null) {

// If the first node is TreeNode, It shows that the array is used + Red black tree structure handles conflicts

// Traversing the red and black trees , Get the node value

if (first instanceof TreeNode)

return ((TreeNode<k,v>)first).getTreeNode(hash, key);

// Linked list structure processing

do {

if (e.hash == hash &&

((k = e.key) == key (key != null && key.equals(k))))

return e;

} while ((e = e.next) != null);

}

}

return null;

}

//************************put*********************************************************************

public V put(K key, V value) {

return putVal(hash(key), key, value, false, true);

} final V putVal(int hash, K key, V value, boolean onlyIfAbsent,

boolean evict) {

Node<k,v>[] tab; Node<k,v> p; int n, i;

// If tab Empty or length 0, Then allocate memory resize()

if ((tab = table) == null (n = tab.length) == 0)

n = (tab = resize()).length;

//(n - 1) & hash find put Location , If it is empty , directly put

if ((p = tab[i = (n - 1) & hash]) == null)

tab[i] = newNode(hash, key, value, null);

else {

Node<k,v> e; K k;

// The first node hash Value with , And key Values and inserts key identical

if (p.hash == hash &&((k = p.key) == key (key != null && key.equals(k))))

e = p;

else if (p instanceof TreeNode)// It belongs to red black tree, dealing with conflicts

e = ((TreeNode<k,v>)p).putTreeVal(this, tab, hash, key, value);

else {

// List handling conflicts

for (int binCount = 0; ; ++binCount) {

//p For the first time, point to the meter , Move back in turn

if ((e = p.next) == null) {

//e It's empty , It means it has reached the end of the table and has not been found key Same value nodes , Then create a new node

p.next = newNode(hash, key, value, null);

// After adding new nodes, if the number of nodes reaches the threshold , Then convert the linked list to a red black tree

if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st

treeifyBin(tab, hash);

break;

}

// allow null==null

if (e.hash == hash &&((k = e.key) == key (key != null && key.equals(k))))

break;

p = e;// to update p Point to next node

}

}

// to update hash Values and key Nodes with the same values Value value

if (e != null) { // existing mapping for key

V oldValue = e.value;

if (!onlyIfAbsent oldValue == null)

e.value = value;

afterNodeAccess(e);

return oldValue;

}

}

++modCount;

if (++size > threshold)

resize();

afterNodeInsertion(evict);

return null;

}

Let's briefly add key value pairs put(key,value) The process of ：( in fact , It's clearer to look directly at the code logic )

1、 Determines the array of key-value pairs tab[] Is null or is null, otherwise resize();

2、 According to the key value key Calculation hash Value to the inserted array index i, If tab[i]==null, Directly create a new node to add , Otherwise transfer in 3

3、 Determine the processing in the current array hash The way of conflict is linked list or red black tree (check The first node type is just ), Deal with... Separately .

Reference link ：http://www.125135.com/588285.htm

## HashMap(JDK1.8) Source analysis of more related articles

- ConcurrentHashMap be based on JDK1.8 Source analysis
Preface Statement , This article USES jdk1.8 Previous chapters review : Collection The overview List Assembly is so simple [ Source analysis ] Map aggregate . Hash table . Introduction to the red and black tree HashMap It's that simple [ Source analysis ] LinkedH ...

- Java HashMap Realization principle Source analysis
HashMap Hash table based Map Interface implementation , Provides all optional mapping operations , And allow to use null Values and null build , Out of sync and no mapping order guaranteed . Let's record the research HashMap Realization principle . HashMap Internal storage stay Ha ...

- JDK1.8 HashMap in put Source code analysis
One . Storage structure stay JDK1.8 Before ,HashMap Use the barrel + Linked list implementation , The essence is to use arrays + One way linked list combined data structure . The reason why it has a fairly fast query speed is that it determines the storage location by calculating the hash code .Hash ...

- HashMap It's that simple 【 Source analysis 】
Preface Statement , This article uses jdk1.8 As I said before Collection An overview and analysis of List Sets and hashes .Map aggregate . The foundation of the red black tree : Collection The overview List Assembly is so simple [ Source analysis ] Ma ...

- 【java Collection framework source code analysis series 】java Source analysis of HashMap
Preface : The reason why I plan to write java Collection framework source code analysis series blog is due to my reflection on the failure of Ali push side ( I don't think so , Because writing this blog has been a week away from Alibaba ), At that time, after the interview, I felt that I answered very well , And according to the interviewer's last words ...

- HashMap Source analysis
HashMap Source analysis Whether in the usual practice or in the project ,HashMap It's very widely used , It's everywhere . I only know when I use it HashMap It is used to store key value pairs , But I don't know how the bottom layer is realized . One .HashM ...

- turn ：【Java Collection source code analysis 】HashMap Source analysis
Reprint please indicate the source :http://blog.csdn.net/ns_code/article/details/36034955 Hello! , I'm in CSDN Blog contest , If you like my article , I hope you can vote for me ...

- 【 Collections framework 】JDK1.8 Source code analysis HashMap（ One ） Reprint
[ Collections framework ]JDK1.8 Source code analysis HashMap( One ) One . Preface Based on the analysis of jdk1.8 After HashMap Source code , I found that a lot of online analysis is based on the previous jdk, and Java8 Of HashMap I've done a lot of optimization before ...

- 【Java Collection source code analysis 】HashMap Source analysis
Reproduced source :http://blog.csdn.net/ns_code/article/details/36034955 HashMap brief introduction HashMap It is based on hash table , Each element is one key-va ...

## Random recommendation

- [BI Project notes ]-BUG Handle
BUG It is a work item often encountered in the process of project and operation and maintenance . In dealing with every BUG In the process of , Through the project management system BUG It's also important to record the corresponding content . Here's how to get through TFS To complete BUG The work of dealing with . Let's look at it first BU ...

- Basics DOS command
DOS command : 1. Toggle character :d: 2. Look at the list of file directories : dir dir/s View all contents in all directories and subdirectories dir/p Split screen display dir/s/p Display everything in separate screens Finish ahead of time :ctr ...

- 【JS Review notes 】01 Basic grammar
Numbers : JS There is only one type of number , amount to double.( do not know why , Every time I play double Input method will appear funny than the three words ) NaN It's a number , It can be used isNaN(number) testing NaN Infinity Express what you want ...

- JQery w3school Learn the first chapter Hiding and displaying labels
I'm a beginner JQuery, The key is JQuery How to get the tag object What we learn in this chapter is to click the button to hide the text of all labels and the position of the label bar , Because simply hiding words , There are still spaces and empty lines The key code here is $( ...

- 【BZOJ 2878】 [Noi2012] Lost amusement park
Description be on holiday , Small Z It's very boring to stay at home , So I decided to go to the amusement park alone . After entering the amusement park , Small Z I looked at the map of the amusement park , Discovery can abstract amusement park into n A scenic spot .m An undirected connected graph of a path , And there is at most one ring in this graph ( ...

- OpenSource.com Judge out 2014 Top 10 open source software of the year
Docker Application container platform “ Power management and virtualization allow us to get the most out of server utilization in the same way . How to really solve virtualization , The number one problem in the world is still widespread .Docker since 2013 Open source since , Just in ...

- [RxJS] Creation operators: fromEventPattern, fromEvent
Besides converting arrays and promises to Observables, we can also convert other structures to Obser ...

- Power and dB The relationship between
Power and dB The relationship should be as follows : 1.dB The purpose of this paper is to transform multiplication and division into addition and subtraction , It is convenient for calculation in engineering . 2.[dB] = 10lg( Output power W/ Input power W). Such as : The input power is 1W And the output power is 1000W, Then the gain of the system is ...

- ipv6 Address configuration experiment （GNS3/ENSP）
The topology : IPV6 The address configuration is shown in the figure , To configure ipv6 Instructions ( With R2 For example ,R1 similar ): int e1/2 R2(config-if)#ipv6 address 2001:db08:acad:1::2/64 ...

- pandas Processing time series （1）：pd.Timestamp()、pd.Timedelta()、pd.datetime( )、 pd.Period()、pd.to_timestamp()、datetime.strftime()、pd.to_datetime( )、pd.to_period()
Pandas The library is a powerful tool for processing time series ,pandas It has powerful date data processing function , You can filter data by date . Display data by date . Statistics by date . pandas The actual types of the system are mainly divided into : timestamp( Time stamp ) ...