4.1 Distane
The distance between two nodes is the length of the shortest path between them.
4.2 Breadth-first search
All the breadth-first search tree's paths from S are the shortest possible. It is therefore a shortest-path tree.
procedure bfs(G,s)
Input: Graph G = (V,E), directed or undirected; vertex s ∈ V
Output: For all vertices u reachable from s, dist(u) is set to the distance from s to u.
for all u ∈ V :
dist(u) = ∞
dist(s) = 0
Q = [s] (queue containing just s)
while Q is not empty:
u = eject(Q)
for all edges (u, v) ∈ E:
if dist(v) = ∞:
inject(Q, v)
dist(v) = dist(u) + 1
The overall running time of this algorithm is linear, O(|V | + |E|)
comparison | DFS | BFS | ||
---|---|---|---|---|
search strategy | Depth-first search makes deep incursions into a graph, retreating only when it runs out of new nodes to visit | Breadth-first search makes sure to visit vertices in increasing order of their distance from the starting point. | ||
implementation | stack | queue | ||
application | cycle test, connection test, SCC(directed) ,topological sort | cycle test, connection test, shortest path | ||
time complexity | adjacent matrix Θ(V^2), adjacent list Θ(V+E) | same as DFS | ||
// Θ( | V | + | E | ) |
4.4 Dijkstra’s algorithm
知乎上看到过一篇文章介绍Dijkstra,有兴趣可戳。(非广告,无广告费。。)
procedure dijkstra(G, l, s)
Input: Graph G = (V, E), directed or undirected;positive edge lengths {le : e ∈ E}; vertex s ∈ V
Output:For all vertices u reachable from s, dist(u) is setto the distance from s to u.
for all u ∈ V :
dist(u) = ∞
prev(u) = nil
dist(s) = 0
H = makequeue (V ) (using dist-values as keys)
while H is not empty:
u = deletemin(H )
for all edges (u, v) ∈ E:
if dist(v) > dist(u) + l(u, v):
dist(v) = dist(u) + l(u, v)
prev(v) = u //shortest path
decreasekey(H, v) // decrease the key( value of dist(v)) in the queue
Running time
Dijkstra’s algorithm is structurally identical to breadth-first search. However, it is slower because the priority queue primitives are computationally more demanding than the constant-time eject’s and inject’s of BFS
we get a total of |V | deletemin and |V | + |E| insert/decreasekey operations.
4.5 Priority queue implementations
4.5.1 Array
The simplest implementation of a priority queue is as an unordered array of key values for all potential elements (the vertices of the graph, in the case of Dijkstra’s algorithm). Initially,these values are set to ∞.
An insert or decreasekey is fast, because it just involves adjusting a key value, an O(1) operation. To deletemin, on the other hand, requires a linear-time scan of the list.
4.5.2 Binary heap
- Elements are stored in a complete binary tree
- each level is filled in from left to right, and must be full before the next level is started.
- the key value of any node of the tree is less than or equal to that of its children.
Therefore, the root always contains the smallest element.
- To insert, place the new element at the bottom of the tree (in the first available position),and let it “bubble up.”(if it is smaller than its parent, swap the two and repeat)The number of swaps is at most the height of the tree, which is ⌊log2 n⌋when there are n elements
- To deletemin, take the last node in the tree and place it at the root.Let it “sift down”.(if it is bigger than either child, swap it with the smaller child and repeat)Again this takes O(log n) time.
4.5.3 d-ary heap
A d-ary heap is identical to a binary heap, except that nodes have d children instead of justtwo. This reduces the height of a tree with n elements to Θ(logd n) = Θ((log n)/(log d)). Insertsare therefore speeded up by a factor of Θ(log d). Deletemin operations, however, take a littlelonger, namely O(d logd n)
4.6 Shortest paths in the presence of negative edges
Bellman-ford algorithm
procedure update((u, v) ∈ E)
dist(v) = min{dist(v), dist(u) + l(u, v)}
procedure shortest-paths(G, l, s)
Input: Directed graph G = (V, E);edge lengths {le : e ∈ E} with no negative cycles;vertex s ∈ V
Output: For all vertices u reachable from s, dist(u) is set to the distance from s to u.
for all u ∈ V :
dist(u) = ∞
prev(u) = nil
dist(s) = 0
repeat |V | − 1 times:
for all e∈E:
update(e)
具体是:
Negative cycles
The shortest-path problem is ill-posed in graphs with negative cycles.
Such acycle would allow us to endlessly apply rounds of update operations, reducing dist estimates every time.
There is a negative cycle if and only if some dist value is reduced during this final round.
4.7 Shortest paths in dags
DAGs avoid negative cycle.
procedure dag-shortest-paths(G, l, s)
Input: DagG=(V,E);edge lengths {le :e∈E};vertex s∈V
Output: For all vertices u reachable from s, dist(u) is set to the distance from s to u.
for all u ∈ V :
dist(u) = ∞
prev(u) = nil
dist(s) = 0
Linearize G
for each u ∈ V , in linearized order:
for all edges (u, v) ∈ E:
update(u, v)
procedure update((u, v) ∈ E)
dist(v) = min{dist(v), dist(u) + l(u, v)}
In particular, we can find longest paths in a dag by the same algorithm: just negate all edge lengths.
To sum up all algorithms for shortest path:
Algorithms | features | Time complexity |
---|---|---|
BFS | s->t, weight=1 | O(|V|+|E|) |
dijkstra | s->t, non-negative weight | binary heap:O((|V|+|E|)*log|V|) |
Bellman-ford | s->t, can deal with negative weight | O(|V|*|E|) |
DAG | s->t, DAG | O(|E|) |
Floyd-Warshall | all pairs of vertices' shortest path | O(|E|) |
下节预告,MST
Prim 算法
Kruskal 算法