706 views
1 votes
1 votes
Suppose that you implement Dijkstra’s algorithm using a priority queue algorithm that requires O(V ) time to initialize, worst-case f(V, E) time for each EXTRACT-MIN operation and worst-case g(V, E) time for each DECREASE-KEY operation. How much (worst-case) time does it take to run Dijkstra’s algorithm on an input graph G = (V, E)?.

2 Answers

Best answer
1 votes
1 votes

O((no. of vertex)( time for each EXTRACT-MIN operation) + (no. of edges)(time for each DECREASE-KEY operation))

i.e. OV. f(V, E)  + E.g(V, E) )

0 votes
0 votes
dijkstars algo complxity=O(initilization +extractmin no of vertex time +decrease key no of edges time )

   so according to question here complexity will be =O(V+V. V.f(V, E)+E. g(V, E))

Related questions

1 votes
1 votes
1 answer
2
2 votes
2 votes
1 answer
3