A program executes on a non-pipelined processor in time t. {means all instruction is executed in t time}
single instruction take time in execution is m*d {number of stage m, each stage delay is d}
So number of instruction is t/(m*d)
now execution time on pipeline is (n-1+k)*d
where n=#'s of instruction , k=stage , d=stage_delay + buffer_delay {in this case ignore buffer_delay}
So [t/(m*d) -1 + m]*d
open bracket (t/m-d+m*d) {this is execution time on pipeline}
if seems anything wrong correct me!!