Monday, November 03, 2008

Problem Solving - 52

A boat traveled up stream a distance of 90 miles at an average speed of (v-3) mph and then traveled the same distance downstream at an average speed of (v+3) mph. If the trip upstream took half an hour longer than the trip downstream, how many hours did it take the boat to travel downstream?

(A) 2.5
(B) 2.4
(C) 2.3
(D) 2.2
(E) 2.1

Answer: A

Total upstream time taken by boat to travel = 90/
(v-3) hrs
Total downstream time taken by boat to travel = 90/ (v+3) hrs

It is given that upstream took half an hour longer than the trip downstream:

=> 90/(v-3) - 90/(v+3) = 1/2

=> [90*(v+3) - 90(v-3)] / [(v^2) -9)] = 1/2
=> 90*[(v+3)-(v-3)] / [(v^2) -9] = 1/2
=> 90*6/ [(v^2) -9] = 1/2
=> v^2 - 9 = 2*90*6
=> v^2 = 2*90*6 + 9 = 2*9*10*6 + 9 = 9(2*10*6+1) = 9(121) = 9*11*11
=> v = 3*11 = 33

We are suppose to calculate = 90/(v+3) = 90/36 = 30/12 = 5/2 = 2.5 ans