According to the Guinness Book of World Records,the longest home run ever measured was hit by Roy �Dizzy� Carlyle in a minor league game. The ball traveled 188 m (618 ft) before landing on the ground outside the ballpark.
1)Assuming the ball's initial velocity was 43 degrees above the horizontal and ignoring air resistance, what did the initial speed of the ball need to be to produce such a home run if the ball was hit at a point 0.9 m (3.0 ft) above ground level? Assume that the ground was perfectly flat.
2)How far would the ball be above a fence 3.0 m (10 ft) high if the fence was 116 m (380 ft) from home plate?
Let V be the initial speed of the ball. Because the ball was hit angle an angle, there are two speed components: one is in x-direction, and the other one is in the y-inderect. The ball travels at a constant speed horizonally but accelerates vertically.
Break down the intial speed
Vx = Vcos45
Vy = Vsin45
find the time it takes the ball to hit the ground 188m away horizonally
x = vt
188 = Vcos45 * t
t = 188 / Vcos45
Xf = .5at^2 + Vt + Xi
Xf = final position (0m)
Xi = initial position (.9m)
a = acceleration
t = time
V = speed in y-derection
we just found the time, which is 188 / Vcos45, plug it in