One of the fastest pitches ever thrown in Major League Baseball was by Aroldis Chapman and had a velocity of 105.1 miles/hour. How many seconds did it take this pitch to travel the 60 feet and 6 inches from the pitcher's mound to home plate
Assuming that the ball is launched horizontally, once in the air, if we neglect the resistance of the air, the ball moves at a constant speed, equal to the initial velocity, in this case, 105.1 mi/hr.
In order to find time in seconds, it is advisable to convert the speed in mi/hr to m/s, as follows:
[tex]105.1 mi/hr * (\frac{1hr}{3600s})*\frac{1609m}{1mi} = 47.0 m/s (1)[/tex]
In the same way, it's advisable to convert 60' 6'' (60.5') to m, as follows:
[tex]60.5 ft * \frac{0.3058m}{1ft} = 18.5 m (2)[/tex]
Applying the definition of average velocity, we can find the time traveled by the ball from pitcher's mound to home plate, as follows: