You are looking for how many miles the car went in 1 hour. You know it went 500 miles in 6.7 hours. To find how many it went in 1 hour simply Divide 500 miles by 6.7 hours.
500 miles/6.7 hours = ____ miles in 1 hour
The average speed of the winning car in the 1911 Indianapolis 500 race was approximately 74.63 mph. This was calculated by dividing the total distance covered (500 miles) by the total time taken (6.7 hours).
The subject of this question is Mathematics, specifically dealing with the calculation of average speed. The average speed is calculated as the total distance traveled divided by the total time taken. From the question, we know that the car covered a distance of 500 miles in 6.7 hours. Therefore, the average speed can be calculated as follows:
The winning car's average speed in miles per hour, rounded to the nearest hundredth, is 74.63 mph.
#SPJ2
Answer:
it is b because i did i quzi
Step-by-step explanation:
B. 1/12
C. 5/12
D. 1/17
The fraction number 20/100 into a decimal form will be 0.20.
The decimal number is the combination of an integer number and part of a rational number. The rational number is greater than zero but less than one.
The fraction number is given below:
⇒ 20 / 100
Convert the fraction number into a decimal form, then we have
⇒ 20 / 100
⇒ 0.20
More about the decimal number link is given below.
#SPJ6