Three are two main factors for the difference between the two flights you’ve listed.
The first is the overhead of take-off, climb, decent and landing. These will be roughly equivalent for all flights, so are basically a fixed overhead.
For the two flights you’ve listed, the times you’ve given are those published by the airline and thus include the time from gate-to-gate, including taxi/etc.
Picking a LAX-ABQ at random, UA6474, FlightAware shows that the average flight time is actually around 1 hour 25 minutes, not 2 hours as you’ve stated. The 2 hours includes taxi time, and some additional buffer the airlines normally add in.
Picking a SJC-ORD at random, AA1008, FlightAware shows that the average flight time is actually around 3 hour 45 minutes, not 4 hours as you’ve stated. Again, the 4 hours includes taxi time.
Even then, that’s ignoring the overhead of take-off/climb/decent/landing, which only occurs once per flight, and isn’t related to the distance.
The second major factor is the type of plane, and thus the maximum speed flown. LAX-ABQ would normally be flown by a “regional” jet – a smaller, and less powerful plane often with a lower cruise speed. SJC-ORD would be flown by a “narrow-body” (or occasionally for other similar length flights a “wide-body”) jet, which would normally have a higher cruise speed.
This will vary depending on the exact jets flown, but could be as much as 10% of more for the standard cruise speed.
Mark already explained the why, but there’s a simple but rather accurate simple approximation for jet flights:
30 min plus 1 hr per 500 mi flown
Consider LAX-ABQ:
0:30 + (670/500) = 0:30 + 1:20 = 1:50
And ORD-SJC:
0:30 + (1859/500) = 0:30 + 3:43 = 4:13
The logic behind this is basically that, while take off and landing take around 20 min each, they’re not entirely wasted flying in circles around the airport. If we assume half the time is spent going in the right direction, at on average half the full speed (since the speed is 0 on the ground and max after take off/before descent), this equates to 50% * 50% = 25% of the time flying in the right direction at full speed. So we lop 25% (5 minutes) off each phase, and ta-da, we only have 30 minutes of entirely wasted time. And the 1 hr per 500 mi is simple: jet planes normally cruise at 500 mph.
Obviously this doesn’t account for prevailing wind patterns, taxi times etc, but it’s good enough to be used for flight time estimation by eg. OpenFlights.org and TravelMath.
A lot of the time in travel is the initial takeoff and climbing to altitude, as well as the descent, queueing and slowdown at the end.
During this time you’re not actually travelling that far, but it uses up time.
Every flight has this component, and let’s say for argument’s sake it takes 20 minutes at the start and 20 minutes at the end.
Then the rest of the time, you’re flying at max cruising speed.
As a result, you’re spending a greater percentage of your travel time at higher speeds, and can get further as a result.
In addition, prevailing winds and flying with the spin of the earth makes a difference as well.
Credit:stackoverflow.com‘