I've had a discussion with my father
today, about the fuel usage of a
vehicle at the same rpm, but a
different gear.
The fuel injector system does not always inject a constant amount of fuel into the engine.
If you are driving down a hill and remove your foot from the gas pedal (a little) the engine will stay the same rpm but with less fuel passing thou the system.
And if you are driving up a hill you need to press down the gas pedal to maintain the same rpm and speed, more or less you need to push more fuel into the engine.
This is why you will not have a constant fuel consumption at a given rpm.
Then when you remove some load, let's say you are on the top of the hill,
you will have a small surplus of "energy released" inside the engine that will push the rpm higher until you remove your foot from the gas pedal and reducing the amount of fuel sent into the engine.
He is also claims that the force to
maintain the speed will be the same
across different gears with the same
rpm.
The faster you travel the more wind resistance you will have, therefore you need more power to overcome this resistance.
And to get more power out from a engine you need to send in more fuel (or use the amount have in there in a more efficient way).
But why do we need to switch gears up and down then,
why can't we just inject more fuel into the engine?
This is where we go into chemistry since you can only make gas burn within a specified rate between fuel and air, and the volume inside the engine has a fix size and therefore has a maximum volume of gas that you can burn per rotation. Send in more fuel than this and the engine will drop in efficiency.
And when you then switch down a gear you will increase the rpm at this speed and that way increase the amount of fuel you can burn in the engine within this timeframe and you will in most of the cases more power out from the engine.
But at the end of the day there is a lot of other constraints in a combustion engine that will limit how it works beside this simplified version I just describe here.
There are two extra things to consider here.
First, in even the absolute simplest case, your car is not just fighting wind resistance (which indeed follows a $F \propto v^2$ law at these velocities) but also various static friction forces, usually following an $F \propto v^0$ law. And as you might imagine some of these forces are dropping based on what gear you're in, as some of the static friction is internal to the engine block. You can also read "constant force" as meaning "constant energy expenditure per unit of distance," which clarifies that something like the pistons compressing air but then that now-hot air being vented out (as it will be) turns out to be a constant force on average.
Now this force that's proportional to the square of velocity might also pick up a horizontal component coming from a cross-breeze of speed $u,$ which looks at first like it doesn't matter (Pythagorean theorem, $|[u,~v]|^2 = u^2 + v^2$) but actually does (because you also have to project it back onto the direction of motion of the car to calculate work, which involves multiplying by $v/\sqrt{u^2 + v^2}$.)
So your actual force equation is probably much closer to $$F = F_0 + k v\sqrt{u^2 + v^2},$$ where $k, u$ are probably being approximately constant but potentially $F_0$ might be much lower at 200 km/hr versus 100 km/hr because you have probably shifted to a higher gear. (Realistically the next step in adding accuracy to this model might be writing $k = k_0 + \alpha~u/v$ or so to add the effect that when there is a cross-breeze it takes the drag force over a less-streamlined orientation with respect to the car.)
Second: you are trying to use power $\vec F \cdot \vec v$ to look at fuel consumption per unit distance, but a given amount of fuel probably gives a certain amount of energy and power is an energy expenditure per unit time. Therefore when you want to know fuel consumption per unit distance you need to multiply power by the time it takes per unit distance -- this is the inverse of the velocity. So actually fuel consumption goes like $\vec F \cdot \vec v /|\vec v|$ and your fuel consumption should only scale like: $$F_0 + k v \sqrt{u^2 + v^2}.$$
So in summary, one of these factors of 2 is flat-out wrong for calculating fuel efficiency, the power may go as speed cubed but the energy per unit distance only goes with speed squared in the limit $F_0 = 0,~u = 0.$ The other missing factor of 2 probably comes from the fact that in the lower gear the drag forces $F_0$ and $k v^2$ are approximately comparable, whereas in the higher gear you've reduced $F_0$ considerably by upshifting -- but some components of it probably also come from a slight cross-wind that both acts as a linear drag force and redirects the airflow over a less-aerodynamic profile over the car.
Best Answer
Presumably this occurred at highway speeds, where hurry up and slow down is very inefficient. This hurry up and slow down driving can be amazingly fuel efficient at slower speeds. One alternates between accelerating (at some optimal value) and then coasting. The fuel injection system shuts down fuel flow during the coast phase. Another name for this is pulse-and-glide.
The reason hurry up and slow down is inefficient at highway speeds is because your vehicle is already operating beyond it's peak efficiency. Internal combustion cars and light trucks experience their best mileage somewhere between 35 and 55 mph (55 and 90 kph), depending on the vehicle. The lower end is where trucks and SUVs operate; the upper end is high end sports cars. Unless you're stuck in a traffic jam, your freeway speed is faster than your optimal speed.
Accelerating to pass does a number of things to reduce fuel efficiency at these speeds. Drag grows quadratically with velocity, so even a small change in velocity increases drag considerably at highway speeds. Your car doesn't have much oomph at highway speeds if you don't downshift, which makes acceleration rather expensive fuel-wise. If you do downshift, that throws your car into a regime where torque falls with increased engine speed. Your car now accelerates nicely, but at the expense of seeing the fuel needle move.
So how does hurry up and stop ("pulse and coast") work at slow speeds? Internal combustion engines are rather inefficient with regard to producing torque at low engine speeds. Fuel efficiency suffers as a result. A typical car is considerably more fuel efficient when operated at 45 mph as opposed to 30. Suppose you accelerate optimally from 30 mph to 40, then let off the gas completely and coast back down to 30, from which point the cycle starts anew. The increased speed will cost a bit in terms of increased drag, but drag isn't nearly as strong a force at 40 as it is at 60. You'll more than make up for that drag loss with improved engine performance experienced while accelerating to 40, and your car consumes no gas during the coast phase.
There is one minor problem with this pulse and coast technique. Other drivers do not appreciate it at all.