[Math] Why is sum of calculation based on individual percentages different than total percentage

percentages

Cost of goods for an entire product line were estimated using a percentage of total revenue. For example, Product line XYZ was projected to have a revenue of \$1,305. The product line's total cost of goods was estimated to be 49.1% of total revenue based on prior year. We predicted the product line's future cost of goods to be \$640.64.

We know that last year product A costs were 40% of revenue, and product B costs were 50% of revenue. We never forecasted the future costs of products A and B, but we did forecast future revenue of each product (Product A will bring in 105, and product B will bring in 1,200.)

Now, we need to calculate the future costs of the individual products. However, I calculate each product's cost based on each product's cost percent of revenue, I get a future cost of \$42 for product A (105*40%), and a future cost of \$600 for product B (1200*50%). However, the total cost the product line using the individual percentages is \$642.

Link to and image of the example is below

Why are these different? The example I'm using is simplified with smaller numbers, but in reality we have hundreds of products, and the difference is millions of dollars between what we said costs would be at the aggregate and what the individual costs are calculated to be by backing into it. How can we get a more accurate calculation of individual costs that will tie to our original prediction when summed?

Best Answer

The problem is that the ratio of the forecast revenues is not the same as the ratio of the prior revenues. If you weight the $40$% and $50$% according to the forecast revenues instead of according to the prior revenues, you get a weighted mean of

$$\frac{105}{1305}\cdot40+\frac{1200}{1305}\cdot 50\approx49.19540229885\tag{1}$$

percent instead of $\frac{540}{1100}\approx 49.09090909091$ percent. And if you use $(1)$ to calculate the forecast cost, you do indeed get $642$.

Related Question