[Math] Utilization difference between a multiple server, single queue and a multiple server, multiple queue system

queueing-theory

I've been studying Queue theory for a while and I'm interested in finding out the solution to the following problem.

Suppose that we have a system with 4 processors where the arrival rate(λ) is 0.2 tasks per time unit and each processor has a service rate(μ) of 0.2 tasks per time unit.

Now we can have two different type of systems. System A has a single queue and 4 processors while the system B 4 queues for each processor. The following diagrams depict the systems I'm talking about.

enter image description here

and

enter image description here

I would like to find out

  1. In which system the utilization of each server is better
  2. In system B, the average number of tasks in each queue and the average waiting time in each queue.

    For system A the utilization of each server will be $\frac{λ}{μm} = \frac{0.2}{4*0.2} = 0.25$

    However, what about system B? If I decide to divide the arrival rate(λ) by 4 and treat each queue-server pair uniquely, then the utilization will be the same. So we will have no difference in the utilization.
    If I find the utilization of each server, then since we know the arrival rate, finding the average waiting time and the average number of tasks in each queue will be very easy, however I don't know whether it's correct to say that the utilization is the same in both systems.

Thank you in advance

Best Answer

You're right that the utilization is the same in both systems. Utilization measures the long term proportion of time that each of the servers is busy. Both of your setups on average split work equally between the $N$ servers and all of the servers are homogeneous, so in the long term would spend the same proportion of time busy.