Consider a program that performs the following steps repeatedly:
1. Use the CPU for 4 milliseconds.
2. By issuing an I/O, use the disk for 14 milliseconds.
3. Use the CPU for 10 milliseconds.
4. By issuing an I/O, use network for 18 milliseconds.
Assume that each step depends on data obtained from the previous step (e.g., step 3
cannot start before step 2 is completed. Also assume that each resource (CPU or disk
or network) can be used by one process at a time.
What are the average utilizations of the CPU, disk and network over these two
iterations? (Please note that the “total time” should be the same across all resources,
from the entire system starts until all work on any resource has been done.)

Q&A Platform for Education
Platform Explore for Education