Rate Monotonic Analysis a short historical overview Sam Tregar May 31, 1999 The term Rate Monotonic Analysis was born in January of 1973 with the publishing of "Scheduling Algorithms for Multiprogramming in a Hard Real Time Environment" by Liu and Layland. (Liu and Layland 1973) Their paper laid out the basis for a simple test that could be applied to a real-time system to determine if it could be garaunteed to meet its deadlines. This test was severely limited in the range of systems it could address. In particular, it assumed that all process in the system are periodic, have deadlines at the end of their periods, and are totally independent of one another. Additionally, the test garaunteed only that the system would be schedulable using their scheduling algorithm - the rate monotonic scheduling algorithm. (Audsley 1993) The theory itself is beguilingly simple. First a utilization bound is determined - U(n) = n(2^(1/n)-1) - where n is the number of processes in the system. Then a simple sum is calculated of the execution time for each process over the period of each process. That sum is compared to the utilization bound. If it is below then the process set is garaunteed to be schedulable using rate monotonic scheduling. If it is above the utilization bound but below 1.0, then the process set may be schedulable, although it cannot be garaunteed. If it is over 1.0 then the process set requires more than 100% of the processing time available to meet its deadline and thus cannot be schedulable. This quality of uncertainty, between the utilization bound and 1.0 makes the theory "sufficient and not necessary". (Audsley 1993) In other language, this makes the theory pessimistic. It may reject systems that are, in fact, schedulable. The search for a "sufficient and necessary" test using the theory of rate monotonic analysis became the subject of much research through the 1970s and 1980s. In 1973, when the theory was introduced, the real-time computing industry was not ready to use it. Real-time systems were dominated by cyclic-schedulers that could be easily proven schedulable so long as their rules were respected, and "best-effort" scheduling that was not provably schedulable, but performed well in average cases. Over time it was realized that these methods produced systems that were difficult to maintain and develop. Fixed-priority preemptive scheduling offered flexibility and provability within certain strict constraints. Unfortunately, rate monotonic analysis was not yet ready - it's assumptions could not be met in the design of real-time systems. During the 1980s the key work in bringing rate monotonic analysis up to speed with the real-time industry was performed. At many universities projects were begun to address the deficiencies in rate monotonic analysis. From a practical point of view the most important of these developments was to discovery of "sufficient and necessary" test for the schedulability of periodic tasks given fixed pre-emptive priorities. (Harter 1984) These test quantitatively analyze the worst-case response time of given system of tasks. They allow the analysis to be carried out on systems using different scheduling policies. Essentially, these tests simulate the graphing of the execution of a set of tasks in the worst-case - taken to be the one in which they are all triggered at the same moment. They are more complicated than a simple utilization test, but do maintain the a reasonable degree of simplicity. An important improvement in the theory came in the form of inclusion of aperiodic events. (Sha, Lehoczky and Rajkumar 1987) They contributed a number of strategies, ranging from the simple casting of aperiodic events into pessimistic periodic framework to the more elaborate involving implementation in the careful rationing of time to incoming aperiodic events. In 1992 another vital contribution was made by Tindell, who offered an enhanced version of the response-time tests that allowed for totally arbitrary deadlines, both in and out of the process' period. (Tindell 1992) This allows for a much more realistic representation of a system's constraints, given that an aperiodic event may possibly appear more often than its execution time. Tindell's extension of the theory allows for runs of a process that pre-empt other runs of the same process. Finally, rounding out these enhancements are developments of methods for specifying and analyzing interprocess cooperation. The most common case of interprocess cooperation is the case in which more than one process must use a single resource. This requires the use some kind of resource allocation policy. One glaring problem with the state-of-the-art resource allocation up to this point was priority inversion. Priority inversion occurs when a high priority task attempts to obtain a lock on a resource held by a low priority task. Since the low priority task is using the resource, the high priority task blocks. However, a middle priority task becomes ready to run and pre-empts the low priority task. This situation can essentially reverse the priorities of the high and middle tasks, preventing the highest priority task from meeting its deadline. Several new policies were developed to address this problem along with the means to factor their use into response-time analysis. Among them, the Priority Inheritance Protocol and the Priority Ceiling Protocol stand out in particular. (Sha, Lehoczky and Rajkumar 1987) This history brings us up to the late 1980s, when the RMARTS (Rate Monotonic Analysis of Real Time Systems) project at Carnegie Melon University began a push to gain industry acceptance and use of rate monotonic analysis. They formed teams of trained individuals to go out to companies engaged in real-time systems development and demonstrate the effectiveness of rate monotonic analysis. Over time their accomplishments were great - large corporations and governments came over to the RMA camp, establishing RMA as a cornerstone of their analysis regimine. (Sha, Klein, Goodenough 1991) All of this lead to an incredible growth in the development of Rate Monotonic Analysis. In 1993 some of the most diffinitive works on the field were written. Among them is the book "A Practitioners Handbook for Real-Time Analysis." (Klein, Ralya, Pollak, Obenza, and Harbour 1993) This book provides an incredibly detailed and practical guide to performing rate monotonic analysis. It contains formulas and examples for all the permutations of rate monotonic analysis mentioned above and many more. One thing to note about this and other products of the mid 1990s in the area of rate monotonic analysis, they do not delve too far into explanation and/or theory. In fact, the Handbook only very rarely gives the derivation or justification for the formulas it presents. This is not meant as a critique, but it does speak to the change in direction of rate monotonic analysis. After 1993, most academic developments in rate monotonic analysis cease. One can assume by the analysis software produced after that date by various companies which claim to be capable of analysis beyond that allowed by thus-far published papers that research is going on, but it is outside of academia, and seemingly unlikely to return. In summary, rate monotonic analysis has gone from being an elegant, yet highly restricted, solution to predicting one scheduling algorithm to a generalized method capable of analyzing almost any conceivable fixed-priority pre-emptive real-time system. The wide acceptance of this methodology is a testament to the brilliance and persistence of the people I have credited above. References [Audsley 1993] Audsley, N.C. et al. "Fixed Priority Pre-emptive Scheduling: An Historical Perspective." Real Time Systems 8, 2-3 (March - May 1995): 173-98. [Harter 1984] Harter, P.K. 1984. "Response Times in Level Structured Systems". Department of Computer Science, University of Colorado, USA. CU-CS-269-94. [Klein, Ralya, Pollak, Obenza, and Harbour 1993] Klein, M., Ralya, T., Pollak, B., Obenza, R. Harbour, M. G., "A Practitioner's Handbook for Real-Time Analysis: Guide to Rate Monotonic Analysis of Real-Time Systems". Kulwer Academic Publishers 1993. [Lui and Layland 1973] Liu, C.L., Layland, J.W., 1973. "Scheduling Algorithms for Multiprogramming in a Hard Real-Time Environment". JACM. 20(1): 40-61. [Sha, Klein, Goodenough 1991] Sha, L., Klein M., Goodenough J. "Rate Monotonic Analysis for Real-Time Systems". Software Engineering Institute, Carnegie-Mellon University, USA. CMU/SEI-91-TR-6 [Sha, Lehoczky and Rajkumar 1987] Sha, L., Lehoczky, J. P. and Rajkumar, R. 1987. "Priority Inheritence Protocols: An Approach To Real-Time Synchronisation". Computer Science Department, Carnegie-Mellon University, USA. CMU-CS-87-181. [Tindell 1992] Tindell, K. W. 1992. "An Extensible Approach for Analysing Fixed Priority Hard Real-Time Tasks". Department of Computer Science, University of York, UK. YCS 189.