## The Mg1 Queue Imbedded Markov Chains

In the previous chapter, we analyzed systems by a Markov chain imbedded at fixed intervals. In this chapter, we shall study a technique whereby a Markov chain is imbedded at randomly spaced points. The terms imbedded Markov chain and semi-Markov are applied to this technique. We apply the technique to the study of the M G 1. The same approach is applied to the G M 1 queue. The challenge in studying the M G 1 queue by means of Markov chains is to find a sequence of points in time, which allows...

## Pbs p

The historic application of the Erlang B formula is in sizing telephone trunks. As stated above, it is a reasonable assumption that the duration of telephone calls is exponentially distributed and the arrival rate is Poisson. The servers in this case are these trunks. The question of interest is to choose the number of trunks for a particular traffic volume so that an arriving call finds a trunk with sufficiently high probability. If all trunks are busy, blocked calls leave the system. This is...

## PiA2ie U2ie

+2 U(i)(i) - P(i) 1 - A(i)(i) 1 - A + ep -iaj 3P(i)(i)A(2)(i)e + P(i)A(3)(i)e + U(3)(i)e+ 3(i - p) 3 U(2)(i) + P(i)A(2)(i) - 2P(i)(i) 1 - A(i)(i) 1 - A + ep -ia J where we have defined the term U(z) -P0D-iD(z)A(z). Example 8.8 Once again, we continue the previous example with m 0 05 s. Corresponding to Equation (8.76), we get P(i) 0.770i 0.2299 . Corresponding to Equations (8.77) and (8.78), we get P(i)(i)e 0.64i3 and P(2)(i)e 0.3564. Note that in solving for A A(i)(i) A(2)(i) A(3)(i) , we use...

## FXY x y fX xfYy

Thus, the joint distribution is the product of the marginal distributions and the random variables are independent. Random variables can be transformed to produce other random variables. Consider, for example, the linear transformation Y aX + b, where X is a random variable with mean m and variance s2 and a and b are constants. It is left as an exercise to show that the mean and the variance of Y are, respectively, am + b and a2s2. Perhaps the most important transformation is the sum of two...

## Aj j i ji

X ( +1 + j+i)exp(-( +1 + j+i)t2)dt2 x --J--x ( +2 + j+2) xexp (-(Aj+2+ j+2)t3)dt3 x- 1+-x (Aj+1 + +1) exp (- +1 + y+i)t4)dt4 x ---- x (Aj+2 + j+2) exp (-(Aj+2 + j+2)t5)dt5 x --J-- x (Aj+1 + j+1) exp (-(Aj+1 + j+i)t6)dt6 x exp (-(Aj+2 + -+2 7) The term (Xj + ) exp ( (Xj + ij)ti)dti is the probability density of the first interval. The probability density of the next five intervals is similar. The probability that the process remains in the state j + 2 for at least t7 seconds is given by exp (...

## Pz TT Itzf eAt1z225

Example 2.6 On the accompanying spreadsheet, the Poisson distribution for It 20 has been plotted. The results are shown in Figure 2.4. We discontinue the calculation after k 50, but it is simple enough to continue beyond this value. Geometric Distribution Consider once again Bernoulli trials, where we are interested in the number of failures between successes. The probability of k failures until the first success is given by P(run of k failures followed by a success) g(k P) Once again, the mean...

## Vn Pj viM t4121

Note that v is the input rate to the timeout box and MT is the mean processing time of a message in the timeout box. Thus (4.120) and (4.121) show that the distribution of the total number in the timeout box is a function only of the mean and not of the distribution of the processing time. These observations demonstrate that the product form solution holds for this simple network of BCMP queues in the same way that it did for Jackson networks (see Section 4.4.3). The form of the results...

## Where p2 1m2

The second part of Burke's theorem states that the departure process from an M M 1 queue is independent of the queue contents. This implies that the contents of the second queue are independent of the contents of the first queue accordingly, the joint distribution in the steady state is simply the product of the marginal distributions P(k1, k2) (1 - Pl)(1 - p2)p i r22 k1, k2 0,1,2, (4.9) Note that this solution is as though each of the queues existed in isolation. The delays through the queues...

## Application Of Birth And Death Processes To Queueing Theory

We now turn to consider the basic elements of queueing models. In order to describe these elements, a convenient notation has been developed. In its simplest form it is written A R S, where A designates the arrival process, R the service required by an arriving customer, and S the number of servers. For example, M M 1 indicates a Poisson arrival of customers, an exponentially distributed service discipline, and a single server. The M may be taken to stand for memoryless (or Markovian) in both...

## Qii0 t Qi20 t Q2l0 t Q220 t

.Gli(0, t) Ql2(0, t) with this definition we have Qil(0, t) Qii(1, t) Qi2(1, t) Qil(1, t) Q22(0, t) Q2i(i, t) Q22 (i, t) Q22(i, t) Qll (0, t) QLi(i, t) QL2(i, t) Qll(i, t) where the infinitesimal generator matrixi is given by 1In order to conform with usage in the literature, in this chapter the infinitesimal generator matrix is written in such a way that the rows sum to zero. In Chapters 3 and 4, the columns of the infinitesimal generator matrix sum to zero. Example 8.1 Consider the two-phase...

## Info

Figure 5.5 Synchronous time-division multiplexing. We assume a gated or please wait strategy whereby packets arriving during a frame must wait until the next frame even though the slots may be empty. The alternative to please wait is come right in, in which newly arrived messages may be transmitted in the same frame. We write the equation for the imbedded Markov chain for the number of packets in the buffer as where, as previously, N is the number of packets in the system at the beginning of...

## S emtmtm1 k atdt k

Finally, we cancel like factors and define A(s), the Laplace transform of a(t). We then have the following implicit equation for s exp (-mt(1 - a))a(t)dt A(m(1 - a)) (6.59) In general, it is necessary to use numerical techniques to solve (6.59). The form of (6.56) is that of a geometric distribution accordingly, the mean number of messages encountered by an arriving message is given by The average time to service a message is 1 m The resulting delay is the time required to transmit the...

## Di d 2 kddDD

En 1 D2 - 2D n 1 Di + m(D )2 _ En 1 D2 - n(D)2 For the estimates of the mean and the variance to be close to true value, a sufficient number of samples must be taken. This can be seen immediately from (9.3) for the sample mean, where the variance of the estimate of the mean is inversely proportional to the number of samples. Enough samples need to be taken for the sample standard deviation to be sufficiently low for credibility. The Chebyshev inequality, discussed in Section 2.6, would supply...

## Markov Chains Application To Multiplexing And Access

In Section 2.7, Markov chains were introduced and their basic characteristics were outlined. In this chapter, the Markov chain is used to model techniques for multiplexing and access in telecommunications networks. We begin with the treatment of time-division multiplexing (TDM). The first widely deployed example of TDM was the T1 carrier system depicted in Figure 5.1. The line flow is 1.544 Mbps segmented into frames 125 ms in duration containing T1 Frame-Pulse Code Modulation(PCM) Frame 1 8000...

## A2 A M 1 MM2 MM and MM2 MM2 MM2

We should also account for processing at each station along the route around the loop. In each station the packet header must be examined. We allow (2 + log2 N +) R seconds for this. In the symmetric case a packet travels halfway around the loop on the average, and this factor would add an average delay of (N 2)(2 + log2N +) R seconds to the delay in reaching a destination terminal. In applying the results of the preemptive priority analysis, we recognize a particular point that may lead to...

## KVi k1V1 1 qiV vk qvki1 v1 1 380 i1 i1 i1

Substituting 3.75 , and canceling terms again, we find A repeated application of 3.75 shows that the RHS of 3.80 is l and we have a solution. The constant term, P 0 , is found from the usual normalization equation. Our interest is in the total number of messages in all stages, that is, the random variable We do a proof by induction. Assume that K 2. We have P k1 k2 m P 0 J2 PiPT' m lt N An application of the binomial theorem yields Assuming that the formula holds for K 1 stages, it is easy to...

## Fe y 1 ey y

However, this is an exponentially distributed random variable with mean 1. This can be transformed into an exponentially distributed random variable with mean 1 m simply by multiplying the random variable by 1 m. Example 9.8 On the associated Excel spreadsheet we generate 10,000 exponentially distributed random variables with mean 10. As in Example 9.1, the means and the variances are estimated for 10 samples and 100 samples. We have also written aMatlab program, ex98. m, which also generates...

## FRQr 0 rr 00 0 2p

From 9.22 , we see that the marginal density functions are These are, respectively, the density function for the Rayleigh distribution and the U 0, 2p distribution. The probability distribution for the Rayleigh distribution is given by xexpl dx 1 - expf - r gt 0 9.26 Our objective now is to generate a Rayleigh distributed random variable. We begin by inverting the transformation u 1 exp - r2 2 . We find that r V 22ln 1 u . Now, suppose that a U 0,1 random variable is transformed according to...

## Preface Of Telecommunication

The insinuation of telecommunications into the daily fabric of our lives has been arguably the most important and surprising development of the last 25 years. Before this revolution, telephone service and its place in our lives had been largely stable for more than a generation. The growth was, so to speak, lateral, as the global reach of telecommunications extended and more people got telephone service. The distinction between oversea and domestic calls blurred with the advances in switching...

## Vt1 t2

E Xt1 - E Xt1 Xt2 - E Xt2 E Xt1 Xt2 - E Xh E Xh For the sinusoidal process defined in 2.111 , we have 1 p 1cos 2p o t1 - t2 1cos 2p 0 t1 t2 20 Since the mean of the process is equal to 0, the covariance function is V ti, t2 2R ti, t2 . We notice that the mean and the mean-square moments of the sinusoidal process are constant with time. Further, correlation between samples is a function of time difference and is not based on the absolute time. Processes with these properties are called...

## Retrieving Files From The Wiley Ftp And Internet Sites

To download software referred to in the examples and used to generate figures in the book, use an ftp program or a Web browser. If you are using an ftp program, type the following at your ftp prompt ftp ftp.wiley.com Some programs may provide the first ftp for you, in which case type ftp.wiley.com Log in as anonymous e.g., User ID anonymous . Leave the password blank. After you have connected to the Wiley ftp site, navigate through the directory path of ftp ftp.wiley.com public sci_tech_med...