Professional Documents
Culture Documents
QUEUES
QUEUES
Reading ref :
LN 2.2.1, LN 2.3.1, ULR 2.3.2, ULR 2.3.3
Queues
• The word queue means line in English
Banana
Cat
Queues
• Array Implementation
• List Implementation
Queues
tail
Simple Array Implementation
24 Enqueue(24)
0 1 2 3 4
head
tail
24 67 Enqueue(67)
0 1 2 3 4
tail
head
24 67 32 Enqueue(32)
0 1 2 3 4
67 32
head
0 1 2 3 4 dequeue()
head tail
Queues
Simple Array Implementation
• A Simple Array-Based Queue Implementation
• We present a simple realization of a queue by
means of an array, Q, of fixed capacity, storing
its elements.
• Since the main rule with the queue ADT is that
we insert and delete objects according to the
FIFO principle, we must decide how we are
going to keep track of the front and rear of the
queue.
Queues
• If we decide Q[0] be the front of the queue and
then letting the queue grow from there.
• This is not an efficient solution, however, for it
requires that we move all the elements forward
one array cell each time we perform a dequeue
operation.
• Such an implementation would therefore take
O(n) time to perform the dequeue method,
where n is the current number of objects in the
queue.
• If we want to achieve constant time for each
queue method, we need a different approach.
Queues
• To avoid moving objects once they are placed in
Q, we define two variables f and r, which have
the following meanings:
• f is an index to the cell of Q storing the first
element of the queue (which is the next
candidate to be removed by a dequeue
operation), unless the queue is empty (in which
case f = r).
• r is an index to the next available array cell in Q.
Queues
• Initially, we assign f = r = 0, which indicates that
the queue is empty. Now, when we remove an
element from the front of the queue, we
increment f to index the next cell.
• Likewise, when we add an element, we store it
in cell Q[r] and increment r to index the next
available cell in Q.
• This scheme allows us to implement methods
front, enqueue, and dequeue in constant time,
that is, O(1) time.
enqueue(6) dequeue() enqueue(8) enqueue(9)
Size() = 4 Size() = 3 Size() = 4 Size() = 5
Isempty()=0 Isempty()=0 Isempty()=0 Isempty()=0
Front() = 5 Front() = 7 Front() = 7 Front() = 7
r=Tail Tail 8 8
6 6 6 6
2 2 2 2
head head
7 7 head 7 7
Tail
5 f= head Tail 9
6 32
5 6
4 80
3 12 Front
2
1
0
Queues
Wrap around
MaxSize- 9 44
8 21
7 79
6 32
5 6
4 80
3 12 Front
2
1
0 63 Rear
Queues
Wrap around
• Insert a few more items. The Rear arrow moves
upward as you’d expect. Notice that after Rear
has wrapped around, it’s now below Front, the
reverse of the original arrangement.
• You can call this a broken sequence: The items
in the queue are in two different sequences in
the array.
• Delete enough items so that the Front arrow also
wraps around. Now you’re back to the original
arrangement, with Front below Rear. The items
are in a single contiguous sequence.
Queues
Circular
• Consider, for example, what happens if we repeatedly
enqueue and dequeue a single element N different
times.
• We would have f = r = N. If we were then to try to insert
the element just one more time, we would get an array-
out-of-bounds error (since the N valid locations in Q are
from Q[0] to Q[N-1]), even though there is plenty of room
in the queue in this case.
• To avoid this problem and be able to utilize all of the
array Q, we let the f and r indices "wrap around" the end
of Q.
• That is, we now view Q as a "circular array" that goes
from Q[0] to Q[N-1] and then immediately back to Q[0]
again.
Queues
• Implementing this circular view of Q is actually pretty
easy. Each time we increment f or r, we compute this
increment as "(f + 1) mod N" or "(r + 1) mod N,"
respectively.
• Recall that operator "mod" is the modulo operator,
which is computed by taking the remainder after an
integral division. For example, 14 divided by 4 is 3 with
remainder 2, so 14 mod 4 = 2.
• By using the modulo operator, we can view Q as a
circular array and implement each queue method in a
constant amount of time (that is, O(1) time).
• MENTION CLOCK 12 or 24hr clock is
QUEUES
Circular
• Initially, we assign f = r = 0, which
indicates that the queue is empty.
• Now, when we remove an element from
the front of the queue, we increment f to
index the next cell (so its now f+1)
• Likewise, when we add an element, we
store it in cell Q[r] and increment r to index
the next available cell in Q (so Q[r+1].
Queues
Efficiency/Complexity
• Performance of a queue realized by an
array.
• Method Time
• Size O(1)
• isEmpty O(1)
• Front O(1)
• Enqueue O(1)
• Dequeue O(1)
Queues
• As with the array-based stack implementation, the only
real disadvantage of the array-based queue
implementation is that we artificially set the capacity of
the queue to be some fixed value. In a real application,
Setting it too high could waste memory, to low would
cause application to crash with
• A queue can be implemented either using an array as
the underlying data structure or a linked list.
• An array implementation of a queue is simple and
efficient but it has a fixed upper bound on the size of the
queue.
Queues
Link list implementation
• Implementing a Queue with a Generic Linked
List
• We can efficiently implement the queue ADT
using a generic singly linked list.
• For efficiency reasons, we choose the front of
the queue to be at the head of the list, and
• the rear of the queue to be at the tail of the list.
In this way, we remove from the head and insert
at the tail.
QUEUE
Linked list implementation
• Initially the queue is empty
• Enqueue(5) Head and tail both point to 5 , 5
points to null pointer
• Enqueue(6) head points to 5, 5 point to 6, tail
points to 6 and 6 point to null pointer
• Enqueue(9) head points to 5, 5 point to 6 , 6
points to 9 , tail points to 9 , 9 points to null.
• Dequeue() head point to 6 , 6 points to 9 , tail
points to 9 , 9 points to null
head
tail
head
5
tail
head
5 6
tail
head
5 6 9
tail
tail
head
tail
0 1 2
head head
tail
5
tail
5
head
head 0 1 2
5 6 tail
head 5 6
tail
0 1 2
head
5 6 head 5 6 9
0 1 2
9 tail tail
QUEUE
• Each of the methods of the singly linked list
implementation of the queue ADT runs in O(1) time.
• We also avoid the need to specify a maximum size for
the queue, as was done in the array-based queue
implementation,
.
Queues
Applications
• Scheduler (e.g. in operating system):
All the processes that are submitted to
your CPU are first put in a queue which
are then processed as per various
algorithms.
• Similarly on a network if a number of users
want to access a resource their request is
taken in a queue and then processed
Queues
Applications
• When programming a real-time system
that can be interrupted (e.g., by a mouse
click or wireless connection), it is
necessary to attend to the interrupts
immediately, before proceeding with the
current activity. If the interrupts should be
handles in the same order they arrive,
then a FIFO queue is the appropriate data
structure.
Queues
Applications
• Round Robin Schedulers
• A popular use of the queue data structure is to
implement a round robin scheduler, where we
iterate through a collection of elements in a
circular fashion and "service“ each element by
performing a given action on it.
• Such a schedule is used, for example, to fairly
allocate a resource that must be shared by a
collection of clients.
• We can use a round robin scheduler to allocate
a slice of CPU time to various applications
running concurrently on a computer.
QUEUE
Applications
The three iterative steps for using a
queue to implement a round robin scheduler.
QUEUE
Applications
• We can implement a round robin
scheduler using a queue, Q, by repeatedly
performing the following steps :
1. e Q.dequeue()
2. Service element e
3. Q.enqueue(e)
Queues
Circular queues
• To avoid the problem of not being able to
insert more items into the queue even
• when it’s not full because items have been
removed from the Front and nothing else
can go to the Rear, the Front and Rear
arrows wrap around to the beginning of
the array. The result is a circular queue
(sometimes called a ring buffer).
Queues
deque (double ended queue)
Call
Generator Call 3 Call 2 Call 1
Customer
Service
Agent
Queues
Blocking
• The first main enhancement that a blocking
queue offers over a regular queue is that it can
be bounded.
• The blocking queue enables you to set an upper
limit on the size of the queue. Moreover, when
an attempt is made to store an item in a queue
that has reached its limit, the queue will block
the thread until space becomes available
– either by removing an item
– or by calling clear().
• In this way, you guarantee that the queue will
never exceed its predefined bounds.
Queues
Blocking
• The second major feature affects the behavior of
dequeue(). Recall from the implementation of
ListFifoQueue presented earlier that an
EmptyQueueException is thrown when an
attempt is made to retrieve an item from an
empty queue.
• A blocking queue, however, will instead block
the current thread until an item is enqueued
perfect for implementing work queues where
multiple, concurrent consumers need to wait
until there are more tasks to perform.
Queues
Blocking
• Implementing this would mean the code
would be designed with multi threading in
mind .
• A lock object or a mutual exclusion
semaphore (mutex), this is used to ensure
that only one thread can acess the queue
at the same time.
Queues
Blocking
public Object dequeue() throws EmptyQueueException {
synchronized (_mutex) {
while (isEmpty()) {
waitForNotification();
}
Object value = _queue.dequeue();
_mutex.notifyAll();
return value;
}
}
ECG Call Centre 0302 611611
Customer
Call Center Service
Agent
Call
Generator Call 3 Call 2 Call 1
Customer
Service
Agent
Queues
Priority Queues
• A priority queue is a more specialized data structure than
a stack or a queue. However, it’s a useful tool in a
surprising number of situations.
• Like an ordinary queue, a priority queue has a front and
a rear, and items are removed from the front.
• However, in a priority queue, items are ordered by key
value so that the item with the lowest key (or in some
implementations the highest key) is always at the front.
• A priority queue is a queue that supports accessing
items from the largest to the smallest. Unlike a simple
queue that supports accessing items in the same order
that they were added to the queue,
• Items are inserted in the proper position to maintain the
order.
Priority Queue
• In a queue, items are removed for processing in
the same order in which they were added to the
queue.
• In a priority queue, however, each item has a
priority, and when it's time to select an item, the
item with the highest priority is the one that is
chosen. The priority might have nothing to do
with the time at which the item was added to the
data structure.
Priority Queue
• Note that a priority queue is not, strictly
speaking, a queue at all, since a queue is a first-
in, first-out data structure and a priority queue is
not.
• Another way to describe a queue is as an
implementation of a “first-come, first served"
policy. The items in the queue wait in line in
order of arrival. This is the right policy when all
the items in the queue are of equal importance.
Priority Queue
• In many situations, however, some items are
more important than others, and in that case the
more important items should be moved up in the
line, ahead of the other items. A priority queue is
an implementation of this idea.
• The priority of an item represents its importance,
so items of higher priority should be selected
before items of lower priority.
Application Priority Queue
• There are many possible applications of priority
queues. For example, a priority queue could be
used to hold packets of data waiting to be
transmitted over a network. Packets that contain
important data or data that must arrive in a
timely way would be given higher priority and
would therefore be transmitted ahead of lower
priority data.
Application Priority Queue
• In an operating system, a priority queue
might hold “jobs," that is, programs that
are waiting for execution when processing
time becomes available, a job would be
removed from the priority queue for
processing.
Application Priority Queue
• Because of the way priority queues work,
high priority jobs would be executed
before low priority jobs, even if the low
priority jobs have been waiting longer. Or a
priority queue might contain jobs in a more
literal sense that is, tasks waiting to be
assigned to workers as they become
available.
Application Priority Queue
• A computer help desk, for example, might
use a priority queue to hold requests for
help until tasks become available to
process them. (Requests for help from,
say, a College President might have a
higher priority than a request from a
student.)
Queues
Priority Queues
• a priority queue is like a queue data
structure, but where additionally each
element has a "priority" associated with it.
• In a priority queue, an element with high
priority is served before an element with
low priority.
• If two elements have the same priority,
they are served according to their order in
the queue.
Queues
Priority Queues
• mail sorting analogy applies to a priority
queue.
• When you pickup mail from your mailbox
you sort them out in order of priority. The
higher the priority the higher the position in
the pile
• The top of the pile of letters corresponds
to the front of the priority queue.
Queues
Priority Queues
• When you have time to answer your mail, you
start by taking the letter off the top (the front of
the queue), thus ensuring that the most
important letters are answered first.
• Also, like ordinary queues, priority queues are
used in various ways in certain computer
systems. In a preemptive multitasking operating
system, for example,
• programs may be placed in a priority queue so
the highest-priority program is the next one to
receive a time-slice that allows it to execute.
Queues
Priority Queues
• In many situations you want access to the item with the lowest
key value (which might represent the cheapest or shortest
way to do something). Thus, the item with the smallest key
has the highest priority. There can be situations in which the
highest key has the highest priority.
• Other characteristics that may influence the choice of a
priority queue implementation are memory requirements,
code size, and the possibility of providing additional
operations such as retrieval of an arbitrary element.
• Besides providing quick access to the item with the smallest
key, you also want a priority queue to provide fairly quick
insertion. For this reason, priority queues are often
implemented with a data structure called a heap.
Queues
Priority Queues
• Types of Priority Queues
– an ascending-priority queue, in which the item with
smallest key has the highest priority and is accessed
with remove().
– A descending-priority queue would be where the
highest-key item were accessed.
• Efficiency of Priority Queues
– In the priority-queue implementation we show here,
insertion runs in O(N) time,
– while deletion takes O(1) time. We’ll see how to
improve insertion time with heaps.
Queues
Implementation work around
• Implementation Without an Item Count
Z
7
Y
O
25 S
A E
I N
D
9
Below is an INCORRECT pseudo code for the algorithm which is
supposed to
determine whether a sequence of parentheses is balanced:
• Given two queues Q & Z an two stacks A & B use the above
operations to execute the following questions below
PRIO*R**I*T*Y***QUE***U*E