Professional Documents
Culture Documents
Unit 6
Queue
Queue
Once a replacement element is inserted into the Queue, all the weather inserted
before the new element within the queue must be removed, to get rid of the new
element.
peek ( ) function is often want to return the worth of first element without
dequeuing it.
Applications of Queue
1. Queue
A linear list which allows deletion to be performed at one end of the list and
insertion at the opposite end is named queue.
The information in such an inventory is processed FIFO (first in first out) of
FCFS (first come first served) pattern.
Front is that the end of queue from that deletion is to be performed
Working of Queue
Queue operations work as follows:
Enqueue Operation
Dequeue Operation
for the last element, reset the values of FRONT and REAR to -1
Dequeue: O(1)
Size: O(1)
H E L L O
0 1 2 3 4 5 6
Front=0 rear=4
Write OVERFLOW
Go to step
[END OF IF]
ELSE
[END OF IF]
Step 4: EXIT
H E L L O G
0 1 2 3 4 5 6
Front=0 rear=5
Algorithm
Write UNDERFLOW
ELSE
[END OF IF]
Step 2: EXIT
E L L O G
0 1 2 3 4 5 6
Front=1 rear=5
Step 1 - Include all the header files which are used in the program. And declare
all the user defined functions.
Step 2 - Define a 'Node' structure with two members data and next.
Step 3 - Define two Node pointers 'front' and 'rear' and set both to NULL.
1. Enqueue
Queues maintain two data pointers, front and rear. Therefore, its operations are
comparatively difficult to implement than that of stacks.
The following steps should be taken to enqueue (insert) data into a queue –
If the queue is not full, increment rear pointer to point the next
empty space.
Add data element to the queue location, where the rear is pointing.
return success.
procedure enqueue(data)
return overflow
end if
rear ← rear + 1
queue[rear] ← data
return true
} end procedure
2. Dequeue
Accessing data from the queue is a process of two tasks − access the data where
front is pointing and remove the data after access. The following steps are taken to
perform dequeue operation −
If the queue is not empty, access the data where front is pointing.
Return success.
procedure dequeue
return underflow
end if
data = queue[front]
front ← front + 1
return true
} end procedure
3. Peek
peek()This function helps to see the data at the front of the queue.
Algorithm
return queue[front]
} end procedure
Example:- int peek()
{
return queue[front];
4. Isfull
isfull() if we are using single dimension array for implementation of queue, to
check for the rear pointer to reach at MAXSIZE to determine that the queue is full.
To maintain the queue in a circular linked-list, the algorithm will differ.
Algorithm
return true
else
return false
endif
} end procedure
{
if(rear == MAXSIZE - 1)
return true;
else
return false;
5. Isempty
isempty() If the value of front is less than MIN or 0, it tells that the queue is not yet
initialized, hence empty.
Algorithm
else
return false
endif
end procedure
Example:- bool isempty()
return true:
else
return false;
In a typical queue arrangement re-buffering problem occurs for every dequeue operation. to
unravel this problem by joining the front and rear ends of a queue to form the queue as a circular
queue
In circular queue the last node is connected back to the primary node to form a circle.
Elements are added at the buttocks and therefore the elements are deleted at front of the queue
Both the front and therefore the rear pointers points to the start of the array
Figure: -Circular queue
1) In case of a circular queue, head pointer will always point to the front of the queue,
and tail pointer will always point to the top of the queue.
2) Initially, the top and therefore the tail pointers are going to be pointing to an
equivalent location, this is able to mean that the queue is empty.
3) New data is usually added to the situation pointed by the tail pointer, and once the
info is added, tail pointer is incremented to point to subsequent available location.
4) In a circular queue, data isn't actually far away from the queue. Only the top pointer is
incremented by one position when dequeue is executed. Because the queue data is
merely the info between head and tail, hence the info left outside isn't a neighborhood
of the queue anymore, hence removed.
5) The head and therefore the tail pointer will get reinitialized to 0 whenever they reach
the top of the queue.
6) Also, the top and therefore the tail pointers can cross one another. In other words,
head pointers are often greater than the tail. This may happen once we dequeue the
queue a few of times and therefore the tail pointer gets reinitialized upon reaching the
top of the queue.
1) Initialize the queue, with size of the queue defined (maxSize), and head and tail
pointers.
If No, then add the new data element to the situation of tail pointer and increment the tail pointer.
3) Dequeue: Check if the amount of elements within the queue is zero:
But if, head > tail, then size = maxSize - (head - tail) + 1
Ring Buffers are common data structures frequently used when the input and output to a
knowledge stream occur at different rates.
Memory Management
CPU scheduling
Circular Queues offer a fast and clean thanks to store FIFO data with a maximum size.
Conserves memory as we only store up to our capacity (opposed to a queue which could still grow
if input outpaces output.)
Circular Queues can only store the pre-determined maximum number of elements.
6.6 Multi-queues
Studies have proven that a single-line queue resulting in multiple servers is more efficient and
leads to less variation within the amount of your time customers are kept waiting.
Still, the single-line queue can appear overwhelming customers who fail to know that one longer
line is really a far better bet than taking their chances with one among many lines.
Has shorter average wait time. It can appear as if a single-line queue comes with an extended
wait, but compared to multiple lines, people are standing in one line for a way shorter time than if that
they had chosen from many lines service points are staggered therefore the entire line benefits from
one fast cashier and therefore the agony of one slow customer is spread evenly among all who wait.
Promotes fairness. “First come, first served” is inarguably the fairest way for a line to make.
When all customers are standing within the same line, the perception is that there's little question who
was there when and who should be getting attention before others.
Cuts down on stress. Whether customers are during a rush or not, they stunning much always
want to form good use of their time, and this will end in stress about selecting the “right” line. The
single-line queue takes away this got to choose.
Reduces jockeying. Line switching is frustrating for patrons and businesses alike. Some people
stand and scope out the multiple queues, trying to measure which one is their best bet for getting
through the road as quickly as possible. Others pick a line and still rubberneck, seeing if there's a far
better option, often jumping from line to line in an attempt to urge through the road faster.
Reduces sweet hearting. one among the foremost common sorts of employee theft,
sweet hearting, is when a cashier neglects to scan a couple of items, or only scans the lower-priced
items as a prefer to a lover or loved one . During a multiple-line queue, it’s easy for a customer to
settle on the road staffed by their acquaintance. During a single-line queue, subsequent person in line
has got to report back to subsequent available cashier. This random selection process dramatically
reduces sweet hearting.
Creates flexibility. The customer has greater flexibility during a multiple-line queue because they
get to pick the road during which they need to face. Providing they’re not of the jockeying nature,
having the facility to settle on can make a customer happier because they’ve selected where they need
to be and aren’t feeling forced to face during a single line.
Deters balking. When there's one line, serpentine or straight, long or short, a customer can feel
trapped by the thought of being at the mercy of just one waiting option. Multiple-line queues maintain
the illusion that there's more service available and, therefore, well worth the wait.
Implementation of k queues
Create a knowledge structure kQueues that represents k queues. Implementation of kQueues should
use just one array, i.e., k queues should use an equivalent array for storing elements. Following
functions must be supported by kQueues.
enqueue(int x, int qn) –> adds x to queue number ‘qn’ where qn is from 0 to k-1
dequeue(int qn) –> deletes a component from queue number ‘qn’ where qn is from 0 to k-1
A simple thanks to implement k queues is to divide the array in k slots of size n/k each, and fix the
slots for various queues, i.e., use arr[0] to arr[n/k-1] for the primary queue, and arr[n/k] to arr[2n/k-1]
for queue2 where arr[] is that the array to be wont to implement two queues and size of array be n.
The problem with this method is an inefficient use of array space. An enqueue operation may end
in overflow albeit there's space available in arr[]. For instance , consider k as 2 and array size n as 6.
Let we enqueue 3 elements to first and don't enqueue anything to the second queue.
Once we enqueue the 4th element to the primary queue, there'll be overflow albeit we've space for
3 more elements within the array.
The idea is analogous to the stack post, here we'd like to use three extra arrays. In stack post, we
would have liked two extra arrays, another array is required because in queues, enqueue() and
dequeue() operations are done at different ends.
front[]: this is often of size k and stores indexes of front elements altogether queues.
rear[]: this is often of size k and stores indexes of rear elements altogether queues.
next[]: this is often of size n and stores indexes of next item for all items in array arr[].
Together with k queues, a stack of free slots in arr[] is additionally maintained. The highest of this
stack is stored during a variable ‘free’.
All entries in front[] are initialized as -1 to point that each one queues are empty
All entries next[i] are initialized as i+1 because all slots are free initially and pointing to
subsequent slot. Top of the free stack, ‘free’ is initialized as 0.
1. Insertion at rear
Given F and R tips that could the front and rear elements of a queue respectively. Queue Q
consisting of N element. This procedure inserts Y at rear end of Queue.
[Overflow]
IF R >= N
Return
R ←R + 1
3. [Insert element ]
Q[R] ←Y
IF F=0
Then F ←1
Return
•Given F and R tips that could the front and rear elements of a queue respectively. Queue Q
consisting of N element. This function deleted and element from front of the Queue.
[Underflow]
IF F= 0
2. [Decrement element]
Y ←Q[F]
3. [Queue empty?]
IF F=R
Then F←R←0
4. [Return element]
Return (Y)
Given F and R tips that could the front and rear elements of a circular queue respectively.
Circular queue Q consisting of N elements. This procedure inserts Y at buttocks of Circular
queue.
If R = N
Then R← 1
Else R ← R + 1
2. [Overflow]
If F = R
Return
3. [Insert element]
Q[R] ← Y
If F = 0
Then F ← 1
Return
Given F and R tips that could the front and rear elements of a Circular queue respectively.
Circular Queue Q consisting of N elements. This function deleted and element from front of
the Circular Queue. Y is temporary pointer variable.
[Underflow?]
If F = 0
Return (0)
2. [Delete Element]
Y ← Q[F]
3. [Queue Empty?]
If F = R
Then F ← R ← 0
Return (Y)
If F = N
Then F ← 1
Else F ← F + 1
Return (Y)
1. Insertion in dequeue
Given F and R tips that could the front and rear elements of a queue, a queue consisting of N
elements and a component Y, this procedure inserts Y at the front of the queue.
[Overflow]
IF F = 0
Return
IF F=1
Return
F ←F-1
3. [Insert element ]
Q[F] ←Y
Return
2. Deletion in dequeue
Function: DQDELETE_REAR (Q, F, R)
• Given F and R tips that could the front and rear elements of a queue. And a queue Q to
which they correspond, this function deletes and returns the last element from the front of a
queue. And Y is temporary variable.
[Underflow]
IF R= 0
Return (0)
2. [Delete element]
Y ←Q[R]
3. [Queue empty?]
IF R=F
Then R←F←0
4. [Return element]
Return (Y)
•Given F and Rare tips that could the front and rear elements of a queue, a queue contains N
element. This procedure display Queue contents
IF F >= R
Return
2. [Display content]
Write (Q[I])
3. [Return statement]
Return
Double ended queue may be a more generalized sort of queue arrangement which allows insertion
and removal of elements from both the ends, i.e , front and back.
The enqueue operation takes place only at the rear, but the dequeue operation takes place at both
rear and front
An input-restricted queue is useful when we need to remove an item from the rear of the queue
The enqueue operation takes place at both rear and front, but the dequeue operation takes place
only at the front:
An output-restricted queue is handy when we need to insert an item at the front of the queue
Here we'll implement a double ended queue employing a circular array. It’ll have the
subsequent methods:
1. Insert Elements at Front
First we check if the queue is full. If its not full we insert a component at front by following the
given conditions
If the queue is empty then intialize front and rear to 0. Both will point to the primary element.
Again first we check if the queue is full. If it’s not full we insert a component at back by following
the given conditions:
If the queue is empty then initialize front and rear to 0. Both will point to the primary element.
Else we increment rear and insert the element. Since we are using circular array, we've to stay in
mind that if rear is adequate to SIZE-1 then rather than increasing it by 1 we make it adequate to 0.
Insert 21 at rear
In order to try to this, we first check if the queue is empty. If it’s not then delete the front element
by following the given conditions :
If just one element is present we once more make front and rear adequate to -1.
Else we increment front. But we've to stay in mind that if front is adequate to SIZE-1 then rather
than increasing it by 1 we make it adequate to 0.
In order to try to this, we again first check if the queue is empty. If it’s not then we delete the last
element by following the given conditions :
If just one element is present we make front and rear adequate to -1.
Else we decrement rear. But we've to stay in mind that if rear is adequate to 0 then rather than
decreasing it by 1 we make it adequate to SIZE-1.
In a priority queue weareable to insert delete items from any position based on some property
(such as giving preference to prior it of the processing task).
An item with a high priority are going to be dequeued before an item with a coffee priority.
An element with high priority is dequeued before a component with low priority.
If two elements have an equivalent priority, they're served consistent with their order within the
queue.
Figure:-Priority Queue viewed as a single queue with insertion allowed at any position
Priority Queues
Priority queue arrangement is an abstract data type that gives how to take care of a group of
elements, each with an associated value called key.
There are two sorts of priority queues: a max-priority queue and a min-priority queue. In both
kinds, the priority queue stores a set of elements and is usually ready to provide the foremost
“extreme” element, which is that the only due to interact with the priority queue.
Operations
maxElement: return the most important element within the priority queue.
updatePriorities: assume the values of the keys are changed and reorder the interior state of the
priority queue.
Implementations
3) Hash table: Although inserting into a hash table takes constant time (given an honest
hash function), finding the max element takes linear time. Therefore, this is able to be
a poor choice for the underlying arrangement.
ASCENDING-->
Items are entered arbitrarily & only the smallest item may be removed. If apq is an ascending
priority queue, the operation pqinsert (apq, x) inserts element x into apq and
pqmindelete(apq) removes the minimum element from apq and returns its value.
DESCENDING-->
Items are entered arbitrarily & only the largest item may be removed. A descending priority
queue is similar but allows deletion of only the largest item. The operations applicable to a
descending priority queue, dpq, are pqinsert(dpq,x) and pqmaxdelete(dpq). pqinsert(dpq,x)
inserts an element x into dpq and is logically identical to pqinsert for an ascending priority
queue. pqmaxdelete(dpq) removes the maximum element from dpq and returns its value.
Using Heaps-
Heap is generally preferred for priority queue implementation because heaps provide better
performance compared arrays or linked list. In a Binary Heap, getHighestPriority() can be
implemented in O(1) time, insert() can be implemented in O(Logn) time and
deleteHighestPriority() can also be implemented in O(Logn) time.
With Fibonacci heap, insert() and getHighestPriority() can be implemented in O(1) amortized
time and deleteHighestPriority() can be implemented in O(Logn) amortized time.
1) CPU Scheduling