You are on page 1of 11

SOFTWARE REQUIREMENTS SPECIFICATION (SRS) REPORT

ON
DIGITAL HANDWRITTEN ANSWER SHEET EVALUATION SYSTEM

SUBMITTED TO THE SAVITRIBAI PHULE PUNE UNIVERSITY, PUNE


IN THE PARTIAL FULFILLMENT OF THE REQUIREMENTS
FOR THE AWARD OF THE DEGREE

OF

BACHELOR OF ENGINEERING (COMPUTER ENGINEERING)

CO-ORDINATOR: PROF. NITIN SHIVALE, PROF. SHRISHAIL PATIL

GUIDE NAME: PROF. MADHAVI KULKARNI

SUBMITTED BY :

GAYATRI ADHAV 01
ROHINI CHAVAN 72
VRUSHALI DESHMUKH 14
KUNAL WADILE 70

DEPARTMENT OF COMPUTER ENGINEERING

JSPM BHIVARABAI SAWANT INSTITUTE OF TECHNOLOGY & RESEARCH


GAT NO: 720/1 and 2, NAGAR ROAD, WAGHOLI, PUNE – 412207
SAVITRIBAI PHULE PUNE UNIVERSITY

2023-2024
Table Of Content
• 1. Introduction
– 1.1 System Purpose
– 1.2 System Scope
– 1.3 Definitions, Acronyms, and Abbreviations
– 1.4 References
– 1.5 System Overview
• 2. General System Description
– 2.1 System Context
– 2.2 System Modes and States
– 2.3 Major System Capabilities
– 2.4 Major System Conditions
– 2.5 Major System Constraints
– 2.6 User Characteristics
– 2.7 Assumptions and Dependencies
– 2.8 Operational Scenarios
• 3. System Capabilities, Conditions, and Constraints
– 3.1 Physical
∗ 3.1.1 Construction
∗ 3.1.2 Durability
∗ 3.1.3 Adaptability
∗ 3.1.4 Environmental Conditions
– 3.2 System Performance Characteristics
– 3.3 System Security
– 3.4 Information Management
– 3.5 System Operations
∗ 3.5.1 System Human Factors ∗ 3.5.2 System Maintainability
∗ 3.5.3 System Reliability
– 3.6 Policy and Regulation
– 3.7 System Life Cycle Sustainment
• 4. System Interfaces
1. INTRODUCTION

The manual approach for evaluating Subjective Answers for technical subjects requires a significant amount of
time and effort from the evaluator. Subjective answers can be evaluated using a variety of criteria, including the
question’s specific topic and writing style. Evaluating subjective responses is a key task to complete. When a human
examines anything, the quality of evaluation can vary depending on the person’s emotions. Because the same
inference method is utilized for all students, evaluating by computers utilizing intelligent algorithms provides
uniformity in grading.

The online evaluation is a much faster and clearer method to define all the relevant marking schemes. It brings
much transparency to the present method of answer checking The answers to all the questions after the extraction
would be stored in a database. The database is designed as such that it is very easily accessible. It will save a lot of
effort and time on the teacher’s part. The human efforts applied to this repetitive task can be saved and spent more
in other academic endeavours. The obvious human mistakes can be reduced to obtain an unbiased result. The system
calculates the score and provides results fairly quickly. This system can be widely used in academic institutions
such as schools, colleges, coaching, and institutes for checking answer sheets

1.1 System Purpose

The purpose of this Software Requirements Specification (SRS) document is to define the detailed requirements
and specifications for the development of a Digital Handwritten Answer Sheet Evaluation System for Teacher use.

The current approach to verifying handwritten (subjective) papers is inefficient. Evaluating Subjective Answers
is a critical task. The purpose of a digital handwritten answer sheet evaluation system is to automate the process of
checking student handwritten responses in various educational evaluations, exams, or tests. This system analyzes
handwritten data utilizing technologies such as Optical Character Recognition (OCR), Natural Language
Processing (NLP), and machine learning algorithms, resulting in faster and more effective grading

1.2 System Scope:

Education-related Handwritten Answer Sheet Evaluation Systems cover the automation of grading procedures
for different tests, providing speed, accuracy, and standardized evaluation.
These programs reduce educator burdens, provide students with rapid feedback, and improve data security
System improves the efficiency of resources, accessibility, diversity, and modification in learning environments. This
Project used in the Education Sector

1.3 Definitions, Acronyms, and Abbreviations:

Definitions:

This section provides clear and concise explanations of key terms and concepts relevant to the project. It helps to
eliminate ambiguity and ensures that everyone involved understands the terminology. Examples of definitions
might include:
• Handwritten Answer Sheet: A physical or digital document where students write their responses to questions
or assignments by hand.
1
• Automated Evaluation: The process of using technology, such as optical character recognition (OCR) and
machine learning, to assess and grade handwritten answer sheets automatically.
• OCR (Optical Character Recognition): A technology that converts handwritten or printed text into machine-
readable text by recognizing and interpreting characters.
• Grading Criteria: The predefined set of rules, rubrics, or guidelines used to evaluate and assign scores to
answers on the answer sheet.
• Machine Learning: A subset of artificial intelligence that involves training algorithms to recognize patterns
in handwritten text and assign scores based on predefined grading criteria.
• Natural Language Processing (NLP): A field of AI that focuses on the interaction between computers and
human language, used to analyze and understand the content of handwritten answers.

Acronyms:

In this section, you list commonly used acronyms along with their full forms. This is particularly useful for
avoiding confusion and ensuring that everyone knows what the abbreviations stand for. For instance:
• DHANSES: Digital Handwriting Answer Sheet Evaluation System

Abbreviations:
Abbreviations are shortened forms of longer terms or phrases used frequently in the project. While they are
similar to acronyms, they may not always be formed from the initial letters of the words. Examples include:
• Data: Referring to various types like Handwritten Answer Sheets, Ground Truth Data, Metadata, etc.
• APIs: Mentioning specific application programming interfaces that are integrated, such as REST
(Representational State Transfer) or SOAP (Simple Object Access Protocol).

1.3 References:

1. Muhammed Farrukhabad Bashir, Hamza Arshad, Abdul Javed, NataliaKryvinska, Shahab Band " Subjective
Answers Evolution Using Machine Learning and Natural language Processing "IEEE, December 7, 2021.
2. Sijimol P J, Surekha Mariam Varghese "Handwritten Short Answer Evaluation System (HSAES)" IJSRST,
February 28, 2018
3. Pranali Nikam, Mayuri Shinde, Rajashree Mahajan and Shashikala KadamAutomatic Evaluation of
Descriptive Answer Using Pattern Matching Algorithm" IJCSE, January 31, 2015.
4. Senthilkumar K, Aroabinesh J, Gowtham T, Manikandan K "Automatic answer evolution using deep
learning algorithms" ECB, MARCH 15, 2023.
5. Mandada Samemi, Tirumala Sai Hareesha Gudluru Venkata Siva Sai AvanKumar, Nalluri
Pramod"Automatic answer evolution using machine learning", UGC,2022
6. Madhavi B. Desai, Visarg D. Desai, Rahul S. Gupta, Deep D. Mevada, and Yash S. Mistry "A survey on
automatic subjective answer evolution"Advances and Applications in Mathematical Sciences, September
11, 2021.

2
2. GENERAL SYSTEM DESCRIPTION:
2.1 Product System Context:
• The system context in a digital handwritten answer sheet evaluation system is a comprehensive framework
that encompasses all the crucial elements and interactions defining its operation within its environment.
This context begins with the hardware, which includes the scanning or capturing devices used to digitize
handwritten answer sheets, as well as any infrastructure supporting data processing and storage. The
software component is pivotal, encompassing the evaluation algorithms, data management applications,
and potentially machine learning models for handwriting recognition and grading.
• Data sources comprise the answer sheets themselves, necessitating careful management and metadata
handling. Users, including educators, students, and administrators, play a central role, and their needs must
be understood for system design. Integration with other educational systems, security and privacy
considerations, scalability, reporting and analytics features, ongoing maintenance, and compliance with
institutional policies and regulations all contribute to the intricate system context. Considering these
elements is essential for designing a robust and user-friendly digital handwritten answer sheet evaluation
system.

2.2 System Modes and States:

In a digital handwritten answer sheet evaluation system, "system modes" and "states" refer to different operational
states and configurations that the system can adapt to fulfill its functions effectively. Here’s an overview of these
concepts: System Modes:

System States:

• Idle State: When the system is not actively processing answer sheets, it’s in an idle state, waiting for new
tasks or input.
• Scanning State: During scanning, the system is in this state, receiving and processing scanned images, and
preparing them for further evaluation.
• Processing State: After scanning, the system transitions to the processing state, where it interprets the
handwritten content and extracts relevant data.
• Reporting State: Once grading is complete, the system enters the reporting state, generating and presenting
evaluation results.
• Maintenance State: This state is essential for system updates, backups, and maintenance tasks to ensure
smooth operation over time.
• D Error State: In case of errors, exceptions, or issues, the system may enter an error state, requiring
intervention

2.3 Major System Capabilities:

• The major system capabilities in a digital handwritten answer sheet evaluation system include:
• Handwriting Recognition: The ability to accurately recognize and convert handwritten text and content
into digital format.
• Scanning and Image Processing: Efficiently capturing and enhancing digital images of handwritten answer
sheets for analysis.
• Data Extraction: Extracting relevant information, such as student responses and identification details, from
the answer sheets.

3
• Grading Algorithms: Employing automated algorithms to assess and assign scores to answers based on
predefined criteria.
• Reporting and Feedback: Generating comprehensive reports, performance analytics, and feedback for
educators and students.
• User Management: Managing user roles, permissions, and access control within the system.

2.4 Major System Conditions:

• Data Availability: Reliable access to digital versions of handwritten answer sheets, which may involve
scanning or image capture processes.
• Handwriting Recognition: The system must be capable of accurately recognizing and interpreting
handwritten text and content.
• Data Integrity: Ensuring that the extracted data and digitized content accurately represent the original
handwritten answers.
• Scalability: The system should be able to handle a large volume of answer sheets during peak evaluation
periods, maintaining performance and responsiveness.
• Security and Privacy: Robust measures to protect sensitive student data and evaluation results, ensuring
compliance with data protection regulations.

2.5 Major System Constraints:

• Data Quality and Availability: Obtaining high-quality digital handwritten answer sheets can be
challenging. Ensuring that the system can accurately interpret and evaluate a wide range of handwriting
styles and qualities is crucial.

• Scalability: Handling a large volume of answer sheets, especially during peak evaluation periods, may
require substantial computational resources. Scalability is important to ensure timely evaluation.

• Model Interpretability: Teachers or evaluators need to understand how the system arrived at its scores and
recommendations to trust and accept the results.

• Scoring Consistency: Ensuring that the system provides consistent and fair evaluations across different
evaluators and answer sheets is important for maintaining trust in the system.

2.6 User Characteristics:


2.7
• Teachers and Evaluators: These are the primary users responsible for evaluating answer sheets.
• Students: While not directly interacting with the evaluation system, students are the subjects of the
evaluation.
• Administrators: Educational institutions or organizations may have administrators who oversee the
evaluation process. They require tools for managing users, access permissions, and monitoring system
performance.
• Technical Support Staff: In case of technical issues or system errors, technical support staff should have
access to tools for resolving problems quickly to minimize disruptions in the evaluation process.

4
2.8 Assumptions and Dependencies:

A. Assumptions:

• Availability of Digital Answer Sheets: It is assumed that a sufficient number of digital answer sheets will be
available for evaluation. Insufficient answer sheets could limit the system’s usefulness.
• Quality of Digital Scans: The project assumes that the quality of digital scans or images of handwritten answer
sheets will be good enough for accurate evaluation. Poor-quality scans may lead to errors in recognition and
evaluation.
• Data Privacy and Security: The project assumes that data privacy and security measures are in place to protect
student and evaluator information.
• Technical Infrastructure: It is assumed that the necessary technical infrastructure, including servers,
databases, and network resources, is available to support the system’s operations.

B. Dependencies:

• Image Recognition Software: The system depends on image recognition software to process and interpret
handwritten text.
• Integration with Educational Systems: The system may need to integrate with educational management
systems or databases to access student and course information. Successful integration is a key dependency.
• Software Updates: Regular software updates and maintenance are necessary to keep the system running
smoothly and securely. Dependencies on software updates and their timing should be managed.
• User Feedback and Improvement: User feedback is essential for improving the system over time.

2.8 Operational Scenarios:

A digital handwritten answer sheet evaluation system streamlines the evaluation process. It begins with answer
sheet scanning and data storage. Image recognition software interprets and extracts text and marks, applying
predefined grading criteria to evaluate and score answers. Feedback reports are generated for students and
educators. Teachers access the system via web or mobile interfaces to review and validate automated evaluations,
with the option for manual review when needed.
User feedback informs system updates, ensuring continuous improvement. Data privacy and compliance are
prioritized, safeguarding student and evaluator information. Technical support is available, and the system may
integrate with educational management systems.

5
3. SYSTEM CAPABILITIES, CONDITIONS, AND CONSTRAINTS:
3.1 Physical:
3.1.1Construction:

The construction phase in a digital handwritten answer sheet evaluation system involves building the
system’s core infrastructure and creating smart algorithms that can recognize and analyze different handwriting
styles on answer sheets. It also includes designing a user-friendly interface that allows educators to easily upload
answer sheets, initiate evaluations, and view results in a clear format. Accessibility is a key focus, ensuring that
educators with varying tech skills can use the system effectively. This approach enhances the system’s usability
and adoption in educational institutions.

3.1.2 Durability:

The durability of a digital handwritten answer sheet evaluation system refers to its ability to maintain
effectiveness and relevance over time, even in the face of evolving educational practices, technological
advancements, and changing user needs. This entails ongoing updates, maintenance, and adaptability to ensure the
system continues to provide accurate evaluations and remains user-friendly. Durability is essential to ensure the
long-term usefulness and sustainability of the system in educational settings.

3.1.3 Adaptability:

Adaptability in a digital handwritten answer sheet evaluation system involves several crucial elements. It
relies on machine learning algorithms that continuously learn and improve accuracy in recognizing various
handwriting styles. It allows for regular updates and enhancements, keeping the system aligned with changing
educational needs. User feedback plays a vital role in shaping the system, enhancing its usability and satisfaction.
Additionally, the system’s compatibility with diverse answer sheet formats and educational contexts ensures its
relevance and effectiveness as education practices evolve. In essence, adaptability ensures the system remains
modern, precise, and user-friendly in an ever-changing educational landscape.

3.1.4 Environmental Conditions:

Environmental conditions in a digital handwritten answer sheet evaluation system encompass factors like
internet connectivity, device compatibility, security, and scalability. the system must be adaptable to different
scenarios, devices, and locations while maintaining robust security and accommodating varying user volumes,
ensuring its effective performance in diverse educational environments.

3.2 System Performance Characteristics:


a) Dynamic Actions and Changes: This part focuses on how the system handles things like how fast it grades
answer sheets, how quickly it learns to recognize different handwriting styles, how many answer sheets it can
process at once, and how well it deals with background noise during grading.

6
b) Quantitative Criteria and Endurance: Here, we specify measurable standards for how long the system
should last and how it should perform in different situations. This includes how often it should be used, how long
each grading session should take, and its expected lifespan.

c) Performance Requirements for Different Situations: In this section, we outline what the system needs to
do in various situations. For example, during busy times like exam season, it should still work quickly. It should
also be able to handle different amounts of work and ways of using it, whether grading a bunch of answer sheets
all at once or grading them one by one.\

3.3 System Security :

• Facility Security: The place where the system’s physical parts are kept should be protected. This means using
locks, cameras, and other measures to stop people from getting to the system’s computers and hardware
without permission.
• Operational Security: To make sure the system works safely, we can do a few things:
• Usernames and Passwords: Only people with the right usernames and passwords can use the system. This
keeps out anyone who shouldn’t be there.
• Data Protection: We make sure that student information is turned into secret code when it’s sent or stored, so
no one else can understand it.
• Access Control: Not everyone should see everything in the system. Teachers might see student results, but
only administrators can change how the system works.
• Data Backup: We make copies of the information in case something bad happens, like a computer breaking.
This way, we don’t lose important data.
• Isolation: In some cases, we keep different parts of the system separate so they can’t talk to each other unless
they’re supposed to.

3.4 Information Management:

1. Data Collection: Establish clear procedures for collecting digital answer sheets from users, ensuring data
integrity during the submission process. Implement data validation checks to verify the correctness of
uploaded documents.
2. Data Storage: Set up secure and reliable data storage systems to safeguard student answer sheets and
evaluation results. Implement encryption for data at rest to protect it from unauthorized access.
3. Data Processing: Define how the system processes handwritten answers, ensuring accurate grading and
evaluation. Implement algorithms and machine learning models for effective handwriting recognition.
4. User Access Control: Implement user authentication mechanisms to control access to the system. Different
user roles (e.g., teachers, and administrators) should have varying levels of access to data and system
features.
5. User Training: Train system users, including teachers and administrators, on data handling best practices and
security protocols to minimize human errors.

3.5 System Operations:


7
3.5.1 System Human Factors:

In a digital handwritten answer sheet evaluation system, considering human factors is essential. We follow
industry guidelines for user-friendly design, address unique user needs, make user interactions with the system easy
and error-resistant, pay extra attention to critical areas like data security, and conduct usability testing with real
users. This ensures the system is user-friendly, efficient, and safe, and minimizes the risk of errors, particularly in
crucial operations.

3.5.2 System Maintainability:

In the context of a digital handwritten answer sheet evaluation system, system maintainability holds paramount
importance. Given the dynamic nature of education and the continuous evolution of machine learning models, the
ability to swiftly implement updates, address bugs, and enhance model performance is crucial. Therefore, the
software architecture and infrastructure of this system must be meticulously designed with ease of maintenance in
mind. This includes implementing robust version control mechanisms to track changes, automated testing
procedures for early issue detection, and comprehensive documentation to facilitate troubleshooting and system
understanding

3.5.3 System Reliability:

System reliability for a digital handwritten answer sheet evaluation system refers to the system’s ability to
consistently and accurately assess handwritten answers. This reliability is essential to ensure fair and consistent
grading. Factors contributing to system reliability include robust handwriting recognition algorithms, effective error
detection and correction mechanisms, and regular system maintenance and updates to improve accuracy and
consistency in evaluating handwritten responses. High system reliability minimizes errors and inconsistencies in
grading, ensuring fairness and trust in the assessment process

3.6 Policy and Regulation:

Digital handwritten response sheet assessment systems are subject to policies and rules that cover privacy
protection, fairness, accessibility, cybersecurity, transparency, responsibility, validation, and ongoing development.
These measures support fairness and integrity in the assessment process by ensuring the secure handling of student
data, preventing unfair treatment, protecting against hacking, disclosing system operations, addressing errors,
validating accuracy, and promoting ongoing system improvements.

3.7 System Life Cycle Sustainment:

• Monitoring and Feedback: Continuous assessment of system performance and user feedback to identify
issues and areas for improvement.
• Model Updates: Adapting machine learning models to evolving educational standards and assessment
criteria, involving retraining with fresh data.
• User Support and Training: Providing ongoing support and training materials for teachers and evaluators to
maximize system utilization.

8
• Maintenance and Bug Fixes: Regular updates and bug fixes to ensure the system operates smoothly and
reliably.

4. SYSTEM INTERFACES:

4.1User Interface:
1 User-friendly Interface: The system must provide an intuitive and user-friendly interface for teachers
and educators to interact with answer sheets easily.
2 Answer sheet Upload: Allow users to upload digital answer sheets in various formats for evaluation.
3 Automated Evaluation: Implement automated recognition and grading features with clear grading
criteria.
4 Result Generation: Generate scores for students.
4.2Software Interface:
1. Application Programming Interface (API): The system may offer APIs to allow integration with external
educational software and platforms, enabling developers to programmatically access evaluation results and
integrate them into broader educational systems.
2. Database Management System (DBMS): The system should interface with DBMS to efficiently store,
manage, and retrieve evaluation data, and student results.
3. Handwriting Recognition Software: Integration with handwriting recognition software is crucial for
accurately converting handwritten answers into digital text for evaluation.
4. Data Processing and Analysis Software: To handle the processing and analysis of handwritten responses,
the system may require data processing and analysis software capable of recognizing and evaluating handwritten
text.
5. Web Browser: The user interface should be compatible with various web browsers and ensuring
educators and administrators can access the system easily regardless of their system.
6. Scanning Interface: This interface is used for scanning or capturing handwritten answer sheets in a
digital format. It may involve hardware such as scanners or cameras.

You might also like