This document discusses types of errors in measurement, precision, accuracy, and rules for significant figures. It defines random error as occurring during a single reading, while systematic error affects all readings due to issues with the measuring system. Precision refers to the closeness of multiple measurements, and accuracy is closeness to the expected value. The document also provides examples of uncertainty ranges for analog and digital measurements and outlines rules for determining significant figures in a number based on zero positions.
This document discusses types of errors in measurement, precision, accuracy, and rules for significant figures. It defines random error as occurring during a single reading, while systematic error affects all readings due to issues with the measuring system. Precision refers to the closeness of multiple measurements, and accuracy is closeness to the expected value. The document also provides examples of uncertainty ranges for analog and digital measurements and outlines rules for determining significant figures in a number based on zero positions.
This document discusses types of errors in measurement, precision, accuracy, and rules for significant figures. It defines random error as occurring during a single reading, while systematic error affects all readings due to issues with the measuring system. Precision refers to the closeness of multiple measurements, and accuracy is closeness to the expected value. The document also provides examples of uncertainty ranges for analog and digital measurements and outlines rules for determining significant figures in a number based on zero positions.
Systematic error: Error in establishment of system for measuring, leading to all consequent reading to go wrong. Precision: Closeness between points Accuracy: Closeness to the accepted/expected value
Note: Report uncertainty and error
Analog (+/- 1/2 of least count)
Digital (+/- 0.001)
Rules for significant figures
Zeros in the middle of a number are significant.
Zero at the beginning of a number are not significant. Zeros after the decimal point are significant. Zeros at the end of a number and before an implied decimal point may or may not be significant.