 In this video, we're going to look at the uncertainty involved when you measure a quantity. When you measure something, say the length of a metal rod, there's always some uncertainty involved in the measurement. Even if you take great care with your measurement, it's impossible to know the exact length of the rod. You may be limited by your measuring instrument, perhaps your ruler only goes to millimetres and the rod is not an exact number of millimetres. You may be limited by your eyesight, how well can you read the scale? You may be limited by environmental conditions, that rod is going to expand or contract if its temperature changes. So one of the most important skills for a good scientist is to understand how much error is involved when she takes a measurement. By error I don't mean a mistake like that you had the ruler back to front and read from the wrong end, although that mistake would certainly lead to a large error in your measurement. The proper definition of measurement error is the difference between the measured value of a quantity and its true or standard value. So you need to be able to identify and understand what sources of error there might be when you take a measurement. There are two classes of measurement error, systematic error and random error. You may want to pause the video here and read through this table and then I'll take you through some examples. Systematic error is when your measurement is consistently or systematically bigger or smaller than the true value. For instance my judo playing friend has a set of bathroom scales that always show a reading that is 600 grams heavier than his true weight. He knows this because he gets weighed during competitions on scales which are extremely accurate, so when he weighs himself at home he has to subtract 600 grams from whatever the scales say. This is an example of a systematic error and how you might compensate for it. Another example would be, say you were reading the meniscus of a liquid in a measuring cylinder and you were unwittingly but consistently holding your head a little above the meniscus. If you do this the meniscus appears to line up with gradations that are higher than the true value. This particular kind of error arising from your eyes not being level with the meniscus is called parallax error. So systematic errors once you've recognized them are predictable. In the case of the scales the systematic error was measured and could be compensated for in calculations. In the case of the measuring cylinder, once the source of error was identified the experimenter could improve her technique ensuring she's always at eye level with the meniscus when she reads it and thereby eliminate the error. Random error on the other hand is random. The readings are not consistently bigger or smaller than the true value but will be all over the place. They're caused either by inherent fluctuations in your measuring instrument or perhaps by some inconsistency in the way that you take your measurements. For instance, sometimes when you're weighing something on laboratory scales you'll find the value flickers up and down when you're trying to read it. This could be due to fluctuations within the instrument itself or to random air movements around the balance. Or imagine that you have to measure out a hundred mils of water in a measuring cylinder a number of times. Each time you use the measuring cylinder you do your best to get the meniscus right on the hundred mil mark but it's really difficult to get it perfect every single time. Sometimes you'll be a little bit over and sometimes you'll be a little bit under. Both of these are examples of random error. Happily random error can be minimized by averaging over a number of measurements. If the error is truly random then averaging over a sufficient number of values will reduce the error to close to zero because the high values will cancel out the low values. In the case of reading the meniscus and in many situations where some skill is required in determining the measurement value more practice and care will also reduce random error.