 What would you think if I told you that after taking a few measurements of my daughter's height and calculating the average I reported her height as 139 point seven eight three four centimeters tall Now hopefully it pours and think it really doesn't make much sense to report someone's height to within a micrometer That's less than the width of a hair Now if you did come to that conclusion You would be demonstrating that you understand There's no point in reporting too many digits of precision in a quantity You should only use as many numbers as is relevant to the level of uncertainty in your quantity and this Concept is referred to as the number of significant figures The number of significant figures in a quantity is simply the number of digits that are meaningful It's related to the precision of your measurement And it's a useful bit of scientific jargon that helps when estimating quantities and errors and it's best explained with some examples So here are some numbers I've written down First of all any zeros at the start of a number are not counted That's because they don't contribute to the magnitude of the quantity So 0.007 has just one significant figure After the first non-zero all subsequent digits are significant So 0.107 has three significant figures and 2300.1 has five significant figures Now all zeros to the right of a decimal place are significant So for example 6.300 has four significant figures and that's because you're specifying that the value Is 6.300 and not 6.301 Now there's always a bit of ambiguity when you have large integer numbers So if I write down 57 000 Then it's not clear Whether I mean two significant figures here Or whether I mean there's a full five significant figures So this relates to quantifying your uncertainty. So I'm not sure is this plus or minus 100 say or is it plus or minus A half so if it was plus or minus 100 then I've got say two or three significant figures But if it was plus or minus 0.5, then I would be having five significant figures So one way to get around this ambiguity is to specify the uncertainty as I just did And another way is to use what's called scientific notation or standard form. So I can write 57 000 as 5.7 times 10 to the power of 4 And if I write it like that Then I know there are only two significant figures here But if I was to write it like this Then all those zeros that I've specifically written down after the decimal point These ones count as significant So this is two sig figs again, and this is five