 Up to this point, we've only used int to specify integers. The C language standard says that an int is at least 16 bits, which gives a range of numbers from negative 32,767 to positive 32,767. That's not a very big range. So C allows you to declare a variable to be a long int, which has at least 32 bits, and a much larger range. For really large integers, you can declare a long, long int, which is guaranteed to be at least 64 bits. There's also a short int data type, which is guaranteed to be a minimum of 16 bits. That's what the standard says the minimums are, but the actual values can vary depending upon your operating system and hardware. For example, running on a Windows 10 computer, here's how many bits are actually used, and here's the result for Linux. For the assignments in this course, int will almost always be enough. But if you do need a long int or long, long int, make sure you tell the compiler that you have one by adding an L or a double L after the number. You can use upper or lower case, but I'd recommend that you use uppercase because lower case Ls look a lot like the digit one. For any declaration other than int, you can drop the word int. These are equivalent. When it comes to printing integers with printf, you have several choices. %i prints the number in base 10. %x prints the number in base 16. This is used when you want to see the actual bits in a number. It's useful for low-level debugging and also for doing color codes in HTML and web-based software. %o prints the number in base 8. This is used infrequently nowadays, but it was very useful at the time when C was first invented. And here are the results of using those format characters for an integer. By the way, you can also use %d instead of %i. For short, long, and long, long, proceed the type specifier with h, l, and double l. And these must be in lower case. If there's a mismatch between your format specifier and the variable type, the compiler will give you a warning. There's one additional specifier you can use when declaring a variable. Unsigned. When you declare an unsigned variable, C interprets the bits in memory to represent non-negative values only. This increases the maximum value the variable can hold. When assigning a constant to an unsigned variable, use the letter u in either upper or lower case to explicitly say you want it unsigned. Use the %u specifier with printf to have the result interpreted as unsigned. And that's a summary of the integer data types in C.