A micrometer resembles a C-clamp but the part you would turn to tighten the clamp is somewhat more elaborate and is specifically made so that each revolution "tightens the clamp" by a specific and controled movement distance. There's then a mechanism that counts the number of turns you make as you tighten it. A lot of modern ones have an electronic digital display. A micrometer is used to measure the "thickness" of something you can get a C-clamp around.
A thickness guage is conceptually similar to a micrometer, but its range of movement is much smaller so it uses a dial rather than counters. It's typically a much more accurate instrument but it has a much more limited range. Generally speaking, the more range an instrument has, the less accurate it is.
Accuracy is a measure of how correctly an instrument measures whatever it measures vs. an absolute standard; in other words, how correct the measurment is. Accuracy is typically expressed as an average of multiple measurments.
Precision is a measure of how consistent the instrument is, how much variation there is in the measurments it gives when the same, unchanging thing multiple times.
So, an instrument can be very accurate, but not very precise if, for example, ten measurments vary amongst themselves by twenty percent but average to the exact, correct value. An instrument can be very precise, but not very accurate if the ten measurments are all exactly the same, but all incorrect. Both types of instrument are useful, you just have to know what type of instrument you have and use it accordingly. Of course, an instrument can be very precise and also very accurate at the same time which is the best.