"t" means "type" (or to some people, typedef
, which is the command used to create them). size_t
is the type used to specify memory sizes. time_t
on the other hand, is the type
used to specify time spans. They generally happen to refer to the same
underlying type (a 64-bit or 32-bit integer, depending on the platform),
but the label helps keep them straight conceptually so that the
implementation details can be hammered out by the compiler.
For example, time_t
used to be a 32-bit integer, meaning
that the clock would roll over in 2038. But on most 64-bit
architectures, they've expanded time_t
to be a 64-bit
integer, which means that 64-bit systems won't have a "year-2038"
problem. Since code that deals with unix timestamps uses the type name time_t
rather than int
to refer to these values, everything will "just work" when you simply recompile the code for your new architecture.