Why is the Objective-C Boolean data type defined as a signed char?

Something that has piqued my interest is Objective-C's BOOL type definition.

Why is it defined as a signed char (which could cause unexpected behaviour if a value greater than 1 byte in length is assigned to it) rather than as an int, as C does (much less margin for error: a zero value is false, a non-zero value is true)?

The only reason I can think of is the Objective-C designers micro-optimising storage because the char will use less memory than the int. Please can someone enlighten me?

Answers


Remember that Objective-C was created back in the 1980's, when saving bytes really mattered.

As mentioned in a comment, as long as you stick with the values YES and NO, everything will be fine.


Need Your Help

Git local developer live version how to?

git repository git-svn git-branch bitbucket

To be specific with the question i will list my scenario below,

R: read .dta file and use value labels only for selected variables to create a factor

r stata

What is the easiest way to read a .dta file in R and convert only specific variables as factors, using Stata value labels? I didn't find a way to specify the convert.factors option in the foreign p...