Day and night are objects of type time of Morning. Think of it like a Boolean, where it's true or false. Or something like that. :)
But you are right. Everything isn't an object. Anything that's physical can be represented as an object. Things that follow ideas usually can be, such as Calendar Date is comprised of a Month, Year, etc... SOme things can be but shouldn't, such as some parsers, where functional programming would be a little smarter.
Figuring out the relationships of how things fit isn't hard. It's preparing for the future that is hard. Simple patterns of usage in a current system is quite easy when you know what's going in. It's just tedious. Given a set of functions of a program, it can be segregated into a many different subsets, where things overlap. But solving a real world problem doesn't require knowledge of all subsets.. just the best ones. Problem is, when someone throws a fork into the mix.
Everything can be done with OO in one form or another. Not always great, not always terrible, but it can be done. Should it? No.
But the definition of OO is clear. It's simply a description of what something can do and what attributes it has. How the definitions of the "do's" and "attributes" are, how they are passed around, accessed and all, is both a sugar and a medicine. It makes things easier and enforces "good" patterns of usage. Java doens't do multiple inheritance for it's "good reasons". perl and c++ do. perl and php (till late) had no access modifiers.
OO appeals to orgnaized people. Not all the people it appeals to are organized and some organized people think OO is not that great.
People who write algorithms, like solving sorting issues and what not don't always flock to OOP since you are describing a process in it's simplest forms. Even then, it's not usually in a concrete language in its description, but in functional psuedo-code :)
Play that funky music white boy..