Information, aside from the purely numerical usage in communication theory and abstruse equivalence to entropy in physics, is usually considered something that is unapproachable in normal intelligible discourse. I submit that such is not the case. That a definition can be formulated that well reduces to the comm theory and physics usages in their contexts but which is otherwise quite useful for asserting logical predicates in more mundane discussions.
Information : The correlation between disparate physical entities by which the properties of the content carrier entity are asserted to predict some properties of the object entity. The "information" is said to reside with the carrier being that about it which enables the predictions of the referent.
Note that there are three entities involved in the instantiation of information. The object of such information, the substrate carrier of the information, and the interpreter and presumptive predictor of the properties of the object upon interaction with the carrier. Information does not exist absent the interpretive agent. There is only the separate physicalities of the object and the carrier. Thus DNA is not information; it is chemistry. The predictive correlation between the DNA and the carrying organism is the information. On the other hand we are not wrong to refer to the "information content" of the DNA as the mechanism of inheritance. There is an observable correspondence in the physical properties of parent and offspring carried and conveyed by the DNA in and instantiated by means of the DNA in the seed. From our descriptive viewpoint the information of the DNA is the accounting of the parent/child physical correlations. But it is the abstraction of the physicality to description that renders the seed DNA propertly describable as information. In the material world, the DNA is just chemistry.
Likewise in communication theory, the correlation is the amount of distinguishable separate prediction of the object that can be derived from an encoded object carrier substrate. And in physics that amount of substrate to predictively describe the behavior of the object said to contain the information. Shannon's definition is not a definition of information but a definition of the measure of information, relating the distinctiveness of the correlate as the entropy.
Often philosophers will assert that something isn't knowledge unless it is "true". But that doesn't work for the way information is used in the real world. In the art of computing for instance, it is quite common to define data as containing "weak references" ... physical structure in the program for addresses or values which are not known, or do not even exist at the time of definition. But such structures do "predict" the behavior of the program whether the values are filled in or not, whether they are filled in with a correct correlate of the object of the program or not, or whether filled with an incorrect value. In other words the *place* for the information is itself information whether it then and there makes a correct prediction or not. Likewise as terribly fallible human beings, we will often act upon what we *believe* to be correct information even when it later turns out erroneous. We still make the prediction. It simply turns out to be wrong. But the information is the means of that prediction. Ergo, a good definiton of information must include provision for it to be wrong even while it remains a *correct* prediction of the predictive behavior of the interpeting agent. In otherwords "information" is what predicts the contingent disposition of the interpetive agent with regard to the asserted object of information.
Upon rereading the above, it occurs to me that while information, as described is valid, its core meaning, it's "essence" in terms of usage is as the foundational material for a prediction ... whether or not such is valid. In somewhat amusing terms of the American legal system it is "exculpatory" material . Stuff which nominally justifies and vets an action with then known material consequences even if such turn out erroneous, even with disastrous consequences. The "information" aka "what he knew and when he knew it" as derived as an interpretation of a substrate from which the prediction was made - a "message" - is the information. The prediction is the assertion of the correlation with some other aspect of reality outside the substrate.
What this means is that, in terms of information process structure, we now have a materially instantiable and meaningful definition of "why".