I have considered Information for several years now. I entered Theoretical Physics in 1972 with the purpose of contributing to a Theory of Information. After studying Communications and Electronics Engineering, I felt that the twentieth century was the time to establish many elements of Information. One can study Statistical Mechanics as the application of Information Theory to the motion of more than two interacting particles. Besides, the Work World of data manipulation started, and is now a full fledged money making industry, as Chemical, and Electrical Engineering were before.
Jacob Bekenstein started the study of Statistical Mathematics in General Relativity, with the concept of the measure of Entropy, as proportional to the area of a black hole. John A. Wheeler, and Stephen Hawking established the importance of It from Bit, and the temperature of these objects. Now we know that, very likely, every galaxy, has a black hole in its center, and Caleb Scharf, has published a great book on his ideas of the role of black holes in making the Universe work," Gravity's Engines".
I do not claim here any fundamental breakthrough, but obviously the twenty first century, is the time to know what Information is.
An electron follows instructions, as coded in Dirac's Equation, maybe through a one dimensional cellular automaton with a simple rule, which acts as a Universal Computer. Rule 110, as proved by Harold V. McIntosh, has that property.
I do not know how the electron reads this automaton.