Science is the foundation of all human knowledge and yet scientists are often not at the center of business and financial decision making. I have been racking my brain with this issue and think that if our society is going to have continued success we should make science a central principle that informs all that we do. So why is it that we don't live in a science-centric society?