Why is business, as a discipline, treated (or treats itself) as if it was on a pedestal? Culture, pricing, business education does much to separate itself and what it teaches. It promises to demystify a field that is actually quite simple at the base, where there are perhaps no answers. Around the world there are millions of individuals who are successful in business, yet the vast majority never set foot in a business school. In order to survive in the world as we have organised it, we need to work, we need to make money. Therefore, shouldn’t it be a right to be given the tools and knowledge to assist us in doing that? Shouldn’t a basic business degree be provided free of charge to all citizens, at any age?
