A work ethic is the set of rules, generally coming from a moral code, by which we decide on the quality of our work, our attention to detail, and if we care. The reason people mention it is that so many younger employees seem not to care at all about their job.
Additionally, we all have an experience something like where you go to buy a house, spending a lot of money. To you, $180,000 is a lot of money. To a real estate broker, they seem to have an attitude that they can`t be bothered for such a trivial sum.
Is that work ethic? If so, why does it seem so prevalent, and how come so many younger people will tell you they feel very responsible for being at their job?
What if it`s not a moral issue, but instead it`s a "perception" problem? Suppose the whole *concept* of work is changing?
I got to thinking that if people no longer understand that an employER has the option---the choice---to provide work, what do they think work actually is? And so I came up with an idea that many people today think work is just a) something they ought to be doing during the day, b) an extension of "having to go to school," and c) a socially acceptable explanation for why they`re not hanging out on a street corner.
Do you think people view *work* as an exchange of effort for money, where work isn`t necessarily supposed to be fun? What with all the laws about "right to work," and the lawsuits regarding unfair firing, I wonder.
It would explain why employees hang around talking and don`t seem to notice a growing line of customers waiting to pay for things. It also would explain why nobody seems to know where anything is in a store, or they`ve been told to stock a shelf but can`t seem to understand that the phone is ringing off the hook.
What if "being hired" is considered a fact of nature, nowadays, like thunderstorms, hot weather, a cold snap, or the changing seasons? It "just happens" (somehow), and there`s no connection between what a person does and why they have a job. Maybe?