I do think that universities have brainwashed society into making too many people go to college, spend lots of money, get into lots of debt, hardly learn anything and take classes for degrees that noone cares about. To get a good job, its just as important to have related work experience in your desired field as your required degree. Many dont really hear about this concept until their last year of college, as i did. I felt like i hardly learned anything in 4 years of college. I feel like i had learned much more out of class and after i graduated. In college i was too busy cramming and memorizing for tests, none of us cared about actually understanding anything because you didn't earn points for that. Then when you needed to understand something, you didn't get any extra time, only enough to memorize. Also, when i began attending college noone mentioned to me about how critical having experience is for getting a good job. There are people with work experience and no degrees making just as much money as many college graduates. Later on i realized that 99% of jobs that require degrees also require several years of experience. You would have to work in a job related to your degree, all through college, just to meet the "minimum requirements" of employers when you graduate. And if you want to start your own business, school matters even less and experience matters even more. Cutting back on education is now on the table since the government debt levels are so high.
As Mark Twain once remarked: "I never let schooling interfere with my education".
No comments:
Post a Comment