It’s hard to say that technology isn’t a significant influence in today’s world. Dominating the entertainment industry, finance sphere and even the job market, technology has made an acute impact on how nearly every person works and lives.
Despite these advancements, almost all levels of education, in higher level institutions and K-12 schools, are far behind the curve in teaching skills that are both relevant and readily applicable.
For example, computer science degree holders are increasingly finding themselves behind in the job market; big tech companies no longer want to hire anyone with just a sampling platter of experience in coding, preferring those with people skills and specializations in subfields like data science or cybersecurity.
Students in K-12 schools are also less likely to be exposed to substantial computer science experiences, with just 53% of public schools offering CS or technology courses, and students in marginalized groups being even less likely to take on these courses if they are offered at all.
These social divides must be bridged early, with more resources allotted to computer science programs and more teachers qualified to lead introductory computer science courses.
Securing access to information about the phone that students use, the car they drive or the websites they visit, while providing personal development, is critical to ensure a well-rounded future.
Why Computer Science?
In the late 1990s, no tech companies would’ve been considered “big.”
The internet was just a fraction of what it would soon become and in most households, it was a place; one office chair next to a computer the size of an air conditioning unit sprinkled with dust and DVDs.
The “dot-com bubble” was causing the stocks of Internet-based companies to rise as more investors saw massive profit to be made with the new technology.
At that time, those with programming experience were few and far between, and companies were vying for capable programmers to create and maintain groundbreaking advancements.
The salary for these programmers was also more than competitive, with some fresh college graduates earning 10% more than the national average starting salary.
The market was rich, jobs were plentiful and computer nerds were flocking to college Computer Science programs to turn their passions into occupations.
Despite this success, the bubble burst in 2000.
Startup culture didn’t seem as alluring or lucrative as it once was, and college students quickly noticed the decline. Rates of enrollment in Computer Science degree programs went down significantly, and recovery wasn’t likely.
Investors and tech giants became despondent, with a popular Silicon Valley bumper sticker reading: “Please God, just one more bubble.”
However, against all odds, the rise of the internet breathed new life into the tech industry, and Silicon Valley’s prayers were answered; Comp Sci grads had jobs once again.
After the 2009 recession, Computer Science became the future of STEM employment.
Today, despite recent layoffs, those with CS degrees still remain among the most employable.
Even though Computer Science has had a more than rocky past, and most likely an equally tumultuous future, those with tech experience remain at a significant advantage in both their daily and professional lives.
The Reboot
Since 2013, Code.org has made it their mission to educate young people about cyber security, programming and other Computer Science topics.
Their comprehensive curriculums range in difficulty from simple block-code based activities, to entire AP courses.
Other organizations also offer such resources, yet on a much smaller scale.
CodeAcademy, the mobile app Mimo and online bootcamps also offer the opportunity for anyone, not just students, to gain certifications in Computer Science subjects.
However, while these specialized services may offer programming knowledge at a lower cost than a college degree, colleges are uniquely positioned to offer something that bootcamps and online courses cannot: teaching programmers how to be good people.
Providing education in ethical, social or language studies is not only beneficial to programmers, but essential in ensuring that future technological innovations are beneficial to society as a whole.
There is a common stereotype that those studying computer science are white, nerdy men with abysmal social skills.
This does not have to be true.
Although white men make up roughly 50% of programmers, universities have sought to recruit from a wide range of backgrounds, aiming to encourage more women and people of color to join computer science programs.
Employers have also increasingly reported that interpersonal, not just technical, expertise is considered a key factor that could make or break a new hire. General education requirements in universities, as well as professional conferences, have helped programmers develop professional skills beyond coding.
Meanwhile, advancements in artificial intelligence, specifically generative AI, have made it clear that those without experience in the arts or other humanities have no business in automating the creative process.
Generative AI, without the input of human artists, has become a breeding ground for destructive propaganda, uncanny imagery and copyright infractions by the million.
AI is also well documented to inflict racial bias onto its users, with multiple AI services scaling down operations in order to manage this unintended realism.
In order to avoid a dystopian, auto-generated, minimally diverse future, computer science should be accessible for all and integrated with both the humanities and social sciences.
This time, programmers need to burst their own bubble.