It’s an interesting argument in favor of computational literacy, computing education for everyone. It’s a pretty accurate description of what happens at the first undergraduate classes.Imagine, if you will, a world where Americans don’t teach their children math in elementary school. Imagine that children no longer learn addition in first grade, subtraction in second or multiplication and division in third and fourth. Imagine instead that children make it all the way through high school without having any formal presentation of mathematical concepts. Now imagine that a student is observant enough to realize that adults who have a firm grasp on mathematics have much better problem-solving life skills and financial opportunities than adults who don’t. If that student is curious enough to enroll in an undergraduate math class, imagine how frustrating it would be to have the whole of arithmetic, algebra and statistics thrown at you in your very first term. Wouldn’t it feel overwhelming? Wouldn’t you be discouraged… especially if you noticed that several people in the class already seemed to understand the stuff fluently? Wouldn’t it be difficult to perceive the subject as one where you have talent?
This hypothetical may seem ridiculous, but the truth is that a similar situation is being played out in America today with the subject of computer science. For many, computer science isn’t introduced at a k-12 level, so their first exposure comes in an undergraduate classroom, where they’re forced to absorb all of the basic building blocks of computational thinking at lightning speed before they can begin to fathom the concept of programming, design or engineering. To add further blows, a handful of students (often boys) will actually have skills in these areas, making the newcomers feel deficient, awkward and behind.