When I was in high school (late 90s), they had kind of a weird curriculum for teaching programming. We had AP classes for Computer Science A/AB, which you had to be at least a sophomore to take. There was also a class called 'Computer Programming' that anyone could take, which I did as a freshman, although the students in the class were actually pretty evenly distributed between all four grades.
Basically we were taught QBASIC and given assignments. The first few weeks seemed pretty mundane as the OP describes, getting into conditionals and loops, etc, but after we learned those, my teacher told us to make something like 'Choose Your Own Adventure' text-based game. I remember loving that assignment and even compiling it as an EXE and sending it to my other friends. This pretty much followed through the whole year, learning some new programming concept (arrays, functions, etc) and then making some sort of game as an assignment. We had the usual "write a program to display all the factors of a number" assignments too, but I just remember loving the game projects. I didn't know anything about Big-O or AVL trees or whatever, just that I could create cool stuff on a computer.
In 10th grade I took Computer Science A, and about half the class hadn't taken Computer Programming already. The material was a lot drier, obviously, and I remember a lot of those students switched out. I witnessed the same thing my freshman year of college -- no 'Choose Your Own Adventure Games' as an assignments, just grueling midterms on polymorphism and inheritance.
So this post really resonated with me, because my thoughts have basically echoed this for awhile -- why isn't software engineering taught as a discipline that can let you implement and create, since that's exactly what it is?
> So this post really resonated with me, because my thoughts have basically echoed this for awhile -- why isn't software engineering taught as a discipline that can let you implement and create, since that's exactly what it is?
Because most colleges teach "computer science" not "software engineering". Computer science is primarily about abstract concepts, and not so much about necessarily doing anything worthwhile with them. (This is similar to saying, "Why doesn't physics teach me to build X?" Because that's mechanical engineering!) Admittedly, there's hopefully some occasion to apply the concepts, rather than just memorize them, but the focus should be on the science, not the engineering.
In my curriculum, software engineering was one of my possible senior classes (including OSes, compilers, etc., almost all focused on application).
This is a great question, and the answer is that universities serve their own institutional needs, not those of business, or of society, or even of students. I remember the shock when I was in college trying to learn Japanese and I asked why, after the first couple of years of language, all the Japanese classes switched to literature or obscure linguistics. I told them that I wanted classes in advanced language that would teach me how to live and work at a professional level in Japan: present in a business meeting, read a contract, etc. I claimed that going from basic language to advanced, modern, professional Japanese was a lot higher priority than a switch to archaic, literary Japanese or obscure linguistic analysis. They reacted with outrage, claiming that if they were to do what I was suggesting, they "wouldn't have any academic credibility at all!"
Ah, that's what it's about. Not my needs but theirs: not real-world skill but academic credibility.
Of course it would be what I needed if, and only if, I intended to become an academic in the field myself, but that is what universities are---farm leagues for finding and developing future professors.
Since professors are only a small portion of the work force, most people weren't even supposed to attend universities. Most people were supposed to enroll in other educational programs (tech schools, art/cooking/etc. academies, apprenticeships, and so on) if they intended to work for a living.
But these days, no employer wants you without a university degree, so universities can go on serving themselves and you'll pay for it anyway, because the end for non-professors is a general purpose "degree" credential that employers use as a proxy for generic employment qualification.
I'm not saying that computer science isn't important, just that it is a higher priority for academia than for business, and academia serves academia.
True. And CS is still one of the more practical degrees. But in the end faculty are weighted on their research, and producing Phds who cite their research, so post-undergrad employment isn't a big concern.
What you're looking for is a 2-year tech degree in programming. Universities aren't about churning out factory workers; they're about education. That's why, when you go to a good college or university, you'll be studying history, languages, physics, etc alongside your major. It's not just about being a good cog, it's about being a well-educated person.
I'm not convinced of this at all (this is the first time "citation please" popped into my mind, and I hate that response :). Of course, there are some problems that have a strong need for erotically background, but those positions tend to require a graduate degree.
On the other hand, I've seen self-taught engineers who have far more passion for developing software, which seems (in my experience) to correlate more with engineer productivity.
I have a CS degree from an institution that has an extremely strong theoretical computer science department. I value my degree and the knowledge I gained to get it, but, beyond getting me in front of hiring managers early in my career, I don't think it particularly made me a better developer. A little understanding of algorithms and data structures (much of the need for which is obviates by today's VMs and libraries) and some understanding of what is really going on under the hood (again, not as helpful in the VM word).
For me, a degree is a nice-to-have, but you won't get hired if you can't convince me you are a good learner.
I'm not talking about self-taught vs. college educated. Computer science can be self-taught. You can buy all the books at a regular bookstore and you don't need special lab equipment or anything. If you're a self-taught developer with a passion for the profession and you're a good learner, you'll probably end up learning some CS along the way.
I still don't see a correlation (we certainly agree on whether a particular slip of paper is useful), so let's turn it around: what is it about computer science (in the realm of Big-O, lambda calculus, Turing machines, discrete mathematics, and such [obviously I've left a bunch out and intentionally started on the more theoretical. Feel free to ground me in somewhat more useful CS.]) that you think makes for better developers?
I totally agree that an attitude of self-teaching makes better developers. There is a lot in the application of CS, but I would call that Software Engineering.
Well there's breadth of expertise. Part of CS is systems, which includes networking, operating systems, compilers, and so forth. If you're making any of those things or leaning heavily on them, you want to hire people who are grounded in those fields. Same goes for things like AI, machine learning, data mining, and so forth.
More generally, having seen more kinds of software kind of broadens one's way of approaching programming problems, so even if you don't directly use anything you see when you study operating systems or compilers or AI, you can grab vague approaches and ideas from those fields.
If you're interested in writing performant software, you'll care about big-O, algorithms, and data structures. Understanding algorithms and data structures enables you to intelligently choose and apply them even if you don't have to develop them from scratch.
Because most colleges teach "computer science" not "software engineering". Computer science is primarily about abstract concepts, and not so much about necessarily doing anything worthwhile with them
A computer science curriculum (even a theoretical one) need not consist only of abstract concepts. And it need not include software if the purpose is to have a theoretical curriculum. I've always been a fan projects whose primary purpose is to help you learn the concepts. My favorite courses have been of the type where the problem sets are: "read this paper; understand it; implement the algorithms; write a report with results and discussion."
why isn't software engineering taught as a discipline that can let you implement and create?
It is. But not in school, or at least not in most schools. Many working web programmers appear to have gotten their start by picking up PHP, HTML, CSS, and snippets of jQuery via online tutorials and a lot of tinkering.
But this process has little or nothing to do with either CS or OO. My impression is that when teachers say "introductory programming" they usually mean either "introductory computer science" or "introduction to OO in Java". I'm a much bigger fan of the former than the latter, but Java is an overdesigned and slow-to-learn path to actually implementing anything fun, and CS is not really about the pragmatics of implementation, just as mathematics is not really about accountancy or computer graphics.
This pretty much mirrors my experience in college. From 98-00 I took Computer Science (and did well) but eventually switched to MIS. Why? Because I spent two years making terminal-based C++ apps and became so frustrated at not being able to make real-world programs that I just gave up. It was so frustrating to be watching the first web bubble develop and yet be in class learning and writing something that seemed so completely different.
I don't know if it was my fault or their fault for not being able to bridge the gap between theory/learning and practical skills. Probably some of both. If my courses had somehow seemed more relevant or at least had a few web-based projects where I could see the real-world application of what I was learning, my career path would probably be very different today.
Same here. I started out as a CS major but got my degree in CIS b/c the courses and material were much more real world, and fun. I was doing practical programming in my first course, learning about web apps, creating databases, and I even learned about hardware and how to build my first computer (not in a low level engineering sense, but piecing them together like modern day enthusiasts do). For someone coming from a music background with little to no heavy experience in computers prior that time in my life, CIS resonated with me much more at the time.
However, I always found myself missing the math aspects of CS, and to this day I regret not having the lower level foundation that CS offers over CIS/MIS. In hindsight, I would've stuck w/CS.
Really interesting. My first highschool programming class was exactly the same and also during the late 90s. We learned QBasic and created a choose your own adventure game. (My class was actually taught by a math teacher who barely knew QBasic) I also took the advanced classes later which were VB and then C++ and did notice a significant drop off. But, I still enjoyed it.
Did we go to the same high school? :) I wonder how common that class structure was back then.
This is stuff educational researchers have known for over 20 years, and theorists have known for 100 years (John Dewey). It's called situated cognition. Learning happens in context. You have to give students a reason to learn, not just learn something for its own sake. It explains why for example a Brazilian street kid may be a whiz at math, and a 6 year old may have hundreds of Pokemons memorized. See for example the work by James Paul Gee or Jean Lave. John Dewey said 100 years ago that we shouldn't educate just to prepare kids for a future they know and care little about, we should educate them for today, teach them stuff that is useful and interesting to them today.
An example of my own - at 9 my father attempted teaching me programming with basic - I thought it was pretty boring all that work just to draw a US flag on the screen, especially compared to the videogames I was playing. I was similarly bored with a basic programming class in high school and a survey of programming languages (lisp, C++) class in college (worst class I ever took in my life, actually).
It wasn't until early college when I started making CGI web applications, games, and educational software that I learned real programming and saw the value of it (along with the value of similarly boring calculus/linear algebra stuff).
I disagree with the author of this article's put down of GameMaker and similar tools. Python was not designed for beginners, and neither was Javascript, of course. Right now Scratch may be the best tool for teaching kids how to program, although it is very limited and not so great for creating games as other tools.
I wish I could vote you to the top. It's painful to see people "rediscovering" something so obvious. By not understanding this people are wasting so much time.
I've been programming for 30+ years. It was the enthusiasm of putting sprites on a screen and moving them around in 6502 assembler that kept me awake into the wee hours, pouring through books and magazines. The need for loops, branching, subroutines, etc all come naturally. As programs got bigger, the need for modules arose.
Once you have the "need/desire" to learn a subject, the rest comes so much easier.
Totally agreed, and it was how I got my foot in the door with web development.
The first tutorial wasn't, Getting started - adding and subtracting numbers in a REPL or doing some basic Hello World nonsense. It was How to build a blog with PHP and MySQL and I came across it when I found out that I didn't have to hand-code all my pages, even in Dreamweaver (at the time).
Once again I wanted to do something more, and came across the concept of frameworks (this was just when RoR was born), and then, the famous Make a blog in 20 minutes trend of tutorials that all frameworks eventually copied. I did CakePHP's, and I learnt about classes and MVC and separating concerns.
My boss summed it up perfectly: "you need to be given something real to be able to learn from it and enjoy it."
He was right, because I'd never have learned how to create an iOS app (and learned Obj-C) if he didn't give me a fortnight to program a design.
This post very eloquently expresses what I've believed for a long time as well.
It really hurts me to see newbie programmers in first year classes going through Java, writing getter and setter methods for a monster Object Oriented cash register or something. The code is spread across 40 different functions in 10 files, each function containing at most 5 or so lines of code. I'm sure it is a nice starter code for an amazing cash register application that can potentially scale to thousands of lines of code correctly, but it is not the way to teach programming. What a disaster.
I've made my own set of tutorials in the past on programming (see the set at http://karpathy.ca/phyces/tutorial4.php, though this set is for intermediate students who are already familiar with coding). I've always felt that coding should be taught in context of doing something concrete, not just by itself. Snippets of code without the whole don't mean anything. I also take a somewhat extreme view in that I think that game programming specifically is the best way to teach, because making games is engaging and students learn all the programming only as a by-product. Once they are comfortable with the basics, they should be introduced to progressively larger code bases, and at some point it becomes obvious why Object Oriented paradigms, for example, are a good idea.
As a last point, I think Udacity recognizes this as well and shares a similar philosophy. Notice their first two classes: "How to build a Search Engine" and "How to build a Self-driving car", not "101: Computer Programming", with Chapters 1. Variables, 2. Control flow, etc.
I tutored a friend who was in those OOP, Java, Intro to CS class. It was her second class of the series and she was so overwhelmed by all this stuff that was made for professional software engineers, that she just dropped that class. It wasn't mainly the courses fault that she left, she also forgot how to conjure up loops and how to use them, but knowing the professor she took last quarter (he likes to weed out people by making the class difficult) I can't really blame her for not having the enthusiasm to retain such information. I was in her shoes before, so I was very familiar with the "blank page" syndrom she was going through.
This "sink and swim" teaching mentality has to go. It seems like from this thread and the article that we are getting a better idea of how people like to learn programming. Let us actually put effort into teaching this way.
I agree. 'sink or swim' has been the technique of preference in academics for many years. I would think it it is the same for any discipline (electronics or chemical engineer). may be the fluid abstractions used in learning software make it much harder to train people?
I completely agree. When my friend and I were teaching ourselves how to develop software, we would write simple games. We went through a process of learning the basics of C, Python and Java by writing Scratchcard style games. It was simple enough conceptually, and it covered all the bases you needed.
The summer before I started university, I wrote a text based version of Minesweeper in Java. It helped build up my knowledge and confidence in using Java to the point that I skipped all my programming lectures in first year, and breezed the exams. That did annoy a few people, who didn't see the work I had put in, and just assumed I had it easy.
I have about 200k+ unique visitors a day reading my books, and over a million since I started tracking in May last year. I have comments on nearly every exercise on my books, which means people are actually going through them. I have 2200 people taking my udemy course which is the most popular paid class on the site (last time I checked). My books are used in workshops all over the world, have readers from all over the world, and have taught people from nearly every age group that can read English.
And my books do exactly what this article says you shouldn't do.
I'd say if nobody wants to learn basic programming concepts from you, then it's not the concepts, it's you.
Zed, you're awesome. With that said -- it's who you're teaching that lets you say this.
Right now there are people who know they want to learn programming coming hell or high water, and you do a damn fine job teaching them.
There are also a whole lot of people who might want to write apps, but do not want to go through the pain of learning programming.
The audience of Scratch, from MIT, is the narrow end of this giant wedge of people. They have nowhere to go from there because nobody else does it that way.
I miss the Apple II. It actually had a few on-ramps for people like that. Those on-ramps are pretty much dead now, though I have no idea why.
The guy writing this article isn't aiming at your set of programmers. He's aiming at all the people who aren't programming but would love to be able to build apps of one kind or another. There are a lot of them, and even a lot who could, and who we do a terrible job of teaching.
And no, I'm saying it's a huge myth that nobody wants to learn the "hard stuff". They do, you just have to teach it right. After that you just have people who aren't interested in programming at all and no amount of graphics, games, or cute cartoons is going to make them want to. You can't force or trick enlightenment on people.
But more importantly, I have evidence to back my beliefs. This article has absolutely no evidence.
Hi Zed, I'm Al, the author of article. I'll admit the title is hyperbole. There are people who want to learn programming for the sake of programming, just like there are people who like learning math even though they don't see immediate practical applications.
I teach programming to a tiny class of 10-12 year olds on Saturday mornings, and have taught a couple one-off classes at Stanford's Splash program to a classes of a dozen students. I've found that having an end-goal in mind of what they will create is a much better hook than just explaining what for loops are (or the idiosyncrasies of the languages are, which Python & Ruby are good about keeping to a minimum.)
I don't think graphics and cute cartoons are needed to make programming engaging either. My book "Invent with Python" has games that use ascii text up until the last few chapters. There's a pretty low standard for games/programs that are cool enough to show off to other people, but nobody ever shows off code snippets.
I think the best thing about "Learn Python the Hard Way" over a book like O'Reilly's "Learning Python" is that your book cuts it down to the bare concepts: "Type this. This should appear. This is what is happening." That's great compared to most computer books where there are paragraphs upon paragraphs that could be whittled down to a single sentence or cut out entirely. "Learn Python the Hard Way" is barely over 200 pages. That's much more digestible than some 600 page tome.
I just think that many classes where the students aren't CS majors or necessarily dedicated to learning programming forget that it's not just the concepts that people want to learn but what they can make using those concepts. Small games and web apps are great for that. Like in Exercise 22 of your book says, "It’s important when you are doing a boring mindless memorization exercise like this to know why. It helps you focus on a goal and know the purpose of all your efforts."
I'm in the process of self teaching myself to program (I'm a civil engineering student). I discovered one of pg's articles somehow and gradually worked my way through them and began to think that I should probably learn to program myself. I have ideas for things that I presume could be solved though programming but without learning to program it is impossible to know what is possible, this is the hardest gap to cross.
I bought Zelle's Python book, started JavaScript with Code Academy, got Graham's ANSI Common Lisp out the library and tried Rails for Zombies but none really got me interested enough to stick at it.
Dan Nguyen's Bastards Book of Ruby however has been the most successful. It's pretty much straight in, actively encourages copying and pasting code early on, with the immediate prospect of making real programs which interact with the wider world (HTML scraping) instead of feeling isolated on your own computer. I could immediately how HTML scraping could be useful for someone and how the first short programs might be extended.
There are some excellent motivating articles as well which were really key in sucking me in. He also calls his type of programming 'data programming' which to me, a non-programmer who hasn't had the chance to be brainwashed by programming theories, is the most obvious name in the world - all these problems I think could be solved with programming are just manipulating data.
btw: I don't find the prospect learning to program games motivating at all; there seems to be so little you can learn from most of them and a dangerous waste of time.
Right, but I think you are confusing causation with correlation. You say, "Nobody wants to learn the boring stuff and they just want to make games." But, they will learn the boring stuff if you teach it right. The concepts aren't what's putting people off, they're just correlated with bad styles of teaching it.
For example, explaining programming to total beginners doesn't work. Like you say, paragraph on paragraph about boolean logic is pointless if they can't even type:
print "hello world"
A beginner can't understand this because they don't have the cognitive tools to comprehend your explanation. But, if you make them do it a bunch, in tiny bites, and build their understanding as you go, then it works. This doesn't work for experts, but it works great for people with no prior knowledge.
So, you are confusing the concept with the delivery. The concept does not cause boredom, it's the delivery that does.
And, I also don't mean you have to deliver it wearing a clown costume, I mean breaking it down in a way that builds the concepts from nothing.
So, I'm walking this edge right now with a few middle schoolers.
At the beginning of the (school) year, we started doing a programming class using scratch. The operative theory was to be able to explore, get the cause/effect thing going, some debugging, and be able to build something that they could distribute and have other people play. It was a success for a while, within a class or two, pretty much everyone had something that was more than just basic behavior. There was some animation, some 'cat chases mouse', and other basic games. But they quickly hit the wall where scratch is just painful to program. There's no abstraction, and if you want to do space invaders, you wind up copy and pasting a hell of a lot of code blocks. And woe to you if you then need to change something. Dragging and dropping is attractive till you need to build equations of motion.
So, beginning this semester, we're working through LPTHW. The class is smaller, some were definitely scared away by the "hard way" and my warning that they'd have to work. But the ones that are there are soaking it up. I think we're at ~ 5 lessons a week, based on one class time when they can do stuff + homework.
They're getting it. They can find bugs. They have seen where we can go (because of scratch) but they've also seen that there's got to be a better way than drag and drop. My goal is LPTHW should take a quarter, then we'll pull in pygame and some networking stuff, and do a basic networked game. My goal is to be able to translate one of their scratch games roughly block to line, then kick it up a notch. Seeing what I have, I should have abandoned scratch after a quarter, rather than a semester. It would have still given them the view of where we can go, but had them miss the more frustrating parts of the scratch environment.
I've now got some home school experience teaching kids from pre-k to middle school. Heavily scripted stuff is awesome, _provided_ the kids see where it's all going at the end. Far better than random or directed exploration. (I've seen this in learning to read, elementary math, programming, and an art class).
I can see how the standard "first class is scheme second class is discrete math" scares some away. How big of a population actually gets lured unto programmingblater in life?
I have college degrees in Entrepreneurship and Marketing and have been just a "non-tech" guy for too long. I got tired of not being able to build anything myself, so I'm working my way through your book right now.
Of course I have a specific application in mind that I'd like to build, and perhaps this is evidence of my naivete, but it seems like if I were to just learn how to do that, I would be limiting my creativity. The difference between learning how to program only a specific type of application and learning to program in general seems like the difference between learning how to paint a bowl a fruit and studying the art of painting. What happens when you want to paint a bowl of fruit with a landscape behind it?
Again, I may just be one fool, but it is not the allure of being able to program a specific application that I am after, it is the pain of being ignorant that I am trying to assuage.
You present a false dichotomy. You don't have to choose to just learn how to program one kind of program or how to program generally. You can do both. And in my experience, having an idea and executing on it is a great way to start. You get instant gratification in learning one small subset of programming and you move on from there.
Like the students the OP describes, I too started programming hoping to build a game with as little effort as possible. (I wanted to build a massively multiplayer first person game. I know, I know. Hey, it was high school.)
As soon as I hit topics like OpenGL and networking, I realized this was going to be harder than I thought. The only reason I stuck with it is because I found joy in each little success, like sending a single character over a TCP connection, or putting a border around a <div>. And these successes continue to make the struggle worthwhile.
So this is totally anecdotal, but my personal experience conforms more with what Zed's saying. I think you won't enjoy programming unless you genuinely like the minute, seemingly boring details. Because those become your day-to-day consciousness far more than glamorous stuff like imagining new game mechanics and plotting out a storyline. (And that's assuming you're doing game development at all.)
The article did make a very generalized statement that "nobody wants to learn programming, they want to learn how to make programs" when your curriculum is an exception. With your rhetoric, you were able to get people to learn by trotting on through concepts in their barest form.
Fundamentally, you're both teaching by example and the code is simple, so that the results have a clear correlation to it.
It seems to be an aesthetic choice as to whether you teach concepts outright and make it communicable with proper rhetoric or illustrate concepts by highlighting their use in a simple application. It's like two guitar instructors. One stresses etudes and practices alongside songs and another stresses songs alongside etudes and practices. I can't see either way hurting a student. Whatever the choice, you're packaging this information in a way that makes it digestible to those who want to learn how to program (either the way of rhetoric or application).
I would like to say this though, at least you guys are not teaching Java by programming within the Eclipse IDE, which isn't ment for beginners.
A few of us have even bought one of your books :-)
My mature students taking basic maths courses in the UK will read the textbook and come back at me with questions that show they have confronted the material and put some time into it. The teenagers need a different approach, games, Web sites with interactive material, carefully chosen written material.
Have you any age data about your participants? Perhaps the video courses?
The age range for my courses is 11 to 65 so far, but I haven't been able to get statistics on the median. It's a touchy subject to ask folks their age, but my guess is the median is about 30, or that it's bimodal with 20 being one hump, and 30 being another.
Another thing to keep in mind is that a computer is already interactive. The idea that you need "graphics" to get students interested is completely unfounded, and the only data comes from experiments from companies trying to sell graphical programming systems for education sales. Even then these studies were done in the 70's and 80's when many people just didn't have computers.
In my experience I've found just making simple text adventure games is entertaining enough. Think about it: It's a game that is approachable by anyone, they are fun, they involve writing, their "graphics" are immediate, they don't require any geometry, and the output looks a lot like the program itself. That's why I teach them. They're easy but still fun.
My experience says you don't need crazy amounts of sensory input to keep people interested, and in fact is just distracts them and frustrates them. Keep it simple, but make sure it has depth that they can discover.
I would imagine the reason kids are likely to prefer graphics is because the games they play involve a lot of graphics so that's what they expect.
I remember in the early 90s plugging away writing various text games in QBASIC then excitedly showing them to school friends who's reaction was mostly "too much reading, not enough guns. Let's play Doom instead".
Of course this was somewhat disheartening and put me off programming for a few years.
The closest thing that I've seen to a comprehensive game creation "kit" is Unity3d but that has a pretty steep learning curve.
One sort of introduction that could potentially work well for teenagers would be to create a mod for an existing game that they already enjoy.
I don't think those kids would enjoy programming a game even with a good kit. They want to play games, not code them, and thus why I said no amount of graphics will get them to.
That is true, although I think one of the main motivations to learning to program (at least for me and I guess many teenagers) is to have something impressive to show off to friends.
Having said that I think you are right to encourage people to use code as a "secret weapon" rather than an end in itself. It is a great "awesomeness multiplier" to other skills.
"If you want to build a ship, don’t drum up the men to gather wood, divide the work and give orders. Instead, teach them to yearn for the vast and endless sea." - Antoine de Saint-Exupery
As I sit here, past 3 in the morning, playing with an Arduino for the first time, I find myself thinking...
GOOD LORD THIS IS FUN.
The Arduino folks have this whole thing nailed. The install process (especially for something that involves, you know, buying resistors) was super-easy. The examples seem endless (they have a UDP implementation? Nifty!). The troubleshooting page is relevant and specific. The default environment is so stripped down it hurts, but you know what? I don't care.
This _feels_ like I'm just getting started again, and I'm totally jazzed about it. This whole system is approachable, understandable, and _fun_.
I've programmed an Atmel microcontroller from scratch. It wasn't fun adding all those components together just to bootstrap myself. Luckily I messed with Arduinos before, so at least I knew what I was doing and was getting to barebones of embedded systems.
Still, I would prefer to develop with the Arduino. It's just, such a fun way to get started with microcontrollers.
The cleanest comparison I can think of is, nobody wants to work out. They want to look good, feel good about themselves and be admired and have their pick of their opposite sex.
This makes sense, as does the fact that working out can get you there. They don't want to work out, they want to _want to_ work out.
At the University of Pennsylvania, we (CS club) held a Codecademy-style event (hackertrails.com) targeted mainly at business folk. We had 150 folks, mostly MBAs, show up to the first event and maybe 15 to the second.
They want to start companies, and get that programming gets them there. But they don't actually want to learn to program, they _want to_ want to.
Yup, and the approach to getting children interested and motivated is similar; teach small easy pieces that sound like what their peer group is listening to.
I think part of the problem why some aspiring programmers give up is because the bar is too damn high. They get fed a constant stream of beautiful 3D games and iPhone apps, and when they start to make one (from following a book/tutorial, for example), it looks like shit compared to what they/others are expecting and they lose their initial enthusiasm.
Compared to a few years ago where a simple 2D/text-based game is already considered pretty cool.
I agree with the general principle behind this. But I don't agree with it in practice. It's a bit like saying "Nobody wants to learn how to drive, they want to learn to get to the store.". And strictly speaking, that's true. I mean, nobody wants to know about how to use the steering wheel or blinkers. Now, the first time I got behind the wheel, did I get to drive to the store? No. I got to drive around the parking lot after taking a class about how the steering wheel and blinker work.
Now clearly this example is different from programming because it's much more difficult to hurt or kill someone learning to program than it is when learning to drive, but I'm not convinced that the general principle is any different. With programming, you'be simply gotta learn about variables and loops, just like you've gotta learn how the steering wheel and blinkers work.
What I'm concerned about with this approach is that it might give newbies a premature sense of accomplishment that might hide just how difficult programming really is. In other words, it's like trying to convince someone that they've driven to the store when all they've done is driven around the parking lot.
The article states "A small amount of programming concepts had to be taught at the start and some single-concept examples are given, but the chapter always centers around a complete game."
But besides, getting to the point of premature successes:
In a way, you need a premature sense of accomplishment to keep a person motivated. Plus, if they ever come across the difficult portions of programming, that motivation might get them to go through the difficulty.
To me, the difficulty in programming is solving a problem using a lot of the difference concepts that make it up, but that's true with any profession and you can't get those concepts in your head unless you learn it one at a time.
What I don't want to see anymore is programming instructors teaching students a lot of concepts at the same time because to them that's the difficult way they learned and the "right" way to learn. That's just not a proper way to learn, you need to teach one concept at a time and if that means premature sense of accomplishment, so be it in the short term.
Nobody wants to work hard (learn to program), everybody wants to show off a coding masterpiece, say Quake or Future Crew's second reality, to their friends. A good teacher leverages the latter to motivate students to do the former.
While learning piano, no one wants to sit in a room playing scales over and over. They want to play Claire De Lune. Programming is similar in that it can be an art but it is first and foremost a skill. You need to struggle with boring parts until you figure out that you can actually make very cool things...but first you need to learn about loops and variables.
But you can teach new techniques in the context of full pieces, gradually introducing them, to great effect. This is the whole premise behind the Suzuki method of music education, and it's highly effective in cultivating not only technical proficiency but also musicality -- and in young kids, at that (it also substantially improved the musicality in piano playing for my wife, who is obviously an adult)
I tought a highschool level intro class in C++ programming. Sadly, it was very short, only 1.5 weeks of 2 hours per day, so we didn't even have time to cover classes.
What I would like is to walk the class through implementing a naïve Brainfuck interpreter, optimizing it, then implementing a Brainfuck-to-C compiler and optimize that.
It would bring a whole lot of understanding I didn't gain until much, much later.
After that, a simple text based adventure game engine to illustrate data driven programming.
That would show the difference between hard coding behaviour in assignments and creating actual usefull applications. It would also show that doing it properly isn't necessarily more difficult, and definately less code.
> What I would like is to walk the class through implementing a naïve Brainfuck interpreter, optimizing it, then implementing a Brainfuck-to-C compiler and optimize that.
Nowadays, I would find that an awesome assignment, but your high school students would hate you.
I remember having to do a similar compiler assignment in C in college. I think that it was the key assignment for making students believe that programming is absolute drudgery and in no way fun. It was the culmination of the semester, and considering how hard many of the students were working, there was frustration that this was all they could make after a semester. I think that your plan will not go well, and could quite possibly sour your students on programming for life. If you teach the text game first, though, that would make a world of difference.
As someone who learned to program in my 30s (spent my 20s as a professional musician) I can't agree more. The book that was recommended to me was K&R, which I found so dry as to be impossible to read through. The first book I really used was C++ in 21 Days, which has you writing working programs from...well, from day 1.
I always thought the reason for this problem in teaching programming is that what you need to know when switching to a new language is totally different from what you need to know when you don't know how to program. If you already know how to program, the first thing you need to know is the syntax: data types, iterators, etc. It's like being a carpenter and going to a new shop -- what are the tools? But if you don't know how to use the tools, being told the torque on an impact wrench is completely useless information. Not only is it useless, it's easy to look up, and therefore a waste of time to teach.
I now help teach the Boston Python Workshop, a weekend workshop for non-developer women and their friends. We do cover loops and data types on Saturday morning, but then all afternoon is projects -- building programs that cheat at scrabble, access twitter, and draw colored grids.
I totally understand why experienced programmers teach the way they do -- it's what they'd want to know. But the focus should not be on the language, it should be on the skill of how to program regardless of language. And yes, that's really hard to teach.
Perhaps a good way to teach people beginning programming would be to show them how to use a simple web framework.
It would have to be something purpose built to be easy though probably with a dedicated language.
A bit like this:
No libraries or APIs , everything required is built into the standard library.
Database is a simple key/value store that can be used transparently within the language. Something like "Store X = 3" and "Get X" where X is the key name.
Of course you would need some form of lists too, not sure what a friendly syntax for that would be.
Very simple looping syntax, in my experience beginning programmers struggle with this more than anything else.
Something like, "do X 3 times" where of course X would be a a subroutine, this would make functional programming seem quite natural later.
A WYSIWYG HTML editor that can integrate seamlessly with the language itself. Save teaching HTML/CSS until later.
Very friendly error messages that provide links to simple documentation that explains where they have most likely made the mistake.
Documentation should include many videos as well as text.
Graphical debugging should also be taught through a super simple debugger that just dumps the values of every variable at a breakpoint and allows users to browse the DB.
is this the same with foreign languages as well? Noone really wants to know the rules of a grammar. They just want to impress others, understand anime or foreign tv shows, or curse in another language
I can say the same about programming languages. I'm reluctant to learn a new programming language because I don't want to spend time learning the library, the syntax, and any other intricacies. All I want is that one cool feature that X language has that Y doesn't.
I think the biggest fallacy these days is that "programming is easy". I bet pilots think flying is easy, and surgeons will repeatedly tell you cutting someone open and removing this vestigial part of the cecum is routine, but if you did that the patient will bleed to death from the first incision.
My point is that coding might be easy for you, easy for me, and it must be easy for most people here, but the rest of the world?
I'm going to sound like Spock here but most individuals out there aren't logical. People out there don't thing in terms of ¬P->Q, so for them even the basics of programming are very hard, if not impossible to understand.
And even if you work around your limitations and learn how to program there's a long way between "hello world" and the engineering behind the kind of stuff that motivates most people to learn how to code, like building the next Google.
That remembered how I learn programing, at the beginning of the course I wasn't at all into that, learning pascal, printing words into a black window, but when I started my project, I got a goal and it become very fun and I was very enthusiastic into that...
I think sometimes the common sense of the right way of doing things its not the best way, other thing that comes to my head about that, is when you end up in big projects with very complex design patterns applied but since nobody b-sides the creator fully understand the pattern, and the shift of people in the project is very high, the thing becomes a mess.
teaching is about enthusiast your students and programing is about make programs with readable code...
but what i really see today, is teaching about be difficult with a ton of unused things, and programing about to be most complex that u can..
plenty of people are fine with learning how to program. It's Learning How To Program that people don't want to do. I blame hiring practices TBH. If anything programming should be the ONE field that leaves the bullshit signaling aside and just hires people who can make things work.
Shakespeare had it wrong---not kill all of the lawyers, leave them on the list sure, but first kill everyone in human resources. Ask yourself how many times the first question asked in today's interviews is 'Can you do the job?' To my admitted old fashioned way of thinking, it should be the first , middle and last question. Particularly since it is the only question that makes a damn bit of difference. Oh---and then kill all of the lawyers...
Bullshit comes from interviewee and interviewer. The asymmetric information (interviewee knows self better than interviewer) and conflict of interests (interviewee wants job; interviewer wants best person for the job) is what makes it difficult. This is before accounting for the Dunning-Kruger effect, which states that people are useless at evaluating their own abilities.
I believe (and teach) programming the way I learn a new language ('Human' language not a computer language).
It should be segmented in a way that at the end of each segment there is a 'reward' (an application of the learning that is considered beneficial/cool by the student) and a 'desire' to learn more to achieve the bigger reward.
So typical segments for a new language could be:
- bragging/showcasing to your family and friends that you know how to count some numbers and words
- being able order food in restaurant in the new language
- communicating with someone who speaks only the new
language
- being able to enjoy a TV show/movie in the new language etc
That's why I really like Scratch tool. I can gradually introduce new capabilities (and implicitly new programming concepts).
I disagree that regular expressions should be deferred to later. For professionals who are trying to figure out what value programming has for them, regexes have more use than game concepts
Randy Pausch helped create Alice, which is a bit like a modern Logo or Mindstorms - helps you create 3d animations and various other cool stuff, while teaching Java syntax.
I'd love to take up that offer, if it's still available. I already know some programming, but I would like to have somebody as a mentor. Send me an email at mocanu_c at yahoo.com if your offer is still available.
Can anyone with kids comment on whether or not computers are even exciting and empowering anymore?
When I was eight, we got a Commodore VIC-20. It plugged into the TV and booted into BASIC. Just running PRINT statements and simple loops was unbelievably cool. I learned key positions (and how easily mistakes are made) by spending hours typing in bytecodes printed in magazines. With any luck, at the en of it, a completely unforeseen game would appear.
Getting to play with Logo on the Apple ][e was awesome, too. Shapes and angles and horribly flickering animations were exciting.
Now, it seems to me that the difference between Dive Into Python (or whatever) and Call of Duty are so immense, that even running Hello World would be a disheartening experience. But maybe I'm just too nostalgic.
Incidentally, I distinctly remember from my first proper high school programming course (Pascal) that the instructor asked why we wanted to learn programming. Every single student responded, "to make games."
I'm 19, so I've never experienced a terminal running BASIC on a TV screen or anything else from those early days. I think my oldest memory of computers is Win 98.
What got me into programming originally was Lego Mindstorms (the original version, the second iteration was no good). My dad and I made robots that could follow black lines on the ground and read barcodes, and it was all with the drag-and-drop interface Mindstorms ran on. ( http://www.bouwvoorbeelden.nl/MindStorms/ShootOut_bestanden/... ) And it was fucking awesome.
It taught the concepts of programming like loops and conditions and variables, but in a watered-down way that took care of the stuff that's not fun for a kid. As you grow you find joy in tackling more complicated things, but as a kid Mindstorms was perfect. Great role-model for any aspiring curriculum in younger education.
Even though it would be way out-dated technology now I feel like Lego should start manufacturing those kits again - they were really amazing. Maybe that would be a good startup idea? Raspberry Pi for kids?
Meh, I mean they are aiming it at education which I think is great, but I doubt many kids would know what to do with Raspberry Pi if you just handed it to them.
My impression is that the main interest in it has been from older people who like to hack on hardware.
That's true of the current ones, but I think the idea is to develop it into a kit with case, manuals, pre-loaded software, etc. to support some sort of programming or robotics curriculum.
Personally, I think it would be neat if the R-Pi could be made into a graphing calculator, which most kids need for their math classes any way (actually they don't need it, but are required to have them). That way, it can be a bog standard calculator by day, and a full Linux workstation by night (after plugging in display/keyboard).
Having just graduated high school a few years ago, I'd say graphic calculators are sort of needed. Most of the time a simpler calculator would do the job and having a $100 Texas Instruments is just a luxury because they're better laid out and show a history of operations like a terminal. Even when graphing and matrices were part of the curriculum, those things can all be done by hand if you really have a lot of time, but it would be harder to learn that stuff without a reference to compare yourself to.
TBH I think books, calculators, computers, and basically every other tool kids use in school will soon be replaced by the iPad, or a similar device. If someone can make a cheaper alternative with R-Pi that catches on (very doable), well great.
I think something more fun for kids would be (apart from the obvious game platform) some kind of communication device. Something like walkie-talkies.. but maybe communicating via IR LEDs with morse code rather than voice. Another idea is using the R-Pi as a counter for a LaserTag-like game.
I've had this project in my head to build a graphing calculator thats a full Linux workstation for some time. It'd be really nice if the display and keyboard could be in the device to start with. Ya know, so I could walk around and use the thing.
There is a lego store near me and trust me, they still sell mindstorms. They have a few different components now but are mostly the same. They cost several hundred pounds however.
Chiming in at 21 as someone who learned to code when 13-14, I actually found the make-a-game approach somewhat boring. I preferred picking up random projects (programming a styrofoam Roomba with proximity sensors on a BASIC Stamp, building a graph algo that would solve a Rubika cube, etc.).
Maybe it's because I was always more into board games than video games, but I don't think the games-size-fits-all mentality is universally appropriate.
Generalising it to task-oriented (versus skill-oriented) learning may be more precise.
I second this. The whole 'video game' and game mechanics culture of most intro programming actually kept me out of it for awhile. I truly dislike making and playing video games, but I enjoy writing useful software and learning about cool topics like computer vision.
I'm not a kid, but I'm late enough to the game to be considered a youngin'. My first computer was a 160mhz Win98 box.
I started programming in '99, when I was invited to a programming class by my typing teacher. His first year long class were split between C++ and Visual Basic. Don't cringe yet! The first semester was all C++ to learn what it is to program in a strict language. The second semester was VB because, now that we did the hard stuff, he "wanted us to have fun with programming." That's when we made all those cheesy games, like tic-tac-toe and connect four. If we wanted to continue next year, it was all C++ for the rest of the curriculum. I ended up buying a C++ and Visual Basic compiler just to program at home.
I believe a group of peers really helps bootstrap the learning process. You want to compete to learn more to create cooler stuff to share and get excited when a friend pulls off something pretty spectacular. Excitement and passion are contagious as all hell.
I was also part of the "I wanna make games!" crowd as well. At the time I was playing Half-Life and Starcraft. I think deep down you know that it will take work to make those kind of games, and so the "Hello World" program doesn't seem so bad because you are only learning. I also think games precondition you to learn the basics since most start out with a tutorial and a gentle learning curve that build on things you learned before. In a way, programming is one long protracted game. I would be curious to see if being a gamer today correlates with getting into programming more often, even if you don't end up making games.
I'm not exactly a kid anymore but I never got interested in programming because I found everything somewhat simple that I could actually program myself already existed. All the crazy stuff that I wanted to make was completely unachievable. So I never really had any positive feedback from my work. It was either stupid and redundant or died on the vine. Not much fun.
My six years old son is currently learning (fast) to read and write and he already wants to be "hacker"; either "white hat" or "black hat", depending on his mood.
I don't think he sees computers as "magical instruments", though. He is excited by them, by the games he can play on them, by the knowledge he can acquire with them, by the code I write on them… of course. But he is certainly more used to them than I was at his age. To the point where, when coming back from a computer lab organised by his school, he says "I'm not very good with Windows, I prefer mom's computer or yours".
I mean, he is only 6 and he already prefers one platform over another.
I'll wait a bit before his first "Hello, World!" (English not being our first language), but the chances that it's going to be a flash game instead of a small command-line executable are pretty big.
hey, I learned on a Vic-20 too. Loved it and (to my parent's dismay) spent far too much time writing programmes than doing "normal" kid things. Then one day I got a book - something like "How to write adventure games". One of the best books I've ever read. I loved it
This book showed me how to design and write, from scratch, a real adventure game. How to design a game and draw maps (well, I was pretty accomplished at that already), how mazes worked, how to parse and handle the standard verb-noun input. Everything! I remember writing games and taking them (on cassette) to show my teacher at school. Awesome times.
[Edit: I've spent the last half hour looking, but no joy yet. It's not "Creating Adventure Games on Your Computer (Tim Hartnell) nor "Write Your Own Adventure Programs" (Jenny Tyler and Les Howarth)
I have two kids. The time at the computer is deliberately limited, so they aren't into games that much, although they like the online games, etc. Not into programming either, they didn't even pose a question 'how is this program made?'. To them it's more like a whole self contained thing. Like a car. You don't sit in a car and say 'I don't like the curve of this seat, I'll hack this afternoon and see if I can come up with something better'. I hope you get an idea. What they do like about computers, is that they can use them to create something - drawing, presentation, writing. In fact, both are really into writing - trying to write novels and short stories. So computer to them is like a tool, which you can use to create something, but not modify itself.
I started off trying to make games. Along the way I noticed all the other (Way cooler) stuff going on behind the scenes and dropped them.
Anecdotally, this seems to be the same story for the few people I know in real life who claim to know some form of computer programming. (Whose claims I actually believe.)
I can vividly remember typing in stuff from the programming manuals when I wasn't older than 6... freaky stuff, I think my very oldest vivid memories are coding memories.
I agree that involving beginners with OOP is a distraction -- albeit a very important principle. But they must be able to write their own modules, functions and classes first then be comfortable with exporting those then OOP can come naturally from that. Most software is still highly procedural in nature (while it may consume and refer to objects) and even in languages like Python where everything is an object you can still write software that has a clearly defined beginning, middle and end. Once you have a handle on all that, you can then start utilising the objects and their properties and that is huge fun in Python and Ruby.
Basically we were taught QBASIC and given assignments. The first few weeks seemed pretty mundane as the OP describes, getting into conditionals and loops, etc, but after we learned those, my teacher told us to make something like 'Choose Your Own Adventure' text-based game. I remember loving that assignment and even compiling it as an EXE and sending it to my other friends. This pretty much followed through the whole year, learning some new programming concept (arrays, functions, etc) and then making some sort of game as an assignment. We had the usual "write a program to display all the factors of a number" assignments too, but I just remember loving the game projects. I didn't know anything about Big-O or AVL trees or whatever, just that I could create cool stuff on a computer.
In 10th grade I took Computer Science A, and about half the class hadn't taken Computer Programming already. The material was a lot drier, obviously, and I remember a lot of those students switched out. I witnessed the same thing my freshman year of college -- no 'Choose Your Own Adventure Games' as an assignments, just grueling midterms on polymorphism and inheritance.
So this post really resonated with me, because my thoughts have basically echoed this for awhile -- why isn't software engineering taught as a discipline that can let you implement and create, since that's exactly what it is?