Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I resent calling Kelly Johnson merely the inventor of SR-71. He was the father, the project manager and the organizational hustler who found Skunk Works and defined it's working culture. He was a legendary aviation engineer but also really understood how to get a team of experts to produce results and to co-operate with the manufacturing and operations to create feats of engineering.

He found and ran a lean organization on grit and triftiness when toyota production system was taking it's baby steps in japan.

I heartily suggest Ben Rich's 'Skunk works' to anyone who gets a kick out of a true story what it actually means in terms of output when an innovative engineering team actually works lean... in hardware.



That's an excellent book.

Kelly had a few other rules. One was that the engineers, the machinists, and the airframe must be physically close, preferably in the same building. That way, if a part wasn't working, the assembler, machinist, and designer could all get together at the airframe and figure out what to do about it. HP also had that approach to work in their glory days.

Another Kelly saying was "Kill problems, don't wound them". If some part is causing problems, don't make one that has fewer problems, redesign it so the problem goes away completely.


The "kill problems, don't wound them" idea reminds me of the idea in programming to make invalid states unrepresentable. It's not always possible to redesign problems away entirely, but if it is possible, it is a good thing to do.


Absolutely. A common approach to programming bugs is to try and educate programmers to not make particular mistakes. A more effective approach is to redesign the language so those mistakes are impossible.

For example, the JSF C++ guidelines say not to use 'l' suffixes on numeric literals, because they look like a 1. D simply makes the 'l' suffix illegal, use 'L' instead.


A common approach to programming bugs is to try and educate programmers to not make particular mistakes. A more effective approach is to redesign the language so those mistakes are impossible.

Exactly. That's why I'm bothered by Rust being almost protected against memory errors, and Go being almost protected against race conditions. Even if you give up some performance, it's a big win to eliminate entire classes of errors.

(Still, both are way ahead of C/C++ in this area. So far, I don't think there's been a CERT advisory for a Go program. Rust is too new to be attacked yet.)

It has been (7) days since the last CERT security advisory for a buffer overflow in a C/C++ program.


Rust fully protects against memory errors (and data races, for that matter).

Having "unsafe" is necessary for making useful software and, in fact, it is also present in most other safe languages: Python, Java, Haskell, etc.


Unsafe is useful and necessary for many things, but most of the time it is easier to avoid it.

However it is disingenuous to claim unsafe is necessary for making any useful software.


Howso? Without unsafe, you can't do what you need to do. That's the definition of "necessary for making useful software."


Can you define "what you need to do"?

For instance, if I want to download/parse an html webpage or make some sort of web crawler or bot, why would I need unsafe?


A common approach to programming bugs is to try and educate programmers to not make particular mistakes. A more effective approach is to redesign the language so those mistakes are impossible.

Depends on what you want. Give me a powertool and I'll give a powerful result. Give me safety scissors and I'll take much longer, and the result will probably be inferior. And I can say with some certainty that both products will be roughly equal in terms of safety, because I pay careful attention to safety when designing and implementing systems. But the products will differ greatly in terms of feature set, extensibility, and robustness.

It's difficult to add safety without subtracting power.


Both of their autobiographies are worth the read for any engineer, or aspiring engineer. I found Johnson's biography especially inspiring, as he is a man who grew up in a poor family, and worked very hard to become a highly respected aeronautical engineer, responsible for more innovative projects than any of his peers (before or since). Some of the stories that stuck with me from his book were how his father (a bricklayer in the midwest) used to have to light a fire in a barrel to heat up bricks so that the mortar would adhere to them in the winter. He also mentions that when he was a child (during the depression), he and his sister used to walk to the curves on the railroad tracks with their wagon, to pick up the coal that fell off trains so that they could heat their house. The man worked his way through college in the kitchen of one of the fraternities, soon after contributed to the design of the Electra, and then the first American jet fighter to enter service (the P-80).


Seconded. I'd note that while Johnson's memoir (/Kelly/) is more inspiring, I recall there being more meat in Rich's memoir (/Skunk Works/)---more discussion of the technology, organizational dynamics, and interpersonal politics. Both are, as you say, very much worth reading.


For more of this kind of thing, see "Sidewinder: Creative Missile Design at China Lake" by Ron Westrum. China Lake had an even more radically free-wheeling approach.

This is all really close to hacker culture, to the extent that there's a risk of "inscription errors" (where you mistakenly think you have found corroboration for your ideas but are just looking at your own culture). Fred Turner traces a lot of the current philosophy of the Internet to post WWII research cultures: http://c-lab.columbia.edu/0188.html


Another Stewart Brand fan. I had a WELL account. I used to go to the Hacker's Conference. No, computer technology did not come out of the hippie movement. It came out of the people who wore white shirts in the 1960s.

The Sidewinder story is interesting. I used to work for Ford Aerospace, which made the things at another location. The China Lake crowd developed proximity fuzes, designed to turn a near miss into a hit by detonating the warhead close to the target on a near miss. If they could do some steering correction, they could turn even more near misses into hits. The pilot using an early Sidewinder still has to get on the tail of the target, but they don't have to get as close. Sidewinders usually either get a direct hit or a complete miss, because the heat-seeker gets more accurate as it gets closer.

The competing radar-guided missiles of the era were trying to lock on at much longer ranges and hit the target on their own, like surface-to-air missiles. That's a much harder problem, and the electronics of the era wasn't up to the job.

The "China Lake culture" was simply that there wasn't anything else to do out there but work (it's in the middle of the Mojave Desert), visits from the top brass were rare, and they had all the facilities to build and test weapons without having to outsource fabrication. Plus they had planes, pilots, an airfield, and a bombing range. So a lot of work got done. China Lake was also a joint Navy-university setup with Caltech. (JPL, as a Caltech/NASA operation, is similar.) So they had access to theoreticians when necessary.


I second Ben Rich's book. Kelly's own book, "Kelly: More Than My Share of It All" is worth a read too, if you can find a copy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: