I'm sorry about that. I might have been the one who originally suggested the name "Java" inspired by the cup of Peets coffee I had in my hand at the time.
To be fair it would have been Kim Polese and Patrick Naughton and James and Arthur Van Hoff who made the final decision on the name from the suggestions thrown out in that meeting. And furthermore Yahoo search was dominant then, maybe with AltaVista, and the idea of polluting the search space hadn't occurred to anyone.
Are you telling me I left a random comment about something that happened 25+ years ago and the relevant person saw it within two hours?
Mind blown :o
Anyway it made sense at the time, it’s just unfortunate. The only redeeming part is that in Indonesian it’s actually spelled Jawa so it doesn’t affect locals too much
AWS, like google and like Azure and every other cloud provider already does everything they can to minimize energy consumption because it is in their economic interest to do so.
In fact I would think the best thing industry can do to reduce energy consumption is to move their data to a cloud provider. It takes far less energy to cool one large room with servers from a dozen companies than it does to cool a dozen server rooms with private on-prem servers.
This all seems correct, but insufficiently proactive, at least not enough to satisfy activists.
For instance, though it may not currently be economic, how about pre-cooling using renewable energy. Locating data centers next to hydro / geothermal sources? Larger UPS? Load balancing across DCs with available renewables? Other mitigations to get ahead of the issue?
Levandowski still lists his occupation on Linkin as "VP Engineering at Uber ATG".
How the !!@ can that still be true?
Hasn't Uber cut ties with him yet?
Pascal was the programming language we used in college in the late 70s, and although I knew it translated to p-code that was then interpreted, somehow that never became much of a focus. We just used the language like you would any other high-level language, and bitched about the shortcomings or inconsistencies.
I can tell you first hand that when James Gosling was developing Oak/Greentalk which became Java, Pascal never came up. Sure we all knew and had used it, but the focus was much more on C++ and what to keep and what to discard. I think Smalltalk or Self might have even been more of an influence since Dave Ungar was in the Sun research division at the time and was spreading the gospel of generational garbage collection.
> I can tell you first hand that when James Gosling was developing Oak/Greentalk which became Java, Pascal never came up ... the focus was much more on C++ and what to keep and what to discard.
The interesting thing is that at the time Java was released and then aggressively marketed by Sun, the AT&T folks had already come up with Limbo (the successor to Alef, and running on the Dis virtual machine), which was technically quite similar to the Java/JVM solution, and in some ways quite superior. But nobody ever mentions Limbo in connection with Java, or even with Go (which it - along with its predecessor Alef - was a clear influence on). History is written by the winners, and Java was a winning language/platform for quite some time.
Even here on HN, people bring up Plan 9 all the time, forgetting that Inferno and Limbo were actually the end of the road and Plan 9 just a middle step.
Great point about Smalltalk. It had been around a long time by the time of UCSD Pascal, and had a byte code engine from the start.
Along similar lines, Burroughs had a line of machines that swapped in language-specific microcode on a task-switch basis(!!!!!) so the idea of language-specific instruction sets has a long history.
What a shame it was that Gosling didn't understand C++ well enough even to crib from it.
I am astonished, again, whenever I am reminded of what he tried to copy and got horribly wrong, and what else he copied and should have known better than to. None of it would matter if Sun had not dumped a billion dollars into hyping it, so that later generations are now saddled with billions of lines of it.
What a cruel joke to play on posterity, a fitting companion to x86 ISA and BSD sockets.
they focused on C++ as an object-oriented programming language (as did, to be fair, mostly everyone in the 1990s since it was The Big Thing then) without seeing where C++'s real value is, which is :
- value types
- RAII idiom with exceptions
it's a big, coherent package, which means that you can write code so that e.g.
and you can be assured that you won't have null pointers croppying left and right, much less "new" to write, your objects are always in a valid state, etc etc.
Java is the opposite: the whole object model revolves around the identity of objects, and thus any function that looks like `public string whatever()` can return null, you have to do manual cleanup of your resources most of the time everytime you use a type (closing files, streams, soundcards, etc etc), versus every time you define a type in C++.
Java is tremendously influenced by C and C++. Any experienced C++ programmer can learn Java easily because it has a similar syntax but a much simpler palette of language constructs.
Undefined behavior is very useful, and Java doesn't replace it with anything better for high-performance CPU programming.
In particular, C loops can only be optimized because signed loop counters are assumed not to overflow. Java also doesn't have SIMD, and a very weak form of arrays as value types leads to pointer chasing.
And Objective-C, stuff like interfaces, resource bundles, lightweight metaclasses, APIs for distributed computing (J2EE was based on an internal OpenSTEP framework).
Not a chance at our 3-yr old startup. That's less than a summer project, and nobody expects much from summer interns. Maybe you haven't written highly async Go services in containers orchestrated by kubernetes. You would spent those two months just ramping up on our code base and customer issues just to be productive. And if by some miracle you actually produced production code, nobody wants the author of that code to immediately walk out the door. Experienced programmers understand that 80% of the cost is in the maintenance over the lifetime of the code. That is a lot easier to do if the author is still around.
"In this case, Monsanto denied requests by university researchers to study its XtendiMax with VaporGrip for volatility - a measure of its tendency to vaporize and drift across fields.
The researchers interviewed by Reuters - Jason Norsworthy at the University of Arkansas, Kevin Bradley at the University of Missouri and Aaron Hager at the University of Illinois - said Monsanto provided samples of XtendiMax before it was approved by the EPA. However, the samples came with contracts that explicitly forbade volatility testing."
I don't know how hard it will be to land a tech job in general, but for any individuals I know it will be more difficult. It gets much harder in your thirties, and a damn near impossible to get a development job by the time you are fifty. You had better have made you millions and retired by then. I recently landed a job at a startup and couldn't be happier, but it was a helluva struggle to overcome pervasive cultural biases that favor less experienced coders right out of college.
That's exactly what I'm talking about. As a developer you're at your most marketable state from the age of 20 to 30. Anything pass 30 (not ALWAYS) is usually not good.
A natural path would be to go into management, etc. But when you look at the ratio management versus dev roles, it's pretty bad. Not everyone becomes a manager.
I have been laid off enough to know that no job is safe, no matter what job you are in. The best thing you can do is work on things while you have a job that will set you up and secure your future. Get into that mindset and as long as you are working towards it, you have a better chance at future security.
Certainly doesn't copy tweets from https://nitter.net/RonFilipkowski