I got a couple of 2TB WD Greens for $85 apiece and a ZyXEL NSA221 two-bay NAS (some ARM chip running what I think is Debian but I've never bothered to look) for $120. I think that one's discontinued now, though.
"...our understanding of the laws of physics is advanced enough nowadays to describe almost perfectly everything we observe in everyday life." -- nope.
What's about those that we can't describe? Some of these effects are very computation-intensive, but doesn't mean their basis isn't well understood already.
This is a gross understatement and fallacious. Many of the problems being studied today would take millions/billions/trillions of years of computation in order to model... with approximations. Using even the largest clusters on the planet we can't even find the ground state of even the smallest protein (with ab-initio). DFT scales cubically with system size, and DFT is an approximation to actual first principles calculations that are impossibly enormous and can never be fully calculated.
The number of systems we can accurately model is minuscule, the number of open problems in computational chemistry and materials science is enormous.
> impossibly enormous and can never be fully calculated.
I wouldn't go so far as to say that.
A quantum computer can handle the fermion sign problem by scaling polynomially with the number of particles instead of exponentially. Estimates for if/when such a practical device will be created vary wildly but I would think the possibility of this, along with new techniques that take advantage of the redundancy inherent to certain categories of problems and efficiently diagonalize the Hamiltonian could accelerate the rate at which we can handle larger and larger systems. It's kind of hard to predict what breakthroughs will be made, but I'm staying optimistic.
EDIT: Then again, looking at your posts on here, I suspect you already know all that ;)
short summary: Quantum teleportation works by taking advantage of entanglement. To teleport a quantum state, you prepare a source qubit and another pair of entangled qubits. One of the pair is sent to your teleportation destination. You take your source qubit and interact it with the one remaining entangled qubit. (this interaction depends on the paradigm of quantum computing you are using, i.e. apply a Hadamard gate to the photonic system) Once you have performed the interaction, you can measure the destination qubit and its state will inform you of the original state of the source qubit. The destination qubit can be arbitrarily far away, thus "teleportation" has occurred.
This new thing in this study is that they use a "hybrid technique" to increase the efficiency of this transformation over 100x. With older techniques, the teleportation transformation occurred according to a probability distribution such that transport fidelity was not always high. This new technique apparently surpasses that barrier so that the teleportation is correct each time.
Your use of the language "passed up to the outside" is interesting... Up and outside imply some sort of intrinsic directionality. Considering that functional programming seems to capture the mathematically inclined, can anyone speak to the concept of topology applied to code structure or interpretation? For instance, does a closure imply some sort of filtration structure on the "code space?" (by filtration here, I mean the concept from algebraic topology)
Oh man, you lost me. I'm definitely not one of the mathematically inclined. My analogies are rough and largely wrong and the only thing I'm trying to relate is the particular way of thinking that at a critical moment helped me understand a little more. I wouldn't put too much credence on the veracity of "up to the outside". If it does hit on some underlying important point, it's entirely by accident, I swear.
For the uninitiated, could someone please explain what we use Ember (and related frameworks like Angular.js) for?
For example, I build a Rails app to handle models, views, and controllers on the backend. Then I can use HTML/CSS/JS to write a frontend to interface with the Rails app. Why do we need another MVC framework on top of Rails?
When writing a single-page app you only need a RESTful API on the backend, much like developing for mobile, to sync your models from the client.
Ideally, the models would be shared between client and server without the need for duplicating code. There has been a few steps in that direction in node.js, but it's an ongoing problem that hopefully we'll figure out in the next couple years.
To clarify, when you say RESTful API, do you mean the client-side MVC framework issues API calls to a server which then updates the backend (database?).
I guess this model is client-centric while the typical Rails app is server-centric (MVC logic executes on the server and then serves the relevant HTML/CSS to the client).
Any particular advantages to the client-side framework over the server-side framework?
The main advantage would be performance, plus maintainability for complex UIs. Navigation can be nearly instant, since you're only making specific requests for data from the server, instead of shuffling html and reloading dozens of assets all the time.
There are trade-offs, of course: it is a tad more complex than the old way, but that is offset by simpler implementation in the server. Network round-trips are minimized, but you use a lot more memory and CPU on the client, and need more extensive testing. It's a great fit for webapps, not so much for more content-oriented websites.