I'm not convinced that there's a coherent concept of real food. Most real food advocates I've seen are happy to eat, say, tofu. They'll admit that technically it's processed, in the sense that it's industrially manufactured by pulverizing soybeans beyond recognition then adding chemicals to the big holding vats. But people say it's not highly processed, or it hasn't lost the natural essence of soy, or whatever. I just can't find a way to extract a meaning beyond "the kind of food that real food advocates like".
Tofu is processed but also incredibly ancient[1]. If it was bad for us, we'd probably have found out about it by now. Ditto for seitan[2], tempeh[3], various pickles and other types of preserved foods, and cheeses. "Traditionally" processed foods get a pass on the "real" food scale because they've (mostly, apart from smoked and cured meats and fish) passed the "is it harmful?" test.
Also true. But arguably it was only possible for those dangers to become known recently. Smoked meats cause cancer. Death by cancer was not so frequent in the past, because fewer people lived long enough to develop cancers that could kill them. Or in other words, they were safe enough at the timescales that people previously operated in.
Also, even though the link between smoked meats and cancer is now as incontrovertible as the link between say tobacco and cancer, the risk is far lower. So it's not like they're wildly unsafe, just kinda sorta of unsafe.
Wouldn't that sort of counter your argument that traditional foods would have discovered the dangers over the ages? Traditional foods could have caused cancer and not been detected, according to your own argument.
"they were safe enough at the timescales that people previously operated in."
If you were a peasant in the 1800s expecting to live till 50, sausages and smoked meat would probably _increase_ your life expectancy due to the additional protein. You would also not live long enough, or eat so much of it as to have a significantly higher likelihood of getting cancer.
The context in which those foods are consumed is different today than it used to be. People live longer and consume way more meat (of every kind) than they used to. If you eat smoked meats and fish (and really any other meats) at the same rates as peasants of 200 years ago, I doubt they'll increase your cancer risk by all that much.
Doesn't the point still stand, though, that you can't use historical usage to prove safety currently (given the change in how we eat food and how long we live)?
Safety is a spectrum really. To give a very binary example we are certain that eating rotten poisonous shark (https://en.wikipedia.org/wiki/H%C3%A1karl) won't kill you because of historical record in spite of eating rotten fish and eating toxic fish separately both being known lethal things.
It doesn't tell us if it is optimal health wise to do such a thing.
Really 'real food' is latter day religion/marketing based upon what sounds good and validates their emotions/sells the premium product.
The /actual/ ancient past was way more concerned about quantity and physiological performance in a 'can you get big on it' as a positive and 'how likely is it to kill you' as opposed to any high minded ideals.
Despite trying to sound like it 'real food' isn't a real goal or metric in the same way that dog breeds which don't have a purpose other than aesthetics defined by vapid breed standards rapidly become the canine counterpart to the Hapsburgs. Compare to a working dog breed's actual defacto requirements. "Smart enough to keep up with and herd livestock, mean enough to chase away predators but not so vicious it kills the livestock itself." is a real set of goals that constrains it into a functional space to optimize for and can even tolerate some wasteful vanity in appearance selection. Hell even something overspecialized like "Try to be the fastest on the race track without being so badly behaved that they are disqualified." is better.
I am convinced that the underlying issues be they dog breeds or food aren't the real problems but ways of thinking.
It's OK if there's not a universal definition of every concept. Some things in life are culturally subjective - especially cuisine. However, I think an intelligent person can look at string cheese and say that's fake food.
I've started realizing that there is very little space for nuance and subjectivity in discussing things on HN
String cheese is another great example. Most people would identify it as a clear example of "fake food", but it's literally just mozzarella stretched in a particular way.
I have no objection to nuance and subjectivity. There's nothing wrong with someone saying "I like fresh salads and fish and tofu, but string cheese and Big Macs aren't for me". Certainly I wouldn't say that you must eat string cheese unless you have an ironclad reason to avoid it. What I object to (and what I think HN is particularly sensitive to) is trying to sneak personal preferences under the guise of loaded terms like "real food".
Then I guess I'm guilty sneaking my personal preferences. I wanted to have a conversation about the industrial food supply and failed to ask the right questions
The ingredients for mozzarella are: buffalo milk, salt and rennet (and a bit of leftover whey for the bacteria). I’m sure there is far more in a packet of “cheese strings” than that
rennet = enzymes and the original list left out cultures, but there are undoubtedly cultures in their cheese, so the only difference in the lists is presentation.
Anything frozen & prepared seems to be taboo -- like a frozen pizza. But frozen peas and berries are fair game. Everything else seems to come down to branding.
I've seen premade refrigerated pizza fly off the shelf. I've seen drinks with all kinds of highly processed ingredients sell like hotcakes. I've seen condiments with a list of ingredients as long as a phone book purchase by plenty of people that eat only "real food".
Protein powders and supplements intrigue me to no end. These are some of the most processed and unregulated things sold on shelves. But "real food" advocates literally eat them up.
One of my friends is making millions catering to this group. I've been trying to figure out how these shoppers think. To me, it doesn't seem like there's much in common. It seems to be mostly branding.
consider signaling. if you only eat "real food", you give off a vibe (not proof) that you're some mix of: educated, with disposable income, purveyor of quality
this is largely the same with other quasi-positive labels (esp. political ones). lots of $$ to be made here, as people generally want to reinforce both their own and others' perceptions of them.
This sort of thing always seems super american centric to me personally. It's common in chinese households to have a stash of frozen dumplings prepared in-kitchen, or frozen steamed buns, or various other frozen prepared foods that are kitchen staples. I get that if it's from a restaurant or big processing plant you might find things questionable, but if you prepare your own pizza and decide to freeze it for later I don't think anyone should besmirch you as suddenly less healthy compared to eating the pizza as soon as you made it.
I don't think anyone has a problem with food you cook and freeze. "Frozen foods" in this context are generally understood to mean "packaged frozen food, prepared by the seller" (which excludes the corner case of a local charcuterie preparing and freezing something).
I think many people do have a mistaken impression that frozen produce is less nutritious than fresh. Kind of an inverse version of the "health halo" that attaches itself unjustifiably to products like "pure, natural" cane sugar.
I had that mistaken impression until this very second, and I've been buying a lot less vegetables than I should for fear of fresh ones going to waste. Guess I have a good new years' resolution now.
> Anything frozen & prepared seems to be taboo -- like a frozen pizza
> I've seen premade refrigerated pizza fly off the shelf.
To be fair most frozen pizza is terrible - pre-baked bread that you're reheating, little better than frozen cheese toast. Whereas premade pizzas tend to be freshly made and you're baking the dough. So they taste a lot better.
You're giving the marketing department way too much credit in my opinion.
Pretty much all of these highly processed foods are highly processed in order to reduce cost in some way (e.g. increasing shelf life, decrease storage requirements, use a substitute ingredient, remove the need for some other product, etc). All the finicky bits of producing food (e.g handling ingredients that are much less shelf stable than the final product) and abstracts that away to some factory somewhere. All of this seems like you're getting something for nothing if you don't know it's unhealthy.
The technology and processes used to create stuff that is recognizable as "modern industrial food" were mostly developed and matured over the late 1800s and early 1900s. For reasons that should be immediately obvious cheap and shelf stable became highly sought after traits for ingredients in the 1930s and 1940s. Likewise a generation of people grew up seeing their parents shoe-horn products like Crisco into use cases formerly reserved for other more natural ingredients. Considering that they grew up on it it's no surprise they stocked their 1950s and 60s cupboards and pantries with the sorts of products that they were familiar with from their youth.
Of course marketing is icing on the cake but things, like cooking habits, that you generally learn from your parents are generally resistant to fast change without some sort of strong outside motivation.
I think Chris Kimball of America's test kitchen spoke on this, one of the other big reasons that American's went from lard to Vegetable shortening was because lard is used in ammunition, so in world war II, much of the production of lard went to the war effort, so folks started to use shortening instead.
Thank you for a detailed and nuanced history. No doubt the industrial processed food supply was necessary for the population growth in the latter half of 20th century. The agriculture industry will need some massive overhauls to survive the next century
All makes sense but I'd say that there is also the factor of trying to maximize particular nutritional traits for marketing purposes ("no saturated fat," "no fat," "no sugar added," "no nitrates," whatever).
I think it's a spectrum of realness rather than real or not real - I also really dislike the title of this article as it connotes stuff that isn't literally conveyed. Industrially produced food is what nearly all of us eat, our sausages, our kale, our milk... all produced on an industrial scale.
That said, I think I value food that has fewer steps in preparation as being "more real" so I'd prefer something like a meat and onion skillet to beef wellington or a clumped cream chicken pot pie.
Sorry to clarify - I'm not worried about industrially produced kale - but the kale you buy in the supermarket is industrially produced.
I think the title is terrible for choosing that specific word - I'd much prefer a title like "How Crisco Made Americans Believers in Artificial Food". And in actuality it's more of an issue of the government not preventing wholly unhealthy food from finding the wide markets they have.
If I buy them from the butcher at the end of my block who make them in house, they're not real food? If the butcher makes them at the store and takes them home, they're not real food?
What if I make them at my house and take them to someone else's home?
It isn't pedantry, though. People talk about 'real' food, but they don't really have an idea of what that means except that it 'feels' real. I think this goes a long way to show that 'real' food is a meaningless statement, and we should stop using it... it doesn't mean we are pedantic.
Apologies. My post was too harsh and impersonal—dealing in generalities when a human whose concerns and actions are particular was involved. Sorry about that. Have a nice holiday season.
It’s not semantics. I’m trying to understand what OP means by “real food,” because I suspect he’s peddling nature woo.[1] Heavily processed food, like sausages and cured meats, have been a basic part of the human diet for millennia.
Yes. They're also bad for you if eaten regularly. Not all "real food" is healthy. Not all "fake food" (whatever that means) is unhealthy. They're orthogonal concepts.
IMO any processed food that hasn't been around for a couple of generations should be treated with some suspicion.
If you want to define "processing" a loosey-goosey definition might be "anything that can't be done in the average kitchen using ingredients commonly found at a grocery store". By this definition, maybe breakfast cereals aren't processed - you could make Grape Nuts in your kitchen if you were sufficiently determined and masochistic. But it's a decent heuristic otherwise.
Our brains spent millions of years evolving in nature, makes sense that cramming into cities with unnatural lights and disrupting biological rhythm would make them go haywire
"Our bodies spent millions of years evolving in nature, makes sense that taking showers with unnatural temperatures and disrupting biological skin bacteria would make them go haywire"
"Our teeth spent millions of years evolving in nature, makes sense that scrubbing them with unnatural mint-flavored pastes and plastic bristles would make them go haywire"
"Our eyes spent millions of years evolving in nature, makes sense that covering them with unnatural lenses would make them go haywire"
I appreciate your pedantry - the scientific community understands the mechanical physiological body (skin, eyes, hair) on a cause-and-affect level, but hardly understands the mind and mood because mental health is such a subjective experience, it's nearly impossible to fit into modern a scientific paradigm
Mental health isn't a strictly subjective experience though. You can measure days called-in-sick. You can measure suicide count.
Cause and effect isn't necessary to reduce harm. You with me?
It's nice to have certainly, but it is not required.
Imagine that I am a friend of yours. I tell you that my child is depressed and the doctor suggests pill1, it makes my child more lethargic. We try pill2 and the depression is gone.
I tell you that I am happy with the results of pill2 and that while I am interested in how it works and potential long-term consequences, I'm happy staying on pill2.
Is this imagined parent strictly better than a parent that refuses to try any pill until convinced of its mechanism and success? I would say yes.
Observation is the workhorse of the scientific paradigm, not idealized experimentation. The quantity, and thus value, of Inductive evidence is greater than that of Deductive evidence.
I suppose I'm advocating for prevention instead of treatment and what the ideal conditions would be for good mental health so that treatment could be avoided in the first place.
C'mon. The comment you're replying to with "makes sense" is about city living vs rural. You weren't commenting that it's unknowable. You stated that "unnatural lights" and "disrupting biological rhythm" (whatever that means) probably cause one's brain to go "haywire" (implication being: and commit suicide).
As if there aren't uncountable other variables that have changed dramatically over millions of years. Or hell, do you even know if suicide rates are higher today than they were millions of years ago? No. You came here to try to suggest city living was inferior to rural living, and used "millions of years of evolution", and some implication that modern rural life is somehow more similar to those millions of years than urban life.
"...makes sense that cramming into cities with unnatural lights and disrupting biological rhythm would make them go haywire"
Then what does this suggest? Why does "unnatural light" (ignoring that humans have built "unnatural light" fires for ~1.7 million years) make a brain go haywire, but not, say, clipping one's toenails? Or reading books, or sitting on chairs, or being blasted by radio waves, or running on treadmills, or traveling over 15mph, or wearing clothes, or flying in airplanes, etc. What are you saying?
As far as I know, rural and suburban dwellers do have lightbulbs, computers, televisions, cell phones, and this time of year, brightly lit christmas trees. Is your position that there are more light bulbs in cities than the countryside? Do you have any evidence that that's meaningful? Seems like a bit of a stretch to me, that the external lights—easily blocked by window curtains—have any meaningful impact on health beyond the artificial light sources we all deliberately use.
I made no implication that contemporary rural life is superior or in any way comparable to historical rural life.
Edit: I guess I'm being pedantic at this point. My personal position is industrial agriculture is failing rapidly and that the modern world will reconcile with the fact that either we will all starve, or learn how to farm again. I spend half the year working on an organic farm, and my life is immeasurably better than when I'm in the city (which is of course, only my personal subjective datum which and I realize how much HN hates anecdote as evidence)
We spend 17.4% of GDP on healthcare. The OECD average is 9.5%, and we don't have better outcomes to show for it. That's ~$1.2-1.7 trillion of waste annually that could be used to improve our society instead of lining crony pockets. Employer-provided healthcare is one layer of the Kafka baklava that obscures the cost of healthcare and prevents better cost controls and price discovery in America. It also makes employees more reliant on their company, literally for their wellbeing. Our economy is less dynamic as a result, so my $1.2-1.7 trillon estimate might understate the problem.
The obesity explanation doesn't pass the sniff test. Canada is 12% less obese than the U.S., but it spends 6% less of its GDP on healthcare. Put differently, if obesity was the reason for excess spending, the U.S. would save $1.1 trillion for every 12% of its population that is cured of obesity. If true, that would peg the marginal cost of obesity at $27,777 per person per year (1.1 trillion/.12*330 million), or 9x the annual salary of a doctor in Cuba. That is beyond the realm of believability, even if I introduce the other population-induced causal factors which you implied but didn't specify.
Additionally, the government would be more invested in the population's health under a single-payer model. It would actively work to reduce the prevalence of obesity and lower its costs. That would include taxing consumable goods with a negative health externality, commensurate with the magnitude of that externality. That would also include incentivizing the consumption and production of goods with positive health externalities and investing in pro-health infrastructure.
Imagine if a city faced the following math: "A network of bike lanes would cost us $40 million and $10 million to maintain over the next 10 years. It would also save around $50 million in health expenditures every 10 years. After one decade, it will cost $10 million and continue to save us $50 million." All the bike lanes you could dream of would be built overnight, assuming there would be subsidies by a M4A healthcare program. I'm more excited at the prospect of converting roads into pedestrian walkways and scooter highways. That wouldn't seem like such an expensive proposition if the government would recoup the cost in healthcare savings.
> The obesity explanation doesn't pass the sniff test. Canada is 12% less obese than the U.S., but it spends 6% less of its GDP on healthcare.
It's my blog (RCA). My argument is that obesity substantially explains US health outcomes in relation to other countries. I never claimed obesity is the cause of high national health spending (as in, "inputs"). To the contrary, I have consistently argued US health spending is well explained by its wealth (technically income levels).
To a first approximation, national health spending is entirely explained by the average house income level in the long run. While time, healthcare technology, and other factors are assocatied with rising spending, these changes are ultimately very well explained by changing income levels.
Amongst high-income countries, a 1% increase in income is robustly associated with a long run increase of about 1.8% (it's highly elastic).
The US spends more than Canada because it's still a much richer country (which isn't to say Canada isn't a nice place!).
> That is beyond the realm of believability, even if I introduce the other population-induced causal factors which you implied but didn't specify.
Again, I never said this, but other population health risk factors such as age structure, disease rates, and the like are of negligible significance when it comes to long run aggregate spending. Such factors may be highly predictive within countries and may have some say in the short run (within budgetary constraints), but in the long run national picture the evidence suggests these factors amount to little more than noise. National household income levels trumps everything.
> Additionally, the government would be more invested in the population's health under a single-payer model.
US government programs, namely Medicare and Medicaid, spend more on healthcare than most other high-income countries do in total (even more so comparing public-to-public). Just how much more incentive do we need before these magical effects kick in? Higher health spending predicts higher obesity rates in time series and cross-sectionally (though this is likely ultimately mediated by long-run income levels and by time).
Where is the evidence that these programs have large, sustained effects and are cost effective? Most data indicate these programs have negligible effects in the long run and they almost always cost more than they save (which isn't to say we shouldn't necessarily do it, but the economic rationale is v. weak).
Obesity is not an independent variable - socialized healthcare does a better job of controlling obesity with preventative health measures. As it is now in the U.S., patients only go to medical professionals when there is a problem, making obesity epidemics one of the effects of how U.S. handles healthcare.
Nothing is perfect, but most experts believe this has little to do with healthcare today because healthcare interventions tend not to be effective causes of long-run weight loss and most countries aren't doing enough of the stuff likely to have large effects (e.g., surgical interventions) to explain much of the variance. Even if you could argue it might explain something, say 0.5 mean BMI points, other factors are clearly highly important. Cultural * and genetic factors are likely to play a significant role amongst high-income countries. Further, obesity rates rise with time and income levels despite higher health spending.
> socialized healthcare does a better job of controlling obesity with preventative health measures
evidence?
> As it is now in the U.S., patients only go to medical professionals when there is a problem
The US spends more on preventive medicine than almost any other country, though preventative medicine generally has very-small-to-modest effects on outcomes and rarely, if ever, saves money (usually quite the other way around)
~ RCA
note: * some of these "cultural" factors may be residual economic influences... the US escaped the malthusian trap long before almost all other high-income countries and this may have latent effects on attitudes towards food, diet, etc)
>"And we will have white/black lists for food. I don’t trust the same people who brought us the food pyramid and low fat as a reliable arbiter of what’s good and what’s bad."
It's telling that, whenever the government (rarely) enacts laws that tax or ban consumables with negative externalities, they actually target the right thing. After troves of empirical evidence, they heavily taxed smoking and banned trans fat (I'm aware of the government's misguided early endorsement of trans fat vs. saturated fat, but science has progressed a lot since then). Recently, local governments have tried to tax excess added sugar. That has been less successful, but it's guided by the right thinking. Excess sugar in our food supply is unequivocally, empirically bad. The government has less of a basis to tax it since it's not paying for all our healthcare, but that would change under M4A. Moreover, there would be more money behind nutritional/health research, because that research would have a more tangible payoff: an approximate dollar amount saved in public healthcare expenditures.
And we will have white/black lists for food. I don’t trust the same people who brought us the food pyramid and low fat as a reliable arbiter of what’s good and what’s bad.
The NHS/universal healthcare was introduced in Britain in 1948, after WWII. The NHS isn't perfect, but it's much better than the American system. If we want a more flexible healthcare model, we can have universal healthcare to cover necessary + preventative care, with the option to purchase supplemental insurance for expedited and/or non-medically necessary care. Several other countries have that model.
>"Do you think high-paying healthcare sector and patent R&D in the US creates more incentive for research and development of new medicine & healthcare technology?"
Yes, but there's a limit. Incentives have diminishing marginal returns. At the end of the day, a $2B pharma company is still going to doggedly pursue a 20 year monopoly on a potential $1B drug, even if it otherwise would have been a $1.5B drug if there was no Medicare for All. Moreover, the vast majority of waste in healthcare is with hospitals, administrators, surgeons, insurance companies, and doctors, not the pharmaceutical industry.
Pharma is closer to software in that one company can produce one product with zero marginal cost that can trivially serve everyone on Earth with a given condition. We can even leave Big Pharma as is and still realize hundreds of billions in savings, though I still believe that there should be some single-buyer negotiation for drugs. We can use empirical evidence to negotiate on drug prices without drastically changing the incentive scheme. Ultimately, I believe pharma companies would increase prices abroad if we implement price controls in the U.S. The U.S. is subsidizing the world's healthcare.
>"Every hospital I've been to seemed horribly understaffed.. and I'd be afraid of an underpaid surgeon. Not sure what waste you're referring to"
Doctor/surgeon labor scarcity exists for the following reasons:
- Medical associations lobbied the government to restrict residency positions a long time ago, and continually lobbied to keep them down until just recently when the shortages have become too obvious. They were even warning of an impending doctor surplus in the 90's. Ya, right.
- Medical associations and med schools have been smart about restricting the supply of doctors through our med school network and excessively tedious licensing system.
Additionally, the other OECD countries I've referred to have similar health outcomes for a much lower % of GDP. Clearly, universal healthcare did not worsen their population's health. If someone wants quicker healthcare, I'm almost certain the U.S. would allow supplemental insurance to get that hip transplant in 2 weeks instead of 6 months.
Thank you for all the replies, I have learned a lot. It seems like there are dozens of issues that need to be fixed in tandem. Meanwhile, I will stay as healthy as I can because that seems like the best plan for the moment
> At the end of the day, a $2B pharma company is still going to doggedly pursue a 20 year monopoly on a potential $1B drug, even if it otherwise would have been a $1.5B drug
The problem is they don't know it's a $1B drug instead of a dud until they've actually done the research, and most of the candidates fail. And since the successes have to cover the failures, if the successes make less money, they can't cover as many failures and you don't get as many attempts.
> Moreover, the vast majority of waste in healthcare is with hospitals, administrators, surgeons, insurance companies, and doctors, not the pharmaceutical industry.
It is certainly a multifaceted problem and there is plenty of inefficiency that could be improved. Not just healthcare, but also the plague of zoning rules that inflate real estate costs in cities. Which is where hospitals are for legitimate reasons, but hospitals not only need a lot of real estate, they also then have to pay salaries there that allow their (already expensive) staff to live within reasonable distances.
And the subsidy issue isn't just drugs, it's also technology. A lot of the "hospital" cost goes to equipment, which is the same subsidization of international R&D as drugs -- other countries with price controls not paying their share of the cost.
Which is why cost comparisons to socialized systems in other countries are so uninformative. Not only are they not paying their share of R&D, they typically have lower real estate costs, lower salaries across all industries, lower (and this one surprises a lot of people) taxes if you count "health insurance" as a tax, and it goes on.
There is a lot of pure inefficiency in the US healthcare system -- the level of bureaucracy is madness -- but a lot of its costs are also external to the system itself and symptomatic of healthcare being at the intersection of several independent sources of price inflation that each have to be addressed on their own terms.
Do you think high-paying healthcare sector and patent R&D in the US creates more incentive for research and development of new medicine & healthcare technology?
> We spend 17.4% of GDP on healthcare. The OECD average is 9.5%, and we don't have better outcomes to show for it.
The problem is we do have "allowing drug companies to keep researching new drugs" to show for it. We're subsidizing the rest of the world because they impose price controls on patented medications. We could do the same thing, but then where does the money to do the R&D come from?
People like to point out that they spend more on advertising than research, but the advertising generates more revenue than it costs or they wouldn't do it, which means without the advertising they would have less money for research.
It should come from other countries who have been free riding with price controls, but how do you get them to do that? The status quo is giving them a trillion dollar a year subsidy.
Pharma companies explain a relatively small amount of excess spending in the U.S. We can even change nothing about how we pay for drugs and still reduce a majority of the $1.2-1.7 trillion in healthcare waste (though we should still try, and let drug companies increase prices elsewhere). I discuss this a bit more below, ctrl+f monopoly
I saw your comment. Clearly, there is a lot of waste, and we both agree on that front. As for exogenous costs, the U.S. isn't the only country with expensive real estate. Almost all of the other OECD countries have a modest degree of real estate cost inflation. Similarly, most should exhibit a similar degree of adherence to the Baumol Cost Disease phenomenon. There is no reason that the U.S. healthcare industry should have a 50-70% higher magnitude of exogenous cost disease. Lastly, I already addressed direct R&D funding (my analogy was simplified, but it could be extended to a portfolio of drugs). However, you mentioned that U.S. purchasing of equipment and drugs disproportionately funds R&D activity. I'm sure that's true, but I see it as a problem to be solved rather than a fact of life. We should adopt universal healthcare just like almost every other developed nation, implement measures to mitigate incentive loss, wait for U.S. medical companies to renegotiate pricing with other countries, and, if incentives are still lacking, we can deal with that then. Surely there's enough money among all developed nations to more than pay for an adequate level of medical R&D.
Somewhat related, but Jennifer Doudna, a government employee, co-discovered CRISPR. CRISPR will prove to be one of the biggest step changes in health outcomes in the history of mankind, or, at some point, supermankind. Now hundreds of pharma companies will try to monetize on the government’s discovery: CRISPR for sickle-cell anemia, CRISPR for congenital retinal defects, CRISPR for lactose intolerance, etc... Should we have to reimburse drug companies for the value of the drug, or should we, recognizing the government’s contribution and the immense value of life, put a reasonable cap on reimbursement? I say the latter. A company developing CRISPR drugs is on record saying they plan to charge over $100,000 for their treatment. I’m not convinced that the drug would not have been developed if they stood to make much less than that per person. We trust the government to grant 20 year monopolies on drugs, and I believe we can also trust the government to reasonably modulate drug reimbursement without ruining incentives for development.
> We spend 17.4% of GDP on healthcare. The OECD average is 9.5%
Health spending is almost entirley explained by income levels, especially in the long-run. The US spends much more because the US is much richer than most and because health spending is highly elastic at a national level.
> and we don't have better outcomes to show for it
Norway and Luxembourg also spend 2x Spain and Italy and don't have more to show for it either despite the fact that they're also much richer, have larger welfare states, etc.
Countries increase health spending because they can, not necessarily because they need to. Evidence strongly suggests returns to health spending are falling everywhere and the US isn't particularly unique in this regard.
Your findings are not mutually exclusive with other theories about the underlying causes of high U.S. healthcare expenditure, such as a chronic lack of price transparency. The more money is in the pot, the greater the incentives to pilfer it, and, in lieu of adequate controls, the more it will be pilfered. The way I see it, your linear regression is between income and aggregate pilferage across all layers of the healthcare establishment (Kafka baklava :) ). I surmise that the accelerating nature of health expenditures (1.8% increase for every 1% increase in income) is due to the rapid inflation of disposable income relative to overall income at the higher levels. Disposable income and the saved wealth accrued from higher disposable income over time are more readily pilfered. I should say “otherwise disposable,” because the healthcare industry takes a progressively larger chunk of that and makes it de facto indisposable. I say this with no irony: a linear regression between income and amounts extorted during kidnapping, controlling for other variables, would show similar results.
> Your findings are not mutually exclusive with other theories about the underlying causes of high U.S. healthcare expenditure, such as a chronic lack of price transparency.
My findings strongly agitate against the notion that high US health spending is a product of idiosyncratic features of our system. Presumably most of these critics believe we'd spend much less if only our system looked more like other countries, but my evidence indicates we'd spend very similar amounts in the long run regardless. Further, we'd likely have similar outcomes and many other similar healthcare attributes (prices, intensity, health worker density, etc). Most of the things about US healthcare people believe to be important and unique (i.e., not explained by income) just aren't.
> The more money is in the pot, the greater the incentives to pilfer it, and, in lieu of adequate controls, the more it will be pilfered
I wouldn't argue there's no "pilfering" or that more money doesn't create more opportunity for this, but the high income elasticity likely has little to do with pilfering. Where are these ill-gotten gains going? I certainly don't think you'd have much success in showing this if you look at, say, the growth in physician incomes, pharma/biotech industry profits, and so on. The data are much more consistent with mundane explanations like rising technological sophistication, higher intensity, and so on (ultimately much of this being driven by some combination of patient/family demand and providers' "spare no expense" approach to caring). I mean, if you look at the economic data it's quite obvious most of the increase can be arithmetically attributed to a swelling of health workers (density or share of workforce) and that most of these workers have lower-to-middle income levels (especially on the margin).
> I surmise that the accelerating nature of health expenditures... is due to the rapid inflation of disposable income relative to overall income at the higher levels.
I'm not sure what you mean by this exactly, but a better, more parsimonous way to understand high income elasticity is that higher income countries are usually inherently more productive countries. We can spend substantially smaller shares of our income on food, clothing, shelter, and other "necessities" because we are able to produce these things so much more efficiently (or otherwise procure on the market) than we did decades earlier or than OECD countries of more significantly more modest income levels while still consuming more of these things in real terms. This frees up resources to be spent on higher order wants like health, education, recreation, culture, and so on. Many of these growth areas, meanwhile, are inherently subject to less productivity growth, meaning prices tend not to fall relative to incomes nearly as quickly as we observe in other sectors.
> I say this with no irony: a linear regression between income and amounts extorted during kidnapping, controlling for other variables, would show similar results.
I doubt that's true, though we're talking about the total spending (kidnapping ransom) per capita here. It's pretty clear the price per transaction (as in, health inflation) explains very little in US time series or cross-sectionally. Now maybe if kidnapping started to become a high amenity affair in developed countries (presuming this sort of thing happened with measurable frequency here) and 25% of the population worked delivering these services.....
There wasn't much to model. Complex health insurance systems were in their infancy during that period. People would sometimes pre-pay a local hospital to reduce costs, etc.
The employer-sponsored model of the US wasn't that bad during the 1940s, but I was clearly obsolete by the 1960s. What keeps it alive, ironically, may be Medicare: it removed the most expensive pool of patients (seniors) from the risk pool. This staved off the government needing to come in and heavily subsidize private insurance and set up nation-wide care provider networks.
Countries with private insurance industries that didn't bifurcate the risk pool in this manner ultimately did a better job of controlling costs for everyone
Medicare itself is a bit of compromise: it was suppose to be for everyone, but it ultimately just became for seniors. And while Medicare for All is back in the zeitgeist today, it was a consistent policy plank until the 80s, with the more conservative position being an approach where private companies would just sell medicare coverage (effectively creating a system like Switzerland or Germany, minus the public care providers).
https://developer.holochain.org/