Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Porsche Open Source Platform (porsche.com)
169 points by rettichschnidi on Nov 12, 2023 | hide | past | favorite | 144 comments


Even if this just a small step and essentially nothing, when compared to software giants like google, I consider it a good sign that such traditional manufacturers do their first humble steps into a good direction.


Looks like a Vercel, probably Nextjs error ?


Slightly OT, but Porsche has recently sold itself to me as a great carmaker when they released a new infotainment system for their 20 years old Cayenne.

Compare it to another Stuttgart carmaker like Mercedes Benz...where half of the stuff released doesn't work anymore few years later, it's a night and day difference.


Porsche cares quite a lot about heritage. They have a reputation for treating all Porsches, no matter their age, the same. I'm a pretty big fan, though I don't own one of their cars quite yet. Some day!


My daily driver is a 1986 911 Turbo (aka the 930). That's a nearly 40 year old car, they make a similar new infotainment system that works for even older cars. Thing runs like a dream, and they tried very hard to make it "fit" in the old interior even though it clearly had completely different design constraints.


Interesting! Wonder what it cost. The amount of non-standard wiring that goes into those is insane, two connectors with an absolute mass of wiring tied into just about every part of the car. I've installed an aftermarket unit in a vehicle like that and it was quite the adventure in undocumented interfaces land. All is well that ends well though.



More info: https://newsroom.porsche.com/en_AU/2023/products/porsche-cla...

> A corresponding product range is now available, with immediate effect, for the sixth-generation 911 generation 997 for model years 2005 to 2008, the early Boxster and Cayman generation 987 for model years 2005 to 2008 and the first-generation Cayenne for model years 2003 to 2008.


Considering it is Porsche that is surprisingly affordable.


> Slightly OT, but Porsche has recently sold itself to me as a great carmaker when they released a new infotainment system for their 20 years old Cayenne.

They take their older cars very seriously.

AFAICT Porsche is the only manufacturer in the world to offer an extended manufacturer warranty extensible until 15 years or 125 000 miles (200 000 km). And it's the mileage at which you last subscribed that counts: so if you renew the warranty at 124 000 miles / 199 000 km, you're still good to go a full year (I know: I asked at two different dealerships).

You have to pay for it, but it's a bumper to bumper warranty covering just about everything that could go wrong.

I buy used Porsche that I first made sure qualify for that extended warranty and then I renew that warranty.

I've got a MY2013 Panamera since five years, it's now 10 years old and has 100 000 miles (160 000 km), and I just renewed the extended manufacturer warranty for another two years.

I pay a bit less than 100 EUR / month for that warranty and in exchange any warning lights (besides the car being low on fuel) that'd show up is not my problem anymore but the nearest official Porsche dealership's problem.

I got super unlucky and had the gearbox fail: this happens extremely rarely but my Panamera was from a bad batch. I had the extended manufacturer warranty so I dropped the car at a dealership, got another Porsche from them for 5 days, then got my Panamera back with a new gearbox.

Besides that a faulty sensor here or there, wipers motor failing and a few other stupid things like that over the course of 5 years / 100 000 km: all taken into account by the extended manufacturer warranty. I'd say the car went about six times under warranty over the course of six years (for tiny stuff besides the gearbox). I'll go soon for the button to open the trunk on one of my two remote key fob ain't working anymore: I'll get a new key.

Recently --and I honestly don't know if I should laugh or cry-- a friend of mine had a Peugeot 208 he bought new badly fail after 27 months and the dealership told him the warranty was two years only and that there was nothing they could do to fix the car. I verified online and it's true: in Belgium on a brand new Peugeot you get only two years of warranty.

I mean: I bring my, what, 120 months old Porsche, bought used, under warranty and any issue is fixed for free (well I pay the warranty) while my friend brought its 27 months old Peugeot and they gave him the finger! 120 months vs 27 months. It's both sad and funny.

A 15 years extended manufacturer warranty is just insane.

Next car is another used Porsche, also with that extended Porsche warranty.

P.S: a simple "mod" on mine, which runs the QNX OS for its infotainment system, is to replace the HDD (!) by a SSD and then the system "boots" faster (like quicker access to nav and songs when starting the car) but... Although I could do it, I don't for... That'd void my extended manufacturer warranty!


Beware of the engines with aluminum stretch bolts. Those are a real problem and Porsche has been known to stiff people with problems with leaking valve seat covers on account of these even though it is clearly a manufacturing defect.


Agreed, am not a Porsche-guy myself but really enjoy seeing older ones still running and well maintained.

Guess for a vehicle like a Porsche it's kind of tough to get parts etc. from a pick an pull so it's in the Company's best interest to help the owners out.

Meanwhile my VW is easy to get pick n pull parts


> really enjoy seeing older ones still running and well maintained

I enjoy that too but it's at a cost for which you could easily run a more recent vehicle.


Wow. That is extremely impressive.


I like the basic idea, but unless Porsche moves away from manufacturer lock-in for all of the software on board of the vehicles, including the inability to inspect and/or repair the underlying hardware not much of use will come of efforts like this. It's like so many other brands that claim to love Open Source Software for marketing purposes but that continue to refuse to open up the key components of the software that they ship.

Porsche could make some real headlines by opening up their ECU code and the code that drives their infotainment systems, would be nice if it was accompanied by schematics and the tools required to read-out and re-program the hardware.

Fat chance that will ever happen because 'safety', 'environment' and a bunch of other fig-leaf excuses.


Neither safety now environment is something you can easily wave at like that. Also, you're completely missing security concerns and legislative.

I have worked in the automotive embedded software industry since 2009 and I have got caught in the safety track in my career. It's a strange place to be, because the basics are extremely simple, yet it takes hundreds if not thousands of man-years to get a modern vehicle reasonable safe just in terms of the electrical system (this includes the software in automotive terms). There are so many ways to make a mistake that could easily result in an accident. Even the window regulators have non-trivial implementation concerns for anti-pinch. Allowing a random hacker to override this is a terrible idea. Now imagine what kind of mess you could do with brakes and steering...

Designing a vehicle to be hackable will very likely lead to an unsafe vehicle.

I believe what I just wrote applies similarly for security too.

Furthermore releasing software for the market, extensive testing is carried out by an independent body to ensure that legislation is followed. Even conceivably simple things such as lighting or headbeam alignment is a pretty large problem domain by itself. Also, so is just the communication standards for diagnostics.

I would say that large changes would be required to transform this industry. In some, protected domains there is use of open source, such as Qt/Linux for HMI, but opening the HMI to be fully hackable is unlikely to happen. There is quite some liability to make the HMI non-distracting.


Oh dear, I wonder how I'll ever be able to use the code I wrote over the years that controls uncounted lathes, mills, plasmacutters, lasers and a whole raft of other industrial tools.

Obviously the only people that can be trusted with our safety are the manufacturers, because the people whose lives are on the line are irresponsible madmen.

> Designing a vehicle to be hackable will very likely lead to an unsafe vehicle.

Vehicles are hackable, but they're not documented which makes them more dangerous, not less dangerous. Witness comma.ai and others.


> I wonder how I'll ever be able to use the code I wrote over the years that controls uncounted lathes, mills, plasmacutters, lasers and a whole raft of other industrial tools.

You are knowledgeable enough to make them work. Many aren't. Some can't be. Hacking requires knowledge and skill, and most importantly, being contained. Cutting yourself with your self-programmed hackable laser in your garage is unfortunate, but cutting other people is a disaster you can't afford.

> Vehicles are hackable, but they're not documented which makes them more dangerous, not less dangerous. Witness comma.ai and others.

I see two points here.

1. Security through obscurity is bad. That's true, but we have "business" in the play, so that's how it goes. Maybe push for better regulation.

2. comma.ai, an "autopilot", based on reverse engineering, or as you put it, the base product "not documented", thus makes it "more dangerous". No, it's dangerous not because the base product is not documented, but because there's no real autopilot at the moment, and comma.ai is irresponsibly advertising as being able to "drive for hours without driver action". There are many "black box" products with a ToS that forbid reverse engineering. Does that make the product inherently more dangerous too?

Besides, you seem to suggest that, with open products, people can not make things unsafer. That's not true. Some don't know what they are doing when they "hack" things.


I'm all for open things, but that's a false equivalence. You don't use those tools on a public road around unsuspecting others.

In the same way you can't just merrily hack about with a plane. The FAA don't really care that much if you die in your experiment. They do care if the burning wreckage falls on someone minding their own business.


No, Jacques and I hand those things off to random unskilled laborers who come work at a robot cell or CNC for $15/hr without any real knowledge of the code and safety standards. "Push the green button, trust us, it's safe."

FAA regulations for experimental or kit aircraft are wide open. As are automotive standards, having just helped my buddy get his sand rail DOT certified for use on the road. A tube chassis, '65 Volkswagen rear end and VIN, safety glass and manually-squeezed washer bottle and manually-swung windshield wipers (for the DOT guy, then never used again), un-boosted non-ABS brakes with a cutting brake in the loop, and it's cleared for use on the Interstate if we were crazy enough to do that. It's only the manufacturers who claim they're protecting the public by locking down their designs.


Road approval looks very differently for mass-produced vehicles. It is not an easy thing to pass.


And what makes you think that the current crop of automotive software written in either asm or unsafe C is going to be any better than what you or I would produce? I've had a very recent model Mercedes C-class nearly kill me twice on account of buggy software. So much for that 'stellar' (pun intended) reputation. My current car is as dumb as it possibly could be.

I'd expect that if any ECU software was to be released that we'd finally realize how bad things really are and that there would be a massive amount of work done on making sure these pieces of critical software would be as safe as they could possibly be.

Note that the norm is 'a subset of C deemed to be safe' but that what I've seen of such development would not pass my personal threshold for quality work. In fact, rather the opposite. On the plus side, the hardware people usually know their stuff and realize what is dangerous to pass to the software people so with some luck your vehicle will use an FPGA for any kind of really safety critical stuff (or processors embedded with the relevant hardware, such as ABS and so on).


> I'd expect that if any ECU software was to be released that we'd finally realize how bad things really are and that there would be a massive amount of work done on making sure these pieces of critical software would be as safe as they could possibly be.

toyota/denso michael barr testimony cough

edit: oh, there's even slides now, you don't even have to read the court transcript https://www.safetyresearch.net/Library/BarrSlides_FINAL_SCRU...


Yes, that's one of the better known cases. But I've looked at similar stuff under NDA and I'm more than a little bit uncomfortable with some of the stuff that I've seen.

The scariest kind of developer to me are the ones that believe that they and only they are able to create safe software without outside scrutiny of what they produce.

Of course that scrutiny would have exposed emissions cheating software right on day #1.


This comment chain appears to have a fundamental misconception of what constitutes safe and what does not.

Automotive standards and automotive coding standards approach safety in a different way than most people think (and given your comments I would say this includes you). If you're curious, you can have a look at some rules to evaluate automotive code that are published here: https://github.com/github/codeql-coding-standards

In short, the rules do not aim to eliminate failure or crashes, but rather make the crash predictable and uniform when a crash occurs so that it can be dealt with. This is further complicated by where and how the automotive manufacture chooses to implement safety controls. It is entirely possible to have a bunch of unsafe code running somewhere on a car, and simply have a small safety shim around said code that prevents the unsafe code from impacting the safe operation of the vehicle.

With that in mind, let's take the example that you use here of emissions cheating software. Emissions is likely not considered safety relevant (it might not even be QM, it just might be some code) and so no safety requirement applies to it. So, no real scrutiny would happen there regardless, at least from a safety perspective. See, validating that software passes a particular safety certification is time and money intensive and manufacturers therefore keep the amount of code that they qualify as safe to a minimum. This means as an example that the infotainment systems of many manufacturers are not safety relevant and no safety function should exist on or interact with them.

A few other things to consider from other threads:

- Telsa doesn't necessarily follow or adhere to safety standards. They (Telsa) are explicitly non-compliant in some cases, and this is partially why there are investigations into their practices.

- Industrial robotics code is just as bad if not worse than most automotive software from what I've seen. As you note, its that these robots are not under manual control

- None of this prevents the software from being open source. There are plenty of safety qualified open source projects. This simply limits who can contribute and how contributions are managed. The main reason why many things in automotive are not open source is that the ECU manufacturer isn't interested in doing so, and the Tier 1/2/3 that does the implementation is even less so.


There is a difference between what I with my layman-user cap on would consider unsafe (say, releasing purposefully harmful software into the environment or software with bugs like phantom braking) and what I with my embedded software programmer cap on (machine control, aerospace related) would consider to be unsafe. I spend a lot of my time reading root cause analysis on software failures and my only conclusion is that this is a very immature industry with a lot of players that should probably not be trusted as much as we do.

As for safety shims: for instance, watchdog timers are often used for this purpose, to bring a system that is exhibiting buggy behavior to behave more predictable. Personally I would consider any watchdog error that was not directly related to a hardware fault (say a bitflip) as a failure and I'd like to have my car report such failures.

Tesla being non-compliant is precisely my point: they get away with that because they don't have to open up their code to scrutiny. But I'll bet that if they would that their marketing department would not be able to spin stuff the way they do at present.

All of those would take into account the relevant context and I think that's where things go off the rails here: embedded software developers do not get to claim the high ground around what they consider safe or unsafe given the code from that world that I've had my eyes on. If anything it is amazing that it works as well as it does, the only two industries that get a pass based on my experience so far is medical embedded and avionics. Everybody else has the exact same problems as any other software project and would benefit from opening themselves up to scrutiny.


> Tesla being non-compliant is precisely my point: they get away with that because they don't have to open up their code to scrutiny.

This is an inaccurate assumption. ASIL compliance is something that you can publicly state, and Tesla explicitly does not. Most automotive products that do follow such standards generally state such. (Example: https://www.aubass.com/products/cp_top.html - search for ASIL D)

Making something open source does not in any way make it safe, unless your enforcement plan is having lawyers look into it and in which case you end up with lawsuits and likely endless litigation that results in a closed platform again. Tesla's calculation is that no one will enforce safety controls or standards compliance on them, and to be fair to date they (Tesla) are right.

> Personally I would consider any watchdog error that was not directly related to a hardware fault (say a bitflip) as a failure and I'd like to have my car report such failures.

This is an opinion and does not properly account for the multitude of scenarios that must be dealt with in automotive. Automotive, unlike aerospace, rarely has failover hardware or in some cases even A/B partitions on the disk for failover in software. Add in the need to process signals in real time and you will encounter situations where use of a health check (watchdog timer) is an appropriate response, and the use of one should not need to be reported.

> embedded software developers do not get to claim the high ground around what they consider safe or unsafe given the code from that world that I've had my eyes on.

They (embedded software developers) don't make such claims; when something is safety certified, an external third party does the validation and verification and asserts that an implementation and processes are either safe or not. In the EU this is usually done by TUV (https://www.tuv.com/world/en/) or Horiba-Mira (https://www.horiba-mira.com/).

This gets extremely complex as this is often hard tied to the hardware (support for a safety island, how memory is managed on the SOC, etc) and the overall E/E architecture (selected messaging protocol for the backbone) and layout of the vehicle. Analyzing a system of systems to determine all the possible impacts and make sure that the chance of failure is small enough to be acceptable is a hard problem to solve and not one any single engineer does.


>> Tesla being non-compliant is precisely my point: they get away with that because they don't have to open up their code to scrutiny.

> This is an inaccurate assumption. ASIL compliance is something that you can publicly state, and Tesla explicitly does not.

Sorry, but voluntary standards aren't standards, Tesla is incompliant with ASIL, if they were they'd definitely state that they are so you may as well assume that they're not. Personally I think any party that doesn't bother to state they are compliant should simply not be allowed to ship a vehicle because consumers are not going to be aware of the differences.

> Making something open source does not in any way make it safe, unless your enforcement plan is having lawyers look into it and in which case you end up with lawsuits and likely endless litigation that results in a closed platform again.

It does not guarantee safety. But it does more or less rest on the assumption that over the years at least some safety related bugs would be found and if there is anything that we've learned from open source over the last couple of decades then it is that if you look long and hard enough at even the most battle tested codebases that you will just uncover an ever lasting stream of bugs with the frequency reducing over time.

>> Personally I would consider any watchdog error that was not directly related to a hardware fault (say a bitflip) as a failure and I'd like to have my car report such failures.

> This is an opinion and does not properly account for the multitude of scenarios that must be dealt with in automotive.

Yes, it is my opinion as a software developer of many decades that if you rely on your watchdog timer to keep stuff running besides exceptional cases that you are doing it wrong. Imagine driving on the highway with one of your mirrors wedged against the guardrail for a close analogy of how I see this kind of 'engineering' practice.

A watchdog timer is the equivalent of ctrl-alt-del in case something stops working and while it is better than nothing and should definitely be present because it is still preferable from a system that is no longer responding at all (which is certainly going to be a safety issue) it should not be relied on for normal operation.

> Automotive, unlike aerospace, rarely has failover hardware or in some cases even A/B partitions on the disk for failover in software.

That's a cost decision, and with the cost of computation these days it is also absolute nonsense. A case could be made for this in the 80's but with hardware costing pennies this is simply no longer a valid excuse.

> Add in the need to process signals in real time and you will encounter situations where use of a health check (watchdog timer) is an appropriate response, and the use of one should not need to be reported.

I've been writing real time applications for a very long time and I highly doubt that such situations occur regularly but I'm open to having my mind changed, can you please explain exactly what kind of situation you have in mind where you think a watchdog timer expiring is an appropriate response?

For me a watchdog timer spells: the situation is such that we can no longer reliably function the safer option is to start all over again from a known set of defaults. It says that something unexpected has occurred that causes an operation that should have completed not to be completed and that this is outside of the design parameters that the software was originally specified with indicating that most likely the controller itself is at fault (and not the peripherals that it is attached to).

>> embedded software developers do not get to claim the high ground around what they consider safe or unsafe given the code from that world that I've had my eyes on.

> They (embedded software developers) don't make such claims; when something is safety certified, an external third party does the validation and verification and asserts that an implementation and processes are either safe or not.

Yes. And that process is anything but perfect. I've seen plenty of code that had passed certification that was so buggy it wasn't even funny. Including automotive. In an extreme cases someone thought it perfectly ok to do an OTA update on a vehicle in motion. I kid you not.

So let's not pretend certification is bullet proof even if it is useful it can miss glaringly obvious errors (time pressure, checkbox mentality).

> In the EU this is usually done by TUV (https://www.tuv.com/world/en/) or Horiba-Mira (https://www.horiba-mira.com/).

My experience is limited to the former. Let me recap that: I think their intentions are good but the bulk of the testing is limited to black box rather than in depth review and formal guarantees around performance. This has some interesting effects: it concentrates on the external manifestations of whatever makes the box tick and as long as the test parameters are exhaustive this will work very well. But for any device complex enough that the test parameters are only going to cover a fraction of the total parameter space you may end up with false confidence.

> This gets extremely complex as this is often hard tied to the hardware (support for a safety island, how memory is managed on the SOC, etc) and the overall E/E architecture (selected messaging protocol for the backbone) and layout of the vehicle.

Yes, again, I'm familiar with this and have some (but not complete) insight in how TUV operates when it comes to vehicle and component certification.

> Analyzing a system of systems to determine all the possible impacts and make sure that the chance of failure is small enough to be acceptable is a hard problem to solve and not one any single engineer does.

I think this is fundamentally borked. It will always be time and budget limited. Case in point: I recently reviewed some vehicle related stuff that had already been TUV certified that contained a glaring error in a complex control system, just looking at it from the outside gave me a fair idea of what I had to do to trip it up and sure enough it failed. TUV should have caught that (and the manufacturer too) if they were as safety conscious as they claim to be. I'm not saying that I'm outperforming TUV on a regular basis, I'm just saying that opening up this kind of code to more eyes, especially those that are more creative when it comes to breaking stuff, can - in my opinion - only be beneficial.

Edit: some more thinking about this: I think one of the reasons why I'm quite skeptical about for instance TUV is that in most countries that have large car manufacturers those manufacturers are 'too big to fail' and I would not be surprised at all if TUF (like BaFin) is not in a position strong enough to fail let's say a product line of a major manufacturer even if they find a massive error. It would immediately become a political football and in practice this gives manufacturers a lot of benefit of the doubt with respect to self regulation, besides the fact that such oversight entities are usually understaffed. TUV may well be with the best of intentions but the fact is that VW managed to bamboozle them in a way that any serious code audit including reproducible builds and something to verify that that was indeed what was shipped to customers should have caught.

But I don't see a massive undertaking to put all VW (and other manufacturers') code through the wringer beyond what was already uncovered simply because the only effect that uncovering such a scandal would would be to discredit the German car industry even further. So I don't think anybody is looking too hard.


So, this will be my last response to this thread as I think it's run it's course.

> voluntary standards aren't standards

Most of the worlds standards work this way. They are standards, and it is up to various legislative bodies to decide how to enforce these things. In automotive, compliance with a standard is generally attested to a government and included in the package that is shared with other governments to allow import or sale of the car in their country. Tesla simply flaunts that.

> safety related bugs

This kind of thing isn't a thing if you understand automotive safety, or shouldn't be. You should have sufficient safety controls such that an unsafe condition will not occur. If this is a thing, you're talking about a bug then in the applied safety mechanism that allows an escape.

> watchdog timer expiring is an appropriate response?

Keys for SecOC get out of sync and throw an error. Not a safety problem per say, but your health check (since I consider watch dog timers an implementation of health and state management), you'd trigger a restart of the software to resync the keys.

> pretend certification is bullet proof even if it is useful it can miss glaringly obvious errors

I don't, but when it works it is sufficient. Open sourcing something adds nothing when it works. Importantly, usually TUV assumes liability for things they certify in many cases (not all, but generally that is how it works)

> limited to black box rather than in depth review and formal guarantees

We get the latter at my place from them, so I would poke at this area more if you think its black box only. This likely depends on the contractual terms, and who assumes liability.

> VW managed to bamboozle them in a way

The VW code is likely not safety relevant, so it wasn't reviewed as in depth. Most ECU code also isn't reproduceable even today.

> So I don't think anybody is looking too hard.

On this I generally agree as someone in this space. The amount of money invested in Pwn2Own is small given the barrier for entry: https://www.zerodayinitiative.com/blog/2023/8/28/revealing-t...


> Keys for SecOC get out of sync and throw an error. Not a safety problem per say, but your health check (since I consider watch dog timers an implementation of health and state management), you'd trigger a restart of the software to resync the keys.

Ok, agreed in that case, though I'd prefer to see a forced reset rather than to rely on the watchdog timer as the mechanism to do it for you. You could just jump to the reset vector instead.


I got a friend who was almost killed by traction control. Pulling out of his driveway onto a 3 lane road, and having 2 cars 1 overtaking come around the corner very fast. He tried to punch it to get out of the way but: ”computer says no”. Longest 2 seconds of his life he reckons, waiting for the gas pedal to work again. Both cars had to brake very heavily.

"Safety"


Yes. That sort of thing. I had a 2015 model C-class Mercedes make two fairly serious attempts at killing me before I got rid of it. The first time it could have been a glitch, the second was so obviously borked that I still wonder why they released that stuff on the road. And no way to disable that 'feature'. I'd have loved to analyze what exactly went wrong there and why a bridge and an advertisement by the side of the road were classified as imminent frontal collision.

My present car just does what I tell it to, no gizmos. It may be statistically less safe (I'm willing to believe that) but at least it doesn't actively try to kill me in the name of keeping me safe.


You don’t drive your lathe down the freeway at 80mph. And your lathe probably isn’t surrounded by hundreds of other lathes also going 80mph. And probably not being run by people only half paying attention.

Tell me about the liability laws in place related to you operating your lathe, or the state-required licensing and insurance that each lathe operator holds.

A machining tool is worlds away from a motor vehicle.


Control software for a lathe (or, in fact any industrial robot) is far more dangerous than a car. Go visit any metal workshop and take an inventory of the number of people and then count the number of digits. The fact that they don't cause nearly as many accidents is because they are not normally under manual control. But when they are better find some (preferably solid) cover.

And meanwhile, Tesla gets to beta-test their self driving stuff in public and I can't look at the source code to my ECU to figure out whether or not a certain behavior is by design or an indication that something is broken.


Cars will do a lot worse than reduce your finger count. In the UK, 68 people died in work-related accidents. Nearly 1700 died on the roads. (The same figures in the US are 5k/40k, go figure).

And I agree that Tesla shouldn't be beta testing on the roads. But neither should anyone else.


The main reason for that is that there are a lot more drivers than there are lathe operators, and that lathe operators usually injure themselves.

There is a relatively low chance that you'll find a lathe on a busy intersection or on a freeway during rush hour. I thought that was obvious.


Not necessarily irresponsible madmen; just curious.

Because I bet you if I buy a new car and discover that I can access its internal components via an API, I will be toying with it.

On any other platform that would never be a problem: found a bug? Just restart the container!

But with a car, this might mean a bug in my code manifesting itself while I'm driving 120 kph. And maybe there's a pedestrian crossing the road and I can't stop in time because the bug makes the brake 60% weaker.

This time however, there's not a restart docker button.

I'm sure if this happens people would be attacking Ferreri viciously the way they pile up on Tesla whenever a douche sleeps at the wheel going 100 kph, even though the company said before that that's not safe.


Then we should all go back to security-by-obscurity and trust in the man behind the curtain for computer security as well. And we all know that that doesn't work, so why is there this conviction that the embedded programmers at car companies are made from magic?

It's precisely because cars are so dangerous that the code should be open to scrutiny. And of course - at least in the past - the argument has been made that more eyes do not make the bugs more shallow, but in practice if there is an incentive (such as personal safety) people will expend a lot of effort to figure out why stuff goes wrong.

What it would do is to take away any kind of excuse that manufacturers have in those cases where their gear is suspect to claim that their wares are perfect and that it must have been user error. Because I can pretty much guarantee you that if you were to inspect your average automotive code-base that you'd find errors, and not just minor ones. From accidental erroneous emergency braking, untended acceleration to outright malicious ones such as planned obsolescence drivers, emission controls defeat code and so on.


Open to scrutiny, absolutely. Anything safety-critical should be freely available to those it can harm. Cars, trains, planes, nuclear reactors, lathes, the lot. I hope your code and schematics is fully provided to worker relying on it being correct. I indeed don't have faith in regulators auditing it properly.

That said I still don't want someone to plonk some GitHub code into the brake controllers, take it for a spin and turn me and mine into meat salsa.

On private land, surrounded by informed and consenting people, sure, go nuts.


> That said I still don't want someone to plonk some GitHub code into the brake controllers, take it for a spin and turn me and mine into meat salsa.

The chances of that happening, versus brake fluid contamination, bad lines, seized rotors, rusted rotors, rotors and pads with grease on them and a thousand other mechanical failures are nil. Because brake controllers are always backed up by a mechanical system and the worst thing about a brake is that it could fail.

The bigger problem is that manufacturers that could barely create functional entertainment systems are now actually creating software and hardware combos that can override driver input to the steering wheel and the brakes and in my own experience they are absolutely not qualified to do this. Car software is crap, you can take my word on that. Very, very few manufacturers have software as a core competency.


Please define car software.

You have user facing functions, and you have engine control functions, ABS, transmission, and so on.

The first one, I agree, is generally crap.

For the second one, in a lot of models, your manufacturer haven't even written the code, because they buy it from some OEM manufacturer like Bosch.

And I am pretty sure that Bosch is pretty good at writing this kind of software.


> Please define car software.

The totality of all code running on a particular vehicle that was part of that vehicle when it was sold to the end user.

> You have user facing functions, and you have engine control functions, ABS, transmission, and so on.

Yes.

> The first one, I agree, is generally crap.

Ok.

> For the second one, in a lot of models, your manufacturer haven't even written the code, because they buy it from some OEM manufacturer like Bosch.

That depends. If they buy a whole unit there is a chance it is 'stock', there is a chance that the firmware was modified by the manufacturer or there is a chance that development of the software is insourced. All of that depends on volume, cost, licensing, purpose.

> And I am pretty sure that Bosch is pretty good at writing this kind of software.

Based on what evidence?


The fact that I've been driving cars with ECUs since I was a teenager and never got stranded in the road because of a firmware bug, neither nobody else I know.

Compared with Amazon/Google/MSFT, this is a remarkable feat.


That's fair, embedded developers are a notch above the CRUD folks. Mostly because the hardware people tend to keep them sharp and won't accept any finger pointing unless there is a solid reason for it. But don't overestimate it either, without the watchdog timers embedded usually won't live long.

In general what keeps you safe is assumptions about failure modes that work out as long as everything stays within the set of parameters that define the working envelope of the device. But any combination of inputs that was unforeseen (and therefore not tested) is a possible source of surprises and if and when that happens embedded stuff has very little resilience.

The one piece of code that I wrote myself that had to pass certification for non-flying software that had the potential to crash aircraft and potentially kill people cost me 10x in time from what it would have cost if that were a non-regulated industry. And I'm the first to admit that this was the most humbling experience ever in terms of the number of issues found based on a very simple trick: dual development of the same software by a different set of programmers at some point in the past. Mine was the 'upgrade'. That old software was battle tested but on a platform that was EOL. Before I got mine to the same level of reliability the changelog was many, many pages long compared to my first attempt that I thought would pass the tests. Not so. Not by a long shot.

As for ECU's, they are relatively simple devices in terms of inputs and outputs, they are required to be utterly reliable because when you press the accelerator to cross an intersection those ponies had better be there. But that's mostly a result of many years of work on the software cores and just like any other old piece of code by now we've found most of the bugs.

The older - pre software - engine control units were massive chunks of hardware, look on the right hand side inside the passenger seat of 80's and late 70's era Mercedes and other German vehicles with fuel injection (and some Volvo and Citroen cars) for the kind of control unit that went into these cars. Just the component count and the number of adjustable parameters alone is enough to make you wonder how reliable that stuff is. And yet, work it did. But the modern ones are far more reliable, not because of software, but in spite of software. And even though they generally work the question I would like to see answered is how much of that is because of duct-tape and how much of it is because of solid engineering? Some people in the embedded world are of the opinion that it doesn't matter. But I think it does and I care enough that I'd like to see what makes this stuff tick. On the off-chance that there is a real bug or some unintended acceleration condition or a way to get the hardware to lock up lurking there that we simply haven't uncovered yet.

Given how prone car manufacturers are to stonewalling on this stuff my guess is: plenty of chance that that's the case.


Yes, now I can see your point. Haven't considered this in depth before. Maybe because we have so low expectations in software, when something mostly works without visible manifestations of failure we tend to consider it better quality than actually it was.

But, yes, the absence of failures should not be taken as proof of absence of bugs.


I read a lot of closed source code and quite a bit of it embedded and some of the stuff you find is enough to make you question just about anything.

Here is a nice picture of a pre-digital ECU:

https://www.benzworld.org/attachments/75-djetecu_3-jpg.16183...

It's essentially a hardwired analog computer (note the lack of adjustment points other than 'idle RPM'), as long as the external contacts are clean, the components are good and within tolerance the unit should work. There is nothing to adjust.

Modern ECU's have many more sensors to deal with, so higher data rates and many, many more fault conditions that have to do with emissions and preventing engine damage, these old D-Jetronic units were very impressive for their time, plenty of them are still running today which given the parts count is a small miracle.

In a way they are so reliable because of what they can't do, as long as the input parameters are roughly where they should be and the crank position input is accurate the engine will run. But whether it will run efficiently or not is a different matter and accurately setting these up without the right gear and manuals is next to impossible. So most parties that repair these will do a 'so-so' job, good enough to get the car back on the road but far from optimal. But the old battle-axes that you'll find these units in never were models of fuel efficiency. In contrast to this a modern ECU has exactly zero analog processing beyond digitization, it's all software. And where at all possible that software is not engine (beyond type and pre-set config) or drivetrain specific (though they do tend to talk to the transmission to get more information about the driving conditions) because that would require a lot of extra work per engine.

The most that you'll find is that the ECU has some knowledge of the serial numbers of key components (for instance the drivetrain) to give some anti-theft protection. Other than that it's all auto-configuration until some parameter goes out of spec and then that expensive check engine light comes on.


Why do we even need computers in cars. The woz said never trust a computer you can't throw out a window. The only computer in my car is the radio.

It's just excessive consumerism and marketing crap. It's not needed.


The initial reason was emissions, but now that we have them all kinds of other stuff gets tacked on.


> I believe what I just wrote applies similarly for security too.

Automotive security is nearly an oxymoron. The reasons for that are simple: the difficulty and expense of attacking a vehicle exceeds the bored grad student/curious tinkerer threshold, and the automotive industry has collectively the worst attitude towards security I've ever encountered.

The depressingly predictable result is that third party automotive security testing is a sport reserved for people who are extremely disinterested in disclosing their methods to you, aka the actual attackers.


Why would it (legally) be on the car manufacturer if someone hacks his own car and causes an accident because of modifications to the ECU (firmware)?

This doesn't intuitively make sense to me. At the very least there are probably huge differences between countries when it comes to this?

Aside from the fact that some people would likely love to modify their car in every way possible to use it on the racetrack or whatever private property?


Or maybe to make it safer.


I think their point / the general FOSS argument is that those 1000 of man-years would be turned into 10000 man-years if these things were open sourced. A similar security concern could be waived at things like openssl, but it seems pretty inarguable that openssl is a net-positive for security.


I'm all for open access to the code for the sake of safety. On the other hand, I'm completely against hobbyists accidently bypassing a safety mechanism.

Open access, but secure access to software download could make sense, at least for commodity parts.

When it comes to features with competitive advantage, though, I don't see that OEMs or its suppliers have anything to gain.


> On the other hand, I'm completely against hobbyists accidently bypassing a safety mechanism.

Accidentally.

Besides that: it should be fairly obvious that hobbyists are not going to 'accidentally bypass a safety mechanism', they can cut their brake lines as well and they don't generally do this. What you'd see is that the aftermarket would finally be able to produce stuff without dealers in between and people with the 'right' kind of tooling (authorized by the manufacturer) to get your replacement to be recognized by the firmware. Because of course absolutely none of this would ever be used to protect the bottom line. Right?

Also: if anything open sourcing this stuff would likely result in more rather than less safe vehicles, maybe at the expense of a couple of embarrassments. Because I have absolutely no illusion about the people working on these systems professionally to be somehow magically better than the ones that work on them for themselves, after all, they have a pretty big stake in the outcome.

Imagine that, working on your car in a safety related way... replacing brakes, steering housing components, linkages, suspension components tires and so on is all at least - if not more - risky than working on software.

FWIW one of those 'safety features' tried to kill me twice and caused me to let go of my recent car and switch to a 1997 issue vehicle that has behaved quite predictable compared to that modern one. Whose 'safety features' could not be disabled.


> Besides that: it should be fairly obvious that hobbyists are not going to 'accidentally bypass a safety mechanism', they can cut their brake lines as well and they don't generally do this.

I can already picture the YouTube videos on "how to gain 15% hp" explaining you how to "hack" your car with a 1s "it will severely reduce your engine life expectancy" message at the end. Thousands of people would run this patch without thinking twice

Also how would you pass the controls most countries do every other year on cars ? I don't expect people checking my brake pads to know how to review the random piece of code I deployed to my car


Well, that's sort of the point: this is already possible, so in that sense nothing would change. Changing the mapping (essentially the amount of fuel injected based on a bunch of parameters) is regularly done by 'tuners' (between quotes because they don't really tune anything, they mostly burn more fuel for questionable gains).

But that's really not what I would care about. I'd like to read that code to figure out what the failure modes are and what might impact my safety in a negative way.


I can go out to my late 80s truck truck right now, undo the lock nut on the fuel screw and wind that sucker up. It'll be a completely different truck. Will I half my fuel range? Sure. Will it be fun? Sure will be. If I care about the engine I'll attach an EGT gauge to ensure I don't melt the alloy head.


Hackable does not mean crackable. The best security implementations in the world are free software.

I'm not even a tiny bit convinced that making cars hackable would be a detriment to safety. Give me one example of that happening in literally any other sector.


You might have a point there, but I struggle to find any completely hackable product that is also safety-critical. Some airplane, nuclear reactor or some train, perhaps?


Any old car will do.


And new experimental aircraft, which are owner-built.


Okay, so for the moment leave aside the safety critical bits (only for a moment) - what's the excuse for not opening up the center console? That generally is already segregated and only handles non critical functions.


Center consoles have been used quite successfully as beachheads by hackers to be able to get into more important systems because car manufacturers are typically utterly clueless when it comes to security. So obscurity is a very large part of their security. Of course that doesn't really work with the most motivated parties (car thieves and their captive techies) having a field day with this.

Hyundai and Kia are reportedly so bad that they ended up paying out a large amount of money to compensate owners.

https://www.reuters.com/legal/hyundai-kia-agree-200-million-...

But don't worry, it's been fixed now. Probably.


If it's security critical, it definitely needs to be FOSS and user patchable; obscurity is not a reasonable strategy.


Agreed, but that won't happen until some regulator wises up to this being a way to reduce vehicle theft considerably.


> Even the window regulators have non-trivial implementation concerns for anti-pinch

Tesla just got hit by this a few months back. They had to remove the auto roll-up-windows when you walk away after parking. Apparently they didn't have the sensors or hardware to do it safely.


> Allowing a random hacker to override this is a terrible idea.

It should be a basic right no matter how "terrible" a idea it is. We bought it, we should have full control. Void the warranty or something.


Many of things "should be" and "ought to be" but we all sat around a table and decided to make a neat little thing called the law because at the end of the day we're still just apes and apes don't always act in the best interest of their peers


Do you seriously actually believe the government and The Law has your best interests at heart?


Well no, that's my entire point, it's not about what work best for _me_, it's about what somewhat works when scaled to a whole country or continent. It's flawed in many ways, and corrupted in other ways, but it's better than the alternative

If I really wanted to do what I want whenever I want I'd live in a hut somewhere in the woods. Law is rarely about _individual you_ it's about the society you live in.

Why do I care if you have your seatbelt or your motorcycle helmet ? Because if you get injured you'll cost millions to the community. Same exact principle here, it's part of the social contract, you enjoy a lot of neat things and in returns you give up some other things.


I'm thinking through the ethics of this myself, but I think it's a reasonable argument that you can have the freedom to do what you want with things that you own _assuming you don't impact others_. The issue with a dangerous car is that it puts others at danger.


Responsibility. If I modify my car and that's determined to be the causal factor of an accident, it's on me.


That'll make the hypothetical parents of the hypothetical kid you just ran over very happy I'm sure. "I fucked up but it's on me!" and they lived happily ever after


Yes? Society is actually completely fine with that arrangement. Every single year tons of people get drunk, get into accidents, get people actually killed and are held accountable for it. Yet nobody dares to infringe on their freedom to drink. They tried once and it just didn't work out.

This "give up freedom because kids" nonsense is seriously tiresome. I'm not engaging with these arguments anymore.


Somebody please think of the children.

Really, come on: you can do much better than this. The hypothetical kid might be the one that was saved because some random hacker figured out why the latest generation radar based cars keep phantom braking. That argument works both ways, and typically what is fixed stays fixed.

And of course none of these hotshot programmers would ever be seen near the following bit of code[1], which probably impacted a lot of actual children and their health in a very direct way:

     // in case of an emissions test ensure that we pass
     // with flying colors

     if (emissions_test_detected()) {
         // we're sooo environmentally conscious
         lean_burn();
     }
     else {
         // roll that coal
         regular_burn();
     }
[1] cribbed from VW/Porsche AG's internal repository ;)


Agreed, and that's an argument for open source, not against.


And someone could respond "Okay, fine, tampering with the onboard software voids the warranty and shifts responsibility to the tinkerer." But that's a liability issue. The safety concern is still there regardless of who is held responsible. A change that seems innocuous may, in fact, be breaking safety regulations. This is a big deal and a matter of public concern.


If you clutch those pearls any harder they'll turn to dust.


The biggest excuse is much more reasonable (albeit annoying, the same reason why most board support firmware isn't open source): ECUs are built using code generation from models (ASCET, LabView/Simulink, etc.) on top of 10 layers of proprietary middleware and compilers, using components supplied by hundreds of consulting firms, so an open-source effort would have to be a paradigm shift in the industry from the ground up. It's not something Porsche could decide to do on their own.


Well, they could dedicate a team to it if they were serious about it and work with open source advocates to make it all work.

I'd rather have a tarball with hard to parse code and weird tooling than nothing at all. But - as I said - I have no illusions about this and see it as a marketing effort.


In my experience it's fairly common even for the manufacturer to not have full access to the code, let alone permission to open source it. I don't think it makes it merely a "marketing effort" if they avoid that.


Yes, I'm aware of that. Which is why my assumption that this would be a red herring was born out. As long as Freescale and Bosch are in control I don't see any of this changing.


I did some work on a toolchain used in automotive (and other industries) to proof^W prove certain software properties related to safety. This is already a massive effort, expensive to build & maintain - but after all a small cog in the whole machine. (Using Rust over subset-of-C would deprecate maybe half of that work; at most I'd guess). The hypothetical team would take a decade or two to replicate just that cog.

Yes, having the whole machinery available as FOSS would be awesome and I'd like to live & work in a world in which this is the norm (Hello, Star Trek?). But all this requires very talented people, many of whom studied and racked up student debt, want something to eat and a place to stay. We first need another paradigm shift to make this kind niche open work sustainable.

Oh, and if you ever find yourself in a car with critical components not checked with that kind of toolchain: I strongly recommended getting out of there asap ;-) For all the excellent people building that software, there will always be some who'd be better writing mobile apps instead (or anything else that cannot kill people).

(btw: You asked me about the setpoints of the 12k 3P Deye inverters a while ago: Seems to work nice for me with my basic "just the inverter + PV + battery" setup. Though some people report trouble when using AC coupling with another inverter; in that case the Deye seems to swing more around the setpoint).


The main thing I think manufacturers would try to avoid is to be embarrassed by the kind of shenanigans that helped VW to pass emissions inspections when in fact they shouldn't have. And that's proof enough to me that manufacturers don't give a rats ass about safety or the environment if it means more money for them.

I realize how complex those toolchains are (and how many parties would have to sign off on a release) but what amazes me more is that we are willing to trust these manufacturers on their say-so. Because chances are their interests and our interests are not aligned, as the diesel scandal has clearly proven. And I would not be at all surprised if there were more such scandals but we just haven't uncovered them.

Given what the Asahilinux people and some other very talented individuals have done in the FOSS movement, combined with the fact that half the world is powered by open source including for many mission critical and safety critical applications I'm pretty sure we could handle vehicles as well.

Nice to see your setup works well!


Honestly just documenting the APIs the different components use to talk to one another would be incredibly helpful. There are so many things that could be done in the pursuit of openness that absolutely won't be.


Volkswagen (Porsche's parent company) certainly has the scale to make this happen. Seems like its entirely a corporate culture issue, same as what plagues Volkswagen's EVs and newer cars with terrible infotainment systems.


>Volkswagen (Porsche's parent company) certainly has the scale to make this happen.

Yeah, but VW and Porsche are into selling cars not OSS so their priorities are aligned with that.

Think of it from the bean-counter's perspective, which run these companies: why would they invest resources into sharing your SW as OSS if that's not gonna sell more cars?


To commoditize your complements, so that you don't have to pay license fee to your suppliers.

In a way, the German OEMs have been trying to do so, but mostly via different standardization efforts (OpenSCENARIO for example) so that they can easily change suppliers.


Sure, but is the owner's experience really a complement? I don't think it is. It's a huge part of what they are selling.


I think you can make that same argument for many large companies that contribute to OSS though.


Contributing to OSS is one thing. Open sourcing your existing closed source internal SW is another and is hugely risky legally as that could have many faults and bugs that could get them sued if discovered.

Toyota had the unintended acceleration lawsuit during which an external audit discovered several bugs and deficiencies with their SW, testing, and dev process.

Knowing this, why would any car manufacturer air their dirty laundry in public for the sake of OSS? Their lawyers would definitely advise them to never OSS anything internal out of the kindness of their hearts as that could backfire spectacularly.


I strongly doubt this is the main reason. I think it's simpler and just like most hardware: there's no perception that open source adds value, and re-negotiating IP agreements with hundreds of sub-vendors would be unreasonably expensive in and of itself even if the vendors were amenable to open source. We see the same thing in plenty of non-safety critical hardware areas: board support packages, device drivers, graphics stacks, and so on. There's no perception that open source adds value in the hardware industry at large.


The real shocker would be how incredibly crappy it all is. But yes, licensing is a huge part of the problem. But that also happens to help the manufacturers who really wouldn't want to open this stuff up to scrutiny anyway.

If only because there might be a significant number of findings about accidents that turned out to have been caused by a malfunctioning vehicle after all when right now these are attributed to the driver.


Hiding safety flaws? That doesn't sound like a very healthy safety culture.

This sounds like a good reason to have a little government regulation to align incentives with safety interests.


It'll never happen because regulators don't get involved except on the most abstract level (say: a recall with a proposed fix).


>Hiding safety flaws? That doesn't sound like a very healthy safety culture.

Welcome to the real world of corporate profit making. You must be new here.


Just the diesel scandal alone... (different badge, same manufacturer, and Porsche with their Cayenne diesel models was definitely affected).


This probably strikes a lot closer to the real story.


Curious, who are the main consulting firms active here?


Bertrandt, IAV, EDAG, to name a few (link in German):

https://de.wikipedia.org/wiki/Entwicklungsdienstleister#Top_...


>It's like so many other brands that claim to love Open Source Software for marketing purposes but that continue to refuse to open up the key components of the software that they ship.

Nobody is buying a car based on the manufacturer's love of open source software above other factors. "I really liked the size of the X3, but went with a Macan because Porsche loves open source software." Said no one ever.



PIWIS for the people!

I got briefly excited that Porsche was going to make this happen.


All mentioned projects look to be web focused? I would assume most of Porsche’s software value-add is in embedded systems. Can open source make a dent in the car components themselves?


Car companies often don't do the embedded development themselves. They usually rely on suppliers and a vast network of contractors to do most of that work, though things have been shifting back in-house in recent years for velocity/feature/security/bug fixing/institutional knowledge reasons. Some of that is becoming more open source, e.g. I've seen work from Ford and GM being open sourced, but it's relatively uncommon compared to the higher level stack.


Does Porsche even know how to software?

When the Taycan was new, it had horrible software and the system would crash on the freeway. A Googler dug in a bit with the dealer and found it was running docker / docker compose and a bunch of the containers would just die sometimes. He banded some other Google Taycan owners (there’s probably a group ..) and they got their own NHSTA recall. Here’s an example of one of the many recalls: https://www.taycanforum.com/forum/threads/wnj8-wnk1-ana6-sof...


Big headline, small impact


So does hashtag mean something sort of like a mini proper noun now?


It’s a shame they don’t open-source their infotainment system. Not that I want to modify my 911, but there’s privacy settings in there that I find concerning, so it’d be good to look at the source.

And, just from a curiosity PoV, I’d like to see the car management code.

I think it’s important that individuals can’t change the code running their cars; that’s just got ‘massive car accident’ written all over it. But being able to see and possibly contribute to the software would be great.


> porsche-design-system/porsche-design-system

Why do companies open-source their design systems? Other people aren't allowed to create a website that looks like Porsche.com because of copyright and trademark laws anyway.

I always assumed it's because properly securing it is too much work for little gain (because of the reasons above), but maybe there's more to it?


I've used their design system several years. It wasn't always on npm and it used to be on a private npm registry (I am assuming that it was for the reasons you mentioned).

Essentially, there are contractors and thrid parties that develop stuff for them and they want it to look the same as the rest of their stuff. Putting it out in the open made things much easier. The private registry was hard to integrate in our CI/CD pipeline. You couldn't just open an issue on GitHub (it existed, but I think noone was looking at this), but had to write some slack messages - to which you needed access to.

It's just more cost-effective this way.


Because they want you to write components that fit their design language if you're integrating into their ecosystem.


I knew I was missing something. Thank you!


"Application error: a client-side exception has occurred (see the browser console for more information)."


It feels like a sort of "green-washing" to me. Let's call it "oss-washing"


Meanwhile, Mazda is sending cease and desist letters to open-source contributors.

Previous discussion: https://news.ycombinator.com/item?id=37874220


Blog post announcing it from (2021). How's it been going since?

https://news.ycombinator.com/item?id=28627902


What is this? Just some stuff for designers so that their webpages and brochures match or what?

I was really excited and expecting some info on their CAN bus and OS. Sigh...



Potentially vaguely related, the Eclipse foundation project GM is backing to establish an open source protocol for vehicle app / service communication: https://projects.eclipse.org/projects/automotive.uprotocol


By the way, is there any open ECU/Sensor/Inject solution that someone could buy to retrofit old engines?


Yes, several actually with a more or less drop-in replacement for anything from 3 to 12 cylinders, it mostly depends on how much work you want to do adapting a particular piece of hardware to the sensors and actuators on your engine. That's the hard part, once you have reliable sensor data and ways to create reliable output it's mostly configuration work and you're good to go.

Off the top of my head, non-free licenses:

- AEM

- Haltec

- MegaSquirt

- Motec

- Profec

Free licenses:

- Speeduino

- RusEFI

And probably many others.

There are also special units designed for the race track folks.


Speeduino is the open-source (as in hardware and software) solution. Megasquirt is most definitely not open.


https://megasquirt.info/ is one (and one of the most well-known).

I've considered doing a retrofit on a classic Mustang V8, but the old-school carb works well enough that it's not been a priority.

Edit: sibling comment correctly identifies the code as not open-source. (I could have sworn it was; perhaps it started that way, or perhaps the amount of open it was when I last looked in detail was sufficient for my plans.)


As mentioned in another answer, megasquirt. I used the first generation one to retrofit injection to my carburated rally car with home made manifold and repuposed injection throttle bodies from a gpz 1100. Even badly self tuned it worked better than the carb did.


They're using Android?

I think they'd be much better off using something like Automotive Grade Linux. Google's car ambitions have been pretty disastrous, including the newer Volvos. Reviews of the Volvo Android Automotive infotainment are just awful. And I don't trust Google to not abandon it.


BMW is also moving from AGL to Android. But they're only using Android as an OS; the user facing UI isn't going to change. And it's not integrated with Google. That's different from Android Automotive / Volvo where the basic UI comes form Google and the stack is full of Google integration. BMW's method of using Android as the underlying OS seems like a reasonable option to me. Not sure what exactly Porsche is doing here.


Hey Porsche execs: The actual recipe to attract better developers is to raise your salaries.


It took german automakers a decade to be international and they still hire just "german" speakers to some teams. Hard pass.


Germany companies are so weird in this aspect, and I honestly believe it's why they kinda lost the tech race. The US is very much different in this diversity aspect, which honestly seems like a success factor.


There's US tech companies hiring people that don't speak English?


No, but I think it's different when you consider that 90%+ of the Germans on these software teams also speak pretty good English. I'm not saying that employees don't need to learn German, but you can give them a few years to catch up rather than leave willing talent on the table.

The Netherlands has done a much better job in this regard and is why they are booming as a headquarters for EU fintech companies. Sure, speaking Dutch will always open more doors for you as an employee, but most companies will not outright dismiss your CV because you can't speak the language on day 1.


In NL in IT it isn't rare at all to find people who don't speak Dutch, even in management (all the way up to C-level) positions.


You know it's not a fair comparison due to how widespread English is and German is not.


Why did you put the word german in quotes?


As if the only important metric was to speak german.


From the history of FOSS phones, I feel like we need to start much smaller here.

How many lines of code would it take to build a FOSS golf cart?


All of a sudden the chance i buy a Porsche went from 0% to 10-15%...


Wow, and the documentation is remarkably good.


Porsche is just VW. Why can't VW spearhead this for a much greater impact?


The VW group is weird. From the outside it looks like they must have the most mental internal politics, unlikely that they'd push an initiative like that through all their brands that operate almost as distinct companies


I think I have this right: The Porsche Family owns half of Porsche SE (but 100% of the voting power), which in turn owns the majority of voting shares in VW, which in turn owns the majority of Porsche AG (the car manufacturer). All of which are distinct publicly listed companies. I'm surprised anyone knows who they actually work for.


Application error: a client-side exception has occurred (see the browser console for more information).

Did we kill it already?


I love the idea but boy the use of hashtags is nauseating.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: