Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Scala: Next Steps (scala-lang.org)
95 points by jfb on July 28, 2014 | hide | past | favorite | 112 comments


Some context - the lead developer on the Scala Compiler, Paul Phillips, went slightly AWOL last year, and presented the talk "We're doing it all wrong" https://www.youtube.com/watch?v=TS1lpKBMkgg http://www.slideshare.net/extempore/keynote-pnw-scala-2013

He still seems slightly burnt by the whole thing: https://twitter.com/extempore2/status/493633548994097154

Other critiques of Scala collections have been made with the Typelevel library https://github.com/scalaz/scalaz which aims to fix these issues.

This post, 'Next Steps', seems to have come out of Martin Odersky's response to the situation. I commend him for taking practical and thought out action.


Summary:

Scala 2.12: Focus on leveraging JDK8 features and interop

Scala next: Collections framework simplification

Scala next next: Language improvements


It's a shame they are not focusing on the most lagging bit - CLR/Mono support.


There doesn't seem to be much demand or interest for that overall. They recently discontinued the .NET support. Almost everyone was using it on the JVM. Any specific reason why you'd need CLR instead of the JVM?


I love how they don't take backwards compatibility so seriously. A lax attitude to backwards compatibility means they can actually have the syntax get less bad over time, instead of becoming C++. Scala does have some less-than-obvious things that need to be fixed.


Nice backhanded complement, the linked document explicitly mentions the automatic source code migration tool to address backwards compatability with the new Scala version (which, in terms of breaking changes [new collections] won't be coming until 2017 at the earliest).

re: linking Scala to C++, Stroustrup can only dream of the linked roadmap for C++


Well, I love Scala, but it can get confusing sometimes.


Very relevant is the timing. The first stable version of Scala 2.12 won't be released until Jan 2016.

Also, no source compatibility breakage is considered without a migration tool available, also mentioned in the timeline.


Migration tools aren't panacea they don't fix stackoverflow answers, pdfs, blog posts, existing tutorials, talks, slides, examples, mailing list posts, or irc logs.

They also require locking a codebase and making massive changes all at once.

It should be preferred to just add them as -Xlint:XXX warning / -Xlanguage:XXX features that can be opted into, along with a tool that will automatically update the code to best practices.

This way no legacy codebase has to undergo the massive cross-cutting refactor required when running a migration tool, instead they can opt into the features they want, and new users copy pasting in code will see that code work.


I completely agree. We'll try to make the boundary between old and new versions as flexible as possible and both lint tools and language imports will be an important part of that.

Note also that the rewritings we consider are in their majority rather trivial. E.g. insert a ": Unit = " every time you used procedure syntax, or wrap every XML literal in xml"""...""".

The one thing that scares me a bit is experimental features. These will not port cleanly, and my advice right now would be: If you want your code to survive transitions without major rewrites, don't use experimental features. They might be tempting, but they can well be a trap down the line.


I kind of disagree with this. As the most fundamental tool in a programmer's toolbox, I think devs have the right to ask for stability from their language/compiler. A new language version may mean new language features, but it's OK for devs to want to incorporate those features at their own pace, not have the language choose for them.

Without a focus on backwards compatibility, this means devs have to choose between bug fixes and perf improvements and new language constructs. This isn't a good place to be in.

At the same time, I understand the realization language designers and compiler devs have when they find out they were doing something wrong all along, or that an old feature was simply a special case of a new feature. Sometimes that means syntax breaking changes. It's hard to deal with, as the owner of a product, when you see something which you know could be better, but is held back by existing installs.


Well, at least in Scala developers have a choice. You can run the latest Scala compiler and still target a JDK6 runtime. Or use a very old Scala compiler and run it on JDK9 builds.

Compare that with e. g. .NET where picking an older compiler potentially locks you into an older .NET version, too. Have a look at all the different CLR, .NET and language versions: It's a huge mess, only increasing with projects like WinRT, and building portable assemblies not being available in all Visual Studio versions until very recently.

Of course, the problem is exacerbated by .NET's lackluster dependency management, and I'm not very optimistic that the .NET community is able to build a decent competitor to NuGet to keep it from stagnating, due to "Microsoft products first"-approach of Microsoft shops and Microsoft's inability to contribute to existing open-source projects.


Although I agree they have a good attitude towards deprecation and backwards compatibility, I do not believe their progressive attitude works necessarily implies a lax attitute to backwards compatibility.

Investigation is underway for more sweeping changes where backwards compatibility can be broken on the fundamental level. This is the Dotty project [1], which will influence development of Scala 3.

[1]: https://github.com/lampepfl/dotty


Umm, Scala 3 = Scala "Don Giovani" = Scala 2.14 (backed by Dotty); that's the gist I got from the linked post (thus the DOT calculus reference in this section of the document)

A ways away, but they're spelling it out now: der Dotty cometh ;-)

Sounds like 2018 if the roadmap plays out as planned. Looks tasty, wish it were more like 2016.


Scala wasn't adopted by a company I worked for because they didn't take backwards compatibility seriously. I believe they were breaking on point releases at one stage.


I feel like this might be a prophetic litmus test of where to work.

Example: Is the company able to handle moving targets, or is the culture so fucked they marry things like JDK 1.4 because they can't handle basic evolution.


I disagree. It's more a question of do you want to spend engineering hours updating all your tools to the latest version, or do you want to spend engineering hours developing features. Different organizations have different priorities. Many times new versions give you features that are worth the effort taken to update. In most cases I'd rather not use technology that is continually breaking my code on every update, and instead focus on solving interesting domain specific problems. It's flawed to say that this means the culture is "fucked up".


Scala simply doesn't move that fast and doesn't break that often. I can say that as someone who has been using it since 2.7. (Some of the thirdparty Scala libraries on the other hand will are close to giving me aneurisms ...)

At its current velocity, Scala is hitting a sweet spot in between improving language features and dropping cruft. The way you can change language features via imports is awfully nice a well.

Note that most of the major library providers for Scala jump on the new versions of the language while it's still in the milestone stages of development. Examples are scalaz and akka.


It's not just simple stuff either, often major libraries will have specific versions that work with specific releases of the language. No matter how well engineered your code base is, if you have to change apis that you use throughout your code as well as cope with languages changes too, it's going to be a lot of work for relatively little business value.


> It's more a question of do you want to spend engineering hours updating all your tools to the latest version, or do you want to spend engineering hours developing features.

How is this different from e. g. Java? Developers pick a version which works for them and stick with it until it makes business sense to upgrade.

> In most cases I'd rather not use technology that is continually breaking my code on every update, and instead focus on solving interesting domain specific problems.

How is this the case for Scala?


I've been developing Scala full time since 2.7. I have not had a release yet that didn't involve changing code to recompile to each new version.

I also have Java projects that were written in pre 1.0 releases that have updated to 1.7 without a single code change.

This is due to a different set of priorities. Each of those priorities has different costs and benefits but to act like Scala's lack of backwards compatibility doesn't have a cost is unfair.


I think scala is still young. A company that doesn't want to take that risk is probably right to do so.

As the language syntax will stabilize more and more companies will do that switch, it's just a matter of time.


Maybe, or maybe java8 will be good enough?

Scala isn't inevitable in the long term. Not everyone wants all the insanity that Scala (currently) entails.

I mean the long term roadmap admits the collection story needs improvement! As a user of those collections... about time!


What do you mean by insanity?


I think that attitude is great. I always move the code/libs forward alongside write it, and I a solo developer and code in several languages and maintain code.

This work great for example, with obj-c where things are moving fast and all the changes in some ways cut the size of the code.

In fact, moving the code forward is very cheap in contrast to stick for too long then eventually the pain is too hard.

I learn this when try to move a large codebase from .NET 1 to 2, in one single shoot. Months on this (and back them we are a team).

HOWEVER, the idea is mark deprecated things, build bridges and do something like GO that could rewrite/upgrade itself.


On the flip side, if all you were used to were systems such as Scala, it would be bloody amazing how documents that were typeset with tex still typeset. Or, any C program I wrote 20+ years ago can still compile and work.


I had to dig out some 5-year-old Scala source a few weeks ago. Honestly it wasn't hard to port - there's a deprecation cycle and the compiler warns you about anything that's deprecated, all the fixes were pretty mechanical. I think a statically-typed language makes backwards compatibility changes much less dangerous than they otherwise are.


I dug up some 15 year old cweb code the other day. Not only did it just compile on my first try, but it then succeeded in being among the fastest pieces of code I've ever run. :)

Even better, I weaved it into a pdf to read, and I was actually able to understand what was going on. The idioms were ancient, but that is another matter.


I've definitely written C code that took advantage of specialized compilers and OS calls that would most certainly not work on modern commodity hardware. C portability is all about limiting the features you use after all.

If you limit yourself to a small subset of Scala features it has all been backwards compatible as well. (That's not to say, I think that Scala does a good job with backwards compatibility)


How small of a subset are we talking about here? And yeah, I realize it is a bit unfair, as much of the "c language" is not about fancy containers and collections.

This is something I still admire in Knuth's tex, though. Having a typesetting language that is effectively "archival" is impressive. I love that I can load up any of the .tex files from his site and they still work, today.

Even his cweb language snippets benefit from this. It may seem somewhat odd and cryptic by some modern aesthetics, but it also hasn't been touched in years and still works.


To be honest, it would need to be an extremely small set of Scala code. But, more to the point is that you are glossing over decades worth of discussions, fights & costs associated with C code being compatible from machine to machine, compiler to compiler, and year to year. C got to backwards compatibility via longevity. Scala de-emphasizes backwards compatibility on purpose so it isn't surprising that it isn't good.

Tex is built for a very small, constrained, and well known problem space. It isn't surprising that you can get it right faster than a general purpose language. Yet, even Tex has been supplanted even it it's own wheel house.


I'm not sure I'm glossing over the point, so much as trying to make sure it isn't lost. The attitude towards backwards compatibility is something that is heralded as amazing by many when they look to Scala. To the point that anyone that speaks ill of it is often chided. (As I seem to be getting here. Note the rhetoric you are using.)

Going further, though, taking your own sentence, one would should probably wonder if Scala is sacrificing anything in the longevity realm by having the attitude that they have. I do not doubt that these are deliberate choices, but they are somewhat annoying. I'm incredibly glad that I can still run all of the code in SICP, if I want to study them. Same for TAoCP. Though, the latter is definitely more work than the former.

Regarding tex, it is annoying to see how it is often supplanted as much by people that just refuse to learn tex as it is for any technical reason. Especially in the modern environment where aesthetics are so widely debated and built around.

And no, I don't think this is unique to the programming world. I need only look at how many ways people refuse to learn to cook eggs on the stove to know that we often try many new things that just aren't that fruitful. And, honestly, this is a good thing. Rarely is any effort truly wasted.


Let me be clear. I am not chiding you for pointing out that Scala does not have the backwards compatibility feature set it would need to compete with something like C. Only that, C wasn't designed with backwards compatibility in mind, it just sort of arrived at it. Scala is most certainly sacrificing compatibility in the short term to allow for "improvability" for lack of a better term. I have no idea if that is a good or bad thing, but my intuition is as a new language with low uptake it makes some sense. A major challenge going forward is going to be maintaining the proper balance with regards to compatibility.

I couldn't agree more about SICP and TAoCP but you'll notice that neither is written in a widely adopted language.

I'm surprised that you actually use bare Tex. Even the most ardent users at this point will at least start with LaTex even though it isn't technically as portable...

Finally, how do you cook an egg not on the stove?


I don't actually use Tex that often. That admiration just comes from realizing that all of my old .tex files still work. As do all of Knuth's. Has been educational going through some of them that I can get my hands on.

I also think that the bare Tex that he writes is easily as readable as any of the latex I have ever written. Even the somewhat archaic looking cweb that he writes is actually more approachable than I would have expected.

Egg cooking is just one of the most gadget filled areas of kitchen accomplishments I personally know of. Seems every time I'm in the grocery store I see some new little device that lets me microwave eggs "perfectly." A friend has a toaster that will toast your English muffin at the same time that it steams an egg. Neat and all, but amusing in how little work it actually saves.


As someone who has spent a long time trying and failing (and taking classes) on how to cook eggs correctly, I completely understand the hopeful nature of an egg cooking gadget. That said, I've never found one I preferred to the standards.

For what it's worth, I do prefer Scala to Java ;)


I'm curious on how you are trying to cook your eggs, now. :) I fully acknowledge that there are some ways I don't have the patience to do. The double boiler method, for instance. I also just don't care for cream in my eggs. Plain scrambled works fine for my tastes.

Regardless, I am not trying to deter folks from Scala. I don't necessarily hate Java, though. I have been less of a fan of static typing in recent years than I was before, though. My stance is more on tooling than it is on typing. (Granted, I realize that the typer is just a tool, as well. Seems common in the statically typed world, for all tools to converge into one, though.)

Honestly, I could rewrite your sentence on cooking eggs correctly to "using the typesystem correctly" and it would be about the same for me.


It's a double-edged sword. C++ is C++ partly because the language doesn't change willy-nilly from underneath you. I could probably still compile C++ code from 1996, you can't even compile Scala code from 2012.

To you, that's a feature. To me, it's a sign that the language designers don't know what the heck they're doing and are throwing everything at the wall, watching what sticks and what doesn't.

I have very little faith in Scala because of this. It's a language that doesn't know what it is now, what it wants to be tomorrow, what it wants to look like to act like...I despise C++ but I'd never, in a million years, start a new project in Scala. I'd have to rewrite it in a year.

Maybe once the Scala gurus figure out what they're doing and release some kind of stable "spec", it'd be worth checking out. Until then, it's just a cool toy.

On how that's done, look at Go. Go 1 was released over 2 years ago and there have been no backwards-breaking changes yet, nor will there be. That's how you build a community.

My 2 cents.


Many people here don't seem to understand what is meant by backwards compatibility in the context of the JVM. Scala has very good source compatibility, this means that scala code from 2012 compiles just fine with the latest scala compiler.

Compared to Java, Scala has poor binary compatibility between mayor releases: a library compiled with scala 2.10 will not work with scala 2.11 In java you can use jars created with java 1.3 in your java 8 project.

Go doesn't guarantee binary compatibility either, they (just like scala) guarantee source compatibility.


> I could probably still compile C++ code from 1996, you can't even compile Scala code from 2012.

You might be surprised. Try compiling PRCS [1], for example.

There are two problems that you're likely to encounter. One, C++ standards compliance has always been a challenge (as a result of overly complex language design), and older compilers often weren't compliant. g++ 3.x in particular comes to mind. The other problem is that people often inadvertently used non-portable language extensions that allowed non-compliant code to compile.

That's not counting the language-independent issue with evolving libraries, where the 1996 version compiles, but is essentially useless, and the 2014 version is subtly (or not so subtly) different (if it hasn't been discontinued).

[1] http://prcs.sourceforge.net/


Did you just make that up, or is there any actual experience backing up your claims?

Sorry, but that just reads like the usual "let's throw some random claims together I have read on the Internet from people who never actually used the language either" which everyone knows by now.

Do you have anything worthwhile to add to this discussion?


>Do you have anything worthwhile to add to this discussion?

Do you?

For one, he mentions what he wrote is just his "2 cents", and he is entitled to his opinion.

Second, he provides factually true statements to support his opinion, e.g that you need to make syntax changes to compile previous Scala code on later versions of the Scala compiler.

You might agree or not agree with him, but he states what he believes, and supports it with arguments and counter-examples (e.g the Go reference).

OTOH, besides the insult to the parent, your comment is content-free.


>> Do you have anything worthwhile to add to this discussion? > Do you?

I already have?

> he is entitled to his opinion

This doesn't mean he should be immune to scrutiny when making claims which are intended to drag the discussion into boring flame-wars.

> he provides factually true statements to support his opinion

Like "I'd have to rewrite it in a year."? Have a look at how long Scala releases are supported, both officially and with commercial support. That statement is just factually false.

Apart from that it's the usual comparison with C++ which don't provide any interesting insight except showing that the author has not much clue about the topic.

> release some kind of stable "spec"

Scala, like Java or C++, are evolving languages. While having some kind of mechanically checked spec (which is what Scala developers are working on) would be great, we have to accept that almost no language manages to do that.

All three languages (and most other languages as well) have compilers which differ from the spec in some cases, and specs which leave out important details important to compiler implementations. I don't see the reason for singling out one language.

> Go [...]

Well, it's not kind of hard to have a stable language (let's just ignore all the changes in Go which broke programs for a minute (Go's runtime and its "GC" are probably the largest offenders)) if it could have been rightfully called obsolete in the 1970ies.

What's missing in this comparison is the level of usefulness achieved by the language. As an example, whitespace and brainfuck have probably been stable right after their creation, it's just they are no that useful to solve today's problems.

More expressive language are harder to keep stable, but it's not that you don't gain anything in return.


That also means that people will never pick it for anything that needs long term development/maintenance.


They will if they know what's good for them. If not (and I have no doubt that often this is the case) then that's their problem.


This assumes that past releases are not maintained, updated and stop existing as soon as a new version is released. None of this is true.

Java people do the same thing: They pick a version and stay with it, and migrate if/when it makes sense for their project.


Actually past releases do stop being maintained, updated and realistically can no longer be used for production, or even development due to bugs.

Java is backwards compatible in the syntax, and future compilers have, so far, been able to compile old code as well.

Will that be the thing for Scala?


There's Scala 2.9 (and older) code out there in production. Maintenance for older versions can be had.


Scala 2.9 is 3 years old. That is a trivially short amount of time.


Seems to be on par with the practice of other vendors:

  Java SE releases are updated for the public with bug 
  fixes, security fixes, and minor updates for a period of 
  at least 3 years before the release reaches end-of-public-
  updates (EoPU).
http://www.oracle.com/technetwork/java/eol-135779.html


You can't possibly be comparing the binary and source compatibility of Java with Scala can you? I'm not an expert, but I haven't experienced a binary Java incompatibility in the wild and I've been using Java since before 1.0. As far as source compatibility you can count on 1 hand the incompatibilities between Java 1.4 released in 2002 with the current release and Java is not a paradigm for long term support!

Instead of trying to shoot down everyone that points out an obvious disadvantage of Scala, and the backwards compatibility issue is an obvious one, a better approach would be to highlight what that trade off provides. Namely, Scala is free from the problems brought on by Java's strict backwards compatibility requirements and can continue to dramatically improve from version to version.


> Instead of trying to shoot down everyone that points out an obvious disadvantage of Scala, and the backwards compatibility issue is an obvious one, a better approach would be to highlight what that trade off provides.

I'd love to do that, but when people are only interested in repeating stuff they read somewhere on the internet, but not discussing interesting things like how union types will change the way people will write code, how collections could be improved, or how the arity-limitation of tuples could be dropped–because that would actually involve thinking and doing some research–it's a hard thing to do. Especially when it's likely that I'm going to be accused of "moving the goal posts" when doing that.

It's kind of sad that there is pretty much no insightful technical discussion happening on HN, but at least I can have some fun with "somebody is wrong on the internet" when people make wrong claims.


"I'd love to do that, but when people are only interested in repeating stuff they read somewhere on the internet..."

"It's kind of sad that there is pretty much no insightful technical discussion happening on HN"

Just as an FYI, in this 1 thread you've dismissively commented on some of the people responsible for the largest user bases of Scala and some of the oldest users of it.

Maybe there would be more insightful technical discussions about Scala on HN if some of us who have actually written Scala in the wild didn't have to put up with derision every time we mention fairly obvious problems that we've encountered and instead could say things like "man useful automated refactorings would be really useful and who cares about tuple arity above 23 f'n parameters"

Another hard thing to do is to deliver relevant business functionality. If your language/community doesn't make that easier, the faster it can be discarded the better.


> and instead could say things [...]

By all means say them, I think that would add something valuable to the debate.

I'm just bored by people who keep complaining that Scala is not Java (which won't change) or make ridiculous comparisons with C++ (because constructive criticism seems to be too much work these days). I think most people heard that stuff 5 years ago, and it's just a waste of time.

From my point of view, 80% of the comments are firmly focused on events in the past, not what can be learned from them for the future or what they would like to see in the next version.

----

So, given the announcement, which parts do you like, which parts do you consider unimportant and which parts are missing in your opinion?

Why do you think removing the arity limit is not interesting? What's your approach when interacting with databases, were tables with tons of columns seem to occur in practice?

What automatic refactorings would you like to see? Which impression do you have regarding ScalaIDE vs. the IntelliJ plugin?

If you had a few wishes about improving Scala in the future, what would they be and how would you implement them?


"So, given the announcement, which parts do you like, which parts do you consider unimportant and which parts are missing in your opinion?"

Things I love: Union/Intersection types. Any simplification of the type system with regards to type members. Any removal of special XML features. Any fixing of the travesty that is the collections library.

Things that are unimportant: "The type system will have theoretical foundations that are given by a minimal core calculus." A type system being theoretically sound doesn't interest me. What does this provide me in practice? Macros/Reflection (not that it isn't actually important, but unless Macros are central to your paradigm like in Lisp, they are a hack, who cares how hacky they are). Parallel collections.

Things that are missing: Structs. Structs. Real f'n Structs (granted this may be a JVM limitation but I'd trade nearly everything for them).

"Why do you think removing the arity limit is not interesting? What's your approach when interacting with databases, were tables with tons of columns seem to occur in practice?"

Because the only reason I can think of for having giant sized tuples is in a code generation scheme most likely for an ORM system. Code generation in an ORM system is generally a bad idea and it bothers me that an ORM system seems to have the ear of the language designers. When I encounter dbs with tons of columns, I fix the db using standard and proven SQL techniques like normalization, views, and stored procedures. I don't need or want a language crutch to get around those standards.

"What automatic refactorings would you like to see? Which impression do you have regarding ScalaIDE vs. the IntelliJ plugin?"

In an ideal world the Scala refactorings would achieve parity with the Java ones. As it stands currently, there are a fraction of the number of refactorings and they aren't actually safe, even on simple ones like rename or move. Using Eclipse is a non-starter for me. I prefer Vim or IntelliJ. That the Scala maintainers have a defacto standard that goes against my (and many other peoples) preference is a real problem. I would much prefer that the defacto standard was removed so that any solutions they came up with in the tooling space were of necessity workflow agnostic.

"If you had a few wishes about improving Scala in the future, what would they be and how would you implement them?"

The single most important next step for the Scala community is to come up with an idiomatic style. As it stands, if you open a library, go to a new company or even look at the standard libraries there is no single "right" way to write Scala code. For instance, lots of effort has been expended to enable type classes in the language but you can't use standard type classes to enable for syntax (to be honest I'd be fine with the removal of the for syntax entirely but understand that is a pretty out there proposition). A decision around what is and is not idiomatic code and a systematic rewrite of the standard libraries to follow that code is in order. Better yet, a program similar to Go format that just makes decisions about what is right or wrong is in order. The XLint/future features are a good start (not withstanding the neutering that happened with procedure syntax) but I would love to see something more aggressive.

Finally, Typesafe the company and Scala the community need clearer boundaries. It is troubling to me that the Akka/Play/Sbt/Slicks of the world get obvious preferential treatment when it comes to language prioritization. If you want to do something outside the norm in any of those functional spaces, Scala becomes hard to use (and in the case of Sbt they are actually internally inconsistent). I'd prefer a more hands off approach by the language that let the "market" decide the solution.


I agree with most of your comment. Thanks for writing it down!

First, the agreement:

Yep, structs are a VM thing. We are very excited about Project Valhalla (http://cr.openjdk.java.net/~briangoetz/valhalla/ etc).

Refactoring in a dependently typed language is hard. There's room for improvement. Scala-refactoring is a great example of community contribution.

Scala style checking is on our roadmap for 2.12. I'm procrastinating reviewing the first PR for the tool (abide) right now.

Finally, I would like to clarify we are doing everything we can think of to encourage the community to help shape the design and implementation of Scala. We publish roadmaps, review (and even rework) community PRs, do most of our team comms publicly (we are a distributed team), solicit proposals for language changes (e.g., http://docs.scala-lang.org/sips/pending/42.type.html). We spun out library modules that are now being maintained by the community. I spent last year simplifying the core build, so that we can move to a standard sbt build this year, which should also make it easier to contribute. As a first step towards updating the [spec](http://scala-lang.org/files/archive/spec/2.11/), we converted it to [markdown](https://github.com/scala/scala/tree/2.11.x/spec).

That said, I think it's only natural that the Scala committers on Typesafe's payroll are more inclined to work on customer support issues / internal Typesafe support. We always try to strike a fair balance to give back to the community.

Please let me know how we can do better, here or via contact info in my about.


> It is troubling to me that the Akka/Play/Sbt/Slicks of the world get obvious preferential treatment when it comes to language prioritization.

Being a member of the Play core team at Typesafe, it hasn't been my experience that Play gets preferential treatment when it comes to Scala. If you feel that this is the case then please cite concrete instances of where this has occurred.

> Compile times and sbt's incremental compilation support is an obvious example of this

I do not agree. This has been a general issue and not directly associated with Play. Play has provided some use-cases in this regard, but that is all.


Thanks a lot, this is a really interesting perspective!

I'll comment tomorrow!


Ok, so here is my response.

I'll probably focus a bit on the things you think are not as interesting. Not because I dislike your opinion, or disagree with it, but because I think it might be interesting to share my view how and why even uninteresting feature X might add value in the end.

> Union/Intersection types. Any simplification of the type system with regards to type members.

Agree. I think it will be very interesting to see how many ugly LUBs will go away due to that. Also, having first-class support for type lambdas will make some people's lives vastly easier.

> Any removal of special XML features.

I guess it's the right thing to do, but I'll probably miss that feature ... not sure if it is possible to achieve feature parity with string interpolation.

> Any fixing of the travesty that is the collections library.

Yes, I agree that a lot of work could be done there.

> "The type system will have theoretical foundations that are given by a minimal core calculus." A type system being theoretically sound doesn't interest me. What does this provide me in practice?

I think the benefit is that it might free the time of compiler developers, because they might be able to skip the step of figuring out whether code X should compile or not. And if we know the type-system is sound, the chance of having code that throws ClassCastExceptions in unexpected places gets smaller. But I guess the main benefit is easier reasoning, both for language developers and language users, which leads to less edge cases and feature interactions down the road.

> Macros/Reflection (not that it isn't actually important, but unless Macros are central to your paradigm like in Lisp, they are a hack, who cares how hacky they are).

I'll come back to this in a minute.

> Things that are missing: Structs. Structs. Real f'n Structs (granted this may be a JVM limitation but I'd trade nearly everything for them).

Absolutely agree on that. But this really depends on the VM, there is not much Scala can do to work around the lack of value types (see the limitations of AnyVal). I really hope Oracle manages to ship those things with JDK 10.

> Because the only reason I can think of for having giant sized tuples is in a code generation scheme most likely for an ORM system. Code generation in an ORM system is generally a bad idea and it bothers me that an ORM system seems to have the ear of the language designers.

ORMs are the vietnam of computer science, but I think that the need to represent properties of a database in code is not restricted to just ORMs.

Especially when you have multiple consumers of databases, it's often hard to correct the database schema without having to touch multiple projects in different languages for different customers.

I don't see that having better tuples would be a crutch to get around bad database design. I think that especially with first-class union and intersection types, better tuples will get vastly more useful and will reduce the current fragmentation in the library space (tuples vs. HLists, etc.).

> [...] I would much prefer that the defacto standard was removed so that any solutions they came up with in the tooling space were of necessity workflow agnostic.

I think this is the huge opportunity macros give us. Eugene Burmako gave a talk recently about scala.meta and one of his examples was showing how people can build IDE-independent rewriting, reformatting and refactoring functionality using Scala's macro features.

> The single most important next step for the Scala community is to come up with an idiomatic style.

Agree. While a lot still needs to happen here, I think there was quite some progress in the last years. From my point of view it seems as if the style of Scala code out there is slowly converging. While the adoption of the recommendations in the style guide was good in my opinion, I think the upcoming framework for configuring compiler warnings and the possibilities of scala.meta to build "go fix" like tools will certainly make an impact.

> Finally, Typesafe the company and Scala the community need clearer boundaries.

Agree. From what I have heard, I think the right people are aware of these concerns now.

> It is troubling to me that the Akka/Play/Sbt/Slicks of the world get obvious preferential treatment when it comes to language prioritization.

Can you expand on that?


By picking tacit standards for the messaging, web, build, orm stacks etc Typesafe introduces a chilling effect on other projects in those functional spaces. This chilling effect causes there to be less choice and competition which causes quality degradation in the long term. For instance, I don't think sbt would have survived it's dreadful beginnings if it weren't for the tacit backing of the core Scala team and to be honest I think the Scala community would have been better off with sbt dying young.

Further, by assuming that those stacks are the tacit standards it decreases the diversity of opinions for the language designers to hear about how the community is using their language. For instance Play is very opinionated about how web projects should be created. But it isn't actually the only way people are using Scala for web deployments. Further, once the core team starts to assume that people are using those stacks they can fob off problems onto them instead of centrally fixing the root cause problems. Compile times and sbt's incremental compilation support is an obvious example of this.

Finally, just my own opinion is that they've drawn the lines about what will be Typesafe and what will be "the community" in weird places. Refactoring tools seem like a very central requirement of the language yet that is left to the community. An ORM on the other hand has been elevated.


What's the alternative, though? By offering a nicely integrated platform, I would hope the community benefits from the halo effect this has on Scala. Our goal is to make the whole Scala eco-system successful, but we have to pick carefully what we work on ourselves.

The Scala team is absolutely interested in hearing how Scala is used. We had 800 users from all over the world show us their projects in Berlin not long ago. On the more technical side, we build 1M lines of OSS code every night to make sure we don't break anything out there, and we'll be more than happy to include more projects (https://github.com/typesafehub/community-builds).

I see our role as providing the platform for others to build on (examples of extension points: compiler plugins/macros/sbt's client-server-split/abide as an extensible stylechecker). I'd love for us to work on scala-refactoring, for example, but it's not immediately a priority since the community is already doing it, and we have so many other things to do. A tool like "go fix", which is closely related to code formatting, refactoring and style checking, is on our planning horizon (as mentioned in the roadmap that sparked this discussion).

> Compile times and sbt's incremental compilation support is an obvious example of this. I don't understand. You seem to be implying incremental compilation is restricted to sbt: it's not! We expose this as a service through the zinc project, which is used by the Intellij plugin (name hashing support added here: https://github.com/JetBrains/intellij-scala/commit/4831c0357...) and many other build tools (e.g., gradle, maven).

Also, regarding sbt: it survived because of the awesome sbt community. The core Scala team (which I lead) is shamefully still using ant, although we plan to move to sbt this year.


"What's the alternative, though?"

The easiest answer to this (and what most language teams do) is not have tacit default functional stacks. Typesafe have gone the odd route, not the standard one.

"I would hope the community benefits from the halo effect this has on Scala"

That may or may not happen, but my experience has been the opposite. An example is Play (which I quite like). It is a very opinionated web framework, that many developers have quite rightly decided didn't match their opinions. By tying the brand of Scala with the brand of Play you make it so that I as a Scala advocate have to fight for both together, even though there are many other Scala web frameworks, and I might not even be fighting for a web project, just that the negative connotations of Play leak into the discussion either explicitly or implicitly. Think of the Ruby case, which is currently more extreme than the Scala one. If someone says they do Ruby development, it is quite natural to think they do Rails even though I'm sure there are other Ruby web frameworks and Ruby the language could be used for any number of things.

This impact is even worse for something that has a really bad reputation like sbt. For many developers their first impression of Scala came from fighting sbt and that negative experience colors their whole opinion of Scala. For years I literally said some variation of "just skip sbt and use maven/gradle/ant etc." trying to get people to divorce Scala from sbt in their mind. Sbt has finally reached a point where instead of saying that I just say, "you'll get used to sbt" which is better, but I can't with a straight face say that sbt is not part and partial to the scala experience (for instance you mention some sbt work is part of the scala core teams responsibilities). If nothing else the other stacks you may encounter make use of it, so you need to have an understanding.

"You seem to be implying incremental compilation is restricted to sbt: it's not!"

I'm sorry if I implied that, you certainly can use zinc standalone or with other build tools besides sbt but the experience is not very polished outside of the sbt use case.

"Also, regarding sbt: it survived because of the awesome sbt community"

No it didn't. For years it was worse than the other alternatives and the only advantage it had was that it was the "Scala" build tool. It is now at a point where it needn't be actively avoided but when I think of all the sunk cost spent on building sbt and fighting to build with sbt it makes me irrationally angry.

"The core Scala team (which I lead) is shamefully still using ant, although we plan to move to sbt this year."

There is nothing shameful about this at all and I wish you'd stick with Ant for all the reasons I've mentioned above.


Thanks for pointing all this out. I honestly don't think of the Scala brand as being tied to the Play framework. Typesafe actually pushes Play on Java more than it does Play on Scala (even though we think Scala is better suited, obviously). We try to decouple Scala from Typesafe and our technologies in two ways: we provide Java programmers with an excellent experience on akka & play, and we welcome external maintainers for any part of Scala.

We happen to steward the Scala project and fund a lot of Scala core development (along with EPFL and research grants), but we're more than happy to share that responsibility with anyone who steps up, as has recently happened (baby steps) with the Scala modules that we spun out of the core for this reason.

PS: The only things that make me angrier than ant are jenkins and bash. (The build systems you're not angry with are probably the ones you haven't spent enough "quality" time with.) In any case, thanks for persevering with sbt and scala!

PPS: I don't think this is an odd route for a small company; how do you make money by developing just a programming language and nothing else?


I like how most of the sentences in this comment try to present some questionable claims as facts, with the usual comparison with C++ casually thrown in.


I'm not a fan of rigor; I prefer to communicate in a short, simple way and let people fill in the blanks. But here's some better argumentation:

There are non-obviousnesses in the Scala syntax: Watch this talk from Scala Days: http://www.parleys.com/play/53a7d2c7e4b0543940d9e551/chapter...

A language with backwards syntax compatibility will have either constant or deteriorating syntax over time: This requires a cost function for language and library syntax. I define this as being a combination of syntax size, syntax consistency and syntax readability (as measured by learning curve for new developers). Syntax here includes the syntax and interface to features in libraries, not just the core language. Obviously, already existing syntax cannot be improved upon without breaking compatibility. This is bad as long as the existing syntax is not perfect, which is (almost) always. One can add better, additional syntax to mitigate but size and inconsistency will grow and new developers will have to understand the old syntax anyway, so you cannot fix existing bad syntax by adding new replacements. Any new syntax will either adhere to old, bad rules or break consistency with the old, neither of which are optimal. Conclusion: Backwards compatibility means you're fucked.

C++'s syntax is bad: This is straddling the line between subjective and objective. I can't imagine any measure by which this is not true. If you want to claim otherwise please remember that extraordinary claims require extraordinary evidence.


I object to you claim of Scala developers "not taking backward compatibility so seriously" and the "lax attitude".

Just compare Oracle's approach–of letting an employee announce on Twitter that they are changing Generics in Swing, and that people should download a JDK9 build, compile their code and see whether things break–with Scala's approach of automatically and continuously building ~1.000.000 lines of publicly available code against Scala to detect breakages before they happen.

Or compare the tools Java/Scala gives you for managing deprecations, both in the library (@deprecated) and external tooling (migration manager).

Scala developers go the great lengths to improve and evolve the language while they try to minimize the work needed to move between language versions. See the recent improvements in Eclipse for an example.

I also find the comparison with C++ pointless, boring and–to be honest–ridiculous, as well as the remark about "less bad" syntax.


Oh, yeah, I exaggerate sometimes. Not really planning to stop, honestly. I think Java never breaks compatibility in the core language or standard library, right?


They started to do that recently.


How exactly? You can still -target 1.4 and there hasn't been any removed APIs.


Incompatible changes:

  - Ongoing changes to method selection within major and minor releases
  - Various breakages when generics where introduced
  - Various breakages when Swing was generified in Java 5, 7 and 9
  - Change of visibility or removal of methods caused by security issues
  - Deprecation and upcoming removal of various methods due to modularization plans


> ... letting an employee announce on Twitter that they are changing Generics in Swing

I hadn't heard that - could you point me to a reference please?



The #1 item on Aida specifically mentions breaking backward compatibility is an option.

The #1 item on Don Giovanni is "clean up syntax".


I hate that they don't take backwards compatibility seriously.

There is an enormous cost to a lack of reliability and consistency in your tools; every time a significantly popular language generates bitrot through an incompatible change, hundreds of thousands of man hours are wasted.

I've seen too much perfectly good, well-made, entirely stable code that did not need any changes bitrot -- simply due to a lack of platform care for stability.

Ignoring stability is merely a way to optimize for the laziness of the few implementors, as opposed to optimizing for the many users.: If you can't support an API over time, then don't ship it; if you're shipping APIs and language design features that require regular churn, the problem is in your design process.

I've got a firm eye on Java 8+ as a route away from Scala.


We do take compatibility seriously. Others have given examples in this thread how we do this technically. I don't think any design process can guarantee perfection at first try, hence we prefer steady improvement over stagnation. When we decide it's important to change something, there's advance warning through deprecation.


If you took it seriously, you wouldn't break the language with every major release, and you'd spend more time not shipping things that are obviously poorly thought out.

No design process can guarantee perfection at first try, but that means you have to invest the effort to maintain what you produce that's imperfect. Simply accepting that you'll produce garbage is how you produce more garbage.

"Stagnation" is what happens to the hundreds of thousands of lines of code in the world that you bitrot with every breaking changes. When you invest more effort in craftsmanship, you're not stagnating the language.


I realize I exaggerated by using the word "lax", btw. Not my intention to hurt any feelings. Sorry.


I really wish they'd stop grouping procedure syntax in with things like xml literals and forSome syntax. Procedure syntax is still used a majority of the time when able judging by the commits on github and it has several benefits whereas the others are rarely if ever used. Removing it while leaving things like postfix operators in, but behind a -feature flag is senseless.


Compared to the other features you mentioned, procedures are actively harmful. That's the reason why they are (slowly) going away.


I disagree with this line of reasoning that because procedures are actively harmful, procedure syntax should be removed from the language. My contentions are that that declaring that procedures are actively harmful is incorrect, further making procedures look more like functions is counter-productive, and finally making them harder to write doesn't make them written less often.

A. It has not been demonstrated that all procedures are harmful.

The most common testing framework for Scala is JUnit at least among Intellij developers[1], and the most common way to write tests in JUnit is as far as I can tell with the @test def testSomething {} or def testSomething {}, both of these are procedures. They are good procedures! I don't want to belabor this point because I am not a fan of procedures and I am suspicious of them, I try to zero in on them in code review but I don't feel that forcing `: Unit =` accomplishes anything meaningful in discouraging their use.

B. Making procedures look more like functions is more harmful than allowing them to have their own syntax.

When I am reviewing code, I will immediately try to scan for = versus {, procedures with : Unit = doesn't stand out as much to me as ones written in procedure Syntax. Having read hundreds of thousands of lines of Scala code, I can quickly see the procedures versus the functions. To me making the procedureSyntax go actually makes procedures blend into functions and sends the messages that procedure are just functions that return Unit. Which while true, isn't probably want you want to be thinking when you see a procedure. When you see a procedure, you brain should be switching immediately to an interrogative mode, what side effects is this code upto, should this thing exist? Or if I am reviewing a good programmers code, hmm what made the poor bastard have to write a procedure here, and can I throw him a life line and get his code out of this quicksand.

C. Even if procedures are harmful, that doesn't mean that procedure syntax is harmful.

Procedure syntax is the only safe way to use type-inference when writing a procedure. Well it's not really type-inference, because it's it own syntax, but that's because there isn't a consistent way to know unit is going to be inferred. Forcing the programmer to use 10 extra keystrokes every time they write a procedure accomplishes what exactly? Either they are don't understand that the procedure they are writing is harmful, and now they seem even more like normal functions to that developer, or they do, but their task requires them to use them, and writing : Unit = doesn't accomplish anything.

Thousands of good, working Scala procedures are written every single day and committed to Github. Thousands of terrible ones are also written. And lets not even get started on method local procedures, which often take the form of @tailrec def loop {}, which is, in my experience a much better way to write `while`. To me this whole idea of killing procedureSyntax is akin to removing type inference from var because it's bad practice. Sure using vars are bad practice, but when it's needed why are we punishing the developer that already has to write some bad code? The poor guy already has to write some vars, now were are really going to stick it to him by forcing him to write like it's java and do more typing, because someone deemed vars bad, and therefore they should be harder to write? In Scala we've come to the consensus that we like type inference and we prefer less code to more, I even remember reading an EPFL paper about how because Scala has less LOC it is more readable.

As a counterpoint, to me actively harmful is also breaking most examples of a procedure that already exists on stack overflow. Including good old `def main(args : Array[String]) {}`

Also I believe that if the forced type annotated Unit syntax was superior, developers aren't stupid they would be using it already. It's considered good practice among several developers, (certainly more than who oppose procedure syntax), to always type annotate public methods, should that be required as well? Imho a -Xlanguage:requirePublicAnnotations that can be toggled on / off as desired, what would actually be much more helpful to producing readable APIs and public Scala code.

Having said all that I don't think I am going to convince the anti-procedure-syntax people. This is some serious bike-shedding territory. So can't we all just get along and compromise? You might not agree with my reasons, but hopefully you agree that I didn't just make them up for no reason and that they are pretty strongly held.

If it is the contention that removing procedure syntax is an improvement to the language, there is a process for improving the language that is supposed to be followed: Creating a SIP (Scala Improvement Proposal). And further rather than invalidate a coding style still actively used by a majority of Scala programmers why not follow the lead of -Xlanguage:postFixOps and propose adding a -Xlanguage:noProcedureSyntax to the language. That could be done for 2.12 (or even 2.11.3), and codebases good start improving sooner according the noProcedureSyntax people, and I could start using -Xfuture on my company's codebase again.

1: http://parleys.com/play/53a7d2c5e4b0543940d9e546/chapter4/ag...


In what possible way is the procedural syntax harmful?


People want to write methods, but forget the =. In the worst case, the result is silently swallowed and the bug manifests itself only later at the use-site, not at the point of declaration.


The exact same complaint could be registered about any function that doesn't have an explicit return type, yet there is no move to make that mandatory.


No, but omitting a return type for a non-private or implicit member is definitely a style violation.


No. If the method has no explicit return type, it is inferred. It's really reducing the usefulness of type inference if a trivial error can flip the switch from "the return type is computed from the last expression of the method body" to "the method body is ignored completely, and () is returned instead".

Apart from that, there is a move to make explicit return types mandatory in some places; for instance when defining implicit conversions or when declaring abstract methods in traits.


And yet, if there is an idiomatic Scala (and the lack of one is a huge problem with the ecosystem) it's that nobody uses function syntax for Unit return types. So in order to get rid of a class of bug that is minuscule compared to others you are going to require hundreds of thousands (millions?) of lines of code to be changed. It's that kind of lack of real world perspective that causes the Scala community to be derided.

The most recent version of the Xlint/future features are useless because of this blind spot making so much noise in the output.


> And yet, if there is an idiomatic Scala (and the lack of one is a huge problem with the ecosystem) it's that nobody uses function syntax for Unit return types. So in order to get rid of a class of bug that is minuscule compared to others you are going to require hundreds of thousands (millions?) of lines of code to be changed.

If there's an automated conversion tool, who cares?


Will the tool scan github, epfl papers, and the rest of the internet and fix the Scala there, which is still being produced using procedure syntax?


It will scan the code you clone, copy, etc. from those places.


Are the writers of the automated tool going to indemnify me if they introduce bugs?


No, you'll have to review the patch yourself. We also don't indemnify you when we introduce a regression. We do have support contracts for those who want the fastest turnaround in fixing those...


I don't mind if they remove procedure syntax. It's annoying special syntax that confuses newcomers (I know it confused me!) for little gain.


How many LOC are you responsible for maintaining? Of all the confusing things in Scala the procedure vs method syntax seems like a trivial one, yet in deployed code bases most people use the procedure syntax. It has become a defacto standard. That the language designers got it wrong is bad, but not worse than the fix (or if they are going to "fix" this they should do it more slowly and not screw up Xlint/future).


I agree it's not the most confusing thing about Scala. It just doesn't help. It doesn't confuse me now (I remember programming languages were the distinction between function and procedure is important), but it makes little sense and I'm glad this syntax is removed for future generations of Scala devs.

Scala is a relatively new language that already has made too many concessions to backwards compatibility with Java, resulting in a less elegant language than it could have been. I'd rather they broke compatibility a few more times and got rid of their earlier mistakes.

This is not very important, though. I'd rather they focused on other, more important improvements first.


It's not a minuscle bug. Some older thread where people discussed this showed that it is a huge issue in practice.

The change to method syntax is both binary and source compatible (forward and backward); and as mentioned in the article no one is expected to do it manually.

Additionally, compiler warnings will become configurable, so that developers can choose which warnings add the most benefit to their workflow.


I've written/read/reviewed lots of Scala code and can't remember seeing it be an issue. Certainly not a huge one. I'd like to see a pointer to that thread if you can find it so that I know what to look out for. Maybe include that as justification in the SIP.

If this was such a `huge` issue, I strongly question how code of such importance would be written without a unit test accompanying it that calls the method at least once. If you are going to write code where having unit inferred is going to be a 'huge issue' what the %$&# are you doing not unit testing that method? Is it reasonable to assert we make the unit test harder to write (unit tests are typically procedures) and make this bug even work to catch in the first place?

If this is done with a -Xlanguage flag or made as opt in warning then that sounds very reasonable. It's going to be a lot worse if the first thing a newbie sees when they copy paste a main method in from an old tutorial is 'Error: Procedure syntax is deprecated'. No matter what the parser is going to have to parse the procedure syntax anyhow, in order recognize that form and give a reasonable error message for new users copy pasting in code. So it's not like we all that parsing code disappears.

>Additionally, compiler warnings will become configurable, so that developers can choose which warnings add the most benefit to their workflow.

My understanding was/is that this is going to become a compiler error. If it is going to just be an opt-in -XLint:procedureSyntax thing, then why isn't it being done for 2.11.3 ??? If the plan is just to make an opt-in warning I can rest easy, because I will simply never turn that linter warning on. I would however suggest to consider making it a -Xlanguage so that like postfix ops it can be disabled/enabled where needed.


The warning is already there: -Xfuture -deprecation. We decided to delay making it a plain deprecation until we have said tool to do help you stay deprecation-warning-free. You are addressing deprecation warnings in a timely fashion, yes? :-)


It feels like a faster complier is still a ways out.

That's really too bad.


I really do not understand this continued complaint about Scala compilation. I've been developing in Scala for over 4 years full time and have never had compile times be the major bottle neck in my workflow. Between incremental compilation and good modularity test times dominate my "waiting" cycles and that is on the order of seconds. Hardly noticeable given my think/type loop is much slower than that.

I have any number of complaints about Scala but compile times is way down on my list. I'm generally interested in what sort of compiler performance is expected.

That said, it might be a generational difference as I've worked on projects where full compiles happen overnight...


How much code are you compiling? At Foursquare (with 100s of thousands of lines in our bigger servers) compilation times are awful. We're solving this by better modularizing, but it's a long journey.


Without significant use of sub projects (i.e. a highly modular and well thought out application structure), you're basically screwed on the compilation front. It's just absolute night and day the difference, compilation hit is reduced by orders of magnitude and development becomes enjoyable.

Will still be a compilation hit for very large projects, but less than with an everything depends on the kitchen sink approach, been through that pain point ;-)

My biggest gripe these days is with the presentation compiler in Eclipse, that can get pretty laggy when working with heavily inferred code, your own, or a 3rd party lib like Shapeless, Scalaz, etc. Not sure if IntelliJ is much better in this area, would be surprised, inference is costly.


While I won't defend the Scala compilation times, your complaints about modular application structure being ultra important for sane compile times is true for every statically typed language.

I completely agree with you with regard to presentation of code. A simpler language would allow for simpler error highlighting in your editor which would be especially valuable. Lots of languages also suffer from this problem though...


Ya, agree with everything you said. Unfortunately untangling a hairball is a bit harder than constructing one. Working on it though!


Admittedly, the largest Scala code base I've worked on was ~200K LOC size which is pretty small. But even that under a full compile/test cycle we were talking minutes in the worst case, dominated by test time. Incremental compile/test cycles were measured in 10s of seconds.

What do you mean when you say "awful" and how does that compare to your test cycles?


It's too easy in scala to create a giant mudball codebase where every file depends on half the project. Managing the compile graph is almost impossible when you have 20 hackers on the same codebase.

Sometimes I wish that scala had some compiler-flag that signals these problems, maybe a restricted mode or something with annotations.


I have not found that attribute to be significantly different in Scala than in other languages. That is, it seems to be a central problem with all of them.


I think you can speed things up a bit by manually entering the types of objects, in some instances.



I tried it, it's not flexible enough for my needs.


File issues and describe your use-case! :-) The author usually seems to be quite responsive.


Well, when you work in Ruby, Python, or PHP all day long, almost any compile is annoying. Even Xcode feels faster than my experience whenever I periodically try Scala inside of say IntelliJ.

If you've ever tried Go, compiles are basically instant. I don't love Go as a language, but gosh the compile times make it attractive as a daily driver language compared to Scala.

Scala feels like a great language that I should enjoy and I just don't, mostly because of the 5-10 second turnaround that I always feel when I try out Scala every 6 months to a year.


Having spent a fair bit of time working in large projects in dynamic languages I think compile times are completely overrated as an issue. Much more important is build times. That is compile, test, package loops. I hardly ever compile in isolation. I am usually running tests, doing code generation, building artifacts as well as compiling. As soon as this is true, compilation times tend to be dwarfed by the rest of the process. Thus my point about compile times being low on my priority list (though I won't defend them).

For instance, lots of people mention Go compile times during discussions such as this and I find that argument non-compelling for the simple reason that on small projects the difference is imperceptible and on large projects the other factors I mention dwarf compile times in either case.

I will say thanks for the concrete time period definition though. 5-10 second turn around seems like a trivial time to argue about to me. If I have to look up from my editor for 1 second it might as well be 30. Anything more than 30 is context switch time. I haven't had to deal with incremental build/test loops longer than 30 seconds in a long time and Scala certainly doesn't require them. Again, I'm perfectly willing to concede this could be a generational difference as I've worked on projects that had compile times measured in hours and (automated) test cycles measured in days.


It has improved a lot since 2.9, at least.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: