Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wow, 39.9% of respondents mark themselves as "Expert" at CSS. I've been doing front-end in various guises for 12 years and I would only rank myself maybe 5 or 6 out of 10.

As I've gotten more experienced in my career I've become more aware of what I don't know as opposed to what I do know.

It's funny that "Advanced" is mastering animations and transitions whereas "Expert" is being able to develop a full-front end consistently. I know many people that can build a front end, myself included but are nowhere near mastering anything, especially animations and css transitions.

I write this in jest of course. As we all know, self-assessment is about as reliable as...



Half of my income is from training people in JS and Python. I have 50% of those missions that are requested to be "advanced" courses.

The vast majority of the participants are not up to the task. They are paid professionnals, but they consistently overestimate they level, skills and needs. They often are not very good, and of course, they don't know what they don't know.

So I always check for that first, and rewrite my course on the fly to include the untold pre-requisites I identify. I don't tell them I do this to avoid making them feel bad. Nobody wants to hear "dude, you wanna learn react and you never used an arrow function, wtf ?" even if being comfy with JS is a listed requirement on the site.

Bottom line: most devs are not experts, and don't have the ability to assess that. But more importantly, it's enough for them to be productive and most industries don't need more.


If I've learnt anything from teaching kids it's the ones really good at it when we start are completely dwarfed by the kids who sucked at it.

You can get kinda far without really understanding what's going on, but if you stop and have to learn something really fundamental the "good" kids just seems to manage. Once things pick up they can excel because the have truly moved on. the previously "good" ones are still copy/pasting from past work or tutorials and get brackets wrong whereas the "slow" ones are now perfectly happy writing from scratch and thinking for themselves.


It's not just the people, some topics leads to that. E.g you can get far with just a few git commands, but one day, you will encounter a situation that is like a brick wall because you have no idea what's going on inside.

Today, most JS devs don't know how "this" works, most python devs never used pdb, most git users picture a branch like a diverging chain of commits and not a moving label, etc.

Again it's not the people's fault: they are required to be productive very quickly. There is competitive pressure. So they aim for what's good enough to make things that looks right. Can you blame them ?

Few jobs pay devs to update their skill on company time, and not everyone will spend an hour a day learning new things instead of playing with the kids, doing exercice or going out with friends.


> most git users picture a branch like a diverging chain of commits and not a moving label

Ok, I'll bite: can you explain that little more please?


A git repository is best thought of as a sort of tree, where the nodes can potentially have multiple parents. This technically makes them a graph ("directed acyclic" to show I do know the technical terms), but our human intuition will mostly work thinking of them as a tree. To understand multiple parents use your intuition about human beings, since we all have multiple parents.

A "branch" doesn't really have any special status in git. All it is is a pointer to a particular commit. Literally, that is all it is, even in the data structures. It has no additional ontological existence beyond that. When you make a commit to a "branch", all that happens is that git moves the branch marker along to the next commit. All of the "specialness" of a branch is what is in that behavior.

You can do anything you want with that branch marker, such as yanking it to a completely different part of the tree via "git reset --hard $NEW_COMMIT", and while you may confuse humans, you haven't confused git in the slightest. On the flip side, since branches don't actually "exist", it is perfectly sensible to "merge" with a direct commit hash, because that's all a branch is; a pointer to a hash. You can pick up that pointer and drop it down where ever you want.

(A tag is the same thing as a branch, except that when you're "on" a tag and make a commit to it, git does not advance the tag. But almost all it is is a pointer to a commit. Technically it can carry a bit more data, like a signature, but as far as the core structure is concerned it's just a commit.)

I am not the original poster, but I have a training I give at work for git, and it is heavily based on making sure this understanding is presented. (The other reason I wrote my own training rather than reusing any of the many other existing ones is mine makes sure to walk the trainees through the confusing states you can get into, then we explain what happened, why, and how to correctly get out of them. They follow along as we all manipulate a repository, and the training actually spends quite a bit of time in a "detached head" state, checking out, merging, and resetting to commits directly.)


  >> makes sure to walk the trainees through the confusing states you can get into, 
  >> then we explain what happened, why, and how to correctly get out of them
Is this something you could share?


- detached HEAD: "git checkout existing_branch" or "git checkout -b new_branch"

- you don't where somewhere and you moved and you can't go back: "git reflog"

- local repo and remote with a different history (e.g: you rebased on a published branch): the whole team to sync with remote except you, then hold. Export your remaining changes as a patch. Reclone. Apply patch.

- remote has a different history than the rest of the team (e.g: you forced push a different history): Delete remote, recreate, repush from one of the team mate, then apply previous solution.

- your messed up your merge and wish to never have done that: "git reset --merge"

- the last commit is not published and you messed it up: "git commit --amend"

- the last commit is published and you messed it up: "git revert HEAD"

But rather than solve problems, better not get them in the first place. Always "git status" before anything, always get a clean working copy before checkout/pull, create a fat gitignore, etc.


Not meaningfully. It isn't written as a blog post document; it's a series of commands and presentation notes, designed to be delivered live by me. You can basically obtain what I have in the document by combining A: what I wrote above B: a good git tutorial, see internet and C: some screwing around with using git checkout, reset, and merge with commit hashes directly on a little repo you use locally.


Of course. Someone explained it to me, after all.

In git, a branch is NOT like a wooden branch on the trunk of a tree, although it ends up being at the top of one, which makes the analogy ok.

A git branch is the same thing as a git tag, except it moves automatically when you commit from it.

You can see it as a lightweight label attached to a commit. If your HEAD is itself attached to a branch (HEAD is also a lightweight label, but this one is like a red sticker on a map saying "you are here"), when you do "git commit", the branch label is moved to the newly created commit.

Hence, you can have several branches on a single commit, and you can move branches around: they are just labels. You can also have branches with different names locally and in a remote, or have a branch with the same name, but on different commits locally and in a remote.

A branch is like a reference, it's useful to tell git what place in the history you are talking about, without referring to a commit in particular. It is a moving reference because the place in history you talk about is always changing: it's the top of a chain of commits, the ever changing last thing you worked on for this particular part of the history.

We want branches because they are a convenient way to say "I'm going to do something on the latest commit of this part of the history":

- "git checkout branch_name" => I'm now working on this part of the history, gimme all the latest stuff you got and consider everything I do is now related to it.

- "git checkout branch_name /file/path" => Get the latest version of the file from this part of the history and bring it on my hard drive

Of course, you can put a branch at a commit that is not yet the top of a chain of commits. But soon this commit will become the top of a new chain of commits because when you'll commit from this branch, a new commit will be attached to the previous one, diverging from the original chain, and the branch label will be moved to it. You now have a fork of two chains of commits, both of them having a different branch label at their top:

"git checkout commit_hash && git checkout -b new_branch_name" => I'm going to consider this commit as a new starting point and work from that.

In fact you can move a branch pretty much anywhere, create one or delete one at anytime, including "master", which is not a special case, just one created automatically on the very first commit.

This is also why if you "git checkout commit_hash" instead of "git checkout branch_name", git warns you you are in a "detached HEAD" (the "you are here" label is attached to a commit directly, not a branch label). Indeed, a chain of commits (the stuff that looks like a wooden branch on a trunk in graphics) can exist without a branch label. But it won't be convenient to reference: looking for the hash of the newest, latest commit, every time you want to mention it is tedious. Besides, git will run "git gc" automatically once in a while, deleting diverging parts of the history with no branch or tag, so you may lose this work.

This makes it clear that tags are labels like branches, only they don't move. They serve the purpose of having a static reference to a fixed point in time: both allowing you to easily talk about it, and let git know you want to keep this around.

All that stuff is way clearer with a visual explanation. For my Git training, I bought a fantastic toy for 10yo with magnets that let me build a 3D history tree and use post-its to represent branches and tags. It's instantly obvious. It's really fun, you can move stuff around and show what each command does to the tree. After that, git seems much more accessible, and you just grunt at the terrible UI ergonomics.


> In git, a branch is NOT like a wooden branch on the trunk of a tree, although it ends up being at the top of one, which makes the analogy ok.

So, there's an isomorphism between (1) a commit node; and (2) the chain(s) of commits ending in that node.

Why then is it important to think of "branch" as referring to one rather than the other? As evidenced by the isomorphism, they're the same thing.


It helps to NOT think of a branch as a commit node, nor a chain of commits ending in that node.

A branch is a separate concept, it's not even stored in the history DB but in a different dir. See it as a label attached to a commit, since a commit can have several branches attached to it, or none. A branch will be moved across the history, while a commit will keep his place in the history.

A branch is designed to be easily created, deleted, and moved around, freely, from anywhere to anywhere.

A commit is designed to feel immutable, and unmovable, and although this is not strictly true, this is how you will mostly use it.

A chain of commits is like a rail track, it goes in one direction, each commit composing it never changing (again, conceptually), and never moving. You stack commits, you grow the track, piece by piece.

The branch is more like a flag you put somewhere on the rail track (most often at the end), to let the workers know where to put the next piece.

Picturing it that way let you best use its properties:

- branches cost nothing. They are very lightweight, unlike in SVN, you should create as many as you want.

- moving to a branch is cheap. Apart from moving the files to the working copy, it's just a matter of changing point of view on the history. Switch to branches often, it's fast.

- you can put a branch anywhere you want. You like an old commit and wanna try something new from it? Plant your flag here and start working.

- deleted a branch by mistake ? No worry, it's just a label. You can recreate it in a blink.

- this branch is so good it should be master? Sure you can. Just swap the labels. But let everyone knows :)

Etc.


> branches cost nothing. They are very lightweight, unlike in SVN, you should create as many as you want.

Ahhhh. So the distinction matters if you bring in a bad assumption from SVN. It had never occurred to me that a labeled branch would be noticeably more expensive than the weight of the commits making up the branch.

> you can put a branch anywhere you want. You like an old commit and wanna try something new from it? Plant your flag here and start working.

Well... yeah. That's how you create branches. You go to the commit you want to branch from, and you run `git branch`. The fact that you can do this is immediately implied by the way you have to do it. I don't understand how someone could believe something different.

> deleted a branch by mistake ? No worry, it's just a label. You can recreate it in a blink.

This one's a little weirder; if you delete a branch, you _can_ recreate it, but it's only blink-of-an-eye easy to do if there's another branch containing the head of the old branch. Otherwise you're mucking around in the internal data.

> this branch is so good it should be master? Sure you can. Just swap the labels.

This is another one where I don't see how you would believe something different. Even if branches were huge, heavyweight objects, they have names, and changing the name of something is generally not so hard to do. There's a command solely for the purpose of renaming branches. (`git branch -m`)

> moving to a branch is cheap. Apart from moving the files to the working copy, it's just a matter of changing point of view on the history. Switch to branches often, it's fast.

And in my mental model, a checkout is indeed a (potentially mass) update to file contents (and, if applicable, file existence). Saying "apart from moving the files to the working copy" sounds -- to my ears -- kind of like the chestnut that "controlled for height, taller men earn no more than short men do". Setting up the working copy is the thing I'm looking to accomplish in a checkout.

I can imagine two types of people who want git training:

- Never used version control.

- Used subversion heavily; trying to update to the new new thing.

In your opinion, how many of these points are "git gotchas" that the first group need to be trained in, and how many are "subversion gotchas" that only really come up for the second group?


> In your opinion, how many of these points are "git gotchas" that the first group need to be trained in, and how many are "subversion gotchas" that only really come up for the second group?

Many years ago, SVN users coming to git were indeed a huge source of head scratching.

Not anymore.

Now we just have a lot of people that just don't think in a way that will lead them to say "So, there's an isomorphism between (1) a commit node; and (2) the chain(s) of commits ending in that node.". In fact, they don't know what isomorphism means, nor that the word exists.

Personally, I like to picture git as applied graph theory. One command = a bunch of operations on the graph.

But there is an old git joke that says:

"Git gets easier once you understand branches are homeomorphic endofunctors mapping submanifolds of a Hilbert space".

It is nonsensical (it's a meta reference to a Monad joke), but I think it gets the point perfectly: we need to bring git to the people that needs it, not to ask the people to come to git. I don't care what level of abstraction they are comfortable with, I want them to feel like they can trust their ability to be productive with the tool and feel their project is safe.

And I find that this way of explaining branches is the most universal way for people from all sorts of background to have a decent model of how git works under the hood. So that when fall in a trap, they can use this model and find a pragmatic way out by themself.

Being right used to be my objective when I was younger. Now I just want to be helpful.


> This one's a little weirder; if you delete a branch, you _can_ recreate it, but it's only blink-of-an-eye easy to do if there's another branch containing the head of the old branch. Otherwise you're mucking around in the internal data.

git reflog is very useful for this. Also, when you delete a branch, git helpfully prints

Deleted branch foo (was da2bb5d)

so you already have that information, right there.


Was going to say that. reflog should be in every tutorials.

When I started using git, I checked out back in time, then git logged and was baffled to not see my most recent commits in the listing. I panicked, though I lost my work.

Git terrible UI didn't help there: who though it was a good idea to just hide everything with no hint of how to get back at it?

Of course, I could have just git checkout master to get back where at was, I just didn't know it. But that's the point of reflog: if you are not sure what you did or where you are, it's always there for you, it has your back.


That ok, but these people need to stop labelling themself as experts. It makes difficult for true experts to stand out.


Oh one can identify experts very easily: they are the ones that understand nuances, see and explain things as cost/ratio instead of good/bad and have an up to date view of their ecosystem of choice but are ok with other techs.

The problem is that they are easy to spot only for somebody with experience in the field. This is why companies sometime pay good freelancers to help with hiring: they can assess what the company can't.

And once they do, they realize quickly that:

- there are few experts

- most resumes are bs

- experts are expensive

- they probably don't need them for most tasks and the bs resume dev may be ok, given the price and skill constraints

- they had one expert internally all along, doing half the job of the rest of the entire team and they should make sure this expert stays


Maybe experts don't call them selves experts.

I have a friend that had some home remodeling work done, poorly. The friend said to me, "But, they were professionals!" I responded, "Well, that's the problem. A professional is someone who will do the least amount of work in the least amount of time for the most amount of money. If you want someone who actually cares about the work, hire a craftsman instead of a professional."


Well, it is said that a wise man never calls himself a sage. Others do.


Yeah, this expert prefers Guru. /s


> you wanna learn react and you never used an arrow function, wtf

At least in the early days, using React was the first time most people would be exposed to webpack/babel, so it was also their first exposure to the newer JS features.


But you didn't need arrow functions at the time because createClass autobind this and you used methods. Now with extends and hooks, it would be a pain.

Besides, it's alright to learn es6 and 7 with react if you have time, but if your company pays a $10k react training in a 3 days session, you better have the pre-requisites right.

And at the very least, don't pretend to be "an expert" in 2019 if you don't know es6. Every tutorials for beginers include them now. Every docs of every libs use them.

You don't have to be an expert. It's legitimate. I don't advertise myself as a JS expert despite being pretty good at it.

But it looks better on the resume, and once you are hired on that assumption, you can't go back. Next time the company asks, you say expert again.


But they've been part of the ES standard for nearly 5 years now.


If you don't mind my asking how did you get into training people in JS and Python, and, how do you like it?


When I was at the university I was enraged by the grades the other students were getting: it was not because it was complicated, it was because the topic was poorly explained.

So I asked the uni to borrow a classroom and hold my own courses, a summary once before every exam. The grades went up and I felt useful for the first time in my life.

I kept explaining things to friends and colleagues after that, while being a full time dev. That's the only thing I was consistently better at than most people. There were better devs, but I could explain python decorators to a junior like if it was the most obvious thing in the world, and I felt proud of it.

I also still was regularly irritated by the terrible state of teaching. Why do people explain react by first installing webpack + babel while you can just drop a script tag and start playing? Why do nobody tells you what a bloody env var is, like you are supposed to be born knowing that? Why no beginner course ever include an intro to a debugger, do you enjoy having your students suffer? Why do they keep using the confusing reverse arrow in git history schemas, as if technically correct where the best kind of correct?

One day, I decided to do charity work in West Africa. I ended up stayed there for 2 years, eventually created my own company. Things where good.

Then the war started. My company went down, my bank account was frozen and I was forced back home, in debt. Fun times.

I looked for missions to quickly make cash, but in France dev doesn't pay that much. So I dug, and noticed that training professionals was incredibly well paid, thousands of dollars for a few days of work, which for the country is insane. It felt like stealing. It still does sometimes.

But I needed the money, and I though I was good at it, so I spammed dozens of entities. After a few courses, in which everybody was surprised there were no slide and the program kept changing, the phone started ringing.

Apparently most professional trainers suck too, and companies were astonished that not only the trainees could enjoy the session, but that they gained productivity afterward. Why did they paid so much money before, then? It's a mystery to me.

But I kept at it every since.

I usually do training once a month: JS, Python, Git, lot of web, a little bit of data analysis, best practice or design patterns.

But I still dev the rest of the time, otherwise I would always go from planes to hotels and that's no life. Besides, a trainer who doesn't code becomes obsolete very quickly.

Yet I try to get remote gigs for American clients: french customers never pay as much and they are very risk adverse so the projects are less interesting. I do have a few, because we know each others very well and working together is a blast: we can have drinks, say offensive jokes and still get the stuff done. Money is not everything.

All in all, I like training people. It's frustrating, infuriating even sometimes. And exhausting. But when I'm fed up with computers, I get to see people. And when I'm fed up with people, I go back to my computer.


One thing I miss about university is exams.

I miss being able to focus for just 2 hours on a task that would then give me a clear view on how well I knew the important mental models of a topic.


This is the Dunning-Kruger effect.


The description of "Expert" in the survey probably caused this.

"Expert: Being able to style an entire front-end from scratch following a consistent methodology"

Meanwhile, "Advanced: Mastering animations, interactions, transitions, etc."

I can certainly "style an entire front-end from scratch" without "mastering animations".


This is makes little sense. According to this I’m an expert but not advanced.

CSS architecture (for a lack of a better term) is completely different from its more playful aspects. Stacking on on top of the other is just silly.


The "expert" description is far too vague and easily interpreted as less advanced than the "advanced" category. Results for this question are meaningless.


Exactly my reaction. I do think that you can create stunning animations and for that you need some expertise but animations are the "the basis" for a complete and successful front-end.


You're absolutely right. I'd bet the CSS experts understand the context of the question and the difficulty involved in "styling an entire front-end app from scratch with consistency." I think a lot of novices looked at that and thought it meant using one of the css grid / framework libraries and making a website. The question hides how difficult that actually is with CSS and how easy it seems if you've never had to do it.


(Survey author) yes, it seems like I made a mistake with the different level descriptions. For me knowing how to do things like transitions and animations is comparatively "simpler" than being able to architecture an entire CSS front-end including managing a design systems, naming, specificity, etc. but I guess many respondents didn't see things that way.


I don’t think you’re wrong on that front. Because I seldom use animations, for example, I tend to have to look up the syntax, but that’s no big deal.

Knowing how to actually build things from scratch in a way that survives the beating of time is something else. It’s much harder to quantify though. Also, these days you have the css in js solutions, which means you can build the app by dropping in the css where you need it, so you get to tick that box, but if you tried to do it the “traditional” way you might not even know where to start.

EDIT two of my more common phrases at work are “you could just use css for that” and “you don’t need to use flex box to do that”. I’ve found that there’s a lot more using css to style things in isolation without understanding the power and performance it brings to the table. I spend more time trying to stop react renders from happening because css should have just been used than I do on the micro detail of ordering things a certain way to avoid repaints.


The problem is that you're trying to mush multiple dimensions into one axis, in the same way that political pundits try to shoehorn all debates into an arbitrary left-right axis even though political alignments are actually highly multidimensional. You should either query the multiple dimensions separately or ask only for a single dimension with a more clearly focused definition.


I don't think you made a mistake. This is just one of those topics that for some is subjective and for other is objective.

As others have mentioned, getting people to assess themselves accurately is very difficult. We always seem to skew towards the overly-confident side - not least because this what is usually rewarded in the job market.

Although it might be worth considering a simpler heuristic in the future as the distinction between Advanced and Expert is somewhat vague and perhaps ultimately not very informative.


Wrote my first piece of CSS in 1998, been doing it ever since. On a good day I'm a 4 :)


I got a resume a while ago where the applicant rated their skills 1-5 stars. They has rated themselves either 3 or 4 out of 5 in Node.js. When asking what they actually did with node, they explained they built some proof of concept once. I got to see the code and it was maybe 4 weeks of work for a beginner...

At that time I was leading a small Node.js dev team and I would maybe rate myself 3/5 in Node. We didn't end up hiring them.


I don't see that as terribly problematic. Unless a specific absolute scale is identified (as per another comment, e.g., the top of the scale meaning "I wrote the book on it"), I would assume the ratings were all relative to each other and the candidate's general skill/experience.

So 4 - 5 stars in an area just means "this is what I'm most comfortable working with and/or have the most experience with", not "if you rank yourself a 3 then I'm better than you at this" or "I'm a world-renowned expert at this". Otherwise, I'm not sure what you would expect from a fresh graduate who'd decided to use this format; 1 star for everything, or fractions of a star?


My lesson from that experience would be to throw away that useless 1-5 star rating scheme and skip to the interesting question immediately: "Tell us what you actually did with NodeJS."


To be honest the problem here is the question not the answer. It’s a bad question and shouldn’t be asked.


Why is it bad to ask about the actual experience someone had with something that I will be hiring them for? That makes no sense to me.


I'd say that the topic of "node.js" skill covers such a wide range of skills, experience and scenarios, that you'd need to be an expert already to judge how little you know. Someone who'd applied it to a single domain successfully can reasonably feel like they've got a good handle on it.

Your question is essentially a proxy for server-side architectures and applications, not nodejs itself. It invites misrepresentation.


I'm still not grasping how the question "what have you actually built with node?" is "a proxy for server-side architectures and applications, not nodejs itself." and "invites misrepresentation."

I feel like you guys think I asked them to rate themselves 1-5 on the subject. I did not, I got their resume, where they themselves rated them on the subject. With my question I was just checking whether that self-assessment was correct. I don't see how that is a bad question... Should I just believe their resume? Should I give them some programming assignments? Should I ask them to write a complex algo on a white board? I feel like just asking is a good way, personally, but maybe I'm wrong.


Maybe it’s like this: Given that I have no experience with Go, I’d still rate myself at least 3 when someone asks me how well I’d be able to work with it because the job they’re asking it for is ‘back-end development’.

Of course this falls apart the moment they ask me any Go question during the interview, but not when I’m actually asked to build a web application in it after being hired.

(the one time this happened people thought I’d been working with the new language for years, but no, it’s just a whole lot of transferrable skills)


The self-rating is useful because it gives you a measure of their perceived proficiency in skills relative to each other. You still have to ask a question like you did, but now that you know that "3/5 stars in Node" means advanced beginner level, you have a good idea what to expect when they rate their SQL skills 5/5 or their CSS 2/5.


Sorry: I misunderstood slightly, thinking it was something you put to them. In our tech interviewing I ignore self-assessed ratings, finding them useless for the reason you do, but I don't hold it against them if they're inaccurate just for the reasons I outlined.


Asking them about their actual experience is a fantastic question. Sounds like you got a good picture of their prior experience doing so.

Asking them to self rate themselves is useless however, as their answer showed.


They said the applicant had rated themselves on the resume. They weren't asked to rate themselves in the interview.

That's how I interpreted the comment and I'm pretty confident that's correct.


I ask candidates to rate themselves so I know where to go with the first few questions.

I ask for a 0 (never heard of it) to 10 (you made the thing or wrote a book on it) ranking and most people that have worked a couple years answer no higher than 5 or 6 but recent grads will say 8-9.

Pure anecdata but supports other comments about not knowing what you don’t know until you’ve been around different subjects for a bit.

Example: frontend design is self rated to a 6 but the person can not articulate the difference between raster and vector graphics (or doesn’t know what an SVG is or when to use one or not use one and vice versa).


I interviews interns from a school where they all ask their students to rates themselves like that.

At first I was put off, but now I interpret it as their own personal scale, with 5 being what they the best at, and not as "I know everything about this".

But even then, the best candidates are usually the ones who rated themselves the lowest.


That makes sense. The more I know about a topic, the more I know about how much I do not know. If I know a topic very well, I also know how very little I really do know, so I'd rate myself much lower.


I'd guess it is because the basis of CSS is pretty simple.

"You have selectors and styles, and I can just can look them up, so I guess I know all there is to know about it."


Second this - I'd have put the requirements of "expert" to be "advanced", skipped the rather small part of "know how to animation" as it doesn't constitute such a "level" for itself as much as beeing "one of the tools you'd probably know if you think you can write a frontend from scratch in css".

The "expert" category would rather be "having a deep understanding of what the browser does with styles and how the concepts of display etc actually work" for me.

But it's really hard to draw a line, css allows for awful hacky solutions that end up looking like they work, and many people "able to write a full frontend project" will produce rather bad CSS irl.


It is almost 2020, and browsers still trip on simplest CSS "tricks."

I think the best CSS compatibility across the board was during the time of ACID series tests, and has declined since.


Probably because a lot of people have been working with it for a long time (like you). I think I started 20 years ago - but I just learned like 40% of css grid this past weekend. (I've been a happy flexbox user for a few years)

But I know what you're saying. I also noticed a few percentage points of people saying they have 20+ years or React experience at this point.


The `20+ years` demographic refers to 20+ years of JavaScript experience, AFAICT.


I had the same misunderstanding, thanks for pointing that out


ahh thanks for that, I misread the survey notes on that.


Probably the Dunning-Kruger effect in action :)

Wikipedia puts it well:

> It's a cognitive bias in which people assess their cognitive ability as greater than it is. It is related to the cognitive bias of illusory superiority and comes from the inability of people to recognize their lack of ability. Without the self-awareness of metacognition, people cannot objectively evaluate their competence or incompetence.

https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect



I design in Sketchapp and then write my code accordingly. Sketch let's me make a consistent style. I know all of sketchapp's keyboard shortcuts.


[flagged]


I'm not sure it's worth addressing this rant because it conflates so many things, but...

> "Oh, look, typescript! Finally JS looks like Java!"

TypeScript's structural typing is vastly different from Java. With Java, every type must be declared somewhere. TypeScript types can be declared, defined as classes, or just based on whatever you typed.

At its core, the TS compiler is just paying attention to what you do and making sure you don't forget what you did elsewhere. It's like having bumper lanes on a bowling alley.

It's arrogant to think you'd get no use out of it when Microsoft, Facebook, and Google have all decided they need static typing on the back and front ends. If you think the cost is too high, that's a different argument, but I'd wager it's because of a poor IDE setup.

> Which means that silly HTML and CSS is going to be easy!

What does this have to do with TypeScript?

> Buzz off back to the backend, you strong-typed clown who doesn't understand what UX and a11y mean.

This is bizarre gatekeeping. TypeScript was adopted by Angular and, early on, seemed focused on browsers rather than Node.

> I know most people love it. I fucking hate it. I'm using it, because I have to, but I never needed it. It literally offers me nothing of value.

That's fine. You may believe it offers no value to you (although better auto-complete and errors are objectively valuable), but it does offer value to anyone who works on your code. TypeScript is like machine-readable self-documenting code.

If you document your code, it might as well be in a format the compiler can enforce and your coworkers can use without having to look back and forth between files.


> I could go on and on about the horror that is typescript

In his talk Predicting the future of the web at the last ReactiveConf, Richard Feldman (not himself a typescript guy, but an Elm user) quoted an interesting statistic that most of developers who try typescript never go back to writing plain javascript. I do not remember where that statistic was from, but can attest that with me it is the case. So much for the horror that is typescript.


> In his talk Predicting the future of the web at the last ReactiveConf, Richard Feldman (not himself a typescript guy, but an Elm user) quoted an interesting statistic that most of developers who try typescript never go back to writing plain javascript. I do not remember where that statistic was from, but can attest that with me it is the case.

Same applies to me, but on the flip side: Typescript (by its very name) is probably only going attract the kind of users who appreciate the kind of things that Typescript provides.

People who insist that dynamic typing is best, that compile-time checks of code is wasted effort etc etc are probably never going to try Typescript (and then go back to JS later on, and thus ruining the "never going back"-statistic).


I feel like Typescript has been very successful (combined with the slightly earlier switch to Node) in converting those that were holding out against typing.


I also can't remember any technology that I've seen so few people complain about online. That was actually part of the reasons for us to start using it.


The worst thing about typescript (and all of the JS ecosystem right now) is the compile times.


> most of developers who try typescript never go back to writing plain javascript

yes, most developers adhere to RDD. That is, resume driven development. You ever see developers going back to Ruby from Node? Or from any shiny new thing to the slightly older but perfectly fine thing? I predict people will move on from TypeScript within 5 years. Purely because everyone will know TS so the differentiating value of TS will plummet.


This is really condescending. I don’t like writing JavaScript, I find writing TypeScript ok.

I avoid both when possible but if the business needs some front end or someone else has chosen Node I’ll oblige, and I’ll be happier if it’s TypeScript.

Either way it’s not contributing to my resume in a meaningful way as a backend developer.


Backend NodeJS is Typescript? At least if you know what you are doing.


"Most" is a fairly strong quantifier, and some people would probably disagree.

Anything new could be termed as an example of RDD, but if that were always the case, we never would have had JavaScript in the first place. One (IMO) useful skill is determining when something new actually provides ROI or is actually a manifestation of RDD or similar.

Unless a language has really great first-class pattern matching (JavaScript does not), I don't ever want to work on a serious project in a dynamically-typed language. The guarantees provided with static-typing and the benefits when it comes to bugs, testing, and refactoring are huge.

Sure, devs that partake in RDD might move on, but there is a good use case for TS that has nothing to do with resumes, and everything to do with a preference in how to build software.


Even if that is true, there is typically a reason things become popular. Right now only Flow / Elm / Purescript could possibly compete in that specific space. Do you specifically see people migrating en-mass to those technologies or do you think something better is coming... soon?

(Or are you literally saying people will go back to JS? - I don't know too many people excited about the prospect of losing all the contextual help and caught compiler errors - and I don't see that happening because of RDD anyway!)


I'm not above going along with tech I think's misguided or bad rather than fighting it, because I know it's trending up and will look good on my résumé, but ask for me to solo-write a Javascript project and tell me delivery must be in Javascript, explicitly not in Typescript, and I'm writing that fucker in Typescript and delivering you the Javascript that tsc outputs, and I don't care if no-one ever knows I did that. It's for my own sanity.


TypeScript is kind of in its heyday though, it's fairly unlikely many teams would be comfortable making a bold decision to "switch back to JS", even if they want to.

Would need to be given more time for such a statistic to be meaningful, if it ever would be.


Not a single developer I know (I was in leadership position in a company with 300+ devs, now I lead a small startup dev team) cares about "heyday". They care about their experience during development being significantly worse with plain JS due to stupid ("we'll use TS because it checks names of properties") but also not so stupid ("we'll use TS because it checks that switches are exhaustive", "we'll use TS because it allows you to check nominal types") mistakes and bugs that TS helps to prevent. For all the devs I know, hearing that they have to code plain JS is the point when they head for exit.


I never said people would "choose" TS because it's in its heyday, I said it'd be an unlikely time for most to wish to justify a switch back.


Most of the types you write in TS will be structural


Indeed, but the few ones that are nominal are making the difference.


CSS is just key-value pairs. How hard can it be? /s

There is absolutely an air of arrogance when people approach seemingly simple things such as styling. And then they implement all sorts of atrocities.


Yeah, I think it might be similar to a lot of things actually. Maybe painting is a good example. Like, the techniques of using a brush in different ways can't be that difficult to learn, but putting them to use to create something great is something else entirely.

You can imagine someone learning those brush techniques and becoming quite confident that they could, theoretically, paint a whole painting, without ever trying and learning the difference.


You see this arrogance everywhere and it's really annoying. The classic joke about it is the "..first assume a perfectly spherical frictionless horse.." one.

or https://en.wikipedia.org/wiki/Spherical_cow


This is what happens when front end is marketed as "easy to get into".

I have 2 acquaintances who work in the medical field ask me about how to pivot to frontend web development. When they asked me about it, I told them that a modern frontend developer needs to at least know one of the major frameworks(ReactJS, VueJS, Angular).

They replied to me in surprise. "You mean CSS and HTML, right?"

That one made me mull over if I should get out of frontend web development altogether.


> I could go on and on about the horror that is typescript

As someone who has seen great improvements in delivered productivity and quality across an entire organization, more or less due to Typescript alone, I would be eager to hear what your complaints are.


The saving grace of Typescript is that types are optional. If types were mandatory for everything, it would be very hard for me to get a buy-in.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: