Personally, it's much easier for me to be invested in a task when what I'm doing is 'interesting' to me.
Almost always and ironically, when something's interesting to me it's usually more difficult work and thus, contains far more unknowns, hence making estimates that much less reliable.
Id like to propose an alternative to the alternative posed in the article:
Years ago I was tasked with building dozens of basic web forms. Immediately recognizing the inherit silliness of this task (boring, error prone etc), I built a form creation tool, which all these years later lives on as RackForms.
Over the years the feature set grew organically to accommodate an ever growing set of demands. Yes we still build forms, but we also call into web services, display data, and so on.
Point being, RackForms and any number of other form builders are the blindingly obvious choice for this task -- it would be down right silly to code forms by hand in this era.
And yet that's what we do with the web as a whole.
Once again Microsoft seems to have had the right idea in the good old Web Forms days. Instead of coding app's you'd simply drag and drop components and add wiring code to do what you needed.
Sure it was never that easy, but it's really hard to see how it couldn't have been better given more time.
Fast forward to today and as much as I adore say, VS Code, it's really nothing more than a hyper-powerful text editor. It has none of that tolling to allow for drag and drop development, Id propose that;s the real solution to this issue - Better Tooling.
That's great if you're building an internal tool, and in fact we use the Django Admin site for this, there exist low-code/no-code tools like Retool for things like this, it's a great type of solution.
Where this breaks down is anything customer facing, like Hey (the product Hotwire was built for), Basecamp, or things like Thread the product I work on. These products aren't forms, or CMSs, and trying to fit them into those sorts of solutions results in a bad or counter-intuitive user experience, even if somewhere deep down these are essentially "CRUD" applications.
I understand your point here, but I think we're a couple of innovations away from dramatically reducing the cost of UI development.
The state of the art for web and mobile app development is a delicate stack of tools that require a lot of expertise to wield. However, if you pay attention, most the complexity of modern UI development is accidental, not fundamental.
Figma already has "auto-layout", which is essentially flexbox. Imagine a world where your designer just builds your whole app with in Figma with flexbox-like constraints, then they hand off to your developer who just writes a couple of snippets code to wire it up to a database and makes sure the elements are configured to be SEO friendly.
IMO developers dramatically underestimate the power of GUI tools to usurp their role because we think in terms of such low level tools.
Oh yeah that is absolutely a direction I can see this going. But that's enabling completely custom UI/UX and behaviours per screen. It's the form-builder style generic tools that I think have limited uses in consumer software.
+1, It's very rare I run into people on the internet that seem to understand this. We're all obsessed with getting our delicate stacks of tools to work together that we fail to realize we're building at entirely the wrong abstraction level.
The solution to slow SPAs is higher level frameworks and tools that compile away the slow runtime. Svelte is a step in the right direction. We have many more to go.
Honestly, I believe the role SPA front-end developer is overvalued at the moment, not because the job is easy, but because the tools are so poor. Once we see platforms that properly balance ease of use with expressiveness and composability, I believe there will be huge opportunity for development agencies to undercut the market.
I've never seen anyone actually say MS got it RIGHT with webforms.
Webforms was their attempt to bring the VB6 workflow onto the web and it was a terrible idea specifically because the web is not desktop.
I get your point, but as far as I'm concerned any rationale that ends with the conclusion that MS got it right with webforms needs to be seriously reconsidered.
I'm gonna say it, then: it was pretty good tech. It worked similar to Turbolinks/Hotwire, you could make reusable components like React, it had much better benchmarks than interpreted languages at the time, and it worked very well with or without Javascript active on the browser. The WYSIWYG was nice but it would be alright without it.
What killed it was lack of community best practices. It NEEDED Custom Components and separation of concerns (using something like MVVM) for it to work well. But without a community around it (and with Microsoft's lack of interest), most people ended up writing spaghetti code.
MVC on the other hand had a large community around it developing best practices. MVC code could also be a clusterfuck if you wanted to: you could put all your logic in the view for example. The reason it didn't happen was because there was widespread education around the MVC paradigm. WebForms didn't have this.
Also it didn't integrate too well with jQuery, which was the new hotness. Today most people are smart enough not to use jQuery to mutate a DOM tree generated by a React component, but back then jQuery was the new kid on the block and people blamed the problems on WebForm, rather on the fact they were mixing two incompatible paradigms.
You're confusing webforms with the asp.net framework.
The framework itself was mostly good tech, as long as you didn't use webforms.
Probably the biggest flaw in the framework was not having a way to capture all unhandled exceptions (Application_Error does not catch unhandled asmx exceptions).
No, I'm not confusing anything. What I'm talking about is exactly WebForms. Please don't distort my words.
WebForms was good, as long as you didn't try to use it in way it wasn't intended, like doing direct DOM manipulation on top of the UpdatePanel, trying to access private state, refusing to use components or not using separation of concerns. Those are the same things that would also make React apps shitty, but the reason React works is because in React we just don't do it.
WebForms was good, but it needed the same amount of good engineering that a modern React app needs to be good, period. Unfortunately lots of programmers using it back in the day didn't have this knowledge because of lack of good documentation and good practices.
> it had much better benchmarks than interpreted languages at the time
That seems an odd statement for someone not confusing WebForms with the tech stack.
You can build applications in asp.net framework without the use of WebForms (and you could do so before MVC existed). THAT tech stack was pretty reasonable. It was WebForms specifically that was the problem.
But it did have better benchmarks. Whether it was the merit of the language, interpreter, rest of the stack or anything else is immaterial. I'm just demonstrating that it didn't have performance problems. The reason I preemptively mentioned performance not being an issue is because you are not putting forth any arguments for why you dislike it.
While we're here, cherry-picking one sentence of a post and putting it out of context to try to invalidate all my other opinions is not arguing in good faith. You seem to be trying to prove that I'm "ignorant" or something, but you're the only one displaying any ignorance here, since you're not putting forth any argument, but rather just parroting that something is crap and trying to "win an argument" using fallacies. There is nothing to be learned from your posts at all, no different perspective, no justification of your opinion, no insights into why you dislike it. You're just doubling down on a preconception that you have no justification for.
> Webforms was their attempt to bring the VB6 workflow onto the web and it was a terrible idea specifically because the web is not desktop.
As for performance, viewstate was a huge performance issue. That's a large part about what I meant when I said they tried to bring the VB6 workflow onto the web. In order to get there, they had to create the viewstate mechanisms, which created a different set of problems.
They wanted developers to be able to dragNdrop and then double click and write a small bit of code to do a thing, just like in VB6. Only, in order to actually get that to work, they had to come up with things that were actively harmful in a website (and it got worse when they tried to shoehorn ajax into it).
> There is nothing to be learned from your posts at all, no different perspective, no justification of your opinion, no insights into why you dislike it. You're just doubling down on a preconception that you have no justification for.
this is just an HN dog whistle. You could have asked.
"according to several online customers" - This phrase or a close variant is used in almost every review and it's really distracting. It feels too robotic and repetitive.
In general, it seems like "build quality" is a common complaint, which at these price points is likely always going to be an issue. I guess what I mean is with cheap prices I'm not expecting build quality to be great so repeatedly calling it out in a general sense isn't all that useful. Far more useful is when a specific part is called out, which in one case it is, the wire being flimsy. More of that!
In general, it would be nice to have more specifics instead of the constant generalities one associates with low/er-cost goods.
The phrase "noise cancelling" is used about 44 times in the article. From what I recall this may be considered "keyword stuffing", which is used to be (and almost certainly still is, a big old no-no). Of course how do you create an aggregate content page without mentioning the topic repeatedly? Well my friend that's the Google content paradox! - talk about something without actually mentioning it by name.
On general mission: it would be really nice to have the top "pick" better called out.
I read the first review and was ok, that sounds like an option I wouldn't have considered, good job site! But then said "Mpow hmmm... not a brand I've heard of", so I scanned down and came to the JLab's. I said "I've heard of them!" and read the blurb.
That's where the problem starts: For almost the same price the blurb makes the Jlabs sound "better" or at least more attractive than what I thought was the "top pick / best value".
I think some of this may have to do with the fact that in the Mpow's have two negative aspects pointed out "Others say that the battery life and microphone are disappointing", whereas the Jlabs only one: "pinch their ears". Mind this isn't about count so much as what's being called out: one may have bad battery life and the other may actually hurt my head lol.
The reason I say "problem" is sites like Wire cutter take a stand and proclaim "this is the best", even though in most cases the top 3 or more would all be just fine. As it stands, we have many low cost options presented and all with what appears to be potentially deal-breaking flaws; the end result is I don't really feel any better prepared that if I had just gone to Amazon and scanned reviews myself.
What's more, if I see something with 20 total reviews at 5 stars and a similar product with 40k 4 stars, I'll usually just go for the one with more reviews. In short, number of reviews is a strong signal for me, and I think / hope it would be trivial to add that as a metric to your site.
On Google: So in closing hopefully some of those thoughts are useful in some way, but I did want to close by saying I absolutely feel your pain.
Someone like you posts a traffic drop off story and we all rush in to say this is why, but what we forget is Rome wasn't built in a damned day. If this traffic penalty sticks it's pretty safe to say you're business and dream is dead. No google means no traffic means no site. Business is hard but what we need to agree upon is the promise of the internet was to democratize and incubate. The early internet certainly did, the modern, not so much.
I think we can all agree that the best ideas should rise to the top and be rewarded. We should also agree that a single company shouldn't have that power, but the Internet's users. Legislation is the only cure here, so far as I can see, as this exact scenario plays out time and again and it's always at our expense, both business owners and internet users.
Yuuuuuuup lol, under penalty for almost a decade now.
In my case I had a forum (remember those!) in support of my software product.
Bots would occasionally create accounts and post links to knockoff handbags and watches. I'd tolerate (and swiftly kill!) them because our users really loved having a place to meet. (This was back in 2011, ironically, reCAPCTAHA landed in 2012)
Unbeknownst to me those links were part of a larger spam network where thousands of low-quality links pointed back to my site, presumably to those fake accounts(?).
When the penalty hit the process of trying to figure out what the heck went wrong and trying to do something about it -- identical.
In short, I've been penalized out of existence because of an obvious and in my humble opinion, easy to identify spam campaign. Sadly Google placed the cleanup burden on me, and try as I did nothing actually helped. The article's mention "hidden" penalties feels...accurate.
I often tell folks when you perform a Google search you're given worse results than you deserve. My site and goodness knows how many others have been placed so far below the fold that if we're not outright killed, we never reach the users and potential we should.
No biggie if the search market were more diverse, sadly, that is simply not the world we live in.
I can see how that sucked for you, but personally, I don't want google sending me to forums infested with spam bots. I don't agree with google's hidden penalties, I think transparency is crucial, but the burden of cleaning up your spam filed website was yours and I'd expect every search engine to bury you until you managed to get it under control.
Great feedback but yeah, big picture we're talking pretty small numbers.
The 10 (or so) spam accounts posted about 30 times. The longest lasting survived a weekend, after which I enabled manual account activation. From start to finish the spam issue lasted around 3 weeks. Eventually, I got rid of the forum altogether.
Alas I tried all the usual things, from the disavow tool to Webmaster forums and dozens of site changes, sadly nothing helped. My penalty felt back then and still appears to be, permanent.
It may help to know pre-penalty my rank was quite high, first page for most relevant search terms. (immediately after page 20 or lower, now around page 10).
Ironically, while the rank was nice to have I never actually did anything for it.
I simply built a fast, human first(!), site that inadvertently followed Google's site quality guidelines.
For example, my software generates web forms. One common growth tactic my competitors use is placing a link at the bottom of every form back to the parent site.
Me -- I never did that. I strongly felt that under no circumstance should the output of my software be used as a marketing tool. Sure it may harm growth, but it felt right, and that was enough for me.
Years after my penalty I read just that. Google frowns upon and may penalize sites for using such "widely distributed site links".
My focus was and always has been on user, so of course I'm the one who gets penalized lol.
Anyway, I think the most damning part of the process was not having a reasonable path for knowing what exactly happened and what I could do to help.
If I were crafting legislation that's where I'd start. I can't help that Google has the market-share it does, but it does mean we all have to play within their world.
All I ask is the rules, wherever they are, are fairly, justly, and evenly applied.
Hey. I don't know much about SEO and have a question. Does sub-domain (forum.example.com) vs sub-path (example.com/forum) make a difference? Now days would it be better to isolate anything with user generated content onto a completely separate domain (example.net)?
This does not work. Many spammers are not exactly the most clever folks and often neither check the effectiveness of their links, nor if they are removed within a few hours anyway. Instead they simply send automated tools to your site, which they often do not even create themselves.
In addition to that, there are a couple of reasons more, why spammers sometimes even actively chase nofollow links. For one, many people believe a certain amount of nofollow links is part of a "healthy" link profile: having 99% follow links might be considered a "bad signal" by Google, because nofollow is just so common, you are expected to have many nofollow links.
I will not judge this theory, but the theory does exist and is followed by some people.
Plus, the fact that it seems Google simply started considering nofollow as a kind of hint, not a decision, did not help the spam situation either.
Regarding whether the nofollow helps with at least avoiding the penalty, even if you are still spammed, nobody knows whether this might work. Google is usually uptight about what it does or does not do.
> Plus, the fact that it seems Google simply started considering nofollow as a kind of hint, not a decision, did not help the spam situation either.
When nofollow is used for all user generated content they kind of have to take it as no more than a suggestion. Just ignoring UGC when it comes to ranking would throw away too much of the web.
This suggests spammers care about the quality of the links their bots post rather than the quantity. I suspect that isn't the case. Spam is always worthwhile to post because forum might change to remove the nofollow in future, a forum user might follow a link, and it's not worth the effort bothering to check if a forum is providing 'value' to the network. In other words, it's easier to to spam everyone and hope some of it proves useful.
I think they meant specifically in the context of avoiding being penalised by Google; my admittedly lay understanding is that "nofollow" would've helped here -- the spammers would still spam of course, but the penalty would not be as severe? Or apply at all, perhaps? Please correct me if I'm misunderstanding though
Consider the following from personal experience: If you rank highly on Google search as I once did you may become the target of malicious actors trying to leach from your ranking.
In my case this leaching appeared to Google as if /we/ were trying to game the system with link farms (we were not). Google penalized us in 2011 and we've yet to recover.
The point: when you use Google search you're often served sub-standard results, as over time generations of ranking penalties has lead to lower quality sites ranking higher.
Interesting. The recent 1903 update has a significant flaw where any disconnection from an RDP session (I work remote) completely resets all window positions. I was saved by this wonderful tool:
It's been absolutely fascinating to watch the YouTube v. community drama play out. I think many forget the exact same situation has been and continues to play out for Alphabets other major community, website owners.
The difference of course is we don't really have active and vocal "followers" in the way YouTube creators do, so for the most part, when a website owner like myself gets penalized for some abstract and out of my hand reason, or a algorithm change destroys your business overnight, it happens silently, beyond the headlines.
I can't stress this enough -- Google has been a powerful, often positive force for the wider Internet. But they're also a cruel, heartless, and often maniacal source of pain and sorrow.
Based on EXIF data that image was taken with an iPhone X back in October.
Which means the OS was 12.0 which had a widely reported bug where the image processing was interpolating images incorrectly. This has been fixed in 12.1. Would be interesting to see the photo taken again to see if the quality has improved.
Also why are you comparing an iPhone X against the Pixel 3. The iPhone Xs has had significant improvements to the camera.
Something I've been trying to figure out without much luck is how different the raw images are between the iPhone X and XS. Any idea where I could find a comparison?
The reason I ask is because feature such as smart HDR have shown great promise but I don't know how much is software and how much is sensor performance.
> A cursory glance shows the Pixel 3's camera the best of the bunch, sometimes by far.
I am surprised that you drew that conslusion from the review. I got the impression that iPhone and Samsung flagships consistently beat the Pixel 3 in daylight and normal light shots. They just have better hardware. He stated multiple times that iPhone is a better representation of the actual scene. However the night mode is obviously best in class (given nothing in the frame is moving).
Yeah, looking for I should qualify in overall presentation yes.
My comment was more a remark on the de-noise Apple's default camera app applies.
Again, I'd check out that image I linked to. At 100% zoom the ground detail is quite literally erased from the iPhone X image, to the point where it looks like a stylized art filter was applied.
I will agree with you that iPhone X photo looks poor. But that is last years flagship. The iPhone xs looks as good, if not better then the pixel. And if you compare the pixel 3 and iPhone xs at 2x zoom for the same photo, you will that the iPhone has much more detail (thanks to the telephoto lens). Google just can’t compete with the better hardware of the iPhones and Samsung phones, even with their better algorithms.
Almost always and ironically, when something's interesting to me it's usually more difficult work and thus, contains far more unknowns, hence making estimates that much less reliable.