Hacker Newsnew | past | comments | ask | show | jobs | submit | robert_g's commentslogin

Adding my own personal experience.

At the end of last year I wanted to change jobs but I was paranoid about my raw algorithm knowledge. I've been a consultant for over 15 years. The last company I worked at for a decade. I've developed software given a specification, worked with teams to design applications, lead teams, presented to management of companies, and even was part owner in a company -- but I never was good at rote memorization.

I know my limitations and I know how to find answers.

So, for 4 months I practiced online programming problems, read interview books, and had my wife quiz me nightly. The nightly quizzes were whiteboard answers and I had to explain the solution enough that my wife understood.

In the end, I was interviewed at 4 companies: Daugherty Consulting, Google, Amazon, and Target. (For Google this was my second interview in two years. The first interview was a shock, I froze during the preliminary interview, and for two years contemplated if I'd ever quit my job.)

Daugherty never had me do whiteboard programming but did ask me some algorithmic questions. These were much easier to answer verbally. In the end I was told I didn't have enough experience in consulting working with large companies. (This was a bit of a shock but whatever.)

With Google, I never got past the first round. I felt very good with my solution coding in a Google Doc, but, they had wanted me to implement the Python bisect_left function. Instead I just used it to solve the problem.

At Amazon I made it onsite, but again, I failed to whiteboard a hashing function to their satisfaction. They told me it could have been overlooked if my architecture skills were stronger. They did complement me highly on my communication skills, which I appreciated. (I had worked for two weeks rewriting my accomplishments journal using the STAR[1] format.)

Target (where I work now), was completely different. I was given a choice of real-world-like problems to solve and a couple weeks to code. Two were pretty heavily algorithm/math-focused but the third was right up my alley -- implement a microservice backed by a data source and a different (potentially flaky) service. I took my time, wrote code I'm proud of, deployed it on Google Cloud, and explained my solution in detail to a Principal Engineer. There were still personality and experience questions (and I think also some algorithm questions) but nothing like my other experiences. It felt much more grounded in reality. Are you a solid developer, good communicator, and good fit for the company. In the end I didn't get the exact position I applied for but I'm still extremely happy.

My takeaways:

1. Maintaining an accomplishments journal as more beneficial than I could ever imagine. I write down everything I'm proud of - when I'm proud of it even if it seems minor. I can always delete it later. Also, the STAR format is actually really good.

2. Don't stagnate in learning. Technology and methodologies are changing all the time. I don't follow every fad or code in my spare time but I feel strongly taking some time periodically to maintain a level of expertise is a good investment.

3. Knowing my strengths and weaknesses really helped me focus while preparing for my interviews.

4. Learning from interviews and maintaining confidence was big for me. I took notes immediately after each interview of what I wanted to work on. I asked for as much feedback as I could get. These notes made it back to my journals and are things I'll refresh time-to-time because I know nothing is a given. Who knows what I'll want in another 15 years.

[1] https://en.wikipedia.org/wiki/Situation,_task,_action,_resul...


Earlier post about SeL4 verification:

https://news.ycombinator.com/item?id=23464187

Also, my previous comment linking to how SeL4 performed the verification:

https://news.ycombinator.com/item?id=23475748

> seL4 hey used Haskell to create an model which was then their specification to help with the formal verification process [1][2].

> [1] https://dl.acm.org/doi/pdf/10.1145/1159842.1159850

> [2] https://www.sigops.org/s/conferences/sosp/2009/papers/klein-....


Just quickly thinking of where there might be a lot of images/documents modified quickly: web browser cache, photo management software, antivirus software.

I think it'd be easier to isolate applications and data like Cubes OS instead of trying to create a universal rule set.

https://www.qubes-os.org/intro/


> Conspiracy theory: This change is dictated by the Google AMP team that wants to take over the world without us knowing

I was just about to write this but I don't necessarily think it's that far off.

With signed exchanges, AMP pages have the ability to hide the fact you're accessing content through Google [1]. In 2016 Google wrote about testing 'mobile-first indexing' because more people are using mobile devices than desktop browsers [2].

[1] https://developers.google.com/search/docs/guides/about-amp#a... [2] https://webmasters.googleblog.com/2016/11/mobile-first-index...

If Google can control the URL narrative (keeping users from bouncing off AMP pages) it's just one more ability for them to be MITM.


I wonder if they’ll eventually hide the URL path from extensions (for security) and serve ads off google.com. Even serving ads from somewhere under google.com/amp would probably cause problems for ad blockers. Or maybe extensions see the rewritten URL only, so CanSignHttpExchanges is a way of changing third party trackers and ads into first party.

Also nice to see DigiCert helping them out, but I’m not surprised with how DigiCert’s product lineup isn’t much more than a test of how much of a sucker you are.


seL4 hey used Haskell to create an model which was then their specification to help with the formal verification process [1][2].

[1] https://dl.acm.org/doi/pdf/10.1145/1159842.1159850

[2] https://www.sigops.org/s/conferences/sosp/2009/papers/klein-...


Sure...what else ;)


There should also be an explicit call out to both tooling and community. Even if a programming language ticks all the other boxes, Without a sizable and stable community to constantly push tooling languages will fade, not necessarily die.

In my past job I was ColdFusion programmer for 15 years. ColdFusion ticks all the other boxes.

* Created in 1995 and is a very mature platform

* Easy to learn and productive

* I was very comfortable hiring multiple ColdFusion engineers over the years

But, the community moved on and the tooling is now nonexistent compared to other programming languages. ColdFusion will live on, but as a shadow of what it once was. Which is unfortunate because I actually enjoyed the language and platform.


I currently see it happening with Ruby. I used it as a scripting language for DevOps tooling, CLIs, and for small web apps, but without rails.

What I see is Go pushing it out from the DevOps space and Python is more popular as a general-purpose language. Tools that were written in Ruby(eg.: Puppet) are outdated and new tools are written in Go. I still think Ruby is a superior language compared to Python, but because mathematicians used Python and the recent ML/big data hype it got more popular and got better libraries which caused an upward spiral.

While Ruby is the language I most comfortable whit and I don't see it going away in the next 10 years. I have to accept that first-party integrations will come later and later and there will be less third party libraries. That is why I'm currently learning Go and if anyone asks me what language should they learn first I point them to Python.


This is avoidable by continuing to use Ruby. Just continue to use it and contribute to the community, it's that easy. Ruby still has big businesses using it - Github, Airbnb, Shopify. There is no reason to believe it'll go the way of cold fusion. The language ecosystem is much different than 20 years ago. There are so many languages all thriving. Even Perl is relatively healthy and is a fine choice for doing many things in the software space.


I came out a little bit negative in the end, but as I said I don't think it's going away.

All these companies you listed using it with Rails and that part of the ecosystem is alive and well, but I'm not interested in it.

A few years ago Ruby ruled the DevOps/Cloud space(which I working in), a lot of tools was written in it, but with the dawn of containerization, its former glory starting to fade. Docker, k8s, or even the new GitHub CLI is written in Go. While I am happy to write Ruby code, I can't expect the same from my colleagues.

While professionally I don't think I will continue to use it much longer, I still planning to keep up with it. Before the lockdown, I started teaching Ruby at a local meetup group, and I can't wait for Ruby 3.


I dont think Ruby would ever be able to compete in that space. Even Hashicorp, the company that was born out of vagrant left Ruby and Chose Go.

I am thinking someday Crystal post 1.0 would be able to put up a fight in that space.


IMO Ruby is already being left behind. HTTP/2 is a good example. Rails doesn't support it, and I can't find anything recent saying support will be added soon. Java is notorious for slow innovation yet language level support for HTTP/2 as added years ago and enabled for basically every popular framework. Same with Python, C#, Go, JS.

HTTP/2 is essential if you want good SEO which makes Rails a non-starter for many projects already


That doesn't make sense. AFAIK with frameworks like Rails or Django you never expose their server directly to the Internet, you put a NGINX in front of it. And NGINX talks to the backend code via UNIX sockets, so support for it in NGINX is what matters.

And sooner rather than later you are going to need a load balancer anyway.


That's not really good enough, you will have http1.1 between nginx and the actual server. Many features require actual code framework level support, like server-side push, realtime streams and grpc.


Yes, but these features are not yet impacting SEO.


Do you think open source colfusion engines such as lucee[0] are worth using/learning?

  [0] https://lucee.org/


No. Avoid ColdFusion - it's a language from a bygone era when Adobe was trying to be what Microsoft is now. They failed and the ColdFusion community will die. The community, while made up of great people that I am still friends with, by and large was not interested in improving, learning new techniques, or growing beyond ColdFusion. They wanted to learn a skill and cash in on it for a long time. The language is dead and their skills are becoming useless as a result.

Maybe in 15-20 years ColdFusion will become like COBOL or ADA, where there's a bunch of code running some necessary systems and nobody knows how to maintain them. But it's a long bet for a payoff that would be much higher if you spent your time somewhere else.


If you're thinking of learning CF, only do so with the understanding that it's now a niche product mostly used in government and large companies. Learn it either as a hobby or look at the Lucee source for educational purposes. Even my last company is moving away from CF.

About Lucee ...

I actually helped the company migrate away from Adobe ColdFusion to Lucee due to a chance in Adobe's licensing. We migrated a large, 10yr old, application and all clients over to Lucee in about 1 year.

Lucee is a nice platform but I'll warn that it's not as polished as Adobe CF and there's some difference in features. You'll run into rough edges in their documentation and language implementation (especially the scripting languages). One nice thing I really enjoyed was being able to download the source and figure out how my CF code was actually being compiled into Java. It took me a few days to understand the parts I needed but I actually figured out some issues I was having. I also was able to build Lucee from source, just for learning, which was really nice.


The points under "Cultural improvements" can't be stressed enough:

* When only one member of that team is remote, they often suffer a combination of isolation ... and organizational burden ...

And their approach to foster inclusiveness is a great checklist:

* We nominated a site lead ... to be responsible for the overall happiness and productivity of the hub. * We ensure that our leadership regularly visits the hub via Zoom meetings to lead discussions, answer questions, and provide a sense of connection. * We encourage virtual coffee chats to promote a sense of belonging. * We survey the team regularly and review feedback and people data, so that we can understand both the shared needs of our employees and the particular needs of a hub.

Remote doesn't have to mean work from home, and I can attest that having too few remote employees and not enough cultural investment is a recipe for failure.

An ole employer had a satellite office with just a few employees. The main office was in a city in a different state. There were many occasions where we were in the dark about talks that'd happened at HQ, activities, and even some times when the owners would let folks leave early on Friday! Without dedicated leadership and effort to make sure employee are included, even a small satellite office can feel isolated and neglected.

In the end there were just 3 engineers total, all in different teams. I left a year after the company restructured, closing the satellite and everyone was WFH.


To add another story... a previous company I worked at had an HQ of about 700 people in Chicago with satellite offices of 20ish people in a number of locations, one of which I worked at. After a couple years it was extremely obvious that the satellite offices were not only left out of important discussions and considerations from HQ, but were also treated very poorly in comparison when it came to bonuses and promotions. This despite many of the satellites being more percentage-profitable with higher growth than HQ. It led to a toxic environment and a personal lesson to be wary of working away from an HQ.


I would speculate that for companies these two statements will be exclusive:

• Company saves money by not having to hire Linux sysadmins

• Company saves money by not having to pay for managed cloud products if they don't want to

As a developer I want to right code, not manage a Kubernetes installation. If my employer wants the most value from my expertice they will either pay for a hosted environment to minimize my time managing it or hire dedicated staff to maintain an environment.


I worked for a brand & marketing company for 15 years and I observed that most of my clients had pretty short memories when it came to how they felt about me. If the last few milestones were really great they quickly forgot an incident. Obviously, the more impactful an incident, the longer/more positive milestones had to be. An incident too impactful got your fired - but, in general, this was my experience.

Even in their recent history Microsoft has repeated incidents, but also has some very big positive milestones. Also, keeping in mind, some customers will only see the positive milestones.


Very true. People currently love VS Code and it makes their life much easier. I remember for a long time .NET friends of mine would extol Microsoft just cause Visual Studio worked really well with C#. Microsoft could do no wrong as long they could seamlessly work on Windows Apps.


Dismissing a successful person but seeming "dumb" person is a benefit to them. Now, they can continue succeeding with less scrutiny.

For a long time I thought many politicians were "dumb" based on their public comments, propelled only by their connections. Now, when I look up some politicians who's spouting objectively false / misleading statements and find they've graduated a top-tier university and have a JD I realize these are not intellectually stupid people. They're skilled in their field, have drive, and less empathy/morals than others.

Now, I watch these "dumb" people more closely.


Empathy / morals seem to be the key thing here.

I have be forced to leave a number of potentially very lucrative deals because I refused to scam little old ladies out of their life savings.


US politicians often act "dumb" because Americans mistrust intellectualism and "book smarts," but they trust plain-spoken cowboys who shoot from the hip and speak like common people.

Hillary Clinton attended Wellesley and Yale, and speaks like someone who did, and many Americans hate her. GWB attended Harvard and Yale, but he played up the stereotypical Texas country-boy stereotype and fumbled over his words, and got two terms. Donald Trump went to Wharton business school, but speaks (and shitposts) at the level of a common Reddit troll, and he was elected in large part because of that.

You'd think Americans would stop falling for it at some point.


> Hillary Clinton attended Wellesley and Yale, and speaks like someone who did,

Hillary Clinton put on a terrible fake Arkansas accent for the beginning of her political career, talked about baking pies, and quoted country-western songs in interviews. The reason she uses her natural accent in her later career is because she has positioned herself as a "wonk," and often against Republicans who were doing folksy.

i.e. she's looking for a different set of Americans to "fall for it." Trying to appeal to an audience who thinks that everybody else is a sucker and they're the shrewd one.

When she goes to black churches, she puts it on again. Instantly starts dropping her g's and praising the lord like she would never do in a white church.


Do you think graduating from a top university makes them smart? Do you think working at a top hospital makes you a great doctor? Etc. There is a distribution everywhere. At top universities it just might be not as wide as some other places.

You should watch closely the people you admire. Donald Trump is as successful politician as one can be. He is as rich as one can be. He has a degree from a top university. But he's a useless person and dumb as a rock. If you try to follow his way you will get nowhere. Good luck.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: