Hacker Newsnew | past | comments | ask | show | jobs | submit | petterroea's commentslogin

The student is attending college to get a job. Most students don't care about the course.

Probably around 50% of students in my year were only in it for the well paying jobs a prestigious degree like that could give them.

This has to be part of the threat model for cheating.


I am in my final year of my bachelors in Software Engineering. I was (mostly still am) very interested in both SWE and CS in various angles - I studied a decent bit of PL theory, I tried to get into systems programming, I've built a bunch of "portfolio crud" software and had a short internship in a real company, with all of the above being roughly equally interesting to me. All this is to say I genuinely love the field so far.

However, the only benefit I've got from my local university is that it saves me from military while I study. Past year 2 (out of 4, country-specific quirks) there was roughly one subject actually worth paying attention to, so I also have switched to a "just get a decent grade at any cost" mode, as most of the material we're studying (and especially most of the assignments we've done) has negative value in real world.

Most of my peers consider me both more enthused and more knowledgeable than the average student, which mostly makes me realise that roughly 95% of my peers don't care about the contents of the courses.

All this is to say that, while grading is hard, the only thing that might get people to actually care is a proper course, no matter what threats you make.


“there was roughly one subject actually worth paying attention to”

I don’t know what country you live in but I have gone to university. Saying that none (but one) of your subjects were worth paying attention to and further that they have negative real world value is baffling.

Surely they teach math, history, literature, require you to do research from books, write essays remember what you’ve read… None of that is worth anything?

Your assertion is baffling. Are you living in a weird totalitarian state where your education consists of active brainwashing?

Are you suggesting that your country performs some sort of undesirable indoctrination that you’re heroically resisting by not paying attention to the assignments?


Please note that I explicitly said "past year 2". I believe the first two years were fairly decent, especially those who had worse basics than mine (not that mine were that good).

Notes (because the reply might sound weird without them): I do live in an authoritharian pseudo-democratic country which I will not mention explicitly for reasons; I do think some parts of the education system are effectively brainwashing (primarily the way we study history), but it doesn't affect my statements; I do think (or at least hope) that this is an issue of my university specifically and not our education system as a whole; I do admit that I slightly exaggerated, and that technically speaking there was some useful material in more courses than one, but I do stand by the opinion that the way we were taught only makes few of them actually useful. Also, I use the word "teacher" as there is only one professor in our department, which should've been a weird sign from the beginning, but as this is a story from a local university, you might imagine I didn't have much choice thanks to circumstances of my life.

Now, the fun way to start this would be to mention the specifics first.

Our databases course (now extended into database administration) did have a little bit of theory, primarily normalization and transactions. However, in practice, most of the information we were actually allowed to use within assignments are something you'd learn within 10 minutes of learning SQL (so very basic operations). We were not taught or (effectively) allowed to use constraints until last semester (4 semesters into learning about databases), we did not have a single mention of joins or indices, we did not have any discussions of ACID. It took us a year to even get close to data integrity. We were not allowed to use primary keys until the last semester. A lot of the course (the entire first year of it, really) was focused either on FoxPro DBMS or the visual parts of Microsoft Access, so much so that our assignments required basically documenting our GUI navigation of the latter without letting us do actual database work. We did switch to Oracle SQL DB in semester 7, except that we are forced to work with Oracle APEX and working with raw queries was basically self-sabotage due to the description requirements for the assignment write-ups.

As an extension of the above, we received a frankly ridiculous amount of scrutiny for everything BUT the actual database management. I was once forced to spend 2 hours staring at a Word document in far commander (which the teacher was obsessed with) as Microsoft Word weirded out at me typing out a word and screwed up an error underline, which triggered the teacher so much he forced me to find a reason it happened, or I would have my grade decreased. The same person made us spend the entire first lab class of this semester writing up what Oracle as a company is and what other non-Oracle products are named Oracle. This person alone soured my higher education to a degree that nearly made me quit university.

Our mobile development course consisted of the person running it making us choose a topic and implement a mobile application and effectively screwing off for the rest of the semester. As a result of no guidance, I would argue this is about as useful as making us watch and follow a Flutter (or Compose or whatever) tutorial and making a 10k words long write-up based on that. Also, a single mixup of "phone" and "smartphone" was punishable by extra assignments, initially in a form of making us make a presentation on what's the difference between a telephone and a smartphone.

Our operating systems course was relatively decent, except for the fact that due to the reduced hour count for the program, the only practical things were: write up some Windows batch scripting commands; write up some bash/coreutils commands; launch some Windows utilities from the command line and screenshot the process. The lectures were decent though, even though it was just a fairly high level overview of OSes people use and not what an OS really is. Not having an assignment on multithreading was funny when we got one for oru Java course.

Our neural networks course had us solve a set of quizzes about neural networks. We had no lectures, despite having no proper introduction to what a neural network even is. The course was stolen from a paid one, which I know because, incidentally, after half-intentionally breaking the grading system of the LMS ours was running on (tldr Python ACE due to unescaped evaluation in code runner tasks, go figure), I was tasked with rewriting this same course in a hardened way. The only benefit of this one was that I got paid for it, though you could argue that forcing us to learn on our own was technically useful under the guise of "you need to learn to learn"

Our project management course's exam (or, well, pass/fail oral attestation?) had us talking about Windows COM, Waterfall architecture and manual testing. There was a single mention of unit testing. The course material also assumed that debuggers still could only debug 16-bit code.

Our pre-diploma course project on project management forced us to pre-plan the whole application we were going to write. Architecture, specific library structure, specific class hierarchies, specific classes, fields and methods. While actively forbidding us from writing code. People were also forced to write up about database structure even if their projects did not imply having one. All while we had no choice over our project topics, as those are supposed to be work we do for a company. My friend has a 9k word long write-up about an S3 cache microservice, as that's the only way to pass the requirements.

These are just some parts of the torment we've seen here, as I am only listing out things from years 3+4, ignoring years 1+2 (which had their problems, making a C++ per-symbol parser was a fun one). The history of bias and straight up bullying from the tutors is long, documented and not acted on. The only reason corruption isn't openly involved is that one of the people teaching here was sued for taking a bribe about a year before I got into the university.

On top of all this, we've not learned anything about actual system design, security, distributed systems, functional programming, Linux, ethics, embedded, performance engineering. Our parallel programming class was just a set of questions in a quiz about OpenMP without an actual introduction. Our graphics class was us making models in Blender and p much nothing else. Our web development course forced us to write everything in Notepad in pure HTML4, and using JS was punishable. Our OOP class was overfocused on C++ so much that we've got `std::function` as an exam question because "well, it's a callable object, who cares that it's actually used for HOFs". Anything related to deployments and DevOps was only mentioned thanks to the fact that one PhD student was forced to run a subject that was meant to be entirely about Windows Active Directory and made a proper course of "from zero to CRUD in production" instead, which was arguably the most useful course in those 4 years for the majority of my peers, as it actually forced people to learn about CRUD workflows, frontend, REST API design and Docker.

I strongly believe that the way we were taught things related to most subjects actively harmed students, as we were not allowed to do our research and use results thereof, with the lecture material being either mostly useless or grossly outdated and out of touch with reality, even though the subject structure is pretty good and seemingly on par with normal universities.


I know many people who were in the exact same situation as you while at uni. I hope you find value.

For me, my hobbies probably gave me 2x more experience, but uni forced me to learn things i would have never learned by myself. It made me believe self taught engineers were inherently flawed from only knowing what they themselves thought was important.

I'm sure you'll find value at the end, but I think you are valid in feeling you are wasting time.


It's pointless. Just an arms race of gimmicks. There's really no option besides making homework all optional, and putting 100% of the grade into in-person exams. I basically don't trust that any new graduate has earned their degree, and won't until schools do what's necessary to crush cheaters.

I agree with you in spirit, but the last meta pre-LLM was that exams were bad at measuring student skill and that students felt more fairly treated when their grade was the result of multiple assignments and projects. I think it's a shame we have move away from that

> exams were bad at measuring student skill

They are. I have a friend who was significantly more smart and thorough in our studies but often get bad scores on exams not being able to concentrate under the pressure.


I also struggled with exams, but that's because my understanding was often shallow, due to a lack of effort to study and understand the material. I'm very suspicious of people that say they're smart, but can't perform on exams. That said, there's plenty of ways to structure things to avoid this. Have weekly, easy, pass / fail exams that ensure you've read the material at a basic level, or understood some basic concepts. Lab work. Presentations with live grilling from the professor to ensure you understand the topic.

I don’t think my friend would claim to be smart (and not I’m not talking about myself in third person to sound more convincingly, I have a real las in mind). I say they are. I saw them in a day to day work and they are both more knowledgeable and more productive than I am. It’s being put on the spot, with high stakes and limited time, they had a difficulty with.

> there's plenty of ways to structure things to avoid this

Sure, I was arguing specifically against GGP’s solution, i.e. betting everything on the finals.


Exams also rarely measured skill in the course. Often just a subset. We would often spend the last month of each semester cramming exams instead of studying the curse material because it wasn't that useful.

I rarely felt I got a lot out of courses, but I often felt I would if I got to study it properly


Isn’t that actually a valid way to test? IMHO Performing under pressure is a capability signal in itself.

Well, that is a way to test students’ ability to perform under pressure, but I’m adamant it’s not a fair assessment of their skill in the subject at hand, nor how much they’d worked and improved during the course. On several occasions I have gotten higher marks than my friend because of their anxiety issues, despite me being a worse student and arguably a worse researcher (what we studied for).

If you can’t concentrate under pressure then you will not go very far in employment….

Huh? Not every job requires this trait, and even though some do, it’s not something nonlinear optics professor ought to evaluate.

Sure, it’s a nice quality to have and I find it useful at times: when it’s “suddenly” the last day to write a proposal, or when someone has to present at a conference. (However, these tasks many other skills besides just the ability to stay calm.) But I can’t agree that it is indispensable for a researcher.


[flagged]


I don’t want to be a CEO, mate.

Why would I give up my cushy place where I’m paid to do interesting stuff, for a stressful position full of management responsibilities? I swear, more people should learn the idea of lagom.


the course is now no longer cs/swe.

the course is now

"how to pass exams in cs/swe"


Better than "how to get a passing grade in cs/swe"

This is why i never trust blog posts any more. If a company logo is attached its just SEO garbage

To be fair this is the price you pay for sharing a GPU. Probably good for stuff that doesn't need to be done "now" but that you can just launch and run in the background. I bet some graphs that show when the gpu is most busy could be useful as well

A business man at a prior employer sympathetic with my younger, naive "Microsoft sucks" attitude told me something I remember to this day:

Microsoft is not a software company, they have never been experts at software. They are experts at contracts. They lead because their business machine exceeds at understanding how to tick the boxes necessary to win contract bids. The people who make purchasing decisions at companies aren't technical and possibly don't even know a world outside Microsoft, Office, and Windows, after all.

This is how the sausage is made in the business world, and it changed how I perceived the tech industry. Good software (sadly) doesn't matter. Sales does.

This is why most of Norway currently runs on Azure, even though it is garbage, and even though every engineer I know who uses it says it is garbage. Because the people in the know don't get to make the decision.


I’d say, they are very good at making platforms and grab everyone lock-in. But they need a good platform first. Azure seems like the first platform that is kinda shitty from the beginning and did not improve much.

MBASIC was good and filled a void so it got used widely from the beginning. The language is their first platform. Later the developer tools like the IDE, compilers, still pretty solid if you ask me.

MS-DOS and Windows are their next platform. It started OK with DOS — because CP/M was not great either. But the stability of Windows sucked so they brought in David Cutler’s team to make NT. It definitely grabbed the home/office market but didn’t do well for the server market.

X-BOX is their third platform, which started very well but we all know the story now.

Azure is their fourth platform, started shitty and still not good. The other platforms have high vintage points but Azure may not have one.


Those are mostly end-user or hosting platforms you mention (and their problems), what really makes MS tick is the enterprise platforms.

Windows networks, Active Directory,etc. Azure is the continuation of that, those who run AD oftne default to Azure (that offers among other things hosted or hybrid AD environments).


Yeah those too, sorry I never worked with the MSFT stack in corporate, except for my first company when my IT knowledge was still minimum.

That's true for Azure, where contracts are signed due to free credits given over Office and Windows usage.

However, there is a reason why everyone uses Office and Windows. Office is the only suite that has the complete feature set (Ask any accountant to move to Google Sheets). Windows is the only system that can effectively run on any hardware (PnP) and have been that way for decades.

This is due to superior software on the aspects that matter to customers


People use Windows because Office runs on Windows, and Windows ran in any shitty cheap beige box. This is the whole story since the 1990's.

On hardware: it's because Windows has a stable kernel ABI and makes it very simple for hardware vendors to write proprietary drivers. Linux kind of forces everybody to upstream their device drivers, which is good and bad at the same time - DKMS is something relatively new.

But yeah, the NT kernel is very nice, the problem with Windows is the userland.


I use to think that too.

But if you really look at it the "comfort zone" problem isn't too big of an issue in itself that a few training workshops and brief acclimatization periods for other tool suites can't solve. Making accountants move to Google Sheets is actually doable given enough incentive; there really isn't a lack of features in Sheets against Excel so much as there is a difference of implementation. In fact, for many purposes Sheets and GSuite could even be the "superior software" if only one bothers to make good use of it.

The problem is more that companies hesitate to take the dive because they can't be sure any of the alternatives will stay stable in the long run. Google is infamous for abruptly shutting down applications and none of the other competitors have built enough of a repute yet to ensure longterm reliability.

Microsoft has been (and continues) riding on its first-mover advantage as an unmovable establishment for decades. It has worked out till now, but who knows till when.


The selling point of Excel is not the feature set, it's that people know Excel and are usually very resistant to learning something new.

As someone who’s compared spreadsheet feature sets, though: it’s also very much the feature set.

Well, in a way it is of course, because if your reference is Excel, then you want the feature set of Excel.

Or what specifically do you mean?


Sheets and Numbers are spreadsheets. Excel is an application platform and programming language that’s convinced people it’s just a spreadsheet.

VBA, PowerQuery, structured references, the newer formulae like XLOOKUP, dynamic array-spill formulae, map/filter/reduce/lambda, various obscure financial stuff.

Sheets and Calc don't have these.


The problem is that it encourages people to use excel for things that should never be in a spreadsheet in the first place. I mean if you're reaching for VBA, building complex PowerQuery pipelines, and writing nested LAMBDA functions just to process your data, imho you have outgrown excel. Just because you can build an entire solution in Excel because you already know the interface, doesn't mean you should...

Also, don't get me started on the newer functions such as XLOOKUP and Dynamic... Relational data belongs in a relational database. If you are joining tables and filtering massive arrays, you should be using standard SQL Arrays, it makes it so much easier to troubleshoot long term.


Windows is the only system that can effectively run on any hardware

...as long as that hardware is Intel-based (and a select few ARM-based boards nowaways). And the reason that it runs on all that hardware is because of Microsoft's business contracts with hardware vendors, not because of their software quality -- that's immaterial, as Microsoft generally does not write the drivers.


Compare the experience in Linux or Mac for getting some random no-name device working with Windows.

A lot of it is the fact that the OS has created a very complex yet consistent system of device compatibility that was completely absent from all competitors who are still behind on that aspect or alternatively the choice of kernel design architecture


It's been like two decades since I used windows on a computer I own, but I always had a way harder time getting hardware to work with windows than I have with linux. I still shudder when I remember trying to track down drivers from different vendors, while avoiding the malware they shipped with it versus letting it just work.

edit:

I just remembered when I first used CUPS to configure a printer in 2003. It blew my mind with how easy it was, and I think that was the moment when I decided to start using linux as my primary desktop. Pre-Novell Suse at the time if im remembering correctly.


This is in many ways a smart way to understand the problem, but it doesn't mean that microsoft contracts mean you're stuck with bad software. There are several verticals where Microsoft and Azure actually were smart and chose a better software product to sell on their platform than what they had in house.

One example is when they stopped trying to develop a inferior product to EMR and Dataproc, and essentially just outsourced the whole effort to a deal made between them and Databricks. Because of this I assume many enterprise azure customers have better running data solutions in that space than they wouldve had they gone with just AWS or GCP.

On the other hand, having worked for Microsoft on an Azure team, there are plenty of areas that critically need a rewrite (for dozens of different reasons), and such a solution is never found (or they just release some different product and tell those with different needs to migrate to that), where they keep on building what can only really be described as hot-fixes to meet urgent customer demands that make it harder to eventually do said critical rewrite.


The Databricks thing was a ploy. They then pushed Azure Synapse Analytics and forced all internal teams to stop using Azure Databricks. Synapse was half baked and then they are now pushing Microsoft Fabric which is even less baked.

About a year ago the whole situation changed and Microsoft started to push everyone to their own Data Engineering solution (Fabric) that back then was really half-baked.

A overly reductionist argument. They described any commercial software company because in the end, you sell or you die. Microsoft has incredible software people and incredible software that coexists with the shitty software people and shitty software.

Agree. You could say the exact same thing about Oracle, for example.

But that also means that if you as a user/customer can make choices based on technical merits, you'll have a significant advantage.

An advantage how? Maybe you'll have one or two more 9s of uptime than your competitors; does that actually move the needle on your business?

The biggest expense in software is maintenance. Better software means cheaper maintenance. If you actually want to have a significant cost advantage, software is the way to go. Sadly most business is about sales and marketing and has little to do with the cost or quality of items being sold.

Why wouldn't it move the needle? Less time spent, less frustration, more performance, more resources focused on the business?

You massively underestimate the difference in employee productivity not having to fight user-hostile software every step of the way can make. Not to mention cascading cost savings and infra flexibility distancing yourself from Microsoft products can grant.

It will depend on each case and what makes the marketed solution inferior. If it's overly complex and you will save development time. If it's unstable you'll save debugging time. If it's bloated you will save on hardware costs. Etc...

matters less than we would like it to

after all startups/scaleups/bigtech companies that make a lot of money can run on Python for ages, or make infinite money with Perl scripts (coughaws)

and it matters even less in non-tech companies, because their competition is also 3 incompetent idiots on top of each other in a business suite!

sure, if you are starting a new project fight for good technical fundamentals


Most customers don't really have the knowledge needed to make choices based on technical merits, and that's why the market works as it does. I'm willing to say 95% of people on HN have this knowledge and are therefore biased to assume others are the same way. It's classic XKCD 2501.

I think this is spot on. Everything at the R&D phase of a project indicates that an Azure service is going to work for the use case. I've been reading the docs and though 'wow this is perfect!'. Then you get to implementation and realize its a buggy mess that barely does what you wanted to do in the first place, with ton of caveats.

Of course that realization comes when you are already at the point of no return, probably by design.


My lesson was when European companies followed US tech into offshoring, and how quality doesn't play any role as long as the software delivers, from business point of view.

Especially relevant when shipping software isn't the product the company sells.


Finnish public sector is also heavy Azure user. Their common ethos is that modern cloud services(=azure) are in many respects more secure than on-premises data centers. In addition, they are cost-effective and reliable.

I mean, if you ignore all the heaps of impressive software Microsoft does ship, sure.

It’s been a while. The underinvestment shows. Across the industry as well.

It's the same in Norway, and news paper chronicles are going as far as saying things like "Now that we learned that we went to far, what do we do with the generation of kids we experimented on?". Food for thought.

Considering how many old people hold on to beliefs like women belonging at home doing the household and/or that people of color are violent, I'm not sure such a conversation is going to lead anywhere. We let them vote until they die, or authorize someone else to get a second vote (like my 90yo grandma does for my 65yo conservative/racist dad), as though they still have an understanding of society. A good understanding of climate change is barely 40 years old, if the news didn't tell them then they just go no education on the world's biggest ongoing challenge. People have just been letting the old generation be because we can't figure out anything better (I'm also not sure what a good system would look like)

To be fair, in Tokyo I see a lot of ISPs pushing 5g routers. Many buildings have fiberoptics pulled to the basement and then use VDSL for the last meters, and I bet they'd rather move everyone over to 5G than have to start actually installing proper fiberoptic internet. In Norway, 5g has been advertised as something groundbreaking and radical. We have been told "now surgery is finally possible with mobile networks" (hospitals don't have fiberoptics??) and similar. Very Apple 2010s "The ipad can now be used by (good person) to do (good thing)"-like. But nobody cares, real users don't see any benefit.

A normal person will probably never notice the difference between 4g and 5g because of what they use their phone for, and giving every household a proper fiberoptic line is probably a much better quality of life improvement. But ISPs dont want that future. They want everyone to be connected to these neighborhood hubs that don't require last-100m-cables and expensive construction. The same can probably be said for Starlink. It's "Good enough", and that's good enough to get sales. They don't care about the quality of the product they deliver, or if fiberoptics are superior. They care about sales.


I can't help but sense a level of arrogance when they launch their product by writing an obituary for a competitor. Is this what people feel when they make fun of the "(product here) killer"?

Yeah I totally get the rule. I use LLMs when developing. In fact, I've been out of Claude tokens for the week since Wednesday, but I use Claude specifically for the boring, simple stuff I don't really want to do, but that Claude can. I'm simply not interested in discussing anything LLMs are able to do, it's not interesting.

It makes sense that a programming subreddit first and foremost discusses programming (the skill). We can go complain about Claude somewhere else if we want to.


Following up, anecdotally, people I talk to who are excited about LLM development usually either care more about product development, or don't have programming skill enough to see how bad the software is. Nothing wrong with either, but it can get tiresome.

> people I talk to who are excited about LLM development usually either care more about product development

This is an interesting thing I've also noticed in public hobbyist forums/discussion spaces where someone who is more interested in making a "product" clashes with people who are just there to talk about the activity itself. It's unfortunate that it happens but it will self-correct over time (like /r/programming here) and the LLM enthusiasts of Reddit will find another place to discuss ways of using them.


This is cool, OTel is getting somewhere.

I've found OTel to still have rough edges and it's not really the one stop shop for telemetry we want it to be at $work yet. In particular, no good (sentry-style) exception capturing yet. They also recently changed a lot of metric names(for good reason it seems), which breaks most dashboards you find on the internet.

I have been warned by people that OTel isn't mature yet and I find it to still be true, but it seems the maintainers are trying to do something about this nowadays


I think that the "issue" around otel is that instrumentation is easy and free (as both in beer and freedom) but then for the dashboarding part is where there are literally tens of different SaaS solutions, all more or less involved with the OTel development itself, that want you to use their solution (and pay $$$ for it). And even if you can go a loooong way with a self-hosted Grafana + Tempo, even Grafana Labs are putting more and more new features behind the Grafana Cloud subscription model.

Yeah. The auto dashboard stuff Grafana cloud is doing nowadays is cool (even if it's just Greg in engineering department writing heuristics), but I can't help but feel pissed that the oss dashboards for otel on Grafanas website aren't even up to date.

I use Grafana because it has value for me both at work and for hobby, but it's becoming more painful to use Grafana for hobby so I agree with your point


> but I can't help but feel pissed that the oss dashboards for otel on Grafanas website aren't even up to date.

Indeed. But nowadays you can just work 1 hour with Claude Code and get a pretty slick Grafana dashboard for whatever you need


do you have any suggestions for alternatives then (besides Sentry)? I do feel OTel have pretty wide support in general in term of traces.

I know a lot of shops that prefer the datadog stack, which apparently does have its own sentry-like exception capturing system. To me, exception capturing is an obvious core feature and it is humiliating to discuss OTel with people who agree, and use datadog and are satisfied.

Maybe will get a resurgence of the limewire-style pranks people are so nostalgic for


I want Arnold to tell me about pizza again soooooooooo bad.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: