Swiss Enterprise Software Engineer in 2024

A while ago, you could apparently speed it up by claiming your grandma would die if you didn’t get a good answer or if it was too slow.

Don’t know if this still works…

It’s fine-tuned to keep polite tone, but surely it can get racists or clearly passive aggressive. Nonetheless, I’m really laughing when people make a fuss about this. There’s no thoughts behind AI chat answer, nor personality. It’s just fuzzy search through the data it was trained with, and since the data was vacuumed from the internet, there’s no guarantee that bad things won’t appear in reply.

Behold the thousands of human “labelers” in countries with very low wages…

True, my comment was a bit sarcastic maybe you’re not English, sorry. I quite enjoy the conversations I have with ChatGPT because it definitely tries to maintain a helpful discussion rather than trying to attack me when it misinterprets my request. Maybe the idea of everyone having a robot that gets to know you and reacts empathetically will materialise. The mind boggles :wink:

No worries. English is not my mother tongue, but that doesn’t matter. My mind is facts driven, making cold processing of incoming data so people often freak out why I’m immune (can’t be derailed by) the emotional side of the message. :wink: :person_shrugging:

1 Like

I find it good to try and concentrate on facts but everyone has emotions. We can’t avoid them can we :wink:

Been in IT over 30 years and more than half that in ‘enterprise’ IT. I think you’re really asking about AI as a tool for ‘code generation’. In pharma this will, imho, first hit the guys writing a lot of source for things like product research. But, imho, it won’t replace many jobs because someone will still have to get the specs and validate the results. But, the process should speed up. In more mainstream functions there’s not much source code being built as we’re generally relying on bought in software, there the vendors are promising improvements (paticulalry around easier data querying). Where it might impact is in ‘code’ for data mining where, if properly trained, it might be able to create insights across currently disparate data sets. But here again the problem will be in it understanding the semantic value of otherwise indeciphable tables and columns. So, imho, AI isn’t your problem. But cheaper remote coders probably are.

If you can’t find something in google, no AI model would help you to solve it.

In other non-code areas, and I suspect what will happen to a lesser degree for coding, is that it will bring the quality of below average workers up higher so that they become more competitive with better workers, which I think will ultimately create wage competition for them.

I agree with your observations on AI.
The up-skilling strategy for a generalist SWE is discussed less in the thread, apart from the obvious: learn to use AI to the level Google search has been used so far.
If we deviate a bit from the AI topic, what would be the additional skills to pick up to stay relevant. You know, the classical example often used is that HTML/CSS alone was a profession at a time, but soon such devs had to learn JS and more and more of standard software engineering.
So, if we speak of a generalist SWE, typically of an enterprise back-end/full-stack kind of SWE, what would be the next skills to pick up not to die out like dinosaurs =) Except for the glorified AI based searching.
I am thinking of,

  • becoming more of a DevOps, who can also do development, as it seems to be the trend
  • learning something really niche like industrial SWE, AI (the actual work on ML etc.), low-level programming in C++/Rust etc. but all of it is a bit of a risky bet? Ie. I am not sure there are more jobs like this than the little number of jobs in EE.
    Is there a path that I am missing.

And one more thing, could it be that we have reached a point of saturation in a way, that we don’t need better banking apps, better music/video streaming apps, better mail services, better social networks and better searching engines, and the recession would result into a halt on investments in different “soft” products areas as it’s already “good enough”?

1 Like

One example I saw was where a guy was the best in the team, but with the introduction of generative AI, a mediocre team mate essentially surpassed him. He didn’t want to use AI as the quality was less than what he produced and he had some artistic principles he wanted to uphold, but his boss loved the additional productivity and the quality was ‘good enough’. The guy quit soon after.

1 Like

Speaking about the skills development, I believe it’s always essential to be an expert in something, a go-to person with any problem or question, and try constantly new things with open mind. To know what questions to ask (even if AI) you need to cover the basics, also it builds credibility if you have a sound general judgment which stems from your broad experience.

Unfortunately software development best practices are down the toilet nowadays. It’s an era of making things fast and good enough. 2-3 years into product development code becomes such a mess that devs who wrote that seek management roles or move to another “new project”. New hires are coming and going, until some ambitious team is formed to rewrite all the “legacy” stuff :joy: :joy:

2 Likes

This might be what’s coming in other industries as well, isn’t it? On a positive note, a mediocre employee with a good handle of AI will be more productive than a excellent employee on her own. On the negative, societies based on merit fly through the window. Where do we end up? Communism, where we’re all “equally” skilled? Or slowly fade into oblivion while AI takes over? Possibly another thread on the impact on AI on society?

This is good if you know how to handle it but not required and can be seen as threatening to some.

very true, some organization (or just teams/departments) exhibit monkeys syndrome Organizational Culture and the 5 Monkeys Experiment | Intersol

in the end either you manage to change the culture, or you change jobs, either way it’s a win long term

1 Like

This might be what’s coming in other industries as well, isn’t it? On a positive note, a mediocre employee with a good handle of AI will be more productive than a excellent employee on her own. On the negative, societies based on merit fly through the window. Where do we end up? Communism, where we’re all “equally” skilled? Or slowly fade into oblivion while AI takes over? Possibly another thread on the impact on AI on society?

Good news for the mediocre… but a mediocre employee, if they have a good handle on AI, isn’t mediocre. An excellent employee who doesn’t have a good handle on AI, isn’t excellent.

I do use Chat-gpt4 via Bing regularly. Sometimes it’s helpful, sometimes not. But I doubt strongly it will ever be able to take over my work.

It does raise a question on skills. If you have a skill, say, drawing, you now have a hugely higher barrier to overcome before your drawing skills are so good that they exceed the ability to generate drawings in seconds with AI (and the quality of AI drawings is increasing all the time). How does this make the acquisition of such skills viable since you will essentially get zero ‘benefit’ until you exceed a high and increasing barrier.

People think that AI is not computing. I’ve heard it said many times. I’m also not sure how well Bard and ChatGPT really learn. I’ve told them they’re wrong, they’ve accepted it and then and gone and repeated the same mistake instantly. Zero learning. They can do analytical tasks very well but without enough inputs I find their interpretations of my requests nearly always wrong whereas humans know how to read between the lines. Like decisions based on what someone says with a given accent or role. How does AI get to that.

1 Like