• 0 Posts
  • 72 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle


  • Interesting article. But as a veteran developer the whole AI trend reminds me of the outsourcing trend back in the mid 2000s.

    Back then Western developers (especially junior and mid levels) were seen by many companies as a waste of money. “We can pay for three developing world developers for the price we pay for one American/European one! Why are we wasting our money?!”

    And so the huge wave of layoffs (fuelled also by the dot com bubble bursting and some other things) kicked off. And many companies contracted out work to India, etc. It was not looking good for us Western developers.

    But then the other shoe dropped. The code being sent back was largely absolute shite. Spaghetti code, disparate platforms bound together with proverbial duct tape, no architectural best practices, design anti-patterns, etc etc. And a lot of these systems started falling apart and required Western developers and support engineers to fix them up or outright replace them.

    Now, this isn’t a sleight on Indian of other developing world developers. I’ve met lots of phenomenal programmers from that part of the world. And developers there have improved a lot and now there are lots of solid options for outsourcing to there. But there’s are still language and culture barriers that are a hurdle, even today.

    But I digress. My underlying point is that there are similarities with today’s situation with what has happened before. Now, it’s very possible LLMs will go to the next level in several years (or more) time. But I still think we are a ways away from having an AI engine that can build a complex, sophisticated system in a holistic way and have it capable of implement the kinda of crazy, wacky, bizarre business rules that are often needed.

    Additionally, we’ve heard this whole “developers are going to be obsolete soon” thing before. For 20 years I’ve been hearing that self-writing code was just around the corner. But it wasn’t even close in reality. And even now it’s not just around the corner.

    No doubt, AI will hit a whole nother level at some point. The stuff you can do with Chat GPT and the like it’s insane, even right now (though as another article here on Lenny earlier today said, quite a lot of LLM code output is of suspect quality to say the least). And I know the market is rough right now for greener developers. But I think we’re about to see history repeat itself.

    Some companies will lean heavily into AI to write code, with only a few seniors basically just curating it and slapping it together. And other companies will find a middle ground of having juniors and seniors using AI to a more limited and careful level. Those latter companies will fare a lot better with the end product, and they will also be better prepared with regard to tribal knowledge transfer (which is another topic in this altogether). And when that epiphany is realized it will become the default approach. At least for another 10-20 years until AI can change things up again.


  • Anyone remember the short-lived Great War of the Messenger Apps? For a few months back around… '98? '99? MSN tried really hard to shoehorn its way into working with AIM. About every day there would be an update from MSM Messenger to allow it to work with AIM. Then AOL would fuck with their own protocol to ice out MSN users again.

    I think these shenanigans also impacted the Trillium Messenger app too, which up until then had been flying under the radar of messenger interoperability.

    I might be getting some of these details wrong.















  • Great article! This kind of thing fascinates me. I’ve thought about this topic quite a lot over the past decade or two. Mostly in the context of my own personal digital data and the stuff created by people I love and care about. But also on a wider level.

    I’ve been backing up what I consider my most important stuff (including writing, audio, and art work) on to MDISCs for several years. Each disc is supposed to last around 10,000 years. But realistically because of the organic elements in the disc they ‘only’ last for about 1,000 years.

    That should be fine from a longevity perspective (assuming the discs themselves don’t get destroyed, obviously). But there’s still the question of whether future generations would have the ability to extract that data, even if it’s still there on physical media. Would they have the devices and the know-how to read and parse them back into a useable format?

    I guess if we hit another dark age then there will probably be more pressing concerns anyway. But it makes me sad to think of all that lost content - not only mine but so many other people who have created interesting stuff. Especially when one realizes that, like the article says, a lot of the early Internet has already been lost. And quite a few of those creators are no longer alive.

    To paraphrase Roy Batty: all those creations have been lost, like tears in rain.



  • Next time I have to get a new TV I think I’ll just get a large computer monitor and stream content via an old mini PC with Linux installed on it. Not an ideal solution, but I’m so tired of this invasive bullshit. At least that will cut out some of its vectors.

    After the recent Roku TOS fiasco I’m done with them. If manufacturers won’t give us a viable situation we will make one ourselves.

    Anyone know a good OS setup for reduced ad streaming? I know about Pi-Holes, but I’m talking about a way of actually streaming content (in addition to blocking ads at our near the router level).