• 2 Posts
  • 101 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle
  • Yeah, I mean if you want to get picky, the actual i386 processor family hasn’t been supported by the Linux kernel since 2012, and was dropped by Debian in 2007.

    Most people were generally not particularly affected by that, seeing as the last i386 chip was released in (I think) 1989!

    Debian’s choice to refer to the whole x86-32 line as i386 has always been a weird historical quirk.



  • Not all mainframes are ancient; new models are still designed and sold to this day. And the brand spanking new mainframes may still be running COBOL code and other such antiquities, as many new mainframes are installed as upgrades for older mainframes and inherit a lot of legacy software that way.

    And to answer your question: a mainframe is just a server. A specific design-type of server with a particular specialism for a particular set of usecases, but the basics of the underlying technology are no different from any other server. Old machines (mainframes or otherwise) will always consume far more power per instruction than a newer machine, so any old mainframes still chugging along out there are likely to be consuming a lot of power comparable to the work they’re doing.

    The value of mainframes is that they tend to have enormous redundancy and very high performance characteristics, particularly in terms of data access and storage. They’re the machine of choice for things like financial transactions, where every transaction must be processed almost instantly, data loss is unacceptable, downtime nonexistent, and spikes in load are extremely unpredictable. For a usecase like that, the over-engineering of a mainframe is exactly what you need, and well worth the money over the alternative of a bodged together cluster of standard rack servers.

    See also machines like the HP Nonstop line of fault-tolerant servers, which aren’t usually called mainframes but which share a kinship with them in terms of being enormously over-engineered and very expensive servers which serve a particular niche.


  • Patch@feddit.uktoLinux@lemmy.mlCanonical changes the license of LXD to AGPL
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    7 months ago

    Projects which choose BSD/Apache type licences do so fully in the knowledge that their code may be incorporated into projects with different licences. That’s literally the point: it’s considered a feature of the licence. These projects are explicitly OK with their code going proprietary, for example. If they weren’t OK with it, they’d use a GPL-type copyleft licence instead, as that’s conversely the literal point of those licences.

    Being mad about your Apache code being incorporated into a GPL project would make no sense, and certainly wouldn’t garner any sympathy from most people in the FOSS community.



  • If youtube would make ads consistent (30 second ad break at the half way point for < 10 mins vids; 15 second at break at 1/3 of the way through the video and a second one at 2/3 of the video for 10 - 30 min vids; etc)

    This seems like a crazy amount of ads to me. On live TV, I wouldn’t expect more than one ad break every 15 minutes of broadcast, with fewer on things like feature films. YouTube is mostly short form content; there’s no reason why there couldn’t just be ads at the beginning for the vast majority of content, with only the longer videos needing a different approach. If you’re mostly watching <20 minute videos, you’re still getting a similar number of “ad breaks” per viewing hour.

    The idea of having a 30 second ad break 5 minutes into a 10 minute video would 100% be unacceptable to me.



  • You can at least pay (quite a lot less than a cable subscription)

    Well I should bloody well hope so, considering you also get far less than cable.

    YouTube is still mostly amateur or indie content, most of it short-form, and most of it frankly just not very good. There’s still stuff on there worth watching, and I know some people really do consume a lot of content on there in the manner of watching TV back in the day, but objectively it really isn’t the same thing as professional studio content. I can watch some random guy in Ohio do a 15 minute review of some niche thing I’m interested in as much as anyone can, but there’s no way I’d consider that worth the same value as a long form TV series or feature film.


  • There’s a huge difference between nudity in the neutral sense of “no clothes on” and erotica (which may or may not involve nudity, but usually does).

    My kids see nudity all the time. They see me and their mum nude in the mornings when we’re getting dressed, they see people nude in the changing rooms at the swimming pool, they undoubtedly see other kids nude at nursery in the course of the day. That’s all normal and healthy.

    That’s not the same as letting my 3 year old watch porn.

    Porn is a complex subject even for adults, and absolutely needs an adult perspective to contextualise it, understand it, and potentially recognise when something about it is seriously wrong. This is something that is perfectly reasonable to limit to adults.



  • Why do you keep deleting your messages and re-replying with essentially the same thing?

    I’ll repost my reply to your last deleted message:

    As someone who has never had any particular compunction about sailing the digital seven seas, and generally has a liberal view of copyright laws and overly comprehensive intellectual property protections, I really don’t give a hoot about whether publicly accessible websites have been used as training data for a website creating system.

    If you don’t want people/machines to read your intellectual property, don’t post it on the internet.





  • It really all depends what we’re talking about when we say “gaming” tbh. Proton on Steam will run literally thousands of titles in one click, no configuration necessary, flawlessly. But thousands of titles isn’t all titles. If you’re a gamer who is happy to play what works and miss out on what doesn’t, there are enough games on Linux to keep you playing for a hundred lifetimes. But if you’ve got a specific competitive multiplayer game in mind that implements anti cheat, or you want to play all the biggest AAA releases as soon as they come out, you’re going to have a less positive experience.

    And yeah, Nvidia on Linux can really suck, too. Anybody buying/building a rig with Linux in mind should steer well clear. If you’re talking about an existing machine with Nvidia then you might get lucky and have an easy straightforward time, or you might find yourself straight in at the deep end with a crash course of Linux sysadmin…


  • I don’t know if it’s still this way, but a decade and more ago (when I last had any professional contact with Microsoft’s development) the company was effectively divided into two competing factions- the Office people and the Windows people. They had wildly different priorities for the shared tech stack, and mutually exclusive demands on the others’ products, and there was a constant bun fight on who got their way. The surprising thing is, even by that era, the Office faction were the dominant one; that’s where the real money was.

    Then I gather the Azure faction was born and has completely dominated both, becoming a massive majority of the company’s profitable business.

    The gaming people (Xbox and whatnot) were always poor relations, if you’re wondering, and MS R&D was its own eccentric little world which seemed to exist entirely outside of the universe inhabited by any of the others.


  • The poor devs aren’t even saying “no”. They’re just saying “what the hell is going on and why didn’t you ask us about this first”.

    Pretty poor form for the OP to use a “KDE Developer”-badged account when they didn’t have any backing from the KDE developers to make the post. Makes it look a lot more official than it actually is.


  • It’s just fancy virtualization. It’s not really wildly different from KVM/QEMU going the other way.

    It’s hard to get too excited about it. It’s not going to replace real Linux builds, which dominate the server space in a way which is never going to be meaningfully challenged by “Linux in a VM under Windows”.

    Windows implementing WSL is their concession that they’ve lost the server market and they aren’t getting it back, and if they don’t want to lose the workstation market as well they need to make sure that Linux development can happen easily on Windows boxes. Their business case for it is clear, and it’s really not got anything to do with classic EEE tactics.



  • I use Windows at work (it is a corporate laptop) but I don’t use a single app which is Windows-only and irreplaceable. My current job isn’t technology-focused, and I don’t really use anything except standard office-related software.

    In my previous job I was a software engineer and also used Windows (same reason; corporate laptop) but again everything I used would have worked in Linux.

    People should use whatever platform works best for them. I’m a Linux user at heart, but I’m all for using Windows if that’s the right tool for the job. But it’s not a “grown ups need Windows only teenagers can use Linux” thing. Most working people would do fine with Linux or Mac.