• 0 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: June 27th, 2023

help-circle



  • Funnily enough, one of the few legitimately impactful non-enterprise uses of AVX512 I’m aware of is that it does a really good job of accelerating emulation of the Cell SPUs in RPCS3. But you’re absolutely right, those things are very funky and implementing their functions is by far the most difficult part of PS3 emulation.

    Luckily, I think most games either didn’t do much with them or left programming for them to middleware, so it would mostly be first- and second-party games that would need super-extensive customisation and testing. Sony could probably figure it out, if they were convinced there was sufficient demand and potential profit on the other side.


  • There’s even rumours that the next version of Windows is going to inject a bunch of AI buzzword stuff into the operating system. Like, how is that going to make the user experience any more intuitive? Sounds like you’re just going to have to fight an overconfident ChatGPT wannabe that thinks it knows what you want to do better than you do, every time you try opening a program or saving a document.


  • The Xbox 360 was based on the same weird, in-order PowerPC 970 derived CPU as the PS3, it just had three of them stuck together instead of one of them tied to seven weird Cell units. The TL;DR of how Xbox backwards compatibility has been achieved is that Microsoft’s whole approach with the Xbox has always been to create a PC-like environment which makes porting games to or from the Xbox simpler.

    The real star of the show here is the Windows NT kernel and DirectX. Microsoft’s core APIs have been designed to be portable and platform-agnostic since the beginning of the NT days (of course, that isn’t necessarily true of the rest of the Windows operating system we use on our PCs). Developers could still program their games mostly as though they were targeting a Windows PC using DirectX since all the same high-level APIs worked in basically the same way, just with less memory and some platform-specific optimisations to keep in mind (stuff like the 10MB of eDRAM, or that you could always assume three 3.2GHz in-order CPU cores with 2-way SMT).

    Xbox 360 games on the Xbox One seem to be run through something akin to Dolphin’s “Übershaders” - in this case, per-game optimised modifications of an entire Xenon GPU stack implemented in software running alongside the entire Xbox 360 operating environment in a hypervisor. This is aided by the integration of hardware-level support for certain texture and audio formats common in Xbox 360 games into the Xbox One’s CPU design, similarly to how Apple’s M-series SoCs integrate support for x86-style memory ordering to greatly accelerate Rosetta 2.

    Microsoft’s APIs for developers to target tend to be fairly platform-agnostic - see Windows CE, which could run on anything from ARM handhelds to the Hitachi SH-4 powered Sega Dreamcast. This enables developers who are mostly experienced in coding for x86 PCs running Windows to relatively easily start writing programs (or games) for other platforms using those APIs. This also has the beneficial side-effect of allowing Microsoft to, with their collective first-hand knowledge of those APIs, create compatibility layers on an x86 system that can run code targeted at a different platform.



  • Centralisation in this instance refers to control over the network and standard itself rather than control over what’s posted on it. There’s no single authority that can unilaterally change how every Fediverse instance and system works - for example, there isn’t anyone who can decree that from now on Lemmy will no longer allow connections from Canada, or that nobody is allowed to post pictures of capybaras any more.

    It’s intended to prevent a /u/spez or Elon Musk situation where one asshole can bring down the entire ecosystem built around an API. Nothing stops anyone else from hosting their own instance if they dislike lemmy.world, whereas if you don’t like Twitter, you can’t just host your own copy of it.


  • Because the last decade has shown the rather terrible consequences of private, proprietary and profit-driven networks like Twitter and Meta’s various crap becoming de facto part of the common infrastructure of public life. A lot of public transit services publish service updates on Twitter. Most politicians have a Twitter presence. Many restaurants and small businesses don’t even have websites anymore - just Facebook pages.

    We want to stop exploitative for-profit entities from furthering their stranglehold on essential parts of everyday life. Nobody should be forced against their will to use crap like Facebook or Twitter, and that means advancing viable alternatives to those platforms that can fill the role they do in the internet era. If the “digital town square” idea is to live on, it should be as a commons like an actual town square, not a publicly-traded company or a billionaire’s personal cult compound.



  • I’d be fine paying Google for YouTube Premium if I could use it without being logged in. I’d take an access key for anonymous ad-free viewing for $20 a month. But Google is never going to offer that because the data-harvesting is the whole point of YouTube to them. Google is a data-slurping company with an advertising division that dabbles in video, search and phones as side hustles.

    In any case, if they really do crack down on adblockers, there are always other methods of watching their videos ad-free, and if I really like a creator, I’ll subscribe to their Patreon or watch them on Nebula.


  • Possibly, now that we have much tighter integration between different chips using die-to-die interconnects like Apple’s “UltraFusion” and AMD’s “Infinity Fabric” to avoid the latency and microstutter issues that came with old-fashioned multi-GPU cards like the GTX 690 and Radeon HD 7990 XT.

    As long as software can make proper use of the multiple processing units, I think multi-GPU cards have a chance to make a comeback… at least if anyone can actually afford the bloody things. Frankly, GPU pricing is a bit fucked at the moment even before we consider the idea of cards with multiple dies.


  • To be fair, a lot of these are accurate, or at least were at the time.

    • Multi-GPU just never caught on. There’s a reason you don’t see even the most hardcore gaming machines running SLI today.

    • The Wii’s novelty wore off fairly quickly (about the time Kinect happened), and it didn’t have much of a lasting impact on the gaming industry once mobile gaming slurped up the casual market.

    • Spore is largely forgotten, despite the enormous hype it had before release. It’s kind of the Avatar of video games.

    • It took years for 64-bit to become relevant to the average user (and hell, there are still devices being sold with only 4GB of memory even today!). Plenty of Core 2 Duo machines still shipped with 32-bit versions of Windows and people didn’t notice or care because basically no apps average people cared about were 64-bit native back then and you were lucky to have more than 4GB in your entire machine, let alone need more than that for one program.

    • Battlestar Galactica (2003) fell off sharply after season 2 and its ending was some of the most insulting back-to-nature religious tripe that has ever had the gall to label itself as science-fiction.

    • Downloading movies over the internet ultimately fell between the cracks outside of piracy. Most people stream films and TV now, and people who want the extra quality tend to buy a Blu-Ray disc rather than download from iTunes (can you even still do that with modern shows?)

    • I definitely know people who didn’t get an HDTV until 4K screens hit the market, and people still buy standard-def DVDs. Hell, they’re still outselling Blu-Rays close to 20 years later. Calling HD a dud is questionable, but it was definitely not seen as a must-have by the general public, partly because that shit was expensive back in 2008.

    • The Eee PC and the other netbooks were only good when they were running a lightweight operating system like Linux or Windows XP. Once Windows 7 Starter became the operating system of choice for netbooks, the user experience fell of a cliff and people tired of them. Which is a shame, because I love little devices like UMPCs.

    • The original iPhone was really limited for 2007. No third-party applications, no 3G support, no voice memos, you could only get it on a single carrier… the iPhone family did make a huge impact in the long run, but it wasn’t until the 3GS that it was a true competitor to something like a Symbian device.

    The only entry on this list that’s really off the mark is Facebook, which even at the time was quickly reshaping the world. And I say that as someone who hates Zuck’s guts and has proudly never had a Facebook account.