• 1 Post
  • 40 Comments
Joined 1 year ago
cake
Cake day: June 18th, 2023

help-circle


  • Urbanisation and deforestation are not the same as enshitification tho.

    It’s a bit unfortunate that “increased degree in which something is shit” sounds like what the word should mean, and I suppose it then sort of does.

    It’s nice to have a word that describes the investor-driven incentives to worsen a service/product to milk out more short-term revenue. The larger a market capture is, the more that can be pushed without an alternative being a threat.

    It’s the cycle of “provide a good quality service that makes everybody happy” -> market capture -> shareholders push for increase revenue at the expense of quality as there is no competition.





  • There are two common types of laser printers. Those that have special paper that react to heat, such as receipt printers, would fit the description.

    The other laser printers… Hm, I don’t think your description is accurate either. It’s more that the laser electrically charges ink particles so that they jump on to a separate roller that gets rolled on to the paper.

    I’m no expert though.



  • Absolutely. Those you suggest there are good examples.

    Good enough that, instead of “is/isn’t” programming language, it would be more a “ah, so, how do you define that then?”. Now that I’ve had some sleep, one could argue that I could have been nicer and suggested that approach for HTML as well. After all, it’s just words that mean stuff, and transfer a concept between people, that translate to the same (ish) idea. The moment the latter isn’t the case, it’s no longer very useful for the former.

    Most disagreements, I find, are just cases of different understandings. Discussions worth having is when both are correct but different, and both want to figure out why they differ. So, on second thought, I think I was appropriately rude _

    Both LaTeX and roff are Turing complete, but they are also DSLs with a somewhat narrow “domain”. Sounds exactly right that these blur the lines between what is/isn’t. You could even argue that claiming one or the other is just one way to express how you understand that difference.


  • That’s such a weird point to make. Is it because to you, it seems like the line drawn is arbitrary? I cannot imagine any other reason. Certain words just mean certain things.

    Markup languages are exactly as much “programming” as you marking a word and hitting “bold”. Which is to say, nothing at all. People are wrong all the time, and I have a very limited amount of fucks to give when it happens.

    As for Scratch, it is a programming language. So, why would you think it’s a logical next step for me to say otherwise? Next, you’ll say something remarkably dumb in response. Resist the temptation, and do something more productive.









  • There is no good answer to that question. Too many don’t understand the importance of context/intention, and unfortunately they are also the same ones that get easily offended on behalf of everyone else. Part of the “but, the discussion should be more about meeee, and my feelings, and how offended I am”-mindset.

    Which is why “kid cosplaying as early days MJ” manages to offend anyone. Why you cannot watch the Community episode on D&D. Why they likely wouldn’t dare make Tropic Thunder today. Why many git-branches are renamed to main, etc, ad nauseum.

    Even though unicode group could (and IMO should) define all things useful, I don’t see why “male-genitals-flaccid”, and variations, couldn’t be specified. Doesn’t require any application level visual implementation. But those that do, and have good reason to, can use a specification, rather than make up their own. Not really a big deal, of course.

    If you can have

    "1F646 🙆 face with “ok” gesture, described as a person with arms raised above the head forming a “circle”, interpreted as “OK sign” (derived from the Japanese gesture for “OK”). Intended as gender-neutral but represented as a woman on most platforms

    Just, throw in a penis or two. It’s… fine



  • I don’t think the Unix philosophy of having lots of small tools that do one thing and do it well that you compose together has ever been achieved

    Why do you think this might be the case? It’s not remotely accurate, which suggests that you must understand it very differently than I do. To some extent, I am curious.

    I’ll give you a recent example. Which is just from yesterday. I had a use case where some program had a memory leak, which would eventually lead to the system running out. So, I “built a program that would monitor this and kill the process that used the most memory”. I don’t know how complicated this is in windows and PS, but it took about 2 minutes in Linux, and it very much leverages the Unix philosophy.

    Looks something like this:

    get_current_available_memory_mb() {
        cat /proc/meminfo | grep MemAvailable | grep -oP '\d*' | xargs printf "%d / 1024  \n" | bc
    }
    

    Functionality based on putting together very small pieces that do their things well.

    • /proc/meminfo is a file pipe that gives you access to information related to memory usage.
    • cat just outputs data from a file or a named pipe, here the latter
    • grep lets you filter stuff. First time the relevant line. Then again to strip out the number with a regex.
    • xargs does one thing well, and lets you pass that on to another command as arguments, instead of stdin.
    • printf formats the output, here to express the numerical operation of dividing the value by 1024 as “[number] / 1024”
    • bc evaluates simple mathematical operations expressed in text

    Result: 1 file pipe and 5 simple utilities, and you get the relevant data.

    The PID of the process using the most memory you can get with something like:

    ps aux --sort=-%mem | head -n2 | tail -n1 | awk '{print $2}'
    

    Same sort of breakdown: ps gives you access to process information, and handles sorting by memory usage. head -n2 just keeps the first two lines, but the first one is a header so tail -n1 keeps the second line. awk is used here to only output the second column value. And, you get the relevant data. Also, with simple tools that leverage the Unix philosophy.

    You then check if the available memory is below some threshold, and send a kill signal to the process if it does. The Unix way of thinking also stops you from adding the infinite loop in the script. You simply stop at making it do that one thing. That is, 1. check remaining memory. 2. if lower than X, kill PID". Let’s call this “foo.sh”.

    You get the “monitoring” aspect by just calling it with watch. Something like watch -n 2 -- ./foo.sh.

    And there you go. Every two seconds, it checks available free memory, and saves my system from freezing up. It took me 10 times longer to write this reply, than to write the initial script.

    If memory serves me correctly, PS also supports piping, so I would assume you could do similar things. Would be weird not to, given how powerful it is.

    I could give you an endless list of examples. This isn’t so much a case of “has ever been achieved”, but… a fundamental concept, in use, all the time, by at least a dozen people. A dozen!

    Also yesterday, or it might have been Saturday. To give you another example, I scratched different itch by setting up a script that monitors the clipboard for changes, if it changes, and now matches a YouTube URL, it opens that URL in FreeTube. So… with that running, I can copy a YouTube URL, from anywhere, and that program will immediately pop up and play the video. That too, took about 2 minutes to do, and was also built using simple tools that do one thing, and one thing well. If you wanted it to also keep a local copy of that video somewhere, it wouldn’t be more effort than the 10 seconds it takes to also send that URL to yt-dlp. One tool, that does that one thing well. Want to also notify you when that download is complete? Just add a line with notify-send "Done with the thing". What about the first example, if you want to get a OS level notification that it killed the process? Just add a line to notify-send, same tool that does that same one thing well.

    None of this takes much effort once you get into it, because the basic tools are all the same, and they don’t change much. The whole workflow is also extremely iterative. In the first example, you just cat meminfo. Then you read it, and identify the relevant line, so you add grep to filter out that line, and run the command again. It’s now a line containing the value, so you add another grep to filter it out the number, and again, run it. “Checks out”. So, you pipe that to printf, and you run it. If you fuck something up, no biggie, you just change it and run it again until that little step matches your expectations, and you move on.