• 5 Posts
  • 112 Comments
Joined 2 years ago
cake
Cake day: June 12th, 2023

help-circle








  • If you’ve always wanted to pursue CS, do CS.

    Honestly, there’s a lot of hype around AI. Companies are trying to figure out how to incorporate LLMs into their workflows, but no one has meaningfully succeeded yet past using it as an automated StackOverflow (which is usually wrong or outdated, just like StackOverflow). Yeah, startups will claim that things like cursor have saved them hundreds or thousands of working hours, but then they get burned their AIs leave in their API keys and code security flaws into their services. In the best case, they’ve created a nightmare codebase that will raise the turnover rates for their software developers significantly.

    If you are actually passionate about CS, get a CS degree and don’t use AI for problem solving. Maybe debugging/concept explanations if it gets better, but don’t let it solve problems for you. Designing solutions, to problems, critically thinking about their strengths/weaknesses, and working through them is exactly what a CS degree is supposed to teach you how to do, so don’t throw that away by having AI do your work for you.


  • This is absolutely not true. Yes, the computer science field is constantly changing, which is exactly why having a strong grasp of fundamentals is incredibly beneficial. Any competent CS program will be teaching you how to approach programming in general (data structures, concepts, algorithms, protocol design, etc.) instead of focusing directly on specific languages. This is exactly because technology changes so frequently.

    In my entire 4-year CS degree, I only took 1 class where the content in that class was specific to a certain programming language or technology. That class was called “Programming in C++” and it was an optional elective class. Sure, a lot (not all) of my classes were based on specific languages (Java, JS and frameworks, Lisp, C, C++, python, etc.) but the content in them was easily applicable to most general programming. In some of my classes we were free to use whichever language we wanted as long as we could get the compiler running on the submission server’s docker environment.

    Yes, you can probably still become a software developer if you are dedicated enough to learning on your own, but in the current job market getting a CS job is definitely not a given anymore, especially when you’ll be competing against 1000s of other resumes with CS degrees on them. But a CS degree will make that learning process a lot easier, and will probably give you a more complete understanding of everything.







  • Signal is private in that other people can’t intercept your messages, including signal. The signal app is open-source so you can be relatively certain it’s not tracking your decrypted messages, unlike closed-source apps like WhatsApp or Facebook Messenger or any other private social media.

    Signal is not anonymous from an account standpoint, because you need a phone number to sign up, even if you can choose not to display it in your account.



  • That’s entirely fair for the usecase of a small script or plugin, or even a small website. I’d quickly get annoyed with Python if I had to use it for a larger project though.

    TypeScript breaks down when you need it for a codebase that’s longer than a few thousand lines of code. I use pure JavaScript in my personal website and it’s not that bad. At work where the frontend I work on has 20,000 lines of TypeScript not including the HTML files, it’s a massive headache.


  • Zangoose@lemmy.worldtoProgrammer Humor@lemmy.mlEvil Ones
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    7 months ago

    This is the case for literally all interpreted languages, and is an inherent part of them being interpreted.

    It’s actually the opposite. The idea of “types” is almost entirely made up by compilers and runtime environments (including interpreters). The only thing assembly instructions actually care about is how many bits a binary value has and whether or not it should be stored as a floating point, integer, or pointer (I’m oversimplifying here but the point still stands). Assembly instructions only care about the data in the registers (or an address in memory) that they operate on.

    There is no part of an interpreted language that requires it to not have any type-checking. In fact, many languages use runtime environments for better runtime type diagnostics (e.g. Java and C#) that couldn’t be enforced at runtime in a purely compiled language like C or C++. Purely compiled binaries are pretty much the only environments where automatic runtime type checking can’t be added without basically recreating a runtime environment in the binary (like what languages like go do). The only interpreter that can’t have type-checking is your physical CPU.

    If you meant that it is inherent to the language in that it was intended, you could make the case that for smaller-scale languages like bash, Lua, and some cases Python, that the dynamic typing makes it better. Working with large, complex frontends is not one of those cases. Even if this was an intentional feature of JavaScript, the existence of TypeScript at all proves it was a bad one.

    However, while I recognize that can happen, I’ve literally never come across it in my time working on Typescript. I’m not sure what third party libraries you’re relying on but the most popular OAuth libraries, ORMs, frontend component libraries, state management libraries, graphing libraries, etc. are all written in pure Typescript these days.

    This next example doesn’t directly return any, but is more ubiquitous than the admittedly niche libraries the code I work on depends on: Many HTTP request services in TypeScript will fill fields in as undefined if they’re missing, even if the typing shouldn’t allow for that because that type requirement doesn’t actually exist at runtime. Languages like Kotlin, C#, and Rust would all error because the deserialization failed when something that shouldn’t be considered nullable had an empty value. Java might also have options for this depending on the serialization library used.


  • As a TypeScript dev, TypeScript is not pleasant to work with at all. I don’t love Java or C# but I’d take them any day of the week over anything JS-based. TypeScript provides the illusion of type safety without actually providing full type safety because of one random library whose functionality you depend on that returns and takes in any instead of using generic types. Unlike pretty much any other statically typed language, compiled TypeScript will do nothing to ensure typing at runtime, and won’t error at all if something else gets passed in until you try to use a method or field that it doesn’t have. It will just fail silently unless you add type checking to your functions/methods that are already annotated as taking in your desired types. Languages like Java and C# would throw an exception immediately when you try to cast the value, and languages like Rust and Go wouldn’t even compile unless you either handle the case or panic at that exact location. Pretty much the only language that handles this worse is Python (and maybe Lua? I don’t really know much about Lua though).

    TLDR; TypeScript in theory is very different from TypeScript in practice and that difference makes it very annoying to use.

    Bonus meme: