![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.ml/pictrs/image/c0ed0a36-2496-4b4d-ac77-7d2fd7f2b5b7.png)
To my knowledge, it’s a name that came out of the free software / GNU movement. So, “hackers” as in clever problem solvers, not those that break into insecure systems.
To my knowledge, it’s a name that came out of the free software / GNU movement. So, “hackers” as in clever problem solvers, not those that break into insecure systems.
Yeah, learning Rust has given me greater appreciation for C/C++. Like, the selling feature of all three is that they don’t use a runtime, which means you’re not locked into that ecosystem. You can create libraries with them, that can be used from virtually any other language.
It’s also easy to say that the performance of Java, Python et al is fine, but having a larger application start up in 1 rather than 20 seconds is still always appreciated.
To be honest, I’m not the best to ask about Python. I need more rigid languages for my daily job, so it’s much quicker for me to just throw down a small project in one of those.
I do know, though, that Python comes with Tkinter out of the box. People usually don’t praise that all too much, but it’s probably fine for small GUIs.
However, it’s almost certainly worse than Powershell/.NET for creating Windows-only GUIs.
If you’d like to write GUIs on the Linux side, then I would frankly recommend not doing that.
No Linux sysadmin wants a GUI to deal with. If you give them a CLI, then they can automate that, i.e. integrate it into yet another (probably Bash) script.
Not to mention that most Linux servers don’t even have a graphics stack installed…
People use Bash for quick and dirty scripts, because it’s pretty much just a few symbols in between all the commands that they know and use all the time anyways. You don’t really ‘learn’ Bash in a dedicated manner, you rather just pick up on tricks and tidbits over years.
For more than that, you’d use Python, Ruby or a full-fledged programming language.
Personally, I would even go so far that Powershell hardly added something new that wasn’t already covered by a programming language…
That’s quite a toxic response, ma’am, and I do not understand why.
That is a very specific usage of the word “master”. We can adjust that, while continuing to use “master” in all the cases where it has nothing to do with slavery.
Because of inheritance.
Yeah, you need to have experiences with the intermediate architecture sizes and need enough information about the concrete use-case, to be able to deliberate something that’s not an extreme. Both of those are a rarity with programming…
Problem is, even if they are capable of explaining it, it’s basically our job to learn things 8 hours a day. Trying to catch someone up on that, who doesn’t have that same job, that’s nearly impossible. Well, and you still want to rant/tell about your day for social interaction purposes.
Like, my mum would also sometimes ask what my (programmer) workday was like and I’d start telling that we had to deploy onto a really old Linux system. Wait hang on, Linux is an operating system. And an operating system is the software that makes computers go. Do you know what “software” is? Hmm, it’s like…
…And yeah, basically one computer science lecture later, I still haven’t told anything about my workday.
Sometimes, I can try to leave out such words, like “we had to roll out our software onto a really old computer”, but then I can practically only say “that was really annoying”. To actually explain how I slayed the beast, I do need to explain the scene.
Wut? Are we talking about one of those “salads” with mayo, eggs, bacon strips, croutons, sugary dressing and whatnot?
Because if not, then cherry tomatoes are going to be pretty much the sweetest thing you’ll find for your salad. I’d definitely still call them healthy, but not more so than the other ingredients of a salad…
Yeah, it’s intentionally obscure. Basically, x86 assembly code is a way of telling a processor what to calculate, at a very low level.
So, it’s similar to programming languages, but those actually get translated into x86 assembly code, before it’s told to the processor. (“x86” is a certain processor architecture. Others exist, too, most prominently “ARM”.)
But yeah, even with me knowing that much, I’d need to guess what ret
and int3
might do.
Everyone knows jmp
and nop
, though, of course. 🙃
But didn’t they have to retrofit structured programming into COBOL? As in if-else, loops etc. didn’t exist in COBOL originally, it was all just GOTO.
I guess, what I’m asking is: Does “not that bad” mean still pretty awful, but perhaps not as awful as one would expect for its age…?
I wonder, if this is another effect of LLMs. Maybe it’s just really easy and lucrative to generate+sell books for these old languages.
I have heard it before, albeit tongue-in-cheek. So, like the server can be “running”, it can also trip and fall over.
Rule of thumb, which I feel gets you 80% there:
If you store data in a struct, you want that struct to have ownership of that data. So, avoid storing references in structs.
If you need to pass data into a function, you usually want to pass it as a reference.
This makes it so you have your data stored in some place with the ownership and from there you just pass data down into functions as references. It forces you to structure your program like a tree (which is often a very good idea to begin with).
You know, at some point in my career I thought, it was kind of silly that so many programming languages optimize speed so much.
But I guess, that’s what you get for not doing it. People having to leave your ecosystem behind and spreading across Numpy/Polars, Cython, plain C/Rust and probably others. 🫠
They’ve declared it as WONTFIX, so unless you’re suggesting that OP creates a fork of numpy, that’s not going to work.
So many people here explaining why Python works that way, but what’s the reason for numpy to introduce its own boolean? Is the Python boolean somehow insufficient?
Well, yeah, but they do mean the exact same thing, hopefully: true or false
Although thinking about it, someone above mentioned that the numpy bool_
is an object, so I guess that is really: true or false or null/None
Man, can you imagine? You’ve written your paper. You generate the LLM-justified version and proof-read it all the way through. But then you realize, you still need to add one more info to your opening paragraph. The LLM will rewrite your entire paper once again and you get to proof-read it another time. 🫠