Answers to a reader's questions about bits of tech
It's time for a round of responses to reader feedback. This one is a little different in that it came in as a single blob of many questions from a single anonymous submitter. It's largely about technologies and programming languages, and I'll just warn you right now that I tend to be pretty boring in this regard. Read on if you must!
What is your take on AVX-512? Have you used it in any of your projects to accelerate your code? Have you dabbled in machine learning/deep learning? Is C++ still the language of your choice or are you using Python too? I would be surprised if you don't use Python because from what I hear, knowing Python or being able to learn it is pretty much a requirement for working at Google. Were you ever forced to learn a new language at any of your jobs or due to a client's requirements? How was the experience? How many programming languages are you fluent in? Do you prefer Intel or AMD these days? Have you done any GPGPU stuff? How about blockchain related stuff? I know. So many questions. I'm just curious :)
Taking them one at a time, then...
AVX-512? I had to look it up to be sure. It seems to be a bunch of Intel-specific extensions which let you crunch numbers in even more wild and wacky ways. Given that I don't write x86[-64] assembly directly, it's not anything that's deliberately been added to any of my projects. In the unlikely case that some compiler knows about it and decided to bake it into a binary, I *guess* it could be in there, but not because I asked for it.
The most likely case would be in the low level GNU Radio number-crunching stuff (VOLK), but even then, it would be going on without me knowing much about it. I've never been much for graphics or anything of the sort, so all of the MMX, SSE, and other things which have come up over the past two decades are mostly off my radar.
Machine learning and deep learning? Nope, not really. This sounds like another number/math-heavy domain which is not really my bag in the first place. This ties back to the AVX question: the things I end up doing don't normally invoke any of that stuff.
C++ is still my default language for random stuff, not that I'm exactly building things en masse these days. I have my existing projects and have been experimenting with things like the "what if threads were no object". Keep in mind that my choice of languages is driven by actual needs, just like it was six years ago.
I don't use Python if I can help it. There's a time and a place for Python just like there's a time and a place for shell scripts, but you quickly exceed the limits of sanity for both. My last Python project of any scale was done on an existing codebase within an environment where everything not-Python was treated as if it didn't exist at best, or with active hostility in some cases. Rather than pull a "we have to rewrite this", I just stayed with the existing Python base and wrote everything that was needed inside of that.
Incidentally, this project is where I picked up my feelings about Python, Gunicorn, Gevent and Flask by extension. The thing I had done with it was small and stupid, given that it was only being hit by humans and their web browsers, but the very same tech stack was being used for "services", and badly at that. It was incredibly inefficient and failed requests left and right because it would do the previously-described thing of taking multiple requests on greenlets, and would time out a large number of them.
It should not surprise anyone that this system was based around *large* numbers of retries, and not just from the clients. There were also internal retry multipliers in place, such that when another backend system went down, other neighboring services would ramp up their retries and boil them. One request from the outside world could turn into dozens of requests on the inside, many of them wasted effort.
I won't even go into the scenario where service A would dispatch multiple identical requests to service B simultaneously, and would then take the first response and drop the second connection. When challenged, they called it an optimization. I called it a waste of resources and a rationalization of a terrible implementation.
And people wonder why their cloud bills are so high, or why that app running on the phone in their hand seems so laggy at times. It's stuff like this.
Regarding Google and Python, well, yes, they did try to make me use it for a starter project way back in 2006. Given that I was having to learn the internal secret sauce at the same time, I decided that one unknown was enough and did it with C++ instead. It ended badly, in no small part because the reviewer seemed to have real issues with the fact I rejected his precious Python.
I did work on a full-scale Python project at Google several years later, and got to see such amazing things as what happens when you add forks (without a following exec) to a program that's already running threads. I also got to see the general sense of "it just does that" when I remarked that we should not be able to ship something that amounts to this:
if result: good_thing() else: syntax error
The fact we couldn't catch this before shipping it to production and having it explode just blew my mind. As someone who had been compiling stuff for a very long time, it made no sense why you would want to build a big system with many moving parts in this way. It's completely avoidable!
Have I been "forced" to learn a language for a job or a client? I guess? I mean, I went into Google in 2006 not knowing C++, not really, and I came out of it somewhat better than that (nobody ever really fully knows a language unless they wrote it, and maybe not even then). Years of doing stuff with it after that point has tightened it up quite a bit.
How many programming languages am I fluent in? Err, well, uh. Hmm. I guess the question is whether that involves looking at other things, or just conjuring it out of thin air on demand.
If you want something I can just do without looking at anything else with a reasonable amount of depth, then that would be C++ and C.
Once you get into the realm of "would probably have to look something up", now you're talking about a great many other languages, many of which I used to use all day every day back when those were my "thing".
One such example: BASIC, specifically multiple variants: the "Commodore v2" flavor of MS BASIC, gwbasic, IBM's ROM BASIC found on their PC models, and yes, MS QuickBasic on DOS. I can still grab any of those environments and do something dumb like making it print something in a loop, but the finer points might not be right there any more. It's been, what, 25 years or so since I last did any of this for real?
Another one: Turbo Pascal. Same thing here: I could probably get a program to compile and writeln a few dumb things to the display, but the details of stuff like file I/O (assign, rewrite, reset?) would require some refreshing glances.
The list just kind of goes on like this. I've also used Python, Java, Perl, JavaScript, and probably some other terrible things over the years to get something done. I wouldn't expect to be able to just write solid code in any of them right off the top of my head at this point. Sometimes that's from not bothering to get deep into it in the first place, and sometimes that's just from it evaporating over the years.
I generally assume that given a need, I'll find a way to be useful in a given context. So, if you test me for what I know now, you might get one answer, but if you look back on what I managed to do after the fact, you'll get something else entirely. Funny, given that interviews tend to look for the former and not the latter, right?
What do you know right now versus what can you do, ever? Those are very different things.
...
Regarding Intel vs. AMD, well, I hate hardware. I try to not keep track of it until it matters, like when a machine blows up and has to be replaced, or some new need calls for more CPU power than I have around now. Then I have to learn all of the new craziness the industry has come up with since the last time I looked and solve for another decent system.
I've had AMD-based machines and Intel-based machines. My only real preference in them was decidedly not wanting to burn down my house if the CPU thermal protection stuff failed. That one video of an AMD CPU starting to smoke when the heat sink was lifted off left a pretty big impression on me for several years afterward.
GPGPU-wise, well, again, not my thing. The combination of not doing graphics or math-heavy stuff and hating the world of hardware basically means I can just ignore whatever the latest nuttiness may be in that world. Not being a PC gamer or bitcoin miner also means I don't have to care about it.
All of those people who blew all that time gluing in add-on boards in the 90s, or doing one board for odd scan lines and another for evens, and fighting over FPS and all of that? Totally not my scene. I just stood off to the side and scoffed at the whole mess, and basically still do.
Just because some tech exists doesn't mean it's on my radar if it hasn't caused a problem or become the solution to a problem. There's just far too much stuff going on to even attempt otherwise.