8.9 MB ...

Not much on the Axe-Fx III compared to the other products. The Axe-Fx I and II had a lot of hand-coded assembly as do the FM3/9. The DSP used in the Axe-Fx III is very difficult to write assembly for. It's a VLIW processor so it's best to use the optimizing compiler and let it do the work. There are a bunch of intrinsics that we use that are essentially assembly instructions but you can call them from C/C++.
Added to the Wiki. It's interesting to hear, at least to us (ex-)programmers.

I remember reading a comment about the TI development tools and it sounded like you were frustrated with it. Has it improved?
 
Can you imagine if Stack Exchange didn't exist to do 100% of their thinking how those kids would survive???
Heh… I did a lot of work in one of the Stack Overflow areas and was one of the top contributors in several of them for years, and the gradual decline of questions asked as the site got more popular was telling; People didn't try to solve their own problems or even to ask well framed questions, and it paralleled what I saw in the office when we'd get new hires. Now I just curate and edit on SO and concentrate on my playing.
 
People didn't try to solve their own problems or even to ask well framed questions, and it paralleled what I saw in the office when we'd get new hires.

I hear this 100%; I spent decades as a Network Administrator, and the first thing I'd ask students, that were fresh out of college/school, if they'd ever used a 'command line' to get things done, be it on a Linux system, routers, VPN servers, switches, proxy servers, etc.

That simple question was pretty much all that was required to judge an overall level of their current, and future, competence in the field.

Sometime, around the late 90's, it was clear that if the overwhelming majority didn't have a fancy GUI front-end they were fully, and completely, lost.

I'd ask, so if the GUI went down on a remote system (which happened a lot), do you know how to SSH into the box and fix things at that level?
Sadly, I'd get the deer-in-the-headlight look more often then not.
 
Last edited:
Can you imagine if Stack Exchange didn't exist to do 100% of their thinking how those kids would survive???
Stackoverflow/exchange is a great resource. The way I see it, in today's world you should know the fundamentals of your chosen language(s) well but you should not have to care about things like "what is the most optimized way to loop through a data tree" etc. That's the kind of stuff you should be able to just google and adapt to your own code.

The real difference is that some junior level programmers are not capable of the adapt part. They try to google for a solution and when presented with one, still don't understand how to adapt it to their own work. That's why mentoring when working is important because everyone starts somewhere, but I do feel that some very expensive projects I've been in have had too inexperienced developers involved in them. Welcome to the world of enterprises outsourcing development to the cheapest bidder.

The second type of person I tend to encounter is the Computer Science graduate. They are smart and driven, but tend to make overly complicated systems. I don't have a CS degree but I know I have been in the same spot in the past, making software architectures that I thought were smart but were just too complex for their own good.

I am a full stack web developer. Basically that's a fancy title for someone who can work with a wide variety of stuff. My work involves anything from user interfaces to mobile apps, backend architectures, cloud systems, databases etc. In the past year I have been programming in Typescript/Javascript, Golang, PHP, Kotlin, Java and Swift languages for different clients.

As for past software being better, it's not that modern software is not optimized. New software often performs fine. It's the legacy stuff that is the real issue. Things like Microsoft Office is such a behemoth at this point that it would benefit from being written from scratch but it would be way too costly to do so. Same deal for say Adobe software. It's gigabytes in size while something like Affinity Photo can do many of the same tasks at a fraction of the size and cost while sporting better performance.

Modern software is also a lot more complicated. It needs to run on a variety of systems, scale nicely, have high res graphics according to current trends and people are not willing to wait more than a few seconds for it to do things.

At the same time there's folks above the developers wanting to release things way before they are ready to fit fiscal quarters and collect their end of the year bonuses. Especially true for games which are often held together by seemingly shoestring and gum based on the reports from my friends in the industry because there's immense pressure to do things over doing things right.
 
I hear this 100%; I spent decades as a Network Administrator, and the first thing I'd ask students, that were fresh out of college/school, if they'd ever used a 'command line' to get things done, be it on a Linux system, routers, VPN servers, switches, proxy servers, etc.

That simple question was pretty much all that was required to judge an overall level of their current, and future, competence in the field.

Sometime, around the late 90's, it was clear that if the overwhelming majority didn't have a fancy GUI front-end they were fully, and completely, lost.

I'd ask, so if the GUI went down on a remote system (which happened a lot), do you know how to SSH into the box and fix things at that level?
Sadly, I'd get the deer-in-the-headlight look more often then not.

We do have significantly more complicated and interconnected systems than we did in the 90's.

We also have significantly more systems to manage. People who can get down to that level are significantly harder to come by and the systems can't go completely unmanaged.

Your point isn't lost on me however. "Get off my lawn!!!" too :)
 
Yes, some very good points have been raised and articulated. The issues of the current state of software are certainly not solely attributable to developers, either experienced or noobs...I agree.

As brought up and expanded on here, the whole issue has many facets; bean counters insisting on release, pressure, the need to generate revenue 'this quarter', the need to support legacy system with no luxury of a complete, new rewrite, increased overall complexity/integration, etc.

There are certainly fantastic, knowledgeable developers doing great things these days, for sure. Affinity Photo, Reaper, Blender, numerous projects on GitHub, RawThereapee (RAW photographic image processor), and dozens more, are worthy of respect and stand on their own merit.

While I assuredly have benefited from mentors and guidance from various sources over the years, a key, and important, point that's been made by @laxu is being able to actually adapt, and apply, said guidance provided by Google/Stack Overflow, et al. I've always been thankful that one of my greatest assets has been my ability to 'make leaps of intuition' regarding technology, and never relied on being spoon-fed to sort out difficult issues in integration/implementation.

I've had to connect/commission systems/tech that was so new that many of the documentation pages simply had placeholders saying "Content To Be Forthcoming", which, obviously, doesn't help when the fancy new $50,000 router/VPN/firewall isn't working, right out of the box, as advertised.

Google/Stack Overflow, and even vendor tech support, cannot help as "we've never seen your particular deployment scenario before" and you have to figure it out...simple as that. I'd stress, get major thinky-pain, walk around the block, and eventually think of new things to try, different ways of doing things, all of that. Many of my workarounds, bug reports, and methods have made their way into official documentation/errata, etc.

So, I guess where I'm coming from, I've seen a steady decline in general overall competency in technology. And, while correlation isn't necessarily causation, IMO that all started around the late 90's - early 2000's, when a lot of techs/devs stopped thinking for themselves since they could solely rely on being able to be spoon-fed answers via the Internet. I definitely have benefited, and still do, from the vast resources the Internet has to offer and lean on it a lot, but will maintain that it's a crutch for too many...it's enabled a 'cut-and-paste' generation that stops dead if the exact solution to whatever problem they are having isn't fully spelled out for them.

LOL yes, rant over....now get off MY lawn!!
 
Last edited:
Not much on the Axe-Fx III compared to the other products. The Axe-Fx I and II had a lot of hand-coded assembly as do the FM3/9. The DSP used in the Axe-Fx III is very difficult to write assembly for. It's a VLIW processor so it's best to use the optimizing compiler and let it do the work. There are a bunch of intrinsics that we use that are essentially assembly instructions but you can call them from C/C++.
Nice, thanks for the insight! I am morbidly obsessed with how the firmwares are coded 😀
 
Indicative of everything wrong with software today. Everything is bloated, slow and buggy.

We have 16 core computers running at 4 GHz but things seem slower than 10 years ago because software is so bloated.

Most of the programs I use now seem like a step backwards. MS Office seems to get worse with every release. The latest IDE for the Analog Devices DSPs is atrocious. Slow, buggy and they charge money for it. The previous version, which was free, was 10 times better and 1/10th the size.
True story. In the past people cared more about resources. Every byte was on count. But laziness is the engine of progress. And here are consequences. Endless resources consumption, memory leaks, garbage collectors. IDEs become monsters, autocomplete, navigations, plugins. All in one.

You guys are so lucky to be able to touch the birth of this progress. PCs were so expensive and unavailable, at least in my town. I got my first computer in 2007 during my first year of university. Half year I had to write programs on paper and pray they will compile and work on exams.
 
Yes indeed. It would seem that the availability of ever faster hardware, development frameworks, and that they can easily push out updates over time, have made developers/vendors lazy and inefficient. Pride, quality, and craftsmanship, as far as it relates to most modern software, are sorely lacking. I can remember the days where software was highly optimized, lean, and releases were tested thoroughly in-house, since getting patches/updates required physical media and were difficult, slow, and expensive, to propagate.

Games are the absolute worst offender these days; I see AAA titles going for ~$60 -> $80 that are extremely bug ridden, clearly not functional, incomplete, and it's stunning how they are released in the state they are...in many cases they are in what could charitably called a 'pre-alpha' stage. Then, they have the gall to push out DLC packs (more $$$) when the game itself is still far, far, from even just 'working'. And don't get me started on the nefarious trend of 'Pre-Purchasing' titles that are months in the future...amazing people still partake in that.

One bright spot in all of this is the most current version of Blender (v3.0, released last month), which is 3D modeling software. The main rendering/ray-tracing engine, called Cycles, had a long awaited, complete engine rewrite finally released. The overall improvements for the efficiency/speed of rendering is astounding, almost an order of magnitude. Even on my older GTX1080 GPU I can actually use the ray-tracing engine to model in the viewport in pretty much real-time; while it's not ultra-snappy like the current RTX30xx GPU's, the overall optimization is fantastic, and it's good to see some developers still strive for 'getting it right'.



I hear you! I remember upgrading my Commodore PET 4016 (16K RAM) to a 4032 (32K RAM) by installing 16x1K RAM discrete chips onto the motherboard. They all came in a large tube, and after I carefully installed them (didn't want to bend any pins, etc.) I thought I was now able to take over the Universe since I now had a whopping 32K of RAM!!!

Me with my TRS-80 with 4k RAM getting jealous!
 
My first taste of coding was that rubbery keyboard that was wired to an Atari 2600 cartridge. I reckon I was about 7. I remember a friend of my brother’s would come over with a BOOK he had, and we would type the code in, and then save it onto an audio cassette, which took like 10 minutes to load. Sometimes the book had mistakes, and didn’t we feel just so badass fixing them!

Now working on cloud stuff, everything is so complicated; it feels like hardly any of the code actually does anything, it’s all docker files and json manifests. I miss the simpler times of arguing with my brother about the difference between bits and bytes, squeezing everything into a single BASIC file, and having to think really hard about what you were typing because the turn-around was a long time!
 
My first taste of coding was that rubbery keyboard that was wired to an Atari 2600 cartridge. I reckon I was about 7. I remember a friend of my brother’s would come over with a BOOK he had, and we would type the code in, and then save it onto an audio cassette, which took like 10 minutes to load. Sometimes the book had mistakes, and didn’t we feel just so badass fixing them!

Now working on cloud stuff, everything is so complicated; it feels like hardly any of the code actually does anything, it’s all docker files and json manifests. I miss the simpler times of arguing with my brother about the difference between bits and bytes, squeezing everything into a single BASIC file, and having to think really hard about what you were typing because the turn-around was a long time!
My first programming was my dad's handheld HP calculator, kinda like BASIC as I recall, in a tiny 1x32 (I think) display. Good times.
 
To the folks ragging on the young developers, it's pure nonsense. They're no worse than developers have been since there were developers.

Yea, seeing how much 'stuff' we get from a mere ~10Mb is remarkable, especially these days when a simple PDF reader program is ~100+ megabytes o_O

What makes you think there is anything simple about writing a PDF reader? They have a storage system, encryption, scripting language, forms, support all manner of vector and raster graphic decoders, embedded fonts, accessibility, post script, etc. And PDF files have been around for nearly 30 years - the combinatorial explosion of what must be supported is extremely daunting.
 
What makes you think there is anything simple about writing a PDF reader? They have a storage system, encryption, scripting language, forms, support all manner of vector and raster graphic decoders, embedded fonts, accessibility, post script, etc. And PDF files have been around for nearly 30 years - the combinatorial explosion of what must be supported is extremely daunting.

Time for a new document standard then....:cool:
 
Back
Top Bottom