8.9 MB ...

That's including all of the factory cabs and the extra MIDI sysex overhead too. The actual code is likely much smaller than that.
 
Yea, seeing how much 'stuff' we get from a mere ~10Mb is remarkable, especially these days when a simple PDF reader program is ~100+ megabytes o_O
Indicative of everything wrong with software today. Everything is bloated, slow and buggy.

We have 16 core computers running at 4 GHz but things seem slower than 10 years ago because software is so bloated.

Most of the programs I use now seem like a step backwards. MS Office seems to get worse with every release. The latest IDE for the Analog Devices DSPs is atrocious. Slow, buggy and they charge money for it. The previous version, which was free, was 10 times better and 1/10th the size.
 
Is it true @FractalAudio that you hand-code the firmware in assembly?
Not much on the Axe-Fx III compared to the other products. The Axe-Fx I and II had a lot of hand-coded assembly as do the FM3/9. The DSP used in the Axe-Fx III is very difficult to write assembly for. It's a VLIW processor so it's best to use the optimizing compiler and let it do the work. There are a bunch of intrinsics that we use that are essentially assembly instructions but you can call them from C/C++.
 
Compared to 6-7Gb for the average program I install at work... yeah, amazing. Then again, when I started at this job back in '86, the AT PC I used had a 10MB hard drive.
I started with IBM S1, 8'' floppy with 256k (512 if you have a double face head reader). That was crazy! The RAM was about 16kb.
 
Yeah, no one cares about performance anymore. It makes me sad as a developer.

PS If you want to see how much data may be represented in the file which is less than 100kb you should check out the project called "kkrieger". I bet you'll be shocked.
 
Not much on the Axe-Fx III compared to the other products. The Axe-Fx I and II had a lot of hand-coded assembly as do the FM3/9. The DSP used in the Axe-Fx III is very difficult to write assembly for. It's a VLIW processor so it's best to use the optimizing compiler and let it do the work. There are a bunch of intrinsics that we use that are essentially assembly instructions but you can call them from C/C++.
After the chungnus X2, will it significantly reduced the code size?
 
And I guess here's why FM3/9 updates lag a few weeks/months behind the axe III, it must be a huge work to "translate" the code
Yikes, maintaining similar but not identical functionally across platforms that are that different to code for would give me the dontwannas.

Props to Fractal, once again.
 
And I guess here's why FM3/9 updates lag a few weeks/months behind the axe III, it must be a huge work to "translate" the code
I don’t want to imagine the work that goes into translating and maintaining the firmware. I wonder if there aren’t any cheaper and slower TI DSP sharing the same architecture and instruction set? That would make life for Fractal Audio a lot easier, if these were used in the FM3/FM9.
 
Indicative of everything wrong with software today. Everything is bloated, slow and buggy.

Yes indeed. It would seem that the availability of ever faster hardware, development frameworks, and that they can easily push out updates over time, have made developers/vendors lazy and inefficient. Pride, quality, and craftsmanship, as far as it relates to most modern software, are sorely lacking. I can remember the days where software was highly optimized, lean, and releases were tested thoroughly in-house, since getting patches/updates required physical media and were difficult, slow, and expensive, to propagate.

Games are the absolute worst offender these days; I see AAA titles going for ~$60 -> $80 that are extremely bug ridden, clearly not functional, incomplete, and it's stunning how they are released in the state they are...in many cases they are in what could charitably called a 'pre-alpha' stage. Then, they have the gall to push out DLC packs (more $$$) when the game itself is still far, far, from even just 'working'. And don't get me started on the nefarious trend of 'Pre-Purchasing' titles that are months in the future...amazing people still partake in that.

One bright spot in all of this is the most current version of Blender (v3.0, released last month), which is 3D modeling software. The main rendering/ray-tracing engine, called Cycles, had a long awaited, complete engine rewrite finally released. The overall improvements for the efficiency/speed of rendering is astounding, almost an order of magnitude. Even on my older GTX1080 GPU I can actually use the ray-tracing engine to model in the viewport in pretty much real-time; while it's not ultra-snappy like the current RTX30xx GPU's, the overall optimization is fantastic, and it's good to see some developers still strive for 'getting it right'.

Almost wet myself when I got a 16k ram pack as a present from my dad.

I hear you! I remember upgrading my Commodore PET 4016 (16K RAM) to a 4032 (32K RAM) by installing 16x1K RAM discrete chips onto the motherboard. They all came in a large tube, and after I carefully installed them (didn't want to bend any pins, etc.) I thought I was now able to take over the Universe since I now had a whopping 32K of RAM!!!
 
Indicative of everything wrong with software today. Everything is bloated, slow and buggy.

We have 16 core computers running at 4 GHz but things seem slower than 10 years ago because software is so bloated.

Most of the programs I use now seem like a step backwards. MS Office seems to get worse with every release. The latest IDE for the Analog Devices DSPs is atrocious. Slow, buggy and they charge money for it. The previous version, which was free, was 10 times better and 1/10th the size.
Also applicable to websites : data volume for a webpage is now very important
 
DpXQd6t.jpg
 
Yeah, no one cares about performance anymore. It makes me sad as a developer.
Well, we do, but the tradeoffs between coding time and code size and performance mean nothing to most companies, they'd rather throw fast virtual hosts at the problem. Where I used to write in assembly and C, now most companies use higher-level languages and seldom use something that's compiled. It's economic pressure but also that most programmers aren't comfortable working inside debuggers.

I'd rather take one of the computers used to host a bunch of virtual computers and write code in C or C++ and let the optimizing compiler chew on it. I remember trying to debug some code the optimizer had worked over, and it was quite different than what I wrote initially, and the resulting assembler output was very good. I could have optimized some of it further, but 95% of it was good enough there was no point. Ah, back in the days when programmers were real men.
 
Last edited:
Back
Top Bottom