ARC is dead, long live ARC

First, forgive me if some of my thoughts seem a bit contradictory or confusing. My brain is still steaming... and I am not sure what to think...

Please, no...

What took you so long...

Nooooo....

But before, I start rationalizing...

ARC IS NOT BROKEN, IT JUST NEEDS SOME POLISHING...

I have been saying for a long time that the dual memory management model is not good in the long term and there should be only one memory management system, but by removing ARC in Linux compiler (10.3) and future plans to remove it from mobile compilers, I am afraid that Embarcadero have opted for the wrong one.

Yes, ARC in Delphi has some issues, but those issues are fixable. Manual memory management is an obsolete technology. It has some niche use cases, but for general-purpose programming, you don't want to waste time on memory management. You want to focus on the real functionality of your code.

To be fair, every memory management model has its strong and weak sides. Every one is more fit for some purpose than the others. On top of that, the differences in how they work have a significant impact on how we structure and write our code. Being well versed in one, does not automatically mean you can easily move around in other. But, besides a few hard core facts, preferring one model over the other will be mostly opinion based. Picking memory management system is actually picking your poison.

Classic Delphi Compiler

Quite often I hear that the Delphi classic compiler has a great memory management model, where you have choice between using manual memory management and ARC. In reality, it is a huge mess. Delphi developers like it because they are used to it. To outsiders, it is just plain horrible.

The choice between reference counted objects and non-counted objects is not really a choice, and there is a huge gap between them. You are bound by the hard-coded model in the classes you use and you interact with. It is only good if you can keep the interaction between those two models at a minimum. Or you have to carefully balance and handcraft code to avoid pitfalls.

Basically, the duality of memory management systems that is currently a problem with cross-compiler code, is also present in the mixed manual + ARC memory model of the Windows compiler. And what is even worse, reference counting interferes with proper OOP abstraction models based on interfaces.

ARC Compiler

A full-ARC compiler solves all those problems. It completely unifies the memory management model, you can more easily use abstractions, you don't have to micromanage memory, you can write less code, you can write cleaner code. You can more easily manage complex scenarios. I have hoped, that the current dual compiler mess is just a necessary evil towards the long-term goal of having ARC and only ARC on all platforms.

Manual vs ARC

The only strong side of manual memory management is performance. There is no reference counting overhead. Every other feature is an actual weakness. You have to write more code and take care of each and every object cleanup. More code -> harder to read -> more places for bugs to hide.

In more complex scenarios with shared object references if you don't have ARC at your disposal, you would have to invent one. Great, you might say, because Delphi allows you to use ARC and you have a choice. However, that plays well if you don't have to mix your models. Using ARC through interfaces also means duplicating declarations that might not be otherwise necessary. Properties cannot be just backed by fields, but also by getter and setter methods. That is a whole lot of code to write. All access to object instance made through an interface reference involves virtual calls that are more expensive compared to static ones or to accessing properties backed by fields.

The bottom line is, that the ability to choose, at the end might cost you more in terms of performance or code complexity compared to code that completely runs on the ARC compiler.

The weak side of ARC is some performance overhead introduced by the reference counting mechanism - increasing and decreasing count for strong references and tracking weak references. However compared to GC that performance overhead is fixed and deterministic. In other words, you can locate bottlenecks in your code and optimize it as necessary. Of course, optimizations are not always easy or possible. No matter what ARC based code will always be a bit slower than manually managed code.

If you consider that ARC in Delphi is not exactly new (it exists for 20 years now) and how much has technology progressed in the meantime (you carry more power in your pocket than you had on your desktop back then) ARC performance costs are not as problematic as they might seem at first.

Another weak side are strong reference cycles. You have to think about how you write your code and avoid or break cycles manually by writing some code - either marking particular references as weak or niling them manually at certain point. This feature is most commonly used as strong argument against ARC.

As someone who has written substantial amount of code for ARC (both in Delphi and Objective-C/Swift) I can assure you that in real-life code this weakness is not an actual weakness. It is more of a strength. It does make you think and spend more time organizing your objects, but all that thinking will result in better code, cleaner and easier to maintain. It pays off significantly in the long run (and even in the short term).

The point is, once you get the hang of it, you will be able to avoid cycles with ease and you will do that automatically, just like you automatically write cleanup code in manual memory management.
The strong side of ARC is just about everything else. It simplifies code in simple scenarios, it simplifies code in complex scenarios, it allows you to focus on actual functionality and not memory management.

What is actually wrong with ARC compiler in Delphi

Two things - breaking compatibility with existing code and performance issues.
When it comes to performance, the issues caused by unnecessary reference counting triggers could be resolved by better compiler optimizations, some by writing better (more ARC friendly) code (without breaking compatibility) and some would require backward compatibility breaking changes.
I covered some possible (non-breaking) ARC code optimizations in Optimizing ARC with unsafe references

Some breaking changes would be simple in terms of actual code changes - as simple as changing method signatures and adding const to object parameters, but those would be nightmare for anyone that has to maintain backward compatibility. Optimizing ARC the hard way

Besides parameters there is another huge breaking compatibility problem. TComponent and its notification system that is not designed to work with ARC. This system is also responsible for DisposeOf that has no place in ARC as memory management system - it is introduced to allow proper functioning of TComponent under ARC.

While porting existing code to current ARC compilers is not that hard, solving all ARC issues by breaking code compatibility would require a considerably more effort from all parties involved. Also, from that point on, maintaining backward compatibility with older code in same code base would be mission impossible.

If we could have fresh start, without having the burden of existing code and frameworks lingering over our heads, ARC would be fine choice for memory management system.

In the words of Kurt Vonnegut - A step backward, after making a wrong turn, is a step in the right direction.

I am just not absolutely sure that introducing ARC actually was a wrong move in the first place.

Comments

Popular posts from this blog

Catch Me If You Can - Part II

Delphi 12.1 & New Quality Portal Released

Coming in Delphi 12: Disabled Floating-Point Exceptions