Design Patterns – Preventing Dangling References with Design Patterns

cdesigndesign-patternsreference

I was thinking about a design for custom handles. The thought is to prevent clients from copying around large objects. Now a regular handle class would probably suffice for that, but it doesn't solve the "dangling reference problem";

If a client has multiple handles of the same object and deletes the object via one of them, all the others would be invalid, but not know it, so the client could write or read parts of the memory he shouldn't have access to.

Is there a design pattern to prevent this from happening?


Two ideas:

  1. An observer-like pattern where the destructor of an object would notify all handles.

  2. "Handle handles" (does such a thing even exist?). All the handles don't really point to the object, but to another handle. When the object gets destroyed, this "master-handle" invalidates itself and therefore all that point to it.

Best Answer

C++11 introduced two new utility classes: std::shared_ptr and std::unique_ptr. If you need to control a resource through multiple handles, then wrap it in a std::shared_ptr, which is a reference-counted smart pointer that cleans up the resource when there are no more pointers to it. std::unique_ptr is designed for single ownership, but it too will automatically clean up the resource it holds when the handle itself is destroyed.

While the name _ptr seems to imply pointers, both classes can be customized to work with any kind of resource that needs to be manually given back to the OS. Here's a hypothetical example that uses FILE* handles from C (why you would use this instead of streams is up to you...)

struct FILE_deleter {
    void operator() (FILE* fp) const {
        fclose(fp);
    }
};

void f()
{
    std::shared_ptr<FILE> sp(fopen("some-file.txt"), FILE_deleter{});

    // Now do whatever you want with sp: make copies of it, pass it around, whatever.
}