Plucked completely out my arse, let's assume 6 billion total URls in goo.gl
That's about 6TB total.
Of course, lots of those will be duplicates. And text compresses really well.
So call it a 3TB database. Obviously, it would probably be a different data structure which could make it more efficient.
I don't understand why they can't just release a static copy - or searchable version - so that people can easily and quickly deference all the goo.gl links printed in papers and books.