So that's a million academic citations deleted.
https://scholar.google.com/scholar?start=90&q=%22https://goo.gl/%22&hl=en&as_sdt=0,5
So that's a million academic citations deleted.
https://scholar.google.com/scholar?start=90&q=%22https://goo.gl/%22&hl=en&as_sdt=0,5
Hundreds of thousands of books with irrevocably broken links.
@Edent
I don't have much sympathy for folk who haven't used a DOI. They're not catchy, but have the clear purpose of avoiding exactly this problem.
Some back-of-the-fag-packet maths.
goo.gl URls are 6 characters long, a-Z,0-9. That's a total of 62^6 combinations - about 56 Billion.
The realistic max length of a URl is 1,024 bytes.
That gives an upper-bound on the size of the database as 560TB.
Various sources reckon the average length is about 80 chars. Let's call it 100. So 56TB.
goo.gl only lasted 10 years (ancient for Google!). Bit.ly claimed to shorten 600 million links per month. I double Google was doing those numbers.
So…
Plucked completely out my arse, let's assume 6 billion total URls in goo.gl
That's about 6TB total.
Of course, lots of those will be duplicates. And text compresses really well.
So call it a 3TB database. Obviously, it would probably be a different data structure which could make it more efficient.
I don't understand why they can't just release a static copy - or searchable version - so that people can easily and quickly deference all the goo.gl links printed in papers and books.
GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.
All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.