March 26, 2007

McAdams makes the point that “time-based media require a lot of redundancy” and that hypertext links can avoid this. However, what I have seen is the opposite. For example, The New York Times places the SAME article in two, three, or more online “sections” (whereas it would only appear in a single print section). I understand how this can be useful (same as assigning multiple call numbers for books vs. the old approach of “mark it and park it”), but actually I do find it annoying to find the same featured article I just finished reading in the Education section as a featured article in Technology or whatever. Perhaps when the Times actually gets the “My Times” feature going (see ) this will improve.

Regarding the idea of breaking down larger documents into hyperlinked smaller ones – I worry that this will eliminate these from search results because two search terms might be separated. Is the solution to keep the material together for the search engine and separate for viewing?

Although this was written in 1995, gratuitous linking is still a big problem – I see Wikipedia as a major offender here and the online New York Times often has the stupidest links. I sometimes wonder if a human made the decision or if they were automatically generated based on some kind of criteria (see where they include but a single link – for “Meat Loaf”). Is anyone minding the store? Is anyone even thinking about this at NYT? Will we see a future where computers really are smart enough to determine what should be linked?

Michael Fitzgerald


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: