Brian tutorial in PowerPoint

September 2, 2007

This gives an introduction to jazz computer discography, explaining the relationships between different entities.

Advertisements

April 15, 2007

Sigh – once again I find myself in the minority of the world. But I guess I should be used to it by now.

Google is good enough for 80% of the people 90% of the time; iPods and mp3 are the future of audio; keyword searching is the way to search the library catalog; Wikipedia is the go-to source for information. Whatever – just keep them away from me. I will, however, thank Google for allowing me to customize my search engine so I can never see hits from Wikipedia (and some others too). Boy, that’s refreshing.

I read the New Yorker article when it came out. At about the same time, The Onion had a piece that was funny despite being so basically accurate (issue 42, no. 30).

I think the public is getting dumber. I don’t claim Wikipedia is a cause of this, it’s just a symptom, and only one of many. Joey read a book for school in 8th grade and for some reason, feels that he should share his newly-acquired knowledge with the world. Joey doesn’t have the wisdom to know that he’s basically ignorant. He just *has* to express himself. In the bad old days, Joey had no outlet. Now he does. Is this an improvement? Now, why on earth would an expert have any interest in contributing or correcting when Joey can “improve” things the next day? I know I don’t have time to get into edit wars with folks. Even a well-written article is fair game for pinhead self-proclaimed editors who feel they need to tweak a sentence or add a paragraph. Two of the most important things about being an editor who oversees a large project are determining what topics will get articles (and which won’t) as well as article length. Wikipedia lacks these things and can’t possibly ever have them. Curt Sachs once said, “A scholarly work, as a work of art, needs integration. And integration is possible only where there is one man, one creative mind.” Something to ponder in the age of everybody-and-nobody-in-charge.

In 1993, I saw how the quality of user-supplied information plummeted when AOL let the unwashed masses onto Usenet (and God, please don’t get me started on webtv). Honestly, it never recovered. Wikipedia never even had a period of higher quality – it started off as a lowest common denominator free-for-all and the talk now of improving it by limiting who can edit, whether people should use fake identities, etc. is simply trying to close the barn door after the horse has gone.

I can easily see the theoretical appeal of Wikipedia – a great way to disseminate and share information that isn’t limited by traditional things like publishing companies; an instantly updateable, always available resource that hooks into the other positive aspects of the Internet. The problem is that this assumes that people are by nature good and intelligent. The best information will rise to the top and drown out the bad. This is *far* too optimistic. If TV has taught me anything, it’s that the public as a whole is not interested in education, intellectualism, or research. Yet they seem to think they should be contributing to an encyclopedia.

In my opinion, the future of quality information on the web will not be on free-for-all sites, but on closely-monitored collaborative projects. These can be *edited* by an *editor* – controlling scope and style, and ensuring consistency so that readers can feel confident that in a large project, the next article will match the previous one. The Wiki technology might very well play a part in creating this, but the “anyone is an editor” concept will not. Just because anyone *can* do something doesn’t mean anyone *should*. A man’s got to know his limitations. An Italian professor friend of mine made this statement, “As long as I say ‘In ancient Greece…’ and am answered ‘My colleagues at Kalamazoo College…’ we are just not on the same page. Mine is several dozen times larger. And I won’t crop it.” Writing (or having the gall to edit) an encyclopedia article requires expertise and perspective (both geographic and historical), and writing skills.

But it seems that quality information is not what the majority desires. They don’t want the right answer, they just want an answer – double quick – and Wikipedia, Google, et al. oblige. Lord, give me refuge from the “wisdom” of crowds.

Michael Fitzgerald

April 9, 2007

Shari, it’s good that you don’t listen to commercial radio because the whole payola thing is still very much with us. There were investigations and settlements in 2005 and 2006 and further situations are still under scrutiny. The fact that radio ownership is very limited with only a few huge conglomerates owning the vast majority of stations contributes to making our present day even more susceptible to corporate influence on what gets played. Live music is another area where there are big problems of this nature. Anyone interested might like to check out:

http://www.salon.com/ent/clear_channel/index.html

I found the Belkin article interesting in light of today’s ASIST presentation on Endeca. It would be interesting to investigate how the ease of presenting explicit facets (both subject-related and otherwise) would impact users’ views. Of course, it’s clear that highly structured information systems (like library catalogs) are going to respond differently than unstructured ones (like the web).

It seems to me that any expert (one who has read vast amounts on a subject) would probably understand the words that would likely be found in a relevant article (as well as the ones that would not). Maybe I hang out with a better class of clientele because I’m not so skeptical to say that “it will be the rare user who will understand these characteristics”.

Michael Fitzgerald

April 2, 2007

I was gratified to read the comments of both Li He and Stephen Sherman on the wide variety of patrons served by university libraries and how they should be considered. I’m frustrated by the idea, espoused by some, that user education is a dead end and should be simply avoided. But at the presentation on controlled vocabularies today, it was good to hear the idea stated that while these are perhaps most useful to experts, novices can benefit because they *learn* something about the domain and how it is structured. I would say that we should continue to educate our users in the different ways to accomplish the tasks with which they are faced. Throwing out what worked quite well last year just because there’s a hot new approach isn’t always productive. I would propose that there are hidden possibilities in the old ways which are often overlooked in the quest for the latest.

Having read and talked much of late about the Net Generation – “screenagers” is another term I recently heard that had a particular ring to it – I find that it is often cast as divisive. The constant focus is defining a group based on differences, despite some anecdotal evidence that there are still many things that are not universal to all people born 1982-1991. Some people don’t like group work – no matter what generation. Collaborative work is not something that “these kids today” invented. I suspect that a lot of the defining is done from outside the group as well. Does “accommodating” a particular group of patrons simply create a self-fulfilling prophecy? How are our various user groups the *same*? Have there been developments in the old way of doing things which can benefit all that should be considered? I’m not saying that we shouldn’t investigate such accommodations, but rather that there are other issues to be considered.

In the wide world, not every employee is from the same age group. How often are Net Gen students asked to collaborate outside of their peer group? If not so much, does this simply reinforce the differences? If all your friends are slackers, how will you ever rise above them? One of the best courses I have had here at SILS was team taught by two instructors from different ages/backgrounds. I feel that we need more of a sharing of methods between groups. Amazingly enough, “them old folks” managed to accomplish quite a bit back in the day, and you might recall that quote about Twain and his father. Are Net Gen students developing an appreciation for this accomplishment and the ways it was done or is it viewed with the same snickering and rolling of eyes that they give Uncle Abner’s tales of when you could buy a good suit for a nickel after watching a newsreel, a cartoon, and a double feature? In 1970, Buckminster Fuller claimed an ancient pyramid contained the following inscription: “Our civilization is going to ruin, the young spend all day in the pub and have no respect for their elders.” Isn’t generational rebellion inherent and, if so, how can we find ways to learn from each other? Or should we simply let the rebellion run its course and eventually the wiser Net Gen kids will realize there’s more to the world than they had assumed?

Michael Fitzgerald

March 26, 2007

McAdams makes the point that “time-based media require a lot of redundancy” and that hypertext links can avoid this. However, what I have seen is the opposite. For example, The New York Times places the SAME article in two, three, or more online “sections” (whereas it would only appear in a single print section). I understand how this can be useful (same as assigning multiple call numbers for books vs. the old approach of “mark it and park it”), but actually I do find it annoying to find the same featured article I just finished reading in the Education section as a featured article in Technology or whatever. Perhaps when the Times actually gets the “My Times” feature going (see http://www.nytimes.com/mem/betamail.html ) this will improve.

Regarding the idea of breaking down larger documents into hyperlinked smaller ones – I worry that this will eliminate these from search results because two search terms might be separated. Is the solution to keep the material together for the search engine and separate for viewing?

Although this was written in 1995, gratuitous linking is still a big problem – I see Wikipedia as a major offender here and the online New York Times often has the stupidest links. I sometimes wonder if a human made the decision or if they were automatically generated based on some kind of criteria (see http://www.nytimes.com/2007/03/25/theater/25goodwin.html where they include but a single link – for “Meat Loaf”). Is anyone minding the store? Is anyone even thinking about this at NYT? Will we see a future where computers really are smart enough to determine what should be linked?

Michael Fitzgerald

March 20, 2007

I viewed the EPIC 2014/2015 films after having attended the LAUNC-CH [Librarians’ Association of UNC-CH] conference last Monday, which had as its topic “Connecting with Millennials” and pointed out the ways in which people born 1982-1991 think differently about technology, information, and life in general. Afterwards, I went home and polished my collection of buggy whips. (see http://www.nieman.harvard.edu/reports/06-4NRwinter/p31-0604-dietrich.html )

A major difference was in privacy, and I think we’ve discussed a little bit about the level of personal information that is willingly supplied by millennials in myspace, etc. So I wasn’t all that surprised to hear from the “real live” millennials that they couldn’t understand why “old people” had all these concerns about privacy on the Internet. I definitely fall in between those myspace folks who feel compelled to publish their every move and thought and the paranoid people who won’t shop online (or the perhaps more savvy people who make the effort to mess with the demographics by providing false information). Is there anything unappealing about the EPIC future that would make the typical user refrain from contributing? Or would the incredible possibility of constant, active contribution eventually lead to apathy, allowing only a few to hold sway (see the film “The Rise and Rise of Michael Rimmer” from 1970).

The appeal of social networking still escapes me. But then again, the appeal of lots of popular things escapes me too. So I tend to agree with Jennifer – why on earth would I care what 50,000 Frenchmen bought? As far as I’m concerned, that’s a reason NOT to buy that. I feel sorry for the sheep/lemmings who get swept along in the hype of what’s hot. EPIC said that at its best, for a very limited number of people, the future holds a great breadth and depth of information but for most folks it’s trivia: largely untrue and sensational. I do tend to believe this – if American Idol can create something from nothing and people will shell out real money for “nonebrities,” it’s possible. When I see evidence to the contrary, I’ll have some hope for humanity.

Until then, what interests me is that small percent of people who will not be satisfied with Google, Googlezon, or whatever is next. Like Doug, I think that if the individualists have power, they won’t settle for what’s popular. (Unlike Doug, I do read books, lots of them – but definitely not all bought at amazon and as far as I can recall, never recommended by any computer.) The EPIC 2015 idea of everyone in the world posting realtime information related to geographic location, etc. – I guess that could be useful to avoid the crowds…..

How does the Googlezon model sit with the idea of the collective as it has been imagined elsewhere? Who is John Galt? Or Howard Roark, for that matter?

Michael Fitzgerald

P.S. – the best thing I saw at the LAUNC-CH conference was a presentation that showed that millennials can do great things when presented with interesting material (from special collections) and lots of individualized instruction. So I guess they aren’t all that different from any other generation in the history of mankind. BTW, my apologies to all those reading this who I’ve been lumping into the pronoun “they”.

P.P.S. – Thanks to Grant Dickie for his work in making that conference run smoothly.

March 5, 2007

Regarding the accuracy of ready-reference answers –

I still have plenty of evidence in my own field of factual, ready-reference information that is incorrect, even on sites that are widely used as authoritative (even up to the level of OCLC and LOC). Are we dealing with a “tolerance” for errors – with 972 billion answers, is it OK if 0.01% of those are wrong? That’s a lot of wrong answers (but of course, way more right answers).

I think the propagation of (mis)information is a big concern. I’ve seen a lot of slow domination by certain sites that license content. It worries me because I feel that those purchasing the licenses don’t have the expertise that even those who are producing the content do – and I’m not convinced that those people are all that fastidious. So what can be done about the independence of “factual” answers?

And I should say that this is not an Internet-only problem in my field. Books and CDROMs can be and are problematic in terms of simple, ready-reference situations (in specialized areas). It’s very much dependent upon the level of scrutiny out there.

This gives me plenty of ideas on how to document my own work on the web with the intent to be viewed more favorably *by those who consider similar criteria*, but it also gives me the reality view that for many people such things don’t matter.

I was also reminded of the early days of the web when things like awards of quality were floating around. Having received a few of those myself, I tend to believe that they existed so that the awarders got links back from the awardees (thereby raising the page-rank of the awarders). Hmmmmm!

Michael Fitzgerald