The crux of the problem, as I see it, is that all of the "votes" (aka links) for page popularity are anonymous. This is the reason that these shady companies can make a business out of manipulating links. This is part of the broader internet trend of having ratings decided by sampling the entire internet population; movie reviews on imdb, book reviews on amazon, and so on.
Thanks for the well thought-out reply, and the encouragement on the research. You'd be surprised how many computer science professors think that "trust" is unworthy of study. It was a good mental exercise for me to reply to the points you made. (I'm sure some of this text will end up in my thesis!) Don't feel that you have to read or respond to it all--I'm not trying to drag you into a big discussion that you probably don't have time for!
Your major points:
- "Boneheaded Decision Makers"
Here I'm relying on "the wikipedia effect." A study found that graffiti in Wikipedia remains there an average of only 5 minutes before it is corrected. Similarly, within my proposed system I'm hoping that boneheads will be quickly detected and marked as untrustworthy. So if the bonehead is (for example) 3 hops away, then I just need anyone within 2 hops of me to notice. And in a social network, there is additional social pressure that Wikipedia doesn't have: No one wants to be the guy that trusted the bonehead and messed things up for everyone downstream! (E.g. imaging getting an email "Hey Dave, why is your friend Mary saying that Claria.com software is good stuff?!" )