[iDC] misogynist search engines?
ndrw
a at ndrw.net
Wed May 9 16:22:55 EDT 2007
I wasn't claiming that the algorythm appeared in a vacuum. I was
attemping to argue (perhaps foolishly) that there is no inherent
sexism in this particular technology.. Rather, the sexism is a side
effect of the extensive languages the algorithm indexes.
here are some fun statistics!
"he invented": "she invented": ratio:
google: 1,060,000 164,000 6.463
yahoo: 1,340,000 241,000 5.560
microsoft: 268,511 30,054 8.934
ask.com: 244,100 22,800 10.706
technorati 240,216 240,216 1
It is nice to know that technorati's search algorithm is so precisely
egalitarian. Kidding aside, either all of these search engines are
sexist in extremely similar ways, or are indexing similar data. Why
are we singling out google? because someone popularized an oddity
and posted it on digg?
Back to google, I'm getting different results than you are. Your
results seem based on googling both words "she" and "cooked." This
is how google treats generic searches, but it also includes pages
with both words in unrelated places. All of the statistics above
were conducted by me, just now, and I made sure to include the quotes
around the phrase, defining it as a phrase search and not a keyword
search.
note that: she cooked
returns 1,570,000
whereas "she cooked"
is 281,000.
I also want to point out that you are making possibly subjective
assumptions about usage of your keyword terms. I notice that the
first result of "she looked" is: 14 Responses to “Other Things
Kiera Knightley Wishes She Looked Like"
but your arguments are all assuming that all your results are linked
(?) verb usage of looked. "She looked over the ledge" is quite
unrelated to "she looked thoughtful." and will definitely skew
readings of hundreds of thousands of unknown search results.
technicality aside, these revised statistics still support your
argument:
cooked looked ratio:
he 201,000 1,830,000 10.427
she 281,100 2,930,000 9.104
There is still a higher ratio of she-looked to she-cooked than he-
looked to he-cooked, yet no "did you mean" suggestion. This seems
anomalous. My first thought, that I'm not sure how to verify, is
that there is a 250,000 threshold for a similar spelling. Actually I
would assume that the threshold would scale in accordance with how
many alternative spellings are in the index.
I of course would never argue against you that many languages have
built in, often political agendas. As to the authoritarianism of the
english language I have always enjoyed Tense Present: Democracy,
English, and the Wars over Usage by David Foster Wallace, even though
he is rather smug. I believe it's available online, although not
sure how readable it is in web form.
Best,
Andrew Macfarlane
On May 9, 2007, at 1:01 PM, Sullivan wrote:
>
> When you search for "she cooked" it doesn't make any alternate
> suggestion, and you get 1.6 million results.
> Search for "he cooked" and you get just over 3 million results,
> but it
> gives you the alternate "Did you mean: he looked".
> "He looked" gives you 51.6 million hits.
> "She looked" gives you 32.7 million hits. Why didn't it ask me if I
> meant
> "she looked" when I put "she cooked"?
> There are only 17 times as many results for "he looked" as for "he
> cooked"
> but there are 20 times as many results for "she looked" as "she
> cooked".
> Interesting.
>
> It's certainly questionable that it suggests you mean to look for men
> instead of women inventing things (a stereotypically male activity) or
> that you didn't really mean to look for men cooking but did for
> women (a
> stereotypically female activity). It does the same damn thing with
> "she
> created" (58.8 million hits) and "she built" (56.5 million hits)
> and "she
> designed" (52.4 million hits), even though all of those result in
> millions
> of hits.
>
> Language, used in a largely patriarchal culture and historically
> structured largely by male power, is used in the interest of male
> dominance and gender inequality. as radfeminist46 posted in the
> comment
> section to this previous post:
> We learn to think of doctors, lawyers, scientists, and professors
> as male.
> The harmful assumption is that women aren't intelligent,
> hardworking, or
> rational enough to be in these occupations. We even unconsciously
> (sometimes consciously) encourage boys and young men to become
> scientists,
> doctors, and professors...to go into science and math. We tend to
> encourage girls and young women to go into teaching, nursing, and
> other
> service work. Since women's work is devalued in this society, nurses,
> teachers, secretaries, flight attendants, etc. are low-paying careers,
> which offer little room for advancement and most of them have a low
> prestige. Remember that the average nursing salary didn't go up
> considerably until men started entering the profession.
>
> All of this is to say that when we refer to a doctor as a "woman
> doctor"
> we're saying that she is a doctor IN SPITE OF her gender. What we
> imply is
> that she has gone beyond what we expect of her and her "abilities"
> as a
> woman in becoming a medical professional. On the other hand, when
> we say
> "male nurse" we are implying that we do not expect men to work
> below their
> "abilities" and become nurses. After all, we all know that's a woman's
> job...or so the stereotype goes.
>
> Phrases like "Male nurse" don't hurt men or reinforce harmful
> stereotypes
> about men. Rather, they reinforce the old stereotype that service
> jobs and
> jobs where you help the "real" professional (i.e. doctor) are for
> women.
> This stereotype has real world consequences. Girls are channeled into
> occupations that pay less and have low prestige...which leads to
> the wage
> gap. Women who do go into professions dominated by men are seen as
> "bitches" who likely "slept their way to the top."
>
> In other words, for men, gender works to their benefit and for women,
> gender works against them.
>
> So, the question again is about why this algorithm is generating these
> particular results, who designed this algorithm, in what social
> context?
> does this mean something about the possibilities for seemingly neutral
> software to encode distinct cultural biases?
>
> This quote from Fatima Lasay:
> ?Software cultures are cultures generated by programmers, designers
> and
> software users. As such, programmers, designers and software users
> interact with the social dimensions of software. Here, the social
> dimensions of software not only reflects but also is an extension
> of the
> social structure of a cultural group within which information is
> shared. A
> subservient society misunderstands and misuses the social dimensions o
> software. A subservient information society produces a productive yet
> docile information economy- subservience is the collective
> acquiescence of
> programmers, designers and software users to the corruption of a
> consumerist information society?. (from Philippine BBS Culture, No
> Carrier?)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.thing.net/pipermail/idc/attachments/20070509/71e00586/attachment.html
More information about the iDC
mailing list