[Kopimism] [sudo-discuss] Friday Filosophy: Software as Speech

Max Klein isalix at gmail.com
Mon Mar 11 00:39:56 PDT 2013


Anecdote: Human walks into a carpet shop, wanting to buy carpet. Sees two
apparently identical carpets; one priced at $10/sqft and one at $50/sqft.
Perplexed, they ask the clerk the difference between these two carpets? The
clerk responds, if you can't tell the difference, then to you there isn't a
difference.

Analogy: If software ever passes "The Turing Test"[1], then it will appear
indistinguishably human to humans - and thus human enough to be covered
under human law.

Extra reading: The history of ELIZA[2], to which I'll admit on using a
psychological therapist.

[1] http://en.wikipedia.org/wiki/Turing_test
[2]http://en.wikipedia.org/wiki/ELIZA


On 2 March 2013 12:25, Eddan <eddan at clear.net> wrote:

> Do Robots Have Rights? - I'm planning to submit that as a suggested
> session topic for the next Workshop Weekend.
>
> It seems to me that whether or not an autonomous system is a sentient
> being seems like a primary hurdle that can't be passed in order to even
> answer the question of where responsibility should fall in a way that makes
> sense to us. I can't imagine computational entities will ever have the
> intent we mean in contemporary society for us to call the damages it causes
> a crime. Not only as a matter of the capacities of technical engineering,
> but even by definition of what we mean by: (1) act; and (2) intent; and
> (a-b) what knowledge is, in the context of both.
>
> As far as I can understand such a question in terms of motive, I think
> responsibility should lie with the anticipated capabilities of the
> technology created by the programmer(s)/designer(s). Software Malfunction
> Liability - we have become convinced that that kind of analysis is too
> remote and unfairly misguided. I most definitely agree that it's hard to
> say what an engineer should have known, especially if the act was committed
> by any further iteration of the program in the autonomous system in the
> example. But I think we can get closer to confident about reckless design,
> and even grossly negligent design - not to mention unconscionable, which
> would make the best case for assigning liability on the designer.
>
>
>
> On Sat, Mar 2, 2013 at 11:58 AM, Steve Berl <steveberl at gmail.com> wrote:
>
>> Seems to me that the autonomous system is guilty of aiding and abetting a
>> crime, or conspiracy, or something like that. Either it's a sentient being
>> and must follow the law, or risk punishment of some sort, or it isn't, and
>> Bob has to be responsible.
>>
>> -steve
>>
>>
>> On Fri, Mar 1, 2013 at 6:54 PM, Anon195714 <anon195714 at sbcglobal.net>wrote:
>>
>>>
>>>
>>> Yo's-
>>>
>>> Since I couldn't make it in person...
>>>
>>> Hypothetical:
>>>
>>> Assume the existence of intelligent computers that can make autonomous
>>> decisions, which many folks believe will become a reality in the near
>>> future.
>>>
>>> Alice Analyst publishes virus source code in an online computer security
>>> publication.  So far that's clearly protected speech, nobody here would
>>> argue otherwise.
>>>
>>> Bob Badguy reads the article and types the code manually into a
>>> computer, with the overt or covert intent for the computer to broadcast the
>>> virus and infect other computers.
>>>
>>> Does it matter whether the computer into which Bob enters the virus
>>> source code, is an ordinary computer that does what it's told, vs. an
>>> intelligent computer that has the capacity to make autonomous decisions?
>>>
>>> Clearly if the computer is an ordinary one that is not capable of
>>> autonomous decisions, then Bob's typing of the virus code into it would
>>> constitute an "action" rather than "speech," and would not be protected.
>>> He could be successfully prosecuted for unleashing the virus upon the
>>> world.
>>>
>>> But if the computer is an intelligent one that can make autonomous
>>> decisions, then could Bob rightfully claim that his typing of the virus
>>> code into that intelligent computer was _also_ protected speech, merely an
>>> exercise in communication with another sentient being, the same as Alice's
>>> original publication?
>>>
>>> -G.
>>>
>>>
>>> =====
>>>
>>>
>>>
>>> On 13-03-01-Fri 8:22 AM, Eddan Katz wrote:
>>>
>>> Dear Kopimists and the People who Love Them.
>>>
>>>  For the featured Filo delicacy for Friday Filosophy, we will have
>>> potato burekas.
>>>
>>>  I propose we talk about the difference between source code, object
>>> code, and executable code in regards to 1st Amendment protection. In other
>>> words, when is code speech and when is it a speech-act subject to less
>>> legal protection?
>>>
>>>  Below is an excerpt from an essay by Lee Tien, a brilliant EFF
>>> attorney for more than a decade, on Software as Speech (2000). These two
>>> paragraphs are in the section: Viruses and other "dangerous" software.
>>>
>>>  Of course, as always, we can talk about whatever else. Such as
>>> conscience and the unconscionable, perhaps.
>>>
>>>  Lee Tien, Publishing Software as a Speech Act, Vol. 15 Berkeley Tech.
>>> Law Journal (2000)
>>> http://www.law.berkeley.edu/journals/btlj/articles/vol15/tien/tien.html
>>>
>>> Let’s return to the virus hypothetical.192<http://www.law.berkeley.edu/journals/btlj/articles/vol15/tien/tien.html#sdfootnote193sym> The
>>> main concern lies in the fact that the software may be “diverted” toward
>>> unlawful purposes, regardless of the speaker’s intent. This concern is,
>>> however, not unique to software. It also applies to other types of
>>> information usable for mischief or harassment, whether highly technical
>>> like information about nuclear weapons, or utterly mundane like a person’s
>>> name, address or telephone number.
>>>
>>> Even if the virus author merely posts the source code and fails to
>>> release it in active form, the issue remains whether the posting was done
>>> with an intent to communicate. If the author claims that she intended it to
>>> communicate, we would need to examine the context to decide the
>>> plausibility of that claim. There will often be a plausible claim. There is
>>> no question that people study viruses and other dangerous software in order
>>> to prevent or relieve harm.193<http://www.law.berkeley.edu/journals/btlj/articles/vol15/tien/tien.html#sdfootnote194sym> One
>>> way to control a virus is to publish its source code so that systems
>>> operators can disable or protect against it. Communicating a virus’ source
>>> code as part of such an effort qualifies as a speech act because the
>>> publisher intends to communicate how the virus works in a conventional way.
>>> In fact, one could imagine entire journals or Internet sites devoted to
>>> viruses and other dangerous software.194<http://www.law.berkeley.edu/journals/btlj/articles/vol15/tien/tien.html#sdfootnote195sym> When
>>> such publications aim to alert the world to these dangers, their intent is
>>> clearly communicative.
>>>
>>>
>>>  sent from eddan.com
>>>
>>>
>>> _______________________________________________
>>> sudo-discuss mailing listsudo-discuss at lists.sudoroom.orghttp://lists.sudoroom.org/listinfo/sudo-discuss
>>>
>>>
>>>
>>> _______________________________________________
>>> sudo-discuss mailing list
>>> sudo-discuss at lists.sudoroom.org
>>> http://lists.sudoroom.org/listinfo/sudo-discuss
>>>
>>>
>>
>>
>> --
>> -steve
>>
>
>
> _______________________________________________
> Kopimism mailing list
> Kopimism at lists.sudoroom.org
> http://lists.sudoroom.org/listinfo/kopimism
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.sudoroom.org/pipermail/kopimism/attachments/20130311/5bef10f1/attachment.html>


More information about the Kopimism mailing list