[ Pobierz całość w formacie PDF ]

assassination?
 What would her motive be? Fredda asked.
 Maybe she wanted to get Shelabas Quellam into office, Kresh said.
 Maybe she got tired of dealing with an overbearing Governor like Grieg.
Quellam has as much backbone as a bucketful of water. With him as Governor,
she could more or less run the planet herself.
 But Quellam would only succeed if Grieg was impeached and convicted,
Fredda said.  As it is, Grieg s Designate becomes Governor. 
 The story is that Quellam is the Designate,  Kresh pointed out.
 But is the story true? Fredda asked.  Suppose that s not true, and
Tonya Welton s intelligence is good enough to tell her that? Maybe she figured
Grieg was going to be thrown out of office, and didn t want Quellam in there.
Or maybe her intelligence people managed to find out who the Designate is, and
she decided she liked that person so much she wanted her or him to be Governor
right now. Or maybe she found out Grieg was about to choose a Designate she
didn t like as much as the present name, and took steps to put her choice in
office. Or maybe she wanted to precipitate such a shambles that she would have
a viable pretext for pulling her people out of this forsaken vermin hole. If
she wanted to abandon the planet and let everyone and everything on it die,
what difference if the Governor dies a little before everyone else?
 Do you really think she was behind it? Devray asked.  You both know her. You
make her sound like she s capable of practically anything. I can believe she s
no shrinking violet, but is she really that ruthless?
Page 102
ABC Amber Palm Converter, http://www.processtext.com/abcpalm.html
 I think Tonya Welton is capable of doing whatever she believes to be
necessary, Kresh said.  Anything. But no, I don t think she did it. She s had
lots of chances to walk away from Inferno, and she hasn t. And if she wanted
to take over the planet, she wouldn t bother with this sort of hole-and-corner
stuff. She d just bring in a fleet with all guns blazing. On the other hand,
that fleet could still show up anytime and there wouldn t be a lot we could do
about it.
 You ve got a real positive attitude about all this, don t you? Fredda asked.
 All right, so there s the diversionary fight. Meanwhile Bissal is waiting to
get in--
 Excuse me, Dr. Leving, but I must interject, Donald said.  There were
another set of participants in the staged altercation. Aside from Tierlaw
Verick, they are, in fact, the only suspects we currently have in custody. 
 In custody? Kresh said.  We have suspects in custody?
 Yes, sir. Caliban and Prospero. They surrendered to me personally about one
hour ago. I had only just returned from taking them into custody as I
arrived here for the briefing. A condition of their surrender was that I was
forced to agree that I would not reveal it to you until such time as I could
do so in front of Commander Devray and one other witness, though I do not know
the reason for that condition. 
 Caliban and Prospero? Fredda asked.  Why didn t you say something at the
start of the briefing session?
 Sheriff Kresh ordered me to report on Ottley Bissal, Donald said.
But that weak excuse didn t fool Fredda. A robot as sophisticated as
Donald did not have to be that literal-minded in interpreting such an order.
Donald had a flair for the dramatic. Not surprising, considering that his job
was the solving of mysteries. Judging--quite rightly--that it would do no harm
to discuss other issues first, he had waited until the proper dramatic moment
to unleash his bombshell.
Or, to give a less anthropomorphic explanation, Donald understood human
psychology and knew that humans would give greater attention--and greater
credence--to his suspicions regarding the two robots if he waited until the
proper moment.
Fredda herself wasn t sure which explanation was right. Maybe Donald himself
didn t know. Humans didn t always know why they did things. Why should robots?
 Where are Caliban and Prospero? Fredda asked.
 Under heavy guard in a storeroom similar to the one Bissal used as a hiding
place, Donald replied.  But with your permission, I would like to point out
several facts that strengthen the case against them.
 Very well, Kresh said.
 First, they were involved in the staged fight. If that in and of itself is
enough to cast suspicion on Tonya Welton, then it is enough to cast suspicion
on Caliban and Prospero.
 He s got a point, Kresh said.  No one seemed to think anything of their
actions at the time, but why were they obeying the Three Laws? Maybe just to
look good. Maybe not.
 You anticipate my next point, sir. The ambiguities of the New Laws might well
permit Prospero to be a willing participant in a murder.
 Donald! Fredda said.
He turned and looked at her with a steady gaze.  I regret saying so, Dr.
Leving, particularly to you, the author of those Laws, but it is nonetheless
true. The New First Law says a robot must not harm a human--but says nothing
about preventing harm. A robot with foreknowledge of a murder is under no
compulsion to give anyone warning. A robot who witnesses a murder is not
compelled to prevent it.
 The New Second Law says a robot must  cooperate with humans, not obey them.
Which humans? Suppose there are two groups of humans, one intent on evil, the
other on good? How does a New Law robot choose?
Page 103
ABC Amber Palm Converter, http://www.processtext.com/abcpalm.html
 The New Third Law is the same as the old third--but relative to the weakened
First and Second Laws, it is proportionately stronger. A so-called
New Law robot will all but inevitably value its own existence far more than
any true robot--to the detriment of the humans around it, who should be under
its protection.
 As for the New Fourth Law, which says a robot  may do whatever it likes, the
level of contradiction inherent in that statement is remarkable.
What does it mean? I grant that the verbal expression of robotic laws is far
less exact than the underlying forms of them as structured in a robot s brain,
but even the mathematical coding of the Fourth Law is uncertain. 
 I meant it to be vague, Fredda said.  That is, I mean there to be a high
level of uncertainty. I grant there is a basic contradiction in a compulsory
instruction to act with free will, but I was forced to deal within the
framework of the compulsory, hierarchical nature of the first three of the
New Laws.
 But even so, Donald said.  The Fourth New Law sets up something utterly new
in robotics--an intralaw conflict. The original Three Laws often conflict with
each other, but that is one of their strengths. Robots are forced to balance
the conflicting demands; for example, a human gives an order for some vitally
important task that involves a very slight risk of minor harm to the human. A
robot that is forced to deal with such conflicts and then resolve them will
act in a more balanced and controlled fashion. More importantly, perhaps, it
can be immobilized by the conflict, thus preventing it from acting in
situations where any action at all would be dangerous.
 But the Fourth New Law conflicts with itself; and I can see no possible
benefit in that. It gives semi-compulsory permission for a robot to follow its
own desires--although a robot has no desires. We robots have no appetites, no [ Pobierz całość w formacie PDF ]

  • zanotowane.pl
  • doc.pisz.pl
  • pdf.pisz.pl
  • pantheraa90.xlx.pl