The Net As New Jerusalem, Part Two 130
(Second in a series)
"I have worked hard for nearly two years, for the sole purpose of infusing life into the human body," wrote Victor Frankenstein in Mary Shelley's great novel. "For this I had deprived myself of rest and health. I had desired it with an ardour that far exceeded moderation; but now that I had finished, the beauty of the dream vanished, and breathless horror and disgust filled my heart."
In our time, Victor Frankenstein would be in Silicon Valley, taking one meeting after another with venture capitalists, angling for a profile in Wired, wrangling tens of millions for his new company, lifeinthebody.com (based in Cambridge, Mass.), beginning commercial licensing of the discoveries of the Human Genome Project.
In contemporary America, Victor wouldn't have to hole up in a remote tower far from human observation. He could partner up with somebody out in the open and promise to create perfectly engineered babies, cure cancer, and stop aging. The venture capitalists would be drooling all over him.
Ethical morasses lie at the heart of modern-day technology, increasingly run by highly-educated, wealthy elites who have little awareness that everybody isn't as technologically-inclined, -equipped or advanced as they are. In this Jerusalem, half the country is still outside the gates.
But perhaps these ethical quandries could form the foundation for a new kind of ethical and rational politics that addresses social divisions among the techno haves and have-nots, the future of gene mapping, intellectual property questions, the use of nano-technologies, the creation of ubiquitous and expensive technologies that are poorly designed or environmentally damaging, the intrusion of government. As in Victor Frankenstein's time, we hear little public or civic discussion about these choices. They haven't surfaced during the presidential campaign and debates. They get crowded off front pages and TV newscasts by hype-laden stories about dotcom greed, crackers and sexual predators online.
We need an ethical framework for technology, and while I'm not a technologist, I'm happy to start the discussion by suggesting some opening questions to ask about developments technological.
Do we need it?
Can we support it? Can the people who buy or use what we make get free, readily-available help?
Are new technologies open to peer review and scrutiny, that is, are the software, hardware, systems and design of new technologies available for public and other inspection in order to root out potential mistakes, problems and flaws?
Will everyone have equal access to new technologies, or will they become the property of corporate and social elites with specialized knowledge and lots of disposable income?
Do new technologies have unintended consequences? Have academic, business or civic analysts examined them? Have their ramifications been explained to the people affected (as in telling Victor Frankenstein's neighbors that a monster would soon be running around the community?).
Can technologies be created with consideration both for the environment and for consumer's convenience? Can batteries, parts, cartridges, support and service be standardized, so that consumers don't have to continuously scramble? Can software and computer makers agree on ethical standards for their product's lifespan, so that people who invest in expensive technologies can be assured that they will last a few years, and that products and software for them will be available in the future?
Can the sale and licensing of gene research to private bio-tech corporations be halted until critical social issues can be discussed and resolved? The public has yet to grasp the consequences of such researching falling into the hands of a few corporations, lulled as they are by scientific and political promises of cures for cancer, aging and heart disease.
Is downloading music or a novel theft? Do the ethics of copyright and intellectual property need reconsideration? Or elimination? Is there a more rational alternative to the Sonny Bono and Digital Millennium Copyright Acts?
How can we ensure that technology and software companies and Web sites prominently disclose privacy provisions and implications? It ought to be illegal to distribute people's personal information with their knowledge and permission.
Perhaps we should require that before new technologies are licensed, deployed or sold, we need a technological impact statement. Like the environmental statements designed to make people aware that their surroundings could be affected by construction or research projects, a TIS would mean that before projects like the gene map are sold and distributed, ordinary people are aware of the technology and its possible impact on their lives and those of their children.
As long as it upholds morality I'm for it (Score:2)
Anything which we, as a decent Christian nation, can implement to ensure that the standards of morality are upheld in our children, will be 100% wholeheartedly supported by me and many other concerned parents. We need to take a step back from the current crop of violent, exploitative technological "advances" and remember that our worth as human beings comes not in the amount of gadgets we accrue, but in how closely we follow the teachings of our lord Jesus Christ.
Realization of the reality of the internet. (Score:2)
I always thought that the thing that the internet really needs to realize its potential is a declaration by the world's leaders (especially the US) that it is a kind of seperate pseudo-reality that does not recognize political borders. The internet could be treated like international waters.
Forgive the redundancy of the title. (Score:1)
The courts will catch up. (Score:3)
No, I don't think I should. Silicon Valley isn't the new anything. A lot of people have tech jobs there. A lot of people have other sorts of jobs. Yes, technology has got to the point where we can pirate. Just because we can doesn't mean the courts are forced not to prosecute us, as we are still responsible for our actions. Yes, gene research and biotechnology will have ramifications. Six months after an outcry, there'll be legislation (or at least there'll be lawyers).
While things change, it all stays the same really. Yes, technology impacts on things, but yes, the courts will catch up. We're not above the law. There's nothing stopping me going to kill someone, much like there's nothing stopping me pirating software, or invading somebodies privacy. The difference is that the crime of murder has been around a lot longer.
It's too easy to think the world will suddently melt down, if you read too much stuff from the net. (Like this article). Whether we need it, or want it, we'll get legislation. Nothing slides too much without being nailed down. We pay lawyers too much.
thenerd.
Technology Impact Statements? (Score:2)
Ok, maybe someone shouted Fire! a long time ago, but we are part-way through an information revolution.
Besides, who really knows the tru status quo at any given time?
Re:My summary of this article: (Score:1)
Re:Realization of the reality of the internet. (Score:2)
To have the Internet treated like international waters means that it would exist outside all countries' laws and therefore be immune to certain scare tactics from lawyers and such. The problem lies in the fact that the servers that house web-based content are not physically located more then 12 miles off the shore of every country. They are within the bounds, and therefore the laws, of a country. I cannot see the U.S. or any other country for that matter, ceding most legal rights to any dink with a server. Furthermore, I do not want to see extra-legal powers for AOL, as given them the same status as international waters would do. (Instead of dealing with U.S. law, they would get to go to the U.N., that bastion of common sense and understanding.)
Just my 2 shekels.
Kierthos
...But not Jon Katz as William Blake (Score:3)
Answer: no, there is no problem actually defined here, though the tenor of the questions implies that the reader is supposed to believe there is one--though not exactly what it is. The imagery and emotional hot-buttons pushed through Katz's choice of phrases have a vague neo-Luddite, Naderite ring to them, which floats away in the swamp of unanswered questions.
The answer to the questions is: Yes, and Maybe.
At the bottom we have a nice "perhaps":
"Perhaps we should require that before new technologies are licensed, deployed or sold, we need a technological impact statement. Like the environmental statements designed to make people aware that their surroundings could be affected by construction or research projects, a TIS would mean that before projects like the gene map are sold and distributed, ordinary people are aware of the technology and its possible impact on their lives and those of their children."
Translated:
"Perhaps we should have more government paperwork required before any innovative business is allowed to do anything, so that more bureaucrats can make a living, getting a power trip from saying "no", holding their hands out for bribes--er, campaign contributions--, and so that established business have yet another legal roadblock they can use to squash competition, and so that any fringe group that doesn't approve of your politics can use the process to shut you down regardless of the actual merits of your product or business."
Reality check: the problem with your "perhaps" is the same old one: WHO DECIDES? Who decides whether my product or business is permissable? Do you really want to hand over to a government body or political group or ANYONE AT ALL the power to FORBID you to research or invent something new?
(Dragoness hands Jon Katz a copy of "Atlas Shrugged", and crawls back into her lair.)
Re:Realization of the reality of the internet. (Score:1)
...or, The Modern Prometheus (Score:5)
Humanity's most precious possession is our most terrible curse: memory. We're adept at freeing genies from bottles, but inept at putting them back...or leaving them there even if we do manage to reimprison them.
It's easy to say we shouldn't pursue a technology that's not sustainable, not clean, not fair in the consequences it will bring into the world...but who are "we"? Certainly Jon Katz and I can agree not to dabble in things that will harm our neighbors or make the world less hospitable for our descendents, but will everyone else? When "we" say "The consequences of Technology X are not acceptable," how do we prevent "not-we" from having, and acting upon, a different opinion?
Lots of third world nations find it awfully suspicious that the major industrial powers are trying to limit CO2 emissions just when industrialization is starting to benefit the little guy. Sure, we know things now about the effects of CO2 on the environment that we didn't before, but how much comfort is that knowledge to the starving peasant who could have benefited from manufactured goods, but whose government has been bullied into signing an agreement not to use technologies damaging to the environment? How do "we" weigh a .001% greater chance of skin cancer for everyone in the world against the quality of life of a few million? How do we make amends for decisions of this nature that we've already made, and that we continue to make to this day?
New Clothing On The Same Old Argument (Score:1)
The real money is in solutions to these problems, which are in dreadfully short supply, unlike rants such as this one. Give me a break.
Technology's Benefits (Score:2)
When these technologies are developed and deployed they will meet the same fate as all technological innovation (unless we act to make a difference): Their distribution will be based on profit potential, politics, nationalism, etc. These technologies will not be distributed to the developing world, unless the developing world can suddenly afford to pay for them. And don't think AI will be able to solve any of these problems either. Since the root causes of our human misery (famine, poverty, disease, ignorance) are caused by us, an advanced AI will have as much trouble dealing with them as we do.
We know how to treat water so it does not transmit disease. We produce more than enough food to feed the people in the world. Medicines are available to prevent or treat the diseases that ravage many third-world countries. These problems (disease, hunger, ignorance) all have solutions that have been proven to work; so why do these situations continue to exist? They exist because we allow them to exist, they exist because our system of distributing these technologies has not kept pace with the technologies themselves.
In order for technology's greatest benefits to make an impact on the overall human condition we have to change the way our civilization operates. Unless we change our thinking, all the technology in the world will not save us.
We need to modify the distribution system for this new technology to really benefit humanity as a whole, and not just enrich the few already wealthy and powerful. How can we hope to get the benefits of 21st century ideas and innovation with 18th century distribution models?
No, we cannot slow the world down. (Score:3)
However, you fail to realize that there are indeed a great many people just as happy to be left behind as those who must be on the cutting edge.
Can we protect, standardize, legitamize, or organize everything or mostly everything.
NO.
Should we.
NO.
Fact is, most of people in this group who want to insulate, protect, coddle, smother, suffocate, and control are for the most part clotting egomaniacs who don't think anyone can think for themselves. Your own text tells me you are part of this elitist crowd which believes it must think for the poor underprivledge people.
There are cases where help is generally needed, but a lot of the world would cease having a reason to progress, let alone live, if someone was constantly speaking in their head "don't step on the grass", "thats not your money", "tell the policeman everything", "those thoughts are evil"
too hell with your world Jon, I prefer to have a chance to fuck up my own life as well as a chance to make it work.
Progress isn't predictable (Score:5)
In the early stages of the automobile, there were hundreds of manufacturers in the US, and lots of unsafe cars. Now there are the Big Three and cars are much safer, but do you think that during the early stages of the industry anyone could possibly have predicted what the automobile would become? In the early stages of any new technology, it's really rather impossible to predict future uses or outcomes.
Dynamite was supposed to render wars a thing of the past, due to its vast destructive power. I'll bet if you polled leading "experts" and concerned citizens at the time of its creation, most would have agreed with that prognosis.
My point is that it would be wonderful if we could truly understand the impact of new technologies before their introduction into society, but there are so many variables (human behavior, economic trends, interaction with other technologies, invaders from Mars...) that it's just not feasible to come to any real conclusions about the impact of a technology, other than the really obvious, immediate effects.
You seem to be saying that we should innovate in accordance within the framework of a vast plan, which is contrary to how innovation works best.
Declare it separate? (Score:2)
Biggest Contribution of Free Software Movement (Score:3)
If you think about it, this is the reverse of the problem that Jon started his article with. Lots of r&d is being done not because it's the right thing, but because there may be a great market for it (some researchers, I know, do things for the cool factor w/o thinking of impacts. Ah, well). I think that if we can export the ethics of Free Software to people, this will do the job.
The concept of "Zion" shows up in a few places other than the Matrix
A Storyteller in Zion, Orson Scott Card
Approaching Zion, Hugh Nibley (especially the chapter entitled "Work We Must But The Lunch Is Free")
(And, yes, there are plenty of differences between the Mormon and Free Software communities. But that doesn't mean something can't be learned by looking around).
Re:The courts will catch up. (Score:1)
The internet is no bigger than a combination of the telephone ("poll"/"pull" technology) and television ("push" technology). (Which admitedly were big). It's what I call an "enabling technology".
It's no more a technological panacea for all modern ills than the telephone was in its day.
One lesson we can learn from TV as a "push" technology is that, as in the USA now, a very low signal to noise ration medium evolves.
However, for the subject of ethics, there are intant parallels that could be drawn, but seem to be overlooked. OK, "wrongs" can be had/done, but...
A kid can view www.pr0n.jp, yeah, but he can phone 0047980084102398 for "big bosomed bimbos".
People can download a copy of "American Pie II, The Cherry's Revenge" illegally and ftp it to mates. However, he can set his dish to pick up the feed _to_ regional TV stations, and therefore get Babylon 5 intended for broadcast the next day not only early, but with all the adverts removed.
I could go on.
However, that would make me as boring as Jon Katz.
Why do I reply to his threads, they just wind me up.
FatPhil
Social Engineering (Score:1)
Would you please explain why this is such a bad thing? How do you justify your arguments for social engineering?
Like it or not Jon, the very idea of progess means that some ideas are left behind in favor of others. This also means, that some people are left behind in favor of others.
No elite group of society can answer the questions you raise in your article. It is folly to think otherwise. If people have the freedom to choose, the end result will be the best.
Most of your suggestions are just another example of liberal/communist social engineering.
What is the West (Score:2)
Frankenstein's inability to back off is thus a very apt metaphore.
Can we change course now? I don't know and the odds seem against it.
Is it worth it? I am not sure. I am sure that sooner or later the West will meet the tragic end that awaits all positive feedback loop. But the opposite of tragedy is not necessarily a happy ending, it may well be just a slow fade out.
My $.02 (Score:2)
Lets look at some other countries. I have a big dilemma about whether to get a Palm 3 regular or color, and am I going to WAP enable my website. People in 3rd world countries have problems like, people with aids thinking they can cure themselves by having sex with a virgin, lack of food, plague. It is already with the social elites.
Perhaps we should require that before new technologies are licensed, deployed or sold, we need a technological impact statement.
I think that this is a GREAT idea. Not only in the tech sector but how about cloning, gene selection, abortion, etc etc. If we hade some law or code that established a set of ethics or rules before we unveiled a new technology, that would be a great way to ensure proper use of the technology. Is it feasible, no. I don't think so because other countries would develop the technology of our discoveries, people are not that patient, and we live in a CAPITALIST society. If it makes money then it survives, this idea would have to be a governmentally employed system, and then the gov. would be restricted the release of technology -not a good thing, to say the least!
I think that most of the questions that Katz presents could be solved/answered with open sourcing. Someone comes out with a new hard drive, release all the tech specs and let the public hack through it. The person could still retain copyright. This would make copyrights much more specific and not allow the huge vague copyrights that are plaguing us now.
Your real question.... (Score:1)
I think the answer is yes we can.
Even though I am a pessimist first and foremost. I still believe that we can over come our desire to be richer and more powerful than others, and get down to the basics of life, what our parents taught us : "share with others". My parents taught me this and I'm trying to teach it to my children.
Not to say that it won't be a long and arduous journey, but we WILL get there. Hopefully before I die.
The wheel, for example? (Score:1)
Re:...or, The Modern Prometheus (Score:1)
Other than a utilitarian philosophy asking us to restrict ourselves to protect our environment, what incentive do individuals have to *not* act in their own self-interest in such matters? None. Legislation at a national level is obviously not a solution, but an oppressive local gov't isn't any better, just less well-armed.
Blatantly Offtopic (Score:1)
If you guys are leaving the country, don't let the door whack you on the butt on the way out.
But you CANNOT predict the uses perfectly. (Score:3)
How about the Laser? Katz's Inquision bans it because it's a death ray. Who at the time of the Laser's invention realised it would end up in every home as an integral part of the CD player? No-one. You simply can't guess the implications of an invention 100% accurately.
Re:christ (Score:1)
Maybe we could pay him to be a speech writer for Bush. Then, we could "accidently" shoot down their plane sometime, and take out two birds with one stone (or maybe that would be two stones with one bird, or better yet two stones with one stone)
There was no sequel to Frankenstein (Score:2)
A gothic horror novel is probably not the fairest analogy to use when examining the evils and ethics of technology. This is a novel designed to horrify and play with emotions -- and Shelley did not necessarily intend to make social commentary on technology.
That said, if someone is going to speculate what Dr. Frankenstein would do now, one has to wonder what would happen to Dr. Frankenstein after the novel completes. (Disclaimer: it's been 10 years since I last read Frankenstein.)
Would Dr. Frankenstein swear off science for life? Would he take his knowledge about infusing dead flesh with life and use it to cure gangrene? Would he continue his research, and if so, would the neighbouring families leave him alone or stone him to death?
See, the tricky part about science is that it's difficult to throw ethics onto something when you don't know what you can do. Once you know it can be done, then it's easier to see applications of this, which helps to determine the ethics of the situation.
Is this ideal? No, probably not. But is it any more ideal to stop all scientific investigation until we can guess all the possible moral ramifications, and then determine if we should proceed with research, only to determine that we can't do it at all, or that we can only do the things we don't want to do, or that we have to do the things we don't want to do for years before we can research the thing we DO want to do? (Excuse my run-on sentence.)
I put forth the classic example of splitting the atom, which has many possible applications, some of which help people, and some of which kill people.
Incidentally, if JonKatz postulates the theory that a modern Dr. Frankenstein would hook up with a VC and create any harmful thing he wanted to, then I'd like to mention that a modern VC would also cut funding and support when the neighbours family sued Dr. Frankenstein for 1.2 billion in punitive damages.
Duuh... (Score:1)
[snip]...a real aversion to intelligent discussion of issues.
Now hold on just one second! Personally, I have found many a good conversation going on here at
Why, just the other day I had sat down at my Beowulf [beowulf.org] cluster with a nice toasty pants full of hot grits [satansbreath.com] to discuss relativity with Natalie Portman [natalie-portman.net].
Granted, it can be difficult to hold a conversation with a waif he is not only emaciated and vapid, but petrified, but I tell you, we were solving the mysteries [enterprisemission.com] of the UNIVERSE, I say!
Re:Of course everybody hates Katz! (Score:1)
Whether you agree with Katz or not, his articles are thought provoking. Which is why people hate him so much, as actual thought really takes effort and evidently is painful to some people here.
Re:Dragoness on Her Knees to the Rich (Score:1)
From the other angle, a group of elite socialists telling you you have to submit to blood samples so the government may have your genetic profile on record. Oh, by the way, we might need to use some of your genetic material for our national gene bank.
Analysis of Technologies... (Score:3)
Thanks, but we already read Bill Joy in Wired (Score:1)
Until you have something fresh and interesting to say, go fishing or something and get some inspiration. You've spent the last six months flogging the same old issues, just slightly rehashed so as to constitute a different checksum than the article before.
Proofreader please! (Score:2)
Eh?
Re:Declare it separate? (Score:3)
The best we can hope is that it will become more and more obvious just how artificial those real world national boundaries are, and that rather than regroup along ethnic or ideological splinter lines that all humans will understand their commonalities and work to abolish those sorts of constraints. Once that happens maybe we can talk about independence or freedom as personal liberties and what exactly that will mean for us all as citizens of Earth. Not just on the internet, but in all of life, which is the only way the internet or information will ever be truly free.
Re:Dragoness on Her Knees to the Rich (Score:1)
Point is, imo, it is more likely that you'll hve to submit your blood samples, etc. under a right wing govt. than otherwise. For our own good. I can't tell you how many nice normal people think that suspending unreasonable search and seizure rights is okay in the name of the war on drugs.
Sure, it's been a Democratic administration behind Carnivore, but don't tell me Republicans don't want exactly the same thing.
Re:Realization of the reality of the internet. (Score:1)
I don't know if you know about this, but the smallest nation in the world is not the Vatican, it's an nation called Sealand that is actually a former British air base. And they've decided to become a haven for internet privacy. Here's a quick description of the situation...
http://www.alternet.org/story.ht ml? StoryID=9315 [alternet.org]
"The handful of crypto activists living in the world's smallest country is prepared for a blockade. Driven by a passion for Internet privacy, they've brought enough food, water and fuel for a year and moved to Sealand, a 25-yard-long steel and concrete former World War II fortress six miles from the English coast. In 1967, an eccentric former British major named Roy Bates declared Sealand a sovereign territory, eventually issuing his own stamps, flag and currency. Forty-four years later, Bates, now the crown prince of Sealand, has leased his island to a group of techno-libertarians and their start-up, a data sanctuary called HavenCo which promises cyber security and which may affront many of the world's major nations."
With any new technology... (Score:1)
Katz, please look up "individualist" (Score:1)
Perhaps we should require that before new technologies are licensed, deployed or sold, we need a technological impact statement. Like the environmental statements designed to make people aware that their surroundings could be affected by construction or research projects, a TIS would mean that before projects like the gene map are sold and distributed, ordinary people are aware of the technology and its possible impact on their lives and those of their children.
This is not individualist. And it is not pro-technology. I can hardly think of a better way to ensure that technological development is restricted to the well-financed and well-connected.
Shame on you, Katz, you big poser.
Re:There was no sequel to Frankenstein (Score:2)
He would decompose. He dies at the end. ;-)
Argument context changes (Score:1)
But who has the impetus to look for solutions unless the conversation brings up the idea that the solutions need to be looked for?
Answers... (Score:1)
No, we don't need anything outside of Maslow's
heiarchy of needs, the basic things required to
support relatively normative human life. If
technology helps us fill one of those needs,
perhaps it can be needed.
> Can we support it? Can the people who buy or
> use what we make get free, readily-available
> help?
Who says they're entitled to free,
readily-available help? A whole industry has
arisen around support and training services. Can
you become an ASE certified mechanic without
paying a dime? I don't think so. Does that mean
you need to be at a mechanics level to drive a
car? Not at all.
It's up to the educational system to step up to
the plate and add new technologies to the
curriculum, as has happened for hundreds of years.
Older folks may have to pay a bit more to get this
new knowledge, but they won't die if they don't.
> Are new technologies open to peer review and
> scrutiny, that is, are the software, hardware,
> systems and design of new technologies
> available for public and other inspection
> in order to root out potential mistakes,
> problems and flaws?
That's what QA is for. I believe in Free Software
and Open Source as much as anyone. You want to
find flaws? Do like the do with cars... slam them
into brick walls at 40mph. Or get Underwriter's
Labs or someone to test.
> Will everyone have equal access to new
> technologies, or will they become the property
> of corporate and social elites with specialized
> knowledge and lots of disposable income?
I don't see this as a problem. Not everyone can
afford a car. Or a telephone. Or a house. Or a
decent meal. This digital divide stuff is crap.
What, the "haves" should give stuff to the "have
nots" because it would be more fair since the
"have nots" didn't have the same opportunities?
My parents, and their parents, worked damn hard,
sacrificed, to provide each following generation a
better life than their own. I will do the same for
my children. I will not do the same for someone
else's offspring. That's their responsibility.
At this point in my life do I give to charity? No.
Will I in the future, when I have a secure
financial situation? Sure. Everyone deserves to
have their basic needs met. People don't "need"
to live in a mansion, drive a porchse, own a
computer, access the internet, or have a
genetically engineered pet that's a perfect
companion and teacher for their children.
Those that do are lucky, they have advantages.
But to claim that having advantages isn't fair
is tantamount to claiming life's not fair, that
everyone should be equal.
The only way everyone will be equal, physically,
mentally, financially, would be if everyone was
dead.
> Do new technologies have unintended
> consequences?
That's kind of silly. They always do.
> Have academic, business or civic analysts
> examined them? Have their ramifications been
> explained to the people affected (as in
> telling Victor Frankenstein's neighbors that a
> monster would soon be running around the
> community?).
The ramifications can't be known until the
technologies have been in widespread use for a
number of years. Medical technology will have
to be tested of course, which means guinea pigs.
Heck, we're still trying to guess what the
ramifications of the integrated circuit have been,
and will be.
> Can technologies be created with consideration
> both for the environment and for consumer's
> convenience?
Yes, if consumers demand it and are willing to
pay for it.
> Can batteries, parts, cartridges,
> support and service be standardized, so that
> consumers don't have to continuously scramble?
Consumers scramble because they want the best.
I won't begrudge and incompatibility for an edge
in functionality. So the new Boo Bah Battery will
only work in a Boo Bah Camcorder. It gets 3x as
much life as any other cordless camcorder. If
people find value in it, they will buy it.
> Can software and computer makers agree on
> ethical standards for their product's lifespan,
> so that people who invest in expensive
> technologies can be assured that they will last
> a few years, and that products and software for
> them will be available in the future?
Heh. No. Probably never. They're in business to
make money. That involves giving people what they
want, and charging them out the arse for it.
Get over it already.
> Can the sale and licensing of gene research to
> private bio-tech corporations be halted until
> critical social issues can be discussed and
> resolved? The public has yet to grasp the
> consequences of such researching falling into
> the hands of a few corporations, lulled as they
> are by scientific and political promises of
> cures for cancer, aging and heart disease.
Sure, but that means gene research will never
go anywhere. If you think we can come to a
resolution about the moral and ethical
implications of gene research, you're living in
a very different country than I am.
> Is downloading music or a novel theft? Do the
> ethics of copyright and intellectual property
> need reconsideration? Or elimination? Is there
> a more rational alternative to the Sonny
> Bono and Digital Millennium Copyright Acts?
I have problems with the idea of intellectual
property only so far is to goes to cover patents
on processes or algorithms. I don't find any
problems with a creative work, like a novel or
a piece of music.
Do I like the RIAA? No. Do I like many of the
artists who, unfortunately, find themselves in a
position to be employed by them to support their
craft? Absolutely.
I think the micropayment model, the MP3.com model
and the commission model are going to be the
funding methods for most future art.
The problem with the RIAA and even the MPAA, is
that they, in large part, have a monopoly right
now.
> How can we ensure that technology and software
> companies and Web sites prominently disclose
> privacy provisions and implications? It ought
> to be illegal to distribute people's personal
> information with their knowledge and permission.
It's simple. If you don't want your personal,
public information to be distributed, don't give
it out!!!! No one's forcing you to give real.com
or whoever it is correct information.
> Perhaps we should require that before new
> technologies are licensed, deployed or sold,
> we need a technological impact statement. Like
> the environmental statements designed to make
> people aware that their surroundings could be
> affected by construction or research projects,
> a TIS would mean that before projects like the
> gene map are sold and distributed, ordinary
> people are aware of the technology and its
> possible impact on their lives and those of
> their children.
Ok, this would put a halt on all future progress.
Do you really think that "ordinary" people will
ever be fully aware of technology and it's
possible impact? People fear what they don't
understand, and often times people don't
understand new things.
Monumentous change is forced upon humanity at
regular intervals. Adapt, or become a fossil.
The Key Question (Score:2)
I think this is perhaps the only question that is posed that has any real importance. All of the other questions are just extrapolations of this question to different isolated situations or subjective points of view.
If you view technology as simply being human progression into the ability to facilitate our needs and desires then obviously any progression is going to result in a consequence that hasn't been considered to it's full extent. This arises from the fact that as we construct new and better ways of doing things we also change ourselves!
Humanity is not an isolated entity and it is not static in time. We constantly interact with our environment and with ourselves to bring about development on every scale from the individual to the global community. So in the end I believe we will never realize the full extent to which our actions (in the form of technology) play out and as such I believe that 'technology ethics' should not be a static value, they are something that we must constantly bring into question. The biggest problem with the current state of affairs is that people try to apply outdated ethics to technological problems. What most people don't realize is that these ethics are being used in contexts that they were never initially intended for.
Another key problem is that for our technology to be evaluated properly we must also have a society that can appreciate the things that truly matter like the environment and true freedom instead of short term economic gains and power struggles. It will be interesting to see if we can manage to evolve as a society quick enough to keep up with the pace of our technology - otherwise you won't have to watch movies about the Matrix and whatnot... you'll be living it.
If the Internet is the new Jerusalem... (Score:1)
Yes, the term "New Jerusalem" is pointing out the fact that the Internet unites this motley crew that is the human race into one large melting pot. However, I fear the reactions that will inevitably take place which will produce disastrous results, much like combining a base with an acid, matter with antimatter, and time with antitime.
Troll? (Score:1)
Sealand, internet haven (Score:1)
Here is the first paragraph...
"The handful of crypto activists living in the world's smallest country is prepared for a blockade. Driven by a passion for Internet privacy, they've brought enough food, water and fuel for a year and moved to Sealand, a 25-yard-long steel and concrete former World War II fortress six miles from the English coast. In 1967, an eccentric former British major named Roy Bates declared Sealand a sovereign territory, eventually issuing his own stamps, flag and currency. Forty-four years later, Bates, now the crown prince of Sealand, has leased his island to a group of techno-libertarians and their start-up, a data sanctuary called HavenCo which promises cyber security and which may affront many of the world's major nations."
I thought this was very interesting; a possiblity that I never considered before.
National purity (Score:2)
I used to dream of a world without national borders, but as I grow older it's becoming more and more obvious that we've gone nowhere for the last several millenia. My highly educated colleagues routinely speak in derogatory terms of the members of other race, religion or sexual orientation and hinder their progression in the society. As if I, born in the country, am somehow more worthy of a job or apartment. Assholes. Ironically, I've said this to their face and still I am considered better than someone who might have immigrated only from a couple of miles outside our national borders. And then they get mad when I won't bash the "foreigner" on the basis on how they talk or how they won't "adapt" (=lose everything that reminds of their original culture) to our culture.
I just read a book of Aldrich Ames and the more I think about it the more I am convinced that it would serve this asshole nation right if I ever managed to betray it in the most damaging way to "her" worst enemy. People would get hurt? Who cares if they choose to serve such an idiotic cause?
Re:A true story! (Score:1)
We tried it. It didn't work. (Score:3)
What happened? The researchers moved abroad and carried on as normal.
No matter how 'unethical' any kind of research is, there is always going to be some jurisdiction willing to reap the possible financial rewards.
As soon as there is a big financial reward to all the opiate chemistry research going on in South America, or all that germ warfare research in going on in Iraq, or the skin colour specific toxin research that was rumoured to be going on under the old South African regieme, let's see how quickly we throw away our moral stance and cash in.
Am I scaremongering? No. I'm living in a city that was trashed by Nazi V2 bombs in the second world war. Where did all the bomb researchers go after the war? They set up a little thing called NASA.
See my point?
Does the ethics police weigh up the peaceful results of the space programme (microchips, for example) and decide the deaths in London were worth it?
Re:Your real question.... (Score:1)
Maybe Im just too much of a pessimist.
In the long, sad, history of bad ideas... (Score:5)
...this one takes the cake...
Perhaps we should require that before new technologies are licensed, deployed or sold, we need a technological impact statement. Like the environmental statements designed to make people aware that their surroundings could be affected by construction or research projects, a TIS would mean that before projects like the gene map are sold and distributed, ordinary people are aware of the technology and its possible impact on their lives and those of their children.
The fact is, we can never predict where technology will take us with any degree of accuracy. Could the Wright Bros have said, "well, in fifty years, our little POS contraption will develop into hypersonic spy planes that could observe the Soviet Union, so maybe we should keep this secret as to not upset the balance of power."
Could Tim Berners-Lee have said "gee, this little Web thing could be used to distribute pornography, so maybe I should keep it secret for a while longer."
Could Rob Malda have predicted that Slashdot would end up overloading web servers world-wide?
The answer to all is "no." To mandate some kind of TIS would not only be impossible, but it would be dangerous. We can't predict the course of technology, we can only adapt to it. Even without technology, life adapts, otherwise it wouldn't currently be around. Let me put it succinctly:
The greatest risk is not taking one at all.
This is really not a helpful discussion (Score:1)
The recent origins of the current conflict can be traced back in particular to two events: the failure of the Palestinian authorities to ensure that Jewish holy sites under their care were treated with respect, and Arafat's continued brinksmanship in response to the most generous offers yet made by an Israeli premier, made at considerable risk to his own political position. In his uncompromising rejection of these terms, Arafat forced the end of that round of peace talks and pushed Israel's government to a more hardline stance.
But neither action can in any wise be used to justify the high level of Palestinian civilian casualties or the outright declaration of martial law with regard to Arab-Israeli citizens, which has extended to the house arrest of tens of thousands of civilians because of their ethnic status and the use of air strikes against the political opposition.
Re:Declare it separate? (Score:1)
There are no technical solutions that make people free 'cause freedom is a political thing, not a technical thing. The sooner people get it out of their heads that some magical whizzy toy rather than hard work and sacrifice get and keep freedom, the better.
Jon Katz as Tech Planning Bureaucrat (Score:3)
Yuck.
Missing the point (Score:3)
I'm not declaring that I get the point, but I seem to have a different response to this article.
It seems to me that Katz' point isn't so much to declare we should regulate and control new technologies. Instead, Katz' understated point is that given the wide-range of information exchange available on the internet, can we have more civic debate on these issues based on better mutual understanding?
New Jerusalem, indeed.
I like the ideal of the internet as a distributed version of the public square where ideas are exchanged. In many ways, it should be the ideal forum for discussion and consensus building on just such issues as Katz raises here. One might hope that a tech guy could talk to a farmer and each get some clue about how policies and technologies effect each other's lives, just to grab a random thought-example.
But that isn't what happens on the net, and it isn't usually what happens in a public square. People of like mind tend to band together and reinforce their own opinions. We ghetto-ize ourselves on line (even without push technology) in similar fashions.
Yes, there will always be the "should we, just because we can?" argument, just as there will always be the Darwinian response. The thing is -- and I think this thought lies behind much of Katz' writing -- how much we may be (culturally and materially) sacrificing our long-term survival for short-term gains?
Re:It is not the 21st century (Score:1)
stop with the New Jerusalem bullshit (Score:2)
for the record, the phrase "new Jerusalem" was stated twice, both in Revelations. chapters 3 and 21.
from chapter 3: 12: He who conquers, I will make him a pillar in the temple of my God; never shall he go out of it, and I will write on him the name of my God, and the name of the city of my God, the new Jerusalem which comes down from my God out of heaven, and my own new name.
from 21: 1: Then I saw a new heaven and a new earth; for the first heaven and the first earth had passed away, and the sea was no more.
2: And I saw the holy city, new Jerusalem, coming down out of heaven from God, prepared as a bride adorned for her husband;
3: and I heard a loud voice from the throne saying, "Behold, the dwelling of God is with men. He will dwell with them, and they shall be his people, and God himself will be with them;
4: he will wipe away every tear from their eyes, and death shall be no more, neither shall there be mourning nor crying nor pain any more, for the former things have passed away."
5: And he who sat upon the throne said, "Behold, I make all things new." Also he said, "Write this, for these words are trustworthy and true."
so Katz, please tell me how this is a halfway respectable analogy? are you just trying to throw in a smattering of apocrophia to get us all a bit more neurotic and worked up?
the only possible way this analogy works is if you firmly believe not that the genome project is a bad thing, but that god is now among us, leading us along in giving us the power of perfect creation and bringing back that which is dead (extinct species). if that is your logic in using this analogy, then your above argument is complete astinine since you're arguing AGAINST your analogy. By implying that the Net might be the "New Jerusalem", you are sanctifying the Net since in the New Jerusalem, God is among us.
Bill Joy is not a Luddite ;) (Score:2)
He didn't preach any answers, as I doubt there are really any concrete ones. Morality is a system of blurry lines, with faith in the uncertain right answer. He seemed to suggest that the days of searching for the truth, without also thinking of the social, ethical, and other ramifications of such a search, are over. If you extrapolate out the amount of violence towards fellow human beings, the availability of information, and the destructive capability of technology that is closely related to information, things do not look good.
Think about this. People often release a virus on a computer because they can, just to show that the exploit is there. With biotech, people might release a virus that "runs" on the human body, just because it is possible, to show that the exploit exists. The information necessary to accomplish such a feat might be readily accessable from a dorm room, and all it would take is *one* disaffected, dejected individual with the inclination and know-how to do it. How about twenty 100% fatal modifications of the Influenza virus, all released at once?
In the question/answer period afterwards, someone asked about using technology to combat technology. He answered that historically, destructive technologies almost always predate and outpower defensive ones, such as the bullet before the bullet-proof vest, or the atomic bomb before "Star Wars".
He didn't have any suggestions for a quick remedy, but he expressed interest in dialogue. He suggested having people from many disciplines, including scientists, business people, teachers, religious leaders, and politicians to discuss a way to get technology back into some human context. He felt that the arrogance of some technologists, who think that laypeople can't understand the technology, and therefore shouldn't have any say in its development or use, might reexamine their position.
Overall, it was very eye-opening. I left with a deep respect for him for bringing these issues out into the open, when other "leaders" in the industry are more concerned with their own bottom line than the consequences of the process in which they participate.
If you subscribe to the almost prevalent theme these days that humanity is doomed, then by all means, continue on your current path. We are all responsible for the current and future states of humanity, and it will take a new level of awareness and consciousness to avert the almost certain disaster that awaits. The excuse that you give, and that I give, will give rise to the excuse that the next Dr. Frankenstein gives to himself when undergoing an experiment that shows just how weak our weakest link is. Love is the only answer, the only currency, and the only hope, so love the next person as you wish you could yourself. Please don't use my imperfection as an excuse to avoid your conscience. And if this language makes you uncomfortable, then I suggest you reexamine your position.
Bablyon (Score:1)
Re:National purity (Score:2)
Re:Declare it separate? (Score:2)
And when it comes to figuring out whats on the servers, we have laws. Laws that most of the time says you are to be regarded as not guilty until proven guilty. And with some technical solutions (like freenet), you can not prove that a person actually posted some information, that the person actually downloaded it, or delete it from the net (without shutting down _all_ servers). On freenet, if I put a document there, you don't like, you can put a pistol to my head, there is no way I can remove it anyway...
There are technical solutions to get freedom. Those are hard work. Freedom is political, but politics can be pure technical if technitians start argue.
Re:stop with the New Jerusalem bullshit (Score:2)
You're diverting the discussion from the argument to the semantics, which is akin to attacking an idea based upon its grammar.
And by replying, i'm helping you do it. Egads! The abyss has gazed too long into me! Monster, I have become! Cursed was the day of your birth, yea, and mine as well!
Issues with Group Ethics (Score:1)
Ethics are an individual thing. Choices about whether to pursue technologies or ideas are made by people who understand them or the ideas behind them enough that they see some potential new utility.
In contrast, institutionalized ethics as a group activity in the form of ethics boards and impact committes seem less useful. Medical ethicists argue over what's ethical here and now and later but those views are fluid, depending on their institutional affiliation, their government, and the exigencies of the moment (and likely the grant money available). It's simply too much to believe that technologists in other areas would be much different.
It's interesting to note that a product like Thalidomide that basically became taboo in the 50s has now been rehabilitated for exploration and use in different medical contexts 50 years later. Presumably we have a deeper and presumably more mature understanding of what it does and how it works but that implies that someone was poking around with it in a different context.
Technologies tend to be pursued whether they are ethically blessed or not. Technologies like cheap virus cookers and nuclear ballistic missles are pursued and perfected by governments and willing individuals as well as any other group with the means and desire to make a controlling social impact. The only way to prevent (dangerous) technologies of this sort from spreading is to keep them secret. But the chances of any given technology being kept hidden seem slim to none given that
Finally, it seems particulary naive to think that all consequences of a technology will be determinable and evaluatable in advance by some well meaning ethics body. Nature will have its way and humans are far too creative to restrict themselves from exploring new or related ideas. Unintended consequences seems a pretty good descriptor for documenting quite a few of the major discoveries and disasters throughout history. Trying to eliminate them seems a fruitless task at best. Better still to try to foresee the nastier ones and work agressively to forestall them or blunt their effect before they turn from consequences into disasters. But that seems a dim hope as well.
Personally, it gets back to individuals making smart choices about what's ethical to pursue, what's ethical to share and when, and whether a particular technology (warhead design, nanobots, gene sequencing, AI) is basically beneficial or basically malevolent in the broader set of social contexts within which we work.
This novel was EXPLICITLY. . . (Score:1)
In fact, its genesis was a coffehouse converstion about just these issues.
The fact that it was a gothic novel is almost incidental. That was meerly the medium which Shelly chose to expose the ideas to the widest reading public.
KFG
Ethical questions can never be resolved (Score:2)
If you read arguments against genetic engineering by someone like Jeremy Rifkin, they exactly parallel arguments against vaccination. He thinks we should wait until all issues are resolved. But that implies that moral issues are like scientific facts; something that sufficient study and observation can clarify.
In fact moral issues in the real world are usually only resolved through the death of one faction or the other. We know slavery is bad now because all slave owners are dead and buried and no one holds that viewpoint anymore. But no "proof" of the evilness of slavery has been discovered that was unknown in the 1700's. The abolition of slavery was accomplished simply by pushing ahead with it despite objections and letting ex-slaveowners whine about it until they died off.
Moral issues can only be resolved after very thorough and widespread implementation, not before!
I only have ONE question (Score:2)
The internet is a tool for directly connecting all of humanity. That means it connects all of humanities ethics and morals.
That means, * that the internet is only capable of the group ethics and morals of the mass of humanity.*
In other words, it is by its very nature a NULL factor.
KFG
WorLd Is AsS (Score:1)
Hrmmm.
Interesting read tho, I'm busy dreaming on right now...
Re:stop with the New Jerusalem bullshit (Score:1)
Re:Progress isn't predictable (Score:2)
Ah, but the safety of the modern automobile is due to outside activism. People like Ralph Nader monitored the developers of the technology and forced them to include safety features not because the average consumer was clamoring for them, but because they were necessary for public well-being.
We need external monitors for the new technologies as well. It was relatively easy to engineer seat belts, air bags, structural reinforcements, etc. after the cars had been around for 50 years, but developing safeguards for biotech after it has been released isn't guaranteed to be so easy.
Let's monitor and take precautions so that we're:
and/or
With proper monitoring, we don't have to be Kreskins, but the monitoring process, combined with careful ethical analysis will make our predictions better. Just because our predictions can't be 100% accurate doesn't mean we shouldn't try.
Re:But you CANNOT predict the uses perfectly. (Score:1)
Re:stop with the New Jerusalem bullshit (Score:1)
(No i'm not!)
(Yes you are!)
(No i'm not!)
(Yes you are!)
(You're stupid!)
(I know you are but what am i?)
(Etc. I just thought i'd save us some time and carry the argument out to its logical conclusion. All in jest, sir.
You are still attacking the analogy (which really has little substantive use in this article) rather than the actual content of the article.
Have some sympathy (Score:1)
Re:My summary of this article: (Score:1)
Re:In the long, sad, history of bad ideas... (Score:3)
Re:We tried it. It didn't work. (Score:2)
Re:Realization of the reality of the internet. (Score:1)
Barak suggested Jerusalem and other hily places should placed in the hands of a third party. Arafat correctly chose to not take his offer without offering the idea to his people. Neither side is ready to deal with that. Think of the way the US cries out about free speech and about the same thing would happen over religion.
Course Clinton just had to rush it just to leave a mark on his way out.
Re:Who decides (Score:1)
"Design For Evil"
Any innocent product which becomes suddenly genocidal in the hands of a tyrant has been designed by a dangerous naif. Every design process is incomplete unless it takes into careful consideration what could be done with the product by a dictatorial megalomaniac in command of a national economy, a secret police, and a large army.
The above quote is from Bruce Sterling's Viridian Design Principles. [thehub.com.au] I am not saying that all designers must be beaten with blunt instruments until they think this way, but the world would be a better place if this kind of thinking was incorporated into the beginning of the design process.
Who decides? Ultimately, you do. You don't live in a bubble. If you're smart enough to build the thing, you're smart enough to anticipate how it'll be abused. If you're ethical enough to worry about Jon Katz indiscriminately handing out legal coercive tools, you're also ethical enough to worry about yourself indiscriminately handing out technological coercive tools.
Not Likely (Score:2)
Should we "allow" technological advances? How do you propose to stop them?
Even if we could, how do we decide which?
How do we predict the effects of unknown technologies? After all, they're "unknown". Duh!
Who are "we" in the preceeding questions, anyway? Any time you say "everybody agrees that
Biotech questions seem to be being decided by legal battles between a tiny group of technophobes and international megacorporations who think that they have billions, if not trillions of dollars at stake.
Look at the development of nuclear weapons. The peacenik types hate them, of course, but they ended the war with Japan (killing maybe 200K in Hiroshima and Nagasaki vs the estimated 1M American and 5M Japanese deaths that would have resulted from an invasion of Japan). After WWII, they prevented the Soviet Union from simply rolling over western Europe the same way they rolled over eastern Europe.
OK now, good or bad?
For another example, look at the Internet. Until the development of the Web, it was simply a geek toy. Once some smart guys in Switzerland (significant!) wrote a little program to make FTP easy to use, the whole thing exploded so fast that our hypothetical review body would have been left totally in the dust.
Frankly, the only really effective checks we have on new technology are our liability laws. Hurt people and you get your arse sued off.
--
Ask your local Almighty Buck (Score:1)
The downfall of a market economy is that the Almighty Buck has the last word 99% of the time. The other 1% is when people realize that it may be cheaper, but not more environmentally friendly, or that they just LIKE doing such-and-such, or that other thing isn't available in your neck of the woods.
Mr. Buck is an ugly master, and doesn't really care that there are monsters running around outside... they're cheaper than all the alternatives.
"There's a party," she said,
"We'll sing and we'll dance,
It's come as you are."
short comment (Score:1)
Technology (Score:1)
Technology itself cannot be 'ethically' evaluated (anyone study ethometrics?), only the uses of technology. Even if you did wish to block a particular technology I think such an attempt would be rather ineffective unless the development of that technology requiried funds which only a world-government could supply.
Even the ethical evaluation of technology is more than problematic, particularly since it would require a system of government based on ethics instead of stability for both evaluation and implementation.
The ramifications and consequences of new technologies can only be explained to those capable and willing to understand - this would require a complete revamp of our system of education. Understanding of technologies is currently limited and perforce leads to an 'elite' who understand them. I think it was Benjamin Franklin (?) who supported the idea of good education for all partly because it was necessary for an effectual democracy.
Too make effective judgements people would not only need to be technically educated but ethically educated. Currently such education is random, slapdash and factional - the power base of most religions is that they teach the One True System of Ethics. In many countries ehtics and religions are confused. Religions will, in general, resist the creation of an accepted system of ethics partly because it would likely differ from their creed, but mostly because it would erode their power. If the negation of an ethical backdrop to events weakened western religions, what would the acceptance of secular ethics do?
When even elementary millenia-old technologies like birth control are still hotly contested on supposed ethical grounds, how can you assume that we can wait contemporary issues to 'simmer down' to a consensus opinion.
Marios (was that inflammatory?)
sp - yarmulke nt (Score:1)
Pandora Box (Score:1)
Maybe I have 'little awareness that everybody isn't as technologically-inclined, -equipped or advanced as' myself. Despite that possibility, it appears that the rapid growth of technology (insert cliche) or paradigm shifts coupled with the internet has made this a moot point.
How can one hinder the rate at which technology emerges? If governments intervened when any opensource code came out saying...'This is to advanced' who here would not pissed off? The same can be said for any technology. By holding back innovation to allow the masses to catch up we ultimately squash the desire to innovate. Besides, how can you stop the application of ideas? Look at the dude (sorry forgot his name) who built the 'Super Gun' for Iraq. He had the idea of launching satellites into space with very large guns. The US refused to fund him so Hussein gave him funding to build a military version. Some would say he did a bad thing. I say he did what he had to for a passion that needed to be sated. Seriously, I would kill someone if I knew it would get us to Mars that much faster, would you?
Unfortunately some people will always be technologically ignorant as has been the case throughout history. Rather than hindering technology, society should try and raise technological awareness through technology. (yeah, yeah catch22 but you need a flame to build a fire!
People who see the future before it happens are driven to it like moths to the flame, perhaps blindly. Nonetheless, the momentum they create is powerful and only slowed by tremendous pressure. Galileo, Copernicus, and other giants suffered for this drive and still their names ring truer than any Pope or inquisitioner. We cannot allow the masses to make that suffering for naught.
Already covered analogy (Score:2)
"Cyberpunk in the 90s" by Bruce Sterling [pugzine.com]
Or, to go musical, Bruce Cockburn:
Re:In the long, sad, history of bad ideas... (Score:2)
Re:In the long, sad, history of bad ideas... (Score:1)
Wow, that's a nice aphorism. Did you come up with that, or do you know who did? I can't find it on ag [aphorismsgalore.com].
--
Darwin is key... (Score:1)
Bottom line is if you can't keep up with the new world you become extinct. Not that you shouldn't challenge technologies you don't agree with, go ahead. But if you stumble on the way you will certainly be trampled by the flocks running after. This is life.
Re:In the long, sad, history of bad ideas... (Score:2)
I have no reason not to run around flapping my wings and clucking. I guess I should do that then.
Re:Realization of the reality of the internet. (Score:3)
Re:If the Internet is the new Jerusalem... (Score:2)
Yes there is (Score:2)
Re:In the long, sad, history of bad ideas... (Score:3)
If it can be abused, at least it's power. (Score:2)
The Net is the most powerful means to free association we've ever had. If it allows abuses, it allows reactions, which allow organization, which allows implementing checks and balances.
It wasn't until I became destitute, last year, that I discovered I was information-rich. A million sites, vast search engines, MySQL, PHP, Wikiwiki, you know the litany. Bottom line: Any group of people can now implement any protocol and any record-keeping system they want.
If you want to fight abuses, OK, you have to deal with the constituted power structures. Learn how to prepare a legal case, how to lobby, how to use leisure instead of money, and how to beg for money anyway. But if you're serious, you're in a far better position than any pre-Net campaigners. And note how sensitive companies can be to mere attacks on their reputations.
Hegelian dialectic on fast-forward.
Re:Progress isn't predictable (Score:2)
Uh, Ralph Nader is a private citizen, just like the rest of us. The safety of automobiles became an issue for various reasons (Nader among them) and the populace pushed the legislators to make rules. And, for the most part, this works out fairly well. There have been some hiccups along the way like mandatory air bags that for smaller folks actually increase the risk of death, but mostly things work fine.
However, if the government would have jumped in at the beginning of the automobile revolution with legislation we would still be riding horses around.
In other words, progress is inherently unsafe. It is impossible to know what new dangers each invention will bring. Tim Berners Lee had no idea that his invention would create the Internet Predator, he just knew that it would solve his particular problem. Pairing inventors with bureaucrats who have the power to veto ideas because they might be unsafe would stop progress altogether. Not too mention the fact that a good portion of our true progress is made while our scientists are looking for faster ways for us to kill each other. You might not like the atom bomb, for example, but there certainly are some valid uses for atomic power and radioactive isotopes. And I certainly am glad that the United States ended up with this power first instead of some other country. The world would be a different place if the Nazis would have developed this sort of weapon first. There is no way to put that particular genie back in the bottle, but there is little evidence that the world isn't better off because of it's existence.
Now I agree that there should be "monitors," but I don't think that the government is likely to do a good job. Fortunately each and every one of us has the same power that Nader has. We can each speak up about the abuses that go on around us. There is nothing magical about what Nader did. And there is no guarantee that you could elect or appoint someone to do the same job. In fact, the second you have a designated whistle blower the whole process opens itself up to politics and corruption.
Re:Progress isn't predictable (Score:2)
That depends on what you're claiming it has become. If it's that the car would be the primary mode of transport for a large percent of the population in industialised countries, then that's exactly what Henry Ford said it wanted it to become.
IMHO the car is a perfect example of technology gone wrong. In cities it is the dominant mode of transport. People drive to and fro by themselves in cars capable of carrying 5 or more, at about 50 kmh in cars capable of doing 180, and turning a finite resource into pollution at an alarming rate. The car is more bloated than Netscape 6, it is completely unsuitable for the majority of its tasks.
I did see in a newspaper last week some concept cars developed by Peugot and (another manufacturer) - small, fibreglass, two seater and electric. Of course you'd be mad to drive one in case some daydreamer in a 1.5 tonne, 5 litre four wheel drive on their way to the corner store ran over you.
I wish more people *had* thought about what they were doing with the bloody card.
[Disclaimer: I ride a push bike]
The Medium Is The Message (Score:2)
For instance, television is one-to-many information distribution. Benefit: a lot of people didn't have such good access to information before TV came along. Downside: those people have no way of putting information back into the system, it's one-way only. Message: the role of the public is to consume only, on every level.
Now, let's look at the Internet. Benefit: a lot of people now have two-way communication, even to the point that they can get 'slashdotted', their ideas given massive media exposure in various ways. Downside: this communication can be and is being spied on, recorded, censored, controlled and manipulated by powerful entities both in government and private industry. It is becoming very natural to be spied on, controlled, manipulated by outside forces with no accountability. This is taken as natural and desirable by most.
Half a point to anyone who can spot _that_ message (as spelled out in classic literature). Hint: the concept of government we're accustomed to is being steadily replaced by this new concept, illustrating a significant shift in the world's power balance.
Progress (Score:2)
It's the facade of our heritage
The odor of our land
They speak of progress
In red, white and blue
It's the structure of the future
As demise comes seething through
It's progress 'til there's nothing left to gain
As the dearth of new ideas
Makes us wallow in our shame
So before you go to contribute more
To the destruction of this world you adore
Remember life on earth is but a flash of dawn
And we're all part of it as the day rolls on
And progress is a message that we send
One step closer to the future
One inch closer to the end
I say progress is a synonym of time
We are all aware of it but it's nothing we refine
And progress is a debt we all must pay
It's convenience we all cherish
It's pollution we disdain
And the cutting edge is dulling
Too many folks to plow through
Just keep your fuckin' distance
And it can't include you!
-Bad Religion, 1989
Moralities (Score:2)
Your statements are in principle wrong. And terribly wrong. Sincerly, you are trying to put moral on things that gives a Hell on what humans may think about it.
In part I am a technocrat so you may blame me for this. But also I know the value and disvalue of many scientific and technological advances from a human point of view. Science and Technology are unhuman in their inner nature. They do not depend on you, me, the government or corporations. Whatever humans do to find a new law or create a new invention does not give these things a human character. It is quite unfortunate that Earth is probably too isolated from the rest of the Galaxy. i believe that if we had two/three neighbors we would have a more clear picture on how a wheel, a car or a computer would look too external to a human mind.
You talk about responsability on technology. What makes you think that technology should be responsable. It is humans who should be responsable. They either make nuclear stations that blast off, or nukes that bomb cities. The technology is mostly the same for both things and the human irresponsability goes nearly the same level for both cases. A criminal case of playing with matches to see how things burn. But note that the problem is not on the matches. The problem is on how you use them. Pease weigh every point of nuclear technology and tell me. Should we forbid it? Yes? Cool, then we should have forbidden dynamite, cannon powder, and even fire. Why our ancesters didn't see it? One makes a fire to roast a chicken and another to cook some niggers/white necks/red skins/gooks/reds/russkies/gringos/yankees/fritz/.
So Frankenstein is probably not the best example you have taken. There it is more a blame for human nature and moral rather than a blame for technology. Maybe it would be better to choose Dr Jekyll & Mr. Hide as an example. There technology is more to blame for freeing the monster.
Meanwhile you talk about the future of technology and boards, public discussion, commitees. Do you know what they mean? Inquisition, the mobs and nothing less than the corporations. This last one in a much broader sense than the "typical modern" corporation. Do you remember Galilei, Bruno? Do you remember what happened to the first car? Do you remember about the masons of Middle Age and their construction skills?
The Judaic/Christian/Muslim tradition states that God said: "You shan't kill...". Well that's the point. You may have your bare hand, a stick, a pistol or a nuke. But, "you shan't kill". Correct, it is hard in our marvellous world to follow such a rule. I know how damn hard it is, specially when one tries to kill you. But the problem is on the human values you hold and how you apply them. I don't go trough the streets and shot everyone I see. But I can't stand out of using violent measures to avoid an uncontrollable hooligan trying to cut my neck.
This same point goes to the human genome. If we care and do care for our future then we should forbid the uncontrollable use of genetics in humans. But if we care to solve serious problems of health like hereditary defects then we should modify genes. Other way does not exist.
And anyway. Imagine that humans have some %%%% of artificiality. Anyway they do have. Even from the selectivism of our nature. So what makes modifying genes a big difference?
Re:Dragoness on Her Knees to the Rich (Score:2)
Certainly beats you.
"Perhaps" we should let the executives at Firestone ("carnage on the freeways") Inc.
What you're conveniently forgetting is that it wasn't a government agency or some Naderesque "consumer advocate group" (read: "bloodsucking lawyers") that found out about the problem. It was State Farm Insurance, presumably run by the same sort of "greedheads" that you despise. In fact, they informed the NHTSA two years ago.
or Union Carbide ("the dead of Bhopal") Inc. make these kinds of moral and ethical decisions for us.
And what you're also (again quite conveniently) omitting here, is that what happened in Bhopal in 1984 was not the result of any decisions made by "greedy" Union Carbide executives, but was caused by one of the workers sabotaging equipment at the plant. What's the glowing term you leftists use for that? "Direct industrial action", I believe?
WHO DECIDES? Certainly we don't want the greedheads making those decisions, unless you are someone like Dragoness here, who serves as sycophant to the rich.
I'll trust a businessman greedy for money rather than a politician greedy for the power to run other peoples' lives, any day. No contest.
How about we have a little test. You total up all the people rounded up and shot/gassed/etc. by businesses during the past century, and I'll add up the number of similar folks treated thusly by governments, and we'll see who has a higher total. Better hurry: Hitler, Stalin, and Mao put the governments' "score" at around 60 million right off the bat.
Or instead, how about this: call up Bill Gates and tell him you use Linux. Then call up Janet Reno and tell her you use heroin. Let us know which call results in armed men kicking down your door.
Re:Who decides (Score:2)
If you work in the defence industry, or with military funding, then maybe you might want to look at the uses of stuff. I hope the guy who invented napalm has that on his conscience for ever - if there's no obvious other use of your stuff except to kill painfully, then that's not something I could do. But if you had to second-guess everything, there'd be nothing left.
And think on: mil-tech isn't necessarily evil. GPS and the Internet are both military projects; GPS is still funded exclusively by the US military. The jet technology developed for fighters in the 40s and 50s drives the jumbo jet that takes you on holiday.
And there's other stuff too - how's about medicine? The range of "truth drugs" out there have medical uses, and their "truth drug" properties are an accidental side-effect. The original, scopolamine, is a sedative and is also used in small quantities to combat motion sickness.
What's likely to trip us is something we'd never think of. In films, think T2 - a computer becoming sentient and enslaving mankind is not something a geek thinks about when he's working on his latest chip design ("It's not every day you learn you're responsible for the deaths of 6 million people.") Robert Heinlein once wrote, "In the early 1900s, most futurists agreed the car would serve a purpose. Some saw that it would replace the horse. But none of them foresaw the change in mating habits of the American teenager which it caused." Would you have predicted 20 years back that the Internet would reach such a mainstream audience, given the average population of BBSes and their speed and reliability (or lack of)?
Anyone who claims to be able to spot all future uses of something is lying - it just isn't possible. Futurists are no more accurate than weather forecasters - once something's happening, they may (if they're lucky) be able to tell you with a reasonable chance of success which direction it's going to go in and how fast, but there's no way to predict anything new starting, and there's no way of knowing whether the butterfly you've just seen flap its wings is going to cause the next cyclone.
Grab.
Grab.
Re:Realization of the reality of the internet. (Score:2)
This is not the same as "the UK recognizes Sealand as a sovereign nation."