spiraling towards contentless content
Day 0
Posted: 2005-10-31 09:32
No comment(s)
Author: Phil Gengler
Section: NaNoWriMo

Thirty days. Fifty thousand words. One novel. Such is the goal of NaNoWriMo, or the National Novel Writing Month contest, which starts tomorrow and spans the whole month of November.

Since I'm a glutton for punishment, apparently, I'll be giving this a shot. I'm hoping that by forcing myself to sit down and write every night, even if I don't make the 50,000 word target, I'll still come away having broken out of the rut I've been in since I started working full time. Assuming I remember to do it, I'll try to make (more or less) daily updates of progress here.

Getting (back) into gear
Posted: 2005-10-03 14:39
No comment(s)
Author: Phil Gengler
Section: Copyright

Well, it's that time again. The Library of Congress has begun the third rulemaking process for exemptions to section 1201(a)(1) of the DMCA with an announcement in the Federal Register. Public comments are being accepted through December 1, followed by reply comments through February 2, 2006, and then there will be public hearings again. After falling away from copyright-related stuff for a while (for all sorts of reasons), this might just be the incentive I need to get back in the game.

Quick summary of the last rulemaking: exemptions were given for access blacklists, dongles for "abandoned" software, access-restricted e-books with no text-to-speech options, and certain types of old video games. Hopefully this time around will yield some more successes.

It's not my fault! It's the (lack of a) law!
Posted: 2005-10-03 10:53
1 comment(s)
Author: Phil Gengler
Section: Rants

Over the weekend, the New York Times ran a story about a new Connecticut law that, in part, makes it illegal for drivers to talk on cell phones while driving. (Technically, the law makes it illegal to have a cellular phone "in the immediate proximity" of one's ear, as though holding a closed phone in your hand, which may then be near your ear since your elbow is resting next to the window, is the same as having a conversation into it.)

I fail to see what makes cell phones so big a distraction, especially compared to having a conversation with a passenger in the car. With a cell phone, at least, it's easier to resist the human tendency to look at the person we're talking to, and so you should be better able to keep your eyes and attention on the road. Now, there may be some actual basis for saying that talking on a cell phone is riskier than talking to a passenger; I haven't read the studies claiming this, so I don't know if they look at the risk with cell phones against the risk of normal, non-distracted driving, or against driving while carrying on other forms of communication. The play this information gets in the media, which is at least equally important, if not more so, to the general public's perception of the issue, is that talking on a phone while driving is more likely to get you involved in an accident than undistracted driving (which is pretty much a foregone conclusion).

While I think this law is unnecessary, it isn't what has so angry this morning. The last two paragraphs of the article are particularly infuriating:

And for some drivers, the law's extension beyond just talking was a welcome reminder that sharing the road requires concentration. Efrain Rosario, 22, an account manager for a Verizon Wireless store in Trumbull, said that on Aug. 10 he drove his 2005 Scion tC into the back of a Honda while reading a text message from a friend. If the law had been in place, he said, maybe he would have thought twice about looking down to read "What's up?"

"That 25-cent text message cost me a thousand dollars in damage," Rosario said, "and it ruined my night."

So, let me get this straight: this idiot thought he would be okay to check a text message while driving, got into an accident, and then claims that it's not his fault, because there wasn't a law prohibiting it? At some point, people have to take responsibility for their own actions; this guy chose defy common sense and got what was coming to him. He should not be able to shirk responsibility for what he did, and he certainly should have no basis for claiming there should be a law prohibiting that behavior. His defense that "had [a law] been in place ... maybe he would have thought twice" is the biggest load of bullshit I've seen lately outside of politics.

When there's an accident, and the cause was carelessness or recklessness, that's already illegal. If you drove around staring at the floor of your car the whole time, and got into an accident, that would be illegal. The fact that you weren't paying attention is not a defense, it's an indication that you ought to take responsibility for the fact that you screwed up. The same holds if you were looking into a mirror to change lanes, for example, and rear-end the car in front of you. You weren't paying attention to the road in front of you, and you're at fault. The fact that you weren't paying attention is, again, not a defense. So clearly, there's ample precedent, not to mention common sense, supporting the idea that this guy is the one to blame, and that a new law is not required. I'm certainly not aware of any movements to prohibit drivers from checking their mirrors, because it diverts their attention from the road in front of them.

I sincerely hope that people like Efrain Rosario are few and far between; given the stories I've heard about people doing stupid things, and then claiming the need for a law against them, however, isn't very encouraging. The purpose of government should not be to try and "protect" its citizens against anything and everything; to accept that should be the point of government would be to give it the power to control every move we make, from when and where we can drive to what we can have for dinner, and how we can make it. To keep this from happening, however, we, the people, need to do our part not to put ourselves and others into situations where harm is likely to emerge, whether the behavior is illegal or not. We also need to ensure that we do not allow others to cede our rights to the government in exchange for misguided protections against the actions of a few.

Why I hate the world today, part two
Posted: 2005-08-16 12:37
No comment(s)
Author: Phil Gengler
Section: Rants

Some days, you take a look around the world and wonder how the hell humanity made it this far. In addition to everything else, now the US government has asked ICANN to put a hold on approving a new .xxx domain suffix. The process of creating the suffix, which was originally going to be completed later this week, has been underway for five years. The suffix itself has actually been approved by ICANN; the pending matter is a contract for a company to run the TLD.

But this isn't what bothers me the most. What really gets me is the response to this by the Family Research Council. Now, I disagree with most every point that site makes on just about every issue, and this is no different. My problem here is that I simply can't understand how people can possibly be so ignorant as to completely miss the point.

According to FRC's statement, "Pornographers will be given even more opportunities to flood our homes, libraries and society with pornography through the .XXX domain." I suppose the fact that any sites that move to a .xxx domain can be blocked with ease, while not affecting non-pornographic sites (a problem with many current filtering systems), means that somehow more pornography will get through.

"To some this [creating a .xxx suffix] may initially seem like a good idea but it is one that has been considered and rejected as ineffectual for years. This will NOT require pornographers who are on the .com domain to relocate to the .XXX domain." But for those that do relocate, it is much easier to block them by blocking all .xxx domains.

"The .XXX domain will increase not decrease porn on the Internet." Neither will not having it; at least when it exists, it's easier to filter out porn sites that are going to exist with or without a new domain suffix.

Just when I thought that people (especially unhinged religious nuts) couldn't possibly get any stupider in my eyes, they do something like this. Thanks for making me realize that, as a species, we're completely and totally doomed.

Shoot first and don't bother with the questions
Posted: 2005-08-16 10:27
No comment(s)
Author: Phil Gengler
Section: Politics

Just when you thought you couldn't hate the world any more than you already do, you read about the facts surrounding the shooting by police of an innocent man in a London Tube station (I talked a little about this in a previous post).

Initial reports about the shooting indicated that the man was wearing a heavy coat in the summer, had jumped over the turnstiles to gain entry, and had run from police. According to the report in The Observer, however, he was not wearing a heavy coat, he did not jump the entrance barrier (he used a regular fare card), and some witnesses say that he was not challenged by police, who some witnesses say did not identify themselves.

We may never know what actually happened inside that station, despite the "dozens of cameras" present inside. Police say that most of the cameras were not working, even after the push for more cameras after the July 7 bombings.

Furthermore, British police have publicly acknowledged a "shoot to kill" policy for people they believe are suicide bombers. Not only does this leave open the possibility of other innocent people being shot dead by police, it also won't help to stop an attack by anyone who has put some thought into it. As Bruce Schneier mentioned in the latest Crypto-gram newsletter: "It is entirely conceivable ... that suicide bombers will use the same kind of dead-man's trigger on their bombs." In that case, killing them would only cause the detonation of the bomb. (Granted, in that situation, apprehending them would likely have the same result; however, when someone is wrongly suspected of being a suicide bomber, if they're apprehended they aren't dead.)

Comments by the Sir Ian Blair, the Metropolitan Police Commissioner, show that the policy isn't being thought through very well. The Observer article has Blair describing the "shoot to kill" policy as the "'least worst' way of tackling suicide bombers" and that he "refuses to rule out other innocent people being shot."

If our own forces, the ones we expect to protect us from attacks and to keep us alive, are operating under a policy that they know might result in the killing of innocent people, why are people accepting this? We don't want to be killed by terrorists, but we're perfectly willing to allow ourselves to be killed by the police? What, then, is the difference?

Now, I know this is a British incident, and fortunately, does not (yet) reflect the policies (at least, the acknowledged policies) of any police force in the US. However, given the similarities between Britain and the US, I think it's important to keep an eye on what is happening there, as it may soon find its way to this side of the Atlantic.

Shoot first and ask questions later
Posted: 2005-07-23 18:26
No comment(s)
Author: Phil Gengler
Section: Politics

In the wake of two rounds of explosions in London's Tube rail system, there has been a response on the part of government here in America to increase the apparent security of the transportation system, especially in the crowded NY/NJ area. After the July 7 bombings, more police were assigned to make patrols of train stations up and down various lines, with some officers also riding on the trains. Following the (smaller) incident on July 21, law enforcement officials instituted a policy of performing "random" bag checks of passengers in the NYC subway and on the LIRR.

There has been quite a bit of discussion about this among 'railfans' at a forum I frequent. There was also a fairly active thread about it at MetaFilter. The majority of people whose opinions I've come across seem to feel that the searches are "good idea," since even if they fail to stop an attack, they make people feel safer.

While I'm willing to concede that such searches may make people feel safer, they are not actually making people any safer, especially since, with the subway and LIRR searches, people are free to decline to be searched and then walk away. Firstly, since the subway and railroads (in this area, at least) are "public transit" and opened by local government, there is no ground for preventing someone who pays the fare from riding. Being forced to consent to a search of a closed container (a backpack, for example) in order to gain access to public infrastructure seems a blatant violation of one's Fourth Amendment protections against "unreasonable searches" (for the record, I feel the same way about being subjected to such searches when entering courthouses and other government buildings). Since the searches are conducted at random, or in the case of courthouses and government buildings, on everyone, there is no "probable cause" for such a search. The fact that someone is carrying a bag onto a train does not make them any more likely to commit a crime, since thousands of commuters bring bags, briefcases, etc. onto trains every day, and have for years.

Second, by allowing people who decline to be searched to walk away (instead of being detained, which I suppose is one bright spot in this whole mess) just makes it possible for someone actually looking to get a bomb onto a train to just walk away and try again later or at a different station. Speaking to probabilties, one comment on MetaFilter noted that even a mandatory search of passengers would only have stopped one of the four bombs in the first London attack.

Third, such measures open up other places where a terrorist could detonate a bomb. Rather than have to get it on board, one would just need to get inside the station, to the turnstile area (for the subway) and detonate it during rush hour. The stepped-up security in stations also makes it more likely that a terrorist would just avoid them and concentrate on a different target that does not have the same level of security (a bus, for example).

This has come in a number of places (here, for example). All our response to terrorism thus far has been reactive. After 9/11, airline security was stepped up, and then there were attacks on other targets, such as the rail system. Now that security has been stepped up there, as one example, movie theaters could be next. After a theater was attacked, security at those would be stepped up, and then the next attack would be somewhere else, etc. Given that most governments don't have the money to pay for even the current expanded security in the long run, it's obvious that we can't protect everything. I don't believe we would want to, even if we could. Even if every airport, train station, theater, museum, office building, etc. had an armed patrol, that would still leave individual residences open. Short of permitting armed agents to live in our homes (or have such legislated upon us by the government, in contravention of the Third Amendment, since no war has been formally declared, and our current "state of war" is merely a common understanding and not the legal one).

Much has been said about one's "right to live" being more important than any "right to privacy." I have to agree, since without being alive, it is impossible to exercise any other rights. I do not believe, as others do, that a "right to live" is so important as to warrant overriding other rights, which, given the definition of a right as something inherent and not given, like a privelege, doesn't make much sense. It is important to balance all rights, not just a select few. It is also important to remember that our Constitution includes a Ninth Amendment, which reads "The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people." In contrast to those who claim that the Constitution does not recognize a right to privacy, the Ninth Amendment clearly does (despite how much this amendment has been gutted by the courts over the years).

It is commonly recognized that our expectation of privacy is significantly diminished when we enter a public place. Courts have ruled, for example, that it is not illegal to take pictures of someone in a public place. One thing to note about this, though, is that we choose how much we want to reveal about ourselves. Someone can cover their entire body from head to toe, and thus ensure that virtually nothing is learned about them. On the other hand, we could all walk around without any clothing whatsoever, and keep nothing about our appearance secret. I don't think that society as a whole is going to give up on clothing anytime soon, as we recognize that there are some personal things best left private. The same logic should apply to carrying something through a public area. If I'm carrying a copy of the "Anarchist's Cookbook" (or even an almanac) under my arm, I shouldn't expect that people should avert their eyes not look at it. If I'm carrying it in a closed bag, however, I have some expectation of privacy. After all, we don't generally tolerate random people searching through our closed bags. There is an expectation that what is placed out of public view was done so because the owner or carrier didn't want everyone to see it.

The same goes for carrying drugs, with the only difference there being that drugs have been labeled 'illegal' by the government. With the subway searches, the New York times reports that "anyone found to be carrying illegal drugs or weapons will be subject to arrest." If the checks were confined solely to bombs, then it might not be so bad; however, with the scope of the search including drugs (and presumably also other illegal items, like drug paraphernalia) this is a blatant violation of the "unreasonable search" clause. The Supreme Court has ruled that if something like drugs or weapons are found during the course of an otherwise legal search, the drugs can be used as evidence in a case about those. With those decisions, however, the items found were incidental to a search conducted either with a warrant or consent. While these searches also require "consent," it's basically the same as an officer wanting to search someone's home, and if they don't get consent, telling the person they couldn't live there. There is no "implied consent" law for carrying bags on the subway the same way there is for a breathalyzer test when driving (which, incidentially, I think is blatantly unconstitutional).

Where do we draw the line here? It's possible that a terrorist might load up a car with explosives and blow up a bridge, so should be all be required to consent to having our cars searched at random, or else we can't use the roads? You could walk over a bridge with explosives and detonate it, so should we have to consent to searches before we're allowed to walk on a public street? If we continue to allow our rights to be eroded by "security theater," as some have called the actions being taken, what right to we have to call this country the land of the free?

One more thing I want to cover goes back to the idea of the "right to live." On Friday, British police shot and killed a man in a Tube station. Reports indicate that the man was "acting suspiciously," by wearing a heavy coat on a summer day and avoiding eye contact with police. It is believed that the man was told by police to stop, but instead hurried to get on a train. He was shot five times at point blank range and died. Today, the New York Times is carrying a story headlined "Britian Says Man Killed by Police Had No Tie to Bombings." In the name of security, an innocent man was shot and killed, by the "good guys." What I find more repulsive about the story was that he was shot at point blank range, which means the officers were right behind him. Some witness reports indicate that the man had tripped; I don't know for certain if this was the case, but in any event, if the police were that close to him, why didn't they take him into custody? There was nothing to be lost from it; if he was carrying a bomb, he was stopped before he could detonate it, and police would have the chance to learn something about the operation (certainly, no less than if he were dead). If, as was the case, he was not detonating a bomb, he gets to go on living. What happened to this man's "right to live?"

Over at, I posted a hypothetical, which relates to the London situation. It goes like this: I'm out one day in early/mid fall, and the day is much warmer than I was dressed for (expecting a colder temperature; perhaps it's just an unseasonably warm winter day, and I've got a bulky coat on because it's all I've got). I'm on way back home from school, so I've got a backpack full of books and a laptop, and I'm in a bit of rush since I've got a doctor's appointment, so I'm trying to weave in and out of people to get through and to the train before it leaves. Maybe at this point an officer yells "Stop," but I don't look up because I don't think it's directed at me, or I don't hear it.

Should the police shoot first and ask questions later? The only thing I may have done wrong was not turn around and respond to the officer yelling "Stop," but assuming I heard it, how was I supposed to know it was directed at me and not one of the people in the crowd?

Measures such as the bag searches do very little to make anyone more secure, or deter a terrorist attack. They do, however, have the effect of giving people the idea that such measures are necessary, and that they must give up some liberty to be safe. As the ubiquitous Ben Franklin quote goes, "They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety." Those that give up that liberty for the delusion of security also deserve neither liberty nor security, but those of us who do value our liberty must not let those people give up our liberty, for it is not theirs to give nor the government's to take.

The death of the English language?
Posted: 2005-07-07 20:26
No comment(s)
Author: Phil Gengler
Section: Stuff

Last week, there was an "Ask Slashdot" submission in which the author noted that he "noticed that a surprisingly large number of native English speakers, who are otherwise very technically competent, seem to lack strong English skills." As someone who served as an editor for a newspaper at a tech school, I certainly have an opinion on the matter. Unfortunately, I came across it the next day, and given that Slashdot stories tend to be rather short-lived, I figured I'd give the issue a more thorough treatment and post it here.

The author comments that "It baffles me that a culture so obsessed with technical knowledge and accuracy can demonstrate such little attention to detail when it comes to communicating that knowledge with others." First, this is a blatant generalization, as there are plenty of technically savvy people who make an effort to follow the "rules" of the language; I like to think I'm part of such a group. On that note, my observation has been that such people are outnumbered by those who take the less structured approach, instead opting to use other means to get their ideas across.

By "other means," I mean any number of shortcuts that have been adopted and accepted by many people. For example, checking one's spelling is not often done for 'quick and dirty' forms of communication, such as an instant message, e-mail, or website comment. A number of mistakes arise from this, leading to common misspellings such as "teh" in place of "the." As real-time, text-based chat, be it IRC, IM, or something else, became more popular, shorthand was adopted for some of the more commonly used phrases. Otherwise meaningless arrangements of letters such as "lol," "bbl," and "afk" (to name a few) became part of the language used when communicating through the Internet.

One apparent result of the ability to communicate via the Internet has been an increase in the amount of "written" communication that takes place between two or more people. Many years ago, writing letters was effectively the only way to communicate with someone not in the immediate area. When the phone system grew, telephone calls became the more common way of having an informal (and in many cases, formal) conversation with someone. Writing a letter was reserved for more formal communications, and as a result, was subject to a higher attention to detail and accuracy. Having a voice conversation, as was possible with the telephone, has not been subject to the same standards of language use; I believe this is because once you have said something to someone, it cannot be taken back. With written communication, you have the chance to look it over as you compose it, as well as after, all before the content reaches the recipient. With this, it was expected that you would look it over and ensure it's syntactical and semantic correctness.

With the Internet, and the ability to communicate quickly and cheaply through the primarily text-based medium, informal communication was written again. Ultimately, the fact that such communication was quick and cheap is, in my opinion, the reason that language standards tend not to be applied. Most electronic communication is fleeting; instant messages are rarely kept around any longer than a single chat session, in-game communications are not logged, and most e-mail messages are not saved. This is in contrast with many letters, which were often saved. (I do acknowledge, however, that in total, electronic communications are, in fact, stored more often than physical correspondence, through caches and server logging facilities. Most of the time, the sender and the recipient never deal with any of these intermediates.) As a result, mistakes are more easily forgotten, and corrections more easily made available. Rather than thoughtfully compose a lengthy e-mail message, most people will send short messages with little or no thought, sending another message when something else crosses their mind.

I believe that I have digressed from the original point. The tendency toward short but frequent electronic communication is not limited solely to technical people; it afflicts nearly everyone who communicates via the Internet. In that, I disagree with the original author's view that such problems exist only in, or to a larger degree in, the technical community. I do find it more surprising that such an attitude is fostered among programmers, though, since programming languages are very strict in the syntax they allow. One of the comments responding to the original article touches on this point; the comment's author makes the distinction between a compiler, which can only understand things written in its syntax, and a human, who is capable of extracting mostly-correct meaning from sentences that don't conform to the rules of the language.

One of the comments replying to this makes a point of noting that "communicating effectively" with a human does not require rigid adherence to a set of rules. This argument is echoed in a number of other comments to the article; the idea is that, if the recipient is able to understand the meaning of what the originator was trying to communicate, then it does not matter how strictly the communication followed the "rules." In various threads up and down the discussion, proponents of this idea disagree with those who believe that there is more to communication than just understanding the idea.

I have mixed feelings about both ideas. On the one hand, I tend to feel that when something can be done just as well without following a set of arbitrary rules, then the rules should probably go. On the other, and as someone who tends to be pedantic about language use, I do agree that the way something is communicated conveys information above and beyond that which was intended. Someone who tends to be very lazy in their writing, who frequently misspells words and misuses grammar, can come across as someone who does not much care for what they are trying to say. This certainly depends on what it is that is being communicated; for someone trying to make an effective point, I tend to believe that the presentation is important, though not as important as the content itself.

In other cases, it is hard to speak generally. When someone is writing with the aim of an ephemeral comment, perhaps with something like a not-too-funny joke, I tend to be much more tolerant of spelling and grammatical errors. When reading something that is meant to be "professional," such as a series entry on a website or some sort of proposal, I expect that the author would have expended the time and effort to proofread. When I see writings which have misspellings, especially on common words, or include sentences which take far too much effort to parse correctly, or have missing or extraneous punctuation, it diminishes the quality of the entire work.

Now I am certainly no less guilty of this than anyone else; for quite a while, I used to spell the word "sentence" as "sentance," and there have been several instances when I have failed to perform my "due diligence" on something I have posted here, and only found out about the mistake when I notice a hit from Google on a misspelled word. Fortunately, this doesn't happen very often. I also admit that I tend to be very pedantic about language usage. One of my pet peeves is when someone uses the phrase "begs the question" in a place where "raises the question" would be correct. The debate goes on about whether the common misuse of this phrase means that it has become part of the English language (for which there is, in effect, no governing body).

At the heart of all this, however, there is what I believe to be a root cause: a lack of proper education of the use of the language. While I imagine that a basic grasp of the English language is taught in all schools (I can't say for sure, and don't want to make the generalization here), continuing education and reinforcement of those topics already taught is lacking. From my discussions with others, it appears that my school district's curriculum was an exception, with an English course being a required part of school through the eighth grade; where it was, however, very few people in the class would take it at all seriously; there appeared to be a prevailing attitude among the students that they were being 'babied.' People seemed to feel that they already knew the language, and being made to learn more of it by then was something of an insult. Even in other classes, where writing assignments were given, very little was done to observe and correct most mistakes. Certainly teachers would correct the basic errors in a piece of writing, but as my experience as an editor showed, these corrections were often superfluous. Poor sentence formation was frequently left uncorrected; you would be sure to find a red circle around the use of "it's" as a possessive, and run-on sentences and sentence fragments would be marked, but you would rarely find suggestions on how to rearrange a sentence or a paragraph to have it make more sense.

Once more students started chatting online, I think that many of them found it easy to be lazy; certainly, it is the culture that exists around the Internet. Unlike a school assignment, there usually isn't anyone who is going to correct the spelling or grammar of a forum post or instant message, and so since similar mistakes are happening all around us without any major complaints, it is easy to get by without ensuring accuracy. The pseudonyms we can often hide behind made this even simpler. This can cause people who usually strive for correctness to slip up; after all, it would generally take a lot of effort to connect something written on the Internet to a real person.

Unfortunately, I don't really see any way to correct this situation. With the Internet becoming a larger part of more students' lives at a younger age, seeing "Internet English" in so many places leads to them believing it is the right way (which can be seen when chat shorthand makes its way into school assignments), and so they keep using it, other kids see it and start using it, and the cycle continues. Whether this is ultimately good or bad for communication remains to be seen.

Sliding further from American principles
Posted: 2005-06-24 18:52
3 comment(s)
Author: Phil Gengler
Section: Politics

The Supreme Court dealt a huge blow to another American foundation yesterday, when it expanded the reach of the power of eminent domain. This comes not long after their ruling in Gonzales v. Raich where the court stomped all over states' rights.

This new case, Kelo vs. New London, dealt with the issue of whether a government could, using the power of eminent domain, take property from a private owner and then turn it over to a private developer. More specifically, was such an action considered "public use," as per the Fifth Amendment, which states, in part, "... nor shall private property be taken for public use, without just compensation."

The particular phrasing is very important here. The amendment states "public use," not "public good." The majority opinion in this case, as well as some of the analysis I have heard about the subject, seems to miss this distinction. The "public use" clearly means use by the public, whereas "public good" is much more nebulous; it would include the collection of more tax revenue, as seems to be the point made by the city of New London, and with which the majority agreed.

This public benefit, however, is not "public use." In fact, it is not even an assurance of a public benefit. What the city of New London, and the private developers to whom the land is to be given, is a plan for developing the particular area. The plan is not going to make any tax revenue automatically, and for a time (once the private residences are demolished), the property value of the land will actually decrease. It is not until something new is actually done with the site that the city will be able to charge more. However, there is no assurance that any of this will actually be carried through on. It's entirely possible for the project to be halted, with the existing private homes demolished but no new construction on the site.

As a related example (though not connected to the discussion of eminent domain), there was a plan in Asbury Park, NJ, to revitalize its waterfront with the construction of a 10-story building for luxury condos. Construction got underway, backed by large sums of money from the city, but was abandoned when the developer went bankrupt. To this day, the half-completed building still stands, an eyesore to the town. One Asbury Park official was quoted in the Star-Ledger as saying, "People hate it so much they would pay to blow it up."

The possibility of abandonment is far from the biggest problem I have with this (in fact, most redevelopment plans don't end this way; I merely included the example of Asbury Park to show that plans and promises do not necessarily provide certainty). What I am most outraged about is the blatant assault on private property rights and the huge potential for abuse by corporate entities.

One of the principles of our Constitution is the protection of private property. While it is implicit in the fact that the Constitution was heavily influenced by the ideas of John Locke, it is also explicit in the Fifth Amendment: "nor shall private property be taken for public use, without just compensation." The founders recognized, however, that sometimes absolute private ownership could be a problem, and that sometimes the state could make better use of land or property for the benefits of all its citizens than could a private owner, hence the need for this clause's inclusion.

It is more than just the decision in the case that angers me. In the syllabus preceding the opinions for the case, the court states that "Petitioners' proposal ... that economic development does not qualify as a public use is supported by neither precedent nor logic." I think it is perfectly logical how "public use" is different from public benefit. The property is still going to be privately owned, and in that, it is not "public use." Despite the fact that the hotel, restaurants, shops, and marina that are to be built on the property will be "open to the public" in the same sense as any other restaurant or store, they are still not public property. The owners of the property retain the right to deny access to anyone they want, for whatever reason. That is not public property, nor can it be considered "public use." This hardly defies logic; in fact, I fail to see how it could be any clearer. I think that a comment by radio host Jim Kerr (who, I think we can all agree, is hardly exploring legal nuances and technicalities) sums it up best: "It just seems un-American."

I mentioned earlier that I feel there is a huge potential for abuse here, particularly by corporate entities. There was a case in California City, CA, where the town wanted to take 700 acres of privately-owned, mostly desert land and turn it over to Hyundai for the construction of a test track. I haven't been able to follow up on this, so I don't know what has transpired since, but I do have a disturbing quote from the town's mayor, Larry Adams: "I think it's a matter of one or two [who] probably are true believers in the freedom of mankind, blah blah blah, and they don't like eminent domain."

What is deeply disturbing about that case is the attitude of the mayor. Rather than try and represent the interests of the people who would be affected, he simply dismisses them out of hand for wanting to hold on to their property.

Other potential for abuse certainly exists. Given that corporations can come in to a town and make promises about bringing in new revenue and new jobs, city councils and other elected officials, especially those in towns that are, like New London or Asbury Park, experiencing hard times, listen to these promises very intently. When just making these promises doesn't work, there are other methods that can be used to influence politicians to support the corporation's interest. In any case, if the corporation decides it wants just about anything, it is going to have the backing of the town (and now the expanded power of eminent domain) to help make sure it gets it. We could have, in effect, corporations taking property from individuals (with the full approval of the government.

Perhaps most disturbing of all is that we now have a federal government which can effectively assume any power it wants, and it can take anything from anyone and give it to anyone else it wants, and all of this is now sanctioned by the branch of government that is supposed to be a sanity check on the other two. This was specifically what the federal government was designed not to do. The shredding of our Constitution continues, and I'm not sure that America can (or should) survive it.

Does the Constitution still mean anything?
Posted: 2005-06-10 01:54
No comment(s)
Author: Phil Gengler
Section: Politics

By now, this topic has been covered a million times over in practically every other corner of the Internet. This isn't going to stop me from throwing my two cents out, too.

By a 6–3 decision, the Supreme Court ruled that the Constitution's "Commerce Clause" (Article I, Section 8, Clause 3, which reads, "[The Congress shall have Power] To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes") gives the federal government the power (through the Controlled Substances Act, which states that marijuana has no medical value) to arrest and prosecute growers and users of medical marijuana, even in states with laws specifically allowing such a use.

This particular case, Gonzales v. Raich, involves two California women, Angel Raich and Diane Monson, who were both prescribed marijuana by their physicians. Monson grew her own marijuana plants, while Raich had plants provided to her at no charge by two unnamed individuals. Federal agents raided Monson's residence and destroyed the plants she was cultivating.

The case has been watched very intently, as it not only relates to the debate over marijuana, but also over that of federal power. The case came down to an interpretation of the Commerce Clause, and the resulting power of Congress to regulate 'interstate commerce.'

Despite the fact that neither plaintiff was engaged in any act of commerce (there was no economic exchange) and that the acts were entirely intrastate, the Court ruled that "Congress' Commerce Clause authority includes the power to prohibit the local cultivation and use of marijuana in compliance with California law." The argument is that allowing such legal uses (under state law) lead to a greater market for the drug, which in turn becomes both an interstate and a commerce issue.

It should come as no great surprise that I think this ruling completely misses the entire point of the Commerce Clause and of the Tenth Amendment. The Constitution explicitly gives Congress the power to regulate "commerce ... among the many States." The Tenth Amendment ensures that "the powers not delegated to the United States by the Constitution ... are reserved to the States respectively." It is tortuous logic to stretch the Constitution to cover the government's actions here, and the fact that six members of our Supreme Court (in addition to the members of Congress who voted in favor of such legislation, as well as the presidents who signed it and have enforced it) is heartbreaking.

By explicitly including only "commerce ... among the many States," and not something along the lines of "activities affecting commerce among the many States," the authors of the Constitution made it clear that only those actions which are commerce and take place "among the many States" are subject to Congressional regulation or action. It, along with the Tenth Amendment, is perhaps the most important "states' rights" provision of the Constitution. By so broadly interpreting it (and make no mistake, this case is not the first; one need only read the majority opinion here to see other citations which are used as precedent for such overreaching) the Court has served to further erode the already weak rights of the states.

This angers me to no end. The Supreme Court is supposed to rein in the legislative and executive branches when they exceed their Constitutional authority, not selectively ignore certain parts of the Constitution to stay in line with their personal beliefs (on this note, it is widely postulated that Scalia, who is typically in favor of more state power, joined with the majority due to his personal feelings regarding marijuana). That such a thing has happened is, to me, a sign of the impending decline or collapse of the government. To me, this case (and the others in the same vein) are the essence of what some call "judicial activism." It is disturbing that many of the same people who decry such a thing support this decision, being motivated by their own personal feelings regarding marijuana.

I feel that we are in a situation where all three branches of our federal government are in violation of the Constitution, and we no longer have any remedy, since the judiciary has shredded the Constitution to declare all of it legal. As I mentioned in a previous rant, we need to be alert to the dangers of incrementalism, or as some would call it, the "slippery slope." At each point along the way, the next step seems like a small and justified one based on all you have behind you. What gets lost in that is that all of these small steps start to add up, and without stepping back to look at the "big picture," we don't realize it. I think this country needs to stop for a moment and take a look at that big picture.

The federal government was created by the Constitution in order to make life better for the people of this country. By moving some powers (such the power to make money, or wage war) to a centralized entity, the states would be able to better deal with the local issues affecting its residents. The Tenth Amendment very clearly shows that the intent of the Framers was to leave power to the states, and to prevent the central government from growing too large and becoming too controlling. In this case, we have a state law specifically aimed at providing relief to a certain class of people (those whose doctors feel that marijuana is the only course of relief or treatment available to them). Over the opinions of medical doctors who feel that marijuana provides effective relief, the government insists that marijuana has no medical value. By sticking to this, the federal government is making life worse for some people, and no better for others.

The entire federal government has overstepped its bounds, and this is the sort of thing we must not be complacent about. If we are all content to merely sit back and watch, without organized and vocal opposition, this slide will only continue, and there will nothing standing between the government and unchecked power.

America: is it 'Animal Farm?'
Posted: 2005-06-08 01:41
No comment(s)
Author: Phil Gengler
Section: Politics

"1984 is here." This basic idea can be found in countless blog entries and comments all over the Internet. Some people look at the techniques of Bush administration and of Congress to 'fight the war on terror' and believe that we are moving much closer to the totalitarian society in George Orwell's 1984. While to some extent, I agree with the sentiment, I think that Orwell's other popular work, Animal Farm, is a much closer analog to the way things are.

The way I read Animal Farm is as a warning about incrementalism, or the danger of letting those in power continually implement small changes that follow from previous changes, with the overall result of making a broad change. I think this is a very close parallel to the actions of the United States government after 9/11 (which was the "revolution" that changed everything). As in Animal Farm, such an event united us all, just as the rebellion united all the animals on the farm. Once everyone is united, then there emerges division. In the case of the United States, it is the division between the two major parties; in the book, it is the constant disagreements between Napoleon and Snowball.

Once Napoleon has ousted Snowball, and sufficiently convinced the other animals that he was a traitor, and remained a danger to the farm, his power grew. As needed, he had Squealer, his mouthpiece, mislead the other animals about the original commandments on which the rebellion was based. The parallel here is to that of our Constitution, the meaning of which continues to be debated and spun to this day.

Once Snowball is chased out of the farm in disgrace (while being pursued by Napoleon's vicious attack dogs), he is used as the scapegoat for all the ills of the farm. When the windmill that the animals devoted their time to building was destroyed, the blame was immediately lain on Snowball. From there, his location is always reported as whichever farm Napoleon is not seeking to trade with. At each turn, when whatever negotiations break down, Napoleon makes the claim that he was not serious, but his efforts were merely a ruse, and that in fact, that particular farm has been harboring Snowball. To me, this parallels some of the actions of the current government, which, while saying one thing, later turns around and claims the opposite. For example, take the current case of the Downing Street memo, which flatly contradicts the administration's claims that it was, at all times, seeking ways to avoiding going to war with Iraq.

One of the ways that Napoleon is able to keep the animals from putting too much thought into his governing was, at any sign of doubt, to evoke memories of the farm under Mr. Jones, securing support by asking the animals (through Squealer) "Surely none of you wishes to see Jones back?" This is echoed in the claims of Bush that "you are either with us, or with the terrorists" or that by allowing or disallowing certain things (such as dissent about government policy, or about the treatment of prisoners, and so on) the terrorists will have won.

Through all of this, Napoleon gradually steps on more and more of the basic principles the animal society lives by. As I have already mentioned, the seven commandments that were the basis of the society were changed in response to whatever acts Napoleon had committed (or would soon commit), such as a ban on the use of alcohol which later became a ban on 'excessive' use of alcohol or a ban on animals sleeping in beds, which was amended to state that animals were not to sleep in beds "with sheets." By labeling some people (including American citizens) as "enemy combatants" and denying them some of the basic protections of our Constitution (such as the right to counsel or to due process) the government is, in effect, altering some of the basic principles of this nation.

This does not mean that I don't believe that some of the ideas of 1984 are making their way into our society. On the contrary, I feel that the intrusions of 1984 are the incremental steps that fill in the broad pattern of Animal Farm. It is of critical importance that we recognize when we are the middle of such steps, and that we take action to halt it. Otherwise, we are likely to end up in a situation just like the end of Animal Farm, which closes with this line: "The creatures outside looked from pig to man, and from man to pig, and from pig to man again; but already it was impossible to say which was which."

All life is meaningless
Posted: 2005-03-24 01:03
4 comment(s)
Author: Phil Gengler
Section: Philosophy

As someone who seems incredibly prone to having bouts of both optimism and pessimism, I think I have an interesting perspective on life. It may be because I'm currently feeling pessimistic, but life is basically meaningless. I think it was nicely summed up in the Law & Order episode "Empire;" I don't have the exact quote, but the gist of it is that a hundred years from now, people are not going to remember or care about what was said in a courtroom, but a particular stadium (which was being planned during the episode) will be.

I certainly agree with the sentiment that the actions of now will be almost certainly be forgotten in a hundred years' time; indeed, I think most things will take even less time than that. I don't agree with the idea that a stadium, or any other structure, for that matter, is going to maintain its significance in the future, at least in the intended sense.

History is certainly not an authoritative record of events; there is bias in all of it, hence the quote that "history is written by the victors" to indicate that history slants to favor the winners of conflicts such as war. Through the years, as firsthand sources disappear, history depends greatly on what was recorded by the people who witnessed it. This carries a bias, as does the selection of what parts of it will be presented to the general public.

What's the point in all this? While we may name a stadium, or a building, or a statue, after someone of significance now, perhaps to memorialize someone or to commemorate someone's role in a particular event, years from now, the person it was named for will be forgotten. People will remember the particular circumstances and events surrounding the naming of the structure, but every other aspect of that person's life is basically neglected. For those who have their name attached to such things, this seems a good thing; after all, structures aren't named after someone because of the bad things they did, they're named because of the positive contributions of that person. Nearly all of the negative aspects of it are left out of consideration.

In that sense, buildings are a lot like history books (at least the ones that provide a general history, and not biographical histories). The people who receive mention in these books are known for something particularly famous, whether it be a good thing or a bad thing. Any other aspects of that person's life are extraordinarily condensed, if they are even mentioned at all. For example, few people today would deny that Abraham Lincoln was a good person, and every school history textbook I have encountered has portrayed him as a powerful figure who fought for the freedom of all men and the unity of America. While this is true, this is so much more to his life than that.

I'm not trying to take shots at Lincoln's reputation. I could make the same point with any other historical figure. What survives of them is largely their reputation, which is often shaped by one or two events of large significance.

I want to get back to my original point, though. Some say that being a historical figure gives them some sense of immortality, and that they have such an effect as to shape life years after their death. Historical recognition does not provide immortality; at best, it may have the effect of "extending" one's life a few years beyond their death. Once you move more than a few years past that, what survives is whatever historians (and the public) want to survive. At the time of the execution of a traitor, the general consensus may overwhelmingly be that the traitor absolutely deserved what they got; perhaps fifty years later, when the social atmosphere has changed, people may look back and see that person as a martyr. There are plenty of cases where someone has been put to death while in a very negative light, with the hope that their death would further whatever cause they stood for. At the same time, there are plenty of cases when people have not sought to be martyrs for a cause; they were killed for the conflict of their opinions with that of society, and while they may have hoped for a change in the future, they were not expecting their death to be the basis of a movement to change things. So when a person in the latter case is used in such a way, the supporters of the cause portray that person as a martyr. To strengthen their cause, they will attribute to him all sorts of things which may or may not be true, and they will attempt to suppress facts which suggest anything to the contrary. Thus, the person died, and their life ended there. What carried on was the idea of cause, with their name attached. The circumstances of their life have very little to do with the way their image is used later.

Generally, the people who die in such ways are only known because a cause chose to use them as martyrs. There is often not any event in that person's life that would otherwise have made them of significance. However the image of their life may be viewed later, their life was still meaningless.

Very few people who die are "resurrected" as martyrs for a cause. Very few people who live achieve significance enough to be remembered at all beyond the generation which knew them directly. The few people whose memory lasts longer than that still do not live on in immortality; the contortions of history see to that.

Many people would jump at the chance to be remembered for something they did. Plato attributed to Socrates the idea that "the unexamined life is not worth living," implying that if there is not some event that a person can be remembered for, by anyone, then their life was without purpose.

Largely, however, there is no direct credit to anyone for the changes of society. History may associate someone (or some group of people) with a particular change, and to some extent, it may be correct. However, it is incredibly rare for any one person to singlehandedly change the views of a society. That one person may lead a group of people, which are mostly responsible for the change, but the group is never attributed. The individuals in that group may as well not exist outside of their role as a member of that group. Their lives are not recorded in history, nor are their lives examined. However, every member of that group shares some portion of the responsibility for that change.

Changes in the views of society are rarely quick. They can often take years, and be so subtle as to go unnoticed at the time. The people in the group that effects a change may be long dead before it comes to be; these people died without knowing the result of their work. To them, their life was meaningless, as they failed to achieve any change in society.

In hindsight, we may see that this was not the case, but can we really say that the meaning of one's life is often not realized until years after their death, if at all? The person living that life is still dead, and they still retained the idea that their life had no meaning. It is not possible for us to change the view of that person, but who else is capable to determining the meaning of that life?

This is some of what I consider when I contemplate the meaning of my life, or what I may achieve later. The inertia of society means that whatever I may do (and I may not do anything, as is the case with most people) and so I live my entire life under the impression that it was meaningless. It is unlikely that I would have accomplished anything large among the group of people who knew me directly; this group is small, and much more able to be changed. If something I did were to have an effect of this group, it would likely have been during my lifetime. Once we move past that group, and even to some within that group, any view of me is flawed. I cannot think of anyone who knows everything about it (excepting myself), and I do not think it likely that anyone ever will. Thus, any image of me is necessarily biased twice; once by what I choose to 'release' about me, and the second by the portions that will be recalled and exaggerated by those who know it.

With each day that goes by, I feel that my life matters less and less. Sheer probability dictates that I am unlikely to ever accomplish anything of significance, and my own view of self agrees. When I try to look at the small things, and to find meaning in them, I fail to see any of it as adding meaning to life. I sit at my computer and write this, yet it is a meaningless exercise; it will not make waves that will shake the foundations of society. It may influence the thoughts of a few random people, but it is unlikely it is going to redefine anyone's ethos. In fact, it is not likely that any significant number of people will even read it.

Everything in my life carries this same feeling; every action feels mundane, as though I am merely "going through the motions" and waiting for the inevitable death to come. I have the free will to do what I wish, but if none of it matters, then what is the point? I choose to try and stay reasonably fit, but for what purpose? Regardless of how fit or healthy I may stay, I am still going to die, and regardless of whether I am in perfect health or am comatose, there is no effective change in the meaning of my life.

It seems that for everything I do, where I may strive to be at the top, so that I might have some effect, there are always others who do the same thing, but do it better. I always view myself in competition with these people, because in reality, that is what life is. It is a competition to see who can do something to provide meaning to their life. Certainly, I could invent a specific categorization in such a way that I was the best, but this would have to be so narrowly defined as to exclude virtually everyone else.

Try as I might, I accept that I will never be the best at anything, at least not anything that matters. I cannot find anything I am remotely near the top of, and as such, likely to be able to use to effect change. Thus, it seems that my entire life will be meaningless. And so what's the point?

It's not like we liked our privacy anyway
Posted: 2005-03-13 14:21
No comment(s)
Author: Phil Gengler
Section: Privacy

In Congress, the push for a national ID and for all-encompassing single-database systems continues. In addition to generally being flawed ideas that will do nothing to increase our national security, in the wake of several recent data leaks, these are generally poor ideas that will put more people at the risk of having their identity stolen or their personal information leaked.

Last month, ChoicePoint released information at least 140,000 people to criminals, who posed as small business owners to obtain the data. The data that was released includes names, addresses, Social Security numbers and credit reports.

Lexis-Nexis also leaked information, such as names, addresses, Social Security numbers and driver's license numbers, of nearly 32,000 of its customers.

On top of this are several leaks at colleges and universities. In January, the names and social security numbers of more than 30,000 students and faculty at George Mason University fell into the hands of attackers. Last year, Georgia Institute of Technology and the University of Texas at Austin suffered similar break-ins and information leaks. More recently, a small-scale leak at Stevens put the names and social security numbers of 31 students into the open.

What all these examples show is that personal information is not particularly well-protected, and these are just some of the larger examples. The recent rise in identity theft is largely attributable to the increased storage of personal information, and the poor security (both technically and socially) that surrounds it. Now the federal government wants to combine all this data into a single source.

Any centralized federal database would be a single point of failure. Since it would contain personal information about every U.S. citizen (and likely information about travelers who enter the country, as well), it would be a prime target for cracking, whether just as a proof of concept or with the intent to steal and use the underlying data. The creation of such a massive data repository would require an intense amount of security and control over the use of the data inside, if that data is to be at all protected. What the ChoicePoint and Lexis-Nexis cases show us is that technical security solutions crumble at the feet of social engineering; the George Mason, Georgia Tech, and U of T cases highlight technical weaknesses.

The government database would ostensibly be used to verify identities for people boarding planes and for doing background checks. It seems virtually assured, however, that the information will increasingly be used for more and more routine things, as with Social Security numbers (which initially were only used for Social Security, and have now become a standard identifier). When all the agencies of the federal government, and likely agencies in each of the states, are accessing the information in this database, the risk for leaks increases, simply because of the number of people who have access to the database.

If anyone in any agency in any state can access the information with little or no technical restriction, it is only a matter of time until someone abuses that access and compiles (and possibly releases) lists of information.

It is also likely that the government will share information in the database with companies with which it has contracts (particularly those that require employees to have security clearances). This adds a whole new set of people who have access to the system.

The possibility exists that the government might allow any company to query the database, effectively making it a competitor to companies like ChoicePoint. If this happens, then the same risks that apply to companies like ChoicePoint (and the same problems which came to light with the recent leaks) also apply to the federal database.

The number of people who would have access to the database is a risk in and of itself. Once a certain number of people are given access, and these people are spread across agencies and states, it becomes virtually impossible to control access to the database. For sure, there would be some sort of login and authentication mechanism, and hopefully some sort of logging, but it is easy for someone to either intentionally or inadvertently give away their login information. From there, no amount of logging is going to be able to undo the compromise to the database. Once the information is out in the open, there just is no way to get it "back in the bottle."

Of course, these problems generally apply to a system that would allow real-time access. If the system required the submission of a certain form, with certain signatures, then certainly it would be somewhat easier to regulate access—assuming that the required information is valid, and that it is properly checked before any data is given out. This would just not work for some needs, in particular, checking passengers who have purchased tickets for a flight. There is a certain amount of processing time required for a request, so it would seem natural to batch them; but at what point would the airline have to cut off registration for a flight just so the check could be completed prior to the flight? Given the number of flights and airline passengers, any system like this would likely be overwhelmed and unable to return responses quickly enough in most cases. So it seems the only solution that would work would be a real-time one.

So we have a system which is not going to make us any safer, but will put more of our personal information at risk. It takes up another step closer to an Orwellian-type society, and we're going to stand for this?

Prof. Wetzel moderates wireless security panel
Posted: 2005-02-25 00:00
No comment(s)
Author: Phil Gengler
Section: The Stute

Wireless networks may be the way of the future, but their security is no sure thing. This was the message conveyed by participants of the "Attacks on and Security Measures for Ad Hoc Wireless Networks" panel, on Saturday, February 19.

The panel, part of the American Association for the Advancement of Science, was moderated by Stevens computer science professor Susanne Wetzel and dealt with the security of ad hoc wireless networks.

An ad hoc network is one where every device forwards traffic for every other; unlike traditional networks, ad hoc networks become more efficient as more nodes are added. This type of network is most common with small mobile devices.

In addition to Professor Wetzel, there were four other members on the panel: Markus Jakobsson and XiaoFeng Wang, both from Indiana University, Panos Papadimitratos from Virginia Tech, and Adrian Perrig from Carnegie Mellon University.

Professor Wetzel started things off, explaining the basics of ad hoc networks, as well as their advantages and disadvantages compared to traditional client/server networks. She described ad hoc networks as a "double-edged sword," due to the new opportunities and new challenges these networks bring.

From there, the topics centered mostly on the various types of attacks that could be made on these networks as well as steps that could be taken to prevent them. Jakobsson focused mostly on stealth attacks, where a malicious node interferes with or intercepts network traffic without exposing itself. He noted the use of cryptography as a potential problem for ad hoc networks, as the processing power required for cryptographic operations can help in denial of service attacks against mobile devices.

The most dangerous stealth attack is a "man in the middle" attack, where a malicious node listens to and potentially changes messages that pass through it. A node operating in this manner can gain access to sensitive information being transmitted across the network, and is similar to a "phishing" attack.

Wang focused on using game theory to improve cooperation between nodes on and security of ad hoc networks. The idea is that by "rewarding" nodes that follow the rules, and establishing a reputation system for devices, malicious nodes will be gradually edged out of the network.

Papadimitratos noted that attacker nodes will seek to "hit when it hurts," by establishing themselves as reliable nodes and then dropping or intercepting potentially sensitive network traffic. He proposed a system, which breaks messages into redundant parts and sending them through different routes.

He noted that if 50 percent of a network consisted of attacker nodes, and no system of redundancy were in place, only 35 percent of traffic would reach its intended destination without resending. With his system, however, this number increases to 93 percent. This has the disadvantage of requiring more transmissions, but "bandwidth is the price we need to pay for security," commented Papadimitratos.

Perrig sought to secure the routing protocols of a network, preventing an attacker node from halting a network by attracting all the traffic. He broke down attacks into two classes, those by external attackers, who do not have access to any of the network's encryption keys, and internal attackers, who do. Stopping external attackers is a simple matter of authenticating it to the network; this would not work for an internal attacker. An internal attacker could bring down a network by claiming to be the quickest route to all other nodes; this would cause nearly all traffic to be sent to that node, which would not forward it. Perrig's solution would make it impossible for a node to hide its true distance from other nodes.

The talks were followed by a question and answer session, where Perrig explained that much of the focus has been on securing routing and messages, and not on the physical security of the network. This is because the frequencies can be jammed, and no security solution is going to be able to overcome that.

Jakobsson noted that the other attacks, particularly "man in the middle" attacks, pose a greater risk than jamming. Since jamming is an obvious attack, it can be easily observed, as opposed to the silent interception or alteration of traffic, which is stealthier and can also result in information being compromised.

Securing ad hoc wireless networks is one of the research areas for WiNSeC, the Wireless Network Security Center, in the Lieb building here at Stevens. It is also one of Professor Wetzel's research aims.

Your Liberty: Think of the children's privacy
Posted: 2005-02-25 00:00
No comment(s)
Author: Phil Gengler
Section: The Stute

How would you like it if you were tracked everywhere you went at Stevens? For students in the Brittan school district in California, this was a reality. Each student was required to carry a radio frequency identification (RFID) badge that could be used to track his or her movements.

The program was designed to ensure that all students were accounted for by providing school officials with an easy way to tell if a student was missing. It would then be easier to tell which students were cutting class, or perhaps even tell if a student had been kidnapped.

The program generated an intense backlash from parents, who felt that the tracking represented an unwarranted intrusion into the civil liberties of the children. It also prompted action from the American Civil Liberties Union, Electronic Frontier Foundation (EFF), and Electronic Privacy Information Center (EPIC).

The protests resulted in the school district disabling readers in the school's bathrooms; a measure the district must have felt was a compromise.

The program ended when InCom, the company providing the RFID tags and readers, abruptly ended its participation. InCom was actually paying the school for its participation, in the hopes that the company could sell the technology to other schools.

This casts doubt on the true intentions of those behind the program, such as principal Earnie Graham. Rather than acknowledge that students might actually be concerned about their privacy, he said, "I believe junior high students want to be stylish. This is not stylish." What about all the parents who were upset? Are they only concerned that their children do not look "stylish" anymore, Mr. Graham?

Assaults on our civil liberties do not only come from the federal government; state and local governments are perfectly capable of them as well, which I think this case shows. This was a blatant attempt to increase government surveillance of its citizens.

The fact that this was done 'in the name of the children' is disgusting. By requiring attendance at schools, and then requiring students to submit to being tracked, the government is, intentionally or not, making an impression on the children's minds.

It may not be the goal of the Brittan school district to raise a generation of children less adverse to being tracked, but that is the effect of policies such as this. It is a step closer to a totalitarian state, where the government knows everything about everyone, including where they are at any time. This is downright un-American, and we must not let it happen.

Babbio Center on track for fall opening
Posted: 2005-02-25 00:00
No comment(s)
Author: Phil Gengler
Section: The Stute

Nearly four years after ground was broken, the Babbio Center is expected to be open for the upcoming fall semester.

Originally expected to open last fall, construction was delayed in its early stages over concerns about the blasting being done to clear the site. This resulted in lawsuits between being waged between Stevens and advocacy group Fund for a Better Waterfront, which were only recently resolved.

Even once it opens, however, the center will not be "complete." The top two floors will not be completely furnished; only the necessary fire and safety items, "whatever we need to get a Certificate of Occupancy," will be installed, according to Hank Dobbelaar, Vice President for Facilities/Support Services.

The Lawrence T. Babbio, Jr. Center for Technology Management will be the new home for the Howe School of Technology Management. The Howe school currently has its offices and classrooms on the third floor of the Morton-Peirce-Kidde complex.

The opening of parking garage attached to the Babbio Center is another matter. Work on the garage is on hold pending more funding, with no definite timeline for its completion.

Funding is the main obstacle facing the garage, since Stevens received the necessary variances needed to complete construction from the Hoboken Planning Board last year. Once funding is obtained, Dobbelaar said, all that remains is the completion of a final plan, "which should take about a month," as well as getting the necessary construction permits from Hoboken.

Ground was broken on the Babbio site in October 2001, making it almost four years to the day between the groundbreaking and its formal opening.