This Wednesday, the Phoenix City Council takes up suggested changes to the City code that would regulate the use of unmanned aerial systems—drones—in city parks. Based on some public debate I’ve heard about the issue, anything adopted is likely to be based little on science and much on stalling technology we don’t yet understand. But whatever. Somewhere, the FAA chuckles.
(I also have to wonder about the role drone restrictions will play in the STEM gap that already affects urban neighborhoods more than suburban ones. Commentary I’ve heard suggested that urban areas are simply “too congested” for recreational or other drone use, and they should be allowed only in the suburbs. So here would be another cutting-edge technology kept far away from urban schoolkids. I’m guessing future college freshmen will be less than competitive touting their kite-flying skills.)
This morning I read an interesting essay that explores humans’ convoluted relationship with machines. It may go some way toward explaining our often knee-jerk reaction against these strange contraptions that can do so much that we cannot.
Written by a Boston lawyer, the essay is part of a nationwide project called Future Tense that includes Arizona State University—it’s worth keeping track of. (As they describe it, “Future Tense [is] a collaboration among Arizona State University, New America, and Slate.”)
Attorney John Frank Weaver analogizes humans’ evolving view of animal protections and suggests a similar approach would benefit us in regard to machines. Like animals, machines and how we treat them say a lot about us, and those interactions have moral implications. And it’s not just the “cute” animals that need legal protections, he argues. We also need to safeguard the more ugly machines. As he writes:
“[I]n focusing on laws that protect how we socialize with anthropomorphized robots, we need to make sure not to ignore plainer robots. They need legal protections, too. In fact, I have gone so far as to recommend that we should grant them limited legal personhood. It’s not because we should empathize with them—it’s because laws governing interactions with ugly bots could improve their utility and benefit to humans.”
Did someone say drone?
Read the whole piece here.Follow @azatty