top of page

Smart City. Trick Question?

  • Writer: Joe Conte
    Joe Conte
  • Aug 24
  • 2 min read

Updated: Aug 31

Joe Conte


ree

It's the digital age: smart cars, smart phones, smart TVs. Why not have a smart city?


This may be about as much thought as many of those holding positions in government spend on the question of whether or not taxpayer money should be spent on turning your home into one part of a web of Internet of Things devices.


Time and time again, new technology has been placed on a pedestal and suggested as a

solution to all of the modern problems we've built for ourselves. Smart cities are no exception. Too much traffic? Have some self-driving cars! Electricity not stable enough? Turn everyone's house into a microgrid! And while you're at it, install some high-definition cameras to collect data around the neighborhood (don't worry, only Jeff the IT guy can access the live feeds. It's totally secure!)


It's easy to make fun of, but the serious critiques of smart cities are entirely valid and plentiful.


Who's going to pay for updating all this tech when it's completely outdated in less than a decade? [1] How are we going to trust corporations with our data when the answer inevitably

becomes outsourcing it to the private sector? How does all this tech even help us when the law itself has yet to catch up to the 21st century?


A smart city with dumb policies will do about as much good as a pen with no ink. No amount of tech can supplant knowledgeable human operators and a participating pool of residents. Mainly because, as much as it pains Sam Altman when I say this, technology doesn't understand us. Only we understand us. And we are undeniably biased: a bias that leaks into every tool we've ever made that attempts to use logic or reasoning like we do.


An old IBM training manual famously said: "A computer can never be held accountable,

therefore a computer must never make a management decision" [2]. In the age of AI, we seem to have forgotten this. People are being interviewed by AI recruiters and cities are considering using AI to fuel crime prevention software, but who will be on the hook when these systems inevitably go wrong? What if either of these systems are found to have an obvious racial bias?


Human's greatest strength is not only our ability to use tools, but our ability to discern which tool to use for a given task. Maybe I want AI to write me some basic emails, but I don't need it to "enhance" my favorite actor's performance, nor do I want big surveillance data to be used to adjust the price of a coffee as I walk up to Starbucks.


Joe Conte is a graduate student of Urban and Regional Planning, Rowan University.


References:

[1] Saxe, S. (2019). I’m an Engineer, and I’m Not Buying Into ‘Smart’ Cities. The New York Times.

Comments


© 2016 - 2025 Mahbubur Meenar

bottom of page