It thwarted the Bureau of Indian Standards’ plan to fix clear criteria for defining a smart city, and came out with a fuzzy ‘Liveability Index’
In reaction to what may be seen as a turf battle between the ministry and the bureau, the ministry decided it would draft its own standards. In November 2016, it released the draft for public comments, and asked the states to give their views on it. The standards were released in June 2017. Only, there were no actual standards to define a smart city. Instead, the ministry had devised the complex Liveability Index to rate and rank cities. The introduction to the index read:
“The Ministry of Urban Development has developed a set of ‘Liveability Standards in Cities’ to generate a Liveability Index and rate cities. The source of the Liveability Standards are the 24 features contained in the Smart City Proposals, which have been grouped into 15 categories. These categories are part of the four pillars of comprehensive development of cities.”
The index is designed to simultaneously promote projects, events and technologies the government has already approved under various Smart City Plans and not just measure outcomes of these projects, technologies and efforts.
The bureau’s standards, by contrast, relied purely on assessing the end result of the mission, and its various projects and components. These standards were agnostic to how the targets were achieved, what technologies were used or projects implemented.
For example, when the bureau intended to measure if the city had become safer, it asked for data on the number of police personnel, number of homicides, rate of crime against women, response time of the police to crime scenes, and rate of violent crimes.
In contrast, the urban development ministry’s index asks whether the city has put up surveillance cameras all over.
Read without the fine print, the Liveability Index would only provide a plain rating of cities, say on a scale of one to 100, and nothing more. Users would have to pore over records, which may not be available publicly, to know what the index really measured and how. In contrast, the bureau had suggested an open data platform, where information on all the parameters is released for the public to freely review, assess and comment.