Intelligence is Open: Smart City versus Open City

In this paper we explore the impacts, current and potential, that new technologies have on city planning and management, comparing the different ways in which those impacts can be harnessed for either the public good, for private profit or for a mixture of both. We argue that smart technologies do not necessarily yield a positive social product, and that the openness of information (in its different levels) plays an important part in maximizing the social product of new technologies applied to urban space. In the first part, we briefly discuss urban complexity and how technology can be used to make cities readable and actionable upon. In the second part, we analyse three technological (“smart”) initiatives related to urban planning; Waze, Uber and OpenStreetMap, analysing the different processes by which information can be turned into use-value (and from there into exchange value). In the third part, we try to understand the economic process by which information is turned into capital through its restriction. We conclude by analysing the potential conflicts between the common good and the turning of information into capital, exploring some of the ways in which open data might be important in the process of making better cities.


Introduction
In this paper we explore the impacts, current and potential, that new technologies have on city planning and management, comparing the different ways in which those impacts can be harnessed for either the public good, for private profit or for a mixture of both. We argue that smart technologies do not necessarily yield a positive social product, and that the openness of information (in its different levels) plays an important part in maximizing the social product of new technologies applied to urban space.

What is information (and what is smart)?
The definition of information can be broad. For the purposes of this article, we will focus on a stricter sense of the word; that is, specific pieces of information that are relevant to the decision-making process in using, interpreting, managing and planning the city. Every decision taken when using and managing a city is based on specific (albeit vast) bits of data. These data constitute a very complex network of information that influences and is influenced by the actions of citizens and urban planners. We will treat the word information as any processed data relating to how people use the city; where they live, where they work, how and when they move from one place to another, where and why they spend their time and their money, and so on. That definition of information is intimately related, then, to the definition of intelligence (or intelligibility). A Smart City (i.e. a city doted of intelligence) is, then, a city that is capable of gathering, systematizing and applying information related to it. In that sense, any technology that gathers and makes use of information applied to the urban space is smart; and, by extension, any city where this technology is used becomes smarter by this use. Smart technology is not necessarily open; that is, some forms of it gather and process information for the exclusive use of a limited number of individuals or companies. We will see why this is relevant throughout this paper.

What is open?
The definition of 'open' is not absolute. Information can be accessible, but in a selective and unsystematised way; it can be downloadable, but readable only with the use of proprietary software; it can be openly available but in such complexity that it makes it difficult for regular people to understand and act upon. The Open Data Institute was cofounded by Sir Tim Berners-Lee to help create a framework of knowledge over which open data can be shared and systematised 1 . It has proposed the concept of the data spectrum that ranges from 'open' to 'shared' to 'closed', with given examples falling along any point of that spectrum (Broad, 2015). It has also proposed the '5 Star Open Data' scheme, where data can be described as being open in some level from one to five stars. We will not attribute a specific rating of openness for each case described in this paper; suffice to say that data can range from completely closed (inaccessible to the public) to completely open (accessible, readable, downloadable, interconnected), falling anywhere along that spectrum.

Use and exchange value in urban space
The process of gathering and systematizing urban information has played an important role in human development since the beginning of history. The citizens of Jane Jacobs's fictional first city, New Obsidian (Jacobs, 1969), probably took decisions based solely on their personal experiences, but as cities grew, each citizen became less capable of individually encompassing their full complexity. Tools through which some of that complexity is gathered, systematized, presented and acted upon are the basis of any action of urban planning and management. Maps are at the basis of these sets of tools; zoning, land uses and mobility AESOP / YOUNG ACADEMICS NETWORK systems are some of the different planning tools that are discussed based on maps and, conversely, have direct influence on them. The process of mapping cities is also a political tool: favelas (slums) in Brazil have notoriously been left out of most official maps at least until the eighties (Magalhães, 2013). If they were not politically relevant for the governmental institutions making the maps, they did not have to be seen; and if they were not seen, they did not have to be acted upon.
Like other commodities, the exchange value of information varies according to its availability (Tregarthen & Rittenberg, 2000). The complexity of the process of building comprehensive and updated maps means this is an industry with a tendency for oligopoly, since it's not feasible (or collectively efficient) to have several companies doing the same, massive amount of work. The largest digital mapping company until recently, Navteq, has dedicated roughly half of its revenues to generating, expanding and updating its map database between 2002and 2006(Navteq, 2007. If there had been double the number of companies in this industry with similar commercial reach, Navteq's expenditures in database building would not have changed significantly, but its revenue would be roughly cut in half (assuming proportional market shares). The profit margins for the industry would virtually disappear. Even though it would probably serve a broader range of customers and, therefore, have an increased usevalue overall, competition and the diminishing marginal exchange value (Jain, 2010) of the information being sold would very quickly absorb any marginal profits brought on by these new companies.
Even though smart technologies are often portrayed as uncontroversially positive and desirable, recent literature on smart cities have started to question this view, such as in Against the Smart City (Greenfield, 2013) and Smart Cities (Townsend, 2013). This is not to say that smart technologies are necessarily bad. What we propose, instead, is that these technologies be analysed as to the amounts of public and private good that they generate. A starting premise is that the social product in each initiative be positive, with the ultimate goal of it being the best possible. This means, for instance, that if a technology is profitable and generates no negative externalities, it should not be opposed; and if it can generate positive externalities at the cost of some of that profit, this should be encouraged (through laws, regulations or direct actions). To that end, we present, in the following section, three case studies of smart technologies to understand some of the process through which they operate and generate use-and exchange value. They are not competitors nor directly comparable, but they all have potential impacts on their users and they all generate significant externalities. Each study is presented with four subsections: a brief introduction; an analysis about the level of openness of each platform; a case study presenting data about each of them; and a comparison of the use-and exchange-values generated by each platform. The analysis does not intend to be exhaustive and should be deepened in further studies; furthermore, its accuracy is dependent on the availability of relevant information in each case, and should, therefore, be adjusted as that information changes and becomes more accurately and widely available. Our purpose is, rather, to understand the different incentives that different platforms offer in the process of converting information into use value.

OpenStreetMap: not-for-profit, free and crowdsourced mapping
The OpenStreetMap (OSM) initiative started in England in 2004. Its creator, Steve Coast, inspired by the way Wikipedia worked, created a collaborative project to put user-generated GPS data together (Coast, 2014). In 2006 the OpenStreetMap Foundation was established AESOP / YOUNG ACADEMICS NETWORK to promote the development of OSM maps. By 2008 the project had mapped about 29% of the area of England, where it started, with dense areas such as the city of London covering as much as 80% of the Ordnance Survey's database, the British's government official database (Haklay, 2010). A year later, in 2009, coverage in England had increased more than twofold, reaching 65% (Neis & Zielstra, 2014). In 2009, OSM's database surpassed TomTom's in Germany (Neis, Zielstra, & Zipf, 2011). As to accuracy, a study published in 2012 showed that OSM's data provided significantly shorter routes, due to broader or more precise coverage, than Navteq's or TomTom's databases in the four cities surveyed (Miami, San Francisco, Berlin and Munich) (Zielstra, 2012). A recent study comparing OSM's database to that of ATKIS, the German authority for topographic-cartographic information, found that in the surveyed area, in southern Germany, OSM's maps showed a completeness of over 80% and a correctness of over 90% for urban areas (Dorn, Törnros, & Zipf, 2015).

Openness
The 'opening up' of the mapping industry brought about by the widespread use of personal devices with GPS and an internet connection, starting in the year of 2007, had two significant impacts: for one, it has severely hurt the map licensing business. TomTom's revenue from licensing fell by 60% from 200960% from to 201460% from (TomTom, 201060% from , 2015. Navteq reported a net income of US$ 110 million for the fiscal year of 2006, its last annual report before being acquired by Nokia. By the second quarter of 2012, Nokia's Navteq operation was reporting operating losses close to € 100 million per quarter (Dediu, 2012). The second, and most significant, impact was the realignment of the process of mapping, comparable to the 'revolution' brought about in the industry with the opening up of GPS technology in 1983.
The geospatial information went from remaining largely lost in the abstract, during medieval times, to being processed and used mostly by governments, with its political incentives and biases. In the eighties, with the opening up of GPS technology, it became a commodity, which was captured and turned into exchange value. Then, around 2009, it began its move towards becoming a common good. While it is hard to put its social product in numbers, nearly no one argues against OSM. One could try to make the case that open data has cost profits and jobs of the proprietary companies that dominated the sector until 2009. TomTom issued a statement in May of 2012 where it argued against open source mapping and wrote that 'mapping errors can be extremely dangerous', citing 'recent studies' that have highlighted some of these dangers but without pointing to any source. The statement featured a world map made of coins. It has since been pulled from their website, but is still accessible through the Internet Archive's Wayback Machine 2 (InternetArchive, 2015). There seems to be a consensus that the social benefits far outweigh the costs and lost profits of the move towards open sourced mapping.

Case Study
The fact that maps lost their market value is not to say that they lost use-value. On the contrary: if their use was, up to that point, restricted to companies that were willing and able to pay millions of dollars for their licensing, OSM's open licensing 3 meant that a vast array of platforms could be built over it, for free. That meant that the access to current and comprehensive geospatial information was no longer accessible only for companies and

AESOP / YOUNG ACADEMICS NETWORK
institutions that had enough commercial or governmental interests to justify buying a use license, but for everyone. These two movements were, then, in opposite directions: as geospatial information lost its market value, it became available and useful for an increasingly wider range of people. In a sense, then, maps have slowly shifted from one end of the diamond-water paradox 4 spectrum to the other. As it became more widely available and more useful, it lost market value. The impact was more clearly felt in areas with little commercial appeal, especially poorer regions and countries. OSM's database had much more precision in richer areas than poorer ones in the first few years, as the first users were generally wealthier and mapped predominantly their own surroundings (Haklay, 2010). In the last few years, though, a host of humanitarian projects, as well as increasingly cheaper GPS devices and computers, have mapped poorer areas with more detail than governments and companies were ever able to do. After the 2010 earthquake in Haiti, several hundred users joined forces to map Haiti's main cities based on GPS data and aerial photographs with the objective of helping citizens and humanitarian institutions; as a result, the number of map nodes in Port-au-Prince increased 9-fold. Later, in 2013, after typhoon Yolanda hit the Philippines, over a thousand users again joined efforts to map the affected areas, with cities such as Tacoblan having its node content increased by a factor of 11.5 (Palen, Soden, Anderson, & Barrenechea, 2015). The changes the mapping guidelines and interface went through after the experience with Haiti are as interesting as this collective mapping effort. That could be felt in the Philippines three years later, which is the topic of Success & Scale in a Data-Producing Organization: The Socio-Technical Evolution of OpenStreetMap in Response to Humanitarian Events AESOP / YOUNG ACADEMICS NETWORK (ibidem). So not only OSM served as a platform over which people volunteered efforts to map socially fragile areas, but it also learnt from its experience and could do a better job the next time it needed to. Figure 1 shows the level of detail provided by each of the four largest digital map databases currently available in a poor area of Port-au-Prince, Haiti. The crowdsourced maps, on the bottom, show much more detail in these areas, with OpenStreetMap showing a level of detail since the aftermath of the 2010 earthquake that commercial initiatives, on top, have not incorporated several years later.

Use value vs exchange value
We identify, therefore, two separate moments in the mapping industry. First, between 1983 and the early 2000s, mapping became an important industry as the two big companies in the sector started building digital maps. Even though the marginal value of maps decreased, the total value of the industry increased; then, as the process of mapping became easier and accessible to the general population, the market value of mapping information decreased accordingly, to the point where its marginal value became negative (i.e., the entire industry lost market value). This is explained by the economic principle that the exchange value of one unit of a commodity is equal to the marginal value of the last unit added to the market (Jain, 2010). That means that, regardless of how much investment was made to build an industry, the total exchange value of that industry's output is the multiplication of its total product by the exchange value of its latest addition. In this case, between 1983 and the early 2000s, the process of converting mapping information from the abstract into usable and comprehensive databases was costly and concentrated in the hands of the two companies that had the resources to do it. Because their maps had huge market value, they were able to finance their operations with large profit margins. The shift in the industry brought about by OpenStreetMap and other open mapping platforms meant that, with enough users, the process became increasingly cheaper until it became virtually free. With the marginal cost of mapping nearing zero, this meant not only that new information had a diminishing market value, but also that the existing databases lost their exchange value.

Waze: a private navigation software 'paid for' with volunteered information
Another open sourced mapping platform, called Freemap, was founded in Israel in 2006. Its founder, Ehud Shabtai, stated that he was 'tired' of having to pay thousands of dollars for map data and decided that 'the only way to develop something free is by creating a community that develops free maps by itself' (Rom, 2014). Its original Terms of Use stated that the 'aim of the project is to create, by the community users, a free digital database of the map of Israel, and to ensure its free content, update and distribution, for non-commercial usage, as convenient as possible'. Besides the map database, the community also forked 5 an open-source software, Roadmap, to launch a navigation app for mobile devices. In 2008, the creators of Freemap raised investment funds, started a corporation and renamed the project to Waze. The Waze navigation software remained open source, but the maps, though openly accessible and editable through a website interface, were made property of the company. As smartphones with data connections became more popular, Waze began capturing traffic speed information from users and integrating it into its routing server. Though the navigation app was bound to remaining open by its open source code that carried a GPL AESOP / YOUNG ACADEMICS NETWORK license 6 , the app was allegedly rewritten from scratch in 2012 and turned into closed source. While remaining free (though moving towards the 'closed' end of the data spectrum), Waze built a user base of nearly 50 million users by 2013, when it was acquired by Google after a bidding war with Facebook and Apple (Cohan, 2013). Waze's main asset was not its geospatial information, which had lost most of its market value (and of which Google had plenty); its main asset is its user base with all the traffic information they provide, passively or through specific reports (idem).

Openness
Waze currently sits somewhere in the middle of the spectrum between open and closed data. Any user can edit its maps, though with some restrictions: new users can only edit the area inside a radius of one mile around the paths they have driven while using Waze. Editors are ranked from level 1 through 6, according to their experience and number of edits, and some areas (such as important avenues and highways) are locked so that they can only be edited by users in higher levels. Any user passively uploads information about their average speed while they are driving; that information is then processed and used as reference so that the routing server can give faster routes, depending on the time of day and day of the week. Users can also report and comment on issues such as accidents, potholes, construction sites and so on. A lot of that information can be accessed through their website, but only on a user level; that is, it is possible to check route times between two different points at different times of the week, but not to download and process that information systematically 7 .
What Waze does, then, is to capture traffic-related information from the abstract and turn it into use-value. It also records that information in a closed environment, which Google then uses to feed its own applications. By opening up some of the information it gathers, it becomes useful to users who provide the application with more data; and by keeping some of that information closed, it provides its parent company with information over which it can monetize in a number of ways (such as displaying ads, learning about its users' behaviour and so on). Waze also exchanges information with some public authorities it has deals with, such as the city of Rio de Janeiro, but the exact content and extent of that sharing is not disclosed (Machado, 2013).
The information Waze deals with is of vital importance in modern urban planning. Understanding urban mobility has been at the core of planning at least since Johann Henrich von Thünen, a German economist, developed in the early 19 th century different models around the economics of land use, transportation costs and marginal productivity (von Thünen, 1826). Von Thünen, however, only considered physical distances. Now traffic, transportation modes and types of roads are of significant in determining the amount of time spent to overcome a given distance. The actual travel distances inside cities are an essential piece of information to accurately gauge density limits, design public transportation systems and decide where to build public infrastructure, such as schools, hospitals and parks. Several mathematical models have been proposed to try to estimate how traffic speed varies according to road characteristics, time of day, number of vehicles and so on; but it is nearly impossible to correctly estimate some of the variables at a given moment, since it can vary greatly depending on time of day, day of the year, number of cars sold, specific events, etc.
To that end, some cities have invested in the installation of traffic sensors aimed at measuring traffic speed and volumes in major roads; the Georgia Navigator system, for AESOP / YOUNG ACADEMICS NETWORK instance, has dedicated US$ 140 million in infrastructure to monitor and manage traffic conditions on 90 miles of highways in the metropolitan area of Atlanta (Excellence, 2008). These sensors are usually limited to main roads and provide data directly to authorities in charge of traffic management. The opportunity that Waze offers is to measure and record real time traffic data in entire cities, making it possible not only to estimate actual time distances inside a city, but the way these distances evolve in time. It establishes a two-way relationship with its users, by measuring traffic speed data at the same time as it distributes users through the fastest available route at any given time. This is particularly important because the marginal impact of each new car in a particular route increases exponentially beyond its saturation point. That is: if there are two possible routes between two points (route A and route B), where A is saturated but B is not, rerouting 5% of the cars from A to B might decrease travel times in A by 8 or 10% while not having a significant impact for travel times on route B. This sort of redistribution of traffic constitutes a 'soft layer' of urban planning that might play a significant role in making cities more efficient with little or no centralized investments.
Of course, reducing travel times is not an absolute priority in urban planning. It is part of the complex network of costs and benefits that include environmental, historical, social and other variables. In the case cited above, route B might be of historical or environmental significance, for instance, and rerouting cars from A to B might not be in the city's interest, even if it would make for a more efficient traffic network. That is why the openness of platforms such as Waze is important. If Waze kept all of its data closed, charging a subscription fee, its sole incentive would be to save its users' driving time regardless of any other factors. Because it is relatively open, it can serve as a backdrop for citizens, drivers and public authorities to negotiate and find solutions together on a case-by-case basis. Besides, being free attracts users, which makes the platform increasingly accurate.

Case Study
To illustrate Waze's capability of mediating conflicts in proportion to its openness, we have studied such a case using Waze's data. Bel Air is one of the richest residential areas in Los Angeles, but it is situated close to the busiest interstate highway in the USA, the I-405. Its residents have recently been complaining that, as Waze becomes more popular, more cars have been using Bel Air as a shortcut during peak times (Roberts, 2015). Without sufficient data, this type of issue can easily become limited to a power struggle, with either side using arguments based on ideology, depending on whether they are interested in defending the rights of local dwellers or the city's traffic network efficiency in general. Actual data, therefore, is a key aspect to settling such matter.
Using Waze's web site, we have measured average travel times between a point at the Ventura Boulevard, to the north of Bel Air, and Sunset Boulevard, to the south, both through the I-405 and through Bel Air. The results are shown in Figure 2. We can see how traffic is at its peak between 07:30 and 9:00 for the southbound traffic, and between 16:30 and 18:30 for northbound traffic. It is also visible how traffic in the main route (I-405) is substantially faster through the day, but spills over to the alternative route (Roscomare Rd.) at peak times, particularly for the northbound traffic at the end of the afternoon. For the southbound traffic, however, average travel times on the alternative route did not change considerably at the peak hours of the morning, increasing by only 5.9% when compared to average travel times through the day. Northbound travel times through in that route increased by 30.5% at its peak. Figure 2. Average travel times (mm:ss) between Ventura Blvd. and Sunset Blvd. throughout the day, using either the main route (through I-405) or cutting through the residential area of Bel Air (through Roscomare Rd.). Source: Produced by the author using data manually extracted from Waze's website on September 1 st , 2015.

AESOP / YOUNG ACADEMICS NETWORK
Upon examining the map, we found that a user had edited one of the roads inside the alternative route, adding a turn restriction between Longbow Drive and Mullholland Drive for the time segment between 07:00 and 09:00. This restriction meant that Waze's routing server could not route traffic through Roscomare Road at those times, so that its users would be instructed to either remain on the I-405 or take a longer alternative route. We posted about this restriction at Waze's Community Forum; one of the Country Managers for the USA drove to the area to check and found that there was no sign posting that restriction, meaning that it was probably made up in an attempt to divert rush hour traffic. He corrected the map error and set a lock, restricting changes for higher-level users. To measure the impact of the change, we started monitoring average travel times for one segment in each of the routes studied above: a southbound segment of the I-405 and a parallel segment that would have been avoided for through traffic during the restriction 8 . The results are shown in Figure 3.
While the peak for the I-405 segment remained relatively stable in the days following the map correction, with a slight increase of 5.9%, the alternative segment started to climb right after Waze's maps were updated, on September 4 th , and steadily climbed to a rise of 80.7% after 18 days. Waze's routing server uses recent travel times to predict traffic flow and guide its users through the fastest routes, so it is likely that estimated travel times vary according to a moving average. This suggests that the made-up restriction that was in place had a significant role in reducing the amount of traffic through Bel Air during the morning rush hour. We cannot, however, assert this as a definitive conclusion without access to the raw data in Waze's servers, or their methodology for calculating average travel times, to assess the significance of our findings and what other elements might have had an impact on them. The consequences of different levels of openness can be seen in this experience. First, the two-way relationship established between drivers allows traffic to be redistributed along existing city streets. This process, considering the exponential nature of the marginal cost of each new car on busy roads, tends to yield a positive sum, that is; taking one car from a busy avenue and routing it through an empty street is, ceteris paribus, beneficial to every driver. This benefit from Waze needs only a slight amount of openness; it can happen as long as drivers' mobile devices receive and send information to Waze's main server. This information does not need to be readable by other machines or by humans, meaning that it can happen even if it is less open than the one-star rating of the 5-Star Open Data mentioned earlier. A further level of openness allows us to track current and historical information about travel times for any city with enough users; this has allowed us to analyse the dynamics of traffic in Los Angeles and to assess how different actions and policies influence the evolution of travel times in studied areas. Although this information is not directly downloadable (which would make it a 2-Star), Waze's servers allow for enough queries that we are able to extract data from them into spreadsheets through simple computer scripts. This allows citizens and urban planners to conduct studies and form opinions on topics that would previously be left to either expensive research or to guesswork. A third level of openness allows users to edit maps to add roads, edit their attributes (such as type of road, routing priority and so on), set restrictions, correct errors and update data in near real-time. Even though the made-up restriction we found in Bel Air seems to have been added by an individual user, Waze's map and forums can be used as platforms to mediate negotiations between different actors, making it possible to adopt such policies to manage interests that would otherwise have to be taken in the political and ideological spheres.

Use value vs exchange value
We have seen how, in its current state, Waze can play an important role in establishing twoway information exchanges between different actors in the city, helping advance the role of

AESOP / YOUNG ACADEMICS NETWORK
the 'soft layer' of urban planning. This contribution is limited, however, from the point where Waze closes its data. We cannot, for instance, access and download historical data. Waze's algorithms that convert past drives into current predictions are not known, and so the significance of researches like the one we have presented is limited. Other statistical information, such as traffic flow, origins and destinations of each drive or number of accidents and other events reported by region, are not disclosed. Disclosing this kind of information in a systematic way could go a long way in helping making cities better, but they could also hurt Waze's current business model, which relies on the exclusivity of the information it harnesses to make use of it for its parent company, Google 9 .
To sum up, inasmuch as it is open, Waze tends to generate a positive social product to cities; and inasmuch as it is closed, it has the potential to turn that social product into private profit, even to the point where it potentially yields a negative social product for the city. If, for instance, they stopped providing traffic information openly and began charging for its services, there would be an incentive for them to route its paying customers through faster routes regardless of the external effects it might have, and no one would be able to measure or have an influence on this process. Waze seems, from this analysis, to be in a point of the openness spectre where it generates much more profit than OpenStreetMap, but less profit than it could if it were completely closed.

Uber: private drivers as a policy for urban mobility
Founded in 2009, Uber's platform puts passengers and drivers in touch to hire taxi-like trips inside certain urban areas. Its stated advantage over current taxis is that any citizen can become an 'independent contractor', setting their own conditions such as working hours, driving areas and who to pick up (Uber, 2015b). It also bypasses the current structure of call centres of different companies to put potential passengers in contact with drivers that are currently close to them, saving idle time and infrastructure costs. Since it uses a web-based information platform to exchange, in real-time, information about where passengers and drivers are currently located, mediating their relationship and storing a wide range of information about every trip, it fits our definition of smart technology.
On-demand transportation is an important part of a city's urban mobility platform. Since collective transportation is inherently generalist and does not usually offer direct, door-to-door trips, their utility depends on the availability of connected stations between the user's origin and destination, and on the running of vehicles around the times the user is looking to make the trip. While this is usually the case for most trips in dense urban areas, the occasional need for specific, direct trips between places or at times when public transportation is not sufficiently available makes for-hire car services an important complement to mass transit systems. As such, they are often regulated by local laws that set rates, service conditions and obligations, car specifications and so on. These policies have short-and long-term impacts on how citizens use public and private transportation modes. Several studies have assessed the elasticity 10 of taxi demand with regards to fares: Schaller (1999) found that elasticity in New York City to be of -0.22 with regards to drivers' revenues, and of somewhere

AESOP / YOUNG ACADEMICS NETWORK
between -1.05 and -1.22 with regards to miles driven 11 . This is significant because it means, for instance, that a 10% decrease in taxi prices tends to increase drivers' revenue by 2.2% (due to increased demand) and the number of miles driven by a little over 10%. Because taxi services compete, in some level, with all other forms of transportation, their availability and fare rates are often object of public regulation. Following the stock market crash of 1929, there were as many as 30,000 private drivers in 1930 in the streets of New York City (Van Gelder, 1996). Some of them would drive 16 hours a day and not pick up a single passenger, which led to the passing of the Haas Act in 1937, regulating the taxi industry and limiting its number to 16,900 (Mathew, 2005). That number went down as some drivers failed to pay annual fees, and has not changed significantly since.
Uber started operating in 2011, offering services comparable to that of taxis. Drivers sign up as 'contractors' and drive their own cars, accepting or rejecting rides that are offered to them on smartphones provided by Uber (Fagin, 2014). Its legality has been disputed and has not yet been settled in most cities. There are two main legal issues usually cited: one is whether the service itself is legal, since taxi companies are regulated by specific laws; and the other is whether its drivers should be considered employees. For the purposes of this paper, those issues are not immediately relevant; we will study Uber's social and economic impacts in cities and analyse how it relates to the platform's level of openness.

Openness
The consequences of Uber's operations in a city are not easy to measure, since it releases almost no data. The total number of active drivers in New York City remained unknown until 2015, when an article, funded by Uber, was published by Hall and Krueger (2015). Called An Analysis of the Labor Market for Uber's Driver-Partners in the United States, its stated focus is the labour market, with a special emphasis in comparing Uber drivers' earnings, demographics and work dynamics to that of taxi drivers. The mere comparison between Uber and the current taxi industry is misleading, however, since it leaves out externalities that might be significant for urban planning. The total number of drivers has an impact on wait times, which might restrict or induce demand; a rise in fares tends to attract more drivers, while lower rates tend to attract more passengers. General public transportation costs influence car ownership in the long run, with an elasticity of between 0.1 and 0.3 (Litman, 2013), and it is natural that car ownership affects demand for parking space, which, in turn, has an influence on housing prices in urban areas. Uber can, then, have a positive effect, if it helps lowering housing costs and broadening the range of citizens that can afford it; or it can have negative effects, if it takes people out of the public transportation network and into cars, making traffic worse for everyone. Actual data regarding Uber usage is, for this matter, of the utmost importance. As in The Problem of Social Cost (Coase, 1960), externalities play a very important part in understanding the issue.

Case Study
New York is a city of importance to this analysis, not only because it is one of the first cities where Uber started operating, but also because it releases most of its data through its NYC Open Data program 12 . This allows us to better estimate the complex consequences of Uber operations. It is also the only city for which they have released statistics for hourly rides, which they did as a way to fight Mayor Bill de Blasio's proposal do cap the number of drivers

AESOP / YOUNG ACADEMICS NETWORK
of ride share companies (Tepper, 2015). Since Uber itself releases very little data, we will work on some assumptions to understand not necessarily its actual impacts, but its potential impacts on urban planning.
The most consequential policy that Uber can adopt is its pricing model. Lower prices induce demand, while higher prices attract more drivers. Because of the exponential nature of the marginal impact that each new car has on traffic, two variables are essential in estimating Uber's impacts: the number of active drivers at any given time, and the proportion of trips that were induced by Uber (i.e., excluding trips that would still have happened in taxis or private cars). According to Uber's data, there is an average of 1,675 Uber cars on the streets of the Central Business District (CBD) of NY, with a peak of 4,510 for June and July of 2015 (Uber, 2015a). An article published by The Economist (2015), based on leaked data, estimates the number of proportion of trips stemming from induced demand to be of about 13% in the CBD. That is almost certainly an underestimation for rush hour, since Uber's rides are more concentrated between those hours than taxis' (Bialik, Flowers, Fischer-Baum, & Mehta, 2015). If we assume 13%, this means that Uber adds an equivalent of 586 new taxi medallions to the CBD. A model developed by Charles Komanoff called 'Balanced Transportation Analyzer' (BTA) suggests that this addition in the area results in a decrease in traffic flow of about 3.9% (Komanoff, 2015). Though it may not seem much, the model estimates that this equates to an aggregate social cost of US$ 260 million per year in wasted hours 13 , increased crash damage costs, air pollution, decrease in bus speeds and a number of other factors that the BTA takes into account. If these numbers are correct, each one of the estimated average of 586 Uber cars that induces demand generates US$ 30.35 per hour in gross revenue for its driver and US$ 7.59 per hour for Uber, while imposing externalities that equate to a US$ 50.65 loss per hour 14 -in other words, under these conditions, Uber costs more to society than it provides to its drivers and users, yielding a negative social product.
Furthermore, Uber's own pricing scheme differs from that of taxis. Taxis only charge a timebased fee when the vehicle is stopped or moving below a certain speed. Uber's pricing model is different: it also charges a time-based fee, but it is lower and applies whether the car is moving or not. This means that, as traffic gets worse and cars spend more time stopped or moving slowly, Uber becomes comparatively cheaper than taxis. A 7-mile, 20 minutes ride in NYC, for instance, currently costs US$ 24.72 (including a 15% tip). The same ride with Uber would cost US$ 26.05, or 5% more. If the same trip took 30 minutes instead, due to traffic delays, the taxi fare would rise to US$ 31.47, while Uber's would cost US$ 30.05, or 5% cheaper. This shows a dual, perverse incentive: as Uber gets more popular, traffic gets worse; and as traffic gets worse, Uber becomes comparatively cheaper than taxis and thus gains more market share. It could be argued that the lower fares might lead drivers to avoid working during rush hour, resulting in longer waiting times and lower demand. However, Uber often offers a 'rush hour guarantee', covering an income of US$ 20 or 25 per hour if the driver made less than that in the period as long as he has driven a certain number of hours 15 .
13 This takes into account the total number of hours saved by those who benefit from shorter wait and trip times from the added cars. 14 Komanoff's spreadsheet is massive and uses several different data inputs to reach results like this one. It can be downloaded, studied and tuned according to different premises. As more data becomes available, it tends to become more accurate; the fact that its author makes it available as a spreadsheet, rather than publishing cherrypicked results according to his ideologies, is a good testament towards open data in general. 15 Although this policy is not publicly posted on Uber's website, third-party websites often reproduce their e-mail announcing such guarantees, such as http://www.driveubernj.com/amrush/ (access on 28/09/2015).

AESOP / YOUNG ACADEMICS NETWORK
The same dynamic applies to all but one city in a list of 23 American cities for which we have compiled official taxi and Uber rates. Furthermore, this policy is often more pronounced in cities that actually have worse traffic. In Washington, D.C., for instance, where commuters spend a yearly average of 82 hours stuck in traffic (Shrank, Eisele, Lomax, & Bak, 2015), the Uber-to-taxi fare ratio 16 drops from 0.77 when there is no traffic to 0.65 with a half-hour delay for the same trip, or a drop of 15.6% in Uber's relative price. Conversely, in Cleveland, where yearly delays due to traffic are of 38 hours per driver (55 th in the country, compared to Washington's 1 st place), Uber's relative price drop in the same conditions is of just 4.3%.
In the table below, we have compared the cost of an Uber trip to that of a taxi trip in free-flow conditions for a 7-miles, 20-minutes, trip, and then multiplied the trip duration by the Travel Time Index 17 . We then compared the latter ratio to the former to measure how much cheaper Uber gets, compared to taxis, during rush hour in each city. Figure 4 shows a significant, negative correlation (r = -0.42 p = 0.047655) between the intensity of traffic in each city and the change in Uber's trip cost compared to taxis (which often have publicly-regulated rates). This means that not only Uber pricing policy makes it relatively cheaper than taxis during peak periods, but that this policy is more aggressive in cities where traffic is worse (and therefore the marginal social cost for each new car at rush hour is higher).

Use value vs exchange value
This study helps put in context some of the discussions involving Uber lobbying that aims to make sure the company can operate unregulated. They have spent almost US$ 1 million on lobbying in California alone (Kokalitcheva, 2015). According to an article in The Washington Post, 'Uber's approach is brash and, so far, highly effective: It launches in local markets regardless of existing laws or regulations. It aims to build a large customer base as quickly as 16 Considering the same trip cited above (7 miles, 20 minutes). 17 The average trip delay increases for peak hours compared to free-flow hours, taken from the same report cited above.

AESOP / YOUNG ACADEMICS NETWORK
possible. When challenged, Uber rallies its users to pressure government officials, while unleashing its well-connected lobbyists to influence lawmakers.' (Helderman, 2014). This makes sense, since, from a strictly individual standpoint, Uber is mostly advantageous in nearly every city, and more so during peak times. So we read that 'this big, aggressive company is using the almost $6 billion it has raised from venture capitalists and other investors to subvert the democratic process', but 'the biggest, scariest lobbying machine in the nation is hard at work lobbying to make most Americans' lives a little bit better' (Fox, 2015). As we have seen, for this view to be true, externalities, present and in the long-term, must be ignored.
Effectively measuring these externalities requires data; we have tried to estimate these numbers as accurately as we can with what information we currently have. What we have found is that Uber can result in a positive social product, by offering a platform that cuts waiting time for passengers and idle time for drivers; or a severely negative one, when it offers incentives for drivers at times when the marginal social costs for each new car are significantly higher than what Uber makes for itself, its drivers or its users, as is the case for the CBD in New York City. Knowing which end of this spectrum is in place on each city, at each time of day, depends on data we currently do not have. What little information we have about Uber's operation was either leaked or made available through studies financed by the company itself. The platform is also closed in its 'upstream' sense, that is, Uber's policies are not directly influenced by users or driversmost notably price policies, which are unilaterally defined by Uber with no possibility of discounts or additions on the part of the driver 18 . As we have seen from our studies, pricing policy plays an important part in determining the social outcome of Uber's operations.
It is apparent, however, that moving towards the 'open' end of the open data spectrum, both by giving up detailed information about its operation and by letting users, drivers and regulators have a say in its pricing policy, would hurt Uber's profits. Its current business model naturally aims at striking a balance between drivers' revenue and trip costs in order to optimize its revenues, regardless of the social impacts; opening this up for society would certainly have no effect in increasing revenues further, and would, in many cases, hurt them.

Conclusion
(…) stop to consider how the so-called owners of the land got hold of it. They simply seized it by force, afterwards hiring lawyers to provide them with title-deeds. (…) they were quite frankly taking the heritage of their own countrymen, upon no sort of pretext except that they had the power to do so (George Orwell, 1945).
Common land did not have much value in feudal times, since they were not protected by a lord, as was the case in the fiefs. As populations grew and societies transitioned from feudalism into national states, common land became increasingly useful for planting crops and raising cattle; that is, it went through a steady increase in its use value, though, being common, it had no exchange value. As that value became apparent, lords and owners of small parcels of land gradually enclosed common land, either through purchase or through the passing of laws, and often through some form of political influence (Thompson, 2002). Thus, common land became private property and gained exchange value. It stopped having use-value for the general population, but started generating profit for its ownerin a process

AESOP / YOUNG ACADEMICS NETWORK
that Karl Marx has called 'the expropriation of the agricultural population from the land' (1889, p. 740).
Every city has an abstract layer of information that was, until very recently, hard to access, except by government agencies through heavy investment for specific purposes. The last few decades have seen the development and popularization of technologies such as the internet, GPS devices and smartphones, which play an increasingly ubiquitous part in harnessing that information and turning it into actionable data. This process presents a unique step for urban planning, which can now use this data to design policies that will transfer to the people decisions that were hitherto guided by broad political and ideological orientations rather than on case-by-case, democratic processes. From this standpoint, ideology alone is an inadequate tool for making complex decisions. Though 'smart' technologies have been widely praised for making this process possible, we have tried to show that even if the process of turning information into data necessarily generates value, it does not necessarily result in a positive-sum result for the common good.
What we have found is that smart technologies fall somewhere in a spectrum between completely open, where information is accessible and actionable upon by everyone; and completely closed, where the process of gathering information still happens but serves the purpose of designing a closed product.
The paradox of value plays an important role in this process: a graph of availability and total value of a commodity generally follows a curve in which scarcity results in a high per-unit exchange value, but a low total value (such as diamonds). As that commodity becomes more widely available, its marginal exchange-value decreases, but the total value increases (as with, for instance, oil, which is worth much less than diamonds per unit, but much more as a total industry). After a peak, its marginal value becomes negative and the total value of that commodity decreases, eventually nearing zeroas with atmospheric air. Air is more useful to mankind than oil, but, because it is so widely available, its total exchange-value is lower than that of oil; and oil is more useful to mankind than diamonds, tough individually much less valuable.
Our hypothesis is that smart technologies follow a similar curve. Platforms closer to the open end of the spectrum, such as OpenStreetMap, appear to generate abundant use-value while having nearly no exchange-value, resulting in incontrovertible, if limited, common good. The OpenStreetMap Foundation has a little over £140,000 in equity and generates no direct profit 19 . It has over 2 million registered contributors, with about 10 thousand new registrations per month. As we get closer to the middle of the spectrum, such as with Waze, there is both use-value and exchange-value being generated; Waze was sold in 2013 for US$ 1.3 billion. It generates some profit by displaying ads and by providing information to its parent company, though the exact amounts are not disclosed by Google. Uber is nearer the closed end of the spectrum, since it barely releases any data nor does it take direct input from either driver or users for the main decisions and rules over its platform. It is currently valuated at over US$ 50 billion (MacMillan & Demos, 2015) and expected to generate over US$ 2 billion in net revenues in 2015, with gross revenues up 271% from 2014 and expected to rise 141% in 2016, according to their own predictions (Zhang & Shih, 2015). It is already one of the biggest companies in the world. Although it has not yet opened its capital and therefore does not release quarterly balance sheets, recent bond term sheets show that it is operating at a AESOP / YOUNG ACADEMICS NETWORK loss of over US$ 400 million (per quarter, presumably) (Biddle, 2015). This loss is reportedly due to investment in expanding operations, mostly with the heavy subsidies it offers for users signing up or inviting others to sign up. In other words, Uber has been investing heavily in acquiring control of its business.
One very important aspect to take into consideration is the propensity for monopoly or oligopoly in industry sectors that deal with information. This is largely due to each platform's improvement brought about by the added number of users. Waze, for instance, gets more accurate as its number of users and map editors increases, meaning that its leadership in the sector tends to self-reinforce.
We conclude, then, that the urban data that sat largely unused in an abstract plane is, as with the common lands, now being turned into use-and exchange-valuethis time, by smart technologies. There is no opposition to that; that is, 'no one says "I don't like the smart thing and I prefer to be dumb"' (Sterling, 2014). But there is no guarantee that the way towards smart cities necessarily yields a positive social product; in some situations, this process might do more harm than good. This is, in economic terms, similar to the surplus value brought about by the industrial revolutionthe distribution of which between capital and labour still animates political discussions to this date. The crux of the matter is not whether new technologies should exist, but rather how its benefits should be distributed (between capital, workers, users and the common good). Rolling back the wheel of time and uninventing smart technologies, such as the Luddites have tried in the beginning of the 19 th century, is hardly a viable answer; a more productive path is, rather, making sure that the smart city serves not as an end in itself, but as a tool towards an open and collectively managed city.