Extract from ‘Geek Sublime: Writing Fiction, Coding Software’ by Vikram Chandra: “A geek hunched over a laptop tapping frantically at the keyboard, neon-bright lines of green code sliding up the screen – the programmer at work is now a familiar staple of popular entertainment. The clipped shorthand and digits of programming languages are familiar even to civilians, if only as runic incantations charged with world-changing power.
Computing has transformed all our lives but the processes and cultures that produce software remain largely opaque, alien, unknown. This is certainly true within my own professional community of fiction writers; whenever I tell one of my fellow authors that I supported myself through the writing of my first novel by working as a programmer and a computer consultant, I evoke a response that mixes bemusement, bafflement, and a touch of awe, as if I had just said that I could levitate.
Most of the artists I know – painters, film-makers, actors, poets – seem to regard programming as an esoteric scientific discipline; they are keenly aware of its cultural mystique, envious of its potential profitability and eager to extract metaphors, imagery, and dramatic possibility from its history, but coding may as well be nuclear physics as far as relevance to their own daily practice is concerned.
Many programmers, on the other hand, regard themselves as artists. Since programmers create complex objects, and care not just about function but also about beauty, they are just like painters and sculptors. The best-known assertion of this notion is the 2003 essay “Hackers and Painters” by programmer and venture capitalist Paul Graham. “Of all the different types of people I’ve known, hackers and painters are among the most alike,” writes Graham. “What hackers and painters have in common is that they’re both makers. Along with composers, architects, and writers, what hackers and painters are trying to do is make good things.”
According to Graham, the iterative processes of programming – write, debug (discover and remove bugs, which are coding errors), rewrite, experiment, debug, rewrite – exactly duplicate the methods of artists. “The way to create something beautiful is often to make subtle tweaks to something that already exists, or to combine existing ideas in a slightly new way,” he writes. “You should figure out programs as you’re writing them, just as writers and painters and architects do.”
Attention to detail further marks good hackers with artist-like passion, he argues. “All those unseen details [in a Leonardo da Vinci painting] combine to produce something that’s just stunning, like a thousand barely audible voices all singing in tune. Great software, likewise, requires a fanatical devotion to beauty. If you look inside good software, you find that parts no one is ever supposed to see are beautiful too.”
This desire to equate art and programming has a lengthy pedigree. In 1972, famed computer scientist Butler Lampson published an editorial titled “Programmers as Authors” that began: “Creative endeavour varies greatly in the amount of overhead (ie money, manpower and organisation) associated with a project which calls for a given amount of creative work. At one extreme is the activity of an aircraft designer, at the other that of a poet. The art of programming currently falls much closer to the former than the latter. I believe, however, that this situation is likely to change considerably in the next decade.”
Boston's Building a Synergy Between City Hall & Startups
Gillis Bernard at BostInno: “Boston’s local government and startup scene want to do more than peacefully co-exist. They want to co-create. The people perhaps credited for contributing the most buzz to this trend are those behind relatively new parking ticket app TicketZen. Cort Johnson, along with a few others from Terrible Labs, a Web and mobile app design consultancy in Chinatown, came up with the idea for the app after spotting a tweet from one of Boston’s trademark entrepreneurs. A few months back, ex-KAYAK CTO (and Blade co-founder) Paul English sent out a 140-character message calling for an easy, instantaneous payment solution for parking tickets, Johnson told BostInno.
The idea was that in the time it takes for Boston’s enforcement office to process a parking ticket, its recipient has already forgotten his or her frustration or misplaced the bright orange slip, thus creating a situation in which both parties lose: the local government’s collection process is held up and the recipient is forced to pay a larger fine for the delay.
With the problem posed and the spark lit, the Terrible Labs team took to building TicketZen, an app which allows people to scan their tickets and immediately send validation to City Hall to kick off the process.
“When we first came up with the prototype, [City Hall was] really excited and worked to get it launched in Boston first,” said Johnson. “But we have built a bunch of integrations for major cities where most of the parking tickets are issued, which will launch early this year.”
But in order to even get the app up-and-running, Terrible Labs needed to work with some local government representatives – namely, Chris Osgood and Nigel Jacob of the Mayor’s Office of New Urban Mechanics….
Since its inception in 2010, the City Hall off-shoot has worked with all kinds of Boston citizens to create civic-facing innovations that would be helpful to the city at large.
For example, a group of mothers with children at Boston Public Schools approached New Urban Mechanics to create an app that shares when the school bus will arrive, similar to that of the MBTA’s, which shows upcoming train times. The nonprofit then arranged a partnership with Vermonster LLC, a software application development firm in Downtown Boston to create the Where’s My School Bus app.
“There’s a whole host of different backgrounds, from undergrad students to parents, who would never consider themselves to be entrepreneurs or innovators originally … There are just so many talented, driven and motivated folks that would likely have a similar interest in doing work in the civic space. The challenge is to scale that beyond what’s currently out there,” shared Osgood. “We’re asking, ‘How can City Hall do a better job to support innovators?’”
Of course, District Hall was created for this very purpose – supporting creatives and entrepreneurs by providing them a perpetually open door and an event space. Additionally, there have been a number of events geared toward civic innovation within the past few months targeting both entrepreneurs and government.
The former mayor Thomas Menino led the charge in opening the Office of Business Development, which features a sleek new website and focuses on providing entrepreneurs and existing businesses with access to financial and technical resources. Further, a number of organizations collaborated in early December 2013 to host a free-to-register event dubbed MassDOT Visualizing Transportation Hackathon to help generate ideas for improving public transit from the next generation’s entrepreneurs; just this month, the Venture Café and the Cambridge Innovation Center hosted Innovation and the City, a conference uniting leading architects, urban planners, educators and business leaders from different cities around the U.S. to speak to the changing landscape of civic development.”
Selected Readings on Personal Data: Security and Use
The Living Library’s Selected Readings series seeks to build a knowledge base on innovative approaches for improving the effectiveness and legitimacy of governance. This curated and annotated collection of recommended works on the topic of personal data was originally published in 2014.
Advances in technology have greatly increased the potential for policymakers to utilize the personal data of large populations for the public good. However, the proliferation of vast stores of useful data has also given rise to a variety of legislative, political, and ethical concerns surrounding the privacy and security of citizens’ personal information, both in terms of collection and usage. Challenges regarding the governance and regulation of personal data must be addressed in order to assuage individuals’ concerns regarding the privacy, security, and use of their personal information.
Selected Reading List (in alphabetical order)
- Ann Cavoukian – Personal Data Ecosystem (PDE) – A Privacy by Design Approach to an Individual’s Pursuit of Radical Control – a paper describing the emerging framework of technologies enabling individuals to hold greater control of their data.
- T. Kirkham, S. Winfield, S. Ravet and S. Kellomaki – A Personal Data Store for an Internet of Subjects – a paper arguing for a shift from the current service-oriented data control architecture to a system centered on individuals controlling their data.
- OECD – The 2013 OECD Privacy Guidelines – a privacy framework built around eight core principles.
- Paul Ohm – Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization – a paper revealing the ease with which researchers have been able to reattach supposedly anonymized personal data to individuals.
- Jules Polonetsky and Omer Tene – Privacy in the Age of Big Data: A Time for Big Decisions – a paper proposing the development of a risk-value matrix for personal data in the big data era.
- Katie Shilton, Jeff Burke, Deborah Estrin, Ramesh Govindan, Mark Hansen, Jerry Kang, and Min Mun – Designing the Personal Data Stream: Enabling Participatory Privacy in Mobile Personal Sensing – a paper arguing for a reimagined system for protecting the privacy of personal data, moving past the out-of-date Codes of Fair Information Practice.
Annotated Selected Reading List (in alphabetical order)
Cavoukian, Ann. “Personal Data Ecosystem (PDE) – A Privacy by Design Approach to an Individual’s Pursuit of Radical Control.” Privacy by Design, October 15, 2013. https://bit.ly/2S00Yfu.
- In this paper, Cavoukian describes the Personal Data Ecosystem (PDE), an “emerging landscape of companies and organizations that believe individuals should be in control of their personal data, and make available a growing number of tools and technologies to enable this control.” She argues that, “The right to privacy is highly compatible with the notion of PDE because it enables the individual to have a much greater degree of control – “Radical Control” – over their personal information than is currently possible today.”
- To ensure that the PDE reaches its privacy-protection potential, Cavouckian argues that it must practice The 7 Foundational Principles of Privacy by Design:
- Proactive not Reactive; Preventative not Remedial
- Privacy as the Default Setting
- Privacy Embedded into Design
- Full Functionality – Positive-Sum, not Zero-Sum
- End-to-End Security – Full Lifecycle Protection
- Visibility and Transparency – Keep it Open
- Respect for User Privacy – Keep it User-Centric
Kirkham, T., S. Winfield, S. Ravet, and S. Kellomaki. “A Personal Data Store for an Internet of Subjects.” In 2011 International Conference on Information Society (i-Society). 92–97. http://bit.ly/1alIGuT.
- This paper examines various factors involved in the governance of personal data online, and argues for a shift from “current service-oriented applications where often the service provider is in control of the person’s data” to a person centric architecture where the user is at the center of personal data control.
- The paper delves into an “Internet of Subjects” concept of Personal Data Stores, and focuses on implementation of such a concept on personal data that can be characterized as either “By Me” or “About Me.”
- The paper also presents examples of how a Personal Data Store model could allow users to both protect and present their personal data to external applications, affording them greater control.
OECD. The 2013 OECD Privacy Guidelines. 2013. http://bit.ly/166TxHy.
- This report is indicative of the “important role in promoting respect for privacy as a fundamental value and a condition for the free flow of personal data across borders” played by the OECD for decades. The guidelines – revised in 2013 for the first time since being drafted in 1980 – are seen as “[t]he cornerstone of OECD work on privacy.”
- The OECD framework is built around eight basic principles for personal data privacy and security:
- Collection Limitation
- Data Quality
- Purpose Specification
- Use Limitation
- Security Safeguards
- Openness
- Individual Participation
- Accountability
Ohm, Paul. “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization.” UCLA Law Review 57, 1701 (2010). http://bit.ly/18Q5Mta.
- This article explores the implications of the “astonishing ease” with which scientists have demonstrated the ability to “reidentify” or “deanonmize” supposedly anonymous personal information.
- Rather than focusing exclusively on whether personal data is “anonymized,” Ohm offers five factors for governments and other data-handling bodies to use for assessing the risk of privacy harm: data-handling techniques, private versus public release, quantity, motive and trust.
Polonetsky, Jules and Omer Tene. “Privacy in the Age of Big Data: A Time for Big Decisions.” Stanford Law Review Online 64 (February 2, 2012): 63. http://bit.ly/1aeSbtG.
- In this article, Tene and Polonetsky argue that, “The principles of privacy and data protection must be balanced against additional societal values such as public health, national security and law enforcement, environmental protection, and economic efficiency. A coherent framework would be based on a risk matrix, taking into account the value of different uses of data against the potential risks to individual autonomy and privacy.”
- To achieve this balance, the authors believe that, “policymakers must address some of the most fundamental concepts of privacy law, including the definition of ‘personally identifiable information,’ the role of consent, and the principles of purpose limitation and data minimization.”
Shilton, Katie, Jeff Burke, Deborah Estrin, Ramesh Govindan, Mark Hansen, Jerry Kang, and Min Mun. “Designing the Personal Data Stream: Enabling Participatory Privacy in Mobile Personal Sensing”. TPRC, 2009. http://bit.ly/18gh8SN.
- This article argues that the Codes of Fair Information Practice, which have served as a model for data privacy for decades, do not take into account a world of distributed data collection, nor the realities of data mining and easy, almost uncontrolled, dissemination.
- The authors suggest “expanding the Codes of Fair Information Practice to protect privacy in this new data reality. An adapted understanding of the Codes of Fair Information Practice can promote individuals’ engagement with their own data, and apply not only to governments and corporations, but software developers creating the data collection programs of the 21st century.”
- In order to achieve this change in approach, the paper discusses three foundational design principles: primacy of participants, data legibility, and engagement of participants throughout the data life cycle.
Civic Tech Forecast: 2014
Laura Dyson from Code for America: “Last year was a big year for civic technology and government innovation, and if last week’s Municipal Innovation discussion was any indication, 2014 promises to be even bigger. More than sixty civic innovators from both inside and outside of government gathered to hear three leading civic tech experts share their “Top Five” list of civic tech trends from 2013m, and predictions for what’s to come in 2014. From responsive web design to overcoming leadership change, guest speakers Luke Fretwell, Juan Pablo Velez, and Alissa Black covered both challenges and opportunities. And the audience had a few predictions of their own. Highlights included:
Mark Leech, Application Development Manager, City of Albuquerque: “Regionalization will allow smaller communities to participate and act as a force multiplier for them.”
Rebecca Williams, Policy Analyst, Sunlight Foundation: “Open data policy (law and implementation) will become more connected to traditional forms of governance, like public records and town hall meetings.”
Rick Dietz, IT Director, City of Bloomington, Ind.: “I think governments will need to collaborate directly more on open source development, particularly on enterprise scale software systems — not just civic apps.”
Kristina Ng, Office of Financial Empowerment, City and County of San Francisco: “I’m excited about the growing community of innovative government workers.”
Hillary Hartley, Presidential Innovation Fellow: “We’ll need to address sustainability and revenue opportunities. Consulting work can only go so far; we must figure out how to empower civic tech companies to actually make money.”
An informal poll of the audience showed that roughly 96 percent of the group was feeling optimistic about the coming year for civic innovation. What’s your civic tech forecast for 2014? Read on to hear what guest speakers Luke Fretwell, Juan Pablo Velez, and Alissa Black had to say, and then let us know how you’re feeling about 2014 by tweeting at @codeforamerica.”
Big Data, Privacy, and the Public Good
Forthcoming book and website by Julia Lane, Victoria Stodden, Stefan Bender, and Helen Nissenbaum (editors): “The overarching goal of the book is to identify ways in which vast new sets of data on human beings can be collected, integrated, and analysed to improve evidence based decision making while protecting confidentiality. …
Massive amounts of new data on human beings can now be accessed and analyzed. Much has been made of the many uses of such data for pragmatic purposes, including selling goods and services, winning political campaigns, and identifying possible terrorists. Yet “big data” can also be harnessed to serve the public good: scientists can use new forms of data to do research that improves the lives of human beings, federal, state and local governments can use data to improve services and reduce taxpayer costs and public organizations can use information to advocate for public causes.
Much has also been made of the privacy and confidentiality issues associated with access. A survey of statisticians at the 2013 Joint Statistical Meeting found that the majority thought consumers should worry about privacy issues, and that an ethical framework should be in place to guide data scientists. Yet there are many unanswered questions. What are the ethical and legal requirements for scientists and government officials seeking to serve the public good without harming individual citizens? What are the rules of engagement? What are the best ways to provide access while protecting confidentiality? Are there reasonable mechanisms to compensate citizens for privacy loss?
The goal of this book is to answer some of these questions. The book’s authors paint an intellectual landscape that includes the legal, economic and statistical context necessary to frame the many privacy issues, including the value to the public of data access. The authors also identify core practical approaches that use new technologies to simultaneously maximize the utility of data access while minimizing information risk. As is appropriate for such a new and evolving field, each chapter also identifies important questions that require future research.
The work in this book is also intended to be accessible to an audience broader than the academy. In addition to informing the public, we hope that the book will be useful to people trying to provide data access but protect confidentiality in the roles as data custodians for federal, state and local agencies, or decision makers on institutional review boards.”
It’s the Neoliberalism, Stupid: Why instrumentalist arguments for Open Access, Open Data, and Open Science are not enough.
Eric Kansa at LSE Blog: “…However, I’m increasingly convinced that advocating for openness in research (or government) isn’t nearly enough. There’s been too much of an instrumentalist justification for open data an open access. Many advocates talk about how it will cut costs and speed up research and innovation. They also argue that it will make research more “reproducible” and transparent so interpretations can be better vetted by the wider community. Advocates for openness, particularly in open government, also talk about the wonderful commercial opportunities that will come from freeing research…
These are all very big policy issues, but they need to be asked if the Open Movement really stands for reform and not just a further expansion and entrenchment of Neoliberalism. I’m using the term “Neoliberalism” because it resonates as a convenient label for describing how and why so many things seem to suck in Academia. Exploding student debt, vanishing job security, increasing compensation for top administrators, expanding bureaucracy and committee work, corporate management methodologies (Taylorism), and intensified competition for ever-shrinking public funding all fall under the general rubric of Neoliberalism. Neoliberal universities primarily serve the needs of commerce. They need to churn out technically skilled human resources (made desperate for any work by high loads of debt) and easily monetized technical advancements….
“Big Data,” “Data Science,” and “Open Data” are now hot topics at universities. Investments are flowing into dedicated centers and programs to establish institutional leadership in all things related to data. I welcome the new Data Science effort at UC Berkeley to explore how to make research data professionalism fit into the academic reward systems. That sounds great! But will these new data professionals have any real autonomy in shaping how they conduct their research and build their careers? Or will they simply be part of an expanding class of harried and contingent employees- hired and fired through the whims of creative destruction fueled by the latest corporate-academic hype-cycle?
Researchers, including #AltAcs and “data professionals”, need a large measure of freedom. Miriam Posner’s discussion about the career and autonomy limits of Alt-academic-hood help highlight these issues. Unfortunately, there’s only one area where innovation and failure seem survivable, and that’s the world of the start-up. I’ve noticed how the “Entrepreneurial Spirit” gets celebrated lots in this space. I’m guilty of basking in it myself (10 years as a quasi-independent #altAc in a nonprofit I co-founded!).
But in the current Neoliberal setting, being an entrepreneur requires a singular focus on monetizing innovation. PeerJ and Figshare are nice, since they have business models that less “evil” than Elsevier’s. But we need to stop fooling ourselves that the only institutions and programs that we can and should sustain are the ones that can turn a profit. For every PeerJ or Figshare (and these are ultimately just as dependent on continued public financing of research as any grant-driven project), we also need more innovative organizations like the Internet Archive, wholly dedicated to the public good and not the relentless pressure to commoditize everything (especially their patrons’ privacy). We need to be much more critical about the kinds of programs, organizations, and financing strategies we (as a society) can support. I raised the political economy of sustainability issue at a recent ThatCamp and hope to see more discussion.
In reality so much of the Academy’s dysfunctions are driven by our new Gilded Age’s artificial scarcity of money. With wealth concentrated in so few hands, it is very hard to finance risk taking and entreprenurialism in the scholarly community, especially to finance any form of entrepreneurialism that does not turn a profit in a year or two.
Open Access and Open Data will make so much more of a difference if we had the same kind of dynamism in the academic and nonprofit sector as we have in the for-profit start-up sector. After all, Open Access and Open Data can be key enablers to allow much broader participation in research and education. However, broader participation still needs to be financed: you cannot eat an open access publication. We cannot gloss over this key issue.
We need more diverse institutional forms so that researchers can find (or found) the kinds of organizations that best channel their passions into contributions that enrich us all. We need more diverse sources of financing (new foundations, better financed Kickstarters) to connect innovative ideas with the capital needed to see them implemented. Such institutional reforms will make life in the research community much more livable, creative, and dynamic. It would give researchers more options for diverse and varied career trajectories (for-profit or not-for-profit) suited to their interests and contributions.
Making the case to reinvest in the public good will require a long, hard slog. It will be much harder than the campaign for Open Access and Open Data because it will mean contesting Neoliberal ideologies and constituencies that are deeply entrenched in our institutions. However, the constituencies harmed by Neoliberalism, particularly the student community now burdened by over $1 trillion in debt and the middle class more generally, are much larger and very much aware that something is badly amiss. As we celebrate the impressive strides made by the Open Movement in the past year, it’s time we broaden our goals to tackle the needs for wider reform in the financing and organization of research and education.
This post originally appeared on Digging Digitally and is reposted under a CC-BY license.”
Online Video Game Plugs Players Into Real Biochemistry Lab
Science Now: “Crowdsourcing is the latest research rage—Kickstarter to raise funding, screen savers that number-crunch, and games to find patterns in data—but most efforts have been confined to the virtual lab of the Internet. In a new twist, researchers have now crowdsourced their experiments by connecting players of a video game to an actual biochemistry lab. The game, called EteRNA, allows players to remotely carry out real experiments to verify their predictions of how RNA molecules fold. The first big result: a study published this week in the Proceedings of the National Academy of Sciences, bearing the names of more than 37,000 authors—only 10 of them professional scientists. “It’s pretty amazing stuff,” says Erik Winfree, a biophysicist at the California Institute of Technology in Pasadena.
Some see EteRNA as a sign of the future for science, not only for crowdsourcing citizen scientists but also for giving them remote access to a real lab. “Cloud biochemistry,” as some call it, isn’t just inevitable, Winfree says: It’s already here. DNA sequencing, gene expression testing, and many biochemical assays are already outsourced to remote companies, and any “wet lab” experiment that can be automated will be automated, he says. “Then the scientists can focus on the non-boring part of their work.”
EteRNA grew out of an online video game called Foldit. Created in 2008 by a team led by David Baker and Zoran Popović, a molecular biologist and computer scientist, respectively, at the University of Washington, Seattle, Foldit focuses on predicting the shape into which a string of amino acids will fold. By tweaking virtual strings, Foldit players can surpass the accuracy of the fastest computers in the world at predicting the structure of certain proteins. Two members of the Foldit team, Adrien Treuille and Rhiju Das, conceived of EteRNA back in 2009. “The idea was to make a version of Foldit for RNA,” says Treuille, who is now based at Carnegie Mellon University in Pittsburgh, Pennsylvania. Treuille’s doctoral student Jeehyung Lee developed the needed software, but then Das persuaded them to take it a giant step further: hooking players up directly to a real-world, robot-controlled biochemistry lab. After all, RNA can be synthesized and its folded-up structure determined far more cheaply and rapidly than protein can.
Lee went back to the drawing board, redesigning the game so that it had not only a molecular design interface like Foldit, but also a laboratory interface for designing RNA sequences for synthesis, keeping track of hypotheses for RNA folding rules, and analyzing data to revise those hypotheses. By 2010, Lee had a prototype game ready for testing. Das had the RNA wet lab ready to go at Stanford University in Palo Alto, California, where he is now a professor. All they lacked were players.
A message to the Foldit community attracted a few hundred players. Then in early 2011, The New York Times wrote about EteRNA and tens of thousands of players flooded in.
The game comes with a detailed tutorial and a series of puzzles involving known RNA structures. Only after winning 10,000 points do you unlock the ability to join EteRNA’s research team. There the goal is to design RNA sequences that will fold into a target structure. Each week, eight sequences are chosen by vote and sent to Stanford for synthesis and structure determination. The data that come back reveal how well the sequences’ true structures matched their targets. That way, Treuille says, “reality keeps score.” The players use that feedback to tweak a set of hypotheses: design rules for determining how an RNA sequence will fold.
Two years and hundreds of RNA structures later, the players of EteRNA have proven themselves to be a potent research team. Of the 37,000 who played, about 1000 graduated to participating in the lab for the study published today. (EteRNA now has 133,000 players, 4000 of them doing research.) They generated 40 new rules for RNA folding. For example, at the junctions between different parts of the RNA structure—such as between a loop and an arm—the players discovered that it is far more stable if enriched with guanines and cytosines, the strongest bonding of the RNA base pairs. To see how well those rules describe reality, the humans then competed toe to toe against computers in a new series of RNA structure challenges. The researchers distilled the humans’ 40 rules into an algorithm called EteRNA Bot.”
How a New Science of Cities Is Emerging from Mobile Phone Data Analysis
MIT Technology Review: “Mobile phones have generated enormous insight into the human condition thanks largely to the study of the data they produce. Mobile phone companies record the time of each call, the caller and receiver ids, as well as the locations of the cell towers involved, among other things.
The combined data from millions of people produces some fascinating new insights in the nature of our society. Anthropologists have crunched it to reveal human reproductive strategies, a universal law of commuting and even the distribution of wealth in Africa.
Today, computer scientists have gone one step further by using mobile phone data to map the structure of cities and how people use them throughout the day. “These results point towards the possibility of a new, quantitative classification of cities using high resolution spatio-temporal data,” say Thomas Louail at the Institut de Physique Théorique in Paris and a few pals.
They say their work is part of a new science of cities that aims to objectively measure and understand the nature of large population centers.
These guys begin with a database of mobile phone calls made by people in the 31 Spanish cities that have populations larger than 200,000. The data consists of the number of unique individuals using a given cell tower (whether making a call or not) for each hour of the day over almost two months….The results reveal some fascinating patterns in city structure. For a start, every city undergoes a kind of respiration in which people converge into the center and then withdraw on a daily basis, almost like breathing. And this happens in all cities. This “suggests the existence of a single ‘urban rhythm’ common to all cities,” say Louail and co.
During the week, the number of phone users peaks at about midday and then again at about 6 p.m. During the weekend the numbers peak a little later: at 1 p.m. and 8 p.m. Interestingly, the second peak starts about an hour later in western cities, such as Sevilla and Cordoba.
The data also reveals that small cities tend to have a single center that becomes busy during the day, such as the cities of Salamanca and Vitoria.
But it also shows that the number of hotspots increases with city size; so-called polycentric cities include Spain’s largest, such as Madrid, Barcelona, and Bilboa.
That could turn out to be useful for automatically classifying cities.
There is a growing interest in the nature of cities, the way they evolve and how their residents use them. The goal of this new science is to make better use of these spaces that more than 50 percent of the planet inhabit. Louail and co show that mobile phone data clearly has an important role to play in this endeavor to better understanding these complex giants.
Ref: arxiv.org/abs/1401.4540 : From Mobile Phone Data To The Spatial Structure Of Cities”
Google Hangouts vs Twitter Q&As: how the US and Europe are hacking traditional diplomacy
Wired (UK): “We’re not yet sure if diplomacy is going digital or just the conversations we’re having,” Moira Whelan, Deputy Assistant Secretary for Digital Strategy, US Department of State, admitted on stage at TedxStockholm. “Sometimes you just have to dive in, and we’re going to, but we’re not really sure where we’re going.”
The US has been at the forefront of digital diplomacy for many years now. President Obama was the first leader to sign up to Twitter, and has amassed the greatest number of followers among his peers at nearly 41 million. The account is, however, mainly run by his staff. It’s understandable, but demonstrates that there still remains a diplomatic disconnect in a country Whelan says knows it’s “ready, leading the conversation and on cutting edge”.
In Europe Swedish Minister for Foreign Affairs Carl Bildt, on the other hand, carries out regular Q&As on the social network and is regarded as one of the most conversational leaders on Twitter and the best connected, according to annual survey Twiplomacy. Our own William Hague is chasing Bildt with close to 200,000 followers, and is the world’s second most connected Foreign Minister, while David Cameron is active on a daily basis with more than 570,000 followers. London was in fact the first place to host a “Diplohack”, an event where ambassadors are brought together with developers and others to hack traditional diplomacy, and Whelan travelled to Sweden to take place in the third European event, the Stockholm Initiative for Digital Diplomacy held 16-17 January in conjunction with TedxStockholm.
Nevertheless, Whelan, who has worked for the state for a decade, says the US is in the game and ready to try new things. Case in point being its digital diplomacy reaction to the crisis in Syria last year.
“In August 2013 we witnessed tragic events in Syria, and obviously the President of the United States and his security team jumped into action,” said Whelan. “We needed to bear witness and… very clearly saw the need for one thing — a Google+ Hangout.” With her tongue-in-cheek comment, Whelan was pointing out social media’s incredibly relevant role in communicating to the public what’s going on when crises hit, and in answering concerns and questions through it.
“We saw speeches and very disturbing images coming at us,” continued Whelan. “We heard leaders making impassioned speeches, and we ourselves had conversations about what we were seeing and how we needed to engage and inform; to give people the chance to engage and ask questions of us.
“We thought, clearly let’s have a Google+ Hangout. Three people joined us and Secretary John Kerry — Nicholas Kirstof of the New York Times, executive editor of Syria Deeply, Lara Setrakian and Andrew Beiter, a teacher affiliated with the Holocaust Memorial Museum who specialises in how we talk about these topics with our children.”
In the run up to the Hangout, news of the event trickled out and soon Google was calling, asking if it could advertise the session at the bottom of other Hangouts, then on YouTube ads. “Suddenly 15,000 people were watching the Secretary live — that’s by far largest number we’d seen. We felt we’d tapped into something, we knew we’d hit success at what was a challenging time. We were engaging the public and could join with them to communicate a set of questions. People want to ask questions and get very direct answers, and we know it’s a success. We’ve talked to Google about how we can replicate that. We want to transform what we’re doing to make that the norm.”
Secretary of State John Kerry is, Whelan told Wired.co.uk later, “game for anything” when it comes to social media — and having the department leader enthused at the prospect of taking digital diplomacy forward is obviously key to its success.
“He wanted us to get on Instagram and the unselfie meme during the Philippines crisis was his idea — an assistant had seen it and he held a paper in front of him with the URL to donate funds to Typhoon Haiyan victims,” Whelan told Wired.co.uk at the Stockholm diplohack. “President Obama came in with a mandate that social media would be present and pronounced in all our departments.”
“[As] government changes and is more influenced away from old paper models and newspapers, suspenders and bow ties, and more into young innovators wanting to come in and change things,” Whelan continued, “I think it will change the way we work and help us get smarter.”
Social Physics
The quantitative study of human society and social statistics (Merriam-Webster).
When the US government announced in 2012 that it would invest $200 million in research grants and infrastructure building for big data in 2012, Farnam Jahanian, chief of the National Science Foundation’s Computer and Information Science and Engineering Directorate, stated that “big data” has the power to change scientific research from a hypothesis-driven field to one that’s data-driven.” Using big data to provide more evidence-based ways of understanding human behavior is the mission of Alex (Sandy)Pentland, director of MIT’s Human Dynamics Laboratory. Pentland’s latest book illustrates the potential of what he describes as “Social Physics.”
The term was initially developed by Adolphe Jacques Quetelet, the Belgian sociologist and mathematician who introduced statistical methods to the social sciences. Quetelet expanded his views to develop a social physics in his book Sur l’homme sur le developpement de ses facultes, ou Essai de physique sociale. Auguste Comte, who coined “sociology” adopted the term (in his Positive Philosophy Volume Social Physics) when he defined sociology as a study that was just as important as biology and chemistry.
According to Sandy Pentland, social physics is about idea flow, the way human social networks spread ideas and transform those ideas into behaviors. His book consequently aims to “extend economic and political thinking by including not only competitive forces but also exchanges of ideas, information, social pressure, and social status in order to more fully explain human behavior… Only once we understand how social interactions work together with competitive forces can we hope to ensure stability and fairness in our hyperconnected, networked society.”
The launch of the book is accompanied by a website that connects several scholars and explains the term further: “How can we create organizations and governments that are cooperative, productive, and creative? These are the questions of social physics, and they are especially important right now, because of global competition, environmental challenges, and government failure. The engine that drives social physics is big data: the newly ubiquitous digital data that is becoming available about all aspects of human life. By using these data with to build a predictive, computational theory of human behavior we can hope to engineer better social systems.”
Also check out the video below: