What Jelly Means


Steven Johnson: “A few months ago, I found this strange white mold growing in my garden in California. I’m a novice gardener, and to make matters worse, a novice Californian, so I had no idea what these small white cells might portend for my flowers.
This is one of those odd blank spots — I used the call them Googleholes in the early days of the service — where the usual Delphic source of all knowledge comes up relatively useless. The Google algorithm doesn’t know what those white spots are, the way it knows more computational questions, like “what is the top-ranked page for “white mold?” or “what is the capital of Illinois?” What I want, in this situation, is the distinction we usually draw between information and wisdom. I don’t just want to know what the white spots are; I want to know if I should be worried about them, or if they’re just a normal thing during late summer in Northern California gardens.
Now, I’m sure I know a dozen people who would be able to answer this question, but the problem is I don’t really know which people they are. But someone in my extended social network has likely experienced these white spots on their plants, or better yet, gotten rid of them.  (Or, for all I know, ate them — I’m trying not to be judgmental.) There are tools out there that would help me run the social search required to find that person. I can just bulk email my entire address book with images of the mold and ask for help. I could go on Quora, or a gardening site.
But the thing is, it’s a type of question that I find myself wanting to ask a lot, and there’s something inefficient about trying to figure the exact right tool to use to ask it each time, particularly when we have seen the value of consolidating so many of our queries into a single, predictable search field at Google.
This is why I am so excited about the new app, Jelly, which launched today. …
Jelly, if you haven’t heard, is the brainchild of Biz Stone, one of Twitter’s co-founders.  The service launches today with apps on iOS and Android. (Biz himself has a blog post and video, which you should check out.) I’ve known Biz since the early days of Twitter, and I’m excited to be an adviser and small investor in a company that shares so many of the values around networks and collective intelligence that I’ve been writing about since Emergence.
The thing that’s most surprising about Jelly is how fun it is to answer questions. There’s something strangely satisfying in flipping through the cards, reading questions, scanning the pictures, and looking for a place to be helpful. It’s the same broad gesture of reading, say, a Twitter feed, and pleasantly addictive in the same way, but the intent is so different. Scanning a twitter feed while waiting for the train has the feel of “Here we are now, entertain us.” Scanning Jelly is more like: “I’m here. How can I help?”

Social media in crisis events: Open networks and collaboration supporting disaster response and recovery


Paper for the IEEE International Conference on Technologies for Homeland Security (HST): “Large-scale crises challenge the ability of public safety and security organisations to respond efficient and effectively. Meanwhile, citizens’ adoption of mobile technology and rich social media services is dramatically changing the way crisis responses develop. Empowered by new communication media (smartphones, text messaging, internet-based applications and social media), citizens are the in situ first sensors. However, this entire social media arena is unchartered territory to most public safety and security organisations. In this paper, we analyse crisis events to draw narratives on social media relevance and describe how public safety and security organisations are increasingly aware of social media’s added value proposition in times of crisis. A set of critical success indicators to address the process of adopting social media is identified, so that social media information is rapidly transformed into actionable intelligence, thus enhancing the effectiveness of public safety and security organisations — saving time, money and lives.”

A permanent hacker space in the Brazilian Congress


Blog entry by Dan Swislow at OpeningParliament: “On December 17, the presidency of the Brazilian Chamber of Deputies passed a resolution that creates a permanent Laboratório Ráquer or “Hacker Lab” inside the Chamber—a global first.
Read the full text of the resolution in Portuguese.
The resolution mandates the creation of a physical space at the Chamber that is “open for access and use by any citizen, especially programmers and software developers, members of parliament and other public workers, where they can utilize public data in a collaborative fashion for actions that enhance citizenship.”
The idea was born out of a week-long, hackathon (or “hacker marathon”) event hosted by the Chamber of Deputies in November, with the goal of using technology to enhance the transparency of legislative work and increase citizen understanding of the legislative process. More than 40 software developers and designers worked to create 22 applications for computers and mobile devices. The applications were voted on and the top three awarded prizes.
The winner was Meu Congress, a website that allows citizens to track the activities of their elected representatives, and monitor their expenses. Runner-ups included Monitora, Brasil!, an Android application that allows users to track proposed bills, attendance and the Twitter feeds of members; and Deliberatório, an online card game that simulates the deliberation of bills in the Chamber of Deputies.
The hackathon engaged the software developers directly with members and staff of the Chamber of Deputies, including the Chamber’s President, Henrique Eduardo Alves. Hackathon organizer Pedro Markun of Transparencia Hacker made a formal proposal to the President of the Chamber for a permanent outpost, where, as Markun said in an email, “we could hack from inside the leviathan’s belly.”
The Chamber’s Director-General has established nine staff positions for the Hacker Lab under the leadership of the Cristiano Ferri Faria, who spoke with me about the new project.
Faria explained that the hackathon event was a watershed moment for many public officials: “For 90-95% of parliamentarians and probably 80% of civil servants, they didn’t know how amazing a simple app, for instance, can make it much easier to analyze speeches.” Faria pointed to one of the hackathon contest entries, Retórica Parlamentar, which provides an interactive visualization of plenary remarks by members of the Chamber. “When members saw that, they got impressed and wondered, ‘There’s something new going on and we need to understand it and support it.’”

How Big Should Your Network Be?


Michael Simmons at Forbes: “There is a debate happening between software developers and scientists: How large can and should our networks be in this evolving world of social media? The answer to this question has dramatic implications for how we look at our own relationship building…

To better understand our limits, I connected with the famous British anthropologist and evolutionary psychologist, Robin Dunbar, creator of his namesake; Dunbar’s number.

Dunbar’s number, 150, is the suggested cognitive limit to the number of relationships we can maintain where both parties are willing to do favors for each other.


Dunbar’s discovery was in finding a very high correlation between the size of a species’ neocortex and the average social group size (see chart to right). The theory predicted 150 for humans, and this number is found throughout human communities over time….
Does Dunbar’s Number Still Apply In Today’s Connected World?
There are two camps when it comes to Dunbar’s number. The first camp is embodied by David Morin, the founder of Path, who built a whole social network predicated on the idea that you cannot have more than 150 friends. Robin Dunbar falls into this camp and even did an academic study on social media’s impact on Dunbar’s number. When I asked for his opinion, he replied:

The 150 limit applies to internet social networking sites just as it does in face-to-face life. Facebook’s own data shows that the average number of friends is 150-250 (within the range of variation in the face-to-face world). Remember that the 150 figure is just the average for the population as a whole. However, those who have more seem to have weaker friendships, suggesting that the amount of social capital is fixed and you can choose to spread it thickly or thinly.

Zvi Band, the founder of Contactually, a rapidly growing, venture-backed, relationship management tool, disagrees with both Morin and Dunbar, “We have the ability as a society to bust through Dunbar’s number. Current software can extend Dunbar’s number by at least 2-3 times.” To understand the power of Contactually and tools like it, we must understand the two paradigms people currently use when keeping in touch: broadcast & one-on-one.

While broadcast email makes it extremely easy to reach lots of people who want to hear from us, it is missing personalization. Personalization is what transforms information diffusion into personal relationship building. To make matters worse, email broadcast open rates have halved in size over the last decade.

On the other end of the spectrum is one-on-one outreach. Research performed by Facebook data scientists shows that one-on-one outreach is extremely effective and explains why:

Both the offering and the receiving of the intimate information increases relationship strength. Providing a partner with personal information expresses trust, encourages reciprocal self-disclosure, and engages the partner in at least some of the details of one’s daily life. Directed communication evokes norms of reciprocity, so may obligate partner to reply. The mere presence of the communication, which is relatively effortful compared to broadcast messages, also signals the importance of the relationship….”

When Tech Culture And Urbanism Collide


John Tolva: “…We can build upon the success of the work being done at the intersection of technology and urban design, right now.

For one, the whole realm of social enterprise — for-profit startups that seek to solve real social problems — has a huge overlap with urban issues. Impact Engine in Chicago, for instance, is an accelerator squarely focused on meaningful change and profitable businesses. One of their companies, Civic Artworks, has set as its goal rebalancing the community planning process.

The Code for America Accelerator and Tumml, both located in San Francisco, morph the concept of social innovation into civic/urban innovation. The companies nurtured by CfA and Tumml are filled with technologists and urbanists working together to create profitable businesses. Like WorkHands, a kind of LinkedIn for blue collar trades. Would something like this work outside a city? Maybe. Are its effects outsized and scale-ready in a city? Absolutely. That’s the opportunity in urban innovation.

Scale is what powers the sharing economy and it thrives because of the density and proximity of cities. In fact, shared resources at critical density is one of the only good definitions for what a city is. It’s natural that entrepreneurs have overlaid technology on this basic fact of urban life to amplify its effects. Would TaskRabbit, Hailo or LiquidSpace exist in suburbia? Probably, but their effects would be minuscule and investors would get restless. The city in this regard is the platform upon which sharing economy companies prosper. More importantly, companies like this change the way the city is used. It’s not urban planning, but it is urban (re)design and it makes a difference.

A twist that many in the tech sector who complain about cities often miss is that change in a city is not the same thing as change in city government. Obviously they are deeply intertwined; change is mighty hard when it is done at cross-purposes with government leadership. But it happens all the time. Non-government actors — foundations, non-profits, architecture and urban planning firms, real estate developers, construction companies — contribute massively to the shape and health of our cities.

Often this contribution is powered through policies of open data publication by municipal governments. Open data is the raw material of a city, the vital signs of what has happened there, what is happening right now, and the deep pool of patterns for what might happen next.

Tech entrepreneurs would do well to look at the organizations and companies capitalizing on this data as the real change agents, not government itself. Even the data in many cases is generated outside government. Citizens often do the most interesting data-gathering, with tools like LocalData. The most exciting thing happening at the intersection of technology and cities today — what really makes them “smart” — is what is happening at the periphery of city government. It’s easy to belly-ache about government and certainly there are administrations that to do not make data public (or shut it down), but tech companies who are truly interested in city change should know that there are plenty of examples of how to start up and do it.

And yet, the somewhat staid world of architecture and urban-scale design presents the most opportunity to a tech community interested in real urban change. While technology obviously plays a role in urban planning — 3D visual design tools like Revit and mapping services like ArcGIS are foundational for all modern firms — data analytics as a serious input to design matters has only been used in specialized (mostly energy efficiency) scenarios. Where are the predictive analytics, the holistic models, the software-as-a-service providers for the brave new world of urban informatics and The Internet of Things? Technologists, it’s our move.

Something’s amiss when some city governments — rarely the vanguard in technological innovation — have more sophisticated tools for data-driven decision-making than the private sector firms who design the city. But some understand the opportunity. Vannevar Technology is working on it, as is Synthicity. There’s plenty of room for the most positive aspects of tech culture to remake the profession of urban planning itself. (Look to NYU’s Center for Urban Science and Progress and the University of Chicago’s Urban Center for Computation and Data for leadership in this space.)…”

Can a Better Taxonomy Help Behavioral Energy Efficiency?


Article at GreenTechEfficiency: “Hundreds of behavioral energy efficiency programs have sprung up across the U.S. in the past five years, but the effectiveness of the programs — both in terms of cost savings and reduced energy use — can be difficult to gauge.
Of nearly 300 programs, a new report from the American Council for an Energy-Efficient Economy was able to accurately calculate the cost of saved energy from only ten programs….
To help utilities and regulators better define and measure behavioral programs, ACEEE offers a new taxonomy of utility-run behavior programs that breaks them into three major categories:
Cognition: Programs that focus on delivering information to consumers.  (This includes general communication efforts, enhanced billing and bill inserts, social media and classroom-based education.)
Calculus: Programs that rely on consumers making economically rational decisions. (This includes real-time and asynchronous feedback, dynamic pricing, games, incentives and rebates and home energy audits.)
Social interaction: Programs whose key drivers are social interaction and belonging. (This includes community-based social marketing, peer champions, online forums and incentive-based gifts.)
….
While the report was mostly preliminary, it also offered four steps forward for utilities that want to make the most of behavioral programs.
Stack. The types of programs might fit into three broad categories, but judiciously blending cues based on emotion, reason and social interaction into programs is key, according to ACEEE. Even though the report recommends stacked programs that have a multi-modal approach, the authors acknowledge, “This hypothesis will remain untested until we see more stacked programs in the marketplace.”
Track. Just like other areas of grid modernization, utilities need to rethink how they collect, analyze and report the data coming out of behavioral programs. This should include metrics that go beyond just energy savings.
Share. As with other utility programs, behavior-based energy efficiency programs can be improved upon if utilities share results and if reporting is standardized across the country instead of varying by state.
Coordinate. Sharing is only the first step. Programs that merge water, gas and electricity efficiency can often gain better results than siloed programs. That approach, however, requires a coordinated effort by regional utilities and a change to how programs are funded and evaluated by regulators.”

Using Social Media in Rulemaking: Possibilities and Barriers


New paper by Michael Herz (Cardozo Legal Studies Research Paper No. 417): “Web 2.0” is characterized by interaction, collaboration, non-static web sites, use of social media, and creation of user-generated content. In theory, these Web 2.0 tools can be harnessed not only in the private sphere but as tools for an e-topia of citizen engagement and participatory democracy. Notice-and-comment rulemaking is the pre-digital government process that most approached (while still falling far short of) the e-topian vision of public participation in deliberative governance. The notice-and-comment process for federal agency rulemaking has now changed from a paper process to an electronic one. Expectations for this switch were high; many anticipated a revolution that would make rulemaking not just more efficient, but also more broadly participatory, democratic, and dialogic. In the event, the move online has not produced a fundamental shift in the nature of notice-and-comment rulemaking. At the same time, the online world in general has come to be increasingly characterized by participatory and dialogic activities, with a move from static, text-based websites to dynamic, multi-media platforms with large amounts of user-generated content. This shift has not left agencies untouched. To the contrary, agencies at all levels of government have embraced social media – by late 2013 there were over 1000 registered federal agency twitter feeds and over 1000 registered federal agency Facebook pages, for example – but these have been used much more as tools for broadcasting the agency’s message than for dialogue or obtaining input. All of which invites the questions whether agencies could or should directly rely on social media in the rulemaking process.
This study reviews how federal agencies have been using social media to date and considers the practical and legal barriers to using social media in rulemaking, not just to raise the visibility of rulemakings, which is certainly happening, but to gather relevant input and help formulate the content of rules.
The study was undertaken for the Administrative Conference of the United States and is the basis for a set of recommendations adopted by ACUS in December 2013. Those recommendations overlap with but are not identical to the recommendations set out herein.”

How could technology improve policy-making?


Beccy Allen from the Hansard Society (UK): “How can civil servants be sure they have the most relevant, current and reliable data? How can open data be incorporated into the policy making process now and what is the potential for the future use of this vast array of information? How can parliamentary clerks ensure they are aware of the broadest range of expert opinion to inform committee scrutiny? And how can citizens’ views help policy makers to design better policy at all stages of the process?
These are the kind of questions that Sense4us will be exploring over the next three years. The aim is to build a digital tool for policy-makers that can:

  1. locate a broad range of relevant and current information, specific to a particular policy, incorporating open data sets and citizens’ views particularly from social media; and
  2. simulate the consequences and impact of potential policies, allowing policy-makers to change variables and thereby better understand the likely outcomes of a range of policy options before deciding which to adopt.

It is early days for open data and open policy making. The word ‘digital’ peppers the Civil Service Reform Plan but the focus is often on providing information and transactional services digitally. Less attention is paid to how digital tools could improve the nature of policy-making itself.
The Sense4us tool aims to help bridge the gap. It will be developed in consultation with policy-makers at different levels of government across Europe to ensure its potential use by a wide range of stakeholders. At the local level, our partners GESIS (the Leibniz-Institute for the Social Sciences) will be responsible for engaging with users at the city level in Berlin and in the North Rhine-Westphalia state legislature At the multi-national level Government to You (Gov2u) will engage with users in the European Parliament and Commission. Meanwhile the Society will be responsible for national level consultation with civil servants, parliamentarians and parliamentary officials in Whitehall and Westminster exploring how the tool can be used to support the UK policy process. Our academic partners leading on technical development of the tool are the IT Innovation Centre at Southampton University, eGovlab at Stockholm University, the University of Koblenz-Landau and the Knowledge Media Institute at the Open University.”

NESTA: 14 predictions for 2014


NESTA: “Every year, our team of in-house experts predicts what will be big over the next 12 months.
This year we set out our case for why 2014 will be the year we’re finally delivered the virtual reality experience we were promised two decades ago, the US will lose technological control of the Internet, communities will start crowdsourcing their own political representatives and we’ll be introduced to the concept of extreme volunteering – plus 10 more predictions spanning energy, tech, health, data, impact investment and social policy…
People powered data

The growing movement to take back control of personal data will reach a tipping point, says Geoff Mulgan
2014 will be the year when citizens start to take control over their own data. So far the public has accepted a dramatic increase in use of personal data because it doesn’t impinge much on freedom, and helps to give us a largely free internet.
But all of that could be about to change. Edward Snowden’s NSA revelations have fuelled a growing perception that the big social media firms are cavalier with personal data (a perception not helped by Facebook and Google’s recent moves to make tracking cookies less visible) and the Information Commissioner has described the data protection breaches of many internet firms, banks and others as ‘horrifying’.
According to some this doesn’t matter. Scott McNealy of Sun Microsystems famously dismissed the problem: “you have zero privacy anyway. Get over it.” Mark Zuckerberg claims that young people no longer worry about making their lives transparent. We’re willing to be digital chattels so long as it doesn’t do us any visible harm.
That’s the picture now. But the past isn’t always a good guide to the future. More digitally savvy young people put a high premium on autonomy and control, and don’t like being the dupes of big organisations. We increasingly live with a digital aura alongside our physical identity – a mix of trails, data, pictures. We will increasingly want to shape and control that aura, and will pay a price if we don’t.
That’s why the movement for citizen control over data has gathered momentum. It’s 30 years since Germany enshrined ‘informational self-determination’ in the constitution and other countries are considering similar rules. Organisations like Mydex and Qiy now give users direct control over a store of their personal data, part of an emerging sector of Personal Data Stores, Privacy Dashboards and even ‘Life Management Platforms’. 
In the UK, the government-backed Midata programme is encouraging firms to migrate data back to public control, while the US has introduced green, yellow and blue buttons to simplify the option of taking back your data (in energy, education and the Veterans Administration respectively). Meanwhile a parallel movement encourages people to monetise their own data – so that, for example, Tesco or Experian would have to pay for the privilege of making money out of analysing your purchases and behaviours.
When people are shown what really happens to their data now they are shocked. That’s why we may be near a tipping point. A few more scandals could blow away any remaining complacency about the near future world of ubiquitous facial recognition software (Google Glasses and the like), a world where more people are likely to spy on their neighbours, lovers and colleagues.
The crowdsourced politician

This year we’ll see the rise of the crowdsourced independent parliamentary candidate, says Brenton Caffin
…In response, existing political institutions have sought to improve feedback between the governing and the governed through the tentative embrace of crowdsourcing methods, ranging from digital engagement strategies, open government challenges, to the recent stalled attempt to embrace open primaries by the Conservative Party (Iceland has been braver by designing its constitution by wiki). Though for many, these efforts are both too little and too late. The sense of frustration that no political party is listening to the real needs of people is probably part of the reason Russell Brand’s interview with Jeremy Paxman garnered nine million views in its first month on YouTube.
However a glimpse of an alternative approach may have arrived courtesy of the 2013 Australian Federal Election.
Tired of being taken for granted by the local MP, locals in the traditionally safe conservative seat of Indi embarked on a structured process of community ‘kitchen table’ conversations to articulate an independent account of the region’s needs. The community group, Voice for Indi, later nominated its chair, Cath McGowan, as an independent candidate. It crowdfunded their campaign finances and built a formidable army of volunteers through a sophisticated social media operation….
The rise of ‘extreme’ volunteering

By the end of 2014 the concept of volunteering will move away from the soup kitchen and become an integral part of how our communities operate, says Lindsay Levkoff Lynn
Extreme volunteering is about regular people going beyond the usual levels of volunteering. It is a deeper and more intensive form of volunteering, and I predict we will see more of these amazing commitments of ‘people helping people’ in the years to come.
Let me give you a few early examples of what we are already starting to see in the UK:

  • Giving a whole year of your life in service of kids. That’s what City Year volunteers do – Young people (18-25) dedicate a year, full-time, before university or work to support head teachers in turning around the behaviour and academics of some of the most underprivileged UK schools.
  • Giving a stranger a place to live and making them part of your family. That’s what Shared Lives Plus carers do. They ‘adopt’ an older person or a person with learning disabilities and offer them a place in their family. So instead of institutional care, families provide the full-time care – much like a ‘fostering for adults’ programme. Can you imagine inviting someone to come and live with you?…

Buenos Aires, A Pocket of Civic Innovation in Argentina


Rebecca Chao in TechPresident: “…In only a few years, the government, civil society and media in Buenos Aires have actively embraced open data. The Buenos Aires city government has been publishing data under a creative commons license and encouraging civic innovation through hackathons. NGOs have launched a number of tech-driven tools and Argentina’s second largest newspaper, La Nación, has published several hard-hitting data journalism projects. The result is a fledgling but flourishing open data culture in Buenos Aires, in a country that has not yet adopted a freedom of information law.

A Wikipedia for Open Government Data

In late August of this year, the Buenos Aires government declared a creative commons license for all of its digital content, which allows it be used for free, like Wikipedia content, with proper attribution. This applies to their new open data catalog that allows users to visualize the data, examine apps that have been created using the data and even includes a design lab for posting app ideas. Launched only in March, the government has already published fairly substantial data sets, including the salaries of city officials. The website also embodies the principals of openness in its design; it is built with open-source software and its code is available for reuse via GitHub.
“We were the first city in Argentina doing open government,” Rudi Borrmann tells techPresident over Skype. Borrmann is the Director of Buenos Aires’ Open Government Initiative. Previously, he was the social media editor at the city’s New Media Office but he also worked for many years in digital media…
While the civil society and media sectors have forged ahead in using open data, Borrmann tells techPresident that up in the ivory tower, openness to open data has been lagging. “Only technical schools are starting to create areas focused on working on open data,” he says.
In an interview with NYU’s govlab, Borrmann explained the significance of academia in using and pushing for more open data. “They have the means, the resources, the methodology to analyze…because in government you don’t have that time to analyze,” he said.
Another issue with open data is getting other branches of the government to modernize. Borrmann says that a lot of the Open Government’s work is done behind the scenes. “In general, you have very poor IT infrastructure all over Latin America” that interferes with the gathering and publishing of data, he says. “So in some cases it’s not about publishing or not publishing,” but about “having robust infrastructure for the information.”
It seems that the behind the scenes work is bearing some fruit. Just last week, on Dec. 6, the team behind the Buenos Aires open data website launched an impressive, interactive timeline, based on a similar timelapse map developed by a 2013 Knight-Mozilla Fellow, Noah Veltman. Against faded black and white photos depicting the subway from different decades over the last century, colorful pops of the Subterráneo lines emerge alongside factoids that go all the way back to 1910.”