Expanding Training on Data and Technology to Improve Communities


Kathryn Pettit at the National Neighborhood Indicators Partnership (NNIP): “Local government and nonprofit staff need data and technology skills to regularly monitor local conditions and design programs that achieve more effective outcomes. Tailored training is essential to help them gain the knowledge and confidence to leverage these indispensable tools. A recent survey of organizations that provide data and technology training documented current practices and how such training should be expanded. Four recommendations are provided to assist government agencies, elected leaders, nonprofit executives, and local funders in empowering workers with the necessary training to use data and technology to benefit their communities. Specifically, community stakeholders should collectively work to

  • expand the training available to government and nonprofit staff;
  • foster opportunities for sharing training materials and lessons;
  • identify allies who can enhance and support local training efforts;
  • and assess the local landscape of data and technology training.

Project Products

  • Brief: A summary of the current training landscape and key action steps for various sectors to ensure that local government and nonprofit staff have the data and technology skills needed for their civic missions.
  • Guide: A document for organizations interested in providing community data and technology training, including advice on how to assess local needs, develop training content, and fund these efforts.
  • Catalog: Example training descriptions and related materials collected from various cities for local adaptation.
  • Fact sheet: A summary of results from a survey on current training content and practices….(More)”

Braindates


Greg Oates at Skift: “C2 has evolved into a global benchmark for event design since launching in 2012, but the innovation extends beyond the creative mojo and significant resources invested into the physical layout.

The show works with Montreal-based E-180 to provide a digital matchmaking platform called Braindate for attendees to find other people with like-minded interests. It’s always a challenge at conferences to meet someone who might be a potential business partner or learning source, or whatever, especially when you don’t know who those people are.

In recent years, dozens of emerging event technology companies have attempted to develop cloud-based personal connectivity platforms to help conference participants cull through their attendee lists. Few, however, have scaled to provide an effective solution for crowdsourced learning and networking at events.

Braindate is …designed to be integrated into an event app where attendees can post a one-sentence question or goal explaining their particular interest. Those posts can be tagged with any number of phrases, such as “event UX” or “smart city” or “experiential marketing” to help users streamline their search.

When someone sees a Braindate suggestion that looks appealing, he or she clicks on it to show a calendar with concurrent open slots in both people’s schedules. Then that person sends a request with a specific time to meet to discuss their shared interests. It’s also important to note that C2 registrants could schedule meetings pre-conference, although the bulk of activity kicked in on opening day.

Other conferences have incorporated the Braindate platform into their event design and programming, including TED Women, Airbnb Open, Salesforce’s Dreamforce, and re:publica.

New this year at C2 Montreal, E-180 launched a Group Braindate scenario where a participant could schedule and host a meeting focusing on a specific topic with four other people. Interested parties would reserve one of the four open spaces, and then everyone participating could chat online within the specific Braindate page to curate the conversation ahead of time….

This year, Montreal-based PixMob developed a Web-based event platform for C2 called “Klik” — with the Braindate platform embedded in the user interface — instead of the typical, downloadable iOS/Android event app traditionally used for large tech/media/marketing conferences.

The obvious benefit from a Web-based event platform is that people don’t have to download another dedicated app to their phone solely for the duration of the live experience. As such, however, there was a lag time whenever attendees tried to access Klik, and the site had to reload every time you moved from page to page. The portal also required users to keep logging in many times during the day.

That’s a big pain point when you’re running around an event floor trying to make quick decisions while paging through app sections on the fly, along with thousands of other people doing the same thing inside a highly congested space.

Attendees also couldn’t add personal meetings into the Klik event agenda, requiring them to shift constantly between the Klik schedule and their email schedule, and the content describing each session and activation was often minimal to the point of being useless….(More)”.

How Data Mining Facebook Messages Can Reveal Substance Abusers


Emerging Technology from the arXiv: “…Substance abuse is a serious concern. Around one in 10 Americans are sufferers. Which is why it costs the American economy more than $700 billion a year in lost productivity, crime, and health-care costs. So a better way to identify people suffering from the disorder, and those at risk of succumbing to it, would be hugely useful.

Bickel and co say they have developed just such a technique, which allows them to spot sufferers simply by looking at their social media messages such as Facebook posts. The technique even provides new insights into the way abuse of different substances influences people’s social media messages.

The new technique comes from the analysis of data collected between 2007 and 2012 as part of a project that ran on Facebook called myPersonality. Users who signed up were offered various psychometric tests and given feedback on their scores. Many also agreed to allow the data to be used for research purposes.

One of these tests asked over 13,000 users with an average age of 23 about the substances they used. In particular, it asked how often they used tobacco, alcohol, or other drugs, and assessed each participant’s level of use. The users were then divided into groups according to their level of substance abuse.

This data set is important because it acts as a kind of ground truth, recording the exact level of substance use for each person.

The team next gathered two other Facebook-related data sets. The first was 22 million status updates posted by more than 150,000 Facebook users. The other was even larger: the “like” data associated with 11 million Facebook users.

Finally, the team worked out how these data sets overlapped. They found almost 1,000 users who were in all the data sets, just over 1,000 who were in the substance abuse and status update data sets, and 3,500 who were in the substance abuse and likes data sets.

These users with overlapping data sets provide rich pickings for data miners. If people with substance use disorders have certain unique patterns of behavior, it may be possible to spot these in their Facebook status updates or in their patterns of likes.

So Bickel and co got to work first by text mining most of the Facebook status updates and then data mining most of the likes data set. Any patterns they found, they then tested by looking for people with similar patterns in the remaining data and seeing if they also had the same level of substance use.

The results make for interesting reading. The team says its technique was hugely successful. “Our best models achieved 86%  for predicting tobacco use, 81% for alcohol use and 84% for drug use, all of which significantly outperformed existing methods,” say Bickel and co…. (More) (Full Paper: arxiv.org/abs/1705.05633: Social Media-based Substance Use Prediction).

How Twitter Is Being Gamed to Feed Misinformation


the New York Times: “…the biggest problem with Twitter’s place in the news is its role in the production and dissemination of propaganda and misinformation. It keeps pushing conspiracy theories — and because lots of people in the media, not to mention many news consumers, don’t quite understand how it works, the precise mechanism is worth digging into….Here’s how.

The guts of the news business.

One way to think of today’s disinformation ecosystem is to picture it as a kind of gastrointestinal tract…. Twitter often acts as the small bowel of digital news. It’s where political messaging and disinformation get digested, packaged and widely picked up for mass distribution to cable, Facebook and the rest of the world.

This role for Twitter has seemed to grow more intense during (and since) the 2016 campaign. Twitter now functions as a clubhouse for much of the news. It’s where journalists pick up stories, meet sources, promote their work, criticize competitors’ work and workshop takes. In a more subtle way, Twitter has become a place where many journalists unconsciously build and gut-check a worldview — where they develop a sense of what’s important and merits coverage, and what doesn’t.

This makes Twitter a prime target for manipulators: If you can get something big on Twitter, you’re almost guaranteed coverage everywhere….

Twitter is clogged with fake people.

For determined media manipulators, getting something big on Twitter isn’t all that difficult. Unlike Facebook, which requires people to use their real names, Twitter offers users essentially full anonymity, and it makes many of its functions accessible to outside programmers, allowing people to automate their actions on the service.

As a result, numerous cheap and easy-to-use online tools let people quickly create thousands of Twitter bots — accounts that look real, but that are controlled by a puppet master.

Twitter’s design also promotes a slavish devotion to metrics: Every tweet comes with a counter of Likes and Retweets, and users come to internalize these metrics as proxies for real-world popularity….

They may ruin democracy.

…. the more I spoke to experts, the more convinced I became that propaganda bots on Twitter might be a growing and terrifying scourge on democracy. Research suggests that bots are ubiquitous on Twitter. Emilio Ferrara and Alessandro Bessi, researchers at the University of Southern California, found that about a fifth of the election-related conversation on Twitter last year was generated by bots. Most users were blind to them; they treated the bots the same way they treated other users….

in a more pernicious way, bots give us an easy way to doubt everything we see online. In the same way that the rise of “fake news” gives the president cover to label everything “fake news,” the rise of bots might soon allow us to dismiss any online enthusiasm as driven by automation. Anyone you don’t like could be a bot; any highly retweeted post could be puffed up by bots….(More)”.

More professionalism, less populism: How voting makes us stupid, and what to do about it


Paper by Benjamin Wittes and Jonathan Rauch: “For several generations, political reform and rhetoric have been entirely one-directional: always more direct democracy, never less. The general belief holds that more public involvement will produce more representative and thus more effective and legitimate governance. But does increasing popular involvement in politics remedy the ills of our government culture; is it the chicken soup of political reforms?

In a new report, “More professionalism, less populism: How voting makes us stupid, and what to do about it,” Brookings Senior Fellows Jonathan Rauch and Benjamin Wittes argue that the best way forward is to rebalance the reform agenda away from direct participation and toward intermediation and institutions. As the authors write, “Neither theory nor practice supports the idea that more participation will produce better policy outcomes, or will improve the public’s approbation of government, or is even attainable in an environment dominated by extreme partisans and narrow interest groups.”

Populism cannot solve our problems, Rauch and Wittes claim, because its core premises and reforms are self-defeating. Research has shown that voters are “irrationally biased and rationally ignorant,” and do not possess the specialized knowledge necessary to make complex policy judgments. Further, elections provide little by way of substantive guidance for policymakers and, even on its own terms, direct democracy is often unrepresentative. In the words of the authors, “By itself, building more direct input from the public into the functions of government is likely to lead to more fragmentation, more stalemate, more flawed policies—and, paradoxically, less effective representation.”

The authors are not advocating complacency about voter participation, much less for restricting or limiting voting: “We are arguing that participation is not enough, and that overinvesting in it neglects other, more promising paths.”

To truly repair American democracy, Rauch and Wittes endorse a resurgence of political institutions, such as political parties, and substantive professionals, such as career politicians and experts. Drawing on examples like the intelligence oversight community, the authors assert that these intermediaries actually make democracy more inclusive and more representative than direct participation can do by itself. “In complex policy spaces,” the authors write, “properly designed intermediary institutions can act more decisively and responsively on behalf of the public than an army of ‘the people’ could do on its own behalf, [and are] less likely to be paralyzed by factional disputes and distorted by special-interest manipulation.”…(More) (Read the full paper here).

Data Journalism: How Not To Be Wrong


Winny de Jong: “At the intersection of data and journalism, lots can go wrong. Merely taking precautions might not be enough….

Half True Is False

But this approach is not totally foolproof.

“In data journalism, we cannot settle for ‘half-true.’ Anything short of true is wrong – and we cannot afford to be wrong.” Unlike fact-checking websites such as Politifact, which invented ‘scales’ for truthfulness, from false to true and everything in between, data journalism should always be true.

No Pants on Fire: Politifact’s Truth-O-Meter.

True but Wrong

But even when your story is true, Gebeloff said you still could still be wrong. “You can do the math correctly, but get the context wrong, fail to acknowledge uncertainties or not describe your findings correctly.”

Fancy Math

When working on a story, journalists should consider whether they use “fancy math” – think statistics – or “standard math.” “Using fancy math you can explore complex relationships, but at the same time your story will be harder to explain.”…

Targets as a Source

…To make sure you’re not going to be wrong, you should share your findings. “Don’t just share findings with experts, share them with hostile experts too,” Gebeloff advises. “Use your targets as a source. If there’s a blowback, you want to know before publication – and include the blowback in the publication.”

How Not To Be Wrong Checklist

Here’s why you want to use this checklist, which is based on Gebeloff’s presentation: a half truth is false, and data journalism should always be true. But just being true is not enough. Your story can be mathematically true but wrong in context or explanation. You should want your stories to be true and not wrong.

  1. Check your data carefully:
    •  Pay attention to dates.
    •  Check for spelling and duplicates.
    •  Identify outliers.
    •  Statistical significance alone is not news.
    •  Prevent base year abuse: if something is a trend, it should be true in general not just if you cherrypick a base year.
    •  Make sure your data represents reality.
  2. As you work, keep a data diary that records what you’ve done and how you’ve done it. You should be able to reproduce your calculations.
  3. Make sure you explain the methods you used – your audience should be able to understand how you find a story.
  4. Play offense and defense simultaneously. Go for the maximum possible story, but at all times think of why you might be wrong, or what your target would say in response.
  5. Use your targets as a source to find blowbacks before publication.
  6. As part of the proofing process, create a footnotes file. Identify each fact and give it a number. Then, for each fact, list which document it came from, how you know it and the proof. Fix what needs to be fixed.

Additional links of interest: the slides of Robert Gebeloff’s how not to be wrong presentation, and the methodology notes and data from the “Race Behind Bars” series….(More)”

Digital Service Teams: Challenges and Recommendations for Government


Report by Ines Mergel: “The British government successfully pioneered the use of a national, semi-independent “surge team” to tackle large-scale technology-driven challenges facing it. The U.S. federal government adapted this approach to improve the success of its own operations in 2014, titling its top-level team as the “U.S. Digital Service.”  It then created a small internal software development and service organization, dubbed “18F,” to support both USDS and individual agencies.  And individual agencies are creating their own internal digital service teams, as well.

Dr. Mergel interviews leaders and users of digital service teams in the US and internationally to learn how they operate, and the challenges they faced in creating a private sector-like “start up” culture within government to foster innovation and top-level tech talent.  Her report identifies six challenges, and offers recommendations on actions that both digital service leaders and policy makers can take to ensure the sustainability and scaling of this novel approach in government….(More)”.

Routledge Handbook on Information Technology in Government


Book edited by Yu-Che Chen and Michael J. Ahn: “The explosive growth in information technology has ushered in unparalleled new opportunities for advancing public service. Featuring 24 chapters from foremost experts in the field of digital government, this Handbook provides an authoritative survey of key emerging technologies, their current state of development and use in government, and insightful discussions on how they are reshaping and influencing the future of public administration. This Handbook explores:

  • Key emerging technologies (i.e., big data, social media, Internet of Things (IOT), GIS, smart phones & mobile technologies) and their impacts on public administration
  • The impacts of the new technologies on the relationships between citizens and their governments with the focus on collaborative governance
  • Key theories of IT innovations in government on the interplay between technological innovations and public administration
  • The relationship between technology and democratic accountability and the various ways of harnessing the new technologies to advance public value
  • Key strategies and conditions for fostering success in leveraging technological innovations for public service

This Handbook will prove to be an invaluable guide and resource for students, scholars and practitioners interested in this growing field of technological innovations in government….(More)”.

The People’s Verdict: Adding Informed Citizen Voices to Public Decision-Making


Book by Claudia Chwalisz:  “In a post-Brexit world with populists on the rise, trust in government and politicians is in short supply. People claim to be tired of ‘experts’ and the divide between facts and opinion has been blurred. The art of offering simple solutions to complex problems is tipping the scale away from nuanced, multifaceted answers founded on compromise.

Within this context, governments nonetheless need to make difficult decisions, whether it is developing budgets, aligning priorities, or designing long-term projects. It is often impossible to make everybody happy, and the messy business of weighing trade-offs takes place.

While sometimes these tricky policy dilemmas are relegated to independent commissions or inquiries, or lately to referendums, a better method exists for solving them. This study of almost 50 long-form deliberative processes in Canada and Australia makes the case that adding informed citizen voices to public decision-making leads to more effective policies. By putting the problem to the people, giving them information, time to discuss the options, to find common ground and to decide what they want, public bodies gain the legitimacy to act on hard choices…(More)”

Permanent Campaigning in Canada


Book by Alex MarlandThierry Giasson and Anna Lennox Esselment:  “Election campaigning never stops. That is the new reality of politics and government in Canada, where everyone from staffers in the Prime Minister’s Office to backbench MPs practise political marketing and communication as though the official campaign were still underway.

Permanent Campaigning in Canada examines the growth and democratic implications of political parties’ relentless search for votes and popularity and what a constant state of electioneering means for governance. With the emergence of fixed-date elections and digital media, each day is a battle to win mini-contests: the news cycle, public opinion polls, quarterly fundraising results, by-elections, and more. The contributors’ case studies – on political databases, the strategy behind online political communication, the politicization of government advertising, and the role of the PMO and political staff – reveal how political actors are using all available tools at their disposal to secure electoral advantage, including the use of public resources for partisan gain.

This is the first study of a phenomenon that has become embedded in Canadian politics and government. It reveals the extent to which political parties and political staff have embraced non-stop electioneering, and the consequences for our democratic processes and institutions….(More)”