The Tech Intellectuals


New Essay by Henry Farrell in Democracy: “A quarter of a century ago, Russell Jacoby lamented the demise of the public intellectual. The cause of death was an improvement in material conditions. Public intellectuals—Dwight Macdonald, I.F. Stone, and their like—once had little choice but to be independent. They had difficulty getting permanent well-paying jobs. However, as universities began to expand, they offered new opportunities to erstwhile unemployables. The academy demanded a high price. Intellectuals had to turn away from the public and toward the practiced obscurities of academic research and prose. In Jacoby’s description, these intellectuals “no longer need[ed] or want[ed] a larger public…. Campuses [were] their homes; colleagues their audience; monographs and specialized journals their media.”
Over the last decade, conditions have changed again. New possibilities are opening up for public intellectuals. Internet-fueled media such as blogs have made it much easier for aspiring intellectuals to publish their opinions. They have fostered the creation of new intellectual outlets (Jacobin, The New Inquiry, The Los Angeles Review of Books), and helped revitalize some old ones too (The Baffler, Dissent). Finally, and not least, they have provided the meat for a new set of arguments about how communications technology is reshaping society.
These debates have created opportunities for an emergent breed of professional argument-crafters: technology intellectuals. Like their predecessors of the 1950s and ’60s, they often make a living without having to work for a university. Indeed, the professoriate is being left behind. Traditional academic disciplines (except for law, which has a magpie-like fascination with new and shiny things) have had a hard time keeping up. New technologies, to traditionalists, are suspect: They are difficult to pin down within traditional academic boundaries, and they look a little too fashionable to senior academics, who are often nervous that their fields might somehow become publicly relevant.
Many of these new public intellectuals are more or less self-made. Others are scholars (often with uncomfortable relationships with the academy, such as Clay Shirky, an unorthodox professor who is skeptical that the traditional university model can survive). Others still are entrepreneurs, like technology and media writer and podcaster Jeff Jarvis, working the angles between public argument and emerging business models….
Different incentives would lead to different debates. In a better world, technology intellectuals might think more seriously about the relationship between technological change and economic inequality. Many technology intellectuals think of the culture of Silicon Valley as inherently egalitarian, yet economist James Galbraith argues that income inequality in the United States “has been driven by capital gains and stock options, mostly in the tech sector.”
They might think more seriously about how technology is changing politics. Current debates are still dominated by pointless arguments between enthusiasts who believe the Internet is a model for a radically better democracy, and skeptics who claim it is the dictator’s best friend.
Finally, they might pay more attention to the burgeoning relationship between technology companies and the U.S. government. Technology intellectuals like to think that a powerful technology sector can enhance personal freedom and constrain the excesses of government. Instead, we are now seeing how a powerful technology sector may enable government excesses. Without big semi-monopolies like Facebook, Google, and Microsoft to hoover up personal information, surveillance would be far more difficult for the U.S. government.
Debating these issues would require a more diverse group of technology intellectuals. The current crop are not diverse in some immediately obvious ways—there are few women, few nonwhites, and few non-English speakers who have ascended to the peak of attention. Yet there is also far less intellectual diversity than there ought to be. The core assumptions of public debates over technology get less attention than they need and deserve.”

Patients Take Control of Their Health Care Online


MIT Technology Review: “Patients are collaborating for better health — and, just maybe, radically reduced health-care costs….Not long ago, Sean Ahrens managed flare-ups of his Crohn’s disease—abdominal pain, vomiting, diarrhea—by calling his doctor and waiting a month for an appointment, only to face an inconclusive array of possible prescriptions. Today, he can call on 4,210 fellow patients in 66 countries who collaborate online to learn which treatments—drugs, diets, acupuncture, meditation, even do-it-yourself infusions of intestinal parasites —bring the most relief.
The online community Ahrens created and launched two years ago, Crohnology.com, is one of the most closely watched experiments in digital health. It lets patients with Crohn’s, colitis, and other inflammatory bowel conditions track symptoms, trade information on different diets and remedies, and generally care for themselves.
The site is at the vanguard of the growing “e-patient” movement that is letting patients take control over their health decisions—and behavior—in ways that could fundamentally change the economics of health care. Investors are particularly interested in the role “peer-to-peer” social networks could play in the $3 trillion U.S. health-care market.

chronology chart

“Patients sharing data about how they feel, the type of treatments they’re using, and how well they’re working is a new behavior,” says Malay Gandhi, chief strategy officer of Rock Health, a San Francisco incubator for health-care startups that invested in Crohnology.com. “If you can get consumers to engage in their health for 15 to 30 minutes a day, there’s the largest opportunity in digital health care.”
Experts say when patients learn from each other, they tend to get fewer tests, make fewer doctors’ visits, and also demand better treatment. “It can lead to better quality, which in many cases will be way more affordable,” says Bob Kocher, an oncologist and former adviser to the Obama administration on health policy.”

Public Open Data: The Good, the Bad, the Future


at IDEALAB: “Some of the most powerful tools combine official public data with social media or other citizen input, such as the recent partnership between Yelp and the public health departments in New York and San Francisco for restaurant hygiene inspection ratings. In other contexts, such tools can help uncover and ultimately reduce corruption by making it easier to “follow the money.”
Despite the opportunities offered by “free data,” this trend also raises new challenges and concerns, among them, personal privacy and security. While attention has been devoted to the unsettling power of big data analysis and “predictive analytics” for corporate marketing, similar questions could be asked about the value of public data. Does it contribute to community cohesion that I can find out with a single query how much my neighbors paid for their house or (if employed by public agencies) their salaries? Indeed, some studies suggest that greater transparency leads not to greater trust in government but to resignation and apathy.
Exposing certain law enforcement data also increases the possibility of vigilantism. California law requires the registration and publication of the home addresses of known sex offenders, for instance. Or consider the controversy and online threats that erupted when, shortly after the Newtown tragedy, a newspaper in New York posted an interactive map of gun permit owners in nearby counties.
…Policymakers and officials must still mind the “big data gap.”So what does the future hold for open data? Publishing data is only one part of the information ecosystem. To be useful, tools must be developed for cleaning, sorting, analyzing and visualizing it as well. …
For-profit companies and non-profit watchdog organizations will continue to emerge and expand, building on the foundation of this data flood. Public-private partnerships such as those between San Francisco and Appallicious or Granicus, startups created by Code for America’s Incubator, and non-partisan organizations like the Sunlight Foundation and MapLight rely on public data repositories for their innovative applications and analysis.
Making public data more accessible is an important goal and offers enormous potential to increase civic engagement. To make the most effective and equitable use of this resource for the public good, cities and other government entities should invest in the personnel and equipment — hardware and software — to make it universally accessible. At the same time, Chief Data Officers (or equivalent roles) should also be alert to the often hidden challenges of equity, inclusion, privacy, and security.”

From Potholes to Policies: Technology, Civic Engagement and the Path to Peer-Produced Governance


Chris Osgood and Nigel Jacob at Living Cities: “There’s been tremendous energy behind the movement to change the way that local governments use technology to better connect with residents. Civic hackers, Code for America Fellows, concerned residents, and offices such as ours, the Mayor’s Office of New Urban Mechanics in Boston, are working together to create a more collaborative environment in which these various players can develop new kinds of solutions to urban challenges…

These initiatives have shown a lot of promise. Now we need to build on these innovations to bring public participation into the heart of policymaking.
This is not going to happen overnight, nor is the path to changing the interface between citizens and government an obvious one. However, reflecting on the work we’ve done over the past few years, we are starting to see a set of design principles that can help guide our efforts. These are emergent, and so imperfect, but we share them here in the hopes of getting feedback to improve them:

  1. The reasons for engagement must be clear: It is incumbent on us as creators and purveyors of civic technologies to be crystal-clear about what policies we are trying to rewrite, why, and what role the public plays in that process. With the Public Schools, the Community PlanIT game was built to engage residents both on-line and in person to co-design school performance metrics; the result was an approach that was different, and better, than what had originally been proposed, with less discord than was happening in traditional town hall meetings.
  2. Channels must be high-quality and appropriately receptive: When you use Citizens Connect to report quality-of-life issues in Boston, you get an email saying: “Thank you for reporting this pothole. It has now been fixed.” You can’t just cut and paste that email to say: “Thank you for your views on this policy. The policy has now been fixed.” The channel has to make it possible for the City to make meaning of and act on resident input, and then to communicate back to users what has been done and why. And as our friends at Code for America say, they must be “simple, beautiful and easy to use.”
  3. Transparency is vital: Transparency around how the process works and why fosters greater public trust in the system and consequently makes people more likely to engage. Local leaders must therefore be very clear up-front about these points, and communicate them repeatedly and consistently in the face of potential mistrust and misunderstanding.”

 

Innovating to Improve Disaster Response and Recovery


Todd Park at OSTP blog: “Last week, the White House Office of Science and Technology Policy (OSTP) and the Federal Emergency Management Agency (FEMA) jointly challenged a group of over 80 top innovators from around the country to come up with ways to improve disaster response and recovery efforts.  This diverse group of stakeholders, consisting of representatives from Zappos, Airbnb, Marriott International, the Parsons School of Design, AOL/Huffington Post’s Social Impact, The Weather Channel, Twitter, Topix.com, Twilio, New York City, Google and the Red Cross, to name a few, spent an entire day at the White House collaborating on ideas for tools, products, services, programs, and apps that can assist disaster survivors and communities…
During the “Data Jam/Think Tank,” we discussed response and recovery challenges…Below are some of the ideas that were developed throughout the day. In the case of the first two ideas, participants wrote code and created actual working prototypes.

  • A real-time communications platform that allows survivors dependent on electricity-powered medical devices to text or call in their needs—such as batteries, medication, or a power generator—and connect those needs with a collaborative transportation network to make real-time deliveries.
  • A technical schema that tags all disaster-related information from social media and news sites – enabling municipalities and first responders to better understand all of the invaluable information generated during a disaster and help identify where they can help.
  • A Disaster Relief Innovation Vendor Engine (DRIVE) which aggregates pre-approved vendors for disaster-related needs, including transportation, power, housing, and medical supplies, to make it as easy as possible to find scarce local resources.
  • A crowdfunding platform for small businesses and others to receive access to capital to help rebuild after a disaster, including a rating system that encourages rebuilding efforts that improve the community.
  • Promoting preparedness through talk shows, working closely with celebrities, musicians, and children to raise awareness.
  • A “community power-go-round” that, like a merry-go-round, can be pushed to generate electricity and additional power for battery-charged devices including cell phones or a Wi-Fi network to provide community internet access.
  • Aggregating crowdsourced imagery taken and shared through social media sites to help identify where trees have fallen, electrical lines have been toppled, and streets have been obstructed.
  • A kid-run local radio station used to educate youth about preparedness for a disaster and activated to support relief efforts during a disaster that allows youth to share their experiences.”

How Mechanical Turkers Crowdsourced a Huge Lexicon of Links Between Words and Emotion


The Physics arXiv Blog: Sentiment analysis on the social web depends on how a person’s state of mind is expressed in words. Now a new database of the links between words and emotions could provide a better foundation for this kind of analysis


One of the buzzphrases associated with the social web is sentiment analysis. This is the ability to determine a person’s opinion or state of mind by analysing the words they post on Twitter, Facebook or some other medium.
Much has been promised with this method—the ability to measure satisfaction with politicians, movies and products; the ability to better manage customer relations; the ability to create dialogue for emotion-aware games; the ability to measure the flow of emotion in novels; and so on.
The idea is to entirely automate this process—to analyse the firehose of words produced by social websites using advanced data mining techniques to gauge sentiment on a vast scale.
But all this depends on how well we understand the emotion and polarity (whether negative or positive) that people associate with each word or combinations of words.
Today, Saif Mohammad and Peter Turney at the National Research Council Canada in Ottawa unveil a huge database of words and their associated emotions and polarity, which they have assembled quickly and inexpensively using Amazon’s crowdsourcing Mechanical Turk website. They say this crowdsourcing mechanism makes it possible to increase the size and quality of the database quickly and easily….The result is a comprehensive word-emotion lexicon for over 10,000 words or two-word phrases which they call EmoLex….
The bottom line is that sentiment analysis can only ever be as good as the database on which it relies. With EmoLex, analysts have a new tool for their box of tricks.”
Ref: arxiv.org/abs/1308.6297: Crowdsourcing a Word-Emotion Association Lexicon

Creating Networked Cities


New Report by Alissa Black and Rachel Burstein, New America Foundation: “In April 2013 the California Civic Innovation Project released a report, The Case for Strengthening Personal Networks in California Local Governments, highlighting the important role of knowledge sharing in the diffusion of innovations from one city or county to another, and identifying personal connections as a significant source of information when it comes to learning about and implementing innovations.
Based on findings from CCIP’s previous study, Creating Networked Cities makes recommendations on how local government leaders, professional associations, and foundation professionals might promote and improve knowledge sharing through developing, strengthening and leveraging their networks. Strong local government networks support the continual sharing and advancement of projects, emerging practices, and civic innovation…Download CCIP’s recommendations for strengthening local government networks and diffusing innovation here.”

Linux Foundation Collaboration Gets Biological


eWeek: “The Linux Foundation is growing its roster of collaboration projects by expanding from the physical into the biological realm with the OpenBEL (Biological Expression Language). The Linux Foundation, best known as the organization that helps bring Linux vendors and developers together, is also growing its expertise as a facilitator for collaborative development projects…
OpenBEL got its start in June 2012 after being open-sourced by biotech firm Selventa. The effort now includes the participation of Foundation Medicine, AstraZeneca,The Fraunhofer Institute, Harvard Medical School, Novartis, Pfizer and the University of California at San Diego.
BEL offers researchers a language to clearly express scientific findings from the life sciences in a format that can be understood by computing infrastructure…..
The Linux Foundation currently hosts a number of different collaboration projects, including the Xen virtualization project, the OpenDaylight software-defined networking effort, Tizen for mobile phone development, and OpenMAMA for financial services information, among others.
The OpenBEL project will be similar to existing collaboration projects in that the contributors to the project want to accelerate their work through collaborative development, McPherson explained.”

Twitter’s activist roots: How Twitter’s past shapes its use as a protest tool


Radio Netherlands Worldwide: “Surprised when demonstrators from all over the world took to Twitter as a protest tool? Evan “Rabble” Henshaw-Plath, member of Twitter’s founding team, was not. Rather, he sees it as a return to its roots: Inspired by protest coordination tools like TXTMob, and shaped by the values and backgrounds of Twitter’s founders, he believes activist potential was built into the service from the start.

It took a few revolutions before Twitter was taken seriously. Critics claimed that its 140-character limit only provided space for the most trivial thoughts: neat for keeping track of Ashton Kutcher’s lunch choices, but not much else. It made the transition from Silicon Valley toy into Middle East protest tool seem all the more astonishing.
Unless, Twitter co-founder Evan Henshaw-Plath argues, you know the story of how Twitter came to be. Evan Henshaw-Plath was the lead developer at Odeo, the company that started and eventually became Twitter. TXTMob, an activist tool deployed during the 2004 Republican National Convention in the US to coordinate protest efforts via SMS was, says Henshaw-Plath, a direct inspiration for Twitter.
Protest 1.0
In 2004, while Henshaw-Plath was working at Odeo, he and a few other colleagues found a fun side-project in working on TXTMob, an initiative by what he describes as a “group of academic artist/prankster/hacker/makers” that operated under the ostensibly serious moniker of Institute for Applied Autonomy (IAA). Earlier IAA projects included small graffiti robots on wheels that spray painted slogans on pavements during demonstrations, and a pudgy talking robot with big puppy eyes made to distribute subversive literature to people who ignored less-cute human pamphleteers.
TXTMob was a more serious endeavor than these earlier projects: a tactical protest coordination tool. With TXTMob, users could quickly exchange text messages with large groups of other users about protest locations and police crackdowns….”

Weather Channel Now Also Forecasts What You'll Buy


Katie Rosman int the Wall Street Journal: “The Weather Channel knows the chance for rain in St. Louis on Friday, what the heat index could reach in Santa Fe on Saturday and how humid Baltimore may get on Sunday.
It also knows when you’re most likely to buy bug spray.
The enterprise is transforming from a cable network viewers flip to during hurricane season into an operation that forecasts consumer behavior by analyzing when, where and how often people check the weather. Last fall the Weather Channel Cos. renamed itself the Weather Co. to reflect the growth of its digital-data business.

The Atlanta-based company has amassed more than 75 years’ worth of information: temperatures, dew points, cloud-cover percentages and much more, across North America and elsewhere.
The company supplies information for many major smartphone weather apps and has invested in data-crunching algorithms. It uses this analysis to appeal to advertisers who want to fine-tune their pitches to consumers….
Weather Co. researchers are now diving into weather-sentiment analysis—how local weather makes people feel, and then act—in different regions of the country. To cull this data, Mr. Walsh’s weather-analytics team directly polls visitors to the Weather.com website, asking them about their moods and purchases on specifics days.
In a series of polls conducted between June 3 and Nov. 4 last year, residents of the Northeast region responded to the question, “Yesterday, what was your mood for most of the day?”