Idea to retire: Leaders can’t take risks or experiment


David Bray at TechTank: “Technology is rapidly changing our world. Traditionally, a nation’s physical borders could mark the beginning of their sovereign space, but in the early to mid-20th century airplanes challenged this notion. Later on, space-based satellites began flying in space above all nations. By the early 21st century, smartphone technologies costing $100 or so gave individuals computational capabilities that dwarfed the multi-million dollar computers operated by large nation-states just three decades earlier.

In this period of exponential change, all of us across the public sector must work together, enabling more inclusive work across government workers, citizen-led contributions, and public-private partnerships. Institutions must empower positive change agents on the inside of public service to pioneer new ways of delivering superior results. Institutions must also open their data for greater public interaction, citizen-led remixing, and discussions.

All together, these actions will transform public service to truly be “We the (mobile, data-enabled, collaborative) People” working to improve our world. These actions all begin creating creative spaces that allow public service professionals the opportunities to experiment and explore new ways of delivering superior results to the public.

21st Century Reality #1: Public service must include workspaces for those who want to experiment and explore new ways of delivering results.

The world we face now is dramatically different then the world of 50, 100, or 200 years ago. More technological change is expected to occur in the next five years than the last 15 years combined. Advances in technology have blurred what traditionally was considered government, and consequentially we must experiment and explore new ways of delivering results.

21st Century Reality #2: Public service agencies need, within reason, to be allowed to have things fail, and be allowed to take risks.

The words “expertise” and “experiments” have the same etymological root, which is “exper,” meaning “out of danger.” Whereas the motto in Silicon Valley and other innovation hubs around the world might be “fail fast and fail often,” such a model is not going to work for public service, where certain endeavors absolutely must succeed and cannot waste taxpayer funds.

The only way public sector technologists will gain the expertise needed to respond to and take advantage of the digital disruptions occurring globally will be to do “dangerous experiments” as positive change agents akin to what entrepreneurs in Silicon Valley also do….

21st Century Reality #3: Public service cannot be done solely by government professionals in a top-down fashion.

With the communication capabilities provided by smartphones, social media, and freely available apps, individual members of the public can voluntarily access, analyze, remix, and choose to contribute data and insights to better inform public service. Recognizing this shift from top-down to bottom-up activities represents the first step to the resiliency of our legacy institutions….

Putting a cultural shift into practice

Senior executives need to shift from managing those who report to them to championing and creating spaces for creativity within their organizations. Within any organization, change agents should be able to approach an executive, pitch new ideas, bring data to support these ideas, and if a venture is approved move forward with speed to transform public service away from our legacy approaches….

The work of public service also can be done by public-private partnerships acting beyond their own corporate interests to benefit the nation and local communities. Historically the U.S. has lagged other nations, like Singapore or the U.K., in exploring new innovative forms of public-private partnerships. This could change by examining the pressing issues of the day and considering how the private sector might solve challenging issues, or complement the efforts of government professionals. This could include rotations of both government and private sector professionals as part of public-private partnerships to do public service that now might be done more collaboratively, effectively, and innovatively using alternative forms of organizational design and delivery.

If public service returns to first principles – namely, what “We the People” choose to do together – new forms of organizing, collaborating, incentivizing, and delivering results will emerge. Our exponential era requires such transformational partnerships for the future ahead….(More)”

Direct democracy may be key to a happier American democracy


 and in the Conversation: “Is American democracy still “by the people, for the people?” According to recent research, it may not be. Martin Gilens at Princeton University confirms that the wishes of the American working and middle class play essentially no role in our nation’s policy making. A BBC story rightly summarized this with the headline: US Is an Oligarchy, Not a Democracy.

However new research by Benjamin Radcliff and Gregory Shufeldt suggests a ray of hope.

Ballot initiatives, they argue, may better serve the interests of ordinary Americans than laws passed by elected officials….

Today, 24 states allow citizens to directly vote on policy matters.

This year, more than 42 initiatives already are approved for the ballot in 18 states.

Voters in California will decide diverse questions including banning plastic bags, voter approval of state expenses greater than US$2 billion dollars, improving school funding, and the future of bilingual education.

The people of Colorado will vote on replacing their current medical insurance programs with a single payer system, and in Massachusetts people may consider legalizing recreational marijuana….

However, many have pointed to problems with direct democracy in the form of ballot initiatives.

Maxwell Sterns at the University of Maryland, for example, writes that legislatures are better because initiatives are the tools of special interests and minorities. In the end, initiatives are voted upon by an unrepresentative subset of the population, Sterns concludes.

Others like Richard Ellis of Willamette University argue that the time-consuming process of gathering signatures introduces a bias toward moneyed interests. Some suggest this has damaged direct democracy in California, where professional petition writers andpaid signature gatherers dominate the process. Moneyed interests also enjoy a natural advantage in having the resources that ordinary people lack to mount media campaigns to support their narrow interests.

To curb this kind of problem, bans on paying people per signature are proposed in many states, but have not yet passed any legislature. However, because Californians like direct democracy in principle, they have recently amended the process to allow for a review and revision, and they require mandatory disclosures about the funding and origins of ballot initiatives.

Finally, some say initiatives can be confusing for voters, like the two recent Ohio propositions concerning marijuana, where one ballot proposition essentially canceled out the other. Similarly, Mississippi’s Initiative 42 required marking the ballot in two places for approval but only one for disapproval, resulting in numerous nullified “yes” votes.

Routes to happiness

Despite these flaws, our research shows that direct democracy might improve happiness in two ways.

One is through its psychological effect on voters, making them feel they have a direct impact on policy outcomes. This holds even if they may not like, and thus vote against, a particular proposition. The second is that it may indeed produce policies more consistent with human well being.

The psychological benefits are obvious. By allowing people literally to be the government, just as in ancient Athens, people develop higher levels of political efficacy. In short, they may feel they have some control over their lives. Direct democracy can give people political capital because it offers a means by which citizens may place issues on the ballot for popular vote, giving them an opportunity both to set the agenda and to vote on the outcome.

We think this is important today given America’s declining faith in government. Overall today only 19 percent believe the government is run for all citizens. The same percentage trusts government to mostly do what is right. The poor and working classes are even more alienated….(More)”

Core Concepts: Computational social science


Adam Mann at PNAS:Cell phone tower data predicts which parts of London can expect a spike in crime (1). Google searches for polling place information on the day of an election reveal the consequences of different voter registration laws (2). Mathematical models explain how interactions among financial investors produce better yields, and even how they generate economic bubbles (3).

Figure

Using cell-phone and taxi GPS data, researchers classified people in San Francisco into “tribal networks,” clustering them according to their behavioral patterns. Student’s, tourists, and businesspeople all travel through the city in various ways, congregating and socializing in different neighborhoods. Image courtesy of Alex Pentland (Massachusetts Institute of Technology, Cambridge, MA).

Figure

Where people hail from in the Mexico City area, here indicated by different colors, feeds into a crime-prediction model devised by Alex Pentland and colleagues (6). Image courtesy of Alex Pentland (Massachusetts Institute of Technology, Cambridge, MA).

 These are just a few examples of how a suite of technologies is helping bring sociology, political science, and economics into the digital age. Such social science fields have historically relied on interviews and survey data, as well as censuses and other government databases, to answer important questions about human behavior. These tools often produce results based on individuals—showing, for example, that a wealthy, well-educated, white person is statistically more likely to vote (4)—but struggle to deal with complex situations involving the interactions of many different people.

 

A growing field called “computational social science” is now using digital tools to analyze the rich and interactive lives we lead. The discipline uses powerful computer simulations of networks, data collected from cell phones and online social networks, and online experiments involving hundreds of thousands of individuals to answer questions that were previously impossible to investigate. Humans are fundamentally social creatures and these new tools and huge datasets are giving social scientists insights into exactly how connections among people create societal trends or heretofore undetected patterns, related to everything from crime to economic fortunes to political persuasions. Although the field provides powerful ways to study the world, it’s an ongoing challenge to ensure that researchers collect and store the requisite information safely, and that they and others use that information ethically….(More)”

Democracy Dashboard


The Brookings Democracy Dashboard is a collection of data designed to help users evaluate political system and governmental performance in the United States. The Democracy Dashboard displays trends in democracy and governance in seven key areas: Elections administration; democratic participation and voting; public opinion; institutional functioning in the executive, legislative, and judicial branches; and media capacity.

The dashboard—and accompanying analyses on the FixGov blog—provide information that can help efforts tScreen Shot 2016-01-27 at 2.01.03 PMo strengthen democracy and improve governance in the U.S.

Data will be released on a rolling basis during 2016 and expanded in future election years. Scroll through the interactive charts below to explore data points and trends in key areas for midterm and presidential elections and/or download the data in Excel format here »….(More)”

 

7 Ways Local Governments Are Getting Creative with Data Mapping


Ben Miller at GovTech:  “As government data collection expands, and as more of that data becomes publicly available, more people are looking to maps as a means of expressing the information.

And depending on the type of application, a map can be useful for both the government and its constituents. Many maps help government servants operate more efficiently and savemoney, while others will answer residents’ questions so they don’t have to call a government worker for theanswer…..

Here are seven examples of state and local governments using maps to help themselves and the people they serve.

1. DISTRICT OF COLUMBIA, IOWA GET LOCAL AND CURRENT WITH THE WEATHER

Washington%2C+D.C.+snow+plow+map

As Winter Storm Jonas was busy dropping nearly 30 inches of snow on the nation’s capital, officials in D.C. were working to clear it. And thanks to a mapping application they launched, citizens could see exactly how the city was going about that business.

The District of Columbia’s snow map lets users enter an address, and then shows what snow plows did near that address within a given range of days. The map also shows where the city received 311 requests for snow removal and gives users a chance to look at recent photos from road cameras showing driving conditions…..

2. LOS ANGELES MAPS EL NIÑO RESOURCES, TRENDS

El Niño Watch map

Throughout the winter, weather monitoring experts warned the public time and again that an El Niño system was brewing in the Pacific Ocean that looked to be one of the largest, if not the largest, ever. That would mean torrents of rain for a parched state that’s seen mudslides and flooding during storms in the past.

So to prepare its residents, the city of Los Angeles published a map in January that lets users see both decision-informing trends and the location of resources. Using the application, one can toggle layers that let them know what the weather is doing around the city, where traffic is backed up, where the power is out, where they can find sand bags to prevent flood damage and more….

3. CALIFORNIA DIVES DEEP INTO AIR POLLUTION RISKS

CalEnviroScreen

….So, faced with a legislative mandate to identify disadvantaged communities, the California Office of Environmental Health Hazard Assessment decided that it wouldn’t just examine smog levels — it also would also take a look at the prevalence of at-risk people across the state.

The result is a series of three maps, the first two examining both factors and the third combining them. That allows the state and its residents to see the places where air pollution is the biggest problem for people it poses a greater risk to….

4. STREAMLINING RESIDENT SERVICE INFORMATION

Manassas+curbside+pickup+map

The city of Manassas, Va., relied on an outdated paper map and a long-time, well-versed staffer to answer questions about municipal curbside pickup services until they launched this map in 2014. The map allows users to enter their address, and then gives them easy-to-read information about when to put out various things on their curb for pickup.

That’s useful because the city’s fall leaf collection schedule changes every year. So the map not only acts as a benefit to residents who want information, but to city staff who don’t have to deal with as many calls.

The map also shows users the locations of resources they can use and gives them city phone numbers in case they still have questions, and displays it all in a popup pane at the bottom of the map.

5. PLACING TOOLS IN THE HANDS OF THE PUBLIC

A lot of cities and counties have started publishing online maps showing city services and releasing government data.

But Chicago, Boston and Philadelphia stand out as examples of maps that take the idea one step further — because each one offers a staggering amount of choices for users.

Chicago’s new OpenGrid map, just launched in January, is a versatile map that lets users search for certain data like food inspection reports, street closures, potholes and more. That’s enough to answer a lot of questions, but what adds even more utility is the map’s various narrowing tools. Users can narrow searches to a zip code, or they can draw a shape on the map and only see results within that shape. They can perform sub-searches within results and they can choose how they’d like to see the data displayed.

Philadelphia’s platform makes use of buttons, icons and categories to help users sift through the spatially-enabled data available to them. Options include future lane closures, bicycle paths, flu shots, city resources, parks and more.

Boston’s platform is open for users to submit their own maps. And submit they have. The city portal offers everything from maps of bus stops to traffic data pulled from the Waze app.

6. HOUSTON TRANSFORMS SERVICE REQUEST DATA

Houston+311+service+request+map

A 311 service functions as a means of bringing problems to city staff’s attention. But the data itself only goes so far — it needs interpretation.

Houston’s 311 service request map helps users easily analyze the data so as to spot trends. The tool offers lots of ways to narrow data down, and can isolate many different kinds of request so users can see whether one problem is reported more often in certain areas.

7. GUIDING BUSINESS GROWTH

For the last several years, the city of Rancho Cucamonga, Calif., has been designing all sorts of maps through its Rancho Enterprise Geographic Information Systems (REGIS) project. Many of them have served specific city purposes, such as tracking code enforcement violations and offering police a command system tool for special events.

The utilitarian foundation of REGIS extends to its public-facing applications as well. One example is INsideRancho, a map built with economic development efforts in mind. The map lets users search and browse available buildings to suit business needs, narrowing results by square footage, zoning and building type. Users can also find businesses by name or address, and look at property exteriors via an embedded connection with Google Street View….(More)”

Methods of Estimating the Total Cost of Regulations


Maeve P. Carey for the Congressional Research Service: “Federal agencies issue thousands of regulations each year under delegated authority from Congress. Over the past 70 years, Congress and various Presidents have created a set of procedures agencies must follow to issue these regulations, some of which contain requirements for the calculation and consideration of costs, benefits, and other economic effects of regulations. In recent years, many Members of Congress have expressed an interest in various regulatory reform efforts that would change the current set of rulemaking requirements, including requirements to estimate costs and benefits of regulations. As part of this debate, it has become common for supporters of regulatory reform to comment on the total cost of federal regulation. Estimating the total cost of regulations is inherently difficult. Current estimates of the cost of regulation should be viewed with a great deal of caution. Scholars and governmental entities estimating the total cost of regulation use one of two methods, which are referred to as the “bottom-up” and the “top-down” approach.

The bottom-up approach aggregates individual cost and benefit estimates produced by agencies, arriving at a governmentwide total. In 2014, the annual report to Congress from the Office of Management and Budget estimated the total cost of federal regulations to range between $68.5 and $101.8 billion and the total benefits to be between $261.7 billion and $1,042.1 billion. The top-down approach estimates the total cost of regulation by looking at the relationship of certain macroeconomic factors, including the size of a country’s economy and a proxy measure of how much regulation the country has. This method estimates the economic effect that a hypothetical change in the amount of regulation in the United States might have, considering that economic effect to represent the cost of regulation. One frequently cited study estimated the total cost of regulation in 2014 to be $2.028 trillion, $1.439 trillion of which was calculated using this top-down approach. Each approach has inherent advantages and disadvantages.

The bottom-up approach relies on agency estimates of the effects of specific regulations and can also be used to estimate benefits, because agencies typically estimate both costs and benefits under current requirements so that they may be compared and evaluated against alternatives. The bottom-up approach does not, however, include estimates of costs and benefits of all rules, nor does it include costs and benefits of regulations that are not monetized—meaning that the bottom-up approach is likely an underestimate of the total cost of regulation. Furthermore, the individual estimates produced by agencies and used in the bottom-up approach may not always be accurate.

The top-down approach can be used to estimate effects of rules that are not captured by the bottom-up approach—such as indirect costs and costs of rules issued by independent regulatory agencies, which are not included in the bottom-up approach—thus theoretically capturing the whole universe of regulatory costs. Its results are, however, entirely reliant upon a number of methodological challenges that are difficult, if not impossible, to overcome. The biggest challenge may be finding a valid proxy measure for regulation: proxy measures of the total amount of regulation in a country are inherently imprecise and cannot be reliably used to estimate macroeconomic outcomes. Because of this difficulty in identifying a suitable proxy measure of regulation, even if the total cost of regulation is substantial, it cannot be estimated with any precision. The top-down method is intended to measure only costs; measuring costs without also considering benefits does not provide the complete context for evaluating the appropriateness of a country’s amount of regulation.

For these and other reasons, both approaches to estimating the total cost of regulation have inherent—and potentially insurmountable—flaws….(More)”

Can We Use Data to Stop Deadly Car Crashes?


Allison Shapiro in Pacific Standard Magazine: “In 2014, New York City Mayor Bill de Blasio decided to adopt Vision Zero, a multi-national initiative dedicated to eliminating traffic-related deaths. Under Vision Zero, city services, including the Department of Transportation, began an engineering and public relations plan to make the streets safer for drivers, pedestrians, and cyclists. The plan included street re-designs, improved accessibility measures, and media campaigns on safer driving.

The goal may be an old one, but the approach is innovative: When New York City officials wanted to reduce traffic deaths, they crowdsourced and used data.

Many cities in the United States—from Washington, D.C., all the way to Los Angeles—have adopted some version of Vision Zero, which began in Sweden in 1997. It’s part of a growing trend to make cities “smart” by integrating data collection into things like infrastructure and policing.

Map of high crash corridors in Portland, Oregon. (Map: Portland Bureau of Transportation)
Map of high crash corridors in Portland, Oregon. (Map: Portland Bureau of Transportation)

Cities have access to an unprecedented amount of data about traffic patterns, driving violations, and pedestrian concerns. Although advocacy groups say Vision Zero is moving too slowly, de Blasio has invested another $115 million in this data-driven approach.

Interactive safety map. (Map: District Department of Transportation)
Interactive safety map. (Map: District Department of Transportation)

De Blasio may have been vindicated. A 2015 year-end report released by the city last week analyzes the successes and shortfalls of data-driven city life, and the early results look promising. In 2015, fewer New Yorkers lost their lives in traffic accidents than in any year since 1910, according to the report, despite the fact that the population has almost doubled in those 105 years.

Below are some of the project highlights.

New Yorkers were invited to add to this public dialogue map, where they could list information ranging from “not enough time to cross” to “red light running.” The Department of Transportation ended up with over 10,000 comments, which led to 80 safety projects in 2015, including the creation of protected bike lanes, the introduction of leading pedestrian intervals, and the simplifying of complex intersections….

Data collected from the public dialogue map, town hall meetings, and past traffic accidents led to “changes to signals, street geometry and markings and regulations that govern actions like turning and parking. These projects simplify driving, walking and bicycling, increase predictability, improve visibility and reduce conflicts,” according to Vision Zero in NYC….(More)”

2015 Philip Meyer Award winners for data-driven investigation


From IRE: “First PlaceFailure Factories” | Tampa Bay Times
Cara Fitzpatrick, Michael LaForgia, Lisa Gartner, Nathaniel Lash and Connie Humburg

The team used statistical analysis and linear regression of data from dozens of records requests to document how steady resegregation of Pinellas County schools left black children to fail at increasingly higher rates than anywhere else in Florida. The series focused on failures of school district officials to give the schools the support necessary for success. The judges praised the reporters for dogged work on a project that took 18 months to report and write, and noted that the results underscored what decades of sociological research has shown happens in racially segregated schools.

Second Place: The Changing Face of America” | USA Today
Paul Overberg, Sarah Frostenson, Marisol Bello, Greg Toppo, and Jodi Upton 

The project was built around measurements across time of the racial and ethnic diversity of each of America’s more than 3,100 counties, going back to 1960 and projected ahead to 2060. The reporters used the results to reveal that high levels of diversity, once found only in a few Southern states and along the border with Mexico, had bloomed out into large areas of the upper Midwest and the Appalachians, for instance. Those results informed the assignments of reporters to find the local stories that illustrated those changes, with the results running in more than 100 Gannett papers and broadcast stations.

Third Place: The Echo Chamber” | Thomson Reuters
Joan Biskupic, Janet Roberts and John Shiffman

The Reuters team analyzed the characteristics of more than 14,400 U.S. Supreme Court records from nine years worth of petitions seeking review by the Court. The analysis showed that 43% of cases eventually heard by the court came from a tiny pool of a few dozen lawyers who represent less than 1% of the more than 17,000 lawyers seeking such review. Further reporting showed that these elite lawyers, mostly representing large corporations, had strong personal connections with the justices, with about half of them having served as clerks to the justices….(More)”

Passive Philanthropy


PSFK: “What if you could cure cancer in your sleep? What if throwing out food meant feeding more people? What if helping coffee farmers in developing nations was as easy as a retweet? Today, businesses pay big money in order to reach the same audience as some viral tweets, and the same strategy is being applied to the reach and impact of social good campaigns. Nonprofits have also begun to leverage creative opportunities to spread awareness and raise funds to harness socially-aware citizens and rethink how social good is spread and executed. Take, for instance, an app that tracks exercise and donates to the charity of choice based on distance….

The DreamLab is a free app that turns smartphones into a research tool for cancer researchers in the Garvan Institute in Australia when their users are sleeping. Developed in conjunction with Vodaphone, the app uses the processing power of idle phones as an alternative to supercomputers which can be difficult to access. After downloading the app, participants simply open it and charge their phone. Once the phone reaches 95 percent charge, it gets to work, acting as a networked processor alongside other users with the app. Each phone solves a small piece of a larger puzzle and sends it back to Garvan.

If 1,000 people are using the app, cancer puzzles can be solved 30x faster.

As DreamLab researchers work toward finding a cure for cancer, Feeding Forward is working toward ending hunger. In America, hunger is not a problem of supply, but rather of distribution. Feeding Forward aims to solves this by connecting restaurants, grocery stores, caterers, or other businesses that are forced to throw away perishable food products with those in need.

Businesses simply need post their excess food on the platform and a driver will come pick it up to deliver to a food bank in need. Donors receive profiles of the people they helped and can also write off the donation as a charitable contribution for tax purposes. Since their launch in 2013, Feeding Forward has achieved a pick up rate of 99 percent, distributing 780,000 pounds of food saving business $3.9 million.

DreamLab and Feeding Forward are putting activities people are already going to do to use, while One Big Tweet harnesses the power of people’s social media accounts as a fundraising strategy. Cafédirect Producers’ Foundation are getting people to donate their Twitter followings for charity, asking people to sign up to post an automated tweet from a corporate sponsor who purchased the privilege at an auction for social good. The more people who donate their accounts, the higher the value of the tweet at auction. After four months, over 700 people with a collective reach of 3.2 mil followers, signed up to help make the One Big Tweet worth $49,000. While the charity is still in search of a buyer, Cafédirect promises the tweet that will be sent out through participants’ accounts will only happen once and be “safe enough for your Gran to read.” All money from the sale will go directly to continuing the work they do with coffee and tea farmers in Africa, Asia, and Latin America…(MoreMore)

The impact of open access scientific knowledge


Jack Karsten and Darrell M. West at Brookings: “In spite of technological advancements like the Internet, academic publishing has operated in much the same way for centuries. Scientists voluntarily review their peers’ papers for little or no compensation; the paper’s author likewise does not receive payment from academic publishers. Though most of the costs of publishing a journal are administrative, the cost of subscribing to scientific journals nevertheless increased 600 percent between 1984 and 2002. The funding for the research libraries that form the bulk of journal subscribers has not kept pace, leading to campaigns at universities including Harvard to boycott for-profit publishers.

Though the Internet has not yet brought down the price of academic journal subscriptions, it has led to some interesting alternatives. In 2015, the Twitter hashtag #icanhazPDF was created to request copies of papers located behind paywalls. Anyone with access to a specific paper can download it and then e-mail it to the requester. The practice violates the copyright of publishers, but puts papers in reach of researchers who would otherwise not be able to read them. If a researcher cannot read a journal article in the first place, they cannot go on to cite it, which raises the profile of the cited article and the journal that published it. The publisher is caught between two conflicting goals: to increase the number of citations for their articles and earning revenue to stay in business.

Thinking outside the journal

A trio of University of Chicago researchers examines this issue through the lens of Wikipedia in a paper titled “Amplifying the Impact of Open Access: Wikipedia and the Diffusion of Science.” Wikipedia makes a compelling subject for scientific diffusion given its status as one of the most visited websites in the world, attracting 374 million unique visitors monthly as of September 2015. The study found that on English language articles, Wikipedia editors are 47 percent more likely to cite an article from an open access journal. Anyone using Wikipedia as a first source for information on a subject is more likely to read information from open source journals. If readers click through the links to cited articles, they can read the actual text of these open-source journal articles.

Given how much the federal government spends on scientific research ($66 billion on nondefense R&D in 2015), it has a large role to play in the diffusion of scientific knowledge. Since 2008, the National Institutes of Health (NIH) has required researchers who publish in academic journals to also publish in PubMed, an online open access journal. Expanding provisions like the NIH Public Access Policy to other agencies and to recipients of federal grants at universities would give the public and other researchers a wealth of scientific information. Scientific literacy, even on cutting-edge research, is increasingly important when science informs policy on major issues such as climate change and health care….(More)”