Post Authored by Communications Intern Olivia Brown
People are always choosing sides. And a lot of folks seem to cling to positions with something akin to a death grip. Music-industry issues are certainly no exception. Whether the debate is about streaming royalty rates, copyright enforcement, crowdfunding, or if Paul is dead (okay, maybe not that last one), people tend to take their positions and refuse to budge.
The human tendency to focus on information that confirms to one’s own pre-existing beliefs, opinions, or theories is called “confirmation bias,” and it’s a very real thing. There’s probably a broader point to make here about politics. But even in the music business, this type of tunnel vision can limit productive discussions and informed solutions.
People have a natural tendency to try to make data fit their conclusions, instead of the other way around. Why is this? Probably because most of us are seeking confirmation that our opinions are correct and our behavior is justifiable. Take the debates surrounding unauthorized downloading, its impacts on the industry (and potential remedies). If you’re the RIAA, you probably don’t want to endorse data that suggests unauthorized downloaders may also purchase a lot of music. Likewise, if you’re a free-culture loving blogger, you’re likely to avoid highlighting data that might show piracy as having a negative impact on your favorite artists. Psychologists tell us that such behavior isn’t necessarily malicious or dishonest, merely human: people piece together the scattered findings that bolster their own arguments, while ignoring other, possibly legitimate data that might call their beliefs into question.
So how can industry observers guard against confirmation bias and successfully navigate conflicting reports to come away something closer to the real story? Here’s a few suggestions.
Pay attention to scope and methodology.
Recently, we witnessed the highly polarized reception to a report [PDF] released by the European Commission’s Joint Research Centre, which found a slightly positive relationship between visits to illegal download sites and legal download sites, suggesting that unauthorized downloading did not harm digital sales. However, this study quickly came under fire for using web traffic as a metric instead of actual sales data, and focusing narrowly on download revenue while ignoring subscription services and ad-supported streaming.
Researchers themselves are evidently not exempt from confirmation bias: As Glen Peoples at Billboard pointed out, JRC researchers showed a tendency to cherrypick the parts of other studies that support what they have found in their own studies, while disregarding findings that undercut their thesis. In doing so, they may have mischaracterized the research they cited.
Look at the body of research as a whole, not just single studies in isolation.
In debates on global warming, climate change deniers are quick to point to a couple of studies that fail to find evidence of climate change, or fail to attribute that change to humans. But these studies are a tiny fraction of the available research. What’s important in science is looking at the entire range of available research, to see where/if consensus exists. Meaning, any new findings should be viewed in light of the body of research as a whole. To wit: a Carnegie Mellon University survey of the entire body of academic work found that “while some papers in the literature find no evidence of harm, the vast majority of the literature (particularly the literature published in top peer reviewed journals) finds evidence that piracy harms media sales.”
Avoid echo chambers.
When various parties mold interpretations of data to fit their opinions, it can lead to an “echo chamber” effect. Opinions are spread by influential figureheads and repeated back and forth. As Eli Pariser has argued, people love hearing their ideas affirmed, a dynamic which is heightened by the nature of online discussion. The same unquestioned concepts are repeated ad nauseum, while the media itself caters to specific groups —often highly oppositional — rather than taking a nuanced approach that can entertain a middle ground (which, in all likelihood, is closer to reality).
For example, it could very well be true that people who engage in the highest rates of unauthorized downloading also purchase more music than the average citizen, who may be only casually interested in music. However, that is not mutually exclusive with findings that demonstrate negative impacts from unauthorized downloading on the industry as a whole. With converging issues of great complexity, carefully weighing a variety of evidence is ever more crucial.
Read past the headline.
In today’s fast-moving, traffic-baiting blogosphere, it’s common for stories about hot-button issues to be framed in ways that generate as many clicks as possible. Sometimes this obscures subtleties and nuance. We’ve seen this with reports on our own data, which has sometimes been done a disservice by inaccurate, misleading, or reductive headlines. A more balanced appraisal might not get as much traffic, but it would mean a better-informed readership.
If people are serious about addressing the industry problems at hand, they should perhaps consider stepping out of their ideological “iso booths” for a minute. Instead of pitching polar opposite arguments at each other, maybe let in in some of the ambient noise. There could be something worth hearing… or even incorporating.