Научная статья на тему 'IF SCIENCE IS A PUBLIC GOOD, WHY DO SCIENTISTS OWN IT?'

IF SCIENCE IS A PUBLIC GOOD, WHY DO SCIENTISTS OWN IT? Текст научной статьи по специальности «Философия, этика, религиоведение»

CC BY
144
73
i Надоели баннеры? Вы всегда можете отключить рекламу.
Журнал
Epistemology & Philosophy of Science
Scopus
ВАК
RSCI
ESCI
Ключевые слова
НАУКА / ПУБЛИЧНОЕ БЛАГО / НАЦИОНАЛЬНЫЙ НАУЧНЫЙ ФОНД США / DAPRA / ОРГАНИЗОВАННОЕ ЛИЦЕМЕРИЕ / РЕЦЕНЗИРОВАНИЕ НАУЧНЫХ СТАТЕЙ / SCIENCE / PUBLIC GOOD / NSF / DARPA / ORGANIZED HYPOCRISY / PEER REVIEW

Аннотация научной статьи по философии, этике, религиоведению, автор научной работы — Fuller Steve

I argue that if science is to be a public good, it must be made one. Neither science nor any other form of knowledge is naturally a public good. And given the history of science policy in the twentieth century, it would be reasonable to conclude that science is in fact what economists call a ‘club good’. I discuss this matter in detail in two contexts: (1) current UK efforts to create a version of the US DARPA that would focus on projects of larger, long-term societal interests - i.e. beyond the interests of the academic specialities represented in, say, the US NSF; (2) what I call the‘organized hypocrisy’ involved in presenting science as a public good through the so-called ‘peer review’ process.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «IF SCIENCE IS A PUBLIC GOOD, WHY DO SCIENTISTS OWN IT?»

Эпистемология и философия науки 2020. Т. 57. № 4. С. 23-39 УДК 167.7

Epistemology & Philosophy of Science 2020, vol. 57, no. 4, pp. 23-39 DOI: https://doi.org/10.5840/eps202057454

If science is a public good,

why do scientists own it?

Steve Fuller - Auguste Comte Professor of Social Epistemology. University of Warwick. Coventry CV4 7AL, UK; e-mail: s.w.fuller@warwick.ac.uk

I argue that if science is to be a public good, it must be made one. Neither science nor any other form of knowledge is naturally a public good. And given the history of science policy in the twentieth century, it would be reasonable to conclude that science is in fact what economists call a 'club good'. I discuss this matter in detail in two contexts: (1) current UK efforts to create a version of the US DARPA that would focus on projects of larger, long-term societal interests - i.e. beyond the interests of the academic specialities represented in, say, the US NSF; (2) what I call the 'organized hypocrisy' involved in presenting science as a public good through the so-called 'peer review' process. Keywords: science, public good, NSF, DARPA, organized hypocrisy, peer review

Если наука - это общественное благо,

то почему им владеют ученые?

Стив Фуллер - доктор философии, профессор. Университет Уорика. Ковентри CV4 7AL, Великобритания; e-mail: s.w.fuller@warwick.ac.uk

Автор считает, что если наука должна быть общественным благом, то ее нужно сделать таковым. Ни наука, ни какая-либо другая форма знания сама по себе не является общественным благом. И, принимая во внимание историю научной политики в ХХ в., стоило бы заключить, что наука на самом деле представляет собой то, что экономисты называют «клубным благом». Автор осмысливает такое понимание науки в двух перспективах: 1) в контексте текущих попыток Великобритании создать свою версию Управления перспективных исследовательских проектов Министерства обороны США (DAPRA), которое бы занималось большими долгосрочными проектами, затрагивающими общие интересы (те, что выходят за рамки специальной академической повестки, представленной, например, Национальным научным фондом США); 2) в отношении к тому, что автор называет «организованным лицемерием», которое представляет науку как общественное благо посредством так называемого процесса рецензирования.

Ключевые слова: наука, общественное благо, Национальный научный фонд США, DAPRA, организованное лицемерие, рецензирование научных статей

© Steve Fuller

23

The Problem: How to Make Knowledge a Public Good?

Both social democrats and neoliberals have struggled to make concrete sense of the attractive idea that science is a 'public good'. In the mid-twentieth century heyday of the welfare state, social democrats invoked the idea to explain why everyone's taxes should be used to subsidize the university education of a relatively small portion of the population. In this context, 'science as a public good' referred to the 'added value' that university-trained people provided to the rest of society through their knowledge, skills, etc. With the rise of the 'knowledge economy' in the 1980s, 'science as public good' became part of the neoliberal agenda as 'human capital' development, which meant that everyone should acquire academic credentials to make society more prosperous and productive.

However, it is not clear that either of these policies have made good long-term economic sense. The key problem is that science isn't naturally a public good but must be made such. Science naturally favours discovery and innovation, processes that create a strong epistemic distinction between those 'ahead' and 'behind' the arc of organized inquiry. Moreover, this distinction is promoted by a peer review process that routinely conflates judgements relating to 'quality control' in the strict sense (i.e. judgements about the adequacy of evidence and reasoning on behalf of knowledge claims) and judgements relating to the prestige or fashion-ableness of topics of inquiry or the theories used to explore them. While many follow Kuhn in believing that such 'streamlining' of inquiry is necessary for science to make progress, others - including Popper and Hayek - regard it as tantamount to 'rent-seeking' behaviour that turns science into a 'club good' (cf. Fuller 2019). If the latter are correct, then science in some sense needs to be'demystified' as a form of knowledge in order to become a public good. Indeed, the state may need to reverse its role since the end of the Second World War and become a kind of 'epistemic trust-buster' in order to convert science into a public good.

The path-dependent understanding of 'science as a public good' is a product of neoclassical welfare economics of the post-Second World War period, epitomized by Paul Samuelson (1969), the MIT-based author of the most widely used economics textbook in the second half of the twentieth century. Its basic assumption is that 'public goods' are whatever everyone needs but no one is incentivized to fund - that is, goods subject to 'market failure', which in turn provides a justification for state funding. On this vision, the ability of states to influence markets is quite limited, perhaps reflecting an overly parochial sense of individual self-interest and a relatively unimaginative sense of how markets might be designed to work. Moreover, in practice, this approach - while ostensibly publicly oriented - resulted in the concentration of state funding in institutions and projects that the scientific elites regarded as most worthy. The guiding

assumption was that funding the 'best people' was the most efficient way for science to benefit everyone, perhaps via a 'multiplier' or 'trickle down' effect, depending on whether you learned economics from John Maynard Keynes or Milton Friedman. In either case, the scientific establishment was largely in charge of how and to whom the money flowed, subject to largely sympathetic civil service oversight.

The UK precedent for this line of thought is the so-called Haldane Principle, named for the Liberal politician, Viscount Richard Haldane, former Defence Minister and Lord Chancellor during the First World War. A distinguished idealist philosopher who delivered the first set of Gifford Lectures in Science and Religion at St. Andrews University, Haldane was among the original British popularisers of the 'new physics' that emerged in the early twentieth century. The conclusion of Haldane (1921) speaks of the state's need to monitor new developments in the disparate areas of science with the aim of seizing any opportunity to harness them for the greater public good. Earlier in that book, Haldane had written presciently of the potential for atomic physics to deliver incredible wartime and peacetime technologies. Arguably Winston Churchill's early acceptance of the atomic bomb as part of the future of warfare was due to Haldane's insight. However, Haldane also believed that universities were best suited to deciding these matters on behalf of the public interest. He had reached that judgement as part of a more general civil service reform in 1904. At that point, his primary concern was simply to devolve decision-making in the national interest to the level at which it could be effectively made. But this was before the 'Great War' and, more to the point, before the intensification of disciplinary specialisation within science, which begins at the university level only after the Second World War.

I raise these historical points because while Haldane provided an intellectual horizon for understanding many of science's later achievements, he failed to anticipate that science would undergo a profound sociological transformation in the twentieth century, effectively turning organized inquiry into an intellectual feudal estate governed by 'peer review' controlled by mutually recognizing disciplinary specialists whose primary interest is to extend research into areas that have already benefitted them, not necessarily the 'general public', however understood. I don't mean that peer reviewed research is somehow inimical to the public good - only that peer review is designed mainly to ensure the integrity of knowledge for those who are most likely to use it for research purposes. From that standpoint, at most it provides a quality control check on knowledge produced for the public good.

Nevertheless, peer review came to dominate state science policy across the world after the Second World War under the rubric of the 'linear model', which effectively delegated to scientific elites the responsibility for setting science's strategic objectives with regard to the public interest. To be sure, over time faith in this process has been eroded, a sign of

which is the increasingly interdisciplinary turn in national research council funding around the world. It has effectively forced researchers to broaden the remit of their enquiries in order to satisfy such larger policy agenda rubrics as 'well-being' and 'sustainability'. The logic of this tendency has arguably reached its culmination in the UK government's (pre-pandemic) 2020 earmarking of £800 million over the five-year parliamentary term for what it bills as a 'high risk, high reward' research funding agency explicitly modelled on the fabled US Defence Department Advanced Research Projects Agency (DARPA), but without defence ('D') as the primary focus.

To be sure, the proposal could not come at a more challenging time. The combination of Brexit and the COVID-19 pandemic places an enormous strain on the public purse. Yet Boris Johnson led the Tories to a stomping parliamentary majority in the December 2019 general election on a platform that the UK should set its own terms on the world stage post-Brexit. This includes taking science seriously but not deferring to the received word of experts on what is and is not possible. The pandemic has unwittingly showcased this point, since when confronted with a new, potentially lethal virus there are no 'experts' in any strict sense. Many different bodies of knowledge shed light on what remains an inherently amorphous situation. The trick then is to organize this knowledge for the most enduring effect - that is, not only beating the virus into retreat but also providing a new field of inquiry upon which later scientists and policymakers might build and draw upon. Indeed, this is exactly how epidemiology emerged in response to various microbial threats to industry, the military and public health in late nineteenth century Europe -a point to which I shall return below.

At a more general level, a UK ARPA could eliminate the final vestiges of the discipline-based feudalism that Viscount Haldane unwittingly unleashed in the early twentieth century and which the US set as the gold standard for science policy making around the world after the Second World War with the establishment of National Science Foundation (NSF). Indeed, a UK ARPA offers a unique opportunity for the UK to set a clear world example in redefining what it means for science to be a genuinely 'public good'. The financial burdens that COVID-19 pandemic have placed on the Treasury provide a further incentive. The likelihood of a tighter public purse in the near future will necessitate hard funding choices. This should inspire a McKinsey-like 'ground zero' auditing of whether current public investment in science is organized in such a way as to actually benefit the public. As with so many other things, our understanding of what it means for science to be a public good is 'path dependent', in the sense that it is conditioned by specific historical episodes, the response to which at the time has anchored subsequent judgements, which are now simply taken for granted. A UK ARPA could serve to break the path dependency in our thinking about 'science as a public

good' if it is proposed as a replacement to the current science funding councils-based arrangement. This would ensure the right levels of investment to enable a UK ARPA to succeed.

Moreover, in light of the UK's government current interest in investing more equitably in science across the country, again in the spirit of 'knowledge as a public good', it is worth observing that the original Congressional proposal for the establishment of the NSF was exactly in that spirit. Indeed, as the US historian Daniel Kevles (1977) has pointed out, the NSF was designed to be an extension of FDR's New Deal, whereby the federal government would invest in university teaching and research only insofar as graduates and researchers agree to perform a kind of 'national service', which amounted to bringing scientific know-how to socially and economically deprived regions of the country. The model for this initiative was the successful 'land grant universities' of the late nineteenth century, which contributed significantly to the development of America's rural backwaters. Interestingly, the New Dealers had instinctively regarded universities as monopoly capitalists of knowledge, largely because most academic research up to that point (the late 1930s) had been funded by large corporations and their affiliate foundations (e.g. Rockefeller, Carnegie, Ford, Sloan). Thus, the NSF was originally conceived as a kind of epistemic trust-busting operation to ensure that the nation's scientific capital was not concentrated in just a few places.

However, once the Democrats lost control of Congress to the Republicans after the Second World War, the more established academic interests took over, led by MIT Vice-President Vannevar Bush, whose Science: The Endless Frontier influentially made the case for the version of the NSF that exists today, largely on the back of the success of the Manhattan Project in producing the first atomic bomb [Bush, 1945]. It effectively launched the prevailing science policy mythology, which amplifies the professional scientific community's control over science in ways that undermine science as a public good. However, as we shall see, the lesson of the Manhattan Project was quite different. In fact, it would have been to establish a version of DARPA, which ended up happening almost a decade later after the NSF and in direct response the Soviet launch of the Sputnik space satellite. By that time, the "basic/applied" science policy mythology had already set in.

NSF vs DARPA: The Struggle for the Legacy of the Manhattan Project

Most histories of science policy point to the establishment of the US National Scientific Foundation in 1950 as the moment when what is nowadays called 'basic research' or 'pure science' came to be something that the public was expected to fund. Governments have always taken

an interest in scientific research, especially a means to acquire military and economic advantage on the world stage. Just as Haldane was reforming UK national knowledge production in the context of civil service reform, Germany established the Kaiser Wilhelm (now Max Planck) Institutes, which forged the first 'triple helix' of state-university-industry collaboration so as to embed science more securely in the national interest.

But the NSF's rationale was different. It didn't start from the perspective of the national interest at all but from that of the academic research community, which was portrayed as best serving the national interest by being left to its own devices. But why would anyone think such a thing? The historical answer is the Manhattan Project. This makeshift gathering of top scientists not only produced the first successful atomic bomb but also an opportunity for Vannevar Bush to promote an already widespread public faith in science to a state ideology. It was a stroke of Machiavellian genius. After all, while the Nazis and the Soviets had loudly harnessed science to their respective causes, neither claimed that the national interest was being led by the science. But this was exactly what Bush proposed - and succeeded in realizing through the NSF, whose iconic status as a science funding agency remains to this day. Among the many downstream effects of this development is that in the current COVID-19 pandemic, democratic political leaders comfortably - albeit with varying degrees of credibility - justify their exercise of unprecedented powers by claiming to be 'led by the science'.

Of course, Bush himself would not necessarily have approved of today's rhetorical leveraging of science in public policy-making. Yet, the prospect of science setting the pace of politics was certainly intimated in his famed Science: The Endless Frontier, the work credited with inspiring the vision behind the NSF. In particular, Bush contributed to a narrative that was being spun by his ally Harvard President James Bryant Co-nant and especially Conant's protégés, Thomas Kuhn and Robert Merton, who were among the founders of today's history and sociology of science. According to this tale, the Nazis and the Soviets were destined to fail because they tried to force science to conform to the dictates of policy, resulting in a distorted understanding of reality. These totalitarian regimes were guilty not simply of bad strategy and tactics in advancing their objectives, or even of bad morals in the treatment of their own peoples. They were guilty of the ultimate crime: bad epistemology. In contrast, this narrative suggests, liberal societies are open to whatever the evidence says - with the proviso that what counts as evidence has been vetted by the relevant academic experts.

Thus, 'Always more science, but never used before its time' became a mantra that shored up the public's commitment to scientific autonomy, while constraining policymakers' sense of what is realizable, thereby safeguarding against the excesses and atrocities of totalitarian regimes. The NSF should be understood as a monument to this mentality. It involved

a peculiar understanding of how liberal societies relate to science. Mean -while, another understanding was brewing on the other side of the Atlantic. While the US Congress was debating the foundations of the NSF, Karl Popper had begun to promote at the London School of Economics an alternative idea of the science-society relationship. Science for him was less about establishing a disciplined grip on reality than recognizing that our grip may not be as secure as we thought. It was more about Galileo's challenge of Church dogma than Newton's pronouncement of the laws of nature. Popper regarded science as the intellectual cradle of liberalism, the exemplar of the 'open society', a society open to change its collective mind with relative ease as new evidence comes to light. Here 'evidence' means an outcome or event that seriously undermines prior expectations, including those based on learned prejudice.

At first, Popper's vision looked like a highly romanticized version of Cold War liberal ideology. However, once the Soviet Union launched Sputnik, the first artificial satellite, in 1957, it became clear that he had captured something significant about the nature of science that the NSF's 'pure science' orientation had overlooked. Indeed, when President Eisenhower's Chief of Staff Sherman Adams canvassed the opinion of scientific leaders on the Sputnik launch, they reassured him that the US had already mastered the basic research behind the satellite's construction. However, the American media did not let the matter rest on that sanguine note. They turned Popperian. They suggested that the Soviets were now capable of a level of global surveillance that threatened US national security. This view was also shared by Eisenhower's Defence Secretary, Neil McElroy, who realized that the Soviets were envisaging outer space in a radically different way from the academic physicists, who understood Sputnik as simply a glorified application of their science.

This led McElroy to propose an 'Advanced Research Projects Agency' for the Defence Department focused on framing the nation's future scientific and technological needs, or DARPA (Belfiore 2009). It would not be about, say, improving current missile technology but developing the 'next generation' of warfare before it is strictly needed. It would be less about winning the current game and more about setting the rules for the next game. This involved a different mindset toward science, one that was arguably truer to the practice of the Manhattan Project than the version of the NSF successfully promoted by Bush and his colleagues. The underlying principle that McElroy grasped was that the ultimate test of scientific knowledge is its capacity to reorganize around an unforeseen development, thereby converting a liability into a virtue.

After all, the US did not start trying to build an atomic bomb until a Princeton-based Albert Einstein tipped off FDR that the Nazis might be already heading down that path. The ensuing team of researchers who succeeded in building the first atomic bomb did not spontaneously self-organize into what was known as the 'Manhattan Project'. The US

government in consultation with the lead scientists set the parameters of the project, including those eligible for participation. The scientists then proceeded in an unprecedentedly free way. It resulted in massive cost overruns, relatively little oversight and high levels of uncertainty until the first bomb was detonated in a New Mexico desert. But this impressive achievement was hardly a triumph of 'basic research left to its own devices', notwithstanding Bush's successful PR campaign. The Manhattan Project was neither the product of discipline-based academic work nor the straightforward application of such work. It was a profoundly interdisciplinary project that involved not only physicists but also engineers and medical professionals. It took all concerned way outside their intellectual comfort zones.

In the end, the establishment of the Manhattan Project was about creatively learning from error, where 'error' means being blindsided. The same applies to the US response to Sputnik. The US failed to launch the first artificial space satellite because it had yet to realize that the Soviets had started a 'space race'. The US succeeded at developing the first atomic bomb only because it threw more resources at it than the Nazis who came up with the idea. In both cases, the US failed to name the game that it was playing with its opponents, even though it went on to win both games. DARPA was designed to prevent that from ever happening again. And by that standard, DARPA has been a sterling success. The internet, virtual reality and drones are among the many products of DARPA-based research that have changed the landscape of warfare - and much else, including the conduct of science itself. In short, DARPA would have set a better precedent for science policy than the NSF in the wake of the Manhattan Project.

Early in 2020, Mauro Ferrari loudly resigned as head of the European Research Council (ERC) - the European Union's answer to the NSF -on the grounds that it was not fit for purpose in tackling the COVID-19 pandemic. He complained about the ERC's 'lack of responsiveness', by which he meant its failure to see the pandemic as an opportunity for genuine scientific innovation. It is perhaps no accident that Ferrari was a pioneer in nanomedicine, a field that emerged in the early 2000s as part of a concerted policy effort on both sides of the Atlantic to harness various sciences in an 'enhancement' agenda designed to enable people to lead longer, healthier and more productive lives. Indeed, the landmark 2002 'converging technologies' report of Mihail Roco and William Sims Bain-bridge made it seem for a while that NSF itself might be heading in a more DARPA-like direction.

A quarter-century ago, US political scientist and pioneer in empirical voter studies, Donald Stokes (1997) dubbed this kind of science, which is literally the product of a state of emergency, 'Pasteur's Quadrant', named, for the French founder of epidemiology, Louis Pasteur. Stokes would turn Vannevar Bush on his head. Whereas Bush believed that a science must

reach a state of maturity on its own terms before it can fruitfully tackle real-world problems, Stokes suggested that signature scientific breakthroughs come from real-world problems challenging several disciplines at once to overcome the self-imposed limitations of their inquiries. The difference in raison d'être for the NSF and DARPA could not be cast more starkly.

Stokes had developed a 2 x 2 matrix of relationships between 'basic' and 'applied' research in the 1990s as part of a prospectus on the possible directions for post-Cold War US science policy. 'Pasteur's Quadrant' -named for its exemplar Louis Pasteur - refers to research driven by'ap-plied' concerns that serve to steer 'basic' research. As Stokes rightly saw, the story of Pasteur's long-term contributions to science - not least in such pandemic-relevant fields as epidemiology and public health - reversed the narrative line of the science policy mythology. Moreover, Pasteur was hardly unique. It has also characterised the overall direction of travel for innovation in both science and technology in the twentieth century. The great private foundations (e.g. Rockefeller) and corporate R&D units (e.g. Bell Labs) had been the main drivers of the signature breakthroughs in molecular biology, behavioural science, neuroscience, as well as information and communication technology, including artificial intelligence research.

Of course, the researchers involved were academically well-trained. More importantly, academia was central to the normalization of these breakthroughs into the curriculum so that many more than the original funders could benefit. No one can reasonably deny the university's cen-trality to the conversion of these innovations into knowledge as a public good. However, when it comes to providing an environment for the actual conduct and evaluation of such cutting edge research, the record of universities - and especially of established academic disciplines - has been chequered, to say the least. The complaints of academic innovators about their home turf are legion and largely justified. They go beyond lack of time and funds. Peer review itself routinely confuses judgements about the validity of work judged on its own terms and in terms of some larger discipline-based agenda, which in the end may matter only to other academics.

It would be ironic if 'basic research' has come to be no more than a euphemism for academic work that can be 'owned' by academic disciplines, especially in the context of tightened public funding. In this light, the UK government's proposed 'high risk, high reward' ARPA-style research agency should be seen as a direct challenge to the science policy mythology. Why should we presume that 'basic research' of the truly fundamental sort is more likely to come from the agendas of self-appointed 'basic researchers' than external exigencies of the sort that provide the basis for Pasteur's Quadrant?

Conclusion: Whither Science as a Public Good?

Ferrari's recent ERC resignation and the UK government's proposed ARPA-style 'high risk, high reward' research agency have thrown into sharp relief the distinction between 'basic' and 'applied' research today. All of this has been happening against the backdrop of the COVID-19 pandemic, and the squeeze that it will invariably place on public finances across the world for the foreseeable future. Taken together we have the perfect storm for radically rethinking why taxpayers should be funding research at all.

Indeed, Stephen Turner and Daryl Chubin (2020) have recently revealed a disconcerting feature about the sense of 'research impact' that ideal of scientific autonomy that they defend. This ideal has been historically promoted by the academic establishment, typically in aid of elite science ('the next Einsteins'). Yet, it is far from clear that academia has been the natural home of 'scientific autonomy' in its broadest sense, namely, the free selection of the ends and means of research. Instead, what has been upheld in the name of 'scientific autonomy' has been the much more narrow and self-serving idea of academically oriented scientists being autonomous from any constraints imposed by the rest of society. This effectively grants the academic establishment exclusive rights to impose all the constraints it wants on those who would claim to do 'science'. Thomas Kuhn's totalitarian sense of 'paradigm', inspired by Bush collaborator James Bryant Conant, concedes this point by making it a condition of being considered a 'scientist' that one self-presents as having undergone the appropriation forms of 'acculturation' (aka indoctrination) into science; hence the significance attached to 'peer review' [Fuller, 2000b, chaps. 3-4].

Put another way: The defence of 'scientific autonomy' has been really about the entitlement of a group of self-certifying academics to monopoly ownership over 'science', understood as the most highly valued form of knowledge in society. The terms on which the NSF was established, masterminded by Bush and Conant, was a victory for this vision. Thus, Conant's promotion of 'no science before its time' should be understood as a form of soft power, whereby science domesticates the passions of a populace increasingly impressed by its achievements. Plato would have approved. Turner and Chubin discuss the Habermas-inspired Final-izationist movement in 1970s-80s West German science policy thinking, which provides an interesting benchmark. The Finalizationists observed that scientists left to their own devices were likely to squander their entitlement to free inquiry by becoming increasingly self-involved in problems that removed them from larger societal concerns, even though they had already secured a body of usable 'public' knowledge. Truly 'autonomous' scientists should be able to opt out of disciplinary specialisation with impunity.

The historically contrasting agendas of the NSF and DARPA allow us to take stock of the terms under which knowledge production might be organized to render it a 'public good'. If one thinks of other public goods, they are better seen as utilities than properties. They are resources to which everyone has access but which each person is expected to use somewhat differently, presumably to their own personal advantage. Indeed, the same public goods can enable rivals to compete more effectively against each other. This certainly applies to knowledge - and, in this respect (pace Audre Lorde), the master's tools can dismantle the master's house. Conversely, it doesn't make sense to say that someone's access to a public good is 'better' than someone else's, if the good is truly public. Someone may use a public good less advantageously than someone else, but the fault lies with the person not with the good. This helps to explain the Humboldtian image of the academic as embodying the 'unity of research and teaching' in his or her person. As the academic enables those not involved in the research process capable of understand -ing and using cutting edge knowledge in their lives, s/he also demystifies the specialness of that knowledge, so that it doesn't turn into the latest principle of social stratification [Fuller, 2000a, chaps. 6-7; Fuller, 2002, chap. 1; Fuller, 2009, chap. 1].

Of course, this raises the question of the cultivation and maintenance of public goods, from which concerns related to the 'tragedy of the commons' derive. In that case, there may be reason to think about public goods as common property. Arguments for academic custodianship as a necessary feature of knowledge as a public good routinely trade on this idea. Like the prospect of environmental degradation through the overuse of common natural resources, stewardship is required to ensure quality control in the production of knowledge. Today's fears - typically on the internet and social media - about the spread of misinformation and 'fake news' relate to this point. Unfortunately, such arguments have historically veered toward rentiership, effectively ceding to academics proprietorship of 'knowledge in the public interest' [Fuller, 2019].

However, much closer to the spirit of knowledge as a public good is to think about late US Senator Daniel Moynihan's famous quip, 'You are entitled to your own beliefs but not your own facts' as applying equally ('symmetrically', in STS jargon) to experts and non-experts. In other words, once the 'fact' or 'statistic' is rendered public, anyone can try to use it to their advantage. It is up to opponents to block its usage in practice. I have described this as constitutive of the 'post-truth condition', which is an inevitable outcome of the democratisation of knowledge and perhaps the most vital way to understand knowledge as a public good [Fuller, 2018; 2020]. Thus, 'contextualisation' comes to be seen as a privileged securing of knowledge in order to restrict its access. From this standpoint, academic credentialing authority and intellectual property rights don't look so different. I explore this matter further in the Appendix of this paper, 'Expertise as Organized Hypocrisy'.

So, one might wonder, is there really a 'tragedy of the commons' with regard to knowledge, after all? If not, the arguments for academic custodianship of knowledge as a public good are weakened. Here three considerations may be brought to bear:

1. The intuitive appeal of the tragedy of the commons is that nature's standards are non-negotiable: If you overuse or misuse a natural resource, its utility for subsequent users is simply diminished. This is not so obvious in the case of knowledge. After all, much scientific innovation has resulted from the 'metaphorical' transfer of concepts from one domain to another. Mathematical isomorphism has often facilitated the transfer, as the same equations are used in different domains (e.g. the inverse square law, the entropy principle). Had the results of such transfers not been so fruitful, they would have been condemned as 'misapplications'. Indeed, the 'Science Wars' of the 1990s - most notably the Sokal Hoax - turned largely on the metaphorical extension of scientific concepts to 'non-scientific' domains that certain scientists regarded as unfruitful.

2. The need for knowledge to be 'contextualised' properly in order to be used properly is akin to saying that your animals can't graze on the commons unless they meet certain standards of 'grazing'. (European Union residents will understand the spirit of this proposal immediately!) Under those conditions, a public good is arguably converted into a 'club good'. To put it bluntly in the case of science, you need to think like an academic in order to use academic knowledge - and that may require getting a degree, credential or some other 'license'. This was the import of Moyni-han's original quip, which has been flipped in the post-truth condition.

3. There is the embarrassing fact to which Kuhn drew attention under the rubric of 'paradigm': namely, that the dominant science is allowed to 'smash and grab' its own history to appropriate whatever past achievements support its current trajectory, regardless of whether the scientists behind them would have consented to that trajectory. Moreover, this 'smash and grab' approach is licensed only to those working within the established paradigm -not to scientists operating outside it. (Anyone party to the debates between evolutionists and creationists/intelligent design theorists over the nature of life will know what I mean.) In short, notwithstanding its avowed role in maintaining knowledge as a public good, 'science' - understood as the authoritative wing of knowledge - simply carries on a form of monopoly capitalism at the meta-level when dealing with its own past.

In the end, once we take into account, first, that knowledge is a 'public good' in a sense that is closer to that of a utility than a property and

second, Humboldt's Enlightenment-oriented academic legacy, it follows that by 'universally accessible knowledge' we must be talking about life-enhancing skills - that is, in the spirit of the medieval 'trivium' (grammar, logic, rhetoric) and 'quadrivium' (arithmetic, geometry, music, astronomy), the modern version of which is captured in 'liberal arts' education. In this context, subject areas are valued less for their actual content than for the refinement of basic powers of speaking, reasoning, discernment and comportment that result from engaging with those areas in a sustained way. An interesting bridge moment from the medieval to the modern context was an idea first floated by Francis Bacon in his general reform of learning and picked up 150 years later by Denis Diderot during the first major wave of empirical interest in the workings of the brain [Darnton, 1984, chap. 5].

Bacon's proposal was basically that academic faculties should correspond not to subject matters at all (which only encouraged a proprietary vision of knowledge as 'domains') but to faculties of the mind. Moreover, true to his signature inductive spirit, Bacon advised that the academic faculties should be regularly revised as our scientific understanding of the mind's faculties improved. Diderot updated Bacon's proposal with the proto-neuroscience of his day and identified it with the Enlightenment's approach to education. The idea was pursued further by the German idealists - especially Hegel - in Humboldtian Berlin, the work of Auguste Comte and his positivistic followers, among others. Today, we might our thinking of 'faculty' to include the sort of prosthetic extensions (e.g. silicon chip implants) associated with a 'cyborg' existence. In any case, this is the likely direction of travel for any future discussions of 'science as a public good'.

Appendix: Expertise as Organized Hypocrisy

When the late US Senator Daniel Patrick Moynihan quipped, 'You're entitled to your own beliefs but not your own facts', he forgot to explain that it was because the facts belong only to the experts. The hidden conceit is Platonic: The plebs have 'beliefs', the experts 'facts'. Moynihan's mode of address is characteristic of experts. What the experts say may be true but it's never the whole truth because they are hiding the context from you. In this sense, expertise is an especially potent form of organized hypocrisy.

To be sure, the experts themselves control only some of the relevant context. The part they control rests on everyone assuming that facts must be understood 'properly'. This part makes the expert appear like a custodian of some intellectual commons. Higher education certainly reinforces this perspective, though it has been seriously eroded over the past quarter

century by the internet. But generally speaking, modern societies are structured to make it relatively easy for experts to control that part of the context.

The part that experts don't control is that contexts change over time and in unexpected ways, which means that the significance of the facts also changes. Here the experts want to ensure that these changes are publicized only once they are ready to take full advantage of them. This leads experts to be 'economical' with the truth in the manner of my Platonic gloss of Moynihan's quip. They want to be the first to announce that the context has changed, and therefore the facts are not quite as they had originally seemed. In practice, this means that most critiques, objections, anomalies are ignored, denied or otherwise marginalized - that is, until the time comes that the experts have got sufficient control of the situation to move forward in unity and strength.

None of what I have said denies that experts commit various errors and even frauds along the way. However, most of those errors don't seriously disturb the overarching expert narrative. Even the frauds are manageable in the long term, especially if they can be presented - in a tastefully distant future - as having anticipated findings confirmed by subsequent 'proper' research. That strategy served to salvage the careers of those eminent fraudsters, Galileo and Mendel. Arguably the true expertise of experts is their ability to manage what the sociologist Erving Goffman (1959) discerned as 'back stage' and 'front stage' in what amounts to the drama of organized inquiry. While Goffman was himself focused on everyday life in the modern world, his point extends quite naturally to the world of expertise.

Philosophers of science translate Goffman's dualist dramaturgy as the 'context of discovery' (aka back stage) and the 'context of justification' (aka front stage). The former is the idiosyncratic, perhaps even irrational way in which scientists actually make breakthroughs; the latter is the logically airbrushed, empirically photoshopped version in which those breakthroughs appear in peer-reviewed journal articles. Sociologists notoriously started pulling the curtain behind science's front stage in the 1970s, emphasizing the considerable work involved in converting the antics of the laboratory into a stable piece of knowledge that can command public support [Latour and Woolgar, 1979; Knorr-Cetina, 1981].

Of course, the sociologists weren't telling the experts anything that they didn't already know. The experts just didn't want it to be more generally known. Facts don't emerge and are not maintained by themselves. Indeed, they need to be stage-managed. Such 'stage management' is largely about controlling the context in which they are presented as 'facts'. The intense publication rivalry that one witnesses in the academic world is mainly about trying to best capture something that many -if not - most of the relevant experts are already prepared to accept. They're just waiting for someone to present it perspicuously. The issue

with establishing facts is not truth but authorization: Who gets to say it, not what gets said.

A doubled down version of this approach appears in the need to possess the relevant academic credentials before one can even be a player in this game. To be sure, someone lacking a PhD is capable of making a major intellectual breakthrough. Arguably most such breakthroughs have been made by just such people, especially if inventions are included. However, it is unlikely nowadays that such people would have the 'know how' (savoir faire) to negotiate their way through the peer review process, that fountainhead of expert hypocrisy.

Peer review is the process by which established academics allow people to contribute to their collective knowledge enterprise. It typically involves a dual judgement about both the validity and the relevance of the work presented. Someone may submit an article that is valid on its own terms yet not relevant to the enterprise's direction of travel. A rejection on this basis at first sounds like a harmless sorting exercise. However, if the knowledge enterprise already enjoys the standing of 'expertise' in the larger society, then a valid article's 'irrelevance' might subvert that standing, were it to be published, if it suggested an alternative way of approaching matters over which the experts currently enjoy a monopoly.

Consider the terms on which the charter of the first modern scientific association, the Royal Society of London, were drawn. They offer a blueprint of corporate autonomy - and the hypocrisy it embodies. In particular, the charter crystallized the mutual recognition of the state and the nascent scientific community of the benefit they could provide to each other if each stayed out of the other's business. Thus, forms of organized inquiry that potentially threatened the state's legitimacy - morals, politics, religion, rhetoric - were specifically banned in the Royal Society's charter. Agreement on this point was easier in 1660 than it might be today because England had just emerged from a long and deep civil war, culminating in the overthrow of the monarchy.

Here it is worth recalling a quip of that great early modern master of hypocrisy, US founding father Benjamin Franklin: 'If we don't hang together, we shall hang separately'. Franklin was talking about the need for the revolting American colonists to settle their differences over the wording of the Declaration of Independence before exposing the document to public scrutiny. In practice, this meant that the colonial delegates wrangled behind closed doors before presenting a finished text. While political philosophers like to present this episode as demonstrating the value and power of reaching consensus, 'consensus' is just a euphemism for organized hypocrisy [Fuller, 2000a, chap. 8].

By drawing a curtain over the back stage, Franklin and his fellow revolutionary wranglers inhibited any larger participation over what probably most concerned the target audience. This was exactly the spirit in which the Charter of the Royal Society was entered. In that case, the curtain

would be regularly drawn by the peer review process, which was originally a form of censorship but subsequently has been sublimated as self-censorship. The Royal Society's first editor, Henry Oldenburg, simply refused cognitively inflammatory references from entering the society's correspondence, the basis for its original 'proceedings'. Indeed, peer review has always involved the enforcement of 'political correctness', with the politics shifting over time and place [Bazerman, 1987].

Early in the Royal Society's history, Thomas Hobbes challenged Robert Boyle over the validity of his air-pump experiments, which allegedly demonstrated the existence of a vacuum in nature. Hobbes was very politically incorrect. He failed to respect the Royal Society's state authorized hypocrisy. Specifically, Hobbes did not believe that the presence of several authorized observers to a single event was sufficient to establish a metaphysical truth. Moreover, his attack was very robust. Hobbes regarded experiments as pure theatre and provocatively compared Boyle's set-up to transubstantiation in the Roman Catholic Mass, whereby the mundane mixing of bread and wine is presented as the body and blood of Jesus Christ. Hobbes' denial of membership to the Royal Society effectively overruled his challenge, after which the curtain was drawn between the front and the back stage of the drama of science [Shapin and Schaffer, 1985]. And the rest is history - specifically, the history of institutionalized science, aka organized hypocrisy.

References / Список литературы

Bazerman, 1987 - Bazerman, C. Shaping Written Knowledge. Madison WI: University of Wisconsin Press, 1987, 356 pp.

Belfiore, 2009 - Belfiore, M. The Department of Mad Scientists: How DARPA Is Remaking Our World. New York: HarperCollins, 2009, 320 pp.

Bush, 1945 - Bush, V. Science: The Endless Frontier. Washington DC: Office of Scientific Research and Development, 1945, 204 pp.

Darnton, 1983 - Darnton, R. The Great Cat Massacre. New York: Basic Books, 1983, 320 pp.

Fuller, 2000a - Fuller, S. The Governance of Science. Milton Keynes: Open University Press, 2000, 192 pp.

Fuller, 2000b - Fuller, S. Thomas Kuhn: A Philosophical History for Our Times. Chicago: University of Chicago Press, 2000, 490 pp.

Fuller, 2002 - Fuller, S. Knowledge Management Foundations. Woburn MA: Butterworth-Heinemann, 2002, 288 pp.

Fuller, 2009 - Fuller, S. The Sociology of Intellectual Life. London: Sage, 2009, 192 pp.

Fuller, 2013 - Fuller, S. "Deviant Interdisciplinarity as Philosophical Practice: Prolegomena To Deep Intellectual History", Synthese, 2013, vol. 190, pp. 1899-1916.

Fuller, 2018 - Fuller, S. Post Truth: Knowledge as a Power Game. London: Anthem Press, 2018, 218 pp.

Fuller, 2019 - Fuller, S. "Against Academic Rentiership: a Radical Critique of the Knowledge Economy", Postdigital Science and Education, 2019, vol. 1, pp. 335-356.

Fuller, 2020 - Fuller, S. A Player's Guide to the Post-Truth Condition: The Name of the Game. London: Anthem Press, 2020, 250 pp.

Goffman, 1959 - Goffman, E. The Presentation of Self in Everyday Life. Garden City NY: Doubleday, 1959, 259 pp.

Haldane, 1921 - Haldane, R. The Reign of Relativity. Toronto: Macmillan, 1921.

Kevles, 1977 - Kevles, D. "The National Science Foundation and the Debate Over Postwar Research Policy, 1942-1945", Isis, 1977, vol. 68, pp. 5-26.

Knorr-Cetina, 1981 - Knorr-Cetina, K. The Manufacture of Knowledge. Oxford: Pergamon, 1981, 189 pp.

Latour and Woolgar, 1979 - Latour, B. and Woolgar, S. Laboratory Life. London: Sage, 1979, 271 pp.

Samuelson, 1969 - Samuelson, P. "Pure Theory of Public Expenditures and Taxation", in: J. Margolis and H. Guitton (eds.), Public Economics. New York: Macmillan, 1969, pp. 98-123.

Shapin and Schaffer, 1985 - Shapin, S. and Schaffer, S. Leviathan and the AirPump. Princeton: Princeton University Press, 1985, 448 pp.

Stokes, 1997 - Stokes, D. Pasteur's Quadrant: Basic Science and Technological Innovation. Washington DC: Brookings Institution Press, 1997, 196 pp.

Turner and Chubin, 2020 - Turner, S. and Chubin, D. "The Changing Temptations of Science", Issues in Science and Technology, 2020, vol. 36, no. 3, pp. 40-46.

i Надоели баннеры? Вы всегда можете отключить рекламу.