RUDN Journal of Public Administration
ISSN 2312-8313 (print), ISSN 2411-1228 (online) 2023 Том 10 No 2 269-285
Вестник РУДН. Серия: http://journals.rudn.ru/
ГОСУДАРСТВЕННОЕ И МУНИЦИПАЛЬНОЕ УПРАВЛЕНИЕ pi^Nraclmmirtratony
DOI: 10.22363/2312-8313-2023-10-2-269-285 EDN: GXDWLJ
Research article / Научная статья
Toward an Integrated Model for Public Technology Policy Analysis — a Taxonomy Useful for Determining Scope and Type of Analysis
Gunnar K.A. Njalsson
University of Lapland,
8 Yliopistonkatu, 96300 Rovaniemi, Finland, 122-96101
Abstract. How might the analysis of public technology policy be further systematised and made more objective so that important factors such as conscious subjects, interests, influence and prioritisation could better be taken into account by policy analysts and decision-makers? An integrated model designed to guide and structure the analysis process might comprise a solution to this problem. An adequately sophisticated yet concise and systematic framework of this sort would necessarily take into account the referential point of departure and alternative types of policy analysis as well as the nature, interests and priorities of those entities actually shaping public technology policy. The segment of the proposed integrated model and which is concerned with built-in assumptions about the nature of technological development has been covered in a previous article [1]. The purpose of the current article is to develop and present one further part or segment of an integrated model for public technology policy analysis (IMTPA) and to demonstrate its methodological and analytical utility with central policy analysis documents from Canada during the period 1990-2005. This article shall limit itself to a part of the IMTPA concerned with the type and scope of public technology policy analysis to be undertaken — a methodology which might better guide and make more transparent both the policies being examined as well as the policy analysis process itself.
Keywords: Public administration, technological and innovation policy, development of scientific methods and models, Canadian technology policy
Conflicts of interest: The author declared no conflicts of interest.
Article history:
The article was submitted on 24.02.2023. The article was accepted on 10.04.2023. For citation:
Njalsson G.K.A. Toward an Integrated Model for Public Technology Policy Analysis — a Taxonomy Useful for Determining Scope and Type of Analysis. RUDN Journal of Public Administration. 2023;10(2):269-285. https://doi.org/10.22363/2312-8313-2023-10-2-269-285
© Njalsson G.K.A., 2023
This work is licensed under a Creative Commons Attribution 4.0 International License https://creativecommons.org/licenses/by-nc/4.0/legalcode
H
Technology Policy and Policy Analysis
Policy and policy analysis have been defined in a number of different ways. Some of these definitions are worth a few words before we continue to the subject of public technology policy analysis. One dictionary defines policy as "a definite course or method of action selected from among alternatives and in light of given conditions to guide and determine present and future decisions" [2]. Still another scholar [3] has expressed the definition in much simpler terms as "what governments choose to do or not to do".
There are many more definitions of the word. However certain ideas appear with regularity in most of them. Firstly, policy has to do with alternatives and choices. Secondly, the definitions imply that someone is saying "yes" to some alternatives and "no" to others. One may safely derive from the definition that choices are being made based upon information about or perceptions of the alternatives. Both conscious action and deliberate or informed omission of action comprise, according to these definitions, what we refer to as policy.
However much can be understood from these definitions they still raise many more questions. They are also very general. What, for example, is meant by "what governments do"?
For the purpose of this segment of the Integrated Model for Public Technology Policy Analysis we shall define as policy governmental or organisational plans, consultations, decisions, directives, resource allocation and deliberate action or inaction. This definition not only includes the choice aspect as well as the options of action or inaction; it also explicitly denotes activities which have the potential of leaving documentary traces and can thereby be examined.
The nature and task of technology policy analysis
Theories of the nature of technology development and innovation have tended to occupy a central role in the formulation of national technology and innovation policy [1]. In the case of technology policy analysis, these most often pertain to subjects such as the nature of technological development, how, why and in what context innovations occur and, indeed, what constitutes an innovation. Many of the theories can be collocated on a matrix ranging from varying extremes of techno-determinism1 or socio-determinism and even varying extremes of technology optimism2 or pessimism [1. P. 66]. Both technology policy and even the analysis of technology policy will in part draw upon the assumptions connected to these
1 Technological determinism is the degree to which it is asssumed that technological development is largely independent of social or cultural considerations and that such development follows its own autonomous logic. Socio-determinism, on the other hand, is the degree to which it is assumed that technological development is a part of the broader cultural and social context of a society.
2 Tech-optimism is the degree to which a particular viewpoint contends without reserve or major qualifications that technological change will be of benefit for most of society. Reservations and critique on the part of a theory would imply a lower level of tech optimism than would a lack of reservations or critique.
varying views of technology development and of the desireability of different technological innovations. When attempting to create a more clear and structured framework for analysing technology policy it is important to establish a specific and exclusive definition of technology policy. This should include what technology policy is and what it does.
What technology policy is
Simply stated, technology policy analysis is the systematic examination of what the government does with respect to the technology sector, why it does it and what effect this has [4. P. 1].
As is the case with policy analysis in general, technology policy analysis examines political choices and the effects of these choices on stakeholders. It is not a theory or methodology in itself; but rather a research field or a research approach often seeking to identify policy alternatives. When used to compare the costs and benefits of particular policy choices it becomes apparent that the conceptual roots of this approach [5; 6] can be traced to systems analysis and economics.
However, technology policy analysis is also a research approach which may be used to examine aspects which led to the formation of one or more policies, in other words, to study to the process ofpolicy determination [4. P. 5-6]. Technology policy analysis may go further than simply studying policy determination. The analyst may even attempt to examine the effects or results of a particular policy which has actually been implemented. In doing so, the analyst has turned his or her attention to the aspect of policy impact (ibid.) or policy outcomes [7].
When studying policy outcomes, a particular technology policy will be viewed by the analyst as the independent variable, while the effects that policy has on various societal institutions and activities (the labour market, healthcare, educational spending) become dependent variables. The technology policy is, in other words, being studied as the cause (or at least one of the causes) of certain effects in other policy areas or in society at large.
Technology policy analysis can be defined as above. However, much of the current literature refers to it in terms of standard tasks which it ideally should carry out. Examining what policy analysis is supposed to do may further clarify the exact nature of technology policy analysis as opposed to other research approaches.
What technology policy does
The definitions above make evident that cause and effect are of central importance to technology policy analysis. When examining a particular technology policy as having resulted from various factors in the policy process, for example the breadth and diversity of participation in the policy process or the prevalence of certain pressure groups in that process, the policy is viewed (at least in part) as a result of participatory factors. The participatory aspects of the policy process are viewed as independent variables. The various characteristics or aspects of the
resulting policy are viewed [8. P. 17] as dependent variables. According to some of the most central literature in policy analysis, the tasks carried out utilising technology policy analysis are most often the following:
1. define an issue or a problem;
2. determine technology policy objectives;
3. identify and rank technology policy alternatives;
4. evaluate the policy outcome.
Despite the common tendency to include choosing and implementing policy amongst the tasks, it is questionable whether such a task can logically be assigned to technology policy analysis. While defining problems, determining policy objectives, ranking alternatives and evaluating policy outcomes can clearly be seen as goals to be reached through a process of analysis, choosing and implementing policy in connection with the process of policy analysis is more problematic and constitutes a potential source of criticism.
In fact, this task may instead be the domain of decision-makers and stakeholders who have been provided with information on policy objectives and alternatives as a result of policy analysis. Academically, the policy analysis approach has been criticised mainly for having too aggressively replaced the democratic process (In particular public participation) with expert analysis [9. P. 349-353; 10. P. 523-531].
Weimer and Vining [5] have defined a list of goals for the process of policy analysis (including technology policy analysis) which clearly portray this research approach as supporting debate and the decision-making process.
PROBLEM ANALYSIS SOLUTION ANALYSIS
Define/Understand
Problem Recommend Solutions
Choose Goals/ Specify and
Determine Options Evaluate
Alternatives
Determine Method
or Methods of Choose Evaluation
Solution Criteria
1
Adapted from Weimer and Vining (1999)
Fig. 1. Problem-solution analysis Sourse: [5]
Starting from the top of the left column at defining and understanding a policy problem and moving downward and then to the right and upward in a counterclockwise direction one progresses from problem formulation to proposal of solutions in the Policy Analysis Process. The aspect of greatest interest from the academic
vantage point of the current study is the division of the tasks of policy analysis into those which deliberate on problems and those which pertain to identifying and recommending solutions.
In sum, the current scholarship would indicate that technology policy analysis is used to define problems of relevance to the technology sector and perhaps society as a whole, to establish criteria for possible solutions to the problems and then to recommend certain solutions for implementation. Evaluation is also part of this mission and takes place most often after technology policy implementation.
Types of technology policy analysis
It is now apparent that technology policy analysis is the systematic examination of governmental or organisational plans, consultations, decisions, directives, resource allocation and deliberate action or inaction with respect to the technology sector. Such analysis can focus on policy determination (the formation of technology policy) or on policy outcomes. Technology policy analysis is not the study of new technologies. It is not the study of what innovations are or how they occur. Such questions are dealt with by multidisciplinary innovation studies including science and technology studies (STS). An integrated model for public technology policy analysis will for the sake of clarity relegate this form of scholarship to the realm of theories of technology development and innovation.
It is possible to devise a taxonomy of technology policy analysis dividing it into several types. This may be done according to particular methodological approaches or aims of the policy analysis. For example, an analysis carried out in accordance with econometric methods, examining the cost or utility of various technology policy options could be classified as a different type of policy analysis than an examination of the participatory aspects of technology policy and how these affect policy determination.
However, such distinctions may be less useful than a typology which clarifies for the user (and even the analyst) the nature and ontological or ideological point of departure of the entire analysis. This may better contribute to transparency, making potential readers or users aware of the possible need for complementary studies from different vantage points.
In the context of an integrated model for public technology policy analysis, it may be useful for the analyst to divide technology policy analysis into three types- objective, normative and speculative [1. P. 57-58]. Such a typology is based upon factors such as the ontology, criteria or indicators utilised as well as the goals of the analysis exercise.
Objective technology policy analysis
While policy can rarely, if at all, be said to be strictly objective (policy is largely based upon priorities, ideology and values), the study of public technology policy may at times simply be concerned with an observation of its characteristics or patterns.
Who is involved in deciding priorities for national investments in technology projects? Which companies receive the greatest amount of public R&D funding? What are the three highest national priorities in the area of technology spending in the context of programmes for the Information Society? Is government money being spent on centres of excellence and new clusters? How much? What are the primary
indicators used in order to define a digital economy? Has public spending in other policy areas decreased at the same time as it has increased in the area of technology? Do technology policy forums include members of the general public or of widely varying occupational groups? How does the national technology programme definethe word 'innovation'?
The questions above correspond to those which objective technology policy analysis would be used as a tool to answer. Here the policy analyst is not concerned with what is 'good' or 'bad', 'efficient' or 'inefficient', 'effective' or 'ineffective', 'democratic' or 'undemocratic', 'expensive' or 'inexpensive'. Characteristics of a particular national technology policy are simply being observed and measured. The analyst is not making any statements as to policy performance or as to what the policy should or should not be like or do. The observer is simply attempting to ascertain what is or was.
Of course, there are issues for debate even in the context of objective policy analysis. Can different policy analysts agree on the definitions used (i.e. clusters, innovation, centres of excellence, technology sector)? Are the indicators used to measure aspects of a national technology policy adequate for this purpose?
Whatever the outcome of such a debate, the discussion pertains to methodology and not to values, priorities or policy performance. Importantly, the observer is not attempting to influence the policy process. Judgements as to performance or the desirability of the particular policy are left to users of the information in the analysis.
Normative technology policy analysis
As soon as the focus of the examination turns to issues of utility, establishing objectives or priorities and setting up benchmarks, it has in effect become normative technology policy analysis. The same can be said for any analysis with the objective of "improving" or changing public policy as it relates to technology or of making changes to the policy process. Any analysis attempting to find the factors which led to the previous "success" of a certain policy, would also belong to this category.
The important aspect which sets this type of technology policy analysis apart from obj ective technology policy analysis is that normative technology policy analysis includes some idea of what a particular technology policy should be or should have been. It includes indicators, benchmarks and goals which are designed to measure "success" or "failure", "positive" or "negative" aspects of a particular policy. When a policy think tank or a political activist attempt to influence decision-makers by means of a particular analysis which supports a certain value-related point of departure and presents conclusions on what action should be taken in relation
to public technology policy, this per definition constitutes normative technology policy analysis.
Such a study may even utilise the findings of objective technology policy analysis, but will add benchmarks, performance indicators or interpretations of results in terms of the desirable and the undesirable. Like objective technology policy analysis this type of analysis will observe aspects of public technology policy, past or present. However, unlike the former, it will attempt to influence the direction of policy.
Speculative technology policy analysis
The two previous types of technology policy analysis share two important aspects. Firstly, they each refer to actual observations of data about the policy process, whether it be participants, programmes, policy instruments or indicators of policy outcomes. Secondly, they are primarily concerned with the past and the present, depending on the type of study. Even a normative technology policy analysis will present judgements and recommendations based upon past or present events or observations.
Speculative technology policy analysis judges public technology policy and makes recommendations based upon a perceived or postulated future state of affairs. That future state of affairs need not be the result of simple guesswork. The future scenario will often be based upon the statistical process of extrapolation. Here, past and current trends, for instance in the characteristics of technologies or in the use of various technologies, are analysed and their continuation into the future is projected.
This process often relies upon technology forecasting, an activity carried out by analysts in Canadian, Finnish and Swedish ministries and particularly popular from the 1960's to the 1980's.
Is Finnish information technology policy up to the challenges of the future global society? Will Canada be able to compete with other OECD nations in the new 'digital era'? Does Sweden have the right technology industries to find a niche in the future integrated Europe?
Speculative technology policy analysis would be the typical type of policy analysis used to propose answers to questions such as these. While this type of policy analysis can collect current data and extrapolate upon it, it cannot actually collect data from the future. There are many events which it has no chance of foreseeing which can drastically alter the future scenario and render such an analysis completely worthless. This is, however, an inherent risk involved in all public and private planning exercises. For this reason, risk may be managed by positing several future scenarios, hoping that one of them will be adequately close to matching actual developments.
Regardless of how well designed and carefully calculated, this type of public technology policy analysis always involves a certain element of speculation. The particular emphasis of speculative technology policy analysis is to advance policy
recommendations based upon possible events in the future. Thus, it obviously attempts to influence public technology policy.
The dividing line between the three types oftechnology policy analysis discussed here may at times be very fluid. Some studies may use a combination of these types, depending upon the scope and nature of the study in question. However, based upon the factors presented here, policy studies can in general be placed into one primary category. In doing so, a better picture of the purpose and nature of the analysis is provided, while readers and users can better determine how to use the results and whether other types of analysis should be carried out to diversify the existing knowledge and provide multiple perspectives.
A summary of the types of technology policy analysis as well as the central factors used to distinguish them from one another can be found below in Figure 2.
Fig. 2. Technology policy analysis Sourse: Own research, 2022
As depicted, objective technology policy analysis will tend to be carried out in specific types of studies such as general policy surveys, data books, case studies and observational studies which may utilise various indicators, without presenting benchmarks for performance or recommendations. All judgements as to proper objectives or policy performance are left to users of the analysis. Policy evaluations, performance assessments or planning exercises are all examples of normative policy analysis as are many think tank studies. These studies use benchmarks and recommend policies.
Examples of the types of studies which would likely utilise speculative technology policy analysis include policy assessments and policy planning studies
based upon technology forecasting or future studies for the purpose of meeting perceived future challenges. Such studies would be very likely to present clear policy recommendations and would seek to influence public policy in the area of technology.
The aspect of analytical scope
Included in this particular segment of the integrated model for public technology policy analysis is the aspect of scope. This should not be confused with the scope and limitations of a particular study as regards the time-frame, focus of the study or particular limitations such as those customarily presented in the beginning of an academic article or monograph. The aspect of analytical scope aids the analyst and users in understanding what has been examined and what has not been examined in relation to the the complete process of policy determination, policy implementation and policy outcomes and compared against a given complete scientific model encompassing everything that could be examined by a full-scope study.
The two simple distinctions made by Dye [4. P. 5-6] can be used as examples of a very basic model for policy analysis and would then be the distinct components policy determination and policy impact which taken together constitute Dye's "model" of public policy and thereby of all major aspects that can be examined by a full-scope policy analysis. If a hypothetical policy analysis only were to concern itself with the aspect of policy determination, without the idea of analytical scope utilised within the context of Dye's complete model, we might not automatically perceive important aspects excluded from the examination. However, with the hypothetical model as our guide we know that Dye's complete model includes another major aspect-policy impact.
It can thus be said that our hypothetical policy study excludes policy impact from its analytical scope and that the analytical scope of the policy study is limited to the aspect or part of Dye's model, namely policy determination.
As the integrated model for public technology policy analysis is gradually created by uniting the individual segments including this one and thereby becomes a great deal more complex, the aspect of analytical scope will become an essential tool for placing a policy study into a systematic context and for denoting the possible limitations of a study or even current scholarship in relation to the entire model.
As is the case with the typology of technology policy analysis, analytical scope can be very beneficial to policy analysis users who must decide whether or not a study contains enough diversity of knowledge to be useful by itself. If not, those policy aspects clearly designated as having been left out of the analytical scope of the study can receive a separate examination. A policy study carried out in accordance with a scientific model and which includes every aspect or component of that model would be considered a full-scope analysis. Where the study only examines a limited number of the possible aspects or segments of that model, it would be considered a limited-scope analysis in relation to the entire model.
Evaluation of Previous Policy Analysis with the IMTPA
Documents used to analyse and evaluate Canadian technology policy during the period 1990-2005 can be analysed and classified systematically using the policy taxonomy segment of the integrated model for technology policy analysis. The period 1990-2005 was chosen because it was central and decisive for the formation of Canadian public policy relating to technological innovation and adoption of the then new technologies seen as crucial to an information society. Many of the policies adopted during this period form a crucial part of Canada's technology and innovation policy even today.
Canada is an important object of research for technology policies as the country is an industrialized OECD member state and has since the 1990's had an information society programme which has driven the country's technology and innovation policies. The transparency and accessibility of public policy documents, including online, has also rendered Canada an ideal object of research.
Since 1990, public policy analysis in the area of technology in Canada has been conducted by government bodies such as the Office of the Auditor General, Industry Canada, the Standing Committee on Industry, Science and Technology as well as the Advisory Committee on Science and Technology (ACST). Additionally, external bodies such as the Council of Science and Technology Advisors (CSTA) have specialised in providing policy evaluation and guidance for Government of Canada internal activities in the area of technology development and proliferation.
Canadian government studies such as those carried out by the Office of the Auditor General (OAG) and Industry Canada have aimed to examine and evaluate the performance of federal technology policy in relation to the various goals stated in federal technology programmes, including Canada's previous Federal Science and Technology Strategy [11] which was approved in early 1996. This federal policy was in part a response to previous technology policy evaluation or analysis carried out by the OAG [12] in 1994.
The OAG study of 1994 which clearly fits the definition of a normative technology policy analysis document briefly recounted the previous thirty years of federal technology strategies and planning and concluded in no uncertain terms that „there has been much activity but few results.. " [12. Point 9.2]. But the normative nature of this central and highly critical evaluation of Canadian technology policy is further highlighted by its insistence that Canadian technology
policy should focus resources and attention upon activities and areas with the greatest potential payback [12. Point 9.3]. In addition, the main observations of the 1994 OAG analysis and evaluation of Canadian technology policy were that prior policy had not been clearly focused, that federal government as a whole had not been systematically involved in policy determination and evaluation, that parliament had no way to determine or judge the results of policy in this area and that indicators of technology policy results needed to be further developed to enable the effectiveness of national technology policy to be determined.
The Federal Science and Technology Review [11. P. 5] which was launched by the new Canadian government in 1994 and included consultations with both
experts and the Canadian public throughout various regions of the country was in itself a normative policy analysis exercise designed to result in a set of policy objectives. Those objectives were expressed in Canada's Federal Science and Technology Strategy for 1996 as set out in that same document. Between 1996 and 2004, the federal government published both normative performance analysis reports and objective data books on an annual basis to follow up on the results of the federal technology policy programme.
Both the normative and objective analysis in this context utilised indicators such as the percentage of national GDP invested in private sector, university and government research and development; the number of Canadian technical patents; distribution of scientific and technical articles by subject; Canadian share of global scientific and technical articles and the number of research scientists and engineers.
The primary federal advisory committees established for the purpose ofproviding expert advice in technology policy matters such as the Advisory Council on Science and Technology (ACST) and the Council of Science and Technology Advisers (CSTA) which focuses on internal federal technology policy each produced their own series of normative and speculative technology policy analysis reports. Between 1999 and 2005 CSTA issued seven major reports on science and technology activities within the federal government. Not one of the CSTA reports was dedicated to the subject of information technology per se. However, since these reports were drafted for the purpose of providing the federal government with policy recommendations for its internal science and technology activities, the area of technology logically includes information technology activities on the general level. And these normative reports, in particular the LINKS, SCOPE [13] and BEST [14] reports, posited that the federal government has an active role to play as a facilitator of science and technology [15], that government needs to communicate interactively with citizens on issues of S&T policy [16] and that national S&T policy should be characterised by openness and accountability [13].
The seven reports issued by the CSTA 15 during the period were also supported by independent studies carried out by consultants who were contracted by the CSTA to make representations with regard to the subject matter of each major CSTA report. However, since the studies of the independent contractors also did not deal specifically with the subject of federal information technology policy, their contents remain outside the scope of this review. Those government policy studies carried out by the ACST 16 of the Prime Minister's Office during the period 2003-2005 were the result of that council's discovery roundtable workshops and meetings.
The content of the various background and roundtable reports [17; 18] is more varied in nature than those of the CSTA and includes topics ranging from diffusion and adoption of new technologies to the use of proactive government technology procurement as a tool to encourage through example the private sector to adopt new technologies. These roundtable reports would clearly be classified as speculative policy analysis, as they posited that a future state of affairs required the federal government to enact measures to assist the diffusion and adoption of new
technologies. This was to be done in order to promote the competitiveness of the Canadian business sector and to strengthen the national innovation system.
In addition to the above, three [19] of the background reports produced by or for the ACST attempted to provide normative policy analysis, focusing on the development of performance indicators and benchmarks. And one ACST background report [19] comes fairly close to providing objective technology policy analysis, making mere observations on how previous Canadian and foreign policies have gone about encouraging commercialisation of technological innovations.
Government-sponsored technology policy analysis in Canada has not only been complemented by the independent studies its various advisory bodies have contracted; but also by a limited number of academic or scholarly works since the mid-1990's. The study of Jennifer Jensen et al. examined Canadian information technology policy on an inter-provincial level and sought to evaluate how that policy had been integrated into practice within the educational sector. The data for this study was collected during the period 1999-2001 and it is important to note that their work entailed a small case study and was very limited in scope. Jensen et al. concluded that government-dictated information society policies and projects implemented in the education sector might not have been in touch with the reality of how information technologies are used by students and by teachers in the classroom.
Because it expressed evaluative views on how Canadian information technology policy had mostly failed in particular to meet local needs in this particular sector and because it recommended changes to current policy, the Jensen study ought clearly to be considered a limited-scope, normative policy analysis focusing primarily on policy impact. In addition to focusing on S&T policy issues from vantage points ranging from the local to the federal and from S&T Studies to the development and usability of indicators, the Centre for Policy Research on Science and Technology (CPROST) based at Simon Frazer University has produced a visible body of nongovernmental and non-commercial innovation and technology policy analysis in the Canadian context. It is also one of the primary repositories of documents and raw data relating to the events leading up to and following the 1994 Federal Science and Technology Review referred to earlier in this section.
Several of the papers published by CPROST touch directly upon the subject of technology policy (In addition to science and research policy) and at more than a merely sub-national level. Of greatest interest to the present study is a report published in 2002 [21] which examined the background and nature of the 1994 Science and Technology Review undertaken by the Canadian government as a consultative pre-cursor to new proposals for Canadian technology policy.
That study had the objective of finding out why the 1994 Federal Science and Technology Review was conducted and what the actual outcome was [21. P. 2]. This objective was part of a larger task which included gathering factual data regarding the 1994 Review for comparison with the then (2002) upcoming Science and Technology Review proposed by the Liberal government. Cruikshank and Holbrook indicated that the study had been intended as a "description of the
1994 federal S&T Review to provide a benchmark for measuring the new consultation process" [21. P. 2]. Methodologically, this study utilised content analysis of transcripts and notes from the various consultative meetings held in connection with the 1994 S&T Review as well as official reports from the government and interviews.
The study also utilised priority matching by examining the number of times a particular policy priority was brought forward by stakeholders during the consultative process and comparing with those priorities expressed by the government at the conclusion of the 1994 S&T Review. The stakeholders supporting or introducing various policy priorities were profiled by their backgrounds, resulting in groupings such as "Academics" (largely scientists and technical experts), "Business", "Government" and "Other/Unknown" [21. P. 4]. In their analysis Cruikshank and Holbrook brought forth some interesting findings relating to policy determination and in particular regarding the participatory aspects [21. P. 12] of the 1994 Review. Their interview data indicated that key officials in the government set the agenda for the national conference and hand-picked the participants to ensure the mix of views that they wanted".
Most importantly, this study concluded that the 1994 Federal Science and Technology Review had despite its seemingly broad and open process of consultations among stakeholders only represented a public relations exercise by a federal government already intent on cutbacks of public monies to Science and Technology. Holbrook had previously analyzed stakeholder voting patterns [22. P. 3-4] relating to Science and Technology policy priorities based upon information gleaned from the 1994 Federal Science and Technology Review. This 1996 study also divided stakeholders by profile into separate groups including Business, Education and Government.
These findings indicated that Business or Industry stakeholders gave highest support to policy priorities such as "Improvement of (Information Technological) Infrastructure" and "National S&T Competitiveness" while the strongest support for "Excellence in S&T", "Improvement of Infrastructure" and "Technology Transfer" came from stakeholders with a Government background or profile. Because the 2002 study was explicitly designed to create benchmarks for comparing the 1994 and upcoming 2002 consultations and their results (including "success or failure") and because that study clearly indicated that the 1994 Review had not met its own goals, the 2002 study by Holbrook and Cruikshank may aptly be classified as a normative policy analysis. It was purely concerned with the nature of policy determination in part through consultation with stakeholders. Therefore, the 2002 study was clearly a limited-scope analysis.
The 1996 study which focused on the same 1994 Federal Science and Technology Review must be said to have come closest to being an objective policy analysis as it solely concerned itself with the comparative policy support or priorities of the different categories of stakeholders. It was limited in scope, focusing on policy determination and priorities/interests of the various policy actors or elites. Therefore this study too can be considered a limited-scope policy analysis.
The government policy documents discussed and classified here as well as the two exernal, non-government studies have been collocated in Table 1 according their characteristics as objective, normative or speculative policy analysis and as partial or full-scope analysis (utilizing the aspects policy determination and policy impact). Each document is named in accordance with its listing in the reference section of this article for better cross-referencing. From Table 1 it can be clearly seen that only two of the studies which pertain to policy determination (i.e. partial analysis) can be considered as objective policy analysis while the majority of the government and other analysis covered are normative in nature. Only one of the normative analysis documents — that of the Auditor General's Office — could be interpreted as being a full-scope analysis. Of particular interest is the fact that no full-scope objective policy analysis could be identified from the government policy documents that were examined. This might have important implications when considering whether there is a need for a unified full-scope objective analysis with regard to this particular policy area.
Table 1
Summary Classification of Core Technology Policy Analysis Documents
Nature of policy analysis Documents
Speculative (Canada 2005 a) (Canada 2005 c) (Canada 2005 d) not applicable applicable
Normative (Canada 1999 b) (Canada 2001 b) (Canada 2003 e) (Cruikshank and A. Holbrook 2002) (Canada 1994) (Canada 2003 b) (Canada 2003 c) (Canada 2003 f) (Jenson et al. 2007) (Auditor General 1994)
Objective (Canada 1996) (Canada 2003 a) (Canada 1999 d) (Canada 2000 d) (Canada 2002) (Canada 2003d) (Canada 2004b)
Scope of policy analysis Partial analysis Full-scope analysis
policy determination or policy impact) policy determination and policy impact
Note: Technology policy documents of central importance for Canadian technology policy are in bold. External non-government documents analysing Canadian technology policy and classified in Table 1 are in regular type.
Conclusions
In addition to being classified as tech-deterministic or sociodeterministic, technology policy analysis can be characterised as objective, normative or speculative in nature. An analysis can also be classifed as limited or full-scope, for instance depending upon whether it considers aspects of policy determination or both policy
determination and policy impact. The utility of such a classification can become evident when there is a need to determine how comprehensive and varied past and current policy analysis documents have been and which aspects or analytical points of departure have been included or excluded. A more systematic and integrated model for the analysis of technology policy such as the proposed IMTPA could include a taxonomy or classification scheme for sorting policy documentation and clearly indicating whether there has been adequate variation and scope or whether government and other policy analysis has been concentrated to particular areas of the classification scheme.
A demonstration of the utility of this segment of the IMPTA, while being very brief in nature, and which includes classification according to policy type and scope has shown that reviewed policy documents may tend to clump together in particular categories such as „normative, limited scope analysis". While it does not explicitely indicate whether this is optimal or suboptimal, our demonstration has indicated that a collection of core policy documents and even private analysis may leave categories such as „objective, full-scope analysis" with minimal or no coverage. Such observations (assuming that they do not arise as result of an inadequate sample of available documentation) can better indicate a clear area of prioritization for future analysis and research. In future studies utilizing this segment of the IMPTA, the national technology policy analysis profiles of several countries might be compared using this method and a possible correlation between these profiles and differences in national technology policies or priorities could be examined.
REFERENCES
1. Njalsson G.K.A. From Autonomous to Socially Conceived Technology: Toward a Causal, Intentional and Systematic Analysis of Interests and Elites in Public Technology Policy. Theoria. 2005;108:56-81.
2. Merriam-Webster's Desk Dictionary. Springfield: MA, Merriam-Webster; 1996. 634 p.
3. Dye T.R. Understanding Public Policy. Englewood Cliffs, NJ: Prentice-Hall; 1981. 368 p.
4. Dye T.R. Policy Analysis. What Governments Do, Why They Do It, and What Difference It Makes. University of Alabama Press; 1976. 13 p.
5. Weimer D.L., Aidan R.V. Policy Analysis: Concepts and Practice. Upper Saddle River, N.J.: Prentice Hall; 1999. 417 p.
6. Stokey E., Zeckhauser R. A Primer for Policy Analysis. New York: W.W. Norton; 1978. 368 p.
7. Grumm J.G., Stephen L.W. The Analysis of Policy Impact. Lexington, Mass.: Lexington Books; 1981. 221 p.
8. Ruostetsaari I. Energiapolitiikka kaannekohdassa: jarjestôt ja yritykset vaikuttajina vapautuvilla energiamarkkinoilla. Tampere: University of Tampere; 1998. 235 p.
9. Walters L.C., James A., Miller J. Putting More Public in Policy Analysis. Public Administration Review. 2000; 60 (4): 227-242.
10. Tepper S.J. Setting Agendas and Designing Alternatives: Policymaking and the Strategic Role of Meetings. Review of Policy Research 2004; 21 (4): 523-542. https://doi.org/10.1111/j.1541-1338.2004.00092.x
11. Science and Technology for the New Century: A Federal Strategy. Technical report. Ottawa: Industry Canada; 1996. 38p.
12. Report of the Auditor General of Canada Rapport du vérificateur général du Canada. Technical report. Ottawa: Report of the Auditor General of Canada Rapport du vérificateur général du Canada; 1994. 955 p.
13. Science Communications and Opportunities for Public Engagement (SCOPE). Tekninen raportti. Ottawa: Council of Science and Technology Advisors CSTA; 2003. 28 p.
14. Building Excellence in Science and Technology (BEST): the Federal Roles in Performing Science and Technology: A Report. Ottawa: Council of Science ja Technology Advisors CSTA; 1999. 44 p.
15. Science Advice for Government Effectiveness (SAGE): A Report of the Council of Science and Technology Advisors. Ottawa: Council of Science ja Technology Advisors CSTA; 1999. 15 p.
16. Science and Technology Excellence in the Public Service: A Framework for Excellence in Federally Performed Science and Technology. Technical report. Ottawa: Council of Science ja Technology Advisors CSTA; 2001. 38 p.
17. The Diffusion and Adoption of Advanced Technologies in Canada: An Overview of the Issues. Advisory Council on Science and Technology; 2005. 23 p.
18. Government-led Industrial Assistance Programs. Technical report. Advisory Council on Science and Technology; 2005. 23 p.
19. Goals, Strategies and Priority-Setting for RD and Commercialization: A Survey of International and Provincial Practices. Advisory Council on Science and Technology; 2003. 98 p.
20. Commercial Innovation: A Policy Stocktaking. Technical report. Advisory Council on Science and Technology; 2003. 23 p.
21. Cruikshank A., Holbrook A. The 1994 Federal Science and Technology Review. Vancouver; 2002. 19 p.
22. Holbrook J.A.D. Analysis of Voting Patterns and ST Issues: The National ST Conference, Ottawa, October 1994. Vancouver: Simon Fraser University: CPROST; 1996. 11 p.
Information about the author:
Gunnar K.A. Njalsson — Master of Public Administration, Lincensed Vocational Instructor, Doctoral Candidate in Public Administration, University of Lapland (Finland) (e-mail: gnjalsso@ ulapland.fi).
К интегрированной модели анализа технологической политики — полезная таксономия для определения масштаба и типа анализа
Г.К.А. Ньялссон
Лапландский университет, 122-96101, Финляндия, 96300 Рованиеми, Улиопистонкату 8
Аннотация. Целью настоящей статьи является разработка и представление еще одной части или сегмента интегрированной модели анализа государственной технологической политики (1МТРА) и демонстрация ее методологической и аналитической полезности с помощью основных документов по анализу политики Канады за период 1990-2005 гг. Эта статья ограничивается частью 1МТРА, касающейся типа и объема проводимого анализа государственной технологической политики — методологии, которая могла бы лучше направлять и делать более прозрачной как изучаемую политику, так и сам процесс анализа политики. Методы. В статье используется предложенная таксономия для изучения наиболее важных политических документов, определяющих политику канадского информационного общества в период 1990-2005 гг. Таксономия сортирует руководящие документы по их характеру и основному содержанию на категории объективного, нормативного или субъективного анализа
политики и, далее, на ограниченный или полномасштабный анализ политики. Результаты. Политический анализ, проведенный правительством Канады и для него, в первую очередь был нормативным анализом ограниченного масштаба. Хотя был проведен как объективный, так и спекулятивный анализ, объективный анализ был сосредоточен в основном на воздействии политики и в гораздо меньшей степени на процессе определения политики, где участие и интересы заинтересованных сторон являются важным фактором. Объективный полномасштабный анализ в рассматриваемый период не проводился.
Ключевые слова: государственное управление, технологическая и инновационная политика, разработка научных методов и моделей, Канадская технологическая политика
Заявление о конфликте интересов: Автор заявляет об отсутствии конфликта интересов.
История статьи:
Статья поступила в редакцию: 24.02.2023. Статья принята к публикации: 10.04.2023. Для цитирования:
Njalsson G.K.A. Toward an Integrated Model for Public Technology Policy Analysis — a Taxonomy Useful for Determining Scope and Type of Analysis // Вестник Российского университета дружбы народов. Серия: Государственное и муниципальное управление. 2023. Т. 10. № 2. С. 269-285. https://doi.org/10.22363/2312-8313-2023-10-2-269-285
Информация об авторе:
Ньялссон Гуннар Кари Александр — магистр государственного управления, лицензированный профессиональный инструктор, докторант в области государственного управления, Университет Лапландии (Финляндия) (e-mail: [email protected]).