ARE THE INVESTMENTS IN PERFORMANCE MONITORING AND EVALUATION SYSTEM PAYING OFF? A CASE STUDY ON FREE STATE OUTCOME BASED PRIORITIES 1: QUALITY BASIC EDUCATION MANAGEMENT APPROACH By Masekhanya Alina Matsie A Mini Dissertation submitted to the University of Free State MASTER IN DEVELOMENT STUDIES Supervisor Professor J. Strauss 0 Approval Page This research has been examined and is approved as meeting the required standards. Supervisor Professor J. Strauss University of the Free State External Examiner Patrick Barmby University of the Witwatersrand Dean, Masters of Development Studies Professor L.J.S. Botes University of the Free State 1 Statement of Originality This is to certify that to the best of my knowledge, the work contained in this dissertation was carried out by the author at the University of Free State between June 2013 and December 2014. I certify that the intellectual content of this dissertation is the product of my own work and that all the assistance received in preparing this dissertation and the sources, have been acknowledged. Name: .. ffi~.Y~~.ANYA. . f.1 .. fYJAl6It Sig~. ................ . 2 Acknowledgments Foremost, I would like to express my sincere gratitude to my supervisor Prof. J. Strauss for the patient guidance provided and prompt responses to my questions and queries throughout my time as his student. I would like to thank the following enthusiasts and supporters: Prof. Moses Sindane, Dr Motsamai Motsoari, and Mr Andile Makapela, for their continued motivation and encouragement. A very special thanks goes out to Ms. Geraldine Muthambala at Department of Education; I recognise that this research would have not been possible without the efforts she made to facilitate interactions with participants. I must also express my gratitude to my best friend, Selone Nthethe, without whose love, encouragement and patience, I would not have finished the study. I would also like to thank my family for their support through my entire life and in particular, my father and mother, Makhate and Dineo Matsie for being my source of love and energy. 3 Abstract For over 15 years of democracy, South Africa has developed and adopted a series of policies which government strives to address divergent societal problems and at the same time enhance the wellbeing of all South Africans. Monitoring and evaluation (M&E) was introduced in the Presidency to increase importance in management functions that are interactive and mutually supportive in government at all levels (The Presidency, 2010b). The Outcome Based Approach is a strategy used to institutionalise the Performance Monitoring and Evaluation (PME) system, and it is the fundamental management control of ensuring that activities and processes on how things are done by government are completed in a way that leads to the attainment of its goals. In the Free State, the introduction and implementation of OBP approach began in 2011 following the guide to the Outcomes Approach from the Presidency that describes the governments PME system. This study sought to determine how the investments in Performance Monitoring and Evaluation System are paying off through implementation focused on the intergovernmental implications. The study investigated the improvement the OBP approach has had on intergovernmental implications rather than an individual focus by departments in quality basic education, and the changing behaviour and attitudes of participants in implementing the full delivery chain. Data for the study was collected using the exploratory design method to gain insight into a situation, and a grounded theory was applied to develop a hypothesis. The PME system is beneficial to government, as it improves how departments conduct the monitoring of their programmes. However, the approach needs additional support for more effective planning and monitoring of Outcome Based Priorities to take place, as well as further institutionalising the approach to all lines of work done in departments. There was also recognition that M&E was not fully implemented within the departments and the establishment of connections with managers amongst one another has managed to assist M&E managers to work together and communicate with one another. The main purpose was to identify the challenges in the PME system in order to find possible solutions observed from the OBP approach implementation. 4 ~------------------------------------------------- ---- A list of Acronyms DSD Department of Social Development FS Free State FSGDS Free State Growth and Development Strategy GDP Gross Domestic Product GWM&E Government-Wide Monitoring & Evaluation HODs Head of Departments MEC Member of Executive Council MPAT Management Performance Assessment Tool MOU Memorandum of Understanding MTSF Medium Term Strategic Framework M&E Monitoring and Evaluation NSDP National Strategic Development Plan OBP Outcome Based Priorities PGDS Provincial Growth and Development Strategy PME Performance Monitoring and Evaluation RDP Reconstruction and Development Policy SA South Africa SACR Sports, Arts, Culture and Recreation TOR Terms of Reference TWG Technical Working Group A list of Figures Figure 1.1: The structure of the dissertation ....................................................................................... 14 Figure 2.1: Organogram showing the development strategic policies in South Africa ....................... 20 Figure 4.1: Research data collection and analysis process .......................................................................... .40 5 ~---------------------------------~--------- A list of Tables Table 3.1: Classification of framework design types .............................................................................. 23 Table 3.2: Summary of the research design highlighting the research questions........................... 35 Table 4.1: Examples of conceptual categories from open coding .................................................... 41 Table 4.2: Interpretation criteria ....................................................................................................... 42 Table 4.3: Discussion of criteria findings ........................................................................................... 43 Table 4.4: Rating on improvement the OBP approach has on changing behaviour and attitudes of participants in implementing the full delivery chain ........................................ SO Table 4.5: Notional Category. ............................................................................................................ 53 A list of Appendices Appendix 1: Questionnaire ......................................................................................................................... 74 Appendix 2: Conceptual categories from open coding ....................................................................... 75 6 Table of Contents CHAPTER 1: INTRODUCTION AND RATIONALE ................................................................ 9 1.1 Introduction .....•...........•..................................•........................•..•.........•......•....•. 9 1.2 Background of the research .•••••••........•..••...•....•..••.......••.•••..••..••..•.•..••.••..••••..•.••.• 9 1.3 Problem Statement ••.•..•..••.••..................................................••....••.....•......•.•.... 11 1.4 Research Objectives .......................................................................................... 12 1.5 Rationale and Justification of the study .............................................................. 12 1.6 Limitations of the study ..................................................................................... 13 1.7 Key words and concepts .................................................................................... 13 1.8 Chapter Outline ..•.••..•..•..•.•.•.••..••.••....••...•.•...............................................•••...... 13 CHAPTER 2: LITERATURE REVIEW AND BACKGROUND .................................................. 16 2.1 Introduction ...................................................................................................... 16 2.2 The education system history in South Africa ...................................................... 16 2.2.1 The Education management system in South Africa ......................................... 17 2.2.2 Education expenditure in South Africa ............................................................ 19 2.3 Free State Province Strategic Plan ...................................................................... 20 2.4 Monitoring and Evaluation in the Free State Province .......................................... 22 2.5 Introduction of Performance Monitoring and Evaluation ..................................... 23 2.5.1 Outcome Based Priorities (OBP) approach used in PME system ........................ 24 2.5.2 Implementation and management through OBP approach ............................... 26 2.6 Conclusion ........................................................................................................ 27 CHAPTER 3: RESEARCH DESIGN ................................................................................... 29 3. 1 Introduction ...................................................................................................... 29 3.2 Sampling .....••..•....•...•..•...••.••••.••..•••.•••.•••••.••.••••••••.••.••••••••.••••.•••..••.......•..••.......• 32 7 3.3 Data Collection Strategy .................................................................................... 33 3.4 Questions for the individual interviews ............................................................... 35 3.6 Limitations ........................................................................................................ 37 3.7 Ethical Considerations ....................................................................................... 37 3.8 Conclusion ........................................................................................................ 38 CHAPTER 4: DATA COLLECTION AND ANALYSIS ............................................................ 39 4.1 Introduction ...................................................................................................... 39 4.2 Response profile and rate .................................................................................. 39 4.3 Open Coding ..................................................................................................... 41 4.3.1 Criteria of analysis ......................................................................................... 43 4.4 Axial Coding ...................................................................................................... 52 4.5 Selective Coding ................................................................................................ 56 4.6 Conclusion ........................................................................................................ 56 CHAPTER 5: DISCUSSION AND RECOMMENDATIONS ..................................................... 58 5.1 Introduction ...................................................................................................... 58 5.2 Discussion and Recommendations ...................................................................... 58 5.3 Conclusion ........................................................................................................ 66 5.4 Recommendati.ons for Future Studies ................................................................. 66 5.5 Final Conclusions ............................................................................................... 67 REFERENCES ................................................................................................................ 69 8 CHAPTER 1: INTRODUCTION AND RATIONALE 1.1 Introduction Monitoring and evaluation (M&E) has taken on increasing importance in management functions that are interactive and mutually supportive. In South Africa, a Performance Monitoring and Evaluation (PME) cabinet was introduced in the Presidency to serve as a monitor for the performance of government at all levels (The Presidency, 201 Ob). The government has faced a continuing struggle to demonstrate progress made toward service delivery, with PME proposed as a solution to many of the dilemmas faced by the country. Outcome Based Priorities (OBP) approach is used for managing the performance of government planning thus improving the focus on outcomes for the whole country, as well as for the improved provision of basic services. The process of monitoring activities in government interventions is to ensure that they are being accomplished as planned and of correcting any significant deviation from intended achievements (Robbins & Decenzo, 2004). The introduction of Outcome Based Approach is the fundamental management control of ensuring that activities and processes on how things are done by government are completed in a way that leads to the attainment of its goals. Performance Monitoring and Evaluation (PME) guides the process of Outcome Based Approach, mainly through the implementation process, reviewing the overall performance in terms of input use, progress of programmes and outputs. 1.2 Background of the research In the Free State, the introduction and implementation of OBP approach began in 2011 following the guide to the Outcomes Approach from the Presidency that describes the government's PME system. The Outcomes Based Priorities (OBP) approach is a new approach in government and the following are proposed as aimed at improving government (The Presidency, 2009: 15): The creation of focus on sectors rather than on departments and the intergovernmental implications; the measure of politically designated outcomes for accountability; priorities given to a 9 few sectors; and an emphasis on accountability throughout the service delivery chain. Following this new approach of performance monitoring and evaluation management in government, the study seeks to conduct an overview on the PME system through assessing • how OBP approach is being used; • how the PME system has improved Quality Basic Education management; and • the barriers and challenges to using OBP approach. The institutionalisation of Performance Monitoring and Evaluation (PME) in the South African Government is a 2010 intervention necessitated by the need to counteract the legacy of poor performance in public service institutions. Monitoring and Evaluation (M&E) has become an increasingly important factor in the development process of government (Lahey, 2010: 17). The enforcement of PME in the South African Government was introduced by the Presidency and it is applied in the implementation of five key priority areas, one of them being Quality Basic Education. Education has been seen as a catalyst to be focused on in order to improve the futures for a number of children competing for better opportunities, such as those of good employment (The Presidency, 2009: 12). Since 2004, the public sector has performed well in implementing government programmes, initiatives and services which were found to have improved, particularly for citizens whose quality of life was neglected under apartheid. However, the government has not performed optimally in relation to public perceptions and expectations (The Presidency, 2009: 4). Post-1994, the all-inclusive South African Government inherited one of the most unequal societies in the world and has since been working to reverse the unjust situation to create a country that is better for all who live in it (Jansen & Taylor, 2003: 11). In 2009, the South African government was affected by service delivery protests which forced the newly appointed President, Jacob Zuma, to create a new 10 administration on entering his term of office (Hamill, 2009: 34). Failure to produce what the South African president called 'visible and tangible socio-economic development' within a relatively short time frame led to new governmental machinery, namely a Performance Monitoring and Evaluation Unit to enhance co- ordination, and ultimately to facilitate delivery and improved governance at all levels (Hamill, 2009: 35). To create a government that was more effective in its actions and efficient in its activities, the Outcome Based Priorities (OBP) was introduced and the implementation thereof focused on the strategic objectives derived from the 5-year electoral mandate period. The following areas were prioritised: Education, Health, Jobs, Rural Development and Safety (The Presidency, 2009: 12). Applying the OBP approach in government was needed to make it compulsory for attention to be given to the Full Delivery ChaJn, namely: Inputs; Activities; Outputs and Outcomes; and the need to do ordinary things well. 1.3 Problem Statement Monitoring and Evaluation of government interventions is seen as vital for the control of projects that are transparent, resourceful, and perform well. The Free State provincial government M&E system of electronic web-based design started during 2004, before the national process of the Government-Wide Monitoring & Evaluation (GWM&E) system framework was developed, but the fundamentals were the same (The Presidency, 2007). The monitoring and evaluation practices in the province could be easily adapted in line with the national process. The performance measurement efforts in government are highly administratively focused, particularly concerning planning, programming, and budgeting. Appraisals of government performance are concerned primarily with assessing the relationship of inputs to costs and the value of cost-reduction activities in these systems, adapting techniques from the larger field of management science (Heinrich, 2002: 712). 11 The Free State provincial government M&E system was focused on those administrative activities, namely: time, money and management activities. The system was created to systematise departments' planning and working together. Despite the huge number of resources invested in this system to manage the integration of the province's programmes implemented, working and planning in isolation amongst the departments still existed. 1.4 Research Objectives The objectives of this study are therefore to: • Determine how the investments in Performance Monitoring and Evaluation System are paying off through implementation focused on the intergovernmental implications. • Determine how the investments in PME System are paying off through changed behaviour on implementing the full delivery chain. • Outline the best practices in Performance Monitoring and Evaluation systems during OBP 1 implementation, in order to guide and inform processes of other OBPs implemented in the province. • Identify the best practices in Performance Monitoring and Evaluation systems in government from the literature. • Make recommendations in order to lessen challenges faced in Performance Monitoring and Evaluation system. 1.5 Rationa1e and Justification of the study Outcome Based Approach in government is introduced, among other things, to instil good management practices required to improve the efficiency on how government plans, works and implements its programmes. The study is conducted to assess how the investments, made in the Performance Monitoring and Evaluation System, are paying off through assessing the improvements the OBP approach has made in Quality Basic Education (OBP 1) management. Since its implementation in 2009, 12 .------------------------------------------------ there is the requirement to conduct an assessment on the approach as to how it has paid off through changing behaviour on implementing the full delivery chain; and to make recommendations which will lessen previously experienced challenges which can be considered for improvement in the next electoral cycle (2014-2019). 1.6 Limitations of the study The OBP approach was used to address 12 outcomes in the country focused on Quality Basic Education, Health, Rural Development, Job creation and Safety. This study is focused only on the implementation of OBP 1 - Quality Basic Education in the Free State, the finding and experiences of which may not be recommendable to all other Outcomes applied. 1. 7 Key words and concepts Key words: Monitoring and Evaluation, Performance Monitoring and Evaluation, Outcome Based Priorities, Quality, Basic Education 1.8 Chapter Outline The dissertation summary is presently arranged under the following chapters: Chapter 1: Introduction and Rationale It describes the problem and its background, the purpose and significance of the study, and the research design methodology, as well as providing the key words. Chapter 2: Literature Review and Background It discusses the literature review for the study. The literature review is compiled through the collection of relevant publications, books, legislation, documents, files, reports, and speeches that explore government management, the impact of monitoring and evaluation and various techniques on management employed in different countries, and the OBP approach. 13 Chapter 3: Research Design This chapter describes the research design and methodology of the study. It describes and justifies the methods and processes that were employed to collect data that were used in answering the research questions. The chosen exploratory research method was chosen for the study. Chapter 4: Data collection and analysis This chapter presents the findings of the study. It describes the research study by highlighting the survey's outcomes, and discusses the findings of the surveys. Prior to the presentation of the findings of each investigative question, the criterion for data analysis and interpretation is set. Chapter 5: Findings and Recommendations This chapter presents the findings of the research study and the recommendations on how to mitigate the challenges faced during the OBP approach to manage Quality Basic Education (OBP 1). The structure of the dissertation is summarised in Figure 1.1 Chapter 1 hapter2 hapter 3 Chapter4 Chapters Introduction and iterature Review and Research Design ata Collection Conclusion and Rationale ackground d Analysis Recommendations Introduction, Literature Survey esearch design and The main study Discussions and Research Objectives, ethodology nducted. Review recommendations Rationale and Justification of the d summaryof Study evidence Duration of literature survey 14 Con textual Research Described Evidence and Literature Review !Areas of interest for Research Questions Finalised assessment in the study Results areas of interest ~ formulated Improve departmental focus on a few priority QI Changes in how you manage basic Background and sectors. Emphasis on education. (i) History of M&E accountability throughout Q2 PME system created focus on lmplementation fthe service delivery intergovernmental arrangement between chain. Improve the "he departments contribution in OBP I. Background and intergovernmental i) Results of the History of Education implication per sector Q3 Changed culture in how you manage study system in South l"ather than individual !basic education applied in your overall ecornmended Africa focus by departments, '1Janagement responsibilities. (i). for the OBP Literature on Change behaviour, and Q4 Ifa pproach brought any changes improvements Outcome Based attitudes of the positive or negative) on how you do Approach participants. vour overall work. (ii) introduction and Q5 How the investments in PME System implementation paid off in changing your behaviour, and attitudes on implementing the full delivery chain. (ii) Q6 Requirements to improve the OBP approach. (i) Figure 1.1 Structure of the d1ssertat1on (source: Own) Structure of the dissertation Figure 1.1 illustrates the link of the contextual research, the analysis of the current situation in implementing OBP 1 to the period of refining the methodology. The literature review runs concurrently with the creation of the research tool and the period of collating the research data. The evidence from the research is used to guide the next planning and implementation of OBP 1. 15 CHAPTER 2: LITERATURE REVIEW AND BACKGROUND 2.1 Introduction The literature review presented is collected from relevant publications, books, legislation, documents, files, reports, and speeches that explore the related literature on government management processes. It also reviews the literature on the impact of the monitoring and evaluation techniques on management employed in different countries. The sections to be presented are as follows: History of the education system in South Africa; Free State Province Strategic Plan; Monitoring and Evaluation in the Free State government; critical view on government non-delivery performance; and Introduction of Performance Monitoring and Evaluation (PME). 2.2 The education system history in South Africa Prior to 1994, South Africa experienced decades of social and economic discrimination against black South Africans that left a legacy of inequality along racial lines. The Bantu Education Act of 1952 ensured that blacks received an education that would limit educational potential and relegate them to the working-class. This further affected the content of learning to racial inequalities by preventing access to further education (Ocampo, 2004). The education system in South Africa has been a central part of the country's reconstruction and development, post-1994. Projects took place focusing on two areas: firstly, to overcome the devastation of apartheid by providing a system of education that builds democracy, human dignity, equality and social justice. The second was to have a system of lifelong learning that enabled South Africans of all races to respond to the enormous economic and social challenges (Steyn, 2001: 332). The South African government needed to provide all South Africans with basic education. This, according to the Bill of Rights in the Constitution of the country, is an obligation the state had, through reasonable measures, to progressively make 16 --~·-----·--~ ------ education available and accessible (SA Constitution, 1996: 10). With the population roughly 78 percent black, 10 percent white, 9 percent coloured, and less than 3 percent Indian (Jansen & Taylor, 2003: 15), access to public education was limited and quality was poor. Black people were provided with substandard education (National Planning Commission, 2011: 14). 2.2.1 The Education management system in South Africa The South African education structure consists of the National Department and Provincial Departments. While the National Department of Education is responsible for preparing government policy on education and training, the Provincial Departments of Education are guided by these policies when setting their own priorities and implementation programmes (Education Strategic Plan, 2010). The Provincial Departments of Education promote the translation of the education and training policies of the Government and the provisions of the Constitution into a national framework. The creation of a single National Department of Education that replaced the nineteen racially, ethnically, and regionally divided "departments of education" was an accomplishment in the early years (Jansen & Taylor, 2003: 12). The National Department of Education is the overseer of the education system of the Republic of South Africa and it took this role in the first five years of educational reconstruction which focused on systemic reform geared to dismantling apartheid- created structures and procedures (SAinfo Material, 2013). The first phase began on working on integrating formerly divided bureaucracies into a new system, without a breakdown in service delivery. One national and nine provincial education departments were established (Education in SA, 2001: 5). With nine provincial departments (each from the 9 provinces), there was a need to focus on their nature and role. Once the first phase of education reform was finished in replacing minority rule, there was a need to improve the organisational cultures of working together in the altered system to mould the new government in its procedures to improve performance and outcomes, teamwork and customer-focused service (Education in SA, 2001: 7). 17 After five years of systemic transformation through the development of policies, the focus of the government shifted to policy implementation. Makinde (2005: 63) states that policy implementation in developing countries can be halted by factors, such as social, political, economic and administrative variables in policy formulation. Furthermore, corruption, a Jack of continuity in government policies, inadequate human and material resources can also lead to problems in policy implementation. The next phase for the South African government after the development of policies on education reform was set, was to urgently increase an educated and skilled population (Education in SA, 2001: 5). This phase entailed policy implementation dealing with the challenges in previously disadvantaged schools. "Tirisano" was launched to create a focus on nine priorities which were addressed in five programmes (Steyn, 2001: 336). Programme 5, which deals with organisational effectiveness, is outlined as: Organisational effectiveness of national and provincial systems. The programme had one priority: We must make our provincial systems work by making co-operative governance work (Implementation Plan for Tirisano, 2004: 21). Four projects were proposed to achieve these priorities: • Project: 1 Integrated planning and budgeting processes - Strategic Objective was to develop planning tools to support the policy and budget processes through the alignment of National and Provincial plans and budgeting (Implementation Plan for Tirisano, 2004: 21). • Project: 2 Monitoring, evaluation and accountability - To establish monitoring and evaluation mechanisms that enable the assessment of the performance of the education and training system, including the impact of implementation plans and strategies (Implementation Plan for Tirisano, 2004: 22). • Project: 3 Systems Development and Co-ordination - To ensure that integrated and functional administrative and management systems are established to support the policy, planning, budget and implementation process (Implementation Plan for Tirisano, 2004: 22). The principle of co- operative governance came about also in mid-1998, and this was for the National Department of Education to actively intervene in key transformation 18 initiatives by reviewing mechanisms to strengthen intergovernmental relations (Education in SA Achievements, 2001: 15). The strengthening of intergovernmental relations was to create a better information flow between the national and provincial departments, and regular monitoring and evaluation of provincial activities and the submission of a quarterly report to the President. • Project: 4 Organisational Restructuring and Human Resource Development Strategy - To ensure the development of appropriate organisational and human resource capacities in the national and provincial departments in the context of co-operative governance (Implementation Plan for Tirisano, 2004: 22). In 2009, the National Department of Education was split into two ministries: Basic Education; and Higher Education and Training. OBP 1 which is focused on Basic Education focuses on primary and secondary education, as well as early childhood development centres (SAinfo Material, 2013). 2.2.2 Education expenditure in South Africa Expenditure on education in South Africa (SA) is an area which the government focuses on to improve quality basic education. Education is one of the most significant long-term investments a country can make and South Africa compared with most other countries, gives it a very large slice of the public pie - around 20% of total state expenditure (National Planning Commission, 2011: 14). After 1994, there was a significant increase in education expenditure under the democratic government but more money is still needed to address the huge backlogs left by 40 years of apartheid education (SAinfo Material, 2013). Education expenditure in South Africa accounts for almost 6 percent of the GDP (over 1 billion rand), and this falls within one of the highest rates of government investment, with South Africa's teachers among the highest paid in the world (National Planning Commission, 2011: 14). 19 In the study by Jansen and Taylor (2003: 19) they argue that the increment in the budget allocation has brought significant improvement in the education system. Equality in schools' improvements (e.g. in infrastructure and basic services) are seen as a crucial requirement for improving education in the country. 2.3 Free State Province Strategic Plan This literature review covers the Free State strategic plans' approach and their implementation, and further discusses the intergovernmental coordination plan. The Free State Provincial Government used the strategic document called the Free State Growth and Development Strategy (FSGDS) from 2005 to 2009 as a framework for both public and private sector investment, indicating areas of opportunities and development priorities. The FSGDS is a document that took into consideration the provincial, national, and external stakeholders' plans for the province through being aligned with the National Strategic Development Plan (NSDP). Furthermore, it was responsible for plans on creating, promoting and supporting the environment, institutions and mechanisms crucial for shared growth and development (Provincial Growth Development Strategy [PGDS], 2005: 13). It was developed for addressing the legacies of the apartheid spatial economy; for promoting integrated and sustainable development in order to turn the tide against poverty; and create employment opportunities. New Gravlb 1'8111/Jl'llP. 2 2010 5+21'111H111es/0111C811le ll8Sell A98llcb 2011 11 SOLRCE: Mafole, Department of the Premier. 2012 20 Figure 2: Organogram showing the development strategic policies in South Africa The FSGDS post-dated many other government interventions formed at national level. The following outlined policy in Figure 2.1 provides an indication of how the policy imperative has evolved and embedded integrated planning. The figure reflects the evolved policy changes from 1994 to the introduction of the 2009 policy on the Outcome Based Approach. In 2011, the National Development Plan (NDP) was adopted by government and it has a 20-year horizon time span. The development plan is broken down into five-year deliverables through the Medium Term Strategic Framework (MTSF) consisting of Outcome Based Priorities of government to deliver on the NDP towards the 20-year horizon. During 2014, the current government has come towards the end of the electoral cycle. As soon as the new administration is inaugurated, it is expected that that the new MTSF will be adopted by the new Cabinet for implementation across all levels of government. The Free State Growth and Development Strategy (FSGDS) harnessed the government endeavours and created a common development vision and direction for the province. It addressed the key social, economic, environmental and spatial imperatives in the province, to be followed by all spheres of government to effectively use scarce resources within the province, whilst addressing the real causes of development challenges. The FSGDS identified many investment opportunities and provided a platform to promote intergovernmental coordination between the three spheres of government (PGDS, 2005: 4). The intergovernmental coordination was maintained by a monitoring and evaluation process which was conducted through an electronic system. Achievements on the FSGDS' progress was managed and implemented through the use of this electronic system that ensured an outcomes-based focus against set milestones for the FSGDS (PGDS, 2005: 18). 21 The Reconstruction and Development Policy (RDP) was established to integrate growth, development, redistribution and reconciliation during a period of significant institutional transformation and the introduction of new policies aligned with the new democratic constitution. This was a period which laid the foundations for delivery in all three spheres of government through a contract with the people. The period 2004-2009 was a time of building on the foundations laid previously, with accelerated and shared economic growth and development and an emphasis on job creation, infrastructure investments, poverty reduction and skills development. The current electoral cycle, 2009-2014, is a period of strengthening government coordination, planning and monitoring. 2.4 Monitoring and Evaluation in the Free State Province Through the development of the FSGDS, the Free State Province set out to contribute to the vision of "a unified, prosperous Free State that fulfils the needs of all its people" (PGDS, 2005: 5). The province developed an M&E system which was website-based to be able to support project initiatives and manage the information system for the FSGDS initiative. Cimdins and Skinkis (2011: 14) state that the development of policy qn monitoring instruments need to promote applicable and effective policy implementation and decision making, which is based on the regular and systematic verification of resources, actions and results. The Free State M&E system provided regular and time-specific reporting on the progress of projects and readily available and reliable information. As a pre-requisite for an effective and efficient public sector, the M&E system was introduced to improve the achievement of the desired results of economic growth, social development, and poverty alleviation (Kusek & Rist, 2011: 2). Goldman and Nel (2005: 5) define monitoring as the routine checking of information on progress, so as to confirm that progress occurs in the defined direction. The M&E system monitored the performance indicators defined in respect of the provincial priorities, strategies and objectives. Such a function was performed by the use of traffic lights that were operational within the system to indicate progress. The 22 separate departmental subsystems updated the overall provincial M&E system (PGDS, 2005: 170). The Monitoring and Evaluation system was required to ensure that an outcomes- based focus against set milestones for the FSGDS was obtained; however, such movement towards a result-based approach in the public sector management did not sufficiently exist. The provided information was unable to expand information systems to go beyond the traditional reporting on inputs, activities and outputs so as to including outcomes and impacts (Kusek & Rist, 2011: 1). A lack of focus on programme management created a breach in determining whether government interventions brought the desired outcomes at community level. The monitoring and evaluation system provided a data analysis component for use on all administrative levels (national, provincial, and local) but unfortunately, the elaboration of solutions on the development planning information system could not be completed. 2.5 Introduction of Performance Monitoring and Evaluation South Africa is the first country in the world to put monitoring and evaluation at cabinet level; this was a major step to institutionalise M&E, rather than its remaining an inspiration (Hamill, 2009: 4). The introduction of Performance Monitoring and Evaluation took place in the South African government to make it compulsory for government to give attention to the full delivery chain, namely Inputs, Activities, Outputs and Outcomes. Performance Monitoring and Evaluation was needed to support • budget decision-making; • performance-based budgeting; • national, sectorial and subnational planning; • designing of new policies and programmes; • assisting governments in their management; and • strengthening accountability relationships (Mackay, 2006: 9). Monitoring was needed to complete procedures, including the fixing of changes, evaluation, decision-making and implementation control (Cimdins and Skinkis, 2011: 23 15). The introduction of PME is focused on addressing the fundamentals of good governance, namely public sector accountability and transparency, government performance, and leadership and management (Seemela & Mkhonto, 2007: 1). The Performance Monitoring and Evaluation system is used to implement the 12 outcomes that collectively address the main strategic priorities of government (Guide to Outcomes Approach, 2010). 2.5.1 Outcome Based Priorities (OBP) approach used in PMEsystem The use of the Outcome Based Priorities (OBP) management approach is introduced to increase performance-evaluation activities at all government levels. Following the above-mentioned insufficiency of government services and delivery in the literature review, the Presidency saw the approach of doing things differently as a way to increase the impact of government interventions. The South African government identified 25 to 30 outcomes, which relate to the five priority areas. The proposed Outcome Based Priorities (OBP) approach as an implementer of PME in the country is not only to measure outcomes and outputs but to guide the direction of policy implementation (The Presidency, 2009: 7). The introduction and application of fewer outcomes to be addressed through the OBP approach was to create a focus on key areas expected to make a significant change in government implemented programmes. The introduction of Outcome Based Priorities (OBPs) management gives attention to the Full Delivery Chain and the approach follows a four-step process (Guide to Outcomes Approach, 2010). Step 1: Adaptation of a set of key strategic outcomes with measurable outputs and key activities. In the Free State, work on identifying each department's strategic objective aligned to the OBP document produced at national level, commenced in 2009. This process formed part of step 1, which is the adaptation of key strategic outcomes. Heads of Departments (HODs), as well as some senior executives in the provincial 24 departments worked together to develop a provincial OBP document. The OBP approach created planning for the province to function in a reverse manner, by first allowing the determination of how best the outcome could be achieved, only after it had been identified (Guide to Outcomes Approach, 2010: 10). Furthermore, the OBP approach prescribed the horizontal working together in government to take place; all the departments that served or contributed towards the achievement of Quality Basic Education worked together to address this outcome. The OBP approach provides sectorial planning and an operational character of state administration which is an obstacle in the implementation of development documents (Cimdins and Skinkis, 2011: 16). The work in step 1 preconditioned good governance through bringing strategy-focused leadership that supports and strategises employee participation in policy planning and implementation (Mothae and Sindane, 2007: 2). Step 2: Performance agreements between the President and Ministers which outline high level outputs, metrics and key activities for each outcome. One key point in the implementation of the OBP approach is to improve the accountability of ministers and other executives in the outcomes for which they are responsible. Accountability has traditionally focused mainly on compliance with regulations and less frequently on service delivery outputs; this performance management operated at the departmental and individual levels (The Presidency, 2009: 16). The accountability concept as explained by Sindane (2009: 4), is used to obtain answerability for the ministers' actions and behaviour, to the extent they are legally required to answer for their actions. The proposed new regime shifts the locus of accountability towards outcomes, politically and administratively. Step 3: Convert the high level outputs and metrics into a detailed Delivery Agreement with the key partners who need to work together to achieve the outputs. As previously explained, the main activity in policy implementation is planning and programming (Mothae and Sindane, 2007: 11). Step 3 of the OBP approach work entailed a process of appropriate individuals drafting a detailed action plan and time frames for the OBP plan. These selected individuals formed a working group named 25 the Technical Working Group (TWG) which worked on the level of the outputs and metrics alignment and the response to outcomes. Step 4: The establishment of effective coordination structures that would allow the partners to the Delivery Agreement to work together for the next four years in coordinating the implementation of the outcomes, reviewing progress and deciding on interventions when required. An implementation forum as explained in the Guide Outcomes approach (2009: 10) needed to be formed to coordinate working towards achieving each OBP and these implementation forums were referred to as the Technical Working group (TWG) in the Free State. The TWGs consisted of a facilitator (Department of the Premier manager); a coordinator in leading the department (Education for OBP 1) , and supporting departments (Sports, Arts, Culture and Recreation, and Social Development for OBP 1). The central objective of the TWG was to determine, for each outcome, whether interventions were relevant and effective, as well as to recommend alternatives which would ensure better results (Terms of Reference [TOR] for Technical Working Group, 2011: 3). 2.5.2 Implementation and management through OBP approach The TWGs facilitated an integrated approach to the performance management, monitoring and evaluation of each OBP. The processes of the implementation of OBP commenced with all programme managers in the departments that served in OBP 1, linking the strategic objectives and measures to the identified outputs. Work took place refining the national targets to provincial targets and refining the outputs, sub-outputs and metrics, on which the departments were expected to provide progress feedback. The laws in the OBP approach regulate the necessity of the monitoring and evaluation of development policies on all administrative levels as the PME system aims at reporting progress, identifying lessons, and making improvements during the time frame of the government interventions. 26 During the OBP approach planning, an operation of development planning and elaboration, implementation monitoring and coordination of state level development planning documents took place. The approach worked out a unified methodology for the elaboration of development programmes for provincial level, further giving recommendations on the process of elaboration and the structure of development documents, as well as the order of implementation and monitoring. Focus on the Free State OBP 1: Quality Basic Education: Departments of Education, Police, Roads and Transport, Human Settlement, Corporate Governance and Traditional Affairs, Sports, Arts, Culture and Recreation, Social Development, and Public Works initially were determined as the departments that needed to work together to convert the high-level outputs and metrics into achievable metrics that would address the provincial need. The following Outputs were focused on: 1 Improve the quality of teaching and learning; 2 Undertake regular assessment to track progress; 3 Improve Early Childhood Development; and 4 Ensure a credible outcome-focused, planning and accountability system. Amongst all the departments that were identified as supporting departments for OBP 1, only Social Development, and Sports, Arts, Culture and Recreation remained the supporting departments for OBP 1. The other departments identified could not be linked as role players in the outputs in OBP 1. The other departments could not be directly linked to the indicators required to be responded on in OBP 1. Although 11 OBPs were implemented in the province, this study focused only on OBP 1. 2.6 Conclusion South Africa has come a long way in determining and implementing policies to improve the quality and accessibility of basic education in the country. From 1994, policies developed have given a clear direction as to where the country is heading in its decisions around improving education. The main focus since 2009 has been on improving policy implementation through the introduced Outcome Based Approach. 27 --------------------------------------- ---- The used and sourced literature review provides information mainly on what has been previously planned to improve basic education. 28 CHAPTER 3: RESEARCH DESIGN 3. 1 Introduction The research design of the study describes and justifies the methods and processes that were employed to collect data that were used in answering the research question. In pursuit of answering the research questions, which is: How successful is the Performance Monitoring and Evaluation (PME) system in improving Quality Basic Education management through the OBP approach?, the researcher was required to obtain appropriate information from the study questions asked to the research participants. The research conducted followed an exploratory study to acquire knowledge on the OBP approach implemented, so as to determine if it has improved the management of quality basic education in the Free State. The exploratory design method is designed to gain insight into a situation of which little is known or if the interest is in developing a hypothesis that would probably be tested in another research study (Diem, 2002). In conducting this study, the researcher sought to develop concepts more clearly on what has been improved and establish priorities on how the management of the OBP approach could be further improved (Blumberg, 2005: 201). The methodology employed for the design of the study was the classification framework of design types used by Mouton (2011: 146). The classification assisted in providing direction on how the research would be conducted. Below, the different types of research under the respective dimensions are highlighted: 29 Table 3.1: Classification of framework design types Dimension Type Dimension 1 - Ranging from empirical to Empirical non-empirical (conceptual) Non-empirical Dimension 2 - Primary or new data Primary collection versus analysing existing data Existing Data Hybrid Dimension 3 - Type of data, ranging Numeric from numeric to textual Textual Combination Dimension 4 - Degree of control or High structure in design Medium Low Source: Own Dimension 1 In this study, empirical research was applied and this method is defined as making new factual discoveries or confirming the existence of previously hypothesised phenomena (Mouton, 2011: 113). The researcher used this method to gain knowledge on how the use of the OBP approach has improved the management of quality basic education by means of direct observation or experience. Dimension 2 The study also made use of hybrid data. Hybrid data mean that the researcher used both primary data, which refers to the data which the researcher collects her/ himself, and secondary data which are data that already exist (Mouton, 2011: 69). The researcher collected information from the participants to obtain direct information on how the OBP approach has improved their ways of working when managing quality basic education. The secondary data were used to provide interpretation on the primary data, as well as providing other opinions on the OBP approach. Dimension 3 30 The researcher used the textual method to conduct a subjective assessment of the participants' experiences, attitudes, opinions and their behaviours. The textual method, scientifically known as the qualitative method, emphasises words rather than quantification in the collection and analysis of data (Bryman, 2008: 697). Dimension 4 The degree of the study design had low control. The research was conducted using individual interviews, in an environment in which the respondents were allowed to respond freely to the topic of discussion. There was no control over the response provided and no questions to limit their responses (open-ended questions). A cross- sectional research type was conducted and this method looked at what was happening at a moment in time; is comparable to taking a snapshot at a particular time; and is representative of what is happening at the time when the research is carried out (Bailey et al., 2005: 43). To assess the success of the OBP approach in improving the OBP 1 management, the researcher viewed a realistic picture of what was going on at a particular point in time and compared it to any changes brought through the introduction of PME. This method was selecied by the researcher due to the advantages it offers, such as low cost in performance; less time taken to respond; and a relatively short time it taken to gather data (Bailey et. al., 2005: 43). Using the cross sectional study, the researcher was aware that data collected may have been outdated as it may have been overtaken by events, and it may also have drawn an artificial picture of what was going on at a particular point in time (Bailey et al., 2005: 43). The logic of this research followed the inductive generalisation process which is defined as applying inferences from specific observations (e.g. sample of cases) to a theoretical population (Mouton, 2011: 117). Muzinda (2007: 84) further explains inductive method as making careful observations, conducting experiments, rigorously analysing the data obtained, and producing new discoveries or theories to explain what is happening. The researcher collected data about the OBP approach to explain how the current PME system had improved the management of quality basic education. The inductive research process was favoured over the deductive 31 ~-------------------------------- research method because the researcher did not seek to inquire into the specific expectations of a hypothesis developed on the basis of the general principles of data collected to prove or disprove it; as is required in a deductive study (Neuman, 2006). Grounded theory was the inductive method of choice in developing inductive theories that are grounded in systematically gathered and analysed data. A theory from which hypotheses are deducted was not developed by the researcher, but with a field of study or a research question. What is relevant to this question is allowed to emerge during the research process. 3.2 Sampling A population is the universe of units from which a sample is to be drawn (Bryman, 2008: 697). From the population of managers applying and using the OBP approach to manage OBP 1, the researcher selected a sample using a probability strategy. A probability strategy is defined as one that has the concept of random selection, with a controlled procedure which ensures that each population is given a non-zero chance of selection (Blumberg, Cooper & Schindler, 2005: 235). The use of these sampling techniques is to remove sampling bias which occurs in non-probability sampling, where a study can be conducted in a population whose elements are unknown. Bryman (2008: 697) defines sampling strategy as gathering data on a small part of the whole parent population or sampling frame, and is used to inform what the whole picture is like. In this study, the researcher used a stratified sampling strategy. Researchers usually cannot make direct observations of every individual in the population; thus, in this study, observations to make inferences about the entire population was conducted through this method. The total population of managers participating in the management of OBP 1 - quality basic education were all part of the Technical Working Group (TWG). The facilitation of OBP is managed by the Department of Education, with the other supporting departments being those of the Premier; Social Development (DSD); and Sport, Art, Culture and Recreation (SACR). Of all the members (24) in OBP 1 TWG, only 10 formed part of the research respondents and were stratified according to the departments to which they 32 belonged. The 1O participants (6 Education, 1 DSD, 1 SACR, and 2 Premier) selected are the participants that are responsible for managing the planning and implementation of OBP 1 and are tasked biannually to provide feedback on their progress. The constituted sample was chosen for the following reasons: • It was convenient for the researcher because the sample was reachable for conducting interviews and the size adequate for following up in order to increase the response rate for the study. The information on TWG comes from the Department of Edu~ation as the facilitator of OBP 1 in the province. They have information on all key stakeholders taking part in the OBP and they are responsible for facilitating, monitoring and evaluating all the plans and activities taking place in OBP 1. 3.3 Data Collection Strategy The data collection strategy highlights the techniques used in the study: The use of the qualitative method in this study applied an in-depth interview that was unstructured and conducted through individual interviews. According to Bryman (2008: 473) the individual interview is a formal discussion between the interviewer and a person chosen specifically for the discussion. The goal is not to represent the population as a whole, but rather to gather diverse points of view. Individual interviews give an opportunity for respondents to share their understanding of the OBP approach; the ideas and behavioural changes it brings to their work; provide motivations for applying it; and its strengths and weaknesses, among other aspects. The reasons for the use of the individual interview are as follows (Bryman, 2008: 475): • The researcher was interested in gaining access to mental representations as well as practices that are deeply ingrained in the people's minds and that can only rarely be expressed via a questionnaire or in a group setting. The research sought to know, per respondent, their general view on the OBP approach, as well as how it has enhanced their ways of doing things. 33 • A greater number of subjects can be explored in an interview as the participants participate more freely without worrying about the sensitivity of the information that they provide. The open-ended questions allow the respondent to answer freely. There are different methods that can be used to collect qualitative data from participants. Grounded theory was considered in this study, through open-ended questions used to guide the conversation. Thus, what is relevant to these questions is allowed to emerge during the research process. Intensive interviewing is another qualitative data method long been used to gather data in various types of qualitative research. It permits an in-depth exploration of a particular topic or experience; therefore, it is a useful method for interpretive inquiry. An in-depth intensive interview fosters the elicitation of each participant's interpretation of his or her experience (Charmaz, 2006: 25). In an in-depth interview one listens, observes with sensitivity, and encourages the person to respond, with most of the talking being done by the participant. The intensive interview was not the choice for this initial study because it is a method used when a researcher is highly experienced and informed about the topic being researched. The individual interview was set with each respondent, according to the departments participating in OBP1, and the time for each meeting was scheduled for an hour. The individual interview for this study was the best option because many of the respondents who were taking part in the OBP were senior managers and some executives. Any form of interview that required having the interviews in the form of a group would not have been appropriate as the research would have experienced a low attendance. The individual interviews took place amongst the stratified participants, and the researcher conducted the interview herself, choosing a data recorder to pick up on the statements made. Bryman (2008: 476) states that the use of tape recordings allows the researcher to easily obtain key points expressed by the participants and therefore, prevents any difficulty experienced when writing them down. Mitigations to deal with the individual interviews' shortcomings were as follows: 34 • Qualitative data collected from interviews take longer to transcribe and analyse than a survey, checklist, or test data. The researcher mitigates this problem through allocating enough time to analysing the data received from the study conducted. • Although the same questions may be presented in the same manner during individual interviews, interviewees may not interpret them in the same way, and thus may respond quite differently to the questions. During the response time of the interview, the researcher developed probing questions that assisted in elaborating the question and answer further, without trying to manipulate the interviewee's response. 3.4 Questions for the individual interviews As mentioned in The Presidency (2009: 15), the Outcomes Based Approach is aimed at improvements in government performance though a focus on a sector rather than on a department and the intergovernmental implic~tions of changing behaviour and attitudes, The measurement is of politically designated outcomes for accountability; giving priority to a few sectors; and an emphasis on accountability throughout the service delivery chain. The study assessed: (i) the improvement the OBP approach had on intergovernmental implications, rather than an individual focus on departments in quality basic education; and (ii) the changing behaviour and attitudes of the participant. According to Krueger and Casey (2009: 7), the questions in the interviews are carefully determined and sequenced to make it easy for respondents to understand. In answering the research question: 'How successful is the Performance Monitoring and Evaluation (PME) system in improving Quality Basic Education management through OBP approach?' the moderator led an interview for an hour focusing on the following questions: Table 3.2: Summary of the research design highlighting the research questions 35 Methodology Investigative Questions Type of Location of the Data Collection Data Analysis data data Instrument required Ql Has there been any Responses Responses from Recording Collating and changes in how you manage monitoring and Device description of basic education work I or evaluation best practices work contributing to basic officials education? (i) Q2 Has the PME system Responses Responses from Recording created focus on monitoring and Device intergovernmental evaluation arrangement between the officials departments' contribution in OBP 1? (i) Q3 Has this changed the Responses Responses from Record"tng culture in how you plan and monitoring Device manage basic education applied in your overall and evaluation management officials responsibilities? (i) Q4 (i) Has this approach Responses Responses from Recording brought any positive changes monitoring Device on how you do your overall work? (ii) and evaluation officials QS How has the investments Responses Responses from Recording in the PME system paid off in monitoring Device changing your behaviour on implementing the full and evaluation delivery chain (ii) officials QG How can the OBP Responses Responses from Recording approach be improved? (i) monitoring and Device evaluation officials Source: Mzunda (2007: 98) 36 Table 3.2 is a summary of the research design highlighting the research questions and the subsequent investigative questions showing the source of data for each investigative question and means of analysis. The sub-questions guided the research and were derived from the research problem. Responses to these questions answered the research question from which they are derived. 3.6 Limitations 12 OBP which focused on Basic Education, Health, Rural Development, Job creation and Safety were implemented during the first years of the Performance Monitoring and Evaluation implementation in the country. This study is focused only on the implementation of OBP 1 - Quality Basic Education in the Free State, the findings and experiences of which may not be recommendable to all other Outcomes applied. The data for the study were collected through the qualitative method and through this approach of responding and asking questions to participants, may have led to the response the researcher wanted, thus posing major limitations to this study. The qualitative method of data collection and analysis incorporates a wide range of different techniques and epistemological assumptions; thus, the careful selection of the appropriate qualitative method is important (Griffin, nd: 2007). The study also has a relatively small number of participants and this can lead to the likelihood of its not being taken seriously by academic researchers or by policy makers. 3. 7 Ethical Considerations With the study using the qualitative approach, the following ethical issues were identified: By using the qualitative technique in the study, the participants need to be clearly informed about the purpose of the study. The introduction of Performance Monitoring and Evaluation (M&E) in the province is perceived negatively by many government 37 officials as a form of 'policing' what do in their work. A brief explanation of this study should also entail an explanation of the benefits of M&E. The honesty and non- biased responses should be motivated by explaining the benefits of the impact of the research on their organisation. Owing to some information that could be found to be private about the organisation's work or consideration of the approach, the participant's protection from harm comes first. The right to anonymity through the use of either numbers or alphabetical letters was used during the interviews in the research study. Confidentiality of where the information being provided originated was maintained through the non-disclosure of data sub-sets, and the restriction of instruments that would have provided information on participants in the study. 3.8 Conclusion There are various methods and tools used to gather data; the method proposed in this study was carefully considered by the researcher through conducting an extensive analysis on existing methods. At this point, the chapter described the relationship between the literature review, methodology and design in accordance with Figure 1.1 that illustrates the hierarchy of study. Following the determined research method, the researcher followed the approach of developing a questionnaire which responded to the research topic. 38 CHAPTER 4: DATA COLLECTION AND ANALYSIS 4.1 Introduction This chapter presents the data collection and analysis of the information obtained during the research study. Data analysis usually begins when data collection begins. To better understand how the Performance Monitoring and Evaluation System was paying off in the quality basic education management approach, a qualitative method of inquiry was utilised because the researcher's intent was to uncover rich and descriptive "meaning" (Bogdan & Biklen, 1992). Data analysis is conducted to reduce, organise and give meaning to the data. Researchers using grounded theory attempt to generate a theory that is closely related to the context of that which is being studied (Strauss & Corbin, 1990). The collection of the information is presented under the following themes, namely: the response profile and response rate of each of the investigative questions that the guiding questionnaire sought to answer. 4.2 Response profile and rate Individual interview appointments were made with ten (10) participating members in OBP 1 Technical Working Group, and these appointments consisted of participants who were strategically identified. All four (4) initially targeted departments, namely: Education, Social Development, Sports Arts Culture and Recreation, and Premier were interviewed. The profiles of the respondents consisted of six (6) participants from Education, one (1) Social Development, one (1) Sports Arts Culture and Recreation, and two (2) Premier. There were four key features intended to be obtained through the use of the grounded theory method in this study (Urquhart, Lehmann & Myers, 2010): (1) theory building; (2) allow researcher prior knowledge and not to take him/her to pre- formulated hypotheses and not hinder observations based solely on data; (3) analysis and conceptualisation entangling through the core process of joint data 39 collection and constant comparison; and (4) codes selection by a process of theoretical sampling. Figure 4.1: Research data collection and analysis process ,:i:-~"'' ~f):lio. ,,. ... ... ·:';..~:-,.,;~~·:..::'A·,·. - ·~~- . ' l L Source: Own Figure 4 .1 depicts the process which the researcher undertook to collect and assess the data collected by means of identifying units of meaning through a process of open coding, which seeks to establish conceptual categories. The interview records from each participant were scribed line by line following the interviews and returned to the participants for confirmation of the transcripts drafted. The researcher also allowed participants to add more information to the questions that they felt were not fully responded to as they had wanted them to be. Through this back and forth movement between validations of the responses received during the interview to confirmation by the participants, the researcher sought to strengthen the inputs and reduce any biased findings. Following the confirmation of the interview data, an analysis line by line, through the use of selective coding to identify the main categories, existing or new, would develop a descriptive narrative (theory) encompassing them. The analysis of the qualitative data was conducted through looking at recurring patterns and themes. Through an inductive approach of analysing qualitative data, 40 the researcher meticulously read through the statements made by the participants of the study, then collected the data through focusing on a particular issue that was relevant and meaningful to the questionnaire, as well as being a unit of meaning. 4.3 Open Coding Initial or open coding is the first step of data analysis. It is a way of identifying important words, or groups of words in the data and then labelling them accordingly (Strauss & Corbin, 1990). The researcher used the open coding analysis to evaluate the transcripts and documents, the interviews of subjects and the examination thereof and then returned to evaluate the transcripts and documents in order to identify categories or properties about what was being studied. Next, the researcher used axial coding to compare the interviews in order to understand the central phenomenon, such as how the investments in performance monitoring and the evaluation system is paying off in the quality basic education management approach. The open cod ing resulted in 34 matrix conceptual categories, with examples of units of meaning of the original data and number of occurrences (#).The grounded theory establishes the need for joint collection and analysis. Table 4.1 shows conceptual categories drawn from this phase and open coding details and the list of conceptual categories is presented in Annexure B. The underlined text is an initial relationship between conceptual categories and it is explained in the next section. In th is study there was only one data collection, and it was not possible to validate the conceptual categories. Table 4.1: Examples of conceptual categories from open coding ID Conceptual Category Example of Unit of Meaning No of occurrences# 1 Strategic Leadership "Still feel there is a lot of need for leadership in 5 terms of working together as a province." "Also requires direction from the top. The Offi ce of the Premier needs to take into consideration when planning cycle starts to then start faci l itatin~ OBP." '--~~~~~~~~~~~-!-~~~~~~ 2 Strategic Document "Should just not see it (OBP) as one document 2 Merging but to align it with existing strategic/ management document, so it forms part of our wa s of 41 ID Conceptual Category Example of Unit of Meaning No of occurrences # working." 3 Coordination "Premier as the centre of the approach to 1 Strengthening strengthen their coordination and their facilitation in terms of beinq consistent." 4 Coordinating Structures "It is not through OBP that such interactions are 2 created but it is also through the Social Cluster, and it is not necessarily captured in the Outcome-Based plan, but where operational, responsible managers work together in terms of some of these issues." "Other individual bilateral Meetings are taking place between the different departments on operational matters pertaining to their delivery of a similar mandate. TWG was responsible for monitorina the imolementation of OBP." 5 Work Streamlining "Leader managed to streamline the things we do 3 and focus on in the province." 6 Sector Focus "We have managed to create foci in specific 2 sectors." 7 Report Focus "The APP that was produced is better than that of 4 previous years, particularly on quality and credibility. We also emphasise the issue of evidence. When people produce their reports, they need to think clearly as to how they will provide evidence of such information. I think gradually, the OBP approach is getting institutionalised at the bottom." 8 Priority Focus Enhancement "They were forced to focus on critical issues of 2 government because they had to align with and report on them." 9 Integration Challenge "The degartments need to talk and this should 6 happen prior to drawing up the departmental plans. We needed to talk to make sure that the inputs are accurately provided in terms of what is needed in the outcomes." 10 Interrelation "They would come together and start to talk about 6 these issues when we called them and not out of their own initiative." " Department is succeeding in using the aggroach on outcome focus, but externally, through working with other departments to deliver on the same outcome, we are struoolino. " Source: Own 42 4.3.1 Criteria of analysis The conceptual codes which were created from the responses were used by the researcher to respond to the research questions which sought to determine how the investments in the Performance Monitoring and Evaluation System were paying off. In this section, the researcher states the criteria of analysis for the data that were used to answer the investigation question. For each question, the researcher analysed the respondent's responses using the rating in Table 4.2 to show the scoring criteria, namely: Positive response, Negative response, and Not Applicable. Table 4.2: Interpretation Criteria Rating Interpretation Positive response improvement of OBP approach on intergovernmental implications and changed > other rating behaviour, and attitudes Negative No improvement of OBP approach on intergovernmental implications and response > other changed behaviour, and attitudes rating Not applicable/ No improvement of OBP approach on intergovernmental implications and Not familiar changed behaviour, and attitudes Source: Own A high rating in positive response per question demonstrates the improvement the OBP approach has had on intergovernmental implications and changed behaviour, and the attitudes of managers in M&E. Negative responses indicate the lack of improvement the OBP approach has had on intergovernmental implications and changed behaviour, and the attitudes of managers in M&E. A high score in Not applicable or Not familiar response also indicates the lack of improvement the OBP approach has had on intergovernmental implications and changed behaviour, and the attitudes of managers in M&E. Following defining the criteria for the analysis of questions: the findings are shown below under the following sections: 43 ~-----------·------ ~------ • Improvement the OBP approach has had on intergovernmental implications rather than an individual focus by departments in quality basic education. • The changing behaviour and attitudes of participants in implementing the full delivery chain. 4.3.1.1. Improvement the OBP approach has had on intergovernmental implications rather than an individual focus by departments in quality basic education This section shows the findings to the questions that sought to respond to the improvement the OBP approach has had on intergovernmental implications rather than an individual focus by departments in quality basic education. The section is divided into three (3) sub-sections i.e.: Q1 Changes in the management of basic education work; Q2 Created focus on intergovernmental arrangement; and Q3 Changed culture in planning and managing basic education in management responsibilities. The rating to each of the questions is discussed in the next table. Table 4.3: Rating on the improvement the OBP approach has had on intergovernmental implications rather than an individual focus Research Positive Negative Not Applicable Interpretation Question Response Response 01 6 3 1 Improvement of OBP approach on intergovernmental implications 02 7 3 - Improvement of OBP approach on intergovernmental implications 03 5 4 1 Improvement of OBP approach on intergovernmental implications Source. Own 44 The table illustrates the score given on each question following the analysis of data collected through the interview. The result depicts improved intergovernmental implications through the OBP approach. In Q1 which enquired about changes in how departments manage basic education work I or work contributing to basic education, the question sought to determine any changes in the managers' behaviour; and attitudes in how they conduct their work and manage basic education inputs since experiencing the Monitoring and Evaluation system prior to the introduction of PME in 2010 and after its implementation. The following statements indicate the response of positive changes in managing basic education: • Plans are being identified and facilitated in key areas of impact. • Improvement in the quality of information that was received on basic education as the Department no longer focused on outputs but also received evidence for verification and validation. • The PME process facilitated the OBP indicators derived from the Action Plan 2014 of the minister (Education) to be aligned with the departmental plan. • Performance Monitoring and Evaluation has enabled the department of the Premier as a strategic leading department to focus on critical aspects of government. The key fundamental changes in how departments manage basic education work were mentioned through the improvement in plans developed to address key areas of, impact, planning information, streamlining and information received on the delivered outputs validated. There was an emphasis on how assessing the expected change in departmental plans, and the monitoring of outputs was brought about. Although many of the respondents identified the positive changes experienced, a few respondents indicated that there has not been any change in how they manage their work or contribution towards basic education since the introduction of PME. The follow reason was outlined: 45 • Lack of allocation in funding to strengthen systems in PME and the consolidation of plans required to create focus on key areas of impact identified. The allocation of funding to strengthen development systems is supported by Cimdins and Skinkis (2011), where in their study on Development Policy Monitoring Issues - Regional and Local experiences in the Riga Region, they observed the link between development priorities and budget planning as an essential precondition for the implementation of development policies. No resource allocations or the determination of the actual costs of implementing the outcomes-based assessment often go uncalculated (Bresciani, 2006: 6). It is evident that an attempt to determine the actual costs of engaging in outcomes-based planning and assessment be identified as a requirement for the departments to improve their management of basic education. Q2 Whether the PME system created a focus on the intergovernmental arrangement between the departments' contribution in OBP 1 sought to make an inquiry into whether the OBP approach has brought working together towards improving basic education amongst the contributing departments in basic education. A high number of respondents indicated a positive response to the approach; thus indicating that a form of working together amongst the contributing departments towards education had improved. A positive score from 6 participants was obtained. They evidently responded to the focus created on intergovernmental arrangements between departments contributing to OBP 1. Their responses outlined the following: • Good relationship in departments delivering on similar work was created through the outcomes interaction. • Continuous interaction amongst the departments has been improved because it used never to often take place previously. 46 The improvement in intergovernmental arrangement amongst the departments delivering similar work implies that scarce resources were allocated; thus, having to do more with less and doing it on time (The Presidency, 2009: 4). The departments were able to work according to plan and avoided compromising with limited resources when doing what other departments were already doing. The following negative responses were said to be the cause of a lack of intergovernmental arrangements being improved: • Budget allocation in departments is insufficient as it has not capacitated the departments to operate in an outcomes manner; planning is short-lived and cannot be focused on in the long term. • Intergovernmental relations between some departments were not well established through the lWG's group interaction, as such interaction involved only a few indicators. • Planning TWG takes place at a high level and this process takes no consideration of plans made in some programme units in departments. The insufficient budget allocation in departments towards long-term plans (5-year strategic plan) maintains a short-lived focus which was on collective administrative data and outputs, rather than outcomes or impacts over a longer period (Heinrich, 2002: 716). Planning in the lWG which was considered high level, consisted of indicators that some programmes in departments were required to contribute to quality basic education and were not directly linked. Kravchuk and Schack (1996: 356) state that differing data definitions can complicate the interpretation of performance requirements of higher-level government departments when data definitions and collection methods change rapidly and in unanticipated ways. Problems can arise especially where the definitions of programme effectiveness (or success) diverge widely. In a bureaucratic environment, distance from the individual programme level to the highest level in decision making is increased. The links between individual programme and organisational outcomes and performance grow more complex when the implemented programme efforts are not ultimately aimed at influencing performance at the highest level of the organisation. 47 Q3 Has this changed the culture in how you plan and manage basic education as applied to your overall management responsibilities? In Q3, the researcher sought to determine if there has been any change in improvements or a lack thereof, since the introduction of OBP. The following responses were outlined to reflect on the change in improvements since the introduction of OBP: • To perform at an optimum level, the education sector was viewed in the province at an outcomes level by contributing departments. All M&E practitioners of contributing departments came together to analyse how they have been doing things to guide change how they would start doing them differently. • Monitoring was approved because at the beginning of doing things, a manager and his down-line staff agreed on the measures to take, regarding what they had planned and intended to deliver in a particular year. • The same approach was applied in department of Education, i.e. the directorates that contribute towards improving education ensured that they delivered a credible report, as well as holding continuous bilateral meetings with the M&E in order to reach a common understanding. It is evident that there have been benefits to departments in how they manage their work through a changed culture in how they plan and manage basic education in their overall management responsibilities. As stated by Heinrich (2002: 716), the outcomes-based performance standards system, creates focus management and attention on those ends deemed most important, such as an emphasis on impacts over output goals. The following responses outlined by the respondents reflect on the lack of improvement since the introduction of OBP: • The manner in which government plans and reports still focuses on responding to the needs and expectations of the executives, and not 48 necessarily on the provincial impact and such focus is on outputs and measurable/tangible deliverables per department, rather than focusing on the what the province should look like. • Government culture is predominately driven by a silo mentality; when OBP 1 was introduced it was looked at on its own, without an attempt to try to instil and institutionalise the approach as part of a way of working. • The introduction of all programmes proposed for the implementation of the PME system are many and have been allocated to one manager to facilitate. Support and human resource are required to assist in its implementation. It· has not been that all respondents have experienced improvement since the introduction of OBP and to some, it has felt as if the system has brought them more work. Heinrich (2002: 712) suggests that the requirements for specific performance goals, plans, and results have increased administrative constraints, and elevated conflict among multiple levels of programme management. 4.3.1.2. The changing behaviour and attitudes of participants in implementing the full delivery chain. The introduction of the outcomes based approach was not only to develop a skilled and well-motivated public service but one that is proud of what it does and receives full recognition for delivering better quality services (The Presidency, 2009: 4). This section shows the findings to the questions that sought to determine if the respondents had changed their behaviour and attitudes in implementing the full delivery chain. It is divided into two (2) sub-sections: Q4 Positive change brought in how overall work is done; and Q5 Investments in the PME system are paying off in changing behaviour on implementing the full delivery chain. The rating to each of the questions is discussed in the next table. Table 4.4: Ratings in improvement in the OBP approach has on changing behaviour and attitudes of participants in implementing the full delivery chain. 49 ----------------------------------------- ----------- ---------- - Research Positive Negative Non-Applicable Interpretation Question Response Response 04 6 4 Improvement in changed behaviour, and attitudes Q5 5 2 3 Improvement in changed behaviour, and attitudes Q6 Source: Own The observation made from Table 4.4 indicates that the rating on the improvement of the OBP approach on changing behaviour and attitudes of the participants in implementing the full delivery chain has increased. The respondents were asked the following two questions in which their responses are explained. Q4 Has this approach brought any positive changes to how you do your overall work? The question asked was if participants are implementing the culture of performance monitoring and evaluation in all other management activities they do in government. The response to how their work was changed positively, in how they do their overall work was: • Positive changes have been seen in the past 2 years since the introduction of PME in the department and a clean audit obtained. • The approach has brought a change to how the departments work because they have been forced to talk to one another. • Departments have been forced to focus on critical issues of government because they had to align with and report on them. • Positive change was seen in the Department of Education where they have embraced the technique and started to capacitate the programme managers on why data sourced for M&E was important. Issues and findings by managers was also shared in executive meetings. The following responses were outlined by respondents as to how the approach has brought no changes to how they do their overall work: 50 • Identification of impact programmes needs to have prioritisation in finance as the departmental budget is used for operational activities. • The level of discussions held in TWG on outcomes needs to be clearly communicated to departmental managers in order not to overlook the planning arrangements already taking place amongst the departments. People participating in the structure do not implement and follow up on their discussions; they cannot communicate plans properly to the people responsible for the work. • Beyond the OBP TWG, there are other intergovernmental structures which build working together amongst the departments. TWG in Education consists of 99% of the indicators coming from the department alone. Addressing these indicators in TWG was not necessary, as an operational plan to address them already existed. Cimdins and Skinkis (2011) state that on different policy development levels exists a varied approach of structures for defining specific aims and the use of policy monitoring measures. To some respondents, the TWG required implementing the PME system in OBP 1, and it was felt that it was a repeat of some structures already existing or taking place in departments. QS How has the investment in the PME System paid off in changing your behaviour on implementing the full delivery chain? The researcher sought to determine any changes in managers' behaviour on implementing the full delivery chain. The following responses were outlined by respondents to indicate the PME system pay-off: • OBP inculcated a focus to all stakeholders that is not inward focused but province focused, and sector focused. Sector focuses embraced and entailed all the stakeholders in delivering all activities in one sector. • We look at the outcome because what we do in departments needs to reflect what is required to achieve a high impact in education. 51 • There is a link to focusing on a full delivery chain through plans that are aligned with the outcomes on which they are focused. Although the system received some positive comments to its paying off, to changes in managers' behaviour on implementing the full delivery chain, the following comments were received to indicate that the system was not paying off: • Focus is still placed on outputs and measurable/tangible deliverables per department, rather than focusing on the outcome of results and the impact on the province as a whole. The focus of TWG needs to be elevated to focus on outcomes. • Department of Education is succeeding in using the approach on outcome focus internally, but externally, through working with other departments to deliver on the same outcome improvement is required. Through TWG, the departments contributing to education need to be measured and be accountable for the contribution they require to play in improving education. 4.4 Axial Coding Axial coding consists of identifying relationships among the open codes. Axial coding follows the development of a major category, although it may be in an early stage of development. Its purpose is to sort, synthesise and organise large amounts of data and reassemble them in new ways after open coding (Charmaz, 2006: 60). Axial coding in the study involved the act of constantly comparing words and meanings in order to formulate some common themes across the data which were used in the Conceptual Category. The purpose of axial coding is to identify categories or conditions that may be contributing to the subjects' inability to engage in the outcomes-based approach and to identify specific strategies, conditions, and contexts. The interpretations of the theory are validated and refined through theoretical integration, which consists of relating the resulting theory to other existing theories to generate confirmations, and even formalisations. 52 Through Axial Coding, in categorising some codes, the researcher established simple connections with different categories, such as 'Work Streamlining' with 'Sector Focus', 'Report Focus' and 'Priority Focus Enhancement'. The group categories were chosen with greater relevance in the context, always seeking to be grounded in the data. Some cases did not result in grouping due to the weak connection found in the narrative of certain categories. After connecting the conceptual categories, notional categories were established . Table 4.5 Notional category ID Notional Category Conceptual Category - 1 A. Strategic Direction Leadership (5) - 2 Manaqement Document Merginq (2) - 3 Coordination Strengthening (1 ) 4 Coordinating Structures (2) - 5 B. Focus Divergence Streamline Work (3) - 6 Sector Focus 72) - 7 Reoort Focus (4\ 8 Priority Focus Enhancement (2\ - 9 C. Integration on Specific Sector lntei::iration Challenqe (6) -10 Interrelation (6) - 11 Middle man (3) -12 Comolementarv Sunnort 11) ,____1_3_ Integration improvement (2) 16 Silo Mentality (1) ,____1_4_ D. Communication Interaction (3\ ,____1_5_ Communication Challenae 12\ ,____1_7_ Communication Improvement (2) 18 Informed Executives (1) - 19 E. Resources Finance Detachment (1) ,__20_ Finance Reduction (1) ,__21_ Budget (2) 22 Funding Support (1) ,__23_ F. Planning Approach in Government Planninq (5\ ,__24_ Future planning (3) 29 Agreement (1) ~ G. Capacity Building Caoacity (3) 31 M&E Exoerience (1) 27 H. Innovation Innovation (1) ,___2_6__ I. Monitoring Functionality Responsibility in OBP (1) 28 Monitoring (3) 53 ID Notional Cat o Conce tual Category 30 Performance Measure (3) 32 M&E demystification (2) 33 34 Source: Own Table 4.5 shows the category identifier, notional category name, the related conceptual categories, along with their respective identifiers (ID) on the left of the table, fol lowed by the name and number of occurrences in parentheses. The identified notional categories will be used to respond to Q6: How the OBP approach can be improved. The question sought to obtain information on how improvements could be made in the OBP approach to undesired experiences. To facilitate the understanding of each notional category generated by relationships among conceptual categories, the narrative around them is described as follows: The business dictionary (2014, online) defines strategic direction as a course of action that leads to the achievement of the goals of an organization's strategy. In Strategic Direction, ' leadership' is preserved as the key element required to improve the implementation of the PME system through the 'strengthening of coordination ' at provincial level in order to achieve the 'management of planning documents required to merge' The leadership as is required, needs to build capacity, establish direction and influence and align others toward a common goal, motivating and committing them to action and making them responsible for their performance (Sindane, 2007 : 2). In Focus Divergence defined as to move, or extend in different directions from a common point; branch off (Dictionary.com, online), the introduction of PME system has managed to 'create focus' through 'streamlining of work' to specific sectors in the province. The 'report focus' has been improved through data that is cred ible and verifiable. In Integration on specific sector, several 'challenges' were outlined, amongst those being 'silo mentality', 'interrelation' of outputs to outcome indicators, and role playing of Department of the Premier as 'Middle Man' to 'improve the integration'. 54 Similarly, communication amongst the participating members from different departments in TWG was initially a 'challenge'. Communication is defined as the articulation of the organization's strategy and translate into simpler and actionable strategic objectives provide clarity on what the organization intends to achieve (Sindane, 2009: 07). Continuous 'interaction' amongst the departments has 'improved' communication on how contribution to quality basic education is achieved. The outcome of results in TWG needs to be communicated to department executives to create awareness of those ends deemed most important. In the resource, category, 'Financial detachment' and the 'reduction' of finance in key programmes was identified as a problem for the implementation of the PME system. Funding support on improving focus on outcomes by departments, as well as budget allocation for high impact programmes in prioritised sectors was found to be crucial. The Planning Approach in Government category, 'Planning' before the introduction of PME was done according to Treasury regulations and guidelines, wherein the focus on the development of Annual Performance Plans was on outputs. For 'Future Planning', joint planning within the departments was needed, particularly in departments working on contributing to one outcome. Capacity Building is defined as planned development of (or increase in) knowledge, output rate, management, skills, and other capabilities of an organization through acquisition, incentives, technology, and/or training (businessdictionary.com). In the capacity building category, the following aspects were identified as key to enabling the departments to fully operate in an outcome orientated manner: the allocation of funds for departments to perform their functions; technology supporting systems for PME management; skills development training; and the allocation of Human Resources for overloaded managers in M&E due to increased responsibilities. In the Innovation category, the outcome-based approach brought an initiative which allowed government to be 'innovative' but government officials' thinking and how they operate has not changed or shifted. 55 In the monitoring functionality category, it was recognised that 'Performance Measure' on identified key areas of impact were essential and to allow 'Monitoring' of these areas of impact, Office of the Premier requires having a team of 4-5 people dedicated to monitoring them. M&E advocacy is needed so that managers are aware of what needs to be done to conduct monitoring and evaluation. A 'Good practice' on using the information for M&E was also realised and this could be obtained through demystifying M&E which would prevent resistance to its implementation. 'Levels of managers in M&E' was seen as a contributing factor to getting other people to listen to the message of M&E. All departments contributing to OBP should .take full accountability and such 'Responsibility in OBP' lies with the accounting officer (HOD) of particular departments. 4.5 Selective Coding The selective coding technique was applied to extract the respondents' points of view about the PME system through the implementation of the Outcomes approach. Selective coding is defined as a process of choosing one category to be the core category, and relating all other categories to that category (Strauss & Corbin, 1990). In developing a single storyline around everything else covered, a theory is developed to help decide on analytic grounds for the next stage of recommendations. Thus, the categories of Strategic Direction; Focus Divergence; Integration on Specific Sector; Communication; Resources; Planning Approach in Government; Capacity Building; Innovation; and Monitoring Functionality can be interpreted as essential to the success of improving the approach and the satisfaction gained from the use of these by the respondents. All categories form part of the process that is incomplete without any one of them. The term for the category or driver is a conceptual idea under which all categories are included. I conclude that the resulting main category is 'Means to support Planning and Monitoring and Evaluation for Outcome Based Priorities'. 4.6 Conclusion 56 It is evident that there have been beneficial changes to how departments manage basic education through the OBP approach. The notional coding technique has provided a view on what the respondents considered important so as to improve the approach. Thus, the categories: Strategic Direction; Focus Divergence; Integration on Specific Sector; Communication; Resources; Planning Approach in Government; Capacity Building; Innovation; and Monitoring Functionality can be interpreted as essential to the success of the approach and the satisfaction of the respondents' use thereof. The main category namely, means to support Planning and Monitoring and Evaluation for Outcome Based Priorities, will help to interrogate and build the findings and recommendations in the next chapter. 57 CHAPTER 5: DISCUSSION AND RECOMMENDATIONS 5.1 Introduction This chapter presents the discussions on the findings in the research and recommends how support for the planning and monitoring of Outcome Based Priorities Quality Basic Education (OBP 1) can take place. 5.2 Discussion and Recommendations It is important that the findings of the study are briefly reiterated before recommendations are made. The following were the research findings: The PME system is beneficial to government, as it improves how departments conduct the monitoring of their programmes. As a respondent stated "There was an improvement in the quality of information on basic education because the department no longer focused on the outputs count but also worked on receiving evidence for verification and validation as required". However, the approach needs additional support for more effective planning and monitoring of Outcome Based Priorities to take place, as well as further institutionalising the approach to all lines of work done in departments. As another respondent declared "When OBP 1 was introduced we looked at it in isolation and did not really try to incorporate it into our ways of working." This statement can be attested to by the responses of those working with and responsible for M&E in the departments, stating that they apply the approach but managers responsible for programmes other than M&E still did not apply it. It must also be highlighted that the intention of the approach was not to improve how government applied the monitoring of their programmes but to improve focus on where planning should be a priority. One other aspect which respondents declared in the capacity building area was that of skills equipping, supporting systems and procedures, as well as the provision of resources, both human and financial. Regarding the strengthening of the coordination and integration of OBP, a respondent summed up: "Essential to the 58 -------------------------- centrality of the approach, is the need to strengthen coordination and facilitation in terms of being consistent. It is difficult as a lead department to organise the supporting departments' participation in these structures". There was also recognition that M&E was not fully implemented within the departments and the establishment of connections with managers amongst one another has managed to assist M&E managers to work together and communicate with one another. Through my observations on managers' contribution to indicators in OBP, I found that they were very concerned about the performance standards contained in the OBP documents as this information was not linked with the established targets set out in their annual performance plans. The absence of strong coalitions supporting a results orientation; measures that were not well linked to goals or consequences; and employee concerns that their responsibility is not commensurate with their authority, was a concern. The researcher makes the following recommendations to address some of the key findings of the study. • Strategic Direction It is imperative for leadership in the PME system to hold the ability, process and action that involves development and communication of a vision to be realised. Leadership should also influence, inspire and mobilise others within various departments contributing to OBP for collective action towards the realisation of its vision. Leadership involves leaders and followers who are distinguished by different but complementary roles that are undertaken collectively for a common purpose. Therefore, systems and structures need to strategically steer and gear efforts towards the execution and attainment of public policies (Mothae & Sindane, 2007: 2). The champions of the PME system should also give guidance on how political principals (MECs) can measure their departments and instigate the necessary changes in outputs and measurable deliverables which impact on how the Free State Province is progressing. 59 • Focus Divergence There are many i.§.sues causing a focus divergence in planning activities in government which are noticeable once the OBP approach is implemented. Where organisational goals and performance measures diverge, concerted bureaucratic effort across all levels of government level should be considered. It is necessary to explore the relationship between alternative measures and strengthening the link between delivered outputs and measured performance. Following the link created, empirical measures of performance should not continue to be based on the earning levels of participants but rather on the improvement in outcomes by departments. • Integration on specific sectors The Outcome Based Approach was introduced to strengthen government's ability to co-operate across the three levels of government and work as a single delivery operation. In order to achieve this, related tasks to attaining an outcome were grouped together as subsystems (Kravchuk & Schack, 1996: 353). The integration of Performance Measurement (through a Technical Working Group) in the context of intersectoral relationships was problematic. Perceptions as to the intended goals and benefits of particular programmes varied at each level of government, where individual work was well coordinated in coherent subunits, and in accordance with each departmental objective standard operating procedure. This process reflected the closeness of each department to the actual delivery of their service. In improving the integration amongst the departments contributing to an outcome, there is a need for the approach to take into consideration how performance measurement for each department can be built into the system. To some degree, there is the maintenance of homeostasis which respects the planning environment in departments, but the OBP approach requires the achievement of a satisfactory and acceptable level of integration and performance measuring that management can maintain at a reasonable level (Kravchuk & Schack, 1996: 353). The Technical Working Group (TWG) needs to focus closely on building the outcome-based approach which considers the existing structures amongst the 60 departments so as to avoid 'reinventing the wheel'. As some respondents have indicated, there are existing structures with signed MOUs to guide departments on how they work together. The system-wide decisions made through the TWG management structure, which pursues the overall goals should also preserve the advantages of the composed structures as set out below (Kravchuk & Schack, 1996: 353). Within the hierarchic arrangements in the OBP approach, interactions and interrelationships among departments' proposed programmes should be maximised through creating a complementary relationship to increase the effects for an outcome delivery. For example, the Department of Education should not keep children in classrooms during all hours of the school week, but allocate some time for sport activities. It should be mandatory for this relationship to take place in order to complement the use of resources and facilities provided by the Department of Sport, Arts and Culture. • Planning approach in government The OBP approach should be aligned with government planning that consists of a clear, coherent Mission Strategy and Objectives. The Performance Measurement of all levels within the result-based chain should begin with a clear understanding of the policy objectives of the programme, or of multi-programme systems in departments (Kravchuk & Schack, 1996: 350). The daily operations of the officials (individually and collectively) in departments should contribute to the implementation and accomplishment of planning outputs clearly linked to outcomes (Mothae & Sindane, 2007: 03). The OBP approach should be used effectively as a policy measuring. tool for increasing governmental accountability. Accountability is defined as answerability for one's actions or behaviour, thus denoting a superior-subordinate relationship (Sindane, 2009: 04). Such accountability in the OBP approach can be created first by building of standards. The standards are created from what the executives, namely, the Members of the Executive Council (MEC) in the Free State Province identify as a priority, in line with what exists in the National Development Plan (NOP). The OBP approach is a 61 strategy implemented on a 5-year basis to deliver on targets set out in the NDP; it is therefore required for departmental 5-year plans to be linked to OBP. There is need for the OBP approach to involve all the programme managers in the planning of the inputs for the PME system. The active involvement of the managers will mitigate the problems of collecting, monitoring and evaluating data from them. Some managers did not understand the basis of the indicators which they were required to report on and their participation would have the added advantage of demonstrating accountability and strategy support (Muzinda, 2007: 145). Thus, standards set to measure monthly, quarterly or even half-yearly performances and the comparison of the actual as against set standards requires a collaborative effort (Sindane, 2009: 07). The OBP approach also requires having a political champion who will advocate for its implementation in the province. While the office of the Premier plays a crucial role as the coordinator of the PME system, a political executive should be identified to influence decisions taken and implemented in the province. • Resources There is a need to allocate resources for the implementation of outcomes. Resource allocation and management actions at different organisational levels have the potential to influence not only programme outcomes, but also the specific types of performance-management policies adopted (Heinrich, 2002: 717). The following are barriers identified by Bresciani (2006: 6) as reasons for the outcomes-based assessment programme not being ubiquitously practised and should be taken into consideration: (a) Time (b) Resources (a) Time Time allocation to tasks and activities at work are based on what is valued as a priority or what managers are told to value by those responsible for evaluating 62 ~------------------------ ------------ --- department performance (Bresciani, 2006: 6). There needs to be sufficient time allocated to activities and performance measuring in the Performance Monitoring and Evaluation system in order for all people to be given the same amount of time to focus on priorities or the level of responsibilities pressing upon their time. Time should be well invested in the establishment of result-based planning, namely: the activities in planning should be linked to the outcome performance of each department. This measure will reduce work that is burdensome due to multiple focuses required for different activities. (b) Resources The following resource, namely: funds, enabling tools for performance measures (IT system) and Human Resources needs to be allocated to the PME system to allow easier application of the results-based method in the OBP approach. The following costs need to be allocated to promote engagement between managers in outcomes- based assessment: • Professional development of managers in order for them to learn how to engage in quality outcomes-based assessment; and • Managers to be able to reflect on what the outcomes-based assessment data are telling them about their programme. While departments are considering costs to improve the PME system's administrative process, they should also look into educating managers on how to implement effective, efficient, and enduring outcomes-based assessment (Bresciani, 2006: 6). There is also the requirement for departments to be offered support through the allocation of an electronic system or other PME tools which can be used in outcomes-based assessment. With regard to human resource allocation, there is the requirement for departments to review their organisational structure to determine if more personnel are required for the institutionalisation of the PME system. Budget allocation should also be made 63 for performance on the basis of positive quantifiable changes in the outcome performance indicators. Managers should be enabled to understand assessment as it is one of the key requirements for monitoring and evaluation. It was indicated that M&E was applied duplicitously wherein managers would not give objective results which would not be embraced by the executives as they did not give information which reflected positively towards their programmes' performance. Even so, there were other managers who did not believe that the PME system added any value to their day-to- day management duties. Careful consideration of financial resources should be made towards the capacity building of officials to further understand M&E as a sound management practice. • Communication Communication was improved between some departments working together towards delivering the same mandate. Programme Managers, as a whole, should continuously be informed or made aware of the implications of their daily duties on the successful implementation and accomplishment of the PME system and other public policies (Sindane, 2007: 3). It is important to develop an effective communication strategy in the OBP approach that would involve all managers responsible for its implementation. Furthermore, the dissemination of information emanating from the PME system should also be included in the communication strategy. • Capacity Building The findings identified a need to address lack of expertise in the monitoring and evaluation of governmental programmes. Some managers expressed the need for training in the aspect of monitoring and evaluation. The intervention should institute programmes to address not only M&E managers but also programme managers. It is imperative that the implementers of government programmes possess skills in monitoring and evaluation so as to enhance the understanding of M&E as a 64 management practice rather than as a function, as well as being able to expand the capacity to all staff managers to reduce the burden of activities occurring for M&E managers (Muzinda, 2007: 141 ). To reduce the perception that the OBP approach management strategy increases the complexity of modern government, departments have to allow for organisational learning though the outgrowth of systems-analysis literature. This information should emphasise performance measurement as a way to help organisations better understand their processes and outcomes, allowing for a holistic, integrated analysis of the organisation's mission, goals and objectives in relation to its current performance (Kravchuk & Schack, 1996: 349). • Innovation The OBP approach implementation was perceived as unique and brought innovation to how departments assessed their programmes. There is the requirement for the office of the Premier to continuously seek ways to meaningfully engage in an outcomes-based assessment programme review in order to find ways to improve managers' learning and development. • Monitoring Functionality With planning and some function of M&E already existing in the organisational functions of departments, the OBP approach was introduced to further create a focus on outcomes. As suggested by Heinrich (2002: 716), it is required that the monitoring functionality of the OBP approach be elevated to a focus-centred performance which measures programme outcomes. Just as structures already exist for "performance auditing," which is aimed at pointing out breakdowns in operational controls and the implementation of functional responsibilities and areas for cost reduction and operating improvements and so forth, the OBP approach needs to operate at a level in which a system is designed to focus management attention (at all levels) on the central organisational objectives (or programme outcomes) and lessen the government's need for costly processes and 65 compliance monitoring (Heinrich, 2002: 712). Learning may be accelerated through low-level individual and team efforts in operational controls. The strategic coordination, integration, and application of what is learnt throughout the organisation and an understanding of the pit-falls in the design, reporting, and analysis of performance measures, as well as the place of performance measures in the managerial armoury should take priority in the OBP approach (Kravchuk & Schack, 1996: 349). 5.3 Conclusion The findings on the implementation of the OBP approach have uncovered vast areas which need to be considered when working on improving the approach. Recognition of all other existing operational structures prior to the development of the OBP approach should work in a complementary manner. In addition, it is to be expected that a change in the ways in which we do things will occur, particularly in how we plan and monitor, towards creating a focus on the outcomes. Such change can be brought about only through improving capacity in M&E practice, as well as enforcing accountability in the manner in which it was proposed in the recommendations. The main activity required for building the system is the broadening of the government's efforts to realign the focus of government accountability and performance analysis away from activities and process measures, toward results or outcomes (Heinrich, 2002: 712). The requirement of programme performance reports focus on the outcome through the TWG structure which will provide political accountability for results and the opportunity for increased responsiveness to programme managers. 5.4 Recommendations for Future Studies The research study was limited to comparing the monitoring and evaluation practices of the PMS system with the best practices or other best performing departments in order to determine how effectively the approach can be used to improve monitoring and evaluation. It is hoped that this study will constitute only the first step towards further investigation as there are areas of understanding the approach 66 implementation that are still lacking. It has not been long since the approach was introduced and an extensive study on other possible ways of improving the PME system in government should be considered. 5.5 Final Conclusions Outcomes Based Priorities performance management is for increasing performance activities at all government levels (Heinrich, 200: 712). While the intention of the study was to identify how much the PME system has improved the approach on managing basic education and giving recommendations, a theory has emerged as to why the system is not pervasive in all departments. The findings of the study show an improvement the OBP approach has had on intergovernmental implications, rather than an individual focus by departments participating in quality basic education, as well as the changed behaviour and attitudes of participants in implementing the full delivery chain. For full participation in the system and to build public accountability, departments need to implement the approach in order to influence the behaviour of all individuals in the organisation. By emphasising the empowerment of all managers in the process of organisational learning, performance-improvement efforts will be sustained (Heinrich, 2002: 713). Nonetheless, the study does not unequivocally state that outcomes-based, performance-management systems are more effective than traditional approaches to bureaucratic control (that is, accountability for inputs and processes), but it creates a focus on all levels of the results-based chain. What is needed then, is a framework for system-wide performance measurement that acknowledges the diversity of governmental goals, while providing information on aggregate efficiency and effectiveness. Ideally, this framework should measure inputs, processes, outputs, and outcomes, as well as beneficiaries' satisfaction. Moreover, the system should serve the purposes of both continual performance assessment and long-term evaluation (Kravchuk & Schack, 1996: 349). 67 In the light of these limitations, the most useful feedback that programme managers might receive from a performance-management system would be those measures put in place to increase their understanding of how their own policy and programme decisions are linked to the programme outcomes. 68 REFERENCES Bailey, L. Vardulaki, K. Langham, J. Chandramohan, D. 2005. Introduction to Epidemiology. England. McGraw-Hill Education Blumberg, B. Cooper, R.D. Schindler, P.S. 2005. Business Research Methods, 2!"' European Ed. London. McGraw-Hill Bresciani, M.J. (2006). Outcomes-based academic and co-curricular program review: A compilation of institutional good practices. Sterting, VA: Stylus Publications. Bryman, A. 2008. Social research methods. 3"' edition. New York. Oxford University Press Bogdan, R.C., & Biklen, S.K. (1992). Qualitative research for education. New York. Allyn and Bacon Burns, N. & Grove, S.K. 1999. Understanding Nursing Research, 2°• Ed. Philadelphia. W.B. Saunders Company Capacity Building Definition. nd. businessdictionary.com Source from http://www.businessdictionarv.com/definition/capacity-building.html (24 November 2014) Cimdins, R., & Skinkis. P. 2011. Development policy monitoring issues - regional and local experiences in Riga Region. European Integration Studies No 5. Riga Region. University of Latvia Source from: http://www.inzeko.ktu.lt/index.php/EIS/article/view/1070 (05 January 2013) Charmaz, K. 2002. Construction of grounded theory. A practical guide through qualitative analysis. London. Sage Publication. Department of the Premier. 2005. Free State Growth and Development Strategy (2005- 2014 ). Chapter 7. Source from: htto://app.spisys.gov.za/files/pula/topics/3037832/Provincial Strategies/Free State Provine e/file 84516279.pdf (22 April 2014) Department of Education Strategic Plan. 2010. Annual Report Department of Performance and Monitoring and Evaluation, National Planning Commission. Diagnostic Overview. 2011. Source from: http://www.npconline.co.za/pebble.asp?relid=33 ( 22 April 2014) 69 Department of the Premier. 2011. Terms Of Reference (TOR) for Technical Working Group. Developed by Performance Monitoring and Evaluation Unit Office of the Premier. Free State Department of the Premier, 2010. Presentation by Mafole Mokalobe on Provincial Strategic Planning Alignment to National Development Plan (NOP). Developed by Provincial Strategic Planning, Policy and Research Diem K. 2002: Using Research Methods to Evaluate Your Extension Program. Journal of Extension, 40(6) 412-415 Divergence definition. nd. Dictionary.com. Source from: http://dictionarv. reference.oom/browse/divergence (12 November 2014) Education in South Africa: Achievements since 1994, ISDN 0-970-3911-2, Department of Education, May 2001. Source From: http://www.dhet.gov.za/LinkClick.aspx?fileticket=sOiQt9GSMpk%3D&tabid=92&mid=495 (19th Aug 2011) Griffin, C. Nd. The advantages and limitations of qualitative research in psychology and education. UK. University of Bath Source from: http://www.pseve.org/Annals el/UPLOAD/griffin2.pdf (01 May 2014) Goldman, I. & Nel, E. 2005. A Framework for Monitoring and Evaluation of Pro-Poor Local Economic Development. South Africa. The Wolid Bank. Source From: http://siteresou rces. wolidban k org/I NTLED/Resources/3396501144099718914/ProPoorME E xecSum.pdf (22 April 2014) Guide to Outcome Approach. 2010. Department of Performance Monitoring and Evaluation. Source: http://www.google.co.za/uli?q=http://www.thepresidency.qov.za/dpme/docs/guideline.pdf&sa =U&ei=9dJxU5SHJsfhyg0n4YCQBQ&ved=OCAsQFjAA&usg=AFQjCNFyslGEPxya8vAaEsR vZog6VZkeOA (11 1" August 2011) Hamill, J. 2009. South Africa under Zuma, restructuring or paralysis?. Contemporary Review. University of Leicester Source from: http://www.questia.oom/read/1G1- 228266027/south-africa-u nder-zuma-restru cturing-or-paralysis 70 Heinrich,C.J. 2002. Outcomes-Based Performance Management in the Public Sector: Implications for Government Accountability and Effectiveness. University of North Carolina- Chapel. Wiley. Source from: http://www.jstor.om/stable/3110329 (12 May 2014) Implementation Plan For Tirisano: January 2000-December 2004. Source From: http://1 0.145.145.8:9091/servlet/com. trend. iwss. user.servlet.sendfile?downloadfile=IRES- 848587743-D6421 F98-26207-25887-15 (14 May 2014) Jansen, J. & Taylor, N. 2003. Educational Change in South Africa 1994-2003: Case Studies in Large-Scale Education Reform Vol II. Education Reform and Management Publication Series. The World Bank. Source From: www.google.co.za/url?q=http://www.jet.org.za/publications/research/jansen-and-taylor- world-bank- report.pdf&sa=U&ei=UclxU6rvDqiWOQXF54CgDQ&ved=OCAsQFjAA&usq=AFQjCNFRnEAq SW3b8PzkCzxuZEZbO WSV (14 May 2014) Kravchuk, R.S & Schack, R.W. 1996. Designing Effective Performance-Measurement Systems under the Government Performance and Results Act of 1993. Public Administration Review, Vol. 56, No. 4. United State of America. Wiley Source From: http://www.jstor.org/stable/976376 (11 September 2014) Krueger, R.A & Casey, M.A. 2009. Focus Groups A practical guide for applied research. 4lh Edition. United State of America. SAGE Kusek, J.Z., & Rist, R.C. 2011. Ten Steps to a Results-Based Monitoring And Evaluation Systems. A Handbook for Development Practitioners. Washington, D.C.The World Bank Source from: http://www.oecd.org/derec/worldbank/35281194.pdf (19th August 2011) Lahey, R. 2010. The Canadian M&E System: Lessons Learned from 30 Years of Development. Washington, DC. The World Bank. Source from: http://siteresources.worldbank.org/INTEVACAPDEV/Resources/ecd wp 23.pdf Mackay, K. 2006. ECO institutionalization of monitoring and evaluation systems to improve Public Sector Management: working paper series - NO. 15. Washington, DC. The World Bank Source from: http://siteresou rces. world bank. org/I NTEVACAP DEV/Resou rces/4585664- 1253899870336/monitori ng evaluation psm.pdf (19th August 2011) 71 Makinde, T. 2005. Problems of Policy Implementation in Developing Nations: The Nigerian Experience. Nigeria. Department of Public Administration, Obafemi Awolowo University, lle- lfe. Source from: http://www.krepublishers.com/02-Joumals/JSS/JSS-11-0-000-000-2005- Web/JSS-11-1-001-090-2005-Abst-PDFIJSS-11-1-063-069-2005-229-Makinde-T/JSS-11-1- 063-069-2005-229-Makinde-T-Full-Text.pdf (Source from: 12 May 2014) Mothae, L. Sindane, M. 2007. A Strategy-Focused Leadership For Improving Public Policy Implementation And Promoting Good Governance Vol 42 no 5. University of the Free State. Journal of Public Administration Mouton, J. 2011. How to succeed in your Master's & Doctoral Studies: A South African Guider and Resource Book. Pretoria. Van Schaik Muzinda, M. 2007. Monitoring and evaluation practices and challenges of Gaborone based local NGO's implementing HIV/AIDS projects in Botswana. University of Botswana. Source from: http://www.aau.org/sites/default/files/mmuzinda.pdf (22 April 2014) Neuman, L. 2006: 'Social Research Methods: Qualitative and Quantitative Approaches: Boston. Allyn and Bacon. Ocampo, Maria Lizet. 2004. A Brief History of Educational Inequality from Apartheid to the Present.Source from: http://web.stanford.eduHbaugh/sawllizet Education lneouitv.html (Source date: 11 November 2014) Provincial Growth and Development Strategy (PGDS) 2005-2014. 2005. Free State Province. Source from: www.google.co.zalurl?q=http://app.spisys.qov.za/files/pula/topics/30378321Provincial Strate qies/Free State Province/file 84516279.pdf&sa=U&ei=hMVxU5fJC4ajOQWX YCwAw&ved =OCBYQFjAF&usq-AFQjCNHYOpWhTUs2WtWtpkdKpD722yi9fQ (11April 2014) Robbins, S.P. Decenzo, D.A. 2004. Fundamentals of Management. Essential Concepts and Applications. Chapter 13, 4'" Edition. McGraw Hill. New York Seemela, P. & Mkhonto, X. 2007. Fundamentals of good in Governance in the Public Sector Vol42 no 5. Journal of Public Administration. University of the Free State Sindane, M. 2009. Administrative Culture, Accountability And Ethics: Gateways in Search Of The Best Public Service Vol 44 no 3. Journal of Public Administration. University of the Free State 72 Steyn, G.M. 2001. The changing principalship in South African schools . Department of Further Teacher Education. Unisa Source From: htto://uir. u nisa.ac.za/bitstrea m/handle/10500/232/ar steyn changing pri ncipalshi p.pdf?seq ue nce=1 (22 April 2014) South African Constitution, 1996. Source from: http://www.info.gov .za/documents/constitution/1996/a108-96.pdf (23 April 2014) SAlnfo Material. 2013. Education in South Africa. 28'h February 2013. Source From: http://www.southafrica.info/about/education/education.htrn#.U1edo_mSyVM#ixzz2zhqtpVSN (12 May 2014) Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park, CA: Sage. The business dictionary (2014, online) Source From: http://www.businessdictionarv.com/definition/strategic-direction.html#ixzz3LGQODBoH (12 November 2014) The education system in South Africa. nd. Source From: http://www.southafrica.info/about/education/education.htrn#.U1edo mSyVM (23 April 2014) The Presidency. 2007. Policy Framework Government-Wide Monitoring & Evaluation System. Source from: file://mcsfs01/users$120020963331Downloads1Policy%20Framework%20for%20the%20GW ME%20system.pdf (24 April 2014) The Presidency. 2009. Improving Government Performance: Our Approach. Source from: http://www.thepresidency.gov.zalleaminq/gov performance.pdf (19th August 2011) Thomas, Vined; Wang, Yan & Fan, Xibo (2000): Measuring Education Inequality: Gini Coefficients of Education. Washington, DC: The World Bank. Source from: https://nicspaull. files. word press.com/2011104/svdb-2007 -apartheids-end u ri nq-legacy- i neq ualiti. pdf (Source date: 11 November2014) Urquhart, C, Lehmann, H and Myers, M.D. 2010. Putting the theory back into grounded theory: guidelines for grounded theory studies in information systems. Information Systems Journal, vol. 20, no. 25, pp. 357-381. 73 Appendix: 1 Questionnaire Each of these questions is illustrated below: Q1 Has there been any changes in how you manage basic education work I or work contributing to basic education? (i) (The researcher sought to determine any changes in managers' behaviour and attitudes in how they conduct their work and manage basic education) Q1 .1 How did you experience the ME system prior to the introduction of PME in 2010? Q1 .2 What is your experience of the PME system currently (since its implementation in 2010)? Q1 .3 Do you think there has been a change in the PME system over a period of time? Q1 .4 If there was change in the PME system, did it benefit the education system and specifically the OBP approach? Q2 Has the PME system created a focus on intergovernmental arrangements between the departments' contribution in OBP 1? (i) (The question inquires as to whether the OBP approach has brought working together towards improving basic education amongst the contributing departments in basic education.) Q3 Has this changed the culture in how you plan and manage basic education as applied in your overall management responsibilities? (i) (The researcher sought to determine if there has been any change in improvements or a lack thereof, since the introduction of OBP.) Q4 (i) Has this approach brought any positive changes to how you do your overall work? (ii) (The question asked aims to investigate if participants are implementing the culture of performance monitoring and evaluation in all other management activities in government.) Q5 How has the investment in the PME System paid off in changing your behaviour on implementing the full delivery chain? (ii) (The researcher sought to determine any changes in managers' attitudes in how they conduct their work and manage basic education.) Q6 How can the OBP approach be improved? (i) (The participant was asked this question in order for the researcher to obtain information on how improvements could be made in the OBP approach to improve negative experiences.) 74 Appendix: 2 Conceptual categories from open coding ID Conceptual Category Example of Unit of Meaning No of occurrences # 1 Strategic Leadership "Still feel there is a lot of reguirement for 5 leadership in terms of working together as a province" "Also requires direction from the top. The Office of the Premier needs to take note when planning cycle takes to then start facilitating OBP." 2 Strategic Document "Should just not see it (OBP) as one document 2 Merging but to align it to existing strategic/ management document, so it forms part of our ways of working." 3 Coordination "Premier as the centre of the approach to 1 Strengthening strengthen their coordination and their facilitation in terms of being consistent. " 4 Coordinating Structures "It is not through OBP that such interactions are 2 created but it is also through the Social Cluster, and it is not necessarily captured in the Outcome-Based plan, but where operational, responsibility managers work together in terms of some of these issues." "Other individual bilateral are taking place between the different departments on operational matters pertaining to their delivery of similar mandate. TWG was responsible for monitoring the implementation of OBP." 5 Work Streamlining "Leader managed to streamline the things we do 3 and focus on in the province." 6 Sector Focus "We have managed to create focus in specific 2 sectors." 7 Report Focus "The APP that was produced is better than that of 4 previous years, particularly on quality and credibility. We also emphasise the issue of evidence. When people produce their reports, they need to think clearly as to how they will provide evidence of such information. I think gradually, the OBP approach is getting institutionalised at the bottom." 8 Priority Focus Enhancement "They were forced to focus on critical issues of 2 government because they had to align and to report on them." 9 Integration Challenge "The departments need to talk and this should 6 happen prior to drawing up the departmental plans. We needed to talk to make sure that the inputs are accurately provided in terms of what was needed in the outcomes." 75 ID Conceptual Category Example of Unit of Meaning No of occurrences# 10 Interrelation "They would come together and start to talk about 6 these issues when we called them and not out of their own initiative." " Department is succeeding in using the approach on outcome focus, but externally, through working with other degartments to deliver on the same outcome, we are struaalina. " 11 Middle man "It is difficult as a lead degartment to organise the 3 sugoorting degartments to participate in these structures, whilst the Premier's office plays the 'role of biQb rother." 12 Complementary Support "With regard to sport, SACR reguires assistance in 1 s1:1ort develo1:1ment. Education keeps children in schools for learninQ. • 13 Integration improvement "I like the monitoring part because at the beginning I 2 agree with my down line stuff on what we will be planning to do and deliver in a particular year. It has been beneficial to the de1:1artment in how the directorate has been glanning to create focus on what thev deliver in their annual oerformance olans." 14 Good relationship "Through the outcomes interaction, we had a good 3 relationshig with the de1:1artment of social development through the development of ECDs, especially getting information on them on pre-grade Rs." 15 Communication Challenge "Basic thing, such as communication between 2 de1:1artments contributing to one outcome was a difficult task' 16 Silo Mentality ' Government culture is 1:1redominatel:t driven by silo 1 mentality; when OBP 1 was introduced we looked at it alone and did not really try to instil and institutionalise it in to also being part of our ways of working." 17 Communication "Positively, this aggroach has brought a change in 2 Improvement how the degartments work, to a certain extent, because somehow departments were forced to talk to one another." 18 Informed Executives "Technical team sgecify to the executives areas on 1 where the focus should be, to be able to see high- level impact.• 19 Finance Detachment 'Planning divorced from resources halts the 1 imgrovement of guality basic education." 20 Finance Reduction 'Allocation towards the delive[)'. of education 1 expenditure is further being reduced by finanre." 21 Budget ' Budget allocation in degartments is insufficient for 2 the delivery of outputs per financial year, and this can clearly be reflected in the departmenfs 5-year strateaic plans." 76 ID Conceptual Category Example of Unit of Meaning No of occurrences# 22 Funding Support "Dedicated funds for departments in which we want to 1 improve services. " 23 Planning "Before it was introduced we planned only according 5 to Treasury r~ulations and guidelines i.e. development of APP and the focus on outputs." "Missing is joint planning within the departments, particularly with working on contributions towards one outcome." 24 Future planning "Onwards from 2011-2013, we tried as the 3 Department of Education to make sure that the indicators are in line with what is planned in the APP ." "So far, we have got our plans aligned: from the APP, to strategic Plan, OBP and MTSF to NOP." 25 Capacity "Funding has not capacitated the departments to 3 operate in an outcome manner; planning is short-lived and cannot be focused on in the long term." "This capacity requirement covers issues on train ing, as well as IT or supporting systems for management." "Since being the only manager in M&E, OBP and MPAT have been dumped on my desk for me to facili tate." 26 Responsibility in OBP "Another point: it would be advisable that the 1 departments (whether leading or supporting), to take fuller accountability of OBP. OBP is only for the Dept of the Premier to facilitate but the accountability lies with the accounting officer (HOD) of that particular department. " 27 Innovation "The outcome-based am2roach brought an initiative 1 which allowed government to think outside the box but government officials' thinking and how they operate has not changed/ shifted." 28 Monitoring "Dept. of Premier should have team of 4-5 people 3 dedicated to monitoring commitments made in programmes which are focused in OBP planning." "Yes, with regard to monitoring and evaluation, as well as to how we do things, I have seen a lot of positive changes." 29 Agreement 'The MOU is there to give direction on how the:t 0 should operate, but they do not respect what is outlined in it; they go beyond it." 77