Masters Degrees (Computer Science and Informatics)
Permanent URI for this collection
Browse
Browsing Masters Degrees (Computer Science and Informatics) by Issue Date
Now showing 1 - 20 of 21
Results Per Page
Sort Options
Item Open Access TCPlot: a network management tool to detect and graphically display faulty TCP conversations(University of the Free State, 1994-10) Kotzé, D. J.; McDonald, T.Afrikaans: Waar hoevlak protokolle die gebruiker afskerm van onderliggende probleme in die netwerk, is dit vir die netwerkbestuurder juis nodig om vroegtydig bewus te wees van sluimerende probleme. Alhoewel daar hulpmiddels, in die vorm van netwerkbestuurspakkette, bestaan om horn behulpsaam te wees met die identifisering en ontleding van netwerkfoute, is hulle dikwels ontoereikend. Bestaande netwerkbestuurspakkette maak tot 'n mate voorsiening vir die ontleding van verkeer op 'n netwerk. Waar 'n enkele verbinding egter as 'n geheel beskou moet word, word verslae van alle pakkies op die netwerk as voldoende beskou. Die verantwoordel ikheid vir die ontleding van hierdie verslae bly die van die netwerkbestuurder. Hierdie tesis beskryf die ontwikkeling van 'n program wat die moni tering en ontleding van 'n enkele TCP gesprek op 'n Ethernetnetwerk moontlik maak. Deur die identifisering van foutiewe verbindings te vergemaklik en die grafiese voorstelling daarvan moontlik te maak, lewer TCPlot 'n bydrae tot netwerkbestuur.Item Open Access A comparison of different approaches towards a computerised information system for primary health care in the Free State(University of the Free State, 1996-11) Blignaut, Petrus Johannes; McDonald, T.English: This study is undertaken in the light of the current importance of the Reconstruction and Development Programme (RDP) and the SA government's commitment to better primary health care (PHC) for everybody. Primary health care services in South Africa should be rendered as effective and complete as possible with the manpower available. The government should therefore have exact knowledge about the current health situation in the country in order to make pro-active provision for better health services in the areas that need it most. Nursing management should thus have access to periodical reports regarding the incidence of epidemics, certain notifiable diseases, the death rate, general housing conditions and much more. It is therefore of the utmost importance that the service providers should capture and process statistical data accurately. This study firstly analyses the current situation with regard to data capturing, processmg, presentation and utilisation. The analysis refers to the manual system of patient carried records, tally sheets as well as the available infrastructure. Nursing management in the Free State has a long term vision to implement a database system to service all fixed and mobile clinics. A complete patient record will be kept by the system and the complete clinical history of a patient will be available at each consultation. With such a system all the regular and ad hoc reports can be processed easily and accurately. This study firstly analyses the current situation with regard to data capturing, processmg, presentation and utilisation. The analysis refers to the manual system of patient carried records, tally sheets as well as the available infrastructure. Nursing management in the Free State has a long term vision to implement a database system to service all fixed and mobile clinics. A complete patient record will be kept by the system and the complete clinical history of a patient will be available at each consultation. With such a system all the regular and ad hoc reports can be processed easily and accurately. This study focuses on the process of computerising pnmary health care services. Some theoretical background on systems analysis and development are provided and thereafter three alternative approaches towards computerisation are proposed and investigated. For each of these proposals a prototype system was developed. The first prototype is based on a patient record approach and includes a complete set of health indicators as well as other demographic and clinical data. The second prototype is based on a minimum data set that leads to more user-friendly system. Thirdly a prototype system that is not based on a patient record but on head-count-approach was developed. This system· resembles the current manual system of tally sheets. The three alternatives are compared with regard to the issues of practicality, flexibility, ease of use, accuracy and completeness of statistical reports and efficiency of time utilisation. It is concluded that the flexibility of a patient-record approach, a_lthough more time-consuming, is preferred to a head-count approach. Furthermore, the ease of use of the s~cond alternative in a developing country with mostly computer illiterate nurses makes it a much more feasible approach than a more comprehensive system. Ways in which a computerised system can be implemented in an environment with limited hardware resources are also investigated. The study concludes with a proposed model for the computerisation of primary health care in the Free State.Item Open Access A GIS for flood damage control planning and estimation of flood damage(University of the Free State, 1998-11) Weepener, Harold Louw; Messerschmidt, H. J.; Viljoen, M. F.English: Viljoen, Du Plessis and Booysen (1995) started in 1992 with the development of a flood damage simulation model (FLODSIM) for the Lower Orange River area. This model was based on GIS technology and was completed in 1994. The main shortcoming of this model was that it was location specific. A successive project was piloted in 1995 for the modification of the model so that it would be generally applicable in flood prone areas. Weiss (1976) already did extensive work on the estimation of flood damage for the Mfolozi floodplain and it was therefore decided to demonstrate the model on the Mfolozi floodplain. A Setup program was written to be able to adapt the model according to the different situations of floodplains. The Setup program prompts the user to indicate the features that should be included in the model and then it guides the user through a series of menus to define the variables that are required to include the specified features. Features that can be included into the model include a DTM, levees, contours, spot heights, cultivated fields, infrastructures and buildings. Flood damage can be computed for cultivated fields, infrastructures and buildings. Other enhancements of the model include programs for the manipulation of levees, loss functions for sugarcane and infrastructure and a program that computes the flooded areas. Programs were also written to speed up the process of acquiring new hydraulic data after levees were added or removed. This includes programs with which topographic data that are required by numerical flood models can be extracted from the DTM and programs with which the hydraulic data that were computed with the numerical flood model can be imported into FLODSIM. An interface with Mike Il was also developed to illustrate the coupling between FLODSIM and numerical flood models. In a literature study that was conducted to investigate different methods for obtaining data it was found that data can either be acquired in digital form from another company or it has to be digitised from maps. When no data are available for the area it may be collected in situ or by means of remote sensing. Remote sensing can also be of great value in updating data such as land use patterns that change over time. The sources that are used to derive DEM data vary from ground surveys, photograrnmetry and existing contour maps to radar or laser altimetry.Item Open Access The development of a virtual reality simulator for certain gastrointestinal endoscopic procedures(University of the Free State, 1999-05) Marais, Charles Claude; Tolmie, C. J.English: A Virtual Reality Gastrointestinal Trainer/Simulator has been developed to enable the simulation of gastrointestinal procedures on a personal computer (PC). Virtual Reality (VR) techniques are used in the construction of this computer-based system to enable the user to practice basic identification, navigational and therapeutical skills. The system consists of a computer-based simulator, a 3-dimensional (3-D) tracker, an endoscope/endocamera and a life-size gastrointestinal model. A normal endoscope/endocamera is used with a hollow transparent life-size gastrointestinal model to provide maximum realism. The computer-based simulator contains a virtual 3-D model of the relevant gastrointestinal organ. Currently the stomach, esophagus and entry to the duodenum (upper G.I. region) are focused on. The position and orientation of the front tip of the endoscope/endocamera are tracked with the 3-D tracker. This data is relayed to the computer, which then calculates and displays the appropriate image on the computer screen as realistically as possible. The calculated image closely resembles the image which would be seen with a real endoscope/endocamera in a real patient. The image is continually updated in accordance with the movement of the endoscope/endocamera and the properties of the gastrointestinal model. Some of the main problems that had to be addressed during the development of the system are: obtaining a 3-D computer graphic model of the stomach with the same shape, size, colour and texture as a real stomach; the simulation of abnormal conditions like ulcers, and how they can be placed inside the 3-D computer graphic model; the simulation of therapeutic tools, like biopsy foreceps; the implementation of realistic, but cheap force feedback; and the deforming of the 3-D computer graphic model when the user touches the inside of the stomach with the tip of the endoscope/endocamera. The system is ideal for teaching, training, simulation, patient briefings and research. In this thesis the virtual reality system, its development and operation is described in detail.Item Open Access 3D visualization of data from groundwater flow and transport models(University of the Free State, 2000-11) Bekker, Meintjes; Messerschmidt, H. J.; Chiang, W. H.English: Groundwater flow and transport models produce large amounts of data, which the human brain cannot possibly grasp. Taking advantage of the natural abilities of the human vision system, 3D visualization is often the tool of choice for understanding and communicating conceptual models, verifying model input, understanding model output, explaining and communicating conclusions and recommendations, and motivating expenses. A 3D visualization tool has therefore been developed for intelligence amplification of model data. The tool is based on a groundwater modeling system (Processing MODFLOW) and makes use of the results from existing groundwater flow (MODFLOW) and transport models (MT3DMS, PHT3D and RT3D.). The Visualization Toolkit (vtk), a C++ class library for visualization was used to render 3D geohydrological objects. Realistic scenes of 3D geospatial models and 3D distributions of geohydrological properties, such as hydraulic conductivity, heads and solute concentrations, can be rendered. The advantages of 3D visualization are evident by applying the visualization tool to case studies.Item Open Access A comparative study on users’ responses to graphics, text and language in a word processor interface(University of the Free State, 2006) Beelders, René Tanya; Blignaut, P. J.; McDonald, T.English: The word processor or some form of editor-based application has become an integral tool for the many people who rely on computers on a daily basis. As such it has a wide and varied user base and must cater for a very diverse user group. Due to the heavy reliance on the word processor it is essential that it delivers pleasurable and efficient interaction to its users. Since its inception, the word processor has displayed the ability to evolve to continually exploit the increasing capabilities of technology. This study focused on furthering the improvement of the word processor usability for a subset of South African word processor users. Specifically, it concentrated on the impact of graphics, text and language on the usability of a word processor. Graphics were incorporated into the interface by means of inclusion of the icons currently found in the Microsoft Office package, which have been accepted as the industry standard, and the development of an alternative set of icons whose usability could be compared to that of the standard icons. Text was included in the interfaces in the form of menus and tooltips as well as text buttons which replaced the afore-mentioned pictorial icons and contained no graphical depiction of the associated function. The impact of language on the usability of a word processor was viewed strictly in terms of bilingual users and was achieved through translation of the text buttons, menus and tooltips into the predominant languages of the area. Comparative user testing was conducted through implementation of a scaled-down word processor application which could accommodate interchangeable interfaces and easy administration of preset tasks. Representative users were then required to complete a series of tasks on their respective pre-assigned interface, which conformed to one of the following general interface configurations: a. An interface using either set of pictorial icons and excluding both menus and tooltips, thus containing no language component. b. An interface in their first language, achieved through use of the text buttons, menus and/or tooltips. c. An English interface, where English was not their first language. A set of usability measures was identified which allowed for the effectiveness, efficiency and satisfaction of the users to be compared between the different user interface configurations. These measurements were: a. the score achieved for the test, based on a built-in difficulty index assigned to each task; b. the satisfaction experienced during interaction with the application; and c. for each task, the: i.time, ii.number of actions, iii.number of errors and iv.ratio of correct and incorrect answers Analysis of the user testing found that no particular interface configuration exhibited increased efficiency, effectiveness, learnability or satisfaction and that users were able to adapt to a changed interface with ease once they had become accustomed to the word processor environment. Therefore, the final finding of the study was that provision of an interface in a bilingual user’s first language neither significantly contributed nor detracted from the application’s usability. Similarly, neither of the pictorial icon sets nor the text buttons exhibited a significantly heightened level of usability. Therefore, none of the interface configurations could be recommended as the most usable. However, a number of recommendations concerning the usability of a word processor were proposed based on both the analysis of the tasks and observation of user interaction. Finally, based on user performance for each individual task, an icon was identified which appeared to be the best and most applicable for that function. The final recommended interface, the usability of which must still be empirically established, consisted of a combination of standard icons, alternative icons and text buttons.Item Open Access A study to determine if experience with mouse-orientated computer games enhances the value that a user draws from an office package in a GUI environment(University of the Free State, 2006-05) Nel, Wynand; Blignaut, P. J.English: Computer use is transforming the lives of many South Africans and is fast changing the way organisations communicate and do business. It also means that thousands of people in South Africa, from different cultures, races and age groups, are coming into contact with and using computers, either at home, at school or university, at the office and even in shopping malls. In order for a user to become computer literate he/she needs to know how to use the computer application effectively. This can only be achieved if he/she knows, inter alia, how to use the computer mouse as an input device. It has been noticed that many previously disadvantaged students (this includes all people that were discriminated against according to race and include all black and coloured people) have no idea of how to use a computer mouse. Even after they have been shown how to hold and move the mouse, many of them still struggle for some time to use the mouse effectively. They find it difficult to master the movement of the mouse cursor and they struggle to click the mouse buttons. Such a user may fall behind the rest of the students in a computer literacy class and often hinders the progress of the class as the lecturer has to give special attention to the struggling individual. The main focus of this study was to determine how long it takes a person to learn how to use a computer mouse effectively, and also, specifically in terms of mouse skills, whether mouse-orientated computer games enhance the value that a user draws from an office package in a graphical user interface environment. The study was done in two phases. In phase one the students played six mouse-orientated computer games. Three questions were investigated in this phase: - Does race play a significant role? - Do difficulty levels play a significant role? - Does computer use frequency play a significant role? Phase two of the study focused on only three of the computer games used in phase one, and also on Microsoft Word tests. Six questions were investigated in this phase: - Is there a difference between the average total completion times for the two MS Word tests within a session? - Is there a difference in the average total completion times between the different sessions? - Is there a difference between the average total marks for the MS Word tests in any session? - Is there a difference in the average total marks between the different sessions? - Does the student's score in any of the three games remain constant through different attempts and sessions? - Is there a correlation between the score that a user obtains in one of the games and the total completion time for the MS Word test in the different sessions? Various statistical tests were done on the captured data to answer the above questions. The tests included the Analysis of Variance (ANOVA), Tukey's test for the honestly significant differences and Spearman's correlation. This study proved that playing mouse-orientated computer games improves a user's fine motor skills and enhances his/her computer mouse hand-eye coordination. Furthermore it proved that three of the six mouse-orientated computer games enhanced the value that the users drew from the MS Word tests. A positive relationship between the scores of the games and the completion time of the MS Word tests was encountered indicating that a high score in the games compared with a short completion time in the MS Word tests. The games provide a relaxed and enjoyable environment for users to improve their computer mouse skills, and users are able to gain more value from an office package within a short time.Item Open Access Some psychological and biographical predictors of computer proficiency: an analysis of the potential of a novice to become a good computer user(University of the Free State, 2006-08-22) Burger, Andries Johannes; Blignaut, P. J.; Huysamen, G. K.English: As a result of the proliferation of computers throughout the business world, more and more demands are placed on workers to develop computer skills. There are a variety of training methods by means of which workers can obtain these much-needed skills. It is nevertheless true that with identical training methods, it is quite likely that different people will end up with different computer abilities. It was thus the primary objective of this study to investigate the role that certain biographical, psychological and cognitive variables play in the prediction of computer proficiency. The variables that were included as possible predictors were personality type, learning style, general anxiety, three-dimensional perceptual ability (spatial 3D), numerical ability, computer attitude, grade 12 final examination mark and mathematical ability. The se condary objective of this study was to determine whether computer attitude and its three components (computer anxiety, computer liking and computer confidence) were influenced by computer experience. Culture was taken into account as a moderator variable in both the primary and secondary studies. To ensure that all the research participants were on the same level of computer literacy, only students enrolled for the basic computer literacy course at the University of the Free State were used in the study. Because the research was used to develop predictor formulas for computer proficiency, the research participants were tested early in February 2003, before the introductory computer literacy course commenced. This was to ensure that the participants’ attitudes, abilities and feelings regarding computers were assessed prior to their exposure to computers. The only test that was repeated (on the same students) towards the end of the semester course was the so-called Computer Attitude Scale (CAS). Apart from measuring a person’s attitude towards computers, the test also contains sub-tests that measure computer anxiety, computer liking and computer confidence. The researcher needed these retest scores to determine whether users’ computer attitude, as well as the three mentioned components, had changed as more computer experience was gained. The primary study resulted in the formulation of two formulas which can be used to predict the computer proficiency of white and black students enrolled for an introductory computer literacy course. The prediction formula for the white students is made up of six variables – grade 12 final examination mark, computer confidence, the learning modes of abstract conceptualisation (AC) and concrete experience (CE), mathematical ability and the conscientiousness (C) domain of personality. The prediction formula for the black students is also made up of six variables – spatial 3D, the L, Q3 and Q4 scores of the IPAT Anxiety Scale, computer confidence and the learning mode of abst ract conceptualisation (AC). It was thus found that different variables predict the computer proficiency of white and black students. The only variables that are shared by both formulas are computer confidence and the learning mode of abstract conceptualisation (AC). In contrast with previous research on the topic, a negative relationship between computer attitude and computer experience was found in the secondary study. The statistical results indicated that as the students gained more experience on computers their computer confidence and computer liking decreased while their computer anxiety increased. As these three constructs are the components of computer attitude, it was not surprising that computer attitude also decreased. Computers play an integral role in the lives of many individuals and therefore the improvement of computer skills is a continuous and important process. This study provided valuable inputs by identifying predictors of computer proficiency for students enrolled in an introductory computer literacy course.Item Open Access A comparative study to determine the optimum e-assessment paradigm for testing users' word processing skills(University of the Free State, 2008) Strauss, Hermanus Johannes; Blignaut, P. J.; Du Toit, E. R.English: In recent times, people have become more and more reliant on computers on a daily basis. As a result, the need has arisen to optimise the task-related experience in terms of time efficiency, which demands effective training in software skills. To be more specific, word processing skills are currently considered essential in any field of work and are in high demand. This study focuses on determining the optimal paradigm (methods) to assess users’ word processing skills. One of the main reasons for this research was the fact that students at the University of the Free State (UFS) reported to the computer literacy course lecturer that they were dissatisfied with the virtual, simulated MS Word software environment used to assess (e-assess) their word processing skills electronically. This existing test system (ETS) at the UFS requires students to perform certain tasks and automatically checks whether the required end-result is obtained. However, this system is based on a simulated interface with limited functionality. As a result, the relevant information on software e-assessment systems was researched and a new software skills e-assessment application developed accordingly. The aim was to develop a tool that would be able to assess students’ word processing skills in the most reliable way possible. Another aim was to find methods of stimulating the learning process during the e-assessment of word processing skills. Therefore the newly developed e-assessment system, WordAssessor, is designed to be based on the real MS Word environment. It requires students to perform certain tasks and automatically checks whether the said tasks have been correctly performed. WordAssessor allows students to explore the MS Word interface fully while being assessed. It even allows students to use trial-and-error to solve certain problems (tasks). To stimulate learning potentially further during e-assessment, WordAssessor presents students with a video solution for the questions they had incorrect, directly after the test. In order to assess the validity of the e-assessment methods employed by the WordAssessor system, comparative user testing was conducted. Students’ word processing skills were assessed as part of their advanced computer literacy course, using the ETS, the WordAssessor system and a personalised test scenario (where no e-assessment tool was used). In addition, participating students were provided with a questionnaire to determine their reaction and preference with regard to the various elements of the assessment methods. By analysing the results in detail, it was found that the results of the personalised test scenario (PT) yielded the most reliable indication of students’ true word processing skills, and could be used as a benchmark. Hereafter, the results of the WordAssessor test scenario were analysed to determine the correlation (relationship) with the results of the personalised test. The same type of correlation was performed between the results of the ETS and PT. It was established that the WordAssessor results correlated significantly more closely with the PT results than the ETS. In the end (and after additional analysis) it was found that the methods employed by WordAssessor yielded the most reliable indication of students’ true word processing skills knowledge when compared to the ETS. In addition, from the results of the post-assessment questionnaire, it was determined that students felt they learned more as a result of the video tutorials. Furthermore, they stated that they preferred video feedback over text- or paper-based feedback. They also stated that they preferred being assessed in a real software environment, as opposed to a simulation. It was recommended that a more flexible and realistic e-assessment approach (as demonstrated by the WordAssessor system) could be beneficial to students on several levels. Also, it was recommended that students be able to answer test questions in a way that suits them rather than being instructed as to the method of use. Finally, the use of highly detailed video tutorials directly following e-assessment (only for questions students had incorrect), was recommended.Item Open Access A comparative study to determine the optimum e-assessment paradigm for testing users' word processing skills(University of the Free State, 2008) Strauss, Hermanus JohannesEnglish: In recent times, people have become more and more reliant on computers on a daily basis. As a result, the need has arisen to optimise the task-related experience in terms of time efficiency, which demands effective training in software skills. To be more specific, word processing skills are currently considered essential in any field of work and are in high demand. This study focuses on determining the optimal paradigm (methods) to assess users’ word processing skills. One of the main reasons for this research was the fact that students at the University of the Free State (UFS) reported to the computer literacy course lecturer that they were dissatisfied with the virtual, simulated MS Word software environment used to assess (e-assess) their word processing skills electronically. This existing test system (ETS) at the UFS requires students to perform certain tasks and automatically checks whether the required end-result is obtained. However, this system is based on a simulated interface with limited functionality. As a result, the relevant information on software e-assessment systems was researched and a new software skills e-assessment application developed accordingly. The aim was to develop a tool that would be able to assess students’ word processing skills in the most reliable way possible. Another aim was to find methods of stimulating the learning process during the e-assessment of word processing skills. Therefore the newly developed e-assessment system, WordAssessor, is designed to be based on the real MS Word environment. It requires students to perform certain tasks and automatically checks whether the said tasks have been correctly performed. WordAssessor allows students to explore the MS Word interface fully while being assessed. It even allows students to use trial-and-error to solve certain problems (tasks). To stimulate learning potentially further during e-assessment, WordAssessor presents students with a video solution for the questions they had incorrect, directly after the test. In order to assess the validity of the e-assessment methods employed by the WordAssessor system, comparative user testing was conducted. Students’ word processing skills were assessed as part of their advanced computer literacy course, using the ETS, the WordAssessor system and a personalised test scenario (where no e-assessment tool was used). In addition, participating students were provided with a questionnaire to determine their reaction and preference with regard to the various elements of the assessment methods. By analysing the results in detail, it was found that the results of the personalised test scenario (PT) yielded the most reliable indication of students’ true word processing skills, and could be used as a benchmark. Hereafter, the results of the WordAssessor test scenario were analysed to determine the correlation (relationship) with the results of the personalised test. The same type of correlation was performed between the results of the ETS and PT. It was established that the WordAssessor results correlated significantly more closely with the PT results than the ETS. In the end (and after additional analysis) it was found that the methods employed by WordAssessor yielded the most reliable indication of students’ true word processing skills knowledge when compared to the ETS. In addition, from the results of the post-assessment questionnaire, it was determined that students felt they learned more as a result of the video tutorials. Furthermore, they stated that they preferred video feedback over text- or paper-based feedback. They also stated that they preferred being assessed in a real software environment, as opposed to a simulation. It was recommended that a more flexible and realistic e-assessment approach (as demonstrated by the WordAssessor system) could be beneficial to students on several levels. Also, it was recommended that students be able to answer test questions in a way that suits them rather than being instructed as to the method of use. Finally, the use of highly detailed video tutorials directly following e-assessment (only for questions students had incorrect), was recommended.Item Open Access The feasibility of an effective data warehousing solution for a tertiary institution(University of the Free State, 2008) Nazir, Amer Bin; McDonald, TheoEnglish: Even though industry in South Africa has utilized data warehousing technologies successfully for a number of years, tertiary institutions have lagged behind. This can in part be attributed to the high costs involved, many failures in the past and the fact that the decision makers of these institutions are unaware of what data warehousing is and the advantages it can bring. Several factors, however, are forcing tertiary institutions in the direction of data warehousing. They need all the help they can get to make this process as easy as possible. Most of the tertiary institutions that still survive today came through periods of tough rationalizations and mergers. In order to stay alive and competitive, they have grown through the years and have developed into large businesses in and of themselves. On the one hand they had to make ends meet with subsidies from government that became less and less and on the other hand they had to provide more and more detailed statistics to the government. This change has resulted in a more business-like management of these institutions. Strategic decision making has now become of the utmost importance to tertiary institutions to meet the frequent changes in the government funding structure. The University of the Free State initially tried to accomplish that with an online transaction processing system developed in-house. These systems, however, are designed to optimize transactional processing and the features which increase the efficiency of these systems are generally those which also make it difficult to extract information. When that did not work, a new online transaction processing system was bought from an international company at a huge cost. During the course of data transfer from the old to the new system (with a different database design) numerous data conversion errors generated anomalies and a lack of integrity in the database. The new system also proved inadequate to provide the necessary statistics required by the Department of Education. A system was subsequently purchased that utilized ASCII files prepared by the online transaction processing system which generated fixed reports according to the Department of Education requirements. This system provided a workable solution, but with changes in requirements, new reports need to be developed continuously. It was also worthless for institutional planning and forecasting. This study reported the advantages and disadvantages of the current systems in use to provide statistics to the Department of Education. It then proposes a new system based on data warehousing principles. The dimensional star schema design for a data warehouse is provided. The methods used to transfer, load and extract data are discussed in detail. The data warehouse solution is then compared to the current solutions. The conclusion is that a data warehouse is a feasible solution for the strategic information problems tertiary institutions are facing today. An effective management information system using data warehousing can be developed in-house with low budgets, institutional data can be fitted into dimensional modelling star schemas, and error free data can be provided to end-users by developing proper extraction, transformation and loading packages. The data surfaced to end-users from relational online analytical processing can provide statistics to government and can be used for general planning and forecasting purposes.Item Open Access Using mobile learning applications to encourage active classroom participation: technical and pedagogical considerations(University of the Free State, 2011-05) Khomokhoana, Pakiso Joseph; Nel, L.English: Higher education institutions are experiencing burgeoning growth in student enrolment. The subsequent increase in undergraduate class sizes means that the needs of individual students are no longer effectively addressed. Students are also less likely to actively participate in these large classes. There is a high probability that such students are less likely to be successful in their studies. In order to support the learning needs of the student population, there are various strategies and tools that can be used to encourage active classroom participation. This study investigated how mobile learning applications can be used to encourage active participation in large undergraduate Computer Science classes. The study identified the four main teaching and learning challenges that are experienced by lecturers and students in large undergraduate courses. They are lack of resources, facilitation of student assessment and feedback, pressure to increase student throughput and the academic under preparedness of students. In this study, the researcher established that it is not easy to address these challenges if a traditional teacher-centred approach is used. The main reason is that this approach is ineffective to support the construction of conceptual understanding by students. Upon consideration of various teaching and learning issues, a student-centred approach was identified as being a more promising approach for quality teaching and successful learning in the 21st century. In a teaching and learning environment where a student-centred approach is practiced, active classroom participation was identified as one viable solution that has the potential to lower the intensity of the four stated challenges. The researcher demonstrated how active classroom participation could mitigate the effects of these challenges. Some of the active participation strategies identified from contemporary literature were also implemented by the lecturer in her classes. On realisation that it is not easy to implement active classroom participation strategies, especially in large classes, the researcher opted for applications that could automate some of these strategies. He specifically decided to use mobile learning applications because in this era, most of the students own cellular phones. The researcher believed that the existing applications could not help him to address the research questions and objectives of this study. He opted for a custom developed application, called MobiLearn. Technical and pedagogical usability of this application were then evaluated in terms of the metrics established from literature. Technical usability was evaluated in terms of 12 metrics and pedagogical usability was evaluated in terms of nine metrics. The study employed the mixed methods design, and the approach was mainly qualitative with some quantitative enhancements. Data was collected through focus group discussions held with voluntary participants from the selected population; questionnaire survey; extracting it from the application (usage data); a face-to-face interview with the lecturer who used the MobiLearn application in her classes as well as class attendance records. Qualitative data was analysed according to qualitative content analysis principles, while quantitative data was analysed by means of statistical analysis. The application was evaluated as both technically and pedagogically usable. It was also evident to have potential to encourage active classroom participation for students who use it. Some students indicated that they experienced some technical problems to access the MobiLearn application. They indicated that they were not motivated to use the application. To address the last (third) objective of this study to mitigate problems such as these experienced by MobiLearn users, the study compiled a set of technical and pedagogical guidelines for best practices in the use of mobile learning applications to encourage active participation in similar contexts.Item Open Access Assessing a brain-computer interface by evoking the auditory cortex through binaural beat(University of the Free State, 2013-01) Potgieter, Louwrens; De Wet, Lizette; Schall, RobertEnglish: Why can some people study, read books, and work while listening to music or with noise in the background while other people simply cannot? This was the question that prompted this research study. The aim of this project was to assess the impact of binaural beats on participants during the performance of a task. The participants were exposed to different binaural beats that changed the dominant brainwaves while they were engaging in the task. A braincomputer interface was used to monitor the performance of the task in which a Lego Mindstorm robot was controlled as it moved through a course. To accomplish the aim of the project, the effects of binaural tones on participants’ task performance were investigated in relation to participants’ levels of frustration, excitement, engagement, meditation and performance. Participants were monitored by means of using an Emotiv EPOC neuroheadset. Although previous studies on binaural beats have been done, most of these studies were done on Attention deficit-hyperactivity disorder (ADHD) children, with users performing everyday tasks. In these studies, time was the only metric used. The researcher collected data by means of questionnaires that were completed by the participants to obtain personal information and measure the user experience. The aspects of frustration, excitement, engagement, meditation and performance were determined using the Emotiv headset in combination with the Emotiv software development kit, Microsoft Robotics Studio and software created by the researcher. After intensive statistical analysis, the researcher found that different sound frequencies did indeed affect user performance. Sessions where no sound frequency was applied were associated with more errors and longer time durations compared with all other frequencies. It can be concluded that invoking a participant’s dominant brainwave by means of binaural tones can change his/her state of mind. This in turn can affect the long-term excitement, short-term excitement, engagement, meditation, frustration or performance of a participant while performing a task. Much remains to be learned, in particular regarding the combination of brain-computer interfaces and human-computer interaction. The possibility of new cutting-edge technologies that could provide a platform for further in-depth research is an exciting prospect.Item Open Access Comparing brain-computer interfaces across varying technology access levels(University of the Free State, 2014) Dollman, Gavin John; De Wet, L.; Beelders, T. R.English: A brain-computer interface (BCI) is a device that uses neurophysiological signals measured from the brain to activate external machinery. BCIs have traditionally been used to enhance the standard of living for severely disabled patients. This has resulted in a shortage of data on how BCIs perform with able-bodied individuals. There has recently (2012) been a trend towards BCI research involving able users but these studies are still too few to make a substantial impact. Additionally, traditional input methods are being replaced or supplemented by alternative natural modes of interaction and these natural interactions have become known as NUIs. To investigate the suitability of a BCI as a NUI, this study used the Emotiv headset to provide direct measurement of a participant’s performance while performing tasks similar to wheelchair manipulation in order to determine whether a participant’s access to traditional input methods influences their performance. Thus, the main aim of this study was to investigate the usability of an Emotiv for robot navigation. Additionally, the study aimed to discover whether a user’s performance differed when using a keyboard compared to the Emotiv as well as investigating whether there was improvement of performance in the short term for a user through repetitive use of the Emotiv. In order to compare the usability of the Emotiv to a keyboard the participants were placed into groups based on their exposure to traditional input methods. This was verified based on their individual expertise rating, which was a measure of frequency and length of use. The test instrument used consisted of a written program that navigated a pair of Mindstorm NXT robots across a custom designed test course. Data was collected via usability testing which measured learnability, efficiency and effectiveness. Efficiency was measured as the time taken to complete a task while effectiveness was a measure of the errors made by a participant when completing a task. Results indicated that there was no significant difference between the groups’ efficiency and effectiveness when using the Emotiv to complete a task. Thus, a user’s previous experience with a traditional input method does not influence a user’s performance with an Emotiv when navigating a robot. This result indicates that the interface is intuitive to use and, therefore the Emotiv could be suitable as a NUI. The results for the usability metrics efficiency and effectiveness indicated that there was a significant difference between the performances with the Emotiv and a keyboard. The results show that, with the Emotiv, participants took more time to complete a task and made more errors when compared to a keyboard. This discrepancy was attributed to cognitive theory as it is believed that the participants violated their preformed schema which affected their performance. However, the participants quickly became comfortable with the Emotiv which supports the evidence that the interface is intuitive to use. For neither the usability metrics efficiency nor effectiveness was a significant improvement detected with repetitive use of the Emotiv. Thus, repetitive use of the Emotiv to navigate a robot does not improve a user’s performance over a short period of time. These results indicate that in terms of efficiency and effectiveness the keyboard is the superior interface. The results also revealed that a participant’s performance is not affected by their exposure to traditional input methods when utilising a BCI. Thus, the Emotiv is intuitive to use and appears suitable for use as a NUI. This study proved that the Emotiv is an intuitive interface and can be used with little to no previous experience.Item Open Access Comparing the sensor glove and questionnaire as measures of computer anxiety(University of the Free State, 2014-01) Nkalai, Tlholohelo Stephania; De Wet, L.English: A vast amount of literature regarding computer anxiety exists. Consequently, a number of researchers have discovered different definitions for computer anxiety. Regardless of the numerous definitions, several researchers agree that computer anxiety involves emotional ‘fear’ or ‘apprehension’ when interacting or anticipating interaction with computers. Subsequently, some individuals who experience computer anxiety avoid using computers. This situation is undesirable because these days it is almost always a necessity for people to use computers in the workplace. It is therefore important to extensively investigate computer anxiety including measures which can be implemented to mitigate it. Different findings about computer anxiety regarding the correlates: gender, age, computer ownership, educational attainment and computer experience, exist. For example, while some research findings state that females experience higher levels of computer anxiety than males, other research findings assert that males experience computer anxiety more than the females. The contradictory findings regarding the correlates of computer anxiety could be attributed to the fact that most of the research studies which investigated computer anxiety relied solely on existing computer anxiety questionnaires. Using questionnaires exclusively poses various limitations which include relying on the ‘subjective’ responses of the participants. This research study incorporated another measurement of computer anxiety in addition to an existing computer anxiety questionnaire named Computer Anxiety Rating Scale. This additional measurement was performed using an instrument that measured physiological signals of a participant. The instrument is called an Emotion RECecognition system (EREC). It measures skin temperature and skin resistance and heart rate. Apart from the mentioned two, other data collection methods were used which are pre-test and post- test self-developed questionnaires, observations and interviews. With various measurements incorporated in this study, computer anxiety was investigated taking into consideration the following research questions: To what extent does a sensor glove add value in measuring computer anxiety during usability testing when compared to anxiety questionnaires and observations? To what extent is computer anxiety influenced by age, gender, computer experience, educational attainment, and ownership of a personal computer according to the anxiety questionnaire and the sensor glove? From the findings of the study in relation to the first research question, it can be concluded that the sensor glove does not add value. Instead, the sensor glove may add value when measuring stress. This means that although the EREC sensor glove measures skin conductance, changes in skin conductance may indicate changes in stress levels rather than anxiety levels. Regarding the second research question, it can be concluded that computer anxiety was not influenced by age, gender, computer experience, educational attainment, and ownership of a personal computer according to the anxiety questionnaire and the sensor glove.Item Open Access Assessing a brain-computer interface by evoking the auditory cortex through binaural beats(University of the Free State, 2014-07-22) Potgieter, Louwrens; De Wet, Lizette; Schall, RobertEnglish: Why can some people study, read books, and work while listening to music or with noise in the background while other people simply cannot? This was the question that prompted this research study. The aim of this project was to assess the impact of binaural beats on participants during the performance of a task. The participants were exposed to different binaural beats that changed the dominant brainwaves while they were engaging in the task. A braincomputer interface was used to monitor the performance of the task in which a Lego Mindstorm robot was controlled as it moved through a course. To accomplish the aim of the project, the effects of binaural tones on participants’ task performance were investigated in relation to participants’ levels of frustration, excitement, engagement, meditation and performance. Participants were monitored by means of using an Emotiv EPOC neuroheadset. Although previous studies on binaural beats have been done, most of these studies were done on Attention deficit-hyperactivity disorder (ADHD) children, with users performing everyday tasks. In these studies, time was the only metric used. The researcher collected data by means of questionnaires that were completed by the participants to obtain personal information and measure the user experience. The aspects of frustration, excitement, engagement, meditation and performance were determined using the Emotiv headset in combination with the Emotiv software development kit, Microsoft Robotics Studio and software created by the researcher. After intensive statistical analysis, the researcher found that different sound frequencies did indeed affect user performance. Sessions where no sound frequency was applied were associated with more errors and longer time durations compared with all other frequencies. It can be concluded that invoking a participant’s dominant brainwave by means of binaural tones can change his/her state of mind. This in turn can affect the long-term excitement, short-term excitement, engagement, meditation, frustration or performance of a participant while performing a task. Much remains to be learned, in particular regarding the combination of brain-computer interfaces and human-computer interaction. The possibility of new cutting-edge technologies that could provide a platform for further in-depth research is an exciting prospect.Item Open Access Assessing the use of a Brain-Computer Interface (BCI) in mathematics education: the case of a cognitive game(University of the Free State, 2015) Verkijika, Silas Formunyuy; De Wet, LizetteEnglish: South Africa currently faces a huge shortage of mathematics skills, a problem commonly referred to as the “math crisis”. Researchers in South Africa have attributed the growing “math crisis” to the lack of cognitive functions among learners. However, existing solutions to address the problem have overlooked the role of cognitive functions in improving mathematics aptitude. Moreover, even though cognitive functions have been widely established to have a significant influence on mathematics performance, there is surprisingly little research on how to enhance cognitive functions (Witt, 2011). Consequently, this study had as primary objective to explore the impact of a BCI-based mathematics educational game as a tool for facilitating the development of cognitive function that enhance mathematics skills in children. The choice of a BCI-based solution for enhancing cognitive functions stems from recent neuroscience literature that highlights the potential of BCIs as tools for enhancing cognitive functions. Existing neuroscience, psychological and mathematical education research have established a number of cognitive functions (working memory, inhibitory control, math anxiety, and number sense) that affect mathematics education. This study combined these existing paradigms with the BCI device to provide a technological solution for enhancing the basic cognitive functions that foster mathematics learning. Following these assertions, a BCI-based mathematics educational game was developed taking into account the target population (children from the ages from 9-16) and the important role of digital educational games in improving education (in this case mathematics education in particular). Using a within-subjects short-term longitudinal research design, this study established that a BCI-based mathematics educational game could be used to significantly enhance four basic cognitive functions (working memory, inhibitory control, math anxiety, and number sense). These four cognitive functions have been widely acknowledged as significant fundamental aspects of mathematics education. As such, adopting such a technological solution in South African schools can go a long way to address the current “math crises” by enabling educators and learners to address the issue of low cognitive functions. This study culminated with practical recommendations on how to address the “math crisis” in South Africa.Item Open Access The usability of natural user interfaces for gameplay(University of the Free State, 2015) Fouche, Rouxan Colin; Beelders, T. R.; De Wet, L.English: This study aimed to determine to what extent the usability of a two dimensional game was influenced by the use of a Natural User Interfaces (NUI) as opposed to a traditional keyboard and mouse combination. Two multimodal NUIs were investigated during the study. The first NUI combination (BCIG) made use of the Peregrine gaming glove for the activation of commands, combined with the Emotiv’s accelerometer for control of the cursor. The second NUI combination (BCIF) made use of facial expression recognition, offered by the Emotiv Brain Computer Interface (BCI), as a method of command activation in combination with the Emotiv’s built-in accelerometer for cursor control. A shooting genre game was developed and three tasks were included during development to simulate gaming actions. The first task used only stationary targets, the second task used predictable moving targets, whereas the third task made use of unpredictable moving targets. Since the Emotiv BCI allows for customisation of the accelerometer sensitivity settings, a pilot study was conducted to determine whether the low, medium or high sensitivity setting would provide the best cursor control. The low sensitivity resulted in the fastest gameplay overall as well as the least number of errors. It could thus be concluded that the lowest setting is the optimal setting since it provided the most efficient control for three out of the four metrics tested. After implementing this result, user testing, which involved 5 sessions per participant (n=18), was conducted. Data for three metrics was gathered during user testing, which included data on effectiveness, efficiency and learnability. Post-test questionnaires were administered to assess the level of user satisfaction with each NUI. The results of this study indicated that there is a difference between the usability of the traditional input combination, the keyboard and mouse, and the two NUIs investigated in this study. With regard to the effectiveness and efficiency metrics the traditional input combination was found to be the best, closely followed by BCIG. The three interfaces showed dissimilar levels of improvement, with the traditional keyboard and mouse combination showing the least, followed by BCIG, while the best improvement was noticed for BCIF. By analysing the subjective data gathered from post-test questionnaires, it was found that participants initially enjoyed using BCIG, and after several sessions their level of satisfaction improved. In comparison, the participants initially experienced a slightly negative feeling towards BCIF, which then improved over several sessions to a positive overall response. In conclusion, the keyboard and mouse combination provided far more effective and efficient input, with one exception being the fast command activation when making use of the Peregrine glove, where the two interfaces compared well. It was found that a significant obstacle in the way of NUIs is the existing skill and acceptance that computer users have with the traditional interface combination. Consequently, for individuals to accept and migrate to a more natural interface the new interface will have to provide more effective and efficient input than what is already achievable with the keyboard and mouse combination.Item Open Access Graphical processing unit assisted image processing for accelerated eye tracking(University of the Free State, 2015-02) Du Plessis, Jean-Pierre Louis; Blignaut, P. J.English: Eye tracking is a well-established tool utilised in research areas such as neuroscience, psychology and marketing. There are currently many different types of eye trackers available, the most common being video-based remote eye trackers. Many of the currently available remote eye trackers are either expensive, or provide a relatively low sampling frequency. The goal of this dissertation is to present researchers with the option of an affordable high-speed eye tracker. The eye tracker implementation presented in this dissertation was developed to address the lack of low-cost high-speed eye trackers currently available. Traditionally, low-cost systems make use of commercial off-the-shelf components. However, the high frequency at which the developed system runs prohibits the use of such hardware. Instead, affordability of the eye tracker has been evaluated relative to existing commercial systems. To facilitate these high frequencies, the eye tracker developed in this dissertation utilised the Graphical Processing Unit, Microsoft DirectX and HLSL in an attempt to accelerate eye tracking tasks – specifically the processing of the eye video. The final system was evaluated through experimentation to determine its performance in terms of accuracy, precision, trackability and sampling frequency. Through an experiment involving 31 participants, it was demonstrated that the developed solution is capable of sampling at frequencies of 200 Hz and higher, while allowing for head movements within an area of 10×6×10 cm. Furthermore, the system reports a pooled variance precision of approximately 0.3° and an accuracy of around 1° of visual angle for human participants. The entire system can be built for less than 700 euros, and will run on a mid-range computer system. Through the study an alternative is presented for more accessible research in numerous application fields.Item Open Access A comparison of similarity metrics for e-assessment of MS Office assignments(University of the Free State, 2015-07) Marais, Willem Sterrenberg Jacobus; Blignaut, P. J.English: Computerised assessment is prevalent in various disciplines where immediate and accurate feedback with regard to students’ assignments is required. It is used as an alternative to manual assessment of computer programming assignments, computer proficiency tests and free-text responses to questions. The implementation of the Office Open XML (OOXML) standard, as the default document format for Microsoft Office, instigated the development of alternative computerised assessment algorithms with the ability to assess word-processing documents of the DOCX format. Word-processing assignments are primarily assessed by comparing the final document, submitted by the student, to the ideal solution provided by the examiner. Research into the anatomy of OOXML-based documents delivered several alternative approaches with regard to the computerised assessment of DOCX document types. OOXML simplifies the evaluation process of word-processing documents by providing easily identifiable elements within the document structure. These elements can then be used to assess the content and formatting of the document to determine whether the solution, submitted by the student, matches the ideal solution provided by the examiner. By examining current OOXML-based algorithms, certain gaps within the implementation thereof were identified. An alternative algorithm, dubbed the OOXML algorithm that could alleviate these issues, is introduced. It improves the assessment techniques of current OOXML-based algorithms by firstly simplifying the structure of the DOCX documents to ensure that the student’s document and examiner’s solution conform to a homogeneous structure. It then identifies corresponding paragraphs between the student’s document and the examiner’s solution. Finally, the student’s simplified document is assessed by comparing the content and formatting elements within the OOXML structure of the corresponding paragraphs with one another. To determine the accuracy and reliability of the proposed OOXML algorithm, it is compared with three established algorithms as well as manual assessment techniques. The three algorithms include a string comparison algorithm called fComp, the Levenshtein algorithm and a document difference algorithm, implemented by a system called Word Grader. The same group of word-processing assignments is graded by the specified algorithms and manually assessed by multiple human markers. Analysis of the results of a quasi-experimental study concluded that the proposed OOXML algorithm and its element comparison metric not only produced more reliable results than the human markers but also more accurate results than the human markers and the other selected document analysis algorithms.