In today’s open, distributed environments, there is an increasing need for systems to assist the interoperation of tools and information resources. This paper describes a multi-agent system, DALEKS, that supports such activities for the information processing domain. With this system, information processing tasks are accomplished by the use of an agent architecture incorporating task planning and information agent matchmaking components. We discuss the characteristics of planning in this domain and describe how information processing tools are specified for the planner. We also describe the manner in which planning, agent matchmaking, and information task execution are interleaved in the DALEKS system. An example application taken from the domain of university course administration is provided to illustrate some of the activities performed in this system.
Download (PDF, 84 KB)
Analyses of landscape structure are used to test the hypothesis that remotely sensed images can be used as indicators of ecosystem conservation status. Vegetation types based on a classified SPOT satellite image were used in a comparison of paired, reserve (conservation area) and adjacent more human modified areas (controls). Ten reserves (average size 965 ha) were selected from upland tussock grasslands in Otago, New Zealand. While there were equal numbers of vegetation types and the size and shape distribution of patches within the overall landscapes were not significantly different, there was less of ‘target’ vegetation in controls. This was in smaller patches and fewer of these patches contained ‘core areas’. These control ‘target’ patches were also less complex in shape than those in the adjacent reserves. These measures showed that remotely sensed images can be used to derive large scale indicators of landscape conservation status. An index is proposed for assessing landscape change and conservation management issues are raised.
Download (PDF, 108 KB)
Multimedia technology allows a variety of the presentation formats to portray instructions for performing a task. These formats include the use of text, graphics, video, aural, photographs, used singly or in combination (Kawin, 1992; Hills, 1984; Newton, 1990; Bailey, 1996). As part of research at the Multimedia Systems Research Laboratory to identify a syntax for the use of multimedia elements, an experiment was conducted to determine whether the use text or video representations of task instructions was more effective at communicating task instructions (Norris, 1996). This paper reports on the outcome of that study.
The repair and assembly environment of a local whiteware manufacturer provided the study domain. The task chosen for the study was the replacement of a heating element in a cooktop oven. As there were no task instructions available from the manufacturer, the study was conducted in two phases: Phase I was a cognitive task analysis of service technicians to determine the steps as well as the cues and considerations of the assembly task; and in Phase II we evaluated the text and video representations of the task instructions. The next sections briefly describe the methodology and the results from the experiment.
Download (PDF, 44 KB)
This study is part of research that is investigating the notion that human performance in dynamic and intentional decision making environments, such as ambulance dispatch management, can be improved if information is portrayed in a manner that supports the decision strategies invoked to achieve the goal states of the process being controlled. Hence, in designing interfaces to support real-time dispatch management decisions, it is suggested that it would be necessary to first discover the goal states and the decision strategies invoked during the process, and then portray the required information in a manner that supports such a user group’s decision making goals and strategies.
The purpose of this paper is to report on the experiences gleaned from the use of a cognitive task analysis technique called Critical Decision Method as an elicitation technique for determining information portrayal requirements. This paper firstly describes how the technique was used in a study to identify the goal states and decision strategies invoked during the dispatch of ambulances at the Sydney Ambulance Co-ordination Centre. The paper then describes how the interview data was analysed within and between cases in order to reveal the goal states of the ambulance dispatchers. A brief description of the resulting goal states follows, although a more detailed description of the goals states and their resulting display concepts has been reported elsewhere (Wong et al., 1996b). Finally, the paper concludes with a set of observations and lessons learnt from the use of the Critical Decision Method for developing display design concepts in dynamic intentional environments.
Keywords: display design, cognitive task analysis, Critical Decision Method, ambulance dispatch management
Download (PDF, 100 KB)
Spatial data quality has become an issue of increasing concern to researchers and practitioners in the field of Spatial Information Systems (SIS). Clearly the results of any spatial analysis are only as good as the data on which it is based. There are a number of significant areas for data quality research in SIS. These include topological consistency; consistency between spatial and attribute data; and consistency between spatial objects’ representation and their true representation on the ground. The last category may be subdivided into spatial accuracy and attribute accuracy. One approach to improving data quality is the imposition of constraints upon data entered into the database. This paper presents a taxonomy of integrity constraints as they apply to spatial database systems. Taking a cross disciplinary approach it aims to clarify some of the terms used in the database and SIS fields for data integrity management. An overview of spatial data quality concerns is given and each type of constraint is assessed regarding its approach to addressing these concerns. Some indication of an implementation method is also given for each.
Keywords: database constraints, spatial data quality, system development, rules
Download (PDF, 128 KB)
Computer users employ a collection of software tools to support their day-to-day work. Often the software environment is dynamic with new tools being added as they become available and removed as they become obsolete or outdated. In today’s systems, the burden of coordinating the use of these disparate tools, remembering the correct sequence of commands, and incorporating new and modified programs into the daily work pattern lies with the user. This paper describes a multi-agent system, DALEKS, that assists users in utilizing diverse software tools for their everyday work. It manages work and information flow by providing a coordination layer that selects the appropriate tool(s) to use for each of the user’s tasks and automates the flow of information between them. This enables the user to be concerned more with what has to be done, rather than with the specifics of how to access tools and information. Here we describe the system architecture of DALEKS and illustrate it with an example in university course administration.
Keywords: agent architecture, software interoperability
Download (PDF, 72 KB)
Modelling the structure of data is an important part of any system analysis project. One problem that can arise is that there may be many differing viewpoints among the various groups that are involved in a project. Each of these viewpoints describes a perspective on the phenomenon being modelled. In this paper, we focus on the representation of developer viewpoints, and in particular on how multiple viewpoint representations may be used for database design. We examine the issues that arise when transforming between different viewpoint representations, and describe an architecture for implementing a database design environment based on these concepts.
Download (PDF, 232 KB)
In this paper, we describe the implementation of a database design environment (Swift) that incorporates several novel features: Swift’s data modelling approach is derived from viewpoint-oriented methods; Swift is implemented in Java, which allows us to easily construct a client/server based environment; the repository is implemented using PostgreSQL, which allows us to store the actual application code in the database; and the combination of Java and PostgreSQL reduces the impedance mismatch between the application and the repository.
Download (PDF, 108 KB)
Privacy is one of the most fundamental of human rights. It is not a privilege granted by some authority or state. It is, in fact, necessary for each human being’s normal development and survival. Those nations who have, in the past, and currently follow the notion that they have the authority and/or moral high ground to grant or deny privacy to their citizens are notable for their other human rights violations. This paper is centered around the above premise and will offer the reader some good news and some bad news. But most important, it will put the reader on notice that our privacy is constantly under attack from one vested interest or another and that each and every one of us must be vigilant in the protection of our private matters.
It is common in New Zealand to assume that anything secret is bad. This is an extremely naïve position to take for any intelligent individual. The old phrase “if you haven’t got anything to hide, then you shouldn’t mind…” is often used to intimidate, manipulate or coerce an individual to “confess” or share information that he/she initially believes to be confidential, private or otherwise not for sharing with others. Secrecy is not bad nor good in and of itself. It is merely a factual description of the condition of some information.
Now for some good news. There are a number of technological devices and procedures that can be used to enhance one’s privacy. The bad news is that most, if not all, can be easily defeated with other technological advances.
Software metrics are measurements of the software development process and product that can be used as variables (both dependent and independent) in models for project management. The most common types of these models are those used for predicting the development effort for a software system based on size, complexity, developer characteristics, and other metrics. Despite the financial benefits from developing accurate and usable models, there are a number of problems that have not been overcome using the traditional techniques of formal and linear regression models. These include the non-linearities and interactions inherent in complex real-world development processes, the lack of stationarity in such processes, over-commitment to precisely specified values, the small quantities of data often available, and the inability to use whatever knowledge is available where exact numerical values are unknown. The use of alternative techniques, especially fuzzy logic, is investigated and some usage recommendations are made.
Download (PDF, 88 KB)
The paper explores building profiles of Newsgroups from a corpus of Usenet E-mail messages employing some standard statistical techniques as well as fuzzy clustering methods. A large set of data from a number of Newsgroups has been analysed to elicit some text attributes, such as number of words, length of sentences and other stylistic characteristics. Readability scores have also been obtained by using recognised assessment methods. These text attributes were used for building Newsgroups’ profiles. Three newsgroups, each with similar number of messages were selected from the processed sample for the analysis of two types of one-dimensional profiles, one by length of texts and the second by readability scores. Those profiles are compared with corresponding profiles of the whole sample and also with those of a group of frequent participants in the newsgroups. Fuzzy clustering is used for creating two-dimensional profiles of the same groups. An attempt is made to identify the newsgroups by defining centres of data clusters.
It is contended that this approach to Newsgroup profile analysis could facilitate a better understanding of computer-mediated communication (CMC) on the Usenet, which is a growing medium of informal business and personal correspondence.
(No abstract.)
Download (PDF, 60 KB)
(No abstract.)
The number of occurrences and severity of computer-based attacks such as viruses and worms, logic bombs, trojan horses, computer fraud, and plagiarism of code have become of increasing concern. In an attempt to better deal with these problems it is proposed that methods for examining the authorship of computer programs are necessary. This field is referred to here as software forensics. This involves the areas of author discrimination, identification, and characterisation, as well as intent analysis. Borrowing extensively from the existing fields of linguistics and software metrics, this can be seen as a new and exciting area for forensics to extend into.
Keywords: authorship analysis, computer programming, malicious programs, software forensics, software metrics, source code
Fuzzy neural networks provide for the extraction of fuzzy rules from artificial neural network architectures. In this paper we describe a general method, based on statistical analysis of the training data, for the selection of fuzzy membership functions to be used in connection with fuzzy neural networks. The technique is first described and then illustrated by means of two experimental examinations.
Download (PDF, 172 KB)