Departmental reports

The following is a list of technical reports authored by computer science faculty at Brock University. Many of these reports are prepublications of papers submitted to journals, conferences, workshops, student theses, and work in progress. Please contact the authors for information about the most current versions of their research:

A Comparison of Knee Strategies For Hierarchical Spatial Clustering
Brian J. Ross, February 2018.

A comparative study of the performance of knee detection approaches for the hierarchical clustering of 2D spatial data is undertaken. Knee detection is usually performed on the dendogram generated during cluster generation. For many problems, the knee is a natural indication of the ideal or optimal number of clusters for the given problem. This research compares the performance of various knee strategies on different spatial datasets. Two hierarchical clustering algorithms, single linkage and group average, are considered. Besides determining knees using conventional cluster distances, we also explore alternative metrics such as average global medoid and centroid distances, and F score metrics. Results show that knee determination is difficult, and that efficacy of knee strategies is very much problem dependent. Furthermore, knee determination is often more effectively applied on alternative distance metrics and F scores. In summary, knee strategies are often a useful heuristic, but not a general solution, towards optimal cluster detection.

A discrete representation of dicomplemented lattices
Ivo Düntsch, Léonard Kwuida and Ewa Orlowska, March 2017

Dicomplemented lattices were introduced as an abstraction of Wille’s concept algebras which provided negations to a concept lattice. We prove a discrete representation theorem for the class of dicomplemented lattices. The theorem is based on a topology free version of Urquhart’s representation of bounded general lattices.

Mixed algebras and their logics
Ivo Düntsch, Ewa Orlowska and Tinko Tinchev, January 2016.

We investigate complex algebras of the form <2X , <R>, [[S]]> arising from a frame <X, R, S> where S ⊆ R, and exhibit their abstract algebraic and logical counterparts.

Online Image Classification Using Graphics Processing Unit-Based Genetic Programming
Mehran Maghoumi, Brian J. Ross, August 2016.

A texture classification vision system implemented with Graphics Processing Unit-based genetic programming is described. An online learning environment is implemented, in which genetic programming is automatically invoked when un-classified texture instances are present in an image stream. Once a segment is positively classified, the genetic programming classifier expression is reapplied to frames of the image stream. System performance is enhanced using population-parallel evaluation on a graphics processing unit. Various experiments with textures of varying difficulty were performed. Real-time performance was often seen in cases using 4-segments of texture data, as correct classifiers were evolved within seconds. In all cases, once evolved, classifiers were easily applied to the image stream in real-time. This research shows that high-performance real-time learning environments for image classification are attainable with genetic programming.

Simplifying contextual structures
Ivo Düntsch and Günther Gediga, January 2015.

We present a method to reduce a formal context while retaining much of its information content. Although simple, our ICRA approach offers an effective way to reduce the complexity of concept lattices and / or knowledge spaces by changing only little information in comparison to a competing model which uses fuzzy K-Means clustering.

Rough set clustering
Ivo Düntsch and Günther Gediga, January 2015.

We present a survey of clustering methods based on rough set data analysis.

A relational logic for spatial contact based on rough set approximation
Ivo Düntsch and Ewa Orloska, Hui Wang, November 2015.

In previous work we have presented a class of algebras enhanced with two contact relation used in spatial reasoning on the basis of rough sets. In this paper we present a relational logic for such structures in the spirit of Rasiowa — Sikorki proof systems.

Virtual Photography Using Multi-Objective Particle Swarm Optimization
William Barry and Brian J. Ross, January 2014.

Particle swarm optimization (PSO) is a stochastic population-based search algorithm that is inspired by the flocking behaviour of birds. Here, a PSO is used to implement swarms of cameras flying through a virtual world in search of an image that satisfies a set of compositional constraints, for example, the rule of thirds and horizon line rules. To effectively process these multiple, and possible conflicting, criteria, a new multi-objective PSO algorithm called the sum of ranks PSO (SR-PSO) is introduced. The SR-PSO is useful for solving high-dimensional search problems, while discouraging degenerate solutions that can arise with other approaches. Less user intervention is required for the SR-PSO, as compared to a conventional PSO. A number of problems using different virtual worlds and user-supplied constraints were studied. In all cases, solution images were obtained that satisfied the majority of given constraints. The SR-PSO was shown to be superior to other algorithms in solving the high-dimensional virtual photography problems studied.

Passive Solar Building Design Using Genetic Programming
Mohammad Mahdi Oraei Gholami and Brian J. Ross, January 2014.

Passive solar building design considers the effect that sunlight has on energy usage. The goal is to reduce the need for artificial cooling and heating devices, thereby saving energy costs. A number of competing design objectives can arise. Window heat gain during winter requires large windows. These same windows, however, reduce energy efficiency during nights and summers. Other model requirements add further complications, which creates a challenging optimization problem. We use genetic programming for passive solar building design. The EnergyPlus system is used to evaluate energy consumption. It considers factors ranging from model construction (shape, windows, materials) to location particulars (latitude/longitude, weather, time of day/year). We use a split grammar to build 3D models, and multi-objective fitness to evaluate the multiple design objectives. Experimental results showed that balancing window heat gain and total energy use is challenging, although our multi-objective strategy could find interesting compromises. Many factors (roof shape, material selection) were consistently optimized by evolution. We also found that geographic aspects of the location play a critical role in the final building design.

Feature Extraction Languages and Visual Pattern Recognition
Mehran Maghoumi and Brian J. Ross, January 2014.

Visual pattern recognition and classification is a challenging computer vision problem. Genetic programming has been applied towards automatic visual pattern recognition. An important factor in evolving effective classifiers is the suitability of the GP language for defining expressions for feature extraction and classification. This research presents a comparative study of a variety of GP languages suitable for classification. Four different languages are examined, which use different selections of image processing operators. One of the languages does block classification, which means that an entire region of pixels is classified simultaneously. The other languages are pixel classifiers, which determine classification for a single pixel. Pixel classifiers are more common in the GP-vision literature. We tested the languages on different instances of Brodatz textures, as well as aerial and camera images. Our results show that the most effective languages are pixel-based ones with spatial operators. However, as is to be expected, the nature of the image will naturally determine the effectiveness of the language used.

Real-Time Automatic Object Classification and Tracking using Genetic Programming and NVidia® CudaTM
Mehran Maghoumi, August 2014.

Genetic Programming (GP) is a widely used methodology for solving various computa- tional problems. GP’s problem solving ability is usually hindered by its long execution time for large and complex problems. In this thesis, GP is applied toward real-time com- puter vision. In particular, object classification and tracking using a parallel GP system is discussed. First, a study of suitable GP languages for object classification is presented. To this end two main GP approaches for visual pattern classification were studied. These approaches include the block-classifiers and the pixel-classifiers. According to the experi- ments, the pixel-classifiers were generally more accurate and performed better. Next, using the results of these experiments, a suitable language was selected for the implementation of the real-time tracking system. The real-time system was implemented using NVIDIA CUDA. Synthetic video data was used in the experiments. The goal of the experiments was to evolve a unique classifier for each texture pattern that was present in the video. The experiments revealed that the system was capable of correctly classifying and tracking the textures in the video. Furthermore, the performance of the system was on-par with real-time requirements.

Discrete dualities for n-potent MTL–algebras and 2-potent BL–algebras
Ivo Düntsch, Ewa Orłowska, Clint van Alten, September 2014.

Discrete dualities are developed for n-potent MTL–algebras and for 2-potent BL–algebras. That is, classes of frames, or relational systems, are defined that serve as dual counterparts to these classes of algebras. The frames defined here are extensions of the frames that were developed for MTL–algebras by Orlowska, Radzikowska (2009) and Orlowska, Rewitzky (2010) ; the additional frame conditions required are given here and also the proofs that discrete dualities hold with respect to such frames. The duality also provides an embedding from an n-potent MTL–algebra, or 2-potent BL–algebra, into the complex algebra of its canonical frame, which is a complete algebra in the lattice sense.

Discrete PSO for the Uncapacitated Single Allocation Hub Location Problem
Alexander Bailey, Beatrice Ombuki-Berman, Stephen Asobiela, January 2013.

Efficient network design and optimization of transportation and distribution systems has significant economic and service efficiency implications for both the public and private sectors. We propose a new solution based on a particle swarm optimization (PSO) for the uncapacitated single allocation hub location problem (USAHLP). Although various meta-heuristics have been proposed for this problem, to the authors. knowledge, this is the first attempt to use a PSO framework for this problem. An empirical study is done using well-known benchmark problems from the Civil Aeronautics Board and Australian Post data sets with node sizes of up to 200. The beauty of the proposed approach is its simplicity and effectiveness, where its solution quality is compared with that of other meta-heuristics. The proposed discrete PSO matches or out-performs the solution quality of the current best-known methods for the USAHLP.

On the homogeneous countable Boolean contact algebra
Ivo Düntsch and Sangiang Li, March 2013.

In a recent paper, we have shown that the class of Boolean contact algebras (BCAs) has the hereditary property, the joint embedding property and the amalgamation property. By Fraïssé.s theorem, this shows that there is a unique countable homogeneous BCA. This paper investigates this algebra and the relation algebra generated by its contact relation. We first show that the algebra can be partitioned into four sets f0g, f1g, K, and L, which are the only orbits of the group of base automorphisms of the algebra, and then show that the contact relation algebra of this algebra is finite, which is the first non.trivial extensional BCA we know which has this property.

Generative Representations for Artificial Architecture and Passive Solar Performance
Adrian Harrington and Brian J. Ross, March 2013.

This paper explores how the use of generative representations improves the quality of solutions in evolutionary design problems. A genetic programming system is developed with individuals encoded as generative representations. Two research goals motivate this work. One goal is to examine Hornby.s features and measures of modularity, reuse and hierarchy in new and more complex evolutionary design problems. In particular, we consider a more difficult problem domain where the generated 3D models are no longer constrained by voxels. Experiments are carried out to generate 3D models which grow towards a set of target points. The results show that the generative representations with the three features of modularity, regularity and hierarchy performed best overall. Although the measures of these features were largely consistent to those of Hornby, a few differences were found.

Our second research goal is to use the best performing encoding on some 3D modeling problems that involve passive solar performance criteria. Here, the system is challenged with generating forms that optimize exposure to the Sun. This is complicated by the fact that a model.s structure can interfere with solar exposure to itself; for example, protrusions can block Sun exposure to other model elements. Furthermore, external environmental factors (geographic location, time of the day, time of the year, other buildings in the proximity) may also be relevant. Experimental results were successful, and the system was shown to scale well to the architectural problems studied.

Automatic Inference of Hierarchical Graph Models Using Genetic Programming with an Application to Cortical Networks
Alexander Bailey, Beatrice Ombuki-Berman, Mario Ventresca, March 2013.

The pathways that relay sensory information within the brain form a network of connections, the precise organization of which is unknown. Communities of neurons can be discerned within this tangled structure, with inhomogeneously distributed connections existing between cortical areas. Classification and modelling of these networks has led to advancements in the identification of unhealthy or injured brains, however, the current models used are known to have major deficiencies. Specifically, the community structure of the cortex is not accounted for in existing algorithms, and it is unclear how to properly design a more representative graph model. It has recently been demonstrated that genetic programming may be useful for inferring accurate graph models, although no study to date has investigated the ability to replicate community structure. In this paper we propose the first GP system for the automatic inference of algorithms capable of generating, to a high accuracy, networks with community structure. We utilize a common cat cortex data set to highlight the efficacy of our approach. Our experiments clearly show that the inferred graph model generates a more representative network than those currently used in scientific literature.

A Scalability Study of Multi-Objective Particle Swarm Optimizers
Kyle Robert Harrison, Andries P. Engelbrecht, Beatrice Ombuki-Berman, March 2013.

Particle swarm optimization (PSO) is a well-known optimization technique originally proposed for solving singleobjective, continuous optimization problems. However, PSO has been extended in various ways to handle multi-objective optimization problems (MOPs). The scalability of multi-objective PSO algorithms as the number of sub-objectives increases has not been well examined; most observations are for two to four objectives. It has been observed that the performance of multiobjective optimizers for a low number of sub-objectives can not be generalized to problems with higher numbers of sub-objectives. With this in mind, this paper presents a scalability study of three well-known multi-objective PSOs, namely vector evaluated PSO (VEPSO), optimized multi-objective PSO (oMOPSO), and speed-constrained multi-objective PSO (SMPSO) with up to eight sub-objectives. The study indicates that as the number of subobjectives increases, SMPSO scaled the best, oMOPSO scaled the worst, while VEPSO.s performance was dependent on the knowledge transfer strategy (KTS) employed, with parent centric recombination (PCX) based approaches scaling consistently better.

PRE and variable precision models in rough set data analysis
Ivo Düntsch and Günther Gediga, April 2013.

We present a parameter free and monotonic alternative to the parametric variable precision model of rough set data analysis. The proposed model is based on the well known PRE index lambda of Goodman and Kruskal. Using a weighted lambda model it is possible to define a two dimensional space based on (Rough) sensitivity and (Rough) specificity, for which the monotonicity of sensitivity in a chain of sets is a nice feature of the model. As specificity is often monotone as well, the results of a rough set analysis can be displayed like a receiver operation curve (ROC) in statistics. Another aspect deals with the precision of the prediction of categories — normally measured by an index alpha in classical rough set data analysis. We offer a statistical theory for alpha and a modification of alpha which fits the needs of our proposed model. Furthermore, we show how expert knowledge can be integrated without losing the monotonic property of the index. Based on a weighted lambda, we present a polynomial algorithm to determine an approximately optimal set of predicting attributes. Finally, we exhibit a connection to Bayesian analysis. We present several simulation studies for the presented concepts. The current paper is an extended version of a paper presented at FedCSiS 2012.

Genetic Programming for Non-Photorealistic Rendering
Maryam Baniasadi, June 2013.

This thesis focuses on developing an evolutionary art system using genetic programming. The main goal is to produce new forms of evolutionary art that alter existing images into new non-photorealistic (NPR) styles, by obtaining images that look like traditional media such as watercolor or pencil, as well as brand new effects. The approach permits GP to generate creative forms of NPR results. The GP language is extended with different techniques and methods inspired from NPR research such as colour mixing expressions, image processing filters and painting algorithm. Colour mixing is a major new contribution, as it enables many familiar and innovative NPR effects to arise. Another major innovation is that many GP functions process the canvas (rendered image), while is dynamically changing. Automatic fitness scoring uses aesthetic evaluation models and statistical analysis, and multi-objective fitness evaluation is used. Results showed a variety of NPR effects, as well as new, creative possibilities.

Standard Errors of Indicies in Rough Set Data Analysis
Günther Gediga and Ivo Düntsch, September 2013.

The sample variation of indices for approximation of sets in the context of rough sets data analysis is considered. We consider the gamma and alpha indices and some other ones – lower and upper bound approximation of decision classes. We derive confidence bounds for these indices as well as a two group comparison procedure. Finally we present procedures to compare the approximation quality of two sets within one sample.

Relational Properties of Sequential Composition of Co-Algebras
Michael Winter and Peter Kempf, October 2013.

In this paper we define a sequential composition for arbitrary co-algebras in a Dedekind category. We show some basic algebraic properties of this operation up to bisimulation. Furthermore, we consider certain recursive equations and provide an explicit solution, i.e., a solution not based on an iterative process.

Passive Solar Building Design Using Genetic Programming
Mohammad Mahdi Oraei Gholami, October 2013.

Passive solar building design is the process of designing a building while considering sunlight exposure for receiving heat in Winter and rejecting heat in summer. The main goal of a passive solar building design is to remove or reduce the need of mechanical and electrical systems for cooling and heating, and therefore saving energy costs and reducing environmental impact. This research will use evolutionary computation to design passive solar buildings. Evolutionary design is used in many research projects to build 3D models for structures automatically. In this research, we use a mixture of split grammar and string-rewriting for generating new 3D structures. To evaluate energy costs, the EnergyPlus system is used. This is a comprehensive building energy simulation system, which will be used alongside the genetic programming system. In addition, genetic programming will also consider other design and geometry characteristics of the building as search objectives, for example, window placement, building shape, size, and complexity. In passive solar designs, reducing energy that is needed for cooling and heating are two objectives of interest. Experiments show that smaller buildings with no windows and skylights are the most energy efficient models. Window heat gain is another objective used to encourage models to have windows. In addition, window and volume based objectives are tried. To examine the impact of environment on designs, experiments are run on five different geographic locations. Also, both single floor models and multi-floor models are examined in this research. According to the experiments, solutions from the experiments were consistent with respect to materials, sizes, and appearance, and satisfied problem constraints in all instances.

Extension Properties of Boolean Contact Algebras
Ivo Düntsch and Sanjiang Li, February 2012.

We show that the class of Boolean contact algebras has the joint embedding property and the amalgamation property, and that the class of connected Boolean contact algebras has the joint embedding property but not the amalgamation property.

Evolutionary Approaches to the Generation of Optimal Error Correcting Codes
Daniel E. McCarney, Sheridan Houghten and Brian J. Ross, March 2012.

Error-correcting codes allow for reliable transmission of data over mediums subject to interference. They guarantee detection and recovery from a level of transmission corruption. Larger error-correcting codes increase the amount of error tolerable, which improves system reliability and performance. However, discovering optimal error-correcting codes for different size specifications is equivalent to the NP-Hard problem of determining maximum cliques of a graph.

In this research, three different binary error correcting code problems are considered. Both genetic algorithm and genetic programming are examined for generating optimal error correcting codes for these problems. A new chromosome representation of the GA system is examined, which shows some benefits in certain conditions. The use of GP is novel in this problem domain, and in combination with the Baldwin effect, it is shown to be a promising new approach for code discovery.

Automatic Generation of Graph Models for Complex Networks by Genetic Programming
Alexander Bailey, Mario Ventresca, and Beatrice Ombuki-Berman, March 2012.

Complex networks have attracted a large amount of research attention, especially over the past decade, due to their prevalence and importance in our daily lives. Numerous humandesigned models have been proposed that aim to capture and model different network structures, for the purpose of improving our understanding the real-life phenomena and its dynamics in different situations. Groundbreaking work in genetics, medicine, epidemiology, neuroscience, telecommunications, social science and drug discovery, to name some examples, have directly resulted. Because the graph models are human made (a very time consuming process) using a small subset of example graphs, they often exhibit inaccuracies when used to model similar structures. This paper represents the first exploration into the use of genetic programming for automating the discovery and algorithm design of graph models, representing a totally new approach with great interdisciplinary application potential. We present exciting initial results that show the potential of GP to replicate existing complex network algorithms.

Relation Algebras, Matrices, and Multi-Valued Decision Diagrams
Francis Atampore, Michael Winter, April 2012.

In this paper we want to further investigate the usage of matrices as a representation of relations within arbitrary heterogeneous relation algebras. First, we want to show that splittings do exist in matrix algebras assuming that the underlying algebra of the coefficients provides this operation. Second, we want to outline an implementation of matrix algebras using reduced ordered multi-valued decision diagrams. This implementation combines the efficiency of operations based on those data structures with the general matrix approach to arbitrary relation algebras.

On Deriving Conditional Diagnosability of Interconnection Networks
E. Cheng, L. Lipták, Ke Qiu, Z. Shen, May 2012.

We provide general techniques to estimate an upper bound of the conditional diagnosability of a graph G, and to prove that such a bound is also tight when a certain connectivity result is available for G. As an example, we derive the exact value of the conditional diagnosability for the (n, k)-star graph.

Towards Quantum Relation Algebras
Michael Winter, June 2012.

Propositional quantum logic is a non-classical logic system based on the structure of quantum mechanics. It is based on the concept of physical observations or experimental propositions. These propositions correspond to closed linear subspaces of a Hilbert space or, more abstractly, to elements of an orthomodular lattice. In this paper we want to start the investigation of relations that are based on such a logic system as a first step towards an algebraic version of quantum first-order logic using a relation algebraic approach.

Knowledge Transfer Strategies for Vector Evaluated Particle Swarm Optimization
Kyle Robert Harrison, Beatrice Ombuki-Berman, and Andries P. Engelbrecht, September 2012.

Vector evaluated particle swarm optimization (VEPSO) is a multi-swarm variant of the traditional particle swarm optimization (PSO) algorithm applied to multi-objective problems (MOPs). Each sub- objective is allocated a single sub-swarm and knowledge transfer strate- gies (KTSs) are used to pass information between swarms. The original VEPSO used a ring KTS, and while VEPSO has shown to be successful in solving MOPs, other algorithms have been shown to produce better results. One reason for VEPSO to perform worse than other algorithms may be due to the ineffciency of the KTS used in the original VEPSO. This paper investigates new KTSs for VEPSO in order to improve its performance. The results indicated that a hybrid strategy using parent- centric cRoss (Link to (PCX) on global best solutions generally lead to a higher hypervolume while using PCX on archive solutions generally lead to a better distributed set of solutions.

Generative Aesthetically Pleasing Images in a Virtual Environment Using Particle Swarm Optimization
William Barry, October 2012.

This research focuses on generating aesthetically pleasing images in virtual environments using the particle swarm optimization (PSO) algorithm. The PSO is a stochastic population based search algorithm that is inspired by the ocking behavior of birds. In this research, we implement swarms of cameras ying through a virtual world in search of an image that is aesthetically pleasing. Virtual world exploration using particle swarm optimization is considered to be a new research area and is of interest to both the sci- enti^Lc and artistic communities. Aesthetic rules such as rule of thirds, sub- ject matter, colour similarity and horizon line are all analyzed together as a multi-objective problem to analyze and solve with rendered images. A new multi-objective PSO algorithm, the sum of ranks PSO, is introduced. It is empirically compared to other single-objective and multi-objective swarm al- gorithms. An advantage of the sum of ranks PSO is that it is useful for solving high-dimensional problems within the context of this research. Throughout many experiments, we show that our approach is capable of automatically producing images satisfying a variety of supplied aesthetic criteria.

Enabling and Measuring Complexity in Evolving Designs using Generative Representations for Artificial Architecture
Adrian Harrington, October 2012.

As the complexity of evolutionary design problems grow, so too must the quality of solutions scale to that complexity. In this research, we develop a genetic programming system with individuals encoded as tree-based generative representations to address scalability. This system is capable of multi-objective evaluation using a ranked sum scoring strategy. We examine Hornby’s features and measures of modularity, reuse and hierarchy in evolutionary design problems. Experiments are carried out, using the system to generate three-dimensional forms, and analyses of feature characteristics such as modularity, reuse and hierarchy were performed. This work expands on that of Hornby’s, by examining a new and more difficult problem domain. The results from these experiments show that individuals encoded with those three features performed best overall. It is also seen, that the measures of complexity conform to the results of Hornby. Moving forward with only this best performing encoding, the system was applied to the generation of three-dimensional external building architecture. One objective considered was passive solar performance, in which the system was challenged with generating forms that optimize exposure to the Sun. The results from these and other experiments satisfied the requirements. The system was shown to scale well to the architectural problems studied.

Genetic Programming for the Automatic Inference of Graph Models for Complex Networks
Alexander Bailey, Mario Ventresca, Beatrice Ombuki-Berman, October 2012.

Complex networks are becoming an integral tool for our understanding of an enormous variety of natural and artificial systems. A number of human-designed network generation procedures have been proposed that reasonably model specific real-life phenomena in structure and dynamics. Consequently, breakthroughs in genetics, medicine, epidemiology, neuroscience, telecommunications and the social sciences have recently resulted. A graph model is an algorithm capable of constructing arbitrarily sized networks, whose end structure will exhibit certain statistical and structural properties. The process of deriving an accurate graph model is very time intensive and challenging and may only yield high accurate models for very specific phenomena. An automated approach based on Genetic Programming was recently proposed by the authors. However, this initial system suffered from a number of drawbacks, including an under-emphasis on creating hub vertices, the requirement of user intervention to determine objective weights and the arbitrary approach to selecting the most representative model from a population of candidate models. In this paper we propose solutions to these problems and show experimentally that the new system is a very significant improvement and is very capable of reproducing existing common graph models from even a single small initial network.

Automatic and Interactive Evolution of Vector Graphics Images with Genetic Algorithms
Steven Bergen and Brian J. Ross, January 2011.

Vector graphics images are composed of lists of discrete geometric shapes, such as circles, squares, and lines. Vector graphics is popular in illustration and graphic design. The generation of vector images by evolutionary computation techniques, however, has been given little attention. This paper uses genetic algorithms to evolve vector images. By restricting the numbers of primitives and colour schemes used, stylized interpretations of target images are produced. Automatic evolution involves measuring the pixel-by-pixel colour distance between a candidate and target image. The JNetic evolutionary vector graphics system is described. JNetic supports automatic and user-guided evolution, chromosome editing, and high-detail masks. The user can paint masks over areas of the target image, which will be used to reproduce the high-detail features within those areas. The system has been successfully used by the authors as a creative tool.

Automatic Evolution of Conceptual Building Architectures
Corrado Coia and Brian J. Ross, January 2011.

An evolutionary approach to the automatic generation of 3D building topologies is presented. Genetic programming is used to evolve shape grammars. When interpreted, the shape grammars generate 3D models of buildings. Fitness evaluation considers user-specified criteria that evaluate different aspects of the model geometry. Such criteria might include maximizing the number of unique normals, satisfying target height requirements, and conforming to supplied shape contours. Multi-objective evaluation is used to analyze and rank model fitness, based on the varied user-supplied criteria. A number of interesting models complying to given geometric specifications have been successfully evolved with the approach. A motivation for this research application is that it can be used as a generator of conceptual designs, to be used as inspirations for refinement or further exploration.

Evolution of Architectural Floor Plans
Robert W.J. Flack, January 2011.

Layout planning is a process of sizing and placing rooms (e.g. in a house) while attempting to optimize various criteria. Often there are conflicting criteria such as construction cost, minimizing the distance between related activities, and meeting the area requirements for these activities. The process of layout planning has mostly been done by hand, with a handful of attempts to automate the process. This thesis explores some of these past attempts and describes several new techniques for automating the layout planning process using evolutionary computation. These techniques are inspired by the existing methods, while adding some of their own innovations. Additional experiments are done to test the possibility of allowing polygonal exteriors with rectilinear interior walls. Several multi-objective approaches are used to evaluate and compare fitness. The evolutionary representation and requirements specification used provide great flexibility in problem scope and depth and is worthy of considering in future layout and design attempts. The system outlined in this thesis is capable of evolving a variety of floor plans conforming to functional and geometric specifications. Many of the resulting plans look reasonable even when compared to a professional floor plan. Additionally polygonal and multi-floor buildings were also enerated.

Fundamentals of Cochlear Nerve Signal Processing: Towards a New Generation of Cochlear Implant Devices
Vlad Wojcik and W. Gregory Wojcik, January 2011.

We present an engineering model of the inner ear focusing on the cochlea, together with the emerging cochlear nerve signal protocol suggesting a new generation of cochlear implants. We demonstrate mathematically that signals generated by proposed electronic devices can be used as restorative signals feeding the cochlear nerve, and in audio pattern recognition, including audio location, perception, pattern recognition and perhaps even subjective enjoyment of music and comprehension of speech.

This paper was written for the multi-disciplinary audience of engineers, biologists, mathematicians, computer and medical practitioners. Each of those groups might find some of its sections basic

Evolution of Stochastic Bio-Networks Using Summed Rank Strategies
Brian J. Ross, February 2011.

Stochastic models defined in the stochastic pi-calculus are evolved using genetic programming. The interpretation of a stochastic model results in a set of time series behaviors. Each time series denotes changing quantities of components within the modeled system. The time series are described by their statistical features. This paper uses genetic programming to reverse engineer stochastic pi-calculus models. Given the statistical characteristics of the intended model behavior, genetic programming attempts to construct a model whose statistical features closely match those of the target process. The feature objectives comprising model behavior are evaluated using a multi-objective strategy. A contribution of this research is that, rather than use conventional Pareto ranking, a summed rank scoring strategy is used instead. Summed rank scoring was originally derived for high-dimensional search spaces. This paper shows that it is likewise effective for evaluating stochastic models with low- to moderate-sized search spaces. A number of stochastic models with oscillating behaviors were successfully evolved. Attempts on a larger-sized model were not successful. Reasons for its poor performance are likely due to poor choices in feature selection, and too many selected features and channels contributing to a overly difficult search space.

Automatic Evolution of Conceptual Building Architectures
Corrado Coia, May 2011.

This thesis describes research in which genetic programming is used to automatically evolve shape grammars that construct three dimensional models of possible external building architectures. A completely automated fitness function is used, which evaluates the three dimensional building models according to different geometric properties such as surface normals, height, building footprint, and more. In order to evaluate the buildings on the different criteria, a multi-objective fitness function is used. The results obtained from the automated system were successful in satisfying the multiple objective criteria as well as creating interesting and unique designs that a human-aided system might not discover. In this study of evolutionary design, the architectures created are not meant to be fully functional and structurally sound blueprints for constructing a building, but are meant to be inspirational ideas for possible architectural designs. The evolved models are applicable for today’s architectural industries as well as in the video game and movie industries. Many new avenues for future work have also been discovered and highlighted.

Automatic Structure Generation using Genetic Programming and Fractal Geometry
Steve Bergen, December 2011.

Three dimensional model design is a well-known and studied field, with numerous real-world applications. However, the manual construction of these models can often be time-consuming to the average user, despite the advantages offered through computational advances. This thesis presents an approach to the design of 3D structures using evolutionary computation and L-systems, which involves the automated production of such designs using a strict set of fitness functions. These functions focus on the geometric properties of the models produced, as well as their quantifiable aesthetic value – a topic which has not been widely investigated with respect to 3D models. New extensions to existing aesthetic measures are discussed and implemented in the presented system in order to produce designs which are visually pleasing. The system itself facilitates the construction of models requiring minimal user initialization and no user-based feedback throughout the evolutionary cycle. The genetic programming evolved models are shown to satisfy multiple criteria, conveying a relationship between their assigned aesthetic value and their perceived aesthetic value. Exploration into the applicability and effectiveness of a multi-objective approach to the problem is also presented, with a focus on both performance and visual results. Although subjective, these results offer insight into future applications and study in the field of computational aesthetics and automated structure design.

Machine Perception and Learning from the Evolutionary Viewpoint: The Hyperball Algorithms (Take Two)
Vlad Wojcik and Behzad Salami, December 2010.

We present a set of highly parallel (fine grain), general pattern classification algorithms aimed at mimicking pattern identification, classification and learning skills of animals. We cover first an idealized case of supervised learning from an infallible expert, followed by the realistic cases of learning from fallible experts as well as the case of autonomous (i.e. self-supervised) learning. To ensure wide range of applicability in our algorithms we use only the basic concepts of mathematics: set theory and theory of metric spaces. This methodology, already tested in computer vision domain, allows for creation of expert systems capable of continuous learning and forgetting of less useful facts, and to demonstrate curiosity and initiative in maintaining their adaptation to an environment evolving sufficiently slowly.

As proof of concept we offer a sketch of automated deep sky observation in search for unknown objects together with a detailed solution to the problem of automated mineral identification in petrographic thin sections. These illustration problems seem intractable using traditional means.

The Evolution of Higher-level Biochemical Reaction Models
Brian J. Ross, December 2010.

Computational tools for analyzing biochemical phenomena are becoming increasingly important. Recently, high-level formal languages for modeling and simulating biochemical reactions have been proposed. These languages make the formal modeling of complex reactions accessible to domain specialists outside of theoretical computer science. This research explores the use of genetic programming to automate the construction of models written in one such language. Given a description of desired timecourse data, the goal is for genetic programming to construct a model that might generate the data. The language investigated is Kahramanogullari’s and Cardelli’s PIM language. The PIM syntax is defined in a grammar-guided genetic programming system. All time series generated during simulations are described by statistical feature tests, and the fitness evaluation compares feature proximity between the target and candidate solutions. Target PIM models of varying complexity are used as target expressions for genetic programming. Results were very successful in all cases. One reason for this success is the compositional nature of PIM, which is amenable to genetic program search.

Splitting Atoms in Relational Algebras
Prathap Siddavaatam and Michael Winter, December 2010.

Splitting atoms in a relation algebra is a common tool to generate new algebras from old ones. This includes constructing non-representable algebras from representable structures. The known method of splitting atoms does not allow that bijections different from the identity are contained in the starting algebra. This is a major drawback of this method because interesting candidates in mereotopology do contain such bijections. An ad-hoc splitting was done in those examples, and the results have been published in several papers. With this paper we want to start a thorough investigation of possible splitting methods.

A First-Order Calculus for Allegories
Bahar Aameri and Michael Winter, December 2010.

In this paper we a language and first-order calculus for formal reasoning about relations based on the theory of allegories. Since allegories are categories the language is typed in Church-style. We show soundness and completeness of the calculus and demonstrate its usability by presenting the RelAPS system; a proof assistant for relational categories based on the calculus presented here.

A First-Order Characterization of Allen’s Interval Algebra
Michael Winter, December 2010.

In this paper we are interested in the order structure induced by the relations ‘before’ and ‘meets’ of Allen’s interval algebra. We provide an abstract definition of this kind of order structure called an interval order. We prove that every interval order is isomorphic to the corresponding order on a set of finite intervals over a linear order without endpoints. As a consequence of this characterization we obtain that any partition of the set of relations on a set consisting of 13 relations that satisfy the composition table of Allen’s algebra is indeed an algebra of intervals.

Functions Definable by Arithmetic Circuits
Ian Pratt-Hartmann and Ivo Düntsch, January 2009.

An arithmetic circuit is a labelled, directed graph specifying a cascade of arithmetic and logical operations to be performed on sets of non-negative integers. In this paper, we consider the definability of functions by means of arithmetic circuits. We prove two negative results: the first shows, roughly, that a function is not circuit-definable if it has an infinite range and sub-linear growth; the second shows, roughly, that a function is not circuit-definable if it has a finite range and fails to converge on certain `sparse’ chains under inclusion. We observe that various functions of interest fall under these descriptions.

Stonian p-Ortholattices: A new approach to the mereotopology RT0
Torsten Hahmann, Michael Gruninger and Michael Winter, April 2009.

This paper gives a isomorphic representation of the subtheories RT-, RT-EC, and RT of Asher and Vieu’s first-order ontology of mereotopology RT0. It corrects and extends previous work on the representation of these mereotopologies. We develop the theory of p-ortholattices — lattices that are both orthocomplemented and pseudocomplemented — and show that the identity (x • y)*=x*+y* defines the natural class of Stonian p-ortholattices. Equivalent conditions for a p-ortholattice to be Stonian are given. The main contribution of the paper consists of a representation theorem for RT- as Stonian p-ortholattices. Moreover, it is shown that the class of models of RT-EC is isomorphic to the non-distributive Stonian p-ortholattices and a representation of RT is given by a set of four algebras of which one need to be a subalgebra of the present model. As corollary we obtain that Axiom (A11) — existence of two externally connected regions — is in fact a theorem of the remaining axioms of RT.

Timed Contact Algebras
Ivo Düntsch and Michael Winter, April 2009.

Timed contact algebras constitute an approach to a temporal version of a region based theory of space. In the general theory the underlying model of time does not have any structure, i.e. time is neither ordered nor required to be discrete or continuous. In this paper we want to investigate contact structures with a betweenness relation on points in time. Furthermore, we are interested in the relationship of a continuity axiom, an axiom of construction, and the fine structure of the time relation.

Cardinal Addition in Distributive Allegories
Yasuo Kawahara and Michael Winter, April 2009.

In this paper we want to develop an addition operation on an abstract notion of cardinality of relations in a distributive allegory. Assuming suitable extra structure on the underlying allegory we are going to define addition and investigate its basic properties.

Cylindric algebras and relational databases
Ivo Düntsch, April 2009.

An interpretation of the relational data model and various dependencies is given within the framework of cylindric algebras. It turns out that there is a natural translation of relational databases into cylindric set lattices, and that Codd’s relational operators have a simple interpretation in these structures. Consequently, various database constraints can be expressed in the equational language, and recent results in equational logic have been able to shed some light on the expressiveness and axiomatizability of these constraints.

Complements in Distributive Allegories
Michael Winter, April 2009.

It is know in topos theory that axiom of choice implies that the topos is Boolean. In this paper we want to prove and generalize this result in the context of allegories. In particular, we will show that partial identities do have complements in distributive allegories with relational sums and total splittings assuming the axiom of choice. Furthermore, we will discuss possible modifications of the assumptions used in the that theorem.

On the Skeleton of Stonian p-Ortholattices
Michael Winter, Torsten Hahmann and Michael Gruninger, April 2009.

Boolean Contact Algebras (BCA) establish the algebraic counterpart of the mereotopolopy induced by the Region Connection Calculus. Similarly, Stonian p-ortholattices serve as a lattice theoretic version of the onthology $RT^-$ of Asher and Vieu. In this paper we study the relationship between BCAs and Stonian p-ortholat\-tices. We show that the skeleton of every Stonian p-ortholattice is a BCA, and, conversely, that every BCA is isomorphic to the skeleton of a Stonian p-ortholattice. Furthermore, we prove the equivalence between algebraic conditions on Stonian p-ortholattices and the axioms C5, C6, and C7 for BCAs.

Choosing the root node of a quadtree
Xiang Yin, Ivo Düntsch and Günther Gediga, May 2009.

Granular computing is closely related to the depth of the detail of information with which we are presented, or choose to process. In spatial cognition and image processing such detail is given by the resolution of a picture. The quadtree representation of an image offers a quick look at the image at various stages of granularity, and successive quadtree representations can be used to represent change. In this paper we present a heuristic algorithm to find a root node of a region quadtree which reduces the number of leaves when compared with the standard quadtree decomposition.

User-Guided Evolution of Granular Synthesis
Corrado Coia and Brian J. Ross, July 2009.

Granular synthesis is a form of audio synthesis that consists of breaking audio into tiny millisecond chucks. This paper describes the EGDE system. EGDE (Evolutionary Granular Delay Environment) is a software plug-in that permits its granular delay effects to be time synchronized with the host application. Its interface features an interactive genetic algorithm, to be used for user exploration of granular synthesis parameters. Interactive evolution is ideal in this application, as it permits a user to interactively explore a wide variety of effects that arise with different combinations of the granular delay parameters. A goal of EGDE is to provide a granular synthesis interface with an intuitive and efficient means of auditioning and rating effect parameters, while minimizing user exhaustion. This is particularly important with a granular synthesis, since many parameter settings will be undesirable to most listeners. Typically, each parameter set in the population can be auditioned in a matter of a few seconds or less. A minimalist 3-value evaluation scheme lets the user either clone, use, or delete candidate parameter sets. Mutation and crossover rates can also be fine tuned as desired. The interface lets the user directly tweak parameters, thus permitting direct user editing of chromosomes. The evolutionary interface could be easily adapted to other musical applications in the future, for example, generalized synthesis engines.

Evolutionary Synthesis of Stochastic Gene Network Models using Feature-based Search Spaces
Janine Imada, November 2009.

A feature-based fitness function is applied in a genetic programming system to synthesize stochastic gene regulatory network models whose behaviour is defined by a time course of protein expression levels. Typically, when targeting time series data, the fitness function is based on a sum-of-errors involving the values of the fluctuating signal. While this approach is successful in many instances, its performance can deteriorate in the presence of noise. This thesis explores a fitness measure determined from a set of statistical features characterizing the time series’ sequence of values, rather than the actual values themselves. Through a series of experiments involving symbolic regression with added noise and gene regulatory network models based on the stochastic pi-calculus, it is shown to successfully target oscillating and non-oscillating signals. This practical and versatile fitness function offers an alternate approach, worthy of consideration for use in algorithms that evaluate noisy or stochastic behaviour.

Functions Definable by Numerical Set-Expressions
Ian Pratt-Hartmann and Ivo Düntsch, December 2009.

A numerical set-expression is a term specifying a cascade of arithmetic and logical operations to be performed on sets of non-negative integers. If these operations are confined to the usual Boolean operations together with the result of lifting addition to the level of sets, we speak of additive circuits. If they are confined to the usual Boolean operations together with the result of lifting addition and multiplication to the level of sets, we speak of arithmetic circuits. In this paper, we investigate the definability of sets and functions by means of additive and arithmetic circuits, occasionally augmented with additional operations.

Complex Algebras of Arithmetic
Ivo Düntsch and Ian Pratt-Hartmann, December 2009.

An arithmetic circuit is a labeled, acyclic directed graph specifying a sequence of arithmetic and logical operations to be performed on sets of natural numbers. Arithmetic circuits can also be viewed as the elements of the smallest subalgebra of the complex algebra of the semiring of natural numbers. In the present paper we investigate the algebraic structure of complex algebras of natural numbers and make some observations regarding the complexity of various theories of such algebras.

The Lattice of Contact Relations on a Boolean Algebra
Ivo Düntsch and Michael Winter, January 2008.

Contact relations on an algebra have been studied since the early part of the previous century, and have recently become a powerful tool in several areas of artificial intelligence, in particular, qualitative spatial reasoning and ontology building. In this paper we investigate the structure of the set of all contact relations on a Boolean algebra.

A discrete duality between apartness algebras and apartness frames
Ivo Düntsch and Ewa Orlowska, January 2008.

Apartness spaces were introduced as a constructive counterpart to proximity spaces which, in turn, aimed to model the concept of nearness of sets in a metric or topological environment. In this paper we introduce apartness algebras and apartness frames intended to be abstract counterparts to the apartness spaces of Bridges and Vita, and we prove a discrete duality for them.

Using Feature-based Fitness Evaluation in Symbolic Regression with Added Noise
Janine Imada and Brian Ross, April 2008.

Symbolic regression is a popular genetic programming (GP) application. Typically, the fitness function for this task is based on a sum-of-errors, involving the values of the dependent variable directly calculated from the candidate expression. While this approach is extremely successful in many instances, its performance can deteriorate in the presence of noise. In this paper, a feature-based fitness function is considered, in which the fitness scores are determined by comparing the statistical features of the sequence of values, rather than the actual values themselves. The set of features used in the fitness evaluation are customized according to the target, and are drawn from a wide set of features capable of characterizing a variety of behaviours. Experiments examining the performance of the feature-based and standard fitness functions are carried out for non-oscillating and oscillating targets in a GP system which introduces noise during the evaluation of candidate expressions. Results show strength in the feature-based fitness function, especially for the oscillating target

Probabilistic Granule Analysis
Ivo Düntsch and Günther Gediga, May 2008.

We present a semi–parametric approach to evaluate the reliability of rules obtained from a rough set information system by replacing strict determinacy by predicting a random variable which is a mixture of latent probabilities obtained from repeated measurements of the decision variable. It is demonstrated that the algorithm may be successfully used for unsupervised learning.

Task Performance Metrics on Liquid Crystal Displays
Dave Bockus and Jenny Quay, July 2008.

Twenty participants performed a selection and targeting task where performance indices were recorded. User performance varied as the sizes and resolutions of the screen changed during task performance. Performance was calculated using formula’s represented in the Fitts’ and Hicks’ Models. Results show that the largest screen with the highest resolution had good performance times but also a smaller screen with a lower resolution yielded good performance times. The results indicate that bigger is not always better when working on LCD screens and that the user can become optically challenged when trying to work with a high resolution on a smaller screen. Results show that there is a relationship between the perceived font size on the screen and the users’ ability to perform comprehension and recognition while performing certain tasks.

Towards Automated Derivation in Theory of Allegories
Joel Glanfield, July 2008.

We provide an algorithm that automatically derives many provable theorems in the equational theory of allegories (ALL). This was accomplished by noticing properties of an existing decision algorithm that could be extended provide a derivation in addition to a decision certificate. We also suggest improvements and corrections to previous research in order to motivate further work on a complete derivation mechanism. The results presented here are significant for those interested in relational theories, since we essentially have a subtheory where automatic proof-generation is possible. This is also relevant to program verification since relations are well-suited to describe the behaviour of computer programs. It is likely that extensions of the theory of allegories are also decidable and possibly suitable for further expansions of the algorithm presented here.

Algorithms for Speedy Visual Recognition and Classification of Patterns Formed on Rectangular Imaging Sensors
Vlad Wojcik and Pascal Comte, July 2008.

The real-time tasks of image interpretation, pattern clustering, recognition and classification necessitate quick analysis of registered patterns. Current algorithms and technological approaches are slow and rigid. Within the evolutionary context we present here a novel, biologically inspired solution to these problems, with aim to accelerate processing. Our massively parallel algorithms are applicable to patterns recorded on rectangular imaging arrays of arbitrary sizes and resolutions. The systems we propose are capable of continuous learning, as well as of selective forgetting of less important facts, and so to maintain their adaptation to an environment evolving sufficiently slowly. A case study of visual signature recognition illustrates the presented ideas and concepts.

Machine Perception and Learning from the Evolutionary Viewpoint: The Hyperball Algorithms
Vlad Wojcik and Behzad Salami, Sept 2008.

We present a set of highly parallel (fine grain), general pattern classification algorithms aimed at mimicking pattern identification, classification and learning skills of animals. We cover first an idealized case of supervised learning from an infallible expert, followed by the realistic cases of learning from fallible experts as well as the case of autonomous (i.e. self-supervised) learning. To ensure wide range of applicability in our algorithms we use only the basic concepts of mathematics: set theory and theory of metric spaces. The proposed methodology allows for creation of expert systems capable of continuous learning and forgetting of less useful facts, and so to maintain their adaptation to an environment evolving sufficiently slowly.

As proof of concept we offer a sketch of automated deep sky observation in search for unknown objects together with a detailed solution to the problem of automated mineral identification in petrographic thin sections. These illustration problems seem intractable using traditional means.

A multi-modal logic for disagreement and exhaustiveness
Ivo Düntsch and Beata Konikowska, January 2007.

The paper explores two basic types of relations between objects of a Pawlak-style information system generated by the values of some attribute of those objects: disagreement (disjoint sets of values) and exhaustiveness (sets of values adding up to the whole universe of the attribute). Out of these two fundamental types of relations, most other types of relations on objects of an information system considered in the literature can be derived – as, for example, indiscernibility, similarity and complementarity. The algebraic properties of disagreement and indiscernibility relations are explored, and a representation theorem for each of these two types of relations is proved.

The notions of disagreement and exhaustiveness relations for a single attribute are extended to relations generated by arbitrary sets of attributes, yielding two families of relations parametrized by sets of attributes. They are used as accessibility relations to define a multi-modal logic with modalities corresponding to the lower and upper approximation of a set in Pawlak’s rough set theory. Finally, a complete Rasiowa-Sikorski deduction system of that logic is developed.

Arrow Categories
Michael Winter, January 2007.

Goguen categories were introduced as a suitable categorical description of L-fuzzy relations, i.e., of relations taking values from an arbitrary complete Brouwerian lattice L instead of the unit interval [0,1] of the real numbers. In this paper we want to study the algebraic structures derived from Goguen categories by replacing its second-order axiom by some weaker versions.

Using Genetic Programming to Synthesize Monotonic Stochastic Processes
Brian Ross, April 2007.

The automatic synthesis of stochastic concurrent processes is investigated. We use genetic programming to automatically evolve a set of stochastic pi-calculus expressions that generate execution behaviour conforming to some supplied target behaviour. We model the stochastic pi-calculus in a grammatically-guided genetic programming system, and we use an efficient interpreter based on the SPIM abstract machine model by Phillips and Cardelli. The behaviours of target systems are modelled as streams of numerical time series for different variables of interest. We were able to successfully evolve stochastic pi-calculus systems that exhibited the target behaviors. Successful experiments considered target processes with continuous monotonic behaviours.

Waste Collection Vehicle Routing Problem with Time Window using Multi-objective Genetic Algorithms
Beatrice Ombuki-Berman, Andrew Runka and Franklin Hanshar, May 2007.

We study a waste collection vehicle routing problem with time windows (VRPTW) complicated by multiple disposal trips and driver’s lunch breaks. Recently Kim et al. [1] introduced and addressed this problem using an extension of the well-known Solomon’s insertion approach, and a clustering-based algorithm. We propose and present the results of an initial study of a multi-objective genetic algorithm for the waste collection VRPTW using a set of benchmark data from real-world problems obtained by Kim et al.

An Ordered Category of Processes
Michael Winter, Oct 2007.

Processes can be seen as relations extended in time. In this paper we want to investigate this observation by deriving an ordered category of processes. We model processes as co-algebras of a relator on Dedekind category up to bisimilarity. On those equivalence classes we define a lower semi-lattice structure and a monotone composition operation.

Cardinality in Allegories
Yasuo Kawahara and Michael Winter, Nov 2007.

In this paper we want to investigate two notions of the cardinality of relations in the context of allegories. The different axiom systems are motivated on the existence of injective and surjective functions, respectively. In both cases we provide a canonical cardinality function and show that it is initial in the category of all cardinality functions over the given allegory.

Safe Control Architectures for Mobile Robots Interacting with Humans
Jerzy A. Barchanski, Nov 2007.

We review first the principal concepts of system safety. Then we analyze hazards of human – robot interactions. We focus on hazards of a robot collision with a human and devise an appropriate robot behavior mitigating this hazard. After reviewing the principal classes of robot control architectures, we evaluate their effectiveness in collision avoidance. In conclusion, we recommend one of the hybrid control architectures which seems to be the safest according to our analysis.

Remarks on lattices of contact relations
I. Düntsch and Michael Winter, February 2006.

In the present report, we study collections of contact relations on a fixed Boolean algebra and show that they can be provided with a rich lattice structure. This follows from a representation theorem which associates with each contact relation a closed graph on the ultrafilter space of the underlying Boolean algebra and vice versa. We also consider collections of special contact relations which have gained some importance in qualitative spatial reasoning.

Search Space Analysis of Recurrent Spiking and Continuous-time Neural Networks
Mario Ventresca and Beatrice Ombuki, February 2006.

The problem of designing recurrent continuous-time and spiking neural networks is NP-Hard. A common practice is to utilize stochastic searches, such as evolutionary algorithms, to automatically construct acceptable networks. The outcome of the stochastic search is related to its ability to navigate the search space of neural networks and discover those of high quality. In this paper we investigate the search space associated with designing the above recurrent neural networks in order to differentiate which network should be easier to automatically design via a stochastic search. Our investigation utilizes two popular dynamic systems problems; (1) the Henon map and (2) the inverted pendulum as a benchmark.

Weak Realtional Products
Michael Winter, March 2006.

The existence of relational products in categories of relations is strongly connected with the representability of that category. In this paper we propose a canonical weakening of the notion of a relational product. Unlike the strong version any (small) category of relations can be embedded into a suitable category providing all weak relational products. Furthermore, we investigate the categorical properties of the new construction and proof several (weak) versions of propositions well-known for relational products.

A Relation Algebraic Theory of Bisimulations
Michael Winter, March 2006.

In this paper we develop an algebraic/categorical theory of bisimulations using relational methods. We define a general notion, which is capable to handle different version of bisimilarity in a common context. Furthermore, the approach relates bisimulations with the notion of a covering known from the theory of graphs and their applications in topology and complex analysis.

Betweenness and comparability obtained from binary relations
Ivo Düntsch and Alasdair Urquhart, March 2006.

We give a brief overview of the axiomatic development of betweenness relations, and investigate the connections between these and comparability graphs. Furthermore, we characterize betweenness relations induced by reflexive and antisymmetric binary relations, thus generalizing earlier results on partial orders. We conclude with a sketch of the algorithmic aspects of recognizing induced betweenness relations.

Epistasis in Multi-Objective Evolutionary Recurrent Neuro-Controllers
Mario Ventresca and Beatrice Ombuki-Berman, October 2006.

This paper presents an information-theoretic analysis of the epistatic effects present in evolving recurrent neural networks. That is, how do the gene-gene interactions change as the evolutionary process progresses and what does this reveal about the problem difficulty. Also, to what end does the environment influence epistasis. Our investigation concentrates on multi-objective evolution, where the major task to be performed is broken into sub-tasks which are then used as our objectives. Our results show that the behavior of epistasis during the evolutionary process is strongly dependant on the environment. The experimental results are presented for the path finding robot application using continuous-time and spiking neuro-controllers.

Search Difficulty of Two-Connected Ring-based Topological Network Designs
Beatrice Ombuki-Berman and Mario Ventresca, November 2006.

Ring-based network design problems have many important applications, especially in the fields of logistics and telecommunications. This paper focuses on the recently proposed two-connected network with bounded rings problem. We investigate the search difficulty of both the Euclidean edge and unit edge length ring size bounds flavors of this problem by performing an information-theoretic fitness landscape analysis for several instances of the problem. Our results further confirm the hypothesis that the unit edge length version of the problem is indeed more difficult. The investigation also further reveals that smaller sized ring bounds lead to more difficult problems for both the Euclidean and unit edge length problems.

Relation algebras and their application in qualitative spatial reasoning
I. Düntsch, February 2005.

Qualitative temporal and spatial reasoning is in many cases based on binary relations such as before, after, starts, contains, contact, part of, and others derived from these by relational operators. The calculus of relation algebras is an equational formalism; it tells us which relations must exist, given several basic operations, such as Boolean operations on relations, relational composition and converse. Each equation in the calculus corresponds to a theorem, and, for a situation where there are only finitely many relations, one can construct a composition table which can serve as a look up table for the relations involved. Since the calculus handles relations, no knowledge about the concrete geometrical objects is necessary. In this sense, relational calculus is “pointless”.

Relation algebras were introduced into temporal reasoning by Allen [1] and into spatial reasoning by Egenhofer and Sharma [32]. The calculus of relation algebras is also well suited to handle binary constraints as demonstrated e.g. by Ladkin and Maddux [55]. In the present paper I will give an introduction to relation algebras, and an overview of their role in qualitative temporal and spatial reasoning.

Evolutionary Algorithms for Optimal Error-Correcting Codes
W. Haas and S. Houghten, May 2005.

The maximum possible number of codewords in a q-ary code of length n and minimum distance d is denoted Aq(n,d). It is a fundamental problem in coding theory to determine this value for given parameters q, n and d. Codes that attain the maximum are said to be optimal. Unfortunately, for many different values of these parameters, the maximum number of codewords is currently unknown: instead we have a known upper bound and a known lower bound for this value.

In this paper, we investigate the use of different evolutionary algorithms for improving lower bounds for given parameters. We relate this problem to the well-known Maximum Clique Problem. We compare the performance of the evolutionary algorithms to Hill Climbing, Beam Search, Simulated Annealing, and greedy methods. We found that the GAs outperformed all other algorithms in general; furthermore, the difference in performance became more significant when considering harder test cases.

Evolving Dynamic Bayesian Networks with Multi-objective Genetic Algorithms
B.J. Ross and E. Zuviria, May 2005.

A dynamic Bayesian network (DBN) is a probabilistic network that models interdependent entities that change over time. Given example sequences of multivariate data, we use a genetic algorithm to synthesize a network structure that models the causal relationships that explain the sequence. We use a multi-objective evaluation strategy with a genetic algorithm. The multi-objective criteria are a network’s probabilistic score and structural complexity score. Our use of Pareto ranking is ideal for this application, because it naturally balances the effect of the likelihood and structural simplicity terms used in the BIC network evaluation heuristic. We use a simple structural scoring formula, which tries to keep the number of links in the network approximately equivalent to the number of variables. We also use a simple representation that favours sparsely connected networks similar in structure to those modeling biological phenomenon. Our experiments show promising results when evolving networks ranging from 10 to 30 variables, using a maximal connectivity of between 3 and 4 parents per node. The results from the multi-objective GA were superior to those obtained with a single objective GA.

On Problems in Polymorphic Object-Oriented Languages With Self Types and Matching
M. Winter, June 2005.

Subtyping is a basic concept in object-oriented languages. It supports subsumption but, unfortunately, it does not support inheritance of binary methods, i.e., methods taking another argument of type Self — the same type as the object itself. For this reason, a relation, called matching, on recursive object types has been proposed. This relation does not support subsumption but it allows to inherit binary methods. Two different definitions of matching, called F-bounded and higher-order subtyping, have been proposed and discussed. It was shown that the higher-order interpretations has better theoretical properties, i.e., it leads to a reflexive and transitive matching relation. In this paper we concentrate on two problems in languages with self types and matching based on the higher-order interpretation. We show that the flexibility of self types may not allow the programmer to define certain classes and/or methods which are based on constant values. Furthermore, the higher-order interpretation, especially in the context of bounded quantification, is too restrictive. We argue that a language should be based on both versions of matching and a notion of a type This distinguished from the type Self.

Time-Dependent Contact Structures in Goguen Categories
M. Winter, June 2005.

In this paper we focus on a time-extended theory of contact. It turns out that a suitable theory can be defined using L-valued or L-fuzzy version of a contact relation. We study this structure in the context of Goguen categories – a suitable categorical formalization of L-valued/fuzzy relations.

Weak Contact Structures
I. Düntsch and M. Winter, June 2005.

In this paper we investigate weak contact relations C on a lattice L, in particular, the relation between various axioms for contact, and their connection to the algebraic structure of the lattice. Furthermore, we will study a notion of orthogonality which is motivated by a weak contact relation in an inner product space. Although this is clearly a spatial application, we will show that, in case L is distributive and C satisfies the orthogonality condition, the only weak contact relation on L is the overlap relation; in particular no RCC model satisfies this condition.

Bounds on Optimal Edit Metric Codes
S.K. Houghten, D. Ashlock and J. Lennarz, July 2005.

The edit distance between two strings is the minimal number of substitutions, deletions, or insertions required to transform one string into another. An error correcting code over the edit metric includes features from deletion-correcting codes as well as the more traditional codes de- fined using Hamming distance. Applications of edit metric codes include the creation of robust tags over the DNA alphabet. While codes over the edit metric are analogous to similar codes over the Hamming metric, little of the beautiful theory survives. The block structure of a word is its partition into maximal subwords composed of a single character. The size of a sphere about a word in the edit metric is heavily dependent on the block structure of the word, creating a substantial divergence from the theory for the Hamming metric.

This paper explores the theory underlying edit metric codes for small alphabets. An optimal code is one with the maximum possible number of codewords for its length and minimum distance. We provide tables of bounds on code sizes for edit codes with short length and small alphabets. We present several heuristics for constructing codes.

Relational semantics through duality
E. Orlowska, I. Rewitzky and I. Düntsch, July 2005.

In this paper we show how the classical duality results extended to a Duality via Truth contribute to development of a relational semantics for various modal-like logics. In particular, we present a Duality via Truth for some classes of information algebras and frames. We also show that the full categorical formulation of classical duality extends to a full Duality via Truth.

A Novel Variation Operator for More Rapid Evolution of DNA Error Correcting Codes
D. Ashlock and S. Houghten, July 2005.

Error correcting codes over the edit metric have been used as embedded DNA markers in at least one sequencing project. The algorithm used to construct those codes was an evolutionary algorithm with a fitness function with exponential time complexity. Presented here is an substantially faster evolutionary algorithm for locating error correcting codes over the edit metric that exhibits either equivalent or only slightly inferior performance on test cases. The new algorithm can produce codes for parameters where the run-time of the earlier algorithm was prohibitive. The new algorithm is a novel type of evolutionary algorithm using a greedy algorithm to implement a variation operator. This variation operator is the sole variation operator used and has unary, binary, and k-ary forms. The unary and binary forms are compared, with the binary form being found superior. Population size and the rate of introduction of random material by the variation operator are also studied. A high rate of introduction of random material and a small population size are found to be the best.

Relational attribute systems II : Reasoning with relations in information structures
I. Düntsch, G. Gediga and Ewa Orlowska, October 2005.

We describe deduction mechanisms for various types of databases with incomplete information, in particular relational attribute systems, which we have introduced earlier.

Rough relation algebras revisited
I. Düntsch and M. Winter, October 2005.

Rough relation algebras arise from Pawlak’s information systems by considering as object ordered pairs on a fixed set X. Thus, the subsets to be approximated are binary relations over X, and hence, we have at our disposal not only the set theoretic operations, but also the relational operators composition, converse, and the identity relation. In the present paper, which is a continuation of an earlier paper, we investigate the structure of abstract rough relation algebras further.

Dynamic Vehicle Routing using Genetic Algorithms
F.T. Hanshar and B.M. Ombuki, November 2005.

Many difficult combinatorial optimization problems have been modeled as static problems. However, in practice, many problems are dynamic and changing, while some decisions have to be made before all the design data is known. For example, in the Dynamic Vehicle Routing Problem (DVRP), new customer orders appear over time, and new routes must be reconfigured while executing the current solution. Montemanni considered a DVRP as an extension to the standard vehicle routing problem (VRP) by decomposing a DVRP as a sequence of static VRPs, and then solving them with an ant colony system (ACS) algorithm. This paper presents a genetic algorithm (GA) methodology for providing solutions for the DVRP model employed in the ACS. The effectiveness of the proposed GA is evaluated using a set of benchmarks found in the literature. Compared with a tabu search approach implemented herein and the aforementioned ACS, the proposed GA methodology performs better in minimizing travel costs.

Construction of Boolean contact algebras
I. Düntsch and M. Winter, January 2004.

We consider Boolean algebras endowed with a contact relation which are abstractions of Boolean algebras of regular closed sets together with Whitehead’s connection relation, in which two non-empty regular closed sets are connected if they have a non-empty intersection. These are standard examples for structures used in qualitative reasoning, mereotopology, and proximity theory. We exhibit various methods how such algebras can be constructed and give several non-standard examples, the most striking one being a countable model of the Region Connection Calculus in which every proper region has infinitely many holes.

Multi-objective Genetic Algorithms for Vehicle Routing Problem with Time Windows
B. Ombuki, B.J. Ross and F. Hanshar, January 2004.

The Vehicle Routing Problem with Time windows (VRPTW) is an extension of the capacity constrained Vehicle Routing Problem (VRP). The VRPTW is NP-Complete and instances with 100 customers or more are very hard to solve optimally. We represent the VRPTW as a multi-objective problem and present a genetic algorithm solution using the Pareto ranking technique. We use a direct interpretation of the VRPTW as a multi-objective problem, in which the two objective dimensions are number of vehicles and total cost (distance). An advantage of this approach is that it is unnecessary to derive weights for a weighted sum scoring formula. This prevents the introduction of solution bias towards either of the problem dimensions. We argue that the VRPTW is most naturally viewed as a multi-modal problem, in which both vehicles and cost are of equal value, depending on the needs of the user. A result of our research is that the multi-objective optimization genetic algorithm returns a set of solutions that fairly consider both of these dimensions. Our approach is quite effective, as it provides solutions competitive with the best known in the literature, as well as new solutions that are not biased toward the number of vehicles. A set of well-known benchmark data are used to compare the effectiveness of the proposed method for solving the VRPTW.

Rough set data representation using binary decision diagrams
A. Muir, I. Düntsch and G. Gediga, February 2004.

A new information system representation, which inherently represents indiscernibility is presented. The basic structure of this representation is a Binary Decision Diagram. We offer testing results for converting large data sets into a Binary Decision Diagram Information System representation, and show how indiscernibility can be efficiently determined. Furthermore, a Binary Decision Diagram is used in place of a relative discernibility matrix to allow for more efficient determination of the discernibility function than previous methods. The current focus is to build an implementation that aids in understanding how binary decision diagrams can improve Rough Set Data Analysis methods.

Ant Colony Optimization for Job Shop Scheduling Problem
M. Ventresca and B.M. Ombuki, February 2004.

This paper presents an application of the Ant colony optimization metaheuristic to the job shop scheduling problem. A pheromone alteration strategy which improves the basic ant system by utilizing the behaviour of artificial ants to perform local search is introduced. Experiments using well-known benchmark JSSP problems show that this approach improves on the performance obtained by the basic ant system and is competitive with another recently proposed extension of the ant system, the MAX-MIN algorithm.

Safety aspects of autonomous robot software development
J.A. Barchanski, February 2004.

This paper is concerned with safety aspects of autonomous robot software development. Autonomous robots may operate unattended and through an unsafe operation may cause significant human, economic, or mission losses. Similar problems were encountered early on in manufacturing automation; but autonomous robots may change their behavior and operate in much less controlled environments. We concentrate in this paper on the safety of robot control software. This software allows unprecedented complexity of robotic systems, which goes beyond the ability of current engineering techniques for assuring acceptable risk. Accidents arise due to robot autonomy or adaptability or as a result of the interactions among the components of the robot control architecture. We discuss shortly safety aspects of those features. We conclude the paper with description of a robot software development process taking into account safety constraints.

On the direct scaling approach of eliciting aggregated fuzzy information: The psychophysical view
G. Gediga, I. Düntsch and J. Adams-Webber, March 2004.

We describe an experiment in which we elicit aggregated fuzzy information, and link it to the “direct scaling” approach of S.S. Stevens (1957). We also re-analyze earlier experiments by Zimmermann and Zysno (1980) and compare the results to the proposed scaling method. Our results seem to indicate that a compensatory operator is not always necessary to model aggregated fuzzy information, and that humans internally use a simple rescaling.

Relational representation theorems for some lattice-based structures
I. Düntsch, E. Orlowska, A.M. Radzikowska and D. Vakarelov, March 2004.

Parsing probabilistic context free languages with multi-objective genetic algorithms
R. Lefuel and B.J. Ross, May 2004.

A genetic algorithm for the design of two-connected networks with bounded rings
M. Ventresca and B. Ombuki, July 2004.

A genetic algorithm for designing minimum cost two-connected networks such that the shortest cycle to which each edge belongs to does not exceed a given length is proposed in this paper. We provide numerical results based on randomly generated graphs found in literature and compare the solution quality with that of tabu search and branch and bound. The results demonstrate the effectiveness of our algorithm. Except for our previous preliminary work, this paper is among the first to document the implementation of a genetic algorithm for the design of two-connected networks with the added constraint of bounded rings.

An effective genetic algorithm for the multi-depot vehicle routing problem
B. Ombuki and F. Hanshar, October 2004.

Efficient routing and scheduling of vehicles has significant economic implications for both the public and private sectors. This paper considers the multi-depot vehicle routing problem (MDVRP) which is an important extension of the vehicle routing problem. We find it is surprising to identify only one effective genetic algorithm (GA) that can compete with the powerful tabu search methods designed for the MDVRP. We present a simple but efficient GA that employs an indirect encoding and an adaptive inter-depot exchange strategy for the MDVRP with capacity and route-length restrictions. The algorithm is tested on a set of 23 classic MDVRP benchmark problems with 50 to 360 customers. The GA’s ability to find good solutions demonstrates that the solution for the MDVRP problem does not strongly depend on the clustering algorithm used to initially assign customers to depots. The GA finds 17 out of 23 new GA solutions compared to the current best-known GA for the problem. Overall, the GA is equally good compared to other existing non-GA based meta-heuristics, obtaining better or competitive results.

Region-based theory of discrete spaces: A proximity approach
I. Düntsch and D. Vakarelov, November 2004.

We introduce Boolean proximity algebras as a generalization of Efremovic proximities which are suitable in reasoning about discrete regions. Following Stone’s representation theorem for Boolean algebras, it is shown that each such algebra is isomorphic to a substructure of a complete and atomic Boolean proximity algebra.

Approximation operators in qualitative data analysis
I. Düntsch and G. Gediga, January 2003.

A large part of qualitative data analysis is concerned with approximations of sets on the basis of relational information. In this paper, we present various forms of set approximations via the unifying concept of modal-style operators. Two examples indicate the usefulness of the approach.

Lattice-based relation algebras and their representability
I. Düntsch, E. Orlowska and A.M. Radzikowska, March 2003.

Region-based theory of discrete spaces: A proximity approach
I. Düntsch and D. Vakarelov, March 2003.

We introduce Boolean proximity algebras as a generalization of Efremovic proximities which are suitable in reasoning about discrete regions. Following Stone’s representation theorem for Boolean algebras, it is shown that each such algebra is isomorphic to a substructure of a complete and atomic Boolean proximity algebra.

Subject-Based File-Link System for the Web
T. Arakawa and J.A. Barchanski, April 2003.

The system described in this paper is a system designed for organizing files on the Internet based on subjects and for sharing information among a group of people. The system consists of a server program and a client-side browser. The server program is a module of an Apache web server while the client-side browser is a web page with functions enabled by JavaScript. The system creates XML files, each of which acts as a node in a logical tree structure and stores them on the server. The node represents a directory or a file of the tree structure. The system can be used by a small to medium group of people who want to share an information specific for the group or it can be used to organize web documents as a private directory collection.

Procedural 3D Texture Synthesis Using Genetic Programming
A. Hewgill and B.J. Ross, April 2003.

The automatic synthesis of procedural textures for 3D surfaces using genetic programming is investigated. Genetic algorithms employ a search strategy inspired by Darwinian natural evolution. Genetic programming uses genetic algorithms on tree structures, which are interpretable as computer programs or mathematical formulae. We use a texture generation language as a target language for genetic programming, and then use it to evolve textures having particular characteristics of interest. The texture generation language used here includes operators useful for texture creation, for example, mathematical operators, and colour and noise functions. In order to be practical for 3D model rendering, the language includes primitives that access surface information for the point being rendered, such as coordinates values, normal vectors, and surface gradients. A variety of experiments successfully generated procedural textures that displayed visual characteristics similar to the target textures used during training.

Relation algebras and their application in qualitative spatial reasoning
I. Düntsch, September 2003.

A Representation Theorem for Boolean Contact Algebras
I. Düntsch and M. Winter, September 2003.

We prove a representation theorem for Boolean contact algebras which implies that the axioms for the Region Connection Calculus (RCC) are complete for the class of subalgebras of the algebras of regular closed sets of weakly regular connected T1 spaces.

An Algorithm for Adaptive Maximization of Speedup
V. Wojcik and J. Martin, September 2003.

We describe an algorithm for workload evaluation and auto-adaptive maximization of speedup. Emphasis is put on exploitation of medium- and fine-grain algorithmic parallelism. The simulation results suggest that significant speedups are achievable for communications-bound algorithms. The implications for design of programming languages and supercomputer design are suggested.

Investigation of Perfect Ternary Arrays PTA(60,25)
P. Becker, S. Houghten and W. Haas, October 2003.

Perfect ternary arrays are closely related to difference sets and group invariant weighing matrices. A previous paper by one of the authors demonstrated strong restrictions on difference sets with parameters (120, 35, 10). A perfect ternary array with energy 25 and order 60, PTA(60,25), would obey an equation analogous to the difference set equation for (120, 35, 10). A perfect ternary array PTA(n,k) is equivalent to a group-developed weighing matrix in a group of order n. There is one known example of a weighing matrix developed over a nonsolvable group of order 60; no solvable examples are known. In this paper, we describe a search for weighing matrices (and corresponding perfect ternary arrays) developed over solvable groups of order 60. We analyze the quotient structure for each group. Techniques from representation theory, including a new viewpoint on complementary quotient images, are used to restrict possible weighing matrices. Finally, we describe a partially completed computer search for such matrices and arrays.

Relational Unsharpness and Processes
P. Kempf and M. Winter, November 2003.

We propose a canonical weakening of the notion of a relational product suitable to describe unsharpness effects of parallel composition of processes. In contrast to relational products, in this new construction the unsharpness property may still be valid although it is totally defined and the algebra is representable.

Meta-heuristics for the Job Shop Scheduling Problem
M. Ventresca and B.M. Ombuki, December 2003.

Meta-heuristics have been shown to provide effective approximate solutions for difficult NP-hard combinatorial optimization problems. This paper, presents a comparative study of a genetic algorithm (introduced in previous works), with simulated annealing (SA), tabu search (TS), and hybrid methods for the job-shop scheduling problem (JSSP). The approaches are tested on some well-known job shop benchmark problems. We show that the SA and TS outperform the GA. However, neither simulated annealing nor tabu search is uniformly superior over the other. Hybridization of genetic algorithm with tabu is shown to present the most promising solution quality for the JSSP problem instances investigated here.

Skill Set Analyis in Knowledge Structures
G. Gediga and I. Düntsch, May 2002.

We extend the theory of knowledge structures by taking into account information about the skills a subject has. In the first part of the paper we exhibit some structural properties of the skill-problem relationship and consequences for the interpretation of concurrent theories in terms of the skill theory. The second part of the paper offers a test theory based on skill functions: We present measurements for the data consistency of the skill-problem relationship, and estimate abilities in terms of lower and/or upper boundaries of problem states and skills, given a special instance of the skill-problem relationship.

On model evaluation, indices of importance, and interaction values in rough set analysis
G. Gediga and I. Düntsch, May 2002.

As most data models, “Computing with words” uses a mix of methods to achieve its aims, including several measurement indices. In this paper we discuss some proposals for such indices in the context of rough set analysis and present some new ones. In the first part we investigate several classical approaches based on approximation quality and the drop of approximation quality when leaving out elements. We show that using the approximation quality index is sensible in terms of admissibility, and present additional indices for the usefulness and significance of an approximation. The analysis of a “drop” is reinterpreted in terms of model comparison, and a general framework for all these concepts is presented. In the second part of the paper we present an example how using similar nomenclature in the theory of Choquet-type aggregations of fuzzy measurements and rough set approximation quality, without regard to the fine structure of the underlying model assumptions, can suggest connections where there are none. On a more positive note, we show that so called qualitative power and interaction indices, which are structurally similar to quantitative Choquet-type aggregations can be used in the context of rough set analysis. Furthermore, we suggest a suitable entropy-based measure which enables the use of qualitative power and interaction indices as an approximation.

Maximum consistency of incomplete data via non-invasive imputation
G. Gediga and I. Düntsch, May 2002.

In this paper we describe an algorithm to impute missing values from given data alone, without representational or other assumptions, and analyse its performance. Our approach is based on non- numeric rule based data analysis. In contrast to statistical procedures, such analysis offers no straightforward way to define loss functions or a likelihood function; these are based on statistical pre- assumptions, which are not given in rule based data analysis. Therefore, other optimisation criteria must be used. A simple criterion is the demand that the rules of the system should have a maximum in terms of consistency, which means if we fill a missing entry with a value, we should result in a rule which is consistent with the other rules of the system. Our algorithm imputes missing values in an attribute vector x by presenting a list of possible values drawn from the set of all vectors y which do not contradict x , i.e. they have the same entries wherever both are defined.

Approximation quality for sorting rules
G. Gediga and I. Düntsch, May 2002.

Abstract: The paper presents a method for computing an approximation quality for sorting rules of type “nominal -> nominal” (NN), “nominal -> ordinal” (NO), “ordinal -> ordinal” (OO), and its generalisation to “nominal, ordinal -> ordinal” rules (NO-O). We provide a significance test for the overall approximation quality, and a test for partial influence of attributes based on the bootstrap technology. A competing model is also studied. We show that this method is susceptible to local perturbations in the data set, and dissociated from the theory it claims to support.

Boolean algebras arising from information systems
I. Düntsch and E. Orlowska, May 2002.

Abstract: We investigate the algebraic counterparts of modal–style logics which arise from Pawlak’s information systems.

Tangent circle algebras
I. Düntsch and M. Roubens, May 2002.

In relational reasoning, one is concerned with the algebras generated by a given set of relations, when one allows only basic relational operations such as the Boolean operations, relational composition, and converse. According to a result by A. Tarski, the relations obtained in this way are exactly the relations which are definable in the three–variable fragment of first order logic. Thus, a relation algebra is a first indicator of the expressive power of a given set of relations.In this paper, we investigate relation algebras which arise in the context of preference relations. In particular, we study the tangent circle orders introduced by Abbas and Vincke (1996).

A Hybrid Search Based on Genetic Algorithms and Tabu Search for Vehicle Routing
B.M. Ombuki, M. Nakamura and M. Osamu, May 2002.

We present a hybrid search technique based on metaheuristics for approximately solving the vehicle routing problem with time windows (VRPTW). The approach is two phased; a global customer clustering phase based on genetic algorithms (GAs) and a post-optimization local search technique based on tabu search (TS). We also devise a new crossover operator for the VRPTW and compare its performance with two well-known crossover operators for VRPTW and related problems. Computational experiments show that the GA is effective in setting the number of vehicles to be used while the tabu search is better suited for reducing the total number of distance traveled by the vehicles. Through our simulations, we conclude that the hybrid search technique is more suitable for the multi-objective optimization for the VRPTW than applying either the GA or tabu search independently.

A Comparative Study of Search Techniques Applied to the Minimum Distance Problem of BCH Codes
J.L. Wallis and S.K. Houghten, May 2002.

BCH codes have been shown to be excellent error-correcting codes among codes of short lengths. The error-correcting capability of a code is directly related to its minimum distance. It is computationally very difficult to determine the true minimum distance of BCH codes. We analyze and compare the behaviour of different heuristic search techniques when applied to the problem of finding the true minimum weight of BCH codes. A basic genetic algorithm significantly outperformed hill-climbing, tabu search and hybrid techniques.

The Extended Quadratic Residue Code is the only (48, 24, 12) Self-Dual Doubly-Even Code
S.K. Houghten, C.W.H. Lam, L.H. Thiel, and J.A. Parker, May 2002.

The largest minimum weight of a self-dual doubly-even binary (n, k, d) code is d = 4*floor(n/24) + 4. Of such codes with length divisible by 24, the Golay Code is the only (24, 12, 8) code, the Extended Quadratic Residue Code is the only known (48, 24, 12) code, and there is no known (72, 36, 16) code. One may partition the search for a (48, 24, 12) self-dual doubly-even code into 3 cases. A previous search assuming one of the cases found only the Extended Quadratic Residue Code. In this paper we examine the remaining 2 cases. Separate searches assuming each of the remaining cases found no codes and thus the Extended Quadratic Residue Code is the only doubly-even self-dual (48, 24, 12) code.

Optimal Ternary (10, 7) Error-Correcting Codes
M.J. Letourneau and S.K. Houghten, May 2002.

We describe a series of exhaustive computer searches to determine the maximum number of words in ternary codes of length 10 and minimum distance 7, and to find all such optimal codes. We show that there are exactly 10 inequivalent (10, 14, 7) ternary codes. We prove A_3(10, 7) = 14 by showing that none of the (10, 14, 7)_3 codes can be extended to a (10, 15, 7)_3 code, thereby also giving A_3(11, 7)

Evolving Protein Motifs Using a Stochastic Regular Language with Codon-Level Probabilities
B.J. Ross, May 2002.

Experiments involving the evolution of protein motifs using genetic programming are presented. The motifs use a stochastic regular expression language that uses codon-level probabilities within conserved sets (masks). Experiments compared basic genetic programming with Lamarckian evolution, as well as the use of natural” probability distributions for masks obtained from the sequence database. It was found that Lamarckian evolution was detrimental to the probability performance of motifs. A comparison of evolved and natural mask probability schemes is inconclusive, since these strategies produce incompatible characterisations of motif fitness as used by the genetic programming system.

Hyperspectral Image Analysis Using Genetic Programming
B.J. Ross, A.G. Gualtieri, F. Fueten, and P. Budkewitsch, May 2002.

Genetic programming is used to evolve mineral identication functions for hyperspectral images. The input image set comprises 168 images from different wavelengths ranging from 428 nm (visible blue) to 2507 nm (invisible shortwave in the infrared), taken over Cuprite, Nevada, with the AVIRIS hyperspectral sensor. A composite mineral image indicating the overall reflectance percentage of three minerals (alunite, kaolnite, buddingtonite) is used as a reference or “solution” image. The training set is manually selected from this composite image. The task of the GP system is to evolve mineral identifiers, where each identifier is trained to identify one of the three mineral specimens. A number of different GP experiments were undertaken, which parameterized features such as thresholded mineral reflectance intensity and target GP language. The results are promising, especially for minerals with higher reflectance thresholds (more intense concentrations).

Parallel Software Engineering: a Tutorial for the State of Mind
V. Wojcik, May 2002.

This is the first of two essays, introducing the reader to performance and programming issues in parallel processing. A list of benefits and difficulties associated with parallel processing is presented. Some computer architectures are compared. Only minimal experience in traditional, sequential programming on the part of the reader is required. The programming issues are illustrated with simple examples. Finally, a parallel programming methodology is recommended.

A Library of Anticipatory Random Number Generators
V. Wojcik, May 2002.

The goal of this paper is to offer a quick parallel software engineering tutorial while creating an anticipatory random number generating tool. In this approach several difficulties were overcome: the generating task, coexisting with the user program, had to be made aware a priori of the required distributions. Also, this task must be capable of handling possible error conditions. This poses some difficulties, since the task is unaware of the nature of the simulation problem. This difficulty is compounded if the task were to be capable of accepting user-defined distributions: in such situation it is impossible to predict all error conditions that might arise. The task commits suicide, once it is not needed by the user program.

This paper focuses on the parallel software engineering aspects of the problem, assuming that the numerical principles of random number generation are well-known.

Modal-style Operators in Qualitative Data Analysis
G. Gediga and I. Düntsch, May 2002.

We explore the modal possibility operator and its dual necessity operator in qualitative data analysis, and show that it (quite literally) complements the derivation operator of formal concept analysis; we also propose a new generalisation of the rough set approximation operators. As an example for the applicability of the concepts we investigate the Morse data set which has been frequently studied in multidimensional scaling procedures.

A Fast Randomisation Test for Rule Significance
G. Gediga and I. Düntsch, May 2002.

Randomisation is a method to test the statistical significance of a symbolic rule; it is, however, very expensive. In this note we present a sequential randomisation test which dramatically reduces the number of steps needed for a conclusion.

Real-Time Competitive Evolutionary Computation
A. Hewgill, June 2002.

In this paper, a system of evolutionary computation will be presented that uses real-time competition to evolve individuals that effectively utilize the environment in which the competition takes place. This system can be used to create computer-controlled enemies for gaming systems with little programmer effort. The evolved individuals will be able to react to changes in the environment in real-time, which is necessary for most user interactive applications.

Procedural Texture Evolution Using Multiobjective Optimization
B.J. Ross and H. Zhu, July 2002.

This paper investigates the application of evolutionary multiobjective optimization to two-dimensional procedural texture evolution. Genetic programming is used to evolve procedural texture formulae. Earlier work used multiple feature tests during fitness evaluation to rate how closely a candidate texture matches visual characteristics of a target texture image. These feature test scores were combined into an overall fitness score using a weighted sum. This paper extends this research by replacing the weighted sum with a Pareto ranking scheme, which preserves the independence of feature tests during fitness evaluation. Three experiments were performed: a pure Pareto ranking scheme, and two Pareto experiments enhanced with population divergence strategies. One divergence strategy scores individuals using their nearest-neighbour distance in feature-space. Another scheme uses a normalized, ranked measurement of nearest neighbour distance. A result of this work is that acceptable textures can be evolved much more efficiently with MOP evolution than compared to the weighted sum approach done earlier. The ability to create a diverse selection of Pareto solutions is advantageous in this problem domain, since the acceptability of a final texture is ultimately a subjective, aesthetic decision by the user.

rmath User’s and Technical Guide
M.J. Letourneau, August 2002.

This manual describes rmath, a library of C code which can be used to represent arbitrary length vectors of Galois field elements. It is an efficient representation in terms of both storage space and running time. It currently supports GF(2), GF(3), GF(4), and GF(5), and is designed to be extensible to any other Galois field. In terms of operations, rmath supports: addition of two vectors; multiplication of two vectors; determining the weight of a vector; determining the distance between two vectors; generation of vectors according to a lexicographic ordering; comparison of vectors according to a lexicographic ordering; production of all linear combinations of one or more vectors. It also has a series of support functions for I/O and efficient storage.

Optimal Ternary (11, 7) and (14, 10) Codes
M.J. Letourneau and S.K. Houghten, September 2002.

We use exhaustive computer searches to show that there are exactly 36 codewords in an optimal ternary (11,7) code and exactly 13 codewords in an optimal ternary (14,10) code. We also enumerate optimal ternary (14,10) codes and show that there are exactly 6151 inequivalent such codes

A Proximity Approach to Some Region-Based Theories of Space
D. Vakarelov, G. Dimov, I. Düntsch, and B. Bennett, November 2002.

This paper is a continuation of earlier work (Dimiter Vakarelov, Ivo Düntsch, and Brandon Bennett. A note on proximity spaces and connection based mereology. In C. Welty and B. Smith, editors, Proceedings of the 2nd International Conference on Formal Ontology in Information Systems (FOIS’01), pages 139-150. ACM, 2001). The notion of local connection algebra, based on the primitive notions of connection and boundedness, is introduced. It is slightly different but equivalent to Roeper’s notion of region-based topology. The similarity between the local proximity spaces of Leader and local connection algebras is emphasized. Machinery, analogous to that introduced by Efremovic, Smirnov and Leader for proximity and local proximity spaces, is developed. This permits us to give new proximity-type models of local connection algebras, to obtain a representation theorem for such algebras and to give a new shorter proof of the main theorem of Roeper’s paper. Finally, the notion of MVD-algebra is introduced. It is similar to Mormann’s notion of enriched Boolean algebra, based on a single mereological relation of interior parthood. It is shown that MVD-algebras are equivalent to local connection algebras. This means that the connection relation and boundedness can be incorporated into one, mereological in nature relation. In this way a formalization of the Whiteheadian theory of space based on a single mereological relation is obtained.

Local Search Genetic Algorithms for the Job Shop Scheduling Problem
B. Ombuki, and M. Ventresca, November 2002.

In previous work, we developed three deadlock removal strategies for the job shop scheduling problem (JSSP) and proposed a hybridized genetic algorithm for it. While the genetic algorithm (GA) gave promising results, its performance depended greatly on the choice of deadlock removal strategies employed. This paper introduces a genetic algorithm based scheduling scheme that is deadlock free. This is achieved through the choice of chromosome representation and genetic operators. We propose an efficient solution representation for the JSSP in which the job task ordering constraints are easily encoded. Furthermore, a problem specific crossover operator that ensures solutions generated through genetic evolution are all feasible is also proposed. Hence, both checking of the constraints and repair mechanism can be avoided, thus resulting in increased efficiency. A mutation-like operator geared towards local search is also proposed which further improves the solution quality. Lastly, a hybrid strategy using the genetic algorithm reinforced with a tabu search is developed. An empirical study is carried out to test the proposed strategies using benchmark data.

Searching for Search Algorithms: Experiments in Meta-search
B.J. Ross, December 2002.

The conventional approach to solving optimization and search problems is to apply a variety of search algorithms to the problem at hand, in order to discover a technique that is well-adapted to the search space being explored. This paper investigates an alternative approach, in which search algorithms are automatically synthesized for particular optimization problem instances. A language composed of potentially useful basic search primitives is derived. This search language is then used with genetic programming to derive search algorithms. The genetic programming system evaluates the fitness of each search algorithm by applying it to a binary-encoded optimization problem (Traveling Salesman), and measuring the relative performance of that algorithm in finding a solution to the problem. It is shown that the evolved search algorithms often display consistent characteristics with respect to the corresponding problem instance to which they are applied. For example, some problem instances are best suited to hill-climbing, while others are better adapted to conventional genetic algorithms. As is to be expected, the search algorithm derived is strongly dependent the scale and representation of the problem explored, the amount of computational effort allotted to the overall search, and the search primitives available for the algorithm. Additionally, some insights are gained into issues of genetic algorithm search. A novel “memetic crossover” operator was evolved during the course of this research.

An Examination of Lamarckian Genetic Algorithms
C. Wellock and B.J. Ross, July 2001.

In keeping with the spirit of Lamarckian evolution, variations on a simple genetic algorithm are compared, in which each individual is optimized prior to evaluation. Four different optimization techniques in all are tested: random hillclimbing, social (memetic) exchange, and two techniques using artificial neural nets (ANNs). These techniques are tested on a set of three sample problems: an instance of a minimum-spanning tree problem, an instance of a travelling salesman problem, and a problem where ANNs are evolved to generate a random sequence of bits. The results suggest that in general, social exchange provides the best performance, consistently outperforming the non-optimized genetic algorithm; results for other optimization techniques are less compelling.

Edge Detection of Petrographic Images Using Genetic Programming
B.J. Ross, F. Fueten, and D. Yashkir, January 2000.

Gentropy: Evolutionary 2D Texture Generation
A. Wiens and B.J. Ross, May 2000.

Autonomous Life Agent Using Recurrent Neural Networks and Genetic Algorithms
T. Abou-Assaleh, and J. Zhang, November 2000.

The Search for a (46, 6, 1) Block Design
S.K. Houghten, L.H. Thiel, J. Janssen, and C.W.H. Lam, November 2000.

Probabilistic Pattern Matching and the Evolution of Stochastic Regular Expressions
B.J. Ross, May 1999.

Logic-based Genetic Programming with Definite Clause Translation Grammars
B.J. Ross, June 1999.

The Effects of Randomly Sampled Training Data on Program Evolution
B.J. Ross, November 1999.

Automatic Mineral Identification Using Genetic Programming
B.J. Ross, F. Fueten, and D. Yashkir, December 1999.

Design Issues of Wireless Communication for Cooperative Mobile Robot Teams
J. Barchanski, September 1998.

Real-Time Streaming Video for Ethernet LANs
J. Barchanski, January 1997.

On Random Shuffles & Subsets
T. Jenkyns, and D. McCarthy, September 1997.

The Evolution of Concurrent Programs
B.J. Ross, July 1996.

Simulation Study of Digital Video Transfer on an Ethernet LAN
J. Barchanski, November 1996.

On Resolution of Ambiguity of Raster Images and addendum
V. Wojcik, January 1995.

A Process Algebra for Stochastic Music Composition
B.J. Ross, February 1995.

Loopless Gray Code Algorithms
T. Jenkyns, July 1995.

A Symbiosis of Animation and Music
R. Pringle and B.J. Ross, December 1995.

A Process Algebra for Sequential and Concurrent Logic Programming
B.J. Ross, June 1994.

The Inductive Inference of Finite Interleaving with Synchronization
B.J. Ross, June 1994.

Running Programs Backwards: the Logical Inversion of Imperative Computation
B.J. Ross, June 1994.