## Monday, June 30, 2003

gold mine!!!! vb source code http://www.joyofvb.com/Supplementary%20Pages/JVB_ListSourceCode.htm

Pseudocode for GA from http://www.willamette.edu/~levenick/cs448/lectures/L15_GAdetails.html

create initial population

while (!done) {

evaluate fitness for each individual

select mating pool

apply genetic operators to mating pool to create new individuals

replace worst individuals with new individuals

}

report results

Pseudocode for select mating pool (probalistically on the basis of fitness)

given a list of individuals, each with a fitness, and a runningSum variable

sum=0;

for each individual {

sum += fitness

runningSum = sum

}

iterate p times (p is the number of individuals in the mating pool) {

r = random number between 0 and sum-1

start with first individual on list

while runningSum < r

go to next individual

add current individual to mating pool

}

Pseudo for Genetic Programming from http://216.239.37.100/search?q=cache:PdDGrBurg9wJ:www.evalife.dk/publications/RKU_CEC2002_smooth_operators.pdf+pseudocode+for+Genetic+Programming&hl=en&ie=UTF-8

Main

initialise population P(0)

evaluate P(0)

t=0

while (t t=t+1

Recombine P(t-1) --> P(t)

Mutate P(t) --> P(t)

Evaluate P(t)

Tournament selection P(t) --> P(t)

}

create initial population

while (!done) {

evaluate fitness for each individual

select mating pool

apply genetic operators to mating pool to create new individuals

replace worst individuals with new individuals

}

report results

Pseudocode for select mating pool (probalistically on the basis of fitness)

given a list of individuals, each with a fitness, and a runningSum variable

sum=0;

for each individual {

sum += fitness

runningSum = sum

}

iterate p times (p is the number of individuals in the mating pool) {

r = random number between 0 and sum-1

start with first individual on list

while runningSum < r

go to next individual

add current individual to mating pool

}

Pseudo for Genetic Programming from http://216.239.37.100/search?q=cache:PdDGrBurg9wJ:www.evalife.dk/publications/RKU_CEC2002_smooth_operators.pdf+pseudocode+for+Genetic+Programming&hl=en&ie=UTF-8

Main

initialise population P(0)

evaluate P(0)

t=0

while (t

Recombine P(t-1) --> P(t)

Mutate P(t) --> P(t)

Evaluate P(t)

Tournament selection P(t) --> P(t)

}

from Vitorino Ramos, On the Implicit and on the Artificial - Morphogenesis and Emergent Aesthetics in Autonomous Collective Systems, in ARCHITOPIA Book / Catalogue, Art, Architecture and Science, J.L. Maubant and L. Moura (Eds.), pp. 25-57, Minist�rio da Ci�ncia e Tecnologia, Feb. 2002

...Synergy is a ubiquitous phenomenon in nature and human societies alike. One well know example is provided by the emergence of self-organization in social insects, via direct (mandibular, antennation, chemical or visual contact, etc) or indirect interactions. The latter types are more subtle and defined by Grass� as stigmergy to explain task coordination and regulation in the context of nest reconstruction in Macrotermes termites. An example, could be provided by two individuals, who interact indirectly when one of them modifies the environment and the other responds to the new environment at a later time. In other words, stigmergy could be defined as a typical case of environmental synergy. Grass� showed that the coordination and regulation of building activities do not depend on the workers themselves but are mainly achieved by the nest structure: a stimulating configuration triggers the response of a termite worker, transforming the configuration into another configuration that may trigger in turn another (possibly different) action performed by the same termite or any other worker in the colony. Another illustration of how stigmergy and self-organization can be combined into more subtle adaptive behaviors is recruitment in social insects. Self-organized trail laying by individual ants is a way of modifying the environment to communicate with nest mates that follow such trails. It appears that task performance by some workers decreases the need for more task performance: for instance, nest cleaning by some workers reduces the need for nest cleaning. Therefore, nest mates communicate to other nest mates by modifying the environment (cleaning the nest), and nest mates respond to the modified environment (by not engaging in nest cleaning); that is stigmergy...

...Th�raulaz and Bonabeau described for instance, a model of nest building in wasps, in which wasp-like agents are stimulated to deposit bricks when they encounter specific configurations of bricks: depositing a brick modifies the environment and hence the stimulatory field of other agents. These asynchronous automata (designed by an ensemble of algorithms) move in a 3D discrete space and behave locally in space and time on a pure stimulus-response basis. There are other types of examples (e.g. prey collectively transport), yet stimergy is also present: ants change the perceived environment of other ants (their cognitive map, according to Chialvo and Millonas), and in every example, the environment serves as medium of communication.

What all these examples have in common is that they show how stigmergy can easily be made operational. As mentioned by Bonabeau, that is a promising first step to design groups of artificial agents which solve problems: replacing coordination (and possible some hierarchy) through direct communications by indirect interactions is appealing if one wishes to design simple agents and reduce communication among agents. Another feature shared by several of the examples is incremental construction: for instance, termites make use of what other termites have constructed to contribute their own piece. In the context of optimization (though not used on the present works), incremental improvement is widely used: a new solution is constructed from previous solutions (see ACO paradigm, Dorigo et al). Finally, stigmergy is often associated with flexibility: when the environment changes because of an external perturbation, the insects respond appropriately to that perturbation, as if it were a modification of the environment caused by the colony�s activities. In other words, the colony can collectively respond to the perturbation with individuals exhibiting the same behavior. When it comes to artificial agents, this type of flexibility is priceless: it means that the agents can respond to a perturbation without being reprogrammed in its intrinsic features to deal with that particular instability. The system organizes itself in order to deal with new object classes (conceptual ideas translated to the computer in the form of basic 2D/3D forms), or even new sub-classes. This task can be performed in real time, and in robust ways due to system�s redundancy...

...Synergy is a ubiquitous phenomenon in nature and human societies alike. One well know example is provided by the emergence of self-organization in social insects, via direct (mandibular, antennation, chemical or visual contact, etc) or indirect interactions. The latter types are more subtle and defined by Grass� as stigmergy to explain task coordination and regulation in the context of nest reconstruction in Macrotermes termites. An example, could be provided by two individuals, who interact indirectly when one of them modifies the environment and the other responds to the new environment at a later time. In other words, stigmergy could be defined as a typical case of environmental synergy. Grass� showed that the coordination and regulation of building activities do not depend on the workers themselves but are mainly achieved by the nest structure: a stimulating configuration triggers the response of a termite worker, transforming the configuration into another configuration that may trigger in turn another (possibly different) action performed by the same termite or any other worker in the colony. Another illustration of how stigmergy and self-organization can be combined into more subtle adaptive behaviors is recruitment in social insects. Self-organized trail laying by individual ants is a way of modifying the environment to communicate with nest mates that follow such trails. It appears that task performance by some workers decreases the need for more task performance: for instance, nest cleaning by some workers reduces the need for nest cleaning. Therefore, nest mates communicate to other nest mates by modifying the environment (cleaning the nest), and nest mates respond to the modified environment (by not engaging in nest cleaning); that is stigmergy...

...Th�raulaz and Bonabeau described for instance, a model of nest building in wasps, in which wasp-like agents are stimulated to deposit bricks when they encounter specific configurations of bricks: depositing a brick modifies the environment and hence the stimulatory field of other agents. These asynchronous automata (designed by an ensemble of algorithms) move in a 3D discrete space and behave locally in space and time on a pure stimulus-response basis. There are other types of examples (e.g. prey collectively transport), yet stimergy is also present: ants change the perceived environment of other ants (their cognitive map, according to Chialvo and Millonas), and in every example, the environment serves as medium of communication.

What all these examples have in common is that they show how stigmergy can easily be made operational. As mentioned by Bonabeau, that is a promising first step to design groups of artificial agents which solve problems: replacing coordination (and possible some hierarchy) through direct communications by indirect interactions is appealing if one wishes to design simple agents and reduce communication among agents. Another feature shared by several of the examples is incremental construction: for instance, termites make use of what other termites have constructed to contribute their own piece. In the context of optimization (though not used on the present works), incremental improvement is widely used: a new solution is constructed from previous solutions (see ACO paradigm, Dorigo et al). Finally, stigmergy is often associated with flexibility: when the environment changes because of an external perturbation, the insects respond appropriately to that perturbation, as if it were a modification of the environment caused by the colony�s activities. In other words, the colony can collectively respond to the perturbation with individuals exhibiting the same behavior. When it comes to artificial agents, this type of flexibility is priceless: it means that the agents can respond to a perturbation without being reprogrammed in its intrinsic features to deal with that particular instability. The system organizes itself in order to deal with new object classes (conceptual ideas translated to the computer in the form of basic 2D/3D forms), or even new sub-classes. This task can be performed in real time, and in robust ways due to system�s redundancy...

from Ethnography of Artificial Culture: Specifications, Prospects, and Constraints by Nicholas Gessler

Abstract

In a recent article I discussed some of the possible research benefits for anthropology of applying the computational paradigm of artificial life (AL) to the scientific study of cultural evolution. I referred to this program as artificial culture (AC). Culture, in this view comprises not only the cognitive processes of individuals (what they think), but includes their behaviors (what they do), and the products of their labors (what they make) as well. AC comprises a population of individual agents, each with its own sensors, cognizers, and actuators interacting with other agents, with products of their own manufacture, and with an external world containing objects and limited and resources situated on a grid. All inanimate objects are given a materiality constrained by a physics, and animate objects are further constrained by a metabolism. Further specifications, potentials and constraints are discussed here for a simplified implementation called T.I.E.R.S. (Trade, Information Exchange, and Risk Sharing). Along with robot sociology, this strategy may help to clarify the role of the individual in society, the dynamics of cooperation and competition, and the general epistemology of emergence.

Abstract

In a recent article I discussed some of the possible research benefits for anthropology of applying the computational paradigm of artificial life (AL) to the scientific study of cultural evolution. I referred to this program as artificial culture (AC). Culture, in this view comprises not only the cognitive processes of individuals (what they think), but includes their behaviors (what they do), and the products of their labors (what they make) as well. AC comprises a population of individual agents, each with its own sensors, cognizers, and actuators interacting with other agents, with products of their own manufacture, and with an external world containing objects and limited and resources situated on a grid. All inanimate objects are given a materiality constrained by a physics, and animate objects are further constrained by a metabolism. Further specifications, potentials and constraints are discussed here for a simplified implementation called T.I.E.R.S. (Trade, Information Exchange, and Risk Sharing). Along with robot sociology, this strategy may help to clarify the role of the individual in society, the dynamics of cooperation and competition, and the general epistemology of emergence.

Space Syntax GLossary by Bjorn Klarqvis

File URL http://www.arch.chalmers.se/tema/stadsbyggnad/glossary.pdf

download into pda

File URL http://www.arch.chalmers.se/tema/stadsbyggnad/glossary.pdf

download into pda

georgio's web as human stigmergy is refering to Clay Shirky: in praise of evolvable system (about internet, 1996)

Despite the Web's ability to usurp the advantages of existing services, this is a story of inevitability, not of perfection. Yahoo and Lycos have taken over from Gopher and WAIS as our meta-indices, but the search engines themselves, as has been widely noted, are pretty lousy ways to find things. The problem that Gopher and WAIS set out to solve has not only not been solved by the Web, it has been made worse. Furthermore, this kind of problem is intractable because of the nature of evolvable systems.

THREE RULES FOR EVOLVABLE SYSTEMS

Evolvable systems -- those that proceed not under the sole direction of one centralized design authority but by being adapted and extended in a thousand small ways in a thousand places at once -- have three main characteristics that are germane to their eventual victories over strong, centrally designed protocols.

Only solutions that produce partial results when partially implemented can succeed. The network is littered with ideas that would have worked had everybody adopted them. Evolvable systems begin partially working right away and then grow, rather than needing to be perfected and frozen. Think VMS vs. Unix, cc:Mail vs. RFC-822, Token Ring vs. Ethernet.

What is, is wrong. Because evolvable systems have always been adapted to earlier conditions and are always being further adapted to present conditions, they are always behind the times. No evolving protocol is ever perfectly in sync with the challenges it faces.

Finally, Orgel's Rule, named for the evolutionary biologist Leslie Orgel -- "Evolution is cleverer than you are". As with the list of the Web's obvious deficiencies above, it is easy to point out what is wrong with any evolvable system at any point in its life. No one seeing Lotus Notes and the NCSA server side-by-side in 1994 could doubt that Lotus had the superior technology; ditto ActiveX vs. Java or Marimba vs. HTTP. However, the ability to understand what is missing at any given moment does not mean that one person or a small central group can design a better system in the long haul.

Centrally designed protocols start out strong and improve logarithmically. Evolvable protocols start out weak and improve exponentially. It's dinosaurs vs. mammals, and the mammals win every time. The Web is not the perfect hypertext protocol, just the best one that's also currently practical. Infrastructure built on evolvable protocols will always be partially incomplete, partially wrong and ultimately better designed than its competition.

Despite the Web's ability to usurp the advantages of existing services, this is a story of inevitability, not of perfection. Yahoo and Lycos have taken over from Gopher and WAIS as our meta-indices, but the search engines themselves, as has been widely noted, are pretty lousy ways to find things. The problem that Gopher and WAIS set out to solve has not only not been solved by the Web, it has been made worse. Furthermore, this kind of problem is intractable because of the nature of evolvable systems.

THREE RULES FOR EVOLVABLE SYSTEMS

Evolvable systems -- those that proceed not under the sole direction of one centralized design authority but by being adapted and extended in a thousand small ways in a thousand places at once -- have three main characteristics that are germane to their eventual victories over strong, centrally designed protocols.

Only solutions that produce partial results when partially implemented can succeed. The network is littered with ideas that would have worked had everybody adopted them. Evolvable systems begin partially working right away and then grow, rather than needing to be perfected and frozen. Think VMS vs. Unix, cc:Mail vs. RFC-822, Token Ring vs. Ethernet.

What is, is wrong. Because evolvable systems have always been adapted to earlier conditions and are always being further adapted to present conditions, they are always behind the times. No evolving protocol is ever perfectly in sync with the challenges it faces.

Finally, Orgel's Rule, named for the evolutionary biologist Leslie Orgel -- "Evolution is cleverer than you are". As with the list of the Web's obvious deficiencies above, it is easy to point out what is wrong with any evolvable system at any point in its life. No one seeing Lotus Notes and the NCSA server side-by-side in 1994 could doubt that Lotus had the superior technology; ditto ActiveX vs. Java or Marimba vs. HTTP. However, the ability to understand what is missing at any given moment does not mean that one person or a small central group can design a better system in the long haul.

Centrally designed protocols start out strong and improve logarithmically. Evolvable protocols start out weak and improve exponentially. It's dinosaurs vs. mammals, and the mammals win every time. The Web is not the perfect hypertext protocol, just the best one that's also currently practical. Infrastructure built on evolvable protocols will always be partially incomplete, partially wrong and ultimately better designed than its competition.

Stigmergy, a term coined by French biologist Pierre-Paul Grasse[2] is interaction through the environment.

Self-Organization in social insects often requires interactions among insects: such interactions can be direct or indirect. Direct interactions are the "obvious" interactions: antennation, trophallaxis (food or liquid exchange), mandibular contact, visual contact, chemical contact (the odor of nearby nestmates), etc. Indirect interactions are more subtle: two individuals interact indirectly when one of then modifies the environment and the other responds to the new environment at a later time. Such an interaction is an example of stigmergy. [1, p.14]

Self-Organization in social insects often requires interactions among insects: such interactions can be direct or indirect. Direct interactions are the "obvious" interactions: antennation, trophallaxis (food or liquid exchange), mandibular contact, visual contact, chemical contact (the odor of nearby nestmates), etc. Indirect interactions are more subtle: two individuals interact indirectly when one of then modifies the environment and the other responds to the new environment at a later time. Such an interaction is an example of stigmergy. [1, p.14]

## Saturday, June 28, 2003

## Friday, June 27, 2003

## Monday, June 23, 2003

stigmergy search http://www.google.com/search?hl=en&lr=&ie=UTF-8&oe=UTF-8&q=stigmergy+wasp&spell=1

guy theraulaz translated http://cognition.ups-tlse.fr/_guy/guy.html

use babelfish online translation FR to ENG http://babelfish.altavista.com/babelfish/tr

guy theraulaz translated http://cognition.ups-tlse.fr/_guy/guy.html

use babelfish online translation FR to ENG http://babelfish.altavista.com/babelfish/tr

## Tuesday, June 17, 2003

in the morning, voronoi can be added more points by user click on it. very neat and we are happy since we dont even care much, just using the most basic principle to generate voronoi

in the afternoon p managed to have two modules in: the voronoi and alpha syntax, and runs it together.

so visually its a bit like alpha syntax and voronoi superimposed.

the next thing he wants to do is to distinguish x's and y's by colouring.

what we would like is to colour the voronoi instead of the alpha syntax

the problem:

voronoi made of Acad lines instead of polygons.

so, can we map and then read lines as polygons?

starts polygonization here http://www.google.com/search?hl=en&lr=&ie=UTF-8&oe=UTF-8&q=pseudocode+polygonization&spell=1

in the afternoon p managed to have two modules in: the voronoi and alpha syntax, and runs it together.

so visually its a bit like alpha syntax and voronoi superimposed.

the next thing he wants to do is to distinguish x's and y's by colouring.

what we would like is to colour the voronoi instead of the alpha syntax

the problem:

voronoi made of Acad lines instead of polygons.

so, can we map and then read lines as polygons?

starts polygonization here http://www.google.com/search?hl=en&lr=&ie=UTF-8&oe=UTF-8&q=pseudocode+polygonization&spell=1

## Monday, June 16, 2003

http://www-2.cs.cmu.edu/~quake/tripaper/triangle2.html

from that page:

A triangular mesh generator rests on the efficiency of its triangulation algorithms and data structures, so I discuss these first. I assume the reader is familiar with Delaunay triangulations, constrained Delaunay triangulations, and the incremental insertion algorithms for constructing them. Consult the survey by Bern and Eppstein [2] for an introduction.

There are many Delaunay triangulation algorithms, some of which are surveyed and evaluated by Fortune [7] and Su and Drysdale [18]. Their results indicate a rough parity in speed among the incremental insertion algorithm of Lawson [11], the divide-and-conquer algorithm of Lee and Schachter [12], and the plane-sweep algorithm of Fortune [6]; however, the implementations they study were written by different people. I believe that Triangle is the first instance in which all three algorithms have been implemented with the same data structures and floating-point tests, by one person who gave roughly equal attention to optimizing each. (Some details of how these implementations were optimized appear in Appendix A.)

Table 1 compares the algorithms, including versions that use exact arithmetic (see Section 4) to achieve robustness, and versions that use approximate arithmetic and are hence faster but may fail or produce incorrect output. (The robust and non-robust versions are otherwise identical.) As Su and Drysdale [18] also found, the divide-and-conquer algorithm is fastest, with the sweepline algorithm second. The incremental algorithm performs poorly, spending most of its time in point location. (Su and Drysdale produced a better incremental insertion implementation by using bucketing to perform point location, but it still ranks third. Triangle does not use bucketing because it is easily defeated, as discussed in the appendix.) The agreement between my results and those of Su and Drysdale lends support to their ranking of algorithms.

An important optimization to the divide-and-conquer algorithm, adapted from Dwyer [5], is to partition the vertices with alternating horizontal and vertical cuts (Lee and Schachter's algorithm uses only vertical cuts). Alternating cuts speed the algorithm and, when exact arithmetic is disabled, reduce its likelihood of failure. One million points can be triangulated correctly in a minute on a fast workstation.

from http://astronomy.swin.edu.au/~pbourke/terrain/triangulate/

The triangulation algorithm may be described in pseudo-code as follows.

subroutine triangulate

input : vertex list

output : triangle list

initialize the triangle list

determine the supertriangle

add supertriangle vertices to the end of the vertex list

add the supertriangle to the triangle list

for each sample point in the vertex list

initialize the edge buffer

for each triangle currently in the triangle list

calculate the triangle circumcircle center and radius

if the point lies in the triangle circumcircle then

add the three triangle edges to the edge buffer

remove the triangle from the triangle list

endif

endfor

delete all doubly specified edges from the edge buffer

this leaves the edges of the enclosing polygon only

add to the triangle list all triangles formed between the point

and the edges of the enclosing polygon

endfor

remove any triangles from the triangle list that use the supertriangle vertices

remove the supertriangle vertices from the vertex list

end

The above can be refined in a number of ways to make it more efficient. The most significant improvement is to presort the sample points by one coordinate, the coordinate used should be the one with the greatest range of samples. If the x axis is used for presorting then as soon as the x component of the distance from the current point to the circumcircle center is greater than the circumcircle radius, that triangle need never be considered for later points, as further points will never again be on the interior of that triangles circumcircle. With the above improvement the algorithm presented here increases with the number of points as approximately O(N^1.5).

The time taken is relatively independent of the input sample distribution, a maximum of 25% variation in execution times has been noticed for a wide range of naturally occurring distributions as well as special cases such as normal, uniform, contour and grid distributions.

The algorithm does not require a large amount of internal storage. The algorithm only requires one internal array and that is a logical array of flags for identifying those triangles that no longer need be considered. If memory is available another speed improvement is to save the circumcircle center and radius for each triangle as it is generated instead of recalculating them for each added point. It should be noted that if sufficient memory is available for the above and other speed enhancements then the increase in execution time is almost a linear function of the number of points.

from that page:

A triangular mesh generator rests on the efficiency of its triangulation algorithms and data structures, so I discuss these first. I assume the reader is familiar with Delaunay triangulations, constrained Delaunay triangulations, and the incremental insertion algorithms for constructing them. Consult the survey by Bern and Eppstein [2] for an introduction.

There are many Delaunay triangulation algorithms, some of which are surveyed and evaluated by Fortune [7] and Su and Drysdale [18]. Their results indicate a rough parity in speed among the incremental insertion algorithm of Lawson [11], the divide-and-conquer algorithm of Lee and Schachter [12], and the plane-sweep algorithm of Fortune [6]; however, the implementations they study were written by different people. I believe that Triangle is the first instance in which all three algorithms have been implemented with the same data structures and floating-point tests, by one person who gave roughly equal attention to optimizing each. (Some details of how these implementations were optimized appear in Appendix A.)

Table 1 compares the algorithms, including versions that use exact arithmetic (see Section 4) to achieve robustness, and versions that use approximate arithmetic and are hence faster but may fail or produce incorrect output. (The robust and non-robust versions are otherwise identical.) As Su and Drysdale [18] also found, the divide-and-conquer algorithm is fastest, with the sweepline algorithm second. The incremental algorithm performs poorly, spending most of its time in point location. (Su and Drysdale produced a better incremental insertion implementation by using bucketing to perform point location, but it still ranks third. Triangle does not use bucketing because it is easily defeated, as discussed in the appendix.) The agreement between my results and those of Su and Drysdale lends support to their ranking of algorithms.

An important optimization to the divide-and-conquer algorithm, adapted from Dwyer [5], is to partition the vertices with alternating horizontal and vertical cuts (Lee and Schachter's algorithm uses only vertical cuts). Alternating cuts speed the algorithm and, when exact arithmetic is disabled, reduce its likelihood of failure. One million points can be triangulated correctly in a minute on a fast workstation.

from http://astronomy.swin.edu.au/~pbourke/terrain/triangulate/

The triangulation algorithm may be described in pseudo-code as follows.

subroutine triangulate

input : vertex list

output : triangle list

initialize the triangle list

determine the supertriangle

add supertriangle vertices to the end of the vertex list

add the supertriangle to the triangle list

for each sample point in the vertex list

initialize the edge buffer

for each triangle currently in the triangle list

calculate the triangle circumcircle center and radius

if the point lies in the triangle circumcircle then

add the three triangle edges to the edge buffer

remove the triangle from the triangle list

endif

endfor

delete all doubly specified edges from the edge buffer

this leaves the edges of the enclosing polygon only

add to the triangle list all triangles formed between the point

and the edges of the enclosing polygon

endfor

remove any triangles from the triangle list that use the supertriangle vertices

remove the supertriangle vertices from the vertex list

end

The above can be refined in a number of ways to make it more efficient. The most significant improvement is to presort the sample points by one coordinate, the coordinate used should be the one with the greatest range of samples. If the x axis is used for presorting then as soon as the x component of the distance from the current point to the circumcircle center is greater than the circumcircle radius, that triangle need never be considered for later points, as further points will never again be on the interior of that triangles circumcircle. With the above improvement the algorithm presented here increases with the number of points as approximately O(N^1.5).

The time taken is relatively independent of the input sample distribution, a maximum of 25% variation in execution times has been noticed for a wide range of naturally occurring distributions as well as special cases such as normal, uniform, contour and grid distributions.

The algorithm does not require a large amount of internal storage. The algorithm only requires one internal array and that is a logical array of flags for identifying those triangles that no longer need be considered. If memory is available another speed improvement is to save the circumcircle center and radius for each triangle as it is generated instead of recalculating them for each added point. It should be noted that if sufficient memory is available for the above and other speed enhancements then the increase in execution time is almost a linear function of the number of points.

## Sunday, June 15, 2003

seems to me that the problem is not just the perimeter, but the middle as

well. i have been debugging the program after changing the type definition of

structure 'delaunay' to hild the indeces into the array 'original points' to

see if all the points are represented in the permutations, and sometimes thay

are (when it is ok) and sometimes they are not (when it has holes in) . so

the problem is that the program does generate all permutations but does not

guarantee that all points eventually end up in the stored array of

triangles.i guess that we need not just one loop, but possibly a series of

passes, but how to do it?

p

well. i have been debugging the program after changing the type definition of

structure 'delaunay' to hild the indeces into the array 'original points' to

see if all the points are represented in the permutations, and sometimes thay

are (when it is ok) and sometimes they are not (when it has holes in) . so

the problem is that the program does generate all permutations but does not

guarantee that all points eventually end up in the stored array of

triangles.i guess that we need not just one loop, but possibly a series of

passes, but how to do it?

p

## Thursday, June 12, 2003

http://www.iue.tuwien.ac.at/publications/PhD%20Theses/fleischmann/diss.html

from that page:

Institute of Microelectronics TU Wien Research Links

from that page:

Institute of Microelectronics TU Wien Research Links

## Wednesday, June 11, 2003

so, what is the nature of research in this field???

research in architecture

"... The ultimate goal of Architectural Research is to provide a general and inter-subjectively acceptable knowlege about basic relations between architecture and man ..."

http://home.worldcom.ch/~negenter/021ArResOrigE.html

from that page:

--------------------------------------------------------------------------------

THE ORIGINS OF ARCHITECTURAL RESEARCH

--------------------------------------------------------------------------------

For many architects who have their educational backgrounds in the euphoric age of -> Modernism with its great cult figures like LeCorbusier, Mies van der Rohe, Frank Lloyd Wright etc., the name Pruitt-Igoe was a tremendous shock. The age of Modernism was at its end. Pruitt-Igoe was the name of a large modern district in St. Louis, Missouri, consisting of mostly 14 storied, clean, geometrical, highly functional buildings, designed by a famous architect, Minoru Yamasaki. The settlement was built according to modern ideals of the times and won an award of the American Institute of Architects in 1951. On July 15th 1972 at 3.52 p.m. large parts were bombed down. Not by an air strike during war - it was peacefully dynamited! But the results were the same: ruins. After a very short time, after only 20 years, it had become rubbish. The crime rate was high, social costs rose and it had become a slum area.

Twenty years is a very very short 'life expectancy' for buildings in view of the tremendous investments! Compare it with pre-modern architecture which, for example in European Cities survived over hundreds of years and still is, in Italy or France, the main attraction for much tourism! Its inhabitants earn money from it! There must be something wrong with the way we conceive architecture and urbanism. -> Modern Architecture and Urbanism

Charles Jencks rhetorically used the Pruitt-Igoe incident to declare ' the death of modern architecture ' and to call out ' Post-Modernism' . But this was a very superficial manoeuvre. Modernism was reduced to a merely timely phenomenon, simply a ' style ' (a term which modernism had vehemently rejected), and a new 'style' was propagated, in fact a semiotically diluted, 'metaphorical' historism.

Not everyone among the architects jumped into this shallow mainstream. There were many people who started to THINK about architecture and urbanism in new ways. Architecture and urbanism were questioned fundamentally, the constitutive structure of these domains in the field of modern knowledge was analysed. The architectural researchers became aware of things which supported their doubts that there was a deeper relation between gigantic failures like Pruitt-Igoe (which is only the tip of an iceberg) and the way we learn, teach and practice architecture today.

Knowledge is based on what one knows. Imagine somebody who calls himself a botanist, but except for orchids he knows nothing about plants. Or, imagine a zoologist who only knows about beautiful animals or butterflies for instance, or a doctor who only cares for beautiful people (some do). The architect considers himself a builder, but does not know what a building is. He only knows about pyramids, palaces and cathedrals (beautiful buildings!) -> Aesthetic definition of architecture , -> Post-medieval myth of the creator-genius (in architecture and art).

We probably would not trust a doctor who only cares for 'beautiful people', evidently because human life and particularly sickness is much wider than just aesthetics. The doctor who only cares for 'beautiful people' would indicate to us, that he has not much serious understanding for his profession and we would consider him a kind of playboy of his own passions.

The role of the architect in modern society is not far from that! Architects are trained today in relatively isolated educational domains (architectural schools) to cultivate their own passions for beauty-design and thus never realise that their knowledge is extremely limited: they think that man is happy if they design beautiful buildings for him. But, aesthetics is not only a very narrow aspect of our modern environment, it is also a very vague and subjective term. Anybody can make his own ready-made theory of beauty in architecture, and then 'build' his 'beauty' into our vital environment. We have to deal with it, even if it makes us sick. If we are lucky, someone blows it up 20 years later. -> Pruitt-Igoe. The unhealthy differentiation of styles and schools in architecture is tremendously increasing all over the world. Jencks counted more than 100 different styles recently. They deconstruct and deform our urban environments into deserts of virtual vanities and increasingly inhumane spaces.

In other words: there are some reasons to assume that the breakdown of modern architecture and urbanism has something to do with a lack of knowledge. The question arises whether the relatively closed and autocratic system of the art critique and the architect is enough to steer the system of our increasingly growing 'built environment'? Is this autocratic cybernetic system, in fact, a very outdated thing? Is it based on a profanised post-medieval myth, the Renaissance myth of the great human creator genius who invents new worlds for the present elite? Are they financing his 'art' into our daily living spaces? Is it enough if we rely on the architect-creator's highpriest, the art historian, who teaches us to sing the songs of what is good and what is evil in our cities? Is this structure maybe at the roots of the failure of the -> gigantic 1 : 1 scale experiment called 'modern architecture' ?

In fact, architecture and urbanism have been rationalised in all dimensions, streamlined with geometry, mathematical proportions, industrialised materials and technological reasoning, etc., but the myth survived. What building and dwelling objectively is in the widest sense in relation to man, was never researched with scientific methods.

This is where architectural research comes in. In 1969 Amos Rapoport 's book 'Built Form and Culture' went around the world and opened the view on an enormous manifold of worldwide architectural traditions. In the past 25 years about 2-3000 professionals of various disciplines including architects got involved in Research into traditional (or vernacular) architecture and settlements all over the world (Architectural Ethnology , EVAW ; IASTE ). The knowledge about architecture has increased tremendously beyond the conventional knowlege of the art historians' 'pyramids, palaces and cathedrals'. This knowledge starts to understand aesthetics and the structure of built space in its anthropological dimensions, as a metalinguistic domain which was swept away by modernism. Modern architecture leaves us 'alone'. Modern society pays for its gigantic architectural and urbanistic opportunism with tremendous social costs.

Some architect's offices have already realised the new trend, particularly those who are engaged on a worldwide level with projects in non-Western cultures. They increasingly employ or engage 'cultural architects ' or 'architectural anthropologists ' as consultants to avoid new disasters in the sense of Pruitt-Igoe.

What will be unavoidable in the near future is an intensification of architectural research . The 'art of healing' in medical practice would be unthinkable today without a tremendous apparatus of research. Architecture and urbanism are at least as important as health, maybe the basis of it. But architecture has no research. One of the largest sectors of any national economy has no scientific base at all. The architect still thinks absolutely subjectively with his pencil! Consequently there are - nearly - as many 'styles' as there are architects. It is not only a tremendous waste to 'design' each building subjectively, it throws our urban spaces into a chaos of unrelated formalisms.

The ultimate goal of Architectural Research is to provide a general and inter-subjectively acceptable knowlege about basic relations between architecture and man. The design of buildings and architecture - as history shows - is one of the faster intercultural diffusions, and one - in the long run - with most important impacts. It would be much cheaper to invest in research beforehand than to repeat Pruitt-Igoe on the global level.

--------------------------------------------------------------------------------

Back to the Homepage

http://www.hozien.com/mih/arg/toc.htm

from that page: (table of content)

A Rulebook for Arguments

Table of Contents

Preface

Introduction

I Composing a Short Argument: Some General Rules

(1) Distinguish premises and conclusion

(2) Present your ideas in a natural order

(3) Start from a reliable premises

(4) Use definite, specific, concrete language

(5) Avoid loaded language

(6) Use consistent terms

(7) Stick to one meaning for each term

II Argument by Example

(8) Is there more than one example?

(9) Are the examples representative?

(10) Background information is crucial

(11) Are there counterexamples?

III Arguments by Analogy

(12) Analogy requires a relevantly similar example

IV Arguments from Authority

(13) Sources should be cited

(14) Are the sources informed?

(15) Are the sources impartial?

(16) Cross-check sources

(17) Personal attacks do not disqualify a source

V Arguments about Causes

(18) Does the argument explain how cause leads to effect?

(19) Does the conclusion propose the most likely cause?

(20) Correlated events are not necessarily related

(21) Correlated events may have a common cause

(22) Either of two correlated events may cause the other

(23) Causes may be complex

VI Deductive Arguments

(24) Modus Ponens

(25) Modus Tollens

(26) Hypothetical Syllogism

(27) Disjunctive Syllogism

(28) Dilemma

(29) Reductio ad absurdum

(30) Deductive arguments in several steps

VII Composing an Argumentative Essay:

A. Exploring the Issue

(A1) Explore the arguments on all sides of an issue

(A2) Question and defend each argument�s premises

(A3) Revise and rethink arguments as they emerge

VIII Composing an Argumentative Essay:

B. Main Points of the Essay

(B1) Explain the question

(B2) Make a definite claim or proposal

(B3) Develop your arguments fully

(B4) Consider objections

(B5) Consider alternatives

IX Composing an Argumentative Essay:

C. Writing

(C1) Follow your outline

(C2) Keep the introduction brief

(C3) Give your arguments one at a time

(C4) Clarify, clarify, clarify

(C5) Support objections with arguments

(C6) Don�t claim more than you have shown

X Fallacies

The Two Great Fallacies

Directory of the Fallacies

Appendix

Definition

Uses of definition

Dictionary definitions

Precising definitions

Essential definitions

For Further Study

Previous

Main

Next

research in architecture

"... The ultimate goal of Architectural Research is to provide a general and inter-subjectively acceptable knowlege about basic relations between architecture and man ..."

http://home.worldcom.ch/~negenter/021ArResOrigE.html

from that page:

--------------------------------------------------------------------------------

THE ORIGINS OF ARCHITECTURAL RESEARCH

--------------------------------------------------------------------------------

For many architects who have their educational backgrounds in the euphoric age of -> Modernism with its great cult figures like LeCorbusier, Mies van der Rohe, Frank Lloyd Wright etc., the name Pruitt-Igoe was a tremendous shock. The age of Modernism was at its end. Pruitt-Igoe was the name of a large modern district in St. Louis, Missouri, consisting of mostly 14 storied, clean, geometrical, highly functional buildings, designed by a famous architect, Minoru Yamasaki. The settlement was built according to modern ideals of the times and won an award of the American Institute of Architects in 1951. On July 15th 1972 at 3.52 p.m. large parts were bombed down. Not by an air strike during war - it was peacefully dynamited! But the results were the same: ruins. After a very short time, after only 20 years, it had become rubbish. The crime rate was high, social costs rose and it had become a slum area.

Twenty years is a very very short 'life expectancy' for buildings in view of the tremendous investments! Compare it with pre-modern architecture which, for example in European Cities survived over hundreds of years and still is, in Italy or France, the main attraction for much tourism! Its inhabitants earn money from it! There must be something wrong with the way we conceive architecture and urbanism. -> Modern Architecture and Urbanism

Charles Jencks rhetorically used the Pruitt-Igoe incident to declare ' the death of modern architecture ' and to call out ' Post-Modernism' . But this was a very superficial manoeuvre. Modernism was reduced to a merely timely phenomenon, simply a ' style ' (a term which modernism had vehemently rejected), and a new 'style' was propagated, in fact a semiotically diluted, 'metaphorical' historism.

Not everyone among the architects jumped into this shallow mainstream. There were many people who started to THINK about architecture and urbanism in new ways. Architecture and urbanism were questioned fundamentally, the constitutive structure of these domains in the field of modern knowledge was analysed. The architectural researchers became aware of things which supported their doubts that there was a deeper relation between gigantic failures like Pruitt-Igoe (which is only the tip of an iceberg) and the way we learn, teach and practice architecture today.

Knowledge is based on what one knows. Imagine somebody who calls himself a botanist, but except for orchids he knows nothing about plants. Or, imagine a zoologist who only knows about beautiful animals or butterflies for instance, or a doctor who only cares for beautiful people (some do). The architect considers himself a builder, but does not know what a building is. He only knows about pyramids, palaces and cathedrals (beautiful buildings!) -> Aesthetic definition of architecture , -> Post-medieval myth of the creator-genius (in architecture and art).

We probably would not trust a doctor who only cares for 'beautiful people', evidently because human life and particularly sickness is much wider than just aesthetics. The doctor who only cares for 'beautiful people' would indicate to us, that he has not much serious understanding for his profession and we would consider him a kind of playboy of his own passions.

The role of the architect in modern society is not far from that! Architects are trained today in relatively isolated educational domains (architectural schools) to cultivate their own passions for beauty-design and thus never realise that their knowledge is extremely limited: they think that man is happy if they design beautiful buildings for him. But, aesthetics is not only a very narrow aspect of our modern environment, it is also a very vague and subjective term. Anybody can make his own ready-made theory of beauty in architecture, and then 'build' his 'beauty' into our vital environment. We have to deal with it, even if it makes us sick. If we are lucky, someone blows it up 20 years later. -> Pruitt-Igoe. The unhealthy differentiation of styles and schools in architecture is tremendously increasing all over the world. Jencks counted more than 100 different styles recently. They deconstruct and deform our urban environments into deserts of virtual vanities and increasingly inhumane spaces.

In other words: there are some reasons to assume that the breakdown of modern architecture and urbanism has something to do with a lack of knowledge. The question arises whether the relatively closed and autocratic system of the art critique and the architect is enough to steer the system of our increasingly growing 'built environment'? Is this autocratic cybernetic system, in fact, a very outdated thing? Is it based on a profanised post-medieval myth, the Renaissance myth of the great human creator genius who invents new worlds for the present elite? Are they financing his 'art' into our daily living spaces? Is it enough if we rely on the architect-creator's highpriest, the art historian, who teaches us to sing the songs of what is good and what is evil in our cities? Is this structure maybe at the roots of the failure of the -> gigantic 1 : 1 scale experiment called 'modern architecture' ?

In fact, architecture and urbanism have been rationalised in all dimensions, streamlined with geometry, mathematical proportions, industrialised materials and technological reasoning, etc., but the myth survived. What building and dwelling objectively is in the widest sense in relation to man, was never researched with scientific methods.

This is where architectural research comes in. In 1969 Amos Rapoport 's book 'Built Form and Culture' went around the world and opened the view on an enormous manifold of worldwide architectural traditions. In the past 25 years about 2-3000 professionals of various disciplines including architects got involved in Research into traditional (or vernacular) architecture and settlements all over the world (Architectural Ethnology , EVAW ; IASTE ). The knowledge about architecture has increased tremendously beyond the conventional knowlege of the art historians' 'pyramids, palaces and cathedrals'. This knowledge starts to understand aesthetics and the structure of built space in its anthropological dimensions, as a metalinguistic domain which was swept away by modernism. Modern architecture leaves us 'alone'. Modern society pays for its gigantic architectural and urbanistic opportunism with tremendous social costs.

Some architect's offices have already realised the new trend, particularly those who are engaged on a worldwide level with projects in non-Western cultures. They increasingly employ or engage 'cultural architects ' or 'architectural anthropologists ' as consultants to avoid new disasters in the sense of Pruitt-Igoe.

What will be unavoidable in the near future is an intensification of architectural research . The 'art of healing' in medical practice would be unthinkable today without a tremendous apparatus of research. Architecture and urbanism are at least as important as health, maybe the basis of it. But architecture has no research. One of the largest sectors of any national economy has no scientific base at all. The architect still thinks absolutely subjectively with his pencil! Consequently there are - nearly - as many 'styles' as there are architects. It is not only a tremendous waste to 'design' each building subjectively, it throws our urban spaces into a chaos of unrelated formalisms.

The ultimate goal of Architectural Research is to provide a general and inter-subjectively acceptable knowlege about basic relations between architecture and man. The design of buildings and architecture - as history shows - is one of the faster intercultural diffusions, and one - in the long run - with most important impacts. It would be much cheaper to invest in research beforehand than to repeat Pruitt-Igoe on the global level.

--------------------------------------------------------------------------------

Back to the Homepage

http://www.hozien.com/mih/arg/toc.htm

from that page: (table of content)

A Rulebook for Arguments

Table of Contents

Preface

Introduction

I Composing a Short Argument: Some General Rules

(1) Distinguish premises and conclusion

(2) Present your ideas in a natural order

(3) Start from a reliable premises

(4) Use definite, specific, concrete language

(5) Avoid loaded language

(6) Use consistent terms

(7) Stick to one meaning for each term

II Argument by Example

(8) Is there more than one example?

(9) Are the examples representative?

(10) Background information is crucial

(11) Are there counterexamples?

III Arguments by Analogy

(12) Analogy requires a relevantly similar example

IV Arguments from Authority

(13) Sources should be cited

(14) Are the sources informed?

(15) Are the sources impartial?

(16) Cross-check sources

(17) Personal attacks do not disqualify a source

V Arguments about Causes

(18) Does the argument explain how cause leads to effect?

(19) Does the conclusion propose the most likely cause?

(20) Correlated events are not necessarily related

(21) Correlated events may have a common cause

(22) Either of two correlated events may cause the other

(23) Causes may be complex

VI Deductive Arguments

(24) Modus Ponens

(25) Modus Tollens

(26) Hypothetical Syllogism

(27) Disjunctive Syllogism

(28) Dilemma

(29) Reductio ad absurdum

(30) Deductive arguments in several steps

VII Composing an Argumentative Essay:

A. Exploring the Issue

(A1) Explore the arguments on all sides of an issue

(A2) Question and defend each argument�s premises

(A3) Revise and rethink arguments as they emerge

VIII Composing an Argumentative Essay:

B. Main Points of the Essay

(B1) Explain the question

(B2) Make a definite claim or proposal

(B3) Develop your arguments fully

(B4) Consider objections

(B5) Consider alternatives

IX Composing an Argumentative Essay:

C. Writing

(C1) Follow your outline

(C2) Keep the introduction brief

(C3) Give your arguments one at a time

(C4) Clarify, clarify, clarify

(C5) Support objections with arguments

(C6) Don�t claim more than you have shown

X Fallacies

The Two Great Fallacies

Directory of the Fallacies

Appendix

Definition

Uses of definition

Dictionary definitions

Precising definitions

Essential definitions

For Further Study

Previous

Main

Next

http://lists.canonical.org/pipermail/kragen-tol/1999-July/000436.html

from that page: (some problem with voronoi diagram)

minimal spanning trees and Delaunay triangulations

Kragen Sitaker kragen@pobox.com

Thu, 8 Jul 1999 16:35:25 -0400 (EDT)

Previous message: more on wax and eternal resource locators

Next message: good software

Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]

--------------------------------------------------------------------------------

This is probably a well-known result, or maybe it's just wrong, but I

was excited when I discovered it this morning.

It seems that every geometric minimal spanning tree of a set of points

must be a subgraph of every Delaunay triangulation of that set of points.

from that page: (some problem with voronoi diagram)

minimal spanning trees and Delaunay triangulations

Kragen Sitaker kragen@pobox.com

Thu, 8 Jul 1999 16:35:25 -0400 (EDT)

Previous message: more on wax and eternal resource locators

Next message: good software

Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]

--------------------------------------------------------------------------------

This is probably a well-known result, or maybe it's just wrong, but I

was excited when I discovered it this morning.

It seems that every geometric minimal spanning tree of a set of points

must be a subgraph of every Delaunay triangulation of that set of points.

i want this book Illustrating Evolutionary Computation with Mathematica, Author Christian Jacob

review of it at http://www.ercb.com/brief/brief.0221.html

from that page:

Genetic Programming

Illustrating Evolutionary Computation with Mathematica, by Christian Jacob, is the English translation of the German original, Pricipia Evolvica, Simulierte Evolution mit Mathematica (dpunkt.verlag 1997).

Presenting to us the intriguing yet stilted artwork generated by evolutionary algorithms, Principia says interesting things about our ability to represent the world digitally, things of interest not just to specialists in the field, but to every serious programmer. Furthermore, computer caricature of the processes from which arise all life on earth is fraught with existential ambiguity in light of concurrent advances in biotechnology. The battle of Waterloo was won on the playing fields of Eton, one might say.

I don't mean to imply that this is a book of computer art. Principia is a masterpiece of a computer-science text. It's about programmers discovering ways to splash the gene pool of an evolving system and speed up the rate of evolution towards the desired computational result. Mutation, selection ... and what happens when the drift isn't in the desired direction? Smash it! Randomize afresh. This sheds new light on the Flood as eugenics.

While genetic algorithms are not new, it's startling to see how far a computer scientist can go towards explaining his thought processes in terms of Darwin. Breaking down complex problems into evolutionary algorithms requires an understanding of evolution as it occurred in the tangible world. There's an underlying principle here, as the Greeks used to say, one that has even political import when you consider there are school boards which believe science can be taught without teaching evolution.

Evolvica (http://www2.informatik.uni-erlangen.de/~jacob/Evolvica/Evolvica-Intro.html) is a Mathematica-based tutorial, programming, and experimentation environment for evolutionary computing that was created by the author and contains the examples in the book. It can be downloaded from http://www.cpsc.ucalgary.ca/njacob/IEC. If you don't own a license for Mathematica, there's the MathReader viewer for Mathematica which lets you view the system and examples; download it from http://www.wolfram.com/products/mathreader/.

The richness of Illustrating Evolutionary Computation with Mathematica has to be seen rather than described. I can't pretend to have finished this book; it will sit on my bedside bookshelf for years, no doubt, to be thumbed through a few of its 575 pages at a time and the wealth of ideas, equations, models and illustrative examples savored and assimilated.

Morgan Kaufmann Publishers went all out on this one, the layout is both lavish and tidy. That's only what this book deserved, which must have been at least half a decade in the authoring. If you have any interest in genetic algorithms, cellular automata, or simulated annealing this is the must-read of the year.

-- Jack J. Woehr (http://www.softwoehr.com)

--------------------------------------------------------------------------------

Copyright � 2001 Electronic Review of Computer Books

Created 7/4/2001 / Last modified 7/4/2001 / webmaster@ercb.com

http://documents.wolfram.com/v4/AddOns/StandardPackages/DiscreteMath/ComputationalGeometry.html

from that page:

Wolfram Research "Discrete Computational Geometry" FULL PAGE voronoi codes

review of it at http://www.ercb.com/brief/brief.0221.html

from that page:

Genetic Programming

Illustrating Evolutionary Computation with Mathematica, by Christian Jacob, is the English translation of the German original, Pricipia Evolvica, Simulierte Evolution mit Mathematica (dpunkt.verlag 1997).

Presenting to us the intriguing yet stilted artwork generated by evolutionary algorithms, Principia says interesting things about our ability to represent the world digitally, things of interest not just to specialists in the field, but to every serious programmer. Furthermore, computer caricature of the processes from which arise all life on earth is fraught with existential ambiguity in light of concurrent advances in biotechnology. The battle of Waterloo was won on the playing fields of Eton, one might say.

I don't mean to imply that this is a book of computer art. Principia is a masterpiece of a computer-science text. It's about programmers discovering ways to splash the gene pool of an evolving system and speed up the rate of evolution towards the desired computational result. Mutation, selection ... and what happens when the drift isn't in the desired direction? Smash it! Randomize afresh. This sheds new light on the Flood as eugenics.

While genetic algorithms are not new, it's startling to see how far a computer scientist can go towards explaining his thought processes in terms of Darwin. Breaking down complex problems into evolutionary algorithms requires an understanding of evolution as it occurred in the tangible world. There's an underlying principle here, as the Greeks used to say, one that has even political import when you consider there are school boards which believe science can be taught without teaching evolution.

Evolvica (http://www2.informatik.uni-erlangen.de/~jacob/Evolvica/Evolvica-Intro.html) is a Mathematica-based tutorial, programming, and experimentation environment for evolutionary computing that was created by the author and contains the examples in the book. It can be downloaded from http://www.cpsc.ucalgary.ca/njacob/IEC. If you don't own a license for Mathematica, there's the MathReader viewer for Mathematica which lets you view the system and examples; download it from http://www.wolfram.com/products/mathreader/.

The richness of Illustrating Evolutionary Computation with Mathematica has to be seen rather than described. I can't pretend to have finished this book; it will sit on my bedside bookshelf for years, no doubt, to be thumbed through a few of its 575 pages at a time and the wealth of ideas, equations, models and illustrative examples savored and assimilated.

Morgan Kaufmann Publishers went all out on this one, the layout is both lavish and tidy. That's only what this book deserved, which must have been at least half a decade in the authoring. If you have any interest in genetic algorithms, cellular automata, or simulated annealing this is the must-read of the year.

-- Jack J. Woehr (http://www.softwoehr.com)

--------------------------------------------------------------------------------

Copyright � 2001 Electronic Review of Computer Books

Created 7/4/2001 / Last modified 7/4/2001 / webmaster@ercb.com

http://documents.wolfram.com/v4/AddOns/StandardPackages/DiscreteMath/ComputationalGeometry.html

from that page:

Wolfram Research "Discrete Computational Geometry" FULL PAGE voronoi codes

## Tuesday, June 10, 2003

More generally, the formula for finding the number of combinations of k objects you can choose from a set of n objects is:

n!

n_C_k = ---------------

k! (n - k)!

Combinations & Permutations Programs

The Algorithm For N-Choose-R in High Level Pseudocode:

Let k = min (r, n-r)

Start answer = 1

Start multiplier = n

Start divisor = 1

while divisor<= k do

{ answer = ( answer * multiplier ) div divisor # this will evenly divide

decrement the multiplier

increment the divisor

}

n!

n_C_k = ---------------

k! (n - k)!

Combinations & Permutations Programs

The Algorithm For N-Choose-R in High Level Pseudocode:

Let k = min (r, n-r)

Start answer = 1

Start multiplier = n

Start divisor = 1

while divisor<= k do

{ answer = ( answer * multiplier ) div divisor # this will evenly divide

decrement the multiplier

increment the divisor

}

following reading: the craft of research

http://www.dcs.gla.ac.uk/~johnson/teaching/research_skills/research.html

from that page:

What is Research in Computing Science?

Chris Johnson

Glasgow Interactive Systems Group (GIST),

Department of Computer Science, Glasgow University,

Glasgow, G12 8QQ.

Tel: +44 141 330 6053

Fax: +44 141 330 4913

EMail:johnson@dcs.gla.ac.uk

This paper argues that the expanding scope of `computing science' makes it difficult to sustain traditional scientific and engineering models of research. In particular, recent work in formal methods has abandoned the traditional empirical methods. Similarly, research in requirements engineering and human computer interaction has challenged the proponents of formal methods. These tensions stem from the fact that `Computing Science' is a misnoma. Topics that are currently considered part of the discipline of computing science are technology rather than theory driven. This creates problems if academic departments are to impose scientific criteria during the assessment of PhDs. It is, therefore, important that people ask themselves `What is Research in Computing Science' before starting on a higher degree.

This paper is intended as a high level introduction for first year research students or students on an advanced MSc course. It should be read in conjunction with Basic Research Skills in Computing Science

Keywords: research skills, computing science.

**********

hi prof. Gero, how are you.

i've been looking around your site and publications.

it raised question in my mind, perhaps you could help.

first, the research field you're in is multi discipline, correct me if wrong. it is about architecture, the method is computing.

so how you help your student building the claim and evidence to support this, because architecture and computing is entirely different field? because from what i know, different field has different way to provide evidence.

cheers,

choesnah

http://www.dcs.gla.ac.uk/~johnson/teaching/research_skills/research.html

from that page:

What is Research in Computing Science?

Chris Johnson

Glasgow Interactive Systems Group (GIST),

Department of Computer Science, Glasgow University,

Glasgow, G12 8QQ.

Tel: +44 141 330 6053

Fax: +44 141 330 4913

EMail:johnson@dcs.gla.ac.uk

This paper argues that the expanding scope of `computing science' makes it difficult to sustain traditional scientific and engineering models of research. In particular, recent work in formal methods has abandoned the traditional empirical methods. Similarly, research in requirements engineering and human computer interaction has challenged the proponents of formal methods. These tensions stem from the fact that `Computing Science' is a misnoma. Topics that are currently considered part of the discipline of computing science are technology rather than theory driven. This creates problems if academic departments are to impose scientific criteria during the assessment of PhDs. It is, therefore, important that people ask themselves `What is Research in Computing Science' before starting on a higher degree.

This paper is intended as a high level introduction for first year research students or students on an advanced MSc course. It should be read in conjunction with Basic Research Skills in Computing Science

Keywords: research skills, computing science.

**********

hi prof. Gero, how are you.

i've been looking around your site and publications.

it raised question in my mind, perhaps you could help.

first, the research field you're in is multi discipline, correct me if wrong. it is about architecture, the method is computing.

so how you help your student building the claim and evidence to support this, because architecture and computing is entirely different field? because from what i know, different field has different way to provide evidence.

cheers,

choesnah

## Sunday, June 08, 2003

http://goanna.cs.rmit.edu.au/~gl/classes/TriangulationApplet.java

comments included are:

1. Edge class. Edges have two vertices, s and t, and two faces,

l (left) and r (right). The triangulation representation and

the Delaunay triangulation algorithms require edges.

2. Circle class. Circles are fundamental to computation of

Delaunay triangulations. In particular, an operation which

computes a circle defined by three points is required.

3. Triangulation class. A triangulation is represented as a set

of points and the edges which form the triangulation.

4.

http://www.ics.uci.edu/%7Eeppstein/gina/scot.drysdale.html#alg

Algorithms for Computing Voronoi Diagrams

-----------------------------------------------------------------

---------------

There are literally hundreds of different algorithms for

constructing various types of Voronoi diagrams. I will describe

two of the simplest.

The first algorithm inserts the points one at a time into the

diagram. Whenever a new point comes in, we need to do three

things. First, we need to figure out which of the existing Voronoi

cells contains the new site. Second, we need to "walk around" the

boundary of the new site's Voronoi region, inserting new edges

into the diagram. Finally, we delete all the old edges sticking

into the new region.

The second and third steps are the harder ones. It's possible that

each new site's Voronoi cell will touch all the old cells. Thus,

in the worst case, we end up spending linear time on each cell, or

O(n2) time overall. However, it has been proven that if you insert

the points in random order, the expected time is only O(n log n),

regardless of which set of points we're given. A divide and

conquer algorithm for constructing Voronoi diagrams was discovered

by Shamos and Hoey. Split the points into two halves, the leftmost

n/2 points, which we'll color bLue, and the rightmost n/2 points,

which we'll color Red. Recursively compute the Voronio diagram of

the two halves. Finally, merge the two diagrams by finding the

edges that separate the bLue points from the Red points The last

step can be done in linear time by the "walking ant" method. An

ant starts down at -infinity, walking upward along the path

halfway between some blue point and some red point. The ant wants

to walk all the way up to +infinity, staying as far away from the

points as possible. Whenever the ant gets to a red Voronoi edge,

it turns away from the new red point. Whenever it hits a blue

edge, it turns away from the new blue point. There are a few

surprisingly difficult details left to deal with, like how does

the ant know where to start, and how do you know which edge the

ant will hit next. (The interested reader is strongly encouraged

to consult the standard computational geometry literature for

solutions to these details.)

Delaunay Triangulations

-----------------------------------------------------------------

---------------

The Delaunay triangulation is the dual of the Voronoi diagram. To

build the Delaunay triangulation, draw a line segment between any

two sites whose Voronoi regions share an edge. This procedure

splits the convex hull of the sites into triangles.

The Delaunay triangulation has a number of nice properties. For

example, each point is connected to its nearest neighbor by an

edge in the triangulation. Since the Delaunay triangulation is a

planar graph, it has at most 3n-6 edges, where n is the number of

sites. So once you have the Dealunay triangulation, if you want to

find the closest pair of sites, you only have to look at 3n-6

pairs, instead of all n(n-1)/2 possible pairs.

Here are some nice properties of the Delaunay triangulation:

It's dual to the Voronoi diagram, so computing one automatically

gives you the other.

The Empty Circle Property -- If you draw a circle through the

vertices of ANY Delaunay triangle, no other sites will be inside

that circle.

It's a planar graph. By Euler's formula, it has at most 3n-6 edges

and at most 2n-5 triangles. This property can be used to reduce

many problems with quadratic size (like closest pair) down to

linear size in the time it takes to construct the triangulation.

It contains "fat" triangles, in the sense that the minimum angle

of any Delaunay triangle is as large as possible. In fact, if you

write down the list of all angles in the Delaunay triangulation,

in increasing order, then do the same thing for any other

triangulation of the same set of points, the Delaunay list is

guaranteed to be lexicographically smaller.

comments included are:

1. Edge class. Edges have two vertices, s and t, and two faces,

l (left) and r (right). The triangulation representation and

the Delaunay triangulation algorithms require edges.

2. Circle class. Circles are fundamental to computation of

Delaunay triangulations. In particular, an operation which

computes a circle defined by three points is required.

3. Triangulation class. A triangulation is represented as a set

of points and the edges which form the triangulation.

4.

http://www.ics.uci.edu/%7Eeppstein/gina/scot.drysdale.html#alg

Algorithms for Computing Voronoi Diagrams

-----------------------------------------------------------------

---------------

There are literally hundreds of different algorithms for

constructing various types of Voronoi diagrams. I will describe

two of the simplest.

The first algorithm inserts the points one at a time into the

diagram. Whenever a new point comes in, we need to do three

things. First, we need to figure out which of the existing Voronoi

cells contains the new site. Second, we need to "walk around" the

boundary of the new site's Voronoi region, inserting new edges

into the diagram. Finally, we delete all the old edges sticking

into the new region.

The second and third steps are the harder ones. It's possible that

each new site's Voronoi cell will touch all the old cells. Thus,

in the worst case, we end up spending linear time on each cell, or

O(n2) time overall. However, it has been proven that if you insert

the points in random order, the expected time is only O(n log n),

regardless of which set of points we're given. A divide and

conquer algorithm for constructing Voronoi diagrams was discovered

by Shamos and Hoey. Split the points into two halves, the leftmost

n/2 points, which we'll color bLue, and the rightmost n/2 points,

which we'll color Red. Recursively compute the Voronio diagram of

the two halves. Finally, merge the two diagrams by finding the

edges that separate the bLue points from the Red points The last

step can be done in linear time by the "walking ant" method. An

ant starts down at -infinity, walking upward along the path

halfway between some blue point and some red point. The ant wants

to walk all the way up to +infinity, staying as far away from the

points as possible. Whenever the ant gets to a red Voronoi edge,

it turns away from the new red point. Whenever it hits a blue

edge, it turns away from the new blue point. There are a few

surprisingly difficult details left to deal with, like how does

the ant know where to start, and how do you know which edge the

ant will hit next. (The interested reader is strongly encouraged

to consult the standard computational geometry literature for

solutions to these details.)

Delaunay Triangulations

-----------------------------------------------------------------

---------------

The Delaunay triangulation is the dual of the Voronoi diagram. To

build the Delaunay triangulation, draw a line segment between any

two sites whose Voronoi regions share an edge. This procedure

splits the convex hull of the sites into triangles.

The Delaunay triangulation has a number of nice properties. For

example, each point is connected to its nearest neighbor by an

edge in the triangulation. Since the Delaunay triangulation is a

planar graph, it has at most 3n-6 edges, where n is the number of

sites. So once you have the Dealunay triangulation, if you want to

find the closest pair of sites, you only have to look at 3n-6

pairs, instead of all n(n-1)/2 possible pairs.

Here are some nice properties of the Delaunay triangulation:

It's dual to the Voronoi diagram, so computing one automatically

gives you the other.

The Empty Circle Property -- If you draw a circle through the

vertices of ANY Delaunay triangle, no other sites will be inside

that circle.

It's a planar graph. By Euler's formula, it has at most 3n-6 edges

and at most 2n-5 triangles. This property can be used to reduce

many problems with quadratic size (like closest pair) down to

linear size in the time it takes to construct the triangulation.

It contains "fat" triangles, in the sense that the minimum angle

of any Delaunay triangle is as large as possible. In fact, if you

write down the list of all angles in the Delaunay triangulation,

in increasing order, then do the same thing for any other

triangulation of the same set of points, the Delaunay list is

guaranteed to be lexicographically smaller.

## Friday, June 06, 2003

http://cgm.cs.mcgill.ca/~soss/cs644/projects/perrier/

from that page:

The Projection Method for Finding Nearest Neighbors

Pattern Recognition (308-644B)

School of Computer Science, McGill University, Winter 1999

Term Project prepared by Jean-S�bastien Perrier (perrier@ai.polymtl.ca)

and Benoit Anctil (benoit@maude.biomed.mcgill.ca)

from that page:

The Projection Method for Finding Nearest Neighbors

Pattern Recognition (308-644B)

School of Computer Science, McGill University, Winter 1999

Term Project prepared by Jean-S�bastien Perrier (perrier@ai.polymtl.ca)

and Benoit Anctil (benoit@maude.biomed.mcgill.ca)

a circle by 3 given points in its circumcircle

http://mcraefamily.com/MathHelp/GeometryConicSectionCircleEquationGivenThreePoints.htm

http://mcraefamily.com/MathHelp/GeometryConicSectionCircleEquationGivenThreePoints.htm