This network is trained to add two numbers together. The network has been built with three hidden nodes. The extreme values are duplicated as validating examples in order to terminate the learning process as soon as they are correct. To use this method the Controls need to be adjusted so that Stop when 100.00% of the validating examples are Correct after rounding is set.
This sample is to demonstrate the Associating facility. The examples are taken from discarded till receipts at a small general groceries and provisions store. Pairs and clusters of associated items in the shopping baskets can be found using Action > Start Associating. In this sample the associations can indicate shopping habits. Most of the important item pairs are close to each other in the store indicating the importance of layout on which items finish up in the shopping basket. The various types of soup show this association quite clearly. Some important item pairs are not close together in the store but will be used together later such as Milk and Custard Powder. A few of the associations are difficult to explain.
Some East African Cichlid fishes are difficult to classify because many of the characteristics that are normally measured for classification purposes lie in overlapping ranges. The network has been trained using the published descriptions of 16 different, but closely related, cichlids. Ten characteristics have been measured in each case. In this sample only one example has been used for each species. Periodic validating with the same data is used to terminate the learning process as soon as 100% correct classification is reached. It is more usual to use different validating and training data. After the network has been trained to be 100% correct on the training examples it can be used to identify other specimens.
If any cichlid experts are using EasyNN-plus the characteristics measured are SL - standard length, BD - body depth, HL - head length, PD - preorbital depth, CP - caudal length, UJL - upper jaw length, LJL - lower jaw length, LL - lateral line scales, D spines - dorsal spiny rays, D branched - dorsal branched rays. The descriptions used are taken from 'A revision of the Haplochromis and related species (Pisces : Cichlidae) from Lake George, Uganda' by P.H. Greenwood 1973. The Haplochromis and related genera have been revised since and the names used in this example have probably now been changed.
This is a simple planning example. It shows how a neural network can be used to estimate the digging rate of one to four diggers under a variety of conditions. The output of the network is the number of kilograms per hour dug from a hole. The inputs produce a number of conflicts. Diggers work best when they have a spade to dig with so four diggers with one spade only perform a little better than one digger with one spade. Rocky soil is more difficult to dig but rocks weigh more than soil and we are interested in the weight rather than the volume. The amount of clay also has some impact on the digging rate but it also weighs more than soil - but not as much as rocks. Cold soil is difficult to dig. Frozen soil is very difficult to dig. On the other hand, diggers need more rest breaks when the temperature is high. If the diameter of the hole is too small then the number of diggers who can work down the hole is limited. However the diggers who are not digging must be resting so high temperature is less of a problem.
This network is trained using a number of fixed discount breakpoints so that intermediate discount prices can be estimated.
This demonstrates the use of the Seek buttons. A retail grocer adjusts his prices according to the wholesale prices every morning when he collects his stock for the day. He always tries to make a reasonable profit but he knows from experience that to maximize his profit he cannot just increase his prices. When his prices reach a certain level his profit starts to decrease because his item sales begin to decrease. The neural network has learned the relationships between some prices and the resulting daily turnover and profit. Using Seek High the grocer can see which price combinations produce the maximum profit or maximum turnover.
This network has been trained using journey times and the routes from work to home. It is a short journey through a very busy area with variable traffic. The object is to find the fastest route home at a given time. The departure and arrival times are in decimal hours, not hours and minutes. The journey has five common points including the start and end. It uses the names of the roads to indicate the route taken on each stage of the journey between adjacent common points. The example journeys cover a period of one year so the most recent traffic and road conditions influence the neural network. To find the fastest route home using Query the departure time (in decimal hours) is Set in the input list and locked by clicking its Lock in the list. Then the Seek Low facility can be used to find the earliest arrival time and thus the fastest route home for that departure time.
This is a simple example showing how EasyNN-plus can be used to estimate house prices. It has been trained using a small number of houses around the Stockport (UK) area. It uses the information commonly found in estate agents advertisements. A very small subset of items of description has been chosen for input along with the location of the house. The location inputs are postal area codes for Stockport (SK1 to SK8). Some of the training cases are incomplete (indicated by a '?' in the training example) and allowed to default. It produces quite accurate results when tested against agents data for houses in a similar area in the same year (1993).
The data in this example is from Fisher, R.A. "The use of multiple measurements in taxonomic problems" Annual Eugenics, 7, Part II, 179-188 (1936), also in "Contributions To Mathematical Statistics" (John Wiley, NY, 1950). The data has been used many times before to demonstrate how neural networks (and other techniques) can be used for classification. 50 examples of three species of Iris are used to train the neural network. Two of the sets of data are not linearly separable. After training the network gets 100% correct.
All the two input logical functions are simulated using this neural network.
Three bit, odd parity example using validating examples to stop training as soon as 100% correct results are achieved.
Four bit, odd parity example trained, without using validating examples, until a low error is achieved.
This network uses random data for both training and validating. It demonstrates that a neural network can be trained using random data but shows that no amount of training will produce good validating results.
A bitmap image of 42 forged signatures has been split into individual parts using the Fragment facility. Each fragment contains one signature and has been associated with a bool in the adjacent Output column. The signatures have been assessed by the potential victim of the forgery to see which are good fakes. That is, which ones would fool him. Ten Example rows have been chosen at random and set to Querying so they will not take part in the training. These Querying examples are used to test the network after training. The network has then been created starting with 10 hidden nodes and allowed to grow. It converges quickly when the hidden nodes are incremented to 11. The excluded column named "reference" allows the Querying results to be compared with the original values. The network gives the correct result for 9 of the 10 Querying examples. It gets fg000002 wrong.
In this network one of the input columns and the output column is text data. It is a simple species classification network. It shows the problem that can occur when text is used in an output column because intermediate values cannot be calculated. The network requires to be trained to a very low error before it classifies the species correctly.
The standard "exclusive or" test that most neural networks systems use for demonstration purposes. This sample is used in the interactive Getting Started exercises.