Dear Students! Add Discussion. CS Artificial Intelligence. Members: Latest Activity: yesterday. Like 60 members like this Share. Tariq Malik.
|Country:||Turks & Caicos Islands|
|Published (Last):||10 April 2005|
|PDF File Size:||2.13 Mb|
|ePub File Size:||19.37 Mb|
|Price:||Free* [*Free Regsitration Required]|
You can change your ad preferences anytime. Artificial intelligence cs handouts lecture 11 - Upcoming SlideShare. Like this document? Why not share! Verheij - Artificial Intel Embed Size px. Start on. Show related SlideShares at end. WordPress Shortcode. Published in: Engineering. Full Name Comment goes here. Are you sure you want to Yes No. Estela Baker My doctor told me that I am no longer diabetic, that I am free of diabetes and that I can stop all the medications.
This is important. You must stop eating this food today or you could be doubling the speed at which your diabetes progresses Show More. Be the first to like this. No Downloads. Views Total views. Actions Shares. Embeds 0 No embeds. No notes for slide. Artificial intelligence cs handouts lecture 11 - 45 1. We kept the information about the tree traversal in memory in the queues , thus we know the links that have to be followed to reach the goal. Another thing we have noticed in the previous chapter is that we perform a sequential search through the search space.
In order to speed up the techniques we can follow a parallel approach where we start from multiple locations states in the solution space and try to search the space in parallel.
Our goal is to reach the top irrespective of how we get there. We apply different operators at a given position, and move in the direction that gives us improvement more height. What if instead of starting from one position we start to climb the hill from different positions as indicated by the diagram below. In other words, we start with different independent search instances that start from different locations to climb up the hill. Further think that we can improve this using a collaborative approach where these instances interact and evolve by sharing information in order to solve the problem.
You will soon find out that what we mean by interact and evolve. However, it is possible to implement parallelism in the sense that the instances can interact and evolve to solve the solution. Such implementations and 2. The genetic algorithm technology comes from the concept of human evolution. The following paragraph gives a brief overview of evolution and introduces some terminologies to the extent that we will require for further discussion on GA.
Individuals animals or plants produce a number of offspring children which are almost, but not entirely, like themselves. Some of these offspring may survive to produce offspring of their own— some will not. Over time, generations become better and better adapted to survive. At each step, current states of different pairs of these paths are combined to form new paths. This way the search paths don't remain independent, instead they share information with each other and thus try to improve the overall performance of the complete search space.
Inheritance has the same notion of having something or some attribute from a parent while mutation refers to a small random change. We will explain these two terms as we discuss the solution to a few problems through GA. The table on the next page shows which steps correspond to what. Terms Basic GA Problem 1 Solution Initial Population Start with a population of randomly generated attempted solutions to a problem Create randomly generated computer words Evaluation Function Evaluate each of the attempted solutions.
We will incorporate inheritance later in the example. Notice that mutation can be as simple as just flipping a bit at random or any number of bits. We go on repeating the algorithm until we either get our required word that is a bit number with all ones, or we run out of time. Hence GA is at times used to get optimal solution given some parameters. Consider that the given points are as follows.
It is not necessary in the above example that you get a solution that gives 0 badness. In case we go on doing iterations and we run out of time, we might just present the solution that has the least badness as the most optimal solution given these number of iterations on this data.
The only way to introduce variation was through mutation random changes. Assuming that each organism has just one chromosome, new offspring are produced by forming a new chromosome from parts of the chromosomes of each parent.
Let us repeat the bit word example again but this time using crossover instead of mutation. The simplest way to perform this crossover is to combine the head of one individual to the tail of the other, as shown in the diagram below. In the bit word problem, the two-parent, no mutation approach, if it succeeds, is likely to succeed much faster because up to half of the bits change each time, not just one bit.
However, with no mutation, it may not succeed at all. By pure bad luck, maybe none of the first randomly generated words have say bit 17 set to 1. Then there is no way a 1 could ever occur in this position. Another problem is lack of genetic diversity. Maybe some of the first generation did have bit 17 set to 1, but none of them were selected for the second generation. The best technique in general turns out to be a combination of both, i. Its called the Eight Queens Problem.
The problem is to place 8 queens on a chess board so that none of them can attack the other. A chess board can be considered a plain board with eight columns and eight rows as shown below. The possible cells that the Queen can move to when placed in a particular square are shown in black shading 8. We will use the representation as shown in the figure below.
Where the 8 digits for eight columns specify the index of the row where the queen is placed. For example, the sequence 2 6 8 3 4 5 3 1 tells us that in first column the queen is placed in the second row, in the second column the queen is in the 6th row so on till in the 8th column the queen is in the 1st row.
Now we need a fitness function, a function by which we can tell which board position is nearer to our goal. Since we are going to select best individuals at every step, we need to define a method to rate these board positions or individuals.
One fitness function can be to count the number of pairs of Queens that are not attacking each other. An example of how to compute the fitness of a board configuration is given in the diagram on the next page.
Suppose individuals board positions chosen for crossover are: Where the numbers 2 and 3 in the boxes to the left and right show the fitness of each board configuration and green arrows denote the queens that can attack none.
The following diagram shows how we apply crossover: Hence we now have a total of 4 candidate solutions. Depending on their fitness we will select the best two. The diagram below shows where we select the best two on the bases of their fitness. The vertical over shows the children and the horizontal oval shows the selected individuals which are the fittest ones according to the fitness function. Similarly, the mutation step can be done as under.
You might as well decide to flip 1, 2, 3 or k number of bits, at random position. Hence GA is totally a random technique. This process is repeated until an individual with required fitness level is found. If no such individual is found, then the process is repeated till the overall fitness of the population or any of its individuals gets very close to the required fitness level. An upper limit on the number of iterations is usually used to end the process in finite time.
One of the solutions to the problem is shown as under whose fitness value is 8. The following flow chart summarizes the Genetic Algorithm.
CS607 Artificial Intelligence
Artificial Intelligence - CS607 VU Video Lectures