The wiki should be a brief overview of your presentation/paper . You should include pictures or video if applicable and, by all means, be creative. Basically, summarize the paper, illustrate pictures related to the paper, and atleast a link related to the topic.
- second page-
a. Summarize the key points in the article- use this article – http://eprints.lincoln.ac.uk/id/eprint/22407/1/Pre-print.pdf
b. Reflect on specific insights or research findings in the articles that are critical for a deeper understanding of the topic
c. Compare or contrast with a theoretical perspective or ideas from other readings such as your textbook.
d. Discuss a personal bias or assumption that is supported/challenged by the reading.
e. Discuss any difference in thinking you now have about the role of such factors such as gender, race, social-class, age, ethnicity in relation to the topic.
Finally, we ran our efficiency experiment, using a fixed item count (N) of 40 over 5,000 iterations (Figure 10). Here, Knapsack dominates the other three algorithms, quickly converging to a fitness result of roughly 4,000. Interestingly, all the algorithms converge rather fast, suggesting this problem consists of a search space with many isolated local minima that are hard to escape. The convergence also suggests that additional iterations would not increase the model’s fitness; in fact, it appears that fewer than 800 iterations were needed for all four algorithms to converge. Figure 10. Knapsack fitness results compared to optimization algorithm iterations. The high performance of the MIMIC algorithm on this type of problem can be attributed to its underlying function. MIMIC attempts to model a search space’s probability distribution, finding an inner ‘structure’ within the data. This allows it to use these assumption as it operates to find points it deems ‘good’ with a level of insight, rather than randomly selecting them as done by the other algorithms. In a problem like Knapsack, where certain combinations of items result in local minima with poor fitness, the algorithm is able to avoid these spots and converge to a more optimal local (or global) minima. Conclusion In this paper, we have looked at four unique randomized optimization algorithms, and compared their performance on entirely different problems. To summarize, we will compare where each of our algorithms exceled, and further, what the solution of our problems tells us about the overall application of randomized optimization. Optimization Algorithms Randomized Hill Climbing comes with two distinct advantages; the algorithm itself is incredibly simple, and further, given enough iterations, a properly-implemented algorithm will eventually fin>GET ANSWER