In 300-500 words, answer the question above and ensure you answer all components of the question.
In addition to Yee Won Chong (2012) AND your textbook, you must include at least 1 additional peer-reviewed source (can be an article from the course readings, or an article that you have found yourself, as long as it is peer reviewed and published in a scholarly journal)
rate of temperature decrease. Using a 70/30 train/test split, we ran a grid search on these two hyperparameters over 1,000 iterations, results of which can be seen on Table 2. Note that we began initial temperature at 1,000 as too low of a temperature would result in the algorithm not expanding its search space to the full scope of the problem and operating as basic hill climbing, as mentioned previously. Initial Temperature Cooling Factor 1000 1.0E5 1.0E7 1.0E9 0.25 0.909 0.913 0.901 0.909 0.5 0.894 0.908 0.910 0.890 0.75 0.847 0.898 0.845 0.879 0.99 0.441 0.441 0.441 0.441 Table 2. Grid search on Simulated Annealing cooling factor and initial temperature. Entries define the final testing accuracy (%) after 1,000 iterations. Best accuracy marked in bold and underlined. Upon running the grid search, it is clear that an initial temperature of 100,000 and cooling factor of 0.25 offer optimal performance of the model. Next, we generated a curve demonstrating accuracy of our model’s training and testing set classification over 1,000 iterations (Figure 2). Figure 2. NN feed-forward accuracy over 1,000 iterations of the Simulated Annealing algorithm. The curve generated for our Simulated Annealing model looks strikingly similar to that generated using Randomized Hill Climbing; the algorithm runs into the same delayed reaction and accuracy anomalies, though at different iterations. This can be attributed to the algorithm’s similarities to Randomized Hill Climbing, as well as evidence that interactions of the different attributes within the neural network’s hidden layer are largely responsible for the behavior. Running 1,000 iterations of the Simulated Annealing algorithm lasted 19.56 seconds, placing the algorithm among the fastest within randomized optimization. Looking forward, the accuracy would likely converge higher if the algorithm given more iterations to run. Aside from this, the >GET ANSWER