Week 6 Homework - Common Errors Summary

Document Date: Week 6 Homework Assessment
Purpose: Summary of recurring mistakes to present to students in class


πŸ“Š Overall Statistics

  • Total students assessed: 19
  • Passing (β‰₯50%): 19 students (100%)
  • Partial (<50%): 0 students
  • Critical error frequency: Missing timing measurements affected 7+ students

πŸ”΄ Critical Error #1: Missing Timing Measurements

Error Description:

Many students did not measure computation time for all three optimization methods, especially missing timing for grid search.

Wrong Implementation:

% Grid search without timing
s_grid = linspace(0.05, 0.5, 200);
SSE = zeros(size(s_grid));
for i = 1:length(s_grid)
    SSE(i) = obj(s_grid(i));
end
[~, idx_min] = min(SSE);
s_hat_grid = s_grid(idx_min);

% fminsearch with timing
tic;
[x_hat, fval] = fminsearch(obj_x, x0);
time_fminsearch = toc;

% fmincon with timing
tic;
[s_hat_fcon, fval_con] = fmincon(obj, s0, [], [], [], [], 0, 1, [], opts);
time_fmincon = toc;
% ❌ WRONG: Grid search not timed!

Correct Implementation:

% Grid search WITH timing
tic;
s_grid = linspace(0.05, 0.5, 200);
SSE = zeros(size(s_grid));
for i = 1:length(s_grid)
    SSE(i) = obj(s_grid(i));
end
[~, idx_min] = min(SSE);
s_hat_grid = s_grid(idx_min);
time_grid = toc;  % βœ… CORRECT: Time grid search

% fminsearch with timing
tic;
[x_hat, fval] = fminsearch(obj_x, x0);
time_fminsearch = toc;

% fmincon with timing
tic;
[s_hat_fcon, fval_con] = fmincon(obj, s0, [], [], [], [], 0, 1, [], opts);
time_fmincon = toc;

% Display all timing results
fprintf('Time (Grid search): %.4f seconds\n', time_grid);
fprintf('Time (fminsearch): %.4f seconds\n', time_fminsearch);
fprintf('Time (fmincon): %.4f seconds\n', time_fmincon);

Why This Is Critical:

  • Cannot compare speed across methods without timing
  • Speed comparison is an explicit homework requirement
  • Demonstrates understanding of computational efficiency
  • Frequency: Affected 7+ students (most common error)

πŸ”΄ Error #2: Objective Plot Missing All Three Optima

Error Description:

Objective function plot only shows grid search curve without marking the optima from fminsearch and fmincon.

Wrong Implementation:

figure;
plot(s_grid, SSE, 'LineWidth', 1.5);
xlabel('s'); ylabel('SSE(s)');
title('Objective function: SSE vs s');
saveas(gcf, 'Figures/SSE_vs_s.png');
% ❌ WRONG: Only shows grid search, missing fminsearch and fmincon optima!

Correct Implementation:

figure;
plot(s_grid, SSE, 'LineWidth', 1.8); hold on;
plot(s_hat_grid, obj(s_hat_grid), 'ro', 'MarkerSize', 8, 'LineWidth', 1.5);
plot(s_hat_fminsearch, obj(s_hat_fminsearch), 'rs', 'MarkerSize', 8, 'LineWidth', 1.5);
plot(s_hat_fmincon, obj(s_hat_fmincon), 'bd', 'MarkerSize', 8, 'LineWidth', 1.5);
xlabel('Savings rate s'); ylabel('SSE');
title('Objective vs s (Grid, fminsearch, fmincon)');
legend('Objective','Grid min','fminsearch','fmincon','Location','best');
grid on;
saveas(gcf, fullfile('Figures','SSE_vs_s.png'));
% βœ… CORRECT: Shows all three optima with distinct markers

Why This Matters:

  • Homework explicitly requires showing all three methods’ optima
  • Demonstrates that all methods converge to similar solutions
  • Visual confirmation of calibration results
  • Frequency: Affected 4-5 students

🟠 Error #3: Missing optimoptions for fmincon

Error Description:

Calling fmincon without explicit optimoptions, relying on defaults which may display unnecessary output.

Wrong Implementation:

[s_hat_fcon, fval_con] = fmincon(obj, s0, [], [], [], [], 0, 1, []);
% ❌ WRONG: No optimoptions specified, may display iteration output

Correct Implementation:

opts = optimoptions('fmincon','Display','off');
[s_hat_fcon, fval_con] = fmincon(obj, s0, [], [], [], [], 0, 1, [], opts);
% βœ… CORRECT: Explicit options with Display='off'

Why This Matters:

  • Cleaner output without unnecessary iteration display
  • Professional code presentation
  • Allows customization of optimization parameters
  • Frequency: Affected 3-4 students

🟠 Error #4: Figures Saved to Wrong Location

Error Description:

Saving figures to current directory instead of Figures/ folder.

Wrong Implementation:

saveas(gcf, 'SSE_vs_s.png');  % ❌ WRONG: Saves to current directory
saveas(gcf, 'fit_vs_data.png');

Correct Implementation:

if ~exist('Figures','dir'), mkdir('Figures'); end
saveas(gcf, fullfile('Figures', 'SSE_vs_s.png'));  % βœ… CORRECT: Saves to Figures/
saveas(gcf, fullfile('Figures', 'fit_vs_data.png'));

Why This Matters:

  • Organization: all figures in one location
  • Cleaner workspace
  • Easier to find and review figures
  • Frequency: Affected 2-3 students

🟑 Error #5: Missing Comparison Comments

Error Description:

Not providing written comments comparing the three optimization methods in terms of speed, robustness, and similarity of results.

Required Elements:

% Comments should address:
% 1. Which solver is faster?
% 2. Which is more robust?
% 3. Do they give similar s*?

Good Example:

% Comparison Comments:
% 1. Speed: fminsearch and fmincon are much faster than grid search,
%    requiring fewer function evaluations (~20-50 vs 200).
% 2. Robustness: fmincon is most robust with direct constraint handling.
%    Grid search always works but is slow for fine grids.
% 3. Similarity: All three methods give nearly identical s* values,
%    confirming the objective function is well-behaved.

Why This Matters:

  • Demonstrates understanding of method trade-offs
  • Explicit homework requirement
  • Shows critical thinking about optimization
  • Frequency: Affected 1-2 students

🟑 Error #6: Incomplete Model Fit Plot

Error Description:

Model fit plot only shows one method instead of all three for comparison.

Better Implementation:

% Plot all three methods for comparison
figure;
plot(1:T, y_data, 'k-', 'LineWidth', 1.5, 'DisplayName', 'Data'); hold on;
plot(1:T, y_fit_grid, 'r--', 'LineWidth', 1.5, 'DisplayName', 'Grid Search');
plot(1:T, y_fit_fminsearch, 'g-.', 'LineWidth', 1.5, 'DisplayName', 'fminsearch');
plot(1:T, y_fit_fmincon, 'b:', 'LineWidth', 1.5, 'DisplayName', 'fmincon');
xlabel('Time (t)'); ylabel('Output y_t');
title('Model Fit: Data vs All Three Methods');
legend('Location','best'); grid on;
% βœ… Shows comparison of all three methods

Why This Matters:

  • Visual confirmation that all methods produce similar fits
  • Demonstrates understanding of calibration results
  • More informative than single-method plot
  • Frequency: Observed in some submissions (not critical, but recommended)

πŸ“ Good Practices Observed

βœ… Excellent Implementations:

  1. Complete timing measurements: Several students timed all three methods correctly
  2. Comprehensive plots: Many students showed all three optima on objective plot
  3. Excellent comparison comments: Some students provided detailed, thoughtful comparisons
  4. Professional figure management: Most students saved figures properly in Figures/ folder
  5. Model fit comparisons: Several students plotted all three methods for visual comparison

βœ… Strong Code Organization:

  • Clear separation of methods (grid search, fminsearch, fmincon)
  • Good documentation and comments
  • Proper use of helper functions (solow_simulate)
  • Organized output with formatted fprintf statements
  • Table outputs for comparison (observed in some submissions)

βœ… Good Mathematical Understanding:

  • Proper understanding of sigmoid reparameterization for fminsearch
  • Correct handling of bounds in fmincon
  • Understanding of calibration objective (SSE minimization)
  • Discussion of speed vs robustness trade-offs
  • Recognition that all methods should converge to similar solutions

βœ… Outstanding Features:

  • Some students included error percentages (observed in template solutions)
  • Comprehensive output with iterations and function evaluations tracked
  • Professional figure styling with proper labels and legends
  • Multiple figure formats (PNG and PDF) saved

🎯 Key Teaching Points for Class Discussion

1. Timing All Methods

  • Always time all three methods for fair comparison
  • Use tic/toc to measure execution time
  • Grid search timing is essential (not just optimizers)
  • Display timing in output for transparency

2. Complete Visualizations

  • Objective plot must show all three optima with distinct markers
  • Use different marker types (circles, squares, diamonds) and colors
  • Include legend to identify each method
  • Model fit plot should ideally show all three methods for comparison

3. Optimization Options

  • Always specify optimoptions for fmincon
  • Set Display='off' for cleaner output
  • Can customize tolerance and algorithm if needed
  • Demonstrates understanding of optimization controls

4. File Organization

  • Create Figures/ directory at start of script
  • Always save figures to Figures/ folder, not current directory
  • Use fullfile() for cross-platform compatibility
  • Save in appropriate formats (PNG, PDF, or both)

5. Comparison Comments

  • Address all three homework questions explicitly
  • Discuss speed: which method is fastest and why
  • Discuss robustness: which handles constraints/edge cases best
  • Discuss similarity: do results converge to same solution
  • Show critical thinking about method trade-offs

6. Reparameterization Understanding

  • fminsearch requires reparameterization (sigmoid) to enforce bounds
  • Understand why: fminsearch is unconstrained, needs mapping
  • fmincon directly handles bounds, no reparameterization needed
  • Grid search is transparent but slower

πŸ“ˆ Grade Distribution Summary

  • βœ… More than 50% Correct: 19 students (100%)
  • ⚠️ Partial <50% Correct: 0 students
  • ❌ Incorrect/Incomplete: 0 students

Most common grade: βœ… (Passing)

Key takeaway: All students who submitted Week 6 homework passed (100% pass rate). Most students demonstrated solid understanding of optimization methods and calibration. The most common issue was missing timing measurements, especially for grid search. Overall performance was excellent, with many outstanding submissions demonstrating professional-level implementation and analysis.


πŸ’‘ Recommendations for Future Classes

  1. Emphasize timing requirements - Make it explicit that ALL methods need timing
  2. Provide example of complete objective plot - Show template with all three optima markers
  3. Highlight comparison comment requirements - Provide structure/guidelines for comments
  4. Include file organization checklist - Ensure Figures/ folder creation and proper saving
  5. Demonstrate optimoptions usage - Show proper fmincon syntax with options
  6. Encourage model fit comparison plots - Show all three methods together for visual confirmation

πŸ“š Excellent Submission Examples

Outstanding Features Observed:

  1. Comprehensive Timing:
    • All three methods timed with detailed output
    • Timing displayed in formatted table
    • Clear comparison of execution times
  2. Professional Visualizations:
    • All three optima clearly marked on objective plot
    • Model fit shows all three methods with distinct styles
    • High-quality figures with proper formatting
    • Both PNG and PDF formats saved
  3. Excellent Comparison Comments:
    • Directly addresses all homework questions
    • Thoughtful discussion of speed vs robustness
    • Recognition of method strengths and limitations
    • Clear conclusion about similarity of results
  4. Advanced Features:
    • Error tracking (percentage difference from true value)
    • Iteration and function evaluation counts
    • Exit flag reporting
    • Comprehensive formatted output tables

πŸ”„ Comparison with Week 5

Week 5 vs Week 6:

  • Week 5 had more critical algorithmic errors (bisection function value updates)
  • Week 6 errors are more about completeness and presentation
  • Week 6 had 100% pass rate vs Week 5’s 81% pass rate
  • Week 6 students demonstrated better understanding of requirements overall
  • Week 6 errors are easier to fix (add timing, complete plots) vs Week 5’s fundamental algorithm issues