# Column Generation in Linear Programming and the Cutting Stock Problem | by Bruno Scalia C. F. Leite | Jun, 2023

Let us start the Python implementation of the cutting stock problem, in which the LP relaxation is solved to optimality, and an integer model with the patterns produced so far is solved. We will use numpy to do linear algebra operations, pandas to work with dataframes, scipy to the optimization algorithms, and matplotlib to visualize the cutting patterns in a plot.

`import numpy as npimport pandas as pdfrom scipy.optimize import linprogimport matplotlib.pyplot as plt`

Let us import a dataset that is a modified version of the problem proposed in the JuMP documentation.

`dataset = pd.read_csv("data.txt", sep=" ")`

And let us instantiate the problem parameters

`# Total widthW = 100.0# Width and amount associated with each demandw = dataset.w.valuesd = dataset.d.values# LP parametersA = np.eye(dataset.shape[0]) * (W // w)c = np.ones_like(w)`

Notice that to initialize the A matrix, I introduced cutting simple patterns that produce the maximum feasible number of rolls for each width from the demands. Suppose there is some demand for rolls of width 24. It would lead to an initial cutting pattern with a coefficient of 4 given the total width W of 100.

Now let us define a function to solve the subproblem given the total width W, a vector of widths associated with each demand w, and the current vector of dual variables duals.

`def solve_knapsack(W, w, duals):return linprog(-duals, A_ub=np.atleast_2d(w), b_ub=np.atleast_1d(W),bounds=(0, np.inf), integrality=1,)`

In the main loop, for each solution to the restricted master problem, we will use the linprog function from scipy. The A matrix as well as the demands vector are both passed as their negative as by convention scipy considers only “lesser than or equal to” inequalities.

`linprog(c, A_ub=-A, b_ub=-d, bounds=(0, None))`

The solution should have the negative of the dual variables associated with demands stored in the ineqlin.marginals attribute.

Exploring duality concepts, one could also obtain dual variables by solving the following LP.

`linprog(-d, A_ub=A.T, b_ub=c, bounds=(0, None))`

And let us put it all together in the main loop.

`# Initial solutionsol = linprog(c, A_ub=-A, b_ub=-d, bounds=(0, None))sol_dual = linprog(-d, A_ub=A.T, b_ub=c, bounds=(0, None))diff = np.abs(sol_dual.x + sol.ineqlin.marginals).sum()print(f"Compare duality difference: {diff}")# Iteratefor _ in range(1000):duals = -sol.ineqlin.marginalsprice_sol = solve_knapsack(W, w, duals)y = price_sol.xif 1 + price_sol.fun < -1e-4:print(f"Iteration: {_}; Reduced cost: {(1 + price_sol.fun):.3f}")A = np.hstack((A, y.reshape((-1, 1))))c = np.append(c, 1)sol = linprog(c, A_ub=-A, b_ub=-d, bounds=(0, None))else:break`

In the end, let us try both rounding up the solution of the linear relaxation and obtaining the integer solution considering only columns produced in the LP.

`sol_round = linprog(c, A_ub=-A, b_ub=-d, bounds=(0, np.inf), integrality=0)print(f"Rounding solution {np.ceil(sol_round.x).sum()}")sol = linprog(c, A_ub=-A, b_ub=-d, bounds=(0, np.inf), integrality=1)print(f"Integer solution: {sol.x.sum()}")`

Which should return:

• Rounding solution 339.0
• Integer solution: 334.0

In this case, we could improve the result by almost 2% just by imposing integrality constraints to the LP rather than just rounding up the relaxation. Just a spoiler for those willing to try Branch & Price: 334 is the exact solution for this instance.

At last, let us try and visualize the new cutting patterns:

`fig, ax = plt.subplots(figsize=[7, 3], dpi=100)hmap = ax.imshow(A > 1e-6, cmap="Blues")plt.axis('off')fig.tight_layout()plt.show()`

And we solved the cutting stock problem using column generation.