Mastering the Entropy Weight Method: Theory, Steps, and Python Example
This article introduces the entropy weight method for multi‑criteria evaluation, explains its theoretical basis, outlines the calculation steps, presents a supplier‑selection case study, and provides a complete Python implementation to compute indicator weights and overall scores.
Entropy Weight Method
The entropy weight method is a commonly used technique in comprehensive evaluation models that applies information entropy theory to assign weights to criteria.
In this method, the entropy value of each evaluation indicator reflects the amount of information it contains; a larger entropy indicates richer information and a greater impact on the evaluation result. By normalizing the entropy values, the weight of each indicator can be derived.
The advantage of the entropy weight method is that it considers the interrelationships among indicators, avoiding subjective weighting and inconsistency, and it has been widely applied in multi‑indicator comprehensive evaluation and decision analysis.
Implementation Process
Assume there are m evaluation objects and n evaluation indicators. The calculation steps are:
1. Compute the information entropy of each indicator using the formula:
2. Standardize the original data, then calculate the entropy based on the standardized values.
3. Derive the weight of each indicator from its entropy.
Case Study and Python Implementation
Suppose a company needs to select a new supplier for a raw material, considering factors such as price, delivery time, and quality. These factors have different importance, so the company must determine the weight of each factor for a comprehensive evaluation.
The following table lists the scores of three candidate suppliers on the three criteria:
Supplier A – Price: 5, Delivery Time: 20, Quality: 80 Supplier B – Price: 6, Delivery Time: 15, Quality: 90 Supplier C – Price: 4, Delivery Time: 18, Quality: 85
<code>import pandas as pd
import numpy as np
from scipy.stats import entropy
# Define evaluation indicators and supplier data
indicators = ['Price', 'Delivery Time', 'Quality']
suppliers = ['Supplier A', 'Supplier B', 'Supplier C']
# Data matrix (rows: suppliers, columns: indicators)
data = np.array([[5, 20, 80], [6, 15, 90], [4, 18, 85]])
n = data.shape[0] # number of objects
# Calculate information entropy for each indicator
entropy_list = []
for i in range(len(indicators)):
entropy_list.append(1/np.log(n) * entropy(data[:, i]))
# Compute entropy weights
weight_list = []
for e in entropy_list:
weight_list.append(1 - e)
weight_list = weight_list / sum(weight_list)
# Calculate comprehensive scores for each supplier
scores = []
for i in range(len(suppliers)):
score_i = 0
for j in range(len(indicators)):
score_i += weight_list[j] * data[i, j]
scores.append(score_i)
# Identify the best supplier
best_supplier = suppliers[np.argmax(scores)]
print("Entropy weights for each indicator:", weight_list)
print("Comprehensive scores for each supplier:", scores)
print("Best supplier is:", best_supplier)
</code>The code implements the entropy weight method to evaluate multiple suppliers across several criteria, calculates each indicator's weight, computes the overall scores, and selects the supplier with the highest score.
First, a 3×3 NumPy array stores the scores of each supplier on price, delivery time, and quality. Pandas and NumPy are imported for data handling.
Next, the information entropy for each indicator is calculated and stored in entropy_list using the entropy formula, where the entropy of the i ‑th indicator is computed based on the standardized data.
Then, the entropy weights are derived by subtracting each entropy from 1 and normalizing so that the sum of weights equals 1.
Afterwards, the comprehensive score for each supplier is obtained by summing the products of the indicator weights and the supplier's scores for those indicators.
Finally, np.argmax identifies the supplier with the highest total score, and the results are printed.
Note that the entropy calculation includes a normalization step to ensure that the sum of all indicator weights equals 1.
That concludes the entire content.
Model Perspective
Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.