Tag

dimensionality reduction

0 views collected around this technical thread.

Python Programming Learning Circle
Python Programming Learning Circle
Jan 2, 2025 · Artificial Intelligence

A Comprehensive Guide to Dimensionality Reduction Algorithms with Python Implementations

This article introduces eleven classic dimensionality reduction techniques—including PCA, LDA, MDS, LLE, and t‑SNE—explains their principles, advantages, and limitations, and provides complete Python code examples and resources for each method, making it a valuable guide for beginners in machine learning and data mining.

Data MiningPCAPython
0 likes · 17 min read
A Comprehensive Guide to Dimensionality Reduction Algorithms with Python Implementations
php中文网 Courses
php中文网 Courses
Oct 23, 2024 · Artificial Intelligence

Data Dimensionality Reduction and Feature Extraction with PHP

This article explains the concepts of data dimensionality reduction and feature extraction in machine learning and demonstrates how to implement them in PHP using the PHP‑ML library, including installation, data preprocessing, PCA-based reduction, and feature extraction with token vectorization and TF‑IDF.

Feature ExtractionPCAPHP-ML
0 likes · 5 min read
Data Dimensionality Reduction and Feature Extraction with PHP
php中文网 Courses
php中文网 Courses
Jun 13, 2024 · Artificial Intelligence

Using PHP for Data Dimensionality Reduction and Feature Extraction

This article explains the importance of data dimensionality reduction and feature extraction in machine learning, and provides a step‑by‑step guide with PHP code examples—including library installation, data preprocessing, PCA‑based reduction, and feature selection techniques—demonstrating how to handle large datasets efficiently.

Feature ExtractionPCAPHP
0 likes · 6 min read
Using PHP for Data Dimensionality Reduction and Feature Extraction
Model Perspective
Model Perspective
May 29, 2024 · Artificial Intelligence

How to Build Word Vectors from Scratch: A Step‑by‑Step Guide

This article explains the fundamentals of word vectors in NLP, walks through constructing them via co‑occurrence matrices and dimensionality reduction, demonstrates the process with a concrete example and Python code, and evaluates the resulting embeddings using cosine similarity.

NLPPythonSVD
0 likes · 7 min read
How to Build Word Vectors from Scratch: A Step‑by‑Step Guide
Model Perspective
Model Perspective
May 20, 2024 · Artificial Intelligence

How Dimensionality Reduction and Graph Theory Simplify Complex Systems

The article explains how dimensionality reduction techniques—such as PCA, LDA, and t‑SNE—combined with graph theory can transform high‑dimensional data into simpler, low‑dimensional representations, enabling clearer analysis of complex systems like neural networks and image data, and enhancing machine‑learning efficiency.

Data Visualizationdimensionality reductiongraph theory
0 likes · 6 min read
How Dimensionality Reduction and Graph Theory Simplify Complex Systems
Model Perspective
Model Perspective
Mar 21, 2023 · Artificial Intelligence

Master Linear Discriminant Analysis (LDA) with Python: Theory & Code

This article explains Linear Discriminant Analysis (LDA) as a pattern‑recognition technique that projects data onto a low‑dimensional space to maximize class separation, details its mathematical formulation with between‑class and within‑class scatter matrices, and provides a complete Python implementation using scikit‑learn on the Iris dataset, including visualization of the results.

LDALinear Discriminant AnalysisPython
0 likes · 6 min read
Master Linear Discriminant Analysis (LDA) with Python: Theory & Code
Model Perspective
Model Perspective
Mar 3, 2023 · Fundamentals

Unlock Hidden Patterns: A Practical Guide to Factor Analysis with Python

Factor analysis, a statistical technique for uncovering underlying common factors among variables, is explained alongside its distinction from PCA, detailed procedural steps, adequacy tests, and a hands‑on Python implementation using the factor_analyzer library with visualizations and factor rotation methods.

Pythondata preprocessingdimensionality reduction
0 likes · 10 min read
Unlock Hidden Patterns: A Practical Guide to Factor Analysis with Python
Model Perspective
Model Perspective
Feb 8, 2023 · Artificial Intelligence

Mastering Feature Selection: From Filters to Embedded Methods in Python

This article explains why feature selection is crucial for machine learning, outlines the general workflow, compares filter, wrapper, embedded, and synthesis approaches, and provides practical Python examples—including Pearson correlation, chi‑square tests, mutual information, variance selection, recursive elimination, L1 regularization, and PCA—complete with code snippets and visualizations.

Pythondimensionality reductionfeature selection
0 likes · 20 min read
Mastering Feature Selection: From Filters to Embedded Methods in Python
Model Perspective
Model Perspective
Jan 8, 2023 · Artificial Intelligence

Unlock Hidden Patterns: A Deep Dive into Unsupervised Learning Techniques

This article introduces unsupervised learning, covering its motivation, Jensen's inequality, key clustering methods such as EM, k‑means, hierarchical clustering, evaluation metrics, and dimensionality‑reduction techniques like PCA and ICA, providing clear explanations and illustrative diagrams.

EM algorithmICAPCA
0 likes · 8 min read
Unlock Hidden Patterns: A Deep Dive into Unsupervised Learning Techniques
Model Perspective
Model Perspective
Dec 30, 2022 · Fundamentals

How PCA Transforms Supplier Evaluation with Weighted Scores

This article explains the Principal Component Analysis (PCA) method, outlines its step‑by‑step weighting algorithm, and demonstrates a complete Python implementation that converts supplier metrics into objective scores using scikit‑learn.

PCAPythondata analysis
0 likes · 9 min read
How PCA Transforms Supplier Evaluation with Weighted Scores
Model Perspective
Model Perspective
Dec 14, 2022 · Fundamentals

Mastering PCA with SPSS: Step‑by‑Step Guide to Data Reduction

This guide explains PCA fundamentals, walks through suitability checks like KMO and Bartlett’s test, details step‑by‑step SPSS operations, and demonstrates how to interpret eigenvalues, scree plots, and rotated component matrices to extract meaningful factors from questionnaire data.

Bartlett TestKMO TestPCA
0 likes · 16 min read
Mastering PCA with SPSS: Step‑by‑Step Guide to Data Reduction
Model Perspective
Model Perspective
Sep 1, 2022 · Fundamentals

Master Factor Analysis in Python: From Theory to Practical Implementation

This article explains the origins and core concepts of factor analysis, outlines its algorithmic steps, demonstrates how to perform the analysis using Python's factor_analyzer library—including data preparation, adequacy tests, eigenvalue selection, rotation, and visualization—culminating in extracting new latent variables.

Pythondata sciencedimensionality reduction
0 likes · 10 min read
Master Factor Analysis in Python: From Theory to Practical Implementation
Model Perspective
Model Perspective
Aug 24, 2022 · Fundamentals

Unlocking Data Insights: How Principal Component Analysis Simplifies Complex Variables

Principal Component Analysis (PCA) reduces high‑dimensional data to a few uncorrelated components by maximizing variance, enabling noise reduction, visualization, and efficient modeling, with practical steps—including data standardization, covariance matrix computation, eigenvalue extraction, and component selection—illustrated through a clothing‑size measurement case study.

PCAdata analysisdimensionality reduction
0 likes · 9 min read
Unlocking Data Insights: How Principal Component Analysis Simplifies Complex Variables
Sohu Tech Products
Sohu Tech Products
Mar 17, 2021 · Big Data

Understanding Simhash: From Traditional Hash to Random Projection LSH

This article explains the principles and implementation of Simhash, covering the shortcomings of traditional hash functions, the use of cosine similarity, random projection for dimensionality reduction, locality‑sensitive hashing, and practical optimizations for large‑scale duplicate detection.

AlgorithmBig DataLocality Sensitive Hashing
0 likes · 24 min read
Understanding Simhash: From Traditional Hash to Random Projection LSH
JD Tech Talk
JD Tech Talk
Nov 30, 2020 · Big Data

Scalable Time Series Similarity Search in Big Data: Partitioning, Dimensionality Reduction, and LSH Approaches

This article examines the challenges of performing time‑series similarity queries on massive datasets and presents three scalable solutions—partition‑based indexing, dimensionality‑reduction using MinHash, and a combined approach with Locality Sensitive Hashing—to reduce computation while preserving similarity accuracy.

Big DataLSHMinHash
0 likes · 10 min read
Scalable Time Series Similarity Search in Big Data: Partitioning, Dimensionality Reduction, and LSH Approaches
TAL Education Technology
TAL Education Technology
Sep 17, 2020 · Artificial Intelligence

Comprehensive Guide to Feature Engineering and Data Preprocessing for Machine Learning

This article provides an extensive overview of feature engineering, covering feature understanding, cleaning, construction, selection, transformation, and dimensionality reduction techniques, illustrated with Python code using the Titanic dataset, and offers practical guidelines for improving data quality and model performance in machine learning projects.

PythonTitanic datasetdata preprocessing
0 likes · 44 min read
Comprehensive Guide to Feature Engineering and Data Preprocessing for Machine Learning
Qunar Tech Salon
Qunar Tech Salon
Jan 15, 2019 · Artificial Intelligence

Introduction to PCA with scikit-learn: A Dimensionality Reduction Tutorial

This article explains why dimensionality reduction is needed, introduces scikit-learn's PCA class and its parameters, provides step‑by‑step Python code examples for generating data, visualising samples, computing variance ratios, applying different n_components settings, and finally discusses the mathematical intuition and algorithmic workflow of Principal Component Analysis.

PCAPythondimensionality reduction
0 likes · 12 min read
Introduction to PCA with scikit-learn: A Dimensionality Reduction Tutorial
Meiyou UED
Meiyou UED
Dec 1, 2015 · Fundamentals

Unlocking Insights: How Exploratory Factor Analysis Simplifies Complex Data

This article introduces exploratory factor analysis as a powerful dimensionality‑reduction method, explains its historical origins, describes its relationship to confirmatory factor analysis, and demonstrates its practical use in consumer‑value research by extracting four interpretable factors.

consumer researchdimensionality reductionexploratory factor analysis
0 likes · 4 min read
Unlocking Insights: How Exploratory Factor Analysis Simplifies Complex Data