<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wikimd.com/index.php?action=history&amp;feed=atom&amp;title=Linear_discriminant_analysis</id>
	<title>Linear discriminant analysis - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wikimd.com/index.php?action=history&amp;feed=atom&amp;title=Linear_discriminant_analysis"/>
	<link rel="alternate" type="text/html" href="https://wikimd.com/index.php?title=Linear_discriminant_analysis&amp;action=history"/>
	<updated>2026-04-20T21:31:51Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.44.2</generator>
	<entry>
		<id>https://wikimd.com/index.php?title=Linear_discriminant_analysis&amp;diff=6320568&amp;oldid=prev</id>
		<title>Prab: CSV import</title>
		<link rel="alternate" type="text/html" href="https://wikimd.com/index.php?title=Linear_discriminant_analysis&amp;diff=6320568&amp;oldid=prev"/>
		<updated>2025-02-18T05:04:25Z</updated>

		<summary type="html">&lt;p&gt;CSV import&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 05:04, 18 February 2025&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l58&quot;&gt;Line 58:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 58:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Fisher, R. A. (1936). &amp;quot;The Use of Multiple Measurements in Taxonomic Problems&amp;quot;. Annals of Eugenics. 7 (2): 179–188.&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;* Fisher, R. A. (1936). &amp;quot;The Use of Multiple Measurements in Taxonomic Problems&amp;quot;. Annals of Eugenics. 7 (2): 179–188.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;{{dictionary-stub1}}&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot;&gt;&lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;{{dictionary-stub1}}&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&amp;lt;gallery&amp;gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;File:Linear_discriminant_analysis_plot.png|Linear Discriminant Analysis Plot&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;File:Fisher2classes.png|Fisher&#039;s Linear Discriminant for Two Classes&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;File:4class3ddiscriminant.png|3D Discriminant for Four Classes&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;File:3dProjections.png|3D Projections in Linear Discriminant Analysis&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-side-deleted&quot;&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;&amp;lt;/gallery&amp;gt;&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;

&lt;!-- diff cache key wikimd4:diff:1.41:old-5266135:rev-6320568:php=table --&gt;
&lt;/table&gt;</summary>
		<author><name>Prab</name></author>
	</entry>
	<entry>
		<id>https://wikimd.com/index.php?title=Linear_discriminant_analysis&amp;diff=5266135&amp;oldid=prev</id>
		<title>Prab: CSV import</title>
		<link rel="alternate" type="text/html" href="https://wikimd.com/index.php?title=Linear_discriminant_analysis&amp;diff=5266135&amp;oldid=prev"/>
		<updated>2024-02-24T13:21:54Z</updated>

		<summary type="html">&lt;p&gt;CSV import&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;== Linear Discriminant Analysis ==&lt;br /&gt;
&lt;br /&gt;
Linear Discriminant Analysis (LDA) is a statistical technique used in machine learning and pattern recognition to find a linear combination of features that characterizes or separates two or more classes of objects or events. It is a supervised learning algorithm that has been widely used in various fields, including computer vision, bioinformatics, and finance.&lt;br /&gt;
&lt;br /&gt;
=== Overview ===&lt;br /&gt;
&lt;br /&gt;
LDA aims to find a linear projection of the data that maximizes the separation between different classes while minimizing the variance within each class. The goal is to reduce the dimensionality of the data while preserving the discriminatory information. By projecting the data onto a lower-dimensional space, LDA can effectively classify new instances based on their feature values.&lt;br /&gt;
&lt;br /&gt;
=== Mathematical Formulation ===&lt;br /&gt;
&lt;br /&gt;
Let&amp;#039;s consider a dataset with *n* samples and *d* features, where each sample belongs to one of *k* classes. The goal of LDA is to find a transformation matrix **W** that maps the original *d*-dimensional feature space to a lower-dimensional space, such that the between-class scatter is maximized and the within-class scatter is minimized.&lt;br /&gt;
&lt;br /&gt;
The between-class scatter matrix **SB** is defined as the sum of the outer products of the difference between the class means and the overall mean:&lt;br /&gt;
&lt;br /&gt;
**SB** = Σ(*mi* - *m*)(*mi* - *m*)^T,&lt;br /&gt;
&lt;br /&gt;
where *mi* is the mean vector of class *i* and *m* is the overall mean vector.&lt;br /&gt;
&lt;br /&gt;
The within-class scatter matrix **SW** is defined as the sum of the covariance matrices of each class:&lt;br /&gt;
&lt;br /&gt;
**SW** = Σ**Si**,&lt;br /&gt;
&lt;br /&gt;
where **Si** is the covariance matrix of class *i*.&lt;br /&gt;
&lt;br /&gt;
The objective of LDA is to find the transformation matrix **W** that maximizes the ratio of the determinant of **SB** to the determinant of **SW**. This can be achieved by solving the generalized eigenvalue problem:&lt;br /&gt;
&lt;br /&gt;
**SB** **W** = λ**SW** **W**,&lt;br /&gt;
&lt;br /&gt;
where λ is the eigenvalue and **W** is the eigenvector matrix.&lt;br /&gt;
&lt;br /&gt;
=== Steps of Linear Discriminant Analysis ===&lt;br /&gt;
&lt;br /&gt;
1. Compute the mean vectors and covariance matrices for each class.&lt;br /&gt;
2. Compute the between-class scatter matrix **SB** and the within-class scatter matrix **SW**.&lt;br /&gt;
3. Solve the generalized eigenvalue problem **SB** **W** = λ**SW** **W** to obtain the eigenvectors and eigenvalues.&lt;br /&gt;
4. Sort the eigenvectors in descending order based on their corresponding eigenvalues.&lt;br /&gt;
5. Select the top *k* eigenvectors to form the transformation matrix **W**.&lt;br /&gt;
6. Project the data onto the lower-dimensional space using **W**.&lt;br /&gt;
&lt;br /&gt;
=== Applications ===&lt;br /&gt;
&lt;br /&gt;
Linear Discriminant Analysis has various applications in different domains. Some of the notable applications include:&lt;br /&gt;
&lt;br /&gt;
- Face Recognition: LDA has been widely used in face recognition systems to extract discriminative features from facial images and improve classification accuracy.&lt;br /&gt;
- Bioinformatics: LDA has been applied to analyze gene expression data and identify genes that are differentially expressed between different biological conditions.&lt;br /&gt;
- Finance: LDA has been used in financial markets to classify and predict stock price movements based on various financial indicators.&lt;br /&gt;
&lt;br /&gt;
=== Conclusion ===&lt;br /&gt;
&lt;br /&gt;
Linear Discriminant Analysis is a powerful technique for dimensionality reduction and classification. By finding a linear projection that maximizes the separation between classes, LDA can effectively classify new instances based on their feature values. Its applications span across various fields, making it a valuable tool in machine learning and pattern recognition.&lt;br /&gt;
&lt;br /&gt;
== See Also ==&lt;br /&gt;
* [[Principal Component Analysis]]&lt;br /&gt;
* [[Support Vector Machines]]&lt;br /&gt;
* [[Naive Bayes Classifier]]&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;br /&gt;
* Fisher, R. A. (1936). &amp;quot;The Use of Multiple Measurements in Taxonomic Problems&amp;quot;. Annals of Eugenics. 7 (2): 179–188.&lt;br /&gt;
{{dictionary-stub1}}&lt;/div&gt;</summary>
		<author><name>Prab</name></author>
	</entry>
</feed>