Jump to content

Portal:Mathematics

Page semi-protected
From Wikipedia, the free encyclopedia

The Mathematics Portal

Mathematics is the study of representing and reasoning about abstract objects (such as numbers, points, spaces, sets, structures, and games). Mathematics is used throughout the world as an essential tool in many fields, including natural science, engineering, medicine, and the social sciences. Applied mathematics, the branch of mathematics concerned with application of mathematical knowledge to other fields, inspires and makes use of new mathematical discoveries and sometimes leads to the development of entirely new mathematical disciplines, such as statistics and game theory. Mathematicians also engage in pure mathematics, or mathematics for its own sake, without having any application in mind. There is no clear line separating pure and applied mathematics, and practical applications for what began as pure mathematics are often discovered. (Full article...)

  Featured articles are displayed here, which represent some of the best content on English Wikipedia.

Selected image – show another

graph showing two sets of 4 points, each set perfectly fit by a trend line with positive slope; the set of points on the left is higher and the set on the right lower, so the entire collection of points is best fit by a trend line with negative slope
graph showing two sets of 4 points, each set perfectly fit by a trend line with positive slope; the set of points on the left is higher and the set on the right lower, so the entire collection of points is best fit by a trend line with negative slope
Simpson's paradox (also known as the Yule–Simpson effect) states that an observed association between two variables can reverse when considered at separate levels of a third variable (or, conversely, that the association can reverse when separate groups are combined). Shown here is an illustration of the paradox for quantitative data. In the graph the overall association between X and Y is negative (as X increases, Y tends to decrease when all of the data is considered, as indicated by the negative slope of the dashed line); but when the blue and red points are considered separately (two levels of a third variable, color), the association between X and Y appears to be positive in each subgroup (positive slopes on the blue and red lines — note that the effect in real-world data is rarely this extreme). Named after British statistician Edward H. Simpson, who first described the paradox in 1951 (in the context of qualitative data), similar effects had been mentioned by Karl Pearson (and coauthors) in 1899, and by Udny Yule in 1903. One famous real-life instance of Simpson's paradox occurred in the UC Berkeley gender-bias case of the 1970s, in which the university was sued for gender discrimination because it had a higher admission rate for male applicants to its graduate schools than for female applicants (and the effect was statistically significant). The effect was reversed, however, when the data was split by department: most departments showed a small but significant bias in favor of women. The explanation was that women tended to apply to competitive departments with low rates of admission even among qualified applicants, whereas men tended to apply to less-competitive departments with high rates of admission among qualified applicants. (Note that splitting by department was a more appropriate way of looking at the data since it is individual departments, not the university as a whole, that admit graduate students.)

Good articles – load new batch

  These are Good articles, which meet a core set of high editorial standards.

Did you know (auto-generated)load new batch

More did you know – view different entries

Did you know...
Did you know...
Showing 7 items out of 75

Selected article – show another


In this shear transformation of the Mona Lisa, the central vertical axis (red vector) is unchanged, but the diagonal vector (blue) has changed direction. Hence the red vector is said to be an eigenvector of this particular transformation and the blue vector is not.
Image credit: User:Voyajer

In mathematics, an eigenvector of a transformation is a vector, different from the zero vector, which that transformation simply multiplies by a constant factor, called the eigenvalue of that vector. Often, a transformation is completely described by its eigenvalues and eigenvectors. The eigenspace for a factor is the set of eigenvectors with that factor as eigenvalue, together with the zero vector.

In the specific case of linear algebra, the eigenvalue problem is this: given an n by n matrix A, what nonzero vectors x in exist, such that Ax is a scalar multiple of x?

The scalar multiple is denoted by the Greek letter λ and is called an eigenvalue of the matrix A, while x is called the eigenvector of A corresponding to λ. These concepts play a major role in several branches of both pure and applied mathematics — appearing prominently in linear algebra, functional analysis, and to a lesser extent in nonlinear situations.

It is common to prefix any natural name for the vector with eigen instead of saying eigenvector. For example, eigenfunction if the eigenvector is a function, eigenmode if the eigenvector is a harmonic mode, eigenstate if the eigenvector is a quantum state, and so on. Similarly for the eigenvalue, e.g. eigenfrequency if the eigenvalue is (or determines) a frequency. (Full article...)

View all selected articles

Subcategories


Full category tree. Select [►] to view subcategories.

Topics in mathematics

General Foundations Number theory Discrete mathematics


Algebra Analysis Geometry and topology Applied mathematics
Source

Index of mathematics articles

ARTICLE INDEX:
MATHEMATICIANS:

WikiProjects

WikiProjects The Mathematics WikiProject is the center for mathematics-related editing on Wikipedia. Join the discussion on the project's talk page.

In other Wikimedia projects

The following Wikimedia Foundation sister projects provide more on this subject:

More portals