1 Corinthians 13:2
"The physicists — Stephen Parke of Fermi National Accelerator Laboratory, Xining Zhang of the University of Chicago and Peter Denton of Brookhaven National Laboratory — had arrived at the mathematical identity about two months earlier while grappling with the strange behavior of particles called neutrinos.
They’d noticed that hard-to-compute terms called “eigenvectors,” describing, in this case, the ways that neutrinos propagate through matter, were equal to combinations of terms called “eigenvalues,” which are far easier to compute.
Moreover, they realized that the relationship between eigenvectors and eigenvalues — ubiquitous objects in math, physics and engineering that have been studied since the 18th century — seemed to hold more generally.
Although the physicists could hardly believe they’d discovered a new fact about such bedrock math, they couldn’t find the relationship in any books or papers. So they took a chance and contacted Tao, despite a note on his website warning against such entreaties.
“To our surprise, he replied in under two hours saying he’d never seen this before,” Parke said.
A week and a half later, the physicists and Tao, whom Parke posted a paper online reporting the new formula.
Eigenvectors and eigenvalues are ubiquitous because they characterize linear transformations:
--operations that stretch,
--squeeze,
--rotate
--or otherwise change all parts of an object in the same way.
These transformations are represented by rectangular arrays of numbers called matrices. One matrix might rotate an object by 90 degrees; another might flip it upside down and shrink it in half.
--Matrices do this by changing an object’s “vectors” — mathematical arrows that point to each physical location in an object.
Although the physicists could hardly believe they’d discovered a new fact about such bedrock math, they couldn’t find the relationship in any books or papers. So they took a chance and contacted Tao, despite a note on his website warning against such entreaties.
“To our surprise, he replied in under two hours saying he’d never seen this before,” Parke said.
A week and a half later, the physicists and Tao, whom Parke posted a paper online reporting the new formula.
Eigenvectors and eigenvalues are ubiquitous because they characterize linear transformations:
--operations that stretch,
--squeeze,
--rotate
--or otherwise change all parts of an object in the same way.
These transformations are represented by rectangular arrays of numbers called matrices. One matrix might rotate an object by 90 degrees; another might flip it upside down and shrink it in half.
--Matrices do this by changing an object’s “vectors” — mathematical arrows that point to each physical location in an object.
--A matrix’s eigenvectors — “own vectors” in German — are those vectors that stay aligned in the same direction when the matrix is applied. Take, for example, the matrix that rotates things by 90 degrees around the x-axis: The eigenvectors lie along the x-axis itself, since points falling along this line don’t rotate, even as everything rotates around them.
A related matrix might rotate objects around the x-axis and also shrink them in half. How much a matrix stretches or squeezes its eigenvectors is given by the corresponding eigenvalue — in this case, ½. (If an eigenvector doesn’t change at all, the eigenvalue is 1.)
Eigenvectors and eigenvalues are independent, and normally they must be calculated separately starting from the rows and columns of the matrix itself. College students learn how to do this for simple matrices. But the new formula differs from existing methods. “What is remarkable about this identity is that at no point do you ever actually need to know any of the entries of the matrix to work out anything,” said Tao.
The identity applies to “Hermitian” matrices, which transform eigenvectors by real amounts (as opposed to those that involve imaginary numbers), and which thus apply in real-world situations. The formula expresses each eigenvector of a Hermitian matrix in terms of the matrix’s eigenvalues and those of the “minor matrix,” a smaller matrix formed by deleting a row and column of the original one.
It’s unusual in mathematics for a tool to appear that’s not already associated with a problem, he said. But he thinks the relationship between eigenvectors and eigenvalues is bound to matter."
Quanta Magazine/Natalie Wolchover
A related matrix might rotate objects around the x-axis and also shrink them in half. How much a matrix stretches or squeezes its eigenvectors is given by the corresponding eigenvalue — in this case, ½. (If an eigenvector doesn’t change at all, the eigenvalue is 1.)
Eigenvectors and eigenvalues are independent, and normally they must be calculated separately starting from the rows and columns of the matrix itself. College students learn how to do this for simple matrices. But the new formula differs from existing methods. “What is remarkable about this identity is that at no point do you ever actually need to know any of the entries of the matrix to work out anything,” said Tao.
The identity applies to “Hermitian” matrices, which transform eigenvectors by real amounts (as opposed to those that involve imaginary numbers), and which thus apply in real-world situations. The formula expresses each eigenvector of a Hermitian matrix in terms of the matrix’s eigenvalues and those of the “minor matrix,” a smaller matrix formed by deleting a row and column of the original one.
It’s unusual in mathematics for a tool to appear that’s not already associated with a problem, he said. But he thinks the relationship between eigenvectors and eigenvalues is bound to matter."
Quanta Magazine/Natalie Wolchover