New research from the University of Waterloo is making inroads on one of the biggest problems in theoretical computer science. But the way to do it, according to Cameron Seth, a PhD researcher working in the field of algorithmic approximation, is by breaking the problem down into smaller pieces.
"Everyone working in computer science and mathematics knows about the 'P vs. NP' problem," Seth says. "It's one of the notorious Millennium Prize Problems: so famous and so difficult that solving one will earn you a million dollars."
To understand the crux of the "P vs. NP" problem, imagine an enormous jigsaw puzzle or a Sudoku puzzle. It would be a 'P' problem if it could be solved relatively quickly by a computer, whereas they would be an 'NP' problem if they were extremely difficult to solve, but a provided solution could be quickly verified.
For example, a Sudoku puzzle might take someone a long time to solve - perhaps hours - but once a solution is provided, it only takes seconds to check that all the columns and rows have the right numbers.
"With P vs. NP the question that's keeping everyone up at night is whether every solution that can be checked quickly is also a problem that can be solved quickly. Is every problem that's easy to verify also easy to solve?"
The practical implications for this lingering question in computer science affect significant research and development in cryptography, AI and optimization. The most common methods of encryption used for sensitive networks of all kinds rely on the assumption that there are problems that are extremely difficult to solve but easy to verify. That's the underlying logic for everything from online passwords to secure banking transfers.
Rather than tackling the problem head-on, Seth's research is instead looking for solutions for approximate problems.
"What I'm doing is looking at a series of smaller problems that are related to the broader P vs. NP problem. Essentially, what I'm asking is if we can solve these other related problems," Seth says.
His recent research in graph algorithms, for example, imagines a huge network with an enormous number of connections, such as one might find in a massive online social networking app. Seth carves out a smaller piece of the graphed network and asks what that smaller piece of the puzzle can teach us about the whole.
This technical innovation provides a combinatorial tool, which can then help solve complex optimization problems. Such tools reduce a huge possible number of combinations to a manageable subset. Seth's fundamental research is, in this sense, chipping away at these much more daunting problems for computer science.
"What I'm doing in my research is not exactly trying to find one solution, but to decide whether a near-solution exists, and what this might teach us about the whole set of problems like it."
The paper, "A Tolerant Independent Set Tester," was presented at the 2025 Symposium on Theory of Computing.