Centre for Forensic Neuroscience
arrow_back Back to Blog
April 12, 2026 • Polygraph Protocols

From Quill to Algorithm: The Surprising History of the Polygraph

By Dr. Keith Ashcroft

The modern polygraph, a mainstay of police procedurals and spy thrillers, is a device that has always lived on the edge of science and spectacle.

For decades, it was seen as a "black box"—a mysterious machine that could supposedly peer into the human soul by tracking a racing heart or a sweaty palm. But while the public imagination was captured by flickering needles on rolling paper, a quieter, more profound revolution was taking place. The truth is, the polygraph has evolved dramatically, transforming from an analog instrument prone to human error into a sophisticated digital system supercharged by powerful statistical reasoning. This is the story of how computerization and Bayesian theory brought the lie detector into the 21st century.

Part I: The Analog Origins and the "Black Box"

The polygraph's journey began in the early 20th century. The first major breakthrough came from an unlikely source: William Moulton Marston, a Harvard psychologist who would later achieve fame as the creator of the comic book character Wonder Woman. In the 1910s, Marston developed a test that measured systolic blood pressure, believing it correlated with deception. But it was John Larson, a Berkeley police officer and medical student, who in 1921 built the first true "polygraph" (meaning "many writings")—a device that could continuously record blood pressure, pulse, and respiration on a single piece of paper. Then came Leonarde Keeler, who in the late 1920s and 1930s refined the machine, making it portable and adding the critical component of measuring the galvanic skin response (GSR)—the electrical conductivity of sweat on the skin—which is a powerful indicator of emotional arousal.

For decades, this was the state of the art. An examiner would ask a series of questions while a set of needles traced jagged lines across a scrolling roll of paper. The process was as much art as science. The examiner would then manually "score" the chart, visually comparing the physiological reactions to different types of questions. This analog system had two major flaws. First, the interpretation was subjective, leading to potential bias and inconsistent results from one examiner to the next. Second, the conclusions were often presented as binary—the subject either "passed" or "failed"—which was a dangerous oversimplification.

Part II: The Digital Revolution in Deception Detection

The first major leap forward came with computerization. Starting in the 1990s, the clunky, ink-stained analog machines gave way to sleek, digital polygraph systems. This shift wasn't just about aesthetics; it was a fundamental change in capability. Digital sensors could capture physiological data with far greater precision, and computers could now be used to process that data algorithmically.

This led to the development of automated scoring systems like the Objective Scoring System (OSS-3) and PolyScore™. These algorithms, trained on vast datasets of confirmed truthful and deceptive examinations, were designed to remove the subjectivity of the human eye. Instead of an examiner guessing if one "blip" looked bigger than another, the software would apply a logistic regression formula or linear discriminant analysis to numerically calculate a probability of deception. Studies showed that these computer algorithms could match or even exceed the accuracy of human examiners, with some achieving accuracy rates between 85% and 92% in controlled conditions. The "black box" was becoming a transparent, data-driven instrument.

Part III: The Bayesian Breakthrough—Context is Everything

As impressive as computerization was, it still left a critical question unanswered: What does a "positive" test result actually mean in the real world? This is where Bayesian theory entered the picture. A Bayesian approach forces us to think about probability in a more nuanced way. It recognizes that the result of a test (like a polygraph) is not an absolute truth. Its meaning is entirely dependent on the context—specifically, the base rate of the condition you're looking for.

Consider a simple example. Let's say a polygraph is 90% accurate. That sounds great. But imagine you use it to screen 10,000 people for a security risk, and only 1 in 1,000 people (10 individuals) are actually guilty. Even with a 90% accurate test, you would expect it to correctly identify 9 of the guilty individuals. However, of the 9,990 innocent people, the test would falsely label 10% (or 999 people) as deceptive. In this scenario, a "positive" polygraph result would be wrong more than 99% of the time! As the National Academies of Sciences notes, the trade-off between false positives and false negatives can be mathematically calculated using Bayes' theorem, and it's starkly different in a low-base-rate screening scenario than in a high-base-rate criminal investigation.

This statistical rigor is the final piece of the puzzle. A 2023 study applying a Bayesian approach to real-world polygraph data found that a deceptive outcome significantly increases the posterior probability of guilt, providing roughly three times more information than a layperson's judgment. However, another comprehensive Bayesian analysis highlighted that there is still a "high degree of uncertainty" around polygraph results, a fact that is often hidden when only a point estimate of accuracy is given.

Conclusion: The Future is a Probability

Today's polygraph is no longer the simple "lie detector" of popular lore. It is a complex instrument that has been revolutionized by two powerful forces: computerization, which brought objective, algorithmic scoring, and Bayesian theory, which brought the critical context of base rates to the interpretation of results. The modern examiner uses a computer not just to record data, but to run sophisticated pattern-recognition software that outputs a statistical likelihood of deception. And when that result is presented, it should be weighed using a Bayesian framework, acknowledging the inherent uncertainties.

The polygraph has evolved from a subjective analog gadget to a digital, data-driven tool for probabilistic assessment. And as the fields of artificial intelligence and machine learning continue to advance, the next chapter of this fascinating history—perhaps moving beyond the polygraph itself—is already being written.