Friday, September 27, 2013

Brains On Trial: Neuroscience Has Limited Use In The Courtroom, Scientists Say getdiscountz.blogspot.com

Written By Unknown; About: Brains On Trial: Neuroscience Has Limited Use In The Courtroom, Scientists Say getdiscountz.blogspot.com on Friday, September 27, 2013

getdiscountz.blogspot.com ® Brains On Trial: Neuroscience Has Limited Use In The Courtroom, Scientists Say

And in “Brains on Trial,” a PBS special hosted by actor Alan Alda that premiered earlier this month, brain scanning techniques to reveal memories, facial recognition, emotions, biases and intentions of witnesses, jurors, judges and participants in a fictional convenience store robbery are examined. The show offers a glimpse at how science and the law might intersect in the future.


But it's too soon for science to claim a victory against crime. While brain-based evidence is already used to determine the extent of brain injuries in a workman’s compensation dispute or injury lawsuit, for example, current knowledge of what happens amid all that gray matter is far from the point where we can start distinguishing truth from lies and convict people for the confounding clues that our brains offer about what we think and what we have done, most neuroresearchers concede.


Your Mouth Says No, But Your Brain Says Yes


Despite their ubiquity on TV cop shows, polygraphs – the “lie detectors” that measure changes in pulse, blood pressure and other physiological markers – are largely scientifically discredited. But neuroscience is offering a different, potentially better path for separating truth from falsehoods: using changes in brain activity to measure recognition.


For example, a suspect could be shown a barrage of faces, one of which might be a suspected accomplice to or victim of a crime. His brain activity during this exercise could give away his relative familiarity with these people. Or he could be presented with pictures of a variety of settings, including the scene of the crime. If the brain activity patterns indicate that the suspect recognizes some aspect of the crime, that might be an important clue.


However, the researchers studying the cerebral architecture of recognition are quick to point out its limitations in the courtroom. Anthony Wagner, a cognitive neuroscientist at Stanford University, has pondered the applicability of his field to the legal system for years. In a guide to neuroscience for judges [PDF], most of Wagner’s chapter outlines the limitations of research on lie detection with functional MRIs (known as fMRIs).


“I tend to be conservative and urge caution,” Wagner says.


Just as there are ways to beat polygraph tests (controlling one’s breathing and staying calm seem to be the most popular and effective), Wagner thinks there could be countermeasures that could fool brain-scan-based lie detectors. The regions of the brain implicated in recognition and truth-telling will “light up” for other reasons as well: usually when a subject is making some sort of mental effort, like what's involved in trying to fool the test. So, if you want to beat a brain-scanning test, the trick might be as simple as doing something like arithmetic in your head to skew the results, or concentrating on something neutral – the taste of oatmeal, or the beige wallpaper in your dentist’s office -- when you’re presented with an incriminating image or statement.


In a forthcoming paper, Wagner discusses results of his recent experiments that tested whether or not people could “beat the technology” of neuroimaging tests. He described the experiments at a meeting of the Cognitive Neuroscience Society this past April.


In the middle of a memory test, Wagner and his team paused the experiment and explained to participants its purpose -- to see if brain patterns could indicate that they were looking at a photograph of a familiar or unfamiliar face. Then they told the subjects to try to think about a familiar face when they saw a new face, and to focus on some aspect of a familiar face and try to see it as new. In the first half of the test, the scientists could use the brain patterns alone to correctly classify faces as novel or familiar to the subjects; in the second half, they couldn’t do so.


Another complication, Wagner says, is the fact that our memories aren’t statically coded in our brains – they can change over time. Your recollection of an event is what’s called an “episodic memory”; you might remember various details about last week’s trip to the bar, such as the color of the dress you were wearing, the bartender’s beard or the song playing when your date walked in. Then there’s “semantic memory,” more akin to knowledge; it’s the stuff that seems almost innate, like your mother’s face or the multiplication tables. It’s still unclear whether the two types of memory are stored in the same parts of the brain or if there are differences in how they might manifest on a brain scan.


“In courts, by the time an individual is testifying, that individual isn’t just having that moment of experience,” Wagner says. “By the time the case gets to court, those one-shot events aren’t one-shot events anymore” in the mind of the person, he adds.


Between the experience and the testimony, the subject will likely have had to repeat his account to multiple parties – lawyers, reporters, law enforcement – and in the process, an episodic memory might become more of a semantic one. This doesn’t necessarily bear on the truth of the memory, but it is possible that the semantic memory might be “coded” differently and stored in a different area of the brain from an episodic memory. In that case, the scan from an initial interrogation might not match a follow-up examination.


Other studies have found success in using fMRIs to distinguish between false statements and true ones. But one of the problems with trying to extrapolate from these studies is that, in this kind of controlled research, the lies are fed to the participants by the examiners. That’s hardly a recipe for weighing real-world situations where people are actually trying to cover up the truth, according to Massachusetts Institute of Technology neuroscientist Nancy Kanwisher.


“Saying something that’s false because someone told you to isn’t like telling a lie,” Kanwisher says.


Most scientists will argue that there would need to be some gold-standard study of fMRI lie detection in real-world situations before it could be admissible in court. But creating such a test would be extremely difficult, perhaps impossible, according to Kanwisher.


“There’d have to be real stakes,” Kanwisher says. “We’re not talking about $50 and a psychology experiment; we’re talking life in prison.”


A scientist would have to find people accused of very serious crimes and perform the experiment with the participants knowing, or at least believing, that the data could be used in a court case. The researchers would also have to have some extremely solid outside evidence that could establish whether the subject was lying or not. Not exactly feasible.


The difference between laboratory settings and the real world raises the risk of convicting an innocent person with neuroscience. If you stick a person in an fMRI scanner and ask her where she was on the night of November 17, a spike of brain activity in a certain area isn’t necessarily an indicator that her answer is a lie. She might be struggling to remember that night, or upset that she's been dragged into a police station. There’s a huge number of variables that could skew the results.


Even if there were some accurate neurological way of detecting lies, many would still consider it unethical. The American Civil Liberties Union has opposed polygraph machines since the 1950s, and not primarily based on how effective the technology is.


“Even if the polygraph were to pass an acceptable threshold of reliability, or a more accurate lie-detection technology were to come along, we would still oppose it because of the unacceptable violation of civil liberties it represents,” ACLU policy analyst Jay Stanley wrote in August 2012. “We view techniques for peering inside the human mind as a violation of the 4th and 5th Amendments, as well as a fundamental affront to human dignity.”


My Brain Made Me Do It


Judges, when they hand down a sentence, can consider how external factors like child abuse or emotional state may have affected a convicted criminal’s judgment. Neuroscience just might be a way of quantifying those external factors a bit more.


Bob Desimone, the director of the McGovern Institute for Brain Research at MIT, would like to remind you that 100 percent of human activity, whether it’s eating cheese or murder, stems from our brains. But it’s only in some rare cases that a person’s brain state is a truly mitigating factor. Where do we draw the line between someone who’s in control of his brain and someone whose brain is driving his behavior? It’s much easier to make the case for a brain-based defense when there’s some sort of obvious brain damage. Desimone points to a convicted pedophile whose aberrant sexual desires seemed to stem from an egg-sized brain tumor that was pressing on his right frontal lobe. When the tumor was removed, the patient’s urges vanished.


“Is it possible that people will raise the defense of ‘My brain made me do it’? Probably,” Desimone says. But only in rare instances, he notes.


Such pleas could be akin to more-advanced versions of the insanity defense. Desimone points out that, despite popular misconceptions, acquittal by reason of insanity occurs in only a tiny fraction of cases. (A 1991 study in the Journal of the American Academy of Psychiatry and the Law [PDF] of insanity pleas entered in eight states across the U.S. found it was used in less than 1 percent of all felony indictments. Of those defendants that pled insanity, only a little over a quarter were acquitted.)


In the case of the pedophile, whether or not his tumor completely excuses his behavior is something of a philosophical question. Did the tumor alter the man’s behavior, or release the inhibitions that kept his existing desires in check? That’s not an easy question to answer with an fMRI machine.


“The biggest problem the average person has is not understanding the difference between a biological explanation and a biological excuse,” Desimone says.


Brains On Trial: Neuroscience Has Limited Use In The Courtroom, Scientists Say

No comments:

Post a Comment