Author: GETAWAYTHEBERKSHIRES

Home / Author: GETAWAYTHEBERKSHIRES

More than 5 million people across the world started out life as a sperm and an egg in a petri dish. Yet for every in vitro fertilization success story, there have been at least as many failures. Today the procedure works about 40 percent of the time for women under 35; it gets worse the older you get. But researchers and companies are hoping that a set of more experimental methods will improve those odds by hacking biology itself.

Last summer, a 32-year-old Greek woman, who’d previously undergone two operations for endometriosis and four unsuccessful cycles of IVF, once again returned to the surgical table to have a thin needle threaded through her vagina to retrieve eggs from her ovaries. But unlike in her earlier IVF attempts, this time fertility specialists did not inseminate them with her partner’s sperm right away. Instead the doctors at the Institute of Life, in Athens, took a donor’s eggs, stripped them of their nuclei, and inserted the patient’s DNA in their place. Then the modified eggs were inseminated. The resulting embryos—a combination of genetic material from three people—were transferred to the Greek woman’s womb, leading to her first successful pregnancy.

She is now 28 weeks along with a baby boy, according to a Spanish company called Embryotools, which announced the pregnancy earlier this month. The fertility tech firm is collaborating with the Institute of Life to conduct the first known human trial of the procedure, called mitochondrial replacement therapy (MRT), for treating infertility. Their pilot study in Greece will eventually enroll 25 women under the age of 40 who’ve failed to conceive using conventional methods of IVF. It’s the largest test yet of the controversial new method of procreation.

Unlike conventional IVF, which is essentially a numbers game to get a viable embryo, MRT promises to actually improve the quality of older eggs, which can take on damage as they age. If it proves to be safe and effective—a big if—it could radically change women’s prospects of having children later in life.

Fertility doctors first started messing around with the idea for MRT in the late ’90s in clinics in New York and New Jersey on a hunch that some people struggle to get pregnant because of defects in the jelly-like cytoplasm of their eggs. By 2001, the technique, often dubbed “three-person IVF,” produced a reported 30 births. Shortly after, the US Food and Drug Administration stepped in with warning letters, abruptly bringing such work in the American infertility scene to a standstill.

From the FDA’s point of view, embryos created using MRT represent an abrupt departure from nature’s normal course. The agency claims that they should be regulated like a drug or gene therapy, because these new, untested genetic relationships pose a considerable risk. While the amount of donor DNA makes up just a tiny fraction of the resulting embryo—about 0.2 percent—the potential health impacts of having any amount of donor DNA are still poorly understood. In the US, that ignorance stems in part from the fact that scientists are prevented from using federal funds for research on embryos that could result in their harm or destruction.

Critics argue that it’s unethical to expose unborn children to these unknowns when infertile parents have other options for starting a family, such as egg donation and adoption. “The potential risks of this procedure for the babies are significant but unclear; its potential value in treating infertility is inconclusive,” says Stanford bioethicist Hank Greely, who wrote about MRT in his book The End of Sex. “For now, I wouldn’t do it.”

Where the case for MRT is more compelling (to ethicists and regulators) is in preventing mitochondrial diseases. Mitochondria, the structures that float in the cytoplasm providing power to human cells, have their own DNA, separate from the DNA coiled inside chromosomes. Mutations in mitochondrial DNA can lead to debilitating, often fatal conditions that affect about one in 6,500 people worldwide. Because babies inherit all their mitochondria from the female egg—sperm lose theirs during the act of reproduction—preventing mitochondrial disease could be as simple as swapping out one egg’s mitochondria for another’s. Studies in monkeys and human cell lines have mostly supported the idea, though in some worrying cases the donor mitochondria have been shown to revert back to the mutated form.

In February, British authorities granted doctors at Newcastle University the go-ahead to begin a study assessing how well MRT could help two women affected by mitochondrial diseases conceive healthy children. The UK is the first country to legalize the use of MRT, but only for women with heritable mitochondrial disease, and only under strict oversight. Australia is also considering legislation to approve the procedure in limited cases.

In the US, such trials are effectively banned. But that hasn’t stopped the most determined MRT defenders from trying it in places with looser laws.

In 2016, a New York-based infertility specialist named John Zhang reported using MRT to facilitate the birth of a healthy baby boy at a clinic in Mexico. Valery Zukin, a fertility doctor in Kiev, Ukraine, says he has used MRT in seven successful births since May 2017, with three more on the way. Zukin says he received approval for a five-year research program from Ukrainian health authorities, but he has not registered the trial with the European clinical trial database, and he is charging patients for the procedure: $8,000 for Ukrainians and $15,000 for foreigners. In December 2017 he formed a company with Zhang to make it easier for interested Americans to access the procedure in Ukraine.

Still, the lack of a rigorous trial leaves questions about how safe and effective the procedure really is. Those gaps in knowledge are what Nuno Costa-Borges, scientific director and cofounder of Embryotools, hopes to address in his Greek study. “The only thing missing from the debate is what happens to the babies,” Costa-Borges says . “There’s no other way of testing that than to transfer the embryos. But we need to do it in a strict, well-controlled study that is scientifically rigorous.”

Some critics may not be swayed by the study’s design, which won’t have a conventional control group. Embryotools is calling it a “pilot trial,” instead. The reason, Costa-Borges says, is that the women they have been recruiting have already failed conventional IVF many times before and may not have many chances left. Those unsuccessful IVF cycles will serve as the control group.

“Comparing to historical controls is better than nothing, but it’s not ideal,” says Paula Amato, an OB-GYN at Oregon Health and Science University, where much of the modern mitochondrial replacement therapy work has been pioneered by Shoukhrat Mitalipov. She says it’s always possible that some of these women might have gotten pregnant on their next round of IVF, even without MRT. But she applauds the Embryotools team for doing something to generate meaningful data. “In the fertility field, innovations have often been adopted prior to having evidence that it works, and that’s a problem.”

As in many countries, the laws in Spain and Greece aren’t exactly clear regarding the legality of mitochondrial replacement therapy. The procedure is neither explicitly prohibited nor approved. Costa-Borges says his team decided to conduct their trial in Greece because that’s where their long-time clinical partner, the Institute of Life, has its facilities. So far, the country has been playing along.

His team received approval in Greece at the end of 2016 but only began recruiting human patients last year after completing another battery of in vitro safety tests. “We are not rushing,” Costa-Borges says. “The technology has a lot of potential, but we want to move cautiously. We don’t want to create any false expectations.” So far, he says, his team has prepared the eggs from eight additional patients. Now that the first pregnancy has crossed into the third trimester, Costa-Borges says, his team is considering moving forward with the eight others.

In addition to showing that the early stages of MRT can be conducted safely, the technique’s proponents also need to assuage critics with longer-term data on how the children develop. To that end, Embryotools is working with a pediatric hospital in Greece to monitor the health of all the babies born from its study until they are 18 years old. The company is also exploring creating a registry of every child born using MRT technologies to help track their health outcomes over their lifespan as compared to naturally conceived babies. Such a database was never established for conventional IVF births, for legal and ethical reasons.

But given the raised stakes of such genetic alterations, the idea might gain traction this time around. Just as IVF redefined the biological boundaries of baby-making four decades ago, MRT is poised to write the next chapter in human reproductive history. Even at the current pace of MRT births, pretty soon it’s going to be easy to lose count.


More Great WIRED Stories

  • We need a radical new way to understand screen use
  • These chickens lay designer eggs for Big Pharma
  • Fyre Festival docs dissect attendees'—and your—FOMO
  • Weedmaps’ grip on the high-flying California pot market
  • The invisible reality of motherhood on Instagram
  • ? Looking for the latest gadgets? Check out our picks, gift guides, and best deals all year round
  • ? Want more? Sign up for our daily newsletter and never miss our latest and greatest stories

Related Video

Business

Engineering Sustainable Biofuels

How do you feed the world, make biofuel, and remain sustainable? In this World Economic Forum discussion, MIT chemical engineer Kristala Prather says that microbes might provide an answer.

On Monday, the buzz of machinery echoed through SpaceX’s Hawthorne-based manufacturing facility as SpaceX president Gwynne Shotwell introduced a quartet of astronauts, each decked out in NASA blues. Behind them, tucked inside a clean room, was their ticket to low-Earth orbit: SpaceX’s Crew Dragon, still naked without its stark white outer shell.

So far, every SpaceX Dragon capsule has only carried cargo to and from the International Space Station. But that will change when NASA’s Commercial Crew program launches its astronauts—the first to leave from US soil since 2011. The first Crew Dragon is set to take off in November as part of an uncrewed flight test, and if all goes according to plan, a crew of two astronauts—Doug Hurley and Bob Behnken—will launch to the ISS for a two-week stay in April 2019. The next team, Victor Glover and Mike Hopkins, will take off some time after that.

Now that the first two crews have been announced, Behnken and Hurley—both veteran shuttle pilots who have been working on the project since 2015—will begin training on the vehicle itself. Or a least a simulacrum of it: Part of that training will happen in a two-seater cockpit simulator, located just above the clean room.

SpaceX’s new cockpit design will take more onboarding than you think. NASA’s astronauts are used to the space shuttle’s vast array of more than 1,000 buttons and switches, but the crew will control the Dragon with the help of just three touch screen control panels and two rows of buttons. Touch screens in space, you say? Yes, really: The astronauts’ new spacesuits, a one-piece design that’s more wetsuit than pumpkin suit, also comes with conductive leather gloves that will allow them to control the screens.

The displays will both provide the crew with orbital flight tracking and give them control over the craft. Though the vehicle is designed to be autonomous, crews will have the ability to manually fly the Dragon and fire thrusters for minor course corrections. After astronauts select commands on the touch screen, the analog buttons, shielded by a clear covering, will execute them. The buttons are also used to handle emergencies: One button under the far left panel extinguishes a fire, while a large pull-and-twist handle, located under the center screen and marked “EJECT,” arms the vehicle’s launch escape system.

Learning the control panel is just the beginning. While Dragon will have both autonomous systems and a ground crew as backup, its first crews will still have to be prepared for any scenario. That’s where SpaceX’s full-scale simulator comes into play. The replica located upstairs in the astronaut training area at the Hawthorne facility comes outfitted with seats, control panels, flight software, and life-support systems, allowing SpaceX crew trainers to put the astronauts through increasingly complex failures—who knows, maybe even their own version of the Kobayashi Maru.

Outside the cavernous rocket-building warehouse, SpaceX is working on another hallmark of its strategy: reusing more of its rocket’s components. In particular, the payload fairing, which is also known as the nose cone. Tethered to a dock in the Port of Los Angeles, and nestled among the many freighters and fishing vessels resides one of the more recent additions to SpaceX’s fleet: a boat named Mr. Steven. SpaceX aims to use the vessel to recover the fairings, which historically have been a one-use component, as they navigate themselves back to Earth after separating from the rocket.

Each fairing—a $6 million piece of hardware—accounts for one tenth of the price of the entire Falcon 9 rocket, and SpaceX can save a bundle if it can scoop up the fairing before it lands in the ocean. Here’s where the aerospace company’s fleet of recovery vessels comes into play. Essentially a mobile catcher’s mitt, Mr. Steven is outfitted with a yellow net that spans nearly 40,000 square feet. So far, Mr. Steven’s recovery attempts have been unsuccessful, but on Monday, SpaceX conducted tests that will hopefully allow engineers better understand the properties of Mr. Steven’s net.

Visible in the net was one of the fairing’s two halves, attached to a crane that repeatedly lifted and lowered it to help engineers understand how the net behaves while loaded down. SpaceX wouldn’t want to catch a fairing, only to have it crash through the net and onto the ship’s deck.

Mr. Steven’s next trip out to sea will be in late September as SpaceX prepares to launch the Argentinian Earth-observing satellite SAO-COM-1A. There’s a lot riding on this launch: It will mark the company’s first attempted landing on the west coast; all of its previous landings out of Vandenberg have touched down on one of the company’s drone ships. If SpaceX manages to recapture both the rocket booster and the fairing, it’ll save an estimated $37 million.

The Unknowability of the Next Global Epidemic

March 20, 2019 | Story | No Comments

Disease X

n. A dire contagion requiring immediate attention—but which we don’t yet know about.

In 2013 a virus jumped from an animal to a child in a remote Guinean village. Three years later, more than 11,000 people in six countries were dead. Devastating—and Ebola was a well-studied disease. What may strike next, the World Health Organization fears, is something no doctor has ever heard of, let alone knows how to treat. It’s come to be known as Disease X.

Since René Descartes adopted the letter x to denote a variable in his 1637 treatise on geometry, it has suggested unknowability: the mysterious nature of x-rays, the uncertain values of Generation X, the conspiratorial fantasies of The X-Files. It’s also been used as code for experimental—in the names, for instance, of fighter jets and submarines. That’s an apt association: Disease X may leapfrog from animals to humans like Ebola, but it could instead be engineered in a lab by some rogue state.

Still, far from asking us to resign ourselves to an unpredictable future horror, Disease X is a warning to prepare for the worst possible scenario as best we can. It calls for nimble response teams (a critical failure in the Ebola epidemic) and broad-spectrum solutions. The WHO has solicited ideas for “platform technologies,” like plug-and-play systems that can create new vaccines in months instead of years. As Descartes showed us in mathematics, only by identifying an unknown can we begin to find an answer.

In a field at the edge of the University of Minnesota’s St. Paul campus, half a dozen students and lab technicians glance up at the darkening afternoon skies. The threatening rain storm might bring relief from the 90-degree August heat, but it won’t help harvest all this wheat. Moving between the short rows, they cut out about 100 spiky heads, put them in a plastic container, and bring them back to a growling Vogel thresher parked at the edge of the plot. From there, they bag and label the grains before loading them in a truck to take back to James Anderson’s lab for analysis.

Inside those bags, the long-time wheat breeder is hoping to find wheat seeds free of a chalky white fungus, Fusarium head blight, that produces a poisonous toxin. He’s looking for new genes that could make wheat resistant to one of the most devastating plant diseases in the world. Anderson runs the university’s wheat breeding program, one of dozens in the US dedicated to improving the crop through generations of traditional breeding, and increasingly, with the aid of genetic technologies. Today his toolbox got a lot bigger.

In a Science report published Thursday, an international team of more than 200 researchers presents the first high-quality, complete sequence of the bread wheat genome. Like a physical map of the monstrous genome—wheat has five times more DNA than you do—the fully annotated sequence provides the location of over 107,000 genes and more than 4 million genetic markers across the plant’s 21 chromosomes. For a staple crop that feeds a third of the world’s population it’s a milestone that may be on par with the day its domestication began 9,000 years ago.

“Having breeders take the information we’ve provided to develop varieties that are more adapted to local areas is really, we think, the foundation of feeding our population in the future,” says Kellye Eversole, the executive director of the International Wheat Genome Sequencing Consortium, the public-private research team that worked for more than a decade to complete the sequence. Founded in 2005, Eversole says the IWGSC’s goal was to help improve new crop traits for a changing world.

Breeding programs like Anderson’s are constantly on the hunt for wheat strains that will meet the needs of farmers facing tough economic and environmental realities. A 2011 study in Science showed that rising temperatures are already causing declines in wheat production. More recent research in Nature suggests that trend is only going to get worse, with a 5 percent decline in wheat yields for every one degree Fahrenheit uptick.

So what kinds of traits make for better wheat? The ability to grow in hotter climates is a plus. And disease resistance is nice. But farmers have other priorities, too. “When selecting a variety they’re looking at yield first, lodging resistance second, and protein content third,” says Anderson. Lodging is when a wheat stalk gets bent over, collapsing under its own weight. Stalk strength is one way to counteract that. But breeders have to be careful to balance those traits with others, like nutritional composition. “We’re trying to build disease resistance into a total package that’s going to be attractive to a grower,” says Anderson.

Building that total package is still a slow, labor-intensive process. Breeders painstakingly pluck out pollen-producing parts from each tiny “spikelet” on a wheat stem so they can fertilize each one with pollen from plants with other desirable traits; repeating that process thousands of times each year. Then they screen and select for traits they want, which requires testing how well thousands of individual plants perform over the growing season. In Anderson’s lab, which focuses on Fusarium head blight resistance, that means spraying test field plots with fungal spores and seeing which ones don’t die. It’s only in the last three years that he’s used gene sequencing technologies to help produce more survivors.

The plot his crew was harvesting on Tuesday was what’s called a “training population”: 500 individual plants selected to represent a larger group of 2,500 that have had their genomes sequenced. By combining the genetic data with their field performance data, Anderson can make better predictions about which plants will have the best disease resistance, and which genetic backgrounds will confer that trait.

To line up those gene sequences, graduate students in Anderson’s lab used an earlier, rougher version of the reference genome, which the IWGSC published in 2014. “That’s made it much easier to identify good DNA markers that serve as tags for genes that we’re interested in tracking,” says Anderson. The sequence published today, which covers 94 percent of the genome, as opposed to 61 percent, will be even more useful for tying specific traits to specific genes and starting to make tweaks.

The question then becomes, how exactly to make those tweaks? Unlike corn, soybean, and canola, no one has yet commercialized genetically modified wheat. Anderson at one time was working with Monsanto on a Roundup-Ready wheat, but the multinational dropped out of the partnership in 2004. “It was mostly cultural, and trade-related,” says Anderson. “The world wasn’t ready for it at the time, and it’s probably still not ready for it.” Some of the largest historical importers of US wheat—Europe and Japan—have been the most hostile to genetically modified foods.

“Bread is the heart of the meal,” says Dan Voytas, a fellow University of Minnesota plant scientist, and the co-founder of gene-editing agricultural company, Calyxt. “It’s kind of sacred, in the public perception.” Calyxt is among a bumper crop of start-ups racing to bring the first gene-edited products to market; it’s growing a new high-fiber wheat in its sealed greenhouses, located just a few miles away from Anderson’s test plots. Newer technologies like Crispr, zinc fingers, and TALENs don’t yet face the same cultural resistance, or as much red tape as first generation GMOs. A USDA ruling in March of this year declared the agency would mostly decline to regulate gene edited plants, provided they didn’t introduce any wildly distant genetic material.

The new reference genome promises to accelerate the development of new genetically engineered products, in addition to the crops coming out of traditional breeding programs. But as with most genomic work, the new wheat map has plenty of room for more detail. So far it’s mostly been manually annotated—meaning researchers went through by hand and added all the information they know about where genes are and how they function. Technology could help speed that up: The IWGSC is looking for support to fund a deep learning annotation pilot project. “We’re in a totally different time where we can try some of these high-risk, high-reward approaches that might give us 80 or 90 percent of the information and then we can go in and fill the gaps,” says Eversole. With its whopping 14.5 billion base pairs, the wheat genome might actually require some non-human intelligence to help it reach its full potential.

There was a time many years ago when cars guzzled gas like beer, teenagers raced them on Friday nights, and Detroit automakers boasted about their vehicles' ever-increasing horsepower and speed. Since then, cars have become safer, cleaner and more efficient, mostly as a result of tougher standards from Washington.

A new Trump administration proposal might bring that half a century of vehicular progress screeching to a halt, some experts say, by shrugging its collective shoulders at the growing danger of climate change and the fuel-efficiency standards designed to combat it. The White House wants to freeze future auto emissions standards and ban California from making its own tougher rules for carbon emissions from vehicles.

First, a look at the numbers. A little-noticed report issued by the National Highway Transportation Safety Administration (NHTSA) predicts that the Earth’s temperature will rise a whopping 7° Fahrenheit (4° Celsius) by 2100, assuming that little or nothing is done to reverse emissions of carbon dioxide and other greenhouse gases. The current Paris climate accords call for nations to pledge to keep warming below 3.6° F (2° C) by century’s end.

The Trump administration’s climate change scenario would likely entail catastrophic melting of ice sheets in Greenland and Antarctica, causing rising sea levels that would flood low-lying coastal areas from Maine to Texas—not to mention warmer oceans that could spawn ever-stronger hurricanes alongside pockets of inland drought, and a collapse of agriculture in many areas.

The NHTSA report came up with these doomsday numbers to argue that automobile and truck tailpipe emissions after 2020 will have such a small global impact on overall greenhouse gases that it's not worth tightening the screws on Detroit automakers. “What they are saying is we are going to hell anyhow, what difference does it make if we go a little faster,” says David Pettit, a senior attorney at the Natural Resources Defense Council. “That’s their theory of how they are dealing with greenhouse gas emissions.”

Pettit and others say the NHTSA report and the Trump administration’s proposal to roll back future tailpipe emissions standards would allow Detroit to build bigger, thirstier cars than would have been permitted under President Obama-era rules. Pettit notes that he has gone from driving a 7-miles-per-gallon Chrysler in the late 1960s to a Chevy Bolt today, largely as the result of stricter federal standards that require automakers to sell clean cars alongside their SUVs and trucks.

In addition to throwing up its hands at climate change, the Trump administration also argues that continuing to increase fuel economy requirements will make the overall vehicle fleet less safe, because people will continue to drive older cars longer than they otherwise would. The argument is that the higher price tags on more fuel-efficient cars will deter consumers from buying new vehicles equipped with more advanced technology that also improves safety. But Giorgio Rizzoni, director of the Center for Automotive Research at Ohio State University, says the administration has it backwards. His study of the past 40 years concludes that safety and fuel efficiency have grown at the same time.

If the Trump administration rules are passed, American carbuyers might end up seeing vehicles with less advanced technology on the dealer lot than overseas buyers, says Austin Brown, executive director of the UC Davis Policy Institute for Energy, Environment and the Economy. “The cars would look the same on the outside, but they would burn more gasoline, cost more money and create more emissions,” he says. That's because US cars with weaker fuel standards won't be sold on worldwide markets, he adds.

The Trump administration held public meetings on the proposal this week in Fresno, California; Dearborn, Michigan; and Pittsburgh. The deadline for written comments is October 23.

A Clever and Simple Robot Hand

March 20, 2019 | Story | No Comments

If you want to survive the robot apocalypse—the nerd joke goes—just close the door. For all that they’re great at (precision, speed, consistency), robots still suck at manipulating door handles, among other basic tasks. Part of the problem is that they have to navigate a world built for humans, designed for hands like ours. And those are among the most complex mechanical structures in nature.

Relief for the machines, though, is in sight. Researchers at the University of Pisa and the Italian Institute of Technology have developed a stunningly simple, yet stunningly capable robotic hand, known as the SoftHand 2, that operates with just two motors. Compare that to the Shadow Dexterous Hand, which is hypnotizingly skillful, but also has 20 motors. The SoftHand promises to help robots get a grip at a fraction of the price.

Like other robot hands out there, the SoftHand uses “tendons,” aka cables, to tug on the fingers. But it’s arranged in a fundamentally different way. Instead of a bunch of cables running to individual fingers, it uses just one cable that snakes through pulleys in each finger. Which gives it a bit less dexterity, but also cuts down on cost and power usage. And that’s just fine: There’s no such thing as a one-technique-fits-all robotic manipulator. More complex robot hands will undoubtedly have their place in certain use cases, as might SoftHand.

To create this hand, the researchers originally built a simpler SoftHand with just one motor. “The idea is that when you turn the motor, the length of the tendon shrinks and in this way you force the hand to close,” says roboticist Cosimo Della Santina, who helped develop the system.

Let out the tendon and the fingers once again unfurl into a flat palm, thanks to elasticity in the joints. It works great if you want to, say, grip a ball. But because the fingers move more or less in unison, fine manipulation isn’t possible.

By adding one more motor, SoftHand 2 ups the dexterity significantly. Take a look at the images above. Each end of the tendon—which still snakes through all the fingers—is attached to one of two motors in the wrist. If you move the motors in the same direction, the tendon shortens, and you get the gestures in the top row: A, B, C, and D. Same principle as the original SoftHand.

But run the motors in opposite directions, and something more complex unravels in E, F, G, and H. In this case, one motor lets out the tendon, while the other reels it in. “If you have a tendon moving through a lot of pulleys, the tension of the tendon is not constant,” says Della Santina.

If one motor is pulling, the tension on that end of the tendon will be higher. If the other is letting out the tendon, the tension on that end will be lower. By exploiting tension this way, the SoftHand requires far fewer cables than your typical robotic hand, yet can still get all those fingers a-wiggling.

Take a look at the GIF above and you can see the difference an extra motor makes. That’s one motor in the hand on the left, and two in the hand on the right. The former sort of brute-forces it, collapsing all its fingers around the ball. The latter, though, can more deliberately pinch the ball, thanks to the differences in tension of the tendon. Same principle below with the bank note.

Given that it’s working with just two motors, SoftHand can pull off an impressive array of maneuvers. It can extend an index finger to unlatch a toolbox or slide a piece of paper off a table. It can even unscrew a jar. All of it on the (relative) cheap. Because lots of motors = lots of money.

“For robots to learn and do cool stuff, we need cheap, reliable, and complex systems,” says Carnegie Mellon University roboticist Lerrel Pinto, who works on robot manipulation. “I think their hand strikes this balance,” he adds, but the real test is whether others researchers find uses for it. “Can it be used to autonomously learn? Is it reliable and robust over thousands of grasps? These questions remain unanswered.”

So SoftHand has promise, but more complicated robotic manipulators like the Shadow Dexterous Hand still have lots to offer. The SoftHand might be good for stereotyped behaviors, like unscrewing jars, while the Shadow and its many actuators might adapt better to more intricate tasks.

Fist bumps, though? Leave that to old Softie.

Police departments around the country are getting increasingly comfortable using DNA from non-criminal databases in the pursuit of criminal cases. On Tuesday, investigators in North and South Carolina announced that a public genealogy website had helped them identify two bodies found decades ago on opposite sides of the state line as a mother and son; the boy’s father, who is currently serving time on an unrelated charge, has reportedly confessed to the crime. It was just the latest in a string of nearly two dozen cold cases cracked open by the technique—called genetic genealogy—in the past nine months.

This powerful new method for tracking potential suspects through forests of family trees has been made possible, in part, by the booming popularity of consumer DNA tests. The two largest testing providers, Ancestry and 23andMe, have policies in place to prevent law enforcement agencies from directly accessing the genetic data of their millions of customers. Yet both companies make it possible for customers to download a digital copy of their DNA and upload the file to public databases where it becomes available to police. Searches conducted on these open genetic troves aren’t currently regulated by any laws.

But that might not be true for much longer, at least in Maryland. Last month, the state’s House of Delegates introduced a bill that would ban police officers from using any DNA database to look for people who might be biologically related to a strand of offending, unknown DNA left behind at a crime scene. If it passes, Maryland investigators would no longer have access to the technique first made famous for its role in cracking the Golden State Killer case.

Maryland has been a leader in genetic privacy since 2008, when the state banned the practice of so-called “familial searches.” This method involves comparing crime scene DNA with genetic registries of convicted felons and arrestees, in an attempt to identify not only suspects but their relatives. Privacy advocates argue that this practice turns family members into “genetic informants,” a violation of the Fourth Amendment. A handful of other states, including California, have also reined in the practice. But only Maryland and the District of Columbia outlawed familial search outright.

“Everyday law enforcement should never trump the Constitution,” says delegate Charles Sydnor III, an attorney and two-term Democrat from Baltimore. Sydnor is sponsoring the current bill, which would expand Maryland’s protections of its residents’ DNA even further. “If the state doesn’t want law enforcement searching databases full of its criminals, why would it allow the same kind of search conducted on citizens who haven’t committed any crimes?”

But opponents of House Bill 30 dispute that the two methods share anything in common. At a hearing for the proposed law in late January, Chevy Chase police chief John Fitzgerald, speaking on behalf of Maryland chiefs and sheriffs, called the bill a “mistake” that would tie investigators’ hands. “A search is a government intrusion into a person’s reasonable expectation of privacy,” he said. Because public databases house DNA from people who have freely consented to its use, as opposed to being compelled by police, there can be no expectation of privacy, said Fitzgerald. “Therefore, there is no search.”

Here is where it might help to have a better idea of how genetic genealogy works. Some police departments enlist the help of skilled sleuths like Barbara Rae-Venter, who worked on both the Golden State Killer and recent North and South Carolina murder cases. But most hire a Virginia-based company called Parabon. Until last spring, Parabon was best known for its work turning unknown DNA into forensic sketches. In May it began recruiting people skilled in the art of family-tree-building to form a unit devoted to offering genetic genealogy services to law enforcement.

The method involves extracting DNA from a crime scene sample and creating a digital file made up of a few hundred thousand letters of genetic code. That file is then uploaded to GEDMatch, a public warehouse of more than a million voluntarily uploaded DNA files from hobby genealogists trying to find a birth parent or long-lost relative. GEDMatch’s algorithms hunt through the database, looking for any shared segments of DNA and adding them up. The more DNA shared between the crime scene sample and any matches, the closer the relationship.

Parabon’s genealogists take that list of names and, using public records like the US Census, birth and death certificates, newspaper clippings, and social media, build out family networks that can include many thousands of individuals. They then narrow down the list to a smaller cohort of likely suspects, which they pass on to their law enforcement clients. In both genetic genealogy and familial search, these lists of relatives generated by shared DNA are treated as leads, for police to investigate further using conventional detective work.

It’s easy to understand why law enforcement agencies in Maryland would want to halt the bill in its tracks. Last year, Parabon helped police in Montgomery and Anne Arundel Counties arrest suspects in two cold cases—a home invasion that turned deadly and a serial rapist who targeted elderly victims. The company declined to disclose how many open cases it is currently pursuing with the state, saying only that it has working relationships with a number of police departments across Maryland. The cost of each case varies, based on the number of hours Parabon’s genealogists put into the search, but on average it runs about $5,000.

Parabon’s CEO, Steven Armentrout, who also spoke out against the bill at the hearing last month, suggests that forensic genetic genealogy is no different than police knocking on doors. “A lead is a lead, whether it’s generated by a phone tip or security camera footage or a consenting individual in a public DNA database.” When police canvas a neighborhood after a crime has taken place, some people will decline to answer, while others will speak freely. Some might speak so freely that they implicate one of their neighbors. “How is this any different?” Armentrout asks.

The difference, say privacy advocates, is that genetic genealogy has the potential to ensnare many more innocent people in a net of police suspicion based solely on their unalterable biology. Today, more than 60 percent of Americans of European ancestry can be identified using open genetic genealogy databases, regardless of whether they’ve ever consented to a DNA test themselves. Experts estimate it will be only a few years before the same will hold true for everyone residing in the US.

“There isn’t anything resembling consent, because the scope of information you can glean from these types of genetic databases is so extensive,” says Erin Murphy, a law professor at New York University. Using just a criminal database, a DNA search would merely add up stutters of junk DNA, much like identifying the whorls on a pair of fingerprints. DNA in databases like GEDMatch, however, can tell someone what color eyes you have, or if you have a higher-than-average risk of certain kinds of cancer. The same properties that make that kind of data much more powerful for producing more distant and more accurate kin-relations make it much more sensitive in the hands of police.

Until very recently, GEDMatch was police investigators’ only source of consumer DNA data. Companies like 23andMe and Ancestry have policies to rebuff requests by law enforcement. But last week Buzzfeed revealed that another large testing firm, Family Tree DNA, has been working with the FBI since last fall to test crime scene samples. The arrangement marked the first time a commercial company has voluntarily cooperated with authorities. The news came as a shock to Family Tree DNA customers, who were not notified that the company’s terms of service had changed.

“That’s why a bill like this is so important,” says Murphy. Because the fine print can change at any time, she argues that demanding more transparency from companies or more vigilance from consumers is insufficient. She also points out that the proposed legislation should actually encourage Marylanders to engage in recreational genomics, because they can worry less about the prying eyes of law enforcement.

But despite Maryland’s history, HB 30 faces an uphill battle. In part, that’s because the ban this time around has been introduced as a stand-alone measure. The 2008 prohibition was folded into a larger package that expanded DNA collection from just-convicted felons to anyone arrested on suspicion of a violent crime. The other hurdle is that in 2008 there were not yet any well-publicized familial search success stories. With resolutions to several high-profile cold cases, forensic genetic genealogy has already captured the public imagination.

Representative Sydnor, who comes from a law enforcement family—his father is a probation officer, and he has one uncle who is a homicide detective and another who is an FBI agent—says he wants to catch criminals as much as anyone. He just wants to do it the right way. “DNA is not a fingerprint,” he says. “A fingerprint ends with you. DNA extends beyond you to your past, present, and future. Before we decide if this is the route we really want to take, citizens and policymakers have to have a frank and honest conversation about what we’re really signing up for.” Over the next few months, that’s exactly what Marylanders will do.

If you want to watch sunrise from the national park at the top of Mount Haleakala, the volcano that makes up around 75 percent of the island of Maui, you have to make a reservation. Being at 10,023 feet, the summit provides a spectacular—and very popular, ticket-controlled—view.

Just about a mile down the road from the visitors center sits “Science City,” where civilian and military telescopes curl around the road, their domes bubbling up toward the sky. Like the park’s visitors, they’re looking out beyond Earth’s atmosphere—toward the sun, satellites, asteroids, or distant galaxies. And one of them, called the Panoramic Survey Telescope and Rapid Response System, or Pan-STARRS, just released the biggest digital astro data set ever, amounting to 1.6 petabytes, the equivalent of around 500,000 HD movies.

From its start in 2010, Pan-STARRS has been watching the 75 percent of the sky it can see from its perch and recording cosmic states and changes on its 1.4 billion-pixel camera. It even discovered the strange 'Oumuamua, the interstellar object that a Harvard astronomer has suggested could be an alien spaceship. Now, as of late January, anyone can access all of those observations, which contain phenomena astronomers don’t yet know about and that—hey, who knows—you could beat them to discovering.

Big surveys like this one, which watch swaths of sky agnostically rather than homing in on specific stuff, represent a big chunk of modern astronomy. They are an efficient, pseudo-egalitarian way to collect data, uncover the unexpected, and allow for discovery long after the lens cap closes. With better computing power, astronomers can see the universe not just as it was and is but also as it's changing, by comparing, say, how a given part of the sky looks on Tuesday to how it looks on Wednesday. Pan-STARRS's latest data dump, in particular, gives everyone access to the in-process cosmos, opening up the "time domain" to all earthlings with a good internet connection.

Pan-STARRS, like all projects, was once just an idea. It started around the turn of this century, when astronomers Nick Kaiser, John Tonry, and Gerry Luppino at Hawaii’s Institute for Astronomy suggested that relatively “modest” telescopes—hooked to huge cameras—were the best way to image large skyfields.

Today, that idea has morphed into Pan-STARRS, a many-pixeled instrument attached to a 1.8-meter telescope (big optical telescopes may measure around 10 meters). It takes multiple images of each part of the sky to show how it’s changing. Over the course of four years, Pan-STARRS imaged the heavens above 12 times, using five different filters. These pictures may show supernovae flaring up and dimming back down, active galaxies whose centers glare as their black holes digest material, and strange bursts from cataclysmic events. “When you visit the same piece of sky again and again, you can recognize, ‘Oh, this galaxy has a new star in it that was not there when we were there a year or three months ago,” says Rick White, an astronomer at the Space Telescope Science Institute, which hosts Pan-STARRS’s archive. In this way, Pan-STARRS is a forerunner of the massive Large Synoptic Survey Telescope, or LSST, which will snap 800 panoramic images every evening with a 3.2-billion-pixel camera, capturing the whole sky twice a week.

Plus, by comparing bright dots that move between images, astronomers can uncover closer-by objects, like rocks whose path might sweep uncomfortably close to Earth.

That latter part is interesting not just to scientists but also to the military. “It’s considered a defense function to find asteroids that might cause us to go extinct,” White says. That's (at least part of) why the Air Force, which also operates a satellite-tracking system on Haleakala, pushed $60 million into Pan-STARRS’s development. NASA, the state of Hawaii, a consortium of scientists, and some private donations ponied up the rest.

But when the telescope first got to work, its operations hit some snags. Its initial images were about half as sharp as they should have been, because the system that adjusted the telescope’s mirror to make up for distortions wasn’t working right.

Also, the Air Force redacted parts of the sky. It used software called Magic to detect streaks of light that might be satellites (including the US government's own). Magic masked those streaks, essentially placing a dead-pixel black bar across that section of sky, to “prevent the determination of any orbital element of the artificial satellite before the images left the [Institute for Astronomy] servers,” according to a recent paper by the Pan-STARRS group. The article says the Air Force dropped the requirement in December 2011. The magic was gone, and the scientists reprocessed the original raw data, removing the black boxes.

The first tranche of data, from the world’s most substantial digital sky survey, came in December 2016. It was full of stars, galaxies, space rocks, and strangeness. The telescope and its associated scientists have already found an eponymous comet, crafted a 3D model of the Milky Way’s dust, unearthed way-old active galaxies, and spotted everyone’s favorite probably-not-an-alien-spaceship, ’Oumuamua.

The real deal, though, entered the world late last month, when astronomers publicly released and put online all the individual snapshots, including auto-generated catalogs of some 800 million objects. With that data set, astronomers and regular people everywhere (once they've read a fair number of help-me files) can check out a patch of sky and see how it evolved as time marched on. The curious can do more of the “time domain” science Pan-STARRS was made for: catching explosions, watching rocks, and squinting at unexplained bursts.

Pan-STARRS might never have gotten its observations online if NASA hadn't seen its own future in the observatory's massive data pileup. That 1.6-petabyte archive is now housed at the Space Telescope Science Institute in Maryland, in a repository called the Mikulski Archive for Space Telescopes. The institute is also the home of bytes from Hubble, Kepler, GALEX, and 15 other missions, mostly belonging to NASA. “At the beginning they didn’t have any commitment to release the data publicly,” White says. “It’s such a large quantity they didn’t think they could manage to do it.” The institute, though, welcomed this outsider data in part so it could learn how to deal with such huge quantities.

The hope is that Pan-STARRS’s freely available data will make a big contribution to astronomy. Just look at the discoveries people publish using Hubble data, White says. “The majority of papers being published are from archival data, by scientists that have no connection to the original observations,” he says. That, he believes, will hold true for Pan-STARRS too.

But surveys are beautiful not just because they can be shared online. They’re also A+ because their observations aren’t narrow. In much of astronomy, scientists look at specific objects in specific ways at specific times. Maybe they zoom in on the magnetic field of pulsar J1745–2900, or the hydrogen gas in the farthest reaches of the Milky Way’s Perseus arm, or that one alien spaceship rock. Those observations are perfect for that individual astronomer to learn about that field, arm, or ship—but they’re not as great for anything or anyone else. Surveys, on the other hand, serve everyone.

“The Sloan Digital Sky Survey set the standard for these huge survey projects,” says White. Sloan, which started operations in 2000, is on its fourth iteration, collecting light with telescopes at Apache Point Observatory in New Mexico and Las Campanas Observatory in Northern Chile. From the early universe to the modern state of the Milky Way’s union, Sloan data has painted a full-on portrait of the universe that, like those creepy Renaissance portraits, will stick around for years to come.

Over in a different part of New Mexico, on the high Plains of San Agustin, radio astronomers recently set the Very Large Array’s sights on a new survey. Having started in 2017, the Very Large Array Sky Survey is still at the beginning of its seven years of operation. But astronomers don't have to wait for it to finish its observations, as happened with the first Pan-STARRS survey. “Within several days of the data coming off the telescope, the images are available to everybody,” says Brian Kent, who since 2012 has worked on the software that processes the data. That's no small task: For every four hours of skywatching, the telescope spits out 300 gigabytes, which the software then has to make useful and usable. “You have to put the collective smarts of the astronomers into the software,” he says.

Kent is excited about the same kinds of time-domain discoveries as White is: about seeing the universe at work rather than as a set of static images. Including the chronological dimension is hot in astronomy right now, from these surveys to future instruments like the LSST and the massive Square Kilometre Array, a radio telescope that will spread across two continents.

By watching for quick changes in their observations, astronomers have sought and found comets, asteroids, supernovae, fast radio bursts, and gamma-ray bursts. As they keep capturing a cosmos that evolves, moves, and bursts forth—not one trapped forever in whatever pose they found it—who knows what else they'll unearth.

Kent, though, is also psyched about the idea of bringing the universe to more people, through the regular internet and more formal initiatives, such as one that has, among other projects, helped train students from the University of the West Indies and the University of the Virgin Islands to dig into the data.

“There’s tons of data to go around,” says White. “And there’s more data than one person can do anything with. It allows people who might not use the telescope facilities to be able to be brought into the observatory.”

No reservations required.


More Great WIRED Stories

  • YouTube and Instagram tots are the new child stars
  • PHOTOS: Wildlife and humans collide on a grand scale
  • With its new 911, Porsche improves the unimprovable
  • ‘Fair’ algorithms can perpetuate discrimination
  • Meth, guns, pirates: The coder who became a crime boss
  • ? Looking for the latest gadgets? Check out our latest buying guides and best deals all year round
  • ? Want more? Sign up for our daily newsletter and never miss our latest and greatest stories

Related Video

Science

Alien Hunting: SETI Scientists on the Search for Life Beyond Earth | WIRED25

As part of WIRED25, WIRED's 25th anniversary celebration in San Francisco, Jill Tarter, author of "The 21st Century: The Century of Biology on Earth and Beyond," and astrobiologist Margaret Turnbull, two of the foremost authorities on search for life beyond Earth come together for a discussion on habitable planets, how life is defined and detected, and finding life in the universe outside Earth.

The laser is a tool of many talents, as the Nobel committee well knows. On Tuesday morning in Stockholm, its members announced the year’s physics prize and rattled off a short list of the technologies it has made possible: barcodes, eye surgery, cancer treatment, welding, cutting materials more precisely than a scalpel. They failed to acknowledge the whimsy it has brought cat owners, although one committee member did mention laser light shows.

The laser’s resume keeps growing. This year, the Nobel committee awarded the prize in physics to three scientists who invented two groundbreaking ways to use them: Arthur Ashkin for developing a technique for grabbing and studying microscopic objects, known as optical tweezers, and Donna Strickland and Gérard Mourou for inventing a method that now allows scientists to produce intense pulses of light that, for a billionth of a billionth of a second, contain more power than the entire U.S. electricity grid. These laser techniques have transformed medical procedures, manufacturing, and biology research, the committee said. The three will share the 9 million kronor (about $1 million) prize, with Ashkin receiving half, and Strickland and Mourou splitting the other half.

At 96, Ashkin is the oldest ever Nobel recipient. He developed optical tweezers in 1970, while he was a scientist working at Bell Labs. He found that, if you focus a laser in a specific way, you can create a sweet spot in the beam where certain microscopic beads can reside harmlessly and motionlessly. By affixing proteins or other tiny biological objects onto the bead, you can precisely steer and prod them. Since Ashkin’s invention, scientists have trapped and played with individual viruses, bacteria, proteins, DNA, and more using the tweezers. They allow scientists to grab and sort individual cells, for example, and to watch cells called phagocytes engulf bacteria to keep humans healthy. Scientists have even used the tweezers to measure the forces during mitosis—a cell dividing into two.

In particular, optical tweezers let scientists study the springiness and bendiness of biological structures, says physicist Michelle Wang of Cornell University. The tweezers are delicate enough to stretch a DNA molecule. Wang uses optical tweezers to study the twisting motion of motor proteins, which are structures that move molecules and other objects around the body.

Strickland and Mourou’s invention—a technique known as chirped pulse amplification—allows scientists to amplify laser light into the petawatts, which is more power than a trillion solar panels under direct sunlight. Prior to their innovation in 1985, this level of intensity was impossible. The beam was so powerful that it would destroy parts of the laser itself, says physicist Arvinder Sandhu of the University of Arizona. Strickland, who now works at the University of Waterloo, and Mourou of the École Polytechnique near Paris, figured out how to first mellow out the laser pulse—deliver the photons in a slow stream—and then squish them together into a sharp burst in another part of the laser that could handle the intensity.

The lasers don’t emit continuously at such high power; instead, this level of intensity endures as briefly as a billionth of a billionth of a second (an “attosecond”). These pulses are especially useful because they can precisely cut away materials like biological tissue without damaging its surroundings. That’s why they’re used in corrective eye surgery, says Sandhu. Similarly, industrial processes use it to carefully cut construction materials—“just like it machines the eye,” said Strickland during the prize announcement.

Like the optical tweezers, these short pulses can also be used to observe microscopic processes. Sandhu points the short bursts at exotic new materials, using it somewhat like a camera. In particular, he is interested in studying what electrons do inside a material, as they determine the material’s bulk properties, such as its electrical conductivity, magnetism, and melting point. The intensity of the light rips electrons off atoms in the material so that Sandhu can study their behavior. Short pulses provide a benefit like a fast shutter speed: the shorter the pulse, the more clearly you can capture what the electrons are doing in slow motion.

This year’s award is especially notable because the Nobel committee recognized both Mourou and Strickland, says Sandhu. When they developed the technique, Strickland was Mourou’s graduate student, a demographic whose achievements have historically been overlooked by the Nobel committee. “Junior researchers play an important role in physics,” he says. “It’s good to see that it’s not just advisers being recognized.”

It’s also “very heartening” that they gave the prize to a woman physicist this year, says Sandhu. Strickland is the first woman to receive the physics Nobel in 55 years. Only three women—Marie Curie, Maria Goeppert-Mayer, and now, Strickland—have won the Nobel Prize in physics. “Is that all? Really?” said Strickland during the conference. That brings the total fraction of female physics laureates to about 1.5 percent. The American Physical Society found that in 2017, women made up 21 percent of college physics degrees.

“Obviously, we need to celebrate women physicists because we’re out there,” said Strickland. “Hopefully in time it’ll start moving forward at a faster rate.” At the very least, during the ceremony, someone finally made Strickland a page on Wikipedia.

When Charles Darwin articulated his theory of evolution by natural selection in On the Origin of Species in 1859, he focused on adaptations—the changes that enable organisms to survive in new or changing environments. Selection for favorable adaptations, he suggested, allowed ancient ancestral forms to gradually diversify into countless species.

Quanta Magazine


About

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

That concept was so powerful that we might assume evolution is all about adaptation. So it can be surprising to learn that for half a century, a prevailing view in scholarly circles has been that it’s not.

Selection isn’t in doubt, but many scientists have argued that most evolutionary changes appear at the level of the genome and are essentially random and neutral. Adaptive changes groomed by natural selection might indeed sculpt a fin into a primitive foot, they said, but those changes make only a small contribution to the evolutionary process, in which the composition of DNA varies most often without any real consequences.

But now some scientists are pushing back against this idea, known as neutral theory, saying that genomes show much more evidence of evolved adaptation than the theory would dictate. This debate is important because it affects our understanding of the mechanisms that generate biodiversity, our inferences about how the sizes of natural populations have changed over time and our ability to reconstruct the evolutionary history of species (including our own). What lies in the future might be a new era that draws from the best of neutral theory while also recognizing the real, empirically supported influence of selection.

An 'Appreciable Fraction' of Variation

Darwin’s core insight was that organisms with disadvantageous traits would slowly be weeded out through negative (or purifying) selection, while those with advantageous features would reproduce more often and pass those features on to the next generation (positive selection). Selection would help to spread and refine those valuable traits. For most of the first half of the 20th century, population geneticists largely attributed genetic differences between populations and species to adaptation through positive selection.

But in 1968, the famed population geneticist Motoo Kimura resisted the adaptationist perspective with his neutral theory of molecular evolution. In a nutshell, he argued that an “appreciable fraction” of the genetic variation within and between species is the result of genetic drift — that is, the effects of randomness in a finite population — rather than natural selection, and that most of these differences have no functional consequences for survival and reproduction.

The following year, the biologists Jack Lester King and Thomas Jukes published “Non-Darwinian Evolution,” an article that likewise emphasized the importance of random genetic changes in the course of evolution. A polarized debate subsequently emerged between the new neutralists and the more traditional adaptationists. Although everyone agreed that purifying selection would weed out deleterious mutations, the neutralists were convinced that genetic drift accounts for most differences between populations or species, whereas the adaptationists credited them to positive selection for adaptive traits.

Much of the debate has hinged on exactly what Kimura meant by “appreciable fraction” of genetic variation, according to Jeffrey Townsend, a biostatistician and professor of evolutionary biology at the Yale School of Public Health. “Is that 50 percent? Is it 5 percent, 0.5 percent? I don’t know,” he said. Because Kimura’s original statement of the theory was qualitative rather than quantitative, “his theory could not be invalidated by later data.”

Nevertheless, neutral theory was rapidly adopted by many biologists. This was partly a result of Kimura’s reputation as one of the most prominent theoretical population geneticists of the time, but it also helped that the mathematics of the theory was relatively simple and intuitive. “One of the reasons for the popularity of the neutral theory was that it made things a lot easier,” said Andrew Kern, a population geneticist now at the University of Oregon, who contributed an article with Matthew Hahn, a population geneticist at Indiana University, to a special issue of Molecular Biology and Evolution celebrating the 50th anniversary of neutral theory.

To apply a neutral model of evolution to a population, Hahn explained, you don’t have to know how strong selection is, how large the population is, whether mutations are dominant or recessive, or whether mutations interact with other mutations. In neutral theory, “all of those very hard parameters to estimate go away.”

The only key input required by the neutral model is the product of the population size and the mutation rate per generation. From this information, the neutral model can predict how the frequency of mutations in the population will change over time. Because of its simplicity, many researchers adopted the neutral model as a convenient “null model,” or default explanation for the patterns of genetic variation they observed.

Some population geneticists were not convinced by Kimura’s argument, however. For instance, John Gillespie, a theoretical population geneticist at the University of California, Davis (and Kern’s doctoral adviser), showed in the early 1970s that some natural selection-based models could explain patterns observed in nature as well as neutral models, if not better.

More fundamentally, even when there aren’t enough data to disprove a neutral-theory null model, it doesn’t mean that natural selection isn’t happening, said Rebekah Rogers, an evolutionary geneticist at the University of North Carolina, Charlotte. “Any time you have limited data, the arguments get really fierce,” she said.

For decades, that was the crux of the problem: Kimura had proposed neutral theory at a time before inexpensive sequencing technology and the polymerase chain reaction became available, when gene sequence data were sparse. There was no simple way to broadly prove or disprove its tenets except on theoretical grounds because we didn’t know enough about genomic variation to resolve the dispute.

Strong Feelings About Neutrality

Today, 50 years after Kimura’s article, more affordable genomic sequencing and sophisticated statistical methods are allowing evolutionary theorists to make headway on quantifying the contribution of adaptive variation and neutral evolution to species differences. In species like humans and fruit flies, the data have revealed extensive selection and adaptation, which has led to strong pushback against Kimura’s original idea, at least by some researchers.

“The ubiquity of adaptive variation both within and between species means that a more comprehensive theory of molecular evolution must be sought,” Kern and Hahn wrote in their recent article.

Although the vast majority of researchers agree that strict neutrality as originally formulated is false, many also point out that refinements of neutral theory have addressed weaknesses in it. One of the original shortcomings was that neutral theory could not explain the varying patterns of genome evolution observed among species with different population sizes. For instance, species with smaller population sizes have on average more mutations that are deleterious.

To address this, one of Kimura’s students, Tomoko Ohta, now professor emeritus at Japan’s National Institute of Genetics, proposed the nearly neutral theory of molecular evolution in 1973. This modified version of neutral theory suggests that many mutations are not strictly neutral, but slightly deleterious. Ohta argued that if population sizes are large enough, purifying selection will purge them of even slightly deleterious mutations. In small populations, however, purifying selection is less effective and allows slightly deleterious mutations to behave neutrally.

Nearly neutral theory also had problems, Kern said: It did not explain, for example, why the rate of evolution varies as observed among different lineages of organisms. In response to such challenges, Ohta and Hidenori Tachida, now a professor of biology at Kyushu University, developed yet another variation of the nearly neutral model in 1990.

Opinions about the standing of nearly neutral theory can still differ sharply. “The predictions of nearly neutral theory have been confirmed very well,” said Jianzhi Zhang, who studies the evolution of genomes at the University of Michigan and also contributed to the special issue of Molecular Biology and Evolution.

Kern and Hahn disagree: Nearly neutral theory “didn’t explain much from the start and then was shuffled around in attempts to save an appealing idea from the harsh glare of data,” Kern wrote in an email.

How Much Evolves Neutrally?

For Townsend, the ongoing debate between neutralists and selectionists isn’t particularly fruitful. Instead, he said, “it’s just a quantitative question of how much selection is going on. And that includes some sites that are completely neutral, and some sites that are moderately selected, and some sites that are really strongly selected. There’s a whole distribution there.”

When Townsend first started studying cancer about a decade ago after training as an evolutionary biologist, he saw that cancer biologists had begun to study mutations at a level of detail that could reveal information about mutation rates at individual sites in the genome. That’s precious information that most population geneticists don’t get from the wild populations they study. Yet few cancer biologists study natural selection, and that’s what Townsend has brought to the cancer field with his background in evolutionary biology.

In a paper published in late October in the Journal of the National Cancer Institute, Townsend and his Yale colleagues presented the results of their evolutionary analysis of mutations in cancers. “What we’ve been able to do is actually quantify, on a site-by-site basis, what the selection intensities of different mutations are,” he said. Cancer cells are rife with mutations, but only a small subset of those are functionally important to the cancer. The selection intensities reveal how important the different mutations are for driving growth in an individual case of cancer—and therefore which ones would be most promising as therapeutic targets.

“This quantification of the selection intensity is absolutely essential, I think, to guiding how we treat cancer,” Townsend said. “My point is, doctors today are encountering the question: Which drug should I give to this patient? And they don’t have a quantification of how important the mutations that those drugs target actually are.” Someday, Townsend hopes, this evolutionary framework will offer a genetic basis for choosing the right drug and even predicting how a specific tumor might develop resistance to a treatment.

Although identifying the mutations undergoing the strongest selection is clearly useful and important, selection can also have subtle but important indirect effects on regions of the genome neighboring the target of selection.

The first hint of these indirect effects came in the 1980s and ’90s with the advent of the polymerase chain reaction, a technique that enabled researchers to look at nucleotide-level variation in gene sequences for the first time. One thing they discovered was an apparent correlation between the level of genetic variation and the rate of recombination at any specified region of the genome.

Recombination is a process in which the maternal and paternal copies of chromosomes exchange blocks of DNA with each other during meiosis, the production of sperm and egg cells. These recombinations shuffle genetic variation throughout the genome, splitting up alleles that might have previously been together.

By 2005, researchers could get whole-genome data from a variety of organisms, and they started to find this apparent correlation between levels of genetic variation and the rates of recombination everywhere, Kern said. That correlation meant that forces beyond direct purifying selection and neutral drift were creating differences in levels of variation across the genomic landscape.

Kern argues that the differences in the rates of recombination across the genome reveal a phenomenon called genetic hitchhiking. When beneficial alleles are closely linked to neighboring neutral mutations, natural selection tends to act on all of them as a unit.

Genetic hitchhiking meant that evolutionary geneticists suddenly had a whole new force called linked selection to worry about, Kern said. If only 10 percent of the genome is under direct selection in a population, then linked selection means that a much larger percentage—maybe 30 or 40 percent—might show its effects.

And if that is true, then selection for adaptive variants indirectly shapes neighboring genomic regions, leading to “a situation where neutral alleles have their frequencies determined by more than genetic drift, and instead have a new layer of stochasticity induced by selection,” Kern explained by email: Linked selection would produce more variance between generations than one would expect under neutrality.

Zhang points out that linked neutral mutations are still neutral. They might be hitchhiking with beneficial alleles, but that linkage is random—they could as easily be linked to harmful alleles and weeded out through “background selection.” So the neutral mutations’ fate is still determined by chance.

Kern agrees: The neutral mutations are still neutral—but they are not behaving as neutral theory would predict. Purifying selection at linked sites, he wrote, would “add noise to allele frequencies beyond drift,” while background selection and hitchhiking would lead to less genetic variation than under neutrality.

Neutral Models and Human Evolution

“While neutral models have without doubt begat tremendous theoretical fruits … the explanatory power of the neutral theory has never been exceptional,” Kern and Hahn wrote in their paper. “Five decades after its proposal, in the age of cheap genome sequencing and tremendous population genomic data sets, the explanatory power of the neutral theory looks even worse.”

In humans, recent evidence suggests “there’s a lot more adaptation than we ever thought was present,” Kern said. Recent human evolution is largely a history of migrations to new geographical locations where humans encountered new climates and pathogens to which they had to adapt. In 2017, Kern published a paper showing that most human adaptations arose from existing genetic variation within the genome, not novel mutations that spread rapidly through the population.

Even so, only about 1 percent of the human genome actually codes for proteins, said Omar Cornejo, an evolutionary genomicist at Washington State University. Maybe about 20 percent of the genome regulates when and where those coding regions are expressed. But that still leaves about 80 percent of the genome with unknown function.

Parts of this noncoding portion of the genome are riddled with repetitive DNA sequences, caused by transposable genetic elements, or transposons, that copy and insert themselves throughout the genome. According to Irina Arkhipova, a molecular evolutionary geneticist who studies the role of transposons at the Marine Biological Laboratory at the University of Chicago, “this portion of the genome is quintessentially neutral in Kimura’s sense,” even if some fraction of those transposons do affect the expression of genes. Because of this, neutral models applied to the nonfunctional regions of the genome can be used to infer the demographic history of human populations (and a variety of other organisms) quite accurately, Cornejo said.

Kern disagrees. “I’d argue we have no idea if we are accurately estimating human demographic history,” he wrote in an email. If you computationally simulate a population evolving neutrally, then methods for estimating demography will work; but introduce linked selection, and those methods fail.

Kern is agnostic about what percentage of the human genome is functional, but he believes that genetic linkage touches a large—yet unknown—fraction of the genome. With the accumulating evidence for adaptation in the human genome, it seems likely that some large fraction of the genome would be subject to the effects of linked selection, he suggested. “We just don’t know how large that fraction is.”

A recent paper in eLife by Fanny Pouyet and her computational-geneticist colleagues at the University of Bern and the Swiss Institute of Bioinformatics pins down that number. “Up to 80-85 percent of the human genome is probably affected by background selection,” the authors wrote.

After they additionally accounted for biased changes in genes that recombination can introduce during DNA repair, they concluded that less than 5 percent of the human genome evolved by chance alone. As the editors of eLife noted in their summary of the paper, “This suggests that while most of our genetic material is formed of non-functional sequences, the vast majority of it evolves indirectly under some type of selection.”

It’s possible that this estimate may creep even higher as biologists learn to recognize more subtle hints of selection. The new frontier in population genomics is focusing on traits such as height, skin color and blood pressure (among many others) that are polygenic, meaning that they result from hundreds or thousands of genes acting in concert. Selection for greater height, for instance, requires piling up changes at a number of dispersed genes to have an effect. Similarly, when farmers select strains of corn for higher yields, the impact generally shows up in many genes simultaneously.

But detecting polygenic adaptations in natural populations is a “very tricky business,” according to Kern, because those multitudes of genes are likely to be interacting in complex, nonlinear ways. The statistical methods for spotting those suites of changes are only beginning to be developed. To Kern, it will involve learning to appreciate “a whole other flavor” of adaptation because it will involve many small changes in individual mutation frequencies that collectively matter to natural selection.

In other words, it’s yet another non-neutral mechanism affecting genome evolution. As useful as the neutral theory has been in its various forms over the past half-century, the future of evolutionary theory may inevitably depend on finding ever-better ways to do the hard work of figuring out exactly how — and how much—selection is inexorably shaping our genomes after all.

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.