Dogs Pass a Smell Test–Locating Ancient Buried Human Remains

(p. D1) On a sunny summer day in Croatia several years ago, an archaeologist and two dog handlers watched as two dogs, one after another, slowly worked their way across the rocky top of a wind-scoured ridge overlooking the Adriatic Sea.

. . .

Panda, a Belgian Malinois with a “sensitive nose,” according to her handler, Andrea Pintar, had begun exploring the circular leftovers of a tomb when she suddenly froze, her nose pointed toward a stone burial chest. This was her signal that she had located the scent of human remains.

Ms. Pintar said the hair on her arms rose. “I was skeptical, and I was like, ‘She is kidding me,’” she recalled thinking about her dog that day.

Archaeologists had found fragments of human bone and teeth in the chest, but these had been removed months earlier for analysis and radiocarbon dating. All that was left was a bit of dirt, the stone slabs of the tomb and the cracked limestone of the ridge.

. . .

(p. D6) . . . the experiment in Croatia marked the start of one of the most careful inquiries yet carried out of an unusual archaeological method. If such dogs could successfully locate the burial sites of mass executions, dating from World War II through the conflicts in the Balkans in the 1990s, might they be effective in helping archaeologists find truly ancient burials?

. . .

That “test run” was the beginning of a careful study on whether human-remains detection dogs could be an asset to archaeologists. Setting up a controlled study was difficult. Dr. Glavaš had to learn the scientific literature, such as scent theory, far outside the standard confines of archaeology; the same was true for Ms. Pintar and the field of archaeology.

. . .

“I think dogs are really capable of this, but I think it’s a logistical challenge,” said Adee Schoon, a scent-detection-animal expert from the Netherlands who was not involved in the study. “It’s not something you can replicate again and again. It’s hard to train.”

And, as Dr. Schoon noted, dogs are “great anomaly detectors.” Something as subtle as recently disturbed soil can elicit a false alert from a dog that is not rigorously trained.

Nonetheless, the team returned to the necropolis for the first controlled tests in September 2015, and again a full year later. Both times, they used all four of Ms. Pintar and Mr. Nikolić’s cadaver dogs: Panda, Mali, a third Belgian Malinois and a German shepherd. They worked them on both known and double-blind searches, in areas where nobody knew if tombs were located.

The dogs located four tombs new to the archaeologists. Dr. Glavaš had suspected that a fifth site might hold a burial chest, and the dogs’ alerts, combined with excavation, proved her suspicion correct.

In September 2019, the Journal of Archaeological Method and Theory published the results of their study: “This research has demonstrated that HRD dogs are able to detect very small amounts of specific human decomposition odor as well as to indicate to considerably older burials than previously assumed,” Dr. Glavaš and Ms. Pintar wrote.

Dr. Schoon, who researches and helps create protocols to train scent-detection animals worldwide, said the Iron Age necropolis study was nicely designed and “really controlled.”

. . .

Cadaver dogs are also helping archaeologists at some especially challenging sites. Mike Russo and Jeff Shanks, archaeologists with the National Park Service’s Southeast Archeological Center, had created at least 14 test holes near a promising site in northwest Florida that had been flattened during an earlier era of less diligent archaeology. They found nothing.

“We knew where it should be, but when we went there, there was absolutely no mound,” Mr. Russo said.

They then asked Suzi Goodhope, a longtime cadaver-dog handler in Florida, to bring her experienced detection dog, Shiraz, a Belgian Malinois, to the site in 2013. Shiraz and Ms. Goodhope worked the flat, brushy area for a long time. Then, Shiraz sat. Once.

“I was pretty skeptical,” Mr. Shanks said.

Nonetheless, the archaeologists dug. And dug. They went down nearly three feet — and there they found a human toe bone more than 1,300 years old.

Passing sniff tests

What is the future of using human-remains detection dogs as a noninvasive tool in archaeology?

Some archaeologists, forensic anthropologists, geologists, scientists — and even H.R.D. dog handlers who know how challenging the work is — say they have great potential. But challenges abound.

Although researchers are learning ever more about the canine olfactory system, they are still trying to pinpoint what volatile organic compounds in human remains are significant to trained dogs.

. . .

Detection dogs also must be trained for archaeology with more consistency. Often humans are the limiting factor. Sometimes, Dr. Schoon said, she can almost see a dog thinking, “Is that all you want me to do? I can do much more!”

For the full story see:

Cat Warren. “Sniffing Out New (Old) Digs.” The New York Times (Tuesday, May 19, 2020 [sic]): D1 & D6.

(Note: ellipses added.)

(Note: the online version of the story was updated May 25, 2020 [sic], and has the title “When Cadaver Dogs Pick Up a Scent, Archaeologists Find Where to Dig.”)

The academic article documenting that dogs are able use their hypercapable noses to smell ancient human remains is:

Glavaš, Vedrana, and Andrea Pintar. “Human Remains Detection Dogs as a New Prospecting Method in Archaeology.” Journal of Archaeological Method and Theory 26, no. 3 (Sept. 2019): 1106-24.

“Fiennes Is Superb” as “Calmly Eccentric Self-Taught Scholar”

(p. A13) Every now and then a film comes along—not a great one, necessarily—that makes you deeply glad. It’s how I feel about “The Dig.”

. . .

The dig in question has come to be called Sutton Hoo, after its site on the banks of a tidal river in Suffolk. The film, directed by Simon Stone and adapted by Moira Buffini from a John Preston novel about the discovery, plunges us into the adventure by following an unassuming gent named Basil Brown (Ralph Fiennes) as he bicycles his way to the fairly imposing house of Edith Pretty (Carey Mulligan), a widow eager to investigate a mysterious group of mounds on her property. The project calls for an archaeologist—not Indiana Jones, necessarily, but someone with more extensive training than Basil, who was, in real life, the man who made the discovery, and who describes himself here with with laconic pride as a lifelong excavator. Yet the nation is preparing for war, no archaeologists are available and Basil will have to do.

Thus does “The Dig” deftly address issues of class—Basil knows more about the history and texture of Suffolk’s soil than any credentialed expert a museum might have sent—while giving us a prime example of an archetype dear to English films, the calmly eccentric self-taught scholar (who of course smokes a pipe). Mr. Fiennes is superb in the role—you’ll be glad to watch him digging away with his shovel, and you’ll be thrilled, as I was, when, after digging for a good while, he shows up at Edith’s door and says, his voice quivering with emotion, “I think you’d better come and see.” (The emotional spectrum of the cinematography, by Mike Eley, ranges from solemn to ecstatic.)

For the full review see:

Joe Morgenstern. “Unearthing a Glittering Tale.” The Wall Street Journal (Friday, Jan. 29, 2021 [sic]): A13.

(Note: ellipsis added.)

(Note: the online version of the review has the date January 28, 2021 [sic], and has the title “‘The Dig’: Unearthing a Glittering Tale.”)

Human Ancestor 1.45-Million Years Ago Was a Victim of Cannibalism

Modern capitalism is sometimes criticized as inferior to a long-ago golden age. A past golden age is a myth. Human ancestors suffered from cannibalism and other violence.

(p. D3) In today’s scholar-eat-scholar world of paleoanthropology, claims of cannibalism are held to exacting standards of evidence. Which is why more than a few eyebrows were raised earlier this week over a study in Scientific Reports asserting that a 1.45-million-year-old fragment of shin bone — found 53 years ago in northern Kenya, and sparsely documented — was an indication that our human ancestors not only butchered their own kind, but were probably, as an accompanying news release put it, “chowing down” on them, too.

The news release described the finding as the “oldest decisive evidence” of such behavior. “The information we have tells us that hominids were likely eating other hominids at least 1.45 million years ago,” Briana Pobiner, a paleoanthropologist at the Smithsonian’s National Museum of Natural History and first author of the paper, said in the news release.

. . .

Dr. Pobiner, an authority on cut marks, had spied the half-tibia fossil six summers ago while examining hominid bones housed in a Nairobi museum vault. She was inspecting the fossil for bite marks when she noticed 11 thin slashes, all angled in the same direction and clustered around a spot where a calf muscle would have attached to the bone — the meatiest chunk of the lower leg, Dr. Pobiner said in an interview.

She sent molds of the scars to Michael Pante, a paleoanthropologist at Colorado State University and an author on the study, who made 3-D scans and compared the shape of the incisions with a database of 898 tooth, trample and butchery marks. The analysis indicated that nine of the markings were consistent with the kind of damage made by stone tools. Dr. Pobiner said that the placement and orientation of the cuts implied that flesh had been stripped from the bone. From those observations she extrapolated her cannibalism thesis.

“From what we can tell, this hominin leg bone is being treated like other animals, which we presume are being eaten based on lots of butchery marks on them,” Dr. Pobiner said. “It makes the most sense to presume that this butchery was also done for the purpose of eating.”

. . .

. . ., clear proof of systematic cannibalism among hominids has emerged in the fossil record. The earliest confirmation was uncovered in 1994 in the Gran Dolina cave site of Spain’s Atapuerca Mountains. The remains of 11 individuals who lived some 800,000 years ago displayed distinctive signs of having been eaten, with bones displaying cuts, fractures where they had been cracked open to expose the marrow and human tooth marks.

Among our other evolutionary cousins now confirmed to have practiced cannibalism are Neanderthals, with whom humans overlapped, and mated, for thousands of years. A study published in 2016 reported that Neanderthal bones found in a cave in Goyet, Belgium, and dated to roughly 40,000 B.C. show signs of being butchered, split and used to sharpen the edges of stone tools. Patterns of bone-breakage in Homo antecessor, considered the last common ancestor of Neanderthals and Homo sapiens, suggest that cannibalism goes back a half-million years or more.

For the full story see:

Franz Lidz. “For Paleoanthropology, Cannibalism Can Be Clickbait.” The New York Times (Tuesday, June 11, 2024): D3.

(Note: ellipses added.)

(Note: the online version of the story was updated July 3, 2023, and has the title “Cannibalism, or ‘Clickbait’ for Paleoanthropology?”)

The study in Scientific Reports mentioned above is:

Pobiner, Briana, Michael Pante, and Trevor Keevil. “Early Pleistocene Cut Marked Hominin Fossil from Koobi Fora, Kenya.” Scientific Reports 13, no. 1 (2023): 9896.

Since Wood Tools Are Rarely Preserved, “Preservation Bias Distorts Our View of Antiquity”

(p. D3) In 1836, Christian Jürgensen Thomsen, a Danish antiquarian, brought the first semblance of order to prehistory, suggesting that the early hominids of Europe had gone through three stages of technological development that were reflected in the production of tools. The basic chronology — Stone Age to Bronze Age to Iron Age — now underpins the archaeology of most of the Old World (and cartoons like “The Flintstones” and “The Croods”).

Thomsen could well have substituted Wood Age for Stone Age, according to Thomas Terberger, an archaeologist and head of research at the Department of Cultural Heritage of Lower Saxony, in Germany.

“We can probably assume that wooden tools have been around just as long as stone ones, that is, two and a half or three million years,“ he said. “But since wood deteriorates and rarely survives, preservation bias distorts our view of antiquity.” Primitive stone implements have traditionally characterized the Lower Paleolithic period, which lasted from about 2.7 million years ago to 200,000 years ago. Of the thousands of archaeological sites that can be traced to the era, wood has been recovered from fewer than 10.

Dr. Terberger was team leader of a study published last month in the Proceedings of the National Academy of Sciences that provided the first comprehensive report on the wooden objects excavated from 1994 to 2008 in the peat of an open-pit coal mine near Schöningen, in northern Germany. The rich haul included two dozen complete or fragmented spears (each about as tall as an N.B.A. center) and double-pointed throwing sticks (half the length of a pool cue) but no hominid bones. The objects date from the end of a warm interglacial period 300,000 years ago, about when early Neanderthals were supplanting Homo heidelbergensis, their immediate predecessors in Europe. The projectiles unearthed at the Schöningen site, known as Spear Horizon, are considered the oldest preserved hunting weapons.

For the full story see:

Franz Lidz. “In the Stone Age, Wood Was Pivotal, a Study Says.” The New York Times (Tuesday, May 7, 2024): D3.

(Note: the online version of the story was updated May 6, 2024, and has the title “Was the Stone Age Actually the Wood Age?”)

Terberger’s co-authored academic paper mentioned above is:

Leder, Dirk, Jens Lehmann, Annemieke Milks, Tim Koddenberg, Michael Sietz, Matthias Vogel, Utz Böhner, and Thomas Terberger. “The Wooden Artifacts from Schöningen’s Spear Horizon and Their Place in Human Evolution.” Proceedings of the National Academy of Sciences 121, no. 15 (2024): e2320484121.

Seeds of Plant Mostly Used for Pain Relief in Roman Era, Found Stashed in Buried Bone in “Far-Flung” Province

A couple of thousand years ago some humans had figured out how to use a medicinal plant for effective pain relief. And they did so without having conducted randomized double-blind clinical trials. And no agency of the government blocked them from easing their pain.

(p. D2) . . ., Mr. van Haasteren was cleaning the mud from yet another bone when something unexpected happened: Hundreds of black specks the size of poppy seeds came pouring out from one end.

The specks turned out to be seeds of black henbane, a potently poisonous member of the nightshade family that can be medicinal or hallucinogenic depending on the dosage.  . . .

This “very special” discovery provides the first definitive evidence that Indigenous people living in such a far-flung Roman province had knowledge of black henbane’s powerful properties, said Maaike Groot, an archaeozoologist at the Free University of Berlin and a co-author of a paper published in the journal Antiquity last month describing the finding.

The plant was mostly used during Roman times as an ointment for pain relief, although some sources also reference smoking its seeds or adding its leaves to wine.

For the full story see:

Rachel Nuwer. “Psychedelic Stash: Ancient Seeds Courtesy of a Doctor, or a Doctor Feel Good.” The New York Times (Tuesday, April 9, 2024): D2.

(Note: ellipses added.)

(Note: the online version of the story has the date March 21, 2024, and has the title “Long Before Amsterdam’s Coffee Shops, There Were Hallucinogenic Seeds.”)

The academic paper co-authored by Groot and mentioned above is:

Groot, Maaike, Martijn van Haasteren, and Laura I. Kooistra. “Evidence of the Intentional Use of Black Henbane (Hyoscyamus niger) in the Roman Netherlands.” Antiquity 98, no. 398 (2024): 470-85.

During Black Death Only 7 of 21 Regions of Europe Had Catastrophic Decline in Agricultural Activity

(p. D4) In the mid-1300s, a species of bacteria spread by fleas and rats swept across Asia and Europe, causing deadly cases of bubonic plague. The “Black Death” is one of the most notorious pandemics in historical memory, with many experts estimating that it killed roughly 50 million Europeans, the majority of people across the continent.

“The data is sufficiently widespread and numerous to make it likely that the Black Death swept away around 60 percent of Europe’s population,” Ole Benedictow, a Norwegian historian and one of the leading experts on the plague, wrote in 2005. When Dr. Benedictow published “The Complete History of the Black Death” in 2021, he raised that estimate to 65 percent.

But those figures, based on historical documents from the time, greatly overestimate the true toll of the plague, according to a study published on Thursday [Feb. 10, 2022]. By analyzing ancient deposits of pollen as markers of agricultural activity, researchers from Germany found that the Black Death caused a patchwork of destruction. Some regions of Europe did indeed suffer devastating losses, but other regions held stable, and some even boomed.

. . .

Losing half the population would have turned many farms fallow. Without enough herders to tend livestock, pastures would have become overgrown. Shrubs and trees would have taken over, eventually replaced by mature forests.

If the Black Death did indeed cause such a shift, Dr. Izdebski and his colleagues reasoned, they should be able to see it in the species of pollen that survived from the Middle Ages. Every year, plants release vast amounts of pollen into the air, and some of it ends up on the bottom of lakes and wetlands. Buried in the mud, the grains can survive sometimes for centuries.

To see what pollen had to say about the Black Death, Dr. Izdebski and his colleagues picked out 261 sites across Europe — from Ireland and Spain in the west to Greece and Lithuania in the east — that held grains preserved from around 1250 to 1450.

In some regions, such as Greece and central Italy, the pollen told a story of devastation. Pollen from crops like wheat dwindled. Dandelions and other flowers in pastureland faded. Fast-growing trees like birch appeared, followed by slow-growing ones like oaks.

But that was hardly the rule across Europe. In fact, just seven out of 21 regions the researchers studied underwent a catastrophic shift. In other places, the pollen registered little change at all.

. . .

Monica Green, an independent historian based in Phoenix, speculated that the Black Death might have been caused by two strains of the bacteria Yersinia pestis, which could have caused different levels of devastation. Yersinia DNA collected from medieval skeletons hints at this possibility, she said.

In their study, Dr. Izdebski and his colleagues did not examine that possibility, but they did consider a number of other factors, including the climate and density of populations in different parts of Europe. But none accounted for the pattern they found.

“There is no simple explanation behind that, or even a combination of simple explanations,” Dr. Izdebski said.

. . .

“What we show is that there are a number of factors, and it’s not easy to predict from the beginning which factors will matter,” he said, referring to how viruses can spread. “You cannot assume one mechanism to work everywhere the same way.”

For the full essay see:

Carl Zimmer. “Questioning the Toll Of a 1300s Pandemic.” The New York Times (Tuesday, February 15, 2022 [sic]): D4.

(Note: ellipses, and bracketed date, added.)

(Note: the online version of the essay was updated Feb. 15, 2022 [sic], and has the title “Did the ‘Black Death’ Really Kill Half of Europe? New Research Says No.”)

The book cited above as over-estimating the death toll of the Black Death is:

Benedictow, Ole J. The Complete History of the Black Death. Martlesham, UK: Boydell Press, 2021.

The academic article co-authored by Izdebski and mentioned above is:

Izdebski, A., P. Guzowski, R. Poniat, L. Masci, J. Palli, C. Vignola, M. Bauch, C. Cocozza, R. Fernandes, F. C. Ljungqvist, T. Newfield, A. Seim, D. Abel-Schaad, F. Alba-Sánchez, L. Björkman, A. Brauer, A. Brown, S. Czerwiński, A. Ejarque, M. Fiłoc, A. Florenzano, E. D. Fredh, R. Fyfe, N. Jasiunas, P. Kołaczek, K. Kouli, R. Kozáková, M. Kupryjanowicz, P. Lagerås, M. Lamentowicz, M. Lindbladh, J. A. López-Sáez, R. Luelmo-Lautenschlaeger, K. Marcisz, F. Mazier, S. Mensing, A. M. Mercuri, K. Milecka, Y. Miras, A. M. Noryśkiewicz, E. Novenko, M. Obremska, S. Panajiotidis, M. L. Papadopoulou, A. Pędziszewska, S. Pérez-Díaz, G. Piovesan, A. Pluskowski, P. Pokorny, A. Poska, T. Reitalu, M. Rösch, L. Sadori, C. Sá Ferreira, D. Sebag, M. Słowiński, M. Stančikaitė, N. Stivrins, I. Tunno, S. Veski, A. Wacnik, and A. Masi. “Palaeoecological Data Indicates Land-Use Changes across Europe Linked to Spatial Heterogeneity in Mortality During the Black Death Pandemic.” Nature Ecology & Evolution 6, no. 3 (March 2022): 297-306.

Archeologist Claims Ancient Egyptians Had Advanced Medical Knowledge

(p. A17) Ancient Egyptian doctors were the first to explore and treat cancer, according to scientists who examined two skulls with tumors and found evidence they had been operated on.

. . .

It might never be possible to know whether these two ancient Egyptians were treated for cancer while they were alive, Camarós said. But given the civilization’s advanced medical knowledge—historical and archaeological records show they built prostheses, put in dental fillings and treated traumatic injuries—he is convinced their physicians were attempting surgical interventions.

“They even had a word for tumor,” he said. “And they knew it was something people were dying from.”

For the full story, see:

Aylin Woodward. “Ancient Egyptians Were First To Treat Cancer.” The Wall Street Journal (Thursday, May 30, 2024): A16.

(Note: ellipsis added.)

(Note: the online version of the story has the date May 29, 2024, and has the title “Ancient Egyptians Were First to Treat Cancer. Skulls Show Evidence of Surgery.”)

The study co-authored by Camarós and summarized above is:

Tondini, Tatiana, Albert Isidro, and Edgard Camarós. “Case Report: Boundaries of Oncological and Traumatological Medical Care in Ancient Egypt: New Palaeopathological Insights from Two Human Skulls.” Frontiers in Medicine 11 (2024).

Egyptians May Have Tried Surgery on Brain Cancer 4,600 Years Ago

(p. D2) Scientists led by Edgard Camarós, a paleopathologist at the University of Santiago de Compostela in Spain, were studying an approximately 4,600-year-old Egyptian skull when they found signs of brain cancer and its treatment.

. . .

Using a microscope, he and Tatiana Tondini of the University of Tübingen in Germany and Albert Isidro of the University Hospital Sagrat Cor in Spain, the study’s other authors, found cut marks around the skull’s edges surrounding dozens of lesions that earlier researchers had linked to metastasized brain cancer. The shape of the cuts indicated that they had been made with a metal tool. This discovery, reported in a study published Wednesday [May 29, 2024] in the journal Frontiers in Medicine, suggests that ancient Egyptians studied brain cancer using surgery. If the cuts were made while the person was alive, they may have even attempted to treat it.

. . .

The new discovery not only expands scientific knowledge of Egyptian medicine, it may also push back the timeline of humanity’s documented attempts to treat cancer by up to 1,000 years.

For the full story see:

Jordan Pearson. “An Ongoing Search: In an Ancient Egyptian Skull, Evidence of a Cancer Treatment.” The New York Times (Tuesday, June 4, 2024): D2.

(Note: ellipses, and bracketed date, added.)

(Note: the online version of the story has the date May 29, 2024, and has the title “Ancient Skull With Brain Cancer Preserves Clues to Egyptian Medicine.” Where the wording of the versions differs, the passages quoted above follow the online version.)

The study co-authored by Camarós, and mentioned above, is:

Tondini, Tatiana, Albert Isidro, and Edgard Camarós. “Case Report: Boundaries of Oncological and Traumatological Medical Care in Ancient Egypt: New Palaeopathological Insights from Two Human Skulls.” Frontiers in Medicine 11 (2024) DOI: 10.3389/fmed.2024.1371645.

On the antiquity of cancer, see also:

Haridy, Yara, Florian Witzmann, Patrick Asbach, Rainer R. Schoch, Nadia Fröbisch, and Bruce M. Rothschild. “Triassic Cancer—Osteosarcoma in a 240-Million-Year-Old Stem-Turtle.” JAMA Oncology 5, no. 3 (March 2019): 425-26.

Species Shifting Their Range Due to Climate Change May Have Enabled the “Playing Around With Resources” That Invented Farming

(p. D6) In the 1990s, archaeologists largely concluded that farming in the Fertile Crescent began in Jordan and Israel, a region known as the southern Levant. “The model was that everything started there, and then everything spread out from there, including maybe the people,” said Melinda A. Zeder, a senior research scientist at the Smithsonian National Museum of Natural History.

But in recent years, Dr. Zeder and other archaeologists have overturned that consensus. Their research suggests that people were inventing farming at several sites in the Fertile Crescent at roughly the same time. In the Zagros Mountains of Iran, for example, Dr. Zeder and her colleagues have found evidence of the gradual domestication of wild goats over many centuries around 10,000 years ago.

People may have been cultivating plants earlier than believed, too.

In the 1980s, Dani Nadel, then at Hebrew University, and his colleagues excavated a 23,000-year-old site on the shores of the Sea of Galilee known as Ohalo II. It consisted of half a dozen brush huts. Last year, Dr. Nadel co-authored a study showing that one of the huts contained 150,000 charred seeds and fruits, including many types, such as almonds, grapes and olives, that would later become crops. A stone blade found at Ohalo II seemed to have been used as a sickle to harvest cereals. A stone slab was used to grind the seeds. It seems clear the inhabitants were cultivating wild plants long before farming was thought to have begun.

“We got fixated on the very few things we just happened to see preserved in the archaeological record, and we got this false impression that this was an abrupt change,” Dr. Zeder said. “Now we really understand there was this long period where they’re playing around with resources.”

Many scientists have suggested that humans turned to agriculture under duress. Perhaps the climate of the Near East grew harsh, or perhaps the hunter-gatherer population outstripped the supply of wild foods.

But “playing around with resources” is not the sort of thing people do in times of desperation. Instead, Dr. Zeder argues, agriculture came about as climatic changes shifted the ranges of some wild species of plants and animals into the Near East.

Many different groups began experimenting with ways of producing extra food, which eventually enabled them to start a new way of life: settling down in more stable social groups.

For the full story see:

Carl Zimmer. “The First Farmers.” The New York Times (Tuesday, October 18, 2016 [sic]): D1 & D6.

(Note: the online version of the story has the date Oct. 17, 2016 [sic], and has the title “How the First Farmers Changed History.”)

The 2015 study co-authored by Dani Nadel and mentioned above is:

Snir, Ainit, Dani Nadel, Iris Groman-Yaroslavski, Yoel Melamed, Marcelo Sternberg, Ofer Bar-Yosef, and Ehud Weiss. “The Origin of Cultivation and Proto-Weeds, Long before Neolithic Farming.” PLOS ONE 10, no. 7 (July 22, 2015): e0131422.

Common Ritualistic Human Sacrifice Detract from the Myth of the Past as Golden Age

(p. D2) One thing that’s definitely gotten better over time: not as much ritualistic human sacrifice.

. . .

The authors list some run-of-the-mill techniques for human sacrifice, but others they mention are more, let’s say, specific: being crushed under a newly built canoe, or being rolled off the roof of a house and then decapitated.

For the full story see:

Tatiana Schlossberg. “Hierarchies: A Grisly Social Order.” The New York Times (Tuesday, April 5, 2016 [sic]): D2.

(Note: ellipsis added.)

(Note: the online version of the story has the date April 4, 2016 [sic], and has the title “Why Some Societies Practiced Ritual Human Sacrifice.” Where the versions differ, in the passages quoted above I follow the more detailed account in the online version.)

The article quoted above references the following academic article:

Watts, Joseph, Oliver Sheehan, Quentin D. Atkinson, Joseph Bulbulia, and Russell D. Gray. “Ritual Human Sacrifice Promoted and Sustained the Evolution of Stratified Societies.” Nature 532, no. 7598 (April 4, 2016): 228-31.

Observations of Non-Credentialed Citizens Add to “Scientific” Knowledge

(p. D5) In 1811, a 12-year-old girl named Mary Anning discovered a fossil on the beach near her home in southwestern England — the first scientifically identified specimen of an ichthyosaur, a dolphin-like, ocean-dwelling reptile from the time of the dinosaurs. Two centuries later, less than 50 miles away, an 11-year-old girl named Ruby Reynolds found a fossil from another ichthyosaur. It appears to be the largest marine reptile known to science.

Ms. Reynolds, now 15, and her father, Justin Reynolds, have been fossil hunting for 12 years near their home in Braunton, England. On a family outing in May 2020 to the village of Blue Anchor along the estuary of the River Severn, they came across a piece of fossilized bone set on a rock.

“We were both excited as we had never found a piece of fossilized bone as big as this before,” Mr. Reynolds said. His daughter kept searching the beach, he added, “and it wasn’t long before she found another much larger piece of bone.”

They took home the fragments of bone, the largest of which was about eight inches long, and began their research. A 2018 paper provided a hint at what they’d found: In nearby Lilstock, fossil hunters had discovered similar bone fragments, hypothesized to be part of the jaw bone of a massive ichthyosaur that lived roughly 202 million years ago. However, the scientists who’d worked on the Lilstock fossil had deemed that specimen too incomplete to designate a new species.

Mr. Reynolds contacted those researchers: Dean Lomax, at the University of Bristol, and Paul de la Salle, an amateur fossil collector. They joined the Reynolds family on collecting trips in Blue Anchor, digging in the mud with shovels. Ultimately, they found roughly half of a bone that they estimate would have been more than seven feet long when complete.

. . .

Dr. Lomax said that this discovery also highlighted the importance of amateur fossil collectors. “If you have a keen eye, if you have a passion for something like that, you can make discoveries like this,” he said.

Ruby Reynolds said: “I didn’t realize when I first found the piece of ichthyosaur bone how important it was and what it would lead to. I think the role that young people can play in science is to enjoy the journey of exploring as you never know where a discovery may take you.”

For the full story see:

Kate Golembiewski. “Huge Ocean Reptile From Dinosaur Days.” The New York Times (Tuesday, April 30, 2024): D5.

(Note: ellipses added.)

(Note: the online version of the story has the date April 17, 2024, and has the title “An 11-Year-Old Girl’s Fossil Find Is the Largest Known Ocean Reptile.” Where there is a difference in wording between the versions, the passages quoted above follow the online version.)

Lomax co-authored an article with Justin Reynolds and Ruby Reynolds that described and named the huge ocean reptile:

Lomax, Dean R., Paul de la Salle, Marcello Perillo, Justin Reynolds, Ruby Reynolds, and James F. Waldron. “The Last Giants: New Evidence for Giant Late Triassic (Rhaetian) Ichthyosaurs from the UK.” PLOS ONE 19, no. 4 (2024): e0300289.