Biology

study:-“smarter”-dogs-think-more-like-humans-to-overcome-their-biases

Study: “Smarter” dogs think more like humans to overcome their biases

who’s a smart doggo? —

Both the shape of a dog’s head and cognitive ability determine degree of spatial bias.

dog in a harness approaching a blue dish on the floor

Enlarge / Look at this very good boy taking a test to determine the origin of his spatial bias for a study on how dogs think.

Eniko Kubinyi

Research has shown that if you point at an object, a dog will interpret the gesture as a directional cue, unlike a human toddler, who will more likely focus on the object itself. It’s called spatial bias, and a recent paper published in the journal Ethology offers potential explanations for why dogs interpret the gesture the way that they do. According to researchers at Eötvös Loránd University in Hungary, the phenomenon arises from a combination of how dogs see (visual acuity) and how they think, with “smarter” dog breeds prioritizing an object’s appearance as much as its location. This suggests the smarter dogs’ information processing is more similar to humans.

The authors wanted to investigate whether spatial bias in dogs is sensory or cognitive, or a combination of the two. “Very early on, children interpret the gesture as pointing to the object, while dogs take the pointing as a directional cue,” said co-author Ivaylo Iotchev. “In other words, regardless of the intention of the person giving the cue, the meaning for children and dogs is different. This phenomenon has previously been observed in dogs using a variety of behavioral tests, ranging from simple associative learning to imitation, but it had never been studied per se.”

Their experimental sample consisted of dogs used in a previous 2018 study plus dogs participating specifically in the new study, for a total of 82 dogs. The dominant breeds were border collies (19), vizslas (17), and whippets (6). Each animal was brought into a small empty room with their owner and one of the experimenters present. The experimenter stood 3 meters away from the dog and owner. There was a training period using different plastic plates to teach the dogs to associate either the presence or absence of an object, or its spatial location, with the presence or absence of food. Then they tested the dogs on a series of tasks.

An object feature conditioning test involving a white round plate and a black square plate.

Enlarge / An object feature conditioning test involving a white round plate and a black square plate.

I.B. Iotchev et al., 2023

For instance, one task required dogs to participate in a maximum of 50 trials to teach them to learn a location of a treat that was always either on the left or right plate. For another task, the experimenter placed a white round plate and a black square plate in the middle of the room. The dogs were exposed to each semi-randomly but only received food in one type of plate. Learning was determined by how quickly each dog ran to the correct plate.

Once the dogs learned those first two tasks, they were given another more complicated task in which either the direction or the object was reversed: if the treat had previously been placed on the right, now it would be found on the left, and if it had previously been placed on a white round plate, it would now be found on the black square one. The researchers found that dogs learned faster when they had to choose the direction, i.e., whether the treat was located on the left or the right. It was harder for the dogs to learn whether a treat would be found on a black square plate or a white round plate.

The shorter a dog's head, the higher the

Enlarge / The shorter a dog’s head, the higher the “cephalic index” (CI).

I.B. Iotchev et al., 2023

Next the team needed to determine differences between the visual and cognitive abilities of the dogs in order to learn whether the spatial bias was sensory or cognitively based, or both. Selective breeding of dogs has produced breeds with different visual capacities, so another aspect of the study involved measuring the length of a dog’s head, which prior research has shown is correlated with visual acuity. The metric used to measure canine heads is known as the “cephalic index” (CI), defined as the ratio of the head’s maximum width multiplied by 100, then divided by the head’s maximum length.

The shorter a dog’s head, the more similar their visual acuity is to human vision. That’s because there is a higher concentration of retinal ganglion cells in the center of their field of vision, making vision sharper and giving such dogs binocular depth vision. The testing showed dogs with better visual acuity, and who also scored higher on the series of cognitive tests, also exhibited less spatial bias. This suggests that canine spatial bias is not simply a sensory matter but is also influenced by how they think. “Smarter” dogs have less spatial bias.

As always, there are a few caveats. Most notably, the authors acknowledge that their sample consisted exclusively of dogs from Hungary kept as pets, and thus their results might not generalize to stray dogs, for example, or dogs from other geographical regions and cultures. Still, “we tested their memory, attention skills, and perseverance,” said co-author Eniko Kubinyi. “We found that dogs with better cognitive performance in the more difficult spatial bias task linked information to objects as easily as to places. We also see that as children develop, spatial bias decreases with increasing intelligence.”

DOI: Ethology, 2023. 10.1111/eth.13423  (About DOIs).

Study: “Smarter” dogs think more like humans to overcome their biases Read More »

contact-tracing-software-could-accurately-gauge-covid-19-risk

Contact-tracing software could accurately gauge COVID-19 risk

As it turns out, epidemiology works —

Time spent with infected individuals is a key determinant of risk.

A woman wearing a face mask and checking her phone.

It’s summer 2021. You rent a house in the countryside with a bunch of friends for someone’s birthday. The weather’s gorgeous that weekend, so mostly you’re all outside—pool, firepit, hammock, etc.—but you do all sleep in the same house. And then on Tuesday, you get an alert on your phone that you’ve been exposed to SARS-CoV-2, the virus that causes COVID-19. How likely are you to now have it?

To answer that question, a group of statisticians, data scientists, computer scientists, and epidemiologists in the UK analyzed 7 million people who were notified that they were exposed to COVID-19 by the NHS COVID-19 app in England and Wales between April 2021 and February 2022. They wanted to know if—and how—these app notifications correlated to actual disease transmission. Analyses like this can help ensure that an app designed for the next pathogen could retain efficacy while minimizing social and economic burdens. And it can tell us more about the dynamics of SARS-CoV-2 transmission.

Over 20 million quarantine requests

The NHS COVID-19 app was active on 13 to 18 million smartphones per day in 2021. It used Bluetooth signals to estimate the proximity between those smartphones while maintaining privacy and then alerted people who spent 15 minutes or more at a distance of 2 meters or less from a confirmed case. This led to over 20 million such alerts, each of which came with a request to quarantine—quite a burden.

The researchers found that the app did, in fact, accurately translate the duration and proximity of a COVID-19 exposure to a relevant epidemiological risk score. The app assessed a contact’s risk by multiplying the length of contact, the proximity of contact, and the infectiousness of the index case as determined by how long it had been since the index case started showing symptoms or tested positive.

There was an increasing probability of reported infection as the app’s risk score increased: more contacts whom the app deemed were at a high transmission risk did go on to test positive for COVID-19 within the following two weeks than those who were notified but had lower risk levels. (That’s positive tests that were reported by using the app. Some of the high-risk people probably did not test at all, did not report their test results, or did not report them within the allotted time. So this is an underestimation of the correlation between notification of risk and infection.)

More exposure = higher risk

When the researchers separated the factors contributing to the risk of an exposure, they found that duration was the most important indicator. Household exposures accounted for 6 percent of all contacts but 41 percent of transmissions.

One caveat: The app didn’t record any contextual variables that are known to impact transmission risk, like if people live in an urban or rural area, was the meeting indoors or outdoors, was it during the week or over the weekend, was anyone vaccinated, etc. Including such data could make risk assessment more accurate.

Based on their work, the researchers suggest that an “Amber Alert” stage could have been introduced to the app, in which people deemed to have an interim degree of risk would be guided to get a PCR test rather than immediately jumping to quarantine. Including this intermediate Amber Alert population could have significantly reduced the socioeconomic costs of contact tracing while retaining its epidemiological impact or could have increased its effectiveness for a similar cost. Performing analyses like this early on in the next pandemic to determine how it is transmitted might minimize illness and strain on society.

Nature, 2023.  DOI:  10.1038/s41586-023-06952-2

Contact-tracing software could accurately gauge COVID-19 risk Read More »

human-brain-cells-put-much-more-energy-into-signaling

Human brain cells put much more energy into signaling

Being human is hard —

Signaling molecules help modulate the brain’s overall activity.

Image of a person staring pensively, with question marks drawn on the wall behind him.

Indian elephants have larger brains than we do (obviously). Mice have a higher brain-to-body mass ratio, and long-finned pilot whales have more neurons. So what makes humans—and more specifically, human brains—special?

As far as organs go, human brains certainly consume a ton of energy—almost 50 grams of sugar, or 12 lumps, every day. This is one of the highest energy demands relative to body metabolism known among species. But what uses up all of this energy? If the human brain is the predicted size and has the predicted number of neurons for a primate of its size, and each individual neuron uses comparable amounts of energy to those in other mammals, then its energy use shouldn’t be exceptional.

The cost of signaling

A group of neuroscientists speculated that maybe the amount of signaling that takes place within the human brain accounts for its heightened energy needs. A consequence of this would be that brain regions that are more highly connected and do more signaling will use more energy.

To test their hypothesis, the scientists started by imaging the brains of 30 healthy, right-handed volunteers between 20 and 50 years old. The imaging took place at two separate institutions, and it allowed the researchers to correlate a given brain region’s energy use (as measured by glucose metabolism) with its level of signaling and connectivity. They found that energy use and signaling scaled in tandem in all 30 brains. But certain regions stuck out. Signaling pathways in certain areas of the cortex—the front of the brain—require almost 70 percent more energy than those in sensory-motor regions.

The frontal cortex is one of the regions that expanded the most during human evolution. According to Robert Sapolsky, “What the prefrontal cortex is most about is making tough decisions in the face of temptation—gratification postponement, long-term planning, impulse control, emotional regulation. The PFC is essential for getting you to do the right thing when it is the harder thing to do.” This is the stuff that humans must constantly contend with. And energetically, it is extraordinarily costly.

Increased modulation is also key for cognition

It is not only signaling that takes energy; it is modulating that signaling, ensuring that it occurs at the appropriate levels and only at the appropriate times.

Using the Allen Human Brain Atlas, these researchers looked at gene activity in the frontal cortex. They found elevated activity of neuromodulators and their receptors. The authors note that “the human brain spends excessive energy on the long-lasting regulation of (fast) neurotransmission with (slow) neuromodulators such as serotonin, dopamine, or noradrenaline.” And also endogenous opiates. “This effect is more about setting the tone of general excitability than transferring individual bits of information,” they write.

Once they correlated energy use to signaling and slow-acting neuromodulation in the cortex, the last thing the scientists did was look at the Neurosynth project, which maps cognitive functions to brain regions. Lo and behold, the energy-hogging, highly connected, strongly modulated, and evolutionarily expanded parts of the cortex are the same ones involved in complex functions like memory processing, reading, and cognitive inhibition. This supports their idea of “an expensive signaling architecture being dedicated to human cognition.”

Science Advances, 2023.  DOI: 10.1126/sciadv.adi7632

Human brain cells put much more energy into signaling Read More »

what-happens-in-a-crow’s-brain-when-it-uses-tools? 

What happens in a crow’s brain when it uses tools? 

This is your brain on tools —

Researchers trace the areas of the brain that are active when birds are using tools.

Three crows on the streets in the foreground with traffic and city lights blurry in the background.

Enlarge / Sure, they can use tools, but do they know where the nearest subway stop is?

“A thirsty crow wanted water from a pitcher, so he filled it with pebbles to raise the water level to drink,” summarizes a famous Aesop Fable. While this tale is thousands of years old, animal behaviorists still use this challenge to study corvids (which include crows, ravens, jays, and magpies) and their use of tools. In a recent Nature Communications study, researchers from a collaboration of universities across Washington, Florida, and Utah used radioactive tracers within the brains of several American crows to see which parts of their brains were active when they used stones to obtain food from the bottom of a water-filled tube.

Their results indicate that the motor learning and tactile control centers were activated in the brains of the more proficient crows, while the sensory and higher-order processing centers lit up in the brains of less proficient crows. These results suggest that competence with tools is linked to certain memories and muscle control, which the researchers claimed is similar to a ski jumper visualizing the course before jumping.

The researchers also found that out of their avian test subjects, female crows were especially proficient at tool usage, succeeding in the challenge quickly. “[A] follow-up question is whether female crows actually have more need for creative thinking relative to male crows,” elaborates Loma Pendergraft, the study’s first author and a graduate student at the University of Washington, who wants to understand if the caregiving and less dominant role of female crows gives them a higher capacity for tool use.

While only two species of crow (the New Caledonian crow and the Hawaiian crow) inherently use twigs and sticks as foraging tools, this study also suggests that other crow species, like the American crow, have the neural flexibility to learn to use tools.

A less invasive look at bird brains

Due to their unique behaviors, complex social structures, and reported intelligence, crows have fascinated animal behavioralists for decades. Scientists can study crows’ brains in real time by using 18F-fluorodeoxyglucose (FDG), a radioactive tracer, which the researchers injected into the crows’ brains. They then use positron emission tomography (PET) scans to see which brain areas are activated during different tasks.

“FDG-PET is a method we use to remotely examine activity throughout the entire brain without needing to do any surgeries or implants,” explained Pendergraft. “It’s like [a functional] MRI.” The FDG-PET method is non-invasive, as the crows aren’t required to sit still, which minimizes the stress the crows feel during the experiment.  In the Nature Communications study, Pendergraft and his team ensured the crows were anesthetized before scanning them.

FDG is also used in various medical imaging techniques, such as diagnosing Alzheimer’s disease or screening for cancerous tissue. “Basically, the body treats it as glucose, a substance needed for cells to stay alive,” Pendergraft added. “If a body part is working harder than normal, it’s going to need extra glucose to power the additional activity. This means we can measure relative FDG concentrations within the brain as a proxy for relative brain activity.”

What happens in a crow’s brain when it uses tools?  Read More »

worm’s-rear-end-develops-its-own-head,-wanders-off-to-mate

Worm’s rear end develops its own head, wanders off to mate

Butt what? —

The butt even grows its own eyes, antennae, and brain.

Three images of worm-like organisms.

Enlarge / From left to right, the head of an actual worm, and the stolon of a male and female.

Some do it horizontally, some do it vertically, some do it sexually, and some asexually. Then there are some organisms that would rather grow a butt that develops into an autonomous appendage equipped with its own antennae, eyes, and brain. This appendage will detach from the main body and swim away, carrying gonads that will merge with those from other disembodied rear ends and give rise to a new generation.

Wait, what in the science fiction B-movie alien star system is this thing?

Megasyllis nipponica really exists on Earth. Otherwise known as the Japanese green syllid worm, it reproduces by a process known as stolonization, which sounds like the brainchild of a sci-fi horror genius but evolved in some annelid (segmented) worms to give future generations the best chance at survival. What was still a mystery (until now) was exactly how that bizarre appendage, or stolon, could form its own head in the middle of the worm’s body. Turns out this is a wonder of gene regulation.

Butt how?

Led by evolutionary biologist and professor Toru Miura of the University of Tokyo, a team of scientists discovered the genetic mechanism behind the formation of the stolon. It starts with Hox genes. These are a set of genes that help determine which segments of an embryo will become the head, thorax, abdomen, and so on. In annelid worms like M. nipponica, different Hox genes regulate the segments that make up the worm’s entire body.

Miura and his colleagues were expecting the activity of Hox genes to be different in the anterior and posterior of a worm. They found out that it is actually not the Hox genes that control the stolon’s segments but gonad development that alters their identity. “These findings suggest that during stolonization, gonad development induces the head formation of a stolon, without up-regulation of anterior Hox genes,” the team said in a study recently published in Scientific Reports.

The anterior part, or stock, of M. nipponica is neither male nor female. The worm has organs called gonad primordia on the underside of its posterior end. When the primordia start maturing into oocytes or testes, head-formation genes (different from the Hox genes), which are also responsible for forming a head in other creatures, become active in the middle of the stock body.

This is when the stolon starts to develop a head. Its head grows a cluster of nerve cells that serve as a brain, along with a central nervous system that extends throughout its body. The stolon’s own eyes, antennae, and swimming bristles also emerge.

Left behind

Before a stolon can take off on its own, it has to develop enough to be fully capable of swimming autonomously and finding its way to another stolon of the opposite sex. The fully developed stolon appears like an alien being attached to the rest of the worm’s body. Besides its own nervous system and something comparable to a brain, it also has two pairs of bulging eyes, two pairs of antennae, and its own digestive tube. Those eyes are enlarged for a reason, as the gonad will often need to navigate in murky waters.

The antennae of the stolon can sense the environment around them, but the researchers suggest that they have a more important function—picking up on pheromones released by the opposite sex. The stolon still isn’t an exact duplication of the stock. It doesn’t have some of the worm’s most sophisticated features, such as a digestive tube with several specialized regions, probably because its purpose is exclusively to spawn. It dies off soon after.

So what could have made stolonization evolve in the first place? Further research needs to be done, but for now, it is thought that this strange capability might have shown up in some annelid worms when genes that develop the head shifted further down the body, but why this shifting of genes evolved to begin with is still unknown.

The worm also regenerates stolons at a high rate, which may also give it the best chance at propagating its species. Hold onto your butts.

Scientific Reports, 2023.  DOI:  10.1038/s41598-023-46358-8

Worm’s rear end develops its own head, wanders off to mate Read More »